Sample records for large performance improvements

  1. Designing a Large-Scale Multilevel Improvement Initiative: The Improving Performance in Practice Program

    ERIC Educational Resources Information Center

    Margolis, Peter A.; DeWalt, Darren A.; Simon, Janet E.; Horowitz, Sheldon; Scoville, Richard; Kahn, Norman; Perelman, Robert; Bagley, Bruce; Miles, Paul

    2010-01-01

    Improving Performance in Practice (IPIP) is a large system intervention designed to align efforts and motivate the creation of a tiered system of improvement at the national, state, practice, and patient levels, assisting primary-care physicians and their practice teams to assess and measurably improve the quality of care for chronic illness and…

  2. Feasibility study of a synthesis procedure for array feeds to improve radiation performance of large distorted reflector antennas

    NASA Technical Reports Server (NTRS)

    Stutzman, W. L.; Takamizawa, K.; Werntz, P.; Lapean, J.; Barts, R.; Shen, B.

    1992-01-01

    Virginia Tech has several articles which support the NASA Langley effort in the area of large aperture radiometric antenna systems. This semi-annual report reports on the following activities: a feasibility study of a synthesis procedure for array feeds to improve radiation performance of large distorted reflector antennas and the design of array feeds for large reflector antennas.

  3. Feasibility study of a synthesis procedure for array feeds to improve radiation performance of large distorted reflector antennas

    NASA Technical Reports Server (NTRS)

    Stutzman, W. L.; Takamizawa, K.; Werntz, P.; Lapean, J.; Barts, R.; Shen, B.

    1991-01-01

    Virginia Tech is involved in a number of activities with NASA Langley related to large aperture radiometric antenna systems. These efforts are summarized and the focus of this report is on the feasibility study of a synthesis procedure for array feeds to improve radiation performance of large distorted reflector antennas; however, some results for all activities are reported.

  4. Active-Learning Methods To Improve Student Performance and Scientific Interest in a Large Introductory Oceanography Course.

    ERIC Educational Resources Information Center

    Yuretich, Richard F.; Khan, Samia A.; Leckie, R. Mark; Clement, John J.

    2001-01-01

    Transfers the environment of a large enrollment oceanography course by modifying lectures to include cooperative learning via interactive in-class exercises and directed discussion. Results of student surveys, course evaluations, and exam performance demonstrate that learning of the subject under these conditions has improved. (Author/SAH)

  5. Improved technique that allows the performance of large-scale SNP genotyping on DNA immobilized by FTA technology.

    PubMed

    He, Hongbin; Argiro, Laurent; Dessein, Helia; Chevillard, Christophe

    2007-01-01

    FTA technology is a novel method designed to simplify the collection, shipment, archiving and purification of nucleic acids from a wide variety of biological sources. The number of punches that can normally be obtained from a single specimen card are often however, insufficient for the testing of the large numbers of loci required to identify genetic factors that control human susceptibility or resistance to multifactorial diseases. In this study, we propose an improved technique to perform large-scale SNP genotyping. We applied a whole genome amplification method to amplify DNA from buccal cell samples stabilized using FTA technology. The results show that using the improved technique it is possible to perform up to 15,000 genotypes from one buccal cell sample. Furthermore, the procedure is simple. We consider this improved technique to be a promising methods for performing large-scale SNP genotyping because the FTA technology simplifies the collection, shipment, archiving and purification of DNA, while whole genome amplification of FTA card bound DNA produces sufficient material for the determination of thousands of SNP genotypes.

  6. Infrastructure for Large-Scale Quality-Improvement Projects: Early Lessons from North Carolina Improving Performance in Practice

    ERIC Educational Resources Information Center

    Newton, Warren P.; Lefebvre, Ann; Donahue, Katrina E.; Bacon, Thomas; Dobson, Allen

    2010-01-01

    Introduction: Little is known regarding how to accomplish large-scale health care improvement. Our goal is to improve the quality of chronic disease care in all primary care practices throughout North Carolina. Methods: Methods for improvement include (1) common quality measures and shared data system; (2) rapid cycle improvement principles; (3)…

  7. How much improvement in thermoelectric performance can come from reducing thermal conductivity?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gaultois, Michael W., E-mail: mgaultois@mrl.ucsb.edu; Sparks, Taylor D., E-mail: sparks@eng.utah.edu

    Large improvements in the performance of thermoelectric materials have come from designing materials with reduced thermal conductivity. Yet as the thermal conductivity of some materials now approaches their amorphous limit, it is unclear if microstructure engineering can further improve thermoelectric performance in these cases. In this contribution, we use large data sets to examine 300 compositions in 11 families of thermoelectric materials and present a type of plot that quickly reveals the maximum possible zT that can be achieved by reducing the thermal conductivity. This plot allows researchers to quickly distinguish materials where the thermal conductivity has been optimized frommore » those where improvement can be made. Moreover, through these large data sets we examine structure-property relationships to identify methods that decrease thermal conductivity and improve thermoelectric performance. We validate, with the data, that increasing (i) the volume of a unit cell and/or (ii) the number of atoms in the unit cell decreases the thermal conductivity of many classes of materials, without changing the electrical resistivity.« less

  8. RTC simulations on large branched sewer systems with SmaRTControl.

    PubMed

    de Korte, Kees; van Beest, Dick; van der Plaat, Marcel; de Graaf, Erno; Schaart, Niels

    2009-01-01

    In The Netherlands many large branched sewer systems exist. RTC can improve the performance of these systems. The objective of the universal algorithm of SmaRTControl is to improve the performance of the sewer system and the WWTP. The effect of RTC under rain weather flow conditions is simulated using a hydrological model with 19 drainage districts. The system related inefficiency coefficient (SIC) is introduced for assessment of the performance of sewer systems. The performance can be improved by RTC in combination with increased pumping capacities in the drainage districts, but without increasing the flow to the WWTP. Under dry weather flow conditions the flow to the WWTP can be equalized by storage of wastewater in the sewer system. It is concluded that SmaRTControl can improve the performance, that simulations are necessary and that SIC is an excellent parameter for assessment of the performance.

  9. Performance monitoring for safe and livable communities : fusing data, to improve arterial operations for all users.

    DOT National Transportation Integrated Search

    2014-10-01

    Measuring or analyzing transportation system performance occupies a large transportation professionals time. So, improving performance : measurement methods in terms of accuracy and cost are important contributions. This research documents develop...

  10. Early experience with pay-for-performance: from concept to practice.

    PubMed

    Rosenthal, Meredith B; Frank, Richard G; Li, Zhonghe; Epstein, Arnold M

    2005-10-12

    The adoption of pay-for-performance mechanisms for quality improvement is growing rapidly. Although there is intense interest in and optimism about pay-for-performance programs, there is little published research on pay-for-performance in health care. To evaluate the impact of a prototypical physician pay-for-performance program on quality of care. We evaluated a natural experiment with pay-for-performance using administrative reports of physician group quality from a large health plan for an intervention group (California physician groups) and a contemporaneous comparison group (Pacific Northwest physician groups). Quality improvement reports were included from October 2001 through April 2004 issued to approximately 300 large physician organizations. Three process measures of clinical quality: cervical cancer screening, mammography, and hemoglobin A1c testing. Improvements in clinical quality scores were as follows: for cervical cancer screening, 5.3% for California vs 1.7% for Pacific Northwest; for mammography, 1.9% vs 0.2%; and for hemoglobin A1c, 2.1% vs 2.1%. Compared with physician groups in the Pacific Northwest, the California network demonstrated greater quality improvement after the pay-for-performance intervention only in cervical cancer screening (a 3.6% difference in improvement [P = .02]). In total, the plan awarded 3.4 million dollars (27% of the amount set aside) in bonus payments between July 2003 and April 2004, the first year of the program. For all 3 measures, physician groups with baseline performance at or above the performance threshold for receipt of a bonus improved the least but garnered the largest share of the bonus payments. Paying clinicians to reach a common, fixed performance target may produce little gain in quality for the money spent and will largely reward those with higher performance at baseline.

  11. Feature engineering for MEDLINE citation categorization with MeSH.

    PubMed

    Jimeno Yepes, Antonio Jose; Plaza, Laura; Carrillo-de-Albornoz, Jorge; Mork, James G; Aronson, Alan R

    2015-04-08

    Research in biomedical text categorization has mostly used the bag-of-words representation. Other more sophisticated representations of text based on syntactic, semantic and argumentative properties have been less studied. In this paper, we evaluate the impact of different text representations of biomedical texts as features for reproducing the MeSH annotations of some of the most frequent MeSH headings. In addition to unigrams and bigrams, these features include noun phrases, citation meta-data, citation structure, and semantic annotation of the citations. Traditional features like unigrams and bigrams exhibit strong performance compared to other feature sets. Little or no improvement is obtained when using meta-data or citation structure. Noun phrases are too sparse and thus have lower performance compared to more traditional features. Conceptual annotation of the texts by MetaMap shows similar performance compared to unigrams, but adding concepts from the UMLS taxonomy does not improve the performance of using only mapped concepts. The combination of all the features performs largely better than any individual feature set considered. In addition, this combination improves the performance of a state-of-the-art MeSH indexer. Concerning the machine learning algorithms, we find that those that are more resilient to class imbalance largely obtain better performance. We conclude that even though traditional features such as unigrams and bigrams have strong performance compared to other features, it is possible to combine them to effectively improve the performance of the bag-of-words representation. We have also found that the combination of the learning algorithm and feature sets has an influence in the overall performance of the system. Moreover, using learning algorithms resilient to class imbalance largely improves performance. However, when using a large set of features, consideration needs to be taken with algorithms due to the risk of over-fitting. Specific combinations of learning algorithms and features for individual MeSH headings could further increase the performance of an indexing system.

  12. Improving parallel I/O autotuning with performance modeling

    DOE PAGES

    Behzad, Babak; Byna, Surendra; Wild, Stefan M.; ...

    2014-01-01

    Various layers of the parallel I/O subsystem offer tunable parameters for improving I/O performance on large-scale computers. However, searching through a large parameter space is challenging. We are working towards an autotuning framework for determining the parallel I/O parameters that can achieve good I/O performance for different data write patterns. In this paper, we characterize parallel I/O and discuss the development of predictive models for use in effectively reducing the parameter space. Furthermore, applying our technique on tuning an I/O kernel derived from a large-scale simulation code shows that the search time can be reduced from 12 hours to 2more » hours, while achieving 54X I/O performance speedup.« less

  13. Performance of Extended Local Clustering Organization (LCO) for Large Scale Job-Shop Scheduling Problem (JSP)

    NASA Astrophysics Data System (ADS)

    Konno, Yohko; Suzuki, Keiji

    This paper describes an approach to development of a solution algorithm of a general-purpose for large scale problems using “Local Clustering Organization (LCO)” as a new solution for Job-shop scheduling problem (JSP). Using a performance effective large scale scheduling in the study of usual LCO, a solving JSP keep stability induced better solution is examined. In this study for an improvement of a performance of a solution for JSP, processes to a optimization by LCO is examined, and a scheduling solution-structure is extended to a new solution-structure based on machine-division. A solving method introduced into effective local clustering for the solution-structure is proposed as an extended LCO. An extended LCO has an algorithm which improves scheduling evaluation efficiently by clustering of parallel search which extends over plural machines. A result verified by an application of extended LCO on various scale of problems proved to conduce to minimizing make-span and improving on the stable performance.

  14. Learning to Collaborate: A Case Study of Performance Improvement CME

    ERIC Educational Resources Information Center

    Shershneva, Marianna B.; Mullikin, Elizabeth A.; Loose, Anne-Sophie; Olson, Curtis A.

    2008-01-01

    Introduction: Performance Improvement Continuing Medical Education (PI CME) is a mechanism for joining quality improvement (QI) in health care to continuing medical education (CME) systems together. Although QI practices and CME approaches have been recognized for years, what emerges from their integration is largely unfamiliar, because it…

  15. Failure tolerance strategy of space manipulator for large load carrying tasks

    NASA Astrophysics Data System (ADS)

    Chen, Gang; Yuan, Bonan; Jia, Qingxuan; Sun, Hanxu; Guo, Wen

    2018-07-01

    During the execution of large load carrying tasks in long term service, there is a notable risk of space manipulator suffering from locked-joint failure, thus space manipulator should be with enough failure tolerance performance. A research on evaluating failure tolerance performance and re-planning feasible task trajectory for space manipulator performing large load carrying tasks is conducted in this paper. The effects of locked-joint failure on critical performance(reachability and load carrying capacity) of space manipulator are analyzed at first. According to the requirements of load carrying tasks, we further propose a new concept of failure tolerance workspace with load carrying capacity(FTWLCC) to evaluate failure tolerance performance, and improve the classic A* algorithm to search the feasible task trajectory. Through the normalized FTWLCC and the improved A* algorithm, the reachability and load carrying capacity of the degraded space manipulator are evaluated, and the reachable and capable trajectory can be obtained. The establishment of FTWLCC provides a novel idea that combines mathematical statistics with failure tolerance performance to illustrate the distribution of load carrying capacity in three-dimensional space, so multiple performance indices can be analyzed simultaneously and visually. And the full consideration of all possible failure situations and motion states makes FTWLCC and improved A* algorithm be universal and effective enough to be appropriate for random joint failure and variety of requirement of large load carrying tasks, so they can be extended to other types of manipulators.

  16. Relative health performance in BRICS over the past 20 years: the winners and losers.

    PubMed

    Petrie, Dennis; Tang, Kam Ki

    2014-06-01

    To determine whether the health performance of Brazil, the Russian Federation, India, China and South Africa--the countries known as BRICS--has kept in step with their economic development. Reductions in age- and sex-specific mortality seen in each BRICS country between 1990 and 2011 were measured. These results were compared with those of the best-performing countries in the world and the best-performing countries with similar income levels. We estimated each country's progress in reducing mortality and compared changes in that country's mortality rates against other countries with similar mean incomes to examine changes in avoidable mortality. The relative health performance of the five study countries differed markedly over the study period. Brazil demonstrated fairly even improvement in relative health performance across the different age and sex subgroups that we assessed. India's improvement was more modest and more varied across the subgroups. South Africa and the Russian Federation exhibited large declines in health performance as well as large sex-specific inequalities in health. Although China's levels of avoidable mortality decreased in absolute terms, the level of improvement appeared low in the context of China's economic growth. When evaluating a country's health performance in terms of avoidable mortality, it is useful to compare that performance against the performance of other countries. Such comparison allows any country-specific improvements to be distinguished from general global improvements.

  17. Large-scale three-dimensional phase-field simulations for phase coarsening at ultrahigh volume fraction on high-performance architectures

    NASA Astrophysics Data System (ADS)

    Yan, Hui; Wang, K. G.; Jones, Jim E.

    2016-06-01

    A parallel algorithm for large-scale three-dimensional phase-field simulations of phase coarsening is developed and implemented on high-performance architectures. From the large-scale simulations, a new kinetics in phase coarsening in the region of ultrahigh volume fraction is found. The parallel implementation is capable of harnessing the greater computer power available from high-performance architectures. The parallelized code enables increase in three-dimensional simulation system size up to a 5123 grid cube. Through the parallelized code, practical runtime can be achieved for three-dimensional large-scale simulations, and the statistical significance of the results from these high resolution parallel simulations are greatly improved over those obtainable from serial simulations. A detailed performance analysis on speed-up and scalability is presented, showing good scalability which improves with increasing problem size. In addition, a model for prediction of runtime is developed, which shows a good agreement with actual run time from numerical tests.

  18. TORNADO-WARNING PERFORMANCE IN THE PAST AND FUTURE: A Perspective from Signal Detection Theory.

    NASA Astrophysics Data System (ADS)

    Brooks, Harold E.

    2004-06-01

    Changes over the years in tornado-warning performance in the United States can be modeled from the perspective of signal detection theory. From this view, it can be seen that there have been distinct periods of change in performance, most likely associated with deployment of radars, and changes in scientific understanding and training. The model also makes it clear that improvements in the false alarm ratio can only occur at the cost of large decreases in the probability of detection, or with large improvements in the overall quality of the warning system.

  19. IM-Chem: The Use of Instant Messaging to Improve Student Performance and Personalize Large Lecture General Chemistry Courses

    ERIC Educational Resources Information Center

    Behmke, Derek A.; Atwood, Charles H.

    2012-01-01

    Previous research has linked poor student performance with the depersonalized feeling of large lecture courses. Various forms of enhanced communication have been tried that appear to enhance personalization in large courses. For general chemistry classes taught in a 365-seat lecture hall at the University of Georgia, we have attempted to enhance…

  20. Study Abroad Field Trip Improves Test Performance through Engagement and New Social Networks

    ERIC Educational Resources Information Center

    Houser, Chris; Brannstrom, Christian; Quiring, Steven M.; Lemmons, Kelly K.

    2011-01-01

    Although study abroad trips provide an opportunity for affective and cognitive learning, it is largely assumed that they improve learning outcomes. The purpose of this study is to determine whether a study abroad field trip improved cognitive learning by comparing test performance between the study abroad participants (n = 20) and their peers who…

  1. Vapor and healing treatment for CH3NH3PbI3-xClx films toward large-area perovskite solar cells

    NASA Astrophysics Data System (ADS)

    Gouda, Laxman; Gottesman, Ronen; Tirosh, Shay; Haltzi, Eynav; Hu, Jiangang; Ginsburg, Adam; Keller, David A.; Bouhadana, Yaniv; Zaban, Arie

    2016-03-01

    Hybrid methyl-ammonium lead trihalide perovskites are promising low-cost materials for use in solar cells and other optoelectronic applications. With a certified photovoltaic conversion efficiency record of 20.1%, scale-up for commercial purposes is already underway. However, preparation of large-area perovskite films remains a challenge, and films of perovskites on large electrodes suffer from non-uniform performance. Thus, production and characterization of the lateral uniformity of large-area films is a crucial step towards scale-up of devices. In this paper, we present a reproducible method for improving the lateral uniformity and performance of large-area perovskite solar cells (32 cm2). The method is based on methyl-ammonium iodide (MAI) vapor treatment as a new step in the sequential deposition of perovskite films. Following the MAI vapor treatment, we used high throughput techniques to map the photovoltaic performance throughout the large-area device. The lateral uniformity and performance of all photovoltaic parameters (Voc, Jsc, Fill Factor, Photo-conversion efficiency) increased, with an overall improved photo-conversion efficiency of ~100% following a vapor treatment at 140 °C. Based on XRD and photoluminescence measurements, We propose that the MAI treatment promotes a ``healing effect'' to the perovskite film which increases the lateral uniformity across the large-area solar cell. Thus, the straightforward MAI vapor treatment is highly beneficial for large scale commercialization of perovskite solar cells, regardless of the specific deposition method.Hybrid methyl-ammonium lead trihalide perovskites are promising low-cost materials for use in solar cells and other optoelectronic applications. With a certified photovoltaic conversion efficiency record of 20.1%, scale-up for commercial purposes is already underway. However, preparation of large-area perovskite films remains a challenge, and films of perovskites on large electrodes suffer from non-uniform performance. Thus, production and characterization of the lateral uniformity of large-area films is a crucial step towards scale-up of devices. In this paper, we present a reproducible method for improving the lateral uniformity and performance of large-area perovskite solar cells (32 cm2). The method is based on methyl-ammonium iodide (MAI) vapor treatment as a new step in the sequential deposition of perovskite films. Following the MAI vapor treatment, we used high throughput techniques to map the photovoltaic performance throughout the large-area device. The lateral uniformity and performance of all photovoltaic parameters (Voc, Jsc, Fill Factor, Photo-conversion efficiency) increased, with an overall improved photo-conversion efficiency of ~100% following a vapor treatment at 140 °C. Based on XRD and photoluminescence measurements, We propose that the MAI treatment promotes a ``healing effect'' to the perovskite film which increases the lateral uniformity across the large-area solar cell. Thus, the straightforward MAI vapor treatment is highly beneficial for large scale commercialization of perovskite solar cells, regardless of the specific deposition method. Electronic supplementary information (ESI) available. See DOI: 10.1039/c5nr08658b

  2. High-Performance Monitoring Architecture for Large-Scale Distributed Systems Using Event Filtering

    NASA Technical Reports Server (NTRS)

    Maly, K.

    1998-01-01

    Monitoring is an essential process to observe and improve the reliability and the performance of large-scale distributed (LSD) systems. In an LSD environment, a large number of events is generated by the system components during its execution or interaction with external objects (e.g. users or processes). Monitoring such events is necessary for observing the run-time behavior of LSD systems and providing status information required for debugging, tuning and managing such applications. However, correlated events are generated concurrently and could be distributed in various locations in the applications environment which complicates the management decisions process and thereby makes monitoring LSD systems an intricate task. We propose a scalable high-performance monitoring architecture for LSD systems to detect and classify interesting local and global events and disseminate the monitoring information to the corresponding end- points management applications such as debugging and reactive control tools to improve the application performance and reliability. A large volume of events may be generated due to the extensive demands of the monitoring applications and the high interaction of LSD systems. The monitoring architecture employs a high-performance event filtering mechanism to efficiently process the large volume of event traffic generated by LSD systems and minimize the intrusiveness of the monitoring process by reducing the event traffic flow in the system and distributing the monitoring computation. Our architecture also supports dynamic and flexible reconfiguration of the monitoring mechanism via its Instrumentation and subscription components. As a case study, we show how our monitoring architecture can be utilized to improve the reliability and the performance of the Interactive Remote Instruction (IRI) system which is a large-scale distributed system for collaborative distance learning. The filtering mechanism represents an Intrinsic component integrated with the monitoring architecture to reduce the volume of event traffic flow in the system, and thereby reduce the intrusiveness of the monitoring process. We are developing an event filtering architecture to efficiently process the large volume of event traffic generated by LSD systems (such as distributed interactive applications). This filtering architecture is used to monitor collaborative distance learning application for obtaining debugging and feedback information. Our architecture supports the dynamic (re)configuration and optimization of event filters in large-scale distributed systems. Our work represents a major contribution by (1) survey and evaluating existing event filtering mechanisms In supporting monitoring LSD systems and (2) devising an integrated scalable high- performance architecture of event filtering that spans several kev application domains, presenting techniques to improve the functionality, performance and scalability. This paper describes the primary characteristics and challenges of developing high-performance event filtering for monitoring LSD systems. We survey existing event filtering mechanisms and explain key characteristics for each technique. In addition, we discuss limitations with existing event filtering mechanisms and outline how our architecture will improve key aspects of event filtering.

  3. Experimental and analytical investigations to improve low-speed performance and stability and control characteristics of supersonic cruise fighter vehicles

    NASA Technical Reports Server (NTRS)

    Graham, A. B.

    1977-01-01

    Small- and large-scale models of supersonic cruise fighter vehicles were used to determine the effectiveness of airframe/propulsion integration concepts for improved low-speed performance and stability and control characteristics. Computer programs were used for engine/airframe sizing studies to yield optimum vehicle performance.

  4. Performance model-directed data sieving for high-performance I/O

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yong; Lu, Yin; Amritkar, Prathamesh

    2014-09-10

    Many scientific computing applications and engineering simulations exhibit noncontiguous I/O access patterns. Data sieving is an important technique to improve the performance of noncontiguous I/O accesses by combining small and noncontiguous requests into a large and contiguous request. It has been proven effective even though more data are potentially accessed than demanded. In this study, we propose a new data sieving approach namely performance model-directed data sieving, or PMD data sieving in short. It improves the existing data sieving approach from two aspects: (1) dynamically determines when it is beneficial to perform data sieving; and (2) dynamically determines how tomore » perform data sieving if beneficial. It improves the performance of the existing data sieving approach considerably and reduces the memory consumption as verified by both theoretical analysis and experimental results. Given the importance of supporting noncontiguous accesses effectively and reducing the memory pressure in a large-scale system, the proposed PMD data sieving approach in this research holds a great promise and will have an impact on high-performance I/O systems.« less

  5. Leading Educational Change and Improvement at Scale: Some Inconvenient Truths about System Performance

    ERIC Educational Resources Information Center

    Harris, Alma; Jones, Michelle

    2017-01-01

    The challenges of securing educational change and transformation, at scale, remain considerable. While sustained progress has been made in some education systems (Fullan, 2009; Hargreaves & Shirley, 2009) generally, it remains the case that the pathway to large-scale, system improvement is far from easy or straightforward. While large-scale…

  6. Improving Performance through Knowledge Translation in the Veterans Health Administration

    ERIC Educational Resources Information Center

    Francis, Joseph; Perlin, Jonathan B.

    2006-01-01

    The Veterans Health Administration (VA) provides a case study for linking performance measurement, information technology, and aligned research efforts to facilitate quality improvement in a large, complex health system. Dialogue between clinical researchers and VA leaders occurs through structured activities (e.g., the Quality Enhancement…

  7. Does a Brief Mindfulness Intervention Impact Quiz Performance?

    ERIC Educational Resources Information Center

    Calma-Birling, Destany; Gurung, Regan A. R.

    2017-01-01

    Mindfulness practices improve cognition, emotional balance, and well-being in clinical and non-clinical populations. The bulk of mindfulness research in higher education has focused on improving psychological and cognitive variables, leaving academic performance largely unexplored. We investigated the effects of a brief mindfulness intervention on…

  8. Daily online testing in large classes: boosting college performance while reducing achievement gaps.

    PubMed

    Pennebaker, James W; Gosling, Samuel D; Ferrell, Jason D

    2013-01-01

    An in-class computer-based system, that included daily online testing, was introduced to two large university classes. We examined subsequent improvements in academic performance and reductions in the achievement gaps between lower- and upper-middle class students in academic performance. Students (N = 901) brought laptop computers to classes and took daily quizzes that provided immediate and personalized feedback. Student performance was compared with the same data for traditional classes taught previously by the same instructors (N = 935). Exam performance was approximately half a letter grade above previous semesters, based on comparisons of identical questions asked from earlier years. Students in the experimental classes performed better in other classes, both in the semester they took the course and in subsequent semester classes. The new system resulted in a 50% reduction in the achievement gap as measured by grades among students of different social classes. These findings suggest that frequent consequential quizzing should be used routinely in large lecture courses to improve performance in class and in other concurrent and subsequent courses.

  9. Relative health performance in BRICS over the past 20 years: the winners and losers

    PubMed Central

    Petrie, Dennis

    2014-01-01

    Abstract Objective To determine whether the health performance of Brazil, the Russian Federation, India, China and South Africa – the countries known as BRICS – has kept in step with their economic development. Methods Reductions in age- and sex-specific mortality seen in each BRICS country between 1990 and 2011 were measured. These results were compared with those of the best-performing countries in the world and the best-performing countries with similar income levels. We estimated each country’s progress in reducing mortality and compared changes in that country’s mortality rates against other countries with similar mean incomes to examine changes in avoidable mortality. Findings The relative health performance of the five study countries differed markedly over the study period. Brazil demonstrated fairly even improvement in relative health performance across the different age and sex subgroups that we assessed. India’s improvement was more modest and more varied across the subgroups. South Africa and the Russian Federation exhibited large declines in health performance as well as large sex-specific inequalities in health. Although China’s levels of avoidable mortality decreased in absolute terms, the level of improvement appeared low in the context of China’s economic growth. Conclusion When evaluating a country’s health performance in terms of avoidable mortality, it is useful to compare that performance against the performance of other countries. Such comparison allows any country-specific improvements to be distinguished from general global improvements. PMID:24940013

  10. Thermal evaluation of advanced solar dynamic heat receiver performance

    NASA Technical Reports Server (NTRS)

    Crane, Roger A.

    1989-01-01

    The thermal performance of a variety of concepts for thermal energy storage as applied to solar dynamic applications is discussed. It is recognized that designs providing large thermal gradients or large temperature swings during orbit are susceptible to early mechanical failure. Concepts incorporating heat pipe technology may encounter operational limitations over sufficiently large ranges. By reviewing the thermal performance of basic designs, the relative merits of the basic concepts are compared. In addition the effect of thermal enhancement and metal utilization as applied to each design provides a partial characterization of the performance improvements to be achieved by developing these technologies.

  11. Performance evaluation capabilities for the design of physical systems

    NASA Technical Reports Server (NTRS)

    Pilkey, W. D.; Wang, B. P.

    1972-01-01

    The results are presented of a study aimed at developing and formulating a capability for the limiting performance of large steady state systems. The accomplishments reported include: (1) development of a theory of limiting performance of large systems subject to steady state inputs; (2) application and modification of PERFORM, the computational capability for the limiting performance of systems with transient inputs; and (3) demonstration that use of an inherently smooth control force for a limiting performance calculation improves the system identification phase of the design process for physical systems subjected to transient loading.

  12. The effect of various parameters of large scale radio propagation models on improving performance mobile communications

    NASA Astrophysics Data System (ADS)

    Pinem, M.; Fauzi, R.

    2018-02-01

    One technique for ensuring continuity of wireless communication services and keeping a smooth transition on mobile communication networks is the soft handover technique. In the Soft Handover (SHO) technique the inclusion and reduction of Base Station from the set of active sets is determined by initiation triggers. One of the initiation triggers is based on the strong reception signal. In this paper we observed the influence of parameters of large-scale radio propagation models to improve the performance of mobile communications. The observation parameters for characterizing the performance of the specified mobile system are Drop Call, Radio Link Degradation Rate and Average Size of Active Set (AS). The simulated results show that the increase in altitude of Base Station (BS) Antenna and Mobile Station (MS) Antenna contributes to the improvement of signal power reception level so as to improve Radio Link quality and increase the average size of Active Set and reduce the average Drop Call rate. It was also found that Hata’s propagation model contributed significantly to improvements in system performance parameters compared to Okumura’s propagation model and Lee’s propagation model.

  13. A Quantitative Evaluation of the Flipped Classroom in a Large Lecture Principles of Economics Course

    ERIC Educational Resources Information Center

    Balaban, Rita A.; Gilleskie, Donna B.; Tran, Uyen

    2016-01-01

    This research provides evidence that the flipped classroom instructional format increases student final exam performance, relative to the traditional instructional format, in a large lecture principles of economics course. The authors find that the flipped classroom directly improves performance by 0.2 to 0.7 standardized deviations, depending on…

  14. Incorporation of Carrier Phase Global Positioning System Measurements into the Navigation Reference System for Improved Performance

    DTIC Science & Technology

    1993-12-01

    5-6 5.6.1 Large Cycle Slip Simulation ............................. 5-7 5.6.2 Small Cycle Slip Simulation ........................... 5-9...Appendix J. Small Cycle Slip Simulation Results ............................. J-1 Bibliography ........................................................ BIB-I...when subjected to large and small cycle slips. Results of the simulations indicate that the PNRS can provide an improved navigation solution over

  15. The effects of low-volume resistance training with and without advanced techniques in trained subjects.

    PubMed

    Gieβsing, Jùrgen; Fisher, James; Steele, James; Rothe, Frank; Raubold, Kristin; Eichmann, Björn

    2016-03-01

    This study examined low-volume resistance training (RT) in trained participants with and without advanced training methods. Trained participants (RT experience 4±3 years) were randomised to groups performing single-set RT: ssRM (N.=21) performing repetitions to self-determined repetition maximum (RM), ssMMF (N.=30) performing repetitions to momentary muscular failure (MMF), and ssRP (N.=28) performing repetitions to self-determined RM using a rest pause (RP) method. Each performed supervised RT twice/week for 10 weeks. Outcomes included maximal isometric strength and body composition using bioelectrical impedance analysis. The ssRM group did not significantly improve in any outcome. The ssMMF and ssRP groups both significantly improved strength (p < 0.05). Magnitude of changes using effect size (ES) was examined between groups. Strength ES's were considered large for ssMMF (0.91 to 1.57) and ranging small to large for ssRP (0.42 to 1.06). Body composition data revealed significant improvements (P<0.05) in muscle and fat mass and percentages for whole body, upper limbs and trunk for ssMMF, but only upper limbs for ssRP. Body composition ES's ranged moderate to large for ssMMF (0.56 to 1.27) and ranged small to moderate for ssRP (0.28 to 0.52). ssMMF also significantly improved (P<0.05) total abdominal fat and increased intracellular water with moderate ES's (-0.62 and 0.56, respectively). Training to self-determined RM is not efficacious for trained participants. Training to MMF produces greatest improvements in strength and body composition, however, RP style training does offer some benefit.

  16. Gap analysis: assessing the value perception of consultant pharmacist services and the performance of consultant pharmacists.

    PubMed

    Clark, Thomas R

    2008-09-01

    To understand the importance of services provided by consultant pharmacists and to assess perception of their performance of services. Cross-sectional; nursing facility team. Random e-mail survey of consultant pharmacists; phone survey of team members. 233 consultant pharmacists (practicing in a nursing facility); 540 team members (practicing in a nursing facility, interacting with > or = 1 consultant pharmacist): 120 medical directors, 210 directors of nursing, 210 administrators. Consultant pharmacists, directors of nursing, medical directors, and administrators rating importance/performance of 21 services. Gap between teams' ratings of importance and consultant pharmacists' performance is assessed to categorize services. Importance/performance ranked on five-point scale. Mean scores used for gap analysis to cluster services into four categories. Per combined group, six services categorized as "Keep It Up" (important, good performance), consensus with individual groups, except discrepancy with medical directors, for one service. Six services each categorized as "Improve" (important, large gap) and "Improve Second" (lower importance, large gap), with varied responses by individual groups. Three different services were categorized into "Don't Worry," with consensus within individual groups. Consensus from all groups found 5 of 21 services are important and performed well by consultant pharmacists, indicating to maintain performance of services. For three services, consultant pharmacists do not need to worry about their performance. Thirteen services require improvement in consultant pharmacists' performance; various groups differ on extent of improvement needed. Results can serve as benchmark comparisons with results obtained by consultant pharmacists in their own facilities.

  17. Hybrid Propulsion Technology Program

    NASA Technical Reports Server (NTRS)

    Jensen, G. E.; Holzman, A. L.

    1990-01-01

    Future launch systems of the United States will require improvements in booster safety, reliability, and cost. In order to increase payload capabilities, performance improvements are also desirable. The hybrid rocket motor (HRM) offers the potential for improvements in all of these areas. The designs are presented for two sizes of hybrid boosters, a large 4.57 m (180 in.) diameter booster duplicating the Advanced Solid Rocket Motor (ASRM) vacuum thrust-time profile and smaller 2.44 m (96 in.), one-quater thrust level booster. The large booster would be used in tandem, while eight small boosters would be used to achieve the same total thrust. These preliminary designs were generated as part of the NASA Hybrid Propulsion Technology Program. This program is the first phase of an eventual three-phaes program culminating in the demonstration of a large subscale engine. The initial trade and sizing studies resulted in preferred motor diameters, operating pressures, nozzle geometry, and fuel grain systems for both the large and small boosters. The data were then used for specific performance predictions in terms of payload and the definition and selection of the requirements for the major components: the oxidizer feed system, nozzle, and thrust vector system. All of the parametric studies were performed using realistic fuel regression models based upon specific experimental data.

  18. Collaborative Testing Improves Performance but Not Content Retention in a Large-Enrollment Introductory Biology Class

    PubMed Central

    Leight, Hayley; Saunders, Cheston; Calkins, Robin; Withers, Michelle

    2012-01-01

    Collaborative testing has been shown to improve performance but not always content retention. In this study, we investigated whether collaborative testing could improve both performance and content retention in a large, introductory biology course. Students were semirandomly divided into two groups based on their performances on exam 1. Each group contained equal numbers of students scoring in each grade category (“A”–“F”) on exam 1. All students completed each of the four exams of the semester as individuals. For exam 2, one group took the exam a second time in small groups immediately following the individually administered test. The other group followed this same format for exam 3. Individual and group exam scores were compared to determine differences in performance. All but exam 1 contained a subset of cumulative questions from the previous exam. Performances on the cumulative questions for exams 3 and 4 were compared for the two groups to determine whether there were significant differences in content retention. Even though group test scores were significantly higher than individual test scores, students who participated in collaborative testing performed no differently on cumulative questions than students who took the previous exam as individuals. PMID:23222835

  19. Aerodynamic Limits on Large Civil Tiltrotor Sizing and Efficiency

    NASA Technical Reports Server (NTRS)

    Acree, C W.

    2014-01-01

    The NASA Large Civil Tiltrotor (2nd generation, or LCTR2) is a useful reference design for technology impact studies. The present paper takes a broad view of technology assessment by examining the extremes of what aerodynamic improvements might hope to accomplish. Performance was analyzed with aerodynamically idealized rotor, wing, and airframe, representing the physical limits of a large tiltrotor. The analysis was repeated with more realistic assumptions, which revealed that increased maximum rotor lift capability is potentially more effective in improving overall vehicle efficiency than higher rotor or wing efficiency. To balance these purely theoretical studies, some practical limitations on airframe layout are also discussed, along with their implications for wing design. Performance of a less efficient but more practical aircraft with non-tilting nacelles is presented.

  20. Treatment of Previously Treated Facial Capillary Malformations: Results of Single-Center Retrospective Objective 3-Dimensional Analysis of the Efficacy of Large Spot 532 nm Lasers.

    PubMed

    Kwiek, Bartłomiej; Ambroziak, Marcin; Osipowicz, Katarzyna; Kowalewski, Cezary; Rożalski, Michał

    2018-06-01

    Current treatment of facial capillary malformations (CM) has limited efficacy. To assess the efficacy of large spot 532 nm lasers for the treatment of previously treated facial CM with the use of 3-dimensional (3D) image analysis. Forty-three white patients aged 6 to 59 were included in this study. Patients had 3D photography performed before and after treatment with a 532 nm Nd:YAG laser with large spot and contact cooling. Objective analysis of percentage improvement based on 3D digital assessment of combined color and area improvement (global clearance effect [GCE]) were performed. The median maximal improvement achieved during the treatment (GCE) was 59.1%. The mean number of laser procedures required to achieve this improvement was 6.2 (range 1-16). Improvement of minimum 25% (GCE25) was achieved by 88.4% of patients, a minimum of 50% (GCE50) by 61.1%, a minimum of 75% (GCE75) by 25.6%, and a minimum of 90% (GCE90) by 4.6%. Patients previously treated with pulsed dye lasers had a significantly less response than those treated with other modalities (GCE 37.3% vs 61.8%, respectively). A large spot 532 nm laser is effective in previously treated patients with facial CM.

  1. The Use of Novel Camtasia Videos to Improve Performance of At-Risk Students in Undergraduate Physiology Courses

    ERIC Educational Resources Information Center

    Miller, Cynthia J.

    2014-01-01

    Students in undergraduate physiology courses often have difficulty understanding complex, multi-step processes, and these concepts consume a large portion of class time. For this pilot study, it was hypothesized that online multimedia resources may improve student performance in a high-risk population and reduce the in-class workload. A narrated…

  2. Do pre-trained deep learning models improve computer-aided classification of digital mammograms?

    NASA Astrophysics Data System (ADS)

    Aboutalib, Sarah S.; Mohamed, Aly A.; Zuley, Margarita L.; Berg, Wendie A.; Luo, Yahong; Wu, Shandong

    2018-02-01

    Digital mammography screening is an important exam for the early detection of breast cancer and reduction in mortality. False positives leading to high recall rates, however, results in unnecessary negative consequences to patients and health care systems. In order to better aid radiologists, computer-aided tools can be utilized to improve distinction between image classifications and thus potentially reduce false recalls. The emergence of deep learning has shown promising results in the area of biomedical imaging data analysis. This study aimed to investigate deep learning and transfer learning methods that can improve digital mammography classification performance. In particular, we evaluated the effect of pre-training deep learning models with other imaging datasets in order to boost classification performance on a digital mammography dataset. Two types of datasets were used for pre-training: (1) a digitized film mammography dataset, and (2) a very large non-medical imaging dataset. By using either of these datasets to pre-train the network initially, and then fine-tuning with the digital mammography dataset, we found an increase in overall classification performance in comparison to a model without pre-training, with the very large non-medical dataset performing the best in improving the classification accuracy.

  3. Time Course of Improvements in Power Characteristics in Elite Development Netball Players Entering a Full-Time Training Program.

    PubMed

    McKeown, Ian; Chapman, Dale W; Taylor, Kristie Lee; Ball, Nick B

    2016-05-01

    We describe the time course of adaptation to structured resistance training on entering a full-time high-performance sport program. Twelve international caliber female netballers (aged 19.9 ± 0.4 years) were monitored for 18 weeks with countermovement (CMJ: performed with body weight and 15 kg) and drop jumps (0.35-m box at body weight) at the start of each training week. Performance did not improve linearly or concurrently with loaded CMJ power improving 11% by Week 5 (effect size [ES] 0.93 ± 0.72) in contrast, substantial positive changes were observed for unloaded CMJ power (12%; ES 0.78 ± 0.39), and CMJ velocity (unloaded: 7.1%; ES 0.66 ± 0.34; loaded: 7.5%; ES 0.90 ± 0.41) by week 7. Over the investigation duration, large improvements were observed in unloaded CMJ power (24%; ES 1.45 ± 1.11) and velocity (12%; ES 1.13 ± 0.76). Loaded CMJ power also showed a large improvement (19%; ES 1.49 ± 0.97) but only moderate changes were observed for loaded CMJ velocity (8.4%; ES 1.01 ± 0.67). Jump height changes in either unloaded or loaded CMJ were unclear over the 18-week period. Drop jump performance improved throughout the investigation period with moderate positive changes in reactive strength index observed (35%; ES 0.97 ± 0.69). The adaptation response to a structured resistance training program does not occur linearly in young female athletes. Caution should be taken if assessing jump height only, as this will provide a biased observation to a training response. Frequently assessing CMJ performance can aid program design coaching decisions to ensure improvements are seen past the initial neuromuscular learning phase in performance training.

  4. NAEP Validity Studies: Improving the Information Value of Performance Items in Large Scale Assessments. Working Paper No. 2003-08

    ERIC Educational Resources Information Center

    Pearson, P. David; Garavaglia, Diane R.

    2003-01-01

    The purpose of this essay is to explore both what is known and what needs to be learned about the information value of performance items "when they are used in large scale assessments." Within the context of the National Assessment of Educational Progress (NAEP), there is substantial motivation for answering these questions. Over the…

  5. A Global Evaluation of Coral Reef Management Performance: Are MPAs Producing Conservation and Socio-Economic Improvements?

    NASA Astrophysics Data System (ADS)

    Hargreaves-Allen, Venetia; Mourato, Susana; Milner-Gulland, Eleanor Jane

    2011-04-01

    There is a consensus that Marine Protected Area (MPA) performance needs regular evaluation against clear criteria, incorporating counterfactual comparisons of ecological and socio-economic performance. However, these evaluations are scarce at the global level. We compiled self-reports from managers and researchers of 78 coral reef-based MPAs world-wide, on the conservation and welfare improvements that their MPAs provide. We developed a suite of performance measures including fulfilment of design and management criteria, achievement of aims, the cessation of banned or destructive activities, change in threats, and measurable ecological and socio-economic changes in outcomes, which we evaluated with respect to the MPA's age, geographical location and main aims. The sample was found to be broadly representative of MPAs generally, and suggests that many MPAs do not achieve certain fundamental aims including improvements in coral cover over time (in 25% of MPAs), and conflict reduction (in 25%). However, the large majority demonstrated improvements in terms of slowing coral loss, reducing destructive uses and increasing tourism and local employment, despite many being small, underfunded and facing multiple large scale threats beyond the control of managers. However spatial comparisons suggest that in some regions MPAs are simply mirroring outside changes, with demonstrates the importance of testing for additionality. MPA benefits do not appear to increase linearly over time. In combination with other management efforts and regulations, especially those relating to large scale threat reduction and targeted fisheries and conflict resolution instruments, MPAs are an important tool to achieve coral reef conservation globally. Given greater resources and changes which incorporate best available science, such as larger MPAs and no-take areas, networks and reduced user pressure, it is likely that performance could further be enhanced. Performance evaluation should test for the generation of additional ecological and socio-economic improvements over time and compared to unmanaged areas as part of an adaptive management regime.

  6. Improving Service Management in the Internet of Things

    PubMed Central

    Sammarco, Chiara; Iera, Antonio

    2012-01-01

    In the Internet of Things (IoT) research arena, many efforts are devoted to adapt the existing IP standards to emerging IoT nodes. This is the direction followed by three Internet Engineering Task Force (IETF) Working Groups, which paved the way for research on IP-based constrained networks. Through a simplification of the whole TCP/IP stack, resource constrained nodes become direct interlocutors of application level entities in every point of the network. In this paper we analyze some side effects of this solution, when in the presence of large amounts of data to transmit. In particular, we conduct a performance analysis of the Constrained Application Protocol (CoAP), a widely accepted web transfer protocol for the Internet of Things, and propose a service management enhancement that improves the exploitation of the network and node resources. This is specifically thought for constrained nodes in the abovementioned conditions and proves to be able to significantly improve the node energetic performance when in the presence of large resource representations (hence, large data transmissions).

  7. Colloquium on Large Scale Improvement: Implications for AISI

    ERIC Educational Resources Information Center

    McEwen, Nelly, Ed.

    2008-01-01

    The Alberta Initiative for School Improvement (AISI) is a province-wide partnership program whose goal is to improve student learning and performance by fostering initiatives that reflect the unique needs and circumstances of each school authority. It is currently ending its third cycle and ninth year of implementation. "The Colloquium on…

  8. Electrolytes for Use in High Energy Lithium-ion Batteries with Wide Operating Temperature Range

    NASA Technical Reports Server (NTRS)

    Smart, Marshall C.; Ratnakumar, B. V.; West, W. C.; Whitcanack, L. D.; Huang, C.; Soler, J.; Krause, F. C.

    2012-01-01

    Met programmatic milestones for program. Demonstrated improved performance with wide operating temperature electrolytes containing ester co-solvents (i.e., methyl butyrate) containing electrolyte additives in A123 prototype cells: Previously demonstrated excellent low temperature performance, including 11C rates at -30 C and the ability to perform well down to -60 C. Excellent cycle life at room temperature has been displayed, with over 5,000 cycles being demonstrated. Good high temperature cycle life performance has also been achieved. Demonstrated improved performance with methyl propionate-containing electrolytes in large capacity prototype cells: Demonstrated the wide operating temperature range capability in large cells (12 Ah), successfully scaling up technology from 0.25 Ah size cells. Demonstrated improved performance at low temperature and good cycle life at 40 C with methyl propionate-based electrolyte containing increasing FEC content and the use of LiBOB as an additive. Utilized three-electrode cells to investigate the electrochemical characteristics of high voltage systems coupled with wide operating temperature range electrolytes: From Tafel polarization measurements on each electrode, it is evident the NMC-based cathode displays poor lithium kinetics (being the limiting electrode). The MB-based formulations containing LiBOB delivered the best rate capability at low temperature, which is attributed to improved cathode kinetics. Whereas, the use of lithium oxalate as an additive lead to the highest reversible capacity and lower irreversible losses.

  9. Improving the Academic Achievement of Third and Fourth Grade Underachievers as a Result of Improved Self-Esteem.

    ERIC Educational Resources Information Center

    Coakley, Barbara Fairfax

    This study was designed to improve the academic achievement of 35 third- and fourth-grade underachievers through improved self-esteem. Specific goals included focusing on self-concept and learning skills reinforcement, with the ultimate goal of increasing academic performance and motivation. Large group sessions with students focused on…

  10. Preliminary evaluation of the Community Multiscale Air Quality model for 2002 over the Southeastern United States.

    PubMed

    Morris, Ralph E; McNally, Dennis E; Tesche, Thomas W; Tonnesen, Gail; Boylan, James W; Brewer, Patricia

    2005-11-01

    The Visibility Improvement State and Tribal Association of the Southeast (VISTAS) is one of five Regional Planning Organizations that is charged with the management of haze, visibility, and other regional air quality issues in the United States. The VISTAS Phase I work effort modeled three episodes (January 2002, July 1999, and July 2001) to identify the optimal model configuration(s) to be used for the 2002 annual modeling in Phase II. Using model configurations recommended in the Phase I analysis, 2002 annual meteorological (Mesoscale Meterological Model [MM5]), emissions (Sparse Matrix Operator Kernal Emissions [SMOKE]), and air quality (Community Multiscale Air Quality [CMAQ]) simulations were performed on a 36-km grid covering the continental United States and a 12-km grid covering the Eastern United States. Model estimates were then compared against observations. This paper presents the results of the preliminary CMAQ model performance evaluation for the initial 2002 annual base case simulation. Model performance is presented for the Eastern United States using speciated fine particle concentration and wet deposition measurements from several monitoring networks. Initial results indicate fairly good performance for sulfate with fractional bias values generally within +/-20%. Nitrate is overestimated in the winter by approximately +50% and underestimated in the summer by more than -100%. Organic carbon exhibits a large summer underestimation bias of approximately -100% with much improved performance seen in the winter with a bias near zero. Performance for elemental carbon is reasonable with fractional bias values within +/- 40%. Other fine particulate (soil) and coarse particular matter exhibit large (80-150%) overestimation in the winter but improved performance in the summer. The preliminary 2002 CMAQ runs identified several areas of enhancements to improve model performance, including revised temporal allocation factors for ammonia emissions to improve nitrate performance and addressing missing processes in the secondary organic aerosol module to improve OC performance.

  11. Existing methods for improving the accuracy of digital-to-analog converters

    NASA Astrophysics Data System (ADS)

    Eielsen, Arnfinn A.; Fleming, Andrew J.

    2017-09-01

    The performance of digital-to-analog converters is principally limited by errors in the output voltage levels. Such errors are known as element mismatch and are quantified by the integral non-linearity. Element mismatch limits the achievable accuracy and resolution in high-precision applications as it causes gain and offset errors, as well as harmonic distortion. In this article, five existing methods for mitigating the effects of element mismatch are compared: physical level calibration, dynamic element matching, noise-shaping with digital calibration, large periodic high-frequency dithering, and large stochastic high-pass dithering. These methods are suitable for improving accuracy when using digital-to-analog converters that use multiple discrete output levels to reconstruct time-varying signals. The methods improve linearity and therefore reduce harmonic distortion and can be retrofitted to existing systems with minor hardware variations. The performance of each method is compared theoretically and confirmed by simulations and experiments. Experimental results demonstrate that three of the five methods provide significant improvements in the resolution and accuracy when applied to a general-purpose digital-to-analog converter. As such, these methods can directly improve performance in a wide range of applications including nanopositioning, metrology, and optics.

  12. An exploratory, large-scale study of pain and quality of life outcomes in cancer patients with moderate or severe pain, and variables predicting improvement.

    PubMed

    Maximiano, Constanza; López, Iker; Martín, Cristina; Zugazabeitia, Luis; Martí-Ciriquián, Juan L; Núñez, Miguel A; Contreras, Jorge; Herdman, Michael; Traseira, Susana; Provencio, Mariano

    2018-01-01

    There have been few large-scale, real world studies in Spain to assess change in pain and quality of life (QOL) outcomes in cancer patients with moderate to severe pain. This study aimed to assess changes on both outcomes after 3 months of usual care and to investigate factors associated with change in QoL. Large, multi-centre, observational study in patients with lung, head and neck, colorectal or breast cancer experiencing a first episode of moderate to severe pain while attending one of the participating centres. QoL was assessed using the EuroQol-5D questionnaire and pain using the Brief Pain Inventory (BPI). Instruments were administered at baseline and after 3 months of follow up. Multivariate analyses were used to assess the impact of treatment factors, demographic and clinical variables, pain and other symptoms on QoL scores. 1711 patients were included for analysis. After 3 months of usual care, a significant improvement was observed in pain and QoL in all four cancer groups (p<0.001). Effect sizes were medium to large on the BPI and EQ-5D Index and Visual Analogue Scale (VAS). Improvements were seen on the majority of EQ-5D dimensions in all patient groups, though breast cancer patients showed the largest gains. Poorer baseline performance status (ECOG) and the presence of anxiety/depression were associated with significantly poorer QOL outcomes. Improvements in BPI pain scores were associated with improved QoL. In the four cancer types studied, pain and QoL outcomes improved considerably after 3 months of usual care. Improvements in pain made a substantial contribution to QoL gains whilst the presence of anxiety and depression and poor baseline performance status significantly constrained improvement.

  13. Feedback associated with expectation for larger-reward improves visuospatial working memory performances in children with ADHD.

    PubMed

    Hammer, Rubi; Tennekoon, Michael; Cooke, Gillian E; Gayda, Jessica; Stein, Mark A; Booth, James R

    2015-08-01

    We tested the interactive effect of feedback and reward on visuospatial working memory in children with ADHD. Seventeen boys with ADHD and 17 Normal Control (NC) boys underwent functional magnetic resonance imaging (fMRI) while performing four visuospatial 2-back tasks that required monitoring the spatial location of letters presented on a display. Tasks varied in reward size (large; small) and feedback availability (no-feedback; feedback). While the performance of NC boys was high in all conditions, boys with ADHD exhibited higher performance (similar to those of NC boys) only when they received feedback associated with large-reward. Performance pattern in both groups was mirrored by neural activity in an executive function neural network comprised of few distinct frontal brain regions. Specifically, neural activity in the left and right middle frontal gyri of boys with ADHD became normal-like only when feedback was available, mainly when feedback was associated with large-reward. When feedback was associated with small-reward, or when large-reward was expected but feedback was not available, boys with ADHD exhibited altered neural activity in the medial orbitofrontal cortex and anterior insula. This suggests that contextual support normalizes activity in executive brain regions in children with ADHD, which results in improved working memory. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  14. A Continuous Quality Improvement Airway Program Results in Sustained Increases in Intubation Success.

    PubMed

    Olvera, David J; Stuhlmiller, David F E; Wolfe, Allen; Swearingen, Charles F; Pennington, Troy; Davis, Daniel P

    2018-02-21

    Airway management is a critical skill for air medical providers, including the use of rapid sequence intubation (RSI) medications. Mediocre success rates and a high incidence of complications has challenged air medical providers to improve training and performance improvement efforts to improve clinical performance. The aim of this research was to describe the experience with a novel, integrated advanced airway management program across a large air medical company and explore the impact of the program on improvement in RSI success. The Helicopter Advanced Resuscitation Training (HeART) program was implemented across 160 bases in 2015. The HeART program includes a novel conceptual framework based on thorough understanding of physiology, critical thinking using a novel algorithm, difficult airway predictive tools, training in the optimal use of specific airway techniques and devices, and integrated performance improvement efforts to address opportunities for improvement. The C-MAC video/direct laryngoscope and high-fidelity human patient simulation laboratories were implemented during the study period. Chi-square test for trend was used to evaluate for improvements in airway management and RSI success (overall intubation success, first-attempt success, first-attempt success without desaturation) over the 25-month study period following HeART implementation. A total of 5,132 patients underwent RSI during the study period. Improvements in first-attempt intubation success (85% to 95%, p < 0.01) and first-attempt success without desaturation (84% to 94%, p < 0.01) were observed. Overall intubation success increased from 95% to 99% over the study period, but the trend was not statistically significant (p = 0.311). An integrated advanced airway management program was successful in improving RSI intubation performance in a large air medical company.

  15. Improved cache performance in Monte Carlo transport calculations using energy banding

    NASA Astrophysics Data System (ADS)

    Siegel, A.; Smith, K.; Felker, K.; Romano, P.; Forget, B.; Beckman, P.

    2014-04-01

    We present an energy banding algorithm for Monte Carlo (MC) neutral particle transport simulations which depend on large cross section lookup tables. In MC codes, read-only cross section data tables are accessed frequently, exhibit poor locality, and are typically too much large to fit in fast memory. Thus, performance is often limited by long latencies to RAM, or by off-node communication latencies when the data footprint is very large and must be decomposed on a distributed memory machine. The proposed energy banding algorithm allows maximal temporal reuse of data in band sizes that can flexibly accommodate different architectural features. The energy banding algorithm is general and has a number of benefits compared to the traditional approach. In the present analysis we explore its potential to achieve improvements in time-to-solution on modern cache-based architectures.

  16. A case for Redundant Arrays of Inexpensive Disks (RAID)

    NASA Technical Reports Server (NTRS)

    Patterson, David A.; Gibson, Garth; Katz, Randy H.

    1988-01-01

    Increasing performance of CPUs and memories will be squandered if not matched by a similar performance increase in I/O. While the capacity of Single Large Expensive Disks (SLED) has grown rapidly, the performance improvement of SLED has been modest. Redundant Arrays of Inexpensive Disks (RAID), based on the magnetic disk technology developed for personal computers, offers an attractive alternative to SLED, promising improvements of an order of magnitude in performance, reliability, power consumption, and scalability. This paper introduces five levels of RAIDs, giving their relative cost/performance, and compares RAID to an IBM 3380 and a Fujitsu Super Eagle.

  17. Evolution of magnetic disk subsystems

    NASA Astrophysics Data System (ADS)

    Kaneko, Satoru

    1994-06-01

    The higher recording density of magnetic disk realized today has brought larger storage capacity per unit and smaller form factors. If the required access performance per MB is constant, the performance of large subsystems has to be several times better. This article describes mainly the technology for improving the performance of the magnetic disk subsystems and the prospects of their future evolution. Also considered are 'crosscall pathing' which makes the data transfer channel more effective, 'disk cache' which improves performance coupling with solid state memory technology, and 'RAID' which improves the availability and integrity of disk subsystems by organizing multiple disk drives in a subsystem. As a result, it is concluded that since the performance of the subsystem is dominated by that of the disk cache, maximation of the performance of the disk cache subsystems is very important.

  18. Large-scale fortification of condiments and seasonings as a public health strategy: equity considerations for implementation.

    PubMed

    Zamora, Gerardo; Flores-Urrutia, Mónica Crissel; Mayén, Ana-Lucia

    2016-09-01

    Fortification of staple foods with vitamins and minerals is an effective approach to increase micronutrient intake and improve nutritional status. The specific use of condiments and seasonings as vehicles in large-scale fortification programs is a relatively new public health strategy. This paper underscores equity considerations for the implementation of large-scale fortification of condiments and seasonings as a public health strategy by examining nonexhaustive examples of programmatic experiences and pilot projects in various settings. An overview of conceptual elements in implementation research and equity is presented, followed by an examination of equity considerations for five implementation strategies: (1) enhancing the capabilities of the public sector, (2) improving the performance of implementing agencies, (3) strengthening the capabilities and performance of frontline workers, (3) empowering communities and individuals, and (4) supporting multiple stakeholders engaged in improving health. Finally, specific considerations related to intersectoral action are considered. Large-scale fortification of condiments and seasonings cannot be a standalone strategy and needs to be implemented with concurrent and coordinated public health strategies, which should be informed by a health equity lens. © 2016 New York Academy of Sciences.

  19. A Novel Approach to Practice-Based Learning and Improvement Using a Web-Based Audit and Feedback Module.

    PubMed

    Boggan, Joel C; Cheely, George; Shah, Bimal R; Heffelfinger, Randy; Springall, Deanna; Thomas, Samantha M; Zaas, Aimee; Bae, Jonathan

    2014-09-01

    Systematically engaging residents in large programs in quality improvement (QI) is challenging. To coordinate a shared QI project in a large residency program using an online tool. A web-based QI tool guided residents through a 2-phase evaluation of performance of foot examinations in patients with diabetes. In phase 1, residents completed reviews of health records with online data entry. Residents were then presented with personal performance data relative to peers and were prompted to develop improvement plans. In phase 2, residents again reviewed personal performance. Rates of performance were compared at the program and clinic levels for each phase, with data presented for residents. Acceptability was measured by the number of residents completing each phase. Feasibility was measured by estimated faculty, programmer, and administrator time and costs. Seventy-nine of 86 eligible residents (92%) completed improvement plans and reviewed 1471 patients in phase 1, whereas 68 residents (79%) reviewed 1054 patient charts in phase 2. Rates of performance of examination increased significantly between phases (from 52% to 73% for complete examination, P < .001). Development of the tool required 130 hours of programmer time. Project analysis and management required 6 hours of administrator and faculty time monthly. An online tool developed and implemented for program-wide QI initiatives successfully engaged residents to participate in QI activities. Residents using this tool demonstrated improvement in a selected quality target. This tool could be adapted by other graduate medical education programs or for faculty development.

  20. Evaluation of Student Reflection as a Route to Improve Oral Communication

    ERIC Educational Resources Information Center

    Mineart, Kenneth P.; Cooper, Matthew E.

    2016-01-01

    This study describes the use of guided self-reflection and peer feedback activities to improve student oral communication in a large ChE class (n ~ 100) setting. Student performance tracked throughout an experimental semester indicated both reflection activities accelerated improvement in oral communication over control; student perception of the…

  1. Extension of wavelength-modulation spectroscopy to large modulation depth for diode laser absorption measurements in high-pressure gases

    NASA Astrophysics Data System (ADS)

    Li, Hejie; Rieker, Gregory B.; Liu, Xiang; Jeffries, Jay B.; Hanson, Ronald K.

    2006-02-01

    Tunable diode laser absorption measurements at high pressures by use of wavelength-modulation spectroscopy (WMS) require large modulation depths for optimum detection of molecular absorption spectra blended by collisional broadening or dense spacing of the rovibrational transitions. Diode lasers have a large and nonlinear intensity modulation when the wavelength is modulated over a large range by injection-current tuning. In addition to this intensity modulation, other laser performance parameters are measured, including the phase shift between the frequency modulation and the intensity modulation. Following published theory, these parameters are incorporated into an improved model of the WMS signal. The influence of these nonideal laser effects is investigated by means of wavelength-scanned WMS measurements as a function of bath gas pressure on rovibrational transitions of water vapor near 1388 nm. Lock-in detection of the magnitude of the 2f signal is performed to remove the dependence on detection phase. We find good agreement between measurements and the improved model developed for the 2f component of the WMS signal. The effects of the nonideal performance parameters of commercial diode lasers are especially important away from the line center of discrete spectra, and these contributions become more pronounced for 2f signals with the large modulation depths needed for WMS at elevated pressures.

  2. Detrending Algorithms in Large Time Series: Application to TFRM-PSES Data

    NASA Astrophysics Data System (ADS)

    del Ser, D.; Fors, O.; Núñez, J.; Voss, H.; Rosich, A.; Kouprianov, V.

    2015-07-01

    Certain instrumental effects and data reduction anomalies introduce systematic errors in photometric time series. Detrending algorithms such as the Trend Filtering Algorithm (TFA; Kovács et al. 2004) have played a key role in minimizing the effects caused by these systematics. Here we present the results obtained after applying the TFA, Savitzky & Golay (1964) detrending algorithms, and the Box Least Square phase-folding algorithm (Kovács et al. 2002) to the TFRM-PSES data (Fors et al. 2013). Tests performed on these data show that by applying these two filtering methods together the photometric RMS is on average improved by a factor of 3-4, with better efficiency towards brighter magnitudes, while applying TFA alone yields an improvement of a factor 1-2. As a result of this improvement, we are able to detect and analyze a large number of stars per TFRM-PSES field which present some kind of variability. Also, after porting these algorithms to Python and parallelizing them, we have improved, even for large data samples, the computational performance of the overall detrending+BLS algorithm by a factor of ˜10 with respect to Kovács et al. (2004).

  3. Evaluating and improving the representation of heteroscedastic errors in hydrological models

    NASA Astrophysics Data System (ADS)

    McInerney, D. J.; Thyer, M. A.; Kavetski, D.; Kuczera, G. A.

    2013-12-01

    Appropriate representation of residual errors in hydrological modelling is essential for accurate and reliable probabilistic predictions. In particular, residual errors of hydrological models are often heteroscedastic, with large errors associated with high rainfall and runoff events. Recent studies have shown that using a weighted least squares (WLS) approach - where the magnitude of residuals are assumed to be linearly proportional to the magnitude of the flow - captures some of this heteroscedasticity. In this study we explore a range of Bayesian approaches for improving the representation of heteroscedasticity in residual errors. We compare several improved formulations of the WLS approach, the well-known Box-Cox transformation and the more recent log-sinh transformation. Our results confirm that these approaches are able to stabilize the residual error variance, and that it is possible to improve the representation of heteroscedasticity compared with the linear WLS approach. We also find generally good performance of the Box-Cox and log-sinh transformations, although as indicated in earlier publications, the Box-Cox transform sometimes produces unrealistically large prediction limits. Our work explores the trade-offs between these different uncertainty characterization approaches, investigates how their performance varies across diverse catchments and models, and recommends practical approaches suitable for large-scale applications.

  4. Analysing the performance of personal computers based on Intel microprocessors for sequence aligning bioinformatics applications.

    PubMed

    Nair, Pradeep S; John, Eugene B

    2007-01-01

    Aligning specific sequences against a very large number of other sequences is a central aspect of bioinformatics. With the widespread availability of personal computers in biology laboratories, sequence alignment is now often performed locally. This makes it necessary to analyse the performance of personal computers for sequence aligning bioinformatics benchmarks. In this paper, we analyse the performance of a personal computer for the popular BLAST and FASTA sequence alignment suites. Results indicate that these benchmarks have a large number of recurring operations and use memory operations extensively. It seems that the performance can be improved with a bigger L1-cache.

  5. Advanced data structures for the interpretation of image and cartographic data in geo-based information systems

    NASA Technical Reports Server (NTRS)

    Peuquet, D. J.

    1986-01-01

    A growing need to usse geographic information systems (GIS) to improve the flexibility and overall performance of very large, heterogeneous data bases was examined. The Vaster structure and the Topological Grid structure were compared to test whether such hybrid structures represent an improvement in performance. The use of artificial intelligence in a geographic/earth sciences data base context is being explored. The architecture of the Knowledge Based GIS (KBGIS) has a dual object/spatial data base and a three tier hierarchial search subsystem. Quadtree Spatial Spectra (QTSS) are derived, based on the quadtree data structure, to generate and represent spatial distribution information for large volumes of spatial data.

  6. An Improved Heuristic Method for Subgraph Isomorphism Problem

    NASA Astrophysics Data System (ADS)

    Xiang, Yingzhuo; Han, Jiesi; Xu, Haijiang; Guo, Xin

    2017-09-01

    This paper focus on the subgraph isomorphism (SI) problem. We present an improved genetic algorithm, a heuristic method to search the optimal solution. The contribution of this paper is that we design a dedicated crossover algorithm and a new fitness function to measure the evolution process. Experiments show our improved genetic algorithm performs better than other heuristic methods. For a large graph, such as a subgraph of 40 nodes, our algorithm outperforms the traditional tree search algorithms. We find that the performance of our improved genetic algorithm does not decrease as the number of nodes in prototype graphs.

  7. Is caffeine a cognitive enhancer?

    PubMed

    Nehlig, Astrid

    2010-01-01

    The effects of caffeine on cognition were reviewed based on the large body of literature available on the topic. Caffeine does not usually affect performance in learning and memory tasks, although caffeine may occasionally have facilitatory or inhibitory effects on memory and learning. Caffeine facilitates learning in tasks in which information is presented passively; in tasks in which material is learned intentionally, caffeine has no effect. Caffeine facilitates performance in tasks involving working memory to a limited extent, but hinders performance in tasks that heavily depend on working memory, and caffeine appears to rather improve memory performance under suboptimal alertness conditions. Most studies, however, found improvements in reaction time. The ingestion of caffeine does not seem to affect long-term memory. At low doses, caffeine improves hedonic tone and reduces anxiety, while at high doses, there is an increase in tense arousal, including anxiety, nervousness, jitteriness. The larger improvement of performance in fatigued subjects confirms that caffeine is a mild stimulant. Caffeine has also been reported to prevent cognitive decline in healthy subjects but the results of the studies are heterogeneous, some finding no age-related effect while others reported effects only in one sex and mainly in the oldest population. In conclusion, it appears that caffeine cannot be considered a ;pure' cognitive enhancer. Its indirect action on arousal, mood and concentration contributes in large part to its cognitive enhancing properties.

  8. Role of substrate quality on IC performance and yields

    NASA Technical Reports Server (NTRS)

    Thomas, R. N.

    1981-01-01

    The development of silicon and gallium arsenide crystal growth for the production of large diameter substrates are discussed. Large area substrates of significantly improved compositional purity, dopant distribution and structural perfection on a microscopic as well as macroscopic scale are important requirements. The exploratory use of magnetic fields to suppress convection effects in Czochralski crystal growth is addressed. The growth of large crystals in space appears impractical at present however the efforts to improve substrate quality could benefit from the experiences gained in smaller scale growth experiments conducted in the zero gravity environment of space.

  9. Mitigating the Impact of Nurse Manager Large Spans of Control.

    PubMed

    Simpson, Brenda Baird; Dearmon, Valorie; Graves, Rebecca

    Nurse managers are instrumental in achievement of organizational and unit performance goals. Greater spans of control for managers are associated with decreased satisfaction and performance. An interprofessional team measured one organization's nurse manager span of control, providing administrative assistant support and transformational leadership development to nurse managers with the largest spans of control. Nurse manager satisfaction and transformational leadership competency significantly improved following the implementation of large span of control mitigation strategies.

  10. Smith predictor with sliding mode control for processes with large dead times

    NASA Astrophysics Data System (ADS)

    Mehta, Utkal; Kaya, İbrahim

    2017-11-01

    The paper discusses the Smith Predictor scheme with Sliding Mode Controller (SP-SMC) for processes with large dead times. This technique gives improved load-disturbance rejection with optimum input control signal variations. A power rate reaching law is incorporated in the sporadic part of sliding mode control such that the overall performance recovers meaningfully. The proposed scheme obtains parameter values by satisfying a new performance index which is based on biobjective constraint. In simulation study, the efficiency of the method is evaluated for robustness and transient performance over reported techniques.

  11. Storage media pipelining: Making good use of fine-grained media

    NASA Technical Reports Server (NTRS)

    Vanmeter, Rodney

    1993-01-01

    This paper proposes a new high-performance paradigm for accessing removable media such as tapes and especially magneto-optical disks. In high-performance computing the striping of data across multiple devices is a common means of improving data transfer rates. Striping has been used very successfully for fixed magnetic disks improving overall system reliability as well as throughput. It has also been proposed as a solution for providing improved bandwidth for tape and magneto-optical subsystems. However, striping of removable media has shortcomings, particularly in the areas of latency to data and restricted system configurations, and is suitable primarily for very large I/Os. We propose that for fine-grained media, an alternative access method, media pipelining, may be used to provide high bandwidth for large requests while retaining the flexibility to support concurrent small requests and different system configurations. Its principal drawback is high buffering requirements in the host computer or file server. This paper discusses the possible organization of such a system including the hardware conditions under which it may be effective, and the flexibility of configuration. Its expected performance is discussed under varying workloads including large single I/O's and numerous smaller ones. Finally, a specific system incorporating a high-transfer-rate magneto-optical disk drive and autochanger is discussed.

  12. Control of Flexible Structures (COFS) Flight Experiment Background and Description

    NASA Technical Reports Server (NTRS)

    Hanks, B. R.

    1985-01-01

    A fundamental problem in designing and delivering large space structures to orbit is to provide sufficient structural stiffness and static configuration precision to meet performance requirements. These requirements are directly related to control requirements and the degree of control system sophistication available to supplement the as-built structure. Background and rationale are presented for a research study in structures, structural dynamics, and controls using a relatively large, flexible beam as a focus. This experiment would address fundamental problems applicable to large, flexible space structures in general and would involve a combination of ground tests, flight behavior prediction, and instrumented orbital tests. Intended to be multidisciplinary but basic within each discipline, the experiment should provide improved understanding and confidence in making design trades between structural conservatism and control system sophistication for meeting static shape and dynamic response/stability requirements. Quantitative results should be obtained for use in improving the validity of ground tests for verifying flight performance analyses.

  13. Superior supercapacitors based on nitrogen and sulfur co-doped hierarchical porous carbon: Excellent rate capability and cycle stability

    NASA Astrophysics Data System (ADS)

    Zhang, Deyi; Han, Mei; Wang, Bing; Li, Yubing; Lei, Longyan; Wang, Kunjie; Wang, Yi; Zhang, Liang; Feng, Huixia

    2017-08-01

    Vastly improving the charge storage capability of supercapacitors without sacrificing their high power density and cycle performance would bring bright application prospect. Herein, we report a nitrogen and sulfur co-doped hierarchical porous carbon (NSHPC) with very superior capacitance performance fabricated by KOH activation of nitrogen and sulfur co-doped ordered mesoporous carbon (NSOMC). A high electrochemical double-layer (EDL) capacitance of 351 F g-1 was observed for the reported NSHPC electrodes, and the capacitance remains at 288 F g-1 even under a large current density of 20 A g-1. Besides the high specific capacitance and outstanding rate capability, symmetrical supercapacitor cell based on the NSHPC electrodes also exhibits an excellent cycling performance with 95.61% capacitance retention after 5000 times charge/discharge cycles. The large surface area caused by KOH activation (2056 m2 g-1) and high utilized surface area owing to the ideal micro/mesopores ratio (2.88), large micropores diameter (1.38 nm) and short opened micropores structure as well as the enhanced surface wettability induced by N and S heteroatoms doping and improved conductivity induced by KOH activation was found to be responsible for the very superior capacitance performance.

  14. Irregular large-scale computed tomography on multiple graphics processors improves energy-efficiency metrics for industrial applications

    NASA Astrophysics Data System (ADS)

    Jimenez, Edward S.; Goodman, Eric L.; Park, Ryeojin; Orr, Laurel J.; Thompson, Kyle R.

    2014-09-01

    This paper will investigate energy-efficiency for various real-world industrial computed-tomography reconstruction algorithms, both CPU- and GPU-based implementations. This work shows that the energy required for a given reconstruction is based on performance and problem size. There are many ways to describe performance and energy efficiency, thus this work will investigate multiple metrics including performance-per-watt, energy-delay product, and energy consumption. This work found that irregular GPU-based approaches1 realized tremendous savings in energy consumption when compared to CPU implementations while also significantly improving the performance-per- watt and energy-delay product metrics. Additional energy savings and other metric improvement was realized on the GPU-based reconstructions by improving storage I/O by implementing a parallel MIMD-like modularization of the compute and I/O tasks.

  15. A strategy for company improvement.

    PubMed

    Howley, L

    2000-03-01

    Strategies based on the kaizen methodology are designed to continuously improve company performance without the need for large capital investments. This article looks at how one company used simple kaizen principles to its advantage, achieving 67% increase in productivity and 10% reduction in the standard cost of product.

  16. Improving LC-MS sensitivity through increases in chromatographic performance: comparisons of UPLC-ES/MS/MS to HPLC-ES/MS/MS.

    PubMed

    Churchwell, Mona I; Twaddle, Nathan C; Meeker, Larry R; Doerge, Daniel R

    2005-10-25

    Recent technological advances have made available reverse phase chromatographic media with a 1.7 microm particle size along with a liquid handling system that can operate such columns at much higher pressures. This technology, termed ultra performance liquid chromatography (UPLC), offers significant theoretical advantages in resolution, speed, and sensitivity for analytical determinations, particularly when coupled with mass spectrometers capable of high-speed acquisitions. This paper explores the differences in LC-MS performance by conducting a side-by-side comparison of UPLC for several methods previously optimized for HPLC-based separation and quantification of multiple analytes with maximum throughput. In general, UPLC produced significant improvements in method sensitivity, speed, and resolution. Sensitivity increases with UPLC, which were found to be analyte-dependent, were as large as 10-fold and improvements in method speed were as large as 5-fold under conditions of comparable peak separations. Improvements in chromatographic resolution with UPLC were apparent from generally narrower peak widths and from a separation of diastereomers not possible using HPLC. Overall, the improvements in LC-MS method sensitivity, speed, and resolution provided by UPLC show that further advances can be made in analytical methodology to add significant value to hypothesis-driven research.

  17. The architecture of the High Performance Storage System (HPSS)

    NASA Technical Reports Server (NTRS)

    Teaff, Danny; Watson, Dick; Coyne, Bob

    1994-01-01

    The rapid growth in the size of datasets has caused a serious imbalance in I/O and storage system performance and functionality relative to application requirements and the capabilities of other system components. The High Performance Storage System (HPSS) is a scalable, next-generation storage system that will meet the functionality and performance requirements or large-scale scientific and commercial computing environments. Our goal is to improve the performance and capacity of storage by two orders of magnitude or more over what is available in the general or mass marketplace today. We are also providing corresponding improvements in architecture and functionality. This paper describes the architecture and functionality of HPSS.

  18. Incentives, Selection, and Teacher Performance: Evidence from IMPACT

    ERIC Educational Resources Information Center

    Dee, Thomas S.; Wyckoff, James

    2015-01-01

    Teachers in the United States are compensated largely on the basis of fixed schedules that reward experience and credentials. However, there is a growing interest in whether performance-based incentives based on rigorous teacher evaluations can improve teacher retention and performance. The evidence available to date has been mixed at best. This…

  19. Performance optimisations for distributed analysis in ALICE

    NASA Astrophysics Data System (ADS)

    Betev, L.; Gheata, A.; Gheata, M.; Grigoras, C.; Hristov, P.

    2014-06-01

    Performance is a critical issue in a production system accommodating hundreds of analysis users. Compared to a local session, distributed analysis is exposed to services and network latencies, remote data access and heterogeneous computing infrastructure, creating a more complex performance and efficiency optimization matrix. During the last 2 years, ALICE analysis shifted from a fast development phase to the more mature and stable code. At the same time, the frameworks and tools for deployment, monitoring and management of large productions have evolved considerably too. The ALICE Grid production system is currently used by a fair share of organized and individual user analysis, consuming up to 30% or the available resources and ranging from fully I/O-bound analysis code to CPU intensive correlations or resonances studies. While the intrinsic analysis performance is unlikely to improve by a large factor during the LHC long shutdown (LS1), the overall efficiency of the system has still to be improved by an important factor to satisfy the analysis needs. We have instrumented all analysis jobs with "sensors" collecting comprehensive monitoring information on the job running conditions and performance in order to identify bottlenecks in the data processing flow. This data are collected by the MonALISa-based ALICE Grid monitoring system and are used to steer and improve the job submission and management policy, to identify operational problems in real time and to perform automatic corrective actions. In parallel with an upgrade of our production system we are aiming for low level improvements related to data format, data management and merging of results to allow for a better performing ALICE analysis.

  20. Calcium supplementation improves clinical outcome in intensive care unit patients: a propensity score matched analysis of a large clinical database MIMIC-II.

    PubMed

    Zhang, Zhongheng; Chen, Kun; Ni, Hongying

    2015-01-01

    Observational studies have linked hypocalcemia with adverse clinical outcome in critically ill patients. However, calcium supplementation has never been formally investigated for its beneficial effect in critically ill patients. To investigate whether calcium supplementation can improve 28-day survival in adult critically ill patients. Secondary analysis of a large clinical database consisting over 30,000 critical ill patients was performed. Multivariable analysis was performed to examine the independent association of calcium supplementation and 28-day morality. Furthermore, propensity score matching technique was employed to investigate the role of calcium supplementation in improving survival. none. Primary outcome was the 28-day mortality. 90-day mortality was used as secondary outcome. A total of 32,551 adult patients, including 28,062 survivors and 4489 non-survivors (28-day mortality rate: 13.8 %) were included. Calcium supplementation was independently associated with improved 28-day mortality after adjusting for confounding variables (hazard ratio: 0.51; 95 % CI 0.47-0.56). Propensity score matching was performed and the after-matching cohort showed well balanced covariates. The results showed that calcium supplementation was associated with improved 28- and 90-day mortality (p < 0.05 for both Log-rank test). In adult critically ill patients, calcium supplementation during their ICU stay improved 28-day survival. This finding supports the use of calcium supplementation in critically ill patients.

  1. Large area silicon sheet by EFG

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Progress was made in improving ribbon flatness and reducing stress, and in raising cell performance for 10 cm wide ribbon grown in single cartridge EFG furnaces. Optimization of growth conditions resulted in improved ribbon thickness uniformity at a thickness of 200 micron, grown at 4 cm/minute, and growth at this target speed is routinely achieved over periods of the order of one hour or more. With the improved ribbon flatness, fabrication of large area (50 cm2) cells is now possible, and 10 to 11% efficiencies were demonstrated on ribbon grown at 3.5 to 4 cm/minute. Factors limiting performance of the existing multiple ribbon furnace were identified, and growth system improvements implemented to help raise throughput rates and the time percentage of simultaneous three-ribbon growth. However, it is evident that major redesign of this furnace would be needed to overcome shortfalls in its ability to achieve the Technical Features Demonstration goals of 1980. It was decided to start construction of a new multiple ribbon furnace and to incorporate the desired improvements into its design. The construction of this furnace is completed.

  2. Scale-Up: Improving Large Enrollment Physics Courses

    NASA Astrophysics Data System (ADS)

    Beichner, Robert

    1999-11-01

    The Student-Centered Activities for Large Enrollment University Physics (SCALE-UP) project is working to establish a learning environment that will promote increased conceptual understanding, improved problem-solving performance, and greater student satisfaction, while still maintaining class sizes of approximately 100. We are also addressing the new ABET engineering accreditation requirements for inquiry-based learning along with communication and team-oriented skills development. Results of studies of our latest classroom design, plans for future classroom space, and the current iteration of instructional materials will be discussed.

  3. Improving Design Efficiency for Large-Scale Heterogeneous Circuits

    NASA Astrophysics Data System (ADS)

    Gregerson, Anthony

    Despite increases in logic density, many Big Data applications must still be partitioned across multiple computing devices in order to meet their strict performance requirements. Among the most demanding of these applications is high-energy physics (HEP), which uses complex computing systems consisting of thousands of FPGAs and ASICs to process the sensor data created by experiments at particles accelerators such as the Large Hadron Collider (LHC). Designing such computing systems is challenging due to the scale of the systems, the exceptionally high-throughput and low-latency performance constraints that necessitate application-specific hardware implementations, the requirement that algorithms are efficiently partitioned across many devices, and the possible need to update the implemented algorithms during the lifetime of the system. In this work, we describe our research to develop flexible architectures for implementing such large-scale circuits on FPGAs. In particular, this work is motivated by (but not limited in scope to) high-energy physics algorithms for the Compact Muon Solenoid (CMS) experiment at the LHC. To make efficient use of logic resources in multi-FPGA systems, we introduce Multi-Personality Partitioning, a novel form of the graph partitioning problem, and present partitioning algorithms that can significantly improve resource utilization on heterogeneous devices while also reducing inter-chip connections. To reduce the high communication costs of Big Data applications, we also introduce Information-Aware Partitioning, a partitioning method that analyzes the data content of application-specific circuits, characterizes their entropy, and selects circuit partitions that enable efficient compression of data between chips. We employ our information-aware partitioning method to improve the performance of the hardware validation platform for evaluating new algorithms for the CMS experiment. Together, these research efforts help to improve the efficiency and decrease the cost of the developing large-scale, heterogeneous circuits needed to enable large-scale application in high-energy physics and other important areas.

  4. Feasibility study of a synthesis procedure for array feeds to improve radiation performance of large distorted reflector antennas

    NASA Technical Reports Server (NTRS)

    Stutzman, W. L.; Smith, W. T.

    1990-01-01

    Surface errors on parabolic reflector antennas degrade the overall performance of the antenna. Space antenna structures are difficult to build, deploy and control. They must maintain a nearly perfect parabolic shape in a harsh environment and must be lightweight. Electromagnetic compensation for surface errors in large space reflector antennas can be used to supplement mechanical compensation. Electromagnetic compensation for surface errors in large space reflector antennas has been the topic of several research studies. Most of these studies try to correct the focal plane fields of the reflector near the focal point and, hence, compensate for the distortions over the whole radiation pattern. An alternative approach to electromagnetic compensation is presented. The proposed technique uses pattern synthesis to compensate for the surface errors. The pattern synthesis approach uses a localized algorithm in which pattern corrections are directed specifically towards portions of the pattern requiring improvement. The pattern synthesis technique does not require knowledge of the reflector surface. It uses radiation pattern data to perform the compensation.

  5. Rapid acquisition of novel interface control by small ensembles of arbitrarily selected primary motor cortex neurons

    PubMed Central

    Law, Andrew J.; Rivlis, Gil

    2014-01-01

    Pioneering studies demonstrated that novel degrees of freedom could be controlled individually by directly encoding the firing rate of single motor cortex neurons, without regard to each neuron's role in controlling movement of the native limb. In contrast, recent brain-computer interface work has emphasized decoding outputs from large ensembles that include substantially more neurons than the number of degrees of freedom being controlled. To bridge the gap between direct encoding by single neurons and decoding output from large ensembles, we studied monkeys controlling one degree of freedom by comodulating up to four arbitrarily selected motor cortex neurons. Performance typically exceeded random quite early in single sessions and then continued to improve to different degrees in different sessions. We therefore examined factors that might affect performance. Performance improved with larger ensembles. In contrast, other factors that might have reflected preexisting synaptic architecture—such as the similarity of preferred directions—had little if any effect on performance. Patterns of comodulation among ensemble neurons became more consistent across trials as performance improved over single sessions. Compared with the ensemble neurons, other simultaneously recorded neurons showed less modulation. Patterns of voluntarily comodulated firing among small numbers of arbitrarily selected primary motor cortex (M1) neurons thus can be found and improved rapidly, with little constraint based on the normal relationships of the individual neurons to native limb movement. This rapid flexibility in relationships among M1 neurons may in part underlie our ability to learn new movements and improve motor skill. PMID:24920030

  6. A Novel Approach to Practice-Based Learning and Improvement Using a Web-Based Audit and Feedback Module

    PubMed Central

    Boggan, Joel C.; Cheely, George; Shah, Bimal R.; Heffelfinger, Randy; Springall, Deanna; Thomas, Samantha M.; Zaas, Aimee; Bae, Jonathan

    2014-01-01

    Background Systematically engaging residents in large programs in quality improvement (QI) is challenging. Objective To coordinate a shared QI project in a large residency program using an online tool. Methods A web-based QI tool guided residents through a 2-phase evaluation of performance of foot examinations in patients with diabetes. In phase 1, residents completed reviews of health records with online data entry. Residents were then presented with personal performance data relative to peers and were prompted to develop improvement plans. In phase 2, residents again reviewed personal performance. Rates of performance were compared at the program and clinic levels for each phase, with data presented for residents. Acceptability was measured by the number of residents completing each phase. Feasibility was measured by estimated faculty, programmer, and administrator time and costs. Results Seventy-nine of 86 eligible residents (92%) completed improvement plans and reviewed 1471 patients in phase 1, whereas 68 residents (79%) reviewed 1054 patient charts in phase 2. Rates of performance of examination increased significantly between phases (from 52% to 73% for complete examination, P < .001). Development of the tool required 130 hours of programmer time. Project analysis and management required 6 hours of administrator and faculty time monthly. Conclusions An online tool developed and implemented for program-wide QI initiatives successfully engaged residents to participate in QI activities. Residents using this tool demonstrated improvement in a selected quality target. This tool could be adapted by other graduate medical education programs or for faculty development. PMID:26279782

  7. Whey protein rich in alpha-lactalbumin increases the ratio of plasma tryptophan to the sum of the other large neutral amino acids and improves cognitive performance in stress-vulnerable subjects.

    PubMed

    Markus, C Rob; Olivier, Berend; de Haan, Edward H F

    2002-06-01

    Cognitive performance often declines under chronic stress exposure. The negative effect of chronic stress on performance may be mediated by reduced brain serotonin function. The uptake of the serotonin precursor tryptophan into the brain depends on nutrients that influence the availability of tryptophan by changing the ratio of plasma tryptophan to the sum of the other large neutral amino acids (Trp-LNAA ratio). In addition, a diet-induced increase in tryptophan may increase brain serotonergic activity levels and improve cognitive performance, particularly in high stress-vulnerable subjects. We tested whether alpha-lactalbumin, a whey protein with a high tryptophan content, would increase the plasma Trp-LNAA ratio and improve cognitive performance in high stress- vulnerable subjects. Twenty-three high stress-vulnerable subjects and 29 low stress-vulnerable subjects participated in a double-blind, placebo-controlled, crossover study. All subjects conducted a memory-scanning task after the intake of a diet enriched with either alpha-lactalbumin (alpha-lactalbumin diet) or sodium caseinate (control diet). Blood samples were taken to measure the effect of dietary manipulation on the plasma Trp-LNAA ratio. A significantly greater increase in the plasma Trp-LNAA ratio after consumption of the alpha-lactalbumin diet than after the control diet (P = 0.0001) was observed; memory scanning improved significantly only in the high stress-vulnerable subjects (P = 0.019). Because an increase in the plasma Trp-LNAA ratio is considered to be an indirect indication of increased brain serotonin function, the results suggest that dietary protein rich in alpha-lactalbumin improves cognitive performance in stress-vulnerable subjects via increased brain tryptophan and serotonin activities.

  8. Exploration of genetic and phenotypic diversity within Saccharomyces uvarum for driving strain improvement in winemaking.

    PubMed

    Verspohl, Alexandra; Solieri, Lisa; Giudici, Paolo

    2017-03-01

    The selection and genetic improvement of wine yeast is an ongoing process, since yeast strains should match new technologies in winemaking to satisfy evolving consumer preferences. A large genetic background is the necessary starting point for any genetic improvement programme. For this reason, we collected and characterized a large number of strains belonging to Saccharomyces uvarum. In particular, 70 strains were isolated from cold-stored must samples: they were identified and compared to S. uvarum strains originating from different collections, regarding fermentation profile, spore viability and stress response. The results demonstrate a large biodiversity among the new isolates, with particular emphasis to fermentation performances, genotypes and high spore viability, making the isolates suitable for further genetic improvement programmes. Furthermore, few of them are competitive with Saccharomyces cerevisiae and per se, suitable for wine fermentation, due to their resistance to stress, short lag phase and fermentation by-products.

  9. Image Location Estimation by Salient Region Matching.

    PubMed

    Qian, Xueming; Zhao, Yisi; Han, Junwei

    2015-11-01

    Nowadays, locations of images have been widely used in many application scenarios for large geo-tagged image corpora. As to images which are not geographically tagged, we estimate their locations with the help of the large geo-tagged image set by content-based image retrieval. In this paper, we exploit spatial information of useful visual words to improve image location estimation (or content-based image retrieval performances). We proposed to generate visual word groups by mean-shift clustering. To improve the retrieval performance, spatial constraint is utilized to code the relative position of visual words. We proposed to generate a position descriptor for each visual word and build fast indexing structure for visual word groups. Experiments show the effectiveness of our proposed approach.

  10. Raytheon Stirling/pulse Tube Cryocooler Development

    NASA Astrophysics Data System (ADS)

    Kirkconnell, C. S.; Hon, R. C.; Kesler, C. H.; Roberts, T.

    2008-03-01

    The first generation flight-design Stirling/pulse tube "hybrid" two-stage cryocooler has entered initial performance and environmental testing. The status and early results of the testing are presented. Numerous improvements have been implemented as compared to the preceding brassboard versions to improve performance, extend life, and enhance launch survivability. This has largely been accomplished by incorporating successful flight-design features from the Raytheon Stirling one-stage cryocooler product line. These design improvements are described. In parallel with these mechanical cryocooler development efforts, a third generation electronics module is being developed that will support hybrid Stirling/pulse tube and Stirling cryocoolers. Improvements relative to the second generation design relate to improved radiation hardness, reduced parts count, and improved vibration cancellation capability. Progress on the electronics is also presented.

  11. Transparent and Self-Supporting Graphene Films with Wrinkled- Graphene-Wall-Assembled Opening Polyhedron Building Blocks for High Performance Flexible/Transparent Supercapacitors.

    PubMed

    Li, Na; Huang, Xuankai; Zhang, Haiyan; Li, Yunyong; Wang, Chengxin

    2017-03-22

    Improving mass loading while maintaining high transparency and large surface area in one self-supporting graphene film is still a challenge. Unfortunately, all of these factors are absolutely essential for enhancing the energy storage performance of transparent supercapacitors for practical applications. To solve the above bottleneck problem, we produce a novel self-supporting flexible and transparent graphene film (STF-GF) with wrinkled-wall-assembled opened-hollow polyhedron building units. Taking advantage of the microscopic morphology, the STF-GF exhibits improved mass loading with high transmittance (70.2% at 550 nm), a large surface area (1105.6 m 2 /g), and good electrochemical performance: high energy (552.3 μWh/cm 3 ), power densities (561.9 mW/cm 3 ), a superlong cycle life, and good cycling stability (the capacitance retention is ∼94.8% after 20,000 cycles).

  12. Nonlinear Control of Large Disturbances in Magnetic Bearing Systems

    NASA Technical Reports Server (NTRS)

    Jiang, Yuhong; Zmood, R. B.

    1996-01-01

    In this paper, the nonlinear operation of magnetic bearing control methods is reviewed. For large disturbances, the effects of displacement constraints and power amplifier current and di/dt limits on bearing control system performance are analyzed. The operation of magnetic bearings exhibiting self-excited large scale oscillations have been studied both experimentally and by simulation. The simulation of the bearing system has been extended to include the effects of eddy currents in the actuators, so as to improve the accuracy of the simulation results. The results of these experiments and simulations are compared, and some useful conclusions are drawn for improving bearing system robustness.

  13. Planning of distributed generation in distribution network based on improved particle swarm optimization algorithm

    NASA Astrophysics Data System (ADS)

    Li, Jinze; Qu, Zhi; He, Xiaoyang; Jin, Xiaoming; Li, Tie; Wang, Mingkai; Han, Qiu; Gao, Ziji; Jiang, Feng

    2018-02-01

    Large-scale access of distributed power can improve the current environmental pressure, at the same time, increasing the complexity and uncertainty of overall distribution system. Rational planning of distributed power can effectively improve the system voltage level. To this point, the specific impact on distribution network power quality caused by the access of typical distributed power was analyzed and from the point of improving the learning factor and the inertia weight, an improved particle swarm optimization algorithm (IPSO) was proposed which could solve distributed generation planning for distribution network to improve the local and global search performance of the algorithm. Results show that the proposed method can well reduce the system network loss and improve the economic performance of system operation with distributed generation.

  14. Factors Important to Success in the Volunteer Long-Term Care Ombudsman Role

    ERIC Educational Resources Information Center

    Nelson, H. Wayne; Hooker, Karen; DeHart, Kimberly N.; Edwards, John A.; Lanning, Kevin

    2004-01-01

    This study found that the satisfaction of one state's largely older volunteers' altruistic, affiliation, and self-improvement motives corresponded to increased organizational loyalty and better performance across several dimensions. Younger volunteers served for shorter periods and were more highly motivated by the "self-improvement" need.…

  15. A Comparison of Service Delivery Models for Special Education Middle School Students Receiving Moderate Intervention Services

    ERIC Educational Resources Information Center

    Jones-Mason, Keely S.

    2012-01-01

    In an effort to improve academic performance for students receiving special education services, a large urban school district in Tennessee has implemented Integrated Service Delivery Model. The purpose of this study was to compare the performance of students receiving instruction in self-contained classrooms to the performance of students…

  16. Feasibility study of a synthesis procedure for array feeds to improve radiation performance of large distorted reflector antennas

    NASA Technical Reports Server (NTRS)

    Stutzman, W. L.; Takamizawa, K.; Werntz, P.; Lapean, J.; Barts, R.; Shen, B.; Dunn, D.

    1992-01-01

    The topics covered include the following: (1) performance analysis of the Gregorian tri-reflector; (2) design and performance of the type 6 reflector antenna; (3) a new spherical main reflector system design; (4) optimization of reflector configurations using physical optics; (5) radiometric array design; and (7) beam efficiency studies.

  17. Incentives, Selection, and Teacher Performance: Evidence from IMPACT. NBER Working Paper No. 19529

    ERIC Educational Resources Information Center

    Dee, Thomas; Wyckoff, James

    2013-01-01

    Teachers in the United States are compensated largely on the basis of fixed schedules that reward experience and credentials. However, there is a growing interest in whether performance-based incentives based on rigorous teacher evaluations can improve teacher retention and performance. The evidence available to date has been mixed at best. This…

  18. Publicly reported quality-of-care measures influenced Wisconsin physician groups to improve performance

    PubMed Central

    Lamb, Geoffrey C.; Smith, Maureen; Weeks, William B.; Queram, Christopher

    2014-01-01

    Public reporting of performance on quality measures is increasingly common but little is known about the impact, especially among physician groups. The Wisconsin Collaborative for Healthcare Quality (Collaborative) is a voluntary consortium of physician groups which has publicly reported quality measures since 2004, providing an opportunity to study the effect of this effort on participating groups. Analyses included member performance on 14 ambulatory measures from 2004–2009, a survey regarding reporting and its relationship to improvement efforts, and use of Medicare billing data to independently compare Collaborative members to the rest of Wisconsin, neighboring states and the rest of the United States. Faced with limited resources, groups prioritized their efforts based on the nature of the measure and their performance compared to others. The outcomes demonstrated that public reporting was associated with improvement in health quality and that large physician group practices will engage in improvement efforts in response. PMID:23459733

  19. UAS in the NAS Project: Large-Scale Communication Architecture Simulations with NASA GRC Gen5 Radio Model

    NASA Technical Reports Server (NTRS)

    Kubat, Gregory

    2016-01-01

    This report provides a description and performance characterization of the large-scale, Relay architecture, UAS communications simulation capability developed for the NASA GRC, UAS in the NAS Project. The system uses a validated model of the GRC Gen5 CNPC, Flight-Test Radio model. Contained in the report is a description of the simulation system and its model components, recent changes made to the system to improve performance, descriptions and objectives of sample simulations used for test and verification, and a sampling and observations of results and performance data.

  20. A Ranking Approach on Large-Scale Graph With Multidimensional Heterogeneous Information.

    PubMed

    Wei, Wei; Gao, Bin; Liu, Tie-Yan; Wang, Taifeng; Li, Guohui; Li, Hang

    2016-04-01

    Graph-based ranking has been extensively studied and frequently applied in many applications, such as webpage ranking. It aims at mining potentially valuable information from the raw graph-structured data. Recently, with the proliferation of rich heterogeneous information (e.g., node/edge features and prior knowledge) available in many real-world graphs, how to effectively and efficiently leverage all information to improve the ranking performance becomes a new challenging problem. Previous methods only utilize part of such information and attempt to rank graph nodes according to link-based methods, of which the ranking performances are severely affected by several well-known issues, e.g., over-fitting or high computational complexity, especially when the scale of graph is very large. In this paper, we address the large-scale graph-based ranking problem and focus on how to effectively exploit rich heterogeneous information of the graph to improve the ranking performance. Specifically, we propose an innovative and effective semi-supervised PageRank (SSP) approach to parameterize the derived information within a unified semi-supervised learning framework (SSLF-GR), then simultaneously optimize the parameters and the ranking scores of graph nodes. Experiments on the real-world large-scale graphs demonstrate that our method significantly outperforms the algorithms that consider such graph information only partially.

  1. Performance improvement of a large capacity GM cryocooler

    NASA Astrophysics Data System (ADS)

    Wang, C.; Olesh, A.; Cosco, J.

    2017-12-01

    This paper presents the improvement of a large GM cryocooler, Cryomech model AL600, based on redesigning a cold head stem seal, regenerator, heat exchanger and displacer bumper as well as optimizing operating parameters. The no-load temperature is reduced from 26.6 K to 23.4 K. The cooling capacity is improved from 615 W to 701W at 80 K with a power input of 12.5 kW. It has the highest relative Carnot Efficiency at 15.4%. The vibration of AL600 is investigated experimentally. The new displacer bumper significantly reduces the vibration force on the room temperature flange by 82 % from 520 N to 93 N.

  2. Retrospective single center study of the efficacy of large spot 532 nm laser for the treatment of facial capillary malformations in 44 patients with the use of three-dimensional image analysis.

    PubMed

    Kwiek, Bartłomiej; Rożalski, Michał; Kowalewski, Cezary; Ambroziak, Marcin

    2017-10-01

    We wanted to asses the efficacy of large spot 532 nm laser for the treatment of facial capillary malformations with the use of three-dimensional (3D) image analysis. Retrospective single center study on previously non-treated patients with facial capillary malformations (CM) was performed. A total of 44 consecutive Caucasian patients aged 5-66 were included. Patients had 3D photography performed before and after and had at least one single session of treatment with 532 nm neodymium-doped yttrium aluminum garnet (Nd:YAG) laser with contact cooling, fluencies ranging from 8 to 11.5 J/cm 2 , pulse duration ranging from 5 to 9 milliseconds and spot size ranging from 5 to 10 mm. Objective analysis of percentage improvement based on 3D digital assessment of combined color and area improvement (global clearance effect [GCE]) were performed. Median maximal improvement achieved during the treatment (GCE max ) was 70.4%. Mean number of laser procedures required to achieve this improvement was 7.1 (ranging from 2 to 14)). Improvement of minimum 25% (GCE 25) was achieved by all patients, of minimum 50% (GCE 50) by 77.3%, of minimum 75% (GCE 75) by 38.6%, and of minimum 90% (GCE 90) by 13.64. Large spot 532 nm laser is highly effective in the treatment of facial CM. 3D color and area image analysis provides an objective method to compare different methods of facial CM treatment in future studies. Lasers Surg. Med. 49:743-749, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  3. Profiling and Improving I/O Performance of a Large-Scale Climate Scientific Application

    NASA Technical Reports Server (NTRS)

    Liu, Zhuo; Wang, Bin; Wang, Teng; Tian, Yuan; Xu, Cong; Wang, Yandong; Yu, Weikuan; Cruz, Carlos A.; Zhou, Shujia; Clune, Tom; hide

    2013-01-01

    Exascale computing systems are soon to emerge, which will pose great challenges on the huge gap between computing and I/O performance. Many large-scale scientific applications play an important role in our daily life. The huge amounts of data generated by such applications require highly parallel and efficient I/O management policies. In this paper, we adopt a mission-critical scientific application, GEOS-5, as a case to profile and analyze the communication and I/O issues that are preventing applications from fully utilizing the underlying parallel storage systems. Through in-detail architectural and experimental characterization, we observe that current legacy I/O schemes incur significant network communication overheads and are unable to fully parallelize the data access, thus degrading applications' I/O performance and scalability. To address these inefficiencies, we redesign its I/O framework along with a set of parallel I/O techniques to achieve high scalability and performance. Evaluation results on the NASA discover cluster show that our optimization of GEOS-5 with ADIOS has led to significant performance improvements compared to the original GEOS-5 implementation.

  4. A Large number of fast cosmological simulations

    NASA Astrophysics Data System (ADS)

    Koda, Jun; Kazin, E.; Blake, C.

    2014-01-01

    Mock galaxy catalogs are essential tools to analyze large-scale structure data. Many independent realizations of mock catalogs are necessary to evaluate the uncertainties in the measurements. We perform 3600 cosmological simulations for the WiggleZ Dark Energy Survey to obtain the new improved Baron Acoustic Oscillation (BAO) cosmic distance measurements using the density field "reconstruction" technique. We use 1296^3 particles in a periodic box of 600/h Mpc on a side, which is the minimum requirement from the survey volume and observed galaxies. In order to perform such large number of simulations, we developed a parallel code using the COmoving Lagrangian Acceleration (COLA) method, which can simulate cosmological large-scale structure reasonably well with only 10 time steps. Our simulation is more than 100 times faster than conventional N-body simulations; one COLA simulation takes only 15 minutes with 216 computing cores. We have completed the 3600 simulations with a reasonable computation time of 200k core hours. We also present the results of the revised WiggleZ BAO distance measurement, which are significantly improved by the reconstruction technique.

  5. Strategic partnering to improve community health worker programming and performance: features of a community-health system integrated approach.

    PubMed

    Naimoli, Joseph F; Perry, Henry B; Townsend, John W; Frymus, Diana E; McCaffery, James A

    2015-09-01

    There is robust evidence that community health workers (CHWs) in low- and middle-income (LMIC) countries can improve their clients' health and well-being. The evidence on proven strategies to enhance and sustain CHW performance at scale, however, is limited. Nevertheless, CHW stakeholders need guidance and new ideas, which can emerge from the recognition that CHWs function at the intersection of two dynamic, overlapping systems - the formal health system and the community. Although each typically supports CHWs, their support is not necessarily strategic, collaborative or coordinated. We explore a strategic community health system partnership as one approach to improving CHW programming and performance in countries with or intending to mount large-scale CHW programmes. To identify the components of the approach, we drew on a year-long evidence synthesis exercise on CHW performance, synthesis records, author consultations, documentation on large-scale CHW programmes published after the synthesis and other relevant literature. We also established inclusion and exclusion criteria for the components we considered. We examined as well the challenges and opportunities associated with implementing each component. We identified a minimum package of four strategies that provide opportunities for increased cooperation between communities and health systems and address traditional weaknesses in large-scale CHW programmes, and for which implementation is feasible at sub-national levels over large geographic areas and among vulnerable populations in the greatest need of care. We postulate that the CHW performance benefits resulting from the simultaneous implementation of all four strategies could outweigh those that either the health system or community could produce independently. The strategies are (1) joint ownership and design of CHW programmes, (2) collaborative supervision and constructive feedback, (3) a balanced package of incentives, and (4) a practical monitoring system incorporating data from communities and the health system. We believe that strategic partnership between communities and health systems on a minimum package of simultaneously implemented strategies offers the potential for accelerating progress in improving CHW performance at scale. Comparative, retrospective and prospective research can confirm the potential of these strategies. More experience and evidence on strategic partnership can contribute to our understanding of how to achieve sustainable progress in health with equity.

  6. Performance Characterization of Global Address Space Applications: A Case Study with NWChem

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hammond, Jeffrey R.; Krishnamoorthy, Sriram; Shende, Sameer

    The use of global address space languages and one-sided communication for complex applications is gaining attention in the parallel computing community. However, lack of good evaluative methods to observe multiple levels of performance makes it difficult to isolate the cause of performance deficiencies and to understand the fundamental limitations of system and application design for future improvement. NWChem is a popular computational chemistry package which depends on the Global Arrays/ ARMCI suite for partitioned global address space functionality to deliver high-end molecular modeling capabilities. A workload characterization methodology was developed to support NWChem performance engineering on large-scale parallel platforms. Themore » research involved both the integration of performance instrumentation and measurement in the NWChem software, as well as the analysis of one-sided communication performance in the context of NWChem workloads. Scaling studies were conducted for NWChem on Blue Gene/P and on two large-scale clusters using different generation Infiniband interconnects and x86 processors. The performance analysis and results show how subtle changes in the runtime parameters related to the communication subsystem could have significant impact on performance behavior. The tool has successfully identified several algorithmic bottlenecks which are already being tackled by computational chemists to improve NWChem performance.« less

  7. 48 CFR 2115.404-71 - Profit analysis factors.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ..., enrollees, beneficiaries, and Congress as measures of economical and efficient contract performance. This..., etc., having viability to the Program at large. Improvements and innovations recognized and rewarded...

  8. 48 CFR 2115.404-71 - Profit analysis factors.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ..., enrollees, beneficiaries, and Congress as measures of economical and efficient contract performance. This..., etc., having viability to the Program at large. Improvements and innovations recognized and rewarded...

  9. 48 CFR 2115.404-71 - Profit analysis factors.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ..., enrollees, beneficiaries, and Congress as measures of economical and efficient contract performance. This..., etc., having viability to the Program at large. Improvements and innovations recognized and rewarded...

  10. 48 CFR 2115.404-71 - Profit analysis factors.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ..., enrollees, beneficiaries, and Congress as measures of economical and efficient contract performance. This..., etc., having viability to the Program at large. Improvements and innovations recognized and rewarded...

  11. Improvement of Charge Transportation in Si Quantum Dot-Sensitized Solar Cells Using Vanadium Doped TiO2.

    PubMed

    Seo, Hyunwoong; Ichida, Daiki; Hashimoto, Shinji; Itagaki, Naho; Koga, Kazunori; Shiratani, Masaharu; Nam, Sang-Hun; Boo, Jin-Hyo

    2016-05-01

    The multiple exciton generation characteristics of quantum dots have been expected to enhance the performance of photochemical solar cells. In previous work, we first introduced Si quantum dot for sensitized solar cells. The Si quantum dots were fabricated by multi-hollow discharge plasma chemical vapor deposition, and were characterized optically and morphologically. The Si quantum dot-sensitized solar cells had poor performance due to significant electron loss by charge recombination. Although the large Si particle size resulted in the exposure of a large TiO2 surface area, there was a limit to ho much the particle size could be decreased due to the reduced absorbance of small particles. Therefore, this work focused on decreasing the internal impedance to improve charge transfer. TiO2 was electronically modified by doping with vanadium, which can improve electron transfer in the TiO2 network, and which is stable in the redox electrolyte. Photogenerated electrons can more easily arrive at the conductive electrode due to the decreased internal impedance. The dark photovoltaic properties confirmed the reduction of charge recombination, and the photon-to-current conversion efficiency reflected the improved electron transfer. Impedance analysis confirmed a decrease in internal impedance and an increased electron lifetime. Consequently, these improvements by vanadium doping enhanced the overall performance of Si quantum dot-sensitized solar cells.

  12. Optimized distributed systems achieve significant performance improvement on sorted merging of massive VCF files.

    PubMed

    Sun, Xiaobo; Gao, Jingjing; Jin, Peng; Eng, Celeste; Burchard, Esteban G; Beaty, Terri H; Ruczinski, Ingo; Mathias, Rasika A; Barnes, Kathleen; Wang, Fusheng; Qin, Zhaohui S

    2018-06-01

    Sorted merging of genomic data is a common data operation necessary in many sequencing-based studies. It involves sorting and merging genomic data from different subjects by their genomic locations. In particular, merging a large number of variant call format (VCF) files is frequently required in large-scale whole-genome sequencing or whole-exome sequencing projects. Traditional single-machine based methods become increasingly inefficient when processing large numbers of files due to the excessive computation time and Input/Output bottleneck. Distributed systems and more recent cloud-based systems offer an attractive solution. However, carefully designed and optimized workflow patterns and execution plans (schemas) are required to take full advantage of the increased computing power while overcoming bottlenecks to achieve high performance. In this study, we custom-design optimized schemas for three Apache big data platforms, Hadoop (MapReduce), HBase, and Spark, to perform sorted merging of a large number of VCF files. These schemas all adopt the divide-and-conquer strategy to split the merging job into sequential phases/stages consisting of subtasks that are conquered in an ordered, parallel, and bottleneck-free way. In two illustrating examples, we test the performance of our schemas on merging multiple VCF files into either a single TPED or a single VCF file, which are benchmarked with the traditional single/parallel multiway-merge methods, message passing interface (MPI)-based high-performance computing (HPC) implementation, and the popular VCFTools. Our experiments suggest all three schemas either deliver a significant improvement in efficiency or render much better strong and weak scalabilities over traditional methods. Our findings provide generalized scalable schemas for performing sorted merging on genetics and genomics data using these Apache distributed systems.

  13. Optimized distributed systems achieve significant performance improvement on sorted merging of massive VCF files

    PubMed Central

    Gao, Jingjing; Jin, Peng; Eng, Celeste; Burchard, Esteban G; Beaty, Terri H; Ruczinski, Ingo; Mathias, Rasika A; Barnes, Kathleen; Wang, Fusheng

    2018-01-01

    Abstract Background Sorted merging of genomic data is a common data operation necessary in many sequencing-based studies. It involves sorting and merging genomic data from different subjects by their genomic locations. In particular, merging a large number of variant call format (VCF) files is frequently required in large-scale whole-genome sequencing or whole-exome sequencing projects. Traditional single-machine based methods become increasingly inefficient when processing large numbers of files due to the excessive computation time and Input/Output bottleneck. Distributed systems and more recent cloud-based systems offer an attractive solution. However, carefully designed and optimized workflow patterns and execution plans (schemas) are required to take full advantage of the increased computing power while overcoming bottlenecks to achieve high performance. Findings In this study, we custom-design optimized schemas for three Apache big data platforms, Hadoop (MapReduce), HBase, and Spark, to perform sorted merging of a large number of VCF files. These schemas all adopt the divide-and-conquer strategy to split the merging job into sequential phases/stages consisting of subtasks that are conquered in an ordered, parallel, and bottleneck-free way. In two illustrating examples, we test the performance of our schemas on merging multiple VCF files into either a single TPED or a single VCF file, which are benchmarked with the traditional single/parallel multiway-merge methods, message passing interface (MPI)–based high-performance computing (HPC) implementation, and the popular VCFTools. Conclusions Our experiments suggest all three schemas either deliver a significant improvement in efficiency or render much better strong and weak scalabilities over traditional methods. Our findings provide generalized scalable schemas for performing sorted merging on genetics and genomics data using these Apache distributed systems. PMID:29762754

  14. Metacognition and Transfer: Keys to Improving Marketing Education

    ERIC Educational Resources Information Center

    Ramocki, Stephen P.

    2007-01-01

    A primary purpose of marketing education is to prepare students to perform throughout their careers, and performance largely relies on transferability of knowledge. It has been demonstrated that training in metacognition, along with emphasis on transfer, does lead to increased probability that knowledge will be transferred into environments…

  15. Technology-Enhanced Learning in College Mathematics Remediation

    ERIC Educational Resources Information Center

    Foshee, Cecile M.; Elliott, Stephen N.; Atkinson, Robert K.

    2016-01-01

    US colleges presently face an academic plight; thousands of high school graduates are performing below the expected ability for college-level mathematics. This paper describes an innovative approach intended to improve the mathematics performance of first-year college students, at a large US university. The innovation involved the integration of…

  16. Improving the performance of interferometric imaging through the use of disturbance feedforward.

    PubMed

    Böhm, Michael; Glück, Martin; Keck, Alexander; Pott, Jörg-Uwe; Sawodny, Oliver

    2017-05-01

    In this paper, we present a disturbance compensation technique to improve the performance of interferometric imaging for extremely large ground-based telescopes, e.g., the Large Binocular Telescope (LBT), which serves as the application example in this contribution. The most significant disturbance sources at ground-based telescopes are wind-induced mechanical vibrations in the range of 8-60 Hz. Traditionally, their optical effect is eliminated by feedback systems, such as the adaptive optics control loop combined with a fringe tracking system within the interferometric instrument. In this paper, accelerometers are used to measure the vibrations. These measurements are used to estimate the motion of the mirrors, i.e., tip, tilt and piston, with a dynamic estimator. Additional delay compensation methods are presented to cancel sensor network delays and actuator input delays, improving the estimation result even more, particularly at higher frequencies. Because various instruments benefit from the implementation of telescope vibration mitigation, the estimator is implemented as a separate, independent software on the telescope, publishing the estimated values via multicast on the telescope's ethernet. Every client capable of using and correcting the estimated disturbances can subscribe and use these values in a feedforward for its compensation device, e.g., the deformable mirror, the piston mirror of LINC-NIRVANA, or the fast path length corrector of the Large Binocular Telescope Interferometer. This easy-to-use approach eventually leveraged the presented technology for interferometric use at the LBT and now significantly improves the sky coverage, performance, and operational robustness of interferometric imaging on a regular basis.

  17. Rubisco activity and regulation as targets for crop improvement.

    PubMed

    Parry, Martin A J; Andralojc, P John; Scales, Joanna C; Salvucci, Michael E; Carmo-Silva, A Elizabete; Alonso, Hernan; Whitney, Spencer M

    2013-01-01

    Rubisco (ribulose-1,5-bisphosphate (RuBP) carboxylase/oxygenase) enables net carbon fixation through the carboxylation of RuBP. However, some characteristics of Rubisco make it surprisingly inefficient and compromise photosynthetic productivity. For example, Rubisco catalyses a wasteful reaction with oxygen that leads to the release of previously fixed CO(2) and NH(3) and the consumption of energy during photorespiration. Furthermore, Rubisco is slow and large amounts are needed to support adequate photosynthetic rates. Consequently, Rubisco has been studied intensively as a prime target for manipulations to 'supercharge' photosynthesis and improve both productivity and resource use efficiency. The catalytic properties of Rubiscos from diverse sources vary considerably, suggesting that changes in turnover rate, affinity, or specificity for CO(2) can be introduced to improve Rubisco performance in specific crops and environments. While attempts to manipulate plant Rubisco by nuclear transformation have had limited success, modifying its catalysis by targeted changes to its catalytic large subunit via chloroplast transformation have been much more successful. However, this technique is still in need of development for most major food crops including maize, wheat, and rice. Other bioengineering approaches for improving Rubisco performance include improving the activity of its ancillary protein, Rubisco activase, in addition to modulating the synthesis and degradation of Rubisco's inhibitory sugar phosphate ligands. As the rate-limiting step in carbon assimilation, even modest improvements in the overall performance of Rubisco pose a viable pathway for obtaining significant gains in plant yield, particularly under stressful environmental conditions.

  18. CAT 2 - An improved version of Cryogenic Analysis Tools for online and offline monitoring and analysis of large size cryostats

    NASA Astrophysics Data System (ADS)

    Pagliarone, C. E.; Uttaro, S.; Cappelli, L.; Fallone, M.; Kartal, S.

    2017-02-01

    CAT, Cryogenic Analysis Tools is a software package developed using LabVIEW and ROOT environments to analyze the performances of large size cryostats, where many parameters, input, and control variables need to be acquired and studied at the same time. The present paper describes how CAT works and which are the main improvements achieved in the new version: CAT 2. New Graphical User Interfaces have been developed in order to make the use of the full package more user-friendly as well as a process of resource optimization has been carried out. The offline analysis of the full cryostat performances is available both trough ROOT line command interface band also by using the new graphical interfaces.

  19. Tungsten fiber reinforced superalloys: A status review

    NASA Technical Reports Server (NTRS)

    Petrasek, D. W.; Signorelli, R. A.

    1981-01-01

    Improved performance of heat engines is largely dependent upon maximum cycle temperatures. Tungsten fiber reinforced superalloys (TFRS) are the first of a family of high temperature composites that offer the potential for significantly raising hot component operating temperatures and thus leading to improved heat engine performance. This status review of TFRS research emphasizes the promising property data developed to date, the status of TFRS composite airfoil fabrication technology, and the areas requiring more attention to assure their applicability to hot section components of aircraft gas turbine engines.

  20. Fiber reinforced superalloys

    NASA Technical Reports Server (NTRS)

    Petrasek, Donald W.; Signorelli, Robert A.; Caulfield, Thomas; Tien, John K.

    1987-01-01

    Improved performance of heat engines is largely dependent upon maximum cycle temperatures. Tungsten fiber reinforced superalloys (TFRS) are the first of a family of high temperature composites that offer the potential for significantly raising hot component operating temperatures and thus leading to improved heat engine performance. This status review of TFRS research emphasizes the promising property data developed to date, the status of TFRS composite airfoil fabrication technology, and the areas requiring more attention to assure their applicability to hot section components of aircraft gas turbine engines.

  1. Synthetic thermoelectric materials comprising phononic crystals

    DOEpatents

    El-Kady, Ihab F; Olsson, Roy H; Hopkins, Patrick; Reinke, Charles; Kim, Bongsang

    2013-08-13

    Synthetic thermoelectric materials comprising phononic crystals can simultaneously have a large Seebeck coefficient, high electrical conductivity, and low thermal conductivity. Such synthetic thermoelectric materials can enable improved thermoelectric devices, such as thermoelectric generators and coolers, with improved performance. Such synthetic thermoelectric materials and devices can be fabricated using techniques that are compatible with standard microelectronics.

  2. Understanding quality improvement is more important now than ever before.

    PubMed

    Watkins, R W

    2014-01-01

    With provider payments being adjusted for performance and emphasis being placed on value-based care, large health care systems are already developing the resources necessary to pursue quality improvement (QI) in their practices. This article explains why smaller and/or rural practices also need to learn about and implement QI.

  3. Considering the Efficacy of Web-Based Worked Examples in Introductory Chemistry

    ERIC Educational Resources Information Center

    Crippen, Kent J.; Earl, Boyd L.

    2004-01-01

    Theory suggests that studying worked examples and engaging in self-explanation will improve learning and problem solving. A growing body of evidence supports the use of web-based assessments for improving undergraduate performance in traditional large enrollment courses. This article describes a study designed to investigate these techniques in a…

  4. An Acute Bout of Exercise Improves the Cognitive Performance of Older Adults.

    PubMed

    Johnson, Liam; Addamo, Patricia K; Selva Raj, Isaac; Borkoles, Erika; Wyckelsma, Victoria; Cyarto, Elizabeth; Polman, Remco C

    2016-10-01

    There is evidence that an acute bout of exercise confers cognitive benefits, but it is largely unknown what the optimal mode and duration of exercise is and how cognitive performance changes over time after exercise. We compared the cognitive performance of 31 older adults using the Stroop test before, immediately after, and at 30 and 60 min after a 10 and 30 min aerobic or resistance exercise session. Heart rate and feelings of arousal were also measured before, during, and after exercise. We found that, independent of mode or duration of exercise, the participants improved in the Stroop Inhibition task immediately postexercise. We did not find that exercise influenced the performance of the Stroop Color or Stroop Word Interference tasks. Our findings suggest that an acute bout of exercise can improve cognitive performance and, in particular, the more complex executive functioning of older adults.

  5. Multicore Challenges and Benefits for High Performance Scientific Computing

    DOE PAGES

    Nielsen, Ida M. B.; Janssen, Curtis L.

    2008-01-01

    Until recently, performance gains in processors were achieved largely by improvements in clock speeds and instruction level parallelism. Thus, applications could obtain performance increases with relatively minor changes by upgrading to the latest generation of computing hardware. Currently, however, processor performance improvements are realized by using multicore technology and hardware support for multiple threads within each core, and taking full advantage of this technology to improve the performance of applications requires exposure of extreme levels of software parallelism. We will here discuss the architecture of parallel computers constructed from many multicore chips as well as techniques for managing the complexitymore » of programming such computers, including the hybrid message-passing/multi-threading programming model. We will illustrate these ideas with a hybrid distributed memory matrix multiply and a quantum chemistry algorithm for energy computation using Møller–Plesset perturbation theory.« less

  6. Performance Optimization of Marine Science and Numerical Modeling on HPC Cluster

    PubMed Central

    Yang, Dongdong; Yang, Hailong; Wang, Luming; Zhou, Yucong; Zhang, Zhiyuan; Wang, Rui; Liu, Yi

    2017-01-01

    Marine science and numerical modeling (MASNUM) is widely used in forecasting ocean wave movement, through simulating the variation tendency of the ocean wave. Although efforts have been devoted to improve the performance of MASNUM from various aspects by existing work, there is still large space unexplored for further performance improvement. In this paper, we aim at improving the performance of propagation solver and data access during the simulation, in addition to the efficiency of output I/O and load balance. Our optimizations include several effective techniques such as the algorithm redesign, load distribution optimization, parallel I/O and data access optimization. The experimental results demonstrate that our approach achieves higher performance compared to the state-of-the-art work, about 3.5x speedup without degrading the prediction accuracy. In addition, the parameter sensitivity analysis shows our optimizations are effective under various topography resolutions and output frequencies. PMID:28045972

  7. Building high-performance system for processing a daily large volume of Chinese satellites imagery

    NASA Astrophysics Data System (ADS)

    Deng, Huawu; Huang, Shicun; Wang, Qi; Pan, Zhiqiang; Xin, Yubin

    2014-10-01

    The number of Earth observation satellites from China increases dramatically recently and those satellites are acquiring a large volume of imagery daily. As the main portal of image processing and distribution from those Chinese satellites, the China Centre for Resources Satellite Data and Application (CRESDA) has been working with PCI Geomatics during the last three years to solve two issues in this regard: processing the large volume of data (about 1,500 scenes or 1 TB per day) in a timely manner and generating geometrically accurate orthorectified products. After three-year research and development, a high performance system has been built and successfully delivered. The high performance system has a service oriented architecture and can be deployed to a cluster of computers that may be configured with high end computing power. The high performance is gained through, first, making image processing algorithms into parallel computing by using high performance graphic processing unit (GPU) cards and multiple cores from multiple CPUs, and, second, distributing processing tasks to a cluster of computing nodes. While achieving up to thirty (and even more) times faster in performance compared with the traditional practice, a particular methodology was developed to improve the geometric accuracy of images acquired from Chinese satellites (including HJ-1 A/B, ZY-1-02C, ZY-3, GF-1, etc.). The methodology consists of fully automatic collection of dense ground control points (GCP) from various resources and then application of those points to improve the photogrammetric model of the images. The delivered system is up running at CRESDA for pre-operational production and has been and is generating good return on investment by eliminating a great amount of manual labor and increasing more than ten times of data throughput daily with fewer operators. Future work, such as development of more performance-optimized algorithms, robust image matching methods and application workflows, is identified to improve the system in the coming years.

  8. Taguchi's technique: an effective method for improving X-ray medical radiographic screen performance.

    PubMed

    Vlachogiannis, J G

    2003-01-01

    Taguchi's technique is a helpful tool to achieve experimental optimization of a large number of decision variables with a small number of off-line experiments. The technique appears to be an ideal tool for improving the performance of X-ray medical radiographic screens under a noise source. Currently there are very many guides available for improving the efficiency of X-ray medical radiographic screens. These guides can be refined using a second-stage parameter optimization. based on Taguchi's technique, selecting the optimum levels of controllable X-ray radiographic screen factors. A real example of the proposed technique is presented giving certain performance criteria. The present research proposes the reinforcement of X-ray radiography by Taguchi's technique as a novel hardware mechanism.

  9. Ensemble training to improve recognition using 2D ear

    NASA Astrophysics Data System (ADS)

    Middendorff, Christopher; Bowyer, Kevin W.

    2009-05-01

    The ear has gained popularity as a biometric feature due to the robustness of the shape over time and across emotional expression. Popular methods of ear biometrics analyze the ear as a whole, leaving these methods vulnerable to error due to occlusion. Many researchers explore ear recognition using an ensemble, but none present a method for designing the individual parts that comprise the ensemble. In this work, we introduce a method of modifying the ensemble shapes to improve performance. We determine how different properties of an ensemble training system can affect overall performance. We show that ensembles built from small parts will outperform ensembles built with larger parts, and that incorporating a large number of parts improves the performance of the ensemble.

  10. Origin of the high performance of perovskite solar cells with large grains

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Jian; Shi, Tongfei, E-mail: tongfeishi@gmail.com; Li, Xinhua

    2016-02-01

    Due to excellent carrier transport characteristics, CH{sub 3}NH{sub 3}PbI{sub 3} film made of large single crystal grains is considered as a key to improve upon already remarkable perovskite solar cell (PSC) efficiency. We have used a simple and efficient solvent vapor annealing method to obtain CH{sub 3}NH{sub 3}PbI{sub 3} films with grain size over 1 μm. PSCs with different grain size films have been fabricated and verified the potential of large grains for improving solar cells performance. Moreover, the larger grain films have shown stronger light absorption ability and more photon-generated carriers under the same illumination. A detailed temperature-dependent PL studymore » has indicated that it originates from larger radius and lower binding energy of donor-acceptor-pair (DAP) in larger grains, which makes the DAP is easily to be separated and difficult to be recombine.« less

  11. Cognitive skills, student achievement tests, and schools.

    PubMed

    Finn, Amy S; Kraft, Matthew A; West, Martin R; Leonard, Julia A; Bish, Crystal E; Martin, Rebecca E; Sheridan, Margaret A; Gabrieli, Christopher F O; Gabrieli, John D E

    2014-03-01

    Cognitive skills predict academic performance, so schools that improve academic performance might also improve cognitive skills. To investigate the impact schools have on both academic performance and cognitive skills, we related standardized achievement-test scores to measures of cognitive skills in a large sample (N = 1,367) of eighth-grade students attending traditional, exam, and charter public schools. Test scores and gains in test scores over time correlated with measures of cognitive skills. Despite wide variation in test scores across schools, differences in cognitive skills across schools were negligible after we controlled for fourth-grade test scores. Random offers of enrollment to oversubscribed charter schools resulted in positive impacts of such school attendance on math achievement but had no impact on cognitive skills. These findings suggest that schools that improve standardized achievement-test scores do so primarily through channels other than improving cognitive skills.

  12. Performance-Enhanced Activated Carbon Electrodes for Supercapacitors Combining Both Graphene-Modified Current Collectors and Graphene Conductive Additive.

    PubMed

    Wang, Rubing; Qian, Yuting; Li, Weiwei; Zhu, Shoupu; Liu, Fengkui; Guo, Yufen; Chen, Mingliang; Li, Qi; Liu, Liwei

    2018-05-15

    Graphene has been widely used in the active material, conductive agent, binder or current collector for supercapacitors, due to its large specific surface area, high conductivity, and electron mobility. However, works simultaneously employing graphene as conductive agent and current collector were rarely reported. Here, we report improved activated carbon (AC) electrodes (AC@G@NiF/G) simultaneously combining chemical vapor deposition (CVD) graphene-modified nickel foams (NiF/Gs) current collectors and high quality few-layer graphene conductive additive instead of carbon black (CB). The synergistic effect of NiF/Gs and graphene additive makes the performances of AC@G@NiF/G electrodes superior to those of electrodes with CB or with nickel foam current collectors. The performances of AC@G@NiF/G electrodes show that for the few-layer graphene addition exists an optimum value around 5 wt %, rather than a larger addition of graphene, works out better. A symmetric supercapacitor assembled by AC@G@NiF/G electrodes exhibits excellent cycling stability. We attribute improved performances to graphene-enhanced conductivity of electrode materials and NiF/Gs with 3D graphene conductive network and lower oxidation, largely improving the electrical contact between active materials and current collectors.

  13. Heat transfer enhancement in a lithium-ion cell through improved material-level thermal transport

    NASA Astrophysics Data System (ADS)

    Vishwakarma, Vivek; Waghela, Chirag; Wei, Zi; Prasher, Ravi; Nagpure, Shrikant C.; Li, Jianlin; Liu, Fuqiang; Daniel, Claus; Jain, Ankur

    2015-12-01

    While Li-ion cells offer excellent electrochemical performance for several applications including electric vehicles, they also exhibit poor thermal transport characteristics, resulting in reduced performance, overheating and thermal runaway. Inadequate heat removal from Li-ion cells originates from poor thermal conductivity within the cell. This paper identifies the rate-limiting material-level process that dominates overall thermal conduction in a Li-ion cell. Results indicate that thermal characteristics of a Li-ion cell are largely dominated by heat transfer across the cathode-separator interface rather than heat transfer through the materials themselves. This interfacial thermal resistance contributes around 88% of total thermal resistance in the cell. Measured value of interfacial resistance is close to that obtained from theoretical models that account for weak adhesion and large acoustic mismatch between cathode and separator. Further, to address this problem, an amine-based chemical bridging of the interface is carried out. This is shown to result in in four-times lower interfacial thermal resistance without deterioration in electrochemical performance, thereby increasing effective thermal conductivity by three-fold. This improvement is expected to reduce peak temperature rise during operation by 60%. By identifying and addressing the material-level root cause of poor thermal transport in Li-ion cells, this work may contributes towards improved thermal performance of Li-ion cells.

  14. Performance-Enhanced Activated Carbon Electrodes for Supercapacitors Combining Both Graphene-Modified Current Collectors and Graphene Conductive Additive

    PubMed Central

    Wang, Rubing; Qian, Yuting; Li, Weiwei; Zhu, Shoupu; Liu, Fengkui; Guo, Yufen; Chen, Mingliang; Li, Qi; Liu, Liwei

    2018-01-01

    Graphene has been widely used in the active material, conductive agent, binder or current collector for supercapacitors, due to its large specific surface area, high conductivity, and electron mobility. However, works simultaneously employing graphene as conductive agent and current collector were rarely reported. Here, we report improved activated carbon (AC) electrodes (AC@G@NiF/G) simultaneously combining chemical vapor deposition (CVD) graphene-modified nickel foams (NiF/Gs) current collectors and high quality few-layer graphene conductive additive instead of carbon black (CB). The synergistic effect of NiF/Gs and graphene additive makes the performances of AC@G@NiF/G electrodes superior to those of electrodes with CB or with nickel foam current collectors. The performances of AC@G@NiF/G electrodes show that for the few-layer graphene addition exists an optimum value around 5 wt %, rather than a larger addition of graphene, works out better. A symmetric supercapacitor assembled by AC@G@NiF/G electrodes exhibits excellent cycling stability. We attribute improved performances to graphene-enhanced conductivity of electrode materials and NiF/Gs with 3D graphene conductive network and lower oxidation, largely improving the electrical contact between active materials and current collectors. PMID:29762528

  15. Challenge for lowering concentration polarization in solid oxide fuel cells

    NASA Astrophysics Data System (ADS)

    Shimada, Hiroyuki; Suzuki, Toshio; Yamaguchi, Toshiaki; Sumi, Hirofumi; Hamamoto, Koichi; Fujishiro, Yoshinobu

    2016-01-01

    In the scope of electrochemical phenomena, concentration polarization at electrodes is theoretically inevitable, and lowering the concentration overpotential to improve the performance of electrochemical cells has been a continuing challenge. Electrodes with highly controlled microstructure, i.e., high porosity and uniform large pores are therefore essential to achieve high performance electrochemical cells. In this study, state-of-the-art technology for controlling the microstructure of electrodes has been developed for realizing high performance support electrodes of solid oxide fuel cells (SOFCs). The key is controlling the porosity and pore size distribution to improve gas diffusion, while maintaining the integrity of the electrolyte and the structural strength of actual sized electrode supports needed for the target application. Planar anode-supported SOFCs developed in this study realize 5 μm thick dense electrolyte (yttria-stabilized zirconia: YSZ) and the anode substrate (Ni-YSZ) of 53.6 vol.% porosity with a large median pore diameter of 0.911 μm. Electrochemical measurements reveal that the performance of the anode-supported SOFCs improves with increasing anode porosity. This Ni-YSZ anode minimizes the concentration polarization, resulting in a maximum power density of 3.09 W cm-2 at 800 °C using humidified hydrogen fuel without any electrode functional layers.

  16. Adaptive slab laser beam quality improvement using a weighted least-squares reconstruction algorithm.

    PubMed

    Chen, Shanqiu; Dong, LiZhi; Chen, XiaoJun; Tan, Yi; Liu, Wenjin; Wang, Shuai; Yang, Ping; Xu, Bing; Ye, YuTang

    2016-04-10

    Adaptive optics is an important technology for improving beam quality in solid-state slab lasers. However, there are uncorrectable aberrations in partial areas of the beam. In the criterion of the conventional least-squares reconstruction method, it makes the zones with small aberrations nonsensitive and hinders this zone from being further corrected. In this paper, a weighted least-squares reconstruction method is proposed to improve the relative sensitivity of zones with small aberrations and to further improve beam quality. Relatively small weights are applied to the zones with large residual aberrations. Comparisons of results show that peak intensity in the far field improved from 1242 analog digital units (ADU) to 2248 ADU, and beam quality β improved from 2.5 to 2.0. This indicates the weighted least-squares method has better performance than the least-squares reconstruction method when there are large zonal uncorrectable aberrations in the slab laser system.

  17. Building a Lattice for School Leadership: Lessons from England. Policy Brief #15-1

    ERIC Educational Resources Information Center

    Supovitz, Jonathan

    2015-01-01

    The flat structure of American schools is ill-suited to meet today's increasing demands for educational improvement. Even with unprecedented pressure to raise performance, America's schools are still largely organized the way they were a century ago--with a single principal presiding over a largely egg-crated faculty. Is such a thin veneer of…

  18. An improved large-field focusing schlieren system

    NASA Technical Reports Server (NTRS)

    Weinstein, Leonard M.

    1991-01-01

    The analysis and performance of a high-brightness large-field focusing schlieren system is described. The system can be used to examine complex two- and three-dimensional flows. Techniques are described to obtain focusing schlieren through distorting optical elements, to use multiple colors in a time multiplexing technique, and to use diffuse screen holography for three-dimensional photographs.

  19. Learner-Centered Use of Student Response Systems Improves Performance in Large Class Environments

    ERIC Educational Resources Information Center

    Pond, Samuel B., III

    2010-01-01

    At the college level, students often participate in introductory courses with large class enrollments that tend to produce many challenges to effective teaching and learning. Many teachers are concerned that this class environment fails to accommodate higher-level thinking and learning. I offer a brief rationale for why a student-response system…

  20. Large area silicon sheet by EFG

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Recent advances toward silicon growth stations and improved electronic quality of multiplesilicon are discussed. These advances were made in large measure by studies in which the composition of the gas environment around the meniscus area was varied. By introducing gases such as CO2, CO, and CH4 into this region, reproducible increases in diffusion length and cell performance were realized, with the best large area (5 cm x 10 cm) cells exceeding 11% efficiency.

  1. Towards Scalable Deep Learning via I/O Analysis and Optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pumma, Sarunya; Si, Min; Feng, Wu-Chun

    Deep learning systems have been growing in prominence as a way to automatically characterize objects, trends, and anomalies. Given the importance of deep learning systems, researchers have been investigating techniques to optimize such systems. An area of particular interest has been using large supercomputing systems to quickly generate effective deep learning networks: a phase often referred to as “training” of the deep learning neural network. As we scale existing deep learning frameworks—such as Caffe—on these large supercomputing systems, we notice that the parallelism can help improve the computation tremendously, leaving data I/O as the major bottleneck limiting the overall systemmore » scalability. In this paper, we first present a detailed analysis of the performance bottlenecks of Caffe on large supercomputing systems. Our analysis shows that the I/O subsystem of Caffe—LMDB—relies on memory-mapped I/O to access its database, which can be highly inefficient on large-scale systems because of its interaction with the process scheduling system and the network-based parallel filesystem. Based on this analysis, we then present LMDBIO, our optimized I/O plugin for Caffe that takes into account the data access pattern of Caffe in order to vastly improve I/O performance. Our experimental results show that LMDBIO can improve the overall execution time of Caffe by nearly 20-fold in some cases.« less

  2. Race Factors Affecting Performance Times in Elite Long-Track Speed Skating.

    PubMed

    Noordhof, Dionne A; Mulder, Roy C; de Koning, Jos J; Hopkins, Will G

    2016-05-01

    Analysis of sport performance can provide effects of environmental and other venue-specific factors in addition to estimates of within-athlete variability between competitions, which determines smallest worthwhile effects. To analyze elite long-track speed-skating events. Log-transformed performance times were analyzed with a mixed linear model that estimated percentage mean effects for altitude, barometric pressure, type of rink, and competition importance. In addition, coefficients of variation representing residual venue-related differences and within-athlete variability between races within clusters spanning ~8 d were determined. Effects and variability were assessed with magnitude-based inference. A 1000-m increase in altitude resulted in very large mean performance improvements of 2.8% in juniors and 2.1% in seniors. An increase in barometric pressure of 100 hPa resulted in a moderate reduction in performance of 1.1% for juniors but an unclear effect for seniors. Only juniors competed at open rinks, resulting in a very large reduction in performance of 3.4%. Juniors and seniors showed small performance improvements (0.4% and 0.3%) at the more important competitions. After accounting for these effects, residual venue-related variability was still moderate to large. The within-athlete within-cluster race-to-race variability was 0.3-1.3%, with a small difference in variability between male (0.8%) and female juniors (1.0%) and no difference between male and female seniors (both 0.6%). The variability in performance times of skaters is similar to that of athletes in other sports in which air or water resistance limits speed. A performance enhancement of 0.1-0.4% by top-10 athletes is necessary to increase medal-winning chances by 10%.

  3. 454th Brookhaven Lecture

    ScienceCinema

    Charles Black

    2017-12-09

    Black discusses examples of integrating self-assembly into semiconductor microelectronics, where advances in the ability to define circuit elements at ever-higher resolution have largely fueled more than 40 years of consistent performance improvements

  4. 48 CFR 1615.404-70 - Profit analysis factors.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... of disputed claims as measures of economical and efficient contract performance. This factor will be..., and practices having viability to the program at large. OPM will not consider improvements and...

  5. 48 CFR 1615.404-70 - Profit analysis factors.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... of disputed claims as measures of economical and efficient contract performance. This factor will be..., and practices having viability to the program at large. OPM will not consider improvements and...

  6. Improving Performance Of Industrial Enterprises With CGT

    NASA Astrophysics Data System (ADS)

    Dolgih, I. N.; Bannova, K. A.; Kuzmina, N. A.; Zdanova, A. B.

    2016-04-01

    At the present day, a falling in the overall level of efficiency production activities, especially in the machine-building companies makes it necessary to development various actions in the State support, including through the creation consolidated taxation system. Such support will help improve efficiency of activity not only the industrial companies, but also will allow improve economic and social situation in regions where often large engineering factories is city-forming.

  7. A generic simulation model to assess the performance of sterilization services in health establishments.

    PubMed

    Di Mascolo, Maria; Gouin, Alexia

    2013-03-01

    The work presented here is with a view to improving performance of sterilization services in hospitals. We carried out a survey in a large number of health establishments in the Rhône-Alpes region in France. Based on the results of this survey and a detailed study of a specific service, we have built a generic model. The generic nature of the model relies on a common structure with a high level of detail. This model can be used to improve the performance of a specific sterilization service and/or to dimension its resources. It can also serve for quantitative comparison of performance indicators of various sterilization services.

  8. Auxetic hexachiral structures with wavy ligaments for large elasto-plastic deformation

    NASA Astrophysics Data System (ADS)

    Zhu, Yilin; Wang, Zhen-Pei; Hien Poh, Leong

    2018-05-01

    The hexachiral structure is in-plane isotropic in small deformation. When subjected to large elasto-plastic deformation, however, the hexachiral structure tends to lose its auxeticity and/or isotropy—properties which are desirable in many potential applications. The objective of this study is to improve these two mechanical properties, without significantly compromising the effective yield stress, in the regime with significant material and geometrical nonlinearity effects. It is found that the deformation mechanisms underlying the auxeticity and isotropy properties of a hexachiral structure are largely influenced by the extent of rotation of the central ring in a unit cell. To facilitate the development of this deformation mechanism, an improved design with wavy ligaments is proposed. The improved performance of the proposed hexachiral structure is demonstrated. An initial study on possible applications as a protective material is next carried out, where the improved hexachiral design is shown to exhibit higher specific energy absorption capacity compared to the original design, as well as standard honeycomb structures.

  9. The impact of nudging coefficient for the initialization on the atmospheric flow field and the photochemical ozone concentration of Seoul, Korea

    NASA Astrophysics Data System (ADS)

    Choi, Hyun-Jung; Lee, Hwa Woon; Sung, Kyoung-Hee; Kim, Min-Jung; Kim, Yoo-Keun; Jung, Woo-Sik

    In order to incorporate correctly the large or local scale circulation in the model, a nudging term is introduced into the equation of motion. Nudging effects should be included properly in the model to reduce the uncertainties and improve the air flow field. To improve the meteorological components, the nudging coefficient should perform the adequate influence on complex area for the model initialization technique which related to data reliability and error suppression. Several numerical experiments have been undertaken in order to evaluate the effects on air quality modeling by comparing the performance of the meteorological result with variable nudging coefficient experiment. All experiments are calculated by the upper wind conditions (synoptic or asynoptic condition), respectively. Consequently, it is important to examine the model response to nudging effect of wind and mass information. The MM5-CMAQ model was used to assess the ozone differences in each case, during the episode day in Seoul, Korea and we revealed that there were large differences in the ozone concentration for each run. These results suggest that for the appropriate simulation of large or small-scale circulations, nudging considering the synoptic and asynoptic nudging coefficient does have a clear advantage over dynamic initialization, so appropriate limitation of these nudging coefficient values on its upper wind conditions is necessary before making an assessment. The statistical verifications showed that adequate nudging coefficient for both wind and temperature data throughout the model had a consistently positive impact on the atmospheric and air quality field. On the case dominated by large-scale circulation, a large nudging coefficient shows a minor improvement in the atmospheric and air quality field. However, when small-scale convection is present, the large nudging coefficient produces consistent improvement in the atmospheric and air quality field.

  10. Accountability for What Matters

    ERIC Educational Resources Information Center

    Rothman, Robert

    2016-01-01

    For more than a decade, states have evaluated school performance largely through a single measure--test scores--and rated schools on whether they improved students' performance in reading or math. The idea was to focus schools' attention on the outcomes that mattered most and to focus states' attention on the schools that needed the most help in…

  11. Program Analyzes Radar Altimeter Data

    NASA Technical Reports Server (NTRS)

    Vandemark, Doug; Hancock, David; Tran, Ngan

    2004-01-01

    A computer program has been written to perform several analyses of radar altimeter data. The program was designed to improve on previous methods of analysis of altimeter engineering data by (1) facilitating and accelerating the analysis of large amounts of data in a more direct manner and (2) improving the ability to estimate performance of radar-altimeter instrumentation and provide data corrections. The data in question are openly available to the international scientific community and can be downloaded from anonymous file-transfer- protocol (FTP) locations that are accessible via links from altimetry Web sites. The software estimates noise in range measurements, estimates corrections for electromagnetic bias, and performs statistical analyses on various parameters for comparison of different altimeters. Whereas prior techniques used to perform similar analyses of altimeter range noise require comparison of data from repetitions of satellite ground tracks, the present software uses a high-pass filtering technique to obtain similar results from single satellite passes. Elimination of the requirement for repeat-track analysis facilitates the analysis of large amounts of satellite data to assess subtle variations in range noise.

  12. Efficient Extraction of High Centrality Vertices in Distributed Graphs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kumbhare, Alok; Frincu, Marc; Raghavendra, Cauligi S.

    2014-09-09

    Betweenness centrality (BC) is an important measure for identifying high value or critical vertices in graphs, in variety of domains such as communication networks, road networks, and social graphs. However, calculating betweenness values is prohibitively expensive and, more often, domain experts are interested only in the vertices with the highest centrality values. In this paper, we first propose a partition-centric algorithm (MS-BC) to calculate BC for a large distributed graph that optimizes resource utilization and improves overall performance. Further, we extend the notion of approximate BC by pruning the graph and removing a subset of edges and vertices that contributemore » the least to the betweenness values of other vertices (MSL-BC), which further improves the runtime performance. We evaluate the proposed algorithms using a mix of real-world and synthetic graphs on an HPC cluster and analyze its strengths and weaknesses. The experimental results show an improvement in performance of upto 12x for large sparse graphs as compared to the state-of-the-art, and at the same time highlights the need for better partitioning methods to enable a balanced workload across partitions for unbalanced graphs such as small-world or power-law graphs.« less

  13. The tubercles on humpback whales' flippers: application of bio-inspired technology.

    PubMed

    Fish, Frank E; Weber, Paul W; Murray, Mark M; Howle, Laurens E

    2011-07-01

    The humpback whale (Megaptera novaeangliae) is exceptional among the large baleen whales in its ability to undertake aquabatic maneuvers to catch prey. Humpback whales utilize extremely mobile, wing-like flippers for banking and turning. Large rounded tubercles along the leading edge of the flipper are morphological structures that are unique in nature. The tubercles on the leading edge act as passive-flow control devices that improve performance and maneuverability of the flipper. Experimental analysis of finite wing models has demonstrated that the presence of tubercles produces a delay in the angle of attack until stall, thereby increasing maximum lift and decreasing drag. Possible fluid-dynamic mechanisms for improved performance include delay of stall through generation of a vortex and modification of the boundary layer, and increase in effective span by reduction of both spanwise flow and strength of the tip vortex. The tubercles provide a bio-inspired design that has commercial viability for wing-like structures. Control of passive flow has the advantages of eliminating complex, costly, high-maintenance, and heavy control mechanisms, while improving performance for lifting bodies in air and water. The tubercles on the leading edge can be applied to the design of watercraft, aircraft, ventilation fans, and windmills.

  14. Apply TQM to E-Government Outsourcing Management

    NASA Astrophysics Data System (ADS)

    Huai, Jinmei

    This paper developed an approach to e-government outsourcing quality management. E-government initiatives have rapidly increased in the last decades and the success of these activities will largely depend on their operation quality. As an instrument to improve operation quality, outsourcing can be applied to e-government. This paper inspected process of e-government outsourcing and discussed how to improve the outsourcing performance through total quality management (TQM). The characteristics and special requirements of e-government outsourcing were analyzed as the basis for discussion. Then the principles and application of total quality management were interpreted. Finally the process of improving performance of e-government was analyzed in the context of outsourcing.

  15. Applying Best Business Practices from Corporate Performance Management to DoD

    DTIC Science & Technology

    2013-01-01

    leading or governing large, complex corporations and are experienced in creating reliable solutions to complex management issues guided by best business ...recommendations and effective solutions aimed at improving DoD. Defense Business Board Corporate Performance Management REPORT FY13-03 Task...Group 1 Applying Best Business Practices from Corporate Performance Management to DoD TASK The Deputy Secretary of Defense (DEPSECDEF

  16. The effects of an enrichment training program for youth football attackers

    PubMed Central

    Santos, Sara; Gonçalves, Bruno; Travassos, Bruno; Wong, Del P.; Schöllhorn, Wolfgang; Sampaio, Jaime

    2018-01-01

    The aim of this study was to identify the effects of a complementary training program based on differential learning approach in the physical, technical, creative and positioning performance of youth football attackers. Fifteen players were allocated into the control (U15C = 9, age: 13.9±0.5 years; U17C = 6, age: 16.1±0.7 years) and the experimental (U15E = 9, age: 14.2±0.8 years; U17E = 6, age: 15.8±0.5 years) groups. The experimental groups participated in 10-weeks of a complementary training program based on differential learning approach to improve physical literacy and players’ tactical behavior. Variables studied encompassed: motor (vertical jump, speed and repeated change-of direction), technical (pass, dribble and shot), creative (fluency, attempts, versatility) and positioning-related variables (stretch index, spatial exploration index and regularity of the lateral and longitudinal movements). Results revealed that U15E improved both the jump and repeated change-of-direction performance, while the U17E have only improved the jump performance. The U15E showed improvements in all technical variables (small to large effects), and in the fluency and versatility (moderate effects), while the U17 have only improved the successful shots (large effects). From a positional perspective, there was a moderate increase in the stretch index, and decreased longitudinal and lateral regularity (small to moderate effects) in the U15E compared to the U15C. In turn, the U17E revealed a moderate increase of the spatial exploration index and a small decrease in the stretch index. Overall, the results suggest that the complementary training program was effective for the development of the overall performance of the U15E attackers, while more time and/or variability may be needed for older age groups. Nevertheless, the overall higher values found in experimental groups, may suggest that this type of complementary training program improves performance. PMID:29897985

  17. Design of 240,000 orthogonal 25mer DNA barcode probes.

    PubMed

    Xu, Qikai; Schlabach, Michael R; Hannon, Gregory J; Elledge, Stephen J

    2009-02-17

    DNA barcodes linked to genetic features greatly facilitate screening these features in pooled formats using microarray hybridization, and new tools are needed to design large sets of barcodes to allow construction of large barcoded mammalian libraries such as shRNA libraries. Here we report a framework for designing large sets of orthogonal barcode probes. We demonstrate the utility of this framework by designing 240,000 barcode probes and testing their performance by hybridization. From the test hybridizations, we also discovered new probe design rules that significantly reduce cross-hybridization after their introduction into the framework of the algorithm. These rules should improve the performance of DNA microarray probe designs for many applications.

  18. Design of 240,000 orthogonal 25mer DNA barcode probes

    PubMed Central

    Xu, Qikai; Schlabach, Michael R.; Hannon, Gregory J.; Elledge, Stephen J.

    2009-01-01

    DNA barcodes linked to genetic features greatly facilitate screening these features in pooled formats using microarray hybridization, and new tools are needed to design large sets of barcodes to allow construction of large barcoded mammalian libraries such as shRNA libraries. Here we report a framework for designing large sets of orthogonal barcode probes. We demonstrate the utility of this framework by designing 240,000 barcode probes and testing their performance by hybridization. From the test hybridizations, we also discovered new probe design rules that significantly reduce cross-hybridization after their introduction into the framework of the algorithm. These rules should improve the performance of DNA microarray probe designs for many applications. PMID:19171886

  19. Large Eddy Simulations and Turbulence Modeling for Film Cooling

    NASA Technical Reports Server (NTRS)

    Acharya, Sumanta

    1999-01-01

    The objective of the research is to perform Direct Numerical Simulations (DNS) and Large Eddy Simulations (LES) for film cooling process, and to evaluate and improve advanced forms of the two equation turbulence models for turbine blade surface flow analysis. The DNS/LES were used to resolve the large eddies within the flow field near the coolant jet location. The work involved code development and applications of the codes developed to the film cooling problems. Five different codes were developed and utilized to perform this research. This report presented a summary of the development of the codes and their applications to analyze the turbulence properties at locations near coolant injection holes.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shi, X.P.

    Empirical studies on the effectiveness of workplace safety regulations are inconclusive. This study hypothesizes that the asynchronous effects of safety regulations occur because regulations need time to become effective. Safety regulations will work initially by reducing the most serious accidents, and later by improving overall safety performance. The hypothesis is tested by studying a provincial level aggregate panel dataset for China's coal industry using two different models with different sets of dependent variables: a fixed-effects model on mortality rate, which is defined as fatalities per 1,000 employees; and a negative binominal model on the annual number (frequency) of disastrous accidents.more » Safety regulations can reduce the frequency of disastrous accidents, but have not reduced mortality rate, which represents overall safety performance. Policy recommendations are made, including shifting production from small to large mines through industrial consolidation, improving the safety performance of large mines, addressing consequences of decentralization, and facilitating the implementation of regulations through carrying on institutional actions and supporting legislation.« less

  1. Warm-up with a weighted vest improves running performance via leg stiffness and running economy.

    PubMed

    Barnes, K R; Hopkins, W G; McGuigan, M R; Kilding, A E

    2015-01-01

    To determine the effects of "strides" with a weighted-vest during a warm-up on endurance performance and its potential neuromuscular and metabolic mediators. A bout of resistance exercise can enhance subsequent high-intensity performance, but little is known about such priming exercise for endurance performance. A crossover with 5-7 days between an experimental and control trial was performed by 11 well-trained distance runners. Each trial was preceded by a warm-up consisting of a 10-min self-paced jog, a 5-min submaximal run to determine running economy, and six 10-s strides with or without a weighted-vest (20% of body mass). After a 10-min recovery period, runners performed a series of jumps to determine leg stiffness and other neuromuscular characteristics, another 5-min submaximal run, and an incremental treadmill test to determine peak running speed. Clinical and non-clinical forms of magnitude-based inference were used to assess outcomes. Correlations and linear regression were used to assess relationships between performance and underlying measures. The weighted-vest condition resulted in a very-large enhancement of peak running speed (2.9%; 90% confidence limits ±0.8%), a moderate increase in leg stiffness (20.4%; ±4.2%) and a large improvement in running economy (6.0%; ±1.6%); there were also small-moderate clear reductions in cardiorespiratory measures. Relationships between change scores showed that changes in leg stiffness could explain all the improvements in performance and economy. Strides with a weighted-vest have a priming effect on leg stiffness and running economy. It is postulated the associated major effect on peak treadmill running speed will translate into enhancement of competitive endurance performance. Copyright © 2013 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  2. First Steps: What School Systems Can Do Right Now to Improve Teacher Compensation and Career Path

    ERIC Educational Resources Information Center

    Frank, Stephen; Baroody, Karen; Gordon, Jeff

    2013-01-01

    Across the country, school districts are struggling to improve student performance on flat or declining budgets. Many districts are understandably cautious about implementing large changes, such as redesigning the step-and-lane system that has existed for decades. New evaluation systems must be implemented and vetted before they are linked to…

  3. Social Problems and America's Youth: Why School Reform Won't Work.

    ERIC Educational Resources Information Center

    Rittenmeyer, Dennis C.

    1987-01-01

    Using the schools to achieve racial balance, eliminate poverty, fight drug abuse, prevent pregnancy, and reduce youth suicide is too large a task. Teachers and principals should address educational issues, not unmet social needs. To improve the educational performance of the schools, the quality of life for youth must first be improved. (MSE)

  4. Quality Circles: How Effective Are They in Improving Employee Performance and Attitudes?

    ERIC Educational Resources Information Center

    Buch, Kimberly; Raban, Amiram

    1990-01-01

    Used a quasi-experimental design to assess the effect of a quality circle intervention on behavior and attitudes of 88 employees at a large Midwestern organization. Results provide mixed support for the purported ability of circles to improve work behavior with no change for absenteeism and productivity but positive change for quality of work.…

  5. Increasing Faculty-Student Communication through Email Messaging to Improve the Success of Online Students

    ERIC Educational Resources Information Center

    Jimison, Donna L.

    2013-01-01

    In a large community college in the Midwest, an online medical terminology course was experiencing success rates below that of college- and state-wide levels. This study evaluated the outcomes of intentional, increased numbers of e-mail communications between under-performing students and faculty for the purpose of improving student academic…

  6. Large-scale synthesis of high-quality hexagonal boron nitride nanosheets for large-area graphene electronics.

    PubMed

    Lee, Kang Hyuck; Shin, Hyeon-Jin; Lee, Jinyeong; Lee, In-yeal; Kim, Gil-Ho; Choi, Jae-Young; Kim, Sang-Woo

    2012-02-08

    Hexagonal boron nitride (h-BN) has received a great deal of attention as a substrate material for high-performance graphene electronics because it has an atomically smooth surface, lattice constant similar to that of graphene, large optical phonon modes, and a large electrical band gap. Herein, we report the large-scale synthesis of high-quality h-BN nanosheets in a chemical vapor deposition (CVD) process by controlling the surface morphologies of the copper (Cu) catalysts. It was found that morphology control of the Cu foil is much critical for the formation of the pure h-BN nanosheets as well as the improvement of their crystallinity. For the first time, we demonstrate the performance enhancement of CVD-based graphene devices with large-scale h-BN nanosheets. The mobility of the graphene device on the h-BN nanosheets was increased 3 times compared to that without the h-BN nanosheets. The on-off ratio of the drain current is 2 times higher than that of the graphene device without h-BN. This work suggests that high-quality h-BN nanosheets based on CVD are very promising for high-performance large-area graphene electronics. © 2012 American Chemical Society

  7. Assessment and improvement of statistical tools for comparative proteomics analysis of sparse data sets with few experimental replicates.

    PubMed

    Schwämmle, Veit; León, Ileana Rodríguez; Jensen, Ole Nørregaard

    2013-09-06

    Large-scale quantitative analyses of biological systems are often performed with few replicate experiments, leading to multiple nonidentical data sets due to missing values. For example, mass spectrometry driven proteomics experiments are frequently performed with few biological or technical replicates due to sample-scarcity or due to duty-cycle or sensitivity constraints, or limited capacity of the available instrumentation, leading to incomplete results where detection of significant feature changes becomes a challenge. This problem is further exacerbated for the detection of significant changes on the peptide level, for example, in phospho-proteomics experiments. In order to assess the extent of this problem and the implications for large-scale proteome analysis, we investigated and optimized the performance of three statistical approaches by using simulated and experimental data sets with varying numbers of missing values. We applied three tools, including standard t test, moderated t test, also known as limma, and rank products for the detection of significantly changing features in simulated and experimental proteomics data sets with missing values. The rank product method was improved to work with data sets containing missing values. Extensive analysis of simulated and experimental data sets revealed that the performance of the statistical analysis tools depended on simple properties of the data sets. High-confidence results were obtained by using the limma and rank products methods for analyses of triplicate data sets that exhibited more than 1000 features and more than 50% missing values. The maximum number of differentially represented features was identified by using limma and rank products methods in a complementary manner. We therefore recommend combined usage of these methods as a novel and optimal way to detect significantly changing features in these data sets. This approach is suitable for large quantitative data sets from stable isotope labeling and mass spectrometry experiments and should be applicable to large data sets of any type. An R script that implements the improved rank products algorithm and the combined analysis is available.

  8. Paper-like electronic displays: Large-area rubber-stamped plastic sheets of electronics and microencapsulated electrophoretic inks

    PubMed Central

    Rogers, John A.; Bao, Zhenan; Baldwin, Kirk; Dodabalapur, Ananth; Crone, Brian; Raju, V. R.; Kuck, Valerie; Katz, Howard; Amundson, Karl; Ewing, Jay; Drzaic, Paul

    2001-01-01

    Electronic systems that use rugged lightweight plastics potentially offer attractive characteristics (low-cost processing, mechanical flexibility, large area coverage, etc.) that are not easily achieved with established silicon technologies. This paper summarizes work that demonstrates many of these characteristics in a realistic system: organic active matrix backplane circuits (256 transistors) for large (≈5 × 5-inch) mechanically flexible sheets of electronic paper, an emerging type of display. The success of this effort relies on new or improved processing techniques and materials for plastic electronics, including methods for (i) rubber stamping (microcontact printing) high-resolution (≈1 μm) circuits with low levels of defects and good registration over large areas, (ii) achieving low leakage with thin dielectrics deposited onto surfaces with relief, (iii) constructing high-performance organic transistors with bottom contact geometries, (iv) encapsulating these transistors, (v) depositing, in a repeatable way, organic semiconductors with uniform electrical characteristics over large areas, and (vi) low-temperature (≈100°C) annealing to increase the on/off ratios of the transistors and to improve the uniformity of their characteristics. The sophistication and flexibility of the patterning procedures, high level of integration on plastic substrates, large area coverage, and good performance of the transistors are all important features of this work. We successfully integrate these circuits with microencapsulated electrophoretic “inks” to form sheets of electronic paper. PMID:11320233

  9. Hybrid Lecture-Online Format Increases Student Grades in an Undergraduate Exercise Physiology Course at a Large Urban University

    ERIC Educational Resources Information Center

    McFarlin, Brian K.

    2008-01-01

    Hybrid courses allow students additional exposure to course content that is not possible in a traditional classroom environment. This exposure may lead to an improvement in academic performance. In this report, I describe the transition of a large undergraduate exercise physiology course from a traditional lecture format to a hybrid…

  10. A Cascaded Approach for Correcting Ionospheric Contamination with Large Amplitude in HF Skywave Radars

    PubMed Central

    Wei, Yinsheng; Guo, Rujiang; Xu, Rongqing; Tang, Xiudong

    2014-01-01

    Ionospheric phase perturbation with large amplitude causes broadening sea clutter's Bragg peaks to overlap each other; the performance of traditional decontamination methods about filtering Bragg peak is poor, which greatly limits the detection performance of HF skywave radars. In view of the ionospheric phase perturbation with large amplitude, this paper proposes a cascaded approach based on improved S-method to correct the ionospheric phase contamination. This approach consists of two correction steps. At the first step, a time-frequency distribution method based on improved S-method is adopted and an optimal detection method is designed to obtain a coarse ionospheric modulation estimation from the time-frequency distribution. At the second correction step, based on the phase gradient algorithm (PGA) is exploited to eliminate the residual contamination. Finally, use the measured data to verify the effectiveness of the method. Simulation results show the time-frequency resolution of this method is high and is not affected by the interference of the cross term; ionospheric phase perturbation with large amplitude can be corrected in low signal-to-noise (SNR); such a cascade correction method has a good effect. PMID:24578656

  11. Multilocus lod scores in large pedigrees: combination of exact and approximate calculations.

    PubMed

    Tong, Liping; Thompson, Elizabeth

    2008-01-01

    To detect the positions of disease loci, lod scores are calculated at multiple chromosomal positions given trait and marker data on members of pedigrees. Exact lod score calculations are often impossible when the size of the pedigree and the number of markers are both large. In this case, a Markov Chain Monte Carlo (MCMC) approach provides an approximation. However, to provide accurate results, mixing performance is always a key issue in these MCMC methods. In this paper, we propose two methods to improve MCMC sampling and hence obtain more accurate lod score estimates in shorter computation time. The first improvement generalizes the block-Gibbs meiosis (M) sampler to multiple meiosis (MM) sampler in which multiple meioses are updated jointly, across all loci. The second one divides the computations on a large pedigree into several parts by conditioning on the haplotypes of some 'key' individuals. We perform exact calculations for the descendant parts where more data are often available, and combine this information with sampling of the hidden variables in the ancestral parts. Our approaches are expected to be most useful for data on a large pedigree with a lot of missing data. (c) 2007 S. Karger AG, Basel

  12. Multilocus Lod Scores in Large Pedigrees: Combination of Exact and Approximate Calculations

    PubMed Central

    Tong, Liping; Thompson, Elizabeth

    2007-01-01

    To detect the positions of disease loci, lod scores are calculated at multiple chromosomal positions given trait and marker data on members of pedigrees. Exact lod score calculations are often impossible when the size of the pedigree and the number of markers are both large. In this case, a Markov Chain Monte Carlo (MCMC) approach provides an approximation. However, to provide accurate results, mixing performance is always a key issue in these MCMC methods. In this paper, we propose two methods to improve MCMC sampling and hence obtain more accurate lod score estimates in shorter computation time. The first improvement generalizes the block-Gibbs meiosis (M) sampler to multiple meiosis (MM) sampler in which multiple meioses are updated jointly, across all loci. The second one divides the computations on a large pedigree into several parts by conditioning on the haplotypes of some ‘key’ individuals. We perform exact calculations for the descendant parts where more data are often available, and combine this information with sampling of the hidden variables in the ancestral parts. Our approaches are expected to be most useful for data on a large pedigree with a lot of missing data. PMID:17934317

  13. Improving Explicit Congestion Notification with the Mark-Front Strategy

    NASA Technical Reports Server (NTRS)

    Liu, Chunlei; Jain, Raj

    2001-01-01

    Delivering congestion signals is essential to the performance of networks. Current TCP/IP networks use packet losses to signal congestion. Packet losses not only reduces TCP performance, but also adds large delay. Explicit Congestion Notification (ECN) delivers a faster indication of congestion and has better performance. However, current ECN implementations mark the packet from the tail of the queue. In this paper, we propose the mark-front strategy to send an even faster congestion signal. We show that mark-front strategy reduces buffer size requirement, improves link efficiency and provides better fairness among users. Simulation results that verify our analysis are also presented.

  14. Optimization study for the experimental configuration of CMB-S4

    NASA Astrophysics Data System (ADS)

    Barron, Darcy; Chinone, Yuji; Kusaka, Akito; Borril, Julian; Errard, Josquin; Feeney, Stephen; Ferraro, Simone; Keskitalo, Reijo; Lee, Adrian T.; Roe, Natalie A.; Sherwin, Blake D.; Suzuki, Aritoki

    2018-02-01

    The CMB Stage 4 (CMB-S4) experiment is a next-generation, ground-based experiment that will measure the cosmic microwave background (CMB) polarization to unprecedented accuracy, probing the signature of inflation, the nature of cosmic neutrinos, relativistic thermal relics in the early universe, and the evolution of the universe. CMB-S4 will consist of O(500,000) photon-noise-limited detectors that cover a wide range of angular scales in order to probe the cosmological signatures from both the early and late universe. It will measure a wide range of microwave frequencies to cleanly separate the CMB signals from galactic and extra-galactic foregrounds. To advance the progress towards designing the instrument for CMB-S4, we have established a framework to optimize the instrumental configuration to maximize its scientific output. The framework combines cost and instrumental models with a cosmology forecasting tool, and evaluates the scientific sensitivity as a function of various instrumental parameters. The cost model also allows us to perform the analysis under a fixed-cost constraint, optimizing for the scientific output of the experiment given finite resources. In this paper, we report our first results from this framework, using simplified instrumental and cost models. We have primarily studied two classes of instrumental configurations: arrays of large-aperture telescopes with diameters ranging from 2–10 m, and hybrid arrays that combine small-aperture telescopes (0.5-m diameter) with large-aperture telescopes. We explore performance as a function of telescope aperture size, distribution of the detectors into different microwave frequencies, survey strategy and survey area, low-frequency noise performance, and balance between small and large aperture telescopes for hybrid configurations. Both types of configurations must cover both large (~ degree) and small (~ arcmin) angular scales, and the performance depends on assumptions for performance vs. angular scale. The configurations with large-aperture telescopes have a shallow optimum around 4–6 m in aperture diameter, assuming that large telescopes can achieve good performance for low-frequency noise. We explore some of the uncertainties of the instrumental model and cost parameters, and we find that the optimum has a weak dependence on these parameters. The hybrid configuration shows an even broader optimum, spanning a range of 4–10 m in aperture for the large telescopes. We also present two strawperson configurations as an outcome of this optimization study, and we discuss some ideas for improving our simple cost and instrumental models used here. There are several areas of this analysis that deserve further improvement. In our forecasting framework, we adopt a simple two-component foreground model with spatially varying power-law spectral indices. We estimate de-lensing performance statistically and ignore non-idealities such as anisotropic mode coverage, boundary effect, and possible foreground residual. Instrumental systematics, which is not accounted for in our analyses, may also influence the conceptual design. Further study of the instrumental and cost models will be one of the main areas of study by the entire CMB-S4 community. We hope that our framework will be useful for estimating the influence of these improvements in the future, and we will incorporate them in order to further improve the optimization.

  15. The Productive Ward Program™: A Two-Year Implementation Impact Review Using a Longitudinal Multilevel Study.

    PubMed

    Van Bogaert, Peter; Van Heusden, Danny; Verspuy, Martijn; Wouters, Kristien; Slootmans, Stijn; Van der Straeten, Johnny; Van Aken, Paul; White, Mark

    2017-03-01

    Aim To investigate the impact of the quality improvement program "Productive Ward - Releasing Time to Care™" using nurses' and midwives' reports of practice environment, burnout, quality of care, job outcomes, as well as workload, decision latitude, social capital, and engagement. Background Despite the requirement for health systems to improve quality and the proliferation of quality improvement programs designed for healthcare, the empirical evidence supporting large-scale quality improvement programs impacting patient satisfaction, staff engagement, and quality care remains sparse. Method A longitudinal study was performed in a large 600-bed acute care university hospital at two measurement intervals for nurse practice environment, burnout, and quality of care and job outcomes and three measurement intervals for workload, decision latitude, social capital, and engagement between June 2011 and November 2014. Results Positive results were identified in practice environment, decision latitude, and social capital. Less favorable results were identified in relation to perceived workload, emotional exhaustion. and vigor. Moreover, measures of quality of care and job satisfaction were reported less favorably. Conclusion This study highlights the need to further understand how to implement large-scale quality improvement programs so that they integrate with daily practices and promote "quality improvement" as "business as usual."

  16. A Study of the Efficiency of Spatial Indexing Methods Applied to Large Astronomical Databases

    NASA Astrophysics Data System (ADS)

    Donaldson, Tom; Berriman, G. Bruce; Good, John; Shiao, Bernie

    2018-01-01

    Spatial indexing of astronomical databases generally uses quadrature methods, which partition the sky into cells used to create an index (usually a B-tree) written as database column. We report the results of a study to compare the performance of two common indexing methods, HTM and HEALPix, on Solaris and Windows database servers installed with a PostgreSQL database, and a Windows Server installed with MS SQL Server. The indexing was applied to the 2MASS All-Sky Catalog and to the Hubble Source catalog. On each server, the study compared indexing performance by submitting 1 million queries at each index level with random sky positions and random cone search radius, which was computed on a logarithmic scale between 1 arcsec and 1 degree, and measuring the time to complete the query and write the output. These simulated queries, intended to model realistic use patterns, were run in a uniform way on many combinations of indexing method and indexing level. The query times in all simulations are strongly I/O-bound and are linear with number of records returned for large numbers of sources. There are, however, considerable differences between simulations, which reveal that hardware I/O throughput is a more important factor in managing the performance of a DBMS than the choice of indexing scheme. The choice of index itself is relatively unimportant: for comparable index levels, the performance is consistent within the scatter of the timings. At small index levels (large cells; e.g. level 4; cell size 3.7 deg), there is large scatter in the timings because of wide variations in the number of sources found in the cells. At larger index levels, performance improves and scatter decreases, but the improvement at level 8 (14 min) and higher is masked to some extent in the timing scatter caused by the range of query sizes. At very high levels (20; 0.0004 arsec), the granularity of the cells becomes so high that a large number of extraneous empty cells begin to degrade performance. Thus, for the use patterns studied here the database performance is not critically dependent on the exact choices of index or level.

  17. Improving Docking Performance Using Negative Image-Based Rescoring.

    PubMed

    Kurkinen, Sami T; Niinivehmas, Sanna; Ahinko, Mira; Lätti, Sakari; Pentikäinen, Olli T; Postila, Pekka A

    2018-01-01

    Despite the large computational costs of molecular docking, the default scoring functions are often unable to recognize the active hits from the inactive molecules in large-scale virtual screening experiments. Thus, even though a correct binding pose might be sampled during the docking, the active compound or its biologically relevant pose is not necessarily given high enough score to arouse the attention. Various rescoring and post-processing approaches have emerged for improving the docking performance. Here, it is shown that the very early enrichment (number of actives scored higher than 1% of the highest ranked decoys) can be improved on average 2.5-fold or even 8.7-fold by comparing the docking-based ligand conformers directly against the target protein's cavity shape and electrostatics. The similarity comparison of the conformers is performed without geometry optimization against the negative image of the target protein's ligand-binding cavity using the negative image-based (NIB) screening protocol. The viability of the NIB rescoring or the R-NiB, pioneered in this study, was tested with 11 target proteins using benchmark libraries. By focusing on the shape/electrostatics complementarity of the ligand-receptor association, the R-NiB is able to improve the early enrichment of docking essentially without adding to the computing cost. By implementing consensus scoring, in which the R-NiB and the original docking scoring are weighted for optimal outcome, the early enrichment is improved to a level that facilitates effective drug discovery. Moreover, the use of equal weight from the original docking scoring and the R-NiB scoring improves the yield in most cases.

  18. Monolayer graphene-insulator-semiconductor emitter for large-area electron lithography

    NASA Astrophysics Data System (ADS)

    Kirley, Matthew P.; Aloui, Tanouir; Glass, Jeffrey T.

    2017-06-01

    The rapid adoption of nanotechnology in fields as varied as semiconductors, energy, and medicine requires the continual improvement of nanopatterning tools. Lithography is central to this evolving nanotechnology landscape, but current production systems are subject to high costs, low throughput, or low resolution. Herein, we present a solution to these problems with the use of monolayer graphene in a graphene-insulator-semiconductor (GIS) electron emitter device for large-area electron lithography. Our GIS device displayed high emission efficiency (up to 13%) and transferred large patterns (500 × 500 μm) with high fidelity (<50% spread). The performance of our device demonstrates a feasible path to dramatic improvements in lithographic patterning systems, enabling continued progress in existing industries and opening opportunities in nanomanufacturing.

  19. Physical Activity Predicts Performance in an Unpracticed Bimanual Coordination Task.

    PubMed

    Boisgontier, Matthieu P; Serbruyns, Leen; Swinnen, Stephan P

    2017-01-01

    Practice of a given physical activity is known to improve the motor skills related to this activity. However, whether unrelated skills are also improved is still unclear. To test the impact of physical activity on an unpracticed motor task, 26 young adults completed the international physical activity questionnaire and performed a bimanual coordination task they had never practiced before. Results showed that higher total physical activity predicted higher performance in the bimanual task, controlling for multiple factors such as age, physical inactivity, music practice, and computer games practice. Linear mixed models allowed this effect of physical activity to be generalized to a large population of bimanual coordination conditions. This finding runs counter to the notion that generalized motor abilities do not exist and supports the existence of a "learning to learn" skill that could be improved through physical activity and that impacts performance in tasks that are not necessarily related to the practiced activity.

  20. Advanced Lithium-ion Batteries with High Specific Energy and Improved Safety for Nasa's Missions

    NASA Technical Reports Server (NTRS)

    West, William; Smart, Marshall; Soler, Jess; Krause, Charlie; Hwang, Constanza; Bugga, Ratnakumar

    2012-01-01

    High Energy Materials ( Cathodes, anodes and high voltage and safe electrolyte are required to meet the needs of the future space missions. A. Cathodes: The layered layered composites of of Li2MnO3 and LiMO2 are promising Power capability of the materials, however requires further improvement. Suitable morphology is critical for good performance and high tap (packing) density. Surface coatings help in the interfacial kinetics and stability. B. Electrolytes: Small additions of Flame Retardant Additives improves flammability without affecting performance (Rate and cycle life). 1.0 M in EC+EMC+TPP was shown to have good performance against the high voltage cathode; Performance demonstrated in large capacity prototype MCMB- LiNiCoO2 Cells. Formulations with higher proportions are looking promising. Still requires further validation through abuse tests (e.g., on 18650 cells).

  1. A Dual Frequency Carrier Phase Error Difference Checking Algorithm for the GNSS Compass.

    PubMed

    Liu, Shuo; Zhang, Lei; Li, Jian

    2016-11-24

    The performance of the Global Navigation Satellite System (GNSS) compass is related to the quality of carrier phase measurement. How to process the carrier phase error properly is important to improve the GNSS compass accuracy. In this work, we propose a dual frequency carrier phase error difference checking algorithm for the GNSS compass. The algorithm aims at eliminating large carrier phase error in dual frequency double differenced carrier phase measurement according to the error difference between two frequencies. The advantage of the proposed algorithm is that it does not need additional environment information and has a good performance on multiple large errors compared with previous research. The core of the proposed algorithm is removing the geographical distance from the dual frequency carrier phase measurement, then the carrier phase error is separated and detectable. We generate the Double Differenced Geometry-Free (DDGF) measurement according to the characteristic that the different frequency carrier phase measurements contain the same geometrical distance. Then, we propose the DDGF detection to detect the large carrier phase error difference between two frequencies. The theoretical performance of the proposed DDGF detection is analyzed. An open sky test, a manmade multipath test and an urban vehicle test were carried out to evaluate the performance of the proposed algorithm. The result shows that the proposed DDGF detection is able to detect large error in dual frequency carrier phase measurement by checking the error difference between two frequencies. After the DDGF detection, the accuracy of the baseline vector is improved in the GNSS compass.

  2. Enhancement radiative cooling performance of nanoparticle crystal via oxidation

    NASA Astrophysics Data System (ADS)

    Jia, Zi-Xun; Shuai, Yong; Li, Meng; Guo, Yanmin; Tan, He-ping

    2018-03-01

    Nanoparticle-crystal is a promising candidate for large scale metamaterial fabrication. However, in radiative cooling application, the maximum blackbody radiation wavelength locates far from metal's plasmon wavelength. In this paper, it will be shown if the metallic nanoparticle crystal can be properly oxidized, the absorption performance within room temperature blackbody radiation spectrum can be improved. Magnetic polariton and surface plasmon polariton have been explained for the mechanism of absorption improvement. Three different oxidation patterns have been investigated in this paper, and the results show they share a similar enhancing mechanism.

  3. Improved prosthetic hand control with concurrent use of myoelectric and inertial measurements.

    PubMed

    Krasoulis, Agamemnon; Kyranou, Iris; Erden, Mustapha Suphi; Nazarpour, Kianoush; Vijayakumar, Sethu

    2017-07-11

    Myoelectric pattern recognition systems can decode movement intention to drive upper-limb prostheses. Despite recent advances in academic research, the commercial adoption of such systems remains low. This limitation is mainly due to the lack of classification robustness and a simultaneous requirement for a large number of electromyogram (EMG) electrodes. We propose to address these two issues by using a multi-modal approach which combines surface electromyography (sEMG) with inertial measurements (IMs) and an appropriate training data collection paradigm. We demonstrate that this can significantly improve classification performance as compared to conventional techniques exclusively based on sEMG signals. We collected and analyzed a large dataset comprising recordings with 20 able-bodied and two amputee participants executing 40 movements. Additionally, we conducted a novel real-time prosthetic hand control experiment with 11 able-bodied subjects and an amputee by using a state-of-the-art commercial prosthetic hand. A systematic performance comparison was carried out to investigate the potential benefit of incorporating IMs in prosthetic hand control. The inclusion of IM data improved performance significantly, by increasing classification accuracy (CA) in the offline analysis and improving completion rates (CRs) in the real-time experiment. Our findings were consistent across able-bodied and amputee subjects. Integrating the sEMG electrodes and IM sensors within a single sensor package enabled us to achieve high-level performance by using on average 4-6 sensors. The results from our experiments suggest that IMs can form an excellent complimentary source signal for upper-limb myoelectric prostheses. We trust that multi-modal control solutions have the potential of improving the usability of upper-extremity prostheses in real-life applications.

  4. Making Connections for Youth in Washington State: The Role of Data in Developing Sound Public Policy. CEDR Working Paper No. 2010-1.0

    ERIC Educational Resources Information Center

    Goldhaber, Dan

    2010-01-01

    The details of school reform in Washington State continue to evolve, but the unprecedented performance demands that it and NCLB place on schools are unlikely to disappear any time soon. The same is true of the large gap that exists between today's performance and tomorrow's aspirations. By any measure, significant improvements in performance now…

  5. Graduate Programs in Building Science at UC Berkeley

    Science.gov Websites

    aims to influence practice and improve the performance of buildings by educating future members of the influence design practice. Coursework is largely decided on an individual basis through consultation between

  6. Measuring School Performance To Improve Student Achievement and To Reward Effective Programs.

    ERIC Educational Resources Information Center

    Heistad, Dave; Spicuzza, Rick

    This paper describes the method that the Minneapolis Public School system (MPS), Minnesota, uses to measure school and student performance. MPS uses a multifaceted system that both captures and accounts for the complexity of a large urban school district. The system incorporates: (1) a hybrid model of critical indicators that report on level of…

  7. Learning Management System Calendar Reminders and Effects on Time Management and Academic Performance

    ERIC Educational Resources Information Center

    Mei, Jianyang

    2016-01-01

    This research project uses a large research university in the Midwest as a research site to explore the time management skills of international students and analyzes how using the Course Hack, an online Learning Management System (LMS) calendar tool, improves participants' time management skills and positively impacts their academic performance,…

  8. Using Clickers to Improve Student Engagement and Performance in an Introductory Biochemistry Class

    ERIC Educational Resources Information Center

    Addison, Stephen; Wright, Adrienne; Milner, Rachel

    2009-01-01

    As part of ongoing efforts to enhance teaching practices in a large-class introductory biochemistry course, we have recently tested the effects of using a student response system (clickers) on student exam performances and engagement with the course material. We found no measurable difference in class mean composite examination score for students…

  9. Practice makes perfect in memory recall

    PubMed Central

    Romani, Sandro; Katkov, Mikhail

    2016-01-01

    A large variability in performance is observed when participants recall briefly presented lists of words. The sources of such variability are not known. Our analysis of a large data set of free recall revealed a small fraction of participants that reached an extremely high performance, including many trials with the recall of complete lists. Moreover, some of them developed a number of consistent input-position-dependent recall strategies, in particular recalling words consecutively (“chaining”) or in groups of consecutively presented words (“chunking”). The time course of acquisition and particular choice of positional grouping were variable among participants. Our results show that acquiring positional strategies plays a crucial role in improvement of recall performance. PMID:26980785

  10. Slicing of Silicon into Sheet Material. Silicon Sheet Growth Development for the Large Area Silicon Sheet Task of the Low Cost Solar Array Project

    NASA Technical Reports Server (NTRS)

    Fleming, J. R.; Holden, S. C.; Wolfson, R. G.

    1979-01-01

    The use of multiblade slurry sawing to produce silicon wafers from ingots was investigated. The commercially available state of the art process was improved by 20% in terms of area of silicon wafers produced from an ingot. The process was improved 34% on an experimental basis. Economic analyses presented show that further improvements are necessary to approach the desired wafer costs, mostly reduction in expendable materials costs. Tests which indicate that such reduction is possible are included, although demonstration of such reduction was not completed. A new, large capacity saw was designed and tested. Performance comparable with current equipment (in terms of number of wafers/cm) was demonstrated.

  11. Out-coupling membrane for large-size organic light-emitting panels with high efficiency and improved uniformity

    NASA Astrophysics Data System (ADS)

    Ding, Lei; Wang, Lu-Wei; Zhou, Lei; Zhang, Fang-hui

    2016-12-01

    An out-coupling membrane embedded with a scattering film of SiO2 spheres and polyethylene terephthalate (PET) plastic was successfully developed for 150 × 150 mm2 green OLEDs. Comparing with a reference OLED panel, an approximately 1-fold enhancement in the forward emission was obtained with an out-coupling membrane adhered to the surface of the external glass substrate of the panel. Moreover, it was verified that the emission color at different viewing angles can be stabilized without apparent spectral distortion. Particularly, the uniformity of the large-area OLEDs was greatly improved. Theoretical calculation clarified that the improved performance of the lighting panels is primarily attributed to the effect of particle scattering.

  12. Dose-response effects of water supplementation on cognitive performance and mood in children and adults.

    PubMed

    Edmonds, Caroline J; Crosbie, Laura; Fatima, Fareeha; Hussain, Maryam; Jacob, Nicole; Gardner, Mark

    2017-01-01

    Water supplementation has been found to facilitate visual attention and short-term memory, but the dose required to improve performance is not yet known. We assessed the dose response effect of water on thirst, mood and cognitive performance in both adults and children. Participants were offered either no water, 25 ml or 300 ml water to drink. Study 1 assessed 96 adults and in Study 2, data are presented from 60 children aged 7-9 years. In both studies, performance was assessed at baseline and 20 min after drinking (or no drink); on thirst and mood scales, letter cancellation and a digit span test. For both children and adults, a large drink (300 ml) was necessary to reduce thirst, while a small drink (25 ml) was sufficient to improve visual attention (letter cancellation). In adults, a large drink improved digit span, but there was no such effect in children. In children, but not adults, a small drink resulted in increased thirst ratings. Both children and adults show dose-response effects of drinking on visual attention. Visual attention is enhanced by small amounts of fluid and appears not to be contingent on thirst reduction. Memory performance may be related to thirst, but differently for children and adults. These contrasting dose-response characteristics could imply cognitive enhancement by different mechanisms for these two domains. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Topological and canonical kriging for design flood prediction in ungauged catchments: an improvement over a traditional regional regression approach?

    USGS Publications Warehouse

    Archfield, Stacey A.; Pugliese, Alessio; Castellarin, Attilio; Skøien, Jon O.; Kiang, Julie E.

    2013-01-01

    In the United States, estimation of flood frequency quantiles at ungauged locations has been largely based on regional regression techniques that relate measurable catchment descriptors to flood quantiles. More recently, spatial interpolation techniques of point data have been shown to be effective for predicting streamflow statistics (i.e., flood flows and low-flow indices) in ungauged catchments. Literature reports successful applications of two techniques, canonical kriging, CK (or physiographical-space-based interpolation, PSBI), and topological kriging, TK (or top-kriging). CK performs the spatial interpolation of the streamflow statistic of interest in the two-dimensional space of catchment descriptors. TK predicts the streamflow statistic along river networks taking both the catchment area and nested nature of catchments into account. It is of interest to understand how these spatial interpolation methods compare with generalized least squares (GLS) regression, one of the most common approaches to estimate flood quantiles at ungauged locations. By means of a leave-one-out cross-validation procedure, the performance of CK and TK was compared to GLS regression equations developed for the prediction of 10, 50, 100 and 500 yr floods for 61 streamgauges in the southeast United States. TK substantially outperforms GLS and CK for the study area, particularly for large catchments. The performance of TK over GLS highlights an important distinction between the treatments of spatial correlation when using regression-based or spatial interpolation methods to estimate flood quantiles at ungauged locations. The analysis also shows that coupling TK with CK slightly improves the performance of TK; however, the improvement is marginal when compared to the improvement in performance over GLS.

  14. Virtual screening methods as tools for drug lead discovery from large chemical libraries.

    PubMed

    Ma, X H; Zhu, F; Liu, X; Shi, Z; Zhang, J X; Yang, S Y; Wei, Y Q; Chen, Y Z

    2012-01-01

    Virtual screening methods have been developed and explored as useful tools for searching drug lead compounds from chemical libraries, including large libraries that have become publically available. In this review, we discussed the new developments in exploring virtual screening methods for enhanced performance in searching large chemical libraries, their applications in screening libraries of ~ 1 million or more compounds in the last five years, the difficulties in their applications, and the strategies for further improving these methods.

  15. Heat transfer enhancement in a lithium-ion cell through improved material-level thermal transport

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vishwakarma, Vivek; Waghela, Chirag; Wei, Zi

    2016-09-25

    We report that while Li-ion cells offer excellent electrochemical performance for several applications including electric vehicles, they also exhibit poor thermal transport characteristics, resulting in reduced performance, overheating and thermal runaway. Inadequate heat removal from Li-ion cells originates from poor thermal conductivity within the cell. This paper identifies the rate-limiting material-level process that dominates overall thermal conduction in a Li-ion cell. Results indicate that thermal characteristics of a Li-ion cell are largely dominated by heat transfer across the cathode-separator interface rather than heat transfer through the materials themselves. This interfacial thermal resistance contributes around 88% of total thermal resistance inmore » the cell. Measured value of interfacial resistance is close to that obtained from theoretical models that account for weak adhesion and large acoustic mismatch between cathode and separator. Further, to address this problem, an amine-based chemical bridging of the interface is carried out. This is shown to result in in four-times lower interfacial thermal resistance without deterioration in electrochemical performance, thereby increasing effective thermal conductivity by three-fold. This improvement is expected to reduce peak temperature rise during operation by 60%. Finally, by identifying and addressing the material-level root cause of poor thermal transport in Li-ion cells, this work may contribute towards improved thermal performance of Li-ion cells.« less

  16. A Pilot Program: Using Text Messaging to Improve Timely Communication to Tonsillectomy Patients.

    PubMed

    Newton, Laurie; Sulman, Cecille

    2016-01-01

    Approximately 1,500 tonsillectomies are performed annually at a large pediatric academic medical center each year. Families need to be educated on how to care for their child after this surgery. Most tonsillectomy patients are discharged home either the same day as surgery or after one night of observation, resulting in post-operative tonsillectomy recovery and care falling upon the patient's family. Multiple quality improvement efforts to improve family education post tonsillectomy surgery have been performed over the last several years at a large pediatric academic medical center. None of these efforts, however, have focused on the use of technology to provide innovative patient education. The purpose of this project is to provide information to parents via text messages and videos to improve patient experience and outcomes following tonsillectomy. Families provided positive feedback, including that the texts were helpful, easy to understand, and reduced pre-operative and recovery anxiety. Also, none of these families needed to call the ENT clinic for any other questions or concerns. The recovery from tonsillectomy is not easy and this pediatric otolaryngology practice is always searching for new ways to improve care and education. Use of technology is an innovative approach and likely one that will be used more often in the future.

  17. Do Teacher Financial Awards Improve Teacher Retention and Student Achievement in an Urban Disadvantaged School District?

    ERIC Educational Resources Information Center

    Shifrer, Dara; Turley, Ruth López; Heard, Holly

    2017-01-01

    Teacher performance pay programs are theorized to improve student achievement by incentivizing teachers, but opponents counter that teachers are not motivated by money. We used regression discontinuity techniques and data on a census of the students, teachers, and schools in a large urban minority-majority school district to show receipt of a…

  18. How to Improve Schooling Outcomes in Low-Income Countries? the Challenges and Hopes of Cognitive Neuroscience

    ERIC Educational Resources Information Center

    Abadzi, Helen

    2014-01-01

    The international Education for All initiative to bring about universal primary education has resulted in large enrollment increases in lower income countries but with limited outcomes. Due to scarcity in material and human resources, all but the better off often fail to learn basic skills. To improve performance within the very limited capacities…

  19. A cloud-based framework for large-scale traditional Chinese medical record retrieval.

    PubMed

    Liu, Lijun; Liu, Li; Fu, Xiaodong; Huang, Qingsong; Zhang, Xianwen; Zhang, Yin

    2018-01-01

    Electronic medical records are increasingly common in medical practice. The secondary use of medical records has become increasingly important. It relies on the ability to retrieve the complete information about desired patient populations. How to effectively and accurately retrieve relevant medical records from large- scale medical big data is becoming a big challenge. Therefore, we propose an efficient and robust framework based on cloud for large-scale Traditional Chinese Medical Records (TCMRs) retrieval. We propose a parallel index building method and build a distributed search cluster, the former is used to improve the performance of index building, and the latter is used to provide high concurrent online TCMRs retrieval. Then, a real-time multi-indexing model is proposed to ensure the latest relevant TCMRs are indexed and retrieved in real-time, and a semantics-based query expansion method and a multi- factor ranking model are proposed to improve retrieval quality. Third, we implement a template-based visualization method for displaying medical reports. The proposed parallel indexing method and distributed search cluster can improve the performance of index building and provide high concurrent online TCMRs retrieval. The multi-indexing model can ensure the latest relevant TCMRs are indexed and retrieved in real-time. The semantics expansion method and the multi-factor ranking model can enhance retrieval quality. The template-based visualization method can enhance the availability and universality, where the medical reports are displayed via friendly web interface. In conclusion, compared with the current medical record retrieval systems, our system provides some advantages that are useful in improving the secondary use of large-scale traditional Chinese medical records in cloud environment. The proposed system is more easily integrated with existing clinical systems and be used in various scenarios. Copyright © 2017. Published by Elsevier Inc.

  20. Plyometric Training Improves Sprinting, Jumping and Throwing Capacities of High Level Female Volleyball Players Better Than Skill-Based Conditioning.

    PubMed

    Gjinovci, Bahri; Idrizovic, Kemal; Uljevic, Ognjen; Sekulic, Damir

    2017-12-01

    There is an evident lack of studies on the effectiveness of plyometric- and skill-based-conditioning in volleyball. This study aimed to evaluate effects of 12-week plyometric- and volleyball-skill-based training on specific conditioning abilities in female volleyball players. The sample included 41 high-level female volleyball players (21.8 ± 2.1 years of age; 1.76 ± 0.06 cm; 60.8 ± 7.0 kg), who participated in plyometric- (n = 21), or skill-based-conditioning-program (n = 20). Both programs were performed twice per week. Participants were tested on body-height, body-mass (BM), countermovement jump (CMJ), standing broad jump (SBJ), medicine ball throw, (MBT) and 20-m sprint (S20M). All tests were assessed at the study baseline (pre-) and at the end of the 12-week programs (post-testing). Two-way ANOVA for repeated measurements showed significant (p<0.05) "Group x Time" effects for all variables but body-height. Plyometric group significantly reduced body-mass (trivial effect size [ES] differences; 1% average pre- to post-measurement changes), and improved their performance in S20M (moderate ES; 8%), MBT (very large ES; 25%), CMJ (large ES; 27%), and SBJ (moderate ES; 8%). Players involved in skill-based-conditioning significantly improved CMJ (large ES; 18%), SBJ (small ES; 3%), and MBT (large ES; 9%). The changes which occurred between pre- and post-testing were more inter-correlated in plyometric-group. Although both training-modalities induced positive changes in jumping- and throwing-capacities, plyometric-training is found to be more effective than skill-based conditioning in improvement of conditioning capacities of female senior volleyball players. Future studies should evaluate differential program effects in less experienced and younger players.

  1. Voices from the Field: The Perceptions of Teachers and Principals on the Class Size Reduction Program in a Large Urban School District.

    ERIC Educational Resources Information Center

    Munoz, Marco A.; Portes, Pedro R.

    A class size reduction (CSR) program was implemented in a large low-performing urban elementary school district. The CSR program helps schools improve student learning by hiring additional teachers so that children in the early elementary grades can attend smaller classes. This study used a participant-oriented evaluation model to examine the…

  2. Modularized battery management for large lithium ion cells

    NASA Astrophysics Data System (ADS)

    Stuart, Thomas A.; Zhu, Wei

    A modular electronic battery management system (BMS) is described along with important features for protecting and optimizing the performance of large lithium ion (LiIon) battery packs. Of particular interest is the use of a much improved cell equalization system that can increase or decrease individual cell voltages. Experimental results are included for a pack of six series connected 60 Ah (amp-hour) LiIon cells.

  3. An Empirical Study of Personal Response Technology for Improving Attendance and Learning in a Large Class

    ERIC Educational Resources Information Center

    Shapiro, Amy

    2009-01-01

    Student evaluations of a large General Psychology course indicate that students enjoy the class a great deal, yet attendance is low. An experiment was conducted to evaluate a personal response system as a solution. Attendance rose by 30% as compared to extra credit as an inducement, but was equivalent to offering pop quizzes. Performance on test…

  4. Portraiture lens concept in a mobile phone camera

    NASA Astrophysics Data System (ADS)

    Sheil, Conor J.; Goncharov, Alexander V.

    2017-11-01

    A small form-factor lens was designed for the purpose of portraiture photography, the size of which allows use within smartphone casing. The current general requirement of mobile cameras having good all-round performance results in a typical, familiar, many-element design. Such designs have little room for improvement, in terms of the available degrees of freedom and highly-demanding target metrics such as low f-number and wide field of view. However, the specific application of the current portraiture lens relaxed the requirement of an all-round high-performing lens, allowing improvement of certain aspects at the expense of others. With a main emphasis on reducing depth of field (DoF), the current design takes advantage of the simple geometrical relationship between DoF and pupil diameter. The system has a large aperture, while a reasonable f-number gives a relatively large focal length, requiring a catadioptric lens design with double ray path; hence, field of view is reduced. Compared to typical mobile lenses, the large diameter reduces depth of field by a factor of four.

  5. Control Laws for a Dual-Spin Stabilized Platform

    NASA Technical Reports Server (NTRS)

    Lim, K. B.; Moerder, D. D.

    2008-01-01

    This paper describes two attitude control laws suitable for atmospheric flight vehicles with a steady angular momentum bias in the vehicle yaw axis. This bias is assumed to be provided by an internal flywheel, and is introduced to enhance roll and pitch stiffness. The first control law is based on Lyapunov stability theory, and stability proofs are given. The second control law, which assumes that the angular momentum bias is large, is based on a classical PID control. It is shown that the large yaw-axis bias requires that the PI feedback component on the roll and pitch angle errors be cross-fed. Both control laws are applied to a vehicle simulation in the presence of disturbances for several values of yaw-axis angular momentum bias. It is seen that both control laws provide a significant improvement in attitude performance when the bias is sufficiently large, but the nonlinear control law is also able to provide improved performance for a small value of bias. This is important because the smaller bias corresponds to a smaller requirement for mass to be dedicated to the flywheel.

  6. Cross-flow turbines: progress report on physical and numerical model studies at large laboratory scale

    NASA Astrophysics Data System (ADS)

    Wosnik, Martin; Bachant, Peter

    2016-11-01

    Cross-flow turbines show potential in marine hydrokinetic (MHK) applications. A research focus is on accurately predicting device performance and wake evolution to improve turbine array layouts for maximizing overall power output, i.e., minimizing wake interference, or taking advantage of constructive wake interaction. Experiments were carried with large laboratory-scale cross-flow turbines D O (1 m) using a turbine test bed in a large cross-section tow tank, designed to achieve sufficiently high Reynolds numbers for the results to be Reynolds number independent with respect to turbine performance and wake statistics, such that they can be reliably extrapolated to full scale and used for model validation. Several turbines of varying solidity were employed, including the UNH Reference Vertical Axis Turbine (RVAT) and a 1:6 scale model of the DOE-Sandia Reference Model 2 (RM2) turbine. To improve parameterization in array simulations, an actuator line model (ALM) was developed to provide a computationally feasible method for simulating full turbine arrays inside Navier-Stokes models. Results are presented for the simulation of performance and wake dynamics of cross-flow turbines and compared with experiments and body-fitted mesh, blade-resolving CFD. Supported by NSF-CBET Grant 1150797, Sandia National Laboratories.

  7. Practice-based learning can improve osteoporosis care.

    PubMed

    Hess, Brian J; Johnston, Mary M; Iobst, William F; Lipner, Rebecca S

    2013-10-01

    To examine physician engagement in practice-based learning using a self-evaluation module to assess and improve their care of individuals with or at risk of osteoporosis. Retrospective cohort study. Internal medicine and subspecialty clinics. Eight hundred fifty U.S. physicians with time-limited certification in general internal medicine or a subspecialty. Performance rates on 23 process measures and seven practice system domain scores were obtained from the American Board of Internal Medicine (ABIM) Osteoporosis Practice Improvement Module (PIM), an Internet-based self-assessment module that physicians use to improve performance on one targeted measure. Physicians remeasured performance on their targeted measures by conducting another medical chart review. Variability in performance on measures was found, with observed differences between general internists, geriatricians, and rheumatologists. Some practice system elements were modestly associated with measure performance; the largest association was between providing patient-centered self-care support and documentation of calcium intake and vitamin D estimation and counseling (correlation coefficients from 0.20 to 0.28, Ps < .002). For all practice types, the most commonly selected measure targeted for improvement was documentation of vitamin D level (38% of physicians). On average, physicians reported significant and large increases in performance on measures targeted for improvement. Gaps exist in the quality of osteoporosis care, and physicians can apply practice-based learning using the ABIM PIM to take action to improve the quality of care. © 2013, Copyright the Authors Journal compilation © 2013, The American Geriatrics Society.

  8. Analysis and Design of Launch Vehicle Flight Control Systems

    NASA Technical Reports Server (NTRS)

    Wie, Bong; Du, Wei; Whorton, Mark

    2008-01-01

    This paper describes the fundamental principles of launch vehicle flight control analysis and design. In particular, the classical concept of "drift-minimum" and "load-minimum" control principles is re-examined and its performance and stability robustness with respect to modeling uncertainties and a gimbal angle constraint is discussed. It is shown that an additional feedback of angle-of-attack or lateral acceleration can significantly improve the overall performance and robustness, especially in the presence of unexpected large wind disturbance. Non-minimum-phase structural filtering of "unstably interacting" bending modes of large flexible launch vehicles is also shown to be effective and robust.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kisaki, M., E-mail: kisaki.masashi@LHD.nifs.ac.jp; Ikeda, K.; Osakabe, M.

    To improve the performance of negative-ion based neutral beam injection on the Large Helical Device, the accelerator was modified on the basis of numerical investigations. A field limiting ring was installed on the upper side of a grounded grid (GG) support and a multi-slot GG was adopted instead of a multi-aperture GG. As a result, the voltage holding capability is improved and the heat load on the GG decreases by 40%. In addition, the arc efficiency is improved significantly only by replacing the GG.

  10. A dissociation between engagement and learning: Enthusiastic instructions fail to reliably improve performance on a memory task.

    PubMed

    Motz, Benjamin A; de Leeuw, Joshua R; Carvalho, Paulo F; Liang, Kaley L; Goldstone, Robert L

    2017-01-01

    Despite widespread assertions that enthusiasm is an important quality of effective teaching, empirical research on the effect of enthusiasm on learning and memory is mixed and largely inconclusive. To help resolve these inconsistencies, we conducted a carefully-controlled laboratory experiment, investigating whether enthusiastic instructions for a memory task would improve recall accuracy. Scripted videos, either enthusiastic or neutral, were used to manipulate the delivery of task instructions. We also manipulated the sequence of learning items, replicating the spacing effect, a known cognitive technique for memory improvement. Although spaced study reliably improved test performance, we found no reliable effect of enthusiasm on memory performance across two experiments. We did, however, find that enthusiastic instructions caused participants to respond to more item prompts, leaving fewer test questions blank, an outcome typically associated with increased task motivation. We find no support for the popular claim that enthusiastic instruction will improve learning, although it may still improve engagement. This dissociation between motivation and learning is discussed, as well as its implications for education and future research on student learning.

  11. A dissociation between engagement and learning: Enthusiastic instructions fail to reliably improve performance on a memory task

    PubMed Central

    de Leeuw, Joshua R.; Carvalho, Paulo F.; Liang, Kaley L.; Goldstone, Robert L.

    2017-01-01

    Despite widespread assertions that enthusiasm is an important quality of effective teaching, empirical research on the effect of enthusiasm on learning and memory is mixed and largely inconclusive. To help resolve these inconsistencies, we conducted a carefully-controlled laboratory experiment, investigating whether enthusiastic instructions for a memory task would improve recall accuracy. Scripted videos, either enthusiastic or neutral, were used to manipulate the delivery of task instructions. We also manipulated the sequence of learning items, replicating the spacing effect, a known cognitive technique for memory improvement. Although spaced study reliably improved test performance, we found no reliable effect of enthusiasm on memory performance across two experiments. We did, however, find that enthusiastic instructions caused participants to respond to more item prompts, leaving fewer test questions blank, an outcome typically associated with increased task motivation. We find no support for the popular claim that enthusiastic instruction will improve learning, although it may still improve engagement. This dissociation between motivation and learning is discussed, as well as its implications for education and future research on student learning. PMID:28732087

  12. Development of a test rig for a helium twin-screw compressor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, B. M.; Hu, Z. J.; Zhang, P.

    2014-01-29

    A large helium cryogenic system is being developed for use in great science projects, such as the International Thermonuclear Experimental Reactor (ITER), Large Helical Device (LHD), and the Experimental Advanced Superconducting Tokamak (EAST). In this cryogenic system, a twin-screw compressor is a key component. Therefore, it is necessary to obtain the compressor performance. To obtain the performance characteristics, a test rig for the compressor has been built. All the important performance parameters, including adiabatic efficiency, volumetric efficiency, oil injection characteristic, and noise characteristic can be acquired with the rig when sensors are installed in the test system. With the testmore » performance, the helium twin-screw compressor can be evaluated. Using these results, the design of the compressor can be improved.« less

  13. The role of aluminum in slow sand filtration.

    PubMed

    Weber-Shirk, Monroe L; Chan, Kwok Loon

    2007-03-01

    Engineering enhancement of slow sand filtration has been an enigma in large part because the mechanisms responsible for particle removal have not been well characterized. The presumed role of biological processes in the filter ripening process nearly precluded the possibility of enhancing filter performance since interventions to enhance biological activity would have required decreasing the quality of the influent water. In previous work, we documented that an acid soluble polymer controls filter performance. The new understanding that particle removal is controlled in large part by physical chemical mechanisms has expanded the possibilities of engineering slow sand filter performance. Herein, we explore the role of naturally occurring aluminum as a ripening agent for slow sand filters and the possibility of using a low dose of alum to improve filter performance or to ripen slow sand filters.

  14. Evaluation philosophy for shuttle launched payloads

    NASA Technical Reports Server (NTRS)

    Heuser, R. E.

    1975-01-01

    Potential benefits of factory-to-pad testing constitute major cost savings and increase test effectiveness. Overall flight performance will be improved. The factory-to-pad approach is compatible with space shuttle processing and the large space telescope program.

  15. Source-gated transistors for order-of-magnitude performance improvements in thin-film digital circuits

    NASA Astrophysics Data System (ADS)

    Sporea, R. A.; Trainor, M. J.; Young, N. D.; Shannon, J. M.; Silva, S. R. P.

    2014-03-01

    Ultra-large-scale integrated (ULSI) circuits have benefited from successive refinements in device architecture for enormous improvements in speed, power efficiency and areal density. In large-area electronics (LAE), however, the basic building-block, the thin-film field-effect transistor (TFT) has largely remained static. Now, a device concept with fundamentally different operation, the source-gated transistor (SGT) opens the possibility of unprecedented functionality in future low-cost LAE. With its simple structure and operational characteristics of low saturation voltage, stability under electrical stress and large intrinsic gain, the SGT is ideally suited for LAE analog applications. Here, we show using measurements on polysilicon devices that these characteristics lead to substantial improvements in gain, noise margin, power-delay product and overall circuit robustness in digital SGT-based designs. These findings have far-reaching consequences, as LAE will form the technological basis for a variety of future developments in the biomedical, civil engineering, remote sensing, artificial skin areas, as well as wearable and ubiquitous computing, or lightweight applications for space exploration.

  16. Source-gated transistors for order-of-magnitude performance improvements in thin-film digital circuits

    PubMed Central

    Sporea, R. A.; Trainor, M. J.; Young, N. D.; Shannon, J. M.; Silva, S. R. P.

    2014-01-01

    Ultra-large-scale integrated (ULSI) circuits have benefited from successive refinements in device architecture for enormous improvements in speed, power efficiency and areal density. In large-area electronics (LAE), however, the basic building-block, the thin-film field-effect transistor (TFT) has largely remained static. Now, a device concept with fundamentally different operation, the source-gated transistor (SGT) opens the possibility of unprecedented functionality in future low-cost LAE. With its simple structure and operational characteristics of low saturation voltage, stability under electrical stress and large intrinsic gain, the SGT is ideally suited for LAE analog applications. Here, we show using measurements on polysilicon devices that these characteristics lead to substantial improvements in gain, noise margin, power-delay product and overall circuit robustness in digital SGT-based designs. These findings have far-reaching consequences, as LAE will form the technological basis for a variety of future developments in the biomedical, civil engineering, remote sensing, artificial skin areas, as well as wearable and ubiquitous computing, or lightweight applications for space exploration. PMID:24599023

  17. Development of superconductor magnetic suspension and balance prototype facility for studying the feasibility of applying this technique to large scale aerodynamic testing

    NASA Technical Reports Server (NTRS)

    Zapata, R. N.; Humphris, R. R.; Henderson, K. C.

    1975-01-01

    The unique design and operational characteristics of a prototype magnetic suspension and balance facility which utilizes superconductor technology are described and discussed from the point of view of scalability to large sizes. The successful experimental demonstration of the feasibility of this new magnetic suspension concept of the University of Virginia, together with the success of the cryogenic wind-tunnel concept developed at Langley Research Center, appear to have finally opened the way to clean-tunnel, high-Re aerodynamic testing. Results of calculations corresponding to a two-step design extrapolation from the observed performance of the prototype magnetic suspension system to a system compatible with the projected cryogenic transonic research tunnel are presented to give an order-of-magnitude estimate of expected performance characteristics. Research areas where progress should lead to improved design and performance of large facilities are discussed.

  18. Distributed control of large space antennas

    NASA Technical Reports Server (NTRS)

    Cameron, J. M.; Hamidi, M.; Lin, Y. H.; Wang, S. J.

    1983-01-01

    A systematic way to choose control design parameters and to evaluate performance for large space antennas is presented. The structural dynamics and control properties for a Hoop and Column Antenna and a Wrap-Rib Antenna are characterized. Some results of the effects of model parameter uncertainties to the stability, surface accuracy, and pointing errors are presented. Critical dynamics and control problems for these antenna configurations are identified and potential solutions are discussed. It was concluded that structural uncertainties and model error can cause serious performance deterioration and can even destabilize the controllers. For the hoop and column antenna, large hoop and long meat and the lack of stiffness between the two substructures result in low structural frequencies. Performance can be improved if this design can be strengthened. The two-site control system is more robust than either single-site control systems for the hoop and column antenna.

  19. Sub-block motion derivation for merge mode in HEVC

    NASA Astrophysics Data System (ADS)

    Chien, Wei-Jung; Chen, Ying; Chen, Jianle; Zhang, Li; Karczewicz, Marta; Li, Xiang

    2016-09-01

    The new state-of-the-art video coding standard, H.265/HEVC, has been finalized in 2013 and it achieves roughly 50% bit rate saving compared to its predecessor, H.264/MPEG-4 AVC. In this paper, two additional merge candidates, advanced temporal motion vector predictor and spatial-temporal motion vector predictor, are developed to improve motion information prediction scheme under the HEVC structure. The proposed method allows each Prediction Unit (PU) to fetch multiple sets of motion information from multiple blocks smaller than the current PU. By splitting a large PU into sub-PUs and filling motion information for all the sub-PUs of the large PU, signaling cost of motion information could be reduced. This paper describes above-mentioned techniques in detail and evaluates their coding performance benefits based on the common test condition during HEVC development. Simulation results show that 2.4% performance improvement over HEVC can be achieved.

  20. Single-crystalline self-branched anatase titania nanowires for dye-sensitized solar cells

    NASA Astrophysics Data System (ADS)

    Li, Zhenquan; Yang, Huang; Wu, Fei; Fu, Jianxun; Wang, Linjun; Yang, Weiguang

    2017-03-01

    The morphology of the anatase titania plays an important role in improving the photovoltaic performance in dye-sensitized solar cells. In this work, single-crystalline self-branched anatase TiO2 nanowires have been synthesized by hydrothermal method using TBAH and CTAB as morphology controlling agents. The obtained self-branched TiO2 nanowires dominated by a large percentage of (010) facets. The photovoltaic conversion efficiency (6.37%) of dye-sensitized solar cell (DSSC) based on the self-branched TiO2 nanowires shows a significant improvement (26.6%) compared to that of P25 TiO2 (5.03%). The enhanced performance of the self-branched TiO2 nanowires-based DSSC is due to heir large percent of exposed (010) facets which have strong dye adsorption capacity and effective charge transport of the self-branched 1D nanostructures.

  1. A small-gap electrostatic micro-actuator for large deflections

    PubMed Central

    Conrad, Holger; Schenk, Harald; Kaiser, Bert; Langa, Sergiu; Gaudet, Matthieu; Schimmanz, Klaus; Stolz, Michael; Lenz, Miriam

    2015-01-01

    Common quasi-static electrostatic micro actuators have significant limitations in deflection due to electrode separation and unstable drive regions. State-of-the-art electrostatic actuators achieve maximum deflections of approximately one third of the electrode separation. Large electrode separation and high driving voltages are normally required to achieve large actuator movements. Here we report on an electrostatic actuator class, fabricated in a CMOS-compatible process, which allows high deflections with small electrode separation. The concept presented makes the huge electrostatic forces within nanometre small electrode separation accessible for large deflections. Electrostatic actuations that are larger than the electrode separation were measured. An analytical theory is compared with measurement and simulation results and enables closer understanding of these actuators. The scaling behaviour discussed indicates significant future improvement on actuator deflection. The presented driving concept enables the investigation and development of novel micro systems with a high potential for improved device and system performance. PMID:26655557

  2. Lockheed L-1011 Test Station on-board in support of the Adaptive Performance Optimization flight res

    NASA Technical Reports Server (NTRS)

    1997-01-01

    This console and its compliment of computers, monitors and commmunications equipment make up the Research Engineering Test Station, the nerve center for a new aerodynamics experiment being conducted by NASA's Dryden Flight Research Center, Edwards, California. The equipment is installed on a modified Lockheed L-1011 Tristar jetliner operated by Orbital Sciences Corp., of Dulles, Va., for Dryden's Adaptive Performance Optimization project. The experiment seeks to improve the efficiency of long-range jetliners by using small movements of the ailerons to improve the aerodynamics of the wing at cruise conditions. About a dozen research flights in the Adaptive Performance Optimization project are planned over the next two to three years. Improving the aerodynamic efficiency should result in equivalent reductions in fuel usage and costs for airlines operating large, wide-bodied jetliners.

  3. Automatic Information Processing and High Performance Skills: Individual Differences and Mechanisms of Performance Improvement in Search-Detection and Complex Task

    DTIC Science & Technology

    1992-09-01

    abilities is fit along with the autoregressive process. Initially, the influences on search performance of within-group age and sex were included as control...Results: PerformanceLAbility Structure Measurement Model: Ability Structure The correlations between all the ability measures, age, and sex are...subsequent analyses for young adults. Age and sex were included as control variables. There was an age range of 15 years; this range is sufficiently large that

  4. DOMe: A deduplication optimization method for the NewSQL database backups

    PubMed Central

    Wang, Longxiang; Zhu, Zhengdong; Zhang, Xingjun; Wang, Yinfeng

    2017-01-01

    Reducing duplicated data of database backups is an important application scenario for data deduplication technology. NewSQL is an emerging database system and is now being used more and more widely. NewSQL systems need to improve data reliability by periodically backing up in-memory data, resulting in a lot of duplicated data. The traditional deduplication method is not optimized for the NewSQL server system and cannot take full advantage of hardware resources to optimize deduplication performance. A recent research pointed out that the future NewSQL server will have thousands of CPU cores, large DRAM and huge NVRAM. Therefore, how to utilize these hardware resources to optimize the performance of data deduplication is an important issue. To solve this problem, we propose a deduplication optimization method (DOMe) for NewSQL system backup. To take advantage of the large number of CPU cores in the NewSQL server to optimize deduplication performance, DOMe parallelizes the deduplication method based on the fork-join framework. The fingerprint index, which is the key data structure in the deduplication process, is implemented as pure in-memory hash table, which makes full use of the large DRAM in NewSQL system, eliminating the performance bottleneck problem of fingerprint index existing in traditional deduplication method. The H-store is used as a typical NewSQL database system to implement DOMe method. DOMe is experimentally analyzed by two representative backup data. The experimental results show that: 1) DOMe can reduce the duplicated NewSQL backup data. 2) DOMe significantly improves deduplication performance by parallelizing CDC algorithms. In the case of the theoretical speedup ratio of the server is 20.8, the speedup ratio of DOMe can achieve up to 18; 3) DOMe improved the deduplication throughput by 1.5 times through the pure in-memory index optimization method. PMID:29049307

  5. Nursery Cultural Practices and Morphological Attributes of Longleaf Pine Bare-Root Stock as Indicators of Early Field Performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Glyndon E. Hatchell, Research Forester, Retired Institute for Mycorrhizal Research and Development Athens, Georgia and H. David Muse, Professor Department of Mathematics University of North Alabama Florence, Alabama

    1990-02-01

    A large study of morphological attributes of longleaf pine nursery stock at the Savannah River site of the various attributes measured, only number of lateral roots and seedling diameters were related to performance. Lateral root pruning in the nursery also improved performance. Both survival and growth during the first two years were strongly correlated with larger stem diameter and larger root system development.

  6. Optimizing NEURON Simulation Environment Using Remote Memory Access with Recursive Doubling on Distributed Memory Systems.

    PubMed

    Shehzad, Danish; Bozkuş, Zeki

    2016-01-01

    Increase in complexity of neuronal network models escalated the efforts to make NEURON simulation environment efficient. The computational neuroscientists divided the equations into subnets amongst multiple processors for achieving better hardware performance. On parallel machines for neuronal networks, interprocessor spikes exchange consumes large section of overall simulation time. In NEURON for communication between processors Message Passing Interface (MPI) is used. MPI_Allgather collective is exercised for spikes exchange after each interval across distributed memory systems. The increase in number of processors though results in achieving concurrency and better performance but it inversely affects MPI_Allgather which increases communication time between processors. This necessitates improving communication methodology to decrease the spikes exchange time over distributed memory systems. This work has improved MPI_Allgather method using Remote Memory Access (RMA) by moving two-sided communication to one-sided communication, and use of recursive doubling mechanism facilitates achieving efficient communication between the processors in precise steps. This approach enhanced communication concurrency and has improved overall runtime making NEURON more efficient for simulation of large neuronal network models.

  7. Optimizing NEURON Simulation Environment Using Remote Memory Access with Recursive Doubling on Distributed Memory Systems

    PubMed Central

    Bozkuş, Zeki

    2016-01-01

    Increase in complexity of neuronal network models escalated the efforts to make NEURON simulation environment efficient. The computational neuroscientists divided the equations into subnets amongst multiple processors for achieving better hardware performance. On parallel machines for neuronal networks, interprocessor spikes exchange consumes large section of overall simulation time. In NEURON for communication between processors Message Passing Interface (MPI) is used. MPI_Allgather collective is exercised for spikes exchange after each interval across distributed memory systems. The increase in number of processors though results in achieving concurrency and better performance but it inversely affects MPI_Allgather which increases communication time between processors. This necessitates improving communication methodology to decrease the spikes exchange time over distributed memory systems. This work has improved MPI_Allgather method using Remote Memory Access (RMA) by moving two-sided communication to one-sided communication, and use of recursive doubling mechanism facilitates achieving efficient communication between the processors in precise steps. This approach enhanced communication concurrency and has improved overall runtime making NEURON more efficient for simulation of large neuronal network models. PMID:27413363

  8. Validation of SplitVectors Encoding for Quantitative Visualization of Large-Magnitude-Range Vector Fields

    PubMed Central

    Zhao, Henan; Bryant, Garnett W.; Griffin, Wesley; Terrill, Judith E.; Chen, Jian

    2017-01-01

    We designed and evaluated SplitVectors, a new vector field display approach to help scientists perform new discrimination tasks on large-magnitude-range scientific data shown in three-dimensional (3D) visualization environments. SplitVectors uses scientific notation to display vector magnitude, thus improving legibility. We present an empirical study comparing the SplitVectors approach with three other approaches - direct linear representation, logarithmic, and text display commonly used in scientific visualizations. Twenty participants performed three domain analysis tasks: reading numerical values (a discrimination task), finding the ratio between values (a discrimination task), and finding the larger of two vectors (a pattern detection task). Participants used both mono and stereo conditions. Our results suggest the following: (1) SplitVectors improve accuracy by about 10 times compared to linear mapping and by four times to logarithmic in discrimination tasks; (2) SplitVectors have no significant differences from the textual display approach, but reduce cluttering in the scene; (3) SplitVectors and textual display are less sensitive to data scale than linear and logarithmic approaches; (4) using logarithmic can be problematic as participants' confidence was as high as directly reading from the textual display, but their accuracy was poor; and (5) Stereoscopy improved performance, especially in more challenging discrimination tasks. PMID:28113469

  9. Validation of SplitVectors Encoding for Quantitative Visualization of Large-Magnitude-Range Vector Fields.

    PubMed

    Henan Zhao; Bryant, Garnett W; Griffin, Wesley; Terrill, Judith E; Jian Chen

    2017-06-01

    We designed and evaluated SplitVectors, a new vector field display approach to help scientists perform new discrimination tasks on large-magnitude-range scientific data shown in three-dimensional (3D) visualization environments. SplitVectors uses scientific notation to display vector magnitude, thus improving legibility. We present an empirical study comparing the SplitVectors approach with three other approaches - direct linear representation, logarithmic, and text display commonly used in scientific visualizations. Twenty participants performed three domain analysis tasks: reading numerical values (a discrimination task), finding the ratio between values (a discrimination task), and finding the larger of two vectors (a pattern detection task). Participants used both mono and stereo conditions. Our results suggest the following: (1) SplitVectors improve accuracy by about 10 times compared to linear mapping and by four times to logarithmic in discrimination tasks; (2) SplitVectors have no significant differences from the textual display approach, but reduce cluttering in the scene; (3) SplitVectors and textual display are less sensitive to data scale than linear and logarithmic approaches; (4) using logarithmic can be problematic as participants' confidence was as high as directly reading from the textual display, but their accuracy was poor; and (5) Stereoscopy improved performance, especially in more challenging discrimination tasks.

  10. Improvement of Automated POST Case Success Rate Using Support Vector Machines

    NASA Technical Reports Server (NTRS)

    Zwack, Mathew R.; Dees, Patrick D.

    2017-01-01

    During early conceptual design of complex systems, concept down selection can have a large impact upon program life-cycle cost. Therefore, any concepts selected during early design will inherently commit program costs and affect the overall probability of program success. For this reason it is important to consider as large a design space as possible in order to better inform the down selection process. For conceptual design of launch vehicles, trajectory analysis and optimization often presents the largest obstacle to evaluating large trade spaces. This is due to the sensitivity of the trajectory discipline to changes in all other aspects of the vehicle design. Small deltas in the performance of other subsystems can result in relatively large fluctuations in the ascent trajectory because the solution space is non-linear and multi-modal. In order to help capture large design spaces for new launch vehicles, the authors have performed previous work seeking to automate the execution of the industry standard tool, Program to Optimize Simulated Trajectories (POST). This work initially focused on implementation of analyst heuristics to enable closure of cases in an automated fashion, with the goal of applying the concepts of design of experiments (DOE) and surrogate modeling to enable near instantaneous throughput of vehicle cases.3 As noted in [4] work was then completed to improve the DOE process by utilizing a graph theory based approach to connect similar design points.

  11. Moving Matters: The Causal Effect of Moving Schools on Student Performance

    ERIC Educational Resources Information Center

    Schwartz, Amy Ellen; Stiefel, Leanna; Cordes, Sarah A.

    2017-01-01

    Policy makers and analysts often view the reduction of student mobility across schools as a way to improve academic performance. Prior work indicates that children do worse in the year of a school move, but has been largely unsuccessful in isolating the causal effects of mobility. We use longitudinal data on students in New York City public…

  12. Recent developments in user-job management with Ganga

    NASA Astrophysics Data System (ADS)

    Currie, R.; Elmsheuser, J.; Fay, R.; Owen, P. H.; Richards, A.; Slater, M.; Sutcliffe, W.; Williams, M.

    2015-12-01

    The Ganga project was originally developed for use by LHC experiments and has been used extensively throughout Run1 in both LHCb and ATLAS. This document describes some the most recent developments within the Ganga project. There have been improvements in the handling of large scale computational tasks in the form of a new GangaTasks infrastructure. Improvements in file handling through using a new IGangaFile interface makes handling files largely transparent to the end user. In addition to this the performance and usability of Ganga have both been addressed through the development of a new queues system allows for parallel processing of job related tasks.

  13. A Historical Review of Cermet Fuel Development and the Engine Performance Implications

    NASA Technical Reports Server (NTRS)

    Stewart, Mark E.

    2015-01-01

    To better understand Cermet engine performance, examined historical material development reports two issues: High vaporization rate of UO2, High temperature chemical stability of UO2. Cladding and chemical stabilizers each result in large, order of magnitude improvements in high temperature performance. Few samples were tested above 2770 K. Results above 2770 K are ambiguous. Contemporary testing may clarify performance. Cermet sample testing during the NERVA Rover era. Important properties, melting temperature, vaporization rate, strength, Brittle-to-Ductile Transition, cermet sample test results, engine performance, location, peak temperature.

  14. Performance and accountability : challenges facing the Department of Transportation

    DOT National Transportation Integrated Search

    2001-02-14

    For surface transportation safety, DOT continues to face challenges in improving the safety of highways and pipelines. While the Department of Transportation appears to be making progress on some initiatives to reduce the number of large truck crashe...

  15. Hysteroscopic sterilization using a virtual reality simulator: assessment of learning curve.

    PubMed

    Janse, Juliënne A; Goedegebuure, Ruben S A; Veersema, Sebastiaan; Broekmans, Frank J M; Schreuder, Henk W R

    2013-01-01

    To assess the learning curve using a virtual reality simulator for hysteroscopic sterilization with the Essure method. Prospective multicenter study (Canadian Task Force classification II-2). University and teaching hospital in the Netherlands. Thirty novices (medical students) and five experts (gynecologists who had performed >150 Essure sterilization procedures). All participants performed nine repetitions of bilateral Essure placement on the simulator. Novices returned after 2 weeks and performed a second series of five repetitions to assess retention of skills. Structured observations on performance using the Global Rating Scale and parameters derived from the simulator provided measurements for analysis. The learning curve is represented by improvement per procedure. Two-way repeated-measures analysis of variance was used to analyze learning curves. Effect size (ES) was calculated to express the practical significance of the results (ES ≥ 0.50 indicates a large learning effect). For all parameters, significant improvements were found in novice performance within nine repetitions. Large learning effects were established for six of eight parameters (p < .001; ES, 0.50-0.96). Novices approached expert level within 9 to 14 repetitions. The learning curve established in this study endorses future implementation of the simulator in curricula on hysteroscopic skill acquisition for clinicians who are interested in learning this sterilization technique. Copyright © 2013 AAGL. Published by Elsevier Inc. All rights reserved.

  16. Increases in lower-body strength transfer positively to sprint performance: a systematic review with meta-analysis.

    PubMed

    Seitz, Laurent B; Reyes, Alvaro; Tran, Tai T; Saez de Villarreal, Eduardo; Haff, G Gregory

    2014-12-01

    Although lower-body strength is correlated with sprint performance, whether increases in lower-body strength transfer positively to sprint performance remain unclear. This meta-analysis determined whether increases in lower-body strength (measured with the free-weight back squat exercise) transfer positively to sprint performance, and identified the effects of various subject characteristics and resistance-training variables on the magnitude of sprint improvement. A computerized search was conducted in ADONIS, ERIC, SPORTDiscus, EBSCOhost, Google Scholar, MEDLINE and PubMed databases, and references of original studies and reviews were searched for further relevant studies. The analysis comprised 510 subjects and 85 effect sizes (ESs), nested with 26 experimental and 11 control groups and 15 studies. There is a transfer between increases in lower-body strength and sprint performance as indicated by a very large significant correlation (r = -0.77; p = 0.0001) between squat strength ES and sprint ES. Additionally, the magnitude of sprint improvement is affected by the level of practice (p = 0.03) and body mass (r = 0.35; p = 0.011) of the subject, the frequency of resistance-training sessions per week (r = 0.50; p = 0.001) and the rest interval between sets of resistance-training exercises (r = -0.47; p ≤ 0.001). Conversely, the magnitude of sprint improvement is not affected by the athlete's age (p = 0.86) and height (p = 0.08), the resistance-training methods used through the training intervention, (p = 0.06), average load intensity [% of 1 repetition maximum (RM)] used during the resistance-training sessions (p = 0.34), training program duration (p = 0.16), number of exercises per session (p = 0.16), number of sets per exercise (p = 0.06) and number of repetitions per set (p = 0.48). Increases in lower-body strength transfer positively to sprint performance. The magnitude of sprint improvement is affected by numerous subject characteristics and resistance-training variables, but the large difference in number of ESs available should be taken into consideration. Overall, the reported improvement in sprint performance (sprint ES = -0.87, mean sprint improvement = 3.11 %) resulting from resistance training is of practical relevance for coaches and athletes in sport activities requiring high levels of speed.

  17. Engineering MoSx/Ti/InP Hybrid Photocathode for Improved Solar Hydrogen Production

    NASA Astrophysics Data System (ADS)

    Li, Qiang; Zheng, Maojun; Zhong, Miao; Ma, Liguo; Wang, Faze; Ma, Li; Shen, Wenzhong

    2016-07-01

    Due to its direct band gap of ~1.35 eV, appropriate energy band-edge positions, and low surface-recombination velocity, p-type InP has attracted considerable attention as a promising photocathode material for solar hydrogen generation. However, challenges remain with p-type InP for achieving high and stable photoelectrochemical (PEC) performances. Here, we demonstrate that surface modifications of InP photocathodes with Ti thin layers and amorphous MoSx nanoparticles can remarkably improve their PEC performances. A high photocurrent density with an improved PEC onset potential is obtained. Electrochemical impedance analyses reveal that the largely improved PEC performance of MoSx/Ti/InP is attributed to the reduced charge-transfer resistance and the increased band bending at the MoSx/Ti/InP/electrolyte interface. In addition, the MoSx/Ti/InP photocathodes function stably for PEC water reduction under continuous light illumination over 2 h. Our study demonstrates an effective approach to develop high-PEC-performance InP photocathodes towards stable solar hydrogen production.

  18. Current state and future direction of computer systems at NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Rogers, James L. (Editor); Tucker, Jerry H. (Editor)

    1992-01-01

    Computer systems have advanced at a rate unmatched by any other area of technology. As performance has dramatically increased there has been an equally dramatic reduction in cost. This constant cost performance improvement has precipitated the pervasiveness of computer systems into virtually all areas of technology. This improvement is due primarily to advances in microelectronics. Most people are now convinced that the new generation of supercomputers will be built using a large number (possibly thousands) of high performance microprocessors. Although the spectacular improvements in computer systems have come about because of these hardware advances, there has also been a steady improvement in software techniques. In an effort to understand how these hardware and software advances will effect research at NASA LaRC, the Computer Systems Technical Committee drafted this white paper to examine the current state and possible future directions of computer systems at the Center. This paper discusses selected important areas of computer systems including real-time systems, embedded systems, high performance computing, distributed computing networks, data acquisition systems, artificial intelligence, and visualization.

  19. Engineering MoSx/Ti/InP Hybrid Photocathode for Improved Solar Hydrogen Production

    PubMed Central

    Li, Qiang; Zheng, Maojun; Zhong, Miao; Ma, Liguo; Wang, Faze; Ma, Li; Shen, Wenzhong

    2016-01-01

    Due to its direct band gap of ~1.35 eV, appropriate energy band-edge positions, and low surface-recombination velocity, p-type InP has attracted considerable attention as a promising photocathode material for solar hydrogen generation. However, challenges remain with p-type InP for achieving high and stable photoelectrochemical (PEC) performances. Here, we demonstrate that surface modifications of InP photocathodes with Ti thin layers and amorphous MoSx nanoparticles can remarkably improve their PEC performances. A high photocurrent density with an improved PEC onset potential is obtained. Electrochemical impedance analyses reveal that the largely improved PEC performance of MoSx/Ti/InP is attributed to the reduced charge-transfer resistance and the increased band bending at the MoSx/Ti/InP/electrolyte interface. In addition, the MoSx/Ti/InP photocathodes function stably for PEC water reduction under continuous light illumination over 2 h. Our study demonstrates an effective approach to develop high-PEC-performance InP photocathodes towards stable solar hydrogen production. PMID:27431993

  20. Engineering MoSx/Ti/InP Hybrid Photocathode for Improved Solar Hydrogen Production.

    PubMed

    Li, Qiang; Zheng, Maojun; Zhong, Miao; Ma, Liguo; Wang, Faze; Ma, Li; Shen, Wenzhong

    2016-07-19

    Due to its direct band gap of ~1.35 eV, appropriate energy band-edge positions, and low surface-recombination velocity, p-type InP has attracted considerable attention as a promising photocathode material for solar hydrogen generation. However, challenges remain with p-type InP for achieving high and stable photoelectrochemical (PEC) performances. Here, we demonstrate that surface modifications of InP photocathodes with Ti thin layers and amorphous MoSx nanoparticles can remarkably improve their PEC performances. A high photocurrent density with an improved PEC onset potential is obtained. Electrochemical impedance analyses reveal that the largely improved PEC performance of MoSx/Ti/InP is attributed to the reduced charge-transfer resistance and the increased band bending at the MoSx/Ti/InP/electrolyte interface. In addition, the MoSx/Ti/InP photocathodes function stably for PEC water reduction under continuous light illumination over 2 h. Our study demonstrates an effective approach to develop high-PEC-performance InP photocathodes towards stable solar hydrogen production.

  1. Working memory plasticity and aging.

    PubMed

    Rhodes, Rebecca E; Katz, Benjamin

    2017-02-01

    The present research explores how the trajectory of learning on a working memory task changes throughout the life span, and whether gains in working memory performance are exclusively a question of initial working memory capacity (WMC) or whether age exerts an independent effect. In a large, cross-sectional study of younger, middle-aged, and older adults, we examined learning on a widely used working memory task-the dual n-back task-over 20 sessions of practice. We found that, while all age groups improved on the task, older adults demonstrated less improvement on the task, and also reached a lower asymptotic maximum performance than younger adults. After controlling for initial WMC, we found that age exerted independent effects on training gains and asymptotic performance; older adults tended to improve less and reached lower levels of performance than younger adults. The difference between younger and older adults' rates of learning depended in part on initial WMC. These results suggest that age-related effects on working memory include not only effects on capacity, but also plasticity and the ability to improve on a task. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  2. Electrolytes for Use in High Energy Lithium-Ion Batteries with Wide Operating Temperature Range

    NASA Technical Reports Server (NTRS)

    Smart, Marshall C.; Ratnakumar, B. V.; West, W. C.; Whitcanack, L. D.; Huang, C.; Soler, J.; Krause, F. C.

    2011-01-01

    Objectives of this work are: (1) Develop advanced Li -ion electrolytes that enable cell operation over a wide temperature range (i.e., -30 to +60C). (2) Improve the high temperature stability and lifetime characteristics of wide operating temperature electrolytes. (3) Improve the high voltage stability of these candidate electrolytes systems to enable operation up to 5V with high specific energy cathode materials. (4) Define the performance limitations at low and high temperature extremes, as well as, life limiting processes. (5) Demonstrate the performance of advanced electrolytes in large capacity prototype cells.

  3. A Historical Review of Cermet Fuel Development and the Engine Performance Implications

    NASA Technical Reports Server (NTRS)

    Stewart, Mark E. M.

    2015-01-01

    This paper reviews test data for cermet fuel samples developed in the 1960's to better quantify Nuclear Thermal Propulsion (NTP) cermet engine performance, and to better understand contemporary fuel testing results. Over 200 cermet (W-UO2) samples were tested by thermally cycling to 2500 deg (2770 K) in hydrogen. The data indicates two issues at high temperatures: the vaporization rate of UO2 and the chemical stability of UO2. The data show that cladding and chemical stabilizers each result in large, order of magnitude improvements in high temperature performance, while other approaches yield smaller, incremental improvements. Data is very limited above 2770 K, and this complicates predictions of engine performance at high Isp. The paper considers how this material performance data translates into engine performance. In particular, the location of maximum temperature within the fuel element and the effect of heat deposition rate are examined.

  4. Design control system of telescope force actuators based on WLAN

    NASA Astrophysics Data System (ADS)

    Shuai, Xiaoying; Zhang, Zhenchao

    2010-05-01

    With the development of the technology of autocontrol, telescope, computer, network and communication, the control system of the modern large and extra lager telescope become more and more complicated, especially application of active optics. Large telescope based on active optics maybe contain enormous force actuators. This is a challenge to traditional control system based on wired networks, which result in difficult-to-manage, occupy signification space and lack of system flexibility. Wireless network can resolve these disadvantages of wired network. Presented control system of telescope force actuators based on WLAN (WFCS), designed the control system framework of WFCS. To improve the performance of real-time, we developed software of force actuators control system in Linux. Finally, this paper discussed improvement of WFCS real-time, conceived maybe improvement in the future.

  5. Effect of recent popularity on heat-conduction based recommendation models

    NASA Astrophysics Data System (ADS)

    Li, Wen-Jun; Dong, Qiang; Shi, Yang-Bo; Fu, Yan; He, Jia-Lin

    2017-05-01

    Accuracy and diversity are two important measures in evaluating the performance of recommender systems. It has been demonstrated that the recommendation model inspired by the heat conduction process has high diversity yet low accuracy. Many variants have been introduced to improve the accuracy while keeping high diversity, most of which regard the current node-degree of an item as its popularity. However in this way, a few outdated items of large degree may be recommended to an enormous number of users. In this paper, we take the recent popularity (recently increased item degrees) into account in the heat-conduction based methods, and propose accordingly the improved recommendation models. Experimental results on two benchmark data sets show that the accuracy can be largely improved while keeping the high diversity compared with the original models.

  6. Beyond Measurement and Reward: Methods of Motivating Quality Improvement and Accountability.

    PubMed

    Berenson, Robert A; Rice, Thomas

    2015-12-01

    The article examines public policies designed to improve quality and accountability that do not rely on financial incentives and public reporting of provider performance. Payment policy should help temper the current "more is better" attitude of physicians and provider organizations. Incentive neutrality would better support health professionals' intrinsic motivation to act in their patients' best interests to improve overall quality than would pay-for-performance plans targeted to specific areas of clinical care. Public policy can support clinicians' intrinsic motivation through approaches that support systematic feedback to clinicians and provide concrete opportunities to collaborate to improve care. Some programs administered by the Centers for Medicare & Medicaid Services, including Partnership for Patients and Conditions of Participation, deserve more attention; they represent available, but largely ignored, approaches to support providers to improve quality and protect beneficiaries against substandard care. Public policies related to quality improvement should focus more on methods of enhancing professional intrinsic motivation, while recognizing the potential role of organizations to actively promote and facilitate that motivation. Actually achieving improvement, however, will require a reexamination of the role played by financial incentives embedded in payments and the unrealistic expectations placed on marginal incentives in pay-for-performance schemes. © Health Research and Educational Trust.

  7. Development and Field Test of the Trial Battery for Project A. Improving the Selection, Classification and Utilization of Army Enlisted Personnel. Project A: Improving the Selection, Classification and Utilization of Army Enlisted Personnel. ARI Technical Report 739.

    ERIC Educational Resources Information Center

    Peterson, Norman G., Ed.

    As part of the United States Army's Project A, research has been conducted to develop and field test a battery of experimental tests to complement the Armed Services Vocational Aptitude Battery in predicting soldiers' job performance. Project A is the United States Army's large-scale manpower effort to improve selection, classification, and…

  8. Practice makes perfect in memory recall.

    PubMed

    Romani, Sandro; Katkov, Mikhail; Tsodyks, Misha

    2016-04-01

    A large variability in performance is observed when participants recall briefly presented lists of words. The sources of such variability are not known. Our analysis of a large data set of free recall revealed a small fraction of participants that reached an extremely high performance, including many trials with the recall of complete lists. Moreover, some of them developed a number of consistent input-position-dependent recall strategies, in particular recalling words consecutively ("chaining") or in groups of consecutively presented words ("chunking"). The time course of acquisition and particular choice of positional grouping were variable among participants. Our results show that acquiring positional strategies plays a crucial role in improvement of recall performance. © 2016 Romani et al.; Published by Cold Spring Harbor Laboratory Press.

  9. Exploring Advanced Technology Gas Turbine Engine Design and Performance for the Large Civil Tiltrotor (LCTR)

    NASA Technical Reports Server (NTRS)

    Snyder, Christopher A.

    2014-01-01

    A Large Civil Tiltrotor (LCTR) conceptual design was developed as part of the NASA Heavy Lift Rotorcraft Systems Investigation in order to establish a consistent basis for evaluating the benefits of advanced technology for large tiltrotors. The concept has since evolved into the second-generation LCTR2, designed to carry 90 passengers for 1,000 nautical miles at 300 knots, with vertical takeoff and landing capability. This paper explores gas turbine component performance and cycle parameters to quantify performance gains possible for additional improvements in component and material performance beyond those identified in previous LCTR2 propulsion studies and to identify additional research areas. The vehicle-level characteristics from this advanced technology generation 2 propulsion architecture will help set performance levels as additional propulsion and power systems are conceived to meet ever-increasing requirements for mobility and comfort, while reducing energy use, cost, noise and emissions. The Large Civil Tiltrotor vehicle and mission will be discussed as a starting point for this effort. A few, relevant engine and component technology studies, including previous LCTR2 engine study results will be summarized to help orient the reader on gas turbine engine architecture, performance and limitations. Study assumptions and methodology used to explore engine design and performance, as well as assess vehicle sizing and mission performance will then be discussed. Individual performance for present and advanced engines, as well as engine performance effects on overall vehicle size and mission fuel usage, will be given. All results will be summarized to facilitate understanding the importance and interaction of various component and system performance on overall vehicle characteristics.

  10. Large-Format HgCdTe Dual-Band Long-Wavelength Infrared Focal-Plane Arrays

    NASA Astrophysics Data System (ADS)

    Smith, E. P. G.; Venzor, G. M.; Gallagher, A. M.; Reddy, M.; Peterson, J. M.; Lofgreen, D. D.; Randolph, J. E.

    2011-08-01

    Raytheon Vision Systems (RVS) continues to further its capability to deliver state-of-the-art high-performance, large-format, HgCdTe focal-plane arrays (FPAs) for dual-band long-wavelength infrared (L/LWIR) detection. Specific improvements have recently been implemented at RVS in molecular-beam epitaxy (MBE) growth and wafer fabrication and are reported in this paper. The aim of the improvements is to establish producible processes for 512 × 512 30- μm-unit-cell L/LWIR FPAs, which has resulted in: the growth of triple-layer heterojunction (TLHJ) HgCdTe back-to-back photodiode detector designs on 6 cm × 6 cm CdZnTe substrates with 300-K Fourier-transform infrared (FTIR) cutoff wavelength uniformity of ±0.1 μm across the entire wafer; demonstration of detector dark-current performance for the longer-wavelength detector band approaching that of single-color liquid-phase epitaxy (LPE) LWIR detectors; and uniform, high-operability, 512 × 512 30- μm-unit-cell FPA performance in both LWIR bands.

  11. Use of controlled vocabularies to improve biomedical information retrieval tasks.

    PubMed

    Pasche, Emilie; Gobeill, Julien; Vishnyakova, Dina; Ruch, Patrick; Lovis, Christian

    2013-01-01

    The high heterogeneity of biomedical vocabulary is a major obstacle for information retrieval in large biomedical collections. Therefore, using biomedical controlled vocabularies is crucial for managing these contents. We investigate the impact of query expansion based on controlled vocabularies to improve the effectiveness of two search engines. Our strategy relies on the enrichment of users' queries with additional terms, directly derived from such vocabularies applied to infectious diseases and chemical patents. We observed that query expansion based on pathogen names resulted in improvements of the top-precision of our first search engine, while the normalization of diseases degraded the top-precision. The expansion of chemical entities, which was performed on the second search engine, positively affected the mean average precision. We have shown that query expansion of some types of biomedical entities has a great potential to improve search effectiveness; therefore a fine-tuning of query expansion strategies could help improving the performances of search engines.

  12. Aluminum doped nickel oxide thin film with improved electrochromic performance from layered double hydroxides precursor in situ pyrolytic route

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shi, Jingjing; Lai, Lincong; Zhang, Ping

    Electrochromic materials with unique performance arouse great interest on account of potential application values in smart window, low-power display, automobile anti-glare rearview mirror, and e-papers. In this paper, high-performing Al-doped NiO porous electrochromic film grown on ITO substrate has been prepared via a layered double hydroxides(LDHs) precursor in situ pyrolytic route. The Al{sup 3+} ions distributed homogenously within the NiO matrix can significantly influence the crystallinity of Ni-Al LDH and NiO:Al{sup 3+} films. The electrochromic performance of the films were evaluated by means of UV–vis absorption spectroscopy, cyclic voltammetry (CV), electrochemical impedance spectroscopy (EIS), and chronoamperometry(CA) measurements. In addition, themore » ratio of Ni{sup 3+}/Ni{sup 2+} also varies with Al content which can lead to different electrochemical performances. Among the as-prepared films, NiO film prepared from Ni-Al (19:1) LDH show the best electrochromic performance with a high transparency of 96%, large optical modulation range (58.4%), fast switching speed (bleaching/coloration times are 1.8/4.2 s, respectively) and excellent durability (30% decrease after 2000 cycles). The improved performance was owed to the synergy of large NiO film specific surface area and porous morphology, as well as Al doping stifled the formation of Ni{sup 3+} making bleached state more pure. This LDHs precursor pyrolytic method is simple, low-cost and environmental benign and is feasible for the preparation of NiO:Al and other Al-doped oxide thin film. - Graphical abstract: The ratio of Ni{sup 3+}/Ni{sup 2+} varies with Al content which can lead to different electrochemical performances. Among the as-prepared films, NiO film prepared from Ni-Al (19:1) LDH show the best electrochromic performance with a high transparency of 96%, large optical modulation range, fast switching speed and excellent durability. Display Omitted.« less

  13. Can anti-gravity running improve performance to the same degree as over-ground running?

    PubMed

    Brennan, Christopher T; Jenkins, David G; Osborne, Mark A; Oyewale, Michael; Kelly, Vincent G

    2018-03-11

    This study examined the changes in running performance, maximal blood lactate concentrations and running kinematics between 85%BM anti-gravity (AG) running and normal over-ground (OG) running over an 8-week training period. Fifteen elite male developmental cricketers were assigned to either the AG or over-ground (CON) running group. The AG group (n = 7) ran twice a week on an AG treadmill and once per week over-ground. The CON group (n = 8) completed all sessions OG on grass. Both AG and OG training resulted in similar improvements in time trial and shuttle run performance. Maximal running performance showed moderate differences between the groups, however the AG condition resulted in less improvement. Large differences in maximal blood lactate concentrations existed with OG running resulting in greater improvements in blood lactate concentrations measured during maximal running. Moderate increases in stride length paired with moderate decreases in stride rate also resulted from AG training. The use of AG training to supplement regular OG training for performance should be used cautiously, as extended use over long periods of time could lead to altered stride mechanics and reduced blood lactate.

  14. Improved charging performance of Li-O2 batteries by forming Ba-incorporated Li2O2 as the discharge product

    NASA Astrophysics Data System (ADS)

    Matsuda, Shoichi; Uosaki, Kohei; Nakanishi, Shuji

    2017-06-01

    Although Li-O2 batteries can potentially achieve greater than two-fold higher energy densities than Li-ion batteries, the basic performance of Li-O2 batteries remains poor. In particular, the large over-potential of positive electrode reactions during the charging process results in low round-trip energy efficiency and limited cycle life, and is therefore the main barrier to the practical use of rechargeable Li-O2 batteries. In the present study, we demonstrate that the charging performance of Li-O2 batteries can be significantly improved by simply adding barium (Ba) ions into the electrolyte. Elemental analysis revealed that Ba-incorporated Li2O2 was obtained as the main discharge product of a Li-O2 cell operated in the presence of Ba2+. Notably, the improvement of charging performance was confirmed to originate from the Ba-incorporated Li2O2 deposits, rather than the Ba2+ present in the electrolyte. The present results suggest that the incorporation of heteroatoms into the discharge product is an effective approach for improving the charging performance of Li-O2 batteries.

  15. Accelerating Large Scale Image Analyses on Parallel, CPU-GPU Equipped Systems

    PubMed Central

    Teodoro, George; Kurc, Tahsin M.; Pan, Tony; Cooper, Lee A.D.; Kong, Jun; Widener, Patrick; Saltz, Joel H.

    2014-01-01

    The past decade has witnessed a major paradigm shift in high performance computing with the introduction of accelerators as general purpose processors. These computing devices make available very high parallel computing power at low cost and power consumption, transforming current high performance platforms into heterogeneous CPU-GPU equipped systems. Although the theoretical performance achieved by these hybrid systems is impressive, taking practical advantage of this computing power remains a very challenging problem. Most applications are still deployed to either GPU or CPU, leaving the other resource under- or un-utilized. In this paper, we propose, implement, and evaluate a performance aware scheduling technique along with optimizations to make efficient collaborative use of CPUs and GPUs on a parallel system. In the context of feature computations in large scale image analysis applications, our evaluations show that intelligently co-scheduling CPUs and GPUs can significantly improve performance over GPU-only or multi-core CPU-only approaches. PMID:25419545

  16. Scintillator performance considerations for dedicated breast computed tomography

    NASA Astrophysics Data System (ADS)

    Vedantham, Srinivasan; Shi, Linxi; Karellas, Andrew

    2017-09-01

    Dedicated breast computed tomography (BCT) is an emerging clinical modality that can eliminate tissue superposition and has the potential for improved sensitivity and specificity for breast cancer detection and diagnosis. It is performed without physical compression of the breast. Most of the dedicated BCT systems use large-area detectors operating in cone-beam geometry and are referred to as cone-beam breast CT (CBBCT) systems. The large-area detectors in CBBCT systems are energy-integrating, indirect-type detectors employing a scintillator that converts x-ray photons to light, followed by detection of optical photons. A key consideration that determines the image quality achieved by such CBBCT systems is the choice of scintillator and its performance characteristics. In this work, a framework for analyzing the impact of the scintillator on CBBCT performance and its use for task-specific optimization of CBBCT imaging performance is described.

  17. Implementation of the 2011 Therapeutic Activity Act: will commercialization improve the financial performance of Polish hospitals?

    PubMed

    Sagan, Anna; Sobczak, Alicja

    2014-11-01

    The Therapeutic Activity Act that came into force on 1 July 2011 was aimed at achieving a large-scale transformation of public hospitals into Commercial Code companies. The change of the legal form, from a public entity to a for-profit company, was expected to improve the poor economic efficiency of the public hospital sector. However, the mere change of the legal form does not guarantee a better financial performance of hospitals and thus the success of the Act. In many cases, deep internal changes are needed to achieve improvements in the financial performance of particular hospitals. In addition, a set of other measures at the national and regional levels, such as the mapping of health needs of the population, have to accompany the legal transformations in order to improve the efficiency of the hospital sector. The recent slowdown in the rate of the transformations is another factor that renders the success of the Act uncertain. Copyright © 2014. Published by Elsevier Ireland Ltd.

  18. Significantly Increasing the Ductility of High Performance Polymer Semiconductors through Polymer Blending.

    PubMed

    Scott, Joshua I; Xue, Xiao; Wang, Ming; Kline, R Joseph; Hoffman, Benjamin C; Dougherty, Daniel; Zhou, Chuanzhen; Bazan, Guillermo; O'Connor, Brendan T

    2016-06-08

    Polymer semiconductors based on donor-acceptor monomers have recently resulted in significant gains in field effect mobility in organic thin film transistors (OTFTs). These polymers incorporate fused aromatic rings and have been designed to have stiff planar backbones, resulting in strong intermolecular interactions, which subsequently result in stiff and brittle films. The complex synthesis typically required for these materials may also result in increased production costs. Thus, the development of methods to improve mechanical plasticity while lowering material consumption during fabrication will significantly improve opportunities for adoption in flexible and stretchable electronics. To achieve these goals, we consider blending a brittle donor-acceptor polymer, poly[4-(4,4-dihexadecyl-4H-cyclopenta[1,2-b:5,4-b']dithiophen-2-yl)-alt-[1,2,5]thiadiazolo[3,4-c]pyridine] (PCDTPT), with ductile poly(3-hexylthiophene). We found that the ductility of the blend films is significantly improved compared to that of neat PCDTPT films, and when the blend film is employed in an OTFT, the performance is largely maintained. The ability to maintain charge transport character is due to vertical segregation within the blend, while the improved ductility is due to intermixing of the polymers throughout the film thickness. Importantly, the application of large strains to the ductile films is shown to orient both polymers, which further increases charge carrier mobility. These results highlight a processing approach to achieve high performance polymer OTFTs that are electrically and mechanically optimized.

  19. Enhanced LWIR NUC using an uncooled microbolometer camera

    NASA Astrophysics Data System (ADS)

    LaVeigne, Joe; Franks, Greg; Sparkman, Kevin; Prewarski, Marcus; Nehring, Brian

    2011-06-01

    Performing a good non-uniformity correction is a key part of achieving optimal performance from an infrared scene projector, and the best NUC is performed in the band of interest for the sensor being tested. While cooled, large format MWIR cameras are readily available and have been successfully used to perform NUC, similar cooled, large format LWIR cameras are not as common and are prohibitively expensive. Large format uncooled cameras are far more available and affordable, but present a range of challenges in practical use for performing NUC on an IRSP. Some of these challenges were discussed in a previous paper. In this discussion, we report results from a continuing development program to use a microbolometer camera to perform LWIR NUC on an IRSP. Camera instability and temporal response and thermal resolution were the main problems, and have been solved by the implementation of several compensation strategies as well as hardware used to stabilize the camera. In addition, other processes have been developed to allow iterative improvement as well as supporting changes of the post-NUC lookup table without requiring re-collection of the pre-NUC data with the new LUT in use.

  20. How is a motor skill learned? Change and invariance at the levels of task success and trajectory control

    PubMed Central

    Krakauer, John W.; Mazzoni, Pietro

    2012-01-01

    The public pays large sums of money to watch skilled motor performance. Notably, however, in recent decades motor skill learning (performance improvement beyond baseline levels) has received less experimental attention than motor adaptation (return to baseline performance in the setting of an external perturbation). Motor skill can be assessed at the levels of task success and movement quality, but the link between these levels remains poorly understood. We devised a motor skill task that required visually guided curved movements of the wrist without a perturbation, and we defined skill learning at the task level as a change in the speed–accuracy trade-off function (SAF). Practice in restricted speed ranges led to a global shift of the SAF. We asked how the SAF shift maps onto changes in trajectory kinematics, to establish a link between task-level performance and fine motor control. Although there were small changes in mean trajectory, improved performance largely consisted of reduction in trial-to-trial variability and increase in movement smoothness. We found evidence for improved feedback control, which could explain the reduction in variability but does not preclude other explanations such as an increased signal-to-noise ratio in cortical representations. Interestingly, submovement structure remained learning invariant. The global generalization of the SAF across a wide range of difficulty suggests that skill for this task is represented in a temporally scalable network. We propose that motor skill acquisition can be characterized as a slow reduction in movement variability, which is distinct from faster model-based learning that reduces systematic error in adaptation paradigms. PMID:22514286

  1. Quick Vegas: Improving Performance of TCP Vegas for High Bandwidth-Delay Product Networks

    NASA Astrophysics Data System (ADS)

    Chan, Yi-Cheng; Lin, Chia-Liang; Ho, Cheng-Yuan

    An important issue in designing a TCP congestion control algorithm is that it should allow the protocol to quickly adjust the end-to-end communication rate to the bandwidth on the bottleneck link. However, the TCP congestion control may function poorly in high bandwidth-delay product networks because of its slow response with large congestion windows. In this paper, we propose an enhanced version of TCP Vegas called Quick Vegas, in which we present an efficient congestion window control algorithm for a TCP source. Our algorithm improves the slow-start and congestion avoidance techniques of original Vegas. Simulation results show that Quick Vegas significantly improves the performance of connections as well as remaining fair when the bandwidth-delay product increases.

  2. Time-resolved speckle effects on the estimation of laser-pulse arrival times

    NASA Technical Reports Server (NTRS)

    Tsai, B.-M.; Gardner, C. S.

    1985-01-01

    A maximum-likelihood (ML) estimator of the pulse arrival in laser ranging and altimetry is derived for the case of a pulse distorted by shot noise and time-resolved speckle. The performance of the estimator is evaluated for pulse reflections from flat diffuse targets and compared with the performance of a suboptimal centroid estimator and a suboptimal Bar-David ML estimator derived under the assumption of no speckle. In the large-signal limit the accuracy of the estimator was found to improve as the width of the receiver observational interval increases. The timing performance of the estimator is expected to be highly sensitive to background noise when the received pulse energy is high and the receiver observational interval is large. Finally, in the speckle-limited regime the ML estimator performs considerably better than the suboptimal estimators.

  3. Improving Hepatitis C Identification: Technology Alone Is Not the Answer.

    PubMed

    Nitsche, Bruce; Miller, Sara C; Giorgio, Margaret; Berry, Carolyn A; Muir, Andrew

    2017-09-01

    An estimated 3 to 5 million Americans are chronically infected with hepatitis C virus (HCV), and approximately 75% of those persons were born between 1945 and 1965 (the so-called baby boomer generation). Because of the largely asymptomatic nature of HCV, up to 50% of those infected are unaware of their disease. Risk-based testing has been largely ineffective. Based on prevalence data, the Centers for Disease Control and Prevention and other organizations recommend a onetime HCV antibody test for all baby boomers. However, uptake of this recommendation requires significant changes in clinical practice for already busy primary care clinicians. We studied the effectiveness of a quality improvement initiative based on continuous audit and feedback combined with education for improving testing in alignment with guidelines; the control group was a cohort of clinicians whose only reminder was an institution-wide electronic health record prompt. Our data show improved testing rates among all clinician groups, but more significant improvement occurred among providers who received continuous feedback about their clinical performance coupled with education.

  4. An improved method to detect correct protein folds using partial clustering.

    PubMed

    Zhou, Jianjun; Wishart, David S

    2013-01-16

    Structure-based clustering is commonly used to identify correct protein folds among candidate folds (also called decoys) generated by protein structure prediction programs. However, traditional clustering methods exhibit a poor runtime performance on large decoy sets. We hypothesized that a more efficient "partial" clustering approach in combination with an improved scoring scheme could significantly improve both the speed and performance of existing candidate selection methods. We propose a new scheme that performs rapid but incomplete clustering on protein decoys. Our method detects structurally similar decoys (measured using either C(α) RMSD or GDT-TS score) and extracts representatives from them without assigning every decoy to a cluster. We integrated our new clustering strategy with several different scoring functions to assess both the performance and speed in identifying correct or near-correct folds. Experimental results on 35 Rosetta decoy sets and 40 I-TASSER decoy sets show that our method can improve the correct fold detection rate as assessed by two different quality criteria. This improvement is significantly better than two recently published clustering methods, Durandal and Calibur-lite. Speed and efficiency testing shows that our method can handle much larger decoy sets and is up to 22 times faster than Durandal and Calibur-lite. The new method, named HS-Forest, avoids the computationally expensive task of clustering every decoy, yet still allows superior correct-fold selection. Its improved speed, efficiency and decoy-selection performance should enable structure prediction researchers to work with larger decoy sets and significantly improve their ab initio structure prediction performance.

  5. An improved method to detect correct protein folds using partial clustering

    PubMed Central

    2013-01-01

    Background Structure-based clustering is commonly used to identify correct protein folds among candidate folds (also called decoys) generated by protein structure prediction programs. However, traditional clustering methods exhibit a poor runtime performance on large decoy sets. We hypothesized that a more efficient “partial“ clustering approach in combination with an improved scoring scheme could significantly improve both the speed and performance of existing candidate selection methods. Results We propose a new scheme that performs rapid but incomplete clustering on protein decoys. Our method detects structurally similar decoys (measured using either Cα RMSD or GDT-TS score) and extracts representatives from them without assigning every decoy to a cluster. We integrated our new clustering strategy with several different scoring functions to assess both the performance and speed in identifying correct or near-correct folds. Experimental results on 35 Rosetta decoy sets and 40 I-TASSER decoy sets show that our method can improve the correct fold detection rate as assessed by two different quality criteria. This improvement is significantly better than two recently published clustering methods, Durandal and Calibur-lite. Speed and efficiency testing shows that our method can handle much larger decoy sets and is up to 22 times faster than Durandal and Calibur-lite. Conclusions The new method, named HS-Forest, avoids the computationally expensive task of clustering every decoy, yet still allows superior correct-fold selection. Its improved speed, efficiency and decoy-selection performance should enable structure prediction researchers to work with larger decoy sets and significantly improve their ab initio structure prediction performance. PMID:23323835

  6. 20 years of KVH fiber optic gyro technology: the evolution from large, low performance FOGs to compact, precise FOGs and FOG-based inertial systems

    NASA Astrophysics Data System (ADS)

    Napoli, Jay

    2016-05-01

    Precision fiber optic gyroscopes (FOGs) are critical components for an array of platforms and applications ranging from stabilization and pointing orientation of payloads and platforms to navigation and control for unmanned and autonomous systems. In addition, FOG-based inertial systems provide extremely accurate data for geo-referencing systems. Significant improvements in the performance of FOGs and FOG-based inertial systems at KVH are due, in large part, to advancements in the design and manufacture of optical fiber, as well as in manufacturing operations and signal processing. Open loop FOGs, such as those developed and manufactured by KVH Industries, offer tactical-grade performance in a robust, small package. The success of KVH FOGs and FOG-based inertial systems is due to innovations in key fields, including the development of proprietary D-shaped fiber with an elliptical core, and KVH's unique ThinFiber. KVH continually improves its FOG manufacturing processes and signal processing, which result in improved accuracies across its entire FOG product line. KVH acquired its FOG capabilities, including its patented E•Core fiber, when the company purchased Andrew Corporation's Fiber Optic Group in 1997. E•Core fiber is unique in that the light-guiding core - critical to the FOG's performance - is elliptically shaped. The elliptical core produces a fiber that has low loss and high polarization-maintaining ability. In 2010, KVH developed its ThinFiber, a 170-micron diameter fiber that retains the full performance characteristics of E•Core fiber. ThinFiber has enabled the development of very compact, high-performance open-loop FOGs, which are also used in a line of FOG-based inertial measurement units and inertial navigation systems.

  7. Catalytic performance of Metal-Organic-Frameworks vs. extra-large pore zeolite UTL in condensation reactions

    PubMed Central

    Shamzhy, Mariya; Opanasenko, Maksym; Shvets, Oleksiy; Čejka, Jiří

    2013-01-01

    Catalytic behavior of isomorphously substituted B-, Al-, Ga-, and Fe-containing extra-large pore UTL zeolites was investigated in Knoevenagel condensation involving aldehydes, Pechmann condensation of 1-naphthol with ethylacetoacetate, and Prins reaction of β-pinene with formaldehyde and compared with large-pore aluminosilicate zeolite beta and representative Metal-Organic-Frameworks Cu3(BTC)2 and Fe(BTC). The yield of the target product over the investigated catalysts in Knoevenagel condensation increases in the following sequence: (Al)beta < (Al)UTL < (Ga)UTL < (Fe)UTL < Fe(BTC) < (B)UTL < Cu3(BTC)2 being mainly related to the improving selectivity with decreasing strength of active sites of the individual catalysts. The catalytic performance of Fe(BTC), containing the highest concentration of Lewis acid sites of the appropriate strength is superior over large-pore zeolite (Al)beta and B-, Al-, Ga-, Fe-substituted extra-large pore zeolites UTL in Prins reaction of β-pinene with formaldehyde and Pechmann condensation of 1-naphthol with ethylacetoacetate. PMID:24790940

  8. Ignition improvement by injector arrangement in a multi-fuel combustor for micro gas turbine

    NASA Astrophysics Data System (ADS)

    Antoshkiv, O.; Poojitganont, T.; Jeansirisomboon, S.; Berg, H. P.

    2018-01-01

    The novel combustor design also has an impact on the ignitor arrangement. The conventional ignitor system cannot guarantee optimal ignition performance in the usual radial position. The difficult ignitability of gaseous fuels was the main challenge for the ignitor system improvement. One way to improve the ignition performance significantly is a torch ignitor system in which the gaseous fuel is directly mixed with a large amount of the combustor air. To reach this goal, the ignition process was investigated in detail. The micro gas turbine (MGT) ignition was optimised considering three main procedures: torch ignitor operation, burner ignition and flame propagation between the neighbour injectors. A successful final result of the chain of ignition procedures depends on multiple aspects of the combustor design. Performed development work shows an important step towards designing modern high-efficiency low-emission combustors.

  9. Cognitive Skills, Student Achievement Tests, and Schools

    PubMed Central

    Finn, Amy S.; Kraft, Matthew A.; West, Martin R.; Leonard, Julia A.; Bish, Crystal E.; Martin, Rebecca E.; Sheridan, Margaret A.; Gabrieli, Christopher F. O.; Gabrieli, John D. E.

    2014-01-01

    Cognitive skills predict academic performance, so schools that improve academic performance might also improve cognitive skills. To investigate the impact schools have on both academic performance and cognitive skills, we related standardized achievement test scores to measures of cognitive skills in a large sample (N=1,367) of 8th-grade students attending traditional, exam, and charter public schools. Test scores and gains in test scores over time correlated with measures of cognitive skills. Despite wide variation in test scores across schools, differences in cognitive skills across schools were negligible after controlling for 4th-grade test scores. Random offers of enrollment to over-subscribed charter schools resulted in positive impacts of such school attendance on math achievement, but had no impact on cognitive skills. These findings suggest that schools that improve standardized achievement tests do so primarily through channels other than cognitive skills. PMID:24434238

  10. A simple route to improve rate performance of LiFePO4/reduced graphene oxide composite cathode by adding Mg2+ via mechanical mixing

    NASA Astrophysics Data System (ADS)

    Huang, Yuan; Liu, Hao; Gong, Li; Hou, Yanglong; Li, Quan

    2017-04-01

    Introducing Mg2+ to LiFePO4 and reduced graphene oxide composite via mechanical mixing and annealing leads to largely improved rate performance of the cathode (e.g. ∼78 mA h g-1 at 20 C for LiFePO4 and reduced graphene oxide composite with Mg2+ introduction vs. ∼37 mA h g-1 at 20 C for LiFePO4 and reduced graphene oxide composite). X-ray photoelectron spectroscopy unravels that the enhanced reduction of Fe2+ to Fe0 occurs in the simultaneous presence of Mg2+ and reduced graphene oxide, which is beneficial for the rate capability of cathode. The simple fabrication process provides a simple and effective means to improve the rate performance of the LiFePO4 and reduced graphene oxide composite cathode.

  11. Predictive control strategy of a gas turbine for improvement of combined cycle power plant dynamic performance and efficiency.

    PubMed

    Mohamed, Omar; Wang, Jihong; Khalil, Ashraf; Limhabrash, Marwan

    2016-01-01

    This paper presents a novel strategy for implementing model predictive control (MPC) to a large gas turbine power plant as a part of our research progress in order to improve plant thermal efficiency and load-frequency control performance. A generalized state space model for a large gas turbine covering the whole steady operational range is designed according to subspace identification method with closed loop data as input to the identification algorithm. Then the model is used in developing a MPC and integrated into the plant existing control strategy. The strategy principle is based on feeding the reference signals of the pilot valve, natural gas valve, and the compressor pressure ratio controller with the optimized decisions given by the MPC instead of direct application of the control signals. If the set points for the compressor controller and turbine valves are sent in a timely manner, there will be more kinetic energy in the plant to release faster responses on the output and the overall system efficiency is improved. Simulation results have illustrated the feasibility of the proposed application that has achieved significant improvement in the frequency variations and load following capability which are also translated to be improvements in the overall combined cycle thermal efficiency of around 1.1 % compared to the existing one.

  12. Performance Metrics as Formal Structures and through the Lens of Social Mechanisms: When Do They Work and How Do They Influence?

    ERIC Educational Resources Information Center

    Colyvas, Jeannette A.

    2012-01-01

    Our current educational environment is subject to persistent calls for accountability, evidence-based practice, and data use for improvement, which largely take the form of performance metrics (PMs). This rapid proliferation of PMs has profoundly influenced the ways in which scholars and practitioners think about their own practices and the larger…

  13. Using Goals, Feedback, Reinforcement, and a Performance Matrix to Improve Customer Service in a Large Department Store

    ERIC Educational Resources Information Center

    Eikenhout, Nelson; Austin, John

    2005-01-01

    This study employed an ABAC and multiple baseline design to evaluate the effects of (B) feedback and (C) a package of feedback, goalsetting, and reinforcement (supervisor praise and an area-wide celebration as managed through a performance matrix, on a total of 14 various customer service behaviors for a total of 115 employees at a large…

  14. Teacher Reactions to the Performance-Based Bonus Program: How the Expectancy Theory Works in the South Korean School Culture

    ERIC Educational Resources Information Center

    Ha, Bong-Woon; Sung, Youl-Kwan

    2011-01-01

    This study was conducted in order to examine how and to what extent the implementation of the performance-based bonus program in South Korean schools has motivated teachers to improve their behavior, as well as to identify any other positive or negative effects of the program. Interviews with teachers indicated that a large percentage of teachers…

  15. Large-area sheet task advanced dendritic web growth development

    NASA Technical Reports Server (NTRS)

    Duncan, C. S.; Seidensticker, R. G.; Mchugh, J. P.; Hopkins, R. H.; Meier, D.; Schruben, J.

    1982-01-01

    The thermal stress model was used to generate the design of a low stress lid and shield configuration, which was fabricated and tested experimentally. In preliminary tests, the New Experimental Web Growth Facility performed as designed, producing web on the first run. These experiments suggested desirable design modifications in the melt level sensing system to improve further its performance, and these are being implemented.

  16. Probabilistic double guarantee kidnapping detection in SLAM.

    PubMed

    Tian, Yang; Ma, Shugen

    2016-01-01

    For determining whether kidnapping has happened and which type of kidnapping it is while a robot performs autonomous tasks in an unknown environment, a double guarantee kidnapping detection (DGKD) method has been proposed. The good performance of DGKD in a relative small environment is shown. However, a limitation of DGKD is found in a large-scale environment by our recent work. In order to increase the adaptability of DGKD in a large-scale environment, an improved method called probabilistic double guarantee kidnapping detection is proposed in this paper to combine probability of features' positions and the robot's posture. Simulation results demonstrate the validity and accuracy of the proposed method.

  17. Vent modification of large ribbon parachutes to enhance cluster performance

    NASA Technical Reports Server (NTRS)

    Kolega, D. J.; Woodis, W. R.; Reuter, J. D.

    1986-01-01

    Due to uneven load sharing and lagging inflation rates, the design of the Large Main Parachute (LMP) cluster, used to recover the Space Shuttle steel case Solid Rocket Boosters, had to be modified. The cause of the problem was excessive variation in effective porosity in the crown area of the LMP during first stage inflation. The design modification consisted of adding horizontal ribbons above the existing vent band to reduce the vent porosity and better control the position and attitude of the vent lines. Performance of modified LMP's since introduction indicates that the load sharing between the clustered chutes has been significantly improved.

  18. Experience measuring performance improvement in multiphase picture archiving and communications systems implementations.

    PubMed

    Reed, G; Reed, D H

    1999-05-01

    When planning a picture archiving and communications system (PACS) implementation and determining which equipment will be implemented in earlier and later phases, collection and analysis of selected data will aid in setting implementation priorities. If baseline data are acquired relative to performance objectives, the same information used for implementation planning can be used to measure performance improvement and outcomes. The main categories of data to choose from are: (1) financial data; (2) productivity data; (3) operational parameters; (4) clinical data; and (5) information about customer satisfaction. In the authors' experience, detailed workflow data have not proved valuable in measuring PACS performance and outcomes. Reviewing only one category of data in planning will not provide adequate basis for targeting operational improvements that will lead to the most significant gains. Quality improvement takes into account all factors in production: human capacity, materials, operating capital and assets. Once we have identified key areas of focus for quality improvement in each phase, we can translate objectives into implementation requirements and finally into detailed functional and performance requirements. Here, Integration Resources reports its experience measuring PACS performance relative to phased implementation strategies for three large medical centers. Each medical center had its own objectives for overcoming image management, physical/geographical, and functional/technical barriers. The report outlines (1) principal financial and nonfinancial measures used as performance indicators; (2) implementation strategies chosen by each of the three medical centers; and (3) the results of those strategies as compared with baseline data.

  19. Large-area triple-junction a-Si alloy production scaleup. Annual subcontract report, 17 March 1993--18 March 1994

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oswald, R.; Morris, J.

    1994-11-01

    The objective of this subcontract over its three-year duration is to advance Solarex`s photovoltaic manufacturing technologies, reduce its a-Si:H module production costs, increase module performance and expand the Solarex commercial production capacity. Solarex shall meet these objectives by improving the deposition and quality of the transparent front contact, by optimizing the laser patterning process, scaling-up the semiconductor deposition process, improving the back contact deposition, scaling-up and improving the encapsulation and testing of its a-Si:H modules. In the Phase 2 portion of this subcontract, Solarex focused on improving deposition of the front contact, investigating alternate feed stocks for the front contact,more » maximizing throughput and area utilization for all laser scribes, optimizing a-Si:H deposition equipment to achieve uniform deposition over large-areas, optimizing the triple-junction module fabrication process, evaluating the materials to deposit the rear contact, and optimizing the combination of isolation scribe and encapsulant to pass the wet high potential test. Progress is reported on the following: Front contact development; Laser scribe process development; Amorphous silicon based semiconductor deposition; Rear contact deposition process; Frit/bus/wire/frame; Materials handling; and Environmental test, yield and performance analysis.« less

  20. Financial incentives, quality improvement programs, and the adoption of clinical information technology.

    PubMed

    Robinson, James C; Casalino, Lawrence P; Gillies, Robin R; Rittenhouse, Diane R; Shortell, Stephen S; Fernandes-Taylor, Sara

    2009-04-01

    Physician use of clinical information technology (CIT) is important for the management of chronic illness, but has lagged behind expectations. We studied the role of health insurers' financial incentives (including pay-for-performance) and quality improvement initiatives in accelerating adoption of CIT in large physician practices. National survey of all medical groups and independent practice association (IPA) physician organizations with 20 or more physicians in the United States in 2006 to 2007. The response rate was 60.3%. Use of 19 CIT capabilities was measured. Multivariate statistical analysis of financial and organizational factors associated with adoption and use of CIT. Use of information technology varied across physician organizations, including electronic access to laboratory test results (medical groups, 49.3%; IPAs, 19.6%), alerts for potential drug interactions (medical groups, 33.9%; IPAs, 9.5%), electronic drug prescribing (medical groups, 41.9%; IPAs, 25.1%), and physician use of e-mail with patients (medical groups, 34.2%; IPAs, 29.1%). Adoption of CIT was stronger for physician organizations evaluated by external entities for pay-for-performance and public reporting purposes (P = 0.042) and for those participating in quality improvement initiatives (P < 0.001). External incentives and participation in quality improvement initiatives are associated with greater use of CIT by large physician practices.

  1. Optimization of an organic memristor as an adaptive memory element

    NASA Astrophysics Data System (ADS)

    Berzina, Tatiana; Smerieri, Anteo; Bernabò, Marco; Pucci, Andrea; Ruggeri, Giacomo; Erokhin, Victor; Fontana, M. P.

    2009-06-01

    The combination of memory and signal handling characteristics of a memristor makes it a promising candidate for adaptive bioinspired information processing systems. This poses stringent requirements on the basic device, such as stability and reproducibility over a large number of training/learning cycles, and a large anisotropy in the fundamental control material parameter, in our case the electrical conductivity. In this work we report results on the improved performance of electrochemically controlled polymeric memristors, where optimization of a conducting polymer (polyaniline) in the active channel and better environmental control of fabrication methods led to a large increase both in the absolute values of the conductivity in the partially oxydized state of polyaniline and of the on-off conductivity ratio. These improvements are crucial for the application of the organic memristor to adaptive complex signal handling networks.

  2. Deficits in discrimination after experimental frontal brain injury are mediated by motivation and can be improved by nicotinamide administration.

    PubMed

    Vonder Haar, Cole; Maass, William R; Jacobs, Eric A; Hoane, Michael R

    2014-10-15

    One of the largest challenges in experimental neurotrauma work is the development of models relevant to the human condition. This includes both creating similar pathophysiology as well as the generation of relevant behavioral deficits. Recent studies have shown that there is a large potential for the use of discrimination tasks in rats to detect injury-induced deficits. The literature on discrimination and TBI is still limited, however. The current study investigated motivational and motor factors that could potentially contribute to deficits in discrimination. In addition, the efficacy of a neuroprotective agent, nicotinamide, was assessed. Rats were trained on a discrimination task and motivation task, given a bilateral frontal controlled cortical impact TBI (+3.0 AP, 0.0 ML from bregma), and then reassessed. They were also assessed on motor ability and Morris water maze (MWM) performance. Experiment 1 showed that TBI resulted in large deficits in discrimination and motivation. No deficits were observed on gross motor measures; however, the vehicle group showed impairments in fine motor control. Both injured groups were impaired on the reference memory MWM, but only nicotinamide-treated rats were impaired on the working memory MWM. Nicotinamide administration improved performance on discrimination and motivation measures. Experiment 2 evaluated retraining on the discrimination task and suggested that motivation may be a large factor underlying discrimination deficits. Retrained rats improved considerably on the discrimination task. The tasks evaluated in this study demonstrate robust deficits and may improve the detection of pharmaceutical effects by being very sensitive to pervasive cognitive deficits that occur after frontal TBI.

  3. Tuning polarity and improving charge transport in organic semiconductors

    NASA Astrophysics Data System (ADS)

    Oh, Joon Hak; Han, A.-Reum; Yu, Hojeong; Lee, Eun Kwang; Jang, Moon Jeong

    2013-09-01

    Although state-of-the-art ambipolar polymer semiconductors have been extensively reported in recent years, highperformance ambipolar polymers with tunable dominant polarity are still required to realize on-demand, target-specific, high-performance organic circuitry. Herein, dithienyl-diketopyrrolopyrrole (TDPP)-based polymer semiconductors with engineered side-chains have been synthesized, characterized and employed in ambipolar organic field-effect transistors, in order to achieve controllable and improved electrical properties. Thermally removable tert-butoxycarbonyl (t-BOC) groups and hybrid siloxane-solubilizing groups are introduced as the solubilizing groups, and they are found to enable the tunable dominant polarity and the enhanced ambipolar performance, respectively. Such outstanding performance based on our molecular design strategies makes these ambipolar polymer semiconductors highly promising for low-cost, large-area, and flexible electronics.

  4. The effect of code expanding optimizations on instruction cache design

    NASA Technical Reports Server (NTRS)

    Chen, William Y.; Chang, Pohua P.; Conte, Thomas M.; Hwu, Wen-Mei W.

    1991-01-01

    It is shown that code expanding optimizations have strong and non-intuitive implications on instruction cache design. Three types of code expanding optimizations are studied: instruction placement, function inline expansion, and superscalar optimizations. Overall, instruction placement reduces the miss ratio of small caches. Function inline expansion improves the performance for small cache sizes, but degrades the performance of medium caches. Superscalar optimizations increases the cache size required for a given miss ratio. On the other hand, they also increase the sequentiality of instruction access so that a simple load-forward scheme effectively cancels the negative effects. Overall, it is shown that with load forwarding, the three types of code expanding optimizations jointly improve the performance of small caches and have little effect on large caches.

  5. Statistical analysis of RHIC beam position monitors performance

    NASA Astrophysics Data System (ADS)

    Calaga, R.; Tomás, R.

    2004-04-01

    A detailed statistical analysis of beam position monitors (BPM) performance at RHIC is a critical factor in improving regular operations and future runs. Robust identification of malfunctioning BPMs plays an important role in any orbit or turn-by-turn analysis. Singular value decomposition and Fourier transform methods, which have evolved as powerful numerical techniques in signal processing, will aid in such identification from BPM data. This is the first attempt at RHIC to use a large set of data to statistically enhance the capability of these two techniques and determine BPM performance. A comparison from run 2003 data shows striking agreement between the two methods and hence can be used to improve BPM functioning at RHIC and possibly other accelerators.

  6. GoFFish: A Sub-Graph Centric Framework for Large-Scale Graph Analytics1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simmhan, Yogesh; Kumbhare, Alok; Wickramaarachchi, Charith

    2014-08-25

    Large scale graph processing is a major research area for Big Data exploration. Vertex centric programming models like Pregel are gaining traction due to their simple abstraction that allows for scalable execution on distributed systems naturally. However, there are limitations to this approach which cause vertex centric algorithms to under-perform due to poor compute to communication overhead ratio and slow convergence of iterative superstep. In this paper we introduce GoFFish a scalable sub-graph centric framework co-designed with a distributed persistent graph storage for large scale graph analytics on commodity clusters. We introduce a sub-graph centric programming abstraction that combines themore » scalability of a vertex centric approach with the flexibility of shared memory sub-graph computation. We map Connected Components, SSSP and PageRank algorithms to this model to illustrate its flexibility. Further, we empirically analyze GoFFish using several real world graphs and demonstrate its significant performance improvement, orders of magnitude in some cases, compared to Apache Giraph, the leading open source vertex centric implementation. We map Connected Components, SSSP and PageRank algorithms to this model to illustrate its flexibility. Further, we empirically analyze GoFFish using several real world graphs and demonstrate its significant performance improvement, orders of magnitude in some cases, compared to Apache Giraph, the leading open source vertex centric implementation.« less

  7. NOVEL NANOPARTICULATE CATALYSTS FOR IMPROVED VOC TREATMENT DEVICES - PHASE I

    EPA Science Inventory

    Catalytic oxidation of VOCs is increasingly used for treatment of large-volume emissions at relatively dilute VOC levels. The best performing catalytic oxidation devices for attainment of very high VOC destruction levels employ precious metal catalysts, the costs of which a...

  8. PERFORMANCE, RELIABILITY, AND IMPROVEMENT OF A TISSUE-SPECIFIC METABOLIC SIMULATOR

    EPA Science Inventory

    A methodology is described that has been used to build and enhance a simulator for rat liver metabolism providing reliable predictions within a large chemical domain. The tissue metabolism simulator (TIMES) utilizes a heuristic algorithm to generate plausible metabolic maps using...

  9. SUPERFUND: FOCUSING ON THE NATION AT LARGE

    EPA Science Inventory

    In 1986 Congress enacted sweeping amendments to the nation's law to cleanup abandoned hazardous waste sites. Two years later Administrator Reilly set a course for the Superfund program designed to improve the program's performance and to increase the role of the private sector in...

  10. Microseismic event location by master-event waveform stacking

    NASA Astrophysics Data System (ADS)

    Grigoli, F.; Cesca, S.; Dahm, T.

    2016-12-01

    Waveform stacking location methods are nowadays extensively used to monitor induced seismicity monitoring assoiciated with several underground industrial activities such as Mining, Oil&Gas production and Geothermal energy exploitation. In the last decade a significant effort has been spent to develop or improve methodologies able to perform automated seismological analysis for weak events at a local scale. This effort was accompanied by the improvement of monitoring systems, resulting in an increasing number of large microseismicity catalogs. The analysis of microseismicity is challenging, because of the large number of recorded events often characterized by a low signal-to-noise ratio. A significant limitation of the traditional location approaches is that automated picking is often done on each seismogram individually, making little or no use of the coherency information between stations. In order to improve the performance of the traditional location methods, in the last year, alternative approaches have been proposed. These methods exploits the coherence of the waveforms recorded at different stations and do not require any automated picking procedure. The main advantage of this methods relies on their robustness even when the recorded waveforms are very noisy. On the other hand, like any other location method, the location performance strongly depends on the accuracy of the available velocity model. When dealing with inaccurate velocity models, in fact, location results can be affected by large errors. Here we will introduce a new automated waveform stacking location method which is less dependent on the knowledge of the velocity model and presents several benefits, which improve the location accuracy: 1) it accounts for phase delays due to local site effects, e.g. surface topography or variable sediment thickness 2) theoretical velocity model are only used to estimate travel times within the source volume, and not along the whole source-sensor path. We finally compare the location results for both synthetics and real data with those obtained by using classical waveforms stacking approaches.

  11. Laser guide star wavefront sensing for ground-layer adaptive optics on extremely large telescopes.

    PubMed

    Clare, Richard M; Le Louarn, Miska; Béchet, Clementine

    2011-02-01

    We propose ground-layer adaptive optics (GLAO) to improve the seeing on the 42 m European Extremely Large Telescope. Shack-Hartmann wavefront sensors (WFSs) with laser guide stars (LGSs) will experience significant spot elongation due to off-axis observation. This spot elongation influences the design of the laser launch location, laser power, WFS detector, and centroiding algorithm for LGS GLAO on an extremely large telescope. We show, using end-to-end numerical simulations, that with a noise-weighted matrix-vector-multiply reconstructor, the performance in terms of 50% ensquared energy (EE) of the side and central launch of the lasers is equivalent, the matched filter and weighted center of gravity centroiding algorithms are the most promising, and approximately 10×10 undersampled pixels are optimal. Significant improvement in the 50% EE can be observed with a few tens of photons/subaperture/frame, and no significant gain is seen by adding more than 200 photons/subaperture/frame. The LGS GLAO is not particularly sensitive to the sodium profile present in the mesosphere nor to a short-timescale (less than 100 s) evolution of the sodium profile. The performance of LGS GLAO is, however, sensitive to the atmospheric turbulence profile.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spagliardi, Fabio

    Liquid argon Time Projection Chambers (LArTPCs) are becoming widely used as neutrino detectors because of their image-like event reconstruction which enables precision neutrino measurements. They primarily use ionisation charge to reconstruct neutrino events. It has been shown, however, that the scintillation light emitted by liquid argon could be exploited to improve their performance. As the neutrino measurements planned in the near future require large-scale experiments, their construction presents challenges in terms of both charge and light collection. In this dissertation we present solutions developed to improve the performance in both aspects of these detectors. We present a new wire tensioningmore » measurement method that allows a remote measurement of the tension of the large number wires that constitute the TPC anode. We also discuss the development and installation of WLS-compound covered foils for the SBND neutrino detector at Fermilab, which is a technique proposed t o augment light collection in LArTPCs. This included preparing a SBND-like mesh cathode and testing it in the Run III of LArIAT, a test beam detector also located at Fermilab. Finally, we present a study aimed at understanding late scintillation light emitted by recombining positive argon ions using LArIAT data, which could affect large scale surface detectors.« less

  13. Recent developments for the Large Binocular Telescope Guiding Control Subsystem

    NASA Astrophysics Data System (ADS)

    Golota, T.; De La Peña, M. D.; Biddick, C.; Lesser, M.; Leibold, T.; Miller, D.; Meeks, R.; Hahn, T.; Storm, J.; Sargent, T.; Summers, D.; Hill, J.; Kraus, J.; Hooper, S.; Fisher, D.

    2014-07-01

    The Large Binocular Telescope (LBT) has eight Acquisition, Guiding, and wavefront Sensing Units (AGw units). They provide guiding and wavefront sensing capability at eight different locations at both direct and bent Gregorian focal stations. Recent additions of focal stations for PEPSI and MODS instruments doubled the number of focal stations in use including respective motion, camera controller server computers, and software infrastructure communicating with Guiding Control Subsystem (GCS). This paper describes the improvements made to the LBT GCS and explains how these changes have led to better maintainability and contributed to increased reliability. This paper also discusses the current GCS status and reviews potential upgrades to further improve its performance.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Lingda; Hayes, Ari; Song, Shuaiwen

    Modern GPUs employ cache to improve memory system efficiency. However, large amount of cache space is underutilized due to irregular memory accesses and poor spatial locality which exhibited commonly in GPU applications. Our experiments show that using smaller cache lines could improve cache space utilization, but it also frequently suffers from significant performance loss by introducing large amount of extra cache requests. In this work, we propose a novel cache design named tag-split cache (TSC) that enables fine-grained cache storage to address the problem of cache space underutilization while keeping memory request number unchanged. TSC divides tag into two partsmore » to reduce storage overhead, and it supports multiple cache line replacement in one cycle.« less

  15. An improved large signal model of InP HEMTs

    NASA Astrophysics Data System (ADS)

    Li, Tianhao; Li, Wenjun; Liu, Jun

    2018-05-01

    An improved large signal model for InP HEMTs is proposed in this paper. The channel current and charge model equations are constructed based on the Angelov model equations. Both the equations for channel current and gate charge models were all continuous and high order drivable, and the proposed gate charge model satisfied the charge conservation. For the strong leakage induced barrier reduction effect of InP HEMTs, the Angelov current model equations are improved. The channel current model could fit DC performance of devices. A 2 × 25 μm × 70 nm InP HEMT device is used to demonstrate the extraction and validation of the model, in which the model has predicted the DC I–V, C–V and bias related S parameters accurately. Project supported by the National Natural Science Foundation of China (No. 61331006).

  16. Modeling Improvements and Users Manual for Axial-flow Turbine Off-design Computer Code AXOD

    NASA Technical Reports Server (NTRS)

    Glassman, Arthur J.

    1994-01-01

    An axial-flow turbine off-design performance computer code used for preliminary studies of gas turbine systems was modified and calibrated based on the experimental performance of large aircraft-type turbines. The flow- and loss-model modifications and calibrations are presented in this report. Comparisons are made between computed performances and experimental data for seven turbines over wide ranges of speed and pressure ratio. This report also serves as the users manual for the revised code, which is named AXOD.

  17. Predicting spacecraft multilayer insulation performance from heat transfer measurements

    NASA Technical Reports Server (NTRS)

    Stimpson, L. D.; Hagemeyer, W. A.

    1974-01-01

    Multilayer insulation (MLI) ideally consists of a series of radiation shields with low-conductivity spacers. When MLI blankets were installed on cryogenic tanks or spacecraft, a large discrepancy between the calorimeter measurements and the performance of the installed blankets was discovered. It was found that discontinuities such as exposed edges coupled with high lateral heat transfer created 'heat leaks' which overshadowed the basic heat transfer of the insulation. Approaches leading to improved performance predictions of MLI units are discussed.

  18. Gpm Level 1 Science Requirements: Science and Performance Viewed from the Ground

    NASA Technical Reports Server (NTRS)

    Petersen, W.; Kirstetter, P.; Wolff, D.; Kidd, C.; Tokay, A.; Chandrasekar, V.; Grecu, M.; Huffman, G.; Jackson, G. S.

    2016-01-01

    GPM meets Level 1 science requirements for rain estimation based on the strong performance of its radar algorithms. Changes in the V5 GPROF algorithm should correct errors in V4 and will likely resolve GPROF performance issues relative to L1 requirements. L1 FOV Snow detection largely verified but at unknown SWE rate threshold (likely < 0.5 –1 mm/hr/liquid equivalent). Ongoing work to improve SWE rate estimation for both satellite and GV remote sensing.

  19. Comparative study of organic transistors with different graphene electrodes fabricated using a simple patterning method

    NASA Astrophysics Data System (ADS)

    Kang, Narae; Smith, Christian W.; Ishigami, Masa; Khondaker, Saiful I.

    2017-12-01

    The performance of organic field-effect transistors (OFETs) can be greatly limited due to the inefficient charge injection caused by the large interfacial barrier at the metal/organic semiconductor interface. To improve this, two-dimensional graphene films have been suggested as alternative electrode materials; however, a comparative study of OFET performances using different types of graphene electrodes has not been systematically investigated. Here, we present a comparative study on the performance of pentacene OFETs using chemical vapor deposition (CVD) grown graphene and reduced graphene oxide (RGO) as electrodes. The large area electrodes were patterned using a simple and environmentally benign patterning technique. Although both the CVD graphene and RGO electrodes showed enhanced device performance compared to metal electrodes, we found the maximum performance enhancement from CVD grown graphene electrodes. Our study suggests that, in addition to the strong π-π interaction at the graphene/organic interface, the higher conductivity of the electrodes also plays an important role in the performance of OFETs.

  20. Stress management on underlying GaN-based epitaxial films: A new vision for achieving high-performance LEDs on Si substrates

    NASA Astrophysics Data System (ADS)

    Lin, Zhiting; Wang, Haiyan; Lin, Yunhao; Wang, Wenliang; Li, Guoqiang

    2017-11-01

    High-performance blue GaN-based light-emitting diodes (LEDs) on Si substrates have been achieved by applying a suitable tensile stress in the underlying n-GaN. It is demonstrated by simulation that tensile stress in the underlying n-GaN alleviates the negative effect from polarization electric fields on multiple quantum wells but an excessively large tensile stress severely bends the band profile of the electron blocking layer, resulting in carrier loss and large electric resistance. A medium level of tensile stress, which ranges from 4 to 5 GPa, can maximally improve the luminous intensity and decrease forward voltage of LEDs on Si substrates. The LED with the optimal tensile stress shows the largest simulated luminous intensity and the smallest simulated voltage at 35 A/cm2. Compared to the LEDs with a compressive stress of -3 GPa and a large tensile stress of 8 GPa, the improvement of luminous intensity can reach 102% and 28.34%, respectively. Subsequent experimental results provide evidence of the superiority of applying tensile stress in n-GaN. The experimental light output power of the LEDs with a tensile stress of 1.03 GPa is 528 mW, achieving a significant improvement of 19.4% at 35 A/cm2 in comparison to the reference LED with a compressive stress of -0.63 GPa. The forward voltage of this LED is 3.08 V, which is smaller than 3.11 V for the reference LED. This methodology of stress management on underlying GaN-based epitaxial films shows a bright feature for achieving high-performance LED devices on Si substrates.

  1. Improvement of Repeated-Sprint Ability and Horizontal-Jumping Performance in Elite Young Basketball Players With Low-Volume Repeated-Maximal-Power Training.

    PubMed

    Gonzalo-Skok, Oliver; Tous-Fajardo, Julio; Arjol-Serrano, José Luis; Suarez-Arrones, Luis; Casajús, José Antonio; Mendez-Villanueva, Alberto

    2016-05-01

    To examine the effects of a low-volume repeated-power-ability (RPA) training program on repeated-sprint and change-of- direction (COD) ability and functional jumping performance. Twenty-two male elite young basketball players (age 16.2 ± 1.2 y, height 190.0 ± 10.0 cm, body mass 82.9 ± 10.1 kg) were randomly assigned either to an RPA-training group (n = 11) or a control group (n = 11). RPA training consisted of leg-press exercise, twice a week for 6 wk, of 1 or 2 blocks of 5 sets × 5 repetitions with 20 s of passive recovery between sets and 3 min between blocks with the load that maximized power output. Before and after training, performance was assessed by a repeated-sprint-ability (RSA) test, a repeated-COD-ability test, a hop for distance, and a drop jump followed by tests of a double unilateral hop with the right and left legs. Within-group and between-groups differences showed substantial improvements in slowest (RSAs) and mean time (RSAm) on RSA; best, slowest and mean time on repeated-COD ability; and unilateral right and left hop in the RPA group in comparison with control. While best time on RSA showed no improvement in any group, there was a large relationship (r = .68, 90% CI .43;.84) between the relative decrement in RSAm and RSAs, suggesting better sprint maintenance with RPA training. The relative improvements in best and mean repeated-COD ability were very largely correlated (r = .89, 90% CI .77;.94). Six weeks of lowvolume (4-14 min/wk) RPA training improved several physical-fitness tests in basketball players.

  2. Large Scale Application of Vibration Sensors for Fan Monitoring at Commercial Layer Hen Houses

    PubMed Central

    Chen, Yan; Ni, Ji-Qin; Diehl, Claude A.; Heber, Albert J.; Bogan, Bill W.; Chai, Li-Long

    2010-01-01

    Continuously monitoring the operation of each individual fan can significantly improve the measurement quality of aerial pollutant emissions from animal buildings that have a large number of fans. To monitor the fan operation by detecting the fan vibration is a relatively new technique. A low-cost electronic vibration sensor was developed and commercialized. However, its large scale application has not yet been evaluated. This paper presents long-term performance results of this vibration sensor at two large commercial layer houses. Vibration sensors were installed on 164 fans of 130 cm diameter to continuously monitor the fan on/off status for two years. The performance of the vibration sensors was compared with fan rotational speed (FRS) sensors. The vibration sensors exhibited quick response and high sensitivity to fan operations and therefore satisfied the general requirements of air quality research. The study proved that detecting fan vibration was an effective method to monitor the on/off status of a large number of single-speed fans. The vibration sensor itself was $2 more expensive than a magnetic proximity FRS sensor but the overall cost including installation and data acquisition hardware was $77 less expensive than the FRS sensor. A total of nine vibration sensors failed during the study and the failure rate was related to the batches of product. A few sensors also exhibited unsteady sensitivity. As a new product, the quality of the sensor should be improved to make it more reliable and acceptable. PMID:22163544

  3. Assessing performance of an Electronic Health Record (EHR) using Cognitive Task Analysis.

    PubMed

    Saitwal, Himali; Feng, Xuan; Walji, Muhammad; Patel, Vimla; Zhang, Jiajie

    2010-07-01

    Many Electronic Health Record (EHR) systems fail to provide user-friendly interfaces due to the lack of systematic consideration of human-centered computing issues. Such interfaces can be improved to provide easy to use, easy to learn, and error-resistant EHR systems to the users. To evaluate the usability of an EHR system and suggest areas of improvement in the user interface. The user interface of the AHLTA (Armed Forces Health Longitudinal Technology Application) was analyzed using the Cognitive Task Analysis (CTA) method called GOMS (Goals, Operators, Methods, and Selection rules) and an associated technique called KLM (Keystroke Level Model). The GOMS method was used to evaluate the AHLTA user interface by classifying each step of a given task into Mental (Internal) or Physical (External) operators. This analysis was performed by two analysts independently and the inter-rater reliability was computed to verify the reliability of the GOMS method. Further evaluation was performed using KLM to estimate the execution time required to perform the given task through application of its standard set of operators. The results are based on the analysis of 14 prototypical tasks performed by AHLTA users. The results show that on average a user needs to go through 106 steps to complete a task. To perform all 14 tasks, they would spend about 22 min (independent of system response time) for data entry, of which 11 min are spent on more effortful mental operators. The inter-rater reliability analysis performed for all 14 tasks was 0.8 (kappa), indicating good reliability of the method. This paper empirically reveals and identifies the following finding related to the performance of AHLTA: (1) large number of average total steps to complete common tasks, (2) high average execution time and (3) large percentage of mental operators. The user interface can be improved by reducing (a) the total number of steps and (b) the percentage of mental effort, required for the tasks. 2010 Elsevier Ireland Ltd. All rights reserved.

  4. Application of the EVEX resource to event extraction and network construction: Shared Task entry and result analysis

    PubMed Central

    2015-01-01

    Background Modern methods for mining biomolecular interactions from literature typically make predictions based solely on the immediate textual context, in effect a single sentence. No prior work has been published on extending this context to the information automatically gathered from the whole biomedical literature. Thus, our motivation for this study is to explore whether mutually supporting evidence, aggregated across several documents can be utilized to improve the performance of the state-of-the-art event extraction systems. In this paper, we describe our participation in the latest BioNLP Shared Task using the large-scale text mining resource EVEX. We participated in the Genia Event Extraction (GE) and Gene Regulation Network (GRN) tasks with two separate systems. In the GE task, we implemented a re-ranking approach to improve the precision of an existing event extraction system, incorporating features from the EVEX resource. In the GRN task, our system relied solely on the EVEX resource and utilized a rule-based conversion algorithm between the EVEX and GRN formats. Results In the GE task, our re-ranking approach led to a modest performance increase and resulted in the first rank of the official Shared Task results with 50.97% F-score. Additionally, in this paper we explore and evaluate the usage of distributed vector representations for this challenge. In the GRN task, we ranked fifth in the official results with a strict/relaxed SER score of 0.92/0.81 respectively. To try and improve upon these results, we have implemented a novel machine learning based conversion system and benchmarked its performance against the original rule-based system. Conclusions For the GRN task, we were able to produce a gene regulatory network from the EVEX data, warranting the use of such generic large-scale text mining data in network biology settings. A detailed performance and error analysis provides more insight into the relatively low recall rates. In the GE task we demonstrate that both the re-ranking approach and the word vectors can provide slight performance improvement. A manual evaluation of the re-ranking results pinpoints some of the challenges faced in applying large-scale text mining knowledge to event extraction. PMID:26551766

  5. Probabilistic combination of static and dynamic gait features for verification

    NASA Astrophysics Data System (ADS)

    Bazin, Alex I.; Nixon, Mark S.

    2005-03-01

    This paper describes a novel probabilistic framework for biometric identification and data fusion. Based on intra and inter-class variation extracted from training data, posterior probabilities describing the similarity between two feature vectors may be directly calculated from the data using the logistic function and Bayes rule. Using a large publicly available database we show the two imbalanced gait modalities may be fused using this framework. All fusion methods tested provide an improvement over the best modality, with the weighted sum rule giving the best performance, hence showing that highly imbalanced classifiers may be fused in a probabilistic setting; improving not only the performance, but also generalized application capability.

  6. Reasons for low aerodynamic performance of 13.5-centimeter-tip-diameter aircraft engine starter turbine

    NASA Technical Reports Server (NTRS)

    Haas, J. E.; Roelke, R. J.; Hermann, P.

    1981-01-01

    The reasons for the low aerodynamic performance of a 13.5 cm tip diameter aircraft engine starter turbine were investigated. Both the stator and the stage were evaluated. Approximately 10 percent improvement in turbine efficiency was obtained when the honeycomb shroud over the rotor blade tips was filled to obtain a solid shroud surface. Efficiency improvements were obtained for three rotor configurations when the shroud was filled. It is suggested that the large loss associated with the open honeycomb shroud is due primarily to energy loss associated with gas transportation as a result of the blade to blade pressure differential at the tip section.

  7. A study of the use of linear programming techniques to improve the performance in design optimization problems

    NASA Technical Reports Server (NTRS)

    Young, Katherine C.; Sobieszczanski-Sobieski, Jaroslaw

    1988-01-01

    This project has two objectives. The first is to determine whether linear programming techniques can improve performance when handling design optimization problems with a large number of design variables and constraints relative to the feasible directions algorithm. The second purpose is to determine whether using the Kreisselmeier-Steinhauser (KS) function to replace the constraints with one constraint will reduce the cost of total optimization. Comparisons are made using solutions obtained with linear and non-linear methods. The results indicate that there is no cost saving using the linear method or in using the KS function to replace constraints.

  8. Organizational climate, occupational stress, and employee mental health: mediating effects of organizational efficiency.

    PubMed

    Arnetz, Bengt B; Lucas, Todd; Arnetz, Judith E

    2011-01-01

    To determine whether the relationship between organizational climate and employee mental health is consistent (ie, invariant) or differs across four large hospitals, and whether organizational efficiency mediates this relationship. Participants (total N = 5316) completed validated measures of organizational climate variables (social climate, participatory management, goal clarity, and performance feedback), organizational efficiency, occupational stress, and mental health. Path analysis best supported a model in which organizational efficiency partially mediated relationships between organizational climate, occupational stress, and mental health. Focusing on improving both the psychosocial work environment and organizational efficiency might contribute to decreased employee stress, improved mental well-being, and organizational performance.

  9. Effect of pentacene/Ag anode buffer and UV-ozone treatment on durability of small-molecule organic solar cells

    NASA Astrophysics Data System (ADS)

    Inagaki, S.; Sueoka, S.; Harafuji, K.

    2017-06-01

    Three surface modifications of indium tin oxide (ITO) are experimentally investigated to improve the performance of small-molecule organic solar cells (OSCs) with an ITO/anode buffer layer (ABL)/copper phthalocyanine (CuPc)/fullerene/bathocuproine/Ag structure. An ultrathin Ag ABL and ultraviolet (UV)-ozone treatment of ITO independently improve the durability of OSCs against illumination stress. The thin pentacene ABL provides good ohmic contact between the ITO and the CuPc layer, thereby producing a large short-circuit current. The combined use of the abovementioned three modifications collectively achieves both better initial performance and durability against illumination stress.

  10. Critical validation studies of neurofeedback.

    PubMed

    Gruzelier, John; Egner, Tobias

    2005-01-01

    The field of neurofeedback training has proceeded largely without validation. In this article the authors review studies directed at validating sensory motor rhythm, beta and alpha-theta protocols for improving attention, memory, and music performance in healthy participants. Importantly, benefits were demonstrable with cognitive and neurophysiologic measures that were predicted on the basis of regression models of learning to enhance sensory motor rhythm and beta activity. The first evidence of operant control over the alpha-theta ratio is provided, together with remarkable improvements in artistic aspects of music performance equivalent to two class grades in conservatory students. These are initial steps in providing a much needed scientific basis to neurofeedback.

  11. Co-digestion of sewage sludge from external small WWTP's in a large plant

    NASA Astrophysics Data System (ADS)

    Miodoński, Stanisław

    2017-11-01

    Improving energy efficiency of WWTPs (Waste Water Treatment Plants) is crucial action of modern wastewater treatment technology. Technological treatment process optimization is important but the main goal will not be achieved without increasing production of renewable energy from sewage sludge in anaerobic digestion process which is most often used as sludge stabilization method on large WWTP's. Usually, anaerobic digestion reactors used for sludge digestion were designed with reserve and most of them is oversized. In many cases that reserve is unused. On the other hand, smaller WWTPs have problem with management of sewage sludge due to lack of adequately developed infrastructure for sludge stabilization. Paper shows an analysis of using a technological reserve of anaerobic digestion reactors at large WWTP (1 million P.E.) for sludge stabilization collected from smaller WWTP in a co-digestion process. Over 30 small WWTPs from the same region as the large WWTP were considered in this study. Furthermore, performed analysis included also evaluation of potential sludge disintegration pre-treatment for co-digestion efficiency improvement.

  12. Performance of Compressor of XJ-41-V Turbojet Engine. 4; Performance Analysis Over Range of Compressor Speeds from 5000 to 10,000 RPM

    NASA Technical Reports Server (NTRS)

    Creagh, John W. R.; Ginsburg, Ambrose

    1948-01-01

    An investigation of the XJ-41-V turbojet-engine compressor was conducted to determine the performance of the compressor and to obtain fundamental information on the aerodynamic problems associated with large centrifugal-type compressors. The results of the research conducted on the original compressor indicated the compressor would not meet the desired engine-design air-flow requirements because of an air-flow restriction in the vaned collector. The compressor air-flow choking point occurred near the entrance to the vaned-collector passage and was instigated by a poor mass-flow distribution at the vane entrance and from relatively large negative angles of attack of the air stream along the entrance edges of the vanes at the outer passage wall and large positive angles of attack at the inner passage wall. As a result of the analysis, a design change of the vaned collector entrance is recommended for improving the maximum flow capacity of the compressor.

  13. Durham extremely large telescope adaptive optics simulation platform.

    PubMed

    Basden, Alastair; Butterley, Timothy; Myers, Richard; Wilson, Richard

    2007-03-01

    Adaptive optics systems are essential on all large telescopes for which image quality is important. These are complex systems with many design parameters requiring optimization before good performance can be achieved. The simulation of adaptive optics systems is therefore necessary to categorize the expected performance. We describe an adaptive optics simulation platform, developed at Durham University, which can be used to simulate adaptive optics systems on the largest proposed future extremely large telescopes as well as on current systems. This platform is modular, object oriented, and has the benefit of hardware application acceleration that can be used to improve the simulation performance, essential for ensuring that the run time of a given simulation is acceptable. The simulation platform described here can be highly parallelized using parallelization techniques suited for adaptive optics simulation, while still offering the user complete control while the simulation is running. The results from the simulation of a ground layer adaptive optics system are provided as an example to demonstrate the flexibility of this simulation platform.

  14. Analysis on applicable error-correcting code strength of storage class memory and NAND flash in hybrid storage

    NASA Astrophysics Data System (ADS)

    Matsui, Chihiro; Kinoshita, Reika; Takeuchi, Ken

    2018-04-01

    A hybrid of storage class memory (SCM) and NAND flash is a promising technology for high performance storage. Error correction is inevitable on SCM and NAND flash because their bit error rate (BER) increases with write/erase (W/E) cycles, data retention, and program/read disturb. In addition, scaling and multi-level cell technologies increase BER. However, error-correcting code (ECC) degrades storage performance because of extra memory reading and encoding/decoding time. Therefore, applicable ECC strength of SCM and NAND flash is evaluated independently by fixing ECC strength of one memory in the hybrid storage. As a result, weak BCH ECC with small correctable bit is recommended for the hybrid storage with large SCM capacity because SCM is accessed frequently. In contrast, strong and long-latency LDPC ECC can be applied to NAND flash in the hybrid storage with large SCM capacity because large-capacity SCM improves the storage performance.

  15. Profiling Oman education data using data mining approach

    NASA Astrophysics Data System (ADS)

    Alawi, Sultan Juma Sultan; Shaharanee, Izwan Nizal Mohd; Jamil, Jastini Mohd

    2017-10-01

    Nowadays, with a large amount of data generated by many application services in different learning fields has led to the new challenges in education field. Education portal is an important system that leads to a better development of education field. This research paper presents an innovative data mining techniques to understand and summarizes the information of Oman's education data generated from the Ministry of Education Oman "Educational Portal". This research embarks into performing student profiling of the Oman student database. This study utilized the k-means clustering technique to determine the students' profiles. An amount of 42484-student records from Sultanate of Oman has been extracted for this study. The findings of this study show the practicality of clustering technique to investigating student's profiles. Allowing for a better understanding of student's behavior and their academic performance. Oman Education Portal contain a large amounts of user activity and interaction data. Analyses of this large data can be meaningful for educator to improve the student performance level and recognize students who needed additional attention.

  16. Periodically-Scheduled Controller Analysis using Hybrid Systems Reachability and Continuization

    DTIC Science & Technology

    2015-12-01

    tools to verify specifications for hybrid automata do not perform well on such periodically scheduled models. This is due to a combination of the large...an additive nondeterministic input. Reachability tools for hybrid automata can better handle such systems. We further improve the analysis by...formally as a hybrid automaton. However, reachability tools to verify specifications for hybrid automata do not perform well on such periodically

  17. Performance Analysis of the Unitree Central File

    NASA Technical Reports Server (NTRS)

    Pentakalos, Odysseas I.; Flater, David

    1994-01-01

    This report consists of two parts. The first part briefly comments on the documentation status of two major systems at NASA#s Center for Computational Sciences, specifically the Cray C98 and the Convex C3830. The second part describes the work done on improving the performance of file transfers between the Unitree Mass Storage System running on the Convex file server and the users workstations distributed over a large georgraphic area.

  18. Survey of Nickel-Aluminium-Bronze Casting Alloys on Marine Applications,

    DTIC Science & Technology

    1981-04-01

    and corrosion performance of nickel-aluminium bronze (NAB)/covered by naval specification DGS-8520 and DGS-348 have been investigated. No evidence was...found to suggest that there would be any significant difference in corrosion performance between alloys meeting the two specifications. Early... corrosion problems associated with the weld repair areas of castings have been overcome largely by using improved foundry and welding techniques followed by a

  19. Large-Eddy Simulations of Atmospheric Flows Over Complex Terrain Using the Immersed-Boundary Method in the Weather Research and Forecasting Model

    NASA Astrophysics Data System (ADS)

    Ma, Yulong; Liu, Heping

    2017-12-01

    Atmospheric flow over complex terrain, particularly recirculation flows, greatly influences wind-turbine siting, forest-fire behaviour, and trace-gas and pollutant dispersion. However, there is a large uncertainty in the simulation of flow over complex topography, which is attributable to the type of turbulence model, the subgrid-scale (SGS) turbulence parametrization, terrain-following coordinates, and numerical errors in finite-difference methods. Here, we upgrade the large-eddy simulation module within the Weather Research and Forecasting model by incorporating the immersed-boundary method into the module to improve simulations of the flow and recirculation over complex terrain. Simulations over the Bolund Hill indicate improved mean absolute speed-up errors with respect to previous studies, as well an improved simulation of the recirculation zone behind the escarpment of the hill. With regard to the SGS parametrization, the Lagrangian-averaged scale-dependent Smagorinsky model performs better than the classic Smagorinsky model in reproducing both velocity and turbulent kinetic energy. A finer grid resolution also improves the strength of the recirculation in flow simulations, with a higher horizontal grid resolution improving simulations just behind the escarpment, and a higher vertical grid resolution improving results on the lee side of the hill. Our modelling approach has broad applications for the simulation of atmospheric flows over complex topography.

  20. Roll-to-roll fabrication of large scale and regular arrays of three-dimensional nanospikes for high efficiency and flexible photovoltaics

    PubMed Central

    Leung, Siu-Fung; Gu, Leilei; Zhang, Qianpeng; Tsui, Kwong-Hoi; Shieh, Jia-Min; Shen, Chang-Hong; Hsiao, Tzu-Hsuan; Hsu, Chin-Hung; Lu, Linfeng; Li, Dongdong; Lin, Qingfeng; Fan, Zhiyong

    2014-01-01

    Three-dimensional (3-D) nanostructures have demonstrated enticing potency to boost performance of photovoltaic devices primarily owning to the improved photon capturing capability. Nevertheless, cost-effective and scalable fabrication of regular 3-D nanostructures with decent robustness and flexibility still remains as a challenging task. Meanwhile, establishing rational design guidelines for 3-D nanostructured solar cells with the balanced electrical and optical performance are of paramount importance and in urgent need. Herein, regular arrays of 3-D nanospikes (NSPs) were fabricated on flexible aluminum foil with a roll-to-roll compatible process. The NSPs have precisely controlled geometry and periodicity which allow systematic investigation on geometry dependent optical and electrical performance of the devices with experiments and modeling. Intriguingly, it has been discovered that the efficiency of an amorphous-Si (a-Si) photovoltaic device fabricated on NSPs can be improved by 43%, as compared to its planar counterpart, in an optimal case. Furthermore, large scale flexible NSP solar cell devices have been fabricated and demonstrated. These results not only have shed light on the design rules of high performance nanostructured solar cells, but also demonstrated a highly practical process to fabricate efficient solar panels with 3-D nanostructures, thus may have immediate impact on thin film photovoltaic industry. PMID:24603964

  1. Roll-to-roll fabrication of large scale and regular arrays of three-dimensional nanospikes for high efficiency and flexible photovoltaics.

    PubMed

    Leung, Siu-Fung; Gu, Leilei; Zhang, Qianpeng; Tsui, Kwong-Hoi; Shieh, Jia-Min; Shen, Chang-Hong; Hsiao, Tzu-Hsuan; Hsu, Chin-Hung; Lu, Linfeng; Li, Dongdong; Lin, Qingfeng; Fan, Zhiyong

    2014-03-07

    Three-dimensional (3-D) nanostructures have demonstrated enticing potency to boost performance of photovoltaic devices primarily owning to the improved photon capturing capability. Nevertheless, cost-effective and scalable fabrication of regular 3-D nanostructures with decent robustness and flexibility still remains as a challenging task. Meanwhile, establishing rational design guidelines for 3-D nanostructured solar cells with the balanced electrical and optical performance are of paramount importance and in urgent need. Herein, regular arrays of 3-D nanospikes (NSPs) were fabricated on flexible aluminum foil with a roll-to-roll compatible process. The NSPs have precisely controlled geometry and periodicity which allow systematic investigation on geometry dependent optical and electrical performance of the devices with experiments and modeling. Intriguingly, it has been discovered that the efficiency of an amorphous-Si (a-Si) photovoltaic device fabricated on NSPs can be improved by 43%, as compared to its planar counterpart, in an optimal case. Furthermore, large scale flexible NSP solar cell devices have been fabricated and demonstrated. These results not only have shed light on the design rules of high performance nanostructured solar cells, but also demonstrated a highly practical process to fabricate efficient solar panels with 3-D nanostructures, thus may have immediate impact on thin film photovoltaic industry.

  2. Experiments on tandem diffusers with boundary-layer suction applied in between

    NASA Technical Reports Server (NTRS)

    Barna, P. S.

    1979-01-01

    Experiments were performed on conical diffusers of various configurations with the same, but rather unusually large, 16:1 area ratio. Because available performance data on diffusers fall short of very large area ratio configurations, an unconventional design, consisting of two diffusers following each other in tandem, was proposed. Both diffusers had the same area ratio of 4:1, but had different taper angles. While for the first diffuser (called leading) the angle remained constant, for the second (called follower), the taper angle was stepped up to higher values. Boundary layer control, by way of suction, was applied between the diffusers, and a single slot suction ring was inserted between them. The leading diffuser had an enclosed nominal divergence angle 2 theta = 5 degrees, while the follower diffusers had either 10, 20, 30, or 40 degrees, respectively, giving 4 combinations. The experiments were performed at four different Reynolds numbers with various suction rates. The rates indicate a general improvement in the performance of all diffusers with boundary layer suction. It appears that the improvement of the pressure recovery depends on both the Reynolds number and the suction rate, and the largest increase, 0.075, was found at the lowest R sub e when the follower divergence was 2 theta = 40 degrees.

  3. Large-Scale Wind-Tunnel Tests and Evaluation of the Low-Speed Performance of a 35 deg Sweptback Wing Jet Transport Model Equipped with a Blowing Boundary-Layer-Control Flap and Leading-Edge Slat

    NASA Technical Reports Server (NTRS)

    Hickey, David H.; Aoyagi, Kiyoshi

    1960-01-01

    A wind-tunnel investigation was conducted to determine the effect of trailing-edge flaps with blowing-type boundary-layer control and leading-edge slats on the low-speed performance of a large-scale jet transport model with four engines and a 35 deg. sweptback wing of aspect ratio 7. Two spanwise extents and several deflections of the trailing-edge flap were tested. Results were obtained with a normal leading-edge and with full-span leading-edge slats. Three-component longitudinal force and moment data and boundary-layer-control flow requirements are presented. The test results are analyzed in terms of possible improvements in low-speed performance. The effect on performance of the source of boundary-layer-control air flow is considered in the analysis.

  4. Improved Edge Performance in MRF

    NASA Technical Reports Server (NTRS)

    Shorey, Aric; Jones, Andrew; Durnas, Paul; Tricard, Marc

    2004-01-01

    The fabrication of large segmented optics requires a polishing process that can correct the figure of a surface to within a short distance from its edges-typically, a few millimeters. The work here is to develop QED's Magnetorheological Finishing (MRF) precision polishing process to minimize residual edge effects.

  5. IMPROVING HYDROLOGIC SUSTAINABILITY OF TEXAS A&M UNIVERSITY CAMPUS

    EPA Science Inventory

    For small to mid-sized rain events, LID scenarios, including permeable pavements, rainwater harvesting, green roofs, and riparian buffer strips perform similarly to a conventional Best Management Practice, a detention pond, with respect to peak flows and HFR. For large rain ev...

  6. The evolution of commercial launch vehicles : fourth quarter 2001 Quarterly Launch Report

    DOT National Transportation Integrated Search

    2001-01-01

    Launch vehicle performance continues to constantly improve, in large part to meet the demands of an increasing number of larger satellites. Current vehicles are very likely to be changed from last year's versions and are certainly not the same as one...

  7. Generating enhanced site topography data to improve permeable pavement performance assessment methods - presentation

    EPA Science Inventory

    Permeable pavement surfaces are infiltration based stormwater control measures (SCM) commonly applied in parking lots to decrease impervious area and reduce runoff volume. Many are not optimally designed however, as little attention is given to draining a large enough contributin...

  8. Performance Evaluation and Improving Mechanisms of Diatomite-Modified Asphalt Mixture

    PubMed Central

    Yang, Chao; Xie, Jun; Zhou, Xiaojun; Liu, Quantao; Pang, Ling

    2018-01-01

    Diatomite is an inorganic natural resource in large reserve. This study consists of two phases to evaluate the effects of diatomite on asphalt mixtures. In the first phase, we characterized the diatomite in terms of mineralogical properties, chemical compositions, particle size distribution, mesoporous distribution, morphology, and IR spectra. In the second phase, road performances, referring to the permanent deformation, crack, fatigue, and moisture resistance, of asphalt mixtures with diatomite were investigated. The characterization of diatomite exhibits that it is a porous material with high SiO2 content and large specific surface area. It contributes to asphalt absorption and therefore leads to bonding enhancement between asphalt and aggregate. However, physical absorption instead of chemical reaction occurs according to the results of FTIR. The resistance of asphalt mixtures with diatomite to permanent deformation and moisture are superior to those of the control mixtures. But, the addition of diatomite does not help to improve the crack and fatigue resistance of asphalt mixture. PMID:29702579

  9. Lockheed L-1011 TriStar to support Adaptive Performance Optimization study with NASA F-18 chase plan

    NASA Technical Reports Server (NTRS)

    1995-01-01

    This Lockheed L-1011 Tristar, seen here June 1995, is currently the subject of a new flight research experiment developed by NASA's Dryden Flight Research Center, Edwards, California, to improve the effiecency of large transport aircraft. Shown with a NASA F-18 chase plane over California's Sierra Nevada mountains during an earlier baseline flight, the jetliner operated by Oribtal Sciences Corp., recently flew its first data-gathering mission in the Adaptive Performance Optimization project. The experiment seeks to reduce fuel comsumption of large jetliners by improving the aerodynamic efficiency of their wings at cruise conditions. A research computer employing a sophisticated software program adapts to changing flight conditions by commanding small movements of the L-1011's outboard ailerons to give its wings the most efficient - or optimal - airfoil. Up to a dozen research flights will be flown in the current and follow-on phases of the project over the next couple years.

  10. Performance Evaluation and Improving Mechanisms of Diatomite-Modified Asphalt Mixture.

    PubMed

    Yang, Chao; Xie, Jun; Zhou, Xiaojun; Liu, Quantao; Pang, Ling

    2018-04-27

    Diatomite is an inorganic natural resource in large reserve. This study consists of two phases to evaluate the effects of diatomite on asphalt mixtures. In the first phase, we characterized the diatomite in terms of mineralogical properties, chemical compositions, particle size distribution, mesoporous distribution, morphology, and IR spectra. In the second phase, road performances, referring to the permanent deformation, crack, fatigue, and moisture resistance, of asphalt mixtures with diatomite were investigated. The characterization of diatomite exhibits that it is a porous material with high SiO₂ content and large specific surface area. It contributes to asphalt absorption and therefore leads to bonding enhancement between asphalt and aggregate. However, physical absorption instead of chemical reaction occurs according to the results of FTIR. The resistance of asphalt mixtures with diatomite to permanent deformation and moisture are superior to those of the control mixtures. But, the addition of diatomite does not help to improve the crack and fatigue resistance of asphalt mixture.

  11. Reducing graphene device variability with yttrium sacrificial layers

    NASA Astrophysics Data System (ADS)

    Wang, Ning C.; Carrion, Enrique A.; Tung, Maryann C.; Pop, Eric

    2017-05-01

    Graphene technology has made great strides since the material was isolated more than a decade ago. However, despite improvements in growth quality and numerous "hero" devices, challenges of uniformity remain, restricting the large-scale development of graphene-based technologies. Here, we investigate and reduce the variability of graphene transistors by studying the effects of contact metals (with and without a Ti layer), resist, and yttrium (Y) sacrificial layers during the fabrication of hundreds of devices. We find that with optical photolithography, residual resist and process contamination are unavoidable, ultimately limiting the device performance and yield. However, using Y sacrificial layers to isolate the graphene from processing conditions improves the yield (from 73% to 97%), the average device performance (three-fold increase of mobility and 58% lower contact resistance), and the device-to-device variability (standard deviation of Dirac voltage reduced by 20%). In contrast to other sacrificial layer techniques, the removal of the Y sacrificial layer with dilute HCl does not harm surrounding materials, simplifying large-scale graphene fabrication.

  12. Using Neural Networks to Improve the Performance of Radiative Transfer Modeling Used for Geometry Dependent Surface Lambertian-Equivalent Reflectivity Calculations

    NASA Technical Reports Server (NTRS)

    Fasnacht, Zachary; Qin, Wenhan; Haffner, David P.; Loyola, Diego; Joiner, Joanna; Krotkov, Nickolay; Vasilkov, Alexander; Spurr, Robert

    2017-01-01

    Surface Lambertian-equivalent reflectivity (LER) is important for trace gas retrievals in the direct calculation of cloud fractions and indirect calculation of the air mass factor. Current trace gas retrievals use climatological surface LER's. Surface properties that impact the bidirectional reflectance distribution function (BRDF) as well as varying satellite viewing geometry can be important for retrieval of trace gases. Geometry Dependent LER (GLER) captures these effects with its calculation of sun normalized radiances (I/F) and can be used in current LER algorithms (Vasilkov et al. 2016). Pixel by pixel radiative transfer calculations are computationally expensive for large datasets. Modern satellite missions such as the Tropospheric Monitoring Instrument (TROPOMI) produce very large datasets as they take measurements at much higher spatial and spectral resolutions. Look up table (LUT) interpolation improves the speed of radiative transfer calculations but complexity increases for non-linear functions. Neural networks perform fast calculations and can accurately predict both non-linear and linear functions with little effort.

  13. Breaking bad news is a teachable skill in pediatric residents: A feasibility study of an educational intervention.

    PubMed

    Reed, Suzanne; Kassis, Karyn; Nagel, Rollin; Verbeck, Nicole; Mahan, John D; Shell, Richard

    2015-06-01

    Patients and physicians identify communication of bad news as a skill in need of improvement. Our objectives were to measure change in performance of first-year pediatric residents in the delivery of bad news after an educational intervention and to measure if changes in performance were sustained over time. Communication skills of 29 residents were assessed via videotaped standardized patient (SP) encounters at 3 time points: baseline, immediately post-intervention, and 3 months post-intervention. Educational intervention used was the previously published "GRIEV_ING Death Notification Protocol." The intraclass correlation coefficient demonstrated substantial inter-rater agreement with the assessment tool. Performance scores significantly improved from baseline to immediate post-intervention. Performance at 3 months post-intervention showed no change in two subscales and small improvement in one subscale. We concluded that breaking bad news is a complex and teachable skill that can be developed in pediatric residents. Improvement was sustained over time, indicating the utility of this educational intervention. This study brings attention to the need for improved communication training, and the feasibility of an education intervention in a large training program. Further work in development of comprehensive communication curricula is necessary in pediatric graduate medical education programs. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  14. Improving Physical Task Performance with Counterfactual and Prefactual Thinking.

    PubMed

    Hammell, Cecilia; Chan, Amy Y C

    2016-01-01

    Counterfactual thinking (reflecting on "what might have been") has been shown to enhance future performance by translating information about past mistakes into plans for future action. Prefactual thinking (imagining "what might be if…") may serve a greater preparative function than counterfactual thinking as it is future-orientated and focuses on more controllable features, thus providing a practical script to prime future behaviour. However, whether or not this difference in hypothetical thought content may translate into a difference in actual task performance has been largely unexamined. In Experiment 1 (n = 42), participants performed trials of a computer-simulated physical task, in between which they engaged in either task-related hypothetical thinking (counterfactual or prefactual) or an unrelated filler task (control). As hypothesised, prefactuals contained more controllable features than counterfactuals. Moreover, participants who engaged in either form of hypothetical thinking improved significantly in task performance over trials compared to participants in the control group. The difference in thought content between counterfactuals and prefactuals, however, did not yield a significant difference in performance improvement. Experiment 2 (n = 42) replicated these findings in a dynamic balance task environment. Together, these findings provide further evidence for the preparatory function of counterfactuals, and demonstrate that prefactuals share this same functional characteristic.

  15. Improving Large-Scale Image Retrieval Through Robust Aggregation of Local Descriptors.

    PubMed

    Husain, Syed Sameed; Bober, Miroslaw

    2017-09-01

    Visual search and image retrieval underpin numerous applications, however the task is still challenging predominantly due to the variability of object appearance and ever increasing size of the databases, often exceeding billions of images. Prior art methods rely on aggregation of local scale-invariant descriptors, such as SIFT, via mechanisms including Bag of Visual Words (BoW), Vector of Locally Aggregated Descriptors (VLAD) and Fisher Vectors (FV). However, their performance is still short of what is required. This paper presents a novel method for deriving a compact and distinctive representation of image content called Robust Visual Descriptor with Whitening (RVD-W). It significantly advances the state of the art and delivers world-class performance. In our approach local descriptors are rank-assigned to multiple clusters. Residual vectors are then computed in each cluster, normalized using a direction-preserving normalization function and aggregated based on the neighborhood rank. Importantly, the residual vectors are de-correlated and whitened in each cluster before aggregation, leading to a balanced energy distribution in each dimension and significantly improved performance. We also propose a new post-PCA normalization approach which improves separability between the matching and non-matching global descriptors. This new normalization benefits not only our RVD-W descriptor but also improves existing approaches based on FV and VLAD aggregation. Furthermore, we show that the aggregation framework developed using hand-crafted SIFT features also performs exceptionally well with Convolutional Neural Network (CNN) based features. The RVD-W pipeline outperforms state-of-the-art global descriptors on both the Holidays and Oxford datasets. On the large scale datasets, Holidays1M and Oxford1M, SIFT-based RVD-W representation obtains a mAP of 45.1 and 35.1 percent, while CNN-based RVD-W achieve a mAP of 63.5 and 44.8 percent, all yielding superior performance to the state-of-the-art.

  16. Improving the performance of auto-parametric pendulum absorbers by means of a flexural beam

    NASA Astrophysics Data System (ADS)

    Mahmoudkhani, S.

    2018-07-01

    Auto-parametric pendulum absorbers perform well only in a very limited range of excitation amplitudes, above which their efficiency would be substantially degraded as a consequence of spillover effects or appearance of quasi-periodic and chaotic responses. For improving the performance against this drawback, the rigid pendulum is replaced in the present study with a low-stiffness viscoelastic beam. An additional one-to-three internal resonance between the almost non-flexural rotational and the first flexural modes of the beam is also introduced. With the aid of this internal resonance, the energy that has been transferred to the absorber due to the one-to-two internal resonance would be avoided from being transferred back to the primary system by faster dissipation of vibrations at a higher-frequency mode thereby leading to lower spillover effects. For modeling purpose, the tracking frame with the rigid-body constraint and also the third-order nonlinear beam theory are employed to account for arbitrarily large rotation angles coupled to moderately large elastic deformations. The assumed-mode method is also used to obtain discretized equations of motion. The numerical continuation of periodic solution is performed and the bifurcations with detrimental effects on the performance are determined. Various parametric studies are also conducted which show that by proper setting of the system parameters, higher efficiencies at much wider range of excitation amplitudes could be achieved.

  17. Improving Large Cetacean Implantable Satellite Tag Designs to Maximize Tag Robustness and Minimize Health Effects to Individual Animals

    DTIC Science & Technology

    2014-09-30

    for the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data ...sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden...relatively small for quantitative comparisons and some of the deployed tags are still transmitting, their overall performance appears to have improved. 2

  18. An evaluation of multi-probe locality sensitive hashing for computing similarities over web-scale query logs.

    PubMed

    Cormode, Graham; Dasgupta, Anirban; Goyal, Amit; Lee, Chi Hoon

    2018-01-01

    Many modern applications of AI such as web search, mobile browsing, image processing, and natural language processing rely on finding similar items from a large database of complex objects. Due to the very large scale of data involved (e.g., users' queries from commercial search engines), computing such near or nearest neighbors is a non-trivial task, as the computational cost grows significantly with the number of items. To address this challenge, we adopt Locality Sensitive Hashing (a.k.a, LSH) methods and evaluate four variants in a distributed computing environment (specifically, Hadoop). We identify several optimizations which improve performance, suitable for deployment in very large scale settings. The experimental results demonstrate our variants of LSH achieve the robust performance with better recall compared with "vanilla" LSH, even when using the same amount of space.

  19. Comparative analysis of machine learning methods in ligand-based virtual screening of large compound libraries.

    PubMed

    Ma, Xiao H; Jia, Jia; Zhu, Feng; Xue, Ying; Li, Ze R; Chen, Yu Z

    2009-05-01

    Machine learning methods have been explored as ligand-based virtual screening tools for facilitating drug lead discovery. These methods predict compounds of specific pharmacodynamic, pharmacokinetic or toxicological properties based on their structure-derived structural and physicochemical properties. Increasing attention has been directed at these methods because of their capability in predicting compounds of diverse structures and complex structure-activity relationships without requiring the knowledge of target 3D structure. This article reviews current progresses in using machine learning methods for virtual screening of pharmacodynamically active compounds from large compound libraries, and analyzes and compares the reported performances of machine learning tools with those of structure-based and other ligand-based (such as pharmacophore and clustering) virtual screening methods. The feasibility to improve the performance of machine learning methods in screening large libraries is discussed.

  20. Hartmann characterization of the PEEM-3 aberration-corrected X-ray photoemission electron microscope.

    PubMed

    Scholl, A; Marcus, M A; Doran, A; Nasiatka, J R; Young, A T; MacDowell, A A; Streubel, R; Kent, N; Feng, J; Wan, W; Padmore, H A

    2018-05-01

    Aberration correction by an electron mirror dramatically improves the spatial resolution and transmission of photoemission electron microscopes. We will review the performance of the recently installed aberration corrector of the X-ray Photoemission Electron Microscope PEEM-3 and show a large improvement in the efficiency of the electron optics. Hartmann testing is introduced as a quantitative method to measure the geometrical aberrations of a cathode lens electron microscope. We find that aberration correction leads to an order of magnitude reduction of the spherical aberrations, suggesting that a spatial resolution of below 100 nm is possible at 100% transmission of the optics when using x-rays. We demonstrate this improved performance by imaging test patterns employing element and magnetic contrast. Published by Elsevier B.V.

  1. Transparent conductor-embedding nanocones for selective emitters: optical and electrical improvements of Si solar cells

    PubMed Central

    Kim, Joondong; Yun, Ju-Hyung; Kim, Hyunyub; Cho, Yunae; Park, Hyeong-Ho; Kumar, M. Melvin David; Yi, Junsin; Anderson, Wayne A.; Kim, Dong-Wook

    2015-01-01

    Periodical nanocone-arrays were employed in an emitter region for high efficient Si solar cells. Conventional wet-etching process was performed to form the nanocone-arrays for a large area, which spontaneously provides the graded doping features for a selective emitter. This enables to lower the electrical contact resistance and enhances the carrier collection due to the high electric field distribution through a nanocone. Optically, the convex-shaped nanocones efficiently reduce light-reflection and the incident light is effectively focused into Si via nanocone structure, resulting in an extremely improved the carrier collection performances. This nanocone-arrayed selective emitter simultaneously satisfies optical and electrical improvement. We report the record high efficiency of 16.3% for the periodically nanoscale patterned emitter Si solar cell. PMID:25787933

  2. Transparent conductor-embedding nanocones for selective emitters: optical and electrical improvements of Si solar cells.

    PubMed

    Kim, Joondong; Yun, Ju-Hyung; Kim, Hyunyub; Cho, Yunae; Park, Hyeong-Ho; Kumar, M Melvin David; Yi, Junsin; Anderson, Wayne A; Kim, Dong-Wook

    2015-03-19

    Periodical nanocone-arrays were employed in an emitter region for high efficient Si solar cells. Conventional wet-etching process was performed to form the nanocone-arrays for a large area, which spontaneously provides the graded doping features for a selective emitter. This enables to lower the electrical contact resistance and enhances the carrier collection due to the high electric field distribution through a nanocone. Optically, the convex-shaped nanocones efficiently reduce light-reflection and the incident light is effectively focused into Si via nanocone structure, resulting in an extremely improved the carrier collection performances. This nanocone-arrayed selective emitter simultaneously satisfies optical and electrical improvement. We report the record high efficiency of 16.3% for the periodically nanoscale patterned emitter Si solar cell.

  3. Deep Brain Stimulation using Magnetic Fields

    NASA Astrophysics Data System (ADS)

    Jiles, David; Williams, Paul; Crowther, Lawrence; Iowa State University Team; Wolfson CentreMagnetics Team

    2011-03-01

    New applications for transcranial magnetic stimulation are developing rapidly for both diagnostic and therapeutic purposes. Therefore so is the demand for improved performance, particularly in terms of their ability to stimulate deeper regions of the brain and to do so selectively. The coil designs that are used presently are limited in their ability to stimulate the brain at depth and with high spatial focality. Consequently, any improvement in coil performance would have a significant impact in extending the usefulness of TMS in both clinical applications and academic research studies. New and improved coil designs have then been developed, modeled and tested as a result of this work. A large magnetizing coil, 300mm in diameter and compatible with a commercial TMS system has been constructed to determine its feasibility for use as a deep brain stimulator. The results of this work have suggested directions that could be pursued in order to further improve the coil designs.

  4. Learning to leverage existing information systems: Part 1. Principles.

    PubMed

    Neil, Nancy; Nerenz, David

    2003-10-01

    The success of performance improvement efforts depends on effective measurement and feedback regarding clinical processes and outcomes. Yet most health care organizations have fragmented rather than integrated data systems. Methods and practical guidance are provided for leveraging available information sources to obtain and create valid performance improvement-related information for use by clinicians and administrators. At Virginia Mason Health System (VMHS; Seattle), a vertically integrated hospital and multispecialty group practice, patient records are paper based and are supplemented with electronic reporting for laboratory and radiology services. Despite growth in the resources and interest devoted to organization-wide performance measurement, quality improvement, and evidence-based tools, VMHS's information systems consist of largely stand-alone, legacy systems organized around the ability to retrieve information on patients, one at a time. By 2002, without any investment in technology, VMHS had developed standardized, clinic-wide key indicators of performance updated and reported regularly at the patient, provider, site, and organizational levels. On the basis of VHMS's experience, principles can be suggested to guide other organizations to explore solutions using their own information systems: for example, start simply, but start; identify information needs; tap multiple data streams; and improve incrementally.

  5. The impact of carbon sp{sup 2} fraction of reduced graphene oxide on the performance of reduced graphene oxide contacted organic transistors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kang, Narae; Department of Physics, University of Central Florida, 12424 Research Parkway, Suite 400, Orlando, Florida 32826; Khondaker, Saiful I., E-mail: saiful@ucf.edu

    2014-12-01

    One of the major bottlenecks in fabricating high performance organic field effect transistors (OFETs) is a large interfacial contact barrier between metal electrodes and organic semiconductors (OSCs) which makes the charge injection inefficient. Recently, reduced graphene oxide (RGO) has been suggested as an alternative electrode material for OFETs. RGO has tunable electronic properties and its conductivity can be varied by several orders of magnitude by varying the carbon sp{sup 2} fraction. However, whether the sp{sup 2} fraction of RGO in the electrode affects the performance of the fabricated OFETs is yet to be investigated. In this study, we demonstrate thatmore » the performance of OFETs with pentacene as OSC and RGO as electrode can be continuously improved by increasing the carbon sp{sup 2} fraction of RGO. When compared to control palladium electrodes, the mobility of the OFETs shows an improvement of ∼200% for 61% sp{sup 2} fraction RGO, which further improves to ∼500% for 80% RGO electrode. Similar improvements were also observed in current on-off ratio, on-current, and transconductance. Our study suggests that, in addition to π-π interaction at RGO/pentacene interface, the tunable electronic properties of RGO electrode have a significant role in OFETs performance.« less

  6. Advanced Model for Extreme Lift and Improved Aeroacoustics (AMELIA)

    NASA Technical Reports Server (NTRS)

    Lichtwardt, Jonathan; Paciano, Eric; Jameson, Tina; Fong, Robert; Marshall, David

    2012-01-01

    With the very recent advent of NASA's Environmentally Responsible Aviation Project (ERA), which is dedicated to designing aircraft that will reduce the impact of aviation on the environment, there is a need for research and development of methodologies to minimize fuel burn, emissions, and reduce community noise produced by regional airliners. ERA tackles airframe technology, propulsion technology, and vehicle systems integration to meet performance objectives in the time frame for the aircraft to be at a Technology Readiness Level (TRL) of 4-6 by the year of 2020 (deemed N+2). The proceeding project that investigated similar goals to ERA was NASA's Subsonic Fixed Wing (SFW). SFW focused on conducting research to improve prediction methods and technologies that will produce lower noise, lower emissions, and higher performing subsonic aircraft for the Next Generation Air Transportation System. The work provided in this investigation was a NASA Research Announcement (NRA) contract #NNL07AA55C funded by Subsonic Fixed Wing. The project started in 2007 with a specific goal of conducting a large-scale wind tunnel test along with the development of new and improved predictive codes for the advanced powered-lift concepts. Many of the predictive codes were incorporated to refine the wind tunnel model outer mold line design. The large scale wind tunnel test goal was to investigate powered lift technologies and provide an experimental database to validate current and future modeling techniques. Powered-lift concepts investigated were Circulation Control (CC) wing in conjunction with over-the-wing mounted engines to entrain the exhaust to further increase the lift generated by CC technologies alone. The NRA was a five-year effort; during the first year the objective was to select and refine CESTOL concepts and then to complete a preliminary design of a large-scale wind tunnel model for the large scale test. During the second, third, and fourth years the large-scale wind tunnel model design would be completed, manufactured, and calibrated. During the fifth year the large scale wind tunnel test was conducted. This technical memo will describe all phases of the Advanced Model for Extreme Lift and Improved Aeroacoustics (AMELIA) project and provide a brief summary of the background and modeling efforts involved in the NRA. The conceptual designs considered for this project and the decision process for the selected configuration adapted for a wind tunnel model will be briefly discussed. The internal configuration of AMELIA, and the internal measurements chosen in order to satisfy the requirements of obtaining a database of experimental data to be used for future computational model validations. The external experimental techniques that were employed during the test, along with the large-scale wind tunnel test facility are covered in great detail. Experimental measurements in the database include forces and moments, and surface pressure distributions, local skin friction measurements, boundary and shear layer velocity profiles, far-field acoustic data and noise signatures from turbofan propulsion simulators. Results and discussion of the circulation control performance, over-the-wing mounted engines, and the combined performance are also discussed in great detail.

  7. Chromatographic hydrogen isotope separation

    DOEpatents

    Aldridge, Frederick T.

    1981-01-01

    Intermetallic compounds with the CaCu.sub.5 type of crystal structure, particularly LaNiCo.sub.4 and CaNi.sub.5, exhibit high separation factors and fast equilibrium times and therefore are useful for packing a chromatographic hydrogen isotope separation colum. The addition of an inert metal to dilute the hydride improves performance of the column. A large scale mutli-stage chromatographic separation process run as a secondary process off a hydrogen feedstream from an industrial plant which uses large volumes of hydrogen can produce large quantities of heavy water at an effective cost for use in heavy water reactors.

  8. Chromatographic hydrogen isotope separation

    DOEpatents

    Aldridge, F.T.

    Intermetallic compounds with the CaCu/sub 5/ type of crystal structure, particularly LaNiCo/sub 4/ and CaNi/sub 5/, exhibit high separation factors and fast equilibrium times and therefore are useful for packing a chromatographic hydrogen isotope separation column. The addition of an inert metal to dilute the hydride improves performance of the column. A large scale multi-stage chromatographic separation process run as a secondary process off a hydrogen feedstream from an industrial plant which uses large volumes of hydrogen cn produce large quantities of heavy water at an effective cost for use in heavy water reactors.

  9. Playing off the curve - testing quantitative predictions of skill acquisition theories in development of chess performance.

    PubMed

    Gaschler, Robert; Progscha, Johanna; Smallbone, Kieran; Ram, Nilam; Bilalić, Merim

    2014-01-01

    Learning curves have been proposed as an adequate description of learning processes, no matter whether the processes manifest within minutes or across years. Different mechanisms underlying skill acquisition can lead to differences in the shape of learning curves. In the current study, we analyze the tournament performance data of 1383 chess players who begin competing at young age and play tournaments for at least 10 years. We analyze the performance development with the goal to test the adequacy of learning curves, and the skill acquisition theories they are based on, for describing and predicting expertise acquisition. On the one hand, we show that the skill acquisition theories implying a negative exponential learning curve do a better job in both describing early performance gains and predicting later trajectories of chess performance than those theories implying a power function learning curve. On the other hand, the learning curves of a large proportion of players show systematic qualitative deviations from the predictions of either type of skill acquisition theory. While skill acquisition theories predict larger performance gains in early years and smaller gains in later years, a substantial number of players begin to show substantial improvements with a delay of several years (and no improvement in the first years), deviations not fully accounted for by quantity of practice. The current work adds to the debate on how learning processes on a small time scale combine to large-scale changes.

  10. Small-scale test program to develop a more efficient swivel nozzle thrust deflector for V/STOL lift/cruise engines

    NASA Technical Reports Server (NTRS)

    Schlundt, D. W.

    1976-01-01

    The installed performance degradation of a swivel nozzle thrust deflector system obtained during increased vectoring angles of a large-scale test program was investigated and improved. Small-scale models were used to generate performance data for analyzing selected swivel nozzle configurations. A single-swivel nozzle design model with five different nozzle configurations and a twin-swivel nozzle design model, scaled to 0.15 size of the large-scale test hardware, were statically tested at low exhaust pressure ratios of 1.4, 1.3, 1.2, and 1.1 and vectored at four nozzle positions from 0 deg cruise through 90 deg vertical used for the VTOL mode.

  11. Performance Demonstration of Mcmb-LiNiCoO2 Cells Containing Electrolytes Designed for Wide Operating Temperature Range

    NASA Technical Reports Server (NTRS)

    Smart, M. C.; Ratnakumar, B. V.; Whicanack, L. D.; Smith, K. A.; Santee, S.; Puglia, F. J.; Gitzendanner, R.

    2009-01-01

    With the intent of improving the performance of Li-ion cells over a wide operating temperature range, we have investigated the use of co-solvents to improve the properties of electrolyte formulations. In the current study, we have focused upon evaluating promising electrolytes which have been incorporated into large capacity (7 Ah) prototype Li-ion cells, fabricated by Yardney Technical Products, Inc. The electrolytes selected for performance evaluation include the use of a number of esters as co-solvents, including methyl propionate (MP), ethyl propionate (EP), ethyl butyrate (EB), propyl butyrate (PB), and 2,2,2-trifluoroethyl butyrate (TFEB). The performance of the prototype cells containing the ester-based electrolytes was compared with an extensive data base generated on cells containing previously developed all carbonate-based electrolytes. A number of performance tests were performed, including determining (i) the discharge rate capacity over a wide range of temperatures, (ii) the charge characteristics, (iii) the cycle life characteristics under various conditions, and (iv) the impedance characteristics.

  12. Ergonomics Climate Assessment: A measure of operational performance and employee well-being.

    PubMed

    Hoffmeister, Krista; Gibbons, Alyssa; Schwatka, Natalie; Rosecrance, John

    2015-09-01

    Ergonomics interventions have the potential to improve operational performance and employee well-being. We introduce a framework for ergonomics climate, the extent to which an organization emphasizes and supports the design and modification of work to maximize both performance and well-being outcomes. We assessed ergonomics climate at a large manufacturing facility twice during a two-year period. When the organization used ergonomics to promote performance and well-being equally, and at a high level, employees reported less work-related pain. A larger discrepancy between measures of operational performance and employee well-being was associated with increased reports of work-related pain. The direction of this discrepancy was not significantly related to work-related pain, such that it didn't matter which facet was valued more. The Ergonomics Climate Assessment can provide companies with a baseline assessment of the overall value placed on ergonomics and help prioritize areas for improving operational performance and employee well-being. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  13. Advanced Nanostructured Anode Materials for Sodium-Ion Batteries.

    PubMed

    Wang, Qidi; Zhao, Chenglong; Lu, Yaxiang; Li, Yunming; Zheng, Yuheng; Qi, Yuruo; Rong, Xiaohui; Jiang, Liwei; Qi, Xinguo; Shao, Yuanjun; Pan, Du; Li, Baohua; Hu, Yong-Sheng; Chen, Liquan

    2017-11-01

    Sodium-ion batteries (NIBs), due to the advantages of low cost and relatively high safety, have attracted widespread attention all over the world, making them a promising candidate for large-scale energy storage systems. However, the inherent lower energy density to lithium-ion batteries is the issue that should be further investigated and optimized. Toward the grid-level energy storage applications, designing and discovering appropriate anode materials for NIBs are of great concern. Although many efforts on the improvements and innovations are achieved, several challenges still limit the current requirements of the large-scale application, including low energy/power densities, moderate cycle performance, and the low initial Coulombic efficiency. Advanced nanostructured strategies for anode materials can significantly improve ion or electron transport kinetic performance enhancing the electrochemical properties of battery systems. Herein, this Review intends to provide a comprehensive summary on the progress of nanostructured anode materials for NIBs, where representative examples and corresponding storage mechanisms are discussed. Meanwhile, the potential directions to obtain high-performance anode materials of NIBs are also proposed, which provide references for the further development of advanced anode materials for NIBs. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Pu Anion Exchange Process Intensification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taylor-Pashow, Kathryn M. L.

    This research is focused on improving the efficiency of the anion exchange process for purifying plutonium. While initially focused on plutonium, the technology could also be applied to other ion-exchange processes. Work in FY17 focused on the improvement and optimization of porous foam columns that were initially developed in FY16. These foam columns were surface functionalized with poly(4-vinylpyridine) (PVP) to provide the Pu specific anion-exchange sites. Two different polymerization methods were explored for maximizing the surface functionalization with the PVP. The open-celled polymeric foams have large open pores and large surface areas available for sorption. The fluid passes through themore » large open pores of this material, allowing convection to be the dominant mechanism by which mass transport takes place. These materials generally have very low densities, open-celled structures with high cell interconnectivity, small cell sizes, uniform cell size distributions, and high structural integrity. These porous foam columns provide advantages over the typical porous resin beads by eliminating the slow diffusion through resin beads, making the anion-exchange sites easily accessible on the foam surfaces. The best performing samples exceeded the Pu capacity of the commercially available resin, and also offered the advantage of sharper elution profiles, resulting in a more concentrated product, with less loss of material to the dilute heads and tails cuts. An alternate approach to improving the efficiency of this process was also explored through the development of a microchannel array system for performing the anion exchange.« less

  15. Impact of Dietary Antioxidants on Sport Performance: A Review.

    PubMed

    Braakhuis, Andrea J; Hopkins, Will G

    2015-07-01

    Many athletes supplement with antioxidants in the belief this will reduce muscle damage, immune dysfunction and fatigue, and will thus improve performance, while some evidence suggests it impairs training adaptations. Here we review the effect of a range of dietary antioxidants and their effects on sport performance, including vitamin E, quercetin, resveratrol, beetroot juice, other food-derived polyphenols, spirulina and N-acetylcysteine (NAC). Older studies suggest vitamin E improves performance at altitude, with possible harmful effects on sea-level performance. Acute intake of vitamin E is worthy of further consideration, if plasma levels can be elevated sufficiently. Quercetin has a small beneficial effect for exercise of longer duration (>100 min), but it is unclear whether this benefits athletes. Resveratrol benefits trained rodents; more research is needed in athletes. Meta-analysis of beetroot juice studies has revealed that the nitrate component of beetroot juice had a substantial but unclear effect on performance when averaged across athletes, non-athletes and modes of exercise (single dose 1.4 ± 2.0%, double dose 0.5 ± 1.9%). The effect of addition of polyphenols and other components to beetroot juice was trivial but unclear (single dose 0.4 ± 3.2%, double dose -0.5 ± 3.3%). Other food-derived polyphenols indicate a range of performance outcomes from a large improvement to moderate impairment. Limited evidence suggests spirulina enhances endurance performance. Intravenous NAC improved endurance cycling performance and reduced muscle fatigue. On the basis of vitamin E and NAC studies, acute intake of antioxidants is likely to be beneficial. However, chronic intakes of most antioxidants have a harmful effect on performance.

  16. Optimizing CyberShake Seismic Hazard Workflows for Large HPC Resources

    NASA Astrophysics Data System (ADS)

    Callaghan, S.; Maechling, P. J.; Juve, G.; Vahi, K.; Deelman, E.; Jordan, T. H.

    2014-12-01

    The CyberShake computational platform is a well-integrated collection of scientific software and middleware that calculates 3D simulation-based probabilistic seismic hazard curves and hazard maps for the Los Angeles region. Currently each CyberShake model comprises about 235 million synthetic seismograms from about 415,000 rupture variations computed at 286 sites. CyberShake integrates large-scale parallel and high-throughput serial seismological research codes into a processing framework in which early stages produce files used as inputs by later stages. Scientific workflow tools are used to manage the jobs, data, and metadata. The Southern California Earthquake Center (SCEC) developed the CyberShake platform using USC High Performance Computing and Communications systems and open-science NSF resources.CyberShake calculations were migrated to the NSF Track 1 system NCSA Blue Waters when it became operational in 2013, via an interdisciplinary team approach including domain scientists, computer scientists, and middleware developers. Due to the excellent performance of Blue Waters and CyberShake software optimizations, we reduced the makespan (a measure of wallclock time-to-solution) of a CyberShake study from 1467 to 342 hours. We will describe the technical enhancements behind this improvement, including judicious introduction of new GPU software, improved scientific software components, increased workflow-based automation, and Blue Waters-specific workflow optimizations.Our CyberShake performance improvements highlight the benefits of scientific workflow tools. The CyberShake workflow software stack includes the Pegasus Workflow Management System (Pegasus-WMS, which includes Condor DAGMan), HTCondor, and Globus GRAM, with Pegasus-mpi-cluster managing the high-throughput tasks on the HPC resources. The workflow tools handle data management, automatically transferring about 13 TB back to SCEC storage.We will present performance metrics from the most recent CyberShake study, executed on Blue Waters. We will compare the performance of CPU and GPU versions of our large-scale parallel wave propagation code, AWP-ODC-SGT. Finally, we will discuss how these enhancements have enabled SCEC to move forward with plans to increase the CyberShake simulation frequency to 1.0 Hz.

  17. An Improved TA-SVM Method Without Matrix Inversion and Its Fast Implementation for Nonstationary Datasets.

    PubMed

    Shi, Yingzhong; Chung, Fu-Lai; Wang, Shitong

    2015-09-01

    Recently, a time-adaptive support vector machine (TA-SVM) is proposed for handling nonstationary datasets. While attractive performance has been reported and the new classifier is distinctive in simultaneously solving several SVM subclassifiers locally and globally by using an elegant SVM formulation in an alternative kernel space, the coupling of subclassifiers brings in the computation of matrix inversion, thus resulting to suffer from high computational burden in large nonstationary dataset applications. To overcome this shortcoming, an improved TA-SVM (ITA-SVM) is proposed using a common vector shared by all the SVM subclassifiers involved. ITA-SVM not only keeps an SVM formulation, but also avoids the computation of matrix inversion. Thus, we can realize its fast version, that is, improved time-adaptive core vector machine (ITA-CVM) for large nonstationary datasets by using the CVM technique. ITA-CVM has the merit of asymptotic linear time complexity for large nonstationary datasets as well as inherits the advantage of TA-SVM. The effectiveness of the proposed classifiers ITA-SVM and ITA-CVM is also experimentally confirmed.

  18. Rois Langner | NREL

    Science.gov Websites

    engineer in the Commercial Buildings Research Group at NREL since 2010. Her research efforts have focused optimize building design and performance for military and large commercial buildings. She has also worked continual energy improvement, and more recently is working to support the small commercial building sector

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aldrich, R.; Williamson, J.

    While single-family, detached homes account for 63% of households (EIA 2009); multi-family homes account for a very large portion of that remaining housing stock, and this fraction is growing. Through recent research efforts, CARB has been evaluating strategies and technologies that can make dramatic improvements in energy performance in multi-family buildings.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    While single-family, detached homes account for 63% of households (EIA 2009); multi-family homes account for a very large portion of that remaining housing stock, and this fraction is growing. Through recent research efforts, CARB has been evaluating strategies and technologies that can make dramatic improvements in energy performance in multi-family buildings

  1. From People to Profits.

    ERIC Educational Resources Information Center

    Barber, L.; Hayday, S.; Bevan, S.

    An empirical test of the service-profit chain in a large United Kingdom retail business explored how employee attitudes and behavior can improve customer retention and, consequently, company sales performance. Data were collected from 65,000 employees and 25,000 customers from almost 100 stores. The business collected customer satisfaction for…

  2. AIBS Education Review, Vol. 2, No. 4.

    ERIC Educational Resources Information Center

    Creager, Joan G., Ed.

    This issue contains articles on experiences gained in the construction of terminal performance objectives for introductory biology courses, the impact of audiotutorial instruction on faculty load and departmental operating levels, an experiment designed to improve the teaching of biology in large enrollment introductory courses, a minicourse on…

  3. Improving quantitative structure-activity relationship models using Artificial Neural Networks trained with dropout.

    PubMed

    Mendenhall, Jeffrey; Meiler, Jens

    2016-02-01

    Dropout is an Artificial Neural Network (ANN) training technique that has been shown to improve ANN performance across canonical machine learning (ML) datasets. Quantitative Structure Activity Relationship (QSAR) datasets used to relate chemical structure to biological activity in Ligand-Based Computer-Aided Drug Discovery pose unique challenges for ML techniques, such as heavily biased dataset composition, and relatively large number of descriptors relative to the number of actives. To test the hypothesis that dropout also improves QSAR ANNs, we conduct a benchmark on nine large QSAR datasets. Use of dropout improved both enrichment false positive rate and log-scaled area under the receiver-operating characteristic curve (logAUC) by 22-46 % over conventional ANN implementations. Optimal dropout rates are found to be a function of the signal-to-noise ratio of the descriptor set, and relatively independent of the dataset. Dropout ANNs with 2D and 3D autocorrelation descriptors outperform conventional ANNs as well as optimized fingerprint similarity search methods.

  4. Improving Quantitative Structure-Activity Relationship Models using Artificial Neural Networks Trained with Dropout

    PubMed Central

    Mendenhall, Jeffrey; Meiler, Jens

    2016-01-01

    Dropout is an Artificial Neural Network (ANN) training technique that has been shown to improve ANN performance across canonical machine learning (ML) datasets. Quantitative Structure Activity Relationship (QSAR) datasets used to relate chemical structure to biological activity in Ligand-Based Computer-Aided Drug Discovery (LB-CADD) pose unique challenges for ML techniques, such as heavily biased dataset composition, and relatively large number of descriptors relative to the number of actives. To test the hypothesis that dropout also improves QSAR ANNs, we conduct a benchmark on nine large QSAR datasets. Use of dropout improved both Enrichment false positive rate (FPR) and log-scaled area under the receiver-operating characteristic curve (logAUC) by 22–46% over conventional ANN implementations. Optimal dropout rates are found to be a function of the signal-to-noise ratio of the descriptor set, and relatively independent of the dataset. Dropout ANNs with 2D and 3D autocorrelation descriptors outperform conventional ANNs as well as optimized fingerprint similarity search methods. PMID:26830599

  5. The importance of calorimetry for highly-boosted jet substructure

    DOE PAGES

    Coleman, Evan; Freytsis, Marat; Hinzmann, Andreas; ...

    2018-01-09

    Here, jet substructure techniques are playing an essential role in exploring the TeV scale at the Large Hadron Collider (LHC), since they facilitate the efficient reconstruction and identification of highly-boosted objects. Both for the LHC and for future colliders, there is a growing interest in using jet substructure methods based only on charged-particle information. The reason is that silicon-based tracking detectors offer excellent granularity and precise vertexing, which can improve the angular resolution on highly-collimated jets and mitigate the impact of pileup. In this paper, we assess how much jet substructure performance degrades by using track-only information, and we demonstratemore » physics contexts in which calorimetry is most beneficial. Specifically, we consider five different hadronic final states - W bosons, Z bosons, top quarks, light quarks, gluons - and test the pairwise discrimination power with a multi-variate combination of substructure observables. In the idealized case of perfect reconstruction, we quantify the loss in discrimination performance when using just charged particles compared to using all detected particles. We also consider the intermediate case of using charged particles plus photons, which provides valuable information about neutral pions. In the more realistic case of a segmented calorimeter, we assess the potential performance gains from improving calorimeter granularity and resolution, comparing a CMS-like detector to more ambitious future detector concepts. Broadly speaking, we find large performance gains from neutral-particle information and from improved calorimetry in cases where jet mass resolution drives the discrimination power, whereas the gains are more modest if an absolute mass scale calibration is not required.« less

  6. The importance of calorimetry for highly-boosted jet substructure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coleman, Evan; Freytsis, Marat; Hinzmann, Andreas

    2017-09-25

    Jet substructure techniques are playing an essential role in exploring the TeV scale at the Large Hadron Collider (LHC), since they facilitate the efficient reconstruction and identification of highly-boosted objects. Both for the LHC and for future colliders, there is a growing interest in using jet substructure methods based only on charged-particle information. The reason is that silicon-based tracking detectors offer excellent granularity and precise vertexing, which can improve the angular resolution on highly-collimated jets and mitigate the impact of pileup. In this paper, we assess how much jet substructure performance degrades by using track-only information, and we demonstrate physicsmore » contexts in which calorimetry is most beneficial. Specifically, we consider five different hadronic final states - W bosons, Z bosons, top quarks, light quarks, gluons - and test the pairwise discrimination power with a multi-variate combination of substructure observables. In the idealized case of perfect reconstruction, we quantify the loss in discrimination performance when using just charged particles compared to using all detected particles. We also consider the intermediate case of using charged particles plus photons, which provides valuable information about neutral pions. In the more realistic case of a segmented calorimeter, we assess the potential performance gains from improving calorimeter granularity and resolution, comparing a CMS-like detector to more ambitious future detector concepts. Broadly speaking, we find large performance gains from neutral-particle information and from improved calorimetry in cases where jet mass resolution drives the discrimination power, whereas the gains are more modest if an absolute mass scale calibration is not required.« less

  7. The importance of calorimetry for highly-boosted jet substructure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coleman, Evan; Freytsis, Marat; Hinzmann, Andreas

    Here, jet substructure techniques are playing an essential role in exploring the TeV scale at the Large Hadron Collider (LHC), since they facilitate the efficient reconstruction and identification of highly-boosted objects. Both for the LHC and for future colliders, there is a growing interest in using jet substructure methods based only on charged-particle information. The reason is that silicon-based tracking detectors offer excellent granularity and precise vertexing, which can improve the angular resolution on highly-collimated jets and mitigate the impact of pileup. In this paper, we assess how much jet substructure performance degrades by using track-only information, and we demonstratemore » physics contexts in which calorimetry is most beneficial. Specifically, we consider five different hadronic final states - W bosons, Z bosons, top quarks, light quarks, gluons - and test the pairwise discrimination power with a multi-variate combination of substructure observables. In the idealized case of perfect reconstruction, we quantify the loss in discrimination performance when using just charged particles compared to using all detected particles. We also consider the intermediate case of using charged particles plus photons, which provides valuable information about neutral pions. In the more realistic case of a segmented calorimeter, we assess the potential performance gains from improving calorimeter granularity and resolution, comparing a CMS-like detector to more ambitious future detector concepts. Broadly speaking, we find large performance gains from neutral-particle information and from improved calorimetry in cases where jet mass resolution drives the discrimination power, whereas the gains are more modest if an absolute mass scale calibration is not required.« less

  8. Queue and stack sorting algorithm optimization and performance analysis

    NASA Astrophysics Data System (ADS)

    Qian, Mingzhu; Wang, Xiaobao

    2018-04-01

    Sorting algorithm is one of the basic operation of a variety of software development, in data structures course specializes in all kinds of sort algorithm. The performance of the sorting algorithm is directly related to the efficiency of the software. A lot of excellent scientific research queue is constantly optimizing algorithm, algorithm efficiency better as far as possible, the author here further research queue combined with stacks of sorting algorithms, the algorithm is mainly used for alternating operation queue and stack storage properties, Thus avoiding the need for a large number of exchange or mobile operations in the traditional sort. Before the existing basis to continue research, improvement and optimization, the focus on the optimization of the time complexity of the proposed optimization and improvement, The experimental results show that the improved effectively, at the same time and the time complexity and space complexity of the algorithm, the stability study corresponding research. The improvement and optimization algorithm, improves the practicability.

  9. [Financial incentives for quality improvement].

    PubMed

    Belicza, Eva; Evetovits, Tamás

    2010-05-01

    Policy makers and payers of health care services devote increasing attention to improve quality of services by incentivising health care providers. These--so called--pay for performance (P4P) programmes have so far been introduced in few countries only and evidence on their effectiveness is still scarce. Therefore we do not know yet which instruments of these programmes are most effective and efficient in improving quality. The P4P systems implemented so far in primary care and in integrated delivery systems use indicators for measurement of performance and the basis for rewards. These indicators are mostly process indicators, but there are some outcome indicators as well. The desired quality improvement effects are most likely to be achieved with programmes that provide seizable financial rewards and cover the extra cost of quality improvement efforts as well. Administration of the programme has to be fully transparent and clear to all involved. It has to be based on scientific evidence and supported with sufficient dedicated funding. Conducting pilot studies is a precondition for large scale implementation.

  10. Methods and Devices for Modifying Active Paths in a K-Delta-1-Sigma Modulator

    NASA Technical Reports Server (NTRS)

    Ardalan, Sasan (Inventor)

    2017-01-01

    The invention relates to an improved K-Delta-1-Sigma Modulators (KG1Ss) that achieve multi GHz sampling rates with 90 nm and 45 nm CMOS processes, and that provide the capability to balance performance with power in many applications. The improved KD1Ss activate all paths when high performance is needed (e.g. high bandwidth), and reduce the effective bandwidth by shutting down multiple paths when low performance is required. The improved KD1Ss can adjust the baseband filtering for lower bandwidth, and can provide large savings in power consumption while maintaining the communication link, which is a great advantage in space communications. The improved KD1Ss herein provides a receiver that adjusts to accommodate a higher rate when a packet is received at a low bandwidth, and at a initial lower rate, power is saved by turning off paths in the KD1S Analog to Digital Converter, and where when a higher rate is required, multiple paths are enabled in the KD1S to accommodate the higher band widths.

  11. Reading performance with large fonts on high-resolution displays

    NASA Astrophysics Data System (ADS)

    Powers, Maureen K.; Larimer, James O.; Gille, Jennifer; Liu, Hsien-Chang

    2004-06-01

    Reading is a fundamental task and skill in many environments including business, education, and the home. Today, reading often occurs on electronic displays in addition to traditional hard copy media such as books and magazines, presenting issues of legibility and other factors that can affect human performance [1]. In fact, the transition to soft copy media for text images is often met with worker complaints about their vision and comfort while reading [2-6]. Careful comparative evaluations of reading performance across hard and soft copy device types are rare, even though they are clearly important given the rapid and substantial improvements in soft copy devices available in the marketplace over the last 5 years. To begin to fill this evaluation gap, we compared reading performance on three different soft copy devices and traditional paper. This study does not investigate comfort factors such as display location, seating comfort, and more general issues of lighting, rather we focus instead on a narrow examination of reading performance differences across display types when font sizes are large.

  12. A cross docking pipeline for improving pose prediction and virtual screening performance

    NASA Astrophysics Data System (ADS)

    Kumar, Ashutosh; Zhang, Kam Y. J.

    2018-01-01

    Pose prediction and virtual screening performance of a molecular docking method depend on the choice of protein structures used for docking. Multiple structures for a target protein are often used to take into account the receptor flexibility and problems associated with a single receptor structure. However, the use of multiple receptor structures is computationally expensive when docking a large library of small molecules. Here, we propose a new cross-docking pipeline suitable to dock a large library of molecules while taking advantage of multiple target protein structures. Our method involves the selection of a suitable receptor for each ligand in a screening library utilizing ligand 3D shape similarity with crystallographic ligands. We have prospectively evaluated our method in D3R Grand Challenge 2 and demonstrated that our cross-docking pipeline can achieve similar or better performance than using either single or multiple-receptor structures. Moreover, our method displayed not only decent pose prediction performance but also better virtual screening performance over several other methods.

  13. Fun, collaboration and formative assessment: skinquizition, a class wide gaming competition in a medical school with a large class.

    PubMed

    Schlegel, Elisabeth F M; Selfridge, Nancy J

    2014-05-01

    Formative assessments are tools for assessing content retention, providing valuable feedback to students and teachers. In medical education, information technology-supported games can accommodate large classes divided into student teams while fostering active engagement. To establish an innovative stimulating approach to formative assessments for large classes furthering collaborative skills that promotes learning and student engagement linked to improvement of academic performance. Using audience response technology, a fast-paced, competitive, interactive quiz game involving dermatology was developed. This stimulating setting, provided on the last day of class, prepares students for high-stakes exams to continue their medical education while training collaborative skills as supported by survey outcomes and average class scores. Educational game competitions provide formative assessments and feedback for students and faculty alike, enhancing learning and teaching processes. In this study, we show an innovative approach to accommodate a large class divided into competing teams furthering collaborative skills reflected by academic performance.

  14. A turbocharger for the 1990s

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, E.R.

    1991-07-01

    This paper reports that the large-bore engines of the 1970s and 1980s have seen tremendous amounts of technological improvements. The key buzzwords were: Lower the emissions, and improve the fuel consumption. As the 1990s approach dramatic improvements in specific outputs along with continued research to further improve emissions and fuel consumption. One of the keys to success will have to be the degree to which the industry responds with improvements in turbocharger performance. In 1985, Cooper Bessemer Rotating Products Division began a program to develop the turbocharger of the 1990s. This paper will describe the development of the CB turbochargermore » series from concept to early production.« less

  15. weather@home 2: validation of an improved global-regional climate modelling system

    NASA Astrophysics Data System (ADS)

    Guillod, Benoit P.; Jones, Richard G.; Bowery, Andy; Haustein, Karsten; Massey, Neil R.; Mitchell, Daniel M.; Otto, Friederike E. L.; Sparrow, Sarah N.; Uhe, Peter; Wallom, David C. H.; Wilson, Simon; Allen, Myles R.

    2017-05-01

    Extreme weather events can have large impacts on society and, in many regions, are expected to change in frequency and intensity with climate change. Owing to the relatively short observational record, climate models are useful tools as they allow for generation of a larger sample of extreme events, to attribute recent events to anthropogenic climate change, and to project changes in such events into the future. The modelling system known as weather@home, consisting of a global climate model (GCM) with a nested regional climate model (RCM) and driven by sea surface temperatures, allows one to generate a very large ensemble with the help of volunteer distributed computing. This is a key tool to understanding many aspects of extreme events. Here, a new version of the weather@home system (weather@home 2) with a higher-resolution RCM over Europe is documented and a broad validation of the climate is performed. The new model includes a more recent land-surface scheme in both GCM and RCM, where subgrid-scale land-surface heterogeneity is newly represented using tiles, and an increase in RCM resolution from 50 to 25 km. The GCM performs similarly to the previous version, with some improvements in the representation of mean climate. The European RCM temperature biases are overall reduced, in particular the warm bias over eastern Europe, but large biases remain. Precipitation is improved over the Alps in summer, with mixed changes in other regions and seasons. The model is shown to represent the main classes of regional extreme events reasonably well and shows a good sensitivity to its drivers. In particular, given the improvements in this version of the weather@home system, it is likely that more reliable statements can be made with regards to impact statements, especially at more localized scales.

  16. Optimization of large matrix calculations for execution on the Cray X-MP vector supercomputer

    NASA Technical Reports Server (NTRS)

    Hornfeck, William A.

    1988-01-01

    A considerable volume of large computational computer codes were developed for NASA over the past twenty-five years. This code represents algorithms developed for machines of earlier generation. With the emergence of the vector supercomputer as a viable, commercially available machine, an opportunity exists to evaluate optimization strategies to improve the efficiency of existing software. This result is primarily due to architectural differences in the latest generation of large-scale machines and the earlier, mostly uniprocessor, machines. A sofware package being used by NASA to perform computations on large matrices is described, and a strategy for conversion to the Cray X-MP vector supercomputer is also described.

  17. Just-in-Time Training: A Novel Approach to Quality Improvement Education.

    PubMed

    Knutson, Allison; Park, Nesha D; Smith, Denise; Tracy, Kelly; Reed, Danielle J W; Olsen, Steven L

    2015-01-01

    Just-in-time training (JITT) is accepted in medical education as a training method for newer concepts or seldom-performed procedures. Providing JITT to a large nursing staff may be an effective method to teach quality improvement (QI) initiatives. We sought to determine if JITT could increase knowledge of a specific nutrition QI initiative. Members of the nutrition QI team interviewed staff using the Frontline Contextual Inquiry to assess knowledge regarding the specific QI project. The inquiry was completed pre- and post-JITT. A JITT educational cart was created, which allowed trainers to bring the educational information to the bedside for a short, small group educational session. The results demonstrated a marked improvement in the knowledge of the frontline staff regarding our Vermont Oxford Network involvement and the specifics of the nutrition QI project. Just-in-time training can be a valuable and effective method to disseminate QI principles to a large audience of staff members.

  18. Gains following perceptual learning are closely linked to the initial visual acuity.

    PubMed

    Yehezkel, Oren; Sterkin, Anna; Lev, Maria; Levi, Dennis M; Polat, Uri

    2016-04-28

    The goal of the present study was to evaluate the dependence of perceptual learning gains on initial visual acuity (VA), in a large sample of subjects with a wide range of VAs. A large sample of normally sighted and presbyopic subjects (N = 119; aged 40 to 63) with a wide range of uncorrected near visual acuities (VA, -0.12 to 0.8 LogMAR), underwent perceptual learning. Training consisted of detecting briefly presented Gabor stimuli under spatial and temporal masking conditions. Consistent with previous findings, perceptual learning induced a significant improvement in near VA and reading speed under conditions of limited exposure duration. Our results show that the improvements in VA and reading speed observed following perceptual learning are closely linked to the initial VA, with only a minor fraction of the observed improvement that may be attributed to the additional sessions performed by those with the worse VA.

  19. Organic transistors manufactured using inkjet technology with subfemtoliter accuracy

    PubMed Central

    Sekitani, Tsuyoshi; Noguchi, Yoshiaki; Zschieschang, Ute; Klauk, Hagen; Someya, Takao

    2008-01-01

    A major obstacle to the development of organic transistors for large-area sensor, display, and circuit applications is the fundamental compromise between manufacturing efficiency, transistor performance, and power consumption. In the past, improving the manufacturing efficiency through the use of printing techniques has inevitably resulted in significantly lower performance and increased power consumption, while attempts to improve performance or reduce power have led to higher process temperatures and increased manufacturing cost. Here, we lift this fundamental limitation by demonstrating subfemtoliter inkjet printing to define metal contacts with single-micrometer resolution on the surface of high-mobility organic semiconductors to create high-performance p-channel and n-channel transistors and low-power complementary circuits. The transistors employ an ultrathin low-temperature gate dielectric based on a self-assembled monolayer that allows transistors and circuits on rigid and flexible substrates to operate with very low voltages. PMID:18362348

  20. High-performance a-IGZO thin-film transistor with conductive indium-tin-oxide buried layer

    NASA Astrophysics Data System (ADS)

    Ahn, Min-Ju; Cho, Won-Ju

    2017-10-01

    In this study, we fabricated top-contact top-gate (TCTG) structure of amorphous indium-gallium-zinc oxide (a-IGZO) thin-film transistors (TFTs) with a thin buried conductive indium-tin oxide (ITO) layer. The electrical performance of a-IGZO TFTs was improved by inserting an ITO buried layer under the IGZO channel. Also, the effect of the buried layer's length on the electrical characteristics of a-IGZO TFTs was investigated. The electrical performance of the transistors improved with increasing the buried layer's length: a large on/off current ratio of 1.1×107, a high field-effect mobility of 35.6 cm2/Vs, a small subthreshold slope of 116.1 mV/dec, and a low interface trap density of 4.2×1011 cm-2eV-1 were obtained. The buried layer a-IGZO TFTs exhibited enhanced transistor performance and excellent stability against the gate bias stress.

  1. Otoacoustic emissions in the general adult population of Nord-Trøndelag, Norway: III. Relationships with pure-tone hearing thresholds.

    PubMed

    Engdahl, Bo; Tambs, Kristian; Borchgrevink, Hans M; Hoffman, Howard J

    2005-01-01

    This study aims to describe the association between otoacoustic emissions (OAEs) and pure-tone hearing thresholds (PTTs) in an unscreened adult population (N =6415), to determine the efficiency by which TEOAEs and DPOAEs can identify ears with elevated PTTs, and to investigate whether a combination of DPOAE and TEOAE responses improves this performance. Associations were examined by linear regression analysis and ANOVA. Test performance was assessed by receiver operator characteristic (ROC) curves. The relation between OAEs and PTTs appeared curvilinear with a moderate degree of non-linearity. Combining DPOAEs and TEOAEs improved performance. Test performance depended on the cut-off thresholds defining elevated PTTs with optimal values between 25 and 45 dB HL, depending on frequency and type of OAE measure. The unique constitution of the present large sample, which reflects the general adult population, makes these results applicable to population-based studies and screening programs.

  2. LAMMPS strong scaling performance optimization on Blue Gene/Q

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coffman, Paul; Jiang, Wei; Romero, Nichols A.

    2014-11-12

    LAMMPS "Large-scale Atomic/Molecular Massively Parallel Simulator" is an open-source molecular dynamics package from Sandia National Laboratories. Significant performance improvements in strong-scaling and time-to-solution for this application on IBM's Blue Gene/Q have been achieved through computational optimizations of the OpenMP versions of the short-range Lennard-Jones term of the CHARMM force field and the long-range Coulombic interaction implemented with the PPPM (particle-particle-particle mesh) algorithm, enhanced by runtime parameter settings controlling thread utilization. Additionally, MPI communication performance improvements were made to the PPPM calculation by re-engineering the parallel 3D FFT to use MPICH collectives instead of point-to-point. Performance testing was done using anmore » 8.4-million atom simulation scaling up to 16 racks on the Mira system at Argonne Leadership Computing Facility (ALCF). Speedups resulting from this effort were in some cases over 2x.« less

  3. Optimal reconstruction for closed-loop ground-layer adaptive optics with elongated spots.

    PubMed

    Béchet, Clémentine; Tallon, Michel; Tallon-Bosc, Isabelle; Thiébaut, Éric; Le Louarn, Miska; Clare, Richard M

    2010-11-01

    The design of the laser-guide-star-based adaptive optics (AO) systems for the Extremely Large Telescopes requires careful study of the issue of elongated spots produced on Shack-Hartmann wavefront sensors. The importance of a correct modeling of the nonuniformity and correlations of the noise induced by this elongation has already been demonstrated for wavefront reconstruction. We report here on the first (to our knowledge) end-to-end simulations of closed-loop ground-layer AO with laser guide stars with such an improved noise model. The results are compared with the level of performance predicted by a classical noise model for the reconstruction. The performance is studied in terms of ensquared energy and confirms that, thanks to the improved noise model, central or side launching of the lasers does not affect the performance with respect to the laser guide stars' flux. These two launching schemes also perform similarly whatever the atmospheric turbulence strength.

  4. Shooting performance is related to forearm temperature and hand tremor size.

    PubMed

    Lakie, M; Villagra, F; Bowman, I; Wilby, R

    1995-08-01

    The changes in postural tremor of the hand and the subsequent effect on shooting performance produced by moderate cooling and heating of the forearm were studied in six subjects. Cooling produced a large decrease in tremor size of the ipsilateral hand, whereas warming the limb produced an increase in tremor size. Cooling or warming the forearm did not change the peak frequency of tremor significantly, which was quite stable for each subject. The improvement in shooting performance after cooling the forearm, as measured by grouping pattern of the shots, reached statistical significance and warming caused a significant worsening. This measure of performance was shown to correlate (r = 0.776) inversely with tremor size. The causes and implications of these changes are discussed. It is suggested that local cooling may be useful for people who wish temporarily to reduce tremor in order to improve dexterity for shooting and for other purposes.

  5. Primary and Secondary Lithium Batteries Capable of Operating at Low Temperatures for Planetary Exploration

    NASA Technical Reports Server (NTRS)

    Smart, M. C.; Ratnakumar, B. V.; West, W. C.; Brandon, E. J.

    2011-01-01

    Objectives and Approach: (1) Develop advanced Li ]ion electrolytes that enable cell operation over a wide temperature range (i.e., -60 to +60 C). Improve the high temperature stability and lifetime characteristics of wide operating temperature electrolytes. (2) Define the performance limitations at low and high temperature extremes, as well as, life limiting processes. (3) Demonstrate the performance of advanced electrolytes in large capacity prototype cells.

  6. Mid-term NEAT review: analysing the improvements in hospital ED performance.

    PubMed

    Khanna, Sankalp; Boyle, Justin; Good, Norm; Lind, James

    2014-01-01

    Introduced with a promise to reduce overcrowding in the Emergency Department (ED) and the associated morbidity and mortality linked to bed access difficulties, the National Emergency Access Target (NEAT) is now over halfway through transitionary arrangements towards a target of 90% of patients that visit a hospital ED being admitted or discharged within 4 hours. Facilitation and reward funding has ensured hospitals around the country are remodelling workflows to ensure compliance. Recent reports however show that the majority of hospitals are still far from being able to meet this target. We investigate the NEAT journey of 30 Queensland hospitals over the past two years and compare this performance to a previous study that investigated the 4 hour ED discharge performance of these hospitals at various times of day and under varying occupancy conditions. Our findings reveal that, while most hospitals have made significant improvements to their 4 hour discharge performance in 2013, the underlying flow patterns and periods of poor NEAT compliance remain largely unchanged. The work identifies areas for targeted improvement to inform system redesign and workflow planning.

  7. Computer-Tailored Student Support in Introductory Physics.

    PubMed

    Huberth, Madeline; Chen, Patricia; Tritz, Jared; McKay, Timothy A

    2015-01-01

    Large introductory courses are at a disadvantage in providing personalized guidance and advice for students during the semester. We introduce E2Coach (an Expert Electronic Coaching system), which allows instructors to personalize their communication with thousands of students. We describe the E2Coach system, the nature of the personalized support it provides, and the features of the students who did (and did not) opt-in to using it during the first three terms of its use in four introductory physics courses at the University of Michigan. Defining a 'better-than-expected' measure of performance, we compare outcomes for students who used E2Coach to those who did not. We found that moderate and high E2Coach usage was associated with improved performance. This performance boost was prominent among high users, who improved by 0.18 letter grades on average when compared to nonusers with similar incoming GPAs. This improvement in performance was comparable across both genders. E2Coach represents one way to use technology to personalize education at scale, contributing to the move towards individualized learning that is becoming more attainable in the 21st century.

  8. Electrochemical properties for high surface area and improved electrical conductivity of platinum-embedded porous carbon nanofibers

    NASA Astrophysics Data System (ADS)

    An, Geon-Hyoung; Ahn, Hyo-Jin; Hong, Woong-Ki

    2015-01-01

    Four different types of carbon nanofibers (CNFs) for electrical double-layer capacitors (EDLCs), porous and non-porous CNFs with and without Pt metal nanoparticles, are synthesized by an electrospinning method and their performance in electrical double-layer capacitors (EDLCs) is characterized. In particular, the Pt-embedded porous CNFs (PCNFs) exhibit a high specific surface area of 670 m2 g-1, a large mesopore volume of 55.7%, and a low electrical resistance of 1.7 × 103. The synergistic effects of the high specific surface area with a large mesopore volume, and superior electrical conductivity result in an excellent specific capacitance of 130.2 F g-1, a good high-rate performance, superior cycling durability, and high energy density of 16.9-15.4 W h kg-1 for the performance of EDLCs.

  9. A laser-sheet flow visualization technique for the large wind tunnels of the National Full-Scale Aerodynamics Complex

    NASA Technical Reports Server (NTRS)

    Reinath, M. S.; Ross, J. C.

    1990-01-01

    A flow visualization technique for the large wind tunnels of the National Full Scale Aerodynamics Complex (NFAC) is described. The technique uses a laser sheet generated by the NFAC Long Range Laser Velocimeter (LRLV) to illuminate a smoke-like tracer in the flow. The LRLV optical system is modified slightly, and a scanned mirror is added to generate the sheet. These modifications are described, in addition to the results of an initial performance test conducted in the 80- by 120-Foot Wind Tunnel. During this test, flow visualization was performed in the wake region behind a truck as part of a vehicle drag reduction study. The problems encountered during the test are discussed, in addition to the recommended improvements needed to enhance the performance of the technique for future applications.

  10. Conceptual Design and Performance Analysis for a Large Civil Compound Helicopter

    NASA Technical Reports Server (NTRS)

    Russell, Carl; Johnson, Wayne

    2012-01-01

    A conceptual design study of a large civil compound helicopter is presented. The objective is to determine how a compound helicopter performs when compared to both a conventional helicopter and a tiltrotor using a design mission that is shorter than optimal for a tiltrotor and longer than optimal for a helicopter. The designs are generated and analyzed using conceptual design software and are further evaluated with a comprehensive rotorcraft analysis code. Multiple metrics are used to determine the suitability of each design for the given mission. Plots of various trade studies and parameter sweeps as well as comprehensive analysis results are presented. The results suggest that the compound helicopter examined for this study would not be competitive with a tiltrotor or conventional helicopter, but multiple possibilities are identified for improving the performance of the compound helicopter in future research.

  11. Efficient parallelization of analytic bond-order potentials for large-scale atomistic simulations

    NASA Astrophysics Data System (ADS)

    Teijeiro, C.; Hammerschmidt, T.; Drautz, R.; Sutmann, G.

    2016-07-01

    Analytic bond-order potentials (BOPs) provide a way to compute atomistic properties with controllable accuracy. For large-scale computations of heterogeneous compounds at the atomistic level, both the computational efficiency and memory demand of BOP implementations have to be optimized. Since the evaluation of BOPs is a local operation within a finite environment, the parallelization concepts known from short-range interacting particle simulations can be applied to improve the performance of these simulations. In this work, several efficient parallelization methods for BOPs that use three-dimensional domain decomposition schemes are described. The schemes are implemented into the bond-order potential code BOPfox, and their performance is measured in a series of benchmarks. Systems of up to several millions of atoms are simulated on a high performance computing system, and parallel scaling is demonstrated for up to thousands of processors.

  12. Improved Saturation Performance in High Speed Waveguide Photodetectors at 1.3 ??sing an Asymmetric InA1GaAs/InGaAsP Structure

    NASA Technical Reports Server (NTRS)

    Vang, T. A.; Davis, L.; Keo, S.; Forouhar, S. F.

    1996-01-01

    Waveguide photodetector (WGPD) results have recently been presented demonstrating the very large bandwidth-efficiency product potential of these devices. Improved saturation and linearity characteristics are realized in waveguide p-i-n photodetectors at 1.3 ??y using an asymmetric cladding structure with InA1GaAs/InGaAsP in the anode and InGaAsP in the cathode.

  13. Fifty years of driving safety research.

    PubMed

    Lee, John D

    2008-06-01

    This brief review covers the 50 years of driving-related research published in Human Factors, its contribution to driving safety, and emerging challenges. Many factors affect driving safety, making it difficult to assess the impact of specific factors such as driver age, cell phone distractions, or collision warnings. The author considers the research themes associated with the approximately 270 articles on driving published in Human Factors in the past 50 years. To a large extent, current and past research has explored similar themes and concepts. Many articles published in the first 25 years focused on issues such as driver impairment, individual differences, and perceptual limits. Articles published in the past 25 years address similar issues but also point toward vehicle technology that can exacerbate or mitigate the negative effect of these issues. Conceptual and computational models have played an important role in this research. Improved crash-worthiness has contributed to substantial improvements in driving safety over the past 50 years, but future improvements will depend on enhancing driver performance and perhaps, more important, improving driver behavior. Developing models to guide this research will become more challenging as new technology enters the vehicle and shifts the focus from driver performance to driver behavior. Over the past 50 years, Human Factors has accumulated a large base of driving-related research that remains relevant for many of today's design and policy concerns.

  14. Using a Feedback Environment to Improve Creative Performance: A Dynamic Affect Perspective.

    PubMed

    Gong, Zhenxing; Zhang, Na

    2017-01-01

    Prior research on feedback and creative performance has neglected the dynamic nature of affect and has focused only on the influence of positive affect. We argue that creative performance is the result of a dynamic process in which a person experiences a phase of negative affect and subsequently enters a state of high positive affect that is influenced by the feedback environment. Hierarchical regression was used to analyze a sample of 264 employees from seven industry firms. The results indicate that employees' perceptions of a supportive supervisor feedback environment indirectly influence their level of creative performance through positive affect (t2); the negative affect (t1) moderates the relationship between positive affect (t2) and creative performance (t2), rendering the relationship more positive if negative affect (t1) is high. The change in positive affect mediates the relationship between the supervisor feedback environment and creative performance; a decrease in negative affect moderates the relationship between increased positive affect and creative performance, rendering the relationship more positive if the decrease in negative affect is large. The implications for improving the creative performances of employees are further discussed.

  15. Human3.6M: Large Scale Datasets and Predictive Methods for 3D Human Sensing in Natural Environments.

    PubMed

    Ionescu, Catalin; Papava, Dragos; Olaru, Vlad; Sminchisescu, Cristian

    2014-07-01

    We introduce a new dataset, Human3.6M, of 3.6 Million accurate 3D Human poses, acquired by recording the performance of 5 female and 6 male subjects, under 4 different viewpoints, for training realistic human sensing systems and for evaluating the next generation of human pose estimation models and algorithms. Besides increasing the size of the datasets in the current state-of-the-art by several orders of magnitude, we also aim to complement such datasets with a diverse set of motions and poses encountered as part of typical human activities (taking photos, talking on the phone, posing, greeting, eating, etc.), with additional synchronized image, human motion capture, and time of flight (depth) data, and with accurate 3D body scans of all the subject actors involved. We also provide controlled mixed reality evaluation scenarios where 3D human models are animated using motion capture and inserted using correct 3D geometry, in complex real environments, viewed with moving cameras, and under occlusion. Finally, we provide a set of large-scale statistical models and detailed evaluation baselines for the dataset illustrating its diversity and the scope for improvement by future work in the research community. Our experiments show that our best large-scale model can leverage our full training set to obtain a 20% improvement in performance compared to a training set of the scale of the largest existing public dataset for this problem. Yet the potential for improvement by leveraging higher capacity, more complex models with our large dataset, is substantially vaster and should stimulate future research. The dataset together with code for the associated large-scale learning models, features, visualization tools, as well as the evaluation server, is available online at http://vision.imar.ro/human3.6m.

  16. Adaptive Laplacian filtering for sensorimotor rhythm-based brain-computer interfaces.

    PubMed

    Lu, Jun; McFarland, Dennis J; Wolpaw, Jonathan R

    2013-02-01

    Sensorimotor rhythms (SMRs) are 8-30 Hz oscillations in the electroencephalogram (EEG) recorded from the scalp over sensorimotor cortex that change with movement and/or movement imagery. Many brain-computer interface (BCI) studies have shown that people can learn to control SMR amplitudes and can use that control to move cursors and other objects in one, two or three dimensions. At the same time, if SMR-based BCIs are to be useful for people with neuromuscular disabilities, their accuracy and reliability must be improved substantially. These BCIs often use spatial filtering methods such as common average reference (CAR), Laplacian (LAP) filter or common spatial pattern (CSP) filter to enhance the signal-to-noise ratio of EEG. Here, we test the hypothesis that a new filter design, called an 'adaptive Laplacian (ALAP) filter', can provide better performance for SMR-based BCIs. An ALAP filter employs a Gaussian kernel to construct a smooth spatial gradient of channel weights and then simultaneously seeks the optimal kernel radius of this spatial filter and the regularization parameter of linear ridge regression. This optimization is based on minimizing the leave-one-out cross-validation error through a gradient descent method and is computationally feasible. Using a variety of kinds of BCI data from a total of 22 individuals, we compare the performances of ALAP filter to CAR, small LAP, large LAP and CSP filters. With a large number of channels and limited data, ALAP performs significantly better than CSP, CAR, small LAP and large LAP both in classification accuracy and in mean-squared error. Using fewer channels restricted to motor areas, ALAP is still superior to CAR, small LAP and large LAP, but equally matched to CSP. Thus, ALAP may help to improve the accuracy and robustness of SMR-based BCIs.

  17. Adaptive Laplacian filtering for sensorimotor rhythm-based brain-computer interfaces

    NASA Astrophysics Data System (ADS)

    Lu, Jun; McFarland, Dennis J.; Wolpaw, Jonathan R.

    2013-02-01

    Objective. Sensorimotor rhythms (SMRs) are 8-30 Hz oscillations in the electroencephalogram (EEG) recorded from the scalp over sensorimotor cortex that change with movement and/or movement imagery. Many brain-computer interface (BCI) studies have shown that people can learn to control SMR amplitudes and can use that control to move cursors and other objects in one, two or three dimensions. At the same time, if SMR-based BCIs are to be useful for people with neuromuscular disabilities, their accuracy and reliability must be improved substantially. These BCIs often use spatial filtering methods such as common average reference (CAR), Laplacian (LAP) filter or common spatial pattern (CSP) filter to enhance the signal-to-noise ratio of EEG. Here, we test the hypothesis that a new filter design, called an ‘adaptive Laplacian (ALAP) filter’, can provide better performance for SMR-based BCIs. Approach. An ALAP filter employs a Gaussian kernel to construct a smooth spatial gradient of channel weights and then simultaneously seeks the optimal kernel radius of this spatial filter and the regularization parameter of linear ridge regression. This optimization is based on minimizing the leave-one-out cross-validation error through a gradient descent method and is computationally feasible. Main results. Using a variety of kinds of BCI data from a total of 22 individuals, we compare the performances of ALAP filter to CAR, small LAP, large LAP and CSP filters. With a large number of channels and limited data, ALAP performs significantly better than CSP, CAR, small LAP and large LAP both in classification accuracy and in mean-squared error. Using fewer channels restricted to motor areas, ALAP is still superior to CAR, small LAP and large LAP, but equally matched to CSP. Significance. Thus, ALAP may help to improve the accuracy and robustness of SMR-based BCIs.

  18. Adaptive fuzzy PID control of hydraulic servo control system for large axial flow compressor

    NASA Astrophysics Data System (ADS)

    Wang, Yannian; Wu, Peizhi; Liu, Chengtao

    2017-09-01

    To improve the stability of the large axial compressor, an efficient and special intelligent hydraulic servo control system is designed and implemented. The adaptive fuzzy PID control algorithm is used to control the position of the hydraulic servo cylinder steadily, which overcomes the drawback that the PID parameters should be adjusted based on the different applications. The simulation and the test results show that the system has a better dynamic property and a stable state performance.

  19. Device Performance and Reliability Improvements of AlGaBN/GaN/Si MOSFET

    DTIC Science & Technology

    2016-02-04

    Metal insulator semiconductor AlGaN /GaN high electron mobility transistors (MISHEMTs) are promising for power device applications due to a lower leakage...current than the conventional Schottky AlGaN/GaN HEMTs.1–3 Among a large number of insulator materials, an Al2O3 dielectric layer, deposited by...atomic layer deposition (ALD), is often employed as the gate insulator because of a large band gap (and the resultant high conduction band offset on

  20. Facile Site-Directed Mutagenesis of Large Constructs Using Gibson Isothermal DNA Assembly.

    PubMed

    Yonemoto, Isaac T; Weyman, Philip D

    2017-01-01

    Site-directed mutagenesis is a commonly used molecular biology technique to manipulate biological sequences, and is especially useful for studying sequence determinants of enzyme function or designing proteins with improved activity. We describe a strategy using Gibson Isothermal DNA Assembly to perform site-directed mutagenesis on large (>~20 kbp) constructs that are outside the effective range of standard techniques such as QuikChange II (Agilent Technologies), but more reliable than traditional cloning using restriction enzymes and ligation.

  1. A new extranodal scoring system based on the prognostically relevant extranodal sites in diffuse large B-cell lymphoma, not otherwise specified treated with chemoimmunotherapy.

    PubMed

    Hwang, Hee Sang; Yoon, Dok Hyun; Suh, Cheolwon; Huh, Jooryung

    2016-08-01

    Extranodal involvement is a well-known prognostic factor in patients with diffuse large B-cell lymphomas (DLBCL). Nevertheless, the prognostic impact of the extranodal scoring system included in the conventional international prognostic index (IPI) has been questioned in an era where rituximab treatment has become widespread. We investigated the prognostic impacts of individual sites of extranodal involvement in 761 patients with DLBCL who received rituximab-based chemoimmunotherapy. Subsequently, we established a new extranodal scoring system based on extranodal sites, showing significant prognostic correlation, and compared this system with conventional scoring systems, such as the IPI and the National Comprehensive Cancer Network-IPI (NCCN-IPI). An internal validation procedure, using bootstrapped samples, was also performed for both univariate and multivariate models. Using multivariate analysis with a backward variable selection, we found nine extranodal sites (the liver, lung, spleen, central nervous system, bone marrow, kidney, skin, adrenal glands, and peritoneum) that remained significant for use in the final model. Our newly established extranodal scoring system, based on these sites, was better correlated with patient survival than standard scoring systems, such as the IPI and the NCCN-IPI. Internal validation by bootstrapping demonstrated an improvement in model performance of our modified extranodal scoring system. Our new extranodal scoring system, based on the prognostically relevant sites, may improve the performance of conventional prognostic models of DLBCL in the rituximab era and warrants further external validation using large study populations.

  2. Use of lean sigma principles in a tertiary care otolaryngology clinic to improve efficiency.

    PubMed

    Lin, Sandra Y; Gavney, Dean; Ishman, Stacey L; Cady-Reh, Julie

    2013-11-01

    To apply Lean Sigma, a quality-improvement strategy supported by tactical tools to eliminate waste and reduce variation, to improve efficiency of patient flow in a large tertiary otolaryngology clinic. The project goals were to decrease overall lead time from patient arrival to start of interaction with care provider, improve on-time starts of patient visits, and decrease excess staff/patient motion. Prospective observational study. Patient flow was mapped through the clinic, including preregistration processes. A time-stamp observation study was performed on 188 patient visits over 5 days. Using Lean Sigma principles, time stamps were analyzed to identify patient flow constraints and areas for potential interventions. Interventions were evaluated and adjusted based on feedback from shareholders: removal of bottlenecks in clinic flow, elimination of non-value added registration staff tasks, and alignment of staff hours to accommodate times of high patient census. A postintervention time observation study of 141 patients was performed 5 months later. Patient lead time from clinic arrival to exam start time decreased by 12.2% on average (P = .042). On-time starts for patient exams improved by 34% (χ(2) = 16.091, P < .001). Excess patient motion was reduced by 74 feet per patient, which represents a 34% reduction in motion per visit. Use of Lean Sigma principles in a large tertiary otolaryngology clinic led to decreased patient wait time and significant improvements in on-time patient exam start time. Process mapping, engagement of leadership and staff, and elimination of non-value added steps or processes were key to improvement. Copyright © 2013 The American Laryngological, Rhinological and Otological Society, Inc.

  3. RabbitQR: fast and flexible big data processing at LSST data rates using existing, shared-use hardware

    NASA Astrophysics Data System (ADS)

    Kotulla, Ralf; Gopu, Arvind; Hayashi, Soichi

    2016-08-01

    Processing astronomical data to science readiness was and remains a challenge, in particular in the case of multi detector instruments such as wide-field imagers. One such instrument, the WIYN One Degree Imager, is available to the astronomical community at large, and, in order to be scientifically useful to its varied user community on a short timescale, provides its users fully calibrated data in addition to the underlying raw data. However, time-efficient re-processing of the often large datasets with improved calibration data and/or software requires more than just a large number of CPU-cores and disk space. This is particularly relevant if all computing resources are general purpose and shared with a large number of users in a typical university setup. Our approach to address this challenge is a flexible framework, combining the best of both high performance (large number of nodes, internal communication) and high throughput (flexible/variable number of nodes, no dedicated hardware) computing. Based on the Advanced Message Queuing Protocol, we a developed a Server-Manager- Worker framework. In addition to the server directing the work flow and the worker executing the actual work, the manager maintains a list of available worker, adds and/or removes individual workers from the worker pool, and re-assigns worker to different tasks. This provides the flexibility of optimizing the worker pool to the current task and workload, improves load balancing, and makes the most efficient use of the available resources. We present performance benchmarks and scaling tests, showing that, today and using existing, commodity shared- use hardware we can process data with data throughputs (including data reduction and calibration) approaching that expected in the early 2020s for future observatories such as the Large Synoptic Survey Telescope.

  4. Linking Assessment and School Success.

    ERIC Educational Resources Information Center

    Raham, Helen

    1999-01-01

    School systems have recently experienced a dramatic shift toward the use of large-scale assessment to improve school performance. Discusses the ways in which external assessment may benefit education, the need for multiple measures of various dimensions of school success, and guidelines for using assessment to create a dynamic cycle of continuous…

  5. Improving CD-ROM Management through Networking.

    ERIC Educational Resources Information Center

    Rutherford, John

    1990-01-01

    Summarizes advantages, components, and manufacturers of CD-ROM networks based on experiences at the Central Connecticut State University library. Three configurations are described, and the steps in installing a network where a large number of databases are shared by a number of microcomputers are detailed. Licensing and network performance issues…

  6. The Deconstructive Approach to Understanding Community College Students' Pathways and Outcomes

    ERIC Educational Resources Information Center

    Bahr, Peter Riley

    2013-01-01

    Two related themes currently dominate discourse on open-access colleges, particularly community colleges: increasing college-going and degree attainment and improving the performance of postsecondary institutions with respect to producing graduates. Largely missing from this discourse, however, is cogency concerning the innumerable ways in which…

  7. Design of experiments (DOE) - history, concepts, and relevance to in vitro culture

    USDA-ARS?s Scientific Manuscript database

    Design of experiments (DOE) is a large and well-developed field for understanding and improving the performance of complex systems. Because in vitro culture systems are complex, but easily manipulated in controlled conditions, they are particularly well-suited for the application of DOE principle...

  8. Effect of the time window on the heat-conduction information filtering model

    NASA Astrophysics Data System (ADS)

    Guo, Qiang; Song, Wen-Jun; Hou, Lei; Zhang, Yi-Lu; Liu, Jian-Guo

    2014-05-01

    Recommendation systems have been proposed to filter out the potential tastes and preferences of the normal users online, however, the physics of the time window effect on the performance is missing, which is critical for saving the memory and decreasing the computation complexity. In this paper, by gradually expanding the time window, we investigate the impact of the time window on the heat-conduction information filtering model with ten similarity measures. The experimental results on the benchmark dataset Netflix indicate that by only using approximately 11.11% recent rating records, the accuracy could be improved by an average of 33.16% and the diversity could be improved by 30.62%. In addition, the recommendation performance on the dataset MovieLens could be preserved by only considering approximately 10.91% recent records. Under the circumstance of improving the recommendation performance, our discoveries possess significant practical value by largely reducing the computational time and shortening the data storage space.

  9. The impact of extensive green roofs on the improvement of thermal performance for urban areas in Mediterranean climate with reference to the city of Jijel in Algeria

    NASA Astrophysics Data System (ADS)

    Lehtihet, M. C.; Bouchair, A.

    2018-05-01

    Buildings with dark surfaces, concrete and pavement, needed for the expansion of cities, absorb huge amounts of heat, increasing the mean radiant temperatures of urban areas and offer significant potential for urban heat island (UHI) effect. The purpose of this work is to investigate the impact of green roofs on the improvement of urban heat performance in Mediterranean climate. A field investigation is carried out using two large-scale modules built in the city of Jijel in the north of Algeria. The first is a bare reinforced concrete slab whereas the second is covered with ivy plants. The experimental site, the air and surface temperature parameters and the various measurement points at the level of the modules are chosen. Measurements are performed using thermo-hygrometer, surface sensors and data acquisition apparatus. The results show that green roofs can be a potential mean of improving the thermal performance of the surrounding microclimate and energy performance of buildings in an urban area. The green roof could be an encouraging strategy against urban heat island effect not only for Mediterranean cities but also for other areas.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boris, J.P.; Picone, J.M.; Lambrakos, S.G.

    The Surveillance, Correlation, and Tracking (SCAT) problem is the computation-limited kernel of future battle-management systems currently being developed, for example, under the Strategic Defense Initiative (SDI). This report shows how high-performance SCAT can be performed in this decade. Estimates suggest that an increase by a factor of at least one thousand in computational capacity will be necessary to track 10/sup 5/ SDI objects in real time. This large improvement is needed because standard algorithms for data organization in important segments of the SCAT problem scale as N/sup 2/ and N/sup 3/, where N is the number of perceived objects. Itmore » is shown that the required speed-up factor can now be achieved because of two new developments: 1) a heterogeneous element supercomputer system based on available parallel-processing technology can account for over one order of magnitude performance improvement today over existing supercomputers; and 2) algorithmic innovations development recently by the NRL Laboratory for Computational Physics will account for another two orders of magnitude improvement. Based on these advances, a comprehensive, high-performance kernel for a simulator/system to perform the SCAT portion of SDI battle management is described.« less

  11. Mechanical-Kinetic Modeling of a Molecular Walker from a Modular Design Principle

    NASA Astrophysics Data System (ADS)

    Hou, Ruizheng; Loh, Iong Ying; Li, Hongrong; Wang, Zhisong

    2017-02-01

    Artificial molecular walkers beyond burnt-bridge designs are complex nanomachines that potentially replicate biological walkers in mechanisms and functionalities. Improving the man-made walkers up to performance for widespread applications remains difficult, largely because their biomimetic design principles involve entangled kinetic and mechanical effects to complicate the link between a walker's construction and ultimate performance. Here, a synergic mechanical-kinetic model is developed for a recently reported DNA bipedal walker, which is based on a modular design principle, potentially enabling many directional walkers driven by a length-switching engine. The model reproduces the experimental data of the walker, and identifies its performance-limiting factors. The model also captures features common to the underlying design principle, including counterintuitive performance-construction relations that are explained by detailed balance, entropy production, and bias cancellation. While indicating a low directional fidelity for the present walker, the model suggests the possibility of improving the fidelity above 90% by a more powerful engine, which may be an improved version of the present engine or an entirely new engine motif, thanks to the flexible design principle. The model is readily adaptable to aid these experimental developments towards high-performance molecular walkers.

  12. Feature generation and representations for protein-protein interaction classification.

    PubMed

    Lan, Man; Tan, Chew Lim; Su, Jian

    2009-10-01

    Automatic detecting protein-protein interaction (PPI) relevant articles is a crucial step for large-scale biological database curation. The previous work adopted POS tagging, shallow parsing and sentence splitting techniques, but they achieved worse performance than the simple bag-of-words representation. In this paper, we generated and investigated multiple types of feature representations in order to further improve the performance of PPI text classification task. Besides the traditional domain-independent bag-of-words approach and the term weighting methods, we also explored other domain-dependent features, i.e. protein-protein interaction trigger keywords, protein named entities and the advanced ways of incorporating Natural Language Processing (NLP) output. The integration of these multiple features has been evaluated on the BioCreAtIvE II corpus. The experimental results showed that both the advanced way of using NLP output and the integration of bag-of-words and NLP output improved the performance of text classification. Specifically, in comparison with the best performance achieved in the BioCreAtIvE II IAS, the feature-level and classifier-level integration of multiple features improved the performance of classification 2.71% and 3.95%, respectively.

  13. Compensation of relector antenna surface distortion using an array feed

    NASA Technical Reports Server (NTRS)

    Cherrette, A. R.; Acosta, R. J.; Lam, P. T.; Lee, S. W.

    1988-01-01

    The dimensional stability of the surface of a large reflector antenna is important when high gain or low sidelobe performance is desired. If the surface is distorted due to thermal or structural reasons, antenna performance can be improved through the use of an array feed. The design of the array feed and its relation to the surface distortion are examined. The sensitivity of antenna performance to changing surface parameters for fixed feed array geometries is also studied. This allows determination of the limits of usefulness for feed array compensation.

  14. Dynamic test/analysis correlation using reduced analytical models

    NASA Technical Reports Server (NTRS)

    Mcgowan, Paul E.; Angelucci, A. Filippo; Javeed, Mehzad

    1992-01-01

    Test/analysis correlation is an important aspect of the verification of analysis models which are used to predict on-orbit response characteristics of large space structures. This paper presents results of a study using reduced analysis models for performing dynamic test/analysis correlation. The reduced test-analysis model (TAM) has the same number and orientation of DOF as the test measurements. Two reduction methods, static (Guyan) reduction and the Improved Reduced System (IRS) reduction, are applied to the test/analysis correlation of a laboratory truss structure. Simulated test results and modal test data are used to examine the performance of each method. It is shown that selection of DOF to be retained in the TAM is critical when large structural masses are involved. In addition, the use of modal test results may provide difficulties in TAM accuracy even if a large number of DOF are retained in the TAM.

  15. Zeolite crystal growth in space

    NASA Technical Reports Server (NTRS)

    Sacco, Albert, Jr.; Thompson, Robert W.; Dixon, Anthony G.

    1991-01-01

    The growth of large, uniform zeolite crystals in high yield in space can have a major impact on the chemical process industry. Large zeolite crystals will be used to improve basic understanding of adsorption and catalytic mechanisms, and to make zeolite membranes. To grow large zeolites in microgravity, it is necessary to control the nucleation event and fluid motion, and to enhance nutrient transfer. Data is presented that suggests nucleation can be controlled using chemical compounds (e.g., Triethanolamine, for zeolite A), while not adversely effecting growth rate. A three-zone furnace has been designed to perform multiple syntheses concurrently. The operating range of the furnace is 295 K to 473 K. Teflon-lined autoclaves (10 ml liquid volume) have been designed to minimize contamination, reduce wall nucleation, and control mixing of pre-gel solutions on orbit. Zeolite synthesis experiments will be performed on USML-1 in 1992.

  16. An evaluation of multi-probe locality sensitive hashing for computing similarities over web-scale query logs

    PubMed Central

    2018-01-01

    Many modern applications of AI such as web search, mobile browsing, image processing, and natural language processing rely on finding similar items from a large database of complex objects. Due to the very large scale of data involved (e.g., users’ queries from commercial search engines), computing such near or nearest neighbors is a non-trivial task, as the computational cost grows significantly with the number of items. To address this challenge, we adopt Locality Sensitive Hashing (a.k.a, LSH) methods and evaluate four variants in a distributed computing environment (specifically, Hadoop). We identify several optimizations which improve performance, suitable for deployment in very large scale settings. The experimental results demonstrate our variants of LSH achieve the robust performance with better recall compared with “vanilla” LSH, even when using the same amount of space. PMID:29346410

  17. Hybrid Parallelism for Volume Rendering on Large-, Multi-, and Many-Core Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Howison, Mark; Bethel, E. Wes; Childs, Hank

    2012-01-01

    With the computing industry trending towards multi- and many-core processors, we study how a standard visualization algorithm, ray-casting volume rendering, can benefit from a hybrid parallelism approach. Hybrid parallelism provides the best of both worlds: using distributed-memory parallelism across a large numbers of nodes increases available FLOPs and memory, while exploiting shared-memory parallelism among the cores within each node ensures that each node performs its portion of the larger calculation as efficiently as possible. We demonstrate results from weak and strong scaling studies, at levels of concurrency ranging up to 216,000, and with datasets as large as 12.2 trillion cells.more » The greatest benefit from hybrid parallelism lies in the communication portion of the algorithm, the dominant cost at higher levels of concurrency. We show that reducing the number of participants with a hybrid approach significantly improves performance.« less

  18. Wind tunnel investigation of a high lift system with pneumatic flow control

    NASA Astrophysics Data System (ADS)

    Victor, Pricop Mihai; Mircea, Boscoianu; Daniel-Eugeniu, Crunteanu

    2016-06-01

    Next generation passenger aircrafts require more efficient high lift systems under size and mass constraints, to achieve more fuel efficiency. This can be obtained in various ways: to improve/maintain aerodynamic performance while simplifying the mechanical design of the high lift system going to a single slotted flap, to maintain complexity and improve the aerodynamics even more, etc. Laminar wings have less efficient leading edge high lift systems if any, requiring more performance from the trailing edge flap. Pulsed blowing active flow control (AFC) in the gap of single element flap is investigated for a relatively large model. A wind tunnel model, test campaign and results and conclusion are presented.

  19. High-performance liquid chromatography purification of homogenous-length RNA produced by trans cleavage with a hammerhead ribozyme.

    PubMed Central

    Shields, T P; Mollova, E; Ste Marie, L; Hansen, M R; Pardi, A

    1999-01-01

    An improved method is presented for the preparation of milligram quantities of homogenous-length RNAs suitable for nuclear magnetic resonance or X-ray crystallographic structural studies. Heterogeneous-length RNA transcripts are processed with a hammerhead ribozyme to yield homogenous-length products that are then readily purified by anion exchange high-performance liquid chromatography. This procedure eliminates the need for denaturing polyacrylamide gel electrophoresis, which is the most laborious step in the standard procedure for large-scale production of RNA by in vitro transcription. The hammerhead processing of the heterogeneous-length RNA transcripts also substantially improves the overall yield and purity of the desired RNA product. PMID:10496226

  20. Study of multi-functional precision optical measuring system for large scale equipment

    NASA Astrophysics Data System (ADS)

    Jiang, Wei; Lao, Dabao; Zhou, Weihu; Zhang, Wenying; Jiang, Xingjian; Wang, Yongxi

    2017-10-01

    The effective application of high performance measurement technology can greatly improve the large-scale equipment manufacturing ability. Therefore, the geometric parameters measurement, such as size, attitude and position, requires the measurement system with high precision, multi-function, portability and other characteristics. However, the existing measuring instruments, such as laser tracker, total station, photogrammetry system, mostly has single function, station moving and other shortcomings. Laser tracker needs to work with cooperative target, but it can hardly meet the requirement of measurement in extreme environment. Total station is mainly used for outdoor surveying and mapping, it is hard to achieve the demand of accuracy in industrial measurement. Photogrammetry system can achieve a wide range of multi-point measurement, but the measuring range is limited and need to repeatedly move station. The paper presents a non-contact opto-electronic measuring instrument, not only it can work by scanning the measurement path but also measuring the cooperative target by tracking measurement. The system is based on some key technologies, such as absolute distance measurement, two-dimensional angle measurement, automatically target recognition and accurate aiming, precision control, assembly of complex mechanical system and multi-functional 3D visualization software. Among them, the absolute distance measurement module ensures measurement with high accuracy, and the twodimensional angle measuring module provides precision angle measurement. The system is suitable for the case of noncontact measurement of large-scale equipment, it can ensure the quality and performance of large-scale equipment throughout the process of manufacturing and improve the manufacturing ability of large-scale and high-end equipment.

  1. Interactive large-group teaching in a dermatology course.

    PubMed

    Ochsendorf, F R; Boehncke, W-H; Sommerlad, M; Kaufmann, R

    2006-12-01

    This is a prospective study to find out whether an interactive large-group case-based teaching approach combined with small-group bedside teaching improves student satisfaction and learning outcome in a practical dermatology course. During two consecutive terms a rotating system of large-group interactive case-study-method teaching with two tutors (one content expert, one process facilitator) and bedside teaching with randomly appointed tutors was evaluated with a nine-item questionnaire and multiple-choice test performed at the beginning and the end of the course (n = 204/231 students evaluable). The results of three different didactic approaches utilized over the prior year served as a control. The interactive course was rated significantly better (p < 0.0001) than the standard course with regard to all items. The aggregate mark given by the students for the whole course was 1.58-0.61 (mean +/- SD, range 1 (good)-5 (poor)). This was significantly better than the standard course (p < 0.0001) and not different from small-group teaching approaches. The mean test results in the final examination improved significantly (p < 0.01). The combination of large-group interactive teaching and small-group bedside teaching was well accepted, improved the learning outcome, was rated as good as a small-group didactic approach and needed fewer resources in terms of personnel.

  2. Improving Disease Prediction by Incorporating Family Disease History in Risk Prediction Models with Large-Scale Genetic Data.

    PubMed

    Gim, Jungsoo; Kim, Wonji; Kwak, Soo Heon; Choi, Hosik; Park, Changyi; Park, Kyong Soo; Kwon, Sunghoon; Park, Taesung; Won, Sungho

    2017-11-01

    Despite the many successes of genome-wide association studies (GWAS), the known susceptibility variants identified by GWAS have modest effect sizes, leading to notable skepticism about the effectiveness of building a risk prediction model from large-scale genetic data. However, in contrast to genetic variants, the family history of diseases has been largely accepted as an important risk factor in clinical diagnosis and risk prediction. Nevertheless, the complicated structures of the family history of diseases have limited their application in clinical practice. Here, we developed a new method that enables incorporation of the general family history of diseases with a liability threshold model, and propose a new analysis strategy for risk prediction with penalized regression analysis that incorporates both large numbers of genetic variants and clinical risk factors. Application of our model to type 2 diabetes in the Korean population (1846 cases and 1846 controls) demonstrated that single-nucleotide polymorphisms accounted for 32.5% of the variation explained by the predicted risk scores in the test data set, and incorporation of family history led to an additional 6.3% improvement in prediction. Our results illustrate that family medical history provides valuable information on the variation of complex diseases and improves prediction performance. Copyright © 2017 by the Genetics Society of America.

  3. Amine-Modulated/Engineered Interfaces of NiMo Electrocatalysts for Improved Hydrogen Evolution Reaction in Alkaline Solutions.

    PubMed

    Gao, Wei; Gou, Wangyan; Zhou, Xuemei; Ho, Johnny C; Ma, Yuanyuan; Qu, Yongquan

    2018-01-17

    The interface between electrolytes and electrocatalysts would largely determine their corresponding activity and stability. Herein, modulating the surface characteristics of NiMo nanoparticles by various adsorbed amines gives the tunability on their interfacial properties and subsequently improves their catalytic performance for hydrogen evolution reaction (HER) in alkaline solutions. Diamines can significantly improve their HER activity by decreasing the charge-transfer resistance and modulating the electronic structures of interfacial active sites. Importantly, among various amines, ethylenediamine facilitates the HER activity of NiMo with a remarkable decrease of 268 mV in the overpotential to reach 10 mA cm -2 as compared with that of the unmodified NiMo in 1.0 M KOH. This method provides a novel strategy of regulating the interfacial properties to strengthen the catalytic performance of electrocatalysts.

  4. Reading with peripheral vision: a comparison of reading dynamic scrolling and static text with a simulated central scotoma.

    PubMed

    Harvey, Hannah; Walker, Robin

    2014-05-01

    Horizontally scrolling text is, in theory, ideally suited to enhance viewing strategies recommended to improve reading performance under conditions of central vision loss such as macular disease, although it is largely unproven in this regard. This study investigated if the use of scrolling text produced an observable improvement in reading performed under conditions of eccentric viewing in an artificial scotoma paradigm. Participants (n=17) read scrolling and static text with a central artificial scotoma controlled by an eye-tracker. There was an improvement in measures of reading accuracy, and adherence to eccentric viewing strategies with scrolling, compared to static, text. These findings illustrate the potential benefits of scrolling text as a potential reading aid for those with central vision loss. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. Chapter 24: Strategic Energy Management (SEM) Evaluation Protocol. The Uniform Methods Project: Methods for Determining Energy Efficiency Savings for Specific Measures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stewart, James

    Strategic energy management (SEM) focuses on achieving energy-efficiency improvements through systematic and planned changes in facility operations, maintenance, and behaviors (OM&B) and capital equipment upgrades in large energy-using facilities, including industrial buildings, commercial buildings, and multi-facility organizations such as campuses or communities. Facilities can institute a spectrum of SEM actions, ranging from a simple process for regularly identifying energy-savings actions, to establishing a formal, third-party recognized or certified SEM framework for continuous improvement of energy performance. In general, SEM programs that would be considered part of a utility program will contain a set of energy-reducing goals, principles, and practices emphasizingmore » continuous improvements in energy performance or savings through energy management and an energy management system (EnMS).« less

  6. SRF niobium characterization using SIMS and FIB-TEM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stevie, F. A.

    2015-12-04

    Our understanding of superconducting radio frequency (SRF) accelerator cavities has been improved by elemental analysis at high depth resolution and by high magnification microscopy. This paper summarizes the technique development and the results obtained on poly-crystalline, large grain, and single crystal SRF niobium. Focused ion beam made possible sample preparation using transmission electron microscopy and the images obtained showed a very uniform oxide layer for all samples analyzed. Secondary ion mass spectrometry indicated the presence of a high concentration of hydrogen and the hydrogen content exhibited a relationship with improvement in performance. Depth profiles of carbon, nitrogen, and oxygen didmore » not show major differences with heat treatment. Niobium oxide less than 10 nm thick was shown to be an effective hydrogen barrier. Niobium with titanium contamination showed unexpected performance improvement.« less

  7. Improve Performance of Data Warehouse by Query Cache

    NASA Astrophysics Data System (ADS)

    Gour, Vishal; Sarangdevot, S. S.; Sharma, Anand; Choudhary, Vinod

    2010-11-01

    The primary goal of data warehouse is to free the information locked up in the operational database so that decision makers and business analyst can make queries, analysis and planning regardless of the data changes in operational database. As the number of queries is large, therefore, in certain cases there is reasonable probability that same query submitted by the one or multiple users at different times. Each time when query is executed, all the data of warehouse is analyzed to generate the result of that query. In this paper we will study how using query cache improves performance of Data Warehouse and try to find the common problems faced. These kinds of problems are faced by Data Warehouse administrators which are minimizes response time and improves the efficiency of query in data warehouse overall, particularly when data warehouse is updated at regular interval.

  8. Classifying injury narratives of large administrative databases for surveillance-A practical approach combining machine learning ensembles and human review.

    PubMed

    Marucci-Wellman, Helen R; Corns, Helen L; Lehto, Mark R

    2017-01-01

    Injury narratives are now available real time and include useful information for injury surveillance and prevention. However, manual classification of the cause or events leading to injury found in large batches of narratives, such as workers compensation claims databases, can be prohibitive. In this study we compare the utility of four machine learning algorithms (Naïve Bayes, Single word and Bi-gram models, Support Vector Machine and Logistic Regression) for classifying narratives into Bureau of Labor Statistics Occupational Injury and Illness event leading to injury classifications for a large workers compensation database. These algorithms are known to do well classifying narrative text and are fairly easy to implement with off-the-shelf software packages such as Python. We propose human-machine learning ensemble approaches which maximize the power and accuracy of the algorithms for machine-assigned codes and allow for strategic filtering of rare, emerging or ambiguous narratives for manual review. We compare human-machine approaches based on filtering on the prediction strength of the classifier vs. agreement between algorithms. Regularized Logistic Regression (LR) was the best performing algorithm alone. Using this algorithm and filtering out the bottom 30% of predictions for manual review resulted in high accuracy (overall sensitivity/positive predictive value of 0.89) of the final machine-human coded dataset. The best pairings of algorithms included Naïve Bayes with Support Vector Machine whereby the triple ensemble NB SW =NB BI-GRAM =SVM had very high performance (0.93 overall sensitivity/positive predictive value and high accuracy (i.e. high sensitivity and positive predictive values)) across both large and small categories leaving 41% of the narratives for manual review. Integrating LR into this ensemble mix improved performance only slightly. For large administrative datasets we propose incorporation of methods based on human-machine pairings such as we have done here, utilizing readily-available off-the-shelf machine learning techniques and resulting in only a fraction of narratives that require manual review. Human-machine ensemble methods are likely to improve performance over total manual coding. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  9. Training children aged 5-10 years in manual compliance control to improve drawing and handwriting.

    PubMed

    Bingham, Geoffrey P; Snapp-Childs, Winona

    2018-04-12

    A large proportion of school-aged children exhibit poor drawing and handwriting. This prevalence limits the availability of therapy. We developed an automated method for training improved manual compliance control and relatedly, prospective control of a stylus. The approach included a difficult training task, while providing parametrically modifiable support that enables the children to perform successfully while developing good compliance control. The task was to use a stylus to push a bead along a 3D wire path. Support was provided by making the wire magnetically attractive to the stylus. Support was progressively reduced as 3D tracing performance improved. We report studies that (1) compared performance of Typically Developing (TD) children and children with Developmental Coordination Disorder (DCD), (2) tested training with active versus passive movement, (3) tested progressively reduced versus constant or no support during training, (4) tested children of different ages, (5) tested the transfer of training to a drawing task, (6) tested the specificity of training in respect to the size, shape and dimensionality of figures, and (7) investigated the relevance of the training task to the Beery VMI, an inventory used to diagnose DCD. The findings were as follows. (1) Pre-training performance of TD and DCD children was the same and good with high support but distinct and poor with low support. Support yielded good self-efficacy that motivated training. Post training performance with no support was improved and the same for TD and DCD children. (2) Actively controlled movements were required for improved performance. (3) Progressively reduced support was required for good performance during and after training. (4) Age differences in performance during pre-training were eliminated post-training. (5) Improvements transferred to drawing. (6) There was no evidence of specificity of training in transfer. (7) Disparate Beery scores were reflected in pre-training but not post-training performance. We conclude that the method improves manual compliance control, and more generally, prospective control of movements used in drawing performance. Copyright © 2018. Published by Elsevier B.V.

  10. Comparative study of absorption in tilted silicon nanowire arrays for photovoltaics

    PubMed Central

    2014-01-01

    Silicon nanowire arrays have been shown to demonstrate light trapping properties and promising potential for next-generation photovoltaics. In this paper, we show that the absorption enhancement in vertical nanowire arrays on a perfectly electric conductor can be further improved through tilting. Vertical nanowire arrays have a 66.2% improvement in ultimate efficiency over an ideal double-pass thin film of the equivalent amount of material. Tilted nanowire arrays, with the same amount of material, exhibit improved performance over vertical nanowire arrays across a broad range of tilt angles (from 38° to 72°). The optimum tilt of 53° has an improvement of 8.6% over that of vertical nanowire arrays and 80.4% over that of the ideal double-pass thin film. Tilted nanowire arrays exhibit improved absorption over the solar spectrum compared with vertical nanowires since the tilt allows for the excitation of additional modes besides the HE 1m modes that are excited at normal incidence. We also observed that tilted nanowire arrays have improved performance over vertical nanowire arrays for a large range of incidence angles (under about 60°). PMID:25435833

  11. Comparative study of absorption in tilted silicon nanowire arrays for photovoltaics.

    PubMed

    Kayes, Md Imrul; Leu, Paul W

    2014-01-01

    Silicon nanowire arrays have been shown to demonstrate light trapping properties and promising potential for next-generation photovoltaics. In this paper, we show that the absorption enhancement in vertical nanowire arrays on a perfectly electric conductor can be further improved through tilting. Vertical nanowire arrays have a 66.2% improvement in ultimate efficiency over an ideal double-pass thin film of the equivalent amount of material. Tilted nanowire arrays, with the same amount of material, exhibit improved performance over vertical nanowire arrays across a broad range of tilt angles (from 38° to 72°). The optimum tilt of 53° has an improvement of 8.6% over that of vertical nanowire arrays and 80.4% over that of the ideal double-pass thin film. Tilted nanowire arrays exhibit improved absorption over the solar spectrum compared with vertical nanowires since the tilt allows for the excitation of additional modes besides the HE 1m modes that are excited at normal incidence. We also observed that tilted nanowire arrays have improved performance over vertical nanowire arrays for a large range of incidence angles (under about 60°).

  12. Improving the availability of clinical history accompanying radiographic examinations in a large pediatric radiology department.

    PubMed

    Hawkins, C Matthew; Anton, Christopher G; Bankes, Wendy M; Leach, Alan D; Zeno, Michael J; Pryor, Rebecca M; Larson, David B

    2014-04-01

    The purpose of this quality improvement initiative was to improve the consistency with which radiologists are provided a complete clinical history when interpreting radiography examinations performed in the outpatient and emergency department settings. The clinical history was considered complete if it contained three elements: nature of the symptoms, description of injury, or cause for clinical concern; duration of symptoms or time of injury; and focal site of pain or abnormality, if applicable. This was reduced to three elements: "what-when-where." A goal was established that 95% of the clinical histories should contain all three elements. To achieve this goal, technologists supplemented referring clinicians' history. The project was divided into four phases: launch, support, transition to sustainability, and maintenance. During the support phase, results of automated weekly audits automatically populated group-level performance reports. During the transition to the sustainability phase, audit results populated individual-level performance reports. During the maintenance phase, quarterly audit results were incorporated into technologists' employee performance goals. Before initiation of the project, 38% (76/200) of radiography examinations were accompanied by a complete clinical history. This increased to 92% (928/1006) by the end of the 15-week improvement phase. Performance was sustained at 96% (1168/1213) 7 months later [corrected]. By clearly defining expectations for an appropriate clinical history and establishing system and organizational mechanisms to facilitate verifiable compliance, we were able to successfully and sustainably improve the consistency with which radiography examinations were accompanied by a complete clinical history.

  13. RETINA EXPANSION TECHNIQUE FOR MACULAR HOLE APPOSITION REPORT 2: Efficacy, Closure Rate, and Risks of a Macular Detachment Technique to Close Large Full-Thickness Macular Holes.

    PubMed

    Wong, Roger; Howard, Catherine; Orobona, Giancarlo Dellʼaversana

    2018-04-01

    To describe the safety and efficacy of a technique to close large thickness macular holes. A consecutive retrospective interventional case series of 16 patients with macular holes greater than 650 microns in "aperture" diameter were included. The technique involves vitrectomy, followed by internal limiting membrane peeling. The macula is detached using subretinal injection of saline. Fluid-air exchange is performed to promote detachment and stretch of the retina. After this, the standard fluid-air exchange is performed and perfluoropropane gas is injected. Face-down posturing is advised. Adverse effects, preoperative, and postoperative visual acuities were recorded. Optical coherence tomography scans were also taken. The mean hole size was 739 microns (SD: 62 microns; mean base diameter: 1,311 microns). Eighty-three percent (14 of 16) of eyes had successful hole closure after the procedure. At 12-month follow-up, no worsening in visual acuity was reported, and improvement in visual acuity was noted in 14 of 16 eyes. No patients lost vision because of the procedure. It is possible to achieve anatomical closure of large macular holes using RETMA. No patients experienced visual loss. The level of visual improvement is likely limited because of the size and chronicity of these holes.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Jie; Cui, Mingjian; Hodge, Bri-Mathias

    The large variability and uncertainty in wind power generation present a concern to power system operators, especially given the increasing amounts of wind power being integrated into the electric power system. Large ramps, one of the biggest concerns, can significantly influence system economics and reliability. The Wind Forecast Improvement Project (WFIP) was to improve the accuracy of forecasts and to evaluate the economic benefits of these improvements to grid operators. This paper evaluates the ramp forecasting accuracy gained by improving the performance of short-term wind power forecasting. This study focuses on the WFIP southern study region, which encompasses most ofmore » the Electric Reliability Council of Texas (ERCOT) territory, to compare the experimental WFIP forecasts to the existing short-term wind power forecasts (used at ERCOT) at multiple spatial and temporal scales. The study employs four significant wind power ramping definitions according to the power change magnitude, direction, and duration. The optimized swinging door algorithm is adopted to extract ramp events from actual and forecasted wind power time series. The results show that the experimental WFIP forecasts improve the accuracy of the wind power ramp forecasting. This improvement can result in substantial costs savings and power system reliability enhancements.« less

  15. Investigation of a Novel Turbulence Model and Using Leading-Edge Slots for Improving the Aerodynamic Performance of Airfoils and Wind Turbines

    NASA Astrophysics Data System (ADS)

    Beyhaghi, Saman

    Because of the problems associated with increase of greenhouse gases, as well as the limited supplies of fossil fuels, the transition to alternate, clean, renewable sources of energy is inevitable. Renewable sources of energy can be used to decrease our need for fossil fuels, thus reducing impact to humans, other species and their habitats. The wind is one of the cleanest forms of energy, and it can be an excellent candidate for producing electrical energy in a more sustainable manner. Vertical- and Horizontal-Axis Wind Turbines (VAWT and HAWT) are two common devices used for harvesting electrical energy from the wind. Due to the development of a thin boundary layer over the ground surface, the modern commercial wind turbines have to be relatively large to be cost-effective. Because of the high manufacturing and transportation costs of the wind turbine components, it is necessary to evaluate the design and predict the performance of the turbine prior to shipping it to the site, where it is to be installed. Computational Fluid Dynamics (CFD) has proven to be a simple, cheap and yet relatively accurate tool for prediction of wind turbine performance, where the suitability of different designs can be evaluated at a low cost. High accuracy simulation methods such as Large Eddy Simulation (LES) and Detached Eddy Simulation (DES) are developed and utilized in the past decades. Despite their superior importance in large fluid domains, they fail to make very accurate predictions near the solid surfaces. Therefore, in the present effort, the possibility of improving near-wall predictions of CFD simulations in the near-wall region by using a modified turbulence model is also thoroughly investigated. Algebraic Stress Model (ASM) is employed in conjunction with Detached Eddy Simulation (DES) to improve Reynolds stresses components, and consequently predictions of the near-wall velocities and surface pressure distributions. The proposed model shows a slightly better performance as compared to the baseline DES. In the second part of this study, the focus is on improving the aerodynamic performance of airfoils and wind turbines in terms of lift and drag coefficients and power generation. One special type of add-on feature for wind turbines and airfoils, i.e., leading-edge slots are investigated through numerical simulation and laboratory experiments. Although similar slots are designed and employed for aircrafts, a special slot with a reversed flow direction is drilled in the leading edge of a sample wind turbine airfoil to study its influence on the aerodynamic performance. The objective is to vary the five main geometrical parameters of slot and characterize the performance improvement of the new design under different operating conditions. A number of Design of Experiment and optimization studies are conducted to determine the most suitable slot configuration to maximize the lift or lift-over-drag ratio. Results indicate that proper sizing and placement of slot can improve the lift coefficient, while it has negligible negative impact on the drag. Some recommendations for future investigation on slot are proposed at the end. The performance of a horizontal axis wind turbine blade equipped with leading-edge slot is also studied, and it is concluded that slotted blades can generate about 10% more power than solid blades, for the two operating conditions investigated. The good agreement between the CFD predictions and experimental data confirms the validity of the model and results.

  16. Storage element performance optimization for CMS analysis jobs

    NASA Astrophysics Data System (ADS)

    Behrmann, G.; Dahlblom, J.; Guldmyr, J.; Happonen, K.; Lindén, T.

    2012-12-01

    Tier-2 computing sites in the Worldwide Large Hadron Collider Computing Grid (WLCG) host CPU-resources (Compute Element, CE) and storage resources (Storage Element, SE). The vast amount of data that needs to processed from the Large Hadron Collider (LHC) experiments requires good and efficient use of the available resources. Having a good CPU efficiency for the end users analysis jobs requires that the performance of the storage system is able to scale with I/O requests from hundreds or even thousands of simultaneous jobs. In this presentation we report on the work on improving the SE performance at the Helsinki Institute of Physics (HIP) Tier-2 used for the Compact Muon Experiment (CMS) at the LHC. Statistics from CMS grid jobs are collected and stored in the CMS Dashboard for further analysis, which allows for easy performance monitoring by the sites and by the CMS collaboration. As part of the monitoring framework CMS uses the JobRobot which sends every four hours 100 analysis jobs to each site. CMS also uses the HammerCloud tool for site monitoring and stress testing and it has replaced the JobRobot. The performance of the analysis workflow submitted with JobRobot or HammerCloud can be used to track the performance due to site configuration changes, since the analysis workflow is kept the same for all sites and for months in time. The CPU efficiency of the JobRobot jobs at HIP was increased approximately by 50 % to more than 90 %, by tuning the SE and by improvements in the CMSSW and dCache software. The performance of the CMS analysis jobs improved significantly too. Similar work has been done on other CMS Tier-sites, since on average the CPU efficiency for CMSSW jobs has increased during 2011. Better monitoring of the SE allows faster detection of problems, so that the performance level can be kept high. The next storage upgrade at HIP consists of SAS disk enclosures which can be stress tested on demand with HammerCloud workflows, to make sure that the I/O-performance is good.

  17. Advances towards high performance low-torque qmin > 2 operations with large-radius ITB on DIII-D

    NASA Astrophysics Data System (ADS)

    Xu, G. S.; Solomon, W. M.; Garofalo, A. M.; Ferron, J. R.; Hyatt, A. W.; Wang, Q.; Yan, Z.; McKee, G. R.; Holcomb, C. T.; EAST Team

    2015-11-01

    A joint DIII-D/EAST experiment was performed aimed at extending a fully noninductive scenario with high βP and qmin > 2 to inductive operation at lower torque and higher Ip (0.6 --> 0.8 MA) for better performance. Extremely high confinement was obtained, i.e., H98y2 ~ 2.1 at βN ~ 3, which was associated with a strong ITB at large minor radius (ρ ~ 0.7). Alfvén Eigenmodes and broadband turbulence were significantly suppressed in the core, and fast-ion confinement was improved. ITB collapses at 0.8 MA were induced by ELM-triggered n = 1 MHD modes at the ITB location, which is different from the ``relaxation oscillations'' associated with the steady-state plasmas at lower current (0.6 MA). This successful joint experiment may open up a new avenue towards high performance low-torque qmin > 2 plasmas with large-radius ITBs, which will be demonstrated on EAST in the near future. Work supported by NMCFSP 2015GB102000, 2015GB110001 and the US DOE under DE-AC02-09CH11466, DE-FC02-04ER54698, DE-FG02-89ER53296 and DE-AC52-07NA27344.

  18. A design-build-test cycle using modeling and experiments reveals interdependencies between upper glycolysis and xylose uptake in recombinant S. cerevisiae and improves predictive capabilities of large-scale kinetic models.

    PubMed

    Miskovic, Ljubisa; Alff-Tuomala, Susanne; Soh, Keng Cher; Barth, Dorothee; Salusjärvi, Laura; Pitkänen, Juha-Pekka; Ruohonen, Laura; Penttilä, Merja; Hatzimanikatis, Vassily

    2017-01-01

    Recent advancements in omics measurement technologies have led to an ever-increasing amount of available experimental data that necessitate systems-oriented methodologies for efficient and systematic integration of data into consistent large-scale kinetic models. These models can help us to uncover new insights into cellular physiology and also to assist in the rational design of bioreactor or fermentation processes. Optimization and Risk Analysis of Complex Living Entities (ORACLE) framework for the construction of large-scale kinetic models can be used as guidance for formulating alternative metabolic engineering strategies. We used ORACLE in a metabolic engineering problem: improvement of the xylose uptake rate during mixed glucose-xylose consumption in a recombinant Saccharomyces cerevisiae strain. Using the data from bioreactor fermentations, we characterized network flux and concentration profiles representing possible physiological states of the analyzed strain. We then identified enzymes that could lead to improved flux through xylose transporters (XTR). For some of the identified enzymes, including hexokinase (HXK), we could not deduce if their control over XTR was positive or negative. We thus performed a follow-up experiment, and we found out that HXK2 deletion improves xylose uptake rate. The data from the performed experiments were then used to prune the kinetic models, and the predictions of the pruned population of kinetic models were in agreement with the experimental data collected on the HXK2 -deficient S. cerevisiae strain. We present a design-build-test cycle composed of modeling efforts and experiments with a glucose-xylose co-utilizing recombinant S. cerevisiae and its HXK2 -deficient mutant that allowed us to uncover interdependencies between upper glycolysis and xylose uptake pathway. Through this cycle, we also obtained kinetic models with improved prediction capabilities. The present study demonstrates the potential of integrated "modeling and experiments" systems biology approaches that can be applied for diverse applications ranging from biotechnology to drug discovery.

  19. Improving collaboration between large and small-medium enterprises in automobile production

    NASA Astrophysics Data System (ADS)

    Sung, Soyoung; Kim, Yanghoon; Chang, Hangbae

    2018-01-01

    Inter-organisational collaboration is important for achieving qualitative and quantitative performance improvement in the global competitive environment. In particular, the extent of collaboration between the mother company and its suppliers is important for the profitability and sustainability of a company in the automobile industry, which is carried out using a customisation and order production system. As a result of the empirical analysis in this study, the collaborative information sharing cycle is shortened and the collaborative information sharing scope is widened. Therefore, the level of collaboration is improved by constructing an IT collaboration system.

  20. Effects of Milk vs Dark Chocolate Consumption on Visual Acuity and Contrast Sensitivity Within 2 Hours: A Randomized Clinical Trial.

    PubMed

    Rabin, Jeff C; Karunathilake, Nirmani; Patrizi, Korey

    2018-04-26

    Consumption of dark chocolate can improve blood flow, mood, and cognition in the short term, but little is known about the possible effects of dark chocolate on visual performance. To compare the short-term effects of consumption of dark chocolate with those of milk chocolate on visual acuity and large- and small-letter contrast sensitivity. A randomized, single-masked crossover design was used to assess short-term visual performance after consumption of a dark or a milk chocolate bar. Thirty participants without pathologic eye disease each consumed dark and milk chocolate in separate sessions, and within-participant paired comparisons were used to assess outcomes. Testing was conducted at the Rosenberg School of Optometry from June 25 to August 15, 2017. Visual acuity (in logMAR units) and large- and small-letter contrast sensitivity (in the log of the inverse of the minimum detectable contrast [logCS units]) were measured 1.75 hours after consumption of dark and milk chocolate bars. Among the 30 participants (9 men and 21 women; mean [SD] age, 26 [5] years), small-letter contrast sensitivity was significantly higher after consumption of dark chocolate (mean [SE], 1.45 [0.04] logCS) vs milk chocolate (mean [SE], 1.30 [0.05] logCS; mean improvement, 0.15 logCS [95% CI, 0.08-0.22 logCS]; P < .001). Large-letter contrast sensitivity was slightly higher after consumption of dark chocolate (mean [SE], 2.05 [0.02] logCS) vs milk chocolate (mean [SE], 2.00 [0.02] logCS; mean improvement, 0.05 logCS [95% CI, 0.00-0.10 logCS]; P = .07). Visual acuity improved slightly after consumption of dark chocolate (mean [SE], -0.22 [0.01] logMAR; visual acuity, approximately 20/12) and milk chocolate (mean [SE], -0.18 [0.01] logMAR; visual acuity, approximately 20/15; mean improvement, 0.04 logMAR [95% CI, 0.02-0.06 logMAR]; P = .05). Composite scores combining results from all tests showed significant improvement after consumption of dark compared with milk chocolate (mean improvement, 0.20 log U [95% CI, 0.10-0.30 log U]; P < .001). Contrast sensitivity and visual acuity were significantly higher 2 hours after consumption of a dark chocolate bar compared with a milk chocolate bar, but the duration of these effects and their influence in real-world performance await further testing. clinicaltrials.gov Identifier: NCT03326934.

  1. Application of the Systematic Sensor Selection Strategy for Turbofan Engine Diagnostics

    NASA Technical Reports Server (NTRS)

    Sowers, T. Shane; Kopasakis, George; Simon, Donald L.

    2008-01-01

    The data acquired from available system sensors forms the foundation upon which any health management system is based, and the available sensor suite directly impacts the overall diagnostic performance that can be achieved. While additional sensors may provide improved fault diagnostic performance, there are other factors that also need to be considered such as instrumentation cost, weight, and reliability. A systematic sensor selection approach is desired to perform sensor selection from a holistic system-level perspective as opposed to performing decisions in an ad hoc or heuristic fashion. The Systematic Sensor Selection Strategy is a methodology that optimally selects a sensor suite from a pool of sensors based on the system fault diagnostic approach, with the ability of taking cost, weight, and reliability into consideration. This procedure was applied to a large commercial turbofan engine simulation. In this initial study, sensor suites tailored for improved diagnostic performance are constructed from a prescribed collection of candidate sensors. The diagnostic performance of the best performing sensor suites in terms of fault detection and identification are demonstrated, with a discussion of the results and implications for future research.

  2. Application of the Systematic Sensor Selection Strategy for Turbofan Engine Diagnostics

    NASA Technical Reports Server (NTRS)

    Sowers, T. Shane; Kopasakis, George; Simon, Donald L.

    2008-01-01

    The data acquired from available system sensors forms the foundation upon which any health management system is based, and the available sensor suite directly impacts the overall diagnostic performance that can be achieved. While additional sensors may provide improved fault diagnostic performance there are other factors that also need to be considered such as instrumentation cost, weight, and reliability. A systematic sensor selection approach is desired to perform sensor selection from a holistic system-level perspective as opposed to performing decisions in an ad hoc or heuristic fashion. The Systematic Sensor Selection Strategy is a methodology that optimally selects a sensor suite from a pool of sensors based on the system fault diagnostic approach, with the ability of taking cost, weight and reliability into consideration. This procedure was applied to a large commercial turbofan engine simulation. In this initial study, sensor suites tailored for improved diagnostic performance are constructed from a prescribed collection of candidate sensors. The diagnostic performance of the best performing sensor suites in terms of fault detection and identification are demonstrated, with a discussion of the results and implications for future research.

  3. Clinical outcomes of the high-performance membrane dialyzer.

    PubMed

    Koda, Yutaka

    2011-01-01

    HPM (high-performance membrane or high-flux membrane) has better biocompatibility and higher capacity to remove retention solutes of large molecular weight, which has been proven to be toxic especially to cardiovascular and skeletal organs. To date, several non-randomized observational studies have shown a reduction in morbidity and mortality in HPM-treated patients compared with low-flux conventional membrane. Meanwhile, two randomized controlled trials were unable to reveal the superiority of high-flux membrane in survival of all-cause mortality, but suggested a significant benefit by subgroup analyses or post-hoc analyses in patients with diabetes, hypoalbuminemia and long duration of prior dialysis. Thus, the results of the published studies are conflicting and it still cannot be explained whether the effect is based on the biocompatibility of the membrane or on the differences in the clearance of middle molecules, or on the microbiological purity of dialysate which improved simultaneously with the flux increment. As survival outcome might be determined by additional multiple confounding factors, dialysis-related or non-dialysis-related, investigations to control them are difficult to perform. Although the clinical results are non-conclusive and it is still unanswered how much large molecule removal is required to improve outcomes in routine clinical practice, there is a considerable amount of biological plausibility for high-flux dialysis or middle molecule removal. Further trials will be required to confirm what patient group benefits the most, the magnitude of advantages and how large the molecules are and how much molecule removal is acceptable using advanced high-performance dialyzers. Dispersing hazardous effects by a low-quality therapy should be taken more seriously than practicing a high-quality therapy of uncertain superiority. Copyright © 2011 S. Karger AG, Basel.

  4. Medical imaging informatics based solutions for human performance analytics

    NASA Astrophysics Data System (ADS)

    Verma, Sneha; McNitt-Gray, Jill; Liu, Brent J.

    2018-03-01

    For human performance analysis, extensive experimental trials are often conducted to identify the underlying cause or long-term consequences of certain pathologies and to improve motor functions by examining the movement patterns of affected individuals. Data collected for human performance analysis includes high-speed video, surveys, spreadsheets, force data recordings from instrumented surfaces etc. These datasets are recorded from various standalone sources and therefore captured in different folder structures as well as in varying formats depending on the hardware configurations. Therefore, data integration and synchronization present a huge challenge while handling these multimedia datasets specifically for large datasets. Another challenge faced by researchers is querying large quantity of unstructured data and to design feedbacks/reporting tools for users who need to use datasets at various levels. In the past, database server storage solutions have been introduced to securely store these datasets. However, to automate the process of uploading raw files, various file manipulation steps are required. In the current workflow, this file manipulation and structuring is done manually and is not feasible for large amounts of data. However, by attaching metadata files and data dictionaries with these raw datasets, they can provide information and structure needed for automated server upload. We introduce one such system for metadata creation for unstructured multimedia data based on the DICOM data model design. We will discuss design and implementation of this system and evaluate this system with data set collected for movement analysis study. The broader aim of this paper is to present a solutions space achievable based on medical imaging informatics design and methods for improvement in workflow for human performance analysis in a biomechanics research lab.

  5. SemiBoost: boosting for semi-supervised learning.

    PubMed

    Mallapragada, Pavan Kumar; Jin, Rong; Jain, Anil K; Liu, Yi

    2009-11-01

    Semi-supervised learning has attracted a significant amount of attention in pattern recognition and machine learning. Most previous studies have focused on designing special algorithms to effectively exploit the unlabeled data in conjunction with labeled data. Our goal is to improve the classification accuracy of any given supervised learning algorithm by using the available unlabeled examples. We call this as the Semi-supervised improvement problem, to distinguish the proposed approach from the existing approaches. We design a metasemi-supervised learning algorithm that wraps around the underlying supervised algorithm and improves its performance using unlabeled data. This problem is particularly important when we need to train a supervised learning algorithm with a limited number of labeled examples and a multitude of unlabeled examples. We present a boosting framework for semi-supervised learning, termed as SemiBoost. The key advantages of the proposed semi-supervised learning approach are: 1) performance improvement of any supervised learning algorithm with a multitude of unlabeled data, 2) efficient computation by the iterative boosting algorithm, and 3) exploiting both manifold and cluster assumption in training classification models. An empirical study on 16 different data sets and text categorization demonstrates that the proposed framework improves the performance of several commonly used supervised learning algorithms, given a large number of unlabeled examples. We also show that the performance of the proposed algorithm, SemiBoost, is comparable to the state-of-the-art semi-supervised learning algorithms.

  6. Improved variance estimation of classification performance via reduction of bias caused by small sample size.

    PubMed

    Wickenberg-Bolin, Ulrika; Göransson, Hanna; Fryknäs, Mårten; Gustafsson, Mats G; Isaksson, Anders

    2006-03-13

    Supervised learning for classification of cancer employs a set of design examples to learn how to discriminate between tumors. In practice it is crucial to confirm that the classifier is robust with good generalization performance to new examples, or at least that it performs better than random guessing. A suggested alternative is to obtain a confidence interval of the error rate using repeated design and test sets selected from available examples. However, it is known that even in the ideal situation of repeated designs and tests with completely novel samples in each cycle, a small test set size leads to a large bias in the estimate of the true variance between design sets. Therefore different methods for small sample performance estimation such as a recently proposed procedure called Repeated Random Sampling (RSS) is also expected to result in heavily biased estimates, which in turn translates into biased confidence intervals. Here we explore such biases and develop a refined algorithm called Repeated Independent Design and Test (RIDT). Our simulations reveal that repeated designs and tests based on resampling in a fixed bag of samples yield a biased variance estimate. We also demonstrate that it is possible to obtain an improved variance estimate by means of a procedure that explicitly models how this bias depends on the number of samples used for testing. For the special case of repeated designs and tests using new samples for each design and test, we present an exact analytical expression for how the expected value of the bias decreases with the size of the test set. We show that via modeling and subsequent reduction of the small sample bias, it is possible to obtain an improved estimate of the variance of classifier performance between design sets. However, the uncertainty of the variance estimate is large in the simulations performed indicating that the method in its present form cannot be directly applied to small data sets.

  7. Genetic relationships between feed efficiency in growing males and beef cow performance.

    PubMed

    Crowley, J J; Evans, R D; Mc Hugh, N; Kenny, D A; McGee, M; Crews, D H; Berry, D P

    2011-11-01

    Most studies on feed efficiency in beef cattle have focused on performance in young animals despite the contribution of the cow herd to overall profitability of beef production systems. The objective of this study was to quantify, using a large data set, the genetic covariances between feed efficiency in growing animals measured in a performance-test station, and beef cow performance including fertility, survival, calving traits, BW, maternal weaning weight, cow price, and cull cow carcass characteristics in commercial herds. Feed efficiency data were available on 2,605 purebred bulls from 1 test station. Records on cow performance were available on up to 94,936 crossbred beef cows. Genetic covariances were estimated using animal and animal-dam linear mixed models. Results showed that selection for feed efficiency, defined as feed conversion ratio (FCR) or residual BW gain (RG), improved maternal weaning weight as evidenced by the respective genetic correlations of -0.61 and 0.57. Despite residual feed intake (RFI) being phenotypically independent of BW, a negative genetic correlation existed between RFI and cow BW (-0.23; although the SE of 0.31 was large). None of the feed efficiency traits were correlated with fertility, calving difficulty, or perinatal mortality. However, genetic correlations estimated between age at first calving and FCR (-0.55 ± 0.14), Kleiber ratio (0.33 ± 0.15), RFI (-0.29 ± 0.14), residual BW gain (0.36 ± 0.15), and relative growth rate (0.37 ± 0.15) all suggest that selection for improved efficiency may delay the age at first calving, and we speculate, using information from other studies, that this may be due to a delay in the onset of puberty. Results from this study, based on the estimated genetic correlations, suggest that selection for improved feed efficiency will have no deleterious effect on cow performance traits with the exception of delaying the age at first calving.

  8. Improving large class performance and engagement through student-generated question banks.

    PubMed

    Hancock, Dale; Hare, Nicole; Denny, Paul; Denyer, Gareth

    2018-03-12

    Disciplines such as Biochemistry and Molecular Biology, which involve concepts not included in the high-school curriculum, are very challenging for many first year university students. These subjects are particularly difficult for students accustomed to surface learning strategies involving memorization and recall of facts, as a deeper understanding of the relationship between concepts is needed for successful transfer to related areas and subsequent study. In this article, we explore an activity in a very large first year Molecular Biology course, in which students create multiple-choice questions related to targeted learning outcomes, and then answer and evaluate one another's questions. This activity encompasses elements of both self- and peer-assessment and the generative tasks of creating questions and producing written feedback may contribute to a deeper understanding of the material. We make use of a free online platform to facilitate all aspects of the process and analyze the effect of student engagement with the task on overall course performance. When compared to previous semester's cohorts, we observe a pronounced improvement in class performance on exam questions targeting similar concepts to the student-generated questions. In addition, those students that engage to a greater extent with the activity perform significantly better on the targeted exam questions than those who are less active, yet all students perform similarly on a set of isolated control questions appearing on the same exam. © 2018 by The International Union of Biochemistry and Molecular Biology, 2018. © 2018 The International Union of Biochemistry and Molecular Biology.

  9. Greenland Inland Traverse (GrIT): 2010 Mobility Performance and Implications

    DTIC Science & Technology

    2011-10-01

    solar irradiance were also measured. The right-hand bladder sled (Fig. 5), designated Sled2, had black- rubber covers ( EPDM roofing material) wrapped...CRREL TR-11-16 ix tion only). Sled2 had thin, black- rubber covers over the bladders to in- crease solar gain. Some performance improvement...Pole Station from McMurdo Station, a distance of 1030 miles, using large, rubber -track tractors to haul fuel and cargo over the snow on flexible

  10. Efficient genotype compression and analysis of large genetic variation datasets

    PubMed Central

    Layer, Ryan M.; Kindlon, Neil; Karczewski, Konrad J.; Quinlan, Aaron R.

    2015-01-01

    Genotype Query Tools (GQT) is a new indexing strategy that expedites analyses of genome variation datasets in VCF format based on sample genotypes, phenotypes and relationships. GQT’s compressed genotype index minimizes decompression for analysis, and performance relative to existing methods improves with cohort size. We show substantial (up to 443 fold) performance gains over existing methods and demonstrate GQT’s utility for exploring massive datasets involving thousands to millions of genomes. PMID:26550772

  11. What can isolated skeletal muscle experiments tell us about the effects of caffeine on exercise performance?

    PubMed Central

    Tallis, Jason; Duncan, Michael J; James, Rob S

    2015-01-01

    Caffeine is an increasingly popular nutritional supplement due to the legal, significant improvements in sporting performance that it has been documented to elicit, with minimal side effects. Therefore, the effects of caffeine on human performance continue to be a popular area of research as we strive to improve our understanding of this drug and make more precise recommendations for its use in sport. Although variations in exercise intensity seems to affect its ergogenic benefits, it is largely thought that caffeine can induce significant improvements in endurance, power and strength-based activities. There are a number of limitations to testing caffeine-induced effects on human performance that can be better controlled when investigating its effects on isolated muscles under in vitro conditions. The hydrophobic nature of caffeine results in a post-digestion distribution to all tissues of the body making it difficult to accurately quantify its key mechanism of action. This review considers the contribution of evidence from isolated muscle studies to our understating of the direct effects of caffeine on muscle during human performance. The body of in vitro evidence presented suggests that caffeine can directly potentiate skeletal muscle force, work and power, which may be important contributors to the performance-enhancing effects seen in humans. PMID:25988508

  12. Improving Physical Task Performance with Counterfactual and Prefactual Thinking

    PubMed Central

    Hammell, Cecilia; Chan, Amy Y. C.

    2016-01-01

    Counterfactual thinking (reflecting on “what might have been”) has been shown to enhance future performance by translating information about past mistakes into plans for future action. Prefactual thinking (imagining “what might be if…”) may serve a greater preparative function than counterfactual thinking as it is future-orientated and focuses on more controllable features, thus providing a practical script to prime future behaviour. However, whether or not this difference in hypothetical thought content may translate into a difference in actual task performance has been largely unexamined. In Experiment 1 (n = 42), participants performed trials of a computer-simulated physical task, in between which they engaged in either task-related hypothetical thinking (counterfactual or prefactual) or an unrelated filler task (control). As hypothesised, prefactuals contained more controllable features than counterfactuals. Moreover, participants who engaged in either form of hypothetical thinking improved significantly in task performance over trials compared to participants in the control group. The difference in thought content between counterfactuals and prefactuals, however, did not yield a significant difference in performance improvement. Experiment 2 (n = 42) replicated these findings in a dynamic balance task environment. Together, these findings provide further evidence for the preparatory function of counterfactuals, and demonstrate that prefactuals share this same functional characteristic. PMID:27942041

  13. How do Stability Corrections Perform in the Stable Boundary Layer Over Snow?

    NASA Astrophysics Data System (ADS)

    Schlögl, Sebastian; Lehning, Michael; Nishimura, Kouichi; Huwald, Hendrik; Cullen, Nicolas J.; Mott, Rebecca

    2017-10-01

    We assess sensible heat-flux parametrizations in stable conditions over snow surfaces by testing and developing stability correction functions for two alpine and two polar test sites. Five turbulence datasets are analyzed with respect to, (a) the validity of the Monin-Obukhov similarity theory, (b) the model performance of well-established stability corrections, and (c) the development of new univariate and multivariate stability corrections. Using a wide range of stability corrections reveals an overestimation of the turbulent sensible heat flux for high wind speeds and a generally poor performance of all investigated functions for large temperature differences between snow and the atmosphere above (>10 K). Applying the Monin-Obukhov bulk formulation introduces a mean absolute error in the sensible heat flux of 6 W m^{-2} (compared with heat fluxes calculated directly from eddy covariance). The stability corrections produce an additional error between 1 and 5 W m^{-2}, with the smallest error for published stability corrections found for the Holtslag scheme. We confirm from previous studies that stability corrections need improvements for large temperature differences and wind speeds, where sensible heat fluxes are distinctly overestimated. Under these atmospheric conditions our newly developed stability corrections slightly improve the model performance. However, the differences between stability corrections are typically small when compared to the residual error, which stems from the Monin-Obukhov bulk formulation.

  14. Nature-Inspired Capillary-Driven Welding Process for Boosting Metal-Oxide Nanofiber Electronics.

    PubMed

    Meng, You; Lou, Kaihua; Qi, Rui; Guo, Zidong; Shin, Byoungchul; Liu, Guoxia; Shan, Fukai

    2018-06-20

    Recently, semiconducting nanofiber networks (NFNs) have been considered as one of the most promising platforms for large-area and low-cost electronics applications. However, the high contact resistance among stacking nanofibers remained to be a major challenge, leading to poor device performance and parasitic energy consumption. In this report, a controllable welding technique for NFNs was successfully demonstrated via a bioinspired capillary-driven process. The interfiber connections were well-achieved via a cooperative concept, combining localized capillary condensation and curvature-induced surface diffusion. With the improvements of the interfiber connections, the welded NFNs exhibited enhanced mechanical property and high electrical performance. The field-effect transistors (FETs) based on the welded Hf-doped In 2 O 3 (InHfO) NFNs were demonstrated for the first time. Meanwhile, the mechanisms involved in the grain-boundary modulation for polycrystalline metal-oxide nanofibers were discussed. When the high-k ZrO x dielectric thin films were integrated into the FETs, the field-effect mobility and operating voltage were further improved to be 25 cm 2 V -1 s -1 and 3 V, respectively. This is one of the best device performances among the reported nanofibers-based FETs. These results demonstrated the potencies of the capillary-driven welding process and grain-boundary modulation mechanism for metal-oxide NFNs, which could be applicable for high-performance, large-scale, and low-power functional electronics.

  15. Strengths, challenges, and opportunities for hydrothermal pretreatment in lignocellulosic biorefineries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Bin; Tao, Ling; Wyman, Charles E.

    Pretreatment prior to or during biological conversion is required to achieve high sugar yields essential to economic production of fuels and chemicals from low cost, abundant lignocellulosic biomass. Aqueous thermochemical pretreatments achieve this performance objective from pretreatment coupled with subsequent enzymatic hydrolysis, but chemical pretreatment can also suffer from additional costs for exotic materials of construction, the need to recover or neutralize the chemicals, introduction of compounds that inhibit downstream operations, and waste disposal, as well as for the chemicals themselves. The simplicity of hydrothermal pretreatment with just hot water offers the potential to greatly improve the cost of themore » entire conversion process if sugar degradation during pretreatment, production of un-fermentable oligomers, and the amount of expensive enzymes needed to obtain satisfactory yields from hydrothermally pretreated solids can be reduced. Biorefinery economics would also benefit if value could be generated from lignin and other components that are currently fated to be burned for power. However, achieving these goals will no doubt require development of advanced hydrothermal pretreatment configurations. For example, passing water through a stationary bed of lignocellulosic biomass in a flowthrough configuration achieves very high yields of hemicellulose sugars, removes more than 75% of the lignin for potential valorization, and improves sugar release from the pretreated solids with lower enzyme loadings. Unfortunately, the large quantities of water needed to achieve this performance result in very dilute sugars, high energy costs for pretreatment and product recover, and large amounts of oligomers. Furthermore, improving our understanding of hydrothermal pretreatment fundamentals is needed to gain insights into R&D opportunities to improve performance, and help identify novel configurations that lower capital and operating costs and achieve higher yields.« less

  16. Strengths, challenges, and opportunities for hydrothermal pretreatment in lignocellulosic biorefineries

    DOE PAGES

    Yang, Bin; Tao, Ling; Wyman, Charles E.

    2017-10-11

    Pretreatment prior to or during biological conversion is required to achieve high sugar yields essential to economic production of fuels and chemicals from low cost, abundant lignocellulosic biomass. Aqueous thermochemical pretreatments achieve this performance objective from pretreatment coupled with subsequent enzymatic hydrolysis, but chemical pretreatment can also suffer from additional costs for exotic materials of construction, the need to recover or neutralize the chemicals, introduction of compounds that inhibit downstream operations, and waste disposal, as well as for the chemicals themselves. The simplicity of hydrothermal pretreatment with just hot water offers the potential to greatly improve the cost of themore » entire conversion process if sugar degradation during pretreatment, production of un-fermentable oligomers, and the amount of expensive enzymes needed to obtain satisfactory yields from hydrothermally pretreated solids can be reduced. Biorefinery economics would also benefit if value could be generated from lignin and other components that are currently fated to be burned for power. However, achieving these goals will no doubt require development of advanced hydrothermal pretreatment configurations. For example, passing water through a stationary bed of lignocellulosic biomass in a flowthrough configuration achieves very high yields of hemicellulose sugars, removes more than 75% of the lignin for potential valorization, and improves sugar release from the pretreated solids with lower enzyme loadings. Unfortunately, the large quantities of water needed to achieve this performance result in very dilute sugars, high energy costs for pretreatment and product recover, and large amounts of oligomers. Furthermore, improving our understanding of hydrothermal pretreatment fundamentals is needed to gain insights into R&D opportunities to improve performance, and help identify novel configurations that lower capital and operating costs and achieve higher yields.« less

  17. Optomechanical design and analysis of a self-adaptive mounting method for optimizing phase matching of large potassium dihydrogen phosphate converter

    NASA Astrophysics Data System (ADS)

    Zhang, Zheng; Tian, Menjiya; Quan, Xusong; Pei, Guoqing; Wang, Hui; Liu, Tianye; Long, Kai; Xiong, Zhao; Rong, Yiming

    2017-11-01

    Surface control and phase matching of large laser conversion optics are urgent requirements and huge challenges in high-power solid-state laser facilities. A self-adaptive, nanocompensating mounting configuration of a large aperture potassium dihydrogen phosphate (KDP) frequency doubler is proposed based on a lever-type surface correction mechanism. A mechanical, numerical, and optical model is developed and employed to evaluate comprehensive performance of this mounting method. The results validate the method's advantages of surface adjustment and phase matching improvement. In addition, the optimal value of the modulation force is figured out through a series of simulations and calculations.

  18. Real-time simulation of large-scale floods

    NASA Astrophysics Data System (ADS)

    Liu, Q.; Qin, Y.; Li, G. D.; Liu, Z.; Cheng, D. J.; Zhao, Y. H.

    2016-08-01

    According to the complex real-time water situation, the real-time simulation of large-scale floods is very important for flood prevention practice. Model robustness and running efficiency are two critical factors in successful real-time flood simulation. This paper proposed a robust, two-dimensional, shallow water model based on the unstructured Godunov- type finite volume method. A robust wet/dry front method is used to enhance the numerical stability. An adaptive method is proposed to improve the running efficiency. The proposed model is used for large-scale flood simulation on real topography. Results compared to those of MIKE21 show the strong performance of the proposed model.

  19. Large guanidinium cation mixed with methylammonium in lead iodide perovskites for 19% efficient solar cells

    NASA Astrophysics Data System (ADS)

    Jodlowski, Alexander D.; Roldán-Carmona, Cristina; Grancini, Giulia; Salado, Manuel; Ralaiarisoa, Maryline; Ahmad, Shahzada; Koch, Norbert; Camacho, Luis; de Miguel, Gustavo; Nazeeruddin, Mohammad Khaja

    2017-12-01

    Organic-inorganic lead halide perovskites have shown photovoltaic performances above 20% in a range of solar cell architectures while offering simple and low-cost processability. Despite the multiple ionic compositions that have been reported so far, the presence of organic constituents is an essential element in all of the high-efficiency formulations, with the methylammonium and formamidinium cations being the sole efficient options available to date. In this study, we demonstrate improved material stability after the incorporation of a large organic cation, guanidinium, into the MAPbI3 crystal structure, which delivers average power conversion efficiencies over 19%, and stabilized performance for 1,000 h under continuous light illumination, a fundamental step within the perovskite field.

  20. Recent progress in high-mobility thin-film transistors based on multilayer 2D materials

    NASA Astrophysics Data System (ADS)

    Hong, Young Ki; Liu, Na; Yin, Demin; Hong, Seongin; Kim, Dong Hak; Kim, Sunkook; Choi, Woong; Yoon, Youngki

    2017-04-01

    Two-dimensional (2D) layered semiconductors are emerging as promising candidates for next-generation thin-film electronics because of their high mobility, relatively large bandgap, low-power switching, and the availability of large-area growth methods. Thin-film transistors (TFTs) based on multilayer transition metal dichalcogenides or black phosphorus offer unique opportunities for next-generation electronic and optoelectronic devices. Here, we review recent progress in high-mobility transistors based on multilayer 2D semiconductors. We describe the theoretical background on characterizing methods of TFT performance and material properties, followed by their applications in flexible, transparent, and optoelectronic devices. Finally, we highlight some of the methods used in metal-semiconductor contacts, hybrid structures, heterostructures, and chemical doping to improve device performance.

  1. Automated Decomposition of Model-based Learning Problems

    NASA Technical Reports Server (NTRS)

    Williams, Brian C.; Millar, Bill

    1996-01-01

    A new generation of sensor rich, massively distributed autonomous systems is being developed that has the potential for unprecedented performance, such as smart buildings, reconfigurable factories, adaptive traffic systems and remote earth ecosystem monitoring. To achieve high performance these massive systems will need to accurately model themselves and their environment from sensor information. Accomplishing this on a grand scale requires automating the art of large-scale modeling. This paper presents a formalization of [\\em decompositional model-based learning (DML)], a method developed by observing a modeler's expertise at decomposing large scale model estimation tasks. The method exploits a striking analogy between learning and consistency-based diagnosis. Moriarty, an implementation of DML, has been applied to thermal modeling of a smart building, demonstrating a significant improvement in learning rate.

  2. Position-dependent performance of copper phthalocyanine based field-effect transistors by gold nanoparticles modification.

    PubMed

    Luo, Xiao; Li, Yao; Lv, Wenli; Zhao, Feiyu; Sun, Lei; Peng, Yingquan; Wen, Zhanwei; Zhong, Junkang; Zhang, Jianping

    2015-01-21

    A facile fabrication and characteristics of copper phthalocyanine (CuPc)-based organic field-effect transistor (OFET) using the gold nanoparticles (Au NPs) modification is reported, thereby achieving highly improved performance. The effect of Au NPs located at three different positions, that is, at the SiO2/CuPc interface (device B), embedding in the middle of CuPc layer (device C), and on the top of CuPc layer (device D), is investigated, and the results show that device D has the best performance. Compared with the device without Au NPs (reference device A), device D displays an improvement of field-effect mobility (μ(sat)) from 1.65 × 10(-3) to 5.51 × 10(-3) cm(2) V(-1) s(-1), and threshold voltage decreases from -23.24 to -16.12 V. Therefore, a strategy for the performance improvement of the CuPc-based OFET with large field-effect mobility and saturation drain current is developed, on the basis of the concept of nanoscale Au modification. The model of an additional electron transport channel formation by FET operation at the Au NPs/CuPc interface is therefore proposed to explain the observed performance improvement. Optimum CuPc thickness is confirmed to be about 50 nm in the present study. The device-to-device uniformity and time stability are discussed for future application.

  3. Development of low-cost technology for the next generation of high efficiency solar cells composed of earth abundant elements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agrawal, Rakesh

    2014-09-28

    The development of renewable, affordable, and environmentally conscious means of generating energy on a global scale represents a grand challenge of our time. Due to the “permanence” of radiation from the sun, solar energy promises to remain a viable and sustainable power source far into the future. Established single-junction photovoltaic technologies achieve high power conversion efficiencies (pce) near 20% but require complicated manufacturing processes that prohibit the marriage of large-scale throughput (e.g. on the GW scale), profitability, and quality control. Our approach to this problem begins with the synthesis of nanocrystals of semiconductor materials comprising earth abundant elements and characterizedmore » by material and optoelectronic properties ideal for photovoltaic applications, namely Cu2ZnSn(S,Se)4 (CZTSSe). Once synthesized, such nanocrystals are formulated into an ink, coated onto substrates, and processed into completed solar cells in such a way that enables scale-up to high throughput, roll-to-roll manufacturing processes. This project aimed to address the major limitation to CZTSSe solar cell pce’s – the low open-circuit voltage (Voc) reported throughout literature for devices comprised of this material. Throughout the project significant advancements have been made in fundamental understanding of the CZTSSe material and device limitations associated with this material system. Additionally, notable improvements have been made to our nanocrystal based processing technique to alleviate performance limitations due to the identified device limitations. Notably, (1) significant improvements have been made in reducing intra- and inter-nanoparticle heterogeneity, (2) improvements in device performance have been realized with novel cation substitution in Ge-alloyed CZTGeSSe absorbers, (3) systematic analysis of absorber sintering has been conducted to optimize the selenization process for large grain CZTSSe absorbers, (4) novel electrical characterization analysis techniques have been developed to identify significant limitations to traditional electrical characterization of CZTSSe devices, and (5) the developed electrical analysis techniques have been used to identify the role that band gap and electrostatic potential fluctuations have in limiting device performance for this material system. The device modeling and characterization of CZTSSe undertaken with this project have significant implications for the CZTSSe research community, as the identified limitations due to potential fluctuations are expected to be a performance limitation to high-efficiency CZTSSe devices fabricated from all current processing techniques. Additionally, improvements realized through enhanced absorber processing conditions to minimize nanoparticle and large-grain absorber heterogeneity are suggested to be beneficial processing improvements which should be applied to CZTSSe devices fabricated from all processing techniques. Ultimately, our research has indicated that improved performance for CZTSSe will be achieved through novel absorber processing which minimizes defect formation, elemental losses, secondary phase formation, and compositional uniformity in CZTSSe absorbers; we believe this novel absorber processing can be achieved through nanocrystal based processing of CZTSSe which is an active area of research at the conclusion of this award. While significant fundamental understanding of CZTSSe and the performance limitations associated with this material system, as well as notable improvements in the processing of nanocrystal based CZTSSe absorbers, have been achieved under this project, the limitation of two years of research funding towards our goals prevents further significant advancements directly identified through pce. improvements relative to those reported herein. As the characterization and modeling subtask of this project has been the main driving force for understanding device limitations, the conclusions of this analysis have just recently been applied to the processing of nanocrystal based CZTSSe absorbers -- with notable success. We expect the notable fundamental understanding of device limitations and absorber sintering achieved under this project will lead to significant improvements in device performance for CZTSSe devices in the near future for devices fabricated from a variety of processing techniques« less

  4. Improved PVDF membrane performance by doping extracellular polymeric substances of activated sludge.

    PubMed

    Guan, Yan-Fang; Huang, Bao-Cheng; Qian, Chen; Wang, Long-Fei; Yu, Han-Qing

    2017-04-15

    Polyvinylidene fluoride (PVDF) membrane has been widely applied in water and wastewater treatment because of its high mechanical strength, thermal stability and chemical resistance. However, the hydrophobic nature of PVDF membrane makes it readily fouled, substantially reducing water flux and overall membrane rejection ability. In this work, an in-situ blending modifier, i.e., extracellular polymeric substances (EPS) from activated sludge, was used to enhance the anti-fouling ability of PVDF membrane. Results indicate that the pure water flux of the membrane and its anti-fouling performance were substantially improved by blending 8% EPS into the membrane. By introducing EPS, the membrane hydrophilicity was increased and the cross section morphology was changed when it interacted with polyvinl pyrrolidone, resulting in the formation of large cavities below the finger-like pores. In addition, the fraction of pores with a size of 100-500 nm increased, which was also beneficial to improving membrane performance. Surface thermodynamic calculations indicate the EPS-functionalized membrane had a higher cohesion free energy, implying its good pollutant rejection and anti-fouling ability. This work provides a simple, efficient and cost-effective method to improve membrane performance and also extends the applications of EPS. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Optimizing agent-based transmission models for infectious diseases.

    PubMed

    Willem, Lander; Stijven, Sean; Tijskens, Engelbert; Beutels, Philippe; Hens, Niel; Broeckhove, Jan

    2015-06-02

    Infectious disease modeling and computational power have evolved such that large-scale agent-based models (ABMs) have become feasible. However, the increasing hardware complexity requires adapted software designs to achieve the full potential of current high-performance workstations. We have found large performance differences with a discrete-time ABM for close-contact disease transmission due to data locality. Sorting the population according to the social contact clusters reduced simulation time by a factor of two. Data locality and model performance can also be improved by storing person attributes separately instead of using person objects. Next, decreasing the number of operations by sorting people by health status before processing disease transmission has also a large impact on model performance. Depending of the clinical attack rate, target population and computer hardware, the introduction of the sort phase decreased the run time from 26% up to more than 70%. We have investigated the application of parallel programming techniques and found that the speedup is significant but it drops quickly with the number of cores. We observed that the effect of scheduling and workload chunk size is model specific and can make a large difference. Investment in performance optimization of ABM simulator code can lead to significant run time reductions. The key steps are straightforward: the data structure for the population and sorting people on health status before effecting disease propagation. We believe these conclusions to be valid for a wide range of infectious disease ABMs. We recommend that future studies evaluate the impact of data management, algorithmic procedures and parallelization on model performance.

  6. High-performance compression and double cryptography based on compressive ghost imaging with the fast Fourier transform

    NASA Astrophysics Data System (ADS)

    Leihong, Zhang; Zilan, Pan; Luying, Wu; Xiuhua, Ma

    2016-11-01

    To solve the problem that large images can hardly be retrieved for stringent hardware restrictions and the security level is low, a method based on compressive ghost imaging (CGI) with Fast Fourier Transform (FFT) is proposed, named FFT-CGI. Initially, the information is encrypted by the sender with FFT, and the FFT-coded image is encrypted by the system of CGI with a secret key. Then the receiver decrypts the image with the aid of compressive sensing (CS) and FFT. Simulation results are given to verify the feasibility, security, and compression of the proposed encryption scheme. The experiment suggests the method can improve the quality of large images compared with conventional ghost imaging and achieve the imaging for large-sized images, further the amount of data transmitted largely reduced because of the combination of compressive sensing and FFT, and improve the security level of ghost images through ciphertext-only attack (COA), chosen-plaintext attack (CPA), and noise attack. This technique can be immediately applied to encryption and data storage with the advantages of high security, fast transmission, and high quality of reconstructed information.

  7. Brute force meets Bruno force in parameter optimisation: introduction of novel constraints for parameter accuracy improvement by symbolic computation.

    PubMed

    Nakatsui, M; Horimoto, K; Lemaire, F; Ürgüplü, A; Sedoglavic, A; Boulier, F

    2011-09-01

    Recent remarkable advances in computer performance have enabled us to estimate parameter values by the huge power of numerical computation, the so-called 'Brute force', resulting in the high-speed simultaneous estimation of a large number of parameter values. However, these advancements have not been fully utilised to improve the accuracy of parameter estimation. Here the authors review a novel method for parameter estimation using symbolic computation power, 'Bruno force', named after Bruno Buchberger, who found the Gröbner base. In the method, the objective functions combining the symbolic computation techniques are formulated. First, the authors utilise a symbolic computation technique, differential elimination, which symbolically reduces an equivalent system of differential equations to a system in a given model. Second, since its equivalent system is frequently composed of large equations, the system is further simplified by another symbolic computation. The performance of the authors' method for parameter accuracy improvement is illustrated by two representative models in biology, a simple cascade model and a negative feedback model in comparison with the previous numerical methods. Finally, the limits and extensions of the authors' method are discussed, in terms of the possible power of 'Bruno force' for the development of a new horizon in parameter estimation.

  8. [Audit system on quality of breast cancer diagnosis and treatment: results of quality indicators on screen-detected lesions in Italy, 2010].

    PubMed

    Ponti, Antonio; Mano, Maria Piera; Tomatis, Mariano; Baiocchi, Diego; Barca, Alessandra; Berti, Rosa; Bisanti, Luigi; Casella, Denise; Deandrea, Silvia; Delrio, Daria; Donati, Giovanni; Falcini, Fabio; Frammartino, Brunella; Frigerio, Alfonso; Mantellini, Paola; Naldoni, Carlo; Orzalesi, Lorenzo; Pagano, Giovanni; Pietribiasi, Francesca; Ravaioli, Alessandra; Sedda, Maria Laura; Taffurelli, Mario; Cataliotti, Luigi; Segnan, Nereo

    2012-01-01

    This survey, conducted by the Italian breast screening network (GISMa), collects yearly individual data on diagnosis and treatment on about 50% of all screen-detected, operated lesions in Italy. The 2010 results show good overall quality and an improving trend over time. Critical issues were identified, including waiting times and compliance with the recommendations on not performing frozen section examination on small lesions. Preoperative diagnosis improved constantly over the years, but there is still a large variation between regions and programmes. For almost 90% of screen-detected invasive cancers the sentinel lymph node technique (SLN) was performed on the axilla, avoiding a large number of potentially harmful dissections. On the other hand, potential overuse of SLN for ductal carcinoma in situ deserves further investigation. The detailed results have been distributed, also by means of a web data warehouse, to regional and local screening programmes in order to allow multidisciplinary discussion and identification of the appropriate solutions to any issues documented by the data. It should be assigned priority to the problem of waiting times. Specialist Breast Units with adequate case volume and enough resources would provide the best setting for making monitoring effective in producing quality improvements with shorter waiting times.

  9. Towards Scalable Graph Computation on Mobile Devices.

    PubMed

    Chen, Yiqi; Lin, Zhiyuan; Pienta, Robert; Kahng, Minsuk; Chau, Duen Horng

    2014-10-01

    Mobile devices have become increasingly central to our everyday activities, due to their portability, multi-touch capabilities, and ever-improving computational power. Such attractive features have spurred research interest in leveraging mobile devices for computation. We explore a novel approach that aims to use a single mobile device to perform scalable graph computation on large graphs that do not fit in the device's limited main memory, opening up the possibility of performing on-device analysis of large datasets, without relying on the cloud. Based on the familiar memory mapping capability provided by today's mobile operating systems, our approach to scale up computation is powerful and intentionally kept simple to maximize its applicability across the iOS and Android platforms. Our experiments demonstrate that an iPad mini can perform fast computation on large real graphs with as many as 272 million edges (Google+ social graph), at a speed that is only a few times slower than a 13″ Macbook Pro. Through creating a real world iOS app with this technique, we demonstrate the strong potential application for scalable graph computation on a single mobile device using our approach.

  10. Towards Scalable Graph Computation on Mobile Devices

    PubMed Central

    Chen, Yiqi; Lin, Zhiyuan; Pienta, Robert; Kahng, Minsuk; Chau, Duen Horng

    2015-01-01

    Mobile devices have become increasingly central to our everyday activities, due to their portability, multi-touch capabilities, and ever-improving computational power. Such attractive features have spurred research interest in leveraging mobile devices for computation. We explore a novel approach that aims to use a single mobile device to perform scalable graph computation on large graphs that do not fit in the device's limited main memory, opening up the possibility of performing on-device analysis of large datasets, without relying on the cloud. Based on the familiar memory mapping capability provided by today's mobile operating systems, our approach to scale up computation is powerful and intentionally kept simple to maximize its applicability across the iOS and Android platforms. Our experiments demonstrate that an iPad mini can perform fast computation on large real graphs with as many as 272 million edges (Google+ social graph), at a speed that is only a few times slower than a 13″ Macbook Pro. Through creating a real world iOS app with this technique, we demonstrate the strong potential application for scalable graph computation on a single mobile device using our approach. PMID:25859564

  11. A priori and a posteriori investigations for developing large eddy simulations of multi-species turbulent mixing under high-pressure conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Borghesi, Giulio; Bellan, Josette, E-mail: josette.bellan@jpl.nasa.gov; Jet Propulsion Laboratory, California Institute of Technology, Pasadena, California 91109-8099

    2015-03-15

    A Direct Numerical Simulation (DNS) database was created representing mixing of species under high-pressure conditions. The configuration considered is that of a temporally evolving mixing layer. The database was examined and analyzed for the purpose of modeling some of the unclosed terms that appear in the Large Eddy Simulation (LES) equations. Several metrics are used to understand the LES modeling requirements. First, a statistical analysis of the DNS-database large-scale flow structures was performed to provide a metric for probing the accuracy of the proposed LES models as the flow fields obtained from accurate LESs should contain structures of morphology statisticallymore » similar to those observed in the filtered-and-coarsened DNS (FC-DNS) fields. To characterize the morphology of the large-scales structures, the Minkowski functionals of the iso-surfaces were evaluated for two different fields: the second-invariant of the rate of deformation tensor and the irreversible entropy production rate. To remove the presence of the small flow scales, both of these fields were computed using the FC-DNS solutions. It was found that the large-scale structures of the irreversible entropy production rate exhibit higher morphological complexity than those of the second invariant of the rate of deformation tensor, indicating that the burden of modeling will be on recovering the thermodynamic fields. Second, to evaluate the physical effects which must be modeled at the subfilter scale, an a priori analysis was conducted. This a priori analysis, conducted in the coarse-grid LES regime, revealed that standard closures for the filtered pressure, the filtered heat flux, and the filtered species mass fluxes, in which a filtered function of a variable is equal to the function of the filtered variable, may no longer be valid for the high-pressure flows considered in this study. The terms requiring modeling are the filtered pressure, the filtered heat flux, the filtered pressure work, and the filtered species mass fluxes. Improved models were developed based on a scale-similarity approach and were found to perform considerably better than the classical ones. These improved models were also assessed in an a posteriori study. Different combinations of the standard models and the improved ones were tested. At the relatively small Reynolds numbers achievable in DNS and at the relatively small filter widths used here, the standard models for the filtered pressure, the filtered heat flux, and the filtered species fluxes were found to yield accurate results for the morphology of the large-scale structures present in the flow. Analysis of the temporal evolution of several volume-averaged quantities representative of the mixing layer growth, and of the cross-stream variation of homogeneous-plane averages and second-order correlations, as well as of visualizations, indicated that the models performed equivalently for the conditions of the simulations. The expectation is that at the much larger Reynolds numbers and much larger filter widths used in practical applications, the improved models will have much more accurate performance than the standard one.« less

  12. Performance verification of adaptive optics for satellite-to-ground coherent optical communications at large zenith angle.

    PubMed

    Chen, Mo; Liu, Chao; Rui, Daoman; Xian, Hao

    2018-02-19

    Although there is an urgent demand, it is still a tremendous challenge to use the coherent optical communication technology to the satellite-to-ground data transmission system especially at large zenith angle due to the influence of atmospheric turbulence. Adaptive optics (AO) is a considerable scheme to solve the problem. In this paper, we integrate the adaptive optics (AO) to the coherent laser communications and the performances of mixing efficiency as well as bit-error-rate (BER) at different zenith angles are studied. The analytical results show that the increasing of zenith angle can severely decrease the performances of the coherent detection, and increase the BER to higher than 10 -3 , which is unacceptable. The simulative results of coherent detection with AO compensation indicate that the larger mixing efficiency and lower BER can be performed by the coherent receiver with a high-mode AO compensation. The experiment of correcting the atmospheric turbulence wavefront distortion using a 249-element AO system at large zenith angles is carried out. The result demonstrates that the AO system has a significant improvement on satellite-to-ground coherent optical communication system at large zenith angle. It also indicates that the 249-element AO system can only meet the needs of coherent communication systems at zenith angle smaller than 65̊ for the 1.8m telescope under weak and moderate turbulence.

  13. Switching electrochromic performance improvement enabled by highly developed mesopores and oxygen vacancy defects of Fe-doped WO3 films

    NASA Astrophysics Data System (ADS)

    Koo, Bon-Ryul; Kim, Kue-Ho; Ahn, Hyo-Jin

    2018-09-01

    In recent years, owing to the capability to reversibly adjust transparency, reflection, and color by the low electric field, electrochromic devices (ECDs) have received an extensive attention for their potential use in optoelectronic applications. However, considering that the performances of the ECDs, including coloration efficiency (CE, <30.0 cm2/C) and switching speed (>10.0 s), are still low for an effective applied use, critical efforts are needed to push the development of a unique nanostructure film to improve electrochromic (EC) performances. Specifically, as the large-scale applications (e.g. refrigerators, vehicles, and airplanes) of the ECDs have been recently developed, the study for improving switching speed is urgently needed for commercialization of the devices. In this context, the present study reports a novel nanostructure film of Fe-doped WO3 films with highly developed mesopores and oxygen vacancy defects, fabricated using the Fe agent and the camphene-assisted sol-gel method. Fe-doped WO3 films with highly developed mesopores and oxygen vacancy defects show remarkable EC performances with both fast switching speed (2.8 s for the coloration speed and 0.3 s for the bleaching speed) and high CE (71.1 cm2/C). These two aspects contribute to the synergistic effects of optimized Fe doping and camphene on the films and have outstanding values as compared to previously reported results of WO3-based materials. Specifically, the fast switching speed is attributed to the shortened Li+ diffusion pathway of the highly developed mesopores; and the other is the improved electrical conductivity of the highly increased oxygen vacancy defects. In addition, the high CE value is due to an efficient charge transport as the result of a more effective electroactive contact of the morphology with highly developed mesopores, resulting in a large transmittance modulation with a small intercalated charge density.

  14. Harnessing Diversity towards the Reconstructing of Large Scale Gene Regulatory Networks

    PubMed Central

    Yamanaka, Ryota; Kitano, Hiroaki

    2013-01-01

    Elucidating gene regulatory network (GRN) from large scale experimental data remains a central challenge in systems biology. Recently, numerous techniques, particularly consensus driven approaches combining different algorithms, have become a potentially promising strategy to infer accurate GRNs. Here, we develop a novel consensus inference algorithm, TopkNet that can integrate multiple algorithms to infer GRNs. Comprehensive performance benchmarking on a cloud computing framework demonstrated that (i) a simple strategy to combine many algorithms does not always lead to performance improvement compared to the cost of consensus and (ii) TopkNet integrating only high-performance algorithms provide significant performance improvement compared to the best individual algorithms and community prediction. These results suggest that a priori determination of high-performance algorithms is a key to reconstruct an unknown regulatory network. Similarity among gene-expression datasets can be useful to determine potential optimal algorithms for reconstruction of unknown regulatory networks, i.e., if expression-data associated with known regulatory network is similar to that with unknown regulatory network, optimal algorithms determined for the known regulatory network can be repurposed to infer the unknown regulatory network. Based on this observation, we developed a quantitative measure of similarity among gene-expression datasets and demonstrated that, if similarity between the two expression datasets is high, TopkNet integrating algorithms that are optimal for known dataset perform well on the unknown dataset. The consensus framework, TopkNet, together with the similarity measure proposed in this study provides a powerful strategy towards harnessing the wisdom of the crowds in reconstruction of unknown regulatory networks. PMID:24278007

  15. A Novel Telemanipulated Robotic Assistant for Surgical Endoscopy: Preclinical Application to ESD.

    PubMed

    Zorn, Lucile; Nageotte, Florent; Zanne, Philippe; Legner, Andras; Dallemagne, Bernard; Marescaux, Jacques; de Mathelin, Michel

    2018-04-01

    Minimally invasive surgical interventions in the gastrointestinal tract, such as endoscopic submucosal dissection (ESD), are very difficult for surgeons when performed with standard flexible endoscopes. Robotic flexible systems have been identified as a solution to improve manipulation. However, only a few such systems have been brought to preclinical trials as of now. As a result, novel robotic tools are required. We developed a telemanipulated robotic device, called STRAS, which aims to assist surgeons during intraluminal surgical endoscopy. This is a modular system, based on a flexible endoscope and flexible instruments, which provides 10 degrees of freedom (DoFs). The modularity allows the user to easily set up the robot and to navigate toward the operating area. The robot can then be teleoperated using master interfaces specifically designed to intuitively control all available DoFs. STRAS capabilities have been tested in laboratory conditions and during preclinical experiments. We report 12 colorectal ESDs performed in pigs, in which large lesions were successfully removed. Dissection speeds are compared with those obtained in similar conditions with the manual Anubiscope platform from Karl Storz. We show significant improvements ( ). These experiments show that STRAS (v2) provides sufficient DoFs, workspace, and force to perform ESD, that it allows a single surgeon to perform all the surgical tasks and those performances are improved with respect to manual systems. The concepts developed for STRAS are validated and could bring new tools for surgeons to improve comfort, ease, and performances for intraluminal surgical endoscopy.

  16. Performance characteristics of NuVal and the Overall Nutritional Quality Index (ONQI).

    PubMed

    Katz, David L; Njike, Valentine Y; Rhee, Lauren Q; Reingold, Arthur; Ayoob, Keith T

    2010-04-01

    Improving diets has considerable potential to improve health, but progress in this area has been limited, and advice to increase fruit and vegetable intake has largely gone unheeded. Our objective was to test the performance characteristics of the Overall Nutritional Quality Index (ONQI), a tool designed to help improve dietary patterns one well-informed choice at a time. The ONQI was developed by a multidisciplinary group of nutrition and public health scientists independent of food industry interests and is the basis for the NuVal Nutritional Guidance System. Dietary guidelines, existing nutritional scoring systems, and other pertinent scientific literature were reviewed. An algorithm incorporating >30 entries that represent both micro- and macronutrient properties of foods, as well as weighting coefficients representing epidemiologic associations between nutrients and health outcomes, was developed and subjected to consumer research and testing of performance characteristics. ONQI and expert panel rankings correlated highly (R = 0.92, P < 0.001). In consumer testing, approximately 80% of >800 study participants indicated that the ONQI would influence their purchase intent. ONQI scoring distinguished the more-healthful DASH (Dietary Approaches to Stop Hypertension) diet (mean score: 46) from the typical American diet according to the National Health and Nutrition Examination Survey (NHANES) 2003-2006 (mean score: 26.5; P < 0.01). In linear regression analysis of the NHANES 2003-2006 populations (n = 15,900), the NuVal system was significantly associated with the Healthy Eating Index 2005 (P < 0.0001). Recently generated data from ongoing studies indicate favorable effects on purchase patterns and significant correlation with health outcomes in large cohorts of men and women followed for decades. NuVal offers universally applicable nutrition guidance that is independent of food industry interests and is supported by consumer research and scientific evaluation of its performance characteristics.

  17. PeakRanger: A cloud-enabled peak caller for ChIP-seq data

    PubMed Central

    2011-01-01

    Background Chromatin immunoprecipitation (ChIP), coupled with massively parallel short-read sequencing (seq) is used to probe chromatin dynamics. Although there are many algorithms to call peaks from ChIP-seq datasets, most are tuned either to handle punctate sites, such as transcriptional factor binding sites, or broad regions, such as histone modification marks; few can do both. Other algorithms are limited in their configurability, performance on large data sets, and ability to distinguish closely-spaced peaks. Results In this paper, we introduce PeakRanger, a peak caller software package that works equally well on punctate and broad sites, can resolve closely-spaced peaks, has excellent performance, and is easily customized. In addition, PeakRanger can be run in a parallel cloud computing environment to obtain extremely high performance on very large data sets. We present a series of benchmarks to evaluate PeakRanger against 10 other peak callers, and demonstrate the performance of PeakRanger on both real and synthetic data sets. We also present real world usages of PeakRanger, including peak-calling in the modENCODE project. Conclusions Compared to other peak callers tested, PeakRanger offers improved resolution in distinguishing extremely closely-spaced peaks. PeakRanger has above-average spatial accuracy in terms of identifying the precise location of binding events. PeakRanger also has excellent sensitivity and specificity in all benchmarks evaluated. In addition, PeakRanger offers significant improvements in run time when running on a single processor system, and very marked improvements when allowed to take advantage of the MapReduce parallel environment offered by a cloud computing resource. PeakRanger can be downloaded at the official site of modENCODE project: http://www.modencode.org/software/ranger/ PMID:21554709

  18. First-order error budgeting for LUVOIR mission

    NASA Astrophysics Data System (ADS)

    Lightsey, Paul A.; Knight, J. Scott; Feinberg, Lee D.; Bolcar, Matthew R.; Shaklan, Stuart B.

    2017-09-01

    Future large astronomical telescopes in space will have architectures that will have complex and demanding requirements to meet the science goals. The Large UV/Optical/IR Surveyor (LUVOIR) mission concept being assessed by the NASA/Goddard Space Flight Center is expected to be 9 to 15 meters in diameter, have a segmented primary mirror and be diffraction limited at a wavelength of 500 nanometers. The optical stability is expected to be in the picometer range for minutes to hours. Architecture studies to support the NASA Science and Technology Definition teams (STDTs) are underway to evaluate systems performance improvements to meet the science goals. To help define the technology needs and assess performance, a first order error budget has been developed. Like the JWST error budget, the error budget includes the active, adaptive and passive elements in spatial and temporal domains. JWST performance is scaled using first order approximations where appropriate and includes technical advances in telescope control.

  19. Approximate Computing Techniques for Iterative Graph Algorithms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Panyala, Ajay R.; Subasi, Omer; Halappanavar, Mahantesh

    Approximate computing enables processing of large-scale graphs by trading off quality for performance. Approximate computing techniques have become critical not only due to the emergence of parallel architectures but also the availability of large scale datasets enabling data-driven discovery. Using two prototypical graph algorithms, PageRank and community detection, we present several approximate computing heuristics to scale the performance with minimal loss of accuracy. We present several heuristics including loop perforation, data caching, incomplete graph coloring and synchronization, and evaluate their efficiency. We demonstrate performance improvements of up to 83% for PageRank and up to 450x for community detection, with lowmore » impact of accuracy for both the algorithms. We expect the proposed approximate techniques will enable scalable graph analytics on data of importance to several applications in science and their subsequent adoption to scale similar graph algorithms.« less

  20. Improved uniformity in high-performance organic photovoltaics enabled by (3-aminopropyl)triethoxysilane cathode functionalization.

    PubMed

    Luck, Kyle A; Shastry, Tejas A; Loser, Stephen; Ogien, Gabriel; Marks, Tobin J; Hersam, Mark C

    2013-12-28

    Organic photovoltaics have the potential to serve as lightweight, low-cost, mechanically flexible solar cells. However, losses in efficiency as laboratory cells are scaled up to the module level have to date impeded large scale deployment. Here, we report that a 3-aminopropyltriethoxysilane (APTES) cathode interfacial treatment significantly enhances performance reproducibility in inverted high-efficiency PTB7:PC71BM organic photovoltaic cells, as demonstrated by the fabrication of 100 APTES-treated devices versus 100 untreated controls. The APTES-treated devices achieve a power conversion efficiency of 8.08 ± 0.12% with histogram skewness of -0.291, whereas the untreated controls achieve 7.80 ± 0.26% with histogram skewness of -1.86. By substantially suppressing the interfacial origins of underperforming cells, the APTES treatment offers a pathway for fabricating large-area modules with high spatial performance uniformity.

  1. Large bandgap reduced graphene oxide (rGO) based n-p + heterojunction photodetector with improved NIR performance

    NASA Astrophysics Data System (ADS)

    Singh, Manjri; Kumar, Gaurav; Prakash, Nisha; Khanna, Suraj P.; Pal, Prabir; Singh, Surinder P.

    2018-04-01

    Integration of two-dimensional reduced graphene oxide (rGO) with conventional Si semiconductor offers novel strategies for realizing broadband photodiode with enhanced device performance. In this quest, we have synthesized large bandgap rGO and fabricated metal-free broadband (300–1100 nm) back-to-back connected np-pn hybrid photodetector utilizing drop casted n-rGO/p +-Si heterojunctions with high performance in NIR region (830 nm). With controlled illumination, the device exhibited a peak responsivity of 16.7 A W‑1 and peak detectivity of 2.56 × 1012 Jones under 830 nm illumination (11 μW cm‑2) at 1 V applied bias with fast response (∼460 μs) and recovery time (∼446 μs). The fabricated device demonstrated excellent repeatability, durability and photoswitching behavior with high external quantum efficiency (∼2.5 × 103%), along with ultrasensitive behavior at low light conditions.

  2. Characterization of the Q-switched MOBLAS Laser Transmitter and Its Ranging Performance Relative to a PTM Q-switched System

    NASA Technical Reports Server (NTRS)

    Degnan, J. J., III; Zagwodski, T. W.

    1979-01-01

    A prototype Q-switched Nd:YAG laser transmitter intended for use in the NASA mobile laser ranging system was subjected to various tests of temporal pulse shape and stability, output energy and stability, beam divergence, and range bias errors. Peak to peak variations in the mean range were as large as 30 cm and drift rates of system bias with time as large as 6 mm per minute of operation were observed. The incorporation of a fast electro-optic cavity dump into the oscillator gave significantly improved results. Reevaluation of the ranging performance after modification showed a reduction in the peak to peak variation in the mean range to the 2 or 3 cm level and a drift rate of system time biases of less than 1 mm per minute of operation. A qualitative physical explanation for the superior performance of cavity dumped lasers is given.

  3. Upgrade of the MEG liquid xenon calorimeter with VUV-light sensitive large area SiPMs

    NASA Astrophysics Data System (ADS)

    Ieki, K.

    2016-07-01

    The MEG experiment searches for the muon lepton flavor violating decay, μ+ →e+ γ. An upgrade of the experiment is ongoing, aiming at reaching a sensitivity of Br(μ+ →e+ γ) = 4 ×10-14, an order of magnitude better than the sensitivity of the current MEG. To achieve this goal, all of the detectors are being upgraded. In MEG, the energy, position and timing of the gamma ray were measured by a liquid Xe calorimeter, which consists of 900 l of liquid Xe and 846 2-in. round-shaped photo-multiplier tubes (PMTs). In the upgrade, the granularity at the gamma ray incident face will be improved by replacing 216 PMTs with 4092 SiPMs (MPPCs) with an active area of 12×12 mm2 each. The energy resolution for the gamma ray is expected to improve by a factor of 2, because the efficiency to collect scintillation light will become more uniform. The position resolution is also expected to improve by a factor of 2. In collaboration with Hamamatsu Photonics K.K., we have successfully developed a high performance MPPC for our detector. It has excellent photon detection efficiency for the liquid xenon scintillation light in VUV range. The size of the chips is large so that it can cover large area with a manageable number of readout channels. The characteristics of the MPPCs are being tested in liquid Xe, and also at the room temperature. The results of the tests will be presented, together with the expected performance of the upgraded detector.

  4. Externally induced frontoparietal synchronization modulates network dynamics and enhances working memory performance.

    PubMed

    Violante, Ines R; Li, Lucia M; Carmichael, David W; Lorenz, Romy; Leech, Robert; Hampshire, Adam; Rothwell, John C; Sharp, David J

    2017-03-14

    Cognitive functions such as working memory (WM) are emergent properties of large-scale network interactions. Synchronisation of oscillatory activity might contribute to WM by enabling the coordination of long-range processes. However, causal evidence for the way oscillatory activity shapes network dynamics and behavior in humans is limited. Here we applied transcranial alternating current stimulation (tACS) to exogenously modulate oscillatory activity in a right frontoparietal network that supports WM. Externally induced synchronization improved performance when cognitive demands were high. Simultaneously collected fMRI data reveals tACS effects dependent on the relative phase of the stimulation and the internal cognitive processing state. Specifically, synchronous tACS during the verbal WM task increased parietal activity, which correlated with behavioral performance. Furthermore, functional connectivity results indicate that the relative phase of frontoparietal stimulation influences information flow within the WM network. Overall, our findings demonstrate a link between behavioral performance in a demanding WM task and large-scale brain synchronization.

  5. Externally induced frontoparietal synchronization modulates network dynamics and enhances working memory performance

    PubMed Central

    Violante, Ines R; Li, Lucia M; Carmichael, David W; Lorenz, Romy; Leech, Robert; Hampshire, Adam; Rothwell, John C; Sharp, David J

    2017-01-01

    Cognitive functions such as working memory (WM) are emergent properties of large-scale network interactions. Synchronisation of oscillatory activity might contribute to WM by enabling the coordination of long-range processes. However, causal evidence for the way oscillatory activity shapes network dynamics and behavior in humans is limited. Here we applied transcranial alternating current stimulation (tACS) to exogenously modulate oscillatory activity in a right frontoparietal network that supports WM. Externally induced synchronization improved performance when cognitive demands were high. Simultaneously collected fMRI data reveals tACS effects dependent on the relative phase of the stimulation and the internal cognitive processing state. Specifically, synchronous tACS during the verbal WM task increased parietal activity, which correlated with behavioral performance. Furthermore, functional connectivity results indicate that the relative phase of frontoparietal stimulation influences information flow within the WM network. Overall, our findings demonstrate a link between behavioral performance in a demanding WM task and large-scale brain synchronization. DOI: http://dx.doi.org/10.7554/eLife.22001.001 PMID:28288700

  6. Ten-year performance of the United States national elm trial

    Treesearch

    Jason J. Griffin; William R. Jacobi; E. Gregory McPherson; Clifford S. Sadof; James R. McKenna; Mark L. Gleason; Nicole Ward Gauthier; Daniel A. Potter; David R. Smitley; Gerard C. Adams; Ann Brooks Gould; Christian R. Cash; James A. Walla; Mark C. Starrett; Gary Chastagner; Jeff L. Sibley; Vera A. Krischik; Adam F. Newby

    2017-01-01

    Ulmus americana (American elm) was an important urban tree in North America prior to the introduction of the Dutch elm disease pathogen in 1930. Subsequently, urban and community forests were devastated by the loss of large canopies. Tree improvement programs produced disease tolerant American and Eurasian elm cultivars and introduced them into the...

  7. Foundations for Success: Case Studies of How Urban School Systems Improve Student Achievement [and] Abstract.

    ERIC Educational Resources Information Center

    Snipes, Jason; Doolittle, Fred; Herlihy, Corinne

    This report examines the experiences of three large urban school districts (and part of a fourth) that raised academic performance for their districts as a whole, while also reducing racial differences in achievement. Educational challenges included low achievement, political conflict, inexperienced teachers, low expectations, and lack of…

  8. Metacognitive Monitoring and Academic Performance in College

    ERIC Educational Resources Information Center

    Wagener, Bastien

    2016-01-01

    In French universities, only one out of two students is successful in his/her first year. The drastic changes in the organization of work and the greater emphasis put on self-regulated learning (relying on metacognition) can largely explain these low success rates. In this regard, techniques have been developed to help students improve monitoring…

  9. Leveraging Successful Collaborative Processes to Improve Performance Outcomes in Large-Scale Event Planning: Super Bowl, A Planned Homeland Security Event

    DTIC Science & Technology

    2010-03-01

    are turning out to be counterproductive because they are culturally anathema (Wollman, 2009). A consideration of psychological tenets by Sigmund ... Freud suggests a principle aspect of dysfunction in collaboration. He reasoned that An Ego governed by social convention and a Superego governed by

  10. Altering Test Environments for Reducing Test Anxiety and for Improving Academic Performance.

    ERIC Educational Resources Information Center

    Bushnell, Don D.

    To test the effects of altering situational variables in stressful examinations on high test anxious and low test anxious undergraduates, mid-terms and final examinations were administered in two environmental settings: large lecture halls and small language laboratories. Mean test scores for high test anxious students in the language labs were…

  11. No One Way: Differentiating School District Leadership and Support for School Improvement

    ERIC Educational Resources Information Center

    Anderson, Stephen E.; Mascall, Blair; Stiegelbauer, Suzanne; Park, Jaddon

    2012-01-01

    This article examines findings from a qualitative investigation of how school district administrators in four mid to large sized urban school districts (10,000-50,000) identify and address differences in school performance. The analysis explores the interaction between district policies and actions that centralize and standardize expectations for…

  12. Data-Acquisition System With Remotely Adjustable Amplifiers

    NASA Technical Reports Server (NTRS)

    Nurge, Mark A.; Larson, William E.; Hallberg, Carl G.; Thayer, Steven W.; Ake, Jeffrey C.; Gleman, Stuart M.; Thompson, David L.; Medelius, Pedro J.; Crawford, Wayne A.; Vangilder, Richard M.; hide

    1994-01-01

    Improved data-acquisition system has both centralized and decentralized characteristics developed. Provides infrastructure for automation and standardization of operation, maintenance, calibration, and adjustment of many transducers. Increases efficiency by reducing need for diminishing work force of highly trained technicians to perform routine tasks. Large industrial and academic laboratory facilities benefit from systems like this one.

  13. Statistical Techniques for Efficient Indexing and Retrieval of Document Images

    ERIC Educational Resources Information Center

    Bhardwaj, Anurag

    2010-01-01

    We have developed statistical techniques to improve the performance of document image search systems where the intermediate step of OCR based transcription is not used. Previous research in this area has largely focused on challenges pertaining to generation of small lexicons for processing handwritten documents and enhancement of poor quality…

  14. Improving the Physical and Social Environment of School: A Question of Equity

    ERIC Educational Resources Information Center

    Uline, Cynthia L.; Wolsey, Thomas DeVere; Tschannen-Moran, Megan; Lin, Chii-Dean

    2010-01-01

    This study explored the interplay between quality facilities and school climate, charting the effects of facility conditions on student and teacher attitudes, behaviors, and performance within schools slated for renovations in a large metropolitan school district. The research applied a school leadership-building design model to explore how six…

  15. Conway Street Apartments: A Multifamily Deep Energy Retrofit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aldrich, R.; Williamson, J.

    2014-11-01

    While single-family, detached homes account for 63% of households (EIA 2009); multi-family homes account for a very large portion of that remaining housing stock, and this fraction is growing. Through recent research efforts, CARB has been evaluating strategies and technologies that can make dramatic improvements in energy performance in multi-family buildings.

  16. Campus Schools: The Search for Safe and Orderly Environment in Large School Settings

    ERIC Educational Resources Information Center

    Ortiz, Monica

    2012-01-01

    Establishing "new small schools" is a major focus of school improvement, especially at the high school level, with the hopes of increasing academic success and reducing violence. Key arguments for small schools are the personalization of schooling and increased academic performance. The structures and process of small schools are…

  17. Team Machine: A Decision Support System for Team Formation

    ERIC Educational Resources Information Center

    Bergey, Paul; King, Mark

    2014-01-01

    This paper reports on the cross-disciplinary research that resulted in a decision-support tool, Team Machine (TM), which was designed to create maximally diverse student teams. TM was used at a large United States university between 2004 and 2012, and resulted in significant improvement in the performance of student teams, superior overall balance…

  18. The Effect of Poverty on Student Achievement. Information Capsule. Volume 0901

    ERIC Educational Resources Information Center

    Blazer, Christie

    2009-01-01

    There is a strong relationship between students' socioeconomic status and their levels of academic achievement. Although educators should be held accountable for improving the performance of all students, including those living in poverty, schools alone can't eliminate the negative factors associated with poverty that lead to a large achievement…

  19. The New Century High Schools Initiative. What Works Clearinghouse Intervention Report

    ERIC Educational Resources Information Center

    What Works Clearinghouse, 2008

    2008-01-01

    The "New Century High Schools Initiative" is a program designed to improve large, under-performing high schools by transforming them into small schools with links to community organizations. "New Century High Schools" each have about 400 students; the small size is intended to foster strong relationships between students and…

  20. North by Northwest: Quality Assurance and Evaluation Processes in European Education

    ERIC Educational Resources Information Center

    Grek, Sotiria; Lawn, Martin; Lingard, Bob; Varjo, Janne

    2009-01-01

    Governing processes in Europe and within Europeanization are often opaque and appearances can deceive. The normative practices of improvement in education, and the connected growth in performance measurement, have been largely understood in their own terms. However, the management of flows of information through quality assurance can be examined…

Top