Sample records for modest computational effort

  1. Identifying Differences between Depressed Adolescent Suicide Ideators and Attempters

    PubMed Central

    Auerbach, Randy P.; Millner, Alexander J.; Stewart, Jeremy G.; Esposito, Erika

    2015-01-01

    Background Adolescent depression and suicide are pressing public health concerns, and identifying key differences among suicide ideators and attempters is critical. The goal of the current study is to test whether depressed adolescent suicide attempters report greater anhedonia severity and exhibit aberrant effort-cost computations in the face of uncertainty. Methods Depressed adolescents (n = 101) ages 13–19 years were administered structured clinical interviews to assess current mental health disorders and a history of suicidality (suicide ideators = 55, suicide attempters = 46). Then, participants completed self-report instruments assessing symptoms of suicidal ideation, depression, anhedonia, and anxiety as well as a computerized effort-cost computation task. Results Compared with depressed adolescent suicide ideators, attempters report greater anhedonia severity, even after concurrently controlling for symptoms of suicidal ideation, depression, and anxiety. Additionally, when completing the effort-cost computation task, suicide attempters are less likely to pursue the difficult, high value option when outcomes are uncertain. Follow-up, trial-level analyses of effort-cost computations suggest that receipt of reward does not influence future decision-making among suicide attempters, however, suicide ideators exhibit a win-stay approach when receiving rewards on previous trials. Limitations Findings should be considered in light of limitations including a modest sample size, which limits generalizability, and the cross-sectional design. Conclusions Depressed adolescent suicide attempters are characterized by greater anhedonia severity, which may impair the ability to integrate previous rewarding experiences to inform future decisions. Taken together, this may generate a feeling of powerlessness that contributes to increased suicidality and a needless loss of life. PMID:26233323

  2. 3DHZETRN: Inhomogeneous Geometry Issues

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Slaba, Tony C.; Badavi, Francis F.

    2017-01-01

    Historical methods for assessing radiation exposure inside complicated geometries for space applications were limited by computational constraints and lack of knowledge associated with nuclear processes occurring over a broad range of particles and energies. Various methods were developed and utilized to simplify geometric representations and enable coupling with simplified but efficient particle transport codes. Recent transport code development efforts, leading to 3DHZETRN, now enable such approximate methods to be carefully assessed to determine if past exposure analyses and validation efforts based on those approximate methods need to be revisited. In this work, historical methods of representing inhomogeneous spacecraft geometry for radiation protection analysis are first reviewed. Two inhomogeneous geometry cases, previously studied with 3DHZETRN and Monte Carlo codes, are considered with various levels of geometric approximation. Fluence, dose, and dose equivalent values are computed in all cases and compared. It is found that although these historical geometry approximations can induce large errors in neutron fluences up to 100 MeV, errors on dose and dose equivalent are modest (<10%) for the cases studied here.

  3. A collaborative institutional model for integrating computer applications in the medical curriculum.

    PubMed Central

    Friedman, C. P.; Oxford, G. S.; Juliano, E. L.

    1991-01-01

    The introduction and promotion of information technology in an established medical curriculum with existing academic and technical support structures poses a number of challenges. The UNC School of Medicine has developed the Taskforce on Educational Applications in Medicine (TEAM), to coordinate this effort. TEAM works as a confederation of existing research and support units with interests in computers and education, along with a core of interested faculty with curricular responsibilities. Constituent units of the TEAM confederation include the medical center library, medical television studios, basic science teaching laboratories, educational development office, microcomputer and network support groups, academic affairs administration, and a subset of course directors and teaching faculty. Among our efforts have been the establishment of (1) a mini-grant program to support faculty initiated development and implementation of computer applications in the curriculum, (2) a symposium series with visiting speakers to acquaint faculty with current developments in medical informatics and related curricular efforts at other institution, (3) 20 computer workstations located in the multipurpose teaching labs where first and second year students do much of their academic work, (4) a demonstration center for evaluation of courseware and technologically advanced delivery systems. The student workstations provide convenient access to electronic mail, University schedules and calendars, the CoSy computer conferencing system, and several software applications integral to their courses in pathology, histology, microbiology, biochemistry, and neurobiology. The progress achieved toward the primary goal has modestly exceeded our initial expectations, while the collegiality and interest expressed toward TEAM activities in the local environment stand as empirical measures of the success of the concept. PMID:1807705

  4. Impact of scaffold rigidity on the design and evolution of an artificial Diels-Alderase

    PubMed Central

    Preiswerk, Nathalie; Beck, Tobias; Schulz, Jessica D.; Milovník, Peter; Mayer, Clemens; Siegel, Justin B.; Baker, David; Hilvert, Donald

    2014-01-01

    By combining targeted mutagenesis, computational refinement, and directed evolution, a modestly active, computationally designed Diels-Alderase was converted into the most proficient biocatalyst for [4+2] cycloadditions known. The high stereoselectivity and minimal product inhibition of the evolved enzyme enabled preparative scale synthesis of a single product diastereomer. X-ray crystallography of the enzyme–product complex shows that the molecular changes introduced over the course of optimization, including addition of a lid structure, gradually reshaped the pocket for more effective substrate preorganization and transition state stabilization. The good overall agreement between the experimental structure and the original design model with respect to the orientations of both the bound product and the catalytic side chains contrasts with other computationally designed enzymes. Because design accuracy appears to correlate with scaffold rigidity, improved control over backbone conformation will likely be the key to future efforts to design more efficient enzymes for diverse chemical reactions. PMID:24847076

  5. Calculation of the Hadronic Vacuum Polarization Disconnected Contribution to the Muon Anomalous Magnetic Moment

    NASA Astrophysics Data System (ADS)

    Blum, T.; Boyle, P. A.; Izubuchi, T.; Jin, L.; Jüttner, A.; Lehner, C.; Maltman, K.; Marinkovic, M.; Portelli, A.; Spraggs, M.; Rbc; Ukqcd Collaborations

    2016-06-01

    We report the first lattice QCD calculation of the hadronic vacuum polarization (HVP) disconnected contribution to the muon anomalous magnetic moment at physical pion mass. The calculation uses a refined noise-reduction technique that enables the control of statistical uncertainties at the desired level with modest computational effort. Measurements were performed on the 483×96 physical-pion-mass lattice generated by the RBC and UKQCD Collaborations. We find the leading-order hadronic vacuum polarization aμHVP (LO )disc=-9.6 (3.3 )(2.3 )×10-10 , where the first error is statistical and the second systematic.

  6. Calculation of the Hadronic Vacuum Polarization Disconnected Contribution to the Muon Anomalous Magnetic Moment.

    PubMed

    Blum, T; Boyle, P A; Izubuchi, T; Jin, L; Jüttner, A; Lehner, C; Maltman, K; Marinkovic, M; Portelli, A; Spraggs, M

    2016-06-10

    We report the first lattice QCD calculation of the hadronic vacuum polarization (HVP) disconnected contribution to the muon anomalous magnetic moment at physical pion mass. The calculation uses a refined noise-reduction technique that enables the control of statistical uncertainties at the desired level with modest computational effort. Measurements were performed on the 48^{3}×96 physical-pion-mass lattice generated by the RBC and UKQCD Collaborations. We find the leading-order hadronic vacuum polarization a_{μ}^{HVP(LO)disc}=-9.6(3.3)(2.3)×10^{-10}, where the first error is statistical and the second systematic.

  7. Climatic response variability and machine learning: development of a modular technology framework for predicting bio-climatic change in pacific northwest ecosystems"

    NASA Astrophysics Data System (ADS)

    Seamon, E.; Gessler, P. E.; Flathers, E.

    2015-12-01

    The creation and use of large amounts of data in scientific investigations has become common practice. Data collection and analysis for large scientific computing efforts are not only increasing in volume as well as number, the methods and analysis procedures are evolving toward greater complexity (Bell, 2009, Clarke, 2009, Maimon, 2010). In addition, the growth of diverse data-intensive scientific computing efforts (Soni, 2011, Turner, 2014, Wu, 2008) has demonstrated the value of supporting scientific data integration. Efforts to bridge this gap between the above perspectives have been attempted, in varying degrees, with modular scientific computing analysis regimes implemented with a modest amount of success (Perez, 2009). This constellation of effects - 1) an increasing growth in the volume and amount of data, 2) a growing data-intensive science base that has challenging needs, and 3) disparate data organization and integration efforts - has created a critical gap. Namely, systems of scientific data organization and management typically do not effectively enable integrated data collaboration or data-intensive science-based communications. Our research efforts attempt to address this gap by developing a modular technology framework for data science integration efforts - with climate variation as the focus. The intention is that this model, if successful, could be generalized to other application areas. Our research aim focused on the design and implementation of a modular, deployable technology architecture for data integration. Developed using aspects of R, interactive python, SciDB, THREDDS, Javascript, and varied data mining and machine learning techniques, the Modular Data Response Framework (MDRF) was implemented to explore case scenarios for bio-climatic variation as they relate to pacific northwest ecosystem regions. Our preliminary results, using historical NETCDF climate data for calibration purposes across the inland pacific northwest region (Abatzoglou, Brown, 2011), show clear ecosystems shifting over a ten-year period (2001-2011), based on multiple supervised classifier methods for bioclimatic indicators.

  8. Reducing costs of managing and accessing navigation and ancillary data by relying on the extensive capabilities of NASA's spice system

    NASA Technical Reports Server (NTRS)

    Semenov, Boris V.; Acton, Charles H., Jr.; Bachman, Nathaniel J.; Elson, Lee S.; Wright, Edward D.

    2005-01-01

    The SPICE system of navigation and ancillary data possesses a number of traits that make its use in modern space missions of all types highly cost efficient. The core of the system is a software library providing API interfaces for storing and retrieving such data as trajectories, orientations, time conversions, and instrument geometry parameters. Applications used at any stage of a mission life cycle can call SPICE APIs to access this data and compute geometric quantities required for observation planning, engineering assessment and science data analysis. SPICE is implemented in three different languages, supported on 20+ computer environments, and distributed with complete source code and documentation. It includes capabilities that are extensively tested by everyday use in many active projects and are applicable to all types of space missions - flyby, orbiters, observatories, landers and rovers. While a customer's initial SPICE adaptation for the first mission or experiment requires a modest effort, this initial effort pays off because adaptation for subsequent missions/experiments is just a small fraction of the initial investment, with the majority of tools based on SPICE requiring no or very minor changes.

  9. Calculation of the Hadronic Vacuum Polarization Disconnected Contribution to the Muon Anomalous Magnetic Moment

    DOE PAGES

    Blum, T.; Boyle, P. A.; Izubuchi, T.; ...

    2016-06-08

    Here we report the first lattice QCD calculation of the hadronic vacuum polarization (HVP) disconnected contribution to the muon anomalous magnetic moment at physical pion mass. The calculation uses a refined noise-reduction technique that enables the control of statistical uncertainties at the desired level with modest computational effort. Measurements were performed on the 48 3×96 physical-pion-mass lattice generated by the RBC and UKQCD Collaborations. In conclusion, we find the leading-order hadronic vacuum polarization amore » $$HVP(LO)disc\\atop{μ}$$=-9.6(3.3)(2.3)×10 -10, where the first error is statistical and the second systematic.« less

  10. State-to-state reactive scattering in six dimensions using reactant-product decoupling: OH + H2 → H2O + H (J = 0).

    PubMed

    Cvitaš, Marko T; Althorpe, Stuart C

    2011-01-14

    We extend to full dimensionality a recently developed wave packet method [M. T. Cvitaš and S. C. Althorpe, J. Phys. Chem. A 113, 4557 (2009)] for computing the state-to-state quantum dynamics of AB + CD → ABC + D reactions and also increase the computational efficiency of the method. This is done by introducing a new set of product coordinates, by applying the Crank-Nicholson approximation to the angular kinetic energy part of the split-operator propagator and by using a symmetry-adapted basis-to-grid transformation to evaluate integrals over the potential energy surface. The newly extended method is tested on the benchmark OH + H(2) → H(2)O + H reaction, where it allows us to obtain accurately converged state-to-state reaction probabilities (on the Wu-Schatz-Fang-Lendvay-Harding potential energy surface) with modest computational effort. These methodological advances will make possible efficient calculations of state-to-state differential cross sections on this system in the near future.

  11. High resolution flow field prediction for tail rotor aeroacoustics

    NASA Technical Reports Server (NTRS)

    Quackenbush, Todd R.; Bliss, Donald B.

    1989-01-01

    The prediction of tail rotor noise due to the impingement of the main rotor wake poses a significant challenge to current analysis methods in rotorcraft aeroacoustics. This paper describes the development of a new treatment of the tail rotor aerodynamic environment that permits highly accurate resolution of the incident flow field with modest computational effort relative to alternative models. The new approach incorporates an advanced full-span free wake model of the main rotor in a scheme which reconstructs high-resolution flow solutions from preliminary, computationally inexpensive simulations with coarse resolution. The heart of the approach is a novel method for using local velocity correction terms to capture the steep velocity gradients characteristic of the vortex-dominated incident flow. Sample calculations have been undertaken to examine the principal types of interactions between the tail rotor and the main rotor wake and to examine the performance of the new method. The results of these sample problems confirm the success of this approach in capturing the high-resolution flows necessary for analysis of rotor-wake/rotor interactions with dramatically reduced computational cost. Computations of radiated sound are also carried out that explore the role of various portions of the main rotor wake in generating tail rotor noise.

  12. The Lunar CELSS Test Module

    NASA Technical Reports Server (NTRS)

    Hoehn, Alexander; Gomez, Shawn; Luttges, Marvin W.

    1992-01-01

    The evolutionarily-developed Lunar Controlled Ecological Life Support System (CELSS) Test Module presented can address questions concerning long-term human presence-related issues both at LEO and in the lunar environment. By achieving well-defined research goals at each of numerous developmental stages (each economically modest), easily justifiable operations can be undertaken. Attention is given to the possibility of maximizing non-NASA involvement in these CELSS developmental efforts via the careful definability and modest risk of each developmental stage.

  13. Israeli Special Libraries

    ERIC Educational Resources Information Center

    Foster, Barbara

    1974-01-01

    Israel is sprinkled with a noteworthy representation of special libraries which run the gamut from modest kibbutz efforts to highly technical scientific and humanities libraries. A few examples are discussed here. (Author/CH)

  14. An adaptive mesh-moving and refinement procedure for one-dimensional conservation laws

    NASA Technical Reports Server (NTRS)

    Biswas, Rupak; Flaherty, Joseph E.; Arney, David C.

    1993-01-01

    We examine the performance of an adaptive mesh-moving and /or local mesh refinement procedure for the finite difference solution of one-dimensional hyperbolic systems of conservation laws. Adaptive motion of a base mesh is designed to isolate spatially distinct phenomena, and recursive local refinement of the time step and cells of the stationary or moving base mesh is performed in regions where a refinement indicator exceeds a prescribed tolerance. These adaptive procedures are incorporated into a computer code that includes a MacCormack finite difference scheme wih Davis' artificial viscosity model and a discretization error estimate based on Richardson's extrapolation. Experiments are conducted on three problems in order to qualify the advantages of adaptive techniques relative to uniform mesh computations and the relative benefits of mesh moving and refinement. Key results indicate that local mesh refinement, with and without mesh moving, can provide reliable solutions at much lower computational cost than possible on uniform meshes; that mesh motion can be used to improve the results of uniform mesh solutions for a modest computational effort; that the cost of managing the tree data structure associated with refinement is small; and that a combination of mesh motion and refinement reliably produces solutions for the least cost per unit accuracy.

  15. Fast Determination of Distribution-Connected PV Impacts Using a Variable Time-Step Quasi-Static Time-Series Approach: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mather, Barry

    The increasing deployment of distribution-connected photovoltaic (DPV) systems requires utilities to complete complex interconnection studies. Relatively simple interconnection study methods worked well for low penetrations of photovoltaic systems, but more complicated quasi-static time-series (QSTS) analysis is required to make better interconnection decisions as DPV penetration levels increase. Tools and methods must be developed to support this. This paper presents a variable-time-step solver for QSTS analysis that significantly shortens the computational time and effort to complete a detailed analysis of the operation of a distribution circuit with many DPV systems. Specifically, it demonstrates that the proposed variable-time-step solver can reduce themore » required computational time by as much as 84% without introducing any important errors to metrics, such as the highest and lowest voltage occurring on the feeder, number of voltage regulator tap operations, and total amount of losses realized in the distribution circuit during a 1-yr period. Further improvement in computational speed is possible with the introduction of only modest errors in these metrics, such as a 91 percent reduction with less than 5 percent error when predicting voltage regulator operations.« less

  16. Rising College Costs and an Illinois Effort to Control Them: A Preliminary Review

    ERIC Educational Resources Information Center

    North, Teresa Lynn

    2013-01-01

    Rising college costs are of increasing concern. At the 12 public universities in Illinois, average increases in tuition were modest, generally in the 4% range, until 1999 when individual campuses begin to increase tuition at double digit rates. In 2002-2003, the overall average increase in tuition/fees more than doubled at 13.79%. In an effort to…

  17. Vodcasting for Everyone

    NASA Astrophysics Data System (ADS)

    Hurt, R.; Christensen, L. L.

    2008-06-01

    Video podcasting, or vodcasting, is the latest evolution of the podcast revolution. The market for on-demand content spans the gamut, ranging from portable media players to computers, and increasingly to televisions through home media centers. This new mode of accessing video content is rapidly growing in popularity, particularly among younger audiences. Because it allows a direct link between consumer and content producer, bypassing traditional media networks, it is ideal for EPO efforts. Even modest budgets can yield compelling astronomy vodcasts that will appeal to a large audience. Gateways like the iTunes Music Store and YouTube have created new content markets where none existed before. This paper highlights the key steps for producing a vodcast. The reader will see how to make (or improve) a video podcast for science communication purposes learn about some of the latest developments in this rapidly-evolving field.

  18. Vodcasting for Everyone

    NASA Astrophysics Data System (ADS)

    Christensen, L. L.; Hurt, R. L.

    2008-06-01

    Video podcasting, or vodcasting, is the latest evolution of the podcast revolution. The market for on-demand content spans the gamut, ranging from portable media players to computers, and increasingly to televisions through home media centres. This new mode of accessing video content is rapidly growing in popularity, particularly among younger audiences. Because it allows a direct link between consumer and content producer, bypassing traditional media networks, it is ideal for EPO efforts. Even modest budgets can yield compelling astronomy vodcasts that will appeal to a large audience. Gateways like the iTunes Music Store and YouTube have created new content markets where none existed before. This paper highlights the key steps for producing a vodcast. The reader will see how to make (or improve) a video podcast for science communication purposes learn about some of the latest developments in this rapidly-evolving field.

  19. Digital Maps, Matrices and Computer Algebra

    ERIC Educational Resources Information Center

    Knight, D. G.

    2005-01-01

    The way in which computer algebra systems, such as Maple, have made the study of complex problems accessible to undergraduate mathematicians with modest computational skills is illustrated by some large matrix calculations, which arise from representing the Earth's surface by digital elevation models. Such problems are often considered to lie in…

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Unat, Didem; Dubey, Anshu; Hoefler, Torsten

    The cost of data movement has always been an important concern in high performance computing (HPC) systems. It has now become the dominant factor in terms of both energy consumption and performance. Support for expression of data locality has been explored in the past, but those efforts have had only modest success in being adopted in HPC applications for various reasons. However, with the increasing complexity of the memory hierarchy and higher parallelism in emerging HPC systems, locality management has acquired a new urgency. Developers can no longer limit themselves to low-level solutions and ignore the potential for productivity andmore » performance portability obtained by using locality abstractions. Fortunately, the trend emerging in recent literature on the topic alleviates many of the concerns that got in the way of their adoption by application developers. Data locality abstractions are available in the forms of libraries, data structures, languages and runtime systems; a common theme is increasing productivity without sacrificing performance. Furthermore, this paper examines these trends and identifies commonalities that can combine various locality concepts to develop a comprehensive approach to expressing and managing data locality on future large-scale high-performance computing systems.« less

  1. Perspective: Web-based machine learning models for real-time screening of thermoelectric materials properties

    NASA Astrophysics Data System (ADS)

    Gaultois, Michael W.; Oliynyk, Anton O.; Mar, Arthur; Sparks, Taylor D.; Mulholland, Gregory J.; Meredig, Bryce

    2016-05-01

    The experimental search for new thermoelectric materials remains largely confined to a limited set of successful chemical and structural families, such as chalcogenides, skutterudites, and Zintl phases. In principle, computational tools such as density functional theory (DFT) offer the possibility of rationally guiding experimental synthesis efforts toward very different chemistries. However, in practice, predicting thermoelectric properties from first principles remains a challenging endeavor [J. Carrete et al., Phys. Rev. X 4, 011019 (2014)], and experimental researchers generally do not directly use computation to drive their own synthesis efforts. To bridge this practical gap between experimental needs and computational tools, we report an open machine learning-based recommendation engine (http://thermoelectrics.citrination.com) for materials researchers that suggests promising new thermoelectric compositions based on pre-screening about 25 000 known materials and also evaluates the feasibility of user-designed compounds. We show this engine can identify interesting chemistries very different from known thermoelectrics. Specifically, we describe the experimental characterization of one example set of compounds derived from our engine, RE12Co5Bi (RE = Gd, Er), which exhibits surprising thermoelectric performance given its unprecedentedly high loading with metallic d and f block elements and warrants further investigation as a new thermoelectric material platform. We show that our engine predicts this family of materials to have low thermal and high electrical conductivities, but modest Seebeck coefficient, all of which are confirmed experimentally. We note that the engine also predicts materials that may simultaneously optimize all three properties entering into zT; we selected RE12Co5Bi for this study due to its interesting chemical composition and known facile synthesis.

  2. A Hands-on Guide to Video Podcasting

    NASA Astrophysics Data System (ADS)

    Christensen, L. L.; Hurt, R.

    2008-02-01

    Video podcasting, or vodcasting, is the latest evolution of the podcast revolution. The market for on demand multimedia content spans the gamut, ranging from portable media players to computers, and increasingly to televisions through home media centres. This new mode of accessing content is rapidly growing in popularity, particularly among younger audiences. Vodcasting allows a direct link between consumer and content producer, bypassing traditional media networks, making it ideal for EPO efforts. Even modest budgets can yield compelling astronomy vodcasts that will appeal to a large audience. Gateways like the iTunes Store and video community websites such as Veoh and YouTube have created new content markets where none existed before. This paper highlights the key steps for producing a vodcast and shows some statistics from two leading astronomy vodcasts. The reader will see how to make (or improve) a science video podcast and learn about some of the latest developments in this rapidly-evolving field.

  3. A Comparison of PETSC Library and HPF Implementations of an Archetypal PDE Computation

    NASA Technical Reports Server (NTRS)

    Hayder, M. Ehtesham; Keyes, David E.; Mehrotra, Piyush

    1997-01-01

    Two paradigms for distributed-memory parallel computation that free the application programmer from the details of message passing are compared for an archetypal structured scientific computation a nonlinear, structured-grid partial differential equation boundary value problem using the same algorithm on the same hardware. Both paradigms, parallel libraries represented by Argonne's PETSC, and parallel languages represented by the Portland Group's HPF, are found to be easy to use for this problem class, and both are reasonably effective in exploiting concurrency after a short learning curve. The level of involvement required by the application programmer under either paradigm includes specification of the data partitioning (corresponding to a geometrically simple decomposition of the domain of the PDE). Programming in SPAM style for the PETSC library requires writing the routines that discretize the PDE and its Jacobian, managing subdomain-to-processor mappings (affine global- to-local index mappings), and interfacing to library solver routines. Programming for HPF requires a complete sequential implementation of the same algorithm, introducing concurrency through subdomain blocking (an effort similar to the index mapping), and modest experimentation with rewriting loops to elucidate to the compiler the latent concurrency. Correctness and scalability are cross-validated on up to 32 nodes of an IBM SP2.

  4. The importance of gene conservation in the USDA Forest Service

    Treesearch

    Robert D. Mangold

    2017-01-01

    Aldo Leopold once said “to keep every cog and wheel is the first precaution of intelligent tinkering.” The USDA Forest Service has embarked on a long-term effort to do just that. Our gene conservation efforts in forest trees are a modest beginning to this urgent need. In the early 2000s, the Forest Health Protection Program and its partners in the National Forest...

  5. Tunable Coarse Graining for Monte Carlo Simulations of Proteins via Smoothed Energy Tables: Direct and Exchange Simulations

    PubMed Central

    2015-01-01

    Many commonly used coarse-grained models for proteins are based on simplified interaction sites and consequently may suffer from significant limitations, such as the inability to properly model protein secondary structure without the addition of restraints. Recent work on a benzene fluid (LettieriS.; ZuckermanD. M.J. Comput. Chem.2012, 33, 268−27522120971) suggested an alternative strategy of tabulating and smoothing fully atomistic orientation-dependent interactions among rigid molecules or fragments. Here we report our initial efforts to apply this approach to the polar and covalent interactions intrinsic to polypeptides. We divide proteins into nearly rigid fragments, construct distance and orientation-dependent tables of the atomistic interaction energies between those fragments, and apply potential energy smoothing techniques to those tables. The amount of smoothing can be adjusted to give coarse-grained models that range from the underlying atomistic force field all the way to a bead-like coarse-grained model. For a moderate amount of smoothing, the method is able to preserve about 70–90% of the α-helical structure while providing a factor of 3–10 improvement in sampling per unit computation time (depending on how sampling is measured). For a greater amount of smoothing, multiple folding–unfolding transitions of the peptide were observed, along with a factor of 10–100 improvement in sampling per unit computation time, although the time spent in the unfolded state was increased compared with less smoothed simulations. For a β hairpin, secondary structure is also preserved, albeit for a narrower range of the smoothing parameter and, consequently, for a more modest improvement in sampling. We have also applied the new method in a “resolution exchange” setting, in which each replica runs a Monte Carlo simulation with a different degree of smoothing. We obtain exchange rates that compare favorably to our previous efforts at resolution exchange (LymanE.; ZuckermanD. M.J. Chem. Theory Comput.2006, 2, 656−666). PMID:25400525

  6. Infusing Software Engineering Technology into Practice at NASA

    NASA Technical Reports Server (NTRS)

    Pressburger, Thomas; Feather, Martin S.; Hinchey, Michael; Markosia, Lawrence

    2006-01-01

    We present an ongoing effort of the NASA Software Engineering Initiative to encourage the use of advanced software engineering technology on NASA projects. Technology infusion is in general a difficult process yet this effort seems to have found a modest approach that is successful for some types of technologies. We outline the process and describe the experience of the technology infusions that occurred over a two year period. We also present some lessons from the experiences.

  7. Challenges and opportunities with standardized monitoring for management decison-making

    USDA-ARS?s Scientific Manuscript database

    The importance of monitoring for adaptive management of rangelands has been well established. However, the actual use of monitoring data in rangeland management decisions has been modest despite extensive efforts to develop and implement monitoring programs from local to national scales. More effect...

  8. Repetitive Domain-Referenced Testing Using Computers: the TITA System.

    ERIC Educational Resources Information Center

    Olympia, P. L., Jr.

    The TITA (Totally Interactive Testing and Analysis) System algorithm for the repetitive construction of domain-referenced tests utilizes a compact data bank, is highly portable, is useful in any discipline, requires modest computer hardware, and does not present a security problem. Clusters of related keyphrases, statement phrases, and distractors…

  9. Trends in data locality abstractions for HPC systems

    DOE PAGES

    Unat, Didem; Dubey, Anshu; Hoefler, Torsten; ...

    2017-05-10

    The cost of data movement has always been an important concern in high performance computing (HPC) systems. It has now become the dominant factor in terms of both energy consumption and performance. Support for expression of data locality has been explored in the past, but those efforts have had only modest success in being adopted in HPC applications for various reasons. However, with the increasing complexity of the memory hierarchy and higher parallelism in emerging HPC systems, locality management has acquired a new urgency. Developers can no longer limit themselves to low-level solutions and ignore the potential for productivity andmore » performance portability obtained by using locality abstractions. Fortunately, the trend emerging in recent literature on the topic alleviates many of the concerns that got in the way of their adoption by application developers. Data locality abstractions are available in the forms of libraries, data structures, languages and runtime systems; a common theme is increasing productivity without sacrificing performance. Furthermore, this paper examines these trends and identifies commonalities that can combine various locality concepts to develop a comprehensive approach to expressing and managing data locality on future large-scale high-performance computing systems.« less

  10. A general concept for consistent documentation of computational analyses

    PubMed Central

    Müller, Fabian; Nordström, Karl; Lengauer, Thomas; Schulz, Marcel H.

    2015-01-01

    The ever-growing amount of data in the field of life sciences demands standardized ways of high-throughput computational analysis. This standardization requires a thorough documentation of each step in the computational analysis to enable researchers to understand and reproduce the results. However, due to the heterogeneity in software setups and the high rate of change during tool development, reproducibility is hard to achieve. One reason is that there is no common agreement in the research community on how to document computational studies. In many cases, simple flat files or other unstructured text documents are provided by researchers as documentation, which are often missing software dependencies, versions and sufficient documentation to understand the workflow and parameter settings. As a solution we suggest a simple and modest approach for documenting and verifying computational analysis pipelines. We propose a two-part scheme that defines a computational analysis using a Process and an Analysis metadata document, which jointly describe all necessary details to reproduce the results. In this design we separate the metadata specifying the process from the metadata describing an actual analysis run, thereby reducing the effort of manual documentation to an absolute minimum. Our approach is independent of a specific software environment, results in human readable XML documents that can easily be shared with other researchers and allows an automated validation to ensure consistency of the metadata. Because our approach has been designed with little to no assumptions concerning the workflow of an analysis, we expect it to be applicable in a wide range of computational research fields. Database URL: http://deep.mpi-inf.mpg.de/DAC/cmds/pub/pyvalid.zip PMID:26055099

  11. Education for an Aging Planet

    ERIC Educational Resources Information Center

    Ingman, Stan; Amin, Iftekhar; Clarke, Egerton; Brune, Kendall

    2010-01-01

    As low income societies experience rapid aging of their populations, they face major challenges in developing educational policies to prepare their workforce for the future. We review modest efforts undertaken to assist colleagues in three societies: Mexico, China, and Jamaica. Graduate education in gerontology has an important opportunity to…

  12. The current state of Bayesian methods in medical product development: survey results and recommendations from the DIA Bayesian Scientific Working Group.

    PubMed

    Natanegara, Fanni; Neuenschwander, Beat; Seaman, John W; Kinnersley, Nelson; Heilmann, Cory R; Ohlssen, David; Rochester, George

    2014-01-01

    Bayesian applications in medical product development have recently gained popularity. Despite many advances in Bayesian methodology and computations, increase in application across the various areas of medical product development has been modest. The DIA Bayesian Scientific Working Group (BSWG), which includes representatives from industry, regulatory agencies, and academia, has adopted the vision to ensure Bayesian methods are well understood, accepted more broadly, and appropriately utilized to improve decision making and enhance patient outcomes. As Bayesian applications in medical product development are wide ranging, several sub-teams were formed to focus on various topics such as patient safety, non-inferiority, prior specification, comparative effectiveness, joint modeling, program-wide decision making, analytical tools, and education. The focus of this paper is on the recent effort of the BSWG Education sub-team to administer a Bayesian survey to statisticians across 17 organizations involved in medical product development. We summarize results of this survey, from which we provide recommendations on how to accelerate progress in Bayesian applications throughout medical product development. The survey results support findings from the literature and provide additional insight on regulatory acceptance of Bayesian methods and information on the need for a Bayesian infrastructure within an organization. The survey findings support the claim that only modest progress in areas of education and implementation has been made recently, despite substantial progress in Bayesian statistical research and software availability. Copyright © 2013 John Wiley & Sons, Ltd.

  13. The Multiple Pendulum Problem via Maple[R

    ERIC Educational Resources Information Center

    Salisbury, K. L.; Knight, D. G.

    2002-01-01

    The way in which computer algebra systems, such as Maple, have made the study of physical problems of some considerable complexity accessible to mathematicians and scientists with modest computational skills is illustrated by solving the multiple pendulum problem. A solution is obtained for four pendulums with no restriction on the size of the…

  14. Building software tools to help contextualize and interpret monitoring data

    USDA-ARS?s Scientific Manuscript database

    Even modest monitoring efforts at landscape scales produce large volumes of data.These are most useful if they can be interpreted relative to land potential or other similar sites. However, for many ecological systems reference conditions may not be defined or are poorly described, which hinders und...

  15. Make the most of your samples: Bayes factor estimators for high-dimensional models of sequence evolution.

    PubMed

    Baele, Guy; Lemey, Philippe; Vansteelandt, Stijn

    2013-03-06

    Accurate model comparison requires extensive computation times, especially for parameter-rich models of sequence evolution. In the Bayesian framework, model selection is typically performed through the evaluation of a Bayes factor, the ratio of two marginal likelihoods (one for each model). Recently introduced techniques to estimate (log) marginal likelihoods, such as path sampling and stepping-stone sampling, offer increased accuracy over the traditional harmonic mean estimator at an increased computational cost. Most often, each model's marginal likelihood will be estimated individually, which leads the resulting Bayes factor to suffer from errors associated with each of these independent estimation processes. We here assess the original 'model-switch' path sampling approach for direct Bayes factor estimation in phylogenetics, as well as an extension that uses more samples, to construct a direct path between two competing models, thereby eliminating the need to calculate each model's marginal likelihood independently. Further, we provide a competing Bayes factor estimator using an adaptation of the recently introduced stepping-stone sampling algorithm and set out to determine appropriate settings for accurately calculating such Bayes factors, with context-dependent evolutionary models as an example. While we show that modest efforts are required to roughly identify the increase in model fit, only drastically increased computation times ensure the accuracy needed to detect more subtle details of the evolutionary process. We show that our adaptation of stepping-stone sampling for direct Bayes factor calculation outperforms the original path sampling approach as well as an extension that exploits more samples. Our proposed approach for Bayes factor estimation also has preferable statistical properties over the use of individual marginal likelihood estimates for both models under comparison. Assuming a sigmoid function to determine the path between two competing models, we provide evidence that a single well-chosen sigmoid shape value requires less computational efforts in order to approximate the true value of the (log) Bayes factor compared to the original approach. We show that the (log) Bayes factors calculated using path sampling and stepping-stone sampling differ drastically from those estimated using either of the harmonic mean estimators, supporting earlier claims that the latter systematically overestimate the performance of high-dimensional models, which we show can lead to erroneous conclusions. Based on our results, we argue that highly accurate estimation of differences in model fit for high-dimensional models requires much more computational effort than suggested in recent studies on marginal likelihood estimation.

  16. Make the most of your samples: Bayes factor estimators for high-dimensional models of sequence evolution

    PubMed Central

    2013-01-01

    Background Accurate model comparison requires extensive computation times, especially for parameter-rich models of sequence evolution. In the Bayesian framework, model selection is typically performed through the evaluation of a Bayes factor, the ratio of two marginal likelihoods (one for each model). Recently introduced techniques to estimate (log) marginal likelihoods, such as path sampling and stepping-stone sampling, offer increased accuracy over the traditional harmonic mean estimator at an increased computational cost. Most often, each model’s marginal likelihood will be estimated individually, which leads the resulting Bayes factor to suffer from errors associated with each of these independent estimation processes. Results We here assess the original ‘model-switch’ path sampling approach for direct Bayes factor estimation in phylogenetics, as well as an extension that uses more samples, to construct a direct path between two competing models, thereby eliminating the need to calculate each model’s marginal likelihood independently. Further, we provide a competing Bayes factor estimator using an adaptation of the recently introduced stepping-stone sampling algorithm and set out to determine appropriate settings for accurately calculating such Bayes factors, with context-dependent evolutionary models as an example. While we show that modest efforts are required to roughly identify the increase in model fit, only drastically increased computation times ensure the accuracy needed to detect more subtle details of the evolutionary process. Conclusions We show that our adaptation of stepping-stone sampling for direct Bayes factor calculation outperforms the original path sampling approach as well as an extension that exploits more samples. Our proposed approach for Bayes factor estimation also has preferable statistical properties over the use of individual marginal likelihood estimates for both models under comparison. Assuming a sigmoid function to determine the path between two competing models, we provide evidence that a single well-chosen sigmoid shape value requires less computational efforts in order to approximate the true value of the (log) Bayes factor compared to the original approach. We show that the (log) Bayes factors calculated using path sampling and stepping-stone sampling differ drastically from those estimated using either of the harmonic mean estimators, supporting earlier claims that the latter systematically overestimate the performance of high-dimensional models, which we show can lead to erroneous conclusions. Based on our results, we argue that highly accurate estimation of differences in model fit for high-dimensional models requires much more computational effort than suggested in recent studies on marginal likelihood estimation. PMID:23497171

  17. Error Reduction Program. [combustor performance evaluation codes

    NASA Technical Reports Server (NTRS)

    Syed, S. A.; Chiappetta, L. M.; Gosman, A. D.

    1985-01-01

    The details of a study to select, incorporate and evaluate the best available finite difference scheme to reduce numerical error in combustor performance evaluation codes are described. The combustor performance computer programs chosen were the two dimensional and three dimensional versions of Pratt & Whitney's TEACH code. The criteria used to select schemes required that the difference equations mirror the properties of the governing differential equation, be more accurate than the current hybrid difference scheme, be stable and economical, be compatible with TEACH codes, use only modest amounts of additional storage, and be relatively simple. The methods of assessment used in the selection process consisted of examination of the difference equation, evaluation of the properties of the coefficient matrix, Taylor series analysis, and performance on model problems. Five schemes from the literature and three schemes developed during the course of the study were evaluated. This effort resulted in the incorporation of a scheme in 3D-TEACH which is usuallly more accurate than the hybrid differencing method and never less accurate.

  18. Changing the Discourse in Schools.

    ERIC Educational Resources Information Center

    Eubanks, Eugene; Parish, Ralph

    Efforts in the United States to provide a higher quality education for everyone regardless of race, class, and gender have had, at best, a very modest effect. This paper suggests that the effect of a change strategy depends on the discourse (how things are talked about when teachers solve problems, plan their work, create policy, and explain…

  19. Creating Better Citizens? Effects of a Model Citizens' Assembly on Student Political Attitudes and Behavior

    ERIC Educational Resources Information Center

    Gershtenson, Joseph; Rainey, Glenn W., Jr.; Rainey, Jane G.

    2010-01-01

    Perceiving political engagement to be dangerously low among American citizens, many political science professors in recent years have attempted to promote engagement and "healthier" political attitudes. The effectiveness of these efforts appears variable and generally quite modest. Following the model of Canadian citizens' assemblies, we…

  20. Self-Entitled College Students: Contributions of Personality, Parenting, and Motivational Factors

    ERIC Educational Resources Information Center

    Greenberger, Ellen; Lessard, Jared; Chen, Chuansheng; Farruggia, Susan P.

    2008-01-01

    Anecdotal evidence suggests an increase in entitled attitudes and behaviors of youth in school and college settings. Using a newly developed scale to assess "academic entitlement" (AE), a construct that includes expectations of high grades for modest effort and demanding attitudes towards teachers, this research is the first to investigate the…

  1. Treatment options after sorafenib failure in patients with hepatocellular carcinoma

    PubMed Central

    Dika, Imane El

    2017-01-01

    Second line therapy after failure of sorafenib continues to be under study. Prognosis of hepatocellular carcinoma is measured in months, with median overall survival reaching 10.7 months with sorafenib. Because of the modest net benefit sorafenib has contributed, and rising incidence of hepatocellular carcinoma in the world, continued efforts are ongoing to look for efficient upfront, second line, or combination therapies. Herein we review the most relevant to date published literature on treatment options beyond sorafenib, reported studies, ongoing investigational efforts, and possibilities for future studies in advanced hepatocellular carcinoma. PMID:29151326

  2. Evaluation Realities or How I Learned to Love "The Standards" While Evaluating a Computer Assisted Instruction Project.

    ERIC Educational Resources Information Center

    Payne, David A.

    This case study presents a narrative summary of the evaluation of a two semester computer assisted instruction (CAI) project in an all minority high school. Use of PLATO software with Control Data microcomputers brought about modest achievement advantages, higher internal locus of control, more positive attitudes toward school and specific course…

  3. Characterization of phenotypic variation for dermo resistance among selectively-bred families of the Eastern oyster, Crassostrea virginica

    USDA-ARS?s Scientific Manuscript database

    Dermo disease impacts nearly every region where oysters are cultured in the Eastern U.S. and is a significant concern to industry stakeholders. Efforts to breed for Dermo resistance in the Eastern Oyster have had modest success, yet the range of existing phenotypic variation with respect to Dermo r...

  4. Assessing the extent of phenotypic variation for dermo resistance among selectively-bred families of the Eastern Oyster, Crassostrea virginica

    USDA-ARS?s Scientific Manuscript database

    Dermo disease impacts nearly every region where oysters are cultured in the Eastern U.S. and is a significant concern to industry stakeholders. Efforts to breed for Dermo resistance in the Eastern Oyster have had modest success, yet the range of existing phenotypic variation with respect to Dermo ...

  5. Curricula, Competition and Conventional Bonds: The Educational Role in Drug Control.

    ERIC Educational Resources Information Center

    Norland, Stephen; And Others

    1996-01-01

    Evaluations of school curricular drug control efforts show they are only modestly successful because they are based on an inaccurate theory of drug taking. Social control theory is suggested as a better model of drug taking and drug resistance. Asserts strong bonds to school decrease the likelihood of interaction with delinquent peers and thereby…

  6. Distributed and Lumped Parameter Models for the Characterization of High Throughput Bioreactors

    PubMed Central

    Conoscenti, Gioacchino; Cutrì, Elena; Tuan, Rocky S.; Raimondi, Manuela T.; Gottardi, Riccardo

    2016-01-01

    Next generation bioreactors are being developed to generate multiple human cell-based tissue analogs within the same fluidic system, to better recapitulate the complexity and interconnection of human physiology [1, 2]. The effective development of these devices requires a solid understanding of their interconnected fluidics, to predict the transport of nutrients and waste through the constructs and improve the design accordingly. In this work, we focus on a specific model of bioreactor, with multiple input/outputs, aimed at generating osteochondral constructs, i.e., a biphasic construct in which one side is cartilaginous in nature, while the other is osseous. We next develop a general computational approach to model the microfluidics of a multi-chamber, interconnected system that may be applied to human-on-chip devices. This objective requires overcoming several challenges at the level of computational modeling. The main one consists of addressing the multi-physics nature of the problem that combines free flow in channels with hindered flow in porous media. Fluid dynamics is also coupled with advection-diffusion-reaction equations that model the transport of biomolecules throughout the system and their interaction with living tissues and C constructs. Ultimately, we aim at providing a predictive approach useful for the general organ-on-chip community. To this end, we have developed a lumped parameter approach that allows us to analyze the behavior of multi-unit bioreactor systems with modest computational effort, provided that the behavior of a single unit can be fully characterized. PMID:27669413

  7. Distributed and Lumped Parameter Models for the Characterization of High Throughput Bioreactors.

    PubMed

    Iannetti, Laura; D'Urso, Giovanna; Conoscenti, Gioacchino; Cutrì, Elena; Tuan, Rocky S; Raimondi, Manuela T; Gottardi, Riccardo; Zunino, Paolo

    Next generation bioreactors are being developed to generate multiple human cell-based tissue analogs within the same fluidic system, to better recapitulate the complexity and interconnection of human physiology [1, 2]. The effective development of these devices requires a solid understanding of their interconnected fluidics, to predict the transport of nutrients and waste through the constructs and improve the design accordingly. In this work, we focus on a specific model of bioreactor, with multiple input/outputs, aimed at generating osteochondral constructs, i.e., a biphasic construct in which one side is cartilaginous in nature, while the other is osseous. We next develop a general computational approach to model the microfluidics of a multi-chamber, interconnected system that may be applied to human-on-chip devices. This objective requires overcoming several challenges at the level of computational modeling. The main one consists of addressing the multi-physics nature of the problem that combines free flow in channels with hindered flow in porous media. Fluid dynamics is also coupled with advection-diffusion-reaction equations that model the transport of biomolecules throughout the system and their interaction with living tissues and C constructs. Ultimately, we aim at providing a predictive approach useful for the general organ-on-chip community. To this end, we have developed a lumped parameter approach that allows us to analyze the behavior of multi-unit bioreactor systems with modest computational effort, provided that the behavior of a single unit can be fully characterized.

  8. The influence of inspiratory effort and emphysema on pulmonary nodule volumetry reproducibility.

    PubMed

    Moser, J B; Mak, S M; McNulty, W H; Padley, S; Nair, A; Shah, P L; Devaraj, A

    2017-11-01

    To evaluate the impact of inspiratory effort and emphysema on reproducibility of pulmonary nodule volumetry. Eighty-eight nodules in 24 patients with emphysema were studied retrospectively. All patients had undergone volumetric inspiratory and end-expiratory thoracic computed tomography (CT) for consideration of bronchoscopic lung volume reduction. Inspiratory and expiratory nodule volumes were measured using commercially available software. Local emphysema extent was established by analysing a segmentation area extended circumferentially around each nodule (quantified as percent of lung with density of -950 HU or less). Lung volumes were established using the same software. Differences in inspiratory and expiratory nodule volumes were illustrated using the Bland-Altman test. The influences of percentage reduction in lung volume at expiration, local emphysema extent, and nodule size on nodule volume variability were tested with multiple linear regression. The majority of nodules (59/88 [67%]) showed an increased volume at expiration. Mean difference in nodule volume between expiration and inspiration was +7.5% (95% confidence interval: -24.1, 39.1%). No relationships were demonstrated between nodule volume variability and emphysema extent, degree of expiration, or nodule size. Expiration causes a modest increase in volumetry-derived nodule volumes; however, the effect is unpredictable. Local emphysema extent had no significant effect on volume variability in the present cohort. Copyright © 2017 The Royal College of Radiologists. Published by Elsevier Ltd. All rights reserved.

  9. Female Staff Members in the Fabrication Shop

    NASA Image and Video Library

    1944-06-21

    The loss of male NACA employees to the war effort and the military’s increased demand for expedited aeronautical research results resulted in a sharp demand for increased staffing in the early 1940s. The Aircraft Engine Research Laboratory (AERL) undertook an extensive recruiting effort to remedy the situation. Current employees were asked to bring in friends and family members, including women. The number of women employed at the AERL nearly doubled to 412 between 1943 and 1944. In May 1944 Director Raymond Sharp initiated a program to train women as machine operators, electricians, instrumentation engineers, and other technical positions. The move coincided with the lab’s implementation of a third shift to meet the military’s demands for improved aircraft performance. There was also a modest, but important, number of female engineers and chemists, as well as large group employed in more traditional positions such as data analysts, editors, and clerks. The integration of women in the research process was critical. Researchers developed a test and submitted plans to the Drafting Section to be converted into blueprints. In some instances the Instrumentation Shop was asked to create instruments for the test. During the test, computers gathered and analyzed the data. The researcher then wrote the report which was reviewed by the Editorial Department and printed in the Duplication Unit. All of these tasks were generally performed by female employees.

  10. Solving optimization problems on computational grids.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wright, S. J.; Mathematics and Computer Science

    2001-05-01

    Multiprocessor computing platforms, which have become more and more widely available since the mid-1980s, are now heavily used by organizations that need to solve very demanding computational problems. Parallel computing is now central to the culture of many research communities. Novel parallel approaches were developed for global optimization, network optimization, and direct-search methods for nonlinear optimization. Activity was particularly widespread in parallel branch-and-bound approaches for various problems in combinatorial and network optimization. As the cost of personal computers and low-end workstations has continued to fall, while the speed and capacity of processors and networks have increased dramatically, 'cluster' platforms havemore » become popular in many settings. A somewhat different type of parallel computing platform know as a computational grid (alternatively, metacomputer) has arisen in comparatively recent times. Broadly speaking, this term refers not to a multiprocessor with identical processing nodes but rather to a heterogeneous collection of devices that are widely distributed, possibly around the globe. The advantage of such platforms is obvious: they have the potential to deliver enormous computing power. Just as obviously, however, the complexity of grids makes them very difficult to use. The Condor team, headed by Miron Livny at the University of Wisconsin, were among the pioneers in providing infrastructure for grid computations. More recently, the Globus project has developed technologies to support computations on geographically distributed platforms consisting of high-end computers, storage and visualization devices, and other scientific instruments. In 1997, we started the metaneos project as a collaborative effort between optimization specialists and the Condor and Globus groups. Our aim was to address complex, difficult optimization problems in several areas, designing and implementing the algorithms and the software infrastructure need to solve these problems on computational grids. This article describes some of the results we have obtained during the first three years of the metaneos project. Our efforts have led to development of the runtime support library MW for implementing algorithms with master-worker control structure on Condor platforms. This work is discussed here, along with work on algorithms and codes for integer linear programming, the quadratic assignment problem, and stochastic linear programmming. Our experiences in the metaneos project have shown that cheap, powerful computational grids can be used to tackle large optimization problems of various types. In an industrial or commercial setting, the results demonstrate that one may not have to buy powerful computational servers to solve many of the large problems arising in areas such as scheduling, portfolio optimization, or logistics; the idle time on employee workstations (or, at worst, an investment in a modest cluster of PCs) may do the job. For the optimization research community, our results motivate further work on parallel, grid-enabled algorithms for solving very large problems of other types. The fact that very large problems can be solved cheaply allows researchers to better understand issues of 'practical' complexity and of the role of heuristics.« less

  11. Stress Management-Augmented Behavioral Weight Loss Intervention for African American Women: A Pilot, Randomized Controlled Trial

    ERIC Educational Resources Information Center

    Cox, Tiffany L.; Krukowski, Rebecca; Love, ShaRhonda J.; Eddings, Kenya; DiCarlo, Marisha; Chang, Jason Y.; Prewitt, T. Elaine; West, Delia Smith

    2013-01-01

    The relationship between chronic stress and weight management efforts may be a concern for African American (AA) women, who have a high prevalence of obesity, high stress levels, and modest response to obesity treatment. This pilot study randomly assigned 44 overweight/obese AA women with moderate to high stress levels to either a 12-week…

  12. Guidelines and Options for Computer Access from a Reclined Position.

    PubMed

    Grott, Ray

    2015-01-01

    Many people can benefit from working in a reclined position when accessing a computer. This can be due to disabilities involving musculoskeletal weakness, or the need to offload pressure on the spine or elevate the legs. Although there are "reclining workstations" on the market that work for some people, potentially better solutions tailored to individual needs can be configured at modest cost by following some basic principles.

  13. Computer simulation of space charge

    NASA Astrophysics Data System (ADS)

    Yu, K. W.; Chung, W. K.; Mak, S. S.

    1991-05-01

    Using the particle-mesh (PM) method, a one-dimensional simulation of the well-known Langmuir-Child's law is performed on an INTEL 80386-based personal computer system. The program is coded in turbo basic (trademark of Borland International, Inc.). The numerical results obtained were in excellent agreement with theoretical predictions and the computational time required is quite modest. This simulation exercise demonstrates that some simple computer simulation using particles may be implemented successfully on PC's that are available today, and hopefully this will provide the necessary incentives for newcomers to the field who wish to acquire a flavor of the elementary aspects of the practice.

  14. Subgrid or Reynolds stress-modeling for three-dimensional turbulence computations

    NASA Technical Reports Server (NTRS)

    Rubesin, M. W.

    1975-01-01

    A review is given of recent advances in two distinct computational methods for evaluating turbulence fields, namely, statistical Reynolds stress modeling and turbulence simulation, where large eddies are followed in time. It is shown that evaluation of the mean Reynolds stresses, rather than use of a scalar eddy viscosity, permits an explanation of streamline curvature effects found in several experiments. Turbulence simulation, with a new volume averaging technique and third-order accurate finite-difference computing is shown to predict the decay of isotropic turbulence in incompressible flow with rather modest computer storage requirements, even at Reynolds numbers of aerodynamic interest.

  15. Approximations for Quantitative Feedback Theory Designs

    NASA Technical Reports Server (NTRS)

    Henderson, D. K.; Hess, R. A.

    1997-01-01

    The computational requirements for obtaining the results summarized in the preceding section were very modest and were easily accomplished using computer-aided control system design software. Of special significance is the ability of the PDT to indicate a loop closure sequence for MIMO QFT designs that employ sequential loop closure. Although discussed as part of a 2 x 2 design, the PDT is obviously applicable to designs with a greater number of inputs and system responses.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bolme, David S; Mikkilineni, Aravind K; Rose, Derek C

    Analog computational circuits have been demonstrated to provide substantial improvements in power and speed relative to digital circuits, especially for applications requiring extreme parallelism but only modest precision. Deep machine learning is one such area and stands to benefit greatly from analog and mixed-signal implementations. However, even at modest precisions, offsets and non-linearity can degrade system performance. Furthermore, in all but the simplest systems, it is impossible to directly measure the intermediate outputs of all sub-circuits. The result is that circuit designers are unable to accurately evaluate the non-idealities of computational circuits in-situ and are therefore unable to fully utilizemore » measurement results to improve future designs. In this paper we present a technique to use deep learning frameworks to model physical systems. Recently developed libraries like TensorFlow make it possible to use back propagation to learn parameters in the context of modeling circuit behavior. Offsets and scaling errors can be discovered even for sub-circuits that are deeply embedded in a computational system and not directly observable. The learned parameters can be used to refine simulation methods or to identify appropriate compensation strategies. We demonstrate the framework using a mixed-signal convolution operator as an example circuit.« less

  17. An Automated Motion Detection and Reward System for Animal Training.

    PubMed

    Miller, Brad; Lim, Audrey N; Heidbreder, Arnold F; Black, Kevin J

    2015-12-04

    A variety of approaches has been used to minimize head movement during functional brain imaging studies in awake laboratory animals. Many laboratories expend substantial effort and time training animals to remain essentially motionless during such studies. We could not locate an "off-the-shelf" automated training system that suited our needs.  We developed a time- and labor-saving automated system to train animals to hold still for extended periods of time. The system uses a personal computer and modest external hardware to provide stimulus cues, monitor movement using commercial video surveillance components, and dispense rewards. A custom computer program automatically increases the motionless duration required for rewards based on performance during the training session but allows changes during sessions. This system was used to train cynomolgus monkeys (Macaca fascicularis) for awake neuroimaging studies using positron emission tomography (PET) and functional magnetic resonance imaging (fMRI). The automated system saved the trainer substantial time, presented stimuli and rewards in a highly consistent manner, and automatically documented training sessions. We have limited data to prove the training system's success, drawn from the automated records during training sessions, but we believe others may find it useful. The system can be adapted to a range of behavioral training/recording activities for research or commercial applications, and the software is freely available for non-commercial use.

  18. Robust Multivariable Optimization and Performance Simulation for ASIC Design

    NASA Technical Reports Server (NTRS)

    DuMonthier, Jeffrey; Suarez, George

    2013-01-01

    Application-specific-integrated-circuit (ASIC) design for space applications involves multiple challenges of maximizing performance, minimizing power, and ensuring reliable operation in extreme environments. This is a complex multidimensional optimization problem, which must be solved early in the development cycle of a system due to the time required for testing and qualification severely limiting opportunities to modify and iterate. Manual design techniques, which generally involve simulation at one or a small number of corners with a very limited set of simultaneously variable parameters in order to make the problem tractable, are inefficient and not guaranteed to achieve the best possible results within the performance envelope defined by the process and environmental requirements. What is required is a means to automate design parameter variation, allow the designer to specify operational constraints and performance goals, and to analyze the results in a way that facilitates identifying the tradeoffs defining the performance envelope over the full set of process and environmental corner cases. The system developed by the Mixed Signal ASIC Group (MSAG) at the Goddard Space Flight Center is implemented as a framework of software modules, templates, and function libraries. It integrates CAD tools and a mathematical computing environment, and can be customized for new circuit designs with only a modest amount of effort as most common tasks are already encapsulated. Customization is required for simulation test benches to determine performance metrics and for cost function computation.

  19. The prediction of satellite ephemeris errors as they result from surveillance system measurement errors

    NASA Astrophysics Data System (ADS)

    Simmons, B. E.

    1981-08-01

    This report derives equations predicting satellite ephemeris error as a function of measurement errors of space-surveillance sensors. These equations lend themselves to rapid computation with modest computer resources. They are applicable over prediction times such that measurement errors, rather than uncertainties of atmospheric drag and of Earth shape, dominate in producing ephemeris error. This report describes the specialization of these equations underlying the ANSER computer program, SEEM (Satellite Ephemeris Error Model). The intent is that this report be of utility to users of SEEM for interpretive purposes, and to computer programmers who may need a mathematical point of departure for limited generalization of SEEM.

  20. Temperamental factors in remitted depression: The role of effortful control and attentional mechanisms.

    PubMed

    Marchetti, Igor; Shumake, Jason; Grahek, Ivan; Koster, Ernst H W

    2018-08-01

    Temperamental effortful control and attentional networks are increasingly viewed as important underlying processes in depression and anxiety. However, it is still unknown whether these factors facilitate depressive and anxiety symptoms in the general population and, more specifically, in remitted depressed individuals. We investigated to what extent effortful control and attentional networks (i.e., Attention Network Task) explain concurrent depressive and anxious symptoms in healthy individuals (n = 270) and remitted depressed individuals (n = 90). Both samples were highly representative of the US population. Increased effortful control predicted a substantial decrease in symptoms of both depression and anxiety in the whole sample, whereas decreased efficiency of executive attention predicted a modest increase in depressive symptoms. Remitted depressed individuals did not show less effortful control nor less efficient attentional networks than healthy individuals. Moreover, clinical status did not moderate the relationship between temperamental factors and either depressive or anxiety symptoms. Limitations include the cross-sectional nature of the study. Our study shows that temperamental effortful control represents an important transdiagnostic process for depressive and anxiety symptoms in adults. Copyright © 2018 Elsevier B.V. All rights reserved.

  1. Design synthesis and optimization of permanent magnet synchronous machines based on computationally-efficient finite element analysis

    NASA Astrophysics Data System (ADS)

    Sizov, Gennadi Y.

    In this dissertation, a model-based multi-objective optimal design of permanent magnet ac machines, supplied by sine-wave current regulated drives, is developed and implemented. The design procedure uses an efficient electromagnetic finite element-based solver to accurately model nonlinear material properties and complex geometric shapes associated with magnetic circuit design. Application of an electromagnetic finite element-based solver allows for accurate computation of intricate performance parameters and characteristics. The first contribution of this dissertation is the development of a rapid computational method that allows accurate and efficient exploration of large multi-dimensional design spaces in search of optimum design(s). The computationally efficient finite element-based approach developed in this work provides a framework of tools that allow rapid analysis of synchronous electric machines operating under steady-state conditions. In the developed modeling approach, major steady-state performance parameters such as, winding flux linkages and voltages, average, cogging and ripple torques, stator core flux densities, core losses, efficiencies and saturated machine winding inductances, are calculated with minimum computational effort. In addition, the method includes means for rapid estimation of distributed stator forces and three-dimensional effects of stator and/or rotor skew on the performance of the machine. The second contribution of this dissertation is the development of the design synthesis and optimization method based on a differential evolution algorithm. The approach relies on the developed finite element-based modeling method for electromagnetic analysis and is able to tackle large-scale multi-objective design problems using modest computational resources. Overall, computational time savings of up to two orders of magnitude are achievable, when compared to current and prevalent state-of-the-art methods. These computational savings allow one to expand the optimization problem to achieve more complex and comprehensive design objectives. The method is used in the design process of several interior permanent magnet industrial motors. The presented case studies demonstrate that the developed finite element-based approach practically eliminates the need for using less accurate analytical and lumped parameter equivalent circuit models for electric machine design optimization. The design process and experimental validation of the case-study machines are detailed in the dissertation.

  2. An Examination of Unsteady Airloads on a UH-60A Rotor: Computation Versus Measurement

    NASA Technical Reports Server (NTRS)

    Biedron, Robert T.; Lee-Rausch, Elizabeth

    2012-01-01

    An unsteady Reynolds-averaged Navier-Stokes solver for unstructured grids is used to simulate the flow over a UH-60A rotor. Traditionally, the computed pressure and shear stresses are integrated on the computational mesh at selected radial stations and compared to measured airloads. However, the corresponding integration of experimental data uses only the pressure contribution, and the set of integration points (pressure taps) is modest compared to the computational mesh resolution. This paper examines the difference between the traditional integration of computed airloads and an integration consistent with that used for the experimental data. In addition, a comparison of chordwise pressure distributions between computation and measurement is made. Examination of this unsteady pressure data provides new opportunities to understand differences between computation and flight measurement.

  3. Suspension Parameter Measurements of Wheeled Military Vehicles

    DTIC Science & Technology

    2012-08-01

    suspension through the wheel pads. The SPIdER was designed so that in the future, with a modest amount of modification , it can be upgraded to include the...AND MOBILITY (P&M) MINI-SYMPOSIUM AUGUST 14-16, MICHIGAN SUSPENSION PARAMETER MEASUREMENTS OF WHEELED MILITARY VEHICLES Dale Andreatta Gary...was built to measure the suspension parameters of any military wheeled vehicle. This is part of an ongoing effort to model and predict vehicle

  4. Reform in Secondary Education: The Continuing Efforts to Reform Secondary Education, and a Modest Proposal. Curriculum Bulletin Vol. XXXII, No. 340.

    ERIC Educational Resources Information Center

    Saylor, Galen

    The author begins by examining the functions of the school and the basic principles governing the provision of education in the American democracy as a way of providing a framework for analyzing proposals for the reform of secondary education. He then examines proposals for reform. His major focus is on ten proposals made by agencies,…

  5. Sensitivity of Pliocene Arctic climate to orbital forcing, atmospheric CO2 and sea ice albedo parameterisation

    USGS Publications Warehouse

    Howell, Fergus W.; Haywood, Alan M.; Dowsett, Harry J.; Pickering, Steven J.

    2016-01-01

    With varying CO2, orbit and sea ice albedo values we are able to reproduce proxy temperature records that lean towards modest levels of high latitude warming, but other proxy data showing greater warming remain beyond the reach of our model. This highlights the importance of additional proxy records at high latitudes and ongoing efforts to compare proxy signals between sites.

  6. The Chinese Armed Forces in the 21st Century

    DTIC Science & Technology

    1999-12-01

    economic 72 growth. Military vs . Economic Considerations. Without saying so, China recognizes that the real potential for trouble on the...only making a modest effort to exploit the RMA. "How to balance investment in the present vs . future was the fundamental contradition facing the U.S...zhuangjiabing wuqizhuangbei fazhan de huigu yu zhanwang" ("Research and Development of Armour "), in Huitou yu zhanwang (Retrospect and Prospect: Chinese

  7. 5 CFR 630.310 - Scheduling of annual leave by employees determined necessary for Year 2000 computer conversion...

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... determined necessary for Year 2000 computer conversion efforts. 630.310 Section 630.310 Administrative... Scheduling of annual leave by employees determined necessary for Year 2000 computer conversion efforts. (a) Year 2000 computer conversion efforts are deemed to be an exigency of the public business for the...

  8. 5 CFR 630.310 - Scheduling of annual leave by employees determined necessary for Year 2000 computer conversion...

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... determined necessary for Year 2000 computer conversion efforts. 630.310 Section 630.310 Administrative... Scheduling of annual leave by employees determined necessary for Year 2000 computer conversion efforts. (a) Year 2000 computer conversion efforts are deemed to be an exigency of the public business for the...

  9. 5 CFR 630.310 - Scheduling of annual leave by employees determined necessary for Year 2000 computer conversion...

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... determined necessary for Year 2000 computer conversion efforts. 630.310 Section 630.310 Administrative... Scheduling of annual leave by employees determined necessary for Year 2000 computer conversion efforts. (a) Year 2000 computer conversion efforts are deemed to be an exigency of the public business for the...

  10. 5 CFR 630.310 - Scheduling of annual leave by employees determined necessary for Year 2000 computer conversion...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... determined necessary for Year 2000 computer conversion efforts. 630.310 Section 630.310 Administrative... Scheduling of annual leave by employees determined necessary for Year 2000 computer conversion efforts. (a) Year 2000 computer conversion efforts are deemed to be an exigency of the public business for the...

  11. 5 CFR 630.310 - Scheduling of annual leave by employees determined necessary for Year 2000 computer conversion...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... determined necessary for Year 2000 computer conversion efforts. 630.310 Section 630.310 Administrative... Scheduling of annual leave by employees determined necessary for Year 2000 computer conversion efforts. (a) Year 2000 computer conversion efforts are deemed to be an exigency of the public business for the...

  12. Hybrid Methods in Quantum Information

    NASA Astrophysics Data System (ADS)

    Marshall, Kevin

    Today, the potential power of quantum information processing comes as no surprise to physicist or science-fiction writer alike. However, the grand promises of this field remain unrealized, despite significant strides forward, due to the inherent difficulties of manipulating quantum systems. Simply put, it turns out that it is incredibly difficult to interact, in a controllable way, with the quantum realm when we seem to live our day to day lives in a classical world. In an effort to solve this challenge, people are exploring a variety of different physical platforms, each with their strengths and weaknesses, in hopes of developing new experimental methods that one day might allow us to control a quantum system. One path forward rests in combining different quantum systems in novel ways to exploit the benefits of different systems while circumventing their respective weaknesses. In particular, quantum systems come in two different flavours: either discrete-variable systems or continuous-variable ones. The field of hybrid quantum information seeks to combine these systems, in clever ways, to help overcome the challenges blocking the path between what is theoretically possible and what is achievable in a laboratory. In this thesis we explore four topics in the context of hybrid methods in quantum information, in an effort to contribute to the resolution of existing challenges and to stimulate new avenues of research. First, we explore the manipulation of a continuous-variable quantum system consisting of phonons in a linear chain of trapped ions where we use the discretized internal levels to mediate interactions. Using our proposed interaction we are able to implement, for example, the acoustic equivalent of a beam splitter with modest experimental resources. Next we propose an experimentally feasible implementation of the cubic phase gate, a primitive non-Gaussian gate required for universal continuous-variable quantum computation, based off sequential photon subtraction. We then discuss the notion of embedding a finite dimensional state into a continuous-variable system, and propose a method of performing quantum computations on encrypted continuous-variable states. This protocol allows for a client, of limited quantum ability, to outsource a computation while hiding their information. Next, we discuss the possibility of performing universal quantum computation on discrete-variable logical states encoded in mixed continuous-variable quantum states. Finally, we present an account of open problems related to our results, and possible future avenues of research.

  13. Cloud Computing with iPlant Atmosphere.

    PubMed

    McKay, Sheldon J; Skidmore, Edwin J; LaRose, Christopher J; Mercer, Andre W; Noutsos, Christos

    2013-10-15

    Cloud Computing refers to distributed computing platforms that use virtualization software to provide easy access to physical computing infrastructure and data storage, typically administered through a Web interface. Cloud-based computing provides access to powerful servers, with specific software and virtual hardware configurations, while eliminating the initial capital cost of expensive computers and reducing the ongoing operating costs of system administration, maintenance contracts, power consumption, and cooling. This eliminates a significant barrier to entry into bioinformatics and high-performance computing for many researchers. This is especially true of free or modestly priced cloud computing services. The iPlant Collaborative offers a free cloud computing service, Atmosphere, which allows users to easily create and use instances on virtual servers preconfigured for their analytical needs. Atmosphere is a self-service, on-demand platform for scientific computing. This unit demonstrates how to set up, access and use cloud computing in Atmosphere. Copyright © 2013 John Wiley & Sons, Inc.

  14. Children's beliefs about causes of childhood depression and ADHD: a study of stigmatization.

    PubMed

    Coleman, Daniel; Walker, Janet S; Lee, Junghee; Friesen, Barbara J; Squire, Peter N

    2009-07-01

    Children's causal attributions about childhood mental health problems were examined in a national sample for prevalence; relative stigmatization; variation by age, race and ethnicity, and gender; and self-report of a diagnosis of depression or attention-deficit hyperactivity disorder (ADHD). A national sample of 1,091 children were randomly assigned to read vignettes about a peer with depression, ADHD, or asthma and respond to an online survey. Causal attributions and social distance were assessed, and correlations were examined. Logistic regression models for each causal item tested main effects and interaction terms for conditions, demographic characteristics, and self-reported diagnosis. The beliefs that parenting, substance abuse, and low effort caused the condition were all strongly intercorrelated and were moderately correlated with social distance. The depression condition was the strongest predictor of endorsement of the most stigmatizing causal beliefs. Stigmatizing causal beliefs were evident for ADHD, but with more modest effects. Children who reported a diagnosis were more likely to endorse parenting and substance abuse as causes (attenuated for ADHD). Modest to moderate effects were found for variation in causal beliefs across ethnic groups. This study demonstrated a consistent presence of stigmatization in children's beliefs about the causes of childhood mental health problems. Low effort, parenting, and substance abuse together tapped a moralistic and blaming view of mental health problems. The results reinforce the need to address stigmatization of mental disorders and the relative stigmatization of different causal beliefs. The findings of variation by ethnicity and diagnosis can inform and target antistigmatization efforts.

  15. A DNA sequence analysis package for the IBM personal computer.

    PubMed Central

    Lagrimini, L M; Brentano, S T; Donelson, J E

    1984-01-01

    We present here a collection of DNA sequence analysis programs, called "PC Sequence" (PCS), which are designed to run on the IBM Personal Computer (PC). These programs are written in IBM PC compiled BASIC and take full advantage of the IBM PC's speed, error handling, and graphics capabilities. For a modest initial expense in hardware any laboratory can use these programs to quickly perform computer analysis on DNA sequences. They are written with the novice user in mind and require very little training or previous experience with computers. Also provided are a text editing program for creating and modifying DNA sequence files and a communications program which enables the PC to communicate with and collect information from mainframe computers and DNA sequence databases. PMID:6546433

  16. Model studies of laser absorption computed tomography for remote air pollution measurement

    NASA Technical Reports Server (NTRS)

    Wolfe, D. C., Jr.; Byer, R. L.

    1982-01-01

    Model studies of the potential of laser absorption-computed tomography are presented which demonstrate the possibility of sensitive remote atmospheric pollutant measurements, over kilometer-sized areas, with two-dimensional resolution, at modest laser source powers. An analysis of this tomographic reconstruction process as a function of measurement SNR, laser power, range, and system geometry, shows that the system is able to yield two-dimensional maps of pollutant concentrations at ranges and resolutions superior to those attainable with existing, direct-detection laser radars.

  17. Toward a Virtual Solar Observatory: Starting Before the Petabytes Fall

    NASA Technical Reports Server (NTRS)

    Gurman, J. B.; Fisher, Richard R. (Technical Monitor)

    2002-01-01

    NASA is currently engaged in the study phase of a modest effort to establish a Virtual Solar Observatory (VSO). The VSO would serve ground- and space-based solar physics data sets from a distributed network of archives through a small number of interfaces to the scientific community. The basis of this approach, as of all planned virtual observatories, is the translation of metadata from the various sources via source-specific dictionaries so the user will not have to distinguish among keyword usages. A single Web interface should give access to all the distributed data. We present the current status of the VSO, its initial scope, and its relation to the European EGSO effort.

  18. Monte Carlo MCNP-4B-based absorbed dose distribution estimates for patient-specific dosimetry.

    PubMed

    Yoriyaz, H; Stabin, M G; dos Santos, A

    2001-04-01

    This study was intended to verify the capability of the Monte Carlo MCNP-4B code to evaluate spatial dose distribution based on information gathered from CT or SPECT. A new three-dimensional (3D) dose calculation approach for internal emitter use in radioimmunotherapy (RIT) was developed using the Monte Carlo MCNP-4B code as the photon and electron transport engine. It was shown that the MCNP-4B computer code can be used with voxel-based anatomic and physiologic data to provide 3D dose distributions. This study showed that the MCNP-4B code can be used to develop a treatment planning system that will provide such information in a time manner, if dose reporting is suitably optimized. If each organ is divided into small regions where the average energy deposition is calculated with a typical volume of 0.4 cm(3), regional dose distributions can be provided with reasonable central processing unit times (on the order of 12-24 h on a 200-MHz personal computer or modest workstation). Further efforts to provide semiautomated region identification (segmentation) and improvement of marrow dose calculations are needed to supply a complete system for RIT. It is envisioned that all such efforts will continue to develop and that internal dose calculations may soon be brought to a similar level of accuracy, detail, and robustness as is commonly expected in external dose treatment planning. For this study we developed a code with a user-friendly interface that works on several nuclear medicine imaging platforms and provides timely patient-specific dose information to the physician and medical physicist. Future therapy with internal emitters should use a 3D dose calculation approach, which represents a significant advance over dose information provided by the standard geometric phantoms used for more than 20 y (which permit reporting of only average organ doses for certain standardized individuals)

  19. Clean fuels from biomass. [cellulose fermentation to methane

    NASA Technical Reports Server (NTRS)

    Hsu, Y. Y.

    1974-01-01

    The potential of growing crops as a source of fuels is examined, and it is shown that enough arable land is available in the U.S. so that, even with a modest rate of crop yield, the nation could be supplied by fuel crops. The technologies for fuel conversion are available; however, some R&D efforts are needed for scaling up design. Fuel crop economics are discussed and shown to be nonprohibitive.

  20. An Attempt to Observe Debris from the Breakup of a Titan 3C-4 Transtage

    NASA Technical Reports Server (NTRS)

    Barker, E. S.; Matney, M. J.; Yanagisawa, T.; Liou, J.-C.; Abercromby, K. J.; Rodriquez, H. M.; Horstman, M. F.; Seitzer, P.

    2007-01-01

    In February 2007 dedicated observations were made of the orbital space predicted to contain debris from the breakup of the Titan 3C-4 transtage back on February 21, 1992. These observations were carried out on the Michigan Orbital DEbris Survey Telescope (MODEST) in Chile with its 1.3deg field of view. The search region or orbital space (inclination and right ascension of the ascending node (RAAN) was predicted using NASA#s LEGEND (LEO-to-GEO Environment Debris) code to generate a Titan debris cloud. Breakup fragments are created based on the NASA Standard Breakup Model (including fragment size, area-to-mass (A/M), and delta-V distributions). Once fragments are created, they are propagated forward in time with a subroutine GEOPROP. Perturbations included in GEOPROP are those due to solar/lunar gravity, radiation pressure, and major geopotential terms. Barker, et. al, (AMOS Conference Proceedings, 2006, pp. 596-604) used similar LEGEND predictions to correlate survey observations made by MODEST (February 2002) and found several possible night-to-night correlations in the limited survey dataset. One conc lusion of the survey search was to dedicate a MODEST run to observing a GEO region predicted to contain debris fragments and actual Titan debris objects (SSN 25000, 25001 and 30000). Such a dedicated run was undertaken with MODEST between February 17 and 23, 2007 (UT dates). MODEST#s limiting magnitude of 18.0 (S\\N approx.10) corresponds to a size of 22cm assuming a diffuse Lambertian albedo of 0.2. However, based on observed break-up data, we expect most debris fragments to be smaller than 22cm which implies a need to increase the effective sensitivity of MODEST for smaller objects. MODEST#s limiting size can be lowered by increasing the exposure time (20 instead of 5 seconds) and applying special image processing. The special processing combines individual CCD images to detect faint objects that are invisible on a single CCD image. Sub-images are cropped from six consecutive CCD images with pixel shifts between images being consistent with the predicted movement of a Titan object. A median image of all the sub-images is then created leaving only those objects with the proper Titan motion. Limiting the median image in this manner brings the needed computer time to process all images taken on one night down to about 50 hours of CPU time.

  1. Silicon graphene waveguide tunable broadband microwave photonics phase shifter.

    PubMed

    Capmany, José; Domenech, David; Muñoz, Pascual

    2014-04-07

    We propose the use of silicon graphene waveguides to implement a tunable broadband microwave photonics phase shifter based on integrated ring cavities. Numerical computation results show the feasibility for broadband operation over 40 GHz bandwidth and full 360° radiofrequency phase-shift with a modest voltage excursion of 0.12 volt.

  2. Hemodialysis Catheter Heat Transfer for Biofilm Prevention and Treatment.

    PubMed

    Richardson, Ian P; Sturtevant, Rachael; Heung, Michael; Solomon, Michael J; Younger, John G; VanEpps, J Scott

    2016-01-01

    Central line-associated bloodstream infections (CLABSIs) are not easily treated, and many catheters (e.g., hemodialysis catheters) are not easily replaced. Biofilms (the source of infection) on catheter surfaces are notoriously difficult to eradicate. We have recently demonstrated that modest elevations of temperature lead to increased staphylococcal susceptibility to vancomycin and significantly soften the biofilm matrix. In this study, using a combination of microbiological, computational, and experimental studies, we demonstrate the efficacy, feasibility, and safety of using heat as an adjuvant treatment for infected hemodialysis catheters. Specifically, we show that treating with heat in the presence of antibiotics led to additive killing of Staphylococcus epidermidis with similar trends seen for Staphylococcus aureus and Klebsiella pneumoniae. The magnitude of temperature elevation required is relatively modest (45-50°C) and similar to that used as an adjuvant to traditional cancer therapy. Using a custom-designed benchtop model of a hemodialysis catheter, positioned with tip in the human vena cava as well as computational fluid dynamic simulations, we demonstrate that these temperature elevations are likely achievable in situ with minimal increased in overall blood temperature.

  3. Data Processing Factory for the Sloan Digital Sky Survey

    NASA Astrophysics Data System (ADS)

    Stoughton, Christopher; Adelman, Jennifer; Annis, James T.; Hendry, John; Inkmann, John; Jester, Sebastian; Kent, Steven M.; Kuropatkin, Nickolai; Lee, Brian; Lin, Huan; Peoples, John, Jr.; Sparks, Robert; Tucker, Douglas; Vanden Berk, Dan; Yanny, Brian; Yocum, Dan

    2002-12-01

    The Sloan Digital Sky Survey (SDSS) data handling presents two challenges: large data volume and timely production of spectroscopic plates from imaging data. A data processing factory, using technologies both old and new, handles this flow. Distribution to end users is via disk farms, to serve corrected images and calibrated spectra, and a database, to efficiently process catalog queries. For distribution of modest amounts of data from Apache Point Observatory to Fermilab, scripts use rsync to update files, while larger data transfers are accomplished by shipping magnetic tapes commercially. All data processing pipelines are wrapped in scripts to address consecutive phases: preparation, submission, checking, and quality control. We constructed the factory by chaining these pipelines together while using an operational database to hold processed imaging catalogs. The science database catalogs all imaging and spectroscopic object, with pointers to the various external files associated with them. Diverse computing systems address particular processing phases. UNIX computers handle tape reading and writing, as well as calibration steps that require access to a large amount of data with relatively modest computational demands. Commodity CPUs process steps that require access to a limited amount of data with more demanding computations requirements. Disk servers optimized for cost per Gbyte serve terabytes of processed data, while servers optimized for disk read speed run SQLServer software to process queries on the catalogs. This factory produced data for the SDSS Early Data Release in June 2001, and it is currently producing Data Release One, scheduled for January 2003.

  4. Rapid and accurate species tree estimation for phylogeographic investigations using replicated subsampling.

    PubMed

    Hird, Sarah; Kubatko, Laura; Carstens, Bryan

    2010-11-01

    We describe a method for estimating species trees that relies on replicated subsampling of large data matrices. One application of this method is phylogeographic research, which has long depended on large datasets that sample intensively from the geographic range of the focal species; these datasets allow systematicists to identify cryptic diversity and understand how contemporary and historical landscape forces influence genetic diversity. However, analyzing any large dataset can be computationally difficult, particularly when newly developed methods for species tree estimation are used. Here we explore the use of replicated subsampling, a potential solution to the problem posed by large datasets, with both a simulation study and an empirical analysis. In the simulations, we sample different numbers of alleles and loci, estimate species trees using STEM, and compare the estimated to the actual species tree. Our results indicate that subsampling three alleles per species for eight loci nearly always results in an accurate species tree topology, even in cases where the species tree was characterized by extremely rapid divergence. Even more modest subsampling effort, for example one allele per species and two loci, was more likely than not (>50%) to identify the correct species tree topology, indicating that in nearly all cases, computing the majority-rule consensus tree from replicated subsampling provides a good estimate of topology. These results were supported by estimating the correct species tree topology and reasonable branch lengths for an empirical 10-locus great ape dataset. Copyright © 2010 Elsevier Inc. All rights reserved.

  5. Observed differences in upper extremity forces, muscle efforts, postures, velocities and accelerations across computer activities in a field study of office workers.

    PubMed

    Bruno Garza, J L; Eijckelhof, B H W; Johnson, P W; Raina, S M; Rynell, P W; Huysmans, M A; van Dieën, J H; van der Beek, A J; Blatter, B M; Dennerlein, J T

    2012-01-01

    This study, a part of the PRedicting Occupational biomechanics in OFfice workers (PROOF) study, investigated whether there are differences in field-measured forces, muscle efforts, postures, velocities and accelerations across computer activities. These parameters were measured continuously for 120 office workers performing their own work for two hours each. There were differences in nearly all forces, muscle efforts, postures, velocities and accelerations across keyboard, mouse and idle activities. Keyboard activities showed a 50% increase in the median right trapezius muscle effort when compared to mouse activities. Median shoulder rotation changed from 25 degrees internal rotation during keyboard use to 15 degrees external rotation during mouse use. Only keyboard use was associated with median ulnar deviations greater than 5 degrees. Idle activities led to the greatest variability observed in all muscle efforts and postures measured. In future studies, measurements of computer activities could be used to provide information on the physical exposures experienced during computer use. Practitioner Summary: Computer users may develop musculoskeletal disorders due to their force, muscle effort, posture and wrist velocity and acceleration exposures during computer use. We report that many physical exposures are different across computer activities. This information may be used to estimate physical exposures based on patterns of computer activities over time.

  6. Computation of an Underexpanded 3-D Rectangular Jet by the CE/SE Method

    NASA Technical Reports Server (NTRS)

    Loh, Ching Y.; Himansu, Ananda; Wang, Xiao Y.; Jorgenson, Philip C. E.

    2000-01-01

    Recently, an unstructured three-dimensional space-time conservation element and solution element (CE/SE) Euler solver was developed. Now it is also developed for parallel computation using METIS for domain decomposition and MPI (message passing interface). The method is employed here to numerically study the near-field of a typical 3-D rectangular under-expanded jet. For the computed case-a jet with Mach number Mj = 1.6. with a very modest grid of 1.7 million tetrahedrons, the flow features such as the shock-cell structures and the axis switching, are in good qualitative agreement with experimental results.

  7. Finite difference time domain electromagnetic scattering from frequency-dependent lossy materials

    NASA Technical Reports Server (NTRS)

    Luebbers, Raymond J.; Beggs, John H.

    1991-01-01

    Four different FDTD computer codes and companion Radar Cross Section (RCS) conversion codes on magnetic media are submitted. A single three dimensional dispersive FDTD code for both dispersive dielectric and magnetic materials was developed, along with a user's manual. The extension of FDTD to more complicated materials was made. The code is efficient and is capable of modeling interesting radar targets using a modest computer workstation platform. RCS results for two different plate geometries are reported. The FDTD method was also extended to computing far zone time domain results in two dimensions. Also the capability to model nonlinear materials was incorporated into FDTD and validated.

  8. Decentralized coordinated control of elastic web winding systems without tension sensor.

    PubMed

    Hou, Hailiang; Nian, Xiaohong; Chen, Jie; Xiao, Dengfeng

    2018-06-26

    In elastic web winding systems, precise regulation of web tension in each span is critical to ensure final product quality, and to achieve low cost by reducing the occurrence of web break or fold. Generally, web winding systems use load cells or swing rolls as tension sensors, which add cost, reduce system reliability and increase the difficulty of control. In this paper, a decentralized coordinated control scheme with tension observers is designed for a three-motor web-winding system. First, two tension observers are proposed to estimate the unwinding and winding tension. The designed observers consider the essential dynamic, radius, and inertial variation effects and only require the modest computational effort. Then, using the estimated tensions as feedback signals, a robust decentralized coordinated controller is adopted to reduce the interaction between subsystems. Asymptotic stabilities of the observer error dynamics and the closed-loop winding systems are demonstrated via Lyapunov stability theory. The observer gains and the controller gains can be obtained by solving matrix inequalities. Finally, some simulations and experiments are performed on a paper winding setup to test the performance of the designed observers and the observer-base DCC method, respectively. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  9. Fleet Astronomy

    NASA Astrophysics Data System (ADS)

    Klebe, D. I.; Colorado College Student Astronomy Instrument Team; Pikes Peak Observatory Team

    1999-12-01

    The Colorado College Student Astronomy Instrument Team (CCSAIT) and the Pikes Peak Observatory (PPO) present preliminary optical and mechanical designs as well as discussion on a fleet of small research-class 0.4-0.5-meter telescopes. Each telescope is being designed to accommodate a variety of visible and near-infrared instrumentation, ranging from wide-field imaging cameras to moderate resolution spectrometers. The design of these telescopes is predicated on the use of lightweight primary mirrors, which will enable the entire optical telescope assembly (OTA) including instrumentation to come in under 50 kilograms. The lightweight OTA’s will further allow the use of inexpensive high-quality off-the-shelf robotic telescope mounts for future access and computer control of these telescopes over the Internet. The basic idea is to provide astronomers with a comprehensive arsenal of modest instrumentation at their fingertips in order to conduct a wide variety of interesting scientific research programs. Some of these research programs are discussed and input from the astronomical community is strongly encouraged. Connectivity and Internet control issues are also briefly discussed as development in this area is already underway through a collaborative effort between the PPO and the Cowan-Fouts Foundation of Woodland Park, Colorado.

  10. The Future of Remote Sensing from Space: Civilian Satellite Systems and Applications.

    DTIC Science & Technology

    1993-07-01

    image shows abundant (dark green) vegetation across the Amazon of South America, while lack of vegetation (black areas) is seen across the Sahara Desert...primarily through the space shuttle and space station Freedom programs.25 Hence, if NASA’s overall budget remains flat or includes only modest growth... remain the primary collector of satellite remote sensing data for both meteorolog- ical and climate monitoring efforts through the decade of the 1990s

  11. Ain't no mountain high enough? Setting high weight loss goals predict effort and short-term weight loss.

    PubMed

    De Vet, Emely; Nelissen, Rob M A; Zeelenberg, Marcel; De Ridder, Denise T D

    2013-05-01

    Although psychological theories outline that it might be beneficial to set more challenging goals, people attempting to lose weight are generally recommended to set modest weight loss goals. The present study explores whether the amount of weight loss individuals strive for is associated with more positive psychological and behavioral outcomes. Hereto, 447 overweight and obese participants trying to lose weight completed two questionnaires with a 2-month interval. Many participants set goals that could be considered unrealistically high. However, higher weight loss goals did not predict dissatisfaction but predicted more effort in the weight loss attempt, as well as more self-reported short-term weight loss when baseline commitment and motivation were controlled for.

  12. Reflections on a half century of injury control.

    PubMed Central

    Waller, J A

    1994-01-01

    Using both historical analysis and personal reminiscence, this article describes the development of injury control activities since about 1940, focusing particular attention on the rise and fall of the Public Health Service's Division of Accident Prevention. By the 1940s and 1950s, modest but useful efforts in injury control research and programming had been made. The 1960s and early 1970s then saw an explosion of new concepts, programs, and enthusiasm, but much of this soon dissipated. Since 1985 there has been a renaissance of interest and effort, and the development of a new cadre of injury control professionals. This progress is threatened, however, by both old and new problems. Images p665-a p666-a p667-a PMID:8154576

  13. Final Report for ALCC Allocation: Predictive Simulation of Complex Flow in Wind Farms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barone, Matthew F.; Ananthan, Shreyas; Churchfield, Matt

    This report documents work performed using ALCC computing resources granted under a proposal submitted in February 2016, with the resource allocation period spanning the period July 2016 through June 2017. The award allocation was 10.7 million processor-hours at the National Energy Research Scientific Computing Center. The simulations performed were in support of two projects: the Atmosphere to Electrons (A2e) project, supported by the DOE EERE office; and the Exascale Computing Project (ECP), supported by the DOE Office of Science. The project team for both efforts consists of staff scientists and postdocs from Sandia National Laboratories and the National Renewable Energymore » Laboratory. At the heart of these projects is the open-source computational-fluid-dynamics (CFD) code, Nalu. Nalu solves the low-Mach-number Navier-Stokes equations using an unstructured- grid discretization. Nalu leverages the open-source Trilinos solver library and the Sierra Toolkit (STK) for parallelization and I/O. This report documents baseline computational performance of the Nalu code on problems of direct relevance to the wind plant physics application - namely, Large Eddy Simulation (LES) of an atmospheric boundary layer (ABL) flow and wall-modeled LES of a flow past a static wind turbine rotor blade. Parallel performance of Nalu and its constituent solver routines residing in the Trilinos library has been assessed previously under various campaigns. However, both Nalu and Trilinos have been, and remain, in active development and resources have not been available previously to rigorously track code performance over time. With the initiation of the ECP, it is important to establish and document baseline code performance on the problems of interest. This will allow the project team to identify and target any deficiencies in performance, as well as highlight any performance bottlenecks as we exercise the code on a greater variety of platforms and at larger scales. The current study is rather modest in scale, examining performance on problem sizes of O(100 million) elements and core counts up to 8k cores. This will be expanded as more computational resources become available to the projects.« less

  14. When Learning Is Just a Click Away: Does Simple User Interaction Foster Deeper Understanding of Multimedia Messages?

    ERIC Educational Resources Information Center

    Mayer, Richard E.; Chandler, Paul

    2001-01-01

    In two experiments, students received two presentations of a narrated animation explaining how lightning forms, followed by retention and transfer tests. The goal was to determine possible benefits of incorporating a modest amount of computer-user interactivity within a multimedia explanation. Results were consistent with cognitive load theory and…

  15. Strategies and methodologies to develop techniques for computer-assisted analysis of gas phase formation during altitude decompression

    NASA Technical Reports Server (NTRS)

    Powell, Michael R.; Hall, W. A.

    1993-01-01

    It would be of operational significance if one possessed a device that would indicate the presence of gas phase formation in the body during hypobaric decompression. Automated analysis of Doppler gas bubble signals has been attempted for 2 decades but with generally unfavorable results, except with surgically implanted transducers. Recently, efforts have intensified with the introduction of low-cost computer programs. Current NASA work is directed towards the development of a computer-assisted method specifically targeted to EVA, and we are most interested in Spencer Grade 4. We note that Spencer Doppler Grades 1 to 3 have increased in the FFT sonogram and spectrogram in the amplitude domain, and the frequency domain is sometimes increased over that created by the normal blood flow envelope. The amplitude perturbations are of very short duration, in both systole and diastole and at random temporal positions. Grade 4 is characteristic in the amplitude domain but with modest increases in the FFT sonogram and spectral frequency power from 2K to 4K over all of the cardiac cycle. Heart valve motion appears to characteristic display signals: (1) the demodulated Doppler signal amplitude is considerably above the Doppler-shifted blow flow signal (even Grade 4); and (2) demodulated Doppler frequency shifts are considerably greater (often several kHz) than the upper edge of the blood flow envelope. Knowledge of these facts will aid in the construction of a real-time, computer-assisted discriminator to eliminate cardiac motion artifacts. There could also exist perturbations in the following: (1) modifications of the pattern of blood flow in accordance with Poiseuille's Law, (2) flow changes with a change in the Reynolds number, (3) an increase in the pulsatility index, and/or (4) diminished diastolic flow or 'runoff.' Doppler ultrasound devices have been constructed with a three-transducer array and a pulsed frequency generator.

  16. Electric utility of the year for 1984: Potomac Electric Power

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1984-11-01

    High performance, efficiency improvements, a modest construction program, a clear balance sheet, and an effort to expend power plant life were among the qualities that earned Potomac Electric Power (PEPCO) the title of 1984 Utility of the Year. Other key elements in the utility's selection were its strategy for purchasing power, a load management plan, diversified investments into subsidiary businesses, community concern that considers the aesthetics of transmission facilities, and its interest in personnel development, especially among minorities. 3 figures.

  17. Basic Skills Resource Center: The Effects of Learning Strategies; Training on the Development of Skills in English as a Second Language

    DTIC Science & Technology

    1985-05-01

    Teachers interested in helping students to become more effective learners * should be aware of strategies which can be embedded in curricul1a and...taught *to students with only modest extra effort. Teachers can expand their * instructional role to include a variety of learning strategies which... can be used with specific types of language tasks. Future research should be * directed to refining strategy training approaches, and determining

  18. Investigation of beam-plasma interactions

    NASA Technical Reports Server (NTRS)

    Olsen, Richard C.

    1987-01-01

    Data from the SCATHA satellite was analyzed to solve the problems of establishing electrical contact between a satellite and the ambient plasma. The original focus of the work was the electron gun experiments conducted near the geosynchronous orbit, which resulted in observations which bore a startling similarity to observations of the SEPAC experiments on SPACELAB 1. The study has evolved to include the ion gun experiments on SCATHA, a modest laboratory effort in hollow cathode performance, and preparation for flight experiments pertinent to tether technology. These areas are addressed separately.

  19. Parametric bicubic spline and CAD tools for complex targets shape modelling in physical optics radar cross section prediction

    NASA Astrophysics Data System (ADS)

    Delogu, A.; Furini, F.

    1991-09-01

    Increasing interest in radar cross section (RCS) reduction is placing new demands on theoretical, computation, and graphic techniques for calculating scattering properties of complex targets. In particular, computer codes capable of predicting the RCS of an entire aircraft at high frequency and of achieving RCS control with modest structural changes, are becoming of paramount importance in stealth design. A computer code, evaluating the RCS of arbitrary shaped metallic objects that are computer aided design (CAD) generated, and its validation with measurements carried out using ALENIA RCS test facilities are presented. The code, based on the physical optics method, is characterized by an efficient integration algorithm with error control, in order to contain the computer time within acceptable limits, and by an accurate parametric representation of the target surface in terms of bicubic splines.

  20. Sustained inflation during neonatal resuscitation.

    PubMed

    Keszler, Martin

    2015-04-01

    Sustained inflation performed shortly after birth to help clear lung fluid and establish functional residual capacity in preterm infants is gaining popularity, but definitive evidence for its effectiveness is lacking. Although there is a sound physiologic basis for this approach, and much preclinical experimental evidence of effectiveness, the results of recent animal studies and clinical trials have been inconsistent. The most recent data from a multicenter randomized trial suggest a modest benefit of sustained inflation in reducing the need for mechanical ventilation in extremely-low-birth-weight infants. However, the impact may be more modest than earlier retrospective cohort comparisons suggested. The trend toward more airleak and a higher rate of intraventricular hemorrhage is worrisome. Sustained inflation may be ineffective unless some spontaneous respiratory effort is present. Several on-going trials should further clarify the putative benefits of sustained inflation. Delivery room sustained inflation is an attractive concept that holds much promise, but widespread clinical application should await definitive evidence from on-going clinical trials.

  1. Strategic communication related to academic performance: Evidence from China.

    PubMed

    Zhao, Li; Chen, Lulu; He, Luwei; Heyman, Gail D

    2017-09-01

    We examined a range of forms of strategic communication relevant to academic performance among 151 seventh- and eleventh-grade adolescents in China. Participants were asked to rate the frequency of their engagement of strategic communication and to evaluate the possible motives for each strategy. The most commonly adopted strategy was to give a vague response about one's own performance, and the predominant motives for strategic communication were the desires to outcompete others, to be prosocial, and to be modest. Males were more likely than females to focus on gaining social approval, and eleventh graders were more likely than seventh graders to focus on being prosocial and modest when engaging in strategic communication. These findings provide insight into the development of strategic communication beyond Western culture. Statement of contribution What is already known on this subject? Adolescents in the West often hide their effort to appear more competent or to gain social acceptance. Little is known about other communication strategies related to academic performance. Little is known about the development of these strategies in non-Western samples. What does this study add? We show that in China, as in Western cultures, children often engage in strategic communication. We demonstrate links between different forms of strategic communication and specific motives. We demonstrate that strategic communication can be motivated by outcompeting others, by being prosocial, and by being modest. © 2017 The British Psychological Society.

  2. Skin self-examinations and visual identification of atypical nevi: comparing individual and crowdsourcing approaches.

    PubMed

    King, Andy J; Gehl, Robert W; Grossman, Douglas; Jensen, Jakob D

    2013-12-01

    Skin self-examination (SSE) is one method for identifying atypical nevi among members of the general public. Unfortunately, past research has shown that SSE has low sensitivity in detecting atypical nevi. The current study investigates whether crowdsourcing (collective effort) can improve SSE identification accuracy. Collective effort is potentially useful for improving people's visual identification of atypical nevi during SSE because, even when a single person has low reliability at a task, the pattern of the group can overcome the limitations of each individual. Adults (N=500) were recruited from a shopping mall in the Midwest. Participants viewed educational pamphlets about SSE and then completed a mole identification task. For the task, participants were asked to circle mole images that appeared atypical. Forty nevi images were provided; nine of the images were of nevi that were later diagnosed as melanoma. Consistent with past research, individual effort exhibited modest sensitivity (.58) for identifying atypical nevi in the mole identification task. As predicted, collective effort overcame the limitations of individual effort. Specifically, a 19% collective effort identification threshold exhibited superior sensitivity (.90). The results of the current study suggest that limitations of SSE can be countered by collective effort, a finding that supports the pursuit of interventions promoting early melanoma detection that contain crowdsourced visual identification components. Copyright © 2013 Elsevier Ltd. All rights reserved.

  3. Intermediate-mass-ratio black-hole binaries: numerical relativity meets perturbation theory.

    PubMed

    Lousto, Carlos O; Nakano, Hiroyuki; Zlochower, Yosef; Campanelli, Manuela

    2010-05-28

    We study black-hole binaries in the intermediate-mass-ratio regime 0.01≲q≲0.1 with a new technique that makes use of nonlinear numerical trajectories and efficient perturbative evolutions to compute waveforms at large radii for the leading and nonleading (ℓ, m) modes. As a proof-of-concept, we compute waveforms for q=1/10. We discuss applications of these techniques for LIGO and VIRGO data analysis and the possibility that our technique can be extended to produce accurate waveform templates from a modest number of fully nonlinear numerical simulations.

  4. WELDSMART: A vision-based expert system for quality control

    NASA Technical Reports Server (NTRS)

    Andersen, Kristinn; Barnett, Robert Joel; Springfield, James F.; Cook, George E.

    1992-01-01

    This work was aimed at exploring means for utilizing computer technology in quality inspection and evaluation. Inspection of metallic welds was selected as the main application for this development and primary emphasis was placed on visual inspection, as opposed to other inspection methods, such as radiographic techniques. Emphasis was placed on methodologies with the potential for use in real-time quality control systems. Because quality evaluation is somewhat subjective, despite various efforts to classify discontinuities and standardize inspection methods, the task of using a computer for both inspection and evaluation was not trivial. The work started out with a review of the various inspection techniques that are used for quality control in welding. Among other observations from this review was the finding that most weld defects result in abnormalities that may be seen by visual inspection. This supports the approach of emphasizing visual inspection for this work. Quality control consists of two phases: (1) identification of weld discontinuities (some of which may be severe enough to be classified as defects), and (2) assessment or evaluation of the weld based on the observed discontinuities. Usually the latter phase results in a pass/fail judgement for the inspected piece. It is the conclusion of this work that the first of the above tasks, identification of discontinuities, is the most challenging one. It calls for sophisticated image processing and image analysis techniques, and frequently ad hoc methods have to be developed to identify specific features in the weld image. The difficulty of this task is generally not due to limited computing power. In most cases it was found that a modest personal computer or workstation could carry out most computations in a reasonably short time period. Rather, the algorithms and methods necessary for identifying weld discontinuities were in some cases limited. The fact that specific techniques were finally developed and successfully demosntrated to work illustrates that the general approach taken here appears to be promising for commercial development of computerized quality inspection systems. Inspection based on these techniques may be used to supplement or substitute more elaborate inspection methods, such as x-ray inspections.

  5. Using Computational Toxicology to Enable Risk-Based ...

    EPA Pesticide Factsheets

    presentation at Drug Safety Gordon Research Conference 2016 on research efforts in NCCT to enable Computational Toxicology to support risk assessment. Slide presentation at Drug Safety Gordon Research Conference 2016 on research efforts in NCCT to enable Computational Toxicology to support risk assessment.

  6. U.S. EPA computational toxicology programs: Central role of chemical-annotation efforts and molecular databases

    EPA Science Inventory

    EPA’s National Center for Computational Toxicology is engaged in high-profile research efforts to improve the ability to more efficiently and effectively prioritize and screen thousands of environmental chemicals for potential toxicity. A central component of these efforts invol...

  7. An initial assessment of the cost and utilization of the Integrated Academic Information System (IAIMS) at Columbia Presbyterian Medical Center.

    PubMed Central

    Clayton, P. D.; Anderson, R. K.; Hill, C.; McCormack, M.

    1991-01-01

    The concept of "one stop information shopping" is becoming a reality at Columbia Presbyterian Medical Center (CPMC). The goal of our effort is to provide access to university and hospital administrative systems as well as clinical and library applications from a single workstation, which also provides utility functions such as word processing and mail. Since June 1987, CPMC has invested the equivalent of $23 million dollars to install a digital communications network that encompasses 18 buildings at seven geographically separate sites and to develop clinical and library applications that are integrated with the existing hospital and university administrative and research computing facilities. During June 1991, 2425 different individuals used the clinical information system, 425 different individuals used the library applications, and 900 different individuals used the hospital administrative applications via network access. If we were to freeze the system in its current state, amortize the development and network installation costs, and add projected maintenance costs for the clinical and library applications, our integrated information system would cost $2.8 million on an annual basis. This cost is 0.3% of the medical center's annual budget. These expenditures could be justified by very small improvements in time savings for personnel and/or decreased length of hospital stay and/or more efficient use of resources. In addition to the direct benefits which we detail, a major benefit is the ease with which additional computer-based applications can be added incrementally at an extremely modest cost. PMID:1666966

  8. Microelectromechanical reprogrammable logic device.

    PubMed

    Hafiz, M A A; Kosuru, L; Younis, M I

    2016-03-29

    In modern computing, the Boolean logic operations are set by interconnect schemes between the transistors. As the miniaturization in the component level to enhance the computational power is rapidly approaching physical limits, alternative computing methods are vigorously pursued. One of the desired aspects in the future computing approaches is the provision for hardware reconfigurability at run time to allow enhanced functionality. Here we demonstrate a reprogrammable logic device based on the electrothermal frequency modulation scheme of a single microelectromechanical resonator, capable of performing all the fundamental 2-bit logic functions as well as n-bit logic operations. Logic functions are performed by actively tuning the linear resonance frequency of the resonator operated at room temperature and under modest vacuum conditions, reprogrammable by the a.c.-driving frequency. The device is fabricated using complementary metal oxide semiconductor compatible mass fabrication process, suitable for on-chip integration, and promises an alternative electromechanical computing scheme.

  9. Microelectromechanical reprogrammable logic device

    PubMed Central

    Hafiz, M. A. A.; Kosuru, L.; Younis, M. I.

    2016-01-01

    In modern computing, the Boolean logic operations are set by interconnect schemes between the transistors. As the miniaturization in the component level to enhance the computational power is rapidly approaching physical limits, alternative computing methods are vigorously pursued. One of the desired aspects in the future computing approaches is the provision for hardware reconfigurability at run time to allow enhanced functionality. Here we demonstrate a reprogrammable logic device based on the electrothermal frequency modulation scheme of a single microelectromechanical resonator, capable of performing all the fundamental 2-bit logic functions as well as n-bit logic operations. Logic functions are performed by actively tuning the linear resonance frequency of the resonator operated at room temperature and under modest vacuum conditions, reprogrammable by the a.c.-driving frequency. The device is fabricated using complementary metal oxide semiconductor compatible mass fabrication process, suitable for on-chip integration, and promises an alternative electromechanical computing scheme. PMID:27021295

  10. Utilizing Structural Equation Modeling and Social Cognitive Career Theory to Identify Factors in Choice of It as a Major

    ERIC Educational Resources Information Center

    Luse, Andy; Rursch, Julie A.; Jacobson, Doug

    2014-01-01

    In the United States, the number of students entering into and completing degrees in science, technology, engineering, and mathematics (STEM) areas has declined significantly over the past decade. Although modest increases have been shown in enrollments in computer-related majors in the past 4 years, the prediction is that even in 3 to 4 years…

  11. Global Warming, Africa and National Security

    DTIC Science & Technology

    2008-01-15

    African populations. This includes awareness from a global perspective in line with The Army Strategy for the Environment, the UN’s Intergovernmental...2 attention. At the time, computer models did not indicate a significant issue with global warming suggesting only a modest increase of 2°C9...projected climate changes. Current Science The science surrounding climate change and global warming was, until recently, a point of

  12. Strategies for sustainable management of renewable resources during environmental change.

    PubMed

    Lindkvist, Emilie; Ekeberg, Örjan; Norberg, Jon

    2017-03-15

    As a consequence of global environmental change, management strategies that can deal with unexpected change in resource dynamics are becoming increasingly important. In this paper we undertake a novel approach to studying resource growth problems using a computational form of adaptive management to find optimal strategies for prevalent natural resource management dilemmas. We scrutinize adaptive management, or learning-by-doing, to better understand how to simultaneously manage and learn about a system when its dynamics are unknown. We study important trade-offs in decision-making with respect to choosing optimal actions (harvest efforts) for sustainable management during change. This is operationalized through an artificially intelligent model where we analyze how different trends and fluctuations in growth rates of a renewable resource affect the performance of different management strategies. Our results show that the optimal strategy for managing resources with declining growth is capable of managing resources with fluctuating or increasing growth at a negligible cost, creating in a management strategy that is both efficient and robust towards future unknown changes. To obtain this strategy, adaptive management should strive for: high learning rates to new knowledge, high valuation of future outcomes and modest exploration around what is perceived as the optimal action. © 2017 The Author(s).

  13. Hierarchy of folding and unfolding events of protein G, CI2, and ACBP from explicit-solvent simulations

    NASA Astrophysics Data System (ADS)

    Camilloni, Carlo; Broglia, Ricardo A.; Tiana, Guido

    2011-01-01

    The study of the mechanism which is at the basis of the phenomenon of protein folding requires the knowledge of multiple folding trajectories under biological conditions. Using a biasing molecular-dynamics algorithm based on the physics of the ratchet-and-pawl system, we carry out all-atom, explicit solvent simulations of the sequence of folding events which proteins G, CI2, and ACBP undergo in evolving from the denatured to the folded state. Starting from highly disordered conformations, the algorithm allows the proteins to reach, at the price of a modest computational effort, nativelike conformations, within a root mean square deviation (RMSD) of approximately 1 Å. A scheme is developed to extract, from the myriad of events, information concerning the sequence of native contact formation and of their eventual correlation. Such an analysis indicates that all the studied proteins fold hierarchically, through pathways which, although not deterministic, are well-defined with respect to the order of contact formation. The algorithm also allows one to study unfolding, a process which looks, to a large extent, like the reverse of the major folding pathway. This is also true in situations in which many pathways contribute to the folding process, like in the case of protein G.

  14. Disparities in child mortality trends: what is the evidence from disadvantaged states in India? the case of Orissa and Madhya Pradesh.

    PubMed

    Nguyen, Kim-Huong; Jimenez-Soto, Eliana; Dayal, Prarthna; Hodge, Andrew

    2013-06-27

    The Millennium Development Goals prompted renewed international efforts to reduce under-five mortality and measure national progress. However, scant evidence exists about the distribution of child mortality at low sub-national levels, which in diverse and decentralized countries like India are required to inform policy-making. This study estimates changes in child mortality across a range of markers of inequalities in Orissa and Madhya Pradesh, two of India's largest, poorest, and most disadvantaged states. Estimates of under-five and neonatal mortality rates were computed using seven datasets from three available sources--sample registration system, summary birth histories in surveys, and complete birth histories. Inequalities were gauged by comparison of mortality rates within four sub-state populations defined by the following characteristics: rural-urban location, ethnicity, wealth, and district. Trend estimates suggest that progress has been made in mortality rates at the state levels. However, reduction rates have been modest, particularly for neonatal mortality. Different mortality rates are observed across all the equity markers, although there is a pattern of convergence between rural and urban areas, largely due to inadequate progress in urban settings. Inter-district disparities and differences between socioeconomic groups are also evident. Although child mortality rates continue to decline at the national level, our evidence shows that considerable disparities persist. While progress in reducing under-five and neonatal mortality rates in urban areas appears to be levelling off, policies targeting rural populations and scheduled caste and tribe groups appear to have achieved some success in reducing mortality differentials. The results of this study thus add weight to recent government initiatives targeting these groups. Equitable progress, particularly for neonatal mortality, requires continuing efforts to strengthen health systems and overcome barriers to identify and reach vulnerable groups.

  15. Improve your marketing effectiveness and net income through better prospecting.

    PubMed

    Gombeski, William R; Kantor, David; Bendycki, Nadine A; Wack, Jeff

    2002-01-01

    Prospecting is the process of finding customers who are ready to buy and can generate high net income for an organization. Leads for prospects come from three categories of sources: (1) organization-initiated; (2) acquired leads; and (3) marketing activity-initiated leads. Findings from a study of academic medical organizations showed a modest use of effective prospecting by hospitals surveyed and that there are opportunities to increase database marketing efforts. The data suggests that prospecting and its companion concept of qualifying are not fully integrated into many healthcare organization's marketing strategies and tactics.

  16. Advances in HIV-1 Vaccine Development

    PubMed Central

    Gao, Yong

    2018-01-01

    An efficacious HIV-1 vaccine is regarded as the best way to halt the ongoing HIV-1 epidemic. However, despite significant efforts to develop a safe and effective vaccine, the modestly protective RV144 trial remains the only efficacy trial to provide some level of protection against HIV-1 acquisition. This review will outline the history of HIV vaccine development, novel technologies being applied to HIV vaccinology and immunogen design, as well as the studies that are ongoing to advance our understanding of vaccine-induced immune correlates of protection. PMID:29614779

  17. Early Formulation Model-centric Engineering on Nasa's Europa Mission Concept Study

    NASA Technical Reports Server (NTRS)

    Bayer, Todd; Chung, Seung; Cole, Bjorn; Cooke, Brian; Dekens, Frank; Delp, Chris; Gontijo, I.; Lewis, Kari; Moshir, Mehrdad; Rasmussen, Robert; hide

    2012-01-01

    By leveraging the existing Model-Based Systems Engineering (MBSE) infrastructure at JPL and adding a modest investment, the Europa Mission Concept Study made striking advances in mission concept capture and analysis. This effort has reaffirmed the importance of architecting and successfully harnessed the synergistic relationship of system modeling to mission architecting. It clearly demonstrated that MBSE can provide greater agility than traditional systems engineering methods. This paper will describe the successful application of MBSE in the dynamic environment of early mission formulation, the significant results produced and lessons learned in the process.

  18. Reflections on ESO, 1957 - 2002. Perspectives from the Directors General past and present: Adriaan Blaauw, ESO Director General, 1970 - 1974

    NASA Astrophysics Data System (ADS)

    Blaauw, Adriaan

    2002-09-01

    Nearly half a century ago, I witnessed Walter Baade and Jan Oort dreaming of a joint enterprise which would lift observational astronomy in Europe from the level of their modest national efforts to that of the leading observatories in the United States. I have been privileged to see, and to have been able to contribute to, the realization of that dream. This half century has left a wealth of recollections and sentiments from which it is difficult to select for this occasion.

  19. Shor's factoring algorithm and modern cryptography. An illustration of the capabilities inherent in quantum computers

    NASA Astrophysics Data System (ADS)

    Gerjuoy, Edward

    2005-06-01

    The security of messages encoded via the widely used RSA public key encryption system rests on the enormous computational effort required to find the prime factors of a large number N using classical (conventional) computers. In 1994 Peter Shor showed that for sufficiently large N, a quantum computer could perform the factoring with much less computational effort. This paper endeavors to explain, in a fashion comprehensible to the nonexpert, the RSA encryption protocol; the various quantum computer manipulations constituting the Shor algorithm; how the Shor algorithm performs the factoring; and the precise sense in which a quantum computer employing Shor's algorithm can be said to accomplish the factoring of very large numbers with less computational effort than a classical computer. It is made apparent that factoring N generally requires many successive runs of the algorithm. Our analysis reveals that the probability of achieving a successful factorization on a single run is about twice as large as commonly quoted in the literature.

  20. Start-Up and Ongoing Practice Expenses of Behavioral Health and Primary Care Integration Interventions in the Advancing Care Together (ACT) Program.

    PubMed

    Wallace, Neal T; Cohen, Deborah J; Gunn, Rose; Beck, Arne; Melek, Steve; Bechtold, Donald; Green, Larry A

    2015-01-01

    Provide credible estimates of the start-up and ongoing effort and incremental practice expenses for the Advancing Care Together (ACT) behavioral health and primary care integration interventions. Expenditure data were collected from 10 practice intervention sites using an instrument with a standardized general format that could accommodate the unique elements of each intervention. Average start-up effort expenses were $44,076 and monthly ongoing effort expenses per patient were $40.39. Incremental expenses averaged $20,788 for start-up and $4.58 per patient for monthly ongoing activities. Variations in expenditures across practices reflect the differences in intervention specifics and organizational settings. Differences in effort to incremental expenditures reflect the extensive use of existing resources in implementing the interventions. ACT program incremental expenses suggest that widespread adoption would likely have a relatively modest effect on overall health systems expenditures. Practice effort expenses are not trivial and may pose barriers to adoption. Payers and purchasers interested in attaining widespread adoption of integrated care must consider external support to practices that accounts for both incremental and effort expense levels. Existing knowledge transfer mechanisms should be employed to minimize developmental start-up expenses and payment reform focused toward value-based, Triple Aim-oriented reimbursement and purchasing mechanisms are likely needed. © Copyright 2015 by the American Board of Family Medicine.

  1. Proposed Directions for Research in Computer-Based Education.

    ERIC Educational Resources Information Center

    Waugh, Michael L.

    Several directions for potential research efforts in the field of computer-based education (CBE) are discussed. (For the purposes of this paper, CBE is defined as any use of computers to promote learning with no intended inference as to the specific nature or organization of the educational application under discussion.) Efforts should be directed…

  2. Faithful qubit transmission in a quantum communication network with heterogeneous channels

    NASA Astrophysics Data System (ADS)

    Chen, Na; Zhang, Lin Xi; Pei, Chang Xing

    2018-04-01

    Quantum communication networks enable long-distance qubit transmission and distributed quantum computation. In this paper, a quantum communication network with heterogeneous quantum channels is constructed. A faithful qubit transmission scheme is presented. Detailed calculations and performance analyses show that even in a low-quality quantum channel with serious decoherence, only modest number of locally prepared target qubits are required to achieve near-deterministic qubit transmission.

  3. Time-optimal aircraft pursuit-evasion with a weapon envelope constraint

    NASA Technical Reports Server (NTRS)

    Menon, P. K. A.; Duke, E. L.

    1990-01-01

    The optimal pursuit-evasion problem between two aircraft, including nonlinear point-mass vehicle models and a realistic weapon envelope, is analyzed. Using a linear combination of flight time and the square of the vehicle acceleration as the performance index, a closed-form solution is obtained in nonlinear feedback form. Due to its modest computational requirements, this guidance law can be used for onboard real-time implementation.

  4. Reprocessing Multiyear GPS Data from Continuously Operating Reference Stations on Cloud Computing Platform

    NASA Astrophysics Data System (ADS)

    Yoon, S.

    2016-12-01

    To define geodetic reference frame using GPS data collected by Continuously Operating Reference Stations (CORS) network, historical GPS data needs to be reprocessed regularly. Reprocessing GPS data collected by upto 2000 CORS sites for the last two decades requires a lot of computational resource. At National Geodetic Survey (NGS), there has been one completed reprocessing in 2011, and currently, the second reprocessing is undergoing. For the first reprocessing effort, in-house computing resource was utilized. In the current second reprocessing effort, outsourced cloud computing platform is being utilized. In this presentation, the outline of data processing strategy at NGS is described as well as the effort to parallelize the data processing procedure in order to maximize the benefit of the cloud computing. The time and cost savings realized by utilizing cloud computing approach will also be discussed.

  5. Patient Hospital Experience Improved Modestly, But No Evidence Medicare Incentives Promoted Meaningful Gains.

    PubMed

    Papanicolas, Irene; Figueroa, José F; Orav, E John; Jha, Ashish K

    2017-01-01

    The Centers for Medicare and Medicaid Services (CMS) has played a leading role in efforts to improve patients' experiences with hospital care. Yet little is known about how much patient experience has changed over the past decade, and even less is known about the impact of CMS's most recent strategy: tying payments to performance under the Value-Based Purchasing (VBP) program. We examined trends in multiple measures of patient satisfaction in the period 2008-14. We found that patient experience has improved modestly at US hospitals-both those participating in the VBP program and others-with the majority of improvement concentrated in the period before the program was implemented. While certain subsets of hospitals improved more than others, we found no evidence that the program has had a beneficial effect. As policy makers continue to promote value-based payment as a way to improve patient experience, it will be critical to ensure that payment is structured in ways that actually drive improvement. Project HOPE—The People-to-People Health Foundation, Inc.

  6. Leveraging the Happy Meal Effect: Substituting Food with Modest Nonfood Incentives Decreases Portion Size Choice

    PubMed Central

    Reimann, Martin; Bechara, Antoine; MacInnis, Deborah

    2015-01-01

    Despite much effort to decrease food intake by altering portion sizes, “super-sized” meals are the preferred choice of many. This research investigated the extent to which individuals can be subtly incentivized to choose smaller portion sizes. Three randomized experiments (2 in the lab and 1 in the field) established that individuals’ choice of full-sized food portions is reduced when they are given the opportunity to choose a half-sized version with a modest nonfood incentive. This substitution effect was robust across different nonfood incentives, foods, populations, and time. Experiment 1 established the effect with children, using inexpensive headphones as nonfood incentives. Experiment 2—a longitudinal study across multiple days—generalized this effect with adults, using the mere chance to win either gift cards or frequent flyer miles as nonfood incentives. Experiment 3 demonstrated the effect among actual restaurant customers who had originally planned to eat a full-sized portion, using the mere chance to win small amounts of money. Our investigation broadens the psychology of food portion choice from perceptual and social factors to motivational determinants. PMID:26372082

  7. TORC3: Token-ring clearing heuristic for currency circulation

    NASA Astrophysics Data System (ADS)

    Humes, Carlos, Jr.; Lauretto, Marcelo S.; Nakano, Fábio; Pereira, Carlos A. B.; Rafare, Guilherme F. G.; Stern, Julio Michael

    2012-10-01

    Clearing algorithms are at the core of modern payment systems, facilitating the settling of multilateral credit messages with (near) minimum transfers of currency. Traditional clearing procedures use batch processing based on MILP - mixed-integer linear programming algorithms. The MILP approach demands intensive computational resources; moreover, it is also vulnerable to operational risks generated by possible defaults during the inter-batch period. This paper presents TORC3 - the Token-Ring Clearing Algorithm for Currency Circulation. In contrast to the MILP approach, TORC3 is a real time heuristic procedure, demanding modest computational resources, and able to completely shield the clearing operation against the participating agents' risk of default.

  8. Data Sets, Ensemble Cloud Computing, and the University Library (Invited)

    NASA Astrophysics Data System (ADS)

    Plale, B. A.

    2013-12-01

    The environmental researcher at the public university has new resources at their disposal to aid in research and publishing. Cloud computing provides compute cycles on demand for analysis and modeling scenarios. Cloud computing is attractive for e-Science because of the ease with which cores can be accessed on demand, and because the virtual machine implementation that underlies cloud computing reduces the cost of porting a numeric or analysis code to a new platform. At the university, many libraries at larger universities are developing the e-Science skills to serve as repositories of record for publishable data sets. But these are confusing times for the publication of data sets from environmental research. The large publishers of scientific literature are advocating a process whereby data sets are tightly tied to a publication. In other words, a paper published in the scientific literature that gives results based on data, must have an associated data set accessible that backs up the results. This approach supports reproducibility of results in that publishers maintain a repository for the papers they publish, and the data sets that the papers used. Does such a solution that maps one data set (or subset) to one paper fit the needs of the environmental researcher who among other things uses complex models, mines longitudinal data bases, and generates observational results? The second school of thought has emerged out of NSF, NOAA, and NASA funded efforts over time: data sets exist coherent at a location, such as occurs at National Snow and Ice Data Center (NSIDC). But when a collection is coherent, reproducibility of individual results is more challenging. We argue for a third complementary option: the university repository as a location for data sets produced as a result of university-based research. This location for a repository relies on the expertise developing in the university libraries across the country, and leverages tools, such as are being developed in the Sustainable Environments - Actionable Data (SEAD) project, an NSF funded DataNet partner, for reducing the burden of describing, publishing, and sharing research data. We use as example the university institutional repository (IR) and an application taken from climate studies. The application is a storm surge model running as a cloud-based software as a service (SaaS). One of the more immediate and dangerous impacts of climate change could be change in the strength of storms that form over the oceans. There have already been indications that even modest changes in ocean surface temperature can have a disproportionate effect on hurricane strength. In an effort to understand these impacts, modelers turn to predictions generated by hydrodynamic coastal ocean models such as the Sea, Lake and Overland Surges from Hurricanes (SLOSH) model. We step through a use scenario of SLOSH in emergency management. To publish a newly created data set resulting from the ensemble runs on the cloud, one needs tools that minimize the burden of describing the data. SEAD has such tools, and engages the e-Science data curation librarian in the process to aid the data set's ingest into the university IR. We finally bring attention to ongoing effort in the Research Data Alliance (RDA) to make data lifecycle issues easier for environmental researchers so that they invest less time and get more credit for their data sets, giving research wider adoption and impact.

  9. FFTs in external or hierarchical memory

    NASA Technical Reports Server (NTRS)

    Bailey, David H.

    1989-01-01

    A description is given of advanced techniques for computing an ordered FFT on a computer with external or hierarchical memory. These algorithms (1) require as few as two passes through the external data set, (2) use strictly unit stride, long vector transfers between main memory and external storage, (3) require only a modest amount of scratch space in main memory, and (4) are well suited for vector and parallel computation. Performance figures are included for implementations of some of these algorithms on Cray supercomputers. Of interest is the fact that a main memory version outperforms the current Cray library FFT routines on the Cray-2, the Cray X-MP, and the Cray Y-MP systems. Using all eight processors on the Cray Y-MP, this main memory routine runs at nearly 2 Gflops.

  10. Scaling up a CMS tier-3 site with campus resources and a 100 Gb/s network connection: what could go wrong?

    NASA Astrophysics Data System (ADS)

    Wolf, Matthias; Woodard, Anna; Li, Wenzhao; Hurtado Anampa, Kenyi; Tovar, Benjamin; Brenner, Paul; Lannon, Kevin; Hildreth, Mike; Thain, Douglas

    2017-10-01

    The University of Notre Dame (ND) CMS group operates a modest-sized Tier-3 site suitable for local, final-stage analysis of CMS data. However, through the ND Center for Research Computing (CRC), Notre Dame researchers have opportunistic access to roughly 25k CPU cores of computing and a 100 Gb/s WAN network link. To understand the limits of what might be possible in this scenario, we undertook to use these resources for a wide range of CMS computing tasks from user analysis through large-scale Monte Carlo production (including both detector simulation and data reconstruction.) We will discuss the challenges inherent in effectively utilizing CRC resources for these tasks and the solutions deployed to overcome them.

  11. Megatux

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2012-09-25

    The Megatux platform enables the emulation of large scale (multi-million node) distributed systems. In particular, it allows for the emulation of large-scale networks interconnecting a very large number of emulated computer systems. It does this by leveraging virtualization and associated technologies to allow hundreds of virtual computers to be hosted on a single moderately sized server or workstation. Virtualization technology provided by modern processors allows for multiple guest OSs to run at the same time, sharing the hardware resources. The Megatux platform can be deployed on a single PC, a small cluster of a few boxes or a large clustermore » of computers. With a modest cluster, the Megatux platform can emulate complex organizational networks. By using virtualization, we emulate the hardware, but run actual software enabling large scale without sacrificing fidelity.« less

  12. Numerical Simulation of Rolling-Airframes Using a Multi-Level Cartesian Method

    NASA Technical Reports Server (NTRS)

    Murman, Scott M.; Aftosmis, Michael J.; Berger, Marsha J.; Kwak, Dochan (Technical Monitor)

    2002-01-01

    A supersonic rolling missile with two synchronous canard control surfaces is analyzed using an automated, inviscid, Cartesian method. Sequential-static and time-dependent dynamic simulations of the complete motion are computed for canard dither schedules for level flight, pitch, and yaw maneuver. The dynamic simulations are compared directly against both high-resolution viscous simulations and relevant experimental data, and are also utilized to compute dynamic stability derivatives. The results show that both the body roll rate and canard dither motion influence the roll-averaged forces and moments on the body. At the relatively, low roll rates analyzed in the current work these dynamic effects are modest, however the dynamic computations are effective in predicting the dynamic stability derivatives which can be significant for highly-maneuverable missiles.

  13. Mobile clusters of single board computers: an option for providing resources to student projects and researchers.

    PubMed

    Baun, Christian

    2016-01-01

    Clusters usually consist of servers, workstations or personal computers as nodes. But especially for academic purposes like student projects or scientific projects, the cost for purchase and operation can be a challenge. Single board computers cannot compete with the performance or energy-efficiency of higher-value systems, but they are an option to build inexpensive cluster systems. Because of the compact design and modest energy consumption, it is possible to build clusters of single board computers in a way that they are mobile and can be easily transported by the users. This paper describes the construction of such a cluster, useful applications and the performance of the single nodes. Furthermore, the clusters' performance and energy-efficiency is analyzed by executing the High Performance Linpack benchmark with a different number of nodes and different proportion of the systems total main memory utilized.

  14. MODEST - JPL GEODETIC AND ASTROMETRIC VLBI MODELING AND PARAMETER ESTIMATION PROGRAM

    NASA Technical Reports Server (NTRS)

    Sovers, O. J.

    1994-01-01

    Observations of extragalactic radio sources in the gigahertz region of the radio frequency spectrum by two or more antennas, separated by a baseline as long as the diameter of the Earth, can be reduced, by radio interferometry techniques, to yield time delays and their rates of change. The Very Long Baseline Interferometric (VLBI) observables can be processed by the MODEST software to yield geodetic and astrometric parameters of interest in areas such as geophysical satellite and spacecraft tracking applications and geodynamics. As the accuracy of radio interferometry has improved, increasingly complete models of the delay and delay rate observables have been developed. MODEST is a delay model (MOD) and parameter estimation (EST) program that takes into account delay effects such as geometry, clock, troposphere, and the ionosphere. MODEST includes all known effects at the centimeter level in modeling. As the field evolves and new effects are discovered, these can be included in the model. In general, the model includes contributions to the observables from Earth orientation, antenna motion, clock behavior, atmospheric effects, and radio source structure. Within each of these categories, a number of unknown parameters may be estimated from the observations. Since all parts of the time delay model contain nearly linear parameter terms, a square-root-information filter (SRIF) linear least-squares algorithm is employed in parameter estimation. Flexibility (via dynamic memory allocation) in the MODEST code ensures that the same executable can process a wide array of problems. These range from a few hundred observations on a single baseline, yielding estimates of tens of parameters, to global solutions estimating tens of thousands of parameters from hundreds of thousands of observations at antennas widely distributed over the Earth's surface. Depending on memory and disk storage availability, large problems may be subdivided into more tractable pieces that are processed sequentially. MODEST is written in FORTRAN 77, C-language, and VAX ASSEMBLER for DEC VAX series computers running VMS. It requires 6Mb of RAM for execution. The standard distribution medium for this package is a 1600 BPI 9-track magnetic tape in DEC VAX BACKUP format. It is also available on a TK50 tape cartridge in DEC VAX BACKUP format. Instructions for use and sample input and output data are available on the distribution media. This program was released in 1993 and is a copyrighted work with all copyright vested in NASA.

  15. Technology and Sexuality--What's the Connection? Addressing Youth Sexualities in Efforts to Increase Girls' Participation in Computing

    ERIC Educational Resources Information Center

    Ashcraft, Catherine

    2015-01-01

    To date, girls and women are significantly underrepresented in computer science and technology. Concerns about this underrepresentation have sparked a wealth of educational efforts to promote girls' participation in computing, but these programs have demonstrated limited impact on reversing current trends. This paper argues that this is, in part,…

  16. Computer simulation and performance assessment of the packet-data service of the Aeronautical Mobile Satellite Service (AMSS)

    NASA Technical Reports Server (NTRS)

    Ferzali, Wassim; Zacharakis, Vassilis; Upadhyay, Triveni; Weed, Dennis; Burke, Gregory

    1995-01-01

    The ICAO Aeronautical Mobile Communications Panel (AMCP) completed the drafting of the Aeronautical Mobile Satellite Service (AMSS) Standards and Recommended Practices (SARP's) and the associated Guidance Material and submitted these documents to ICAO Air Navigation Commission (ANC) for ratification in May 1994. This effort, encompassed an extensive, multi-national SARP's validation. As part of this activity, the US Federal Aviation Administration (FAA) sponsored an effort to validate the SARP's via computer simulation. This paper provides a description of this effort. Specifically, it describes: (1) the approach selected for the creation of a high-fidelity AMSS computer model; (2) the test traffic generation scenarios; and (3) the resultant AMSS performance assessment. More recently, the AMSS computer model was also used to provide AMSS performance statistics in support of the RTCA standardization activities. This paper describes this effort as well.

  17. Computer use, sleep duration and health symptoms: a cross-sectional study of 15-year olds in three countries.

    PubMed

    Nuutinen, Teija; Roos, Eva; Ray, Carola; Villberg, Jari; Välimaa, Raili; Rasmussen, Mette; Holstein, Bjørn; Godeau, Emmanuelle; Beck, Francois; Léger, Damien; Tynjälä, Jorma

    2014-08-01

    This study investigated whether computer use is associated with health symptoms through sleep duration among 15-year olds in Finland, France and Denmark. We used data from the WHO cross-national Health Behaviour in School-aged Children study collected in Finland, France and Denmark in 2010, including data on 5,402 adolescents (mean age 15.61 (SD 0.37), girls 53%). Symptoms assessed included feeling low, irritability/bad temper, nervousness, headache, stomachache, backache, and feeling dizzy. We used structural equation modeling to explore the mediating effect of sleep duration on the association between computer use and symptom load. Adolescents slept approximately 8 h a night and computer use was approximately 2 h a day. Computer use was associated with shorter sleep duration and higher symptom load. Sleep duration partly mediated the association between computer use and symptom load, but the indirect effects of sleep duration were quite modest in all countries. Sleep duration may be a potential underlying mechanism behind the association between computer use and health symptoms.

  18. Cost effectiveness of a computer-delivered intervention to improve HIV medication adherence

    PubMed Central

    2013-01-01

    Background High levels of adherence to medications for HIV infection are essential for optimal clinical outcomes and to reduce viral transmission, but many patients do not achieve required levels. Clinician-delivered interventions can improve patients’ adherence, but usually require substantial effort by trained individuals and may not be widely available. Computer-delivered interventions can address this problem by reducing required staff time for delivery and by making the interventions widely available via the Internet. We previously developed a computer-delivered intervention designed to improve patients’ level of health literacy as a strategy to improve their HIV medication adherence. The intervention was shown to increase patients’ adherence, but it was not clear that the benefits resulting from the increase in adherence could justify the costs of developing and deploying the intervention. The purpose of this study was to evaluate the relation of development and deployment costs to the effectiveness of the intervention. Methods Costs of intervention development were drawn from accounting reports for the grant under which its development was supported, adjusted for costs primarily resulting from the project’s research purpose. Effectiveness of the intervention was drawn from results of the parent study. The relation of the intervention’s effects to changes in health status, expressed as utilities, was also evaluated in order to assess the net cost of the intervention in terms of quality adjusted life years (QALYs). Sensitivity analyses evaluated ranges of possible intervention effectiveness and durations of its effects, and costs were evaluated over several deployment scenarios. Results The intervention’s cost effectiveness depends largely on the number of persons using it and the duration of its effectiveness. Even with modest effects for a small number of patients the intervention was associated with net cost savings in some scenarios and for durations greater than three months and longer it was usually associated with a favorable cost per QALY. For intermediate and larger assumed effects and longer durations of intervention effectiveness, the intervention was associated with net cost savings. Conclusions Computer-delivered adherence interventions may be a cost-effective strategy to improve adherence in persons treated for HIV. Trial registration Clinicaltrials.gov identifier NCT01304186. PMID:23446180

  19. Cost effectiveness of a computer-delivered intervention to improve HIV medication adherence.

    PubMed

    Ownby, Raymond L; Waldrop-Valverde, Drenna; Jacobs, Robin J; Acevedo, Amarilis; Caballero, Joshua

    2013-02-28

    High levels of adherence to medications for HIV infection are essential for optimal clinical outcomes and to reduce viral transmission, but many patients do not achieve required levels. Clinician-delivered interventions can improve patients' adherence, but usually require substantial effort by trained individuals and may not be widely available. Computer-delivered interventions can address this problem by reducing required staff time for delivery and by making the interventions widely available via the Internet. We previously developed a computer-delivered intervention designed to improve patients' level of health literacy as a strategy to improve their HIV medication adherence. The intervention was shown to increase patients' adherence, but it was not clear that the benefits resulting from the increase in adherence could justify the costs of developing and deploying the intervention. The purpose of this study was to evaluate the relation of development and deployment costs to the effectiveness of the intervention. Costs of intervention development were drawn from accounting reports for the grant under which its development was supported, adjusted for costs primarily resulting from the project's research purpose. Effectiveness of the intervention was drawn from results of the parent study. The relation of the intervention's effects to changes in health status, expressed as utilities, was also evaluated in order to assess the net cost of the intervention in terms of quality adjusted life years (QALYs). Sensitivity analyses evaluated ranges of possible intervention effectiveness and durations of its effects, and costs were evaluated over several deployment scenarios. The intervention's cost effectiveness depends largely on the number of persons using it and the duration of its effectiveness. Even with modest effects for a small number of patients the intervention was associated with net cost savings in some scenarios and for durations greater than three months and longer it was usually associated with a favorable cost per QALY. For intermediate and larger assumed effects and longer durations of intervention effectiveness, the intervention was associated with net cost savings. Computer-delivered adherence interventions may be a cost-effective strategy to improve adherence in persons treated for HIV. Clinicaltrials.gov identifier NCT01304186.

  20. Experiences running NASTRAN on the Microvax 2 computer

    NASA Technical Reports Server (NTRS)

    Butler, Thomas G.; Mitchell, Reginald S.

    1987-01-01

    The MicroVAX operates NASTRAN so well that the only detectable difference in its operation compared to an 11/780 VAX is in the execution time. On the modest installation described here, the engineer has all of the tools he needs to do an excellent job of analysis. System configuration decisions, system sizing, preparation of the system disk, definition of user quotas, installation, monitoring of system errors, and operation policies are discussed.

  1. Applications of computational modeling in ballistics

    NASA Technical Reports Server (NTRS)

    Sturek, Walter B.

    1987-01-01

    The development of the technology of ballistics as applied to gun launched Army weapon systems is the main objective of research at the U.S. Army Ballistic Research Laboratory (BRL). The primary research programs at the BRL consist of three major ballistic disciplines: exterior, interior, and terminal. The work done at the BRL in these areas was traditionally highly dependent on experimental testing. A considerable emphasis was placed on the development of computational modeling to augment the experimental testing in the development cycle; however, the impact of the computational modeling to this date is modest. With the availability of supercomputer computational resources recently installed at the BRL, a new emphasis on the application of computational modeling to ballistics technology is taking place. The major application areas are outlined which are receiving considerable attention at the BRL at present and to indicate the modeling approaches involved. An attempt was made to give some information as to the degree of success achieved and indicate the areas of greatest need.

  2. Rapid insights from remote sensing in the geosciences

    NASA Astrophysics Data System (ADS)

    Plaza, Antonio

    2015-03-01

    The growing availability of capacity computing for atomistic materials modeling has encouraged the use of high-accuracy computationally intensive interatomic potentials, such as SNAP. These potentials also happen to scale well on petascale computing platforms. SNAP has a very general form and uses machine-learning techniques to reproduce the energies, forces, and stress tensors of a large set of small configurations of atoms, which are obtained using high-accuracy quantum electronic structure (QM) calculations. The local environment of each atom is characterized by a set of bispectrum components of the local neighbor density projected on to a basis of hyperspherical harmonics in four dimensions. The computational cost per atom is much greater than that of simpler potentials such as Lennard-Jones or EAM, while the communication cost remains modest. We discuss a variety of strategies for implementing SNAP in the LAMMPS molecular dynamics package. We present scaling results obtained running SNAP on three different classes of machine: a conventional Intel Xeon CPU cluster; the Titan GPU-based system; and the combined Sequoia and Vulcan BlueGene/Q. The growing availability of capacity computing for atomistic materials modeling has encouraged the use of high-accuracy computationally intensive interatomic potentials, such as SNAP. These potentials also happen to scale well on petascale computing platforms. SNAP has a very general form and uses machine-learning techniques to reproduce the energies, forces, and stress tensors of a large set of small configurations of atoms, which are obtained using high-accuracy quantum electronic structure (QM) calculations. The local environment of each atom is characterized by a set of bispectrum components of the local neighbor density projected on to a basis of hyperspherical harmonics in four dimensions. The computational cost per atom is much greater than that of simpler potentials such as Lennard-Jones or EAM, while the communication cost remains modest. We discuss a variety of strategies for implementing SNAP in the LAMMPS molecular dynamics package. We present scaling results obtained running SNAP on three different classes of machine: a conventional Intel Xeon CPU cluster; the Titan GPU-based system; and the combined Sequoia and Vulcan BlueGene/Q. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corp., for the U.S. Dept. of Energy's National Nuclear Security Admin. under Contract DE-AC04-94AL85000.

  3. WeFold: A Coopetition for Protein Structure Prediction

    PubMed Central

    Khoury, George A.; Liwo, Adam; Khatib, Firas; Zhou, Hongyi; Chopra, Gaurav; Bacardit, Jaume; Bortot, Leandro O.; Faccioli, Rodrigo A.; Deng, Xin; He, Yi; Krupa, Pawel; Li, Jilong; Mozolewska, Magdalena A.; Sieradzan, Adam K.; Smadbeck, James; Wirecki, Tomasz; Cooper, Seth; Flatten, Jeff; Xu, Kefan; Baker, David; Cheng, Jianlin; Delbem, Alexandre C. B.; Floudas, Christodoulos A.; Keasar, Chen; Levitt, Michael; Popović, Zoran; Scheraga, Harold A.; Skolnick, Jeffrey; Crivelli, Silvia N.; Players, Foldit

    2014-01-01

    The protein structure prediction problem continues to elude scientists. Despite the introduction of many methods, only modest gains were made over the last decade for certain classes of prediction targets. To address this challenge, a social-media based worldwide collaborative effort, named WeFold, was undertaken by thirteen labs. During the collaboration, the labs were simultaneously competing with each other. Here, we present the first attempt at “coopetition” in scientific research applied to the protein structure prediction and refinement problems. The coopetition was possible by allowing the participating labs to contribute different components of their protein structure prediction pipelines and create new hybrid pipelines that they tested during CASP10. This manuscript describes both successes and areas needing improvement as identified throughout the first WeFold experiment and discusses the efforts that are underway to advance this initiative. A footprint of all contributions and structures are publicly accessible at http://www.wefold.org. PMID:24677212

  4. Enhancing Student Success in Biology, Chemistry, and Physics by Transforming the Faculty Culture

    NASA Astrophysics Data System (ADS)

    Jackson, Howard; Smith, Leigh; Koenig, Kathleen; Beyette, Jill; Kinkle, Brian; Vonderheide, Anne

    We present preliminary results of an effort to enhance undergraduate student success in the STEM disciplines. We explore a multistep approach that reflects recent literature and report initial results by each of the Departments of Biology, Chemistry, and Physics of implementing several change strategies. The central elements of our approach involve identified departmental Teaching and Learning Liaisons, a unique faculty development component by our teaching center, a vertical integration of leadership across department heads, the Dean, and the Provost, and the explicit acknowledgement that change happens locally. Teaching and Learning lunches across the departments have attracted an attendance of ~65% of the faculty. The use of Learning Assistants in classrooms has also increased sharply. Modest changes in the student success rates have been observed. These efforts and others at the decanal and provostal levels promise changes in student success. We acknowledge the financial support of the National Science Foundation through DUE 1544001 and 1431350.

  5. A brief history of the global effort to develop a preventive HIV vaccine.

    PubMed

    Esparza, José

    2013-08-02

    Soon after HIV was discovered as the cause of AIDS in 1983-1984, there was an expectation that a preventive vaccine would be rapidly developed. In trying to achieve that goal, three successive scientific paradigms have been explored: induction of neutralizing antibodies, induction of cell mediated immunity, and exploration of combination approaches and novel concepts. Although major progress has been made in understanding the scientific basis for HIV vaccine development, efficacy trials have been critical in moving the field forward. In 2009, the field was reinvigorated with the modest results obtained from the RV144 trial conducted in Thailand. Here, we review those vaccine development efforts, with an emphasis on events that occurred during the earlier years. The goal is to provide younger generations of scientists with information and inspiration to continue the search for an HIV vaccine. Copyright © 2013 The Author. Published by Elsevier Ltd.. All rights reserved.

  6. To close the childhood immunization gap, we need a richer understanding of parents' decision-making.

    PubMed

    Corben, Paul; Leask, Julie

    2016-12-01

    Vaccination is widely acknowledged as one of the most successful public health interventions globally and in most high-income countries childhood vaccination coverage rates are moderately high. Yet in many instances, immunisation rates remain below aspirational targets and have shown only modest progress toward those targets in recent years, despite concerted efforts to improve uptake. In part, coverage rates reflect individual parents' vaccination attitudes and decisions and, because vaccination decision-making is complex and context-specific, it remains challenging at individual and community levels to assist parents to make positive decisions. Consequently, in the search for opportunities to improve immunisation coverage, there has been a renewed research focus on parents' decision-making. This review provides an overview of the literature surrounding parents' vaccination decision-making, offering suggestions for where efforts to increase vaccination coverage should be targeted and identifying areas for further research.

  7. New strategies to improve food marketing to children.

    PubMed

    Dietz, William H

    2013-09-01

    Federal efforts to address the impact of food marketing on children began more than thirty years ago, when the Federal Trade Commission sought comment on strategies to reduce young children's exposure to food advertising. The food, advertising, and television industries mounted a virulent response, and Congress withdrew the commission's authority to regulate unfair advertising to children. The same industries and Congress responded equally aggressively to the proposed nutrition criteria for food products marketed to children drafted by a working group of federal agencies in 2011. Although federal efforts over the past thirty years have led to modest improvements in food quality and marketing practices, commercial interests have consistently overridden the health concerns of children. Mobilization of parents as a political force to improve standards for food marketed to children, use of social media for counteradvertising, and the development of new technologies to decrease exposure to food advertisements could reduce the impact of food marketing to children.

  8. TEMPEST: A three-dimensional time-dependent computer program for hydrothermal analysis: Volume 2, Assessment and verification results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eyler, L L; Trent, D S; Budden, M J

    During the course of the TEMPEST computer code development a concurrent effort was conducted to assess the code's performance and the validity of computed results. The results of this work are presented in this document. The principal objective of this effort was to assure the code's computational correctness for a wide range of hydrothermal phenomena typical of fast breeder reactor application. 47 refs., 94 figs., 6 tabs.

  9. Accelerating 3D Elastic Wave Equations on Knights Landing based Intel Xeon Phi processors

    NASA Astrophysics Data System (ADS)

    Sourouri, Mohammed; Birger Raknes, Espen

    2017-04-01

    In advanced imaging methods like reverse-time migration (RTM) and full waveform inversion (FWI) the elastic wave equation (EWE) is numerically solved many times to create the seismic image or the elastic parameter model update. Thus, it is essential to optimize the solution time for solving the EWE as this will have a major impact on the total computational cost in running RTM or FWI. From a computational point of view applications implementing EWEs are associated with two major challenges. The first challenge is the amount of memory-bound computations involved, while the second challenge is the execution of such computations over very large datasets. So far, multi-core processors have not been able to tackle these two challenges, which eventually led to the adoption of accelerators such as Graphics Processing Units (GPUs). Compared to conventional CPUs, GPUs are densely populated with many floating-point units and fast memory, a type of architecture that has proven to map well to many scientific computations. Despite its architectural advantages, full-scale adoption of accelerators has yet to materialize. First, accelerators require a significant programming effort imposed by programming models such as CUDA or OpenCL. Second, accelerators come with a limited amount of memory, which also require explicit data transfers between the CPU and the accelerator over the slow PCI bus. The second generation of the Xeon Phi processor based on the Knights Landing (KNL) architecture, promises the computational capabilities of an accelerator but require the same programming effort as traditional multi-core processors. The high computational performance is realized through many integrated cores (number of cores and tiles and memory varies with the model) organized in tiles that are connected via a 2D mesh based interconnect. In contrary to accelerators, KNL is a self-hosted system, meaning explicit data transfers over the PCI bus are no longer required. However, like most accelerators, KNL sports a memory subsystem consisting of low-level caches and 16GB of high-bandwidth MCDRAM memory. For capacity computing, up to 400GB of conventional DDR4 memory is provided. Such a strict hierarchical memory layout means that data locality is imperative if the true potential of this product is to be harnessed. In this work, we study a series of optimizations specifically targeting KNL for our EWE based application to reduce the time-to-solution time for the following 3D model sizes in grid points: 1283, 2563 and 5123. We compare the results with an optimized version for multi-core CPUs running on a dual-socket Xeon E5 2680v3 system using OpenMP. Our initial naive implementation on the KNL is roughly 20% faster than the multi-core version, but by using only one thread per core and careful memory placement using the memkind library, we could achieve higher speedups. Additionally, by using the MCDRAM as cache for problem sizes that are smaller than 16 GB further performance improvements were unlocked. Depending on the problem size, our overall results indicate that the KNL based system is approximately 2.2x faster than the 24-core Xeon E5 2680v3 system, with only modest changes to the code.

  10. 42 CFR 441.182 - Maintenance of effort: Computation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... SERVICES Inpatient Psychiatric Services for Individuals Under Age 21 in Psychiatric Facilities or Programs § 441.182 Maintenance of effort: Computation. (a) For expenditures for inpatient psychiatric services... total State Medicaid expenditures in the current quarter for inpatient psychiatric services and...

  11. An accurate binding interaction model in de novo computational protein design of interactions: if you build it, they will bind.

    PubMed

    London, Nir; Ambroggio, Xavier

    2014-02-01

    Computational protein design efforts aim to create novel proteins and functions in an automated manner and, in the process, these efforts shed light on the factors shaping natural proteins. The focus of these efforts has progressed from the interior of proteins to their surface and the design of functions, such as binding or catalysis. Here we examine progress in the development of robust methods for the computational design of non-natural interactions between proteins and molecular targets such as other proteins or small molecules. This problem is referred to as the de novo computational design of interactions. Recent successful efforts in de novo enzyme design and the de novo design of protein-protein interactions open a path towards solving this problem. We examine the common themes in these efforts, and review recent studies aimed at understanding the nature of successes and failures in the de novo computational design of interactions. While several approaches culminated in success, the use of a well-defined structural model for a specific binding interaction in particular has emerged as a key strategy for a successful design, and is therefore reviewed with special consideration. Copyright © 2013 Elsevier Inc. All rights reserved.

  12. Consumer decision making in the individual health insurance market.

    PubMed

    Marquis, M Susan; Buntin, Melinda Beeuwkes; Escarce, José J; Kapur, Kanika; Louis, Thomas A; Yegian, Jill M

    2006-01-01

    This paper summarizes the results from a study of consumer decision making in California's individual health insurance market. We conclude that price subsidies will have only modest effects on participation and that efforts to reduce nonprice barriers might be just as effective. We also find that there is substantial pooling in the individual market and that it increases over time because people who become sick can continue coverage without new underwriting. Finally, we show that people prefer more-generous benefits and that it is difficult to induce people in poor health to enroll in high-deductible health plans.

  13. Determining potential 30/20 GHZ domestic satellite system concepts and establishment of a suitable experimental configuration

    NASA Technical Reports Server (NTRS)

    Stevens, G. H.; Anzic, G.

    1979-01-01

    NASA is conducting a series of millimeter wave satellite communication systems and market studies to: (1) determine potential domestic 30/20 GHz satellite concepts and market potential, and (2) establish the requirements for a suitable technology verification payload which, although intended to be modest in capacity, would sufficiently demonstrate key technologies and experimentally address key operational issues. Preliminary results and critical issues of the current contracted effort are described. Also included is a description of a NASA-developed multibeam satellite payload configuration which may be representative of concepts utilized in a technology flight verification program.

  14. Preparing for the 90s using today's communications assets

    NASA Technical Reports Server (NTRS)

    Posner, Edward C.

    1987-01-01

    Such existing NASA/U.S. facilities and spacecraft as those of the Deep Space Network, VLA, and Arecibo are presently judged capable, at modest additional investment during the next five years, to acquire unique space science data, to generate mission planning data for missions to be launched in the early 1990s, and to evaluate and demonstrate communications and navigation technology for missions of the late 1990s and beyond. The more ambitious of these efforts will contribute the continuation of space research attractiveness for students, as well as furnish an important part of their scientific training.

  15. Epigenetics: spotlight on type 2 diabetes and obesity.

    PubMed

    Desiderio, A; Spinelli, R; Ciccarelli, M; Nigro, C; Miele, C; Beguinot, F; Raciti, G A

    2016-10-01

    Type 2 diabetes (T2D) and obesity are the major public health problems. Substantial efforts have been made to define loci and variants contributing to the individual risk of these disorders. However, the overall risk explained by genetic variation is very modest. Epigenetics is one of the fastest growing research areas in biomedicine as changes in the epigenome are involved in many biological processes, impact on the risk for several complex diseases including diabetes and may explain susceptibility. In this review, we focus on the role of DNA methylation in contributing to the risk of T2D and obesity.

  16. Turbulence modeling of free shear layers for high performance aircraft

    NASA Technical Reports Server (NTRS)

    Sondak, Douglas

    1993-01-01

    In many flowfield computations, accuracy of the turbulence model employed is frequently a limiting factor in the overall accuracy of the computation. This is particularly true for complex flowfields such as those around full aircraft configurations. Free shear layers such as wakes, impinging jets (in V/STOL applications), and mixing layers over cavities are often part of these flowfields. Although flowfields have been computed for full aircraft, the memory and CPU requirements for these computations are often excessive. Additional computer power is required for multidisciplinary computations such as coupled fluid dynamics and conduction heat transfer analysis. Massively parallel computers show promise in alleviating this situation, and the purpose of this effort was to adapt and optimize CFD codes to these new machines. The objective of this research effort was to compute the flowfield and heat transfer for a two-dimensional jet impinging normally on a cool plate. The results of this research effort were summarized in an AIAA paper titled 'Parallel Implementation of the k-epsilon Turbulence Model'. Appendix A contains the full paper.

  17. Sex differences in impulsivity: a meta-analysis.

    PubMed

    Cross, Catharine P; Copping, Lee T; Campbell, Anne

    2011-01-01

    Men are overrepresented in socially problematic behaviors, such as aggression and criminal behavior, which have been linked to impulsivity. Our review of impulsivity is organized around the tripartite theoretical distinction between reward hypersensitivity, punishment hyposensitivity, and inadequate effortful control. Drawing on evolutionary, criminological, developmental, and personality theories, we predicted that sex differences would be most pronounced in risky activities with men demonstrating greater sensation seeking, greater reward sensitivity, and lower punishment sensitivity. We predicted a small female advantage in effortful control. We analyzed 741 effect sizes from 277 studies, including psychometric and behavioral measures. Women were consistently more punishment sensitive (d = -0.33), but men did not show greater reward sensitivity (d = 0.01). Men showed significantly higher sensation seeking on questionnaire measures (d = 0.41) and on a behavioral risk-taking task (d = 0.36). Questionnaire measures of deficits in effortful control showed a very modest effect size in the male direction (d = 0.08). Sex differences were not found on delay discounting or executive function tasks. The results indicate a stronger sex difference in motivational rather than effortful or executive forms of behavior control. Specifically, they support evolutionary and biological theories of risk taking predicated on sex differences in punishment sensitivity. A clearer understanding of sex differences in impulsivity depends upon recognizing important distinctions between sensation seeking and impulsivity, between executive and effortful forms of control, and between impulsivity as a deficit and as a trait.

  18. Game theory of pre-emptive vaccination before bioterrorism or accidental release of smallpox.

    PubMed

    Molina, Chai; Earn, David J D

    2015-06-06

    Smallpox was eradicated in the 1970s, but new outbreaks could be seeded by bioterrorism or accidental release. Substantial vaccine-induced morbidity and mortality make pre-emptive mass vaccination controversial, and if vaccination is voluntary, then there is a conflict between self- and group interests. This conflict can be framed as a tragedy of the commons, in which herd immunity plays the role of the commons, and free-riding (i.e. not vaccinating pre-emptively) is analogous to exploiting the commons. This game has been analysed previously for a particular post-outbreak vaccination scenario. We consider several post-outbreak vaccination scenarios and compare the expected increase in mortality that results from voluntary versus imposed vaccination. Below a threshold level of post-outbreak vaccination effort, expected mortality is independent of the level of response effort. A lag between an outbreak starting and a response being initiated increases the post-outbreak vaccination effort necessary to reduce mortality. For some post-outbreak vaccination scenarios, even modest response lags make it impractical to reduce mortality by increasing post-outbreak vaccination effort. In such situations, if decreasing the response lag is impossible, the only practical way to reduce mortality is to make the vaccine safer (greater post-outbreak vaccination effort leads only to fewer people vaccinating pre-emptively). © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  19. Object Classification Based on Analysis of Spectral Characteristics of Seismic Signal Envelopes

    NASA Astrophysics Data System (ADS)

    Morozov, Yu. V.; Spektor, A. A.

    2017-11-01

    A method for classifying moving objects having a seismic effect on the ground surface is proposed which is based on statistical analysis of the envelopes of received signals. The values of the components of the amplitude spectrum of the envelopes obtained applying Hilbert and Fourier transforms are used as classification criteria. Examples illustrating the statistical properties of spectra and the operation of the seismic classifier are given for an ensemble of objects of four classes (person, group of people, large animal, vehicle). It is shown that the computational procedures for processing seismic signals are quite simple and can therefore be used in real-time systems with modest requirements for computational resources.

  20. The detection and analysis of point processes in biological signals

    NASA Technical Reports Server (NTRS)

    Anderson, D. J.; Correia, M. J.

    1977-01-01

    A pragmatic approach to the detection and analysis of discrete events in biomedical signals is taken. Examples from both clinical and basic research are provided. Introductory sections discuss not only discrete events which are easily extracted from recordings by conventional threshold detectors but also events embedded in other information carrying signals. The primary considerations are factors governing event-time resolution and the effects limits to this resolution have on the subsequent analysis of the underlying process. The analysis portion describes tests for qualifying the records as stationary point processes and procedures for providing meaningful information about the biological signals under investigation. All of these procedures are designed to be implemented on laboratory computers of modest computational capacity.

  1. U.S. aerospace industry opinion of the effect of computer-aided prediction-design technology on future wind-tunnel test requirements for aircraft development programs

    NASA Technical Reports Server (NTRS)

    Treon, S. L.

    1979-01-01

    A survey of the U.S. aerospace industry in late 1977 suggests that there will be an increasing use of computer-aided prediction-design technology (CPD Tech) in the aircraft development process but that, overall, only a modest reduction in wind-tunnel test requirements from the current level is expected in the period through 1995. Opinions were received from key spokesmen in 23 of the 26 solicited major companies or corporate divisions involved in the design and manufacture of nonrotary wing aircraft. Development programs for nine types of aircraft related to test phases and wind-tunnel size and speed range were considered.

  2. Computational psychiatry

    PubMed Central

    Montague, P. Read; Dolan, Raymond J.; Friston, Karl J.; Dayan, Peter

    2013-01-01

    Computational ideas pervade many areas of science and have an integrative explanatory role in neuroscience and cognitive science. However, computational depictions of cognitive function have had surprisingly little impact on the way we assess mental illness because diseases of the mind have not been systematically conceptualized in computational terms. Here, we outline goals and nascent efforts in the new field of computational psychiatry, which seeks to characterize mental dysfunction in terms of aberrant computations over multiple scales. We highlight early efforts in this area that employ reinforcement learning and game theoretic frameworks to elucidate decision-making in health and disease. Looking forwards, we emphasize a need for theory development and large-scale computational phenotyping in human subjects. PMID:22177032

  3. ``But it doesn't come naturally'': how effort expenditure shapes the benefit of growth mindset on women's sense of intellectual belonging in computing

    NASA Astrophysics Data System (ADS)

    Stout, Jane G.; Blaney, Jennifer M.

    2017-10-01

    Research suggests growth mindset, or the belief that knowledge is acquired through effort, may enhance women's sense of belonging in male-dominated disciplines, like computing. However, other research indicates women who spend a great deal of time and energy in technical fields experience a low sense of belonging. The current study assessed the benefits of a growth mindset on women's (and men's) sense of intellectual belonging in computing, accounting for the amount of time and effort dedicated to academics. We define "intellectual belonging" as the sense that one is believed to be a competent member of the community. Whereas a stronger growth mindset was associated with stronger intellectual belonging for men, a growth mindset only boosted women's intellectual belonging when they did not work hard on academics. Our findings suggest, paradoxically, women may not benefit from a growth mindset in computing when they exert a lot of effort.

  4. Predicting Motivation: Computational Models of PFC Can Explain Neural Coding of Motivation and Effort-based Decision-making in Health and Disease.

    PubMed

    Vassena, Eliana; Deraeve, James; Alexander, William H

    2017-10-01

    Human behavior is strongly driven by the pursuit of rewards. In daily life, however, benefits mostly come at a cost, often requiring that effort be exerted to obtain potential benefits. Medial PFC (MPFC) and dorsolateral PFC (DLPFC) are frequently implicated in the expectation of effortful control, showing increased activity as a function of predicted task difficulty. Such activity partially overlaps with expectation of reward and has been observed both during decision-making and during task preparation. Recently, novel computational frameworks have been developed to explain activity in these regions during cognitive control, based on the principle of prediction and prediction error (predicted response-outcome [PRO] model [Alexander, W. H., & Brown, J. W. Medial prefrontal cortex as an action-outcome predictor. Nature Neuroscience, 14, 1338-1344, 2011], hierarchical error representation [HER] model [Alexander, W. H., & Brown, J. W. Hierarchical error representation: A computational model of anterior cingulate and dorsolateral prefrontal cortex. Neural Computation, 27, 2354-2410, 2015]). Despite the broad explanatory power of these models, it is not clear whether they can also accommodate effects related to the expectation of effort observed in MPFC and DLPFC. Here, we propose a translation of these computational frameworks to the domain of effort-based behavior. First, we discuss how the PRO model, based on prediction error, can explain effort-related activity in MPFC, by reframing effort-based behavior in a predictive context. We propose that MPFC activity reflects monitoring of motivationally relevant variables (such as effort and reward), by coding expectations and discrepancies from such expectations. Moreover, we derive behavioral and neural model-based predictions for healthy controls and clinical populations with impairments of motivation. Second, we illustrate the possible translation to effort-based behavior of the HER model, an extended version of PRO model based on hierarchical error prediction, developed to explain MPFC-DLPFC interactions. We derive behavioral predictions that describe how effort and reward information is coded in PFC and how changing the configuration of such environmental information might affect decision-making and task performance involving motivation.

  5. Status of Computational Aerodynamic Modeling Tools for Aircraft Loss-of-Control

    NASA Technical Reports Server (NTRS)

    Frink, Neal T.; Murphy, Patrick C.; Atkins, Harold L.; Viken, Sally A.; Petrilli, Justin L.; Gopalarathnam, Ashok; Paul, Ryan C.

    2016-01-01

    A concerted effort has been underway over the past several years to evolve computational capabilities for modeling aircraft loss-of-control under the NASA Aviation Safety Program. A principal goal has been to develop reliable computational tools for predicting and analyzing the non-linear stability & control characteristics of aircraft near stall boundaries affecting safe flight, and for utilizing those predictions for creating augmented flight simulation models that improve pilot training. Pursuing such an ambitious task with limited resources required the forging of close collaborative relationships with a diverse body of computational aerodynamicists and flight simulation experts to leverage their respective research efforts into the creation of NASA tools to meet this goal. Considerable progress has been made and work remains to be done. This paper summarizes the status of the NASA effort to establish computational capabilities for modeling aircraft loss-of-control and offers recommendations for future work.

  6. The use of a proactive dissemination strategy to optimize reach of an internet-delivered computer tailored lifestyle intervention.

    PubMed

    Schneider, Francine; Schulz, Daniela N; Pouwels, Loes H L; de Vries, Hein; van Osch, Liesbeth A D M

    2013-08-05

    The use of reactive strategies to disseminate effective Internet-delivered lifestyle interventions restricts their level of reach within the target population. This stresses the need to invest in proactive strategies to offer these interventions to the target population. The present study used a proactive strategy to increase reach of an Internet-delivered multi component computer tailored intervention, by embedding the intervention in an existing online health monitoring system of the Regional Public Health Services in the Netherlands. The research population consisted of Dutch adults who were invited to participate in the Adult Health Monitor (N = 96,388) offered by the Regional Public Health Services. This Monitor consisted of an online or a written questionnaire. A prospective design was used to determine levels of reach, by focusing on actual participation in the lifestyle intervention. Furthermore, adequacy of reach among the target group was assessed by composing detailed profiles of intervention users. Participants' characteristics, like demographics, behavioral and mental health status and quality of life, were included in the model as predictors. A total of 41,155 (43%) people participated in the Adult Health Monitor, of which 41% (n = 16,940) filled out the online version. More than half of the online participants indicated their interest (n = 9169; 54%) in the computer tailored intervention and 5168 participants (31%) actually participated in the Internet-delivered computer tailored intervention. Males, older respondents and individuals with a higher educational degree were significantly more likely to participate in the intervention. Furthermore, results indicated that especially participants with a relatively healthier lifestyle and a healthy BMI were likely to participate. With one out of three online Adult Health Monitor participants actually participating in the computer tailored lifestyle intervention, the employed proactive dissemination strategy succeeded in ensuring relatively high levels of reach. Reach among at-risk individuals (e.g. low socioeconomic status and unhealthy lifestyle) was modest. It is therefore essential to further optimize reach by putting additional effort into increasing interest in the lifestyle intervention among at-risk individuals and to encourage them to actually use the intervention. Dutch Trial Register (NTR1786) and Medical Ethics Committee of Maastricht University and the University Hospital Maastricht (NL2723506809/MEC0903016).

  7. The use of a proactive dissemination strategy to optimize reach of an internet-delivered computer tailored lifestyle intervention

    PubMed Central

    2013-01-01

    Background The use of reactive strategies to disseminate effective Internet-delivered lifestyle interventions restricts their level of reach within the target population. This stresses the need to invest in proactive strategies to offer these interventions to the target population. The present study used a proactive strategy to increase reach of an Internet-delivered multi component computer tailored intervention, by embedding the intervention in an existing online health monitoring system of the Regional Public Health Services in the Netherlands. Methods The research population consisted of Dutch adults who were invited to participate in the Adult Health Monitor (N = 96,388) offered by the Regional Public Health Services. This Monitor consisted of an online or a written questionnaire. A prospective design was used to determine levels of reach, by focusing on actual participation in the lifestyle intervention. Furthermore, adequacy of reach among the target group was assessed by composing detailed profiles of intervention users. Participants’ characteristics, like demographics, behavioral and mental health status and quality of life, were included in the model as predictors. Results A total of 41,155 (43%) people participated in the Adult Health Monitor, of which 41% (n = 16,940) filled out the online version. More than half of the online participants indicated their interest (n = 9169; 54%) in the computer tailored intervention and 5168 participants (31%) actually participated in the Internet-delivered computer tailored intervention. Males, older respondents and individuals with a higher educational degree were significantly more likely to participate in the intervention. Furthermore, results indicated that especially participants with a relatively healthier lifestyle and a healthy BMI were likely to participate. Conclusions With one out of three online Adult Health Monitor participants actually participating in the computer tailored lifestyle intervention, the employed proactive dissemination strategy succeeded in ensuring relatively high levels of reach. Reach among at-risk individuals (e.g. low socioeconomic status and unhealthy lifestyle) was modest. It is therefore essential to further optimize reach by putting additional effort into increasing interest in the lifestyle intervention among at-risk individuals and to encourage them to actually use the intervention. Trial registration Dutch Trial Register (NTR1786) and Medical Ethics Committee of Maastricht University and the University Hospital Maastricht (NL2723506809/MEC0903016). PMID:23914991

  8. Report of the Commission to Assess the Threat to the United States from Electromagnetic Pulse (EMP) Attack: Critical National Infrastructures

    DTIC Science & Technology

    2008-04-01

    consumers and electric utilities in Arizona and Southern California. Twelve people, including five children, died as a result of the explosion. The...Modern electronics, communications, pro- tection, control and computers have allowed the physical system to be utilized fully with ever smaller... margins for error. Therefore, a relatively modest upset to the system can cause functional collapse. As the system grows in complexity and interdependence

  9. Extraction of Vertical Profiles of Atmospheric Variables from Gridded Binary, Edition 2 (GRIB2) Model Output Files

    DTIC Science & Technology

    2018-01-18

    processing. Specifically, the method described herein uses wgrib2 commands along with a Python script or program to produce tabular text files that in...It makes use of software that is readily available and can be implemented on many computer systems combined with relatively modest additional...example), extracts appropriate information, and lists the extracted information in a readable tabular form. The Python script used here is described in

  10. Resolute efforts to cure hepatitis C: Understanding patients' reasons for completing antiviral treatment.

    PubMed

    Clark, Jack A; Gifford, Allen L

    2015-09-01

    Antiviral treatment for hepatitis C is usually difficult, demanding, and debilitating and has long offered modest prospects of successful cure. Most people who may need treatment have faced stigma of an illness associated with drug and alcohol misuse and thus may be deemed poor candidates for treatment, while completing a course of treatment typically calls for resolve and responsibility. Patients' efforts and their reasons for completing treatment have received scant attention in hepatitis C clinical policy discourse that instead focuses on problems of adherence and patients' expected failures. Thus, we conducted qualitative interviews with patients who had recently undertaken treatment to explore their reasons for completing antiviral treatment. Analysis of their narrative accounts identified four principal reasons: cure the infection, avoid a bad end, demonstrate the virtue of perseverance through a personal trial, and achieve personal rehabilitation. Their reasons reflect moral rationales that mark the social discredit ascribed to the infection and may represent efforts to restore creditable social membership. Their reasons may also reflect the selection processes that render some of the infected as good candidates for treatment, while excluding others. Explication of the moral context of treatment may identify opportunities to support patients' efforts in completing treatment, as well as illuminate the choices people with hepatitis C make about engaging in care. © US Government 2014.

  11. Clinical utility of the Conners' Continuous Performance Test-II to detect poor effort in U.S. military personnel following traumatic brain injury.

    PubMed

    Lange, Rael T; Iverson, Grant L; Brickell, Tracey A; Staver, Tara; Pancholi, Sonal; Bhagwat, Aditya; French, Louis M

    2013-06-01

    The purpose of this study is to examine the clinical utility of the Conners' Continuous Performance Test (CPT-II) as an embedded marker of poor effort in military personnel undergoing neuropsychological evaluations following traumatic brain injury. Participants were 158 U.S. military service members divided into 3 groups on the basis of brain injury severity and performance (pass/fail) on 2 symptom validity tests: Mild Traumatic Brain Injury (MTBI)-Pass (n = 87), MTBI-Fail (n = 42), and severe traumatic brain injury (STBI)-Pass (n = 29). The MTBI-Fail group performed worse on the majority of CPT-II measures compared with both the MTBI-Pass and STBI-Pass groups. When comparing the MTBI-Fail group and MTBI-Pass groups, the most accurate measure for identifying poor effort was the Commission T score. When selected measures were combined (i.e., Omissions, Commissions, and Perseverations), there was a very small increase in sensitivity (from .26 to .29). When comparing the MTBI-Fail group and STBI-Pass groups, the most accurate measure for identifying poor effort was the Omission and Commissions T score. When selected measures were combined, sensitivity again increased (from .24 to .45). Overall, these results suggest that individual CPT-II measures can be useful for identifying people who are suspected of providing poor effort from those who have provided adequate effort. However, due to low sensitivity and modest negative predictive power values, this measure cannot be used in isolation to detect poor effort, and is largely useful as a test to "rule in," not "rule out" poor effort. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  12. Mild KCC2 Hypofunction Causes Inconspicuous Chloride Dysregulation that Degrades Neural Coding

    PubMed Central

    Doyon, Nicolas; Prescott, Steven A.; De Koninck, Yves

    2016-01-01

    Disinhibition caused by Cl− dysregulation is implicated in several neurological disorders. This form of disinhibition, which stems primarily from impaired Cl− extrusion through the co-transporter KCC2, is typically identified by a depolarizing shift in GABA reversal potential (EGABA). Here we show, using computer simulations, that intracellular [Cl−] exhibits exaggerated fluctuations during transient Cl− loads and recovers more slowly to baseline when KCC2 level is even modestly reduced. Using information theory and signal detection theory, we show that increased Cl− lability and settling time degrade neural coding. Importantly, these deleterious effects manifest after less KCC2 reduction than needed to produce the gross changes in EGABA required for detection by most experiments, which assess KCC2 function under weak Cl− load conditions. By demonstrating the existence and functional consequences of “occult” Cl− dysregulation, these results suggest that modest KCC2 hypofunction plays a greater role in neurological disorders than previously believed. PMID:26858607

  13. Diversification of Processors Based on Redundancy in Instruction Set

    NASA Astrophysics Data System (ADS)

    Ichikawa, Shuichi; Sawada, Takashi; Hata, Hisashi

    By diversifying processor architecture, computer software is expected to be more resistant to plagiarism, analysis, and attacks. This study presents a new method to diversify instruction set architecture (ISA) by utilizing the redundancy in the instruction set. Our method is particularly suited for embedded systems implemented with FPGA technology, and realizes a genuine instruction set randomization, which has not been provided by the preceding studies. The evaluation results on four typical ISAs indicate that our scheme can provide a far larger degree of freedom than the preceding studies. Diversified processors based on MIPS architecture were actually implemented and evaluated with Xilinx Spartan-3 FPGA. The increase of logic scale was modest: 5.1% in Specialized design and 3.6% in RAM-mapped design. The performance overhead was also modest: 3.4% in Specialized design and 11.6% in RAM-mapped design. From these results, our scheme is regarded as a practical and promising way to secure FPGA-based embedded systems.

  14. Lawrence Livermore National Laboratories Perspective on Code Development and High Performance Computing Resources in Support of the National HED/ICF Effort

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clouse, C. J.; Edwards, M. J.; McCoy, M. G.

    2015-07-07

    Through its Advanced Scientific Computing (ASC) and Inertial Confinement Fusion (ICF) code development efforts, Lawrence Livermore National Laboratory (LLNL) provides a world leading numerical simulation capability for the National HED/ICF program in support of the Stockpile Stewardship Program (SSP). In addition the ASC effort provides high performance computing platform capabilities upon which these codes are run. LLNL remains committed to, and will work with, the national HED/ICF program community to help insure numerical simulation needs are met and to make those capabilities available, consistent with programmatic priorities and available resources.

  15. Capturing in situ skeletal muscle power for circulatory support: a new approach to device design.

    PubMed

    Trumble, Dennis R; Magovern, James A

    2003-01-01

    Efforts to harness in situ skeletal muscle for circulatory support have been extensive, but implants designed to tap this power source have yet to meet the strict performance standards incumbent upon such devices. A fourth generation muscle energy converter (MEC4) is described that represents a significant departure from previous hydraulic muscle pump designs, all of which have assumed a long cylindrical profile. The MEC4, in contrast, features a puck shaped metallic bellows oriented so that its end fittings lie parallel to the chest wall. The fixed end is centered over a fluid port that passes into the thoracic cavity across one resected rib. The opposite end of the bellows supports a roller bearing that moves beneath a linear cam fixed to a reciprocating shaft. The shaft exits the housing through a spring loaded seal and is attached to a sintered anchor pad for muscle tendon fixation. This configuration was chosen to improve bellows durability, lower device profile, and reduce tissue encumbrance to actuator recoil. Bench tests show that modest actuation forces can effect full actuator displacement in 0.25 seconds against high pressure loads, transmitting up to 0.9 J/stroke at 60% efficiency. In vitro tests also confirm that key device performance parameters can be computed from pressure readings transmitted via radiotelemetry, clearing the way for long-term implant studies in conscious animals.

  16. A Spatial Probit Econometric Model of Land Change: The Case of Infrastructure Development in Western Amazonia, Peru

    PubMed Central

    Arima, E. Y.

    2016-01-01

    Tropical forests are now at the center stage of climate mitigation policies worldwide given their roles as sources of carbon emissions resulting from deforestation and forest degradation. Although the international community has created mechanisms such as REDD+ to reduce those emissions, developing tropical countries continue to invest in infrastructure development in an effort to spur economic growth. Construction of roads in particular is known to be an important driver of deforestation. This article simulates the impact of road construction on deforestation in Western Amazonia, Peru, and quantifies the amount of carbon emissions associated with projected deforestation. To accomplish this objective, the article adopts a Bayesian probit land change model in which spatial dependencies are defined between regions or groups of pixels instead of between individual pixels, thereby reducing computational requirements. It also compares and contrasts the patterns of deforestation predicted by both spatial and non-spatial probit models. The spatial model replicates complex patterns of deforestation whereas the non-spatial model fails to do so. In terms of policy, both models suggest that road construction will increase deforestation by a modest amount, between 200–300 km2. This translates into aboveground carbon emissions of 1.36 and 1.85 x 106 tons. However, recent introduction of palm oil in the region serves as a cautionary example that the models may be underestimating the impact of roads. PMID:27010739

  17. A Spatial Probit Econometric Model of Land Change: The Case of Infrastructure Development in Western Amazonia, Peru.

    PubMed

    Arima, E Y

    2016-01-01

    Tropical forests are now at the center stage of climate mitigation policies worldwide given their roles as sources of carbon emissions resulting from deforestation and forest degradation. Although the international community has created mechanisms such as REDD+ to reduce those emissions, developing tropical countries continue to invest in infrastructure development in an effort to spur economic growth. Construction of roads in particular is known to be an important driver of deforestation. This article simulates the impact of road construction on deforestation in Western Amazonia, Peru, and quantifies the amount of carbon emissions associated with projected deforestation. To accomplish this objective, the article adopts a Bayesian probit land change model in which spatial dependencies are defined between regions or groups of pixels instead of between individual pixels, thereby reducing computational requirements. It also compares and contrasts the patterns of deforestation predicted by both spatial and non-spatial probit models. The spatial model replicates complex patterns of deforestation whereas the non-spatial model fails to do so. In terms of policy, both models suggest that road construction will increase deforestation by a modest amount, between 200-300 km2. This translates into aboveground carbon emissions of 1.36 and 1.85 x 106 tons. However, recent introduction of palm oil in the region serves as a cautionary example that the models may be underestimating the impact of roads.

  18. Coupled Human-Environment Dynamics of Forest Pest Spread and Control in a Multi-Patch, Stochastic Setting

    PubMed Central

    Ali, Qasim; Bauch, Chris T.; Anand, Madhur

    2015-01-01

    Background The transportation of camp firewood infested by non-native forest pests such as Asian long-horned beetle (ALB) and emerald ash borer (EAB) has severe impacts on North American forests. Once invasive forest pests are established, it can be difficult to eradicate them. Hence, preventing the long-distance transport of firewood by individuals is crucial. Methods Here we develop a stochastic simulation model that captures the interaction between forest pest infestations and human decisions regarding firewood transportation. The population of trees is distributed across 10 patches (parks) comprising a “low volume” partition of 5 patches that experience a low volume of park visitors, and a “high volume” partition of 5 patches experiencing a high visitor volume. The infestation spreads within a patch—and also between patches—according to the probability of between-patch firewood transportation. Individuals decide to transport firewood or buy it locally based on the costs of locally purchased versus transported firewood, social norms, social learning, and level of concern for observed infestations. Results We find that the average time until a patch becomes infested depends nonlinearly on many model parameters. In particular, modest increases in the tree removal rate, modest increases in public concern for infestation, and modest decreases in the cost of locally purchased firewood, relative to baseline (current) values, cause very large increases in the average time until a patch becomes infested due to firewood transport from other patches, thereby better preventing long-distance spread. Patches that experience lower visitor volumes benefit more from firewood movement restrictions than patches that experience higher visitor volumes. Also, cross–patch infestations not only seed new infestations, they can also worsen existing infestations to a surprising extent: long-term infestations are more intense in the high volume patches than the low volume patches, even when infestation is already endemic everywhere. Conclusions The success of efforts to prevent long-distance spread of forest pests may depend sensitively on the interaction between outbreak dynamics and human social processes, with similar levels of effort producing very different outcomes depending on where the coupled human and natural system exists in parameter space. Further development of such modeling approaches through better empirical validation should yield more precise recommendations for ways to optimally prevent the long-distance spread of invasive forest pests. PMID:26430902

  19. Human Influences on Geomorphic Dynamics in Western Montana Gravel-Bed Rivers

    NASA Astrophysics Data System (ADS)

    Wilcox, A. C.

    2016-12-01

    Management of river ecosystems, river restoration, climate-change vulnerability assessments, and other applications require understanding of how current channel conditions and processes compare to historical ranges of variability. This is particularly true with respect to evaluation of sediment balances, including of whether and how current sediment supply compares to background conditions. In western Montana, management and restoration efforts are in some cases driven by the perception that anthropogenic activities have elevated sediment yields above background levels; human-induced erosional increases have been documented in certain environments, but empirical supporting evidence is lacking for western Montana rivers. Here, human-induced changes in channel form and in sediment balances, including flow, sediment supply, and erosion rates, are evaluated for rivers in western Montana, with a particular focus on the Clark Fork and Bitterroot Rivers. These rivers are characteristic of systems in the northern Rocky Mountains with gravel beds, historically wandering channel patterns, modest bed-material loads, and land uses including logging, mining, and agriculture. The Clark Fork is influenced by legacy mining-related sediments and associated contaminants, remediation efforts, and the 2008 removal of Milltown Dam. These influences have caused temporary shifts in sediment balances, but overall, sediment fluxes are modest (e.g., suspended sediment fluxes of 6 tonnes km-2 yr-1 at the USGS Turah gage). The Bitterroot River is influenced by a mix of glaciated and unglaciated landscapes with fire-dominated erosional regimes and larger sand supply than the Clark Fork, reflecting lithologic differences; erosion rates, and the imprint of anthropogenic activities on sediment dynamics, are being investigated. This work has implications for river restoration, including whether measures are needed to impose channel stability, and for evaluating how climate-change-induced changes in fire, runoff, and erosion will alter fluvial sediment balances.

  20. Prevention of chronic lung disease

    PubMed Central

    Laughon, Matthew M.; Smith, P. Brian; Bose, Carl

    2010-01-01

    Considerable effort has been devoted to the development of strategies to reduce the incidence of chronic lung disease, including use of medications, nutritional therapies, and respiratory care practices. Unfortunately, most of these strategies have not been successful. To date, the only two treatments developed specifically to prevent CLD whose efficacy is supported by evidence from randomized, controlled trials are the parenteral administration of vitamin A and corticosteroids. Two other therapies, the use of caffeine for the treatment of apnea of prematurity and aggressive phototherapy for the treatment of hyperbilirubinemia were evaluated for the improvement of other outcomes and found to reduce CLD. Cohort studies suggest that the use of CPAP as a strategy for avoiding mechanical ventilation might also be beneficial. Other therapies reduce lung injury in animal models but do not appear to reduce CLD in humans. The benefits of the efficacious therapies have been modest, with an absolute risk reduction in the 7–11% range. Further preventive strategies are needed to reduce the burden of this disease. However, each will need to be tested in randomized, controlled trials, and the expectations of new therapies should be modest reductions of the incidence of the disease. PMID:19736053

  1. Individualized adjustments to reference phantom internal organ dosimetry—scaling factors given knowledge of patient external anatomy

    NASA Astrophysics Data System (ADS)

    Wayson, Michael B.; Bolch, Wesley E.

    2018-04-01

    Internal radiation dose estimates for diagnostic nuclear medicine procedures are typically calculated for a reference individual. Resultantly, there is uncertainty when determining the organ doses to patients who are not at 50th percentile on either height or weight. This study aims to better personalize internal radiation dose estimates for individual patients by modifying the dose estimates calculated for reference individuals based on easily obtainable morphometric characteristics of the patient. Phantoms of different sitting heights and waist circumferences were constructed based on computational reference phantoms for the newborn, 10 year-old, and adult. Monoenergetic photons and electrons were then simulated separately at 15 energies. Photon and electron specific absorbed fractions (SAFs) were computed for the newly constructed non-reference phantoms and compared to SAFs previously generated for the age-matched reference phantoms. Differences in SAFs were correlated to changes in sitting height and waist circumference to develop scaling factors that could be applied to reference SAFs as morphometry corrections. A further set of arbitrary non-reference phantoms were then constructed and used in validation studies for the SAF scaling factors. Both photon and electron dose scaling methods were found to increase average accuracy when sitting height was used as the scaling parameter (~11%). Photon waist circumference-based scaling factors showed modest increases in average accuracy (~7%) for underweight individuals, but not for overweight individuals. Electron waist circumference-based scaling factors did not show increases in average accuracy. When sitting height and waist circumference scaling factors were combined, modest average gains in accuracy were observed for photons (~6%), but not for electrons. Both photon and electron absorbed doses are more reliably scaled using scaling factors computed in this study. They can be effectively scaled using sitting height alone as patient-specific morphometric parameter.

  2. Individualized adjustments to reference phantom internal organ dosimetry-scaling factors given knowledge of patient external anatomy.

    PubMed

    Wayson, Michael B; Bolch, Wesley E

    2018-04-13

    Internal radiation dose estimates for diagnostic nuclear medicine procedures are typically calculated for a reference individual. Resultantly, there is uncertainty when determining the organ doses to patients who are not at 50th percentile on either height or weight. This study aims to better personalize internal radiation dose estimates for individual patients by modifying the dose estimates calculated for reference individuals based on easily obtainable morphometric characteristics of the patient. Phantoms of different sitting heights and waist circumferences were constructed based on computational reference phantoms for the newborn, 10 year-old, and adult. Monoenergetic photons and electrons were then simulated separately at 15 energies. Photon and electron specific absorbed fractions (SAFs) were computed for the newly constructed non-reference phantoms and compared to SAFs previously generated for the age-matched reference phantoms. Differences in SAFs were correlated to changes in sitting height and waist circumference to develop scaling factors that could be applied to reference SAFs as morphometry corrections. A further set of arbitrary non-reference phantoms were then constructed and used in validation studies for the SAF scaling factors. Both photon and electron dose scaling methods were found to increase average accuracy when sitting height was used as the scaling parameter (~11%). Photon waist circumference-based scaling factors showed modest increases in average accuracy (~7%) for underweight individuals, but not for overweight individuals. Electron waist circumference-based scaling factors did not show increases in average accuracy. When sitting height and waist circumference scaling factors were combined, modest average gains in accuracy were observed for photons (~6%), but not for electrons. Both photon and electron absorbed doses are more reliably scaled using scaling factors computed in this study. They can be effectively scaled using sitting height alone as patient-specific morphometric parameter.

  3. New Mexico district work-effort analysis computer program

    USGS Publications Warehouse

    Hiss, W.L.; Trantolo, A.P.; Sparks, J.L.

    1972-01-01

    The computer program (CAN 2) described in this report is one of several related programs used in the New Mexico District cost-analysis system. The work-effort information used in these programs is accumulated and entered to the nearest hour on forms completed by each employee. Tabulating cards are punched directly from these forms after visual examinations for errors are made. Reports containing detailed work-effort data itemized by employee within each project and account and by account and project for each employee are prepared for both current-month and year-to-date periods by the CAN 2 computer program. An option allowing preparation of reports for a specified 3-month period is provided. The total number of hours worked on each account and project and a grand total of hours worked in the New Mexico District is computed and presented in a summary report for each period. Work effort not chargeable directly to individual projects or accounts is considered as overhead and can be apportioned to the individual accounts and projects on the basis of the ratio of the total hours of work effort for the individual accounts or projects to the total New Mexico District work effort at the option of the user. The hours of work performed by a particular section, such as General Investigations or Surface Water, are prorated and charged to the projects or accounts within the particular section. A number of surveillance or buffer accounts are employed to account for the hours worked on special events or on those parts of large projects or accounts that require a more detailed analysis. Any part of the New Mexico District operation can be separated and analyzed in detail by establishing an appropriate buffer account. With the exception of statements associated with word size, the computer program is written in FORTRAN IV in a relatively low and standard language level to facilitate its use on different digital computers. The program has been run only on a Control Data Corporation 6600 computer system. Central processing computer time has seldom exceeded 5 minutes on the longest year-to-date runs.

  4. Hypersonic Experimental and Computational Capability, Improvement and Validation. Volume 2

    NASA Technical Reports Server (NTRS)

    Muylaert, Jean (Editor); Kumar, Ajay (Editor); Dujarric, Christian (Editor)

    1998-01-01

    The results of the phase 2 effort conducted under AGARD Working Group 18 on Hypersonic Experimental and Computational Capability, Improvement and Validation are presented in this report. The first volume, published in May 1996, mainly focused on the design methodology, plans and some initial results of experiments that had been conducted to serve as validation benchmarks. The current volume presents the detailed experimental and computational data base developed during this effort.

  5. [Earth Science Technology Office's Computational Technologies Project

    NASA Technical Reports Server (NTRS)

    Fischer, James (Technical Monitor); Merkey, Phillip

    2005-01-01

    This grant supported the effort to characterize the problem domain of the Earth Science Technology Office's Computational Technologies Project, to engage the Beowulf Cluster Computing Community as well as the High Performance Computing Research Community so that we can predict the applicability of said technologies to the scientific community represented by the CT project and formulate long term strategies to provide the computational resources necessary to attain the anticipated scientific objectives of the CT project. Specifically, the goal of the evaluation effort is to use the information gathered over the course of the Round-3 investigations to quantify the trends in scientific expectations, the algorithmic requirements and capabilities of high-performance computers to satisfy this anticipated need.

  6. A Review of High-Performance Computational Strategies for Modeling and Imaging of Electromagnetic Induction Data

    NASA Astrophysics Data System (ADS)

    Newman, Gregory A.

    2014-01-01

    Many geoscientific applications exploit electrostatic and electromagnetic fields to interrogate and map subsurface electrical resistivity—an important geophysical attribute for characterizing mineral, energy, and water resources. In complex three-dimensional geologies, where many of these resources remain to be found, resistivity mapping requires large-scale modeling and imaging capabilities, as well as the ability to treat significant data volumes, which can easily overwhelm single-core and modest multicore computing hardware. To treat such problems requires large-scale parallel computational resources, necessary for reducing the time to solution to a time frame acceptable to the exploration process. The recognition that significant parallel computing processes must be brought to bear on these problems gives rise to choices that must be made in parallel computing hardware and software. In this review, some of these choices are presented, along with the resulting trade-offs. We also discuss future trends in high-performance computing and the anticipated impact on electromagnetic (EM) geophysics. Topics discussed in this review article include a survey of parallel computing platforms, graphics processing units to multicore CPUs with a fast interconnect, along with effective parallel solvers and associated solver libraries effective for inductive EM modeling and imaging.

  7. Meshfree and efficient modeling of swimming cells

    NASA Astrophysics Data System (ADS)

    Gallagher, Meurig T.; Smith, David J.

    2018-05-01

    Locomotion in Stokes flow is an intensively studied problem because it describes important biological phenomena such as the motility of many species' sperm, bacteria, algae, and protozoa. Numerical computations can be challenging, particularly in three dimensions, due to the presence of moving boundaries and complex geometries; methods which combine ease of implementation and computational efficiency are therefore needed. A recently proposed method to discretize the regularized Stokeslet boundary integral equation without the need for a connected mesh is applied to the inertialess locomotion problem in Stokes flow. The mathematical formulation and key aspects of the computational implementation in matlab® or GNU Octave are described, followed by numerical experiments with biflagellate algae and multiple uniflagellate sperm swimming between no-slip surfaces, for which both swimming trajectories and flow fields are calculated. These computational experiments required minutes of time on modest hardware; an extensible implementation is provided in a GitHub repository. The nearest-neighbor discretization dramatically improves convergence and robustness, a key challenge in extending the regularized Stokeslet method to complicated three-dimensional biological fluid problems.

  8. Seeing the forest for the trees: Networked workstations as a parallel processing computer

    NASA Technical Reports Server (NTRS)

    Breen, J. O.; Meleedy, D. M.

    1992-01-01

    Unlike traditional 'serial' processing computers in which one central processing unit performs one instruction at a time, parallel processing computers contain several processing units, thereby, performing several instructions at once. Many of today's fastest supercomputers achieve their speed by employing thousands of processing elements working in parallel. Few institutions can afford these state-of-the-art parallel processors, but many already have the makings of a modest parallel processing system. Workstations on existing high-speed networks can be harnessed as nodes in a parallel processing environment, bringing the benefits of parallel processing to many. While such a system can not rival the industry's latest machines, many common tasks can be accelerated greatly by spreading the processing burden and exploiting idle network resources. We study several aspects of this approach, from algorithms to select nodes to speed gains in specific tasks. With ever-increasing volumes of astronomical data, it becomes all the more necessary to utilize our computing resources fully.

  9. Effects of Geometric Details on Slat Noise Generation and Propagation

    NASA Technical Reports Server (NTRS)

    Khorrami, Mehdi R.; Lockard, David P.

    2009-01-01

    The relevance of geometric details to the generation and propagation of noise from leading-edge slats is considered. Typically, such details are omitted in computational simulations and model-scale experiments thereby creating ambiguities in comparisons with acoustic results from flight tests. The current study uses two-dimensional, computational simulations in conjunction with a Ffowcs Williams-Hawkings (FW-H) solver to investigate the effects of previously neglected slat "bulb" and "blade" seals on the local flow field and the associated acoustic radiation. The computations show that the presence of the "blade" seal at the cusp in the simulated geometry significantly changes the slat cove flow dynamics, reduces the amplitudes of the radiated sound, and to a lesser extent, alters the directivity beneath the airfoil. Furthermore, the computations suggest that a modest extension of the baseline "blade" seal further enhances the suppression of slat noise. As a side issue, the utility and equivalence of FW-H methodology for calculating far-field noise as opposed to a more direct approach is examined and demonstrated.

  10. Development of an autonomous video rendezvous and docking system, phase 2

    NASA Technical Reports Server (NTRS)

    Tietz, J. C.; Richardson, T. E.

    1983-01-01

    The critical elements of an autonomous video rendezvous and docking system were built and used successfully in a physical laboratory simulation. The laboratory system demonstrated that a small, inexpensive electronic package and a flight computer of modest size can analyze television images to derive guidance information for spacecraft. In the ultimate application, the system would use a docking aid consisting of three flashing lights mounted on a passive target spacecraft. Television imagery of the docking aid would be processed aboard an active chase vehicle to derive relative positions and attitudes of the two spacecraft. The demonstration system used scale models of the target spacecraft with working docking aids. A television camera mounted on a 6 degree of freedom (DOF) simulator provided imagery of the target to simulate observations from the chase vehicle. A hardware video processor extracted statistics from the imagery, from which a computer quickly computed position and attitude. Computer software known as a Kalman filter derived velocity information from position measurements.

  11. Thermodynamics of quasideterministic digital computers

    NASA Astrophysics Data System (ADS)

    Chu, Dominique

    2018-02-01

    A central result of stochastic thermodynamics is that irreversible state transitions of Markovian systems entail a cost in terms of an infinite entropy production. A corollary of this is that strictly deterministic computation is not possible. Using a thermodynamically consistent model, we show that quasideterministic computation can be achieved at finite, and indeed modest cost with accuracies that are indistinguishable from deterministic behavior for all practical purposes. Concretely, we consider the entropy production of stochastic (Markovian) systems that behave like and and a not gates. Combinations of these gates can implement any logical function. We require that these gates return the correct result with a probability that is very close to 1, and additionally, that they do so within finite time. The central component of the model is a machine that can read and write binary tapes. We find that the error probability of the computation of these gates falls with the power of the system size, whereas the cost only increases linearly with the system size.

  12. On the Rapid Computation of Various Polylogarithmic Constants

    NASA Technical Reports Server (NTRS)

    Bailey, David H.; Borwein, Peter; Plouffe, Simon

    1996-01-01

    We give algorithms for the computation of the d-th digit of certain transcendental numbers in various bases. These algorithms can be easily implemented (multiple precision arithmetic is not needed), require virtually no memory, and feature run times that scale nearly linearly with the order of the digit desired. They make it feasible to compute, for example, the billionth binary digit of log(2) or pi on a modest workstation in a few hours run time. We demonstrate this technique by computing the ten billionth hexadecimal digit of pi, the billionth hexadecimal digits of pi-squared, log(2) and log-squared(2), and the ten billionth decimal digit of log(9/10). These calculations rest on the observation that very special types of identities exist for certain numbers like pi, pi-squared, log(2) and log-squared(2). These are essentially polylogarithmic ladders in an integer base. A number of these identities that we derive in this work appear to be new, for example a critical identity for pi.

  13. Analysis of Crack Arrest Toughness.

    DTIC Science & Technology

    1988-01-15

    kl,. and that the microstructural features that effect "eligibility" may have a modest effect on K,. 1953 to 1955 he sered in the Titani im Section of...ductile fracture criterion, computations which assumed that the 6-Aa history was the same for rapid fracture as it was for stable crack growth agree...around the crack tip [25]. The 8-Aa history , used as the fracture criterion for the first 4 mm of growth in the dynamic analysis, was obtained from the

  14. Toward using games to teach fundamental computer science concepts

    NASA Astrophysics Data System (ADS)

    Edgington, Jeffrey Michael

    Video and computer games have become an important area of study in the field of education. Games have been designed to teach mathematics, physics, raise social awareness, teach history and geography, and train soldiers in the military. Recent work has created computer games for teaching computer programming and understanding basic algorithms. We present an investigation where computer games are used to teach two fundamental computer science concepts: boolean expressions and recursion. The games are intended to teach the concepts and not how to implement them in a programming language. For this investigation, two computer games were created. One is designed to teach basic boolean expressions and operators and the other to teach fundamental concepts of recursion. We describe the design and implementation of both games. We evaluate the effectiveness of these games using before and after surveys. The surveys were designed to ascertain basic understanding, attitudes and beliefs regarding the concepts. The boolean game was evaluated with local high school students and students in a college level introductory computer science course. The recursion game was evaluated with students in a college level introductory computer science course. We present the analysis of the collected survey information for both games. This analysis shows a significant positive change in student attitude towards recursion and modest gains in student learning outcomes for both topics.

  15. Computational Methods for Stability and Control (COMSAC): The Time Has Come

    NASA Technical Reports Server (NTRS)

    Hall, Robert M.; Biedron, Robert T.; Ball, Douglas N.; Bogue, David R.; Chung, James; Green, Bradford E.; Grismer, Matthew J.; Brooks, Gregory P.; Chambers, Joseph R.

    2005-01-01

    Powerful computational fluid dynamics (CFD) tools have emerged that appear to offer significant benefits as an adjunct to the experimental methods used by the stability and control community to predict aerodynamic parameters. The decreasing costs for and increasing availability of computing hours are making these applications increasingly viable as time goes on and the cost of computing continues to drop. This paper summarizes the efforts of four organizations to utilize high-end computational fluid dynamics (CFD) tools to address the challenges of the stability and control arena. General motivation and the backdrop for these efforts will be summarized as well as examples of current applications.

  16. Simulated quantum computation of molecular energies.

    PubMed

    Aspuru-Guzik, Alán; Dutoi, Anthony D; Love, Peter J; Head-Gordon, Martin

    2005-09-09

    The calculation time for the energy of atoms and molecules scales exponentially with system size on a classical computer but polynomially using quantum algorithms. We demonstrate that such algorithms can be applied to problems of chemical interest using modest numbers of quantum bits. Calculations of the water and lithium hydride molecular ground-state energies have been carried out on a quantum computer simulator using a recursive phase-estimation algorithm. The recursive algorithm reduces the number of quantum bits required for the readout register from about 20 to 4. Mappings of the molecular wave function to the quantum bits are described. An adiabatic method for the preparation of a good approximate ground-state wave function is described and demonstrated for a stretched hydrogen molecule. The number of quantum bits required scales linearly with the number of basis functions, and the number of gates required grows polynomially with the number of quantum bits.

  17. A compendium of computational fluid dynamics at the Langley Research Center

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Through numerous summary examples, the scope and general nature of the computational fluid dynamics (CFD) effort at Langley is identified. These summaries will help inform researchers in CFD and line management at Langley of the overall effort. In addition to the inhouse efforts, out of house CFD work supported by Langley through industrial contracts and university grants are included. Researchers were encouraged to include summaries of work in preliminary and tentative states of development as well as current research approaching definitive results.

  18. Δ9-Tetrahydrocannabinol decreases willingness to exert cognitive effort in male rats.

    PubMed

    Silveira, Mason M; Adams, Wendy K; Morena, Maria; Hill, Matthew N; Winstanley, Catharine A

    2017-03-01

    Acceptance of cannabis use is growing. However, prolonged use is associated with diminished psychosocial outcomes, potentially mediated by drug-induced cognitive impairments. Δ 9 -Tetrahydrocannabinol (THC) is the main psychoactive ingredient in cannabis, yet other phytocannabinoids in the plant, such as cannabidiol (CBD), have unique properties. Given that CBD can modulate the undesirable effects of THC, therapeutic agents, such as nabiximols, contain higher CBD:THC ratios than illicit marijuana. We tested the hypothesis that THC impairs a relevant cognitive function for long-term success, namely willingness to exert cognitive effort for greater rewards, and that CBD could attenuate such decision-making impairments. Male Long-Evans rats ( n = 29) performing the rat cognitive effort task (rCET) received acute THC and CBD, independently and concurrently, in addition to other cannabinoids. Rats chose between 2 options differing in reward magnitude, but also in the cognitive effort (attentional load) required to obtain them. We found that THC decreased choice of hard trials without impairing the animals' ability to accurately complete them. Strikingly, this impairment was correlated with CB1 receptor density in the medial prefrontal cortex - an area previously implicated in effortful decision-making. In contrast, CBD did not affect choice. Coadministration of 1:1 CBD:THC matching that in nabiximols modestly attenuated the deleterious effects of THC in "slacker" rats. Only male rats were investigated, and the THC/CBD coadministration experiment was carried out in a subset of individuals. These findings confirm that THC, but not CBD, selectively impairs decision-making involving cognitive effort costs. However, coadministration of CBD only partially ameliorates such THC-induced dysfunction.

  19. Affordable Manufacturing Technologies Being Developed for Actively Cooled Ceramic Components

    NASA Technical Reports Server (NTRS)

    Bhatt, Ramakrishna T.

    1999-01-01

    Efforts to improve the performance of modern gas turbine engines have imposed increasing service temperature demands on structural materials. Through active cooling, the useful temperature range of nickel-base superalloys in current gas turbine engines has been extended, but the margin for further improvement appears modest. Because of their low density, high-temperature strength, and high thermal conductivity, in situ toughened silicon nitride ceramics have received a great deal of attention for cooled structures. However, high processing costs have proven to be a major obstacle to their widespread application. Advanced rapid prototyping technology, which is developing rapidly, offers the possibility of an affordable manufacturing approach.

  20. An Empirical Test of the Theory of Planned Behaviour Applied to Contraceptive Use in Rural Uganda

    PubMed Central

    Kiene, Susan M.; Hopwood, Sarah; Lule, Haruna; Wanyenze, Rhoda K.

    2013-01-01

    There is a high unmet need for contraceptives in developing countries such as Uganda, with high population growth, where efforts are needed to promote family planning and contraceptive use. Despite this high need, little research has investigated applications of health behaviour change theories to contraceptive use amongst this population. The present study tested the Theory of Planned Behaviour’s ability to predict contraceptive use-related behaviours among postpartum women in rural Uganda. Results gave modest support to the theory’s application and suggest an urgent need for improved theory-based interventions to promote contraceptive use in the populations of developing countries. PMID:23928989

  1. THESIS: the terrestrial habitable-zone exoplanet spectroscopy infrared spacecraft

    NASA Astrophysics Data System (ADS)

    Swain, Mark R.; Vasisht, Gautam; Henning, Thomas; Tinetti, Giovanna; Beaulieu, Jean-Phillippe

    2010-07-01

    THESIS, the Transiting Habitable-zone Exoplanet Spectroscopy Infrared Spacecraft, is a concept for a medium/Probe class exoplanet mission. Building on the recent Spitzer successes in exoplanet characterization, THESIS would extend these types of measurements to super-Earth-like planets. A strength of the THESIS concept is simplicity, low technical risk, and modest cost. The mission concept has the potential to dramatically advance our understanding of conditions on extrasolar worlds and could serve as a stepping stone to more ambitious future missions. We envision this mission as a joint US-European effort with science objectives that resonate with both the traditional astronomy and planetary science communities.

  2. A climate trend analysis of Mali

    USGS Publications Warehouse

    Funk, Christopher C.; Rowland, Jim; Adoum, Alkhalil; Eilerts, Gary; White, Libby

    2012-01-01

    This brief report, drawing from a multi-year effort by the U.S. Agency for International Development (USAID) Famine Early Warning Systems Network (FEWS NET), identifies modest declines in rainfall, accompanied by increases in air temperatures. These analyses are based on quality-controlled station observations. Conclusions: * Summer rains have remained relatively steady for the past 20 years, but are 12 percent below the 1920-1969 average. * Temperatures have increased by 0.8° Celsius since 1975, amplifying the effect of droughts. * Cereal yields are low but have been improving. * Current population and agricultural trends indicate that increased yields have offset population expansion, keeping per capita cereal production steady.

  3. Impact of a Brief Training on Medical Resident Screening for Alcohol Misuse and Illicit Drug Use

    PubMed Central

    Gunderson, Erik W.; Levin, Frances R.; Owen, Patricia

    2011-01-01

    Educational initiatives are needed to improve primary care substance use screening. This study assesses the impact on 24 medical residents of a 2.5-day curriculum combining experiential and manual-based training on screening for alcohol misuse and illicit drug use. A retrospective chart review of new primary care outpatients demonstrated that nearly all were asked about current alcohol use before and after curriculum participation. Adherence to national screening guidelines on quantification of alcohol consumption modestly improved (p < .05), as did inquiry about current illicit drug use (p < .05). Continued efforts are needed to enhance educational initiatives for primary care physicians. PMID:18393059

  4. Filling the Graduate Student Pipeline

    NASA Astrophysics Data System (ADS)

    Winey, Karen I.

    2003-03-01

    As a professor who relies on graduate students to participate in my research program, I work to ensure that the pipeline of graduate students is full. This presentation will discuss a variety of strategies that I have used to advertise the opportunities of graduate school, many of which use existing infrastructure at the University of Pennsylvania. These strategies involve a combination of public speaking, discussion groups, and faculty advising. During these exchanges it's important to both contrast the career opportunities for B.S., M.S. and Ph.D. degree holders and outline the financial facts about graduate school. These modest efforts have increased the number of Penn undergraduates pursuing doctorate degrees.

  5. Cognitive styles in creative leadership practices: exploring the relationship between level and style.

    PubMed

    Isaksen, Scott G; Babij, Barbara J; Lauer, Kenneth J

    2003-12-01

    This study investigated the relationship between two measures used to assist change and transformation efforts, the Kirton Adaption-Innovation Inventory which assesses style or manner of cognition and problem-solving, not level or capability, and the Leadership Practices Inventory which measures the extent to which leaders exhibit certain leadership behaviors associated with accomplishing extraordinary results. These two measures of level and style should be conceptually distinct and show no or only modest correlation. Analysis yielded statistically significant and meaningful relationships between scores on the Kirton inventory and two scales of the Leadership Practices Inventory. Implications and challenges for research and practice were outlined.

  6. Impact of a brief training on medical resident screening for alcohol misuse and illicit drug use.

    PubMed

    Gunderson, Erik W; Levin, Frances R; Owen, Patricia

    2008-01-01

    Educational initiatives are needed to improve primary care substance use screening. This study assesses the impact on 24 medical residents of a 2.5-day curriculum combining experiential and manual-based training on screening for alcohol misuse and illicit drug use. A retrospective chart review of new primary care outpatients demonstrated that nearly all were asked about current alcohol use before and after curriculum participation. Adherence to national screening guidelines on quantification of alcohol consumption modestly improved (p < .05), as did inquiry about current illicit drug use (p < .05). Continued efforts are needed to enhance educational initiatives for primary care physicians.

  7. Recommendations for evaluation of computational methods

    NASA Astrophysics Data System (ADS)

    Jain, Ajay N.; Nicholls, Anthony

    2008-03-01

    The field of computational chemistry, particularly as applied to drug design, has become increasingly important in terms of the practical application of predictive modeling to pharmaceutical research and development. Tools for exploiting protein structures or sets of ligands known to bind particular targets can be used for binding-mode prediction, virtual screening, and prediction of activity. A serious weakness within the field is a lack of standards with respect to quantitative evaluation of methods, data set preparation, and data set sharing. Our goal should be to report new methods or comparative evaluations of methods in a manner that supports decision making for practical applications. Here we propose a modest beginning, with recommendations for requirements on statistical reporting, requirements for data sharing, and best practices for benchmark preparation and usage.

  8. Periodic control of the individual-blade-control helicopter rotor

    NASA Technical Reports Server (NTRS)

    Mckillip, R. M., Jr.

    1985-01-01

    This paper describes the results of an investigation into methods of controller design for linear periodic systems utilizing an extension of modern control methods. Trends present in the selection of various cost functions are outlined, and closed-loop controller results are demonstrated for two cases: first, on an analog computer simulation of the rigid out of plane flapping dynamics of a single rotor blade, and second, on a 4 ft diameter single-bladed model helicopter rotor in the MIT 5 x 7 subsonic wind tunnel, both for various high levels of advance ratio. It is shown that modal control using the IBC concept is possible over a large range of advance ratios with only a modest amount of computational power required.

  9. Motivational Beliefs, Student Effort, and Feedback Behaviour in Computer-Based Formative Assessment

    ERIC Educational Resources Information Center

    Timmers, Caroline F.; Braber-van den Broek, Jannie; van den Berg, Stephanie M.

    2013-01-01

    Feedback can only be effective when students seek feedback and process it. This study examines the relations between students' motivational beliefs, effort invested in a computer-based formative assessment, and feedback behaviour. Feedback behaviour is represented by whether a student seeks feedback and the time a student spends studying the…

  10. Establishing a K-12 Circuit Design Program

    ERIC Educational Resources Information Center

    Inceoglu, Mustafa M.

    2010-01-01

    Outreach, as defined by Wikipedia, is an effort by an organization or group to connect its ideas or practices to the efforts of other organizations, groups, specific audiences, or the general public. This paper describes a computer engineering outreach project of the Department of Computer Engineering at Ege University, Izmir, Turkey, to a local…

  11. Identifying Predictors of Achievement in the Newly Defined Information Literacy: A Neural Network Analysis

    ERIC Educational Resources Information Center

    Sexton, Randall; Hignite, Michael; Margavio, Thomas M.; Margavio, Geanie W.

    2009-01-01

    Information Literacy is a concept that evolved as a result of efforts to move technology-based instructional and research efforts beyond the concepts previously associated with "computer literacy." While computer literacy was largely a topic devoted to knowledge of hardware and software, information literacy is concerned with students' abilities…

  12. Eradication of Yaws: Historical Efforts and Achieving WHO's 2020 Target

    PubMed Central

    Asiedu, Kingsley; Fitzpatrick, Christopher; Jannin, Jean

    2014-01-01

    Background Yaws, one of the 17 neglected tropical diseases (NTDs), is targeted for eradication by 2020 in resolution WHA66.12 of the World Health Assembly (2013) and the WHO roadmap on NTDs (2012). The disease frequently affects children who live in poor socioeconomic conditions. Between 1952 and 1964, WHO and the United Nations Children's Fund (UNICEF) led a global eradication campaign using injectable benzathine penicillin. Recent developments using a single dose of oral azithromycin have renewed optimism that eradication can be achieved through a comprehensive large-scale treatment strategy. We review historical efforts to eradicate yaws and argue that this goal is now technically feasible using new tools and with the favorable environment for control of NTDs. We also summarize the work of WHO's Department of Control of Neglected Tropical Diseases in leading the renewed eradication initiative and call on the international community to support efforts to achieve the 2020 eradication goal. The critical factor remains access to azithromycin. Excluding medicines, the financial cost of yaws eradication could be as little as US$ 100 million. Conclusions The development of new tools has renewed interest in eradication of yaws; with modest support, the WHO eradication target of 2020 can be achieved. PMID:25254372

  13. A streamlined failure mode and effects analysis.

    PubMed

    Ford, Eric C; Smith, Koren; Terezakis, Stephanie; Croog, Victoria; Gollamudi, Smitha; Gage, Irene; Keck, Jordie; DeWeese, Theodore; Sibley, Greg

    2014-06-01

    Explore the feasibility and impact of a streamlined failure mode and effects analysis (FMEA) using a structured process that is designed to minimize staff effort. FMEA for the external beam process was conducted at an affiliate radiation oncology center that treats approximately 60 patients per day. A structured FMEA process was developed which included clearly defined roles and goals for each phase. A core group of seven people was identified and a facilitator was chosen to lead the effort. Failure modes were identified and scored according to the FMEA formalism. A risk priority number,RPN, was calculated and used to rank failure modes. Failure modes with RPN > 150 received safety improvement interventions. Staff effort was carefully tracked throughout the project. Fifty-two failure modes were identified, 22 collected during meetings, and 30 from take-home worksheets. The four top-ranked failure modes were: delay in film check, missing pacemaker protocol/consent, critical structures not contoured, and pregnant patient simulated without the team's knowledge of the pregnancy. These four failure modes had RPN > 150 and received safety interventions. The FMEA was completed in one month in four 1-h meetings. A total of 55 staff hours were required and, additionally, 20 h by the facilitator. Streamlined FMEA provides a means of accomplishing a relatively large-scale analysis with modest effort. One potential value of FMEA is that it potentially provides a means of measuring the impact of quality improvement efforts through a reduction in risk scores. Future study of this possibility is needed.

  14. Optical Observations of GEO Debris with Two Telescopes

    NASA Technical Reports Server (NTRS)

    Seitzer, P.; Abercromby, K.; Rodriguez, H.; Barker, E.

    2007-01-01

    For several years, the Michigan Orbital DEbris Survey Telescope (MODEST), the University of Michigan s 0.6/0.9-m Schmidt telescope on Cerro Tololo Inter-American Observatory in Chile has been used to survey the debris population at GEO in the visible regime. Magnitudes, positions, and angular rates are determined for GEO objects as they move across the telescope s field-of-view (FOV) during a 5-minute window. This short window of time is not long enough to determine a full six parameter orbit so usually a circular orbit is assumed. A longer arc of time is necessary to determine eccentricity and to look for changes in the orbit with time. MODEST can follow objects in real-time, but only at the price of stopping survey operations. A second telescope would allow for longer arcs of orbit to obtain the full six orbital parameters, as well as assess the changes over time. An additional benefit of having a second telescope is the capability of obtaining BVRI colors of the faint targets, aiding efforts to determine the material type of faint debris. For 14 nights in March 2007, two telescopes were used simultaneously to observe the GEO debris field. MODEST was used exclusively in survey mode. As objects were detected, they were handed off in near real-time to the Cerro Tololo 0.9-m telescope for follow-up observations. The goal was to determine orbits and colors for all objects fainter than R = 15th magnitude (corresponds to 1 meter in size assuming a 0.2 albedo) detected by MODEST. The hand-off process was completely functional during the final eight nights and follow-ups for objects from night-to-night were possible. The cutoff magnitude level of 15th was selected on the basis of an abrupt change in the observed angular rate distribution in the MODEST surveys. Objects brighter than 15th magnitude tend to lie on a well defined locus in the angular rate plane (and have orbits in the catalog), while fainter objects fill the plane almost uniformly. We need to determine full six-parameter orbits to investigate what causes this change in observed angular rates. Are these faint objects either the same population of high area-to-mass (A/M) objects on eccentric orbits as discovered by the ESA Space Debris Telescope (Schildknecht, et al. 2004), or are they just normal debris from breakups in GEO?

  15. Impact of remote sensing upon the planning, management, and development of water resources

    NASA Technical Reports Server (NTRS)

    Loats, H. L.; Fowler, T. R.; Frech, S. L.

    1974-01-01

    A survey of the principal water resource users was conducted to determine the impact of new remote data streams on hydrologic computer models. The analysis of the responses and direct contact demonstrated that: (1) the majority of water resource effort of the type suitable to remote sensing inputs is conducted by major federal water resources agencies or through federally stimulated research, (2) the federal government develops most of the hydrologic models used in this effort; and (3) federal computer power is extensive. The computers, computer power, and hydrologic models in current use were determined.

  16. An opportunity cost model of subjective effort and task performance

    PubMed Central

    Kurzban, Robert; Duckworth, Angela; Kable, Joseph W.; Myers, Justus

    2013-01-01

    Why does performing certain tasks cause the aversive experience of mental effort and concomitant deterioration in task performance? One explanation posits a physical resource that is depleted over time. We propose an alternate explanation that centers on mental representations of the costs and benefits associated with task performance. Specifically, certain computational mechanisms, especially those associated with executive function, can be deployed for only a limited number of simultaneous tasks at any given moment. Consequently, the deployment of these computational mechanisms carries an opportunity cost – that is, the next-best use to which these systems might be put. We argue that the phenomenology of effort can be understood as the felt output of these cost/benefit computations. In turn, the subjective experience of effort motivates reduced deployment of these computational mechanisms in the service of the present task. These opportunity cost representations, then, together with other cost/benefit calculations, determine effort expended and, everything else equal, result in performance reductions. In making our case for this position, we review alternate explanations both for the phenomenology of effort associated with these tasks and for performance reductions over time. Likewise, we review the broad range of relevant empirical results from across subdisciplines, especially psychology and neuroscience. We hope that our proposal will help to build links among the diverse fields that have been addressing similar questions from different perspectives, and we emphasize ways in which alternate models might be empirically distinguished. PMID:24304775

  17. An efficient method for hybrid density functional calculation with spin-orbit coupling

    NASA Astrophysics Data System (ADS)

    Wang, Maoyuan; Liu, Gui-Bin; Guo, Hong; Yao, Yugui

    2018-03-01

    In first-principles calculations, hybrid functional is often used to improve accuracy from local exchange correlation functionals. A drawback is that evaluating the hybrid functional needs significantly more computing effort. When spin-orbit coupling (SOC) is taken into account, the non-collinear spin structure increases computing effort by at least eight times. As a result, hybrid functional calculations with SOC are intractable in most cases. In this paper, we present an approximate solution to this problem by developing an efficient method based on a mixed linear combination of atomic orbital (LCAO) scheme. We demonstrate the power of this method using several examples and we show that the results compare very well with those of direct hybrid functional calculations with SOC, yet the method only requires a computing effort similar to that without SOC. The presented technique provides a good balance between computing efficiency and accuracy, and it can be extended to magnetic materials.

  18. Computational Models of Anterior Cingulate Cortex: At the Crossroads between Prediction and Effort.

    PubMed

    Vassena, Eliana; Holroyd, Clay B; Alexander, William H

    2017-01-01

    In the last two decades the anterior cingulate cortex (ACC) has become one of the most investigated areas of the brain. Extensive neuroimaging evidence suggests countless functions for this region, ranging from conflict and error coding, to social cognition, pain and effortful control. In response to this burgeoning amount of data, a proliferation of computational models has tried to characterize the neurocognitive architecture of ACC. Early seminal models provided a computational explanation for a relatively circumscribed set of empirical findings, mainly accounting for EEG and fMRI evidence. More recent models have focused on ACC's contribution to effortful control. In parallel to these developments, several proposals attempted to explain within a single computational framework a wider variety of empirical findings that span different cognitive processes and experimental modalities. Here we critically evaluate these modeling attempts, highlighting the continued need to reconcile the array of disparate ACC observations within a coherent, unifying framework.

  19. An Investigation of the Flow Physics of Acoustic Liners by Direct Numerical Simulation

    NASA Technical Reports Server (NTRS)

    Watson, Willie R. (Technical Monitor); Tam, Christopher

    2004-01-01

    This report concentrates on reporting the effort and status of work done on three dimensional (3-D) simulation of a multi-hole resonator in an impedance tube. This work is coordinated with a parallel experimental effort to be carried out at the NASA Langley Research Center. The outline of this report is as follows : 1. Preliminary consideration. 2. Computation model. 3. Mesh design and parallel computing. 4. Visualization. 5. Status of computer code development. 1. Preliminary Consideration.

  20. Genome sequencing of bacteria: sequencing, de novo assembly and rapid analysis using open source tools.

    PubMed

    Kisand, Veljo; Lettieri, Teresa

    2013-04-01

    De novo genome sequencing of previously uncharacterized microorganisms has the potential to open up new frontiers in microbial genomics by providing insight into both functional capabilities and biodiversity. Until recently, Roche 454 pyrosequencing was the NGS method of choice for de novo assembly because it generates hundreds of thousands of long reads (<450 bps), which are presumed to aid in the analysis of uncharacterized genomes. The array of tools for processing NGS data are increasingly free and open source and are often adopted for both their high quality and role in promoting academic freedom. The error rate of pyrosequencing the Alcanivorax borkumensis genome was such that thousands of insertions and deletions were artificially introduced into the finished genome. Despite a high coverage (~30 fold), it did not allow the reference genome to be fully mapped. Reads from regions with errors had low quality, low coverage, or were missing. The main defect of the reference mapping was the introduction of artificial indels into contigs through lower than 100% consensus and distracting gene calling due to artificial stop codons. No assembler was able to perform de novo assembly comparable to reference mapping. Automated annotation tools performed similarly on reference mapped and de novo draft genomes, and annotated most CDSs in the de novo assembled draft genomes. Free and open source software (FOSS) tools for assembly and annotation of NGS data are being developed rapidly to provide accurate results with less computational effort. Usability is not high priority and these tools currently do not allow the data to be processed without manual intervention. Despite this, genome assemblers now readily assemble medium short reads into long contigs (>97-98% genome coverage). A notable gap in pyrosequencing technology is the quality of base pair calling and conflicting base pairs between single reads at the same nucleotide position. Regardless, using draft whole genomes that are not finished and remain fragmented into tens of contigs allows one to characterize unknown bacteria with modest effort.

  1. [Earth and Space Sciences Project Services for NASA HPCC

    NASA Technical Reports Server (NTRS)

    Merkey, Phillip

    2002-01-01

    This grant supported the effort to characterize the problem domain of the Earth Science Technology Office's Computational Technologies Project, to engage the Beowulf Cluster Computing Community as well as the High Performance Computing Research Community so that we can predict the applicability of said technologies to the scientific community represented by the CT project and formulate long term strategies to provide the computational resources necessary to attain the anticipated scientific objectives of the CT project. Specifically, the goal of the evaluation effort is to use the information gathered over the course of the Round-3 investigations to quantify the trends in scientific expectations, the algorithmic requirements and capabilities of high-performance computers to satisfy this anticipated need.

  2. Automated Estimation Of Software-Development Costs

    NASA Technical Reports Server (NTRS)

    Roush, George B.; Reini, William

    1993-01-01

    COSTMODL is automated software development-estimation tool. Yields significant reduction in risk of cost overruns and failed projects. Accepts description of software product developed and computes estimates of effort required to produce it, calendar schedule required, and distribution of effort and staffing as function of defined set of development life-cycle phases. Written for IBM PC(R)-compatible computers.

  3. Engineering oilseeds for sustainable production of industrial and nutritional feedstocks: solving bottlenecks in fatty acid flux.

    PubMed

    Cahoon, Edgar B; Shockey, Jay M; Dietrich, Charles R; Gidda, Satinder K; Mullen, Robert T; Dyer, John M

    2007-06-01

    Oilseeds provide a unique platform for the production of high-value fatty acids that can replace non-sustainable petroleum and oceanic sources of specialty chemicals and aquaculture feed. However, recent efforts to engineer the seeds of crop and model plant species to produce new types of fatty acids, including hydroxy and conjugated fatty acids for industrial uses and long-chain omega-3 polyunsaturated fatty acids for farmed fish feed, have met with only modest success. The collective results from these studies point to metabolic 'bottlenecks' in the engineered plant seeds that substantially limit the efficient or selective flux of unusual fatty acids between different substrate pools and ultimately into storage triacylglycerol. Evidence is emerging that diacylglycerol acyltransferase 2, which catalyzes the final step in triacylglycerol assembly, is an important contributor to the synthesis of unusual fatty acid-containing oils, and is likely to be a key target for future oilseed metabolic engineering efforts.

  4. Board development in two hospitals: lessons from a demonstration.

    PubMed

    Kovner, A R; Ritvo, R A; Holland, T P

    1997-01-01

    A recently concluded demonstration project examined efforts to improve the effectiveness of nonprofit boards. This article focuses on the interventions at two participating healthcare organizations and examines the outcomes of these efforts. Changes made at the Alpha Health Care system included: reduction in the number of boards, term limits established for board members, election of new board chairs for two of the fewer boards, reduction in the size of those boards, implementation of a consent agenda, and reorganization of the boards' committee structure. Fewer changes were implemented at the Beta Hospital, where several initiatives were started but only some of which were retained by the projects' conclusion. Key factors limiting the extent of changes there were the modest interest in an active board by a new CEO and the limited investment of trustees in change. The article concludes with a discussion of lessons learned about board assessment, the use of retreats to initiate board development, and the importance of time management and CEO support to strengthen board effectiveness.

  5. A company-instituted program to improve blood pressure control in primary care.

    PubMed

    Alderman, M H; Melcher, L A

    1981-01-01

    An occupation-based effort to improve the outcome of antihypertensive therapy provided in the community was instituted by the Massachusetts Mutual Life Insurance Company in 1977. The goal of the program was to utilize the administrative and organizational resources of the company to enhance employee/patient adherence to treatment provided in conventional primary care settings. Key elements of the program were: companywide education and on-site screening, referral to community physicians and company assumption of all patient costs, linked to a monitoring system to permit oversight of care. Initially, 98% of employees were screened, 70% accepted referral for care and 59% fully adhered to program performance criteria. Blood pressure control has risen from 36% at the beginning to 69% at the end of the second year. Fully compliant patients have achieved the greatest lowering of blood pressure and compiled the best work attendance record. Program costs are modest and acceptance by employees and physicians supports the concept that occupation-based, systematic efforts can enhance the impact of primary care.

  6. Concurrent planning and execution for a walking robot

    NASA Astrophysics Data System (ADS)

    Simmons, Reid

    1990-07-01

    The Planetary Rover project is developing the Ambler, a novel legged robot, and an autonomous software system for walking the Ambler over rough terrain. As part of the project, we have developed a system that integrates perception, planning, and real-time control to navigate a single leg of the robot through complex obstacle courses. The system is integrated using the Task Control Architecture (TCA), a general-purpose set of utilities for building and controlling distributed mobile robot systems. The walking system, as originally implemented, utilized a sequential sense-plan-act control cycle. This report describes efforts to improve the performance of the system by concurrently planning and executing steps. Concurrency was achieved by modifying the existing sequential system to utilize TCA features such as resource management, monitors, temporal constraints, and hierarchical task trees. Performance was increased in excess of 30 percent with only a relatively modest effort to convert and test the system. The results lend support to the utility of using TCA to develop complex mobile robot systems.

  7. In vitro and in vivo transfection of primary phagocytes via microbubble-mediated intraphagosomal sonoporation.

    PubMed

    Lemmon, Jason C M; McFarland, Ryan J; Rybicka, Joanna M; Balce, Dale R; McKeown, Kyle R; Krohn, Regina M; Matsunaga, Terry O; Yates, Robin M

    2011-08-31

    The professional phagocytes, such as macrophages and dendritic cells, are the subject of numerous research efforts in immunology and cell biology. The use of primary phagocytes in these investigations however, are limited by their inherent resistance to transfection with DNA constructs. As a result, the use of phagocyte-like immortalized cell lines is widespread. While these cell lines are transfection permissive, they are generally regarded as poor biological substitutes for primary phagocytes. By exploiting the phagocytic machinery of primary phagocytes, we developed a non-viral method of DNA transfection of macrophages that employs intraphagosomal sonoporation mediated by internalized lipid-based microbubbles. This approach enables the transfection of primary phagocytes in vitro, with a modest, but reliable efficiency. Furthermore, this methodology was readily adapted to transfect murine peritoneal macrophages in vivo. This technology has immediate application to current research efforts and has potential for use in gene therapy and vaccination strategies. Copyright © 2011 Elsevier B.V. All rights reserved.

  8. Reference Computational Meshing Strategy for Computational Fluid Dynamics Simulation of Departure from Nucleate BoilingReference Computational Meshing Strategy for Computational Fluid Dynamics Simulation of Departure from Nucleate Boiling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pointer, William David

    The objective of this effort is to establish a strategy and process for generation of suitable computational mesh for computational fluid dynamics simulations of departure from nucleate boiling in a 5 by 5 fuel rod assembly held in place by PWR mixing vane spacer grids. This mesh generation process will support ongoing efforts to develop, demonstrate and validate advanced multi-phase computational fluid dynamics methods that enable more robust identification of dryout conditions and DNB occurrence.Building upon prior efforts and experience, multiple computational meshes were developed using the native mesh generation capabilities of the commercial CFD code STAR-CCM+. These meshes weremore » used to simulate two test cases from the Westinghouse 5 by 5 rod bundle facility. The sensitivity of predicted quantities of interest to the mesh resolution was then established using two evaluation methods, the Grid Convergence Index method and the Least Squares method. This evaluation suggests that the Least Squares method can reliably establish the uncertainty associated with local parameters such as vector velocity components at a point in the domain or surface averaged quantities such as outlet velocity magnitude. However, neither method is suitable for characterization of uncertainty in global extrema such as peak fuel surface temperature, primarily because such parameters are not necessarily associated with a fixed point in space. This shortcoming is significant because the current generation algorithm for identification of DNB event conditions relies on identification of such global extrema. Ongoing efforts to identify DNB based on local surface conditions will address this challenge« less

  9. 24 CFR 891.670 - Cost containment and modest design standards.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 24 Housing and Urban Development 4 2011-04-01 2011-04-01 false Cost containment and modest design... Handicapped Families and Individuals-Section 162 Assistance § 891.670 Cost containment and modest design standards. (a) Restrictions on amenities. Projects must be modest in design. Except as provided in paragraph...

  10. 24 CFR 891.670 - Cost containment and modest design standards.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 24 Housing and Urban Development 4 2014-04-01 2014-04-01 false Cost containment and modest design... Handicapped Families and Individuals-Section 162 Assistance § 891.670 Cost containment and modest design standards. (a) Restrictions on amenities. Projects must be modest in design. Except as provided in paragraph...

  11. 24 CFR 891.670 - Cost containment and modest design standards.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 24 Housing and Urban Development 4 2013-04-01 2013-04-01 false Cost containment and modest design... Handicapped Families and Individuals-Section 162 Assistance § 891.670 Cost containment and modest design standards. (a) Restrictions on amenities. Projects must be modest in design. Except as provided in paragraph...

  12. 24 CFR 891.670 - Cost containment and modest design standards.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 24 Housing and Urban Development 4 2012-04-01 2012-04-01 false Cost containment and modest design... Handicapped Families and Individuals-Section 162 Assistance § 891.670 Cost containment and modest design standards. (a) Restrictions on amenities. Projects must be modest in design. Except as provided in paragraph...

  13. 24 CFR 891.670 - Cost containment and modest design standards.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Cost containment and modest design... Handicapped Families and Individuals-Section 162 Assistance § 891.670 Cost containment and modest design standards. (a) Restrictions on amenities. Projects must be modest in design. Except as provided in paragraph...

  14. A Survey of Techniques for Approximate Computing

    DOE PAGES

    Mittal, Sparsh

    2016-03-18

    Approximate computing trades off computation quality with the effort expended and as rising performance demands confront with plateauing resource budgets, approximate computing has become, not merely attractive, but even imperative. Here, we present a survey of techniques for approximate computing (AC). We discuss strategies for finding approximable program portions and monitoring output quality, techniques for using AC in different processing units (e.g., CPU, GPU and FPGA), processor components, memory technologies etc., and programming frameworks for AC. Moreover, we classify these techniques based on several key characteristics to emphasize their similarities and differences. Finally, the aim of this paper is tomore » provide insights to researchers into working of AC techniques and inspire more efforts in this area to make AC the mainstream computing approach in future systems.« less

  15. 34 CFR 461.45 - How does the Secretary compute maintenance of effort in the event of a waiver?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Education (Continued) OFFICE OF VOCATIONAL AND ADULT EDUCATION, DEPARTMENT OF EDUCATION ADULT EDUCATION... awarded for the year after the year of the waiver by comparing the amount spent for adult education from... 34 Education 3 2012-07-01 2012-07-01 false How does the Secretary compute maintenance of effort in...

  16. 34 CFR 461.45 - How does the Secretary compute maintenance of effort in the event of a waiver?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Education (Continued) OFFICE OF VOCATIONAL AND ADULT EDUCATION, DEPARTMENT OF EDUCATION ADULT EDUCATION... awarded for the year after the year of the waiver by comparing the amount spent for adult education from... 34 Education 3 2013-07-01 2013-07-01 false How does the Secretary compute maintenance of effort in...

  17. 34 CFR 461.45 - How does the Secretary compute maintenance of effort in the event of a waiver?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Education (Continued) OFFICE OF VOCATIONAL AND ADULT EDUCATION, DEPARTMENT OF EDUCATION ADULT EDUCATION... awarded for the year after the year of the waiver by comparing the amount spent for adult education from... 34 Education 3 2011-07-01 2011-07-01 false How does the Secretary compute maintenance of effort in...

  18. 34 CFR 461.45 - How does the Secretary compute maintenance of effort in the event of a waiver?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Education (Continued) OFFICE OF VOCATIONAL AND ADULT EDUCATION, DEPARTMENT OF EDUCATION ADULT EDUCATION... awarded for the year after the year of the waiver by comparing the amount spent for adult education from... 34 Education 3 2014-07-01 2014-07-01 false How does the Secretary compute maintenance of effort in...

  19. 34 CFR 461.45 - How does the Secretary compute maintenance of effort in the event of a waiver?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Education (Continued) OFFICE OF VOCATIONAL AND ADULT EDUCATION, DEPARTMENT OF EDUCATION ADULT EDUCATION... awarded for the year after the year of the waiver by comparing the amount spent for adult education from... 34 Education 3 2010-07-01 2010-07-01 false How does the Secretary compute maintenance of effort in...

  20. Integrating computation into the undergraduate curriculum: A vision and guidelines for future developments

    NASA Astrophysics Data System (ADS)

    Chonacky, Norman; Winch, David

    2008-04-01

    There is substantial evidence of a need to make computation an integral part of the undergraduate physics curriculum. This need is consistent with data from surveys in both the academy and the workplace, and has been reinforced by two years of exploratory efforts by a group of physics faculty for whom computation is a special interest. We have examined past and current efforts at reform and a variety of strategic, organizational, and institutional issues involved in any attempt to broadly transform existing practice. We propose a set of guidelines for development based on this past work and discuss our vision of computationally integrated physics.

  1. Micro-video display with ocular tracking and interactive voice control

    NASA Technical Reports Server (NTRS)

    Miller, James E.

    1993-01-01

    In certain space-restricted environments, many of the benefits resulting from computer technology have been foregone because of the size, weight, inconvenience, and lack of mobility associated with existing computer interface devices. Accordingly, an effort to develop a highly miniaturized and 'wearable' computer display and control interface device, referred to as the Sensory Integrated Data Interface (SIDI), is underway. The system incorporates a micro-video display that provides data display and ocular tracking on a lightweight headset. Software commands are implemented by conjunctive eye movement and voice commands of the operator. In this initial prototyping effort, various 'off-the-shelf' components have been integrated into a desktop computer and with a customized menu-tree software application to demonstrate feasibility and conceptual capabilities. When fully developed as a customized system, the interface device will allow mobile, 'hand-free' operation of portable computer equipment. It will thus allow integration of information technology applications into those restrictive environments, both military and industrial, that have not yet taken advantage of the computer revolution. This effort is Phase 1 of Small Business Innovative Research (SBIR) Topic number N90-331 sponsored by the Naval Undersea Warfare Center Division, Newport. The prime contractor is Foster-Miller, Inc. of Waltham, MA.

  2. Measuring and Modeling Change in Examinee Effort on Low-Stakes Tests across Testing Occasions

    ERIC Educational Resources Information Center

    Sessoms, John; Finney, Sara J.

    2015-01-01

    Because schools worldwide use low-stakes tests to make important decisions, value-added indices computed from test scores must accurately reflect student learning, which requires equal test-taking effort across testing occasions. Evaluating change in effort assumes effort is measured equivalently across occasions. We evaluated the longitudinal…

  3. Vector and Scalar Bosons at DØ and ATLAS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lammers, Sabine Sabine

    2014-09-26

    Vector Boson Fusion (VBF) has never been measured in hadron collisions, but it is one of the most sensitive modes for low mass Standard Model Higgs production at ATLAS. The objective of this proposal is to measure VBF production of W and Z bosons at the DØ Experiment taking place at the Tevatron Collider near Chicago, Illinois, and at the ATLAS Experiment, running at the Large Hadron Collider in Geneva, Switzerland. The framework developed in these measurements will be used to discover and study the Higgs Boson produced through the same mechanism (VBF) at ATLAS. The 10 f b-1 datasetmore » recently collected by the DØ experiment provides a unique opportunity to observe evidence of VBF production of W Bosons, which will provide the required theoretical knowledge - VBF cross sections - and experimental knowledge - tuning of measurement techniques - on which to base the VBF measurements at the LHC. At the time of this writing, the ATLAS experiment has recorded 5 fb-1 of data at √s = 7 TeV, and expects to collect at least another 5 in 2012. Assuming Standard Model cross sections, this dataset will allow for the observation of VBF production of W, Z and Higgs bosons. The major challenges for the first observation of VBF interactions are: developing highly optimized forward jet identification algorithms, and accurately modeling both rates and kinematics of background processes. With the research program outlined in this grant proposal, I plan to address each of these areas, paving the way for VBF observation. The concentration on VBF production for the duration of this grant will be at ATLAS where the anticipated high pileup rates necessitates a cleaner signal. My past experience with forward jet identification at the ZEUS experiment, and with W+(n)Jets measurements at DØ , puts me in a unique position to lead this effort. The proposed program will have a dual focus: on DØ where the VBF analysis effort is mature and efforts of a postdoc will be required to bring the VBF W analysis to a paper, and at ATLAS where a graduate student will begin the effort. I therefore request funding for a student and a postdoc, as well as summer support for myself, for the four year duration of the grant proposal. I also request travel funds to facilitate interactions with my group, presentation at conferences, and a modest amount of money to purchase computing resources.« less

  4. Materials Genome Initiative

    NASA Technical Reports Server (NTRS)

    Vickers, John

    2015-01-01

    The Materials Genome Initiative (MGI) project element is a cross-Center effort that is focused on the integration of computational tools to simulate manufacturing processes and materials behavior. These computational simulations will be utilized to gain understanding of processes and materials behavior to accelerate process development and certification to more efficiently integrate new materials in existing NASA projects and to lead to the design of new materials for improved performance. This NASA effort looks to collaborate with efforts at other government agencies and universities working under the national MGI. MGI plans to develop integrated computational/experimental/ processing methodologies for accelerating discovery and insertion of materials to satisfy NASA's unique mission demands. The challenges include validated design tools that incorporate materials properties, processes, and design requirements; and materials process control to rapidly mature emerging manufacturing methods and develop certified manufacturing processes

  5. Tobacco use in popular movies during the past decade.

    PubMed

    Mekemson, C; Glik, D; Titus, K; Myerson, A; Shaivitz, A; Ang, A; Mitchell, S

    2004-12-01

    The top 50 commercially successful films released per year from 1991 to 2000 were content coded to assess trends in tobacco use over time and attributes of films predictive of higher smoking rates. This observational study used media content analysis methods to generate data about tobacco use depictions in films studied (n = 497). Films are the basic unit of analysis. Once films were coded and preliminary analysis completed, outcome data were transformed to approximate multivariate normality before being analysed with general linear models and longitudinal mixed method regression methods. Tobacco use per minute of film was the main outcome measure used. Predictor variables include attributes of films and actors. Tobacco use was defined as any cigarette, cigar, and chewing tobacco use as well as the display of smoke and cigarette paraphernalia such as ashtrays, brand names, or logos within frames of films reviewed. Smoking rates in the top films fluctuated yearly over the decade with an overall modest downward trend (p < 0.005), with the exception of R rated films where rates went up. The decrease in smoking rates found in films in the past decade is modest given extensive efforts to educate the entertainment industry on this issue over the past decade. Monitoring, education, advocacy, and policy change to bring tobacco depiction rates down further should continue.

  6. Ion Beam Propulsion Study

    NASA Technical Reports Server (NTRS)

    2008-01-01

    The Ion Beam Propulsion Study was a joint high-level study between the Applied Physics Laboratory operated by NASA and ASRC Aerospace at Kennedy Space Center, Florida, and Berkeley Scientific, Berkeley, California. The results were promising and suggested that work should continue if future funding becomes available. The application of ion thrusters for spacecraft propulsion is limited to quite modest ion sources with similarly modest ion beam parameters because of the mass penalty associated with the ion source and its power supply system. Also, the ion source technology has not been able to provide very high-power ion beams. Small ion beam propulsion systems were used with considerable success. Ion propulsion systems brought into practice use an onboard ion source to form an energetic ion beam, typically Xe+ ions, as the propellant. Such systems were used for steering and correction of telecommunication satellites and as the main thruster for the Deep Space 1 demonstration mission. In recent years, "giant" ion sources were developed for the controlled-fusion research effort worldwide, with beam parameters many orders of magnitude greater than the tiny ones of conventional space thruster application. The advent of such huge ion beam sources and the need for advanced propulsion systems for exploration of the solar system suggest a fresh look at ion beam propulsion, now with the giant fusion sources in mind.

  7. Δ9-Tetrahydrocannabinol decreases willingness to exert cognitive effort in male rats

    PubMed Central

    Silveira, Mason M.; Adams, Wendy K.; Morena, Maria; Hill, Matthew N.; Winstanley, Catharine A.

    2017-01-01

    Background Acceptance of cannabis use is growing. However, prolonged use is associated with diminished psychosocial outcomes, potentially mediated by drug-induced cognitive impairments. Δ9-Tetrahydrocannabinol (THC) is the main psychoactive ingredient in cannabis, yet other phytocannabinoids in the plant, such as cannabidiol (CBD), have unique properties. Given that CBD can modulate the undesirable effects of THC, therapeutic agents, such as nabiximols, contain higher CBD:THC ratios than illicit marijuana. We tested the hypothesis that THC impairs a relevant cognitive function for long-term success, namely willingness to exert cognitive effort for greater rewards, and that CBD could attenuate such decision-making impairments. Methods Male Long–Evans rats (n = 29) performing the rat cognitive effort task (rCET) received acute THC and CBD, independently and concurrently, in addition to other cannabinoids. Rats chose between 2 options differing in reward magnitude, but also in the cognitive effort (attentional load) required to obtain them. Results We found that THC decreased choice of hard trials without impairing the animals’ ability to accurately complete them. Strikingly, this impairment was correlated with CB1 receptor density in the medial prefrontal cortex — an area previously implicated in effortful decision-making. In contrast, CBD did not affect choice. Coadministration of 1:1 CBD:THC matching that in nabiximols modestly attenuated the deleterious effects of THC in “slacker” rats. Limitations Only male rats were investigated, and the THC/CBD coadministration experiment was carried out in a subset of individuals. Conclusion These findings confirm that THC, but not CBD, selectively impairs decision-making involving cognitive effort costs. However, coadministration of CBD only partially ameliorates such THC-induced dysfunction. PMID:28245177

  8. Multiphysics Thrust Chamber Modeling for Nuclear Thermal Propulsion

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See; Cheng, Gary; Chen, Yen-Sen

    2006-01-01

    The objective of this effort is to develop an efficient and accurate thermo-fluid computational methodology to predict environments for a solid-core, nuclear thermal engine thrust chamber. The computational methodology is based on an unstructured-grid, pressure-based computational fluid dynamics formulation. A two-pronged approach is employed in this effort: A detailed thermo-fluid analysis on a multi-channel flow element for mid-section corrosion investigation; and a global modeling of the thrust chamber to understand the effect of heat transfer on thrust performance. Preliminary results on both aspects are presented.

  9. Is nuclear matter a quantum crystal?

    NASA Technical Reports Server (NTRS)

    Canuto, V.; Chitre, S. M.

    1973-01-01

    A possible alternative to the ordinary gas-like computation for nuclear matter is investigated under the assumption that the nucleons are arranged in a lattice. BCC, FCC and HCP structures are investigated. Only HCP shows a minimum in the energy vs. density curve with a modest binding energy of -1.5 MeV. The very low density limit is investigated and sensible results are obtained only if the tensor force decreases with the density. A study of the elastic properties indicates that the previous structures are mechanically unstable against shearing stresses.

  10. Waterspout, Gust Fronts and Associated Cloud Systems

    NASA Technical Reports Server (NTRS)

    Simpson, J.

    1983-01-01

    Nine waterspouts observed on five experimental days during the GATE period of observations are discussed. Primary data used are from 2 aircraft flying in different patterns, one above the other between 30 and 300 m. There is strong evidence associating whirl initiation with cumulus outflow. Computations prepared from estimates of convergence with the region suggest the possibility of vortex generation within 4 minutes. This analysis supports (1) the importance cumulus outflows may have in waterspout initiation and (2) the possibility that sea surface temperature gradients may be important in enabling waterspout development from modest size cumuli.

  11. On-line data display

    NASA Astrophysics Data System (ADS)

    Lang, Sherman Y. T.; Brooks, Martin; Gauthier, Marc; Wein, Marceli

    1993-05-01

    A data display system for embedded realtime systems has been developed for use as an operator's user interface and debugging tool. The motivation for development of the On-Line Data Display (ODD) have come from several sources. In particular the design reflects the needs of researchers developing an experimental mobile robot within our laboratory. A proliferation of specialized user interfaces revealed a need for a flexible communications and graphical data display system. At the same time the system had to be readily extensible for arbitrary graphical display formats which would be required for data visualization needs of the researchers. The system defines a communication protocol transmitting 'datagrams' between tasks executing on the realtime system and virtual devices displaying the data in a meaningful way on a graphical workstation. The communication protocol multiplexes logical channels on a single data stream. The current implementation consists of a server for the Harmony realtime operating system and an application written for the Macintosh computer. Flexibility requirements resulted in a highly modular server design, and a layered modular object- oriented design for the Macintosh part of the system. Users assign data types to specific channels at run time. Then devices are instantiated by the user and connected to channels to receive datagrams. The current suite of device types do not provide enough functionality for most users' specialized needs. Instead the system design allows the creation of new device types with modest programming effort. The protocol, design and use of the system are discussed.

  12. Physicians' fears of malpractice lawsuits are not assuaged by tort reforms.

    PubMed

    Carrier, Emily R; Reschovsky, James D; Mello, Michelle M; Mayrell, Ralph C; Katz, David

    2010-09-01

    Physicians contend that the threat of malpractice lawsuits forces them to practice defensive medicine, which in turn raises the cost of health care. This argument underlies efforts to change malpractice laws through legislative tort reform. We evaluated physicians' perceptions about malpractice claims in states where more objective indicators of malpractice risk, such as malpractice premiums, varied considerably. We found high levels of malpractice concern among both generalists and specialists in states where objective measures of malpractice risk were low. We also found relatively modest differences in physicians' concerns across states with and without common tort reforms. These results suggest that many policies aimed at controlling malpractice costs may have a limited effect on physicians' malpractice concerns.

  13. An empirical test of the Theory of Planned Behaviour applied to contraceptive use in rural Uganda.

    PubMed

    Kiene, Susan M; Hopwood, Sarah; Lule, Haruna; Wanyenze, Rhoda K

    2014-12-01

    There is a high unmet need for contraceptives in developing countries such as Uganda, with high population growth, where efforts are needed to promote family planning and contraceptive use. Despite this high need, little research has investigated applications of health-behaviour-change theories to contraceptive use among this population. This study tested the Theory of Planned Behaviour's ability to predict contraceptive-use-related behaviours among post-partum women in rural Uganda. Results gave modest support to the theory's application and suggest an urgent need for improved theory-based interventions to promote contraceptive use in the populations of developing countries. © The Author(s) 2013.

  14. A climate trend analysis of Senegal

    USGS Publications Warehouse

    Funk, Christopher C.; Rowland, Jim; Adoum, Alkhalil; Eilerts, Gary; Verdin, James; White, Libby

    2012-01-01

    This brief report, drawing from a multi-year effort by the U.S. Agency for International Development (USAID) Famine Early Warning Systems Network (FEWS NET), identifies modest declines in rainfall, accompanied by increases in air temperatures. These analyses are based on quality-controlled station observations. Conclusions: * Summer rains have remained steady in Senegal over the past 20 years but are 15 percent below the 1920-1969 average. * Temperatures have increased by 0.9° Celsius since 1975, amplifying the effect of droughts. * Cereal yields are low but have been improving. * The amount of farmland per person is low and declining rapidly. * Current population and agriculture trends could lead to a 30-percent reduction in per capita cereal production by 2025.

  15. 34 CFR 403.185 - How does the Secretary compute maintenance of effort in the event of a waiver?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... VOCATIONAL AND APPLIED TECHNOLOGY EDUCATION PROGRAM What Financial Conditions Must Be Met by a State? § 403... 34 Education 3 2010-07-01 2010-07-01 false How does the Secretary compute maintenance of effort in the event of a waiver? 403.185 Section 403.185 Education Regulations of the Offices of the Department...

  16. Complete distributed computing environment for a HEP experiment: experience with ARC-connected infrastructure for ATLAS

    NASA Astrophysics Data System (ADS)

    Read, A.; Taga, A.; O-Saada, F.; Pajchel, K.; Samset, B. H.; Cameron, D.

    2008-07-01

    Computing and storage resources connected by the Nordugrid ARC middleware in the Nordic countries, Switzerland and Slovenia are a part of the ATLAS computing Grid. This infrastructure is being commissioned with the ongoing ATLAS Monte Carlo simulation production in preparation for the commencement of data taking in 2008. The unique non-intrusive architecture of ARC, its straightforward interplay with the ATLAS Production System via the Dulcinea executor, and its performance during the commissioning exercise is described. ARC support for flexible and powerful end-user analysis within the GANGA distributed analysis framework is also shown. Whereas the storage solution for this Grid was earlier based on a large, distributed collection of GridFTP-servers, the ATLAS computing design includes a structured SRM-based system with a limited number of storage endpoints. The characteristics, integration and performance of the old and new storage solutions are presented. Although the hardware resources in this Grid are quite modest, it has provided more than double the agreed contribution to the ATLAS production with an efficiency above 95% during long periods of stable operation.

  17. Quantum Computation using Arrays of N Polar Molecules in Pendular States.

    PubMed

    Wei, Qi; Cao, Yudong; Kais, Sabre; Friedrich, Bretislav; Herschbach, Dudley

    2016-11-18

    We investigate several aspects of realizing quantum computation using entangled polar molecules in pendular states. Quantum algorithms typically start from a product state |00⋯0⟩ and we show that up to a negligible error, the ground states of polar molecule arrays can be considered as the unentangled qubit basis state |00⋯0⟩ . This state can be prepared by simply allowing the system to reach thermal equilibrium at low temperature (<1 mK). We also evaluate entanglement, characterized by concurrence of pendular state qubits in dipole arrays as governed by the external electric field, dipole-dipole coupling and number N of molecules in the array. In the parameter regime that we consider for quantum computing, we find that qubit entanglement is modest, typically no greater than 10 -4 , confirming the negligible entanglement in the ground state. We discuss methods for realizing quantum computation in the gate model, measurement-based model, instantaneous quantum polynomial time circuits and the adiabatic model using polar molecules in pendular states. © 2016 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. A streamlined failure mode and effects analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ford, Eric C., E-mail: eford@uw.edu; Smith, Koren; Terezakis, Stephanie

    Purpose: Explore the feasibility and impact of a streamlined failure mode and effects analysis (FMEA) using a structured process that is designed to minimize staff effort. Methods: FMEA for the external beam process was conducted at an affiliate radiation oncology center that treats approximately 60 patients per day. A structured FMEA process was developed which included clearly defined roles and goals for each phase. A core group of seven people was identified and a facilitator was chosen to lead the effort. Failure modes were identified and scored according to the FMEA formalism. A risk priority number,RPN, was calculated and usedmore » to rank failure modes. Failure modes with RPN > 150 received safety improvement interventions. Staff effort was carefully tracked throughout the project. Results: Fifty-two failure modes were identified, 22 collected during meetings, and 30 from take-home worksheets. The four top-ranked failure modes were: delay in film check, missing pacemaker protocol/consent, critical structures not contoured, and pregnant patient simulated without the team's knowledge of the pregnancy. These four failure modes hadRPN > 150 and received safety interventions. The FMEA was completed in one month in four 1-h meetings. A total of 55 staff hours were required and, additionally, 20 h by the facilitator. Conclusions: Streamlined FMEA provides a means of accomplishing a relatively large-scale analysis with modest effort. One potential value of FMEA is that it potentially provides a means of measuring the impact of quality improvement efforts through a reduction in risk scores. Future study of this possibility is needed.« less

  19. Two-body Schrödinger wave functions in a plane-wave basis via separation of dimensions

    NASA Astrophysics Data System (ADS)

    Jerke, Jonathan; Poirier, Bill

    2018-03-01

    Using a combination of ideas, the ground and several excited electronic states of the helium atom and the hydrogen molecule are computed to chemical accuracy—i.e., to within 1-2 mhartree or better. The basic strategy is very different from the standard electronic structure approach in that the full two-electron six-dimensional (6D) problem is tackled directly, rather than starting from a single-electron Hartree-Fock approximation. Electron correlation is thus treated exactly, even though computational requirements remain modest. The method also allows for exact wave functions to be computed, as well as energy levels. From the full-dimensional 6D wave functions computed here, radial distribution functions and radial correlation functions are extracted—as well as a 2D probability density function exhibiting antisymmetry for a single Cartesian component. These calculations support a more recent interpretation of Hund's rule, which states that the lower energy of the higher spin-multiplicity states is actually due to reduced screening, rather than reduced electron-electron repulsion. Prospects for larger systems and/or electron dynamics applications appear promising.

  20. Comparison of scientific computing platforms for MCNP4A Monte Carlo calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hendricks, J.S.; Brockhoff, R.C.

    1994-04-01

    The performance of seven computer platforms is evaluated with the widely used and internationally available MCNP4A Monte Carlo radiation transport code. All results are reproducible and are presented in such a way as to enable comparison with computer platforms not in the study. The authors observed that the HP/9000-735 workstation runs MCNP 50% faster than the Cray YMP 8/64. Compared with the Cray YMP 8/64, the IBM RS/6000-560 is 68% as fast, the Sun Sparc10 is 66% as fast, the Silicon Graphics ONYX is 90% as fast, the Gateway 2000 model 4DX2-66V personal computer is 27% as fast, and themore » Sun Sparc2 is 24% as fast. In addition to comparing the timing performance of the seven platforms, the authors observe that changes in compilers and software over the past 2 yr have resulted in only modest performance improvements, hardware improvements have enhanced performance by less than a factor of [approximately]3, timing studies are very problem dependent, MCNP4Q runs about as fast as MCNP4.« less

  1. Global Static Indexing for Real-Time Exploration of Very Large Regular Grids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pascucci, V; Frank, R

    2001-07-23

    In this paper we introduce a new indexing scheme for progressive traversal and visualization of large regular grids. We demonstrate the potential of our approach by providing a tool that displays at interactive rates planar slices of scalar field data with very modest computing resources. We obtain unprecedented results both in terms of absolute performance and, more importantly, in terms of scalability. On a laptop computer we provide real time interaction with a 2048{sup 3} grid (8 Giga-nodes) using only 20MB of memory. On an SGI Onyx we slice interactively an 8192{sup 3} grid (1/2 tera-nodes) using only 60MB ofmore » memory. The scheme relies simply on the determination of an appropriate reordering of the rectilinear grid data and a progressive construction of the output slice. The reordering minimizes the amount of I/O performed during the out-of-core computation. The progressive and asynchronous computation of the output provides flexible quality/speed tradeoffs and a time-critical and interruptible user interface.« less

  2. Two-body Schrödinger wave functions in a plane-wave basis via separation of dimensions.

    PubMed

    Jerke, Jonathan; Poirier, Bill

    2018-03-14

    Using a combination of ideas, the ground and several excited electronic states of the helium atom and the hydrogen molecule are computed to chemical accuracy-i.e., to within 1-2 mhartree or better. The basic strategy is very different from the standard electronic structure approach in that the full two-electron six-dimensional (6D) problem is tackled directly, rather than starting from a single-electron Hartree-Fock approximation. Electron correlation is thus treated exactly, even though computational requirements remain modest. The method also allows for exact wave functions to be computed, as well as energy levels. From the full-dimensional 6D wave functions computed here, radial distribution functions and radial correlation functions are extracted-as well as a 2D probability density function exhibiting antisymmetry for a single Cartesian component. These calculations support a more recent interpretation of Hund's rule, which states that the lower energy of the higher spin-multiplicity states is actually due to reduced screening, rather than reduced electron-electron repulsion. Prospects for larger systems and/or electron dynamics applications appear promising.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keyes, D.; McInnes, L. C.; Woodward, C.

    This report is an outcome of the workshop Multiphysics Simulations: Challenges and Opportunities, sponsored by the Institute of Computing in Science (ICiS). Additional information about the workshop, including relevant reading and presentations on multiphysics issues in applications, algorithms, and software, is available via https://sites.google.com/site/icismultiphysics2011/. We consider multiphysics applications from algorithmic and architectural perspectives, where 'algorithmic' includes both mathematical analysis and computational complexity and 'architectural' includes both software and hardware environments. Many diverse multiphysics applications can be reduced, en route to their computational simulation, to a common algebraic coupling paradigm. Mathematical analysis of multiphysics coupling in this form is not alwaysmore » practical for realistic applications, but model problems representative of applications discussed herein can provide insight. A variety of software frameworks for multiphysics applications have been constructed and refined within disciplinary communities and executed on leading-edge computer systems. We examine several of these, expose some commonalities among them, and attempt to extrapolate best practices to future systems. From our study, we summarize challenges and forecast opportunities. We also initiate a modest suite of test problems encompassing features present in many applications.« less

  4. Computational Fluid Dynamics Technology for Hypersonic Applications

    NASA Technical Reports Server (NTRS)

    Gnoffo, Peter A.

    2003-01-01

    Several current challenges in computational fluid dynamics and aerothermodynamics for hypersonic vehicle applications are discussed. Example simulations are presented from code validation and code benchmarking efforts to illustrate capabilities and limitations. Opportunities to advance the state-of-art in algorithms, grid generation and adaptation, and code validation are identified. Highlights of diverse efforts to address these challenges are then discussed. One such effort to re-engineer and synthesize the existing analysis capability in LAURA, VULCAN, and FUN3D will provide context for these discussions. The critical (and evolving) role of agile software engineering practice in the capability enhancement process is also noted.

  5. Equal Time for Women.

    ERIC Educational Resources Information Center

    Kolata, Gina

    1984-01-01

    Examines social influences which discourage women from pursuing studies in computer science, including monopoly of computer time by boys at the high school level, sexual harassment in college, movies, and computer games. Describes some initial efforts to encourage females of all ages to study computer science. (JM)

  6. Combining Computational and Social Effort for Collaborative Problem Solving

    PubMed Central

    Wagy, Mark D.; Bongard, Josh C.

    2015-01-01

    Rather than replacing human labor, there is growing evidence that networked computers create opportunities for collaborations of people and algorithms to solve problems beyond either of them. In this study, we demonstrate the conditions under which such synergy can arise. We show that, for a design task, three elements are sufficient: humans apply intuitions to the problem, algorithms automatically determine and report back on the quality of designs, and humans observe and innovate on others’ designs to focus creative and computational effort on good designs. This study suggests how such collaborations should be composed for other domains, as well as how social and computational dynamics mutually influence one another during collaborative problem solving. PMID:26544199

  7. Computing Cluster for Large Scale Turbulence Simulations and Applications in Computational Aeroacoustics

    NASA Astrophysics Data System (ADS)

    Lele, Sanjiva K.

    2002-08-01

    Funds were received in April 2001 under the Department of Defense DURIP program for construction of a 48 processor high performance computing cluster. This report details the hardware which was purchased and how it has been used to enable and enhance research activities directly supported by, and of interest to, the Air Force Office of Scientific Research and the Department of Defense. The report is divided into two major sections. The first section after this summary describes the computer cluster, its setup, and some cluster performance benchmark results. The second section explains ongoing research efforts which have benefited from the cluster hardware, and presents highlights of those efforts since installation of the cluster.

  8. Eric Bonnema | NREL

    Science.gov Websites

    contributes to the research efforts for commercial buildings. This effort is dedicated to studying the , commercial sector whole-building energy simulation, scientific computing, and software configuration and

  9. Genetic susceptibility to neuroblastoma: current knowledge and future directions.

    PubMed

    Ritenour, Laura E; Randall, Michael P; Bosse, Kristopher R; Diskin, Sharon J

    2018-05-01

    Neuroblastoma, a malignancy of the developing peripheral nervous system that affects infants and young children, is a complex genetic disease. Over the past two decades, significant progress has been made toward understanding the genetic determinants that predispose to this often lethal childhood cancer. Approximately 1-2% of neuroblastomas are inherited in an autosomal dominant fashion and a combination of co-morbidity and linkage studies has led to the identification of germline mutations in PHOX2B and ALK as the major genetic contributors to this familial neuroblastoma subset. The genetic basis of "sporadic" neuroblastoma is being studied through a large genome-wide association study (GWAS). These efforts have led to the discovery of many common susceptibility alleles, each with modest effect size, associated with the development and progression of sporadic neuroblastoma. More recently, next-generation sequencing efforts have expanded the list of potential neuroblastoma-predisposing mutations to include rare germline variants with a predicted larger effect size. The evolving characterization of neuroblastoma's genetic basis has led to a deeper understanding of the molecular events driving tumorigenesis, more precise risk stratification and prognostics and novel therapeutic strategies. This review details the contemporary understanding of neuroblastoma's genetic predisposition, including recent advances and discusses ongoing efforts to address gaps in our knowledge regarding this malignancy's complex genetic underpinnings.

  10. A review of high-speed, convective, heat-transfer computation methods

    NASA Technical Reports Server (NTRS)

    Tauber, Michael E.

    1989-01-01

    The objective of this report is to provide useful engineering formulations and to instill a modest degree of physical understanding of the phenomena governing convective aerodynamic heating at high flight speeds. Some physical insight is not only essential to the application of the information presented here, but also to the effective use of computer codes which may be available to the reader. A discussion is given of cold-wall, laminar boundary layer heating. A brief presentation of the complex boundary layer transition phenomenon follows. Next, cold-wall turbulent boundary layer heating is discussed. This topic is followed by a brief coverage of separated flow-region and shock-interaction heating. A review of heat protection methods follows, including the influence of mass addition on laminar and turbulent boundary layers. Also discussed are a discussion of finite-difference computer codes and a comparison of some results from these codes. An extensive list of references is also provided from sources such as the various AIAA journals and NASA reports which are available in the open literature.

  11. A review of high-speed, convective, heat-transfer computation methods

    NASA Technical Reports Server (NTRS)

    Tauber, Michael E.

    1989-01-01

    The objective is to provide useful engineering formulations and to instill a modest degree of physical understanding of the phenomena governing convective aerodynamic heating at high flight speeds. Some physical insight is not only essential to the application of the information presented here, but also to the effective use of computer codes which may be available to the reader. Given first is a discussion of cold-wall, laminar boundary layer heating. A brief presentation of the complex boundary layer transition phenomenon follows. Next, cold-wall turbulent boundary layer heating is discussed. This topic is followed by a brief coverage of separated flow-region and shock-interaction heating. A review of heat protection methods follows, including the influence of mass addition on laminar and turbulent boundary layers. Next is a discussion of finite-difference computer codes and a comparison of some results from these codes. An extensive list of references is also provided from sources such as the various AIAA journals and NASA reports which are available in the open literature.

  12. Effects of Geometric Details on Slat Noise Generation and Propagation

    NASA Technical Reports Server (NTRS)

    Khorrami, Mehdi R.; Lockard, David P.

    2006-01-01

    The relevance of geometric details to the generation and propagation of noise from leading-edge slats is considered. Typically, such details are omitted in computational simulations and model-scale experiments thereby creating ambiguities in comparisons with acoustic results from flight tests. The current study uses two-dimensional, computational simulations in conjunction with a Ffowcs Williams-Hawkings (FW-H) solver to investigate the effects of previously neglected slat "bulb" and "blade" seals on the local flow field and the associated acoustic radiation. The computations clearly show that the presence of the "blade" seal at the cusp significantly changes the slat cove flow dynamics, reduces the amplitudes of the radiated sound, and to a lesser extent, alters the directivity beneath the airfoil. Furthermore, it is demonstrated that a modest extension of the baseline "blade" seal further enhances the suppression of slat noise. As a side issue, the utility and equivalence of FW-H methodology for calculating far-field noise as opposed to a more direct approach is examined and demonstrated.

  13. How wearable technologies will impact the future of health care.

    PubMed

    Barnard, Rick; Shea, J Timothy

    2004-01-01

    After four hundred years of delivering health care in hospitals, industrialized countries are now shifting towards treating patients at the "point of need". This trend will likely accelerate demand for, and adoption of, wearable computing and smart fabric and interactive textile (SFIT) solutions. These healthcare solutions will be designed to provide real-time vital and diagnostic information to health care providers, patients, and related stakeholders in such a manner as to improve quality of care, reduce the cost of care, and allow patients greater control over their own health. The current market size for wearable computing and SFIT solutions is modest; however, the future outlook is extremely strong. Venture Development Corporation, a technology market research and strategy firm, was founded in 1971. Over the years, VDC has developed and implemented a unique and highly successful methodology for forecasting and analyzing highly dynamic technology markets. VDC has extensive experience in providing multi-client and proprietary analysis in the electronic components, advanced materials, and mobile computing markets.

  14. Assessment of time-dependent density functional theory with the restricted excitation space approximation for excited state calculations of large systems

    NASA Astrophysics Data System (ADS)

    Hanson-Heine, Magnus W. D.; George, Michael W.; Besley, Nicholas A.

    2018-06-01

    The restricted excitation subspace approximation is explored as a basis to reduce the memory storage required in linear response time-dependent density functional theory (TDDFT) calculations within the Tamm-Dancoff approximation. It is shown that excluding the core orbitals and up to 70% of the virtual orbitals in the construction of the excitation subspace does not result in significant changes in computed UV/vis spectra for large molecules. The reduced size of the excitation subspace greatly reduces the size of the subspace vectors that need to be stored when using the Davidson procedure to determine the eigenvalues of the TDDFT equations. Furthermore, additional screening of the two-electron integrals in combination with a reduction in the size of the numerical integration grid used in the TDDFT calculation leads to significant computational savings. The use of these approximations represents a simple approach to extend TDDFT to the study of large systems and make the calculations increasingly tractable using modest computing resources.

  15. Using Computer-Extracted Data from Electronic Health Records to Measure the Quality of Adolescent Well-Care

    PubMed Central

    Gardner, William; Morton, Suzanne; Byron, Sepheen C; Tinoco, Aldo; Canan, Benjamin D; Leonhart, Karen; Kong, Vivian; Scholle, Sarah Hudson

    2014-01-01

    Objective To determine whether quality measures based on computer-extracted EHR data can reproduce findings based on data manually extracted by reviewers. Data Sources We studied 12 measures of care indicated for adolescent well-care visits for 597 patients in three pediatric health systems. Study Design Observational study. Data Collection/Extraction Methods Manual reviewers collected quality data from the EHR. Site personnel programmed their EHR systems to extract the same data from structured fields in the EHR according to national health IT standards. Principal Findings Overall performance measured via computer-extracted data was 21.9 percent, compared with 53.2 percent for manual data. Agreement measures were high for immunizations. Otherwise, agreement between computer extraction and manual review was modest (Kappa = 0.36) because computer-extracted data frequently missed care events (sensitivity = 39.5 percent). Measure validity varied by health care domain and setting. A limitation of our findings is that we studied only three domains and three sites. Conclusions The accuracy of computer-extracted EHR quality reporting depends on the use of structured data fields, with the highest agreement found for measures and in the setting that had the greatest concentration of structured fields. We need to improve documentation of care, data extraction, and adaptation of EHR systems to practice workflow. PMID:24471935

  16. Study of the Use of Time-Mean Vortices to Generate Lift for MAV Applications

    DTIC Science & Technology

    2011-05-31

    microplate to in-plane resonance. Computational effort centers around optimization of a range of parameters (geometry, frequency, amplitude of oscillation, etc...issue involved. Towards this end, a suspended microplate was fabricated via MEMS technology and driven to in-plane resonance via Lorentz force...force to drive the suspended MEMS-based microplate to in-plane resonance. Computational effort centers around optimization of a range of parameters

  17. A General Approach to Measuring Test-Taking Effort on Computer-Based Tests

    ERIC Educational Resources Information Center

    Wise, Steven L.; Gao, Lingyun

    2017-01-01

    There has been an increased interest in the impact of unmotivated test taking on test performance and score validity. This has led to the development of new ways of measuring test-taking effort based on item response time. In particular, Response Time Effort (RTE) has been shown to provide an assessment of effort down to the level of individual…

  18. Key issues in the design of NO{sub x} emission trading programs to reduce ground-level ozone. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nichols, A.; Harrison, D.

    1994-07-01

    This report is the first product of a study being conducted by National Economic Research Associates for the Electric Power Research Institute to evaluate various market-based alternatives for managing emissions of nitrogen oxides (NO{sub x}) as part of strategies to achieve the ambient ozone standard. The report focuses on choices in the design of relatively broad, ambitious emission trading programs, rather than on more modest programs designed to generate offsets within a regulatory framework that continues to rely primarily on traditional emission standards and nontransferable permits. After a brief introductory chapter, Chapter 2 reviews both the conceptual underpinnings of emissionmore » trading and prior experience. This review suggests the need for clear initial allocations-generally based on emission caps-to simplify trading while assuring the achievement of emission-reduction goals. Chapter 3 lays out the basic choices required in establishing an emission trading program. For concreteness, the basic design is discussed in terms of trading among utilities and other large stationary sources of NO{sub x}, generally the most promising candidates for trading. Chapter 4 discusses various ways in which a basic trading program could be extended to other source categories and to volatile organic compounds (VOCs), the other major precursor of ozone. Chapter 5 analyzes various ways in which trading programs can be refined to focus control efforts on those times and at those locations where ozone problems are most severe. Although highly refined targeting programs are unlikely to be worth the effort, modest differentials can be implemented by making the number of allowances required for each ton of emissions vary with the time and location of emissions. Chapter 6 reviews various alternatives for making the initial allocation of emission allowances among sources in the trading program, breaking the process into two components, an emission rate and an activity level.« less

  19. Computational strategies for three-dimensional flow simulations on distributed computer systems

    NASA Technical Reports Server (NTRS)

    Sankar, Lakshmi N.; Weed, Richard A.

    1995-01-01

    This research effort is directed towards an examination of issues involved in porting large computational fluid dynamics codes in use within the industry to a distributed computing environment. This effort addresses strategies for implementing the distributed computing in a device independent fashion and load balancing. A flow solver called TEAM presently in use at Lockheed Aeronautical Systems Company was acquired to start this effort. The following tasks were completed: (1) The TEAM code was ported to a number of distributed computing platforms including a cluster of HP workstations located in the School of Aerospace Engineering at Georgia Tech; a cluster of DEC Alpha Workstations in the Graphics visualization lab located at Georgia Tech; a cluster of SGI workstations located at NASA Ames Research Center; and an IBM SP-2 system located at NASA ARC. (2) A number of communication strategies were implemented. Specifically, the manager-worker strategy and the worker-worker strategy were tested. (3) A variety of load balancing strategies were investigated. Specifically, the static load balancing, task queue balancing and the Crutchfield algorithm were coded and evaluated. (4) The classical explicit Runge-Kutta scheme in the TEAM solver was replaced with an LU implicit scheme. And (5) the implicit TEAM-PVM solver was extensively validated through studies of unsteady transonic flow over an F-5 wing, undergoing combined bending and torsional motion. These investigations are documented in extensive detail in the dissertation, 'Computational Strategies for Three-Dimensional Flow Simulations on Distributed Computing Systems', enclosed as an appendix.

  20. Computational strategies for three-dimensional flow simulations on distributed computer systems

    NASA Astrophysics Data System (ADS)

    Sankar, Lakshmi N.; Weed, Richard A.

    1995-08-01

    This research effort is directed towards an examination of issues involved in porting large computational fluid dynamics codes in use within the industry to a distributed computing environment. This effort addresses strategies for implementing the distributed computing in a device independent fashion and load balancing. A flow solver called TEAM presently in use at Lockheed Aeronautical Systems Company was acquired to start this effort. The following tasks were completed: (1) The TEAM code was ported to a number of distributed computing platforms including a cluster of HP workstations located in the School of Aerospace Engineering at Georgia Tech; a cluster of DEC Alpha Workstations in the Graphics visualization lab located at Georgia Tech; a cluster of SGI workstations located at NASA Ames Research Center; and an IBM SP-2 system located at NASA ARC. (2) A number of communication strategies were implemented. Specifically, the manager-worker strategy and the worker-worker strategy were tested. (3) A variety of load balancing strategies were investigated. Specifically, the static load balancing, task queue balancing and the Crutchfield algorithm were coded and evaluated. (4) The classical explicit Runge-Kutta scheme in the TEAM solver was replaced with an LU implicit scheme. And (5) the implicit TEAM-PVM solver was extensively validated through studies of unsteady transonic flow over an F-5 wing, undergoing combined bending and torsional motion. These investigations are documented in extensive detail in the dissertation, 'Computational Strategies for Three-Dimensional Flow Simulations on Distributed Computing Systems', enclosed as an appendix.

  1. Increasing the impact of medical image computing using community-based open-access hackathons: The NA-MIC and 3D Slicer experience.

    PubMed

    Kapur, Tina; Pieper, Steve; Fedorov, Andriy; Fillion-Robin, J-C; Halle, Michael; O'Donnell, Lauren; Lasso, Andras; Ungi, Tamas; Pinter, Csaba; Finet, Julien; Pujol, Sonia; Jagadeesan, Jayender; Tokuda, Junichi; Norton, Isaiah; Estepar, Raul San Jose; Gering, David; Aerts, Hugo J W L; Jakab, Marianna; Hata, Nobuhiko; Ibanez, Luiz; Blezek, Daniel; Miller, Jim; Aylward, Stephen; Grimson, W Eric L; Fichtinger, Gabor; Wells, William M; Lorensen, William E; Schroeder, Will; Kikinis, Ron

    2016-10-01

    The National Alliance for Medical Image Computing (NA-MIC) was launched in 2004 with the goal of investigating and developing an open source software infrastructure for the extraction of information and knowledge from medical images using computational methods. Several leading research and engineering groups participated in this effort that was funded by the US National Institutes of Health through a variety of infrastructure grants. This effort transformed 3D Slicer from an internal, Boston-based, academic research software application into a professionally maintained, robust, open source platform with an international leadership and developer and user communities. Critical improvements to the widely used underlying open source libraries and tools-VTK, ITK, CMake, CDash, DCMTK-were an additional consequence of this effort. This project has contributed to close to a thousand peer-reviewed publications and a growing portfolio of US and international funded efforts expanding the use of these tools in new medical computing applications every year. In this editorial, we discuss what we believe are gaps in the way medical image computing is pursued today; how a well-executed research platform can enable discovery, innovation and reproducible science ("Open Science"); and how our quest to build such a software platform has evolved into a productive and rewarding social engineering exercise in building an open-access community with a shared vision. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. The potential supply of organ donors. An assessment of the efficacy of organ procurement efforts in the United States.

    PubMed

    Evans, R W; Orians, C E; Ascher, N L

    1992-01-08

    To estimate the potential supply of organ donors and to measure the efficiency of organ procurement efforts in the United States. A geographic database has been developed consisting of multiple cause of death and sociodemographic data compiled by the National Center for Health Statistics. All deaths are evaluated as to their potential for organ donation. Two classes of potential donors are identified: class 1 estimates are restricted to causes of death involving significant head trauma only, and class 2 estimates include class 1 estimates as well as deaths in which brain death was less probable. Over 23,000 people are currently awaiting a kidney, heart, liver, heart-lung, pancreas, or lung transplantation. Donor supply is inadequate, and the number of donors remained unchanged at approximately 4000 annually for 1986 through 1989, with a modest 9.1% increase in 1990. Between 6900 and 10,700 potential donors are available annually (eg, 28.5 to 43.7 per million population). Depending on the class of donor considered, organ procurement efforts are between 37% and 59% efficient. Efficiency greatly varies by state and organ procurement organization. Many more organ donors are available than are being accessed through existing organ procurement efforts. Realistically, it may be possible to increase by 80% the number of donors available in the United States (up to 7300 annually). It is conceivable, although unlikely, that the supply of donor organs could achieve a level to meet demand.

  3. 1999 NCCS Highlights

    NASA Technical Reports Server (NTRS)

    Bennett, Jerome (Technical Monitor)

    2002-01-01

    The NASA Center for Computational Sciences (NCCS) is a high-performance scientific computing facility operated, maintained and managed by the Earth and Space Data Computing Division (ESDCD) of NASA Goddard Space Flight Center's (GSFC) Earth Sciences Directorate. The mission of the NCCS is to advance leading-edge science by providing the best people, computers, and data storage systems to NASA's Earth and space sciences programs and those of other U.S. Government agencies, universities, and private institutions. Among the many computationally demanding Earth science research efforts supported by the NCCS in Fiscal Year 1999 (FY99) are the NASA Seasonal-to-Interannual Prediction Project, the NASA Search and Rescue Mission, Earth gravitational model development efforts, the National Weather Service's North American Observing System program, Data Assimilation Office studies, a NASA-sponsored project at the Center for Ocean-Land-Atmosphere Studies, a NASA-sponsored microgravity project conducted by researchers at the City University of New York and the University of Pennsylvania, the completion of a satellite-derived global climate data set, simulations of a new geodynamo model, and studies of Earth's torque. This document presents highlights of these research efforts and an overview of the NCCS, its facilities, and its people.

  4. Quantum Information Science: An Update

    NASA Astrophysics Data System (ADS)

    Kwek, L. C.; Zen, Freddy P.

    2016-08-01

    It is now roughly thirty years since the incipient ideas on quantum information science was concretely formalized. Over the last three decades, there has been much development in this field, and at least one technology, namely devices for quantum cryptography, is now commercialized. Yet, the holy grail of a workable quantum computing machine still lies faraway at the horizon. In any case, it took nearly several centuries before the vacuum tubes were invented after the first mechanical calculating were constructed, and several decades later, for the transistor to bring the current computer technology to fruition. In this review, we provide a short survey of the current development and progress in quantum information science. It clearly does not do justice to the amount of work in the past thirty years. Nevertheless, despite the modest attempt, this review hopes to induce younger researchers into this exciting field.

  5. Efficient Variational Quantum Simulator Incorporating Active Error Minimization

    NASA Astrophysics Data System (ADS)

    Li, Ying; Benjamin, Simon C.

    2017-04-01

    One of the key applications for quantum computers will be the simulation of other quantum systems that arise in chemistry, materials science, etc., in order to accelerate the process of discovery. It is important to ask the following question: Can this simulation be achieved using near-future quantum processors, of modest size and under imperfect control, or must it await the more distant era of large-scale fault-tolerant quantum computing? Here, we propose a variational method involving closely integrated classical and quantum coprocessors. We presume that all operations in the quantum coprocessor are prone to error. The impact of such errors is minimized by boosting them artificially and then extrapolating to the zero-error case. In comparison to a more conventional optimized Trotterization technique, we find that our protocol is efficient and appears to be fundamentally more robust against error accumulation.

  6. Large-eddy simulations of compressible convection on massively parallel computers. [stellar physics

    NASA Technical Reports Server (NTRS)

    Xie, Xin; Toomre, Juri

    1993-01-01

    We report preliminary implementation of the large-eddy simulation (LES) technique in 2D simulations of compressible convection carried out on the CM-2 massively parallel computer. The convective flow fields in our simulations possess structures similar to those found in a number of direct simulations, with roll-like flows coherent across the entire depth of the layer that spans several density scale heights. Our detailed assessment of the effects of various subgrid scale (SGS) terms reveals that they may affect the gross character of convection. Yet, somewhat surprisingly, we find that our LES solutions, and another in which the SGS terms are turned off, only show modest differences. The resulting 2D flows realized here are rather laminar in character, and achieving substantial turbulence may require stronger forcing and less dissipation.

  7. Nicotine increases impulsivity and decreases willingness to exert cognitive effort despite improving attention in "slacker" rats: insights into cholinergic regulation of cost/benefit decision making.

    PubMed

    Hosking, Jay G; Lam, Fred C W; Winstanley, Catharine A

    2014-01-01

    Successful decision making in our daily lives requires weighing an option's costs against its associated benefits. The neuromodulator acetylcholine underlies both the etiology and treatment of a number of illnesses in which decision making is perturbed, including Alzheimer's disease, attention-deficit/hyperactivity disorder, and schizophrenia. Nicotine acts on the cholinergic system and has been touted as a cognitive enhancer by both smokers and some researchers for its attention-boosting effects; however, it is unclear whether treatments that have a beneficial effect on attention would also have a beneficial effect on decision making. Here we utilize the rodent Cognitive Effort Task (rCET), wherein animals can choose to allocate greater visuospatial attention for a greater reward, to examine cholinergic contributions to both attentional performance and choice based on attentional demand. Following the establishment of baseline behavior, four drug challenges were administered: nicotine, mecamylamine, scopolamine, and oxotremorine (saline plus three doses for each). As per previous rCET studies, animals were divided by their baseline preferences, with "worker" rats choosing high-effort/high-reward options more than their "slacker" counterparts. Nicotine caused slackers to choose even fewer high-effort trials than at baseline, but had no effect on workers' choice. Despite slackers' decreased willingness to expend effort, nicotine improved their attentional performance on the task. Nicotine also increased measures of motor impulsivity in all animals. In contrast, scopolamine decreased animals' choice of high-effort trials, especially for workers, while oxotremorine decreased motor impulsivity for all animals. In sum, the cholinergic system appears to contribute to decision making, and in part these contributions can be understood as a function of individual differences. While nicotine has been considered as a cognitive enhancer, these data suggest that its modest benefits to attention may be coupled with impulsiveness and decreased willingness to work hard, especially in individuals who are particularly sensitive to effort costs (i.e. slackers).

  8. Nicotine Increases Impulsivity and Decreases Willingness to Exert Cognitive Effort despite Improving Attention in “Slacker” Rats: Insights into Cholinergic Regulation of Cost/Benefit Decision Making

    PubMed Central

    Hosking, Jay G.; Lam, Fred C. W.; Winstanley, Catharine A.

    2014-01-01

    Successful decision making in our daily lives requires weighing an option’s costs against its associated benefits. The neuromodulator acetylcholine underlies both the etiology and treatment of a number of illnesses in which decision making is perturbed, including Alzheimer’s disease, attention-deficit/hyperactivity disorder, and schizophrenia. Nicotine acts on the cholinergic system and has been touted as a cognitive enhancer by both smokers and some researchers for its attention-boosting effects; however, it is unclear whether treatments that have a beneficial effect on attention would also have a beneficial effect on decision making. Here we utilize the rodent Cognitive Effort Task (rCET), wherein animals can choose to allocate greater visuospatial attention for a greater reward, to examine cholinergic contributions to both attentional performance and choice based on attentional demand. Following the establishment of baseline behavior, four drug challenges were administered: nicotine, mecamylamine, scopolamine, and oxotremorine (saline plus three doses for each). As per previous rCET studies, animals were divided by their baseline preferences, with “worker” rats choosing high-effort/high-reward options more than their “slacker” counterparts. Nicotine caused slackers to choose even fewer high-effort trials than at baseline, but had no effect on workers’ choice. Despite slackers’ decreased willingness to expend effort, nicotine improved their attentional performance on the task. Nicotine also increased measures of motor impulsivity in all animals. In contrast, scopolamine decreased animals’ choice of high-effort trials, especially for workers, while oxotremorine decreased motor impulsivity for all animals. In sum, the cholinergic system appears to contribute to decision making, and in part these contributions can be understood as a function of individual differences. While nicotine has been considered as a cognitive enhancer, these data suggest that its modest benefits to attention may be coupled with impulsiveness and decreased willingness to work hard, especially in individuals who are particularly sensitive to effort costs (i.e. slackers). PMID:25353339

  9. Computerizing the Accounting Curriculum.

    ERIC Educational Resources Information Center

    Nash, John F.; England, Thomas G.

    1986-01-01

    Discusses the use of computers in college accounting courses. Argues that the success of new efforts in using computers in teaching accounting is dependent upon increasing instructors' computer skills, and choosing appropriate hardware and software, including commercially available business software packages. (TW)

  10. Computers in Schools: White Boys Only?

    ERIC Educational Resources Information Center

    Hammett, Roberta F.

    1997-01-01

    Discusses the role of computers in today's world and the construction of computer use attitudes, such as gender gaps. Suggests how schools might close the gaps. Includes a brief explanation about how facility with computers is important for women in their efforts to gain equitable treatment in all aspects of their lives. (PA)

  11. Computers and Instruction: Implications of the Rising Tide of Criticism for Reading Education.

    ERIC Educational Resources Information Center

    Balajthy, Ernest

    1988-01-01

    Examines two major reasons that schools have adopted computers without careful prior examination and planning. Surveys a variety of criticisms targeted toward some aspects of computer-based instruction in reading in an effort to direct attention to the beneficial implications of computers in the classroom. (MS)

  12. Preparing Future Secondary Computer Science Educators

    ERIC Educational Resources Information Center

    Ajwa, Iyad

    2007-01-01

    Although nearly every college offers a major in computer science, many computer science teachers at the secondary level have received little formal training. This paper presents details of a project that could make a significant contribution to national efforts to improve computer science education by combining teacher education and professional…

  13. Computers for the Faculty: How on a Limited Budget.

    ERIC Educational Resources Information Center

    Arman, Hal; Kostoff, John

    An informal investigation of the use of computers at Delta College (DC) in Michigan revealed reasonable use of computers by faculty in disciplines such as mathematics, business, and technology, but very limited use in the humanities and social sciences. In an effort to increase faculty computer usage, DC decided to make computers available to any…

  14. Possible Computer Vision Systems and Automated or Computer-Aided Edging and Trimming

    Treesearch

    Philip A. Araman

    1990-01-01

    This paper discusses research which is underway to help our industry reduce costs, increase product volume and value recovery, and market more accurately graded and described products. The research is part of a team effort to help the hardwood sawmill industry automate with computer vision systems, and computer-aided or computer controlled processing. This paper...

  15. Biomechanics of Head, Neck, and Chest Injury Prevention for Soldiers: Phase 2 and 3

    DTIC Science & Technology

    2016-08-01

    understanding of the biomechanics of the head and brain. Task 2.3 details the computational modeling efforts conducted to evaluate the response of the...section also details the progress made on the development of a testing apparatus to evaluate cervical spine implants in survivable loading scenarios...computational modeling efforts conducted to evaluate the response of the cervical spine and the effects of cervical arthrodesis and arthroplasty during

  16. Limits on fundamental limits to computation.

    PubMed

    Markov, Igor L

    2014-08-14

    An indispensable part of our personal and working lives, computing has also become essential to industries and governments. Steady improvements in computer hardware have been supported by periodic doubling of transistor densities in integrated circuits over the past fifty years. Such Moore scaling now requires ever-increasing efforts, stimulating research in alternative hardware and stirring controversy. To help evaluate emerging technologies and increase our understanding of integrated-circuit scaling, here I review fundamental limits to computation in the areas of manufacturing, energy, physical space, design and verification effort, and algorithms. To outline what is achievable in principle and in practice, I recapitulate how some limits were circumvented, and compare loose and tight limits. Engineering difficulties encountered by emerging technologies may indicate yet unknown limits.

  17. Computational electronics and electromagnetics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shang, C C

    The Computational Electronics and Electromagnetics thrust area serves as the focal point for Engineering R and D activities for developing computer-based design and analysis tools. Representative applications include design of particle accelerator cells and beamline components; design of transmission line components; engineering analysis and design of high-power (optical and microwave) components; photonics and optoelectronics circuit design; electromagnetic susceptibility analysis; and antenna synthesis. The FY-97 effort focuses on development and validation of (1) accelerator design codes; (2) 3-D massively parallel, time-dependent EM codes; (3) material models; (4) coupling and application of engineering tools for analysis and design of high-power components; andmore » (5) development of beam control algorithms coupled to beam transport physics codes. These efforts are in association with technology development in the power conversion, nondestructive evaluation, and microtechnology areas. The efforts complement technology development in Lawrence Livermore National programs.« less

  18. Office workers' computer use patterns are associated with workplace stressors.

    PubMed

    Eijckelhof, Belinda H W; Huysmans, Maaike A; Blatter, Birgitte M; Leider, Priscilla C; Johnson, Peter W; van Dieën, Jaap H; Dennerlein, Jack T; van der Beek, Allard J

    2014-11-01

    This field study examined associations between workplace stressors and office workers' computer use patterns. We collected keyboard and mouse activities of 93 office workers (68F, 25M) for approximately two work weeks. Linear regression analyses examined the associations between self-reported effort, reward, overcommitment, and perceived stress and software-recorded computer use duration, number of short and long computer breaks, and pace of input device usage. Daily duration of computer use was, on average, 30 min longer for workers with high compared to low levels of overcommitment and perceived stress. The number of short computer breaks (30 s-5 min long) was approximately 20% lower for those with high compared to low effort and for those with low compared to high reward. These outcomes support the hypothesis that office workers' computer use patterns vary across individuals with different levels of workplace stressors. Copyright © 2014 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  19. Neurocomputational mechanisms underlying subjective valuation of effort costs

    PubMed Central

    Giehl, Kathrin; Sillence, Annie

    2017-01-01

    In everyday life, we have to decide whether it is worth exerting effort to obtain rewards. Effort can be experienced in different domains, with some tasks requiring significant cognitive demand and others being more physically effortful. The motivation to exert effort for reward is highly subjective and varies considerably across the different domains of behaviour. However, very little is known about the computational or neural basis of how different effort costs are subjectively weighed against rewards. Is there a common, domain-general system of brain areas that evaluates all costs and benefits? Here, we used computational modelling and functional magnetic resonance imaging (fMRI) to examine the mechanisms underlying value processing in both the cognitive and physical domains. Participants were trained on two novel tasks that parametrically varied either cognitive or physical effort. During fMRI, participants indicated their preferences between a fixed low-effort/low-reward option and a variable higher-effort/higher-reward offer for each effort domain. Critically, reward devaluation by both cognitive and physical effort was subserved by a common network of areas, including the dorsomedial and dorsolateral prefrontal cortex, the intraparietal sulcus, and the anterior insula. Activity within these domain-general areas also covaried negatively with reward and positively with effort, suggesting an integration of these parameters within these areas. Additionally, the amygdala appeared to play a unique, domain-specific role in processing the value of rewards associated with cognitive effort. These results are the first to reveal the neurocomputational mechanisms underlying subjective cost–benefit valuation across different domains of effort and provide insight into the multidimensional nature of motivation. PMID:28234892

  20. Parallel computing for automated model calibration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burke, John S.; Danielson, Gary R.; Schulz, Douglas A.

    2002-07-29

    Natural resources model calibration is a significant burden on computing and staff resources in modeling efforts. Most assessments must consider multiple calibration objectives (for example magnitude and timing of stream flow peak). An automated calibration process that allows real time updating of data/models, allowing scientists to focus effort on improving models is needed. We are in the process of building a fully featured multi objective calibration tool capable of processing multiple models cheaply and efficiently using null cycle computing. Our parallel processing and calibration software routines have been generically, but our focus has been on natural resources model calibration. Somore » far, the natural resources models have been friendly to parallel calibration efforts in that they require no inter-process communication, only need a small amount of input data and only output a small amount of statistical information for each calibration run. A typical auto calibration run might involve running a model 10,000 times with a variety of input parameters and summary statistical output. In the past model calibration has been done against individual models for each data set. The individual model runs are relatively fast, ranging from seconds to minutes. The process was run on a single computer using a simple iterative process. We have completed two Auto Calibration prototypes and are currently designing a more feature rich tool. Our prototypes have focused on running the calibration in a distributed computing cross platform environment. They allow incorporation of?smart? calibration parameter generation (using artificial intelligence processing techniques). Null cycle computing similar to SETI@Home has also been a focus of our efforts. This paper details the design of the latest prototype and discusses our plans for the next revision of the software.« less

  1. Making the hard work of recovery more attractive for those with substance use disorders

    PubMed Central

    McKay, James R.

    2016-01-01

    Background Research has led to improvements in the effectiveness of interventions for substance use disorders (SUD), but for the most part progress has been modest, particularly with regard to longer-term outcomes. Moreover, most individuals with SUD do not seek out treatment. Argument/analysis This paper presents two recommendations on how to improve treatment engagement and long-term outcomes for those with SUD. First, treatments should go beyond a focus on reducing or eliminating substance use to target greater access to and more time spent in experiences that will be enjoyable or otherwise rewarding to clients. Second, there must be sufficient incentives in the environment to justify the effort needed to sustain long-term abstinence for individuals who often have limited access to such incentives. Conclusions To increase rates of long-term recovery from substance misuse, treatments should link clients to reinforcers that will make continued abstinence more appealing. This work needs to extend beyond interventions focused on the individual or family to include the local community and national policy in an effort to more strongly incentivize longer-term recoveries. PMID:27535787

  2. 4-Benzothiazole-7-hydroxyindolinyl diaryl ureas are potent P2Y1 antagonists with favorable pharmacokinetics: low clearance and small volume of distribution.

    PubMed

    Qiao, Jennifer X; Wang, Tammy C; Hiebert, Sheldon; Hu, Carol H; Schumacher, William A; Spronk, Steven A; Clark, Charles G; Han, Ying; Hua, Ji; Price, Laura A; Shen, Hong; Chacko, Silvi A; Everlof, Gerry; Bostwick, Jeffrey S; Steinbacher, Thomas E; Li, Yi-Xin; Huang, Christine S; Seiffert, Dietmar A; Rehfuss, Robert; Wexler, Ruth R; Lam, Patrick Y S

    2014-10-01

    Current antithrombotic discovery efforts target compounds that are highly efficacious in thrombus reduction with less bleeding liability than the standard of care. Preclinical data suggest that P2Y1 antagonists may have lower bleeding liabilities than P2Y12 antagonists while providing similar antithrombotic efficacy. This article describes our continuous SAR efforts in a series of 7-hydroxyindolinyl diaryl ureas. When dosed orally, 4-trifluoromethyl-7-hydroxy-3,3-dimethylindolinyl analogue 4 was highly efficacious in a model of arterial thrombosis in rats with limited bleeding. The chemically labile CF3 group in 4 was then transformed to various groups via a novel one-step synthesis, yielding a series of potent P2Y1 antagonists. Among them, the 4-benzothiazole-substituted indolines had desirable PK properties in rats, specifically, low clearance and small volume of distribution. In addition, compound 40 had high i.v. exposure and modest bioavailability, giving it the best overall profile. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Expanding primary care capacity by reducing waste and improving the efficiency of care.

    PubMed

    Shipman, Scott A; Sinsky, Christine A

    2013-11-01

    Most solutions proposed for the looming shortage of primary care physicians entail strategies that fall into one of three categories: train more, lose fewer, or find someone else. A fourth strategy deserves more attention: waste less. This article examines the remarkable inefficiency and waste in primary care today and highlights practices that have addressed these problems. For example, delegating certain administrative tasks such as managing task lists in the electronic health record can give physicians more time to see additional patients. Flow managers who guide physicians from task to task throughout the clinical day have been shown to improve physicians' efficiency and capacity. Even something as simple as placing a printer in every exam room can save each physician twenty minutes per day. Modest but systemwide improvements could yield dramatic gains in physician capacity while potentially reducing physician burnout and its implications for the quality of care. If widely adopted, small efforts to empower nonphysicians, reengineer workflows, exploit technology, and update policies to eliminate wasted effort could yield the capacity for millions of additional patient visits per year in the United States.

  4. Predictive biomarkers of sorafenib efficacy in advanced hepatocellular carcinoma: Are we getting there?

    PubMed Central

    Shao, Yu-Yun; Hsu, Chih-Hung; Cheng, Ann-Lii

    2015-01-01

    Sorafenib is the current standard treatment for advanced hepatocellular carcinoma (HCC), but its efficacy is modest with low response rates and short response duration. Predictive biomarkers for sorafenib efficacy are necessary. However, efforts to determine biomarkers for sorafenib have led only to potential candidates rather than clinically useful predictors. Studies based on patient cohorts identified the potential of blood levels of angiopoietin-2, hepatocyte growth factor, insulin-like growth factor-1, and transforming growth factor-β1 for predicting sorafenib efficacy. Alpha-fetoprotein response, dynamic contrast-enhanced magnetic resonance imaging, and treatment-related side effects may serve as early surrogate markers. Novel approaches based on super-responders or experimental mouse models may provide new directions in biomarker research. These studies identified tumor amplification of FGF3/FGF4 or VEGFA and tumor expression of phospho-Mapk14 and phospho-Atf2 as possible predictive markers that await validation. A group effort that considers various prognostic factors and proper collection of tumor tissues before treatment is imperative for the success of future biomarker research in advanced HCC. PMID:26420960

  5. Making the hard work of recovery more attractive for those with substance use disorders.

    PubMed

    McKay, James R

    2017-05-01

    Research has led to improvements in the effectiveness of interventions for substance use disorders (SUD), but for the most part progress has been modest, particularly with regard to longer-term outcomes. Moreover, most individuals with SUD do not seek out treatment. This paper presents two recommendations on how to improve treatment engagement and long-term outcomes for those with SUD. First, treatments should go beyond a focus on reducing or eliminating substance use to target greater access to and more time spent in experiences that will be enjoyable or otherwise rewarding to clients. Secondly, there must be sufficient incentives in the environment to justify the effort needed to sustain long-term abstinence for individuals who often have limited access to such incentives. To increase rates of long-term recovery from substance misuse, treatments should link clients to reinforcers that will make continued abstinence more appealing. This work needs to extend beyond interventions focused on the individual or family to include the local community and national policy in an effort to incentivize longer-term recoveries more strongly. © 2016 Society for the Study of Addiction.

  6. Predictive biomarkers of sorafenib efficacy in advanced hepatocellular carcinoma: Are we getting there?

    PubMed

    Shao, Yu-Yun; Hsu, Chih-Hung; Cheng, Ann-Lii

    2015-09-28

    Sorafenib is the current standard treatment for advanced hepatocellular carcinoma (HCC), but its efficacy is modest with low response rates and short response duration. Predictive biomarkers for sorafenib efficacy are necessary. However, efforts to determine biomarkers for sorafenib have led only to potential candidates rather than clinically useful predictors. Studies based on patient cohorts identified the potential of blood levels of angiopoietin-2, hepatocyte growth factor, insulin-like growth factor-1, and transforming growth factor-β1 for predicting sorafenib efficacy. Alpha-fetoprotein response, dynamic contrast-enhanced magnetic resonance imaging, and treatment-related side effects may serve as early surrogate markers. Novel approaches based on super-responders or experimental mouse models may provide new directions in biomarker research. These studies identified tumor amplification of FGF3/FGF4 or VEGFA and tumor expression of phospho-Mapk14 and phospho-Atf2 as possible predictive markers that await validation. A group effort that considers various prognostic factors and proper collection of tumor tissues before treatment is imperative for the success of future biomarker research in advanced HCC.

  7. Information Security: Governmentwide Guidance Needed to Assist Agencies in Implementing Cloud Computing

    DTIC Science & Technology

    2010-07-01

    Cloud computing , an emerging form of computing in which users have access to scalable, on-demand capabilities that are provided through Internet... cloud computing , (2) the information security implications of using cloud computing services in the Federal Government, and (3) federal guidance and...efforts to address information security when using cloud computing . The complete report is titled Information Security: Federal Guidance Needed to

  8. Factors associated with receipt of pension and compensation benefits for homeless veterans in the VBA/VHA Homeless Outreach Initiative.

    PubMed

    Chen, Joyce H; Rosenheck, Robert A; Greenberg, Greg A; Seibyl, Catherine

    2007-03-01

    Public support payments may facilitate exit from homelessness for persons with mental illness. We examined data from 10,641 homeless veterans contacted from October 1, 1995 to September 30, 2002 in a collaborative outreach program designed to facilitate access to Department of Veterans Affairs (VA) disability benefits. Those who were awarded benefits (22% of contacted veterans) were more likely to report disability, poor to fair self-rated health, and were more likely to have used VA services in the past. Thus, this program achieved only modest success and was most successful with veterans who were already receiving VA services and who might have received benefits even without the outreach effort.

  9. Design study of wind turbines 50 kW to 3000 kW for electric utility applications. Volume 3: Supplementary design and analysis tasks

    NASA Technical Reports Server (NTRS)

    1976-01-01

    Additional design and analysis data are provided to supplement the results of the two parallel design study efforts. The key results of the three supplemental tasks investigated are: (1) The velocity duration profile has a significant effect in determining the optimum wind turbine design parameters and the energy generation cost. (2) Modest increases in capacity factor can be achieved with small increases in energy generation costs and capital costs. (3) Reinforced concrete towers that are esthetically attractive can be designed and built at a cost comparable to those for steel truss towers. The approach used, method of analysis, assumptions made, design requirements, and the results for each task are discussed in detail.

  10. Proceedings of the third biennial conference of research on the Colorado Plateau

    USGS Publications Warehouse

    Deshler, Elena T.; van Riper, Charles

    1997-01-01

    The papers in this volume are contributions from federal, state, and private sector researchers, who have come together to share scientific information with land managers on the Colorado Plateau. This Proceedings is the third in a series of publications that focuses on providing information to land managers on baseline scientific information pertaining to physical, cultural and biological resources of the Colorado Plateau. Support for these studies came from a spectrum of federal, state, and private partners concerned about the well-being of the Plateau's resources. I applaud the effort of the contributors. With modest funding and a broad base of public and institutional support, these authors have pursued important lines of work in the four states that comprise the Colorado Plateau biogeographic region.

  11. Quality control: can compounding pharmacy learn from the automotive industry?

    PubMed

    Dillon, L Rad

    2014-01-01

    The healthcare system is vast in scope, with constant concerns of how the system's major changes will affect the quality of future patient care. This article concerns only one small corner of the healthcare system--pharmaceutical compounding. Despite our best efforts, we are not going to single-handedly change all of the grim statistics, no matter how many years we are given or how much assistance Americans obtain from their international colleagues. Yet, the "good" news is that although in my opinion the overall quality of America's healthcare system has declined, perhaps the modest niche of compounding pharmacy can become a role model of what actually works and can offer immense opportunities to improve the efficiency and effectiveness of health care.

  12. Internet-based system for simulation-based medical planning for cardiovascular disease.

    PubMed

    Steele, Brooke N; Draney, Mary T; Ku, Joy P; Taylor, Charles A

    2003-06-01

    Current practice in vascular surgery utilizes only diagnostic and empirical data to plan treatments, which does not enable quantitative a priori prediction of the outcomes of interventions. We have previously described simulation-based medical planning methods to model blood flow in arteries and plan medical treatments based on physiologic models. An important consideration for the design of these patient-specific modeling systems is the accessibility to physicians with modest computational resources. We describe a simulation-based medical planning environment developed for the World Wide Web (WWW) using the Virtual Reality Modeling Language (VRML) and the Java programming language.

  13. A case for Redundant Arrays of Inexpensive Disks (RAID)

    NASA Technical Reports Server (NTRS)

    Patterson, David A.; Gibson, Garth; Katz, Randy H.

    1988-01-01

    Increasing performance of CPUs and memories will be squandered if not matched by a similar performance increase in I/O. While the capacity of Single Large Expensive Disks (SLED) has grown rapidly, the performance improvement of SLED has been modest. Redundant Arrays of Inexpensive Disks (RAID), based on the magnetic disk technology developed for personal computers, offers an attractive alternative to SLED, promising improvements of an order of magnitude in performance, reliability, power consumption, and scalability. This paper introduces five levels of RAIDs, giving their relative cost/performance, and compares RAID to an IBM 3380 and a Fujitsu Super Eagle.

  14. Solid-state greenhouses and their implications for icy satellites

    NASA Technical Reports Server (NTRS)

    Matson, Dennis L.; Brown, Robert H.

    1989-01-01

    The 'solid-state greenhouse effect' model constituted by the subsurface solar heating of translucent, high-albedo materials is presently applied to the study of planetary surfaces, with attention to frost and ice surfaces of the solar system's outer satellites. Temperature is computed as a function of depth for an illustrative range of thermal variables, and it is discovered that the surfaces and interiors of such bodies can be warmer than otherwise suspected. Mechanisms are identified through which the modest alteration of surface properties can substantially change the solid-state greenhouse and force an interior temperature adjustment.

  15. Single-Step BLUP with Varying Genotyping Effort in Open-Pollinated Picea glauca.

    PubMed

    Ratcliffe, Blaise; El-Dien, Omnia Gamal; Cappa, Eduardo P; Porth, Ilga; Klápště, Jaroslav; Chen, Charles; El-Kassaby, Yousry A

    2017-03-10

    Maximization of genetic gain in forest tree breeding programs is contingent on the accuracy of the predicted breeding values and precision of the estimated genetic parameters. We investigated the effect of the combined use of contemporary pedigree information and genomic relatedness estimates on the accuracy of predicted breeding values and precision of estimated genetic parameters, as well as rankings of selection candidates, using single-step genomic evaluation (HBLUP). In this study, two traits with diverse heritabilities [tree height (HT) and wood density (WD)] were assessed at various levels of family genotyping efforts (0, 25, 50, 75, and 100%) from a population of white spruce ( Picea glauca ) consisting of 1694 trees from 214 open-pollinated families, representing 43 provenances in Québec, Canada. The results revealed that HBLUP bivariate analysis is effective in reducing the known bias in heritability estimates of open-pollinated populations, as it exposes hidden relatedness, potential pedigree errors, and inbreeding. The addition of genomic information in the analysis considerably improved the accuracy in breeding value estimates by accounting for both Mendelian sampling and historical coancestry that were not captured by the contemporary pedigree alone. Increasing family genotyping efforts were associated with continuous improvement in model fit, precision of genetic parameters, and breeding value accuracy. Yet, improvements were observed even at minimal genotyping effort, indicating that even modest genotyping effort is effective in improving genetic evaluation. The combined utilization of both pedigree and genomic information may be a cost-effective approach to increase the accuracy of breeding values in forest tree breeding programs where shallow pedigrees and large testing populations are the norm. Copyright © 2017 Ratcliffe et al.

  16. Evaluating biomarkers for prognostic enrichment of clinical trials.

    PubMed

    Kerr, Kathleen F; Roth, Jeremy; Zhu, Kehao; Thiessen-Philbrook, Heather; Meisner, Allison; Wilson, Francis Perry; Coca, Steven; Parikh, Chirag R

    2017-12-01

    A potential use of biomarkers is to assist in prognostic enrichment of clinical trials, where only patients at relatively higher risk for an outcome of interest are eligible for the trial. We investigated methods for evaluating biomarkers for prognostic enrichment. We identified five key considerations when considering a biomarker and a screening threshold for prognostic enrichment: (1) clinical trial sample size, (2) calendar time to enroll the trial, (3) total patient screening costs and the total per-patient trial costs, (4) generalizability of trial results, and (5) ethical evaluation of trial eligibility criteria. Items (1)-(3) are amenable to quantitative analysis. We developed the Biomarker Prognostic Enrichment Tool for evaluating biomarkers for prognostic enrichment at varying levels of screening stringency. We demonstrate that both modestly prognostic and strongly prognostic biomarkers can improve trial metrics using Biomarker Prognostic Enrichment Tool. Biomarker Prognostic Enrichment Tool is available as a webtool at http://prognosticenrichment.com and as a package for the R statistical computing platform. In some clinical settings, even biomarkers with modest prognostic performance can be useful for prognostic enrichment. In addition to the quantitative analysis provided by Biomarker Prognostic Enrichment Tool, investigators must consider the generalizability of trial results and evaluate the ethics of trial eligibility criteria.

  17. Future Computer Requirements for Computational Aerodynamics

    NASA Technical Reports Server (NTRS)

    1978-01-01

    Recent advances in computational aerodynamics are discussed as well as motivations for and potential benefits of a National Aerodynamic Simulation Facility having the capability to solve fluid dynamic equations at speeds two to three orders of magnitude faster than presently possible with general computers. Two contracted efforts to define processor architectures for such a facility are summarized.

  18. Emerging Neuromorphic Computing Architectures & Enabling Hardware for Cognitive Information Processing Applications

    DTIC Science & Technology

    2010-06-01

    DATES COVEREDAPR 2009 – JAN 2010 (From - To) APR 2009 – JAN 2010 4. TITLE AND SUBTITLE EMERGING NEUROMORPHIC COMPUTING ARCHITECTURES AND ENABLING...14. ABSTRACT The highly cross-disciplinary emerging field of neuromorphic computing architectures for cognitive information processing applications...belief systems, software, computer engineering, etc. In our effort to develop cognitive systems atop a neuromorphic computing architecture, we explored

  19. Overview 1993: Computational applications

    NASA Technical Reports Server (NTRS)

    Benek, John A.

    1993-01-01

    Computational applications include projects that apply or develop computationally intensive computer programs. Such programs typically require supercomputers to obtain solutions in a timely fashion. This report describes two CSTAR projects involving Computational Fluid Dynamics (CFD) technology. The first, the Parallel Processing Initiative, is a joint development effort and the second, the Chimera Technology Development, is a transfer of government developed technology to American industry.

  20. Computing at the speed limit (supercomputers)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bernhard, R.

    1982-07-01

    The author discusses how unheralded efforts in the United States, mainly in universities, have removed major stumbling blocks to building cost-effective superfast computers for scientific and engineering applications within five years. These computers would have sustained speeds of billions of floating-point operations per second (flops), whereas with the fastest machines today the top sustained speed is only 25 million flops, with bursts to 160 megaflops. Cost-effective superfast machines can be built because of advances in very large-scale integration and the special software needed to program the new machines. VLSI greatly reduces the cost per unit of computing power. The developmentmore » of such computers would come at an opportune time. Although the US leads the world in large-scale computer technology, its supremacy is now threatened, not surprisingly, by the Japanese. Publicized reports indicate that the Japanese government is funding a cooperative effort by commercial computer manufacturers to develop superfast computers-about 1000 times faster than modern supercomputers. The US computer industry, by contrast, has balked at attempting to boost computer power so sharply because of the uncertain market for the machines and the failure of similar projects in the past to show significant results.« less

  1. Computer Augmented Video Education.

    ERIC Educational Resources Information Center

    Sousa, M. B.

    1979-01-01

    Describes project CAVE (Computer Augmented Video Education), an ongoing effort at the U.S. Naval Academy to present lecture material on videocassette tape, reinforced by drill and practice through an interactive computer system supported by a 12 channel closed circuit television distribution and production facility. (RAO)

  2. Computer Guided Instructional Design.

    ERIC Educational Resources Information Center

    Merrill, M. David; Wood, Larry E.

    1984-01-01

    Describes preliminary efforts to create the Lesson Design System, a computer-guided instructional design system written in Pascal for Apple microcomputers. Its content outline, strategy, display, and online lesson editors correspond roughly to instructional design phases of content and strategy analysis, display creation, and computer programing…

  3. CAROLINA CENTER FOR COMPUTATIONAL TOXICOLOGY

    EPA Science Inventory

    The Center will advance the field of computational toxicology through the development of new methods and tools, as well as through collaborative efforts. In each Project, new computer-based models will be developed and published that represent the state-of-the-art. The tools p...

  4. Tobacco use in popular movies during the past decade

    PubMed Central

    Mekemson, C; Glik, D; Titus, K; Myerson, A; Shaivitz, A; Ang, A; Mitchell, S

    2004-01-01

    Objective: The top 50 commercially successful films released per year from 1991 to 2000 were content coded to assess trends in tobacco use over time and attributes of films predictive of higher smoking rates. Design: This observational study used media content analysis methods to generate data about tobacco use depictions in films studied (n = 497). Films are the basic unit of analysis. Once films were coded and preliminary analysis completed, outcome data were transformed to approximate multivariate normality before being analysed with general linear models and longitudinal mixed method regression methods. Main outcome measures: Tobacco use per minute of film was the main outcome measure used. Predictor variables include attributes of films and actors. Tobacco use was defined as any cigarette, cigar, and chewing tobacco use as well as the display of smoke and cigarette paraphernalia such as ashtrays, brand names, or logos within frames of films reviewed. Results: Smoking rates in the top films fluctuated yearly over the decade with an overall modest downward trend (p < 0.005), with the exception of R rated films where rates went up. Conclusions: The decrease in smoking rates found in films in the past decade is modest given extensive efforts to educate the entertainment industry on this issue over the past decade. Monitoring, education, advocacy, and policy change to bring tobacco depiction rates down further should continue. PMID:15564625

  5. Immediate financial impact of computerized clinical decision support for long-term care residents with renal insufficiency: a case study.

    PubMed

    Subramanian, Sujha; Hoover, Sonja; Wagner, Joann L; Donovan, Jennifer L; Kanaan, Abir O; Rochon, Paula A; Gurwitz, Jerry H; Field, Terry S

    2012-01-01

    In a randomized trial of a clinical decision support system for drug prescribing for residents with renal insufficiency in a large long-term care facility, analyses were conducted to estimate the system's immediate, direct financial impact. We determined the costs that would have been incurred if drug orders that triggered the alert system had actually been completed compared to the costs of the final submitted orders and then compared intervention units to control units. The costs incurred by additional laboratory testing that resulted from alerts were also estimated. Drug orders were conservatively assigned a duration of 30 days of use for a chronic drug and 10 days for antibiotics. It was determined that there were modest reductions in drug costs, partially offset by an increase in laboratory-related costs. Overall, there was a reduction in direct costs (US$1391.43, net 7.6% reduction). However, sensitivity analyses based on alternative estimates of duration of drug use suggested a reduction as high as US$7998.33 if orders for non-antibiotic drugs were assumed to be continued for 180 days. The authors conclude that the immediate and direct financial impact of a clinical decision support system for medication ordering for residents with renal insufficiency is modest and that the primary motivation for such efforts must be to improve the quality and safety of medication ordering.

  6. Computing Models of M-type Host Stars and their Panchromatic Spectral Output

    NASA Astrophysics Data System (ADS)

    Linsky, Jeffrey; Tilipman, Dennis; France, Kevin

    2018-06-01

    We have begun a program of computing state-of-the-art model atmospheres from the photospheres to the coronae of M stars that are the host stars of known exoplanets. For each model we are computing the emergent radiation at all wavelengths that are critical for assessingphotochemistry and mass-loss from exoplanet atmospheres. In particular, we are computing the stellar extreme ultraviolet radiation that drives hydrodynamic mass loss from exoplanet atmospheres and is essential for determing whether an exoplanet is habitable. The model atmospheres are computed with the SSRPM radiative transfer/statistical equilibrium code developed by Dr. Juan Fontenla. The code solves for the non-LTE statistical equilibrium populations of 18,538 levels of 52 atomic and ion species and computes the radiation from all species (435,986 spectral lines) and about 20,000,000 spectral lines of 20 diatomic species.The first model computed in this program was for the modestly active M1.5 V star GJ 832 by Fontenla et al. (ApJ 830, 152 (2016)). We will report on a preliminary model for the more active M5 V star GJ 876 and compare this model and its emergent spectrum with GJ 832. In the future, we will compute and intercompare semi-empirical models and spectra for all of the stars observed with the HST MUSCLES Treasury Survey, the Mega-MUSCLES Treasury Survey, and additional stars including Proxima Cen and Trappist-1.This multiyear theory program is supported by a grant from the Space Telescope Science Institute.

  7. Langley's Computational Efforts in Sonic-Boom Softening of the Boeing HSCT

    NASA Technical Reports Server (NTRS)

    Fouladi, Kamran

    1999-01-01

    NASA Langley's computational efforts in the sonic-boom softening of the Boeing high-speed civil transport are discussed in this paper. In these efforts, an optimization process using a higher order Euler method for analysis was employed to reduce the sonic boom of a baseline configuration through fuselage camber and wing dihedral modifications. Fuselage modifications did not provide any improvements, but the dihedral modifications were shown to be an important tool for the softening process. The study also included aerodynamic and sonic-boom analyses of the baseline and some of the proposed "softened" configurations. Comparisons of two Euler methodologies and two propagation programs for sonic-boom predictions are also discussed in the present paper.

  8. Predicting adult pulmonary ventilation volume and wearing complianceby on-board accelerometry during personal level exposure assessments

    NASA Astrophysics Data System (ADS)

    Rodes, C. E.; Chillrud, S. N.; Haskell, W. L.; Intille, S. S.; Albinali, F.; Rosenberger, M. E.

    2012-09-01

    BackgroundMetabolic functions typically increase with human activity, but optimal methods to characterize activity levels for real-time predictions of ventilation volume (l min-1) during exposure assessments have not been available. Could tiny, triaxial accelerometers be incorporated into personal level monitors to define periods of acceptable wearing compliance, and allow the exposures (μg m-3) to be extended to potential doses in μg min-1 kg-1 of body weight? ObjectivesIn a pilot effort, we tested: 1) whether appropriately-processed accelerometer data could be utilized to predict compliance and in linear regressions to predict ventilation volumes in real-time as an on-board component of personal level exposure sensor systems, and 2) whether locating the exposure monitors on the chest in the breathing zone, provided comparable accelerometric data to other locations more typically utilized (waist, thigh, wrist, etc.). MethodsPrototype exposure monitors from RTI International and Columbia University were worn on the chest by a pilot cohort of adults while conducting an array of scripted activities (all <10 METS), spanning common recumbent, sedentary, and ambulatory activity categories. Referee Wocket accelerometers that were placed at various body locations allowed comparison with the chest-located exposure sensor accelerometers. An Oxycon Mobile mask was used to measure oral-nasal ventilation volumes in-situ. For the subset of participants with complete data (n = 22), linear regressions were constructed (processed accelerometric variable versus ventilation rate) for each participant and exposure monitor type, and Pearson correlations computed to compare across scenarios. ResultsTriaxial accelerometer data were demonstrated to be adequately sensitive indicators for predicting exposure monitor wearing compliance. Strong linear correlations (R values from 0.77 to 0.99) were observed for all participants for both exposure sensor accelerometer variables against ventilation volume for recumbent, sedentary, and ambulatory activities with MET values ˜<6. The RTI monitors mean R value of 0.91 was slightly higher than the Columbia monitors mean of 0.86 due to utilizing a 20 Hz data rate instead of a slower 1 Hz rate. A nominal mean regression slope was computed for the RTI system across participants and showed a modest RSD of +/-36.6%. Comparison of the correlation values of the exposure monitors with the Wocket accelerometers at various body locations showed statistically identical regressions for all sensors at alternate hip, ankle, upper arm, thigh, and pocket locations, but not for the Wocket accelerometer located at the dominant side wrist location (R = 0.57; p = 0.016). ConclusionsEven with a modest number of adult volunteers, the consistency and linearity of regression slopes for all subjects were very good with excellent within-person Pearson correlations for the accelerometer versus ventilation volume data. Computing accelerometric standard deviations allowed good sensitivity for compliance assessments even for sedentary activities. These pilot findings supported the hypothesis that a common linear regression is likely to be usable for a wider range of adults to predict ventilation volumes from accelerometry data over a range of low to moderate energy level activities. The predicted volumes would then allow real-time estimates of potential dose, enabling more robust panel studies. The poorer correlation in predicting ventilation rate for an accelerometer located on the wrist suggested that this location should not be considered for predictions of ventilation volume.

  9. The Nuclear Energy Advanced Modeling and Simulation Enabling Computational Technologies FY09 Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Diachin, L F; Garaizar, F X; Henson, V E

    2009-10-12

    In this document we report on the status of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Enabling Computational Technologies (ECT) effort. In particular, we provide the context for ECT In the broader NEAMS program and describe the three pillars of the ECT effort, namely, (1) tools and libraries, (2) software quality assurance, and (3) computational facility (computers, storage, etc) needs. We report on our FY09 deliverables to determine the needs of the integrated performance and safety codes (IPSCs) in these three areas and lay out the general plan for software quality assurance to meet the requirements of DOE andmore » the DOE Advanced Fuel Cycle Initiative (AFCI). We conclude with a brief description of our interactions with the Idaho National Laboratory computer center to determine what is needed to expand their role as a NEAMS user facility.« less

  10. Overview of NASA/OAST efforts related to manufacturing technology

    NASA Technical Reports Server (NTRS)

    Saunders, N. T.

    1976-01-01

    An overview of some of NASA's current efforts related to manufacturing technology and some possible directions for the future are presented. The topics discussed are: computer-aided design, composite structures, and turbine engine components.

  11. Data Network Weather Service Reporting - Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Michael Frey

    2012-08-30

    A final report is made of a three-year effort to develop a new forecasting paradigm for computer network performance. This effort was made in co-ordination with Fermi Lab's construction of e-Weather Center.

  12. Defining Computational Thinking for Mathematics and Science Classrooms

    NASA Astrophysics Data System (ADS)

    Weintrop, David; Beheshti, Elham; Horn, Michael; Orton, Kai; Jona, Kemi; Trouille, Laura; Wilensky, Uri

    2016-02-01

    Science and mathematics are becoming computational endeavors. This fact is reflected in the recently released Next Generation Science Standards and the decision to include "computational thinking" as a core scientific practice. With this addition, and the increased presence of computation in mathematics and scientific contexts, a new urgency has come to the challenge of defining computational thinking and providing a theoretical grounding for what form it should take in school science and mathematics classrooms. This paper presents a response to this challenge by proposing a definition of computational thinking for mathematics and science in the form of a taxonomy consisting of four main categories: data practices, modeling and simulation practices, computational problem solving practices, and systems thinking practices. In formulating this taxonomy, we draw on the existing computational thinking literature, interviews with mathematicians and scientists, and exemplary computational thinking instructional materials. This work was undertaken as part of a larger effort to infuse computational thinking into high school science and mathematics curricular materials. In this paper, we argue for the approach of embedding computational thinking in mathematics and science contexts, present the taxonomy, and discuss how we envision the taxonomy being used to bring current educational efforts in line with the increasingly computational nature of modern science and mathematics.

  13. Benchmarking optimization software with COPS.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dolan, E.D.; More, J.J.

    2001-01-08

    The COPS test set provides a modest selection of difficult nonlinearly constrained optimization problems from applications in optimal design, fluid dynamics, parameter estimation, and optimal control. In this report we describe version 2.0 of the COPS problems. The formulation and discretization of the original problems have been streamlined and improved. We have also added new problems. The presentation of COPS follows the original report, but the description of the problems has been streamlined. For each problem we discuss the formulation of the problem and the structural data in Table 0.1 on the formulation. The aim of presenting this data ismore » to provide an approximate idea of the size and sparsity of the problem. We also include the results of computational experiments with the LANCELOT, LOQO, MINOS, and SNOPT solvers. These computational experiments differ from the original results in that we have deleted problems that were considered to be too easy. Moreover, in the current version of the computational experiments, each problem is tested with four variations. An important difference between this report and the original report is that the tables that present the computational experiments are generated automatically from the testing script. This is explained in more detail in the report.« less

  14. Implementing Equal Access Computer Labs.

    ERIC Educational Resources Information Center

    Clinton, Janeen; And Others

    This paper discusses the philosophy followed in Palm Beach County to adapt computer literacy curriculum, hardware, and software to meet the needs of all children. The Department of Exceptional Student Education and the Department of Instructional Computing Services cooperated in planning strategies and coordinating efforts to implement equal…

  15. The Effort Paradox: Effort Is Both Costly and Valued.

    PubMed

    Inzlicht, Michael; Shenhav, Amitai; Olivola, Christopher Y

    2018-04-01

    According to prominent models in cognitive psychology, neuroscience, and economics, effort (be it physical or mental) is costly: when given a choice, humans and non-human animals alike tend to avoid effort. Here, we suggest that the opposite is also true and review extensive evidence that effort can also add value. Not only can the same outcomes be more rewarding if we apply more (not less) effort, sometimes we select options precisely because they require effort. Given the increasing recognition of effort's role in motivation, cognitive control, and value-based decision-making, considering this neglected side of effort will not only improve formal computational models, but also provide clues about how to promote sustained mental effort across time. Copyright © 2018 Elsevier Ltd. All rights reserved.

  16. 76 FR 28443 - President's National Security Telecommunications Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-17

    ... Government's use of cloud computing; the Federal Emergency Management Agency's NS/EP communications... Commercial Satellite Mission Assurance; and the way forward for the committee's cloud computing effort. The...

  17. Home

    Science.gov Websites

    System Award for developing a tool that has had a lasting influence on computing. Project Jupyter evolved lasting influence on computing. Project Jupyter evolved from IPython, an effort pioneered by Fernando PÃ

  18. The 'Wow' Signal, Drake Equation and Exoplanet Considerations

    NASA Astrophysics Data System (ADS)

    Wheeler, E.

    It has been 38 years since the most likely artificial transmission ever recorded from a possible extraterrestrial source was received [1, 2]. Using greatly improved technology, subsequent efforts by the Search for Extraterrestrial Intelligence (SETI) have continued, yet silence from space prevails [3]. This article examines whether the transmission was an artificial signal, and if so why it matters, to include the possibility that the modest technology used by the "Big Ear" receiver could have been accommodated by the source. The transmission and the ensuing long silence may be intended. This paper reconsiders the Drake equation, an estimate for the number of civilizations in our galaxy that may possess technology for interstellar signaling [4, 5], and shows that statement of the current alleged best estimate of two civilizations is not supported [6]. An alternate and original method suggests ~100 civilizations. It importantly relies on experience and detectable events, including recent astronomical evidence about exoplanets as cataloged by the European Exoplanet program and by the National Aeronautics and Space Administration (NASA) Exoplanet Science Institute [7, 8]. In addition it addresses major geological and astronomical occurrences that profoundly affected development of life on Earth and might apply similarly for Extraterrestrial Intelligence (ETI). The alternate approach is not intended to compute ETI precisely but to examine the possibility that, though vastly spread, it likely exists. The discussion anticipates difficulties in communication with an alien civilization, hardly an exercise in science fiction, and explores how international groups can participate in future specific response. One response might be to monitor the electromagnetic radiation spectral line of an element to be determined by consensus.

  19. Periodic control of the individual-blade-control helicopter rotor. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Mckillip, R. M., Jr.

    1984-01-01

    Results of an investigation into methods of controller design for an individual helicopter rotor blade in the high forward-flight speed regime are described. This operating condition poses a unique control problem in that the perturbation equations of motion are linear with coefficients that vary periodically with time. The design of a control law was based on extensions to modern multivariate synthesis techniques and incorporated a novel approach to the reconstruction of the missing system state variables. The controller was tested on both an electronic analog computer simulation of the out-of-plane flapping dynamics, and on a four foot diameter single-bladed model helicopter rotor in the M.I.T. 5x7 subsonic wind tunnel at high levels of advance ratio. It is shown that modal control using the IBC concept is possible over a large range of advance ratios with only a modest amount of computational power required.

  20. Influence of system size on the properties of a fluid adsorbed in a nanopore: Physical manifestations and methodological consequences

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Puibasset, Joël, E-mail: puibasset@cnrs-orleans.fr; Kierlik, Edouard, E-mail: edouard.kierlik@upmc.fr; Tarjus, Gilles, E-mail: tarjus@lptl.jussieu.fr

    Hysteresis and discontinuities in the isotherms of a fluid adsorbed in a nanopore in general hamper the determination of equilibrium thermodynamic properties, even in computer simulations. A way around this has been to consider both a reservoir of small size and a pore of small extent in order to restrict the fluctuations of density and approach a classical van der Waals loop. We assess this suggestion by thoroughly studying through Monte Carlo simulations and density functional theory the influence of system size on the equilibrium configurations of the adsorbed fluid and on the resulting isotherms. We stress the importance ofmore » pore-symmetry-breaking states that even for modest pore sizes lead to discontinuous isotherms and we discuss the physical relevance of these states and the methodological consequences for computing thermodynamic quantities.« less

  1. Predicting Pilot Error in Nextgen: Pilot Performance Modeling and Validation Efforts

    NASA Technical Reports Server (NTRS)

    Wickens, Christopher; Sebok, Angelia; Gore, Brian; Hooey, Becky

    2012-01-01

    We review 25 articles presenting 5 general classes of computational models to predict pilot error. This more targeted review is placed within the context of the broader review of computational models of pilot cognition and performance, including such aspects as models of situation awareness or pilot-automation interaction. Particular emphasis is placed on the degree of validation of such models against empirical pilot data, and the relevance of the modeling and validation efforts to Next Gen technology and procedures.

  2. A neuronal model of a global workspace in effortful cognitive tasks.

    PubMed

    Dehaene, S; Kerszberg, M; Changeux, J P

    1998-11-24

    A minimal hypothesis is proposed concerning the brain processes underlying effortful tasks. It distinguishes two main computational spaces: a unique global workspace composed of distributed and heavily interconnected neurons with long-range axons, and a set of specialized and modular perceptual, motor, memory, evaluative, and attentional processors. Workspace neurons are mobilized in effortful tasks for which the specialized processors do not suffice. They selectively mobilize or suppress, through descending connections, the contribution of specific processor neurons. In the course of task performance, workspace neurons become spontaneously coactivated, forming discrete though variable spatio-temporal patterns subject to modulation by vigilance signals and to selection by reward signals. A computer simulation of the Stroop task shows workspace activation to increase during acquisition of a novel task, effortful execution, and after errors. We outline predictions for spatio-temporal activation patterns during brain imaging, particularly about the contribution of dorsolateral prefrontal cortex and anterior cingulate to the workspace.

  3. A specific role for serotonin in overcoming effort cost.

    PubMed

    Meyniel, Florent; Goodwin, Guy M; Deakin, Jf William; Klinge, Corinna; MacFadyen, Christine; Milligan, Holly; Mullings, Emma; Pessiglione, Mathias; Gaillard, Raphaël

    2016-11-08

    Serotonin is implicated in many aspects of behavioral regulation. Theoretical attempts to unify the multiple roles assigned to serotonin proposed that it regulates the impact of costs, such as delay or punishment, on action selection. Here, we show that serotonin also regulates other types of action costs such as effort. We compared behavioral performance in 58 healthy humans treated during 8 weeks with either placebo or the selective serotonin reuptake inhibitor escitalopram. The task involved trading handgrip force production against monetary benefits. Participants in the escitalopram group produced more effort and thereby achieved a higher payoff. Crucially, our computational analysis showed that this effect was underpinned by a specific reduction of effort cost, and not by any change in the weight of monetary incentives. This specific computational effect sheds new light on the physiological role of serotonin in behavioral regulation and on the clinical effect of drugs for depression. ISRCTN75872983.

  4. DARPA-funded efforts in the development of novel brain-computer interface technologies.

    PubMed

    Miranda, Robbin A; Casebeer, William D; Hein, Amy M; Judy, Jack W; Krotkov, Eric P; Laabs, Tracy L; Manzo, Justin E; Pankratz, Kent G; Pratt, Gill A; Sanchez, Justin C; Weber, Douglas J; Wheeler, Tracey L; Ling, Geoffrey S F

    2015-04-15

    The Defense Advanced Research Projects Agency (DARPA) has funded innovative scientific research and technology developments in the field of brain-computer interfaces (BCI) since the 1970s. This review highlights some of DARPA's major advances in the field of BCI, particularly those made in recent years. Two broad categories of DARPA programs are presented with respect to the ultimate goals of supporting the nation's warfighters: (1) BCI efforts aimed at restoring neural and/or behavioral function, and (2) BCI efforts aimed at improving human training and performance. The programs discussed are synergistic and complementary to one another, and, moreover, promote interdisciplinary collaborations among researchers, engineers, and clinicians. Finally, this review includes a summary of some of the remaining challenges for the field of BCI, as well as the goals of new DARPA efforts in this domain. Copyright © 2014 The Authors. Published by Elsevier B.V. All rights reserved.

  5. Immersive Earth: Teaching Earth and Space with inexpensive immersive technology

    NASA Astrophysics Data System (ADS)

    Reiff, P. H.; Sumners, C.; Law, C. C.; Handron, K.

    2003-12-01

    In 1995 we pioneered "Space Update", the Digital Library for the rest of us", software that was so simple that a child could use it without a keyboard and yet would allow one-click updating of the daily earth and space science images without the dangers of having an open web browser on display. Thanks to NASA support, it allowed museums and schools to have a powerful exhibit for a tiny price. Over 40,000 disks in our series have been distributed so far to educators and the public. In 2003, with our partners we are again revolutionizing educational technology with a low-cost hardware and software solution to creating and displaying immersive content. Recently selected for funding as part of the REASoN competition, Immersive Earth is a partnership of scientists, museums, educators, and content providers. The hardware consists of a modest projector with a special fisheye lens to be used in an inflatable dome which many schools already have. This, coupled with a modest personal computer, can now easily project images and movies of earth and space, allows training students in 3-D content at a tiny fraction of the cost of a cave or fullscale dome theater. Another low-cost solution is the "Imove" system, where spherical movies can play on a personal computer, with the user changing the viewing direction with a joystick. We were the first to create immersive earth science shows, remain the leader in creating educational content that people want to see. We encourage people with "allsky" images or movies to bring it and see what it looks like inside a dome! Your content could be in our next show!

  6. Cross-Sectoral Collaboration: The State Health Official's Role in Elevating and Promoting Health Equity in All Policies in Minnesota.

    PubMed

    Bliss, Dorothy; Mishra, Meenoo; Ayers, Jeanne; Lupi, Monica Valdes

    2016-01-01

    For many years, the Minnesota Department of Health (MDH) has been intentionally engaged in decreasing race- and ethnicity-based health disparities in the state. It has seen modest success in some areas, but overall, the disparities remain. Research over the last several decades has shown that race- and ethnicity-based health disparities are the result of persistent social and economic inequities, which have a greater influence on health outcomes than either individual choices or interventions by the health care system. The MDH leaders recognized that to focus health improvement efforts solely on access to health care and individual behavior change (the traditional public health approaches of the last 30 years) would fail to make adequate advances in eliminating health disparities. Working with a statewide group known as the Healthy Minnesota Partnership, MDH decided to shift the public conversations about health in Minnesota to focus on the factors that actually create health. This effort to develop and implement a new narrative about health, focused on upstream issues such as education, employment, and home ownership, led to an emphasis on health in all policies approach for MDH and its partners. This case example will highlight Minnesota's efforts and discuss the new Council on Institutional Collaboration initiative in partnering large research universities with state health departments in addressing the social determinants of health.

  7. HOST turbine heat transfer subproject overview

    NASA Technical Reports Server (NTRS)

    Gladden, Herbert J.

    1986-01-01

    The experimental part of the turbine heat transfer subproject consists of six large experiments, which are highlighted in this overview, and three of somewhat more modest scope. One of the initial efforts was the stator airfoil heat transfer program. The non-film cooled and the showerhead film cooled data have already been reported. The gill region film cooling effort is currently underway. The investigation of secondary flows in a 90 deg curved duct, was completed. The first phase examined flows with a relatively thin inlet boundary layer and low free stream turbulence. The second phase studied a thicker inlet boundary layer and higher free stream turbulence. A comparison of analytical and experimental cross flow velocity vectors is shown for the 60 deg plane. Two experiments were also conducted in the high pressure facility. One examined full coverage film cooled vanes, and the other, advanced instrumentation. The other three large experimental efforts were conducted in a rotation reference frame. An experiment to obtain gas path airfoil heat transfer coefficients in the large, low speed turbine was completed. Single-stage data with both high and low-inlet turbulence were taken. The second phase examined a one and one-half stage turbine and focused on the second vane row. Under phase 3 aerodynamic quantities such as interrow time-averaged and rms values of velocity, flow angle, inlet turbulence, and surface pressure distribution were measured.

  8. 2006 Status of the Momentum eXchange Electrodynamic Re-Boost (MXER) Tether Development

    NASA Technical Reports Server (NTRS)

    Bonometti, Joseph A.; Sorensen, Kirk F.; Dankanich, John W.; Frame, Kyle L.

    2006-01-01

    The MXER Tether technology development is a high-payoff/high-risk investment area within the NASA In-Space Propulsion Technology (ISPT) Program. The ISPT program is managed by the NASA Headquarters Science Mission Directorate and implemented by the Marshall Space Flight Center in Huntsville, Alabama. The MXER concept was identified and competitively ranked within NASA's comprehensive Integrated In-Space Transportation Plan (IISTP); an agency-wide technology assessment activity. The objective of the MXER tether project within ISPT is to advance the technological maturation level for the MXER system, and its subsystems, as well as other space and terrestrial tether applications. Recent hardware efforts have focused on the manufacturability of space-survivable high-strength tether material and coatings, high-current electrodynamic tether, lightweight catch mechanism, high-accuracy propagator/predictor code, and efficient electron collection/current generation. Significant technical progress has been achieved with modest ISPT funding to the extent that MXER has evolved to a well-characterized system with greater capability as the design has been matured. Synergistic efforts in high-current electrodynamic tethers and efficient electron collection/current generation have been made possible through SBIR and STTR support. The entire development endeavor was orchestrated as a collaborative team effort across multiple individual contracts and has established a solid technology resource base, which permits a wide variety of future space cable/tether applications to be realized.

  9. Computer Technology and Social Issues.

    ERIC Educational Resources Information Center

    Garson, G. David

    Computing involves social issues and political choices. Issues such as privacy, computer crime, gender inequity, disemployment, and electronic democracy versus "Big Brother" are addressed in the context of efforts to develop a national public policy for information technology. A broad range of research and case studies are examined in an…

  10. Advances in computational design and analysis of airbreathing propulsion systems

    NASA Technical Reports Server (NTRS)

    Klineberg, John M.

    1989-01-01

    The development of commercial and military aircraft depends, to a large extent, on engine manufacturers being able to achieve significant increases in propulsion capability through improved component aerodynamics, materials, and structures. The recent history of propulsion has been marked by efforts to develop computational techniques that can speed up the propulsion design process and produce superior designs. The availability of powerful supercomputers, such as the NASA Numerical Aerodynamic Simulator, and the potential for even higher performance offered by parallel computer architectures, have opened the door to the use of multi-dimensional simulations to study complex physical phenomena in propulsion systems that have previously defied analysis or experimental observation. An overview of several NASA Lewis research efforts is provided that are contributing toward the long-range goal of a numerical test-cell for the integrated, multidisciplinary design, analysis, and optimization of propulsion systems. Specific examples in Internal Computational Fluid Mechanics, Computational Structural Mechanics, Computational Materials Science, and High Performance Computing are cited and described in terms of current capabilities, technical challenges, and future research directions.

  11. The association between job stress and leisure-time physical inactivity adjusted for individual attributes: evidence from a Japanese occupational cohort survey.

    PubMed

    Oshio, Takashi; Tsutsumi, Akizumi; Inoue, Akiomi

    2016-05-01

    We examined the association between job stress and leisure-time physical inactivity, adjusting for individual time-invariant attributes. We used data from a Japanese occupational cohort survey, which included 31 025 observations of 9871 individuals. Focusing on the evolution of job stress and leisure-time physical inactivity within the same individual over time, we employed fixed-effects logistic models to examine the association between job stress and leisure-time physical inactivity. We compared the results with those in pooled cross-sectional models and fixed-effects ordered logistic models. Fixed-effects models showed that the odds ratio (OR) of physical inactivity were 22% higher for those with high strain jobs [high demands/low control; OR 1.22, 95% confidence interval (95% CI) 1.03-1.43] and 17% higher for those with active jobs (high demands/high control; OR 1.17, 95% CI 1.02-1.34) than those with low strain jobs (low demands/high control). The models also showed that the odds of physical inactivity were 28% higher for those with high effort/low reward jobs (OR 1.28, 95% CI 1.10-1.50) and 24% higher for those with high effort/high reward jobs (OR 1.24, 95% CI 1.07-1.43) than those with low effort/high reward jobs. Fixed-effects ordered logistic models led to similar results. Job stress, especially high job strain and effort-reward imbalance, was modestly associated with higher risks of physical inactivity, even after controlling for individual time-invariant attributes.

  12. Multiphysics Analysis of a Solid-Core Nuclear Thermal Engine Thrust Chamber

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See; Canabal, Francisco; Cheng, Gary; Chen, Yen-Sen

    2006-01-01

    The objective of this effort is to develop an efficient and accurate thermo-fluid computational methodology to predict environments for a hypothetical solid-core, nuclear thermal engine thrust chamber. The computational methodology is based on an unstructured-grid, pressure-based computational fluid dynamics methodology. Formulations for heat transfer in solids and porous media were implemented and anchored. A two-pronged approach was employed in this effort: A detailed thermo-fluid analysis on a multi-channel flow element for mid-section corrosion investigation; and a global modeling of the thrust chamber to understand the effect of hydrogen dissociation and recombination on heat transfer and thrust performance. The formulations and preliminary results on both aspects are presented.

  13. Experimental Evaluation and Workload Characterization for High-Performance Computer Architectures

    NASA Technical Reports Server (NTRS)

    El-Ghazawi, Tarek A.

    1995-01-01

    This research is conducted in the context of the Joint NSF/NASA Initiative on Evaluation (JNNIE). JNNIE is an inter-agency research program that goes beyond typical.bencbking to provide and in-depth evaluations and understanding of the factors that limit the scalability of high-performance computing systems. Many NSF and NASA centers have participated in the effort. Our research effort was an integral part of implementing JNNIE in the NASA ESS grand challenge applications context. Our research work under this program was composed of three distinct, but related activities. They include the evaluation of NASA ESS high- performance computing testbeds using the wavelet decomposition application; evaluation of NASA ESS testbeds using astrophysical simulation applications; and developing an experimental model for workload characterization for understanding workload requirements. In this report, we provide a summary of findings that covers all three parts, a list of the publications that resulted from this effort, and three appendices with the details of each of the studies using a key publication developed under the respective work.

  14. Quadratic Programming for Allocating Control Effort

    NASA Technical Reports Server (NTRS)

    Singh, Gurkirpal

    2005-01-01

    A computer program calculates an optimal allocation of control effort in a system that includes redundant control actuators. The program implements an iterative (but otherwise single-stage) algorithm of the quadratic-programming type. In general, in the quadratic-programming problem, one seeks the values of a set of variables that minimize a quadratic cost function, subject to a set of linear equality and inequality constraints. In this program, the cost function combines control effort (typically quantified in terms of energy or fuel consumed) and control residuals (differences between commanded and sensed values of variables to be controlled). In comparison with prior control-allocation software, this program offers approximately equal accuracy but much greater computational efficiency. In addition, this program offers flexibility, robustness to actuation failures, and a capability for selective enforcement of control requirements. The computational efficiency of this program makes it suitable for such complex, real-time applications as controlling redundant aircraft actuators or redundant spacecraft thrusters. The program is written in the C language for execution in a UNIX operating system.

  15. Topical perspective on massive threading and parallelism.

    PubMed

    Farber, Robert M

    2011-09-01

    Unquestionably computer architectures have undergone a recent and noteworthy paradigm shift that now delivers multi- and many-core systems with tens to many thousands of concurrent hardware processing elements per workstation or supercomputer node. GPGPU (General Purpose Graphics Processor Unit) technology in particular has attracted significant attention as new software development capabilities, namely CUDA (Compute Unified Device Architecture) and OpenCL™, have made it possible for students as well as small and large research organizations to achieve excellent speedup for many applications over more conventional computing architectures. The current scientific literature reflects this shift with numerous examples of GPGPU applications that have achieved one, two, and in some special cases, three-orders of magnitude increased computational performance through the use of massive threading to exploit parallelism. Multi-core architectures are also evolving quickly to exploit both massive-threading and massive-parallelism such as the 1.3 million threads Blue Waters supercomputer. The challenge confronting scientists in planning future experimental and theoretical research efforts--be they individual efforts with one computer or collaborative efforts proposing to use the largest supercomputers in the world is how to capitalize on these new massively threaded computational architectures--especially as not all computational problems will scale to massive parallelism. In particular, the costs associated with restructuring software (and potentially redesigning algorithms) to exploit the parallelism of these multi- and many-threaded machines must be considered along with application scalability and lifespan. This perspective is an overview of the current state of threading and parallelize with some insight into the future. Published by Elsevier Inc.

  16. Model of Procedure Usage – Results from a Qualitative Study to Inform Design of Computer-Based Procedures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johanna H Oxstrand; Katya L Le Blanc

    The nuclear industry is constantly trying to find ways to decrease the human error rate, especially the human errors associated with procedure use. As a step toward the goal of improving procedure use performance, researchers, together with the nuclear industry, have been looking at replacing the current paper-based procedures with computer-based procedure systems. The concept of computer-based procedures is not new by any means; however most research has focused on procedures used in the main control room. Procedures reviewed in these efforts are mainly emergency operating procedures and normal operating procedures. Based on lessons learned for these previous efforts wemore » are now exploring a more unknown application for computer based procedures - field procedures, i.e. procedures used by nuclear equipment operators and maintenance technicians. The Idaho National Laboratory, the Institute for Energy Technology, and participants from the U.S. commercial nuclear industry are collaborating in an applied research effort with the objective of developing requirements and specifications for a computer-based procedure system to be used by field operators. The goal is to identify the types of human errors that can be mitigated by using computer-based procedures and how to best design the computer-based procedures to do this. The underlying philosophy in the research effort is “Stop – Start – Continue”, i.e. what features from the use of paper-based procedures should we not incorporate (Stop), what should we keep (Continue), and what new features or work processes should be added (Start). One step in identifying the Stop – Start – Continue was to conduct a baseline study where affordances related to the current usage of paper-based procedures were identified. The purpose of the study was to develop a model of paper based procedure use which will help to identify desirable features for computer based procedure prototypes. Affordances such as note taking, markups, sharing procedures between fellow coworkers, the use of multiple procedures at once, etc. were considered. The model describes which affordances associated with paper based procedures should be transferred to computer-based procedures as well as what features should not be incorporated. The model also provides a means to identify what new features not present in paper based procedures need to be added to the computer-based procedures to further enhance performance. The next step is to use the requirements and specifications to develop concepts and prototypes of computer-based procedures. User tests and other data collection efforts will be conducted to ensure that the real issues with field procedures and their usage are being addressed and solved in the best manner possible. This paper describes the baseline study, the construction of the model of procedure use, and the requirements and specifications for computer-based procedures that were developed based on the model. It also addresses how the model and the insights gained from it were used to develop concepts and prototypes for computer based procedures.« less

  17. Near-Source Modeling Updates: Building Downwash & Near-Road

    EPA Science Inventory

    The presentation describes recent research efforts in near-source model development focusing on building downwash and near-road barriers. The building downwash section summarizes a recent wind tunnel study, ongoing computational fluid dynamics simulations and efforts to improve ...

  18. Validity of questionnaire self‐reports on computer, mouse and keyboard usage during a four‐week period

    PubMed Central

    Mikkelsen, Sigurd; Vilstrup, Imogen; Lassen, Christina Funch; Kryger, Ann Isabel; Thomsen, Jane Frølund; Andersen, Johan Hviid

    2007-01-01

    Objective To examine the validity and potential biases in self‐reports of computer, mouse and keyboard usage times, compared with objective recordings. Methods A study population of 1211 people was asked in a questionnaire to estimate the average time they had worked with computer, mouse and keyboard during the past four working weeks. During the same period, a software program recorded these activities objectively. The study was part of a one‐year follow‐up study from 2000–1 of musculoskeletal outcomes among Danish computer workers. Results Self‐reports on computer, mouse and keyboard usage times were positively associated with objectively measured activity, but the validity was low. Self‐reports explained only between a quarter and a third of the variance of objectively measured activity, and were even lower for one measure (keyboard time). Self‐reports overestimated usage times. Overestimation was large at low levels and declined with increasing levels of objectively measured activity. Mouse usage time proportion was an exception with a near 1:1 relation. Variability in objectively measured activity, arm pain, gender and age influenced self‐reports in a systematic way, but the effects were modest and sometimes in different directions. Conclusion Self‐reported durations of computer activities are positively associated with objective measures but they are quite inaccurate. Studies using self‐reports to establish relations between computer work times and musculoskeletal pain could be biased and lead to falsely increased or decreased risk estimates. PMID:17387136

  19. Validity of questionnaire self-reports on computer, mouse and keyboard usage during a four-week period.

    PubMed

    Mikkelsen, Sigurd; Vilstrup, Imogen; Lassen, Christina Funch; Kryger, Ann Isabel; Thomsen, Jane Frølund; Andersen, Johan Hviid

    2007-08-01

    To examine the validity and potential biases in self-reports of computer, mouse and keyboard usage times, compared with objective recordings. A study population of 1211 people was asked in a questionnaire to estimate the average time they had worked with computer, mouse and keyboard during the past four working weeks. During the same period, a software program recorded these activities objectively. The study was part of a one-year follow-up study from 2000-1 of musculoskeletal outcomes among Danish computer workers. Self-reports on computer, mouse and keyboard usage times were positively associated with objectively measured activity, but the validity was low. Self-reports explained only between a quarter and a third of the variance of objectively measured activity, and were even lower for one measure (keyboard time). Self-reports overestimated usage times. Overestimation was large at low levels and declined with increasing levels of objectively measured activity. Mouse usage time proportion was an exception with a near 1:1 relation. Variability in objectively measured activity, arm pain, gender and age influenced self-reports in a systematic way, but the effects were modest and sometimes in different directions. Self-reported durations of computer activities are positively associated with objective measures but they are quite inaccurate. Studies using self-reports to establish relations between computer work times and musculoskeletal pain could be biased and lead to falsely increased or decreased risk estimates.

  20. 76 FR 17424 - President's National Security Telecommunications Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-29

    ... discuss and vote on the Communications Resiliency Report and receive an update on the cloud computing... Communications Resiliency Report III. Update on the Cloud Computing Scoping Effort IV. Closing Remarks Dated...

  1. Conjugate Gradient Algorithms For Manipulator Simulation

    NASA Technical Reports Server (NTRS)

    Fijany, Amir; Scheid, Robert E.

    1991-01-01

    Report discusses applicability of conjugate-gradient algorithms to computation of forward dynamics of robotic manipulators. Rapid computation of forward dynamics essential to teleoperation and other advanced robotic applications. Part of continuing effort to find algorithms meeting requirements for increased computational efficiency and speed. Method used for iterative solution of systems of linear equations.

  2. Using Information Technology in Mathematics Education.

    ERIC Educational Resources Information Center

    Tooke, D. James, Ed.; Henderson, Norma, Ed.

    This collection of essays examines the history and impact of computers in mathematics and mathematics education from the early, computer-assisted instruction efforts through LOGO, the constructivist educational software for K-9 schools developed in the 1980s, to MAPLE, the computer algebra system for mathematical problem solving developed in the…

  3. Cooperation Support in Computer-Aided Authoring and Learning.

    ERIC Educational Resources Information Center

    Muhlhauser, Max; Rudebusch, Tom

    This paper discusses the use of Computer Supported Cooperative Work (CSCW) techniques for computer-aided learning (CAL); the work was started in the context of project Nestor, a joint effort of German universities about cooperative multimedia authoring/learning environments. There are four major categories of cooperation for CAL: author/author,…

  4. 2016 Annual Report - Argonne Leadership Computing Facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Collins, Jim; Papka, Michael E.; Cerny, Beth A.

    The Argonne Leadership Computing Facility (ALCF) helps researchers solve some of the world’s largest and most complex problems, while also advancing the nation’s efforts to develop future exascale computing systems. This report presents some of the ALCF’s notable achievements in key strategic areas over the past year.

  5. Computer-Based Training: Capitalizing on Lessons Learned

    ERIC Educational Resources Information Center

    Bedwell, Wendy L.; Salas, Eduardo

    2010-01-01

    Computer-based training (CBT) is a methodology for providing systematic, structured learning; a useful tool when properly designed. CBT has seen a resurgence given the serious games movement, which is at the forefront of integrating primarily entertainment computer-based games into education and training. This effort represents a multidisciplinary…

  6. "Computer Science Can Feed a Lot of Dreams"

    ERIC Educational Resources Information Center

    Educational Horizons, 2014

    2014-01-01

    Pat Yongpradit is the director of education at Code.org. He leads all education efforts, including professional development and curriculum creation, and he builds relationships with school districts. Pat joined "Educational Horizons" to talk about why it is important to teach computer science--even for non-computer science teachers. This…

  7. Understanding the Internet.

    ERIC Educational Resources Information Center

    Oblinger, Diana

    The Internet is an international network linking hundreds of smaller computer networks in North America, Europe, and Asia. Using the Internet, computer users can connect to a variety of computers with little effort or expense. The potential for use by college faculty is enormous. The largest problem faced by most users is understanding what such…

  8. "Intelligent" Computer Assisted Instruction (CAI) Applications. Interim Report.

    ERIC Educational Resources Information Center

    Brown, John Seely; And Others

    Interim work is documented describing efforts to modify computer techniques used to recognize and process English language requests to an instructional simulator. The conversion from a hand-coded to a table driven technique are described in detail. Other modifications to a simulation based computer assisted instruction program to allow a gaming…

  9. Investigation of Grid Adaptation to Reduce Computational Efforts for a 2-D Hydrogen-Fueled Dual-Mode Scramjet

    NASA Astrophysics Data System (ADS)

    Foo, Kam Keong

    A two-dimensional dual-mode scramjet flowpath is developed and evaluated using the ANSYS Fluent density-based flow solver with various computational grids. Results are obtained for fuel-off, fuel-on non-reacting, and fuel-on reacting cases at different equivalence ratios. A one-step global chemical kinetics hydrogen-air model is used in conjunction with the eddy-dissipation model. Coarse, medium and fine computational grids are used to evaluate grid sensitivity and to investigate a lack of grid independence. Different grid adaptation strategies are performed on the coarse grid in an attempt to emulate the solutions obtained from the finer grids. The goal of this study is to investigate the feasibility of using various mesh adaptation criteria to significantly decrease computational efforts for high-speed reacting flows.

  10. Pseudo-point transport technique: a new method for solving the Boltzmann transport equation in media with highly fluctuating cross sections

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nakhai, B.

    A new method for solving radiation transport problems is presented. The heart of the technique is a new cross section processing procedure for the calculation of group-to-point and point-to-group cross sections sets. The method is ideally suited for problems which involve media with highly fluctuating cross sections, where the results of the traditional multigroup calculations are beclouded by the group averaging procedures employed. Extensive computational efforts, which would be required to evaluate double integrals in the multigroup treatment numerically, prohibit iteration to optimize the energy boundaries. On the other hand, use of point-to-point techniques (as in the stochastic technique) ismore » often prohibitively expensive due to the large computer storage requirement. The pseudo-point code is a hybrid of the two aforementioned methods (group-to-group and point-to-point) - hence the name pseudo-point - that reduces the computational efforts of the former and the large core requirements of the latter. The pseudo-point code generates the group-to-point or the point-to-group transfer matrices, and can be coupled with the existing transport codes to calculate pointwise energy-dependent fluxes. This approach yields much more detail than is available from the conventional energy-group treatments. Due to the speed of this code, several iterations could be performed (in affordable computing efforts) to optimize the energy boundaries and the weighting functions. The pseudo-point technique is demonstrated by solving six problems, each depicting a certain aspect of the technique. The results are presented as flux vs energy at various spatial intervals. The sensitivity of the technique to the energy grid and the savings in computational effort are clearly demonstrated.« less

  11. Final Report Scalable Analysis Methods and In Situ Infrastructure for Extreme Scale Knowledge Discovery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Leary, Patrick

    The primary challenge motivating this project is the widening gap between the ability to compute information and to store it for subsequent analysis. This gap adversely impacts science code teams, who can perform analysis only on a small fraction of the data they calculate, resulting in the substantial likelihood of lost or missed science, when results are computed but not analyzed. Our approach is to perform as much analysis or visualization processing on data while it is still resident in memory, which is known as in situ processing. The idea in situ processing was not new at the time ofmore » the start of this effort in 2014, but efforts in that space were largely ad hoc, and there was no concerted effort within the research community that aimed to foster production-quality software tools suitable for use by Department of Energy (DOE) science projects. Our objective was to produce and enable the use of production-quality in situ methods and infrastructure, at scale, on DOE high-performance computing (HPC) facilities, though we expected to have an impact beyond DOE due to the widespread nature of the challenges, which affect virtually all large-scale computational science efforts. To achieve this objective, we engaged in software technology research and development (R&D), in close partnerships with DOE science code teams, to produce software technologies that were shown to run efficiently at scale on DOE HPC platforms.« less

  12. Urban Form, Health, and the Law’s Limits

    PubMed Central

    Buzbee, William W.

    2003-01-01

    Urban form, the law, and health are undoubtedly linked. However, nonlegal factors such as 20th-century reliance on the automobile as well as associated governmental actions and private investment choices have greatly influenced urban form, especially urban sprawl. The American system of federalism, with its traditional allocation of land-use legal authority to local governments, and resulting fragmented legal authority over causes and effects of urban sprawl, renders difficult legal efforts to reshape urban form. Legal frameworks and the dynamics and effects of urban sprawl are largely mismatched. Still, existing legal frameworks and modest legal reforms provide means to encourage or at least allow urban forms that are more conducive to health. However, the law will not easily transform urban form and deter urban sprawl. PMID:12948950

  13. States and the politics of incrementalism: health policy in Wisconsin during the 1990s.

    PubMed

    Sparer, Michael S

    2004-04-01

    Wisconsin officials during the 1990s seemed poised to enact innovative and comprehensive health care reform. During that era, an ambitious, popular, and reform-minded governor led the state. The state had an unusually professional legislature. The state's economy was strong. Even with these advantages, however, the report card on the state's efforts is mixed. The state enacted a fairly modest set of reforms that were financed largely by the federal government and subject to extensive federal oversight. The Wisconsin story thus seems to be about the politics of incrementalism. But while critics of incrementalist politics point out that the number of uninsured continues to grow, the catalytic federalism witnessed in Wisconsin in the 1990s may well be the best model for implementing health care reform.

  14. SQLGEN: a framework for rapid client-server database application development.

    PubMed

    Nadkarni, P M; Cheung, K H

    1995-12-01

    SQLGEN is a framework for rapid client-server relational database application development. It relies on an active data dictionary on the client machine that stores metadata on one or more database servers to which the client may be connected. The dictionary generates dynamic Structured Query Language (SQL) to perform common database operations; it also stores information about the access rights of the user at log-in time, which is used to partially self-configure the behavior of the client to disable inappropriate user actions. SQLGEN uses a microcomputer database as the client to store metadata in relational form, to transiently capture server data in tables, and to allow rapid application prototyping followed by porting to client-server mode with modest effort. SQLGEN is currently used in several production biomedical databases.

  15. Applying Community Organizing Principles to Assess Health Needs in New Haven, Connecticut.

    PubMed

    Santilli, Alycia; Carroll-Scott, Amy; Ickovics, Jeannette R

    2016-05-01

    The Affordable Care Act added requirements for nonprofit hospitals to conduct community health needs assessments. Guidelines are minimal; however, they require input and representation from the broader community. This call echoes 2 decades of literature on the importance of including community members in all aspects of research design, a tenet of community organizing. We describe a community-engaged research approach to a community health needs assessment in New Haven, Connecticut. We demonstrate that a robust community organizing approach provided unique research benefits: access to residents for data collection, reliable data, leverage for community-driven interventions, and modest improvements in behavioral risk. We make recommendations for future community-engaged efforts and workforce development, which are important for responding to increasing calls for community health needs assessments.

  16. Heart rate, rate-pressure product, and oxygen uptake during four sexual activities.

    PubMed

    Bohlen, J G; Held, J P; Sanderson, M O; Patterson, R P

    1984-09-01

    Heart rate, rate-pressure product, and VO2 were measured in ten healthy men during four specified sexual activities: coitus with husband on top, coitus with wife on top, noncoital stimulation of husband by wife, and self-stimulation by husband. Foreplay generated slight, but statistically significant, increases above resting baseline in cardiac and metabolic variables. From stimulation through orgasm, average effort was modest for relatively short spans. Maximum exercise values occurred during the brief spans of orgasm, then returned quickly to near baseline levels. The two noncoital activities required lower expenditures than the two coital positions, with man-on-top coitus rating the highest. Large variations among subjects and among activities discourage use of a general equivalent activity for comparison, such as "two flights of stairs," to represent "sexual activity."

  17. East Timor in transition: health and health care.

    PubMed

    Povey, George; Mercer, Mary Anne

    2002-01-01

    East Timor was liberated from 400 years of conquest and exploitation in an armed struggle that ended, in September 1999, in a conflagration that destroyed its social and physical infrastructures. For two years the territory has been under United Nations administration. Political conditions remain unstable as the result of many intrinsic and external factors. Its economy continues to depend upon infusions of funds from multilateral, bilateral, and private sources. Efforts by expatriates to introduce Euro-American cultural and technical models have been applied to the factors that determine health, with modest results. East Timor expects to be totally independent of foreign control early in 2002. Its future health will depend upon continuing collaboration between international and local leadership in evolving effective government, economy, and health services designed, managed, and executed by Timorese.

  18. Applying Community Organizing Principles to Assess Health Needs in New Haven, Connecticut

    PubMed Central

    Carroll-Scott, Amy; Ickovics, Jeannette R.

    2016-01-01

    The Affordable Care Act added requirements for nonprofit hospitals to conduct community health needs assessments. Guidelines are minimal; however, they require input and representation from the broader community. This call echoes 2 decades of literature on the importance of including community members in all aspects of research design, a tenet of community organizing. We describe a community-engaged research approach to a community health needs assessment in New Haven, Connecticut. We demonstrate that a robust community organizing approach provided unique research benefits: access to residents for data collection, reliable data, leverage for community-driven interventions, and modest improvements in behavioral risk. We make recommendations for future community-engaged efforts and workforce development, which are important for responding to increasing calls for community health needs assessments. PMID:26985599

  19. Observation model and parameter partials for the JPL VLBI parameter estimation software MODEST/1991

    NASA Technical Reports Server (NTRS)

    Sovers, O. J.

    1991-01-01

    A revision is presented of MASTERFIT-1987, which it supersedes. Changes during 1988 to 1991 included introduction of the octupole component of solid Earth tides, the NUVEL tectonic motion model, partial derivatives for the precession constant and source position rates, the option to correct for source structure, a refined model for antenna offsets, modeling the unique antenna at Richmond, FL, improved nutation series due to Zhu, Groten, and Reigber, and reintroduction of the old (Woolard) nutation series for simulation purposes. Text describing the relativistic transformations and gravitational contributions to the delay model was also revised in order to reflect the computer code more faithfully.

  20. Meraculous2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2014-06-01

    meraculous2 is a whole genome shotgun assembler for short-reads that is capable of assembling large, polymorphic genomes with modest computational requirements. Meraculous relies on an efficient and conservative traversal of the subgraph of the k-mer (deBruijn) graph of oligonucleotides with unique high quality extensions in the dataset, avoiding an explicit error correction step as used in other short-read assemblers. Additional features include (1) handling of allelic variation using "bubble" structures within the deBruijn graph, (2) gap closing of repetitive and low quality regions using localized assemblies, and (3) an improved scaffolding algorithm that produces more complete assemblies without compromising onmore » scaffolding accuracy« less

  1. The upwind control volume scheme for unstructured triangular grids

    NASA Technical Reports Server (NTRS)

    Giles, Michael; Anderson, W. Kyle; Roberts, Thomas W.

    1989-01-01

    A new algorithm for the numerical solution of the Euler equations is presented. This algorithm is particularly suited to the use of unstructured triangular meshes, allowing geometric flexibility. Solutions are second-order accurate in the steady state. Implementation of the algorithm requires minimal grid connectivity information, resulting in modest storage requirements, and should enhance the implementation of the scheme on massively parallel computers. A novel form of upwind differencing is developed, and is shown to yield sharp resolution of shocks. Two new artificial viscosity models are introduced that enhance the performance of the new scheme. Numerical results for transonic airfoil flows are presented, which demonstrate the performance of the algorithm.

  2. SSTs, nitrogen fertiliser and stratospheric ozone

    NASA Technical Reports Server (NTRS)

    Turco, R. P.; Whitten, R. C.; Poppoff, I. G.; Capone, L. A.

    1978-01-01

    A recently revised model of the stratosphere is used to show that a substantial enhancement in the ozone layer could accompany worldwide SST fleet operations and that water vapor may be an important factor in SST assessments. Revised rate coefficients for various ozone-destroying reactions are employed in calculations which indicate a slight increase in the total content of stratospheric ozone for modest-sized fleets of SSTs flying below about 25 km. It is found that water-vapor chemical reactions can negate in large part the NOx-induced ozone gains computed below 25 km and that increased use of nitrogen fertilizer might also enhance the ozone layer.

  3. Time-optimal Aircraft Pursuit-evasion with a Weapon Envelope Constraint

    NASA Technical Reports Server (NTRS)

    Menon, P. K. A.

    1990-01-01

    The optimal pursuit-evasion problem between two aircraft including a realistic weapon envelope is analyzed using differential game theory. Six order nonlinear point mass vehicle models are employed and the inclusion of an arbitrary weapon envelope geometry is allowed. The performance index is a linear combination of flight time and the square of the vehicle acceleration. Closed form solution to this high-order differential game is then obtained using feedback linearization. The solution is in the form of a feedback guidance law together with a quartic polynomial for time-to-go. Due to its modest computational requirements, this nonlinear guidance law is useful for on-board real-time implementation.

  4. Application Portable Parallel Library

    NASA Technical Reports Server (NTRS)

    Cole, Gary L.; Blech, Richard A.; Quealy, Angela; Townsend, Scott

    1995-01-01

    Application Portable Parallel Library (APPL) computer program is subroutine-based message-passing software library intended to provide consistent interface to variety of multiprocessor computers on market today. Minimizes effort needed to move application program from one computer to another. User develops application program once and then easily moves application program from parallel computer on which created to another parallel computer. ("Parallel computer" also include heterogeneous collection of networked computers). Written in C language with one FORTRAN 77 subroutine for UNIX-based computers and callable from application programs written in C language or FORTRAN 77.

  5. Using a higher criticism statistic to detect modest effects in a genome-wide study of rheumatoid arthritis

    PubMed Central

    2009-01-01

    In high-dimensional studies such as genome-wide association studies, the correction for multiple testing in order to control total type I error results in decreased power to detect modest effects. We present a new analytical approach based on the higher criticism statistic that allows identification of the presence of modest effects. We apply our method to the genome-wide study of rheumatoid arthritis provided in the Genetic Analysis Workshop 16 Problem 1 data set. There is evidence for unknown bias in this study that could be explained by the presence of undetected modest effects. We compared the asymptotic and empirical thresholds for the higher criticism statistic. Using the asymptotic threshold we detected the presence of modest effects genome-wide. We also detected modest effects using 90th percentile of the empirical null distribution as a threshold; however, there is no such evidence when the 95th and 99th percentiles were used. While the higher criticism method suggests that there is some evidence for modest effects, interpreting individual single-nucleotide polymorphisms with significant higher criticism statistics is of undermined value. The goal of higher criticism is to alert the researcher that genetic effects remain to be discovered and to promote the use of more targeted and powerful studies to detect the remaining effects. PMID:20018032

  6. Randomized Trial of Desktop Humidifier for Dry Eye Relief in Computer Users.

    PubMed

    Wang, Michael T M; Chan, Evon; Ea, Linda; Kam, Clifford; Lu, Yvonne; Misra, Stuti L; Craig, Jennifer P

    2017-11-01

    Dry eye is a frequently reported problem among computer users. Low relative humidity environments are recognized to exacerbate signs and symptoms of dry eye, yet are common in offices of computer operators. Desktop USB-powered humidifiers are available commercially, but their efficacy for dry eye relief has not been established. This study aims to evaluate the potential for a desktop USB-powered humidifier to improve tear-film parameters, ocular surface characteristics, and subjective comfort of computer users. Forty-four computer users were enrolled in a prospective, masked, randomized crossover study. On separate days, participants were randomized to 1 hour of continuous computer use, with and without exposure to a desktop humidifier. Lipid-layer grade, noninvasive tear-film breakup time, and tear meniscus height were measured before and after computer use. Following the 1-hour period, participants reported whether ocular comfort was greater, equal, or lesser than that at baseline. The desktop humidifier effected a relative difference in humidity between the two environments of +5.4 ± 5.0% (P < .001). Participants demonstrated no significant differences in lipid-layer grade and tear meniscus height between the two environments (all P > .05). However, a relative increase in the median noninvasive tear-film breakup time of +4.0 seconds was observed in the humidified environment (P < .001), which was associated with a higher proportion of subjects reporting greater comfort relative to baseline (36% vs. 5%, P < .001). Even with a modest increase in relative humidity locally, the desktop humidifier shows potential to improve tear-film stability and subjective comfort during computer use.Trial registration no: ACTRN12617000326392.

  7. A Modest Critical Pedagogy for English as a Foreign Language Education

    ERIC Educational Resources Information Center

    Kim, Mi Kyong; Pollard, Vikki Ann

    2017-01-01

    This paper uses the introduction of critical pedagogy to an English as a Foreign Language class in the Republic of Korea as a case study for a "modest critical pedagogy" (Tinning 2002). Focusing on the stress and resistances experienced during the introduction, we suggest a modest critical pedagogy that 1) makes the paradigm itself an…

  8. Infrared Algorithm Development for Ocean Observations with EOS/MODIS

    NASA Technical Reports Server (NTRS)

    Brown, Otis B.

    1997-01-01

    Efforts continue under this contract to develop algorithms for the computation of sea surface temperature (SST) from MODIS infrared measurements. This effort includes radiative transfer modeling, comparison of in situ and satellite observations, development and evaluation of processing and networking methodologies for algorithm computation and data accession, evaluation of surface validation approaches for IR radiances, development of experimental instrumentation, and participation in MODIS (project) related activities. Activities in this contract period have focused on radiative transfer modeling, evaluation of atmospheric correction methodologies, undertake field campaigns, analysis of field data, and participation in MODIS meetings.

  9. Hypercube matrix computation task

    NASA Technical Reports Server (NTRS)

    Calalo, Ruel H.; Imbriale, William A.; Jacobi, Nathan; Liewer, Paulett C.; Lockhart, Thomas G.; Lyzenga, Gregory A.; Lyons, James R.; Manshadi, Farzin; Patterson, Jean E.

    1988-01-01

    A major objective of the Hypercube Matrix Computation effort at the Jet Propulsion Laboratory (JPL) is to investigate the applicability of a parallel computing architecture to the solution of large-scale electromagnetic scattering problems. Three scattering analysis codes are being implemented and assessed on a JPL/California Institute of Technology (Caltech) Mark 3 Hypercube. The codes, which utilize different underlying algorithms, give a means of evaluating the general applicability of this parallel architecture. The three analysis codes being implemented are a frequency domain method of moments code, a time domain finite difference code, and a frequency domain finite elements code. These analysis capabilities are being integrated into an electromagnetics interactive analysis workstation which can serve as a design tool for the construction of antennas and other radiating or scattering structures. The first two years of work on the Hypercube Matrix Computation effort is summarized. It includes both new developments and results as well as work previously reported in the Hypercube Matrix Computation Task: Final Report for 1986 to 1987 (JPL Publication 87-18).

  10. Staff | Computational Science | NREL

    Science.gov Websites

    develops and leads laboratory-wide efforts in high-performance computing and energy-efficient data centers Professional IV-High Perf Computing Jim.Albin@nrel.gov 303-275-4069 Ananthan, Shreyas Senior Scientist - High -Performance Algorithms and Modeling Shreyas.Ananthan@nrel.gov 303-275-4807 Bendl, Kurt IT Professional IV-High

  11. Evaluating the Implementation of International Computing Curricular in African Universities: A Design-Reality Gap Approach

    ERIC Educational Resources Information Center

    Dasuki, Salihu Ibrahim; Ogedebe, Peter; Kanya, Rislana Abdulazeez; Ndume, Hauwa; Makinde, Julius

    2015-01-01

    Efforts are been made by Universities in developing countries to ensure that it's graduate are not left behind in the competitive global information society; thus have adopted international computing curricular for their computing degree programs. However, adopting these international curricula seem to be very challenging for developing countries…

  12. Automated computer grading of hardwood lumber

    Treesearch

    P. Klinkhachorn; J.P. Franklin; Charles W. McMillin; R.W. Conners; H.A. Huber

    1988-01-01

    This paper describes an improved computer program to grade hardwood lumber. The program was created as part of a system to automate various aspects of the hardwood manufacturing industry. It enhances previous efforts by considering both faces of the board and provides easy application of species dependent rules. The program can be readily interfaced with a computer...

  13. Distance Learning and Cloud Computing: "Just Another Buzzword or a Major E-Learning Breakthrough?"

    ERIC Educational Resources Information Center

    Romiszowski, Alexander J.

    2012-01-01

    "Cloud computing is a model for the enabling of ubiquitous, convenient, and on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and other services) that can be rapidly provisioned and released with minimal management effort or service provider interaction." This…

  14. The Use of Computers in the Math Classroom.

    ERIC Educational Resources Information Center

    Blass, Barbara; And Others

    In an effort to increase faculty use and knowledge of computers, Oakland Community College (OCC), in Michigan, developed a Summer Technology Institute (STI), and a Computer Technology Grants (CTG) project beginning in 1989. The STI involved 3-day forums during summers 1989, 1990, and 1991 to expose faculty to hardware and software applications.…

  15. Commentary: It Is Not Only about the Computers--An Argument for Broadening the Conversation

    ERIC Educational Resources Information Center

    DeWitt, Scott W.

    2006-01-01

    In 2002 the members of the National Technology Leadership Initiative (NTLI) framed seven conclusions relating to handheld computers and ubiquitous computing in schools. While several of the conclusions are laudable efforts to increase research and professional development, the factual and conceptual bases for this document are seriously flawed.…

  16. The Relationship between Computational Fluency and Student Success in General Studies Mathematics

    ERIC Educational Resources Information Center

    Hegeman, Jennifer; Waters, Gavin

    2012-01-01

    Many developmental mathematics programs emphasize computational fluency with the assumption that this is a necessary contributor to student success in general studies mathematics. In an effort to determine which skills are most essential, scores on a computational fluency test were correlated with student success in general studies mathematics at…

  17. Computational procedure for finite difference solution of one-dimensional heat conduction problems reduces computer time

    NASA Technical Reports Server (NTRS)

    Iida, H. T.

    1966-01-01

    Computational procedure reduces the numerical effort whenever the method of finite differences is used to solve ablation problems for which the surface recession is large relative to the initial slab thickness. The number of numerical operations required for a given maximum space mesh size is reduced.

  18. Computer-Based Simulations for Maintenance Training: Current ARI Research. Technical Report 544.

    ERIC Educational Resources Information Center

    Knerr, Bruce W.; And Others

    Three research efforts that used computer-based simulations for maintenance training were in progress when this report was written: Game-Based Learning, which investigated the use of computer-based games to train electronics diagnostic skills; Human Performance in Fault Diagnosis Tasks, which evaluated the use of context-free tasks to train…

  19. Computational protein design-the next generation tool to expand synthetic biology applications.

    PubMed

    Gainza-Cirauqui, Pablo; Correia, Bruno Emanuel

    2018-05-02

    One powerful approach to engineer synthetic biology pathways is the assembly of proteins sourced from one or more natural organisms. However, synthetic pathways often require custom functions or biophysical properties not displayed by natural proteins, limitations that could be overcome through modern protein engineering techniques. Structure-based computational protein design is a powerful tool to engineer new functional capabilities in proteins, and it is beginning to have a profound impact in synthetic biology. Here, we review efforts to increase the capabilities of synthetic biology using computational protein design. We focus primarily on computationally designed proteins not only validated in vitro, but also shown to modulate different activities in living cells. Efforts made to validate computational designs in cells can illustrate both the challenges and opportunities in the intersection of protein design and synthetic biology. We also highlight protein design approaches, which although not validated as conveyors of new cellular function in situ, may have rapid and innovative applications in synthetic biology. We foresee that in the near-future, computational protein design will vastly expand the functional capabilities of synthetic cells. Copyright © 2018. Published by Elsevier Ltd.

  20. Design and implementation of a Windows NT network to support CNC activities

    NASA Technical Reports Server (NTRS)

    Shearrow, C. A.

    1996-01-01

    The Manufacturing, Materials, & Processes Technology Division is undergoing dramatic changes to bring it's manufacturing practices current with today's technological revolution. The Division is developing Computer Automated Design and Computer Automated Manufacturing (CAD/CAM) abilities. The development of resource tracking is underway in the form of an accounting software package called Infisy. These two efforts will bring the division into the 1980's in relationship to manufacturing processes. Computer Integrated Manufacturing (CIM) is the final phase of change to be implemented. This document is a qualitative study and application of a CIM application capable of finishing the changes necessary to bring the manufacturing practices into the 1990's. The documentation provided in this qualitative research effort includes discovery of the current status of manufacturing in the Manufacturing, Materials, & Processes Technology Division including the software, hardware, network and mode of operation. The proposed direction of research included a network design, computers to be used, software to be used, machine to computer connections, estimate a timeline for implementation, and a cost estimate. Recommendation for the division's improvement include action to be taken, software to utilize, and computer configurations.

  1. Many Masses on One Stroke:. Economic Computation of Quark Propagators

    NASA Astrophysics Data System (ADS)

    Frommer, Andreas; Nöckel, Bertold; Güsken, Stephan; Lippert, Thomas; Schilling, Klaus

    The computational effort in the calculation of Wilson fermion quark propagators in Lattice Quantum Chromodynamics can be considerably reduced by exploiting the Wilson fermion matrix structure in inversion algorithms based on the non-symmetric Lanczos process. We consider two such methods: QMR (quasi minimal residual) and BCG (biconjugate gradients). Based on the decomposition M/κ = 1/κ-D of the Wilson mass matrix, using QMR, one can carry out inversions on a whole trajectory of masses simultaneously, merely at the computational expense of a single propagator computation. In other words, one has to compute the propagator corresponding to the lightest mass only, while all the heavier masses are given for free, at the price of extra storage. Moreover, the symmetry γ5M = M†γ5 can be used to cut the computational effort in QMR and BCG by a factor of two. We show that both methods then become — in the critical regime of small quark masses — competitive to BiCGStab and significantly better than the standard MR method, with optimal relaxation factor, and CG as applied to the normal equations.

  2. Computational Aerodynamic Simulations of a Spacecraft Cabin Ventilation Fan Design

    NASA Technical Reports Server (NTRS)

    Tweedt, Daniel L.

    2010-01-01

    Quieter working environments for astronauts are needed if future long-duration space exploration missions are to be safe and productive. Ventilation and payload cooling fans are known to be dominant sources of noise, with the International Space Station being a good case in point. To address this issue cost effectively, early attention to fan design, selection, and installation has been recommended, leading to an effort by NASA to examine the potential for small-fan noise reduction by improving fan aerodynamic design. As a preliminary part of that effort, the aerodynamics of a cabin ventilation fan designed by Hamilton Sundstrand has been simulated using computational fluid dynamics codes, and the computed solutions analyzed to quantify various aspects of the fan aerodynamics and performance. Four simulations were performed at the design rotational speed: two at the design flow rate and two at off-design flow rates. Following a brief discussion of the computational codes, various aerodynamic- and performance-related quantities derived from the computed flow fields are presented along with relevant flow field details. The results show that the computed fan performance is in generally good agreement with stated design goals.

  3. F‐GHG Emissions Reduction Efforts: FY2015 Supplier Profiles

    EPA Pesticide Factsheets

    The Supplier Profiles outlined in this document detail the efforts of large‐area flat panel suppliers to reduce their F‐GHG emissions in manufacturing facilities that make today’s large‐area panels used for products such as TVs and computer monitors.

  4. F‐GHG Emissions Reduction Efforts: FY2016 Supplier Profiles

    EPA Pesticide Factsheets

    The Supplier Profiles outlined in this document detail the efforts of large‐area flat panel suppliers to reduce their F‐GHG emissions in manufacturing facilities that make today’s large‐area panels used for products such as TVs and computer monitors.

  5. Operating manual for coaxial injection combustion model. [for the space shuttle main engine

    NASA Technical Reports Server (NTRS)

    Sutton, R. D.; Schuman, M. D.; Chadwick, W. D.

    1974-01-01

    An operating manual for the coaxial injection combustion model (CICM) is presented as the final report for an eleven month effort designed to provide improvement, to verify, and to document the comprehensive computer program for analyzing the performance of thrust chamber operation with gas/liquid coaxial jet injection. The effort culminated in delivery of an operation FORTRAN IV computer program and associated documentation pertaining to the combustion conditions in the space shuttle main engine. The computer program is structured for compatibility with the standardized Joint Army-Navy-NASA-Air Force (JANNAF) performance evaluation procedure. Use of the CICM in conjunction with the JANNAF procedure allows the analysis of engine systems using coaxial gas/liquid injection.

  6. Aviation security : vulnerabilities still exist in the aviation security system

    DOT National Transportation Integrated Search

    2000-04-06

    The testimony today discusses the Federal Aviation Administration's (FAA) efforts to implement and improve security in two key areas: air traffic control computer systems and airport passenger screening checkpoints. Computer systems-and the informati...

  7. Psychological Issues in Online Adaptive Task Allocation

    NASA Technical Reports Server (NTRS)

    Morris, N. M.; Rouse, W. B.; Ward, S. L.; Frey, P. R.

    1984-01-01

    Adaptive aiding is an idea that offers potential for improvement over many current approaches to aiding in human-computer systems. The expected return of tailoring the system to fit the user could be in the form of improved system performance and/or increased user satisfaction. Issues such as the manner in which information is shared between human and computer, the appropriate division of labor between them, and the level of autonomy of the aid are explored. A simulated visual search task was developed. Subjects are required to identify targets in a moving display while performing a compensatory sub-critical tracking task. By manipulating characteristics of the situation such as imposed task-related workload and effort required to communicate with the computer, it is possible to create conditions in which interaction with the computer would be more or less desirable. The results of preliminary research using this experimental scenario are presented, and future directions for this research effort are discussed.

  8. Role of chemotherapy and targeted therapy in early-stage non-small cell lung cancer.

    PubMed

    Nagasaka, Misako; Gadgeel, Shirish M

    2018-01-01

    Adjuvant platinum based chemotherapy is accepted as standard of care in stage II and III non-small cell lung cancer (NSCLC) patients and is often considered in patients with stage IB disease who have tumors ≥ 4 cm. The survival advantage is modest with approximately 5% at 5 years. Areas covered: This review article presents relevant data regarding chemotherapy use in the perioperative setting for early stage NSCLC. A literature search was performed utilizing PubMed as well as clinical trial.gov. Randomized phase III studies in this setting including adjuvant and neoadjuvant use of chemotherapy as well as ongoing trials on targeted therapy and immunotherapy are also discussed. Expert commentary: With increasing utilization of screening computed tomography scans, it is possible that the percentage of early stage NSCLC patients will increase in the coming years. Benefits of adjuvant chemotherapy in early stage NSCLC patients remain modest. There is a need to better define patients most likely to derive survival benefit from adjuvant therapy and spare patients who do not need adjuvant chemotherapy due to the toxicity of such therapy. Trials for adjuvant targeted therapy, including adjuvant EGFR-TKI trials and trials of immunotherapy drugs are ongoing and will define the role of these agents as adjuvant therapy.

  9. Automated Boundary Conditions for Wind Tunnel Simulations

    NASA Technical Reports Server (NTRS)

    Carlson, Jan-Renee

    2018-01-01

    Computational fluid dynamic (CFD) simulations of models tested in wind tunnels require a high level of fidelity and accuracy particularly for the purposes of CFD validation efforts. Considerable effort is required to ensure the proper characterization of both the physical geometry of the wind tunnel and recreating the correct flow conditions inside the wind tunnel. The typical trial-and-error effort used for determining the boundary condition values for a particular tunnel configuration are time and computer resource intensive. This paper describes a method for calculating and updating the back pressure boundary condition in wind tunnel simulations by using a proportional-integral-derivative controller. The controller methodology and equations are discussed, and simulations using the controller to set a tunnel Mach number in the NASA Langley 14- by 22-Foot Subsonic Tunnel are demonstrated.

  10. Increasingly inbred and fragmented populations of Plasmodium vivax associated with the eastward decline in malaria transmission across the Southwest Pacific

    PubMed Central

    Waltmann, Andreea; Koepfli, Cristian; Tessier, Natacha; Karl, Stephan; Fola, Abebe; Darcy, Andrew W.; Wini, Lyndes; Harrison, G. L. Abby; Barnadas, Céline; Jennison, Charlie; Karunajeewa, Harin; Boyd, Sarah; Whittaker, Maxine; Kazura, James; Bahlo, Melanie; Mueller, Ivo

    2018-01-01

    The human malaria parasite Plasmodium vivax is more resistant to malaria control strategies than Plasmodium falciparum, and maintains high genetic diversity even when transmission is low. To investigate whether declining P. vivax transmission leads to increasing population structure that would facilitate elimination, we genotyped samples from across the Southwest Pacific region, which experiences an eastward decline in malaria transmission, as well as samples from two time points at one site (Tetere, Solomon Islands) during intensified malaria control. Analysis of 887 P. vivax microsatellite haplotypes from hyperendemic Papua New Guinea (PNG, n = 443), meso-hyperendemic Solomon Islands (n = 420), and hypoendemic Vanuatu (n = 24) revealed increasing population structure and multilocus linkage disequilibrium yet a modest decline in diversity as transmission decreases over space and time. In Solomon Islands, which has had sustained control efforts for 20 years, and Vanuatu, which has experienced sustained low transmission for many years, significant population structure was observed at different spatial scales. We conclude that control efforts will eventually impact P. vivax population structure and with sustained pressure, populations may eventually fragment into a limited number of clustered foci that could be targeted for elimination. PMID:29373596

  11. To apply or not to apply: a survey analysis of grant writing costs and benefits.

    PubMed

    von Hippel, Ted; von Hippel, Courtney

    2015-01-01

    We surveyed 113 astronomers and 82 psychologists active in applying for federally funded research on their grant-writing history between January, 2009 and November, 2012. We collected demographic data, effort levels, success rates, and perceived non-financial benefits from writing grant proposals. We find that the average proposal takes 116 PI hours and 55 CI hours to write; although time spent writing was not related to whether the grant was funded. Effort did translate into success, however, as academics who wrote more grants received more funding. Participants indicated modest non-monetary benefits from grant writing, with psychologists reporting a somewhat greater benefit overall than astronomers. These perceptions of non-financial benefits were unrelated to how many grants investigators applied for, the number of grants they received, or the amount of time they devoted to writing their proposals. We also explored the number of years an investigator can afford to apply unsuccessfully for research grants and our analyses suggest that funding rates below approximately 20%, commensurate with current NIH and NSF funding, are likely to drive at least half of the active researchers away from federally funded research. We conclude with recommendations and suggestions for individual investigators and for department heads.

  12. Smoking Trends and Disparities Among Black and Non-Hispanic Whites in California

    PubMed Central

    Felicitas, Jamie; Fagan, Pebbles; Gruder, Charles L.; Blanco, Lyzette; Cappelli, Christopher; Trinidad, Dennis R.

    2015-01-01

    Objectives: The current study examined disparities in smoking trends across Blacks and non-Hispanic Whites in California. Methods: Data from the 1996 to 2008 California Tobacco Survey were analyzed to examine trends in smoking behaviors and cessation across Blacks and non-Hispanic Whites. Results: A decrease in overall ever and current smoking was observed for both Black and non-Hispanic Whites across the 12-year time period. A striking decrease in proportions of heavy daily smokers for both Black and non-Hispanic Whites were observed. Proportions of light and intermittent smokers and moderate daily smokers displayed modest increases for Blacks, but large increases for non-Hispanic Whites. Increases in successful cessation were also observed for Blacks and, to a lesser extent, for non-Hispanic Whites. Discussion: Smoking behavior and cessation trends across Blacks and non-Hispanic Whites were revealing. The decline in heavy daily and former smokers may demonstrate the success and effectiveness of tobacco control efforts in California. However, the increase in proportions of light and intermittent smokers and moderate daily smokers for both Blacks and non-Hispanic Whites demonstrates a need for tobacco cessation efforts focused on lighter smokers. PMID:25666813

  13. The potential roles of science centers in climate change adaptation

    NASA Astrophysics Data System (ADS)

    Hamilton, P.

    2012-12-01

    The overwhelming consensus amongst climatologists is that anthropogenic climate change is underway, but leading climate scientists also anticipate that over the next 20 years research may only modestly reduce the uncertainty about where, when and by how much climate will change. Uncertainty presents not only scientific challenges but social, political and economic quandaries as well. Both scientific and educational communities understand that climate change will test the resilience of societies especially because of the uncertainties regarding the timing, nature and severity of climate change. Thus the need is great for civic conversations regarding climate change adaptation. What roles might science centers play in helping their audiences and communities make decisions about climate change adaptation despite less-than-perfect knowledge? And how might informal and formal education work together on this task? This session will begin with a review of some initial efforts by selected science centers and their partners to engage their audiences in and help their communities grapple with climate change adaptation. It then will conclude with an audience discussion about potential future efforts by science centers both individually and in collaboration with formal education institutions to elevate public and policymaker awareness and appreciation of the need for climate change adaptation.

  14. Health communication, genetic determinism, and perceived control: the roles of beliefs about susceptibility and severity versus disease essentialism.

    PubMed

    Parrott, Roxanne; Kahl, Mary L; Ndiaye, Khadidiatou; Traeder, Tara

    2012-08-01

    This research examined the lay public's beliefs about genes and health that might be labeled deterministic. The goals of this research were to sort through the divergent and contested meanings of genetic determinism in an effort to suggest directions for public health genomic communication. A survey conducted in community-based settings of 717 participants included 267 who self-reported race as African American and 450 who self-reported race as Caucasian American. The survey results revealed that the structure of genetic determinism included 2 belief sets. One set aligned with perceived threat, encompassing susceptibility and severity beliefs linked to genes and health. The other set represents beliefs about biological essentialism linked to the role of genes for health. These concepts were found to be modestly positively related. Threat beliefs predicted perceived control over genes. Public health efforts to communicate about genes and health should consider effects of these messages for (a) perceived threat relating to susceptibility and severity and (b) perceptions of disease essentialism. Perceived threat may enhance motivation to act in health protective ways, whereas disease essentialist beliefs may contribute to a loss of motivation associated with control over health.

  15. Vaccine-induced T cells Provide Partial Protection Against High-dose Rectal SIVmac239 Challenge of Rhesus Macaques

    PubMed Central

    Lasaro, Marcio O; Haut, Larissa H; Zhou, Xiangyang; Xiang, Zhiquan; Zhou, Dongming; Li, Yan; Giles-Davis, Wynetta; Li, Hua; Engram, Jessica C; DiMenna, Lauren J; Bian, Ang; Sazanovich, Marina; Parzych, Elizabeth M; Kurupati, Raj; Small, Juliana C; Wu, Te-Lang; Leskowitz, Rachel M; Klatt, Nicole R; Brenchley, Jason M; Garber, David A; Lewis, Mark; Ratcliffe, Sarah J; Betts, Michael R; Silvestri, Guido; Ertl, Hildegund C

    2011-01-01

    Despite enormous efforts by the scientific community, an effective HIV vaccine remains elusive. To further address to what degree T cells in absence of antibodies may protect against simian immunodeficiency virus (SIV) disease progression, rhesus macaques were vaccinated intramuscularly with a chimpanzee-derived Ad vector (AdC) serotype 6 and then boosted intramuscularly with a serologically distinct AdC vector of serotype 7 both expressing Gag of SIVmac239. Animals were subsequently boosted intramuscularly with a modified vaccinia Ankara (MVA) virus expressing Gag and Tat of the homologous SIV before mucosal challenge with a high dose of SIVmac239 given rectally. Whereas vaccinated animals showed only a modest reduction of viral loads, their overall survival was improved, in association with a substantial protection from the loss of CD4+ T cells. In addition, the two vaccinated Mamu-A*01+ macaques controlled viral loads to levels below detection within weeks after challenge. These data strongly suggest that T cells, while unable to affect SIV acquisition upon high-dose rectal infection, can reduce disease progression. Induction of potent T-cell responses should thus remain a component of our efforts to develop an efficacious vaccine to HIV-1. PMID:21081905

  16. Vaccine-induced T cells provide partial protection against high-dose rectal SIVmac239 challenge of rhesus macaques.

    PubMed

    Lasaro, Marcio O; Haut, Larissa H; Zhou, Xiangyang; Xiang, Zhiquan; Zhou, Dongming; Li, Yan; Giles-Davis, Wynetta; Li, Hua; Engram, Jessica C; Dimenna, Lauren J; Bian, Ang; Sazanovich, Marina; Parzych, Elizabeth M; Kurupati, Raj; Small, Juliana C; Wu, Te-Lang; Leskowitz, Rachel M; Klatt, Nicole R; Brenchley, Jason M; Garber, David A; Lewis, Mark; Ratcliffe, Sarah J; Betts, Michael R; Silvestri, Guido; Ertl, Hildegund C

    2011-02-01

    Despite enormous efforts by the scientific community, an effective HIV vaccine remains elusive. To further address to what degree T cells in absence of antibodies may protect against simian immunodeficiency virus (SIV) disease progression, rhesus macaques were vaccinated intramuscularly with a chimpanzee-derived Ad vector (AdC) serotype 6 and then boosted intramuscularly with a serologically distinct AdC vector of serotype 7 both expressing Gag of SIVmac239. Animals were subsequently boosted intramuscularly with a modified vaccinia Ankara (MVA) virus expressing Gag and Tat of the homologous SIV before mucosal challenge with a high dose of SIVmac239 given rectally. Whereas vaccinated animals showed only a modest reduction of viral loads, their overall survival was improved, in association with a substantial protection from the loss of CD4(+) T cells. In addition, the two vaccinated Mamu-A*01(+) macaques controlled viral loads to levels below detection within weeks after challenge. These data strongly suggest that T cells, while unable to affect SIV acquisition upon high-dose rectal infection, can reduce disease progression. Induction of potent T-cell responses should thus remain a component of our efforts to develop an efficacious vaccine to HIV-1.

  17. Increasingly inbred and fragmented populations of Plasmodium vivax associated with the eastward decline in malaria transmission across the Southwest Pacific.

    PubMed

    Waltmann, Andreea; Koepfli, Cristian; Tessier, Natacha; Karl, Stephan; Fola, Abebe; Darcy, Andrew W; Wini, Lyndes; Harrison, G L Abby; Barnadas, Céline; Jennison, Charlie; Karunajeewa, Harin; Boyd, Sarah; Whittaker, Maxine; Kazura, James; Bahlo, Melanie; Mueller, Ivo; Barry, Alyssa E

    2018-01-01

    The human malaria parasite Plasmodium vivax is more resistant to malaria control strategies than Plasmodium falciparum, and maintains high genetic diversity even when transmission is low. To investigate whether declining P. vivax transmission leads to increasing population structure that would facilitate elimination, we genotyped samples from across the Southwest Pacific region, which experiences an eastward decline in malaria transmission, as well as samples from two time points at one site (Tetere, Solomon Islands) during intensified malaria control. Analysis of 887 P. vivax microsatellite haplotypes from hyperendemic Papua New Guinea (PNG, n = 443), meso-hyperendemic Solomon Islands (n = 420), and hypoendemic Vanuatu (n = 24) revealed increasing population structure and multilocus linkage disequilibrium yet a modest decline in diversity as transmission decreases over space and time. In Solomon Islands, which has had sustained control efforts for 20 years, and Vanuatu, which has experienced sustained low transmission for many years, significant population structure was observed at different spatial scales. We conclude that control efforts will eventually impact P. vivax population structure and with sustained pressure, populations may eventually fragment into a limited number of clustered foci that could be targeted for elimination.

  18. Does adding clinical data to administrative data improve agreement among hospital quality measures?

    PubMed

    Hanchate, Amresh D; Stolzmann, Kelly L; Rosen, Amy K; Fink, Aaron S; Shwartz, Michael; Ash, Arlene S; Abdulkerim, Hassen; Pugh, Mary Jo V; Shokeen, Priti; Borzecki, Ann

    2017-09-01

    Hospital performance measures based on patient mortality and readmission have indicated modest rates of agreement. We examined if combining clinical data on laboratory tests and vital signs with administrative data leads to improved agreement with each other, and with other measures of hospital performance in the nation's largest integrated health care system. We used patient-level administrative and clinical data, and hospital-level data on quality indicators, for 2007-2010 from the Veterans Health Administration (VA). For patients admitted for acute myocardial infarction (AMI), heart failure (HF) and pneumonia we examined changes in hospital performance on 30-d mortality and 30-d readmission rates as a result of adding clinical data to administrative data. We evaluated whether this enhancement yielded improved measures of hospital quality, based on concordance with other hospital quality indicators. For 30-d mortality, data enhancement improved model performance, and significantly changed hospital performance profiles; for 30-d readmission, the impact was modest. Concordance between enhanced measures of both outcomes, and with other hospital quality measures - including Joint Commission process measures, VA Surgical Quality Improvement Program (VASQIP) mortality and morbidity, and case volume - remained poor. Adding laboratory tests and vital signs to measure hospital performance on mortality and readmission did not improve the poor rates of agreement across hospital quality indicators in the VA. Efforts to improve risk adjustment models should continue; however, evidence of validation should precede their use as reliable measures of quality. Published by Elsevier Inc.

  19. Individual wealth rank, community wealth inequality, and self-reported adult poor health: a test of hypotheses with panel data (2002-2006) from native Amazonians, Bolivia.

    PubMed

    Undurraga, Eduardo A; Nyberg, Colleen; Eisenberg, Dan T A; Magvanjav, Oyunbileg; Reyes-García, Victoria; Huanca, Tomás; Leonard, William R; McDade, Thomas W; Tanner, Susan; Vadez, Vincent; Godoy, Ricardo

    2010-12-01

    Growing evidence suggests that economic inequality in a community harms the health of a person. Using panel data from a small-scale, preindustrial rural society, we test whether individual wealth rank and village wealth inequality affects self-reported poor health in a foraging-farming native Amazonian society. A person's wealth rank was negatively but weakly associated with self-reported morbidity. Each step up/year in the village wealth hierarchy reduced total self-reported days ill by 0.4 percent. The Gini coefficient of village wealth inequality bore a positive association with self-reported poor health that was large in size, but not statistically significant. We found small village wealth inequality, and evidence that individual economic rank did not change. The modest effects may have to do with having used subjective rather than objective measures of health, having small village wealth inequality, and with the possibly true modest effect of a person's wealth rank on health in a small-scale, kin-based society. Finally, we also found that an increase in mean individual wealth by village was related to worse self-reported health. As the Tsimane' integrate into the market economy, their possibilities of wealth accumulation rise, which may affect their well-being. Our work contributes to recent efforts in biocultural anthropology to link the study of social inequalities, human biology, and human-environment interactions.

  20. A Scalable O(N) Algorithm for Large-Scale Parallel First-Principles Molecular Dynamics Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Osei-Kuffuor, Daniel; Fattebert, Jean-Luc

    2014-01-01

    Traditional algorithms for first-principles molecular dynamics (FPMD) simulations only gain a modest capability increase from current petascale computers, due to their O(N 3) complexity and their heavy use of global communications. To address this issue, we are developing a truly scalable O(N) complexity FPMD algorithm, based on density functional theory (DFT), which avoids global communications. The computational model uses a general nonorthogonal orbital formulation for the DFT energy functional, which requires knowledge of selected elements of the inverse of the associated overlap matrix. We present a scalable algorithm for approximately computing selected entries of the inverse of the overlap matrix,more » based on an approximate inverse technique, by inverting local blocks corresponding to principal submatrices of the global overlap matrix. The new FPMD algorithm exploits sparsity and uses nearest neighbor communication to provide a computational scheme capable of extreme scalability. Accuracy is controlled by the mesh spacing of the finite difference discretization, the size of the localization regions in which the electronic orbitals are confined, and a cutoff beyond which the entries of the overlap matrix can be omitted when computing selected entries of its inverse. We demonstrate the algorithm's excellent parallel scaling for up to O(100K) atoms on O(100K) processors, with a wall-clock time of O(1) minute per molecular dynamics time step.« less

  1. Tropospheric ozone observations - How well can we assess tropospheric ozone changes?

    NASA Astrophysics Data System (ADS)

    Tarasick, D. W.; Galbally, I. E.; Ancellet, G.; Leblanc, T.; Wallington, T. J.; Ziemke, J. R.; Steinbacher, M.; Stähelin, J.; Vigouroux, C.; Hannigan, J. W.; García, O. E.; Foret, G.; Zanis, P.; Liu, X.; Weatherhead, E. C.; Petropavlovskikh, I. V.; Worden, H. M.; Osman, M.; Liu, J.; Lin, M.; Cooper, O. R.; Schultz, M. G.; Granados-Muñoz, M. J.; Thompson, A. M.; Cuesta, J.; Dufour, G.; Thouret, V.; Hassler, B.; Trickl, T.

    2017-12-01

    Since the early 20th century, measurements of ozone in the free troposphere have evolved and changed. Data records have different uncertainties and biases, and differ with respect to coverage, information content, and representativeness. Almost all validation studies employ ECC ozonesondes. These have been compared to UV-absorption measurements in a number of intercomparison studies, and show a modest ( 1-5%) high bias in the troposphere, with an uncertainty of 5%, but no evidence of a change over time. Umkehr, lidar, FTIR, and commercial aircraft all show modest low biases relative to the ECCs, and so -- if the ECC biases are transferable -- all agree within 1σ with the modern UV standard. Relative to the UV standard, Brewer-Mast sondes show a 20% increase in sensitivity from 1970-1995, while Japanese KC sondes show an increase of 5-10%. Combined with the shift of the global ozonesonde network to ECCs, this can induce a false positive trend, in analyses based on sonde data. Passive sounding methods -- Umkehr, FTIR and satellites -- have much lower vertical resolution than active methods, and this can limit the attribution of trends. Satellite biases are larger than those of other measurement systems, ranging between -10% and +20%, and standard deviations are large: about 10-30%, versus 5-10% for sondes, aircraft, lidar and ground-based FTIR. There is currently little information on measurement drift for satellite measurements of tropospheric ozone. This is an evident area of concern if satellite retrievals are used for trend studies. The importance of ECC sondes as a transfer standard for satellite validation means that efforts to homogenize existing records, by correcting for known changes and by adopting strict standard operating procedures, should continue, and additional research effort should be put into understanding and reducing sonde uncertainties. Representativeness is also a potential source of large errors, which are difficult to quantify. The global observation network is unevenly distributed, and so additional sites (or airports), would be of benefit. Objective methods of quantifying spatial representativeness can optimize future network design. International cooperation and data sharing will be of paramount importance, as the TOAR project has demonstrated.

  2. A repeated cross-sectional study of socio-economic inequities in dietary sodium consumption among Canadian adults: implications for national sodium reduction strategies.

    PubMed

    McLaren, Lindsay; Heidinger, Shayla; Dutton, Daniel J; Tarasuk, Valerie; Campbell, Norman R

    2014-06-05

    In many countries including Canada, excess consumption of dietary sodium is common, and this has adverse implications for population health. Socio-economic inequities in sodium consumption seem likely, but research is limited. Knowledge of socio-economic inequities in sodium consumption is important for informing population-level sodium reduction strategies, to ensure that they are both impactful and equitable. We examined the association between socio-economic indicators (income and education) and sodium, using two outcome variables: 1) sodium consumption in mg/day, and 2) reported use of table salt, in two national surveys: the 1970/72 Nutrition Canada Survey and the 2004 Canadian Community Health Survey, Cycle 2.2. This permitted us to explore whether there were any changes in socio-economic patterning in dietary sodium during a time period characterized by modest, information-based national sodium reduction efforts, as well as to provide baseline information against which to examine the impact (equitable or not) of future sodium reduction strategies in Canada. There was no evidence of a socio-economic inequity in sodium consumption (mg/day) in 2004. In fact findings pointed to a positive association in women, whereby women of higher education consumed more sodium than women of lower education in 2004. For men, income was positively associated with reported use of table salt in 1970/72, but negatively associated in 2004. An emerging inequity in reported use of table salt among men could reflect the modest, information-based sodium reduction efforts that were implemented during the time frame considered. However, for sodium consumption in mg/day, we found no evidence of a contemporary inequity, and in fact observed the opposite effect among women. Our findings could reflect data limitations, or they could signal that sodium differs from some other nutrients in terms of its socio-economic patterning, perhaps reflecting very high prevalence of excess consumption. It is possible that socio-economic inequities in sodium consumption will emerge as excess consumption declines, consistent with fundamental cause theory. It is important that national sodium reduction strategies are both impactful and equitable.

  3. A repeated cross-sectional study of socio-economic inequities in dietary sodium consumption among Canadian adults: implications for national sodium reduction strategies

    PubMed Central

    2014-01-01

    Introduction In many countries including Canada, excess consumption of dietary sodium is common, and this has adverse implications for population health. Socio-economic inequities in sodium consumption seem likely, but research is limited. Knowledge of socio-economic inequities in sodium consumption is important for informing population-level sodium reduction strategies, to ensure that they are both impactful and equitable. Methods We examined the association between socio-economic indicators (income and education) and sodium, using two outcome variables: 1) sodium consumption in mg/day, and 2) reported use of table salt, in two national surveys: the 1970/72 Nutrition Canada Survey and the 2004 Canadian Community Health Survey, Cycle 2.2. This permitted us to explore whether there were any changes in socio-economic patterning in dietary sodium during a time period characterized by modest, information-based national sodium reduction efforts, as well as to provide baseline information against which to examine the impact (equitable or not) of future sodium reduction strategies in Canada. Results There was no evidence of a socio-economic inequity in sodium consumption (mg/day) in 2004. In fact findings pointed to a positive association in women, whereby women of higher education consumed more sodium than women of lower education in 2004. For men, income was positively associated with reported use of table salt in 1970/72, but negatively associated in 2004. Conclusions An emerging inequity in reported use of table salt among men could reflect the modest, information-based sodium reduction efforts that were implemented during the time frame considered. However, for sodium consumption in mg/day, we found no evidence of a contemporary inequity, and in fact observed the opposite effect among women. Our findings could reflect data limitations, or they could signal that sodium differs from some other nutrients in terms of its socio-economic patterning, perhaps reflecting very high prevalence of excess consumption. It is possible that socio-economic inequities in sodium consumption will emerge as excess consumption declines, consistent with fundamental cause theory. It is important that national sodium reduction strategies are both impactful and equitable. PMID:24903535

  4. A Noise-Assisted Reprogrammable Nanomechanical Logic Gate

    DTIC Science & Technology

    2009-01-01

    effort toward scalable mechanical computation.1-4 This effort can be traced back to 1822 (at least), when Charles Babbage presented a mechanical...the ONR (N000140910963). REFERENCES AND NOTES (1) Babbage , H. P. Babbage’s Calculating Engines; Charles Babbage Reprint Series for the History of

  5. On the evaluation of derivatives of Gaussian integrals

    NASA Technical Reports Server (NTRS)

    Helgaker, Trygve; Taylor, Peter R.

    1992-01-01

    We show that by a suitable change of variables, the derivatives of molecular integrals over Gaussian-type functions required for analytic energy derivatives can be evaluated with significantly less computational effort than current formulations. The reduction in effort increases with the order of differentiation.

  6. How the Brain Decides When to Work and When to Rest: Dissociation of Implicit-Reactive from Explicit-Predictive Computational Processes

    PubMed Central

    Meyniel, Florent; Safra, Lou; Pessiglione, Mathias

    2014-01-01

    A pervasive case of cost-benefit problem is how to allocate effort over time, i.e. deciding when to work and when to rest. An economic decision perspective would suggest that duration of effort is determined beforehand, depending on expected costs and benefits. However, the literature on exercise performance emphasizes that decisions are made on the fly, depending on physiological variables. Here, we propose and validate a general model of effort allocation that integrates these two views. In this model, a single variable, termed cost evidence, accumulates during effort and dissipates during rest, triggering effort cessation and resumption when reaching bounds. We assumed that such a basic mechanism could explain implicit adaptation, whereas the latent parameters (slopes and bounds) could be amenable to explicit anticipation. A series of behavioral experiments manipulating effort duration and difficulty was conducted in a total of 121 healthy humans to dissociate implicit-reactive from explicit-predictive computations. Results show 1) that effort and rest durations are adapted on the fly to variations in cost-evidence level, 2) that the cost-evidence fluctuations driving the behavior do not match explicit ratings of exhaustion, and 3) that actual difficulty impacts effort duration whereas expected difficulty impacts rest duration. Taken together, our findings suggest that cost evidence is implicitly monitored online, with an accumulation rate proportional to actual task difficulty. In contrast, cost-evidence bounds and dissipation rate might be adjusted in anticipation, depending on explicit task difficulty. PMID:24743711

  7. Comfort and experience with online learning: trends over nine years and associations with knowledge.

    PubMed

    Cook, David A; Thompson, Warren G

    2014-07-01

    Some evidence suggests that attitude toward computer-based instruction is an important determinant of success in online learning. We sought to determine how comfort using computers and perceptions of prior online learning experiences have changed over the past decade, and how these associate with learning outcomes. Each year from 2003-2011 we conducted a prospective trial of online learning. As part of each year's study, we asked medicine residents about their comfort using computers and if their previous experiences with online learning were favorable. We assessed knowledge using a multiple-choice test. We used regression to analyze associations and changes over time. 371 internal medicine and family medicine residents participated. Neither comfort with computers nor perceptions of prior online learning experiences showed a significant change across years (p > 0.61), with mean comfort rating 3.96 (maximum 5 = very comfortable) and mean experience rating 4.42 (maximum 6 = strongly agree [favorable]). Comfort showed no significant association with knowledge scores (p = 0.39) but perceptions of prior experiences did, with a 1.56% rise in knowledge score for a 1-point rise in experience score (p = 0.02). Correlations among comfort, perceptions of prior experiences, and number of prior experiences were all small and not statistically significant. Comfort with computers and perceptions of prior experience with online learning remained stable over nine years. Prior good experiences (but not comfort with computers) demonstrated a modest association with knowledge outcomes, suggesting that prior course satisfaction may influence subsequent learning.

  8. Scalable Analysis Methods and In Situ Infrastructure for Extreme Scale Knowledge Discovery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bethel, Wes

    2016-07-24

    The primary challenge motivating this team’s work is the widening gap between the ability to compute information and to store it for subsequent analysis. This gap adversely impacts science code teams, who are able to perform analysis only on a small fraction of the data they compute, resulting in the very real likelihood of lost or missed science, when results are computed but not analyzed. Our approach is to perform as much analysis or visualization processing on data while it is still resident in memory, an approach that is known as in situ processing. The idea in situ processing wasmore » not new at the time of the start of this effort in 2014, but efforts in that space were largely ad hoc, and there was no concerted effort within the research community that aimed to foster production-quality software tools suitable for use by DOE science projects. In large, our objective was produce and enable use of production-quality in situ methods and infrastructure, at scale, on DOE HPC facilities, though we expected to have impact beyond DOE due to the widespread nature of the challenges, which affect virtually all large-scale computational science efforts. To achieve that objective, we assembled a unique team of researchers consisting of representatives from DOE national laboratories, academia, and industry, and engaged in software technology R&D, as well as engaged in close partnerships with DOE science code teams, to produce software technologies that were shown to run effectively at scale on DOE HPC platforms.« less

  9. Computation of mass-density images from x-ray refraction-angle images.

    PubMed

    Wernick, Miles N; Yang, Yongyi; Mondal, Indrasis; Chapman, Dean; Hasnah, Moumen; Parham, Christopher; Pisano, Etta; Zhong, Zhong

    2006-04-07

    In this paper, we investigate the possibility of computing quantitatively accurate images of mass density variations in soft tissue. This is a challenging task, because density variations in soft tissue, such as the breast, can be very subtle. Beginning from an image of refraction angle created by either diffraction-enhanced imaging (DEI) or multiple-image radiography (MIR), we estimate the mass-density image using a constrained least squares (CLS) method. The CLS algorithm yields accurate density estimates while effectively suppressing noise. Our method improves on an analytical method proposed by Hasnah et al (2005 Med. Phys. 32 549-52), which can produce significant artefacts when even a modest level of noise is present. We present a quantitative evaluation study to determine the accuracy with which mass density can be determined in the presence of noise. Based on computer simulations, we find that the mass-density estimation error can be as low as a few per cent for typical density variations found in the breast. Example images computed from less-noisy real data are also shown to illustrate the feasibility of the technique. We anticipate that density imaging may have application in assessment of water content of cartilage resulting from osteoarthritis, in evaluation of bone density, and in mammographic interpretation.

  10. Fractional Steps methods for transient problems on commodity computer architectures

    NASA Astrophysics Data System (ADS)

    Krotkiewski, M.; Dabrowski, M.; Podladchikov, Y. Y.

    2008-12-01

    Fractional Steps methods are suitable for modeling transient processes that are central to many geological applications. Low memory requirements and modest computational complexity facilitates calculations on high-resolution three-dimensional models. An efficient implementation of Alternating Direction Implicit/Locally One-Dimensional schemes for an Opteron-based shared memory system is presented. The memory bandwidth usage, the main bottleneck on modern computer architectures, is specially addressed. High efficiency of above 2 GFlops per CPU is sustained for problems of 1 billion degrees of freedom. The optimized sequential implementation of all 1D sweeps is comparable in execution time to copying the used data in the memory. Scalability of the parallel implementation on up to 8 CPUs is close to perfect. Performing one timestep of the Locally One-Dimensional scheme on a system of 1000 3 unknowns on 8 CPUs takes only 11 s. We validate the LOD scheme using a computational model of an isolated inclusion subject to a constant far field flux. Next, we study numerically the evolution of a diffusion front and the effective thermal conductivity of composites consisting of multiple inclusions and compare the results with predictions based on the differential effective medium approach. Finally, application of the developed parabolic solver is suggested for a real-world problem of fluid transport and reactions inside a reservoir.

  11. DIGGING DEEPER INTO DEEP DATA: MOLECULAR DOCKING AS A HYPOTHESIS-DRIVEN BIOPHYSICAL INTERROGATION SYSTEM IN COMPUTATIONAL TOXICOLOGY.

    EPA Science Inventory

    Developing and evaluating prediactive strategies to elucidate the mode of biological activity of environmental chemicals is a major objective of the concerted efforts of the US-EPA's computational toxicology program.

  12. Globus Quick Start Guide. Globus Software Version 1.1

    NASA Technical Reports Server (NTRS)

    1999-01-01

    The Globus Project is a community effort, led by Argonne National Laboratory and the University of Southern California's Information Sciences Institute. Globus is developing the basic software infrastructure for computations that integrate geographically distributed computational and information resources.

  13. Enabling Wide-Scale Computer Science Education through Improved Automated Assessment Tools

    NASA Astrophysics Data System (ADS)

    Boe, Bryce A.

    There is a proliferating demand for newly trained computer scientists as the number of computer science related jobs continues to increase. University programs will only be able to train enough new computer scientists to meet this demand when two things happen: when there are more primary and secondary school students interested in computer science, and when university departments have the resources to handle the resulting increase in enrollment. To meet these goals, significant effort is being made to both incorporate computational thinking into existing primary school education, and to support larger university computer science class sizes. We contribute to this effort through the creation and use of improved automated assessment tools. To enable wide-scale computer science education we do two things. First, we create a framework called Hairball to support the static analysis of Scratch programs targeted for fourth, fifth, and sixth grade students. Scratch is a popular building-block language utilized to pique interest in and teach the basics of computer science. We observe that Hairball allows for rapid curriculum alterations and thus contributes to wide-scale deployment of computer science curriculum. Second, we create a real-time feedback and assessment system utilized in university computer science classes to provide better feedback to students while reducing assessment time. Insights from our analysis of student submission data show that modifications to the system configuration support the way students learn and progress through course material, making it possible for instructors to tailor assignments to optimize learning in growing computer science classes.

  14. A Comparison of Approaches for Solving Hard Graph-Theoretic Problems

    DTIC Science & Technology

    2015-05-01

    collaborative effort “ Adiabatic Quantum Computing Applications Research” (14-RI-CRADA-02) between the Information Directorate and Lock- 3 Algorithm 3...using Matlab, a quantum annealing approach using the D-Wave computer , and lastly using satisfiability modulo theory (SMT) and corresponding SMT...methods are explored and consist of a parallel computing approach using Matlab, a quantum annealing approach using the D-Wave computer , and lastly using

  15. Computational Omics Pre-Awardees | Office of Cancer Clinical Proteomics Research

    Cancer.gov

    The National Cancer Institute's Clinical Proteomic Tumor Analysis Consortium (CPTAC) is pleased to announce the pre-awardees of the Computational Omics solicitation. Working with NVIDIA Foundation's Compute the Cure initiative and Leidos Biomedical Research Inc., the NCI, through this solicitation, seeks to leverage computational efforts to provide tools for the mining and interpretation of large-scale publicly available ‘omics’ datasets.

  16. Computers on Wheels: An Alternative to Each One Has One

    ERIC Educational Resources Information Center

    Grant, Michael M.; Ross, Steven M.; Wang, Weiping; Potter, Allison

    2005-01-01

    Four fifth-grade classrooms embarked on a modified ubiquitous computing initiative in the fall of 2003. Two 15-computer wireless laptop carts were shared among the four classrooms in an effort to integrate technology across the curriculum and affect change in student learning and teacher pedagogy. This initiative--in contrast to other one-to-one…

  17. Apple Seeks To Regain Its Stature in World of Academic Computing.

    ERIC Educational Resources Information Center

    Young, Jeffrey R.; Blumenstyk, Goldie

    1998-01-01

    Managers of Apple Computer, the company that pioneered campus personal computing and later lost most of its share of the market, are again focusing energies on academic buyers. Campus technology officials, even those fond of Apples, are greeting the company's efforts with caution. Some feel it may be too late for Apple to regain a significant…

  18. Crossbar Nanocomputer Development

    DTIC Science & Technology

    2012-04-01

    their utilization. Areas such as neuromorphic computing, signal processing, arithmetic processing, and crossbar computing are only some of the...due to its intrinsic, network-on- chip flexibility to re-route around defects. Preliminary efforts in crossbar computing have been demonstrated by...they approach their scaling limits [2]. Other applications that memristive devices are suited for include FPGA [3], encryption [4], and neuromorphic

  19. A new generation in computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kahn, R.E.

    1983-11-01

    Fifth generation of computers is described. The three disciplines involved in bringing such a new generation to reality are: microelectronics; artificial intelligence and, computer systems and architecture. Applications in industry, offices, aerospace, education, health care and retailing are outlined. An analysis is given of research efforts in the US, Japan, U.K., and Europe. Fifth generation programming languages are detailed.

  20. 45 CFR 309.145 - What costs are allowable for Tribal IV-D programs carried out under § 309.65(a) of this part?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    .... (h) Automated data processing computer systems, including: (1) Planning efforts in the identification, evaluation, and selection of an automated data processing computer system solution meeting the program... existing automated data processing computer system to support Tribal IV-D program operations, and...

  1. 45 CFR 309.145 - What costs are allowable for Tribal IV-D programs carried out under § 309.65(a) of this part?

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    .... (h) Automated data processing computer systems, including: (1) Planning efforts in the identification, evaluation, and selection of an automated data processing computer system solution meeting the program... existing automated data processing computer system to support Tribal IV-D program operations, and...

  2. 45 CFR 309.145 - What costs are allowable for Tribal IV-D programs carried out under § 309.65(a) of this part?

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    .... (h) Automated data processing computer systems, including: (1) Planning efforts in the identification, evaluation, and selection of an automated data processing computer system solution meeting the program... existing automated data processing computer system to support Tribal IV-D program operations, and...

  3. 45 CFR 309.145 - What costs are allowable for Tribal IV-D programs carried out under § 309.65(a) of this part?

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    .... (h) Automated data processing computer systems, including: (1) Planning efforts in the identification, evaluation, and selection of an automated data processing computer system solution meeting the program... existing automated data processing computer system to support Tribal IV-D program operations, and...

  4. 45 CFR 309.145 - What costs are allowable for Tribal IV-D programs carried out under § 309.65(a) of this part?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    .... (h) Automated data processing computer systems, including: (1) Planning efforts in the identification, evaluation, and selection of an automated data processing computer system solution meeting the program... existing automated data processing computer system to support Tribal IV-D program operations, and...

  5. Reading Teachers' Beliefs and Utilization of Computer and Technology: A Case Study

    ERIC Educational Resources Information Center

    Remetio, Jessica Espinas

    2014-01-01

    Many researchers believe that computers have the ability to help improve the reading skills of students. In an effort to improve the poor reading scores of students on state tests, as well as improve students' overall academic performance, computers and other technologies have been installed in Frozen Bay School classrooms. As the success of these…

  6. Attitudes of Design Students toward Computer Usage in Design

    ERIC Educational Resources Information Center

    Pektas, Sule Tasli; Erkip, Feyzan

    2006-01-01

    The success of efforts to integrate technology with design education is largely affected by the attitudes of students toward technology. This paper presents the findings of a research on the attitudes of design students toward the use of computers in design and its correlates. Computer Aided Design (CAD) tools are the most widely used computer…

  7. Using an Online Homework System to Submit Accounting Homework: Role of Cognitive Need, Computer Efficacy, and Perception

    ERIC Educational Resources Information Center

    Peng, Jacob C.

    2009-01-01

    The author investigated whether students' effort in working on homework problems was affected by their need for cognition, their perception of the system, and their computer efficacy when instructors used an online system to collect accounting homework. Results showed that individual intrinsic motivation and computer efficacy are important factors…

  8. Education:=Coding+Aesthetics; Aesthetic Understanding, Computer Science Education, and Computational Thinking

    ERIC Educational Resources Information Center

    Good, Jonathon; Keenan, Sarah; Mishra, Punya

    2016-01-01

    The popular press is rife with examples of how students in the United States and around the globe are learning to program, make, and tinker. The Hour of Code, maker-education, and similar efforts are advocating that more students be exposed to principles found within computer science. We propose an expansion beyond simply teaching computational…

  9. Human-computer interaction in multitask situations

    NASA Technical Reports Server (NTRS)

    Rouse, W. B.

    1977-01-01

    Human-computer interaction in multitask decisionmaking situations is considered, and it is proposed that humans and computers have overlapping responsibilities. Queueing theory is employed to model this dynamic approach to the allocation of responsibility between human and computer. Results of simulation experiments are used to illustrate the effects of several system variables including number of tasks, mean time between arrivals of action-evoking events, human-computer speed mismatch, probability of computer error, probability of human error, and the level of feedback between human and computer. Current experimental efforts are discussed and the practical issues involved in designing human-computer systems for multitask situations are considered.

  10. Effects of environmental covariates and density on the catchability of fish populations and interpretation of catch per unit effort trends

    USGS Publications Warehouse

    Korman, Josh; Yard, Mike

    2017-01-01

    Article for outlet: Fisheries Research. Abstract: Quantifying temporal and spatial trends in abundance or relative abundance is required to evaluate effects of harvest and changes in habitat for exploited and endangered fish populations. In many cases, the proportion of the population or stock that is captured (catchability or capture probability) is unknown but is often assumed to be constant over space and time. We used data from a large-scale mark-recapture study to evaluate the extent of spatial and temporal variation, and the effects of fish density, fish size, and environmental covariates, on the capture probability of rainbow trout (Oncorhynchus mykiss) in the Colorado River, AZ. Estimates of capture probability for boat electrofishing varied 5-fold across five reaches, 2.8-fold across the range of fish densities that were encountered, 2.1-fold over 19 trips, and 1.6-fold over five fish size classes. Shoreline angle and turbidity were the best covariates explaining variation in capture probability across reaches and trips. Patterns in capture probability were driven by changes in gear efficiency and spatial aggregation, but the latter was more important. Failure to account for effects of fish density on capture probability when translating a historical catch per unit effort time series into a time series of abundance, led to 2.5-fold underestimation of the maximum extent of variation in abundance over the period of record, and resulted in unreliable estimates of relative change in critical years. Catch per unit effort surveys have utility for monitoring long-term trends in relative abundance, but are too imprecise and potentially biased to evaluate population response to habitat changes or to modest changes in fishing effort.

  11. Robotics-Centered Outreach Activities: An Integrated Approach

    ERIC Educational Resources Information Center

    Ruiz-del-Solar, Javier

    2010-01-01

    Nowadays, universities are making extensive efforts to attract prospective students to the fields of electrical, electronic, and computer engineering. Thus, outreach is becoming increasingly important, and activities with schoolchildren are being extensively carried out as part of this effort. In this context, robotics is a very attractive and…

  12. Automated Lumber Processing

    Treesearch

    Powsiri Klinkhachorn; J. Moody; Philip A. Araman

    1995-01-01

    For the past few decades, researchers have devoted time and effort to apply automation and modern computer technologies towards improving the productivity of traditional industries. To be competitive, one must streamline operations and minimize production costs, while maintaining an acceptable margin of profit. This paper describes the effort of one such endeavor...

  13. A Framework for Robust Multivariable Optimization of Integrated Circuits in Space Applications

    NASA Technical Reports Server (NTRS)

    DuMonthier, Jeffrey; Suarez, George

    2013-01-01

    Application Specific Integrated Circuit (ASIC) design for space applications involves multiple challenges of maximizing performance, minimizing power and ensuring reliable operation in extreme environments. This is a complex multidimensional optimization problem which must be solved early in the development cycle of a system due to the time required for testing and qualification severely limiting opportunities to modify and iterate. Manual design techniques which generally involve simulation at one or a small number of corners with a very limited set of simultaneously variable parameters in order to make the problem tractable are inefficient and not guaranteed to achieve the best possible results within the performance envelope defined by the process and environmental requirements. What is required is a means to automate design parameter variation, allow the designer to specify operational constraints and performance goals, and to analyze the results in a way which facilitates identifying the tradeoffs defining the performance envelope over the full set of process and environmental corner cases. The system developed by the Mixed Signal ASIC Group (MSAG) at the Goddard Space Flight Center is implemented as framework of software modules, templates and function libraries. It integrates CAD tools and a mathematical computing environment, and can be customized for new circuit designs with only a modest amount of effort as most common tasks are already encapsulated. Customization is required for simulation test benches to determine performance metrics and for cost function computation. Templates provide a starting point for both while toolbox functions minimize the code required. Once a test bench has been coded to optimize a particular circuit, it is also used to verify the final design. The combination of test bench and cost function can then serve as a template for similar circuits or be re-used to migrate the design to different processes by re-running it with the new process specific device models. The system has been used in the design of time to digital converters for laser ranging and time-of-flight mass spectrometry to optimize analog, mixed signal and digital circuits such as charge sensitive amplifiers, comparators, delay elements, radiation tolerant dual interlocked (DICE) flip-flops and two of three voter gates.

  14. Converting Advances in Seismology into Earthquake Science

    NASA Astrophysics Data System (ADS)

    Hauksson, Egill; Shearer, Peter; Vidale, John

    2004-01-01

    Federal and state agencies and university groups all operate seismic networks in California. The U.S. Geological Survey (USGS) operates seismic networks in California in cooperation with the California Institute of Technology (Caltech) in southern California, and the University of California (UC) at Berkeley in northern California. The California Geological Survey (CGS) and the USGS National Strong Motion Program (NSMP) operate dial-out strong motion instruments in the state, primarily to capture data from large earthquakes for earthquake engineering and, more recently, emergency response. The California Governor's Office of Emergency Services (OES) provides leadership for the most recent project, the California Integrated Seismic Network (CISN), to integrate all of the California efforts, and to take advantage of the emergency response capabilities of the seismic networks. The core members of the CISN are Caltech, UC Berkeley, CGS, USGS Menlo Park, and USGS Pasadena (http://www.cisn.org). New seismic instrumentation is in place across southern California, and significant progress has been made in improving instrumentation in northern California. Since 2001, these new field instrumentation efforts, data sharing, and software development for real-time reporting and archiving have been coordinated through the California Integrated Seismic Network (CISN). The CISN is also the California region of the Advanced National Seismic Network (ANSS). In addition, EarthScope deployments of USArray that will begin in early 2004 in California are coordinated with the CISN. The southern and northern California earthquake data centers (SCEDC and NCEDC) have new capabilities that enable seismologists to obtain large volumes of data with only modest effort.

  15. Allele Identification for Transcriptome-Based Population Genomics in the Invasive Plant Centaurea solstitialis

    PubMed Central

    Dlugosch, Katrina M.; Lai, Zhao; Bonin, Aurélie; Hierro, José; Rieseberg, Loren H.

    2013-01-01

    Transcriptome sequences are becoming more broadly available for multiple individuals of the same species, providing opportunities to derive population genomic information from these datasets. Using the 454 Life Science Genome Sequencer FLX and FLX-Titanium next-generation platforms, we generated 11−430 Mbp of sequence for normalized cDNA for 40 wild genotypes of the invasive plant Centaurea solstitialis, yellow starthistle, from across its worldwide distribution. We examined the impact of sequencing effort on transcriptome recovery and overlap among individuals. To do this, we developed two novel publicly available software pipelines: SnoWhite for read cleaning before assembly, and AllelePipe for clustering of loci and allele identification in assembled datasets with or without a reference genome. AllelePipe is designed specifically for cases in which read depth information is not appropriate or available to assist with disentangling closely related paralogs from allelic variation, as in transcriptome or previously assembled libraries. We find that modest applications of sequencing effort recover most of the novel sequences present in the transcriptome of this species, including single-copy loci and a representative distribution of functional groups. In contrast, the coverage of variable sites, observation of heterozygosity, and overlap among different libraries are all highly dependent on sequencing effort. Nevertheless, the information gained from overlapping regions was informative regarding coarse population structure and variation across our small number of population samples, providing the first genetic evidence in support of hypothesized invasion scenarios. PMID:23390612

  16. Obesity and severe obesity forecasts through 2030.

    PubMed

    Finkelstein, Eric A; Khavjou, Olga A; Thompson, Hope; Trogdon, Justin G; Pan, Liping; Sherry, Bettylou; Dietz, William

    2012-06-01

    Previous efforts to forecast future trends in obesity applied linear forecasts assuming that the rise in obesity would continue unabated. However, evidence suggests that obesity prevalence may be leveling off. This study presents estimates of adult obesity and severe obesity prevalence through 2030 based on nonlinear regression models. The forecasted results are then used to simulate the savings that could be achieved through modestly successful obesity prevention efforts. The study was conducted in 2009-2010 and used data from the 1990 through 2008 Behavioral Risk Factor Surveillance System (BRFSS). The analysis sample included nonpregnant adults aged ≥ 18 years. The individual-level BRFSS variables were supplemented with state-level variables from the U.S. Bureau of Labor Statistics, the American Chamber of Commerce Research Association, and the Census of Retail Trade. Future obesity and severe obesity prevalence were estimated through regression modeling by projecting trends in explanatory variables expected to influence obesity prevalence. Linear time trend forecasts suggest that by 2030, 51% of the population will be obese. The model estimates a much lower obesity prevalence of 42% and severe obesity prevalence of 11%. If obesity were to remain at 2010 levels, the combined savings in medical expenditures over the next 2 decades would be $549.5 billion. The study estimates a 33% increase in obesity prevalence and a 130% increase in severe obesity prevalence over the next 2 decades. If these forecasts prove accurate, this will further hinder efforts for healthcare cost containment. Copyright © 2012 Elsevier Inc. All rights reserved.

  17. The opportunities for and obstacles against prevention: the example of Germany in the areas of tobacco and alcohol

    PubMed Central

    2010-01-01

    Background Recent years have seen a growing research and policy interest in prevention in many developed countries. However, the actual efforts and resources devoted to prevention appear to have lagged well behind the lip service paid to the topic. Discussion We review the evidence on the considerable existing scope for health gains from prevention as well as for greater prevention policy efforts in Germany. We also discuss the barriers to "more and better" prevention and provide modest suggestions about how some of the obstacles could be overcome. Summary In Germany, there are substantial health gains to be reaped from the implementation of evidence-based, cost-effective preventive interventions and policies. Barriers to more prevention include social, historical, political, legal and economic factors. While there is sufficient evidence to scale up prevention efforts in some public health domains in Germany, in general there is a comparative shortage of research on non-clinical preventive interventions. Some of the existing barriers in Germany are at least in principle amenable to change, provided sufficient political will exists. More research on prevention by itself is no panacea, but could help facilitate more policy action. In particular, there is an economic efficiency-based case for public funding and promotion of research on non-clinical preventive interventions, in Germany and beyond, to confront the peculiar challenges that set this research apart from its clinical counterpart. PMID:20718995

  18. Bridging the Gap: Need for a Data Repository to Support Vaccine Prioritization Efforts*

    PubMed Central

    Madhavan, Guruprasad; Phelps, Charles; Sangha, Kinpritma; Levin, Scott; Rappuoli, Rino

    2015-01-01

    As the mechanisms for discovery, development, and delivery of new vaccines become increasingly complex, strategic planning and priority setting have become ever more crucial. Traditional single value metrics such as disease burden or cost-effectiveness no longer suffice to rank vaccine candidates for development. The Institute of Medicine—in collaboration with the National Academy of Engineering—has developed a novel software system to support vaccine prioritization efforts. The Strategic Multi-Attribute Ranking Tool for Vaccines—SMART Vaccines—allows decision makers to specify their own value structure, selecting from among 28 pre-defined and up to 7 user-defined attributes relevant to the ranking of vaccine candidates. Widespread use of SMART Vaccines will require compilation of a comprehensive data repository for numerous relevant populations—including their demographics, disease burdens and associated treatment costs, as well as characterizing performance features of potential or existing vaccines that might be created, improved, or deployed. While the software contains preloaded data for a modest number of populations, a large gap exists between the existing data and a comprehensive data repository necessary to make full use of SMART Vaccines. While some of these data exist in disparate sources and forms, constructing a data repository will require much new coordination and focus. Finding strategies to bridge the gap to a comprehensive data repository remains the most important task in bringing SMART Vaccines to full fruition, and to support strategic vaccine prioritization efforts in general. PMID:26022565

  19. Positioning Continuing Education Computer Programs for the Corporate Market.

    ERIC Educational Resources Information Center

    Tilney, Ceil

    1993-01-01

    Summarizes the findings of the market assessment phase of Bellevue Community College's evaluation of its continuing education computer training program. Indicates that marketing efforts must stress program quality and software training to help overcome strong antiacademic client sentiment. (MGB)

  20. Researching and Reducing the Health Burden of Stroke

    MedlinePlus

    ... the result of continuing research to map the brain and interface it with a computer to enable stroke patients to regain function. How important is the new effort to map the human brain? The brain is more complex than any computer ...

  1. Brain transcriptome atlases: a computational perspective.

    PubMed

    Mahfouz, Ahmed; Huisman, Sjoerd M H; Lelieveldt, Boudewijn P F; Reinders, Marcel J T

    2017-05-01

    The immense complexity of the mammalian brain is largely reflected in the underlying molecular signatures of its billions of cells. Brain transcriptome atlases provide valuable insights into gene expression patterns across different brain areas throughout the course of development. Such atlases allow researchers to probe the molecular mechanisms which define neuronal identities, neuroanatomy, and patterns of connectivity. Despite the immense effort put into generating such atlases, to answer fundamental questions in neuroscience, an even greater effort is needed to develop methods to probe the resulting high-dimensional multivariate data. We provide a comprehensive overview of the various computational methods used to analyze brain transcriptome atlases.

  2. Cloud Computing for Complex Performance Codes.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Appel, Gordon John; Hadgu, Teklu; Klein, Brandon Thorin

    This report describes the use of cloud computing services for running complex public domain performance assessment problems. The work consisted of two phases: Phase 1 was to demonstrate complex codes, on several differently configured servers, could run and compute trivial small scale problems in a commercial cloud infrastructure. Phase 2 focused on proving non-trivial large scale problems could be computed in the commercial cloud environment. The cloud computing effort was successfully applied using codes of interest to the geohydrology and nuclear waste disposal modeling community.

  3. Space shuttle low cost/risk avionics study

    NASA Technical Reports Server (NTRS)

    1971-01-01

    All work breakdown structure elements containing any avionics related effort were examined for pricing the life cycle costs. The analytical, testing, and integration efforts are included for the basic onboard avionics and electrical power systems. The design and procurement of special test equipment and maintenance and repair equipment are considered. Program management associated with these efforts is described. Flight test spares and labor and materials associated with the operations and maintenance of the avionics systems throughout the horizontal flight test are examined. It was determined that cost savings can be achieved by using existing hardware, maximizing orbiter-booster commonality, specifying new equipments to MIL quality standards, basing redundancy on cost effective analysis, minimizing software complexity and reducing cross strapping and computer-managed functions, utilizing compilers and floating point computers, and evolving the design as dictated by the horizontal flight test schedules.

  4. Distributed Accounting on the Grid

    NASA Technical Reports Server (NTRS)

    Thigpen, William; Hacker, Thomas J.; McGinnis, Laura F.; Athey, Brian D.

    2001-01-01

    By the late 1990s, the Internet was adequately equipped to move vast amounts of data between HPC (High Performance Computing) systems, and efforts were initiated to link together the national infrastructure of high performance computational and data storage resources together into a general computational utility 'grid', analogous to the national electrical power grid infrastructure. The purpose of the Computational grid is to provide dependable, consistent, pervasive, and inexpensive access to computational resources for the computing community in the form of a computing utility. This paper presents a fully distributed view of Grid usage accounting and a methodology for allocating Grid computational resources for use on a Grid computing system.

  5. A Population-Based Study of Childhood Sexual Contact in China: Prevalence and Long-Term Consequences

    PubMed Central

    Luo, Ye; Parish, William L.; Laumann, Edward O.

    2008-01-01

    Objectives This study provides national estimates of the prevalence of childhood sexual contact and its association with sexual well-being and psychological distress among adults in China. Method A national stratified probability sample of 1,519 women and 1,475 men aged 20 to 64 years in urban China completed a computer-administered survey in 1999–2000. The data from this survey on both adult-to-child and peer-to-peer sexual contact before age 14 were subjected to descriptive and multivariate analyses that were adjusted for both sampling weights and sampling design. Results The overall prevalence of reported childhood sexual contact was 4.2%, with prevalence higher among men (5.1%) than among women (3.3%) and higher among those aged 20–29 years (8.3%). Childhood sexual contact was associated with multiplex consequences, including hyper-sexuality (high levels of masturbation, thoughts about sex, varieties of sexual practices, partner turnover), adult sexual victimization (unwanted sex, unwanted sexual acts, sexual harassment), sexual difficulties (genitor-urinary symptoms, sexually transmitted infections, sexual dysfunctions), and psychological distress. Psychological distress was largely mediated by adult sexual victimization, sexual difficulties, and hyper-sexuality. Conclusions Despite the relatively modest prevalence of childhood sexual contact among Chinese adults, the association with multiplex adult outcomes suggests that much as in the West early sexual contact is a significant issue. Practice Implications The findings underscore the importance of public education about childhood sexual contact and abuse in China. The findings suggest a need for public health campaigns that tackle the stigma associated with being abused and encourage victims to report abusive behavior to proper sources. The findings are also consistent with new efforts to alleviate the negative long-term impact of childhood sexual abuse. PMID:18614231

  6. Institutional delivery in India, 2004-14: unravelling the equity-enhancing contributions of the public sector.

    PubMed

    Joe, William; Perkins, Jessica M; Kumar, Saroj; Rajpal, Sunil; Subramanian, S V

    2018-06-01

    To achieve faster and equitable improvements in maternal and child health outcomes, the government of India launched the National Rural Health Mission in 2005. This paper describes the equity-enhancing role of the public sector in increasing use of institutional delivery care services in India between 2004 and 2014. Information on 24 661 births from nationally representative survey data for 2004 and 2014 is analysed. Concentration index is computed to describe socioeconomic-rank-related relative inequalities in institutional delivery and decomposition is used to assess the contributions of public and private sectors in overall socioeconomic inequality. Multilevel logistic regression is applied to examine the changes in socioeconomic gradient between 2004 and 2014. The analysis finds that utilization of institutional delivery care in India increased from 43% in 2004 to 83% in 2014. The bulk of the increase was in public sector use (21% in 2004 to 53% in 2014) with a modest increase in private sector use (22% in 2004 to 30% in 2014). The shift from a pro-rich to pro-poor distribution of public sector use is confirmed. Decomposition analysis indicates that 51% of these reductions in socioeconomic inequality are associated with improved pro-poor distribution of public sector births. Multilevel logistic regressions confirm the disappearance of a wealth-based gradient in public sector births between 2004 and 2014. We conclude that public health investments in India have significantly contributed towards an equitable increase in the coverage of institutional delivery care. Sustained policy efforts are necessary, however, with an emphasis on education, sociocultural and geographical factors to ensure universal coverage of institutional delivery care services in India.

  7. A population-based study of childhood sexual contact in China: prevalence and long-term consequences.

    PubMed

    Luo, Ye; Parish, William L; Laumann, Edward O

    2008-07-01

    This study provides national estimates of the prevalence of childhood sexual contact and its association with sexual well-being and psychological distress among adults in China. A national stratified probability sample of 1,519 women and 1,475 men aged 20-64 years in urban China completed a computer-administered survey in 1999-2000. The data from this survey on both adult-to-child and peer-to-peer sexual contact before age 14 were subjected to descriptive and multivariate analyses that were adjusted for both sampling weights and sampling design. The overall prevalence of reported childhood sexual contact was 4.2%, with prevalence higher among men (5.1%) than among women (3.3%) and higher among those aged 20-29 years (8.3%). Childhood sexual contact was associated with multiplex consequences, including hyper-sexuality (high levels of masturbation, thoughts about sex, varieties of sexual practices, partner turnover), adult sexual victimization (unwanted sex, unwanted sexual acts, sexual harassment), sexual difficulties (genitor-urinary symptoms, sexually transmitted infections, sexual dysfunctions), and psychological distress. Psychological distress was largely mediated by adult sexual victimization, sexual difficulties, and hyper-sexuality. Despite the relatively modest prevalence of childhood sexual contact among Chinese adults, the association with multiplex adult outcomes suggests that much as in the West early sexual contact is a significant issue. The findings underscore the importance of public education about childhood sexual contact and abuse in China. The findings suggest a need for public health campaigns that tackle the stigma associated with being abused and encourage victims to report abusive behavior to proper sources. The findings are also consistent with new efforts to alleviate the negative long-term impact of childhood sexual abuse.

  8. Airline Safety and Economy

    NASA Technical Reports Server (NTRS)

    1993-01-01

    This video documents efforts at NASA Langley Research Center to improve safety and economy in aircraft. Featured are the cockpit weather information needs computer system, which relays real time weather information to the pilot, and efforts to improve techniques to detect structural flaws and corrosion, such as the thermal bond inspection system.

  9. MUMPS Based Integration of Disparate Computer-Assisted Medical Diagnosis Modules

    DTIC Science & Technology

    1989-12-12

    modules use a Bayesian approach, while the Opthalmology module uses a Rule Based approach. In the current effort, MUMPS is used to develop an...Abdominal and Chest Pain modules use a Bayesian approach, while the Opthalmology module uses a Rule Based approach. In the current effort, MUMPS is used

  10. Interactive Electronic Storybooks for Kindergartners to Promote Vocabulary Growth

    ERIC Educational Resources Information Center

    Smeets, Daisy J. H.; Bus, Adriana G.

    2012-01-01

    The goals of this study were to examine (a) whether extratextual vocabulary instructions embedded in electronic storybooks facilitated word learning over reading alone and (b) whether instructional formats that required children to invest more effort were more effective than formats that required less effort. A computer-based "assistant" was added…

  11. Using the base-of-the-pyramid perspective to catalyze interdependence-based collaborations

    PubMed Central

    London, Ted; Anupindi, Ravi

    2012-01-01

    Improving food security and nutrition in the developing world remains among society's most intractable challenges and continues despite a wide variety of investments. Both donor- and enterprise-led initiatives, for example, have explored including smallholder farmers in their value chains. However, these efforts have had only modest success, partly because the private and development sectors prefer to maintain their independence. Research from the base-of-the-pyramid domain offers new insights into how collaborative interdependence between sectors can enhance the connection between profits and the alleviation of poverty. In this article, we identify the strengths and weaknesses of donor-led and enterprise-led value chain initiatives. We then explore how insights from the base-of-the-pyramid domain yield a set of interdependence-based collaboration strategies that can achieve more sustainable and scalable outcomes. PMID:21482752

  12. The Importance of Stochastic Effects for Explaining Entrainment in the Zebrafish Circadian Clock.

    PubMed

    Heussen, Raphaela; Whitmore, David

    2015-01-01

    The circadian clock plays a pivotal role in modulating physiological processes and has been implicated, either directly or indirectly, in a range of pathological states including cancer. Here we investigate how the circadian clock is entrained by external cues such as light. Working with zebrafish cell lines and combining light pulse experiments with simulation efforts focused on the role of synchronization effects, we find that even very modest doses of light exposure are sufficient to trigger some entrainment, whereby a higher light intensity or duration correlates with strength of the circadian signal. Moreover, we observe in the simulations that stochastic effects may be considered an essential feature of the circadian clock in order to explain the circadian signal decay in prolonged darkness, as well as light initiated resynchronization as a strong component of entrainment.

  13. Advanced components for spaceborne infrared astronomy

    NASA Technical Reports Server (NTRS)

    Davidson, A. W.

    1984-01-01

    The need for improved cryogenic components to be used in future spaceborne infrared astronomy missions was identified. Improved low noise cryogenic amplifiers operated with infrared detectors, and better cryogenic actuators and motors with extremely low power dissipation are needed. The feasibility of achieving technological breakthroughs in both of these areas was studied. An improved silicon junction field effect transistor (JFET) could be developed if: (1) high purity silicon; (2) optimum dopants; and (3) very high doping levels are used. The feasibility of a simple stepper motor equipped with superconducting coils is demonstrated by construction of such a device based on a standard commercial motor. It is found that useful levels of torque at immeasurably low power levels were achieved. It is concluded that with modest development and optimization efforts, significant performance gains is possible for both cryogenic preamplifiers and superconducting motors and actuators.

  14. Opportunities for improving legislative public health policy in Rhode Island through evidence-based education.

    PubMed

    Bourdeau, Moise; Winter, Ronald; Marshall, Robert

    2013-10-01

    The Rhode Island General Assembly considers nearly 3000 bills yearly--spanning the entire range of issues related to state government and legislative policy. This review analyzes the modest number of 40 "health-related" bills introduced during the 2009 session. It is often not clear to what extent these proposals consistently received analysis by both informed and independent organizations or experts regarding their "evidence-based" foundations. Only 25 of these bills received a committee hearing, and eventually become law. Hence, there may be a reasonable opportunity for expert, non-partisan organizations to provide the General Assembly with information related to proposed legislation on a routine or "as requested" basis. This study provides a systematic analysis of this degree of effort based on data regarding health- related legislation proposed during the 2009 session of the RI General Assembly.

  15. Using the base-of-the-pyramid perspective to catalyze interdependence-based collaborations.

    PubMed

    London, Ted; Anupindi, Ravi

    2012-07-31

    Improving food security and nutrition in the developing world remains among society's most intractable challenges and continues despite a wide variety of investments. Both donor- and enterprise-led initiatives, for example, have explored including smallholder farmers in their value chains. However, these efforts have had only modest success, partly because the private and development sectors prefer to maintain their independence. Research from the base-of-the-pyramid domain offers new insights into how collaborative interdependence between sectors can enhance the connection between profits and the alleviation of poverty. In this article, we identify the strengths and weaknesses of donor-led and enterprise-led value chain initiatives. We then explore how insights from the base-of-the-pyramid domain yield a set of interdependence-based collaboration strategies that can achieve more sustainable and scalable outcomes.

  16. Emerging markets for satellite data communications in the public service

    NASA Technical Reports Server (NTRS)

    Potter, J. G.

    1978-01-01

    The paper discusses some of the current and potential markets for satellite data communications as projected by the Public Service Satellite Consortium (PSSC). Organizations in the public service sector are divided into three categories, depending on their expected benefits and organizational changes due to increased satellite telecommunications use: A - modest institutional adjustments are necessary and significant productivity gains are likely; B - institutional requirements picture is promising, but more information is needed to assess benefits and risk; and C - major institutional adjustments are needed, risks are high but possible benefits are high. These criteria are applied to the U.S. health care system, continuing education, equipment maintenance, libraries, environmental monitoring, and other potential markets. The potential revenues are seen to be significant, but what is needed is a cooperative effort by common carriers and major public service institutions to aggregate the market.

  17. The Andrea Levialdi Fellowship

    NASA Astrophysics Data System (ADS)

    Fieschi, Roberto

    My first encounter with Cuba dates back to winter 1967-1968 at the Cultural Congress of La Havana, a very large international event to promote greater understanding of the reality of the Cuban Revolution. In fact the person invited was my friend and colleague Andrea Levialdi (Andrea already knew Cuba and loved it) who, unable to participate, allowed me to go in her place. So I landed at the airport of the "first free country in Latin America" with the delegation of the Italian Communist Party. In Havana I met other Italian physicists whom I already knew, among them Bruno Vitale and Daniele Amati. They, like me, were embarrassed by the generous hospitality of `Havana Libre,' especially in a country which was going through such difficulties. Despite our best efforts we did not succeed in receiving a more modest welcome.

  18. Rethinking construction: inclusion of slow learners as taker-off in quantity surveying practice

    NASA Astrophysics Data System (ADS)

    Majid, Masidah Abdul; Ashaari, Norul Izzati M.; @ Suhana Kamarudin Nurul Aini Osman, Suhaida; Suhaimi, Mohamad Saifulnizam Mohd

    2017-11-01

    The objective of this paper is to present the preliminary findings regarding the participation of OKU with learning disability in Science Technology, Engineering and Mathematics (STEM) sectors. Review of the works of past researchers suggested that OKU is a potential workforce in STEM sectors but still under-represented due to lack of efforts from stakeholders and learning institutions in providing information on the opportunities that are available. A research has been initiated to explore the potential of slow learners to become workforce in the construction industry as a taker off - part of work of a Quantity Surveyor. Against the findings from the literature review, the modest attempt to attract slow learners to become taker off in the construction industry require the formulation of appropriate learning environment and strong support from the respective key players and stakeholders.

  19. Cloud Computing for Pharmacometrics: Using AWS, NONMEM, PsN, Grid Engine, and Sonic

    PubMed Central

    Sanduja, S; Jewell, P; Aron, E; Pharai, N

    2015-01-01

    Cloud computing allows pharmacometricians to access advanced hardware, network, and security resources available to expedite analysis and reporting. Cloud-based computing environments are available at a fraction of the time and effort when compared to traditional local datacenter-based solutions. This tutorial explains how to get started with building your own personal cloud computer cluster using Amazon Web Services (AWS), NONMEM, PsN, Grid Engine, and Sonic. PMID:26451333

  20. Cloud Computing for Pharmacometrics: Using AWS, NONMEM, PsN, Grid Engine, and Sonic.

    PubMed

    Sanduja, S; Jewell, P; Aron, E; Pharai, N

    2015-09-01

    Cloud computing allows pharmacometricians to access advanced hardware, network, and security resources available to expedite analysis and reporting. Cloud-based computing environments are available at a fraction of the time and effort when compared to traditional local datacenter-based solutions. This tutorial explains how to get started with building your own personal cloud computer cluster using Amazon Web Services (AWS), NONMEM, PsN, Grid Engine, and Sonic.

  1. Challenging the Myth of Disability.

    ERIC Educational Resources Information Center

    Brightman, Alan

    1989-01-01

    Discussion of the rhetoric of disability, including physical, hearing, and visual impairments, highlights possible benefits that computer technology can provide. Designing for disabled individuals is discussed, and product development efforts by Apple Computer to increase microcomputer access to disabled children and adults are described. (LRW)

  2. Physical-depth architectural requirements for generating universal photonic cluster states

    NASA Astrophysics Data System (ADS)

    Morley-Short, Sam; Bartolucci, Sara; Gimeno-Segovia, Mercedes; Shadbolt, Pete; Cable, Hugo; Rudolph, Terry

    2018-01-01

    Most leading proposals for linear-optical quantum computing (LOQC) use cluster states, which act as a universal resource for measurement-based (one-way) quantum computation. In ballistic approaches to LOQC, cluster states are generated passively from small entangled resource states using so-called fusion operations. Results from percolation theory have previously been used to argue that universal cluster states can be generated in the ballistic approach using schemes which exceed the critical threshold for percolation, but these results consider cluster states with unbounded size. Here we consider how successful percolation can be maintained using a physical architecture with fixed physical depth, assuming that the cluster state is continuously generated and measured, and therefore that only a finite portion of it is visible at any one point in time. We show that universal LOQC can be implemented using a constant-size device with modest physical depth, and that percolation can be exploited using simple pathfinding strategies without the need for high-complexity algorithms.

  3. Chemically frozen multicomponent boundary layer theory of salt and/or ash deposition rates from combustion gases

    NASA Technical Reports Server (NTRS)

    Rosner, D. E.; Chen, B.-K.; Fryburg, G. C.; Kohl, F. J.

    1979-01-01

    There is increased interest in, and concern about, deposition and corrosion phenomena in combustion systems containing inorganic condensible vapors and particles (salts, ash). To meet the need for a computationally tractable deposition rate theory general enough to embrace multielement/component situations of current and future gas turbine and magnetogasdynamic interest, a multicomponent chemically 'frozen' boundary layer (CFBL) deposition theory is presented and its applicability to the special case of Na2SO4 deposition from seeded laboratory burner combustion products is demonstrated. The coupled effects of Fick (concentration) diffusion and Soret (thermal) diffusion are included, along with explicit corrections for effects of variable properties and free stream turbulence. The present formulation is sufficiently general to include the transport of particles provided they are small enough to be formally treated as heavy molecules. Quantitative criteria developed to delineate the domain of validity of CFBL-rate theory suggest considerable practical promise for the present framework, which is characterized by relatively modest demands for new input information and computer time.

  4. Hierarchical surface code for network quantum computing with modules of arbitrary size

    NASA Astrophysics Data System (ADS)

    Li, Ying; Benjamin, Simon C.

    2016-10-01

    The network paradigm for quantum computing involves interconnecting many modules to form a scalable machine. Typically it is assumed that the links between modules are prone to noise while operations within modules have a significantly higher fidelity. To optimize fault tolerance in such architectures we introduce a hierarchical generalization of the surface code: a small "patch" of the code exists within each module and constitutes a single effective qubit of the logic-level surface code. Errors primarily occur in a two-dimensional subspace, i.e., patch perimeters extruded over time, and the resulting noise threshold for intermodule links can exceed ˜10 % even in the absence of purification. Increasing the number of qubits within each module decreases the number of qubits necessary for encoding a logical qubit. But this advantage is relatively modest, and broadly speaking, a "fine-grained" network of small modules containing only about eight qubits is competitive in total qubit count versus a "course" network with modules containing many hundreds of qubits.

  5. Equatorially trapped convection in a rapidly rotating shallow shell

    NASA Astrophysics Data System (ADS)

    Miquel, Benjamin; Xie, Jin-Han; Featherstone, Nicholas; Julien, Keith; Knobloch, Edgar

    2018-05-01

    Motivated by the recent discovery of subsurface oceans on planetary moons and the interest they have generated, we explore convective flows in shallow spherical shells of dimensionless gap width ɛ2≪1 in the rapid rotation limit E ≪1 , where E is the Ekman number. We employ direct numerical simulation (DNS) of the Boussinesq equations to compute the local heat flux Nu (λ ) as a function of the latitude λ and use the results to characterize the trapping of convection at low latitudes, around the equator. We show that these results are quantitatively reproduced by an asymptotically exact nonhydrostatic equatorial β -plane convection model at a much more modest computational cost than DNS. We identify the trapping parameter β =ɛ E-1 as the key parameter that controls the vigor and latitudinal extent of convection for moderate thermal forcing when E ˜ɛ and ɛ ↓0 . This model provides a theoretical paradigm for nonlinear investigations.

  6. A computational search for lipases that can preferentially hydrolyze long-chain omega-3 fatty acids from fish oil triacylglycerols.

    PubMed

    Kamal, Md Zahid; Barrow, Colin J; Rao, Nalam Madhusudhana

    2015-04-15

    Consumption of long-chain omega-3 fatty acids is known to decrease the risk of major cardiovascular events. Lipases, a class of triacylglycerol hydrolases, have been extensively tested to concentrate omega-3 fatty acids from fish oils, under mild enzymatic conditions. However, no lipases with preference for omega-3 fatty acids selectivity have yet been discovered or developed. In this study we performed an exhaustive computational study of substrate-lipase interactions by docking, both covalent and non-covalent, for 38 lipases with a large number of structured triacylglycerols containing omega-3 fatty acids. We identified some lipases that have potential to preferentially hydrolyze omega-3 fatty acids from structured triacylglycerols. However omega-3 fatty acid preferences were found to be modest. Our study provides an explanation for absence of reports of lipases with omega-3 fatty acid hydrolyzing ability and suggests methods for developing these selective lipases. Copyright © 2014 Elsevier Ltd. All rights reserved.

  7. Regioselectivity of intermolecular Pauson-Khand reaction of aliphatic alkynes: experimental and theoretical study of the effect of alkyne polarization.

    PubMed

    Fager-Jokela, Erika; Muuronen, Mikko; Khaizourane, Héléa; Vázquez-Romero, Ana; Verdaguer, Xavier; Riera, Antoni; Helaja, Juho

    2014-11-21

    Generally judged poor electronic regioselectivity of alkyne insertion in intermolecular Pauson-Khand reaction (PKR) has severely restricted its synthetic applications. In our previous rational study concerning diarylalkynes (Fager-Jokela, E.; Muuronen, M.; Patzschke, M.; Helaja, J. J. Org. Chem. 2012, 77, 9134-9147), both experimental and theoretical results indicated that purely electronic factors, i.e., alkyne polarization via resonance effect, induced the observed modest regioselectivity. In the present work, we substantiate that the alkyne polarization via inductive effect can result notable, synthetically valuable regioselectivity. Computational study at DFT level was performed to disclose the electronic origin of the selectivity. Overall, the NBO charges of alkynes correlated qualitatively with regioisomer outcome. In a detailed computational PKR case study, the obtained Boltzmann distributions of the transition state (TS) populations correlate closely with experimental regioselectivity. Analysis of the TS-structures revealed that weak interactions, e.g., hydrogen bonding and steric repulsion, affect the regioselectivity and can easily override the electronic guidance.

  8. Brief History of Computer-Assisted Instruction at the Institute for Mathematical Studies in the Social Sciences.

    ERIC Educational Resources Information Center

    Stanford Univ., CA. Inst. for Mathematical Studies in Social Science.

    In 1963, the Institute began a program of research and development in computer-assisted instruction (CAI). Their efforts have been funded at various times by the Carnegie Corporation of New York, The National Science Foundation and the United States Office of Education. Starting with a medium-sized computer and six student stations, the Institute…

  9. Efficient Computational Prototyping of Mixed Technology Microfluidic Components and Systems

    DTIC Science & Technology

    2002-08-01

    AFRL-IF-RS-TR-2002-190 Final Technical Report August 2002 EFFICIENT COMPUTATIONAL PROTOTYPING OF MIXED TECHNOLOGY MICROFLUIDIC...SUBTITLE EFFICIENT COMPUTATIONAL PROTOTYPING OF MIXED TECHNOLOGY MICROFLUIDIC COMPONENTS AND SYSTEMS 6. AUTHOR(S) Narayan R. Aluru, Jacob White...Aided Design (CAD) tools for microfluidic components and systems were developed in this effort. Innovative numerical methods and algorithms for mixed

  10. A Case Study on Collective Cognition and Operation in Team-Based Computer Game Design by Middle-School Children

    ERIC Educational Resources Information Center

    Ke, Fengfeng; Im, Tami

    2014-01-01

    This case study examined team-based computer-game design efforts by children with diverse abilities to explore the nature of their collective design actions and cognitive processes. Ten teams of middle-school children, with a high percentage of minority students, participated in a 6-weeks, computer-assisted math-game-design program. Essential…

  11. Computer Science Lesson Study: Building Computing Skills among Elementary School Teachers

    ERIC Educational Resources Information Center

    Newman, Thomas R.

    2017-01-01

    The lack of diversity in the technology workforce in the United States has proven to be a stubborn problem, resisting even the most well-funded reform efforts. With the absence of computer science education in the mainstream K-12 curriculum, only a narrow band of students in public schools go on to careers in technology. The problem persists…

  12. Using Computer Games to Train Information Warfare Teams

    DTIC Science & Technology

    2004-01-01

    Interservice/Industry Training, Simulation, and Education Conference (I/ITSEC) 2004 2004 Paper No 1729 Page 1 of 10 Using Computer Games to...responses they will experience on real missions is crucial. 3D computer games have proved themselves to be highly effective in engaging players...motivationally and emotionally. This effort, therefore, uses gaming technology to provide realistic simulations. These games are augmented with

  13. High-Performance Computing: High-Speed Computer Networks in the United States, Europe, and Japan. Report to Congressional Requesters.

    ERIC Educational Resources Information Center

    General Accounting Office, Washington, DC. Information Management and Technology Div.

    This report was prepared in response to a request from the Senate Committee on Commerce, Science, and Transportation, and from the House Committee on Science, Space, and Technology, for information on efforts to develop high-speed computer networks in the United States, Europe (limited to France, Germany, Italy, the Netherlands, and the United…

  14. Terrestrial implications of mathematical modeling developed for space biomedical research

    NASA Technical Reports Server (NTRS)

    Lujan, Barbara F.; White, Ronald J.; Leonard, Joel I.; Srinivasan, R. Srini

    1988-01-01

    This paper summarizes several related research projects supported by NASA which seek to apply computer models to space medicine and physiology. These efforts span a wide range of activities, including mathematical models used for computer simulations of physiological control systems; power spectral analysis of physiological signals; pattern recognition models for detection of disease processes; and computer-aided diagnosis programs.

  15. Writing. A Research-Based Writing Program for Students with High Access to Computers. ACOT Report #2.

    ERIC Educational Resources Information Center

    Hiebert, Elfrieda H.; And Others

    This report summarizes the curriculum development and research effort that took place at the Cupertino Apple Classrooms of Tomorrow (ACOT) site from January through June 1987. Based on the premise that computers make revising and editing much easier, the four major objectives emphasized by the computer-intensive writing program are fluency,…

  16. Small Computer Applications for Base Supply.

    DTIC Science & Technology

    1984-03-01

    research on small computer utili- zation at bse level organizatins , This research effort studies whether small computers and commercial softure can assist...Doe has made !solid contributions to the full range of departmental activity. His demonstrated leadership skills and administrative ability warrent his...outstanding professionalism and leadership abilities were evidenced by his superb performance as unit key worker In the 1980 Combined Federal CauMign

  17. Technology for Kids' Desktops: How One School Brought Its Computers Out of the Lab and into Classrooms.

    ERIC Educational Resources Information Center

    Bozzone, Meg A.

    1997-01-01

    Purchasing custom-made desks with durable glass tops to house computers and double as student work space solved the problem of how to squeeze in additional classroom computers at Johnson Park Elementary School in Princeton, New Jersey. This article describes a K-5 grade school's efforts to overcome barriers to integrating technology. (PEN)

  18. Twenty Years of Girls into Computing Days: Has It Been Worth the Effort?

    ERIC Educational Resources Information Center

    Craig, Annemieke; Lang, Catherine; Fisher, Julie

    2008-01-01

    The first documented day-long program to encourage girls to consider computing as a career was held in 1987 in the U.K. Over the last 20 years these one-day events, labeled "Girls into Computing" days, have been conducted by academics and professionals to foster female-student interest in information technology (IT) degrees and careers.…

  19. Wusor II: A Computer Aided Instruction Program with Student Modelling Capabilities. AI Memo 417.

    ERIC Educational Resources Information Center

    Carr, Brian

    Wusor II is the second intelligent computer aided instruction (ICAI) program that has been developed to monitor the progress of, and offer suggestions to, students playing Wumpus, a computer game designed to teach logical thinking and problem solving. From the earlier efforts with Wusor I, it was possible to produce a rule-based expert which…

  20. Milestones toward Majorana-based quantum computing

    NASA Astrophysics Data System (ADS)

    Alicea, Jason

    Experiments on nanowire-based Majorana platforms now appear poised to move beyond the preliminary problem of zero-mode detection and towards loftier goals of realizing non-Abelian statistics and quantum information applications. Using an approach that synthesizes recent materials growth breakthroughs with tools long successfully deployed in quantum-dot research, I will outline a number of relatively modest milestones that progressively bridge the gap between the current state of the art and these grand longer-term challenges. The intermediate Majorana experiments surveyed in this talk should be broadly adaptable to other approaches as well. Supported by the National Science Foundation (DMR-1341822), Institute for Quantum Information and Matter, and Walter Burke Institute at Caltech.

Top