Missed deadline notification in best-effort schedulers
NASA Astrophysics Data System (ADS)
Banachowski, Scott A.; Wu, Joel; Brandt, Scott A.
2003-12-01
It is common to run multimedia and other periodic, soft real-time applications on general-purpose computer systems. These systems use best-effort scheduling algorithms that cannot guarantee applications will receive responsive scheduling to meet deadline or timing requirements. We present a simple mechanism called Missed Deadline Notification (MDN) that allows applications to notify the system when they do not receive their desired level of responsiveness. Consisting of a single system call with no arguments, this simple interface allows the operating system to provide better support for soft real-time applications without any a priori information about their timing or resource needs. We implemented MDN in three different schedulers: Linux, BEST, and BeRate. We describe these implementations and their performance when running real-time applications and discuss policies to prevent applications from abusing MDN to gain extra resources.
Double absorbing boundaries for finite-difference time-domain electromagnetics
DOE Office of Scientific and Technical Information (OSTI.GOV)
LaGrone, John, E-mail: jlagrone@smu.edu; Hagstrom, Thomas, E-mail: thagstrom@smu.edu
We describe the implementation of optimal local radiation boundary condition sequences for second order finite difference approximations to Maxwell's equations and the scalar wave equation using the double absorbing boundary formulation. Numerical experiments are presented which demonstrate that the design accuracy of the boundary conditions is achieved and, for comparable effort, exceeds that of a convolution perfectly matched layer with reasonably chosen parameters. An advantage of the proposed approach is that parameters can be chosen using an accurate a priori error bound.
Tang, Shaojie; Yang, Yi; Tang, Xiangyang
2012-01-01
Interior tomography problem can be solved using the so-called differentiated backprojection-projection onto convex sets (DBP-POCS) method, which requires a priori knowledge within a small area interior to the region of interest (ROI) to be imaged. In theory, the small area wherein the a priori knowledge is required can be in any shape, but most of the existing implementations carry out the Hilbert filtering either horizontally or vertically, leading to a vertical or horizontal strip that may be across a large area in the object. In this work, we implement a practical DBP-POCS method with radial Hilbert filtering and thus the small area with the a priori knowledge can be roughly round (e.g., a sinus or ventricles among other anatomic cavities in human or animal body). We also conduct an experimental evaluation to verify the performance of this practical implementation. We specifically re-derive the reconstruction formula in the DBP-POCS fashion with radial Hilbert filtering to assure that only a small round area with the a priori knowledge be needed (namely radial DBP-POCS method henceforth). The performance of the practical DBP-POCS method with radial Hilbert filtering and a priori knowledge in a small round area is evaluated with projection data of the standard and modified Shepp-Logan phantoms simulated by computer, followed by a verification using real projection data acquired by a computed tomography (CT) scanner. The preliminary performance study shows that, if a priori knowledge in a small round area is available, the radial DBP-POCS method can solve the interior tomography problem in a more practical way at high accuracy. In comparison to the implementations of DBP-POCS method demanding the a priori knowledge in horizontal or vertical strip, the radial DBP-POCS method requires the a priori knowledge within a small round area only. Such a relaxed requirement on the availability of a priori knowledge can be readily met in practice, because a variety of small round areas (e.g., air-filled sinuses or fluid-filled ventricles among other anatomic cavities) exist in human or animal body. Therefore, the radial DBP-POCS method with a priori knowledge in a small round area is more feasible in clinical and preclinical practice.
Extension of HCDstruct for Transonic Aeroservoelastic Analysis of Unconventional Aircraft Concepts
NASA Technical Reports Server (NTRS)
Quinlan, Jesse R.; Gern, Frank H.
2017-01-01
A substantial effort has been made to implement an enhanced aerodynamic modeling capability in the Higher-fidelity Conceptual Design and structural optimization tool. This additional capability is needed for a rapid, physics-based method of modeling advanced aircraft concepts at risk of structural failure due to dynamic aeroelastic instabilities. To adequately predict these instabilities, in particular for transonic applications, a generalized aerodynamic matching algorithm was implemented to correct the doublet-lattice model available in Nastran using solution data from a priori computational fluid dynamics anal- ysis. This new capability is demonstrated for two tube-and-wing aircraft configurations, including a Boeing 737-200 for implementation validation and the NASA D8 as a first use case. Results validate the current implementation of the aerodynamic matching utility and demonstrate the importance of using such a method for aircraft configurations featuring fuselage-wing aerodynamic interaction.
Bokhour, Barbara G; Fix, Gemmae M; Mueller, Nora M; Barker, Anna M; Lavela, Sherri L; Hill, Jennifer N; Solomon, Jeffrey L; Lukas, Carol VanDeusen
2018-03-07
Healthcare organizations increasingly are focused on providing care which is patient-centered rather than disease-focused. Yet little is known about how best to transform the culture of care in these organizations. We sought to understand key organizational factors for implementing patient-centered care cultural transformation through an examination of efforts in the US Department of Veterans Affairs. We conducted multi-day site visits at four US Department of Veterans Affairs medical centers designated as leaders in providing patient-centered care. We conducted qualitative semi-structured interviews with 108 employees (22 senior leaders, 42 middle managers, 37 front-line providers and 7 staff). Transcripts of audio recordings were analyzed using a priori codes based on the Consolidated Framework for Implementation Research. We used constant comparison analysis to synthesize codes into meaningful domains. Sites described actions taken to foster patient-centered care in seven domains: 1) leadership; 2) patient and family engagement; 3) staff engagement; 4) focus on innovations; 5) alignment of staff roles and priorities; 6) organizational structures and processes; 7) environment of care. Within each domain, we identified multi-faceted strategies for implementing change. These included efforts by all levels of organizational leaders who modeled patient-centered care in their interactions and fostered willingness to try novel approaches to care amongst staff. Alignment and integration of patient centered care within the organization, particularly surrounding roles, priorities and bureaucratic rules, remained major challenges. Transforming healthcare systems to focus on patient-centered care and better serve the "whole" patient is a complex endeavor. Efforts to transform healthcare culture require robust, multi-pronged efforts at all levels of the organization; leadership is only the beginning. Challenges remain for incorporating patient-centered approaches in the context of competing priorities and regulations. Through actions within each of the domains, organizations may begin to truly transform to patient-driven care.
NASA Astrophysics Data System (ADS)
Smeltzer, C. D.; Wang, Y.; Zhao, C.; Boersma, F.
2009-12-01
Polar orbiting satellite retrievals of tropospheric nitrogen dioxide (NO2) columns are important to a variety of scientific applications. These NO2 retrievals rely on a priori profiles from chemical transport models and radiative transfer models to derive the vertical columns (VCs) from slant columns measurements. In this work, we compare the retrieval results using a priori profiles from a global model (TM4) and a higher resolution regional model (REAM) at the OMI overpass hour of 1330 local time, implementing the Dutch OMI NO2 (DOMINO) retrieval. We also compare the retrieval results using a priori profiles from REAM model simulations with and without lightning NOx (NO + NO2) production. A priori model resolution and lightning NOx production are both found to have large impact on satellite retrievals by altering the satellite sensitivity to a particular observation by shifting the NO2 vertical distribution interpreted by the radiation model. The retrieved tropospheric NO2 VCs may increase by 25-100% in urban regions and be reduced by 50% in rural regions if the a priori profiles from REAM simulations are used during the retrievals instead of the profiles from TM4 simulations. The a priori profiles with lightning NOx may result in a 25-50% reduction of the retrieved tropospheric NO2 VCs compared to the a priori profiles without lightning. As first priority, a priori vertical NO2 profiles from a chemical transport model with a high resolution, which can better simulate urban-rural NO2 gradients in the boundary layer and make use of observation-based parameterizations of lightning NOx production, should be first implemented to obtain more accurate NO2 retrievals over the United States, where NOx source regions are spatially separated and lightning NOx production is significant. Then as consequence of a priori NO2 profile variabilities resulting from lightning and model resolution dynamics, geostationary satellite, daylight observations would further promote the next step towards producing a more complete NO2 data product provided sufficient resolution of the observations. Both the corrected retrieval algorithm and the proposed next generation geostationary satellite observations would thus improve emission inventories, better validate model simulations, and advantageously optimize regional specific ozone control strategies.
Elias, Lucília de Almeida; Bastos, Francisco Inacio
2011-12-01
This article assesses the historical context and the conceptual frame of setting up damage containment programs in the field of public health, with special emphasis on the Brazilian experience. The survey seeks to assess the relevance of such programs in the ongoing efforts to curb the spread of blood-borne and sexually transmitted infections, especially AIDS and hepatitis C. Findings from both the Brazilian and the international literature demonstrate that practical damage containment initiatives tend to be more effective when integrated with other public health measures based on common goals. Damage containment initiatives, aligned with the basic principles of public health do not limit themselves to a priori models or health care per se. They encompass a variety of pragmatic measures based on public policies and should be in line with the demands of the communities since the moment of their inception and implemented in the context of full partnership with such communities.
Ichthyoplankton abundance and variance in a large river system concerns for long-term monitoring
Holland-Bartels, Leslie E.; Dewey, Michael R.; Zigler, Steven J.
1995-01-01
System-wide spatial patterns of ichthyoplankton abundance and variability were assessed in the upper Mississippi and lower Illinois rivers to address the experimental design and statistical confidence in density estimates. Ichthyoplankton was sampled from June to August 1989 in primary milieus (vegetated and non-vegated backwaters and impounded areas, main channels and main channel borders) in three navigation pools (8, 13 and 26) of the upper Mississippi River and in a downstream reach of the Illinois River. Ichthyoplankton densities varied among stations of similar aquatic landscapes (milieus) more than among subsamples within a station. An analysis of sampling effort indicated that the collection of single samples at many stations in a given milieu type is statistically and economically preferable to the collection of multiple subsamples at fewer stations. Cluster analyses also revealed that stations only generally grouped by their preassigned milieu types. Pilot studies such as this can define station groupings and sources of variation beyond an a priori habitat classification. Thus the minimum intensity of sampling required to achieve a desired statistical confidence can be identified before implementing monitoring efforts.
What roles do middle managers play in implementation of innovative practices?
Engle, Ryann L.; Lopez, Emily R.; Gormley, Katelyn E.; Chan, Jeffrey A.; Charns, Martin P.; Lukas, Carol VanDeusen
2017-01-01
Background: Middle managers play key roles in hospitals as the bridge between senior leaders and frontline staff. Yet relatively little research has focused on their role in implementing new practices. Purpose: The aim of this study was to expand the understanding of middle managers’ influence in organizations by looking at their activities through the lens of two complementary conceptual frameworks. Methodology/Approach: We analyzed qualitative data from 17 Veterans Affairs Medical Centers with high and low potential to change organizational practices. We analyzed 98 interviews with staff ranging from senior leaders to frontline staff to identify themes within an a priori framework reflecting middle manager activities. Findings: Analyses yielded 14 emergent themes that allowed us to classify specific expressions of middle manager commitment to implementation of innovative practices (e.g., facilitate improvement innovation, garner staff buy-in). In comparing middle manager behaviors in high and low change potential sites, we found that most emergent themes were present in both groups. However, the activities and interactions described differed between the groups. Practice Implications: Middle managers can use the promising strategies identified by our analyses to guide and improve their effectiveness in implementing new practices. These strategies can also inform senior leaders striving to guide middle managers in those efforts. PMID:26488239
Feedback control laws for highly maneuverable aircraft
NASA Technical Reports Server (NTRS)
Garrard, William L.; Balas, Gary J.
1995-01-01
During this year, we concentrated our efforts on the design of controllers for lateral/directional control using mu synthesis. This proved to be a more difficult task than we anticipated and we are still working on the designs. In the lateral-directional control problem, the inputs are pilot lateral stick and pedal commands and the outputs are roll rate about the velocity vector and side slip angle. The control effectors are ailerons, rudder deflection, and directional thrust vectoring vane deflection which produces a yawing moment about the body axis. Our math model does not contain any provision for thrust vectoring of rolling moment. This has resulted in limitations of performance at high angles of attack. During 1994-95, the following tasks for the lateral-directional controllers were accomplished: (1) Designed both inner and outer loop dynamic inversion controllers. These controllers are implemented using accelerometer outputs rather than an a priori model of the vehicle aerodynamics; (2) Used classical techniques to design controllers for the system linearized by dynamics inversion. These controllers acted to control roll rate and Dutch roll response; (3) Implemented the inner loop dynamic inversion and classical controllers on the six DOF simulation; (4) Developed a lateral-directional control allocation scheme based on minimizing required control effort among the ailerons, rudder, and directional thrust vectoring; and (5) Developed mu outer loop controllers combined with classical inner loop controllers.
Miller, Robin Lin; Reed, Sarah J; Chiaramonte, Danielle; Strzyzykowski, Trevor; Spring, Hannah; Acevedo-Polakovich, Ignacio D; Chutuape, Kate; Cooper-Walker, Bendu; Boyer, Cherrie B; Ellen, Jonathan M
2017-09-01
Connect to Protect (C2P), a 10-year community mobilization effort, pursued the dual aims of creating communities competent to address youth's HIV-related risks and removing structural barriers to youth health. We used Community Coalition Action Theory (CCAT) to examine the perceived contributions and accomplishments of 14 C2P coalitions. We interviewed 318 key informants, including youth and community leaders, to identify the features of coalitions' context and operation that facilitated and undermined their ability to achieve structural change and build communities' capability to manage their local adolescent HIV epidemic effectively. We coded the interviews using an a priori coding scheme informed by CCAT and scholarship on AIDS-competent communities. We found community mobilization efforts like C2P can contribute to addressing the structural factors that promote HIV-risk among youth and to community development. We describe how coalition leadership, collaborative synergy, capacity building, and local community context influence coalitions' ability to successfully implement HIV-related structural change, demonstrating empirical support for many of CCAT's propositions. We discuss implications for how community mobilization efforts might succeed in laying the foundation for an AIDS-competent community. © Society for Community Research and Action 2017.
Fuzzy logic of Aristotelian forms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Perlovsky, L.I.
1996-12-31
Model-based approaches to pattern recognition and machine vision have been proposed to overcome the exorbitant training requirements of earlier computational paradigms. However, uncertainties in data were found to lead to a combinatorial explosion of the computational complexity. This issue is related here to the roles of a priori knowledge vs. adaptive learning. What is the a-priori knowledge representation that supports learning? I introduce Modeling Field Theory (MFT), a model-based neural network whose adaptive learning is based on a priori models. These models combine deterministic, fuzzy, and statistical aspects to account for a priori knowledge, its fuzzy nature, and data uncertainties.more » In the process of learning, a priori fuzzy concepts converge to crisp or probabilistic concepts. The MFT is a convergent dynamical system of only linear computational complexity. Fuzzy logic turns out to be essential for reducing the combinatorial complexity to linear one. I will discuss the relationship of the new computational paradigm to two theories due to Aristotle: theory of Forms and logic. While theory of Forms argued that the mind cannot be based on ready-made a priori concepts, Aristotelian logic operated with just such concepts. I discuss an interpretation of MFT suggesting that its fuzzy logic, combining a-priority and adaptivity, implements Aristotelian theory of Forms (theory of mind). Thus, 2300 years after Aristotle, a logic is developed suitable for his theory of mind.« less
IGS preparations for the next reprocessing and ITRF
NASA Astrophysics Data System (ADS)
Griffiths, J.; Rebischung, P.; Garayt, B.; Ray, J.
2012-04-01
The International GNSS Service (IGS) is preparing for a second reanalysis of the full history of data collected by the global network using the latest models and methodologies. This effort is designed to obtain improved, consistent satellite orbits, station and satellite clocks, Earth orientation parameters (EOPs) and terrestrial frame products using the current IGS framework, IGS08/igs08.atx. It follows a successful first reprocessing campaign, which provided the IGS input to ITRF2008. Likewise, this second campaign (repro2) should provide the IGS contribution to the next ITRF. We will discuss the analysis standards adopted for repro2, including treatment of and mitigation against non-tidal loading effects, and improvements expected with respect to the first reprocessing campaign. International Earth Rotation and Reference Systems Service (IERS) Conventions of 2010 are expected to be implemented. Though, no improvements in the diurnal and semidiurnal EOP tide models will be made, so associated errors will remain. Adoption of new orbital force models and consistent handling of satellite attitude changes are expected to improve IGS clock and orbit products. A priori Earth-reflected radiation pressure models should nearly eliminate the ~2.5 cm orbit radial bias previously observed using laser ranging methods. Also, a priori modeling of radiation forces exerted in signal transmission should improve the orbit products. And use of consistent satellite attitude models should help with satellite clock estimation during Earth and Moon eclipses. Improvements of the terrestrial frame products are expected from, for example, the inclusion of second order ionospheric corrections and also the a priori modeling of Earth-reflected radiation pressure. Because of remaining unmodeled orbital forces, systematic errors will however likely continue to affect the origin of the repro2 frames and prevent a contribution of GNSS to the origin of the next ITRF. On the other hand, the planned inclusion of satellite phase center offsets in the long-term stacking of the repro2 frames could help in defining the scale rate of the next ITRF.
Validation of Suomi NPP OMPS Limb Profiler Ozone Measurements
NASA Astrophysics Data System (ADS)
Buckner, S. N.; Flynn, L. E.; McCormick, M. P.; Anderson, J.
2017-12-01
The Ozone Mapping and Profiler Suite (OMPS) Limb Profiler onboard the Suomi National Polar-Orbiting Partnership satellite (SNPP) makes measurements of limb-scattered solar radiances over Ultraviolet and Visible wavelengths. These measurements are used in retrieval algorithms to create high vertical resolution ozone profiles, helping monitor the evolution of the atmospheric ozone layer. NOAA is in the process of implementing these algorithms to make near-real-time versions of these products. The main objective of this project is to generate estimates of the accuracy and precision of the OMPS Limb products by analysis of matchup comparisons with similar products from the Earth Observing System Microwave Limb Sounder (EOS Aura MLS). The studies investigated the sources of errors, and classified them with respect to height, geographic location, and atmospheric and observation conditions. In addition, this project included working with the algorithm developers in an attempt to develop corrections and adjustments. Collocation and zonal mean comparisons were made and statistics were gathered on both a daily and monthly basis encompassing the entire OMPS data record. This validation effort of the OMPS-LP data will be used to help validate data from the Stratosphere Aerosol and Gas Experiment III on the International Space Station (SAGE III ISS) and will also be used in conjunction with the NOAA Total Ozone from Assimilation of Stratosphere and Troposphere (TOAST) product to develop a new a-priori for the NOAA Unique Combined Atmosphere Processing System (NUCAPS) ozone product. The current NUCAPS ozone product uses a combination of Cross-track Infrared Sounder (CrIS) data for the troposphere and a tropopause based climatology derived from ozonesonde data for the stratosphere a-priori. The latest version of TOAST uses a combination of both CrIS and OMPS-LP data. We will further develop the newest version of TOAST and incorporate it into the NUCAPS system as a new a-priori, in hopes of creating a better global ozone product.
Intelligent Control Systems Research
NASA Technical Reports Server (NTRS)
Loparo, Kenneth A.
1994-01-01
Results of a three phase research program into intelligent control systems are presented. The first phase looked at implementing the lowest or direct level of a hierarchical control scheme using a reinforcement learning approach assuming no a priori information about the system under control. The second phase involved the design of an adaptive/optimizing level of the hierarchy and its interaction with the direct control level. The third and final phase of the research was aimed at combining the results of the previous phases with some a priori information about the controlled system.
Rutterford, Clare; Taljaard, Monica; Dixon, Stephanie; Copas, Andrew; Eldridge, Sandra
2015-06-01
To assess the quality of reporting and accuracy of a priori estimates used in sample size calculations for cluster randomized trials (CRTs). We reviewed 300 CRTs published between 2000 and 2008. The prevalence of reporting sample size elements from the 2004 CONSORT recommendations was evaluated and a priori estimates compared with those observed in the trial. Of the 300 trials, 166 (55%) reported a sample size calculation. Only 36 of 166 (22%) reported all recommended descriptive elements. Elements specific to CRTs were the worst reported: a measure of within-cluster correlation was specified in only 58 of 166 (35%). Only 18 of 166 articles (11%) reported both a priori and observed within-cluster correlation values. Except in two cases, observed within-cluster correlation values were either close to or less than a priori values. Even with the CONSORT extension for cluster randomization, the reporting of sample size elements specific to these trials remains below that necessary for transparent reporting. Journal editors and peer reviewers should implement stricter requirements for authors to follow CONSORT recommendations. Authors should report observed and a priori within-cluster correlation values to enable comparisons between these over a wider range of trials. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.
1982-06-02
to Army Modeling efforts. Include design for future priori- ties and specific actions. (13) Establish standards, methodology and formats for exter- I...with models and the wider technological-scientific-academic community, (4) increased centralized management of data, and (5) design of a proactive...andObjectives ............... 2 Purposes and Preliminary Results . . . . . . . . . . . . 4 Scope of Study .................... 6 Methodology
Pressure Scalings and Influence Region Research
2018-01-01
the results are briefly discussed. Additionally, updated experimental results are presented along with discussion of collaborative research efforts...with upstream and downstream influence, where the influence lengths are defined in terms of a-priori quantities (freestream conditions and...governing equations and the result is briefly discussed. Additionally, updated experimental results are presented along with discussion of
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sciarrino, Fabio; Dipartimento di Fisica and Consorzio Nazionale Interuniversitario per le Scienze Fisiche della Materia, Universita 'La Sapienza', Rome 00185; De Martini, Francesco
The optimal phase-covariant quantum cloning machine (PQCM) broadcasts the information associated to an input qubit into a multiqubit system, exploiting a partial a priori knowledge of the input state. This additional a priori information leads to a higher fidelity than for the universal cloning. The present article first analyzes different innovative schemes to implement the 1{yields}3 PQCM. The method is then generalized to any 1{yields}M machine for an odd value of M by a theoretical approach based on the general angular momentum formalism. Finally different experimental schemes based either on linear or nonlinear methods and valid for single photon polarizationmore » encoded qubits are discussed.« less
Masia, Lorenzo; Cappello, Leonardo; Morasso, Pietro; Lachenal, Xavier; Pirrera, Alberto; Weaver, Paul; Mattioni, Filippo
2013-06-01
A novel actuator is introduced that combines an elastically compliant composite structure with conventional electromechanical elements. The proposed design is analogous to that used in Series Elastic Actuators, its distinctive feature being that the compliant composite part offers different stable configurations. In other words, its elastic potential presents points of local minima that correspond to robust stable positions (multistability). This potential is known a priori as a function of the structural geometry, thus providing tremendous benefits in terms of control implementation. Such knowledge enables the complexities arising from the additional degrees of freedom associated with link deformations to be overcome and uncover challenges that extends beyond those posed by standard rigidlink robot dynamics. It is thought that integrating a multistable elastic element in a robotic transmission can provide new scenarios in the field of assistive robotics, as the system may help a subject to stand or carry a load without the need for an active control effort by the actuators.
Lost in space: Onboard star identification using CCD star tracker data without an a priori attitude
NASA Technical Reports Server (NTRS)
Ketchum, Eleanor A.; Tolson, Robert H.
1993-01-01
There are many algorithms in use today which determine spacecraft attitude by identifying stars in the field of view of a star tracker. Some methods, which date from the early 1960's, compare the angular separation between observed stars with a small catalog. In the last 10 years, several methods have been developed which speed up the process and reduce the amount of memory needed, a key element to onboard attitude determination. However, each of these methods require some a priori knowledge of the spacecraft attitude. Although the Sun and magnetic field generally provide the necessary coarse attitude information, there are occasions when a spacecraft could get lost when it is not prudent to wait for sunlight. Also, the possibility of efficient attitude determination using only the highly accurate CCD star tracker could lead to fully autonomous spacecraft attitude determination. The need for redundant coarse sensors could thus be eliminated at substantial cost reduction. Some groups have extended their algorithms to implement a computation intense full sky scan. Some require large data bases. Both storage and speed are concerns for autonomous onboard systems. Neural network technology is even being explored by some as a possible solution, but because of the limited number of patterns that can be stored and large overhead, nothing concrete has resulted from these efforts. This paper presents an algorithm which, by descretizing the sky and filtering by visual magnitude of the brightness observed star, speeds up the lost in space star identification process while reducing the amount of necessary onboard computer storage compared to existing techniques.
DOT National Transportation Integrated Search
2009-11-01
The Lane Transit District, in cooperation with the National Bus Rapid Transit Institute (NBRTI) at the University of South Florida, completed a preliminary implementation study to determine the potential impacts of a new and innovative transit priori...
Wiuf, Carsten; Schaumburg-Müller Pallesen, Jonatan; Foldager, Leslie; Grove, Jakob
2016-08-01
In many areas of science it is custom to perform many, potentially millions, of tests simultaneously. To gain statistical power it is common to group tests based on a priori criteria such as predefined regions or by sliding windows. However, it is not straightforward to choose grouping criteria and the results might depend on the chosen criteria. Methods that summarize, or aggregate, test statistics or p-values, without relying on a priori criteria, are therefore desirable. We present a simple method to aggregate a sequence of stochastic variables, such as test statistics or p-values, into fewer variables without assuming a priori defined groups. We provide different ways to evaluate the significance of the aggregated variables based on theoretical considerations and resampling techniques, and show that under certain assumptions the FWER is controlled in the strong sense. Validity of the method was demonstrated using simulations and real data analyses. Our method may be a useful supplement to standard procedures relying on evaluation of test statistics individually. Moreover, by being agnostic and not relying on predefined selected regions, it might be a practical alternative to conventionally used methods of aggregation of p-values over regions. The method is implemented in Python and freely available online (through GitHub, see the Supplementary information).
Dean, Laura; Gregorius, Stefanie; Pulford, Justin
2017-01-01
Objectives Substantial development assistance and research funding are invested in health research capacity strengthening (HRCS) interventions in low-income and middle-income countries, yet the effectiveness, impact and value for money of these investments are not well understood. A major constraint to evidence-informed HRCS intervention has been the disparate nature of the research effort to date. This review aims to map and critically analyse the existing HRCS effort to better understand the level, type, cohesion and conceptual sophistication of the current evidence base. The overall goal of this article is to advance the development of a unified, implementation-focused HRCS science. Methods We used a scoping review methodology to identify peer-reviewed HRCS literature within the following databases: PubMed, Global Health and Scopus. HRCS publications available in English between the period 2000 and 2016 were included. 1195 articles were retrieved of which 172 met the final inclusion criteria. A priori thematic analysis of all included articles was completed. Content analysis of identified HRCS definitions was conducted. Results The number of HRCS publications increased exponentially between 2000 and 2016. Most publications during this period were perspective, opinion or commentary pieces; however, original research publications were the primary publication type since 2013. Twenty-five different definitions of research capacity strengthening were identified, of which three aligned with current HRCS guidelines. Conclusions The review findings indicate that an HRCS research field with a focus on implementation science is emerging, although the conceptual and empirical bases are not yet sufficiently advanced to effectively inform HRCS programme planning. Consolidating an HRCS implementation science therefore presents as a viable option that may accelerate the development of a useful evidence base to inform HRCS programme planning. Identifying an agreed operational definition of HRCS, standardising HRCS-related terminology, developing a needs-based HRCS-specific research agenda and synthesising currently available evidence may be useful first steps. PMID:29217727
Star Identification Without Attitude Knowledge: Testing with X-Ray Timing Experiment Data
NASA Technical Reports Server (NTRS)
Ketchum, Eleanor
1997-01-01
As the budget for the scientific exploration of space shrinks, the need for more autonomous spacecraft increases. For a spacecraft with a star tracker, the ability to determinate attitude from a lost in space state autonomously requires the capability to identify the stars in the field of view of the tracker. Although there have been efforts to produce autonomous star trackers which perform this function internally, many programs cannot afford these sensors. The author previously presented a method for identifying stars without a priori attitude knowledge specifically targeted for onboard computers as it minimizes the necessary computer storage. The method has previously been tested with simulated data. This paper provides results of star identification without a priori attitude knowledge using flight data from two 8 by 8 degree charge coupled device star trackers onboard the X-Ray Timing Experiment.
ERIC Educational Resources Information Center
Singell, Larry D.; Waddell, Glen R.
2010-01-01
We examine the extent to which readily available data at a large public university can be used to a priori identify at-risk students who may benefit from targeted retention efforts. Although it is possible to identify such students, there remains an inevitable tradeoff in any resource allocation between not treating the students who are likely to…
Glisson, Charles; Green, Philip; Williams, Nathaniel J
2012-09-01
The study: (1) provides the first assessment of the a priori measurement model and psychometric properties of the Organizational Social Context (OSC) measurement system in a US nationwide probability sample of child welfare systems; (2) illustrates the use of the OSC in constructing norm-based organizational culture and climate profiles for child welfare systems; and (3) estimates the association of child welfare system-level organizational culture and climate profiles with individual caseworker-level job satisfaction and organizational commitment. The study applies confirmatory factor analysis (CFA) and hierarchical linear models (HLM) analysis to a US nationwide sample of 1,740 caseworkers from 81 child welfare systems participating in the second National Survey of Child and Adolescent Wellbeing (NSCAW II). The participating child welfare systems were selected using a national probability procedure reflecting the number of children served by child welfare systems nationwide. The a priori OSC measurement model is confirmed in this nationwide sample of child welfare systems. In addition, caseworker responses to the OSC scales generate acceptable to high scale reliabilities, moderate to high within-system agreement, and significant between-system differences. Caseworkers in the child welfare systems with the best organizational culture and climate profiles report higher levels of job satisfaction and organizational commitment. Organizational climates characterized by high engagement and functionality, and organizational cultures characterized by low rigidity are associated with the most positive work attitudes. The OSC is the first valid and reliable measure of organizational culture and climate with US national norms for child welfare systems. The OSC provides a useful measure of Organizational Social Context for child welfare service improvement and implementation research efforts which include a focus on child welfare system culture and climate. Copyright © 2012 Elsevier Ltd. All rights reserved.
Wickham, J.D.; Stehman, S.V.; Smith, J.H.; Wade, T.G.; Yang, L.
2004-01-01
Two-stage cluster sampling reduces the cost of collecting accuracy assessment reference data by constraining sample elements to fall within a limited number of geographic domains (clusters). However, because classification error is typically positively spatially correlated, within-cluster correlation may reduce the precision of the accuracy estimates. The detailed population information to quantify a priori the effect of within-cluster correlation on precision is typically unavailable. Consequently, a convenient, practical approach to evaluate the likely performance of a two-stage cluster sample is needed. We describe such an a priori evaluation protocol focusing on the spatial distribution of the sample by land-cover class across different cluster sizes and costs of different sampling options, including options not imposing clustering. This protocol also assesses the two-stage design's adequacy for estimating the precision of accuracy estimates for rare land-cover classes. We illustrate the approach using two large-area, regional accuracy assessments from the National Land-Cover Data (NLCD), and describe how the a priorievaluation was used as a decision-making tool when implementing the NLCD design.
NASA Astrophysics Data System (ADS)
Lukyanenko, D. V.; Shishlenin, M. A.; Volkov, V. T.
2018-01-01
We propose the numerical method for solving coefficient inverse problem for a nonlinear singularly perturbed reaction-diffusion-advection equation with the final time observation data based on the asymptotic analysis and the gradient method. Asymptotic analysis allows us to extract a priory information about interior layer (moving front), which appears in the direct problem, and boundary layers, which appear in the conjugate problem. We describe and implement the method of constructing a dynamically adapted mesh based on this a priory information. The dynamically adapted mesh significantly reduces the complexity of the numerical calculations and improve the numerical stability in comparison with the usual approaches. Numerical example shows the effectiveness of the proposed method.
Becan, Jennifer E; Bartkowski, John P; Knight, Danica K; Wiley, Tisha R A; DiClemente, Ralph; Ducharme, Lori; Welsh, Wayne N; Bowser, Diana; McCollister, Kathryn; Hiller, Matthew; Spaulding, Anne C; Flynn, Patrick M; Swartzendruber, Andrea; Dickson, Megan F; Fisher, Jacqueline Horan; Aarons, Gregory A
2018-04-13
This paper describes the means by which a United States National Institute on Drug Abuse (NIDA)-funded cooperative, Juvenile Justice-Translational Research on Interventions for Adolescents in the Legal System (JJ-TRIALS), utilized an established implementation science framework in conducting a multi-site, multi-research center implementation intervention initiative. The initiative aimed to bolster the ability of juvenile justice agencies to address unmet client needs related to substance use while enhancing inter-organizational relationships between juvenile justice and local behavioral health partners. The EPIS (Exploration, Preparation, Implementation, Sustainment) framework was selected and utilized as the guiding model from inception through project completion; including the mapping of implementation strategies to EPIS stages, articulation of research questions, and selection, content, and timing of measurement protocols. Among other key developments, the project led to a reconceptualization of its governing implementation science framework into cyclical form as the EPIS Wheel. The EPIS Wheel is more consistent with rapid-cycle testing principles and permits researchers to track both progressive and recursive movement through EPIS. Moreover, because this randomized controlled trial was predicated on a bundled strategy method, JJ-TRIALS was designed to rigorously test progress through the EPIS stages as promoted by facilitation of data-driven decision making principles. The project extended EPIS by (1) elucidating the role and nature of recursive activity in promoting change (yielding the circular EPIS Wheel), (2) by expanding the applicability of the EPIS framework beyond a single evidence-based practice (EBP) to address varying process improvement efforts (representing varying EBPs), and (3) by disentangling outcome measures of progression through EPIS stages from the a priori established study timeline. The utilization of EPIS in JJ-TRIALS provides a model for practical and applied use of implementation frameworks in real-world settings that span outer service system and inner organizational contexts in improving care for vulnerable populations. NCT02672150 . Retrospectively registered on 22 January 2016.
Teaching of transcendence in physics
NASA Astrophysics Data System (ADS)
Jaki, Stanley L.
1987-10-01
Efforts aimed at showing that modern physics points to a truly transcendental factor as the explanation of the universe should be welcomed by those who have urged the teaching of physics in a broad cultural context. Those efforts may profit from the following guidelines: avoid the antiontological basis of the Copenhagen interpretation of quantum mechanics; make much of the reality of the universe and its enormous degree of specificity as revealed by general relativity and the cosmic background radiation; exploit Gödel's incompleteness theorems against any grand unified theory proposed as if it were true a priori and necessarily; and realize that the design argument always presupposes the validity of the cosmological argument.
Distributed Compression in Camera Sensor Networks
2006-02-13
complicated in this context. This effort will make use of the correlation structure of the data given by the plenoptic function n the case of multi-camera...systems. In many cases the structure of the plenoptic function can be estimated without requiring inter-sensor communications, but by using some a...priori global geometrical information. Once the structure of the plenoptic function has been predicted, it is possible to develop specific distributed
Investigating prior probabilities in a multiple hypothesis test for use in space domain awareness
NASA Astrophysics Data System (ADS)
Hardy, Tyler J.; Cain, Stephen C.
2016-05-01
The goal of this research effort is to improve Space Domain Awareness (SDA) capabilities of current telescope systems through improved detection algorithms. Ground-based optical SDA telescopes are often spatially under-sampled, or aliased. This fact negatively impacts the detection performance of traditionally proposed binary and correlation-based detection algorithms. A Multiple Hypothesis Test (MHT) algorithm has been previously developed to mitigate the effects of spatial aliasing. This is done by testing potential Resident Space Objects (RSOs) against several sub-pixel shifted Point Spread Functions (PSFs). A MHT has been shown to increase detection performance for the same false alarm rate. In this paper, the assumption of a priori probability used in a MHT algorithm is investigated. First, an analysis of the pixel decision space is completed to determine alternate hypothesis prior probabilities. These probabilities are then implemented into a MHT algorithm, and the algorithm is then tested against previous MHT algorithms using simulated RSO data. Results are reported with Receiver Operating Characteristic (ROC) curves and probability of detection, Pd, analysis.
How Can TOLNet Help to Better Understand Tropospheric Ozone? A Satellite Perspective
NASA Technical Reports Server (NTRS)
Johnson, Matthew S.
2018-01-01
Potential sources of a priori ozone (O3) profiles for use in Tropospheric Emissions: Monitoring of Pollution (TEMPO) satellite tropospheric O3 retrievals are evaluated with observations from multiple Tropospheric Ozone Lidar Network (TOLNet) systems in North America. An O3 profile climatology (tropopause-based O3 climatology (TB-Clim), currently proposed for use in the TEMPO O3 retrieval algorithm) derived from ozonesonde observations and O3 profiles from three separate models (operational Goddard Earth Observing System (GEOS-5) Forward Processing (FP) product, reanalysis product from Modern-Era Retrospective analysis for Research and Applications version 2 (MERRA2), and the GEOS-Chem chemical transport model (CTM)) were: 1) evaluated with TOLNet measurements on various temporal scales (seasonally, daily, hourly) and 2) implemented as a priori information in theoretical TEMPO tropospheric O3 retrievals in order to determine how each a priori impacts the accuracy of retrieved tropospheric (0-10 km) and lowermost tropospheric (LMT, 0-2 km) O3 columns. We found that all sources of a priori O3 profiles evaluated in this study generally reproduced the vertical structure of summer-averaged observations. However, larger differences between the a priori profiles and lidar observations were observed when evaluating inter-daily and diurnal variability of tropospheric O3. The TB-Clim O3 profile climatology was unable to replicate observed inter-daily and diurnal variability of O3 while model products, in particular GEOS-Chem simulations, displayed more skill in reproducing these features. Due to the ability of models, primarily the CTM used in this study, on average to capture the inter-daily and diurnal variability of tropospheric and LMT O3 columns, using a priori profiles from CTM simulations resulted in TEMPO retrievals with the best statistical comparison with lidar observations. Furthermore, important from an air quality perspective, when high LMT O3 values were observed, using CTM a priori profiles resulted in TEMPO LMT O3 retrievals with the least bias. The application of time-specific (non-climatological) hourly/daily model predictions as the a priori profile in TEMPO O3 retrievals will be best suited when applying this data to study air quality or event-based processes as the standard retrieval algorithm will still need to use a climatology product. Follow-on studies to this work are currently being conducted to investigate the application of different CTM-predicted O3 climatology products in the standard TEMPO retrieval algorithm. Finally, similar methods to those used in this study can be easily applied by TEMPO data users to recalculate tropospheric O3 profiles provided from the standard retrieval using a different source of a priori.
Dynamic Programming Method for Impulsive Control Problems
ERIC Educational Resources Information Center
Balkew, Teshome Mogessie
2015-01-01
In many control systems changes in the dynamics occur unexpectedly or are applied by a controller as needed. The time at which a controller implements changes is not necessarily known a priori. For example, many manufacturing systems and flight operations have complicated control systems, and changes in the control systems may be automatically…
Ekman, Drew R.; Ankley, Gerald T.; Blazer, Vicki; Collette, Timothy W.; Garcia-Reyero, Natàlia; Iwanowicz, Luke R.; Jorgensen, Zachary G.; Lee, Kathy E.; Mazik, Pat M.; Miller, David H.; Perkins, Edward J.; Smith, Edwin T.; Tietge, Joseph E.; Villeneuve, Daniel L.
2013-01-01
There is increasing demand for the implementation of effects-based monitoring and surveillance (EBMS) approaches in the Great Lakes Basin to complement traditional chemical monitoring. Herein, we describe an ongoing multiagency effort to develop and implement EBMS tools, particularly with regard to monitoring potentially toxic chemicals and assessing Areas of Concern (AOCs), as envisioned by the Great Lakes Restoration Initiative (GLRI). Our strategy includes use of both targeted and open-ended/discovery techniques, as appropriate to the amount of information available, to guide a priori end point and/or assay selection. Specifically, a combination of in vivo and in vitro tools is employed by using both wild and caged fish (in vivo), and a variety of receptor- and cell-based assays (in vitro). We employ a work flow that progressively emphasizes in vitro tools for long-term or high-intensity monitoring because of their greater practicality (e.g., lower cost, labor) and relying on in vivo assays for initial surveillance and verification. Our strategy takes advantage of the strengths of a diversity of tools, balancing the depth, breadth, and specificity of information they provide against their costs, transferability, and practicality. Finally, a series of illustrative scenarios is examined that align EBMS options with management goals to illustrate the adaptability and scaling of EBMS approaches and how they can be used in management decisions.
Application of Bayesian a Priori Distributions for Vehicles' Video Tracking Systems
NASA Astrophysics Data System (ADS)
Mazurek, Przemysław; Okarma, Krzysztof
Intelligent Transportation Systems (ITS) helps to improve the quality and quantity of many car traffic parameters. The use of the ITS is possible when the adequate measuring infrastructure is available. Video systems allow for its implementation with relatively low cost due to the possibility of simultaneous video recording of a few lanes of the road at a considerable distance from the camera. The process of tracking can be realized through different algorithms, the most attractive algorithms are Bayesian, because they use the a priori information derived from previous observations or known limitations. Use of this information is crucial for improving the quality of tracking especially for difficult observability conditions, which occur in the video systems under the influence of: smog, fog, rain, snow and poor lighting conditions.
Large eddy simulation and direct numerical simulation of high speed turbulent reacting flows
NASA Technical Reports Server (NTRS)
Adumitroaie, V.; Frankel, S. H.; Madnia, C. K.; Givi, P.
1993-01-01
The objective of this research is to make use of Large Eddy Simulation (LES) and Direct Numerical Simulation (DNS) for the computational analyses of high speed reacting flows. Our efforts in the first phase of this research conducted within the past three years have been directed in several issues pertaining to intricate physics of turbulent reacting flows. In our previous 5 semi-annual reports submitted to NASA LaRC, as well as several technical papers in archival journals, the results of our investigations have been fully described. In this progress report which is different in format as compared to our previous documents, we focus only on the issue of LES. The reason for doing so is that LES is the primary issue of interest to our Technical Monitor and that our other findings were needed to support the activities conducted under this prime issue. The outcomes of our related investigations, nevertheless, are included in the appendices accompanying this report. The relevance of the materials in these appendices are, therefore, discussed only briefly within the body of the report. Here, results are presented of a priori and a posterior analyses for validity assessments of assumed Probability Density Function (PDF) methods as potential subgrid scale (SGS) closures for LES of turbulent reacting flows. Simple non-premixed reacting systems involving an isothermal reaction of the type A + B yields Products under both chemical equilibrium and non-equilibrium conditions are considered. A priori analyses are conducted of a homogeneous box flow, and a spatially developing planar mixing layer to investigate the performance of the Pearson Family of PDF's as SGS models. A posteriori analyses are conducted of the mixing layer using a hybrid one-equation Smagorinsky/PDF SGS closure. The Smagorinsky closure augmented by the solution of the subgrid turbulent kinetic energy (TKE) equation is employed to account for hydrodynamic fluctuations, and the PDF is employed for modeling the effects of scalar fluctuations. The implementation of the model requires the knowledge of the local values of the first two SGS moments. These are provided by additional modeled transport equations. In both a priori and a posteriori analyses, the predicted results are appraised by comparison with subgrid averaged results generated by DNS. Based on these results, the paths to be followed in future investigations are identified.
Impact of a priori information on IASI ozone retrievals and trends
NASA Astrophysics Data System (ADS)
Barret, B.; Peiro, H.; Emili, E.; Le Flocgmoën, E.
2017-12-01
The IASI sensor documents atmospheric water vapor, temperature and composition since 2007. The Software for a Fast Retrieval of IASI Data (SOFRID) has been developped to retrieve O3 and CO profiles from IASI in near-real time on a global scale. Information content analyses have shown that IASI enables the quantification of O3 independently in the troposphere, the UTLS and the stratosphere. Validation studies have demonstrated that the daily to seasonal variability of tropospheric and UTLS O3 was well captured by IASI especially in the tropics. IASI-SOFRID retrievals have also been used to document the tropospheric composition during the Asian monsoon and participated to determine the O3 evolution during the 2008-2016 period in the framework of the TOAR project. Nevertheless, IASI-SOFRID O3 is biased high in the UTLS and in the tropical troposphere and the 8 years O3 trends from the different IASI products are significantly different from the O3 trends from UV-Vis satellite sensors (e.g. OMI)..SOFRID is based on the Optimal Estimation Method that requires a priori information to complete the information provided by the measured thermal infrared radiances. In SOFRID-O3 v1.5 used in TOAR the a priori consists of a single O3 profile and associated covariance matrix based on global O3 radiosoundings. Such a global a priori is characterized by a very large variabilty and does not represent our best kowledge of the O3 profile at a given time and location. Furthermore it is biased towards the northern hemisphere middle latitudes. We have therefore implemented the possibility to use dynamical a priori data in SOFRID and performed experiments using O3 climatological data and MLS O3 analyses. We will present O3 distributions and comparisons with O3 radiosoundings from the different SOFRID-O3 retrievals. We will in particular assess the impact of the use of different a priori data upon the O3 biases and trends during the IASI period.
NASA Technical Reports Server (NTRS)
Quinlan, Jesse R.; Drozda, Tomasz G.; McDaniel, James C.; Lacaze, Guilhem; Oefelein, Joseph
2015-01-01
In an effort to make large eddy simulation of hydrocarbon-fueled scramjet combustors more computationally accessible using realistic chemical reaction mechanisms, a compressible flamelet/progress variable (FPV) model was proposed that extends current FPV model formulations to high-speed, compressible flows. Development of this model relied on observations garnered from an a priori analysis of the Reynolds-Averaged Navier-Stokes (RANS) data obtained for the Hypersonic International Flight Research and Experimentation (HI-FiRE) dual-mode scramjet combustor. The RANS data were obtained using a reduced chemical mechanism for the combustion of a JP-7 surrogate and were validated using avail- able experimental data. These RANS data were then post-processed to obtain, in an a priori fashion, the scalar fields corresponding to an FPV-based modeling approach. In the current work, in addition to the proposed compressible flamelet model, a standard incompressible FPV model was also considered. Several candidate progress variables were investigated for their ability to recover static temperature and major and minor product species. The effects of pressure and temperature on the tabulated progress variable source term were characterized, and model coupling terms embedded in the Reynolds- averaged Navier-Stokes equations were studied. Finally, results for the novel compressible flamelet/progress variable model were presented to demonstrate the improvement attained by modeling the effects of pressure and flamelet boundary conditions on the combustion.
Quantification and propagation of disciplinary uncertainty via Bayesian statistics
NASA Astrophysics Data System (ADS)
Mantis, George Constantine
2002-08-01
Several needs exist in the military, commercial, and civil sectors for new hypersonic systems. These needs remain unfulfilled, due in part to the uncertainty encountered in designing these systems. This uncertainty takes a number of forms, including disciplinary uncertainty, that which is inherent in the analytical tools utilized during the design process. Yet, few efforts to date empower the designer with the means to account for this uncertainty within the disciplinary analyses. In the current state-of-the-art in design, the effects of this unquantifiable uncertainty significantly increase the risks associated with new design efforts. Typically, the risk proves too great to allow a given design to proceed beyond the conceptual stage. To that end, the research encompasses the formulation and validation of a new design method, a systematic process for probabilistically assessing the impact of disciplinary uncertainty. The method implements Bayesian Statistics theory to quantify this source of uncertainty, and propagate its effects to the vehicle system level. Comparison of analytical and physical data for existing systems, modeled a priori in the given analysis tools, leads to quantification of uncertainty in those tools' calculation of discipline-level metrics. Then, after exploration of the new vehicle's design space, the quantified uncertainty is propagated probabilistically through the design space. This ultimately results in the assessment of the impact of disciplinary uncertainty on the confidence in the design solution: the final shape and variability of the probability functions defining the vehicle's system-level metrics. Although motivated by the hypersonic regime, the proposed treatment of uncertainty applies to any class of aerospace vehicle, just as the problem itself affects the design process of any vehicle. A number of computer programs comprise the environment constructed for the implementation of this work. Application to a single-stage-to-orbit (SSTO) reusable launch vehicle concept, developed by the NASA Langley Research Center under the Space Launch Initiative, provides the validation case for this work, with the focus placed on economics, aerothermodynamics, propulsion, and structures metrics. (Abstract shortened by UMI.)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Kandler A; Santhanagopalan, Shriram; Yang, Chuanbo
Computer models are helping to accelerate the design and validation of next generation batteries and provide valuable insights not possible through experimental testing alone. Validated 3-D physics-based models exist for predicting electrochemical performance, thermal and mechanical response of cells and packs under normal and abuse scenarios. The talk describes present efforts to make the models better suited for engineering design, including improving their computation speed, developing faster processes for model parameter identification including under aging, and predicting the performance of a proposed electrode material recipe a priori using microstructure models.
Photochemical Phenomenology Model for the New Millenium
NASA Technical Reports Server (NTRS)
Bishop, James; Evans, J. Scott
2000-01-01
This project tackles the problem of conversion of validated a priori physics-based modeling capabilities, specifically those relevant to the analysis and interpretation of planetary atmosphere observations, to application-oriented software for use in science and science-support activities. The software package under development, named the Photochemical Phenomenology Modeling Tool (PPMT), has particular focus on the atmospheric remote sensing data to be acquired by the CIRS instrument during the CASSINI Jupiter flyby and orbital tour of the Saturnian system. Overall, the project has followed the development outline given in the original proposal, and the Year 1 design and architecture goals have been met. Specific accomplishments and the difficulties encountered are summarized in this report. Most of the effort has gone into complete definition of the PPMT interfaces within the context of today's IT arena: adoption and adherence to the CORBA Component Model (CCM) has yielded a solid architecture basis, and CORBA-related issues (services, specification options, development plans, etc.) have been largely resolved. Implementation goals have been redirected somewhat so as to be more relevant to the upcoming CASSINI flyby of Jupiter, with focus now being more on data analysis and remote sensing retrieval applications.
Developing an emergency department crowding dashboard: A design science approach.
Martin, Niels; Bergs, Jochen; Eerdekens, Dorien; Depaire, Benoît; Verelst, Sandra
2017-08-30
As an emergency department (ED) is a complex adaptive system, the analysis of continuously gathered data is valuable to gain insight in the real-time patient flow. To support the analysis and management of ED operations, relevant data should be provided in an intuitive way. Within this context, this paper outlines the development of a dashboard which provides real-time information regarding ED crowding. The research project underlying this paper follows the principles of design science research, which involves the development and study of artifacts which aim to solve a generic problem. To determine the crowding indicators that are desired in the dashboard, a modified Delphi study is used. The dashboard is implemented using the open source Shinydashboard package in R. A dashboard is developed containing the desired crowding indicators, together with general patient flow characteristics. It is demonstrated using a dataset of a Flemish ED and fulfills the requirements which are defined a priori. The developed dashboard provides real-time information on ED crowding. This information enables ED staff to judge whether corrective actions are required in an effort to avoid the adverse effects of ED crowding. Copyright © 2017 Elsevier Ltd. All rights reserved.
Mass univariate analysis of event-related brain potentials/fields I: a critical tutorial review.
Groppe, David M; Urbach, Thomas P; Kutas, Marta
2011-12-01
Event-related potentials (ERPs) and magnetic fields (ERFs) are typically analyzed via ANOVAs on mean activity in a priori windows. Advances in computing power and statistics have produced an alternative, mass univariate analyses consisting of thousands of statistical tests and powerful corrections for multiple comparisons. Such analyses are most useful when one has little a priori knowledge of effect locations or latencies, and for delineating effect boundaries. Mass univariate analyses complement and, at times, obviate traditional analyses. Here we review this approach as applied to ERP/ERF data and four methods for multiple comparison correction: strong control of the familywise error rate (FWER) via permutation tests, weak control of FWER via cluster-based permutation tests, false discovery rate control, and control of the generalized FWER. We end with recommendations for their use and introduce free MATLAB software for their implementation. Copyright © 2011 Society for Psychophysiological Research.
Implementation of safety checklists in surgery: a realist synthesis of evidence.
Gillespie, Brigid M; Marshall, Andrea
2015-09-28
The aim of this review is to present a realist synthesis of the evidence of implementation interventions to improve adherence to the use of safety checklists in surgery. Surgical safety checklists have been shown to improve teamwork and patient safety in the operating room. Yet, despite the benefits associated with their use, universal implementation of and compliance with these checklists has been inconsistent. An overview of the literature from 2008 is examined in relation to checklist implementation, compliance, and sustainability. Pawson's and Rycroft-Malone's realist synthesis methodology was used to explain the interaction between context, mechanism, and outcome. This approach incorporated the following: defining the scope of the review, searching and appraising the evidence, extracting and synthesising the findings, and disseminating, implementing, and evaluating the evidence. We identified two theories a priori that explained contextual nuances associated with implementation and evaluation of checklists in surgery: the Normalisation Process Theory and Responsive Regulation Theory. We identified four a priori propositions: (1) Checklist protocols that are prospectively tailored to the context are more likely to be used and sustained in practice, (2) Fidelity and sustainability is increased when checklist protocols can be seamlessly integrated into daily professional practice, (3) Routine embedding of checklist protocols in practice is influenced by factors that promote or inhibit clinicians' participation, and (4) Regulation reinforcement mechanisms that are more contextually responsive should lead to greater compliance in using checklist protocols. The final explanatory model suggests that the sustained use of surgical checklists is discipline-specific and is more likely to occur when medical staff are actively engaged and leading the process of implementation. Involving clinicians in tailoring the checklist to better fit their context of practice and giving them the opportunity to reflect and evaluate the implementation intervention enables greater participation and ownership of the process. A major limitation in the surgical checklist literature is the lack of robust descriptions of intervention methods and implementation strategies. Despite this, two consequential findings have emerged through this realist synthesis: First, the sustained use of surgical checklists is discipline-specific and is more successful when physicians are actively engaged and leading implementation. Second, involving clinicians in tailoring the checklist to their context and encouraging them to reflect on and evaluate the implementation process enables greater participation and ownership.
The ethical use of existing samples for genome research.
Bathe, Oliver F; McGuire, Amy L
2009-10-01
Modern biobanking efforts consist of prospective collections of tissues linked to clinical data for patients who have given informed consent for the research use of their specimens and data, including their DNA. In such efforts, patient autonomy and privacy are well respected because of the prospective nature of the informed consent process. However, one of the richest sources of tissue for research continues to be the millions of archived samples collected by pathology departments during normal clinical care or for research purposes without specific consent for future research or genetic analysis. Because specific consent was not obtained a priori, issues related to individual privacy and autonomy are much more complicated. A framework for accessing these existing samples and related clinical data for research is presented. Archival tissues may be accessed only when there is a reasonable likelihood of generating beneficial and scientifically valid information. To minimize risks, databases containing information related to the tissue and to clinical data should be coded, no personally identifying phenotypic information should be included, and access should be restricted to bona fide researchers for legitimate research purposes. These precautions, if implemented appropriately, should ensure that the research use of archival tissue and data are no more than minimal risk. A waiver of the requirement for informed consent would then be justified if reconsent is shown to be impracticable. A waiver of consent should not be granted, however, if there is a significant risk to privacy, if the proposed research use is inconsistent with the original consent (where there is one), or if the potential harm from a privacy breach is considerable.
Kerfriden, P.; Goury, O.; Rabczuk, T.; Bordas, S.P.A.
2013-01-01
We propose in this paper a reduced order modelling technique based on domain partitioning for parametric problems of fracture. We show that coupling domain decomposition and projection-based model order reduction permits to focus the numerical effort where it is most needed: around the zones where damage propagates. No a priori knowledge of the damage pattern is required, the extraction of the corresponding spatial regions being based solely on algebra. The efficiency of the proposed approach is demonstrated numerically with an example relevant to engineering fracture. PMID:23750055
NASA Astrophysics Data System (ADS)
Feng, J.; Bai, L.; Liu, S.; Su, X.; Hu, H.
2012-07-01
In this paper, the MODIS remote sensing data, featured with low-cost, high-timely and moderate/low spatial resolutions, in the North China Plain (NCP) as a study region were firstly used to carry out mixed-pixel spectral decomposition to extract an useful regionalized indicator parameter (RIP) (i.e., an available ratio, that is, fraction/percentage, of winter wheat planting area in each pixel as a regionalized indicator variable (RIV) of spatial sampling) from the initial selected indicators. Then, the RIV values were spatially analyzed, and the spatial structure characteristics (i.e., spatial correlation and variation) of the NCP were achieved, which were further processed to obtain the scalefitting, valid a priori knowledge or information of spatial sampling. Subsequently, founded upon an idea of rationally integrating probability-based and model-based sampling techniques and effectively utilizing the obtained a priori knowledge or information, the spatial sampling models and design schemes and their optimization and optimal selection were developed, as is a scientific basis of improving and optimizing the existing spatial sampling schemes of large-scale cropland remote sensing monitoring. Additionally, by the adaptive analysis and decision strategy the optimal local spatial prediction and gridded system of extrapolation results were able to excellently implement an adaptive report pattern of spatial sampling in accordance with report-covering units in order to satisfy the actual needs of sampling surveys.
A Priori Design of Optimal Electro-Optic Materials for Laser Eye Protection
2008-12-01
Jacquemin et al., 2008) In this paper, we have also used the recently proposed BNL functional (Livshits and Baer, 2007). The BNL functional falls...systems. Recently, the BNL functional was implemented in the highly parallel NWChem program (Andzelm et al., 2008), and was optimized for several...βijj) / |μ| The static hyperpolarizability β was calculated using a finite-field method. 2.1 LC-DFT calculations The BNL functional
Assimilation of the Microwave Limb Sounder Radiances
NASA Technical Reports Server (NTRS)
Wargan, K.; Read, W.; Livesey, N.; Wagner, P.; Nguyen. H.; Pawson, S.
2012-01-01
It has been shown that the assimilation of limb-sounder data can significantly improve the representation of ozone in NASA's GEOS Data Assimilation Systems (GEOS-DAS), particularly in the stratosphere. The studies conducted so far utilized retrieved data from the MIPAS, POAM, ILAS and EOS Microwave Limb Sounder (EOS MLS) instruments. Direct assimilation of the radiance data can be seen as the natural next step to those studies. The motivation behind working with radiances is twofold. First, retrieval algorithms use a priori data which are either climatological or are obtained from previous analyses. This introduces additional uncertainty and, in some cases, may lead to "self-contamination"- when the a priori is taken from the same assimilation system in which subsequently ingests the retrieved observations. Second, radiances can be available in near real time thus providing an opportunity for operational assimilation, which could help improve the use of infrared radiance instruments from operational satellite instruments. In this presentation we summarize our ongoing work on an implementation of the assimilation of EOS MLS radiances into the GEOS-5 DAS. This work focuses on assimilation of band 7 brightness temperatures which are sensitive to ozone. Our implementation uses the MLS Callable Forward Model developed by the MLS team at NASA JPL as the observation operator. We will describe our approach and recent results which are not yet final. In particular, we will demonstrate that this approach has a potential to improve the vertical structure of ozone in the lower tropical stratosphere as compared with the retrieved MLS product. We will discuss the computational efficiency of this implementation.
NASA Astrophysics Data System (ADS)
Trauth, N.; Schmidt, C.; Munz, M.
2016-12-01
Heat as a natural tracer to quantify water fluxes between groundwater and surface water has evolved to a standard hydrological method. Typically, time series of temperatures in the surface water and in the sediment are observed and are subsequently evaluated by a vertical 1D representation of heat transport by advection and dispersion. Several analytical solutions as well as their implementation into user-friendly software exist in order to estimate water fluxes from the observed temperatures. Analytical solutions can be easily implemented but assumptions on the boundary conditions have to be made a priori, e.g. sinusoidal upper temperature boundary. Numerical models offer more flexibility and can handle temperature data which is characterized by irregular variations such as storm-event induced temperature changes and thus cannot readily be incorporated in analytical solutions. This also reduced the effort of data preprocessing such as the extraction of the diurnal temperature variation. We developed a software to estimate water FLUXes Based On Temperatures- FLUX-BOT. FLUX-BOT is a numerical code written in MATLAB which is intended to calculate vertical water fluxes in saturated sediments, based on the inversion of measured temperature time series observed at multiple depths. It applies a cell-centered Crank-Nicolson implicit finite difference scheme to solve the one-dimensional heat advection-conduction equation. Besides its core inverse numerical routines, FLUX-BOT includes functions visualizing the results and functions for performing uncertainty analysis. We provide applications of FLUX-BOT to generic as well as to measured temperature data to demonstrate its performance.
On the regularization for nonlinear tomographic absorption spectroscopy
NASA Astrophysics Data System (ADS)
Dai, Jinghang; Yu, Tao; Xu, Lijun; Cai, Weiwei
2018-02-01
Tomographic absorption spectroscopy (TAS) has attracted increased research efforts recently due to the development in both hardware and new imaging concepts such as nonlinear tomography and compressed sensing. Nonlinear TAS is one of the emerging modality that bases on the concept of nonlinear tomography and has been successfully demonstrated both numerically and experimentally. However, all the previous demonstrations were realized using only two orthogonal projections simply for ease of implementation. In this work, we examine the performance of nonlinear TAS using other beam arrangements and test the effectiveness of the beam optimization technique that has been developed for linear TAS. In addition, so far only smoothness prior has been adopted and applied in nonlinear TAS. Nevertheless, there are also other useful priors such as sparseness and model-based prior which have not been investigated yet. This work aims to show how these priors can be implemented and included in the reconstruction process. Regularization through Bayesian formulation will be introduced specifically for this purpose, and a method for the determination of a proper regularization factor will be proposed. The comparative studies performed with different beam arrangements and regularization schemes on a few representative phantoms suggest that the beam optimization method developed for linear TAS also works for the nonlinear counterpart and the regularization scheme should be selected properly according to the available a priori information under specific application scenarios so as to achieve the best reconstruction fidelity. Though this work is conducted under the context of nonlinear TAS, it can also provide useful insights for other tomographic modalities.
NASA Astrophysics Data System (ADS)
Sutliff, T. J.; Otero, A. M.; Urban, D. L.
2002-01-01
The Physical Sciences Research Program of NASA has chartered a broad suite of peer-reviewed research investigating both fundamental combustion phenomena and applied combustion research topics. Fundamental research provides insights to develop accurate simulations of complex combustion processes and allows developers to improve the efficiency of combustion devices, to reduce the production of harmful emissions, and to reduce the incidence of accidental uncontrolled combustion (fires, explosions). The applied research benefit humans living and working in space through its fire safety program. The Combustion Science Discipline is implementing a structured flight research program utilizing the International Space Station (ISS) and two of its premier facilities, the Combustion Integrated Rack of the Fluids and Combustion Facility and the Microgravity Science Glovebox to conduct this space-based research. This paper reviews the current vision of Combustion Science research planned for International Space Station implementation from 2003 through 2012. A variety of research efforts in droplets and sprays, solid-fuels combustion, and gaseous combustion have been independently selected and critiqued through a series of peer-review processes. During this period, while both the ISS carrier and its research facilities are under development, the Combustion Science Discipline has synergistically combined research efforts into sub-topical areas. To conduct this research aboard ISS in the most cost effective and resource efficient manner, the sub-topic research areas are implemented via a multi-user hardware approach. This paper also summarizes the multi-user hardware approach and recaps the progress made in developing these research hardware systems. A balanced program content has been developed to maximize the production of fundamental and applied combustion research results within the current budgetary and ISS operational resource constraints. Decisions on utilizing the Combustion Integrated Rack and the Microgravity Science Glovebox are made based on facility capabilities and research requirements. To maximize research potential, additional research objectives are specified as desires a priori during the research design phase. These expanded research goals, which are designed to be achievable even with late addition of operational resources, allow additional research of a known, peer-endorsed scope to be conducted at marginal cost. Additional operational resources such as upmass, crewtime, data downlink bandwidth, and stowage volume may be presented by the ISS planners late in the research mission planning process. The Combustion Discipline has put in place plans to be prepared to take full advantage of such opportunities.
A framework for interval-valued information system
NASA Astrophysics Data System (ADS)
Yin, Yunfei; Gong, Guanghong; Han, Liang
2012-09-01
Interval-valued information system is used to transform the conventional dataset into the interval-valued form. To conduct the interval-valued data mining, we conduct two investigations: (1) construct the interval-valued information system, and (2) conduct the interval-valued knowledge discovery. In constructing the interval-valued information system, we first make the paired attributes in the database discovered, and then, make them stored in the neighbour locations in a common database and regard them as 'one' new field. In conducting the interval-valued knowledge discovery, we utilise some related priori knowledge and regard the priori knowledge as the control objectives; and design an approximate closed-loop control mining system. On the implemented experimental platform (prototype), we conduct the corresponding experiments and compare the proposed algorithms with several typical algorithms, such as the Apriori algorithm, the FP-growth algorithm and the CLOSE+ algorithm. The experimental results show that the interval-valued information system method is more effective than the conventional algorithms in discovering interval-valued patterns.
Super-resolution structured illumination in optically thick specimens without fluorescent tagging
NASA Astrophysics Data System (ADS)
Hoffman, Zachary R.; DiMarzio, Charles A.
2017-11-01
This research extends the work of Hoffman et al. to provide both sectioning and super-resolution using random patterns within thick specimens. Two methods of processing structured illumination in reflectance have been developed without the need for a priori knowledge of either the optical system or the modulation patterns. We explore the use of two deconvolution algorithms that assume either Gaussian or sparse priors. This paper will show that while both methods accomplish their intended objective, the sparse priors method provides superior resolution and contrast against all tested targets, providing anywhere from ˜1.6× to ˜2× resolution enhancement. The methods developed here can reasonably be implemented to work without a priori knowledge about the patterns or point spread function. Further, all experiments are run using an incoherent light source, unknown random modulation patterns, and without the use of fluorescent tagging. These additional modifications are challenging, but the generalization of these methods makes them prime candidates for clinical application, providing super-resolved noninvasive sectioning in vivo.
CLAES Product Improvement by use of GSFC Data Assimilation System
NASA Technical Reports Server (NTRS)
Kumer, J. B.; Douglass, Anne (Technical Monitor)
2001-01-01
Recent development in chemistry transport models (CTM) and in data assimilation systems (DAS) indicate impressive predictive capability for the movement of airparcels and the chemistry that goes on within these. This project was aimed at exploring the use of this capability to achieve improved retrieval of geophysical parameters from remote sensing data. The specific goal was to improve retrieval of the CLAES CH4 data obtained during the active north high latitude dynamics event of 18 to 25 February 1992. The model capabilities would be used: (1) rather than climatology to improve on the first guess and the a-priori fields, and (2) to provide horizontal gradients to include in the retrieval forward model. The retrieval would be implemented with the first forward DAS prediction. The results would feed back to the DAS and a second DAS prediction for first guess, a-priori and gradients would feed to the retrieval. The process would repeat to convergence and then proceed to the next day.
NASA Technical Reports Server (NTRS)
Moller, Bjorn; Garro, Alfredo; Falcone, Alberto; Crues, Edwin Z.; Dexter, Daniel E.
2016-01-01
Distributed and Real-Time Simulation plays a key-role in the Space domain being exploited for missions and systems analysis and engineering as well as for crew training and operational support. One of the most popular standards is the 1516-2010 IEEE Standard for Modeling and Simulation (M&S) High Level Architecture (HLA). HLA supports the implementation of distributed simulations (called Federations) in which a set of simulation entities (called Federates) can interact using a Run-Time Infrastructure (RTI). In a given Federation, a Federate can publish and/or subscribes objects and interactions on the RTI only in accordance with their structures as defined in a FOM (Federation Object Model). Currently, the Space domain is characterized by a set of incompatible FOMs that, although meet the specific needs of different organizations and projects, increases the long-term cost for interoperability. In this context, the availability of a reference FOM for the Space domain will enable the development of interoperable HLA-based simulators for related joint projects and collaborations among worldwide organizations involved in the Space domain (e.g. NASA, ESA, Roscosmos, and JAXA). The paper presents a first set of results achieved by a SISO standardization effort that aims at providing a Space Reference FOM for international collaboration on Space systems simulations.
NASA Astrophysics Data System (ADS)
Basu, N. B.; Van Meter, K. J.
2012-12-01
Increased nutrient loads delivered from watersheds due to agricultural intensification, industrialization, and urbanization have contributed globally to the persistence of large hypoxic zones in inland and coastal waters. Watershed management practices targeting these non-point source pollutants often lead to little or no improvement in water quality, even after extensive implementation of conservation measures or Best Management Practices (BMPs). The lag time between implementation of a conservation measure and resultant water quality benefits has recently been recognized as an important factor in the "apparent" failure of these BMPs. When conservation measures are implemented without explicit consideration of the lag time and with expectations that they will lead to immediate benefits, the resulting failure to meet such expectations can discourage vital restoration efforts. It is therefore important to quantify the lag times associated with watershed management efforts a priori and to implement restoration strategies targeted specifically at minimizing lag times and maximizing restoration benefits. The focus of this research is to develop a framework for understanding the time lags between land-use changes and stream water quality benefits. We hypothesize that such time lags arise from nutrient legacies building over decades of fertilizer application. For nitrogen (N), one can conceptualize this as either hydrologic legacy, in the form of dissolved nitrate that is delayed due to slow groundwater transport, or as biogeochemical legacy, in the form of organic N, possibly in dissolved or readily mineralizable forms. Indeed, mass-balance studies across the Mississippi and Thames river basins indicate the possibility of missing N mass in these landscapes, with inputs being consistently greater than the outputs even when accounting for all possible pathways of nitrogen transformation. Historical soil data within the upper Mississippi River Basin (MRB) indicate that agriculture depletes organic N in surface soil, but leads to N accumulations deeper in the profile. Nitrogen accumulation estimates (approximately 2 million Mt/yr) based on the historical data are startlingly close to the deficit suggested by mass-balance studies of the MRB (3 million Mt/yr). Understanding the lag times associated with such biogeochemical legacies requires quantification of this accumulation as a function of landscape attributes, climate, and management controls, as well as the rate of mineralization of accumulated N after implementation of management practices. Understanding hydrologic legacy requires a partitioning of flow along various pathways (e.g., overland flow, tile flow, or groundwater pathways), and the distribution of travel times along the pathways. Based on this framework, we developed a coupled hydrologic and biogeochemical model to quantify these legacies and predict landscape recovery times as a function of natural and anthropogenic controls.
Blind detection of giant pulses: GPU implementation
NASA Astrophysics Data System (ADS)
Ait-Allal, Dalal; Weber, Rodolphe; Dumez-Viou, Cédric; Cognard, Ismael; Theureau, Gilles
2012-01-01
Radio astronomical pulsar observations require specific instrumentation and dedicated signal processing to cope with the dispersion caused by the interstellar medium. Moreover, the quality of observations can be limited by radio frequency interference (RFI) generated by Telecommunications activity. This article presents the innovative pulsar instrumentation based on graphical processing units (GPU) which has been designed at the Nançay Radio Astronomical Observatory. In addition, for giant pulsar search, we propose a new approach which combines a hardware-efficient search method and some RFI mitigation capabilities. Although this approach is less sensitive than the classical approach, its advantage is that no a priori information on the pulsar parameters is required. The validation of a GPU implementation is under way.
NASA Technical Reports Server (NTRS)
Gingras, David R.; Barnhart, Billy P.; Martos, Borja; Ratvasky, Thomas P.; Morelli, Eugene
2011-01-01
Fatal loss-of-control (LOC) accidents have been directly related to in-flight airframe icing. The prototype system presented in this paper directly addresses the need for real-time onboard envelope protection in icing conditions. The combinations of a-priori information and realtime aerodynamic estimations are shown to provide sufficient input for determining safe limits of the flight envelope during in-flight icing encounters. The Icing Contamination Envelope Protection (ICEPro) system has been designed and implemented to identify degradations in airplane performance and flying qualities resulting from ice contamination and provide safe flight-envelope cues to the pilot. Components of ICEPro are described and results from preliminary tests are presented.
NASA Astrophysics Data System (ADS)
Newman, A. J.; Sampson, K. M.; Wood, A. W.; Hopson, T. M.; Brekke, L. D.; Arnold, J.; Raff, D. A.; Clark, M. P.
2013-12-01
Skill in model-based hydrologic forecasting depends on the ability to estimate a watershed's initial moisture and energy conditions, to forecast future weather and climate inputs, and on the quality of the hydrologic model's representation of watershed processes. The impact of these factors on prediction skill varies regionally, seasonally, and by model. We are investigating these influences using a watershed simulation platform that spans the continental US (CONUS), encompassing a broad range of hydroclimatic variation, and that uses the current simulation models of National Weather Service streamflow forecasting operations. The first phase of this effort centered on the implementation and calibration of the SNOW-17 and Sacramento soil moisture accounting (SAC-SMA) based hydrologic modeling system for a range of watersheds. The base configuration includes 630 basins in the United States Geological Survey's Hydro-Climatic Data Network 2009 (HCDN-2009, Lins 2012) conterminous U.S. basin subset. Retrospective model forcings were derived from Daymet (http://daymet.ornl.gov/), and where available, a priori parameter estimates were based on or compared with the operational NWS model parameters. Model calibration was accomplished by several objective, automated strategies, including the shuffled complex evolution (SCE) optimization approach developed within the NWS in the early 1990s (Duan et al. 1993). This presentation describes outcomes from this effort, including insights about measuring simulation skill, and on relationships between simulation skill and model parameters, basin characteristics (climate, topography, vegetation, soils), and the quality of forcing inputs. References: %Z Thornton, P.; Thornton, M.; Mayer, B.; Wilhelmi, N.; Wei, Y.; Devarakonda, R; Cook, R. Daymet: Daily Surface Weather on a 1 km Grid for North America. 1980-2008; Oak Ridge National Laboratory Distributed Active Archive Center: Oak Ridge, TN, USA, 2012; Volume 10.
Effectiveness-implementation Hybrid Designs
Curran, Geoffrey M.; Bauer, Mark; Mittman, Brian; Pyne, Jeffrey M.; Stetler, Cheryl
2013-01-01
Objectives This study proposes methods for blending design components of clinical effectiveness and implementation research. Such blending can provide benefits over pursuing these lines of research independently; for example, more rapid translational gains, more effective implementation strategies, and more useful information for decision makers. This study proposes a “hybrid effectiveness-implementation” typology, describes a rationale for their use, outlines the design decisions that must be faced, and provides several real-world examples. Results An effectiveness-implementation hybrid design is one that takes a dual focus a priori in assessing clinical effectiveness and implementation. We propose 3 hybrid types: (1) testing effects of a clinical intervention on relevant outcomes while observing and gathering information on implementation; (2) dual testing of clinical and implementation interventions/strategies; and (3) testing of an implementation strategy while observing and gathering information on the clinical intervention’s impact on relevant outcomes. Conclusions The hybrid typology proposed herein must be considered a construct still in evolution. Although traditional clinical effectiveness and implementation trials are likely to remain the most common approach to moving a clinical intervention through from efficacy research to public health impact, judicious use of the proposed hybrid designs could speed the translation of research findings into routine practice. PMID:22310560
Merging Digital Surface Models Implementing Bayesian Approaches
NASA Astrophysics Data System (ADS)
Sadeq, H.; Drummond, J.; Li, Z.
2016-06-01
In this research different DSMs from different sources have been merged. The merging is based on a probabilistic model using a Bayesian Approach. The implemented data have been sourced from very high resolution satellite imagery sensors (e.g. WorldView-1 and Pleiades). It is deemed preferable to use a Bayesian Approach when the data obtained from the sensors are limited and it is difficult to obtain many measurements or it would be very costly, thus the problem of the lack of data can be solved by introducing a priori estimations of data. To infer the prior data, it is assumed that the roofs of the buildings are specified as smooth, and for that purpose local entropy has been implemented. In addition to the a priori estimations, GNSS RTK measurements have been collected in the field which are used as check points to assess the quality of the DSMs and to validate the merging result. The model has been applied in the West-End of Glasgow containing different kinds of buildings, such as flat roofed and hipped roofed buildings. Both quantitative and qualitative methods have been employed to validate the merged DSM. The validation results have shown that the model was successfully able to improve the quality of the DSMs and improving some characteristics such as the roof surfaces, which consequently led to better representations. In addition to that, the developed model has been compared with the well established Maximum Likelihood model and showed similar quantitative statistical results and better qualitative results. Although the proposed model has been applied on DSMs that were derived from satellite imagery, it can be applied to any other sourced DSMs.
Curran, Geoffrey M; Bauer, Mark; Mittman, Brian; Pyne, Jeffrey M; Stetler, Cheryl
2012-03-01
This study proposes methods for blending design components of clinical effectiveness and implementation research. Such blending can provide benefits over pursuing these lines of research independently; for example, more rapid translational gains, more effective implementation strategies, and more useful information for decision makers. This study proposes a "hybrid effectiveness-implementation" typology, describes a rationale for their use, outlines the design decisions that must be faced, and provides several real-world examples. An effectiveness-implementation hybrid design is one that takes a dual focus a priori in assessing clinical effectiveness and implementation. We propose 3 hybrid types: (1) testing effects of a clinical intervention on relevant outcomes while observing and gathering information on implementation; (2) dual testing of clinical and implementation interventions/strategies; and (3) testing of an implementation strategy while observing and gathering information on the clinical intervention's impact on relevant outcomes. The hybrid typology proposed herein must be considered a construct still in evolution. Although traditional clinical effectiveness and implementation trials are likely to remain the most common approach to moving a clinical intervention through from efficacy research to public health impact, judicious use of the proposed hybrid designs could speed the translation of research findings into routine practice.
Hennemann, Severin; Witthöft, Michael; Bethge, Matthias; Spanier, Katja; Beutel, Manfred E; Zwerenz, Rüdiger
2018-04-01
Occupational e-mental-health (OEMH) may extend existing instruments for preservation or restoration of health and work ability. As a key precondition to efficient implementation, this study examined acceptance and person-centered barriers to potential uptake of OEMH for work-related distress in employees with an elevated risk of early retirement. Within the framework of the "Third German Sociomedical Panel of Employees", 1829 employees with prior sickness absence payments filled out a self-administered questionnaire. Participants had a mean age of 49.93 years (SD = 4.06). 6.2% indicated prior use of eHealth interventions. Potential predictors of acceptance of OEMH were examined based on the "Unified Theory of Acceptance and Use of Technology" (UTAUT) extended by work ability, mental health, eHealth literacy and demographic characteristics. 89.1% (n = 1579) showed low to moderate acceptance (M = 2.20, SD = 1.05, range 1-5). A path analysis revealed significant, positive direct effects of UTAUT predictors on acceptance (performance expectancy: 0.48, SE = 0.02, p < 0.001; effort expectancy: 0.20, SE = 0.02, p < 0.001; social influence: 0.28, SE = 0.02, p < 0.001).Online time and frequency of online health information search were further positive direct predictors of acceptance. Model fit was good [χ 2 (7) = 12.91, p = 0.07, RMSEA = 0.02, CFI = 1.00, TLI = 0.99, SRMR = 0.01]. Attitudes towards OEMH are rather disadvantageous in the studied risk group. Implementation of OEMH, therefore, requires a-priori education including promotion of awareness, favorable attitudes regarding efficacy and usability in a collaborative approach.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pin, F.G.; Bender, S.R.
Most fuzzy logic-based reasoning schemes developed for robot control are fully reactive, i.e., the reasoning modules consist of fuzzy rule bases that represent direct mappings from the stimuli provided by the perception systems to the responses implemented by the motion controllers. Due to their totally reactive nature, such reasoning systems can encounter problems such as infinite loops and limit cycles. In this paper, we proposed an approach to remedy these problems by adding a memory and memory-related behaviors to basic reactive systems. Three major types of memory behaviors are addressed: memory creation, memory management, and memory utilization. These are firstmore » presented, and examples of their implementation for the recognition of limit cycles during the navigation of an autonomous robot in a priori unknown environments are then discussed.« less
Liu, Jie; Ying, Dongwen; Zhou, Ping
2014-01-01
Voluntary surface electromyogram (EMG) signals from neurological injury patients are often corrupted by involuntary background interference or spikes, imposing difficulties for myoelectric control. We present a novel framework to suppress involuntary background spikes during voluntary surface EMG recordings. The framework applies a Wiener filter to restore voluntary surface EMG signals based on tracking a priori signal to noise ratio (SNR) by using the decision-directed method. Semi-synthetic surface EMG signals contaminated by different levels of involuntary background spikes were constructed from a database of surface EMG recordings in a group of spinal cord injury subjects. After the processing, the onset detection of voluntary muscle activity was significantly improved against involuntary background spikes. The magnitude of voluntary surface EMG signals can also be reliably estimated for myoelectric control purpose. Compared with the previous sample entropy analysis for suppressing involuntary background spikes, the proposed framework is characterized by quick and simple implementation, making it more suitable for application in a myoelectric control system toward neurological injury rehabilitation. PMID:25443536
Utilization of electrical impedance imaging for estimation of in-vivo tissue resistivities
NASA Astrophysics Data System (ADS)
Eyuboglu, B. Murat; Pilkington, Theo C.
1993-08-01
In order to determine in vivo resistivity of tissues in the thorax, the possibility of combining electrical impedance imaging (EII) techniques with (1) anatomical data extracted from high resolution images, (2) a prior knowledge of tissue resistivities, and (3) a priori noise information was assessed in this study. A Least Square Error Estimator (LSEE) and a statistically constrained Minimum Mean Square Error Estimator (MiMSEE) were implemented to estimate regional electrical resistivities from potential measurements made on the body surface. A two dimensional boundary element model of the human thorax, which consists of four different conductivity regions (the skeletal muscle, the heart, the right lung, and the left lung) was adopted to simulate the measured EII torso potentials. The calculated potentials were then perturbed by simulated instrumentation noise. The signal information used to form the statistical constraint for the MiMSEE was obtained from a prior knowledge of the physiological range of tissue resistivities. The noise constraint was determined from a priori knowledge of errors due to linearization of the forward problem and to the instrumentation noise.
The factors affecting Nigeria's success toward implementation of global public health priorities.
Echebiri, Vitalis C
2015-06-01
This paper examines the challenges facing the Nigerian government toward the implementation of global public health priories. The Nigerian government recognizes the need to implement these priorities by putting in place the necessary policy framework, but political instability, poor infrastructural development and inadequate funding have remained barriers toward the achievement of success in implementing these priorities. The rest of the paper elucidates the fact that despite leadership and influence from the World Health Organization and other United Nations agencies, and some responses from the Nigerian government, tackling these public health problems requires much more fundamental reform to primary health services and a reduction in poverty. Although the government has shown enough political will to tackle these problems, it is expected that a better result will be achieved through injecting more funds into the Nigerian health sector, and deploying astute health administrators to manage the sector rather than pure health professionals without managerial acumen. © The Author(s) 2014.
Optimal policy for value-based decision-making.
Tajima, Satohiro; Drugowitsch, Jan; Pouget, Alexandre
2016-08-18
For decades now, normative theories of perceptual decisions, and their implementation as drift diffusion models, have driven and significantly improved our understanding of human and animal behaviour and the underlying neural processes. While similar processes seem to govern value-based decisions, we still lack the theoretical understanding of why this ought to be the case. Here, we show that, similar to perceptual decisions, drift diffusion models implement the optimal strategy for value-based decisions. Such optimal decisions require the models' decision boundaries to collapse over time, and to depend on the a priori knowledge about reward contingencies. Diffusion models only implement the optimal strategy under specific task assumptions, and cease to be optimal once we start relaxing these assumptions, by, for example, using non-linear utility functions. Our findings thus provide the much-needed theory for value-based decisions, explain the apparent similarity to perceptual decisions, and predict conditions under which this similarity should break down.
Optimal policy for value-based decision-making
Tajima, Satohiro; Drugowitsch, Jan; Pouget, Alexandre
2016-01-01
For decades now, normative theories of perceptual decisions, and their implementation as drift diffusion models, have driven and significantly improved our understanding of human and animal behaviour and the underlying neural processes. While similar processes seem to govern value-based decisions, we still lack the theoretical understanding of why this ought to be the case. Here, we show that, similar to perceptual decisions, drift diffusion models implement the optimal strategy for value-based decisions. Such optimal decisions require the models' decision boundaries to collapse over time, and to depend on the a priori knowledge about reward contingencies. Diffusion models only implement the optimal strategy under specific task assumptions, and cease to be optimal once we start relaxing these assumptions, by, for example, using non-linear utility functions. Our findings thus provide the much-needed theory for value-based decisions, explain the apparent similarity to perceptual decisions, and predict conditions under which this similarity should break down. PMID:27535638
In-Flight System Identification
NASA Technical Reports Server (NTRS)
Morelli, Eugene A.
1998-01-01
A method is proposed and studied whereby the system identification cycle consisting of experiment design and data analysis can be repeatedly implemented aboard a test aircraft in real time. This adaptive in-flight system identification scheme has many advantages, including increased flight test efficiency, adaptability to dynamic characteristics that are imperfectly known a priori, in-flight improvement of data quality through iterative input design, and immediate feedback of the quality of flight test results. The technique uses equation error in the frequency domain with a recursive Fourier transform for the real time data analysis, and simple design methods employing square wave input forms to design the test inputs in flight. Simulation examples are used to demonstrate that the technique produces increasingly accurate model parameter estimates resulting from sequentially designed and implemented flight test maneuvers. The method has reasonable computational requirements, and could be implemented aboard an aircraft in real time.
Inverse Modeling of Texas NOx Emissions Using Space-Based and Ground-Based NO2 Observations
NASA Technical Reports Server (NTRS)
Tang, Wei; Cohan, D.; Lamsal, L. N.; Xiao, X.; Zhou, W.
2013-01-01
Inverse modeling of nitrogen oxide (NOx) emissions using satellite-based NO2 observations has become more prevalent in recent years, but has rarely been applied to regulatory modeling at regional scales. In this study, OMI satellite observations of NO2 column densities are used to conduct inverse modeling of NOx emission inventories for two Texas State Implementation Plan (SIP) modeling episodes. Addition of lightning, aircraft, and soil NOx emissions to the regulatory inventory narrowed but did not close the gap between modeled and satellite observed NO2 over rural regions. Satellitebased top-down emission inventories are created with the regional Comprehensive Air Quality Model with extensions (CAMx) using two techniques: the direct scaling method and discrete Kalman filter (DKF) with Decoupled Direct Method (DDM) sensitivity analysis. The simulations with satellite-inverted inventories are compared to the modeling results using the a priori inventory as well as an inventory created by a ground-level NO2 based DKF inversion. The DKF inversions yield conflicting results: the satellite based inversion scales up the a priori NOx emissions in most regions by factors of 1.02 to 1.84, leading to 3-55% increase in modeled NO2 column densities and 1-7 ppb increase in ground 8 h ozone concentrations, while the ground-based inversion indicates the a priori NOx emissions should be scaled by factors of 0.34 to 0.57 in each region. However, none of the inversions improve the model performance in simulating aircraft-observed NO2 or ground-level ozone (O3) concentrations.
Inverse modeling of Texas NOx emissions using space-based and ground-based NO2 observations
NASA Astrophysics Data System (ADS)
Tang, W.; Cohan, D. S.; Lamsal, L. N.; Xiao, X.; Zhou, W.
2013-11-01
Inverse modeling of nitrogen oxide (NOx) emissions using satellite-based NO2 observations has become more prevalent in recent years, but has rarely been applied to regulatory modeling at regional scales. In this study, OMI satellite observations of NO2 column densities are used to conduct inverse modeling of NOx emission inventories for two Texas State Implementation Plan (SIP) modeling episodes. Addition of lightning, aircraft, and soil NOx emissions to the regulatory inventory narrowed but did not close the gap between modeled and satellite-observed NO2 over rural regions. Satellite-based top-down emission inventories are created with the regional Comprehensive Air Quality Model with extensions (CAMx) using two techniques: the direct scaling method and discrete Kalman filter (DKF) with decoupled direct method (DDM) sensitivity analysis. The simulations with satellite-inverted inventories are compared to the modeling results using the a priori inventory as well as an inventory created by a ground-level NO2-based DKF inversion. The DKF inversions yield conflicting results: the satellite-based inversion scales up the a priori NOx emissions in most regions by factors of 1.02 to 1.84, leading to 3-55% increase in modeled NO2 column densities and 1-7 ppb increase in ground 8 h ozone concentrations, while the ground-based inversion indicates the a priori NOx emissions should be scaled by factors of 0.34 to 0.57 in each region. However, none of the inversions improve the model performance in simulating aircraft-observed NO2 or ground-level ozone (O3) concentrations.
Inverse modeling of Texas NOx emissions using space-based and ground-based NO2 observations
NASA Astrophysics Data System (ADS)
Tang, W.; Cohan, D.; Lamsal, L. N.; Xiao, X.; Zhou, W.
2013-07-01
Inverse modeling of nitrogen oxide (NOx) emissions using satellite-based NO2 observations has become more prevalent in recent years, but has rarely been applied to regulatory modeling at regional scales. In this study, OMI satellite observations of NO2 column densities are used to conduct inverse modeling of NOx emission inventories for two Texas State Implementation Plan (SIP) modeling episodes. Addition of lightning, aircraft, and soil NOx emissions to the regulatory inventory narrowed but did not close the gap between modeled and satellite observed NO2 over rural regions. Satellite-based top-down emission inventories are created with the regional Comprehensive Air Quality Model with extensions (CAMx) using two techniques: the direct scaling method and discrete Kalman filter (DKF) with Decoupled Direct Method (DDM) sensitivity analysis. The simulations with satellite-inverted inventories are compared to the modeling results using the a priori inventory as well as an inventory created by a ground-level NO2 based DKF inversion. The DKF inversions yield conflicting results: the satellite-based inversion scales up the a priori NOx emissions in most regions by factors of 1.02 to 1.84, leading to 3-55% increase in modeled NO2 column densities and 1-7 ppb increase in ground 8 h ozone concentrations, while the ground-based inversion indicates the a priori NOx emissions should be scaled by factors of 0.34 to 0.57 in each region. However, none of the inversions improve the model performance in simulating aircraft-observed NO2 or ground-level ozone (O3) concentrations.
Undersampling strategies for compressed sensing accelerated MR spectroscopic imaging
NASA Astrophysics Data System (ADS)
Vidya Shankar, Rohini; Hu, Houchun Harry; Bikkamane Jayadev, Nutandev; Chang, John C.; Kodibagkar, Vikram D.
2017-03-01
Compressed sensing (CS) can accelerate magnetic resonance spectroscopic imaging (MRSI), facilitating its widespread clinical integration. The objective of this study was to assess the effect of different undersampling strategy on CS-MRSI reconstruction quality. Phantom data were acquired on a Philips 3 T Ingenia scanner. Four types of undersampling masks, corresponding to each strategy, namely, low resolution, variable density, iterative design, and a priori were simulated in Matlab and retrospectively applied to the test 1X MRSI data to generate undersampled datasets corresponding to the 2X - 5X, and 7X accelerations for each type of mask. Reconstruction parameters were kept the same in each case(all masks and accelerations) to ensure that any resulting differences can be attributed to the type of mask being employed. The reconstructed datasets from each mask were statistically compared with the reference 1X, and assessed using metrics like the root mean square error and metabolite ratios. Simulation results indicate that both the a priori and variable density undersampling masks maintain high fidelity with the 1X up to five-fold acceleration. The low resolution mask based reconstructions showed statistically significant differences from the 1X with the reconstruction failing at 3X, while the iterative design reconstructions maintained fidelity with the 1X till 4X acceleration. In summary, a pilot study was conducted to identify an optimal sampling mask in CS-MRSI. Simulation results demonstrate that the a priori and variable density masks can provide statistically similar results to the fully sampled reference. Future work would involve implementing these two masks prospectively on a clinical scanner.
Lake bed classification using acoustic data
Yin, Karen K.; Li, Xing; Bonde, John; Richards, Carl; Cholwek, Gary
1998-01-01
As part of our effort to identify the lake bed surficial substrates using remote sensing data, this work designs pattern classifiers by multivariate statistical methods. Probability distribution of the preprocessed acoustic signal is analyzed first. A confidence region approach is then adopted to improve the design of the existing classifier. A technique for further isolation is proposed which minimizes the expected loss from misclassification. The devices constructed are applicable for real-time lake bed categorization. A mimimax approach is suggested to treat more general cases where the a priori probability distribution of the substrate types is unknown. Comparison of the suggested methods with the traditional likelihood ratio tests is discussed.
The constitutive a priori and the distinction between mathematical and physical possibility
NASA Astrophysics Data System (ADS)
Everett, Jonathan
2015-11-01
This paper is concerned with Friedman's recent revival of the notion of the relativized a priori. It is particularly concerned with addressing the question as to how Friedman's understanding of the constitutive function of the a priori has changed since his defence of the idea in his Dynamics of Reason. Friedman's understanding of the a priori remains influenced by Reichenbach's initial defence of the idea; I argue that this notion of the a priori does not naturally lend itself to describing the historical development of space-time physics. Friedman's analysis of the role of the rotating frame thought experiment in the development of general relativity - which he suggests made the mathematical possibility of four-dimensional space-time a genuine physical possibility - has a central role in his argument. I analyse this thought experiment and argue that it is better understood by following Cassirer and placing emphasis on regulative principles. Furthermore, I argue that Cassirer's Kantian framework enables us to capture Friedman's key insights into the nature of the constitutive a priori.
Ecosystem context and historical contingency in apex predator recoveries.
Stier, Adrian C; Samhouri, Jameal F; Novak, Mark; Marshall, Kristin N; Ward, Eric J; Holt, Robert D; Levin, Phillip S
2016-05-01
Habitat loss, overexploitation, and numerous other stressors have caused global declines in apex predators. This "trophic downgrading" has generated widespread concern because of the fundamental role that apex predators can play in ecosystem functioning, disease regulation, and biodiversity maintenance. In attempts to combat declines, managers have conducted reintroductions, imposed stricter harvest regulations, and implemented protected areas. We suggest that full recovery of viable apex predator populations is currently the exception rather than the rule. We argue that, in addition to well-known considerations, such as continued exploitation and slow life histories, there are several underappreciated factors that complicate predator recoveries. These factors include three challenges. First, a priori identification of the suite of trophic interactions, such as resource limitation and competition that will influence recovery can be difficult. Second, defining and accomplishing predator recovery in the context of a dynamic ecosystem requires an appreciation of the timing of recovery, which can determine the relative density of apex predators and other predators and therefore affect competitive outcomes. Third, successful recovery programs require designing adaptive sequences of management strategies that embrace key environmental and species interactions as they emerge. Consideration of recent research on food web modules, alternative stable states, and community assembly offer important insights for predator recovery efforts and restoration ecology more generally. Foremost among these is the importance of a social-ecological perspective in facilitating a long-lasting predator restoration while avoiding unintended consequences.
Ecosystem context and historical contingency in apex predator recoveries
Stier, Adrian C.; Samhouri, Jameal F.; Novak, Mark; Marshall, Kristin N.; Ward, Eric J.; Holt, Robert D.; Levin, Phillip S.
2016-01-01
Habitat loss, overexploitation, and numerous other stressors have caused global declines in apex predators. This “trophic downgrading” has generated widespread concern because of the fundamental role that apex predators can play in ecosystem functioning, disease regulation, and biodiversity maintenance. In attempts to combat declines, managers have conducted reintroductions, imposed stricter harvest regulations, and implemented protected areas. We suggest that full recovery of viable apex predator populations is currently the exception rather than the rule. We argue that, in addition to well-known considerations, such as continued exploitation and slow life histories, there are several underappreciated factors that complicate predator recoveries. These factors include three challenges. First, a priori identification of the suite of trophic interactions, such as resource limitation and competition that will influence recovery can be difficult. Second, defining and accomplishing predator recovery in the context of a dynamic ecosystem requires an appreciation of the timing of recovery, which can determine the relative density of apex predators and other predators and therefore affect competitive outcomes. Third, successful recovery programs require designing adaptive sequences of management strategies that embrace key environmental and species interactions as they emerge. Consideration of recent research on food web modules, alternative stable states, and community assembly offer important insights for predator recovery efforts and restoration ecology more generally. Foremost among these is the importance of a social-ecological perspective in facilitating a long-lasting predator restoration while avoiding unintended consequences. PMID:27386535
Austin, Jane E.; Slattery, Stuart; Clark, Robert G.
2014-01-01
There are 30 threatened or endangered species of waterfowl worldwide, and several sub-populations are also threatened. Some of these species occur in North America, and others there are also of conservation concern due to declining population trends and their importance to hunters. Here we review conservation initiatives being undertaken for several of these latter species, along with conservation measures in place in Europe, to seek common themes and approaches that could be useful in developing broad conservation guidelines. While focal species may vary in their life histories, population threats and geopolitical context, most conservation efforts have used a systematic approach to understand factors limiting populations and o identify possible management or policy actions. This approach generally includes a priori identification of plausible hypotheses about population declines or status, incorporation of hypotheses into conceptual or quantitative planning models, and the use of some form of structured decision making and adaptive management to develop and implement conservation actions in the face of many uncertainties. A climate of collaboration among jurisdictions sharing these birds is important to the success of a conservation or management programme. The structured conservation approach exemplified herein provides an opportunity to involve stakeholders at all planning stages, allows for all views to be examined and incorporated into model structures, and yields a format for improved communication, cooperation and learning, which may ultimately be one of the greatest benefits of this strategy.
Kapiriri, Lydia; Tharao, Wangari; Muchenje, Marvelous; Masinde, Khatundi I; Ongoiba, Fanta
2016-12-01
Over 60 countries criminalise 'the "willful" transmission of HIV'. Such a law has the potential to hinder public health interventions. There is limited literature discussing the perceptions of this law and the impact, it has had on HIV-positive women. This paper describes the knowledge of and attitudes of this law by HIV-positive women living in Ontario; and their experiences with its application. Three group discussions (n = 10) and 17 in-depth interviews with HIV-positive women age: 21-56 years. Data were analysed using a modified thematic approach. Most of the respondents knew about the law with regard to adult HIV transmission. However, very few knew about any laws related to mother to child HIV transmission, although some reported having had their children taken away because of breastfeeding. Respondents felt that the law could be fair and protective if there were means of providing a priori support to those women who have been disadvantaged social-culturally and structurally. Without this support, the law could potentially lead HIV-positive women into hiding and not accessing services that could help them. There is need for the law implementers to consider these findings if they are to support the public health efforts to control HIV.
An adaptive finite element method for the inequality-constrained Reynolds equation
NASA Astrophysics Data System (ADS)
Gustafsson, Tom; Rajagopal, Kumbakonam R.; Stenberg, Rolf; Videman, Juha
2018-07-01
We present a stabilized finite element method for the numerical solution of cavitation in lubrication, modeled as an inequality-constrained Reynolds equation. The cavitation model is written as a variable coefficient saddle-point problem and approximated by a residual-based stabilized method. Based on our recent results on the classical obstacle problem, we present optimal a priori estimates and derive novel a posteriori error estimators. The method is implemented as a Nitsche-type finite element technique and shown in numerical computations to be superior to the usually applied penalty methods.
NASA Astrophysics Data System (ADS)
Ma, Wenjuan; Gao, Feng; Duan, Linjing; Zhu, Qingzhen; Wang, Xin; Zhang, Wei; Wu, Linhui; Yi, Xi; Zhao, Huijuan
2012-03-01
We obtain absorption and scattering reconstructed images by incorporating a priori information of target location obtained from fluorescence diffuse optical tomography (FDOT) into the diffuse optical tomography (DOT). The main disadvantage of DOT lies in the low spatial resolution resulting from highly scattering nature of tissue in the near-infrared (NIR), but one can use it to monitor hemoglobin concentration and oxygen saturation simultaneously, as well as several other cheomphores such as water, lipids, and cytochrome-c-oxidase. Up to date, extensive effort has been made to integrate DOT with other imaging modalities such as MRI, CT, to obtain accurate optical property maps of the tissue. However, the experimental apparatus is intricate. In this study, DOT image reconstruction algorithm that incorporates a prior structural information provided by FDOT is investigated in an attempt to optimize recovery of a simulated optical property distribution. By use of a specifically designed multi-channel time-correlated single photon counting system, the proposed scheme in a transmission mode is experimentally validated to achieve simultaneous reconstruction of the fluorescent yield, lifetime, absorption and scattering coefficient. The experimental results demonstrate that the quantitative recovery of the tumor optical properties has doubled and the spatial resolution improves as well by applying the new improved method.
NASA Technical Reports Server (NTRS)
Hoffman, Matthew J.; Eluszkiewicz, Janusz; Weisenstein, Deborah; Uymin, Gennady; Moncet, Jean-Luc
2012-01-01
Motivated by the needs of Mars data assimilation. particularly quantification of measurement errors and generation of averaging kernels. we have evaluated atmospheric temperature retrievals from Mars Global Surveyor (MGS) Thermal Emission Spectrometer (TES) radiances. Multiple sets of retrievals have been considered in this study; (1) retrievals available from the Planetary Data System (PDS), (2) retrievals based on variants of the retrieval algorithm used to generate the PDS retrievals, and (3) retrievals produced using the Mars 1-Dimensional Retrieval (M1R) algorithm based on the Optimal Spectral Sampling (OSS ) forward model. The retrieved temperature profiles are compared to the MGS Radio Science (RS) temperature profiles. For the samples tested, the M1R temperature profiles can be made to agree within 2 K with the RS temperature profiles, but only after tuning the prior and error statistics. Use of a global prior that does not take into account the seasonal dependence leads errors of up 6 K. In polar samples. errors relative to the RS temperature profiles are even larger. In these samples, the PDS temperature profiles also exhibit a poor fit with RS temperatures. This fit is worse than reported in previous studies, indicating that the lack of fit is due to a bias correction to TES radiances implemented after 2004. To explain the differences between the PDS and Ml R temperatures, the algorithms are compared directly, with the OSS forward model inserted into the PDS algorithm. Factors such as the filtering parameter, the use of linear versus nonlinear constrained inversion, and the choice of the forward model, are found to contribute heavily to the differences in the temperature profiles retrieved in the polar regions, resulting in uncertainties of up to 6 K. Even outside the poles, changes in the a priori statistics result in different profile shapes which all fit the radiances within the specified error. The importance of the a priori statistics prevents reliable global retrievals based a single a priori and strongly implies that a robust science analysis must instead rely on retrievals employing localized a priori information, for example from an ensemble based data assimilation system such as the Local Ensemble Transform Kalman Filter (LETKF).
NASA Astrophysics Data System (ADS)
Panagoulia, D.; Trichakis, I.
2012-04-01
Considering the growing interest in simulating hydrological phenomena with artificial neural networks (ANNs), it is useful to figure out the potential and limits of these models. In this study, the main objective is to examine how to improve the ability of an ANN model to simulate extreme values of flow utilizing a priori knowledge of threshold values. A three-layer feedforward ANN was trained by using the back propagation algorithm and the logistic function as activation function. By using the thresholds, the flow was partitioned in low (x < μ), medium (μ ≤ x ≤ μ + 2σ) and high (x > μ + 2σ) values. The employed ANN model was trained for high flow partition and all flow data too. The developed methodology was implemented over a mountainous river catchment (the Mesochora catchment in northwestern Greece). The ANN model received as inputs pseudo-precipitation (rain plus melt) and previous observed flow data. After the training was completed the bootstrapping methodology was applied to calculate the ANN confidence intervals (CIs) for a 95% nominal coverage. The calculated CIs included only the uncertainty, which comes from the calibration procedure. The results showed that an ANN model trained specifically for high flows, with a priori knowledge of the thresholds, can simulate these extreme values much better (RMSE is 31.4% less) than an ANN model trained with all data of the available time series and using a posteriori threshold values. On the other hand the width of CIs increases by 54.9% with a simultaneous increase by 64.4% of the actual coverage for the high flows (a priori partition). The narrower CIs of the high flows trained with all data may be attributed to the smoothing effect produced from the use of the full data sets. Overall, the results suggest that an ANN model trained with a priori knowledge of the threshold values has an increased ability in simulating extreme values compared with an ANN model trained with all the data and a posteriori knowledge of the thresholds.
Programming new geometry restraints: Parallelity of atomic groups
Sobolev, Oleg V.; Afonine, Pavel V.; Adams, Paul D.; ...
2015-08-01
Improvements in structural biology methods, in particular crystallography and cryo-electron microscopy, have created an increased demand for the refinement of atomic models against low-resolution experimental data. One way to compensate for the lack of high-resolution experimental data is to use a priori information about model geometry that can be utilized in refinement in the form of stereochemical restraints or constraints. Here, the definition and calculation of the restraints that can be imposed on planar atomic groups, in particular the angle between such groups, are described. Detailed derivations of the restraint targets and their gradients are provided so that they canmore » be readily implemented in other contexts. Practical implementations of the restraints, and of associated data structures, in the Computational Crystallography Toolbox( cctbx) are presented.« less
Some Simultaneous Inference Procedures for A Priori Contrasts.
ERIC Educational Resources Information Center
Convey, John J.
The testing of a priori contrasts, post hoc contrasts, and experimental error rates are discussed. Methods for controlling the experimental error rate for a set of a priori contrasts tested simultaneously have been developed by Dunnett, Dunn, Sidak, and Krishnaiah. Each of these methods is discussed and contrasted as to applicability, power, and…
Adaptability and phenotypic stability of common bean genotypes through Bayesian inference.
Corrêa, A M; Teodoro, P E; Gonçalves, M C; Barroso, L M A; Nascimento, M; Santos, A; Torres, F E
2016-04-27
This study used Bayesian inference to investigate the genotype x environment interaction in common bean grown in Mato Grosso do Sul State, and it also evaluated the efficiency of using informative and minimally informative a priori distributions. Six trials were conducted in randomized blocks, and the grain yield of 13 common bean genotypes was assessed. To represent the minimally informative a priori distributions, a probability distribution with high variance was used, and a meta-analysis concept was adopted to represent the informative a priori distributions. Bayes factors were used to conduct comparisons between the a priori distributions. The Bayesian inference was effective for the selection of upright common bean genotypes with high adaptability and phenotypic stability using the Eberhart and Russell method. Bayes factors indicated that the use of informative a priori distributions provided more accurate results than minimally informative a priori distributions. According to Bayesian inference, the EMGOPA-201, BAMBUÍ, CNF 4999, CNF 4129 A 54, and CNFv 8025 genotypes had specific adaptability to favorable environments, while the IAPAR 14 and IAC CARIOCA ETE genotypes had specific adaptability to unfavorable environments.
Matthews, Karen A; Adler, Nancy E; Forrest, Christopher B; Stead, William W
2016-09-01
Social, psychological, and behavioral factors are recognized as key contributors to health, but they are rarely measured in a systematic way in health care settings. Electronic health records (EHRs) can be used in these settings to routinely collect a standardized set of social, psychological, and behavioral determinants of health. The expanded use of EHRs provides opportunities to improve individual and population health, and offers new ways for the psychological community to engage in health promotion and disease prevention efforts. This article addresses 3 issues. First, it discusses what led to current efforts to include measures of psychosocial and behavioral determinants of health in EHRs. Second, it presents recommendations of an Institute of Medicine committee regarding inclusion in EHRS of a panel of measures that meet a priori criteria. Third, it identifies new opportunities and challenges these recommendations present for psychologists in practice and research. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
On the numerical treatment of selected oscillatory evolutionary problems
NASA Astrophysics Data System (ADS)
Cardone, Angelamaria; Conte, Dajana; D'Ambrosio, Raffaele; Paternoster, Beatrice
2017-07-01
We focus on evolutionary problems whose qualitative behaviour is known a-priori and exploited in order to provide efficient and accurate numerical schemes. For classical numerical methods, depending on constant coefficients, the required computational effort could be quite heavy, due to the necessary employ of very small stepsizes needed to accurately reproduce the qualitative behaviour of the solution. In these situations, it may be convenient to use special purpose formulae, i.e. non-polynomially fitted formulae on basis functions adapted to the problem (see [16, 17] and references therein). We show examples of special purpose strategies to solve two families of evolutionary problems exhibiting periodic solutions, i.e. partial differential equations and Volterra integral equations.
Discoveries far from the lamppost with matrix elements and ranking
DOE Office of Scientific and Technical Information (OSTI.GOV)
Debnath, Dipsikha; Gainer, James S.; Matchev, Konstantin T.
2015-04-01
The prevalence of null results in searches for new physics at the LHC motivates the effort to make these searches as model-independent as possible. We describe procedures for adapting the Matrix Element Method for situations where the signal hypothesis is not known a priori. We also present general and intuitive approaches for performing analyses and presenting results, which involve the flattening of background distributions using likelihood information. The first flattening method involves ranking events by background matrix element, the second involves quantile binning with respect to likelihood (and other) variables, and the third method involves reweighting histograms by the inversemore » of the background distribution.« less
ERIC Educational Resources Information Center
Sollervall, Håkan; Stadler, Erika
2015-01-01
The aim of the presented case study is to investigate how coherent analytical instruments may guide the a priori and a posteriori analyses of a didactical situation. In the a priori analysis we draw on the notion of affordances, as artefact-mediated opportunities for action, to construct hypothetical trajectories of goal-oriented actions that have…
NASA Astrophysics Data System (ADS)
Ren, Xia; Yang, Yuanxi; Zhu, Jun; Xu, Tianhe
2017-11-01
Intersatellite Link (ISL) technology helps to realize the auto update of broadcast ephemeris and clock error parameters for Global Navigation Satellite System (GNSS). ISL constitutes an important approach with which to both improve the observation geometry and extend the tracking coverage of China's Beidou Navigation Satellite System (BDS). However, ISL-only orbit determination might lead to the constellation drift, rotation, and even lead to the divergence in orbit determination. Fortunately, predicted orbits with good precision can be used as a priori information with which to constrain the estimated satellite orbit parameters. Therefore, the precision of satellite autonomous orbit determination can be improved by consideration of a priori orbit information, and vice versa. However, the errors of rotation and translation in a priori orbit will remain in the ultimate result. This paper proposes a constrained precise orbit determination (POD) method for a sub-constellation of the new Beidou satellite constellation with only a few ISLs. The observation model of dual one-way measurements eliminating satellite clock errors is presented, and the orbit determination precision is analyzed with different data processing backgrounds. The conclusions are as follows. (1) With ISLs, the estimated parameters are strongly correlated, especially the positions and velocities of satellites. (2) The performance of determined BDS orbits will be improved by the constraints with more precise priori orbits. The POD precision is better than 45 m with a priori orbit constrain of 100 m precision (e.g., predicted orbits by telemetry tracking and control system), and is better than 6 m with precise priori orbit constraints of 10 m precision (e.g., predicted orbits by international GNSS monitoring & Assessment System (iGMAS)). (3) The POD precision is improved by additional ISLs. Constrained by a priori iGMAS orbits, the POD precision with two, three, and four ISLs is better than 6, 3, and 2 m, respectively. (4) The in-plane link and out-of-plane link have different contributions to observation configuration and system observability. The POD with weak observation configuration (e.g., one in-plane link and one out-of-plane link) should be tightly constrained with a priori orbits.
ERIC Educational Resources Information Center
Hall, Charles Dana
2013-01-01
Research investigating the complex, multi-directional relationships inherent to public education has become a focal point of reform research. This study investigated the perceptions held by district-level leaders regarding the Colorado Department of Education's efforts to facilitate the successful implementation of reading policy. In addition, it…
Li, Hong; Liu, Mingyong; Liu, Kun; Zhang, Feihu
2017-12-25
By simulating the geomagnetic fields and analyzing thevariation of intensities, this paper presents a model for calculating the objective function ofan Autonomous Underwater Vehicle (AUV)geomagnetic navigation task. By investigating the biologically inspired strategies, the AUV successfullyreachesthe destination duringgeomagnetic navigation without using the priori geomagnetic map. Similar to the pattern of a flatworm, the proposed algorithm relies on a motion pattern to trigger a local searching strategy by detecting the real-time geomagnetic intensity. An adapted strategy is then implemented, which is biased on the specific target. The results show thereliabilityandeffectivenessofthe proposed algorithm.
Trust and Extra Effort Implementing Curriculum Reform: The Mediating Effects of Collaboration
ERIC Educational Resources Information Center
Cerit, Yusuf
2013-01-01
This study aims to examine the relationship between trust and extra effort implementing reform, and relationship between trust and extra effort are mediated by collaboration. The study was carried out in elementary schools in Turkey. Faculty trust in schools was measured using the Omnibus T-Scale, collaboration was measured using collaboration…
Manafò, Elizabeth; Petermann, Lisa; Lobb, Rebecca; Keen, Deb; Kerner, Jon
2011-01-01
To describe the development stages of the Coalitions Linking Action and Science for Prevention (CLASP) initiative of the Canadian Partnership Against Cancer to support research, practice, and policy coalitions focused on cancer and chronic disease prevention in Canada. Coalitions Linking Action and Science for Prevention was implemented in 3 stages. This article describes Stage 1 that consisted of an online concept-mapping consultation process, 3 topic specific networking and consultation workshops, and 3 context-specific networking, coalition development, and planning meetings. These were all completed using a participatory engagement approach to encourage knowledge exchange across jurisdictions and sectors in Canada. Toronto, Ontario; Calgary, Alberta; Montreal, Québec; and Ottawa, Ontario. More than 500 researchers, practitioners, and policy specialists were invited to take part in the first stage activities. (1) Participant-identified high-priority opportunities for strategic collaboration; (2) Cross-jurisdictional and cross-sector representation; and (3) Participant feedback on the CLASP processes and activities. Participants from Stage 1 activities were distributed across all provinces/territories; 3 jurisdictional levels; and research, practice, and policy sectors. Ninety priority opportunities for strategic collaboration were identified across all 3 workshops. Participants provided detailed feedback about transparency of the RFP (Request for Proposals) application process, support needed to level the playing field for potential applicants, and valuable suggestions for the adjudication process. Coalitions Linking Action and Science for Prevention engaged hundreds of research, practice, and policy experts across Canada focusing social-behavioral, clinical, and environmental and occupational opportunities for cancer and chronic disease prevention. Given the extent of expert and jurisdictional engagement, the substantial Partnership investment in a participatory engagement approach to RFP development and potential applicant response suggests that efforts to link cancer and chronic disease prevention efforts across jurisdictions and through research, practice, and policy collaboration may require this type of a priori investment in networking, communication, coordination, and collaboration.
Freydefont, Laure; Gollwitzer, Peter M; Oettingen, Gabriele
2016-09-01
Two experiments investigate the influence of goal and implementation intentions on effort mobilization during task performance. Although numerous studies have demonstrated the beneficial effects of setting goals and making plans on performance, the effects of goals and plans on effort-related cardiac activity and especially the cardiac preejection period (PEP) during goal striving have not yet been addressed. According to the Motivational Intensity Theory, participants should increase effort mobilization proportionally to task difficulty as long as success is possible and justified. Forming goals and making plans should allow for reduced effort mobilization when participants perform an easy task. However, when the task is difficult, goals and plans should differ in their effect on effort mobilization. Participants who set goals should disengage, whereas participants who made if-then plans should stay in the field showing high effort mobilization during task performance. As expected, using an easy task in Experiment 1, we observed a lower cardiac PEP in both the implementation intention and the goal intention condition than in the control condition. In Experiment 2, we varied task difficulty and demonstrated that while participants with a mere goal intention disengaged from difficult tasks, participants with an implementation intention increased effort mobilization proportionally with task difficulty. These findings demonstrate the influence of goal striving strategies (i.e., mere goals vs. if-then plans) on effort mobilization during task performance. Copyright © 2016 Elsevier B.V. All rights reserved.
Liver segmentation from CT images using a sparse priori statistical shape model (SP-SSM).
Wang, Xuehu; Zheng, Yongchang; Gan, Lan; Wang, Xuan; Sang, Xinting; Kong, Xiangfeng; Zhao, Jie
2017-01-01
This study proposes a new liver segmentation method based on a sparse a priori statistical shape model (SP-SSM). First, mark points are selected in the liver a priori model and the original image. Then, the a priori shape and its mark points are used to obtain a dictionary for the liver boundary information. Second, the sparse coefficient is calculated based on the correspondence between mark points in the original image and those in the a priori model, and then the sparse statistical model is established by combining the sparse coefficients and the dictionary. Finally, the intensity energy and boundary energy models are built based on the intensity information and the specific boundary information of the original image. Then, the sparse matching constraint model is established based on the sparse coding theory. These models jointly drive the iterative deformation of the sparse statistical model to approximate and accurately extract the liver boundaries. This method can solve the problems of deformation model initialization and a priori method accuracy using the sparse dictionary. The SP-SSM can achieve a mean overlap error of 4.8% and a mean volume difference of 1.8%, whereas the average symmetric surface distance and the root mean square symmetric surface distance can reach 0.8 mm and 1.4 mm, respectively.
Conventional Principles in Science: On the foundations and development of the relativized a priori
NASA Astrophysics Data System (ADS)
Ivanova, Milena; Farr, Matt
2015-11-01
The present volume consists of a collection of papers originally presented at the conference Conventional Principles in Science, held at the University of Bristol, August 2011, which featured contributions on the history and contemporary development of the notion of 'relativized a priori' principles in science, from Henri Poincaré's conventionalism to Michael Friedman's contemporary defence of the relativized a priori. In Science and Hypothesis, Poincaré assessed the problematic epistemic status of Euclidean geometry and Newton's laws of motion, famously arguing that each has the status of 'convention' in that their justification is neither analytic nor empirical in nature. In The Theory of Relativity and A Priori Knowledge, Hans Reichenbach, in light of the general theory of relativity, proposed an updated notion of the Kantian synthetic a priori to account for the dynamic inter-theoretic status of geometry and other non-empirical physical principles. Reichenbach noted that one may reject the 'necessarily true' aspect of the synthetic a priori whilst preserving the feature of being constitutive of the object of knowledge. Such constitutive principles are theory-relative, as illustrated by the privileged role of non-Euclidean geometry in general relativity theory. This idea of relativized a priori principles in spacetime physics has been analysed and developed at great length in the modern literature in the work of Michael Friedman, in particular the roles played by the light postulate and the equivalence principle - in special and general relativity respectively - in defining the central terms of their respective theories and connecting the abstract mathematical formalism of the theories with their empirical content. The papers in this volume guide the reader through the historical development of conventional and constitutive principles in science, from the foundational work of Poincaré, Reichenbach and others, to contemporary issues and applications of the relativized a priori concerning the notion of measurement, physical possibility, and the interpretation of scientific theories.
DAISY: a new software tool to test global identifiability of biological and physiological systems.
Bellu, Giuseppina; Saccomani, Maria Pia; Audoly, Stefania; D'Angiò, Leontina
2007-10-01
A priori global identifiability is a structural property of biological and physiological models. It is considered a prerequisite for well-posed estimation, since it concerns the possibility of recovering uniquely the unknown model parameters from measured input-output data, under ideal conditions (noise-free observations and error-free model structure). Of course, determining if the parameters can be uniquely recovered from observed data is essential before investing resources, time and effort in performing actual biomedical experiments. Many interesting biological models are nonlinear but identifiability analysis for nonlinear system turns out to be a difficult mathematical problem. Different methods have been proposed in the literature to test identifiability of nonlinear models but, to the best of our knowledge, so far no software tools have been proposed for automatically checking identifiability of nonlinear models. In this paper, we describe a software tool implementing a differential algebra algorithm to perform parameter identifiability analysis for (linear and) nonlinear dynamic models described by polynomial or rational equations. Our goal is to provide the biological investigator a completely automatized software, requiring minimum prior knowledge of mathematical modelling and no in-depth understanding of the mathematical tools. The DAISY (Differential Algebra for Identifiability of SYstems) software will potentially be useful in biological modelling studies, especially in physiology and clinical medicine, where research experiments are particularly expensive and/or difficult to perform. Practical examples of use of the software tool DAISY are presented. DAISY is available at the web site http://www.dei.unipd.it/~pia/.
Design of an Axisymmetric Afterbody Test Case for CFD Validation
NASA Technical Reports Server (NTRS)
Disotell, Kevin J.; Rumsey, Christopher L.
2017-01-01
As identified in the CFD Vision 2030 Study commissioned by NASA, validation of advanced RANS models and scale-resolving methods for computing turbulent flow fields must be supported by continuous improvements in fundamental, high-fidelity experiments designed specifically for CFD implementation. In accordance with this effort, the underpinnings of a new test platform referred to herein as the NASA Axisymmetric Afterbody are presented. The devised body-of-revolution is a modular platform consisting of a forebody section and afterbody section, allowing for a range of flow behaviors to be studied on interchangeable afterbody geometries. A body-of-revolution offers advantages in shape definition and fabrication, in avoiding direct contact with wind tunnel sidewalls, and in tail-sting integration to facilitate access to higher Reynolds number tunnels. The current work is focused on validation of smooth-body turbulent flow separation, for which a six-parameter body has been developed. A priori RANS computations are reported for a risk-reduction test configuration in order to demonstrate critical variation among turbulence model results for a given afterbody, ranging from barely-attached to mild separated flow. RANS studies of the effects of forebody nose (with/without) and wind tunnel boundary (slip/no-slip) on the selected afterbody are presented. Representative modeling issues that can be explored with this configuration are the effect of higher Reynolds number on separation behavior, flow physics of the progression from attached to increasingly-separated afterbody flows, and the effect of embedded longitudinal vortices on turbulence structure.
Jump-Starting Educational Reform. Implementing British Columbia's Comprehensive School Act.
ERIC Educational Resources Information Center
Goldman, Paul
An educational reform effort to implement a comprehensive school act in British Columbia (Canada) is analyzed with a focus on some sociotechnical and political aspects. An overview of the content, background, and implementation of the reform effort is followed by identification of seven contradictions inherent in the plan. Contradictions are as…
Wallace, Lauren; Kapirir, Lydia
2017-01-01
Background: To date, research on priority-setting for new vaccines has not adequately explored the influence of the global, national and sub-national levels of decision-making or contextual issues such as political pressure and stakeholder influence and power. Using Kapiriri and Martin’s conceptual framework, this paper evaluates priority setting for new vaccines in Uganda at national and sub-national levels, and considers how global priorities can influence country priorities. This study focuses on 2 specific vaccines, the human papilloma virus (HPV) vaccine and the pneumococcal conjugate vaccine (PCV). Methods: This was a qualitative study that involved reviewing relevant Ugandan policy documents and media reports, as well as 54 key informant interviews at the global level and national and sub-national levels in Uganda. Kapiriri and Martin’s conceptual framework was used to evaluate the prioritization process. Results: Priority setting for PCV and HPV was conducted by the Ministry of Health (MoH), which is considered to be a legitimate institution. While respondents described the priority setting process for PCV process as transparent, participatory, and guided by explicit relevant criteria and evidence, the prioritization of HPV was thought to have been less transparent and less participatory. Respondents reported that neither process was based on an explicit priority setting framework nor did it involve adequate representation from the districts (program implementers) or publicity. The priority setting process for both PCV and HPV was negatively affected by the larger political and economic context, which contributed to weak institutional capacity as well as power imbalances between development assistance partners and the MoH. Conclusion: Priority setting in Uganda would be improved by strengthening institutional capacity and leadership and ensuring a transparent and participatory processes in which key stakeholders such as program implementers (the districts) and beneficiaries (the public) are involved. Kapiriri and Martin’s framework has the potential to guide priority setting evaluation efforts, however, evaluation should be built into the priority setting process a priori such that information on priority setting is gathered throughout the implementation cycle. PMID:29172378
Optimal phase estimation with arbitrary a priori knowledge
DOE Office of Scientific and Technical Information (OSTI.GOV)
Demkowicz-Dobrzanski, Rafal
2011-06-15
The optimal-phase estimation strategy is derived when partial a priori knowledge on the estimated phase is available. The solution is found with the help of the most famous result from the entanglement theory: the positive partial transpose criterion. The structure of the optimal measurements, estimators, and the optimal probe states is analyzed. This Rapid Communication provides a unified framework bridging the gap in the literature on the subject which until now dealt almost exclusively with two extreme cases: almost perfect knowledge (local approach based on Fisher information) and no a priori knowledge (global approach based on covariant measurements). Special attentionmore » is paid to a natural a priori probability distribution arising from a diffusion process.« less
NASA Astrophysics Data System (ADS)
Lerot, C.; Van Roozendael, M.; Spurr, R.; Loyola, D.; Coldewey-Egbers, M.; Kochenova, S.; van Gent, J.; Koukouli, M.; Balis, D.; Lambert, J.-C.; Granville, J.; Zehner, C.
2014-02-01
Within the European Space Agency's Climate Change Initiative, total ozone column records from GOME (Global Ozone Monitoring Experiment), SCIAMACHY (SCanning Imaging Absorption SpectroMeter for Atmospheric CartograpHY), and GOME-2 have been reprocessed with GODFIT version 3 (GOME-type Direct FITting). This algorithm is based on the direct fitting of reflectances simulated in the Huggins bands to the observations. We report on new developments in the algorithm from the version implemented in the operational GOME Data Processor v5. The a priori ozone profile database TOMSv8 is now combined with a recently compiled OMI/MLS tropospheric ozone climatology to improve the representativeness of a priori information. The Ring procedure that corrects simulated radiances for the rotational Raman inelastic scattering signature has been improved using a revised semi-empirical expression. Correction factors are also applied to the simulated spectra to account for atmospheric polarization. In addition, the computational performance has been significantly enhanced through the implementation of new radiative transfer tools based on principal component analysis of the optical properties. Furthermore, a soft-calibration scheme for measured reflectances and based on selected Brewer measurements has been developed in order to reduce the impact of level-1 errors. This soft-calibration corrects not only for possible biases in backscattered reflectances, but also for artificial spectral features interfering with the ozone signature. Intersensor comparisons and ground-based validation indicate that these ozone data sets are of unprecedented quality, with stability better than 1% per decade, a precision of 1.7%, and systematic uncertainties less than 3.6% over a wide range of atmospheric states.
Implementation and testing of the gridded Vienna Mapping Function 1 (VMF1)
NASA Astrophysics Data System (ADS)
Kouba, J.
2008-04-01
The new gridded Vienna Mapping Function (VMF1) was implemented and compared to the well-established site-dependent VMF1, directly and by using precise point positioning (PPP) with International GNSS Service (IGS) Final orbits/clocks for a 1.5-year GPS data set of 11 globally distributed IGS stations. The gridded VMF1 data can be interpolated for any location and for any time after 1994, whereas the site-dependent VMF1 data are only available at selected IGS stations and only after 2004. Both gridded and site-dependent VMF1 PPP solutions agree within 1 and 2 mm for the horizontal and vertical position components, respectively, provided that respective VMF1 hydrostatic zenith path delays (ZPD) are used for hydrostatic ZPD mapping to slant delays. The total ZPD of the gridded and site-dependent VMF1 data agree with PPP ZPD solutions with RMS of 1.5 and 1.8 cm, respectively. Such precise total ZPDs could provide useful initial a priori ZPD estimates for kinematic PPP and regional static GPS solutions. The hydrostatic ZPDs of the gridded VMF1 compare with the site-dependent VMF1 ZPDs with RMS of 0.3 cm, subject to some biases and discontinuities of up to 4 cm, which are likely due to different strategies used in the generation of the site-dependent VMF1 data. The precision of gridded hydrostatic ZPD should be sufficient for accurate a priori hydrostatic ZPD mapping in all precise GPS and very long baseline interferometry (VLBI) solutions. Conversely, precise and globally distributed geodetic solutions of total ZPDs, which need to be linked to VLBI to control biases and stability, should also provide a consistent and stable reference frame for long-term and state-of-the-art numerical weather modeling.
ERIC Educational Resources Information Center
Anderson, Kimberly; Mire, Mary Elizabeth
2016-01-01
This report presents a multi-year study of how states are implementing their state college- and career-readiness standards. In this report, the Southern Regional Education Board's (SREB's) Benchmarking State Implementation of College- and Career-Readiness Standards project studied state efforts in 2014-15 and 2015-16 to foster effective…
Loh, Ivory H; Schwendler, Teresa; Trude, Angela C B; Anderson Steeves, Elizabeth T; Cheskin, Lawrence J; Lange, Sarah; Gittelsohn, Joel
2018-01-01
Social media and text messaging show promise as public health interventions, but little evaluation of implementation exists. The B'more Healthy Communities for Kids (BHCK) was a multilevel, multicomponent (wholesalers, food stores, recreation centers) childhood obesity prevention trial that included social media and text-messaging components. The BHCK was implemented in 28 low-income areas of Baltimore City, Maryland, in 2 waves. The texting intervention targeted 241 low-income African American caregivers (of 283), who received 3 texts/week reinforcing key messages, providing nutrition information, and weekly goals. Regular posting on social media platforms (Facebook, Instagram, Twitter) targeted community members and local stakeholders. High implementation standards were set a priori (57 for social media, 11 for texting), with low implementation defined as <50%, medium as 50% to 99%, high as ≥100% of the high standard for each measure. Reach, dose delivered, and fidelity were assessed via web-based analytic tools. Between waves, social media implementation improved from low-moderate to high reach, dose delivered, and fidelity. Text messaging increased from moderate to high in reach and dose delivered, fidelity decreased from high to moderate. Data were used to monitor and revise the BHCK intervention throughout implementation. Our model for evaluating text messaging-based and social media-based interventions may be applicable to other settings.
Specification-based software sizing: An empirical investigation of function metrics
NASA Technical Reports Server (NTRS)
Jeffery, Ross; Stathis, John
1993-01-01
For some time the software industry has espoused the need for improved specification-based software size metrics. This paper reports on a study of nineteen recently developed systems in a variety of application domains. The systems were developed by a single software services corporation using a variety of languages. The study investigated several metric characteristics. It shows that: earlier research into inter-item correlation within the overall function count is partially supported; a priori function counts, in themself, do not explain the majority of the effort variation in software development in the organization studied; documentation quality is critical to accurate function identification; and rater error is substantial in manual function counting. The implication of these findings for organizations using function based metrics are explored.
Use of a priori statistics to minimize acquisition time for RFI immune spread spectrum systems
NASA Technical Reports Server (NTRS)
Holmes, J. K.; Woo, K. T.
1978-01-01
The optimum acquisition sweep strategy was determined for a PN code despreader when the a priori probability density function was not uniform. A psuedo noise spread spectrum system was considered which could be utilized in the DSN to combat radio frequency interference. In a sample case, when the a priori probability density function was Gaussian, the acquisition time was reduced by about 41% compared to a uniform sweep approach.
Digital transceiver implementation for wavelet packet modulation
NASA Astrophysics Data System (ADS)
Lindsey, Alan R.; Dill, Jeffrey C.
1998-03-01
Current transceiver designs for wavelet-based communication systems are typically reliant on analog waveform synthesis, however, digital processing is an important part of the eventual success of these techniques. In this paper, a transceiver implementation is introduced for the recently introduced wavelet packet modulation scheme which moves the analog processing as far as possible toward the antenna. The transceiver is based on the discrete wavelet packet transform which incorporates level and node parameters for generalized computation of wavelet packets. In this transform no particular structure is imposed on the filter bank save dyadic branching, and a maximum level which is specified a priori and dependent mainly on speed and/or cost considerations. The transmitter/receiver structure takes a binary sequence as input and, based on the desired time- frequency partitioning, processes the signal through demultiplexing, synthesis, analysis, multiplexing and data determination completely in the digital domain - with exception of conversion in and out of the analog domain for transmission.
NASA Technical Reports Server (NTRS)
Ketchum, Eleanor A. (Inventor)
2000-01-01
A computer-implemented method and apparatus for determining position of a vehicle within 100 km autonomously from magnetic field measurements and attitude data without a priori knowledge of position. An inverted dipole solution of two possible position solutions for each measurement of magnetic field data are deterministically calculated by a program controlled processor solving the inverted first order spherical harmonic representation of the geomagnetic field for two unit position vectors 180 degrees apart and a vehicle distance from the center of the earth. Correction schemes such as a successive substitutions and a Newton-Raphson method are applied to each dipole. The two position solutions for each measurement are saved separately. Velocity vectors for the position solutions are calculated so that a total energy difference for each of the two resultant position paths is computed. The position path with the smaller absolute total energy difference is chosen as the true position path of the vehicle.
Benefits of a one health approach: An example using Rift Valley fever.
Rostal, Melinda K; Ross, Noam; Machalaba, Catherine; Cordel, Claudia; Paweska, Janusz T; Karesh, William B
2018-06-01
One Health has been promoted by international institutions as a framework to improve public health outcomes. Despite strong overall interest in One Health, country-, local- and project-level implementation remains limited, likely due to the lack of pragmatic and tested operational methods for implementation and metrics for evaluation. Here we use Rift Valley fever virus as an example to demonstrate the value of using a One Health approach for both scientific and resources advantages. We demonstrate that coordinated, a priori investigations between One Health sectors can yield higher statistical power to elucidate important public health relationships as compared to siloed investigations and post-hoc analyses. Likewise, we demonstrate that across a project or multi-ministry health study a One Health approach can result in improved resource efficiency, with resultant cost-savings (35% in the presented case). The results of these analyses demonstrate that One Health approaches can be directly and tangibly applied to health investigations.
Time Domain Simulations of Arm Locking in LISA
NASA Technical Reports Server (NTRS)
Thorpe, J. I.; Maghami, P.; Livas, Jeff
2011-01-01
Arm locking is a technique that has been proposed for reducing laser frequency fluctuations in the Laser Interferometer Space Antenna (LISA). a gravitational-wave observatory sensitive' in the milliHertz frequency band. Arm locking takes advantage of the geometric stability of the triangular constellation of three spacecraft that comprise LISA to provide a frequency reference with a stability in the LISA measurement band that exceeds that available from a standard reference such as an optical cavity or molecular absorption line. We have implemented a time-domain simulation of arm locking including the expected limiting noise sources (shot noise, clock noise. spacecraft jitter noise. and residual laser frequency noise). The effect of imperfect a priori knowledge of the LISA heterodyne frequencies and associated "pulling" of an arm locked laser is included. We find that our implementation meets requirements both on the noise and dynamic range of the laser frequency.
Experimental statistical signature of many-body quantum interference
NASA Astrophysics Data System (ADS)
Giordani, Taira; Flamini, Fulvio; Pompili, Matteo; Viggianiello, Niko; Spagnolo, Nicolò; Crespi, Andrea; Osellame, Roberto; Wiebe, Nathan; Walschaers, Mattia; Buchleitner, Andreas; Sciarrino, Fabio
2018-03-01
Multi-particle interference is an essential ingredient for fundamental quantum mechanics phenomena and for quantum information processing to provide a computational advantage, as recently emphasized by boson sampling experiments. Hence, developing a reliable and efficient technique to witness its presence is pivotal in achieving the practical implementation of quantum technologies. Here, we experimentally identify genuine many-body quantum interference via a recent efficient protocol, which exploits statistical signatures at the output of a multimode quantum device. We successfully apply the test to validate three-photon experiments in an integrated photonic circuit, providing an extensive analysis on the resources required to perform it. Moreover, drawing upon established techniques of machine learning, we show how such tools help to identify the—a priori unknown—optimal features to witness these signatures. Our results provide evidence on the efficacy and feasibility of the method, paving the way for its adoption in large-scale implementations.
Adaptively restrained molecular dynamics in LAMMPS
NASA Astrophysics Data System (ADS)
Kant Singh, Krishna; Redon, Stephane
2017-07-01
Adaptively restrained molecular dynamics (ARMD) is a recently introduced particles simulation method that switches positional degrees of freedom on and off during simulation in order to speed up calculations. In the NVE ensemble, ARMD allows users to trade between precision and speed while, in the NVT ensemble, it makes it possible to compute statistical averages faster. Despite the conceptual simplicity of the approach, however, integrating it in existing molecular dynamics packages is non-trivial, in particular since implemented potentials should a priori be rewritten to take advantage of frozen particles and achieve a speed-up. In this paper, we present novel algorithms for integrating ARMD in LAMMPS, a popular multi-purpose molecular simulation package. In particular, we demonstrate how to enable ARMD in LAMMPS without having to re-implement all available force fields. The proposed algorithms are assessed on four different benchmarks, and show how they allow us to speed up simulations up to one order of magnitude.
Envelope Protection for In-Flight Ice Contamination
NASA Technical Reports Server (NTRS)
Gingras, David R.; Barnhart, Billy P.; Ranaudo, Richard J.; Ratvasky, Thomas P.; Morelli, Eugene A.
2010-01-01
Fatal loss-of-control (LOC) accidents have been directly related to in-flight airframe icing. The prototype system presented in this paper directly addresses the need for real-time onboard envelope protection in icing conditions. The combinations of a-priori information and realtime aerodynamic estimations are shown to provide sufficient input for determining safe limits of the flight envelope during in-flight icing encounters. The Icing Contamination Envelope Protection (ICEPro) system has been designed and implemented to identify degradations in airplane performance and flying qualities resulting from ice contamination and provide safe flight-envelope cues to the pilot. Components of ICEPro are described and results from preliminary tests are presented.
Near-Field Diffraction Imaging from Multiple Detection Planes
NASA Astrophysics Data System (ADS)
Loetgering, L.; Golembusch, M.; Hammoud, R.; Wilhein, T.
2017-06-01
We present diffraction imaging results obtained from multiple near-field diffraction constraints. An iterative phase retrieval algorithm was implemented that uses data redundancy achieved by measuring near-field diffraction intensities at various sample-detector distances. The procedure allows for reconstructing the exit surface wave of a sample within a multiple constraint satisfaction framework neither making use of a priori knowledge as enforced in coherent diffraction imaging (CDI) nor exact scanning grid knowledge as required in ptychography. We also investigate the potential of the presented technique to deal with polychromatic radiation as important for potential application in diffraction imaging by means of tabletop EUV and X-ray sources.
Bhuiya, Nazmim; House, L Duane; Desmarais, Jeffrey; Fletcher, Erica; Conlin, Maeve; Perez-McAdoo, Sarah; Waggett, Jessica; Tendulkar, Shalini A
2017-03-01
This paper describes an assessment of community readiness to implement a community-wide teen pregnancy prevention initiative, Youth First, and presents strategies used to enhance this readiness as informed by the assessment. Twenty-five community stakeholder interviews were conducted to assess four domains of readiness: (1) attitudes, perception, and knowledge of teen pregnancy; (2) perceived level of readiness; (3) resources, existing and current efforts; and (4) leadership. Interview transcripts were coded and analyzed to identify key themes. Stakeholders acknowledged teen pregnancy as an issue but lacked contextual information. They also perceived the community as ready to address the issue and recognized some organizations already championing efforts. However, many key players were not involved, and ongoing data collection to assess teen pregnancy and prevention efforts was limited. Though many stakeholders were ready to engage in teen pregnancy prevention efforts, they required additional information and training to appropriately address the issue. In response to the assessment findings, several strategies were applied to address readiness and build Youth First partners' capacity to implement the community-wide initiative. Thus, to successfully implement community-wide prevention efforts, it is valuable to assess the level of community readiness to address health issues. Copyright © 2016 Society for Adolescent Health and Medicine. Published by Elsevier Inc. All rights reserved.
Aarons, Gregory A; Ehrhart, Mark G; Farahnak, Lauren R
2014-04-14
In healthcare and allied healthcare settings, leadership that supports effective implementation of evidenced-based practices (EBPs) is a critical concern. However, there are no empirically validated measures to assess implementation leadership. This paper describes the development, factor structure, and initial reliability and convergent and discriminant validity of a very brief measure of implementation leadership: the Implementation Leadership Scale (ILS). Participants were 459 mental health clinicians working in 93 different outpatient mental health programs in Southern California, USA. Initial item development was supported as part of a two United States National Institutes of Health (NIH) studies focused on developing implementation leadership training and implementation measure development. Clinician work group/team-level data were randomly assigned to be utilized for an exploratory factor analysis (n = 229; k = 46 teams) or for a confirmatory factor analysis (n = 230; k = 47 teams). The confirmatory factor analysis controlled for the multilevel, nested data structure. Reliability and validity analyses were then conducted with the full sample. The exploratory factor analysis resulted in a 12-item scale with four subscales representing proactive leadership, knowledgeable leadership, supportive leadership, and perseverant leadership. Confirmatory factor analysis supported an a priori higher order factor structure with subscales contributing to a single higher order implementation leadership factor. The scale demonstrated excellent internal consistency reliability as well as convergent and discriminant validity. The ILS is a brief and efficient measure of unit level leadership for EBP implementation. The availability of the ILS will allow researchers to assess strategic leadership for implementation in order to advance understanding of leadership as a predictor of organizational context for implementation. The ILS also holds promise as a tool for leader and organizational development to improve EBP implementation.
2014-01-01
Background In healthcare and allied healthcare settings, leadership that supports effective implementation of evidenced-based practices (EBPs) is a critical concern. However, there are no empirically validated measures to assess implementation leadership. This paper describes the development, factor structure, and initial reliability and convergent and discriminant validity of a very brief measure of implementation leadership: the Implementation Leadership Scale (ILS). Methods Participants were 459 mental health clinicians working in 93 different outpatient mental health programs in Southern California, USA. Initial item development was supported as part of a two United States National Institutes of Health (NIH) studies focused on developing implementation leadership training and implementation measure development. Clinician work group/team-level data were randomly assigned to be utilized for an exploratory factor analysis (n = 229; k = 46 teams) or for a confirmatory factor analysis (n = 230; k = 47 teams). The confirmatory factor analysis controlled for the multilevel, nested data structure. Reliability and validity analyses were then conducted with the full sample. Results The exploratory factor analysis resulted in a 12-item scale with four subscales representing proactive leadership, knowledgeable leadership, supportive leadership, and perseverant leadership. Confirmatory factor analysis supported an a priori higher order factor structure with subscales contributing to a single higher order implementation leadership factor. The scale demonstrated excellent internal consistency reliability as well as convergent and discriminant validity. Conclusions The ILS is a brief and efficient measure of unit level leadership for EBP implementation. The availability of the ILS will allow researchers to assess strategic leadership for implementation in order to advance understanding of leadership as a predictor of organizational context for implementation. The ILS also holds promise as a tool for leader and organizational development to improve EBP implementation. PMID:24731295
The importance of using dynamical a-priori profiles for infrared O3 retrievals : the case of IASI.
NASA Astrophysics Data System (ADS)
Peiro, H.; Emili, E.; Le Flochmoen, E.; Barret, B.; Cariolle, D.
2016-12-01
Tropospheric ozone (O3) is a trace gas involved in the global greenhouse effect. To quantify its contribution to global warming, an accurate determination of O3 profiles is necessary. The instrument IASI (Infrared Atmospheric Sounding Interferometer), on board satellite MetOP-A, is the more sensitive sensor to tropospheric O3 with a high spatio-temporal coverage. Satellite retrievals are often based on the inversion of the measured radiance data with a variational approach. This requires an a priori profile and the correspondent error covariance matrix (COV) as ancillary input. Previous studies have shown some biases ( 20%) in IASI retrievals for tropospheric column in the Southern Hemisphere (SH). A possible source of errors is caused by the a priori profile. This study aims to i) build a dynamical a priori profile O3 with a Chemistry Transport Model (CTM), ii) integrate and to demonstrate the interest of this a priori profile in IASI retrievals.Global O3 profiles are retrieved from IASI radiances with the SOFRID (Software for a fast Retrieval of IASI Data) algorithm. It is based on the RTTOV (Radiative Transfer for TOVS) code and a 1D-Var retrieval scheme. Until now, a constant a priori profile was based on a combination of MOZAIC, WOUDC-SHADOZ and Aura/MLS data named here CLIM PR. The global CTM MOCAGE (Modèle de Chimie Atmosphérique à Grande Echelle) has been used with a linear O3 chemistry scheme to assimilate Microwave Limb Sounder (MLS) data. The model resolution of 2°x2°, with 60 sigma-hybrid vertical levels covering the stratosphere has been used. MLS level 2 products have been assimilated with a 4D-VAR variational algorithm to constrain stratospheric O3 and obtain high quality a priori profiles O3 above the tropopause. From this reanalysis, we built these profiles at a 6h frequency on a coarser resolution grid 10°x20° named MOCAGE+MLS PR.Statistical comparisons between retrievals and ozonesondes have shown better correlations and smaller biases for MOCAGE+MLS PR than CLIM PR. We found biases of 6% instead of 33% in SH showing that the a priori plays an important role within O3 infrared-retrievals. Improvements of IASI retrievals have been obtained in the free troposphere and low stratosphere, inserting dynamical a priori profiles from a CTM in SOFRID. Possible advancements would be to insert dynamical COV in SOFRID.
Schulze, Christin; Newell, Ben R
2016-07-01
Cognitive load has previously been found to have a positive effect on strategy selection in repeated risky choice. Specifically, whereas inferior probability matching often prevails under single-task conditions, optimal probability maximizing sometimes dominates when a concurrent task competes for cognitive resources. We examined the extent to which this seemingly beneficial effect of increased task demands hinges on the effort required to implement each of the choice strategies. Probability maximizing typically involves a simple repeated response to a single option, whereas probability matching requires choice proportions to be tracked carefully throughout a sequential choice task. Here, we flipped this pattern by introducing a manipulation that made the implementation of maximizing more taxing and, at the same time, allowed decision makers to probability match via a simple repeated response to a single option. The results from two experiments showed that increasing the implementation effort of probability maximizing resulted in decreased adoption rates of this strategy. This was the case both when decision makers simultaneously learned about the outcome probabilities and responded to a dual task (Exp. 1) and when these two aspects were procedurally separated in two distinct stages (Exp. 2). We conclude that the effort involved in implementing a choice strategy is a key factor in shaping repeated choice under uncertainty. Moreover, highlighting the importance of implementation effort casts new light on the sometimes surprising and inconsistent effects of cognitive load that have previously been reported in the literature.
Zimmerman, Lindsey; Lounsbury, David W; Rosen, Craig S; Kimerling, Rachel; Trafton, Jodie A; Lindley, Steven E
2016-11-01
Implementation planning typically incorporates stakeholder input. Quality improvement efforts provide data-based feedback regarding progress. Participatory system dynamics modeling (PSD) triangulates stakeholder expertise, data and simulation of implementation plans prior to attempting change. Frontline staff in one VA outpatient mental health system used PSD to examine policy and procedural "mechanisms" they believe underlie local capacity to implement evidence-based psychotherapies (EBPs) for PTSD and depression. We piloted the PSD process, simulating implementation plans to improve EBP reach. Findings indicate PSD is a feasible, useful strategy for building stakeholder consensus, and may save time and effort as compared to trial-and-error EBP implementation planning.
DOE Office of Scientific and Technical Information (OSTI.GOV)
J.D. Sanders
Under the U.S.-Russian Material Protection, Control and Accounting (MPC&A) Program, the Material Control and Accounting Measurements (MCAM) Project has supported a joint U.S.-Russian effort to coordinate improvements of the Russian MC&A measurement system. These efforts have resulted in the development of a MC&A Equipment and Methodological Support (MEMS) Strategic Plan (SP), developed by the Russian MEM Working Group. The MEMS SP covers implementation of MC&A measurement equipment, as well as the development, attestation and implementation of measurement methodologies and reference materials at the facility and industry levels. This paper provides an overview of the activities conducted under the MEMS SP,more » as well as a status on current efforts to develop reference materials, implement destructive and nondestructive assay measurement methodologies, and implement sample exchange, scrap and holdup measurement programs across Russian nuclear facilities.« less
Driving a car with custom-designed fuzzy inferencing VLSI chips and boards
NASA Technical Reports Server (NTRS)
Pin, Francois G.; Watanabe, Yutaka
1993-01-01
Vehicle control in a-priori unknown, unpredictable, and dynamic environments requires many calculational and reasoning schemes to operate on the basis of very imprecise, incomplete, or unreliable data. For such systems, in which all the uncertainties can not be engineered away, approximate reasoning may provide an alternative to the complexity and computational requirements of conventional uncertainty analysis and propagation techniques. Two types of computer boards including custom-designed VLSI chips were developed to add a fuzzy inferencing capability to real-time control systems. All inferencing rules on a chip are processed in parallel, allowing execution of the entire rule base in about 30 microseconds, and therefore, making control of 'reflex-type' of motions envisionable. The use of these boards and the approach using superposition of elemental sensor-based behaviors for the development of qualitative reasoning schemes emulating human-like navigation in a-priori unknown environments are first discussed. Then how the human-like navigation scheme implemented on one of the qualitative inferencing boards was installed on a test-bed platform to investigate two control modes for driving a car in a-priori unknown environments on the basis of sparse and imprecise sensor data is described. In the first mode, the car navigates fully autonomously, while in the second mode, the system acts as a driver's aid providing the driver with linguistic (fuzzy) commands to turn left or right and speed up or slow down depending on the obstacles perceived by the sensors. Experiments with both modes of control are described in which the system uses only three acoustic range (sonar) sensor channels to perceive the environment. Simulation results as well as indoors and outdoors experiments are presented and discussed to illustrate the feasibility and robustness of autonomous navigation and/or safety enhancing driver's aid using the new fuzzy inferencing hardware system and some human-like reasoning schemes which may include as little as six elemental behaviors embodied in fourteen qualitative rules.
The Neural Correlates of Emotion Regulation by Implementation Intentions
Hallam, Glyn P.; Webb, Thomas L.; Sheeran, Paschal; Miles, Eleanor; Wilkinson, Iain D.; Hunter, Michael D.; Barker, Anthony T.; Woodruff, Peter W. R.; Totterdell, Peter; Lindquist, Kristen A.; Farrow, Tom F. D.
2015-01-01
Several studies have investigated the neural basis of effortful emotion regulation (ER) but the neural basis of automatic ER has been less comprehensively explored. The present study investigated the neural basis of automatic ER supported by ‘implementation intentions’. 40 healthy participants underwent fMRI while viewing emotion-eliciting images and used either a previously-taught effortful ER strategy, in the form of a goal intention (e.g., try to take a detached perspective), or a more automatic ER strategy, in the form of an implementation intention (e.g., “If I see something disgusting, then I will think these are just pixels on the screen!”), to regulate their emotional response. Whereas goal intention ER strategies were associated with activation of brain areas previously reported to be involved in effortful ER (including dorsolateral prefrontal cortex), ER strategies based on an implementation intention strategy were associated with activation of right inferior frontal gyrus and ventro-parietal cortex, which may reflect the attentional control processes automatically captured by the cue for action contained within the implementation intention. Goal intentions were also associated with less effective modulation of left amygdala, supporting the increased efficacy of ER under implementation intention instructions, which showed coupling of orbitofrontal cortex and amygdala. The findings support previous behavioural studies in suggesting that forming an implementation intention enables people to enact goal-directed responses with less effort and more efficiency. PMID:25798822
Multiparameter elastic full waveform inversion with facies-based constraints
NASA Astrophysics Data System (ADS)
Zhang, Zhen-dong; Alkhalifah, Tariq; Naeini, Ehsan Zabihi; Sun, Bingbing
2018-06-01
Full waveform inversion (FWI) incorporates all the data characteristics to estimate the parameters described by the assumed physics of the subsurface. However, current efforts to utilize FWI beyond improved acoustic imaging, like in reservoir delineation, faces inherent challenges related to the limited resolution and the potential trade-off between the elastic model parameters. Some anisotropic parameters are insufficiently updated because of their minor contributions to the surface collected data. Adding rock physics constraints to the inversion helps mitigate such limited sensitivity, but current approaches to add such constraints are based on including them as a priori knowledge mostly valid around the well or as a global constraint for the whole area. Since similar rock formations inside the Earth admit consistent elastic properties and relative values of elasticity and anisotropy parameters (this enables us to define them as a seismic facies), utilizing such localized facies information in FWI can improve the resolution of inverted parameters. We propose a novel approach to use facies-based constraints in both isotropic and anisotropic elastic FWI. We invert for such facies using Bayesian theory and update them at each iteration of the inversion using both the inverted models and a priori information. We take the uncertainties of the estimated parameters (approximated by radiation patterns) into consideration and improve the quality of estimated facies maps. Four numerical examples corresponding to different acquisition, physical assumptions and model circumstances are used to verify the effectiveness of the proposed method.
Analyzing and Predicting Effort Associated with Finding and Fixing Software Faults
NASA Technical Reports Server (NTRS)
Hamill, Maggie; Goseva-Popstojanova, Katerina
2016-01-01
Context: Software developers spend a significant amount of time fixing faults. However, not many papers have addressed the actual effort needed to fix software faults. Objective: The objective of this paper is twofold: (1) analysis of the effort needed to fix software faults and how it was affected by several factors and (2) prediction of the level of fix implementation effort based on the information provided in software change requests. Method: The work is based on data related to 1200 failures, extracted from the change tracking system of a large NASA mission. The analysis includes descriptive and inferential statistics. Predictions are made using three supervised machine learning algorithms and three sampling techniques aimed at addressing the imbalanced data problem. Results: Our results show that (1) 83% of the total fix implementation effort was associated with only 20% of failures. (2) Both safety critical failures and post-release failures required three times more effort to fix compared to non-critical and pre-release counterparts, respectively. (3) Failures with fixes spread across multiple components or across multiple types of software artifacts required more effort. The spread across artifacts was more costly than spread across components. (4) Surprisingly, some types of faults associated with later life-cycle activities did not require significant effort. (5) The level of fix implementation effort was predicted with 73% overall accuracy using the original, imbalanced data. Using oversampling techniques improved the overall accuracy up to 77%. More importantly, oversampling significantly improved the prediction of the high level effort, from 31% to around 85%. Conclusions: This paper shows the importance of tying software failures to changes made to fix all associated faults, in one or more software components and/or in one or more software artifacts, and the benefit of studying how the spread of faults and other factors affect the fix implementation effort.
TMDL Implementation: Lessons Learned
Virginia Tech Center for TMDL and Watershed Studies provided state TMDL implementation information and reviewed ongoing TMDL implementation efforts across the country to identify factors that contribute to successful implementation.
Mediterranean Diet and Cardiovascular Disease: A Critical Evaluation of A Priori Dietary Indexes
D’Alessandro, Annunziata; De Pergola, Giovanni
2015-01-01
The aim of this paper is to analyze the a priori dietary indexes used in the studies that have evaluated the role of the Mediterranean Diet in influencing the risk of developing cardiovascular disease. All the studies show that this dietary pattern protects against cardiovascular disease, but studies show quite different effects on specific conditions such as coronary heart disease or cerebrovascular disease. A priori dietary indexes used to measure dietary exposure imply quantitative and/or qualitative divergences from the traditional Mediterranean Diet of the early 1960s, and, therefore, it is very difficult to compare the results of different studies. Based on real cultural heritage and traditions, we believe that the a priori indexes used to evaluate adherence to the Mediterranean Diet should consider classifying whole grains and refined grains, olive oil and monounsaturated fats, and wine and alcohol differently. PMID:26389950
Data Prediction for Public Events in Professional Domains Based on Improved RNN- LSTM
NASA Astrophysics Data System (ADS)
Song, Bonan; Fan, Chunxiao; Wu, Yuexin; Sun, Juanjuan
2018-02-01
The traditional data services of prediction for emergency or non-periodic events usually cannot generate satisfying result or fulfill the correct prediction purpose. However, these events are influenced by external causes, which mean certain a priori information of these events generally can be collected through the Internet. This paper studied the above problems and proposed an improved model—LSTM (Long Short-term Memory) dynamic prediction and a priori information sequence generation model by combining RNN-LSTM and public events a priori information. In prediction tasks, the model is qualified for determining trends, and its accuracy also is validated. This model generates a better performance and prediction results than the previous one. Using a priori information can increase the accuracy of prediction; LSTM can better adapt to the changes of time sequence; LSTM can be widely applied to the same type of prediction tasks, and other prediction tasks related to time sequence.
Assessing the risk of work-related international travel.
Druckman, Myles; Harber, Philip; Liu, Yihang; Quigley, Robert L
2014-11-01
To identify factors affecting the likelihood of requiring medical services during international business trips. Data from more than 800,000 international trips and medical assistance cases provided to 48 multinational corporations in 2009. Travel destination countries were grouped into four a priori risk-related categories. Travel to "low" medical risk countries in aggregate accounted for more hospitalizations and medical evacuations than travel to "high" medical risk countries. Nevertheless, the risk per trip was much higher for travel to higher medical risk countries. Corporations with employees on international travel should allocate sufficient resources to manage and ideally prevent medical issues during business travel. Travel medicine must focus on more than infectious diseases, and programs are necessary for both high- and low-risk regions. Improved understanding of travel-related needs determines resource allocation and risk mitigation efforts.
Direct-to-consumer marketing of evidence-based psychological interventions: introduction.
Santucci, Lauren C; McHugh, R Kathryn; Barlow, David H
2012-06-01
The dissemination and implementation of evidence-based psychological interventions (EBPIs) to service provision settings has been a major challenge. Most efforts to disseminate and implement EBPIs have focused on clinicians and clinical systems as the consumers of these treatments and thus have targeted efforts to these groups. An alternative, complementary approach to achieve more widespread utilization of EBPIs is to disseminate directly to patients themselves. The aim of this special section is to explore several direct-to-consumer (i.e., patient) dissemination and education efforts currently underway. This manuscript highlights the rationale for direct-to-patient dissemination strategies as well as the application of marketing science to dissemination efforts. Achieving greater access to EBPIs will require the use of multiple approaches to overcome the many and varied barriers to successful dissemination and implementation. Copyright © 2011. Published by Elsevier Ltd.
Implementation Issues in Federal Reform Efforts in Education: The United States and Australia.
ERIC Educational Resources Information Center
Porter, Paige
Multiple data sources are used in this study of educational change in the United States and Australia. The author considers political issues that may affect the implementation of educational reform efforts at the federal level, such as homogeneity versus heterogeneity, centralization versus decentralization, constitutional responsibility for…
Regional efforts to promote forestry best management practices: a southern success story
Herb Nicholson; John Colberg; Hughes Simpson; Tom Gerow; Wib Owen
2016-01-01
The Southern Group of State Foresters has a long history of water resource protection efforts, providing leadership in BMP development, improvement, and implementation, enhancing state BMP programs, establishing effective partnerships, and standardizing an approach to consistently monitor implementation across the region.
Sanghavi, Ankit; Siddiqui, Nadia J
2017-06-01
While a large body of work documents the interconnections between oral health and obesity, less is known about the role that oral health professionals and organizations play to prevent childhood obesity, especially by influencing children's consumption of sugar-sweetened beverages (SSBs). This review identifies efforts by oral health professionals and organizations to influence such policy and advocacy, while informing future opportunities to leverage and expand on existing efforts. A scoping review of peer-reviewed literature and a web-based review of oral health policy and advocacy initiatives addressing prevention of obesity and reducing children's consumption of SSBs were conducted. Of 30 unique references identified, four peer-reviewed and seven non-peer-reviewed references met selection criteria. Qualitative and quantitative data were extracted using a priori determined headings. Findings suggest a strong role for oral health professionals in preventing childhood obesity and reducing children's consumption of SSBs; however, only a few national, state, and local oral-health-advocacy and -policy efforts were identified, such as policy statements by national associations, state and local education campaigns, and clinical guidelines. Evidence was limited on the role of oral health professionals in influencing broader communitywide advocacy and policy efforts such as soda taxation and limiting SSB consumption in schools. This review provides an emerging evidence base to support growing recognition among oral health professionals of their dual role in preventing childhood obesity and dental caries by targeting SSB consumption. It also identifies opportunities for oral health professionals to build on initial efforts to more proactively influence future policy and advocacy. © 2017 American Association of Public Health Dentistry.
Schneider, Marguerite
2009-01-01
This article discusses the current efforts to measure disability in a comparable manner internationally, the effects of using different types of wording in questions, and the implications of the approach of asking about 'difficulties' rather than 'disability' on the use of disability statistics. The study design was qualitative. Twenty-one focus groups were run with adults responding for themselves. Nine groups were classified a priori by the author as 'disabled', six as 'unsure', and the last six as 'non-disabled'. The participants completed a questionnaire using the Washington Group on Disability Statistics (WG) Short Set, the South African Census 2001 question, and the question 'Are you disabled?'. This was followed by group discussion on these questions and on how the concept of disability is understood by group participants. Participants understand disability as being a permanent, unchangeable state, mostly physical, and where a person is unable to do anything. The participants in the three groups of allocated disability status (disabled, unsure and non-disabled) provided quite different responses on the three questions. All participants in the 'disabled' and 'unsure' groups reported having 'difficulty' on the WG questions, but the 'unsure' groups did not identify as being 'disabled' on either of the two other questions. Using questions that ask about 'difficulty' rather than 'disability' provides a more comprehensive and inclusive measure of disability with a clearer understanding of what is being measured. Asking about 'difficulty' provides an improved measure of disability status for effective data collection and analysis to promote development, implementation and monitoring of disability-inclusive policies.
GOCE User Toolbox and Tutorial
NASA Astrophysics Data System (ADS)
Benveniste, Jérôme; Knudsen, Per
2016-07-01
The GOCE User Toolbox GUT is a compilation of tools for the utilisation and analysis of GOCE Level 2 products. GUT support applications in Geodesy, Oceanography and Solid Earth Physics. The GUT Tutorial provides information and guidance in how to use the toolbox for a variety of applications. GUT consists of a series of advanced computer routines that carry out the required computations. It may be used on Windows PCs, UNIX/Linux Workstations, and Mac. The toolbox is supported by The GUT Algorithm Description and User Guide and The GUT Install Guide. A set of a-priori data and models are made available as well. Without any doubt the development of the GOCE user toolbox have played a major role in paving the way to successful use of the GOCE data for oceanography. The GUT version 2.2 was released in April 2014 and beside some bug-fixes it adds the capability for the computation of Simple Bouguer Anomaly (Solid-Earth). During this fall a new GUT version 3 has been released. GUTv3 was further developed through a collaborative effort where the scientific communities participate aiming on an implementation of remaining functionalities facilitating a wider span of research in the fields of Geodesy, Oceanography and Solid earth studies. Accordingly, the GUT version 3 has: - An attractive and easy to use Graphic User Interface (GUI) for the toolbox, - Enhance the toolbox with some further software functionalities such as to facilitate the use of gradients, anisotropic diffusive filtering and computation of Bouguer and isostatic gravity anomalies. - An associated GUT VCM tool for analyzing the GOCE variance covariance matrices.
A community-wide media campaign to promote walking in a Missouri town.
Wray, Ricardo J; Jupka, Keri; Ludwig-Bell, Cathy
2005-10-01
Engaging in moderate physical activity for 30 minutes five or more times per week substantially reduces the risk of coronary heart disease, stroke, colon cancer, diabetes, high blood pressure, and obesity, and walking is an easy and accessible way to achieve this goal. A theory-based mass media campaign promoted walking and local community-sponsored wellness initiatives through four types of media (billboard, newspaper, radio, and poster advertisements) in St Joseph, Mo, over 5 months during the summer of 2003. The Walk Missouri campaign was conducted in four phases: 1) formative research, 2) program design and pretesting, 3) implementation, and 4) impact assessment. Using a postcampaign-only, cross-sectional design, a telephone survey (N = 297) was conducted in St Joseph to assess campaign impact. Study outcomes were pro-walking beliefs and behaviors. One in three survey respondents reported seeing or hearing campaign messages on one or more types of media. Reported exposure to the campaign was significantly associated with two of four pro-walking belief scales (social and pleasure benefits) and with one of three community-sponsored activities (participation in a community-sponsored walk) controlling for demographic, health status, and environmental factors. Exposure was also significantly associated with one of three general walking behaviors (number of days per week walking) when controlling for age and health status but not when beliefs were introduced into the model, consistent with an a priori theoretical mechanism: the mediating effect of pro-walking beliefs on the exposure-walking association. These results suggest that a media campaign can enhance the success of community-based efforts to promote pro-walking beliefs and behaviors.
DAISY: a new software tool to test global identifiability of biological and physiological systems
Bellu, Giuseppina; Saccomani, Maria Pia; Audoly, Stefania; D’Angiò, Leontina
2009-01-01
A priori global identifiability is a structural property of biological and physiological models. It is considered a prerequisite for well-posed estimation, since it concerns the possibility of recovering uniquely the unknown model parameters from measured input-output data, under ideal conditions (noise-free observations and error-free model structure). Of course, determining if the parameters can be uniquely recovered from observed data is essential before investing resources, time and effort in performing actual biomedical experiments. Many interesting biological models are nonlinear but identifiability analysis for nonlinear system turns out to be a difficult mathematical problem. Different methods have been proposed in the literature to test identifiability of nonlinear models but, to the best of our knowledge, so far no software tools have been proposed for automatically checking identifiability of nonlinear models. In this paper, we describe a software tool implementing a differential algebra algorithm to perform parameter identifiability analysis for (linear and) nonlinear dynamic models described by polynomial or rational equations. Our goal is to provide the biological investigator a completely automatized software, requiring minimum prior knowledge of mathematical modelling and no in-depth understanding of the mathematical tools. The DAISY (Differential Algebra for Identifiability of SYstems) software will potentially be useful in biological modelling studies, especially in physiology and clinical medicine, where research experiments are particularly expensive and/or difficult to perform. Practical examples of use of the software tool DAISY are presented. DAISY is available at the web site http://www.dei.unipd.it/~pia/. PMID:17707944
Web 2.0 collaboration tool to support student research in hydrology - an opinion
NASA Astrophysics Data System (ADS)
Pathirana, A.; Gersonius, B.; Radhakrishnan, M.
2012-08-01
A growing body of evidence suggests that it is unwise to make the a-priori assumption that university students are ready and eager to embrace modern online technologies employed to enhance the educational experience. We present our opinion on employing Wiki, a popular Web 2.0 technology, in small student groups, based on a case-study of using it customized to work as a personal learning environment (PLE1) (Fiedler and Väljataga, 2011) for supporting thesis research in hydrology. Since inception in 2006, the system presented has proven to facilitate knowledge construction and peer-communication within and across groups of students of different academic years and to stimulate learning. Being an open ended and egalitarian system, it was a minimal burden to maintain, as all students became content authors and shared responsibility. A number of unintended uses of the system were also observed, like using it as a backup medium and mobile storage. We attribute the success and sustainability of the proposed Web 2.0-based approach to the fact that the efforts were not limited to the application of the technology, but comprised the creation of a supporting environment with educational activities organized around it. We propose that Wiki-based PLEs are much more suitable than traditional learning management systems for supporting non-classroom education activities like thesis research in hydrology. 1Here we use the term PLE to refer to the conceptual framework to make the process of knowledge construction a personalized experience - rather than to refer to the technology (in this case Wiki) used to attempt implementing such a system.
A haphazard reading of McHugh and Barlow (2010).
McHugh, R Kathryn; Barlow, David H
2010-12-01
Replies to comments on Do haphazard reviews provide sound directions for dissemination efforts? (see record 2010-24768-012) by Eileen Gambrill and Julia H. Littell on the current authors' article The dissemination and implementation of evidence-based psychological treatments: A review of current efforts (see record 2010-02208-010) by Kathryn R. McHugh and David H. Barlow. In their commentary, Gambrill and Littell (2010, this issue) suggested that we provided misleading guidance on the selection of treatments for dissemination in our recent article (McHugh & Barlow, February- March 2010) on the dissemination and implementation of evidence-based treatments. These authors misread our article as an affirmation of the evidence base of the treatments involved in the dissemination and implementation efforts we described. In fact, we explicitly disclaimed in the third paragraph that "we do not revisit controversies surrounding the identification or appropriateness of [evidence-based psychological treatments] . . . rather, we focus on the status and adequacy of [dissemination and implementation] efforts currently under way (McHugh & Barlow, 2010, p. 73). Thus, our review was not intended as a guideline for which treatments to disseminate, nor was it a thorough review of the evidence base for the treatments included in the efforts we reviewed. We chose several programs for illustrative purposes as representative efforts from three general domains: national, state, and investigator initiated. PsycINFO Database Record (c) 2010 APA, all rights reserved.
Applying the Policy Ecology Framework to Philadelphia’s Behavioral Health Transformation Efforts
Powell, Byron J.; Beidas, Rinad S.; Rubin, Ronnie M.; Stewart, Rebecca E.; Wolk, Courtney Benjamin; Matlin, Samantha L.; Weaver, Shawna; Hurford, Matthew O.; Evans, Arthur C.; Hadley, Trevor R.; Mandell, David S.
2016-01-01
Raghavan et al. (2008) proposed that effective implementation of evidence-based practices requires implementation strategies deployed at multiple levels of the “policy ecology,” including the organizational, regulatory or purchaser agency, political, and social levels. However, much of implementation research and practice targets providers without accounting for contextual factors that may influence provider behavior. This paper examines Philadelphia’s efforts to work toward an evidence-based and recovery-oriented behavioral health system, and uses the policy ecology framework to illustrate how multifaceted, multilevel implementation strategies can facilitate the widespread implementation of evidence-based practices. Ongoing challenges and implications for research and practice are discussed. PMID:27032411
Ex Priori: Exposure-based Prioritization across Chemical Space
EPA's Exposure Prioritization (Ex Priori) is a simplified, quantitative visual dashboard that makes use of data from various inputs to provide rank-ordered internalized dose metric. This complements other high throughput screening by viewing exposures within all chemical space si...
Pravatà, Emanuele; Zecca, Chiara; Sestieri, Carlo; Caulo, Massimo; Riccitelli, Gianna Carla; Rocca, Maria Assunta; Filippi, Massimo; Cianfoni, Alessandro; Gobbi, Claudio
2016-11-01
To investigate the dynamic temporal changes of brain resting-state functional connectivity (RS-FC) following mental effort in multiple sclerosis (MS) patients with cognitive fatigue (CF). Twenty-two MS patients, 11 with (F) and 11 without CF, and 12 healthy controls were included. Separate RS-FC scans were acquired on a 3T MR scanner immediately before (t0), immediately after (t1) and 30 minutes after (t2) execution of the paced auditory serial addition test (PASAT), a cognitively demanding task. Subjectively perceived CF after PASAT execution was also assessed. RS-FC changes were investigated by using a data-driven approach (the Intrinsic Connectivity Contrast -power ), complemented by a priori defined regions of interest analyses. The F-group patients experienced stronger RS-FC at t2 between the left superior frontal gyrus (L-SFG) and occipital, frontal and temporal areas, which increased over time after PASAT execution. In the F-group patients, the L-SFG was hyperconnected at t1 with the left caudate nucleus and hypoconnected at t2 with the left anterior thalamus. These variations were correlated with both subjectively perceived and clinically assessed CF, and-for the left thalamus-with PASAT performance. The development of cortico-cortical and cortico-subcortical hyperconnectivity following mental effort is related to CF symptoms in MS patients. © The Author(s), 2016.
NASA Astrophysics Data System (ADS)
Naguib, Hussein; Bol, Igor I.; Lora, J.; Chowdhry, R.
1994-09-01
This paper presents a case study on the implementation of ABC to calculate the cost per wafer and to drive cost reduction efforts for a new IC product line. The cost reduction activities were conducted through the efforts of 11 cross-functional teams which included members of the finance, purchasing, technology development, process engineering, equipment engineering, production control, and facility groups. The activities of these cross functional teams were coordinated by a cost council. It will be shown that these activities have resulted in a 57% reduction in the wafer manufacturing cost of the new product line. Factors contributed to successful implementation of an ABC management system are discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kasemir, Kay; Pearson, Matthew R
For several years, the Control System Studio (CS-Studio) Scan System has successfully automated the operation of beam lines at the Oak Ridge National Laboratory (ORNL) High Flux Isotope Reactor (HFIR) and Spallation Neutron Source (SNS). As it is applied to additional beam lines, we need to support simultaneous adjustments of temperatures or motor positions. While this can be implemented via virtual motors or similar logic inside the Experimental Physics and Industrial Control System (EPICS) Input/Output Controllers (IOCs), doing so requires a priori knowledge of experimenters requirements. By adding support for the parallel control of multiple process variables (PVs) to themore » Scan System, we can better support ad hoc automation of experiments that benefit from such simultaneous PV adjustments.« less
[The semi-structured interview: at the border of public health and anthropology].
Imbert, Geneviève
2010-09-01
The interview is the tool for data collection the most used in the context of research conducted in health sciences, human sciences and social sciences. After completing some generalities about the different types of interviews, the focus is on semi-structured interview during its various stages including the processing and data analysis, this from the return of a lived experience of research in work on the border of the field of public health and that of anthropology. If this approach and contextualized the semistructured interview may a priori appear specific, the reader interested in the development of qualitative research in a humanistic perspective and the implementation of multidisciplinary strategies to ascertain its universal character.
Geometric modeling for computer aided design
NASA Technical Reports Server (NTRS)
Schwing, James L.
1988-01-01
Research focused on two major areas. The first effort addressed the design and implementation of a technique that allows for the visualization of the real time variation of physical properties. The second effort focused on the design and implementation of an on-line help system with components designed for both authors and users of help information.
Professional Development. State Implementation of Common Core State Standards
ERIC Educational Resources Information Center
Anderson, Kimberly; Mira, Mary Elizabeth
2014-01-01
The following profiles address how the state departments of education are helping educators prepare for and implement the Common Core and aligned assessments through professional learning. The major professional development efforts around the Common Core were examined in order to understand the overall efforts of each state. An exhaustive list of…
Do Haphazard Reviews Provide Sound Directions for Dissemination Efforts?
ERIC Educational Resources Information Center
Gambrill, Eileen; Littell, Julia H.
2010-01-01
Comments on The dissemination and implementation of evidence-based psychological treatments: A review of current efforts by Kathryn R. McHugh and David H. Barlow. The lead article in the February-March issue by McHugh and Barlow (2010) emphasized the need for "dissemination and implementation of evidence-based psychological treatments."…
Implementing RTI in a High School: A Case Study
ERIC Educational Resources Information Center
Fisher, Douglas; Frey, Nancy
2013-01-01
This case study chronicles the efforts of a small high school over a 2-year period as it designed and implemented a response to intervention (RTI) program for students at the school. Their efforts were largely successful, with improved achievement, attendance, and grade point averages and a decrease in special education referrals. Major themes…
The status of states' policies to support evidence-based practices in children's mental health.
Cooper, Janice L; Aratani, Yumiko
2009-12-01
This study examined the efforts of states' mental health authorities to promote the use of evidence-based practices through policy. Data were drawn from three components of a national study, including a survey of state children's mental health directors (N=53), which was developed using a three-step process that involved stakeholders. Data from the directors' survey revealed that over 90% of states are implementing strategies to support the use of evidence-based practices. The scope of these efforts varies, with 36% reporting statewide reach. Further, states' strategies for implementing evidence-based practices are often not accompanied by comparable efforts to enhance information systems, even though enhancing such systems can bolster opportunities for successful implementation. Variability in the adoption of evidence-based practices, poor attention to information systems, and inconsistent fiscal policies threaten states' efforts to improve the quality of children's mental health services.
[Population pharmacokinetics applied to optimising cisplatin doses in cancer patients].
Ramón-López, A; Escudero-Ortiz, V; Carbonell, V; Pérez-Ruixo, J J; Valenzuela, B
2012-01-01
To develop and internally validate a population pharmacokinetics model for cisplatin and assess its prediction capacity for personalising doses in cancer patients. Cisplatin plasma concentrations in forty-six cancer patients were used to determine the pharmacokinetic parameters of a two-compartment pharmacokinetic model implemented in NONMEN VI software. Pharmacokinetic parameter identification capacity was assessed using the parametric bootstrap method and the model was validated using the nonparametric bootstrap method and standardised visual and numerical predictive checks. The final model's prediction capacity was evaluated in terms of accuracy and precision during the first (a priori) and second (a posteriori) chemotherapy cycles. Mean population cisplatin clearance is 1.03 L/h with an interpatient variability of 78.0%. Estimated distribution volume at steady state was 48.3 L, with inter- and intrapatient variabilities of 31,3% and 11,7%, respectively. Internal validation confirmed that the population pharmacokinetics model is appropriate to describe changes over time in cisplatin plasma concentrations, as well as its variability in the study population. The accuracy and precision of a posteriori prediction of cisplatin concentrations improved by 21% and 54% compared to a priori prediction. The population pharmacokinetic model developed adequately described the changes in cisplatin plasma concentrations in cancer patients and can be used to optimise cisplatin dosing regimes accurately and precisely. Copyright © 2011 SEFH. Published by Elsevier Espana. All rights reserved.
Linking customisation of ERP systems to support effort: an empirical study
NASA Astrophysics Data System (ADS)
Koch, Stefan; Mitteregger, Kurt
2016-01-01
The amount of customisation to an enterprise resource planning (ERP) system has always been a major concern in the context of the implementation. This article focuses on the phase of maintenance and presents an empirical study about the relationship between the amount of customising and the resulting support effort. We establish a structural equation modelling model that explains support effort using customisation effort, organisational characteristics and scope of implementation. The findings using data from an ERP provider show that there is a statistically significant effect: with an increasing amount of customisation, the quantity of telephone calls to support increases, as well as the duration of each call.
Understanding middle managers' influence in implementing patient safety culture.
Gutberg, Jennifer; Berta, Whitney
2017-08-22
The past fifteen years have been marked by large-scale change efforts undertaken by healthcare organizations to improve patient safety and patient-centered care. Despite substantial investment of effort and resources, many of these large-scale or "radical change" initiatives, like those in other industries, have enjoyed limited success - with practice and behavioural changes neither fully adopted nor ultimately sustained - which has in large part been ascribed to inadequate implementation efforts. Culture change to "patient safety culture" (PSC) is among these radical change initiatives, where results to date have been mixed at best. This paper responds to calls for research that focus on explicating factors that affect efforts to implement radical change in healthcare contexts, and focuses on PSC as the radical change implementation. Specifically, this paper offers a novel conceptual model based on Organizational Learning Theory to explain the ability of middle managers in healthcare organizations to influence patient safety culture change. We propose that middle managers can capitalize on their unique position between upper and lower levels in the organization and engage in 'ambidextrous' learning that is critical to implementing and sustaining radical change. This organizational learning perspective offers an innovative way of framing the mid-level managers' role, through both explorative and exploitative activities, which further considers the necessary organizational context in which they operate.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2009-04-01
This report documents implementation strategies to leverage public and private resources for the development of an adequate national security workforce as part of the National Security Preparedness Project (NSPP), being performed under a U.S. Department of Energy (DOE)/National Nuclear Security Administration (NNSA) grant. There are numerous efforts across the United States to develop a properly skilled and trained national security workforce. Some of these efforts are the result of the leveraging of public and private dollars. As budget dollars decrease and the demand for a properly skilled and trained national security workforce increases, it will become even more important tomore » leverage every education and training dollar. This report details some of the efforts that have been implemented to leverage public and private resources, as well as implementation strategies to further leverage public and private resources.« less
Raveis, Victoria H; Conway, Laurie J; Uchida, Mayuko; Pogorzelska-Maziarz, Monika; Larson, Elaine L; Stone, Patricia W
2014-04-01
Health-care-associated infections (HAIs) remain a major patient safety problem even as policy and programmatic efforts designed to reduce HAIs have increased. Although information on implementing effective infection control (IC) efforts has steadily grown, knowledge gaps remain regarding the organizational elements that improve bedside practice and accommodate variations in clinical care settings. We conducted in-depth, semistructured interviews in 11 hospitals across the United States with a range of hospital personnel involved in IC (n = 116). We examined the collective nature of IC and the organizational elements that can enable disparate groups to work together to prevent HAIs. Our content analysis of participants' narratives yielded a rich description of the organizational process of implementing adherence to IC. Findings document the dynamic, fluid, interactional, and reactive nature of this process. Three themes emerged: implementing adherence efforts institution-wide, promoting an institutional culture to sustain adherence, and contending with opposition to the IC mandate.
Noise Control in Space Shuttle Orbiter
NASA Technical Reports Server (NTRS)
Goodman, Jerry R.
2009-01-01
Acoustic limits in habitable space enclosures are required to ensure crew safety, comfort, and habitability. Noise control is implemented to ensure compliance with the acoustic requirements. The purpose of this paper is to describe problems with establishing acoustic requirements and noise control efforts, and present examples of noise control treatments and design applications used in the Space Shuttle Orbiter. Included is the need to implement the design discipline of acoustics early in the design process, and noise control throughout a program to ensure that limits are met. The use of dedicated personnel to provide expertise and oversight of acoustic requirements and noise control implementation has shown to be of value in the Space Shuttle Orbiter program. It is concluded that to achieve acceptable and safe noise levels in the crew habitable space, early resolution of acoustic requirements and implementation of effective noise control efforts are needed. Management support of established acoustic requirements and noise control efforts is essential.
Space Vehicle Pose Estimation via Optical Correlation and Nonlinear Estimation
NASA Technical Reports Server (NTRS)
Rakoczy, John M.; Herren, Kenneth A.
2008-01-01
A technique for 6-degree-of-freedom (6DOF) pose estimation of space vehicles is being developed. This technique draws upon recent developments in implementing optical correlation measurements in a nonlinear estimator, which relates the optical correlation measurements to the pose states (orientation and position). For the optical correlator, the use of both conjugate filters and binary, phase-only filters in the design of synthetic discriminant function (SDF) filters is explored. A static neural network is trained a priori and used as the nonlinear estimator. New commercial animation and image rendering software is exploited to design the SDF filters and to generate a large filter set with which to train the neural network. The technique is applied to pose estimation for rendezvous and docking of free-flying spacecraft and to terrestrial surface mobility systems for NASA's Vision for Space Exploration. Quantitative pose estimation performance will be reported. Advantages and disadvantages of the implementation of this technique are discussed.
Peak reduction for commercial buildings using energy storage
NASA Astrophysics Data System (ADS)
Chua, K. H.; Lim, Y. S.; Morris, S.
2017-11-01
Battery-based energy storage has emerged as a cost-effective solution for peak reduction due to the decrement of battery’s price. In this study, a battery-based energy storage system is developed and implemented to achieve an optimal peak reduction for commercial customers with the limited energy capacity of the energy storage. The energy storage system is formed by three bi-directional power converter rated at 5 kVA and a battery bank with capacity of 64 kWh. Three control algorithms, namely fixed-threshold, adaptive-threshold, and fuzzy-based control algorithms have been developed and implemented into the energy storage system in a campus building. The control algorithms are evaluated and compared under different load conditions. The overall experimental results show that the fuzzy-based controller is the most effective algorithm among the three controllers in peak reduction. The fuzzy-based control algorithm is capable of incorporating a priori qualitative knowledge and expertise about the load characteristic of the buildings as well as the useable energy without over-discharging the batteries.
Space Vehicle Pose Estimation via Optical Correlation and Nonlinear Estimation
NASA Technical Reports Server (NTRS)
Rakoczy, John; Herren, Kenneth
2007-01-01
A technique for 6-degree-of-freedom (6DOF) pose estimation of space vehicles is being developed. This technique draws upon recent developments in implementing optical correlation measurements in a nonlinear estimator, which relates the optical correlation measurements to the pose states (orientation and position). For the optical correlator, the use of both conjugate filters and binary, phase-only filters in the design of synthetic discriminant function (SDF) filters is explored. A static neural network is trained a priori and used as the nonlinear estimator. New commercial animation and image rendering software is exploited to design the SDF filters and to generate a large filter set with which to train the neural network. The technique is applied to pose estimation for rendezvous and docking of free-flying spacecraft and to terrestrial surface mobility systems for NASA's Vision for Space Exploration. Quantitative pose estimation performance will be reported. Advantages and disadvantages of the implementation of this technique are discussed.
Design of adaptive control systems by means of self-adjusting transversal filters
NASA Technical Reports Server (NTRS)
Merhav, S. J.
1986-01-01
The design of closed-loop adaptive control systems based on nonparametric identification was addressed. Implementation is by self-adjusting Least Mean Square (LMS) transversal filters. The design concept is Model Reference Adaptive Control (MRAC). Major issues are to preserve the linearity of the error equations of each LMS filter, and to prevent estimation bias that is due to process or measurement noise, thus providing necessary conditions for the convergence and stability of the control system. The controlled element is assumed to be asymptotically stable and minimum phase. Because of the nonparametric Finite Impulse Response (FIR) estimates provided by the LMS filters, a-priori information on the plant model is needed only in broad terms. Following a survey of control system configurations and filter design considerations, system implementation is shown here in Single Input Single Output (SISO) format which is readily extendable to multivariable forms. In extensive computer simulation studies the controlled element is represented by a second-order system with widely varying damping, natural frequency, and relative degree.
Improved chemical identification from sensor arrays using intelligent algorithms
NASA Astrophysics Data System (ADS)
Roppel, Thaddeus A.; Wilson, Denise M.
2001-02-01
Intelligent signal processing algorithms are shown to improve identification rates significantly in chemical sensor arrays. This paper focuses on the use of independently derived sensor status information to modify the processing of sensor array data by using a fast, easily-implemented "best-match" approach to filling in missing sensor data. Most fault conditions of interest (e.g., stuck high, stuck low, sudden jumps, excess noise, etc.) can be detected relatively simply by adjunct data processing, or by on-board circuitry. The objective then is to devise, implement, and test methods for using this information to improve the identification rates in the presence of faulted sensors. In one typical example studied, utilizing separately derived, a-priori knowledge about the health of the sensors in the array improved the chemical identification rate by an artificial neural network from below 10 percent correct to over 99 percent correct. While this study focuses experimentally on chemical sensor arrays, the results are readily extensible to other types of sensor platforms.
1991-12-01
abstract data type is, what an object-oriented design is and how to apply "software engineering" principles to the design of both of them. I owe a great... Program (ASVP), a research and development effort by two aerospace contractors to redesign and implement subsets of two existing flight simulators in...effort addresses how to implement a simulator designed using the SEI OOD Paradigm on a distributed, parallel, multiple instruction, multiple data (MIMD
ERIC Educational Resources Information Center
Ashby, Cornelia M.
2009-01-01
Presented herein is a statement of Cornelia M. Ashby, Director Education, Workforce, and Income Security. The early efforts of the District of Columbia Public Schools' (DCPS) to improve student achievement focused on implementing initiatives to improve student performance, including implementing a new staffing model; restructuring underperforming…
Liljedahl, Sophie I; Helleman, Marjolein; Daukantaité, Daiva; Westrin, Åsa; Westling, Sofie
2017-06-15
Brief Admission is a crisis and risk management strategy in which self-harming and suicidal individuals with three or more diagnostic criteria of borderline personality disorder self-admit to hospital at times of increasing risk when other efforts to stay safe are failing. Standardized in the current randomized controlled trial, the intensity of Brief Admission Skåne is implemented in durations of three days, with a maximum frequency of three times a month. Brief Admission is integrated into existing treatment plans in advance of crises to prevent reliance on general psychiatric admissions for risk management, as these may be lengthy, unstructured, and of uncertain therapeutic value. The overall objective of the Brief Admission Skåne randomized controlled trial is to determine if Brief Admission can replace general psychiatric admission for self-harming and suicidal individuals with complex mental illness at times of escalating risk. Other objectives of the study are to evaluate whether Brief Admission increases daily functioning and enhances coping, reduces psychiatric symptoms including frequency and severity of self-harm and suicidal behaviours. A final objective is to determine if Brief Admission is an effective crisis management model for this population. Participants are randomized at an individual level to either Brief Admission Skåne plus Treatment as Usual or Treatment As Usual. Based on a priori power analyses, N = 124 participants will be recruited to the study. Data collection is in progress, and will continue until June 2018. All participant data are single-blinded and will be handled with intention-to-treat analysis. Based on the combined clinical experience of our international research group, the Brief Admission Skåne randomized controlled trial upon which the current protocol is based represents the first initiative to standardize, implement and evaluate Brief Admission amongst self-harming and suicidal individuals, including those with borderline traits. Objectively measuring protocol fidelity and developing English-language Brief Admission study protocols and training materials are implementation and dissemination targets developed in order to facilitate adherent international export of Brief Admission Skåne. NCT02985047 . Registered November 25, 2016. Retrospectively registered.
Wallace, Lauren; Kapirir, Lydia
2017-04-08
To date, research on priority-setting for new vaccines has not adequately explored the influence of the global, national and sub-national levels of decision-making or contextual issues such as political pressure and stakeholder influence and power. Using Kapiriri and Martin's conceptual framework, this paper evaluates priority setting for new vaccines in Uganda at national and sub-national levels, and considers how global priorities can influence country priorities. This study focuses on 2 specific vaccines, the human papilloma virus (HPV) vaccine and the pneumococcal conjugate vaccine (PCV). This was a qualitative study that involved reviewing relevant Ugandan policy documents and media reports, as well as 54 key informant interviews at the global level and national and sub-national levels in Uganda. Kapiriri and Martin's conceptual framework was used to evaluate the prioritization process. Priority setting for PCV and HPV was conducted by the Ministry of Health (MoH), which is considered to be a legitimate institution. While respondents described the priority setting process for PCV process as transparent, participatory, and guided by explicit relevant criteria and evidence, the prioritization of HPV was thought to have been less transparent and less participatory. Respondents reported that neither process was based on an explicit priority setting framework nor did it involve adequate representation from the districts (program implementers) or publicity. The priority setting process for both PCV and HPV was negatively affected by the larger political and economic context, which contributed to weak institutional capacity as well as power imbalances between development assistance partners and the MoH. Priority setting in Uganda would be improved by strengthening institutional capacity and leadership and ensuring a transparent and participatory processes in which key stakeholders such as program implementers (the districts) and beneficiaries (the public) are involved. Kapiriri and Martin's framework has the potential to guide priority setting evaluation efforts, however, evaluation should be built into the priority setting process a priori such that information on priority setting is gathered throughout the implementation cycle. © 2017 The Author(s); Published by Kerman University of Medical Sciences. This is an open-access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
NASA Astrophysics Data System (ADS)
Arrigo, J. S.; Famiglietti, J. S.; Murdoch, L. C.; Lakshmi, V.; Hooper, R. P.
2012-12-01
The Consortium of Universities for the Advancement of Hydrologic Science, Inc. (CUAHSI) continues a major effort towards supporting Community Hydrologic Modeling. From 2009 - 2011, the Community Hydrologic Modeling Platform (CHyMP) initiative held three workshops, the ultimate goal of which was to produce recommendations and an implementation plan to establish a community modeling program that enables comprehensive simulation of water anywhere on the North American continent. Such an effort would include connections to and advances in global climate models, biogeochemistry, and efforts of other disciplines that require an understanding of water patterns and processes in the environment. To achieve such a vision will require substantial investment in human and cyber-infrastructure and significant advances in the science of hydrologic modeling and spatial scaling. CHyMP concluded with a final workshop, held March 2011, and produced several recommendations. CUAHSI and the university community continue to advance community modeling and implement these recommendations through several related and follow on efforts. Key results from the final 2011 workshop included agreement among participants that the community is ready to move forward with implementation. It is recognized that initial implementation of this larger effort can begin with simulation capabilities that currently exist, or that can be easily developed. CHyMP identified four key activities in support of community modeling: benchmarking, dataset evaluation and development, platform evaluation, and developing a national water model framework. Key findings included: 1) The community supported the idea of a National Water Model framework; a community effort is needed to explore what the ultimate implementation of a National Water Model is. A true community modeling effort would support the modeling of "water anywhere" and would include all relevant scales and processes. 2) Implementation of a community modeling program could initially focus on continental scale modeling of water quantity (rather than quality). The goal of this initial model is the comprehensive description of water stores and fluxes in such a way to permit linkage to GCM's, biogeochemical, ecological, and geomorphic models. This continental scale focus allows systematic evaluation of our current state of knowledge and data, leverages existing efforts done by large scale modelers, contributes to scientific discovery that informs globally and societal relevant questions, and provides an initial framework to evaluate hydrologic information relevant to other disciplines and a structure into which to incorporate other classes of hydrologic models. 3) Dataset development will be a key aspect of any successful national water model implementation. Our current knowledge of the subsurface is limiting our ability to truly integrate soil and groundwater into large scale models, and to answering critical science questions with societal relevance (i.e. groundwater's influence on climate). 4) The CHyMP workshops and efforts to date have achieved collaboration between university scientists, government agencies and the private sector that must be maintained. Follow on efforts in community modeling should aim at leveraging and maintaining this collaboration for maximum scientific and societal benefit.
Large eddy simulations and direct numerical simulations of high speed turbulent reacting flows
NASA Technical Reports Server (NTRS)
Givi, P.; Frankel, S. H.; Adumitroaie, V.; Sabini, G.; Madnia, C. K.
1993-01-01
The primary objective of this research is to extend current capabilities of Large Eddy Simulations (LES) and Direct Numerical Simulations (DNS) for the computational analyses of high speed reacting flows. Our efforts in the first two years of this research have been concentrated on a priori investigations of single-point Probability Density Function (PDF) methods for providing subgrid closures in reacting turbulent flows. In the efforts initiated in the third year, our primary focus has been on performing actual LES by means of PDF methods. The approach is based on assumed PDF methods and we have performed extensive analysis of turbulent reacting flows by means of LES. This includes simulations of both three-dimensional (3D) isotropic compressible flows and two-dimensional reacting planar mixing layers. In addition to these LES analyses, some work is in progress to assess the extent of validity of our assumed PDF methods. This assessment is done by making detailed companions with recent laboratory data in predicting the rate of reactant conversion in parallel reacting shear flows. This report provides a summary of our achievements for the first six months of the third year of this program.
Enhanced Medical Rehabilitation: Effectiveness of a clinical training model.
Bland, Marghuretta D; Birkenmeier, Rebecca L; Barco, Peggy; Lenard, Emily; Lang, Catherine E; Lenze, Eric J
2016-10-14
Patient engagement in medical rehabilitation can be greatly influenced by their provider during therapy sessions. We developed Enhanced Medical Rehabilitation (EMR), a set of provider skills grounded in theories of behavior change. EMR utilizes 18 motivational techniques focused on providing frequent feedback to patients on their effort and progress and linking these to patient goals. To examine the effectiveness of a clinical training protocol for clinicians to do EMR, as measured by clinician adherence. A physical therapist, physical therapist assistant, occupational therapist, and certified occupational therapist assistant were trained in EMR. Training consisted of five formal training sessions and individual and group coaching. Adherence to EMR techniques was measured during two phases: Pre-Training and Maintenance, with an a priori target of 90% adherence by clinicians to each EMR technique. With training and coaching, clinician adherence per therapeutic activity significantly improved in 13 out of 18 items (p < 0.05). The target of 90% adherence was not achieved for many items. Our training and coaching program successfully trained clinicians to promote patient engagement during therapeutic service delivery, although not typically to 90% or greater adherence. Ongoing coaching efforts were necessary to increase adherence.
A Comprehensive Model of the Near-Earth Magnetic Field. Phase 3
NASA Technical Reports Server (NTRS)
Sabaka, Terence J.; Olsen, Nils; Langel, Robert A.
2000-01-01
The near-Earth magnetic field is due to sources in Earth's core, ionosphere, magnetosphere, lithosphere, and from coupling currents between ionosphere and magnetosphere and between hemispheres. Traditionally, the main field (low degree internal field) and magnetospheric field have been modeled simultaneously, and fields from other sources modeled separately. Such a scheme, however, can introduce spurious features. A new model, designated CMP3 (Comprehensive Model: Phase 3), has been derived from quiet-time Magsat and POGO satellite measurements and observatory hourly and annual means measurements as part of an effort to coestimate fields from all of these sources. This model represents a significant advancement in the treatment of the aforementioned field sources over previous attempts, and includes an accounting for main field influences on the magnetosphere, main field and solar activity influences on the ionosphere, seasonal influences on the coupling currents, a priori characterization of ionospheric and magnetospheric influence on Earth-induced fields, and an explicit parameterization and estimation of the lithospheric field. The result of this effort is a model whose fits to the data are generally superior to previous models and whose parameter states for the various constituent sources are very reasonable.
Tiger in the fault tree jungle
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rubel, P.
1976-01-01
There is yet little evidence of serious efforts to apply formal reliability analysis methods to evaluate, or even to identify, potential common-mode failures (CMF) of reactor safeguard systems. The prospects for event logic modeling in this regard are examined by the primitive device of reviewing actual CMF experience in terms of what the analyst might have perceived a priori. Further insights of the probability and risks aspects of CMFs are sought through consideration of three key likelihood factors: (1) prior probability of cause ever existing, (2) opportunities for removing cause, and (3) probability that a CMF cause will be activatedmore » by conditions associated with a real system challenge. It was concluded that the principal needs for formal logical discipline in the endeavor to decrease CMF-related risks are to discover and to account for strong ''energetic'' dependency couplings that could arise in the major accidents usually classed as ''hypothetical.'' This application would help focus research, design and quality assurance efforts to cope with major CMF causes. But without extraordinary challenges to the reactor safeguard systems, there must continue to be virtually no statistical evidence pertinent to that class of failure dependencies.« less
Using Current Resources to Implement Wellness Programming for Preschoolers
ERIC Educational Resources Information Center
Cirignano, Sherri M.
2013-01-01
Currently, there is a nationwide effort to include preschool-aged children in wellness efforts for the prevention of obesity. National resources include guidelines, best practices, and tip sheets to assist in the implementation of these interventions. The Let's Move! Child Care Checklist is a resource that can be used to assess the level at…
ERIC Educational Resources Information Center
Peck, Craig
2014-01-01
This policy study critically compares two different efforts to implement an accountability system in the New York City public schools. In 1971, the New York City Board of Education contracted with the Educational Testing Service (ETS), which created a lengthy accountability plan for the district. Fitful maneuvers to execute the ETS plan fizzled…
ERIC Educational Resources Information Center
Yuksel, Yusuf
2013-01-01
Despite the popularity of planned change efforts, the failure rates of implementation are as high as 50 to 70 percent (Lewis & Seibold, 1998). While these efforts are affected by technical issues, the organizations' approach to change, structure, technological capabilities, and organizational culture and communication practices are…
Engineer, Rakesh S; Podolsky, Seth R; Fertel, Baruch S; Grover, Purva; Jimenez, Heather; Simon, Erin L; Smalley, Courtney M
2018-05-15
The American College of Emergency Physicians embarked on the "Choosing Wisely" campaign to avoid computed tomographic (CT) scans in patients with minor head injury who are at low risk based on validated decision rules. We hypothesized that a Pediatric Mild Head Injury Care Path could be developed and implemented to reduce inappropriate CT utilization with support of a clinical decision support tool (CDST) and a structured parent discussion tool. A quality improvement project was initiated for 9 weeks to reduce inappropriate CT utilization through 5 interventions: (1) engagement of leadership, (2) provider education, (3) incorporation of a parent discussion tool to guide discussion during the emergency department (ED) visit between the parent and the provider, (4) CDST embedded in the electronic medical record, and (5) importation of data into the note to drive compliance. Patients prospectively were enrolled when providers at a pediatric and a freestanding ED entered data into the CDST for decision making. Rate of care path utilization and head CT reduction was determined for all patients with minor head injury based on International Classification of Diseases, Ninth Revision codes. Targets for care path utilization and head CT reduction were established a priori. Results were compared with baseline data collected from 2013. The CDST was used in 176 (77.5%) of 227 eligible patients. Twelve patients were excluded based on a priori criteria. Adherence to recommendations occurred in 162 (99%) of 164 patients. Head CT utilization was reduced from 62.7% to 22% (odds ratio, 0.17; 95% confidence interval, 0.12-0.24) where CDST was used by the provider. There were no missed traumatic brain injuries in our study group. A Pediatric Mild Head Injury Care Path can be implemented in a pediatric and freestanding ED, resulting in reduced head CT utilization and high levels of adherence to CDST recommendations.
Return of results in translational iPS cell research: considerations for donor informed consent
2013-01-01
Efforts have emerged internationally to recruit donors with specific disease indications and to derive induced pluripotent cell lines. These disease-specific induced pluripotent stem cell lines have the potential to accelerate translational goals such as drug discovery and testing. One consideration for donor recruitment and informed consent is the possibility that research will result in findings that are clinically relevant to the cell donor. Management protocols for such findings should be developed a priori and disclosed during the informed consent process. The California Institute for Regenerative Medicine has developed recommendations for informing donors in sponsored research. These recommendations include obtaining consent to recontact tissue donors for a range of scientific, medical and ethical considerations. This article reviews the basis for these recommendations and suggests conditions that may be appropriate when reporting findings to donors. PMID:23336317
NASA Technical Reports Server (NTRS)
Peuquet, Donna J.
1991-01-01
The most important issue facing science is understanding global change; the causes, the processes involved and their consequences. The key to success in this massive Earth science research effort will depend on efficient identification and access to the most data available across the atmospheric, oceanographic, and land sciences. Current mechanisms used by earth scientists for accessing these data fall far short of meeting this need. Scientists must as a result frequently rely on a priori knowledge and informal person to person networks to find relevant data. The Master Directory/Catalog Interoperability Program (MC/CI) undertaken by NASA is an important step in overcoming these problems. The stated goal of the MD project is to enable researchers to efficiently identify, locate, and obtain access to space and Earth science data.
When students can choose easy, medium, or hard homework problems
NASA Astrophysics Data System (ADS)
Teodorescu, Raluca E.; Seaton, Daniel T.; Cardamone, Caroline N.; Rayyan, Saif; Abbott, Jonathan E.; Barrantes, Analia; Pawl, Andrew; Pritchard, David E.
2012-02-01
We investigate student-chosen, multi-level homework in our Integrated Learning Environment for Mechanics [1] built using the LON-CAPA [2] open-source learning system. Multi-level refers to problems categorized as easy, medium, and hard. Problem levels were determined a priori based on the knowledge needed to solve them [3]. We analyze these problems using three measures: time-per-problem, LON-CAPA difficulty, and item difficulty measured by item response theory. Our analysis of student behavior in this environment suggests that time-per-problem is strongly dependent on problem category, unlike either score-based measures. We also found trends in student choice of problems, overall effort, and efficiency across the student population. Allowing students choice in problem solving seems to improve their motivation; 70% of students worked additional problems for which no credit was given.
Helfrich, Christian D; Kohn, Marlana J; Stapleton, Austin; Allen, Claire L; Hammerback, Kristen Elizabeth; Chan, K C Gary; Parrish, Amanda T; Ryan, Daron E; Weiner, Bryan J; Harris, Jeffrey R; Hannon, Peggy A
2018-01-01
Organizational readiness to change may be a key determinant of implementation success and a mediator of the effectiveness of implementation interventions. If organizational readiness can be reliably and validly assessed at the outset of a change initiative, it could be used to assess the effectiveness of implementation-support activities by measuring changes in readiness factors over time. We analyzed two waves of readiness-to-change survey data collected as part of a three-arm, randomized controlled trial to implement evidence-based health promotion practices in small worksites in low-wage industries. We measured five readiness factors: context (favorable broader conditions); change valence (valuing health promotion); information assessment (demands and resources to implement health promotion); change commitment (an intention to implement health promotion); and change efficacy (a belief in shared ability to implement health promotion). We expected commitment and efficacy to increase at intervention sites along with their self-reported effort to implement health promotion practices, termed wellness-program effort. We compared means between baseline and 15 months, and between intervention and control sites. We used linear regression to test whether intervention and control sites differed in their change-readiness scores over time. Only context and change commitment met reliability thresholds. Change commitment declined significantly for both control (-0.39) and interventions sites (-0.29) from baseline to 15 months, while context did not change for either. Only wellness program effort at 15 months, but not at baseline, differed significantly between control and intervention sites (1.20 controls, 2.02 intervention). Regression analyses resulted in two significant differences between intervention and control sites in changes from baseline to 15 months: (1) intervention sites exhibited significantly smaller change in context scores relative to control sites over time and (2) intervention sites exhibited significantly higher changes in wellness program effort relative to control sites. Contrary to our hypothesis, change commitment declined significantly at both Healthlinks and control sites, even as wellness-program effort increased significantly at HealthLinks sites. Regression to the mean may explain the decline in change commitment. Future research needs to assess whether baseline commitment is an independent predictor of wellness-program effort or an effect modifier of the HealthLinks intervention.
Nutrient intake and food habits of soccer players: analyzing the correlates of eating practice.
García-Rovés, Pablo M; García-Zapico, Pedro; Patterson, Angeles M; Iglesias-Gutiérrez, Eduardo
2014-07-18
Despite the impact and popularity of soccer, and the growing field of soccer-related scientific research, little attention has been devoted to the nutritional intake and eating habits of soccer players. Moreover, the few studies that have addressed this issue suggest that the nutritional intake of soccer players is inadequate, underscoring the need for better adherence to nutritional recommendations and the development and implementation of nutrition education programs. The objective of these programs would be to promote healthy eating habits for male and female soccer players of all ages to optimize performance and provide health benefits that last beyond the end of a player's career. To date, no well-designed nutrition education program has been implemented for soccer players. The design and implementation of such an intervention requires a priori knowledge of nutritional intake and other correlates of food selection, such as food preferences and the influence of field position on nutrient intake, as well as detailed analysis of nutritional intake on match days, on which little data is available. Our aim is to provide an up-to-date overview of the nutritional intake, eating habits, and correlates of eating practice of soccer players.
Nutrient Intake and Food Habits of Soccer Players: Analyzing the Correlates of Eating Practice
García-Rovés, Pablo M.; García-Zapico, Pedro; Patterson, Ángeles M.; Iglesias-Gutiérrez, Eduardo
2014-01-01
Despite the impact and popularity of soccer, and the growing field of soccer-related scientific research, little attention has been devoted to the nutritional intake and eating habits of soccer players. Moreover, the few studies that have addressed this issue suggest that the nutritional intake of soccer players is inadequate, underscoring the need for better adherence to nutritional recommendations and the development and implementation of nutrition education programs. The objective of these programs would be to promote healthy eating habits for male and female soccer players of all ages to optimize performance and provide health benefits that last beyond the end of a player’s career. To date, no well-designed nutrition education program has been implemented for soccer players. The design and implementation of such an intervention requires a priori knowledge of nutritional intake and other correlates of food selection, such as food preferences and the influence of field position on nutrient intake, as well as detailed analysis of nutritional intake on match days, on which little data is available. Our aim is to provide an up-to-date overview of the nutritional intake, eating habits, and correlates of eating practice of soccer players. PMID:25045939
A Priori Knowledge and Heuristic Reasoning in Architectural Design.
ERIC Educational Resources Information Center
Rowe, Peter G.
1982-01-01
It is proposed that the various classes of a priori knowledge incorporated in heuristic reasoning processes exert a strong influence over architectural design activity. Some design problems require exercise of some provisional set of rules, inference, or plausible strategy which requires heuristic reasoning. A case study illustrates this concept.…
Implementation of the Every Student Succeeds Act: Update and Next Steps
ERIC Educational Resources Information Center
Strobach, Kelly Vaillancourt
2018-01-01
Efforts to implement the Every Student Succeeds Act (ESSA) are well underway, as states submitted their state plans to the Department of Education in September 2017. The entire education policy and advocacy community has been combing through the state plans to get an idea of how states will tackle efforts to improve student outcomes under the new…
ERIC Educational Resources Information Center
Lewis, Jodi; Nodine, Thad; Venezia, Andrea
2016-01-01
This brief shares the perspectives and concerns of high school teachers in two districts regarding implementing the Common Core State Standards, specifically as the Common Core pertains to preparing more students for college and well-paying careers. The brief also makes state policy recommendations for ways to support teachers in their efforts to…
Abrines-Jaume, Neus; Midgley, Nick; Hopkins, Katy; Hoffman, Jasmine; Martin, Kate; Law, Duncan; Wolpert, Miranda
2016-01-01
To explore the implementation of shared decision making (SDM) in Child and Adolescent Mental Health Services (CAMHS), and identify clinician-determined facilitators to SDM. Professionals from four UK CAMHS tried a range of tools to support SDM. They reflected on their experiences using plan-do-study-act log books. A total of 23 professionals completed 307 logs, which were transcribed and analysed using Framework Analysis in Atlas.Ti. Three states of implementation (apprehension, feeling clunky, and integration) and three aspects of clinician behavior or approach (effort, trust, and flexibility) were identified. Implementation of SDM in CAMHS requires key positive clinician behaviors, including preparedness to put in effort, trust in young people, and use of the approach flexibly. Implementation of SDM in CAMHS is effortful, and while tools may help support SDM, clinicians need to be allowed to use the tools flexibly to allow them to move from a state of apprehension through a sense of feeling "clunky" to integration in practice. © The Author(s) 2014.
Raiche, Gilles; Blais, Jean-Guy
2009-01-01
In a computerized adaptive test, we would like to obtain an acceptable precision of the proficiency level estimate using an optimal number of items. Unfortunately, decreasing the number of items is accompanied by a certain degree of bias when the true proficiency level differs significantly from the a priori estimate. The authors suggest that it is possible to reduced the bias, and even the standard error of the estimate, by applying to each provisional estimation one or a combination of the following strategies: adaptive correction for bias proposed by Bock and Mislevy (1982), adaptive a priori estimate, and adaptive integration interval.
Attitude determination and parameter estimation using vector observations - Theory
NASA Technical Reports Server (NTRS)
Markley, F. Landis
1989-01-01
Procedures for attitude determination based on Wahba's loss function are generalized to include the estimation of parameters other than the attitude, such as sensor biases. Optimization with respect to the attitude is carried out using the q-method, which does not require an a priori estimate of the attitude. Optimization with respect to the other parameters employs an iterative approach, which does require an a priori estimate of these parameters. Conventional state estimation methods require a priori estimates of both the parameters and the attitude, while the algorithm presented in this paper always computes the exact optimal attitude for given values of the parameters. Expressions for the covariance of the attitude and parameter estimates are derived.
Baumann, Ana A.; Domenech Rodríguez, Melanie M.; Amador, Nancy G.; Forgatch, Marion S.; Parra-Cardona, J. Rubén
2015-01-01
This article describes the process of cultural adaptation at the start of the implementation of the Parent Management Training intervention-Oregon model (PMTO) in Mexico City. The implementation process was guided by the model, and the cultural adaptation of PMTO was theoretically guided by the cultural adaptation process (CAP) model. During the process of the adaptation, we uncovered the potential for the CAP to be embedded in the implementation process, taking into account broader training and economic challenges and opportunities. We discuss how cultural adaptation and implementation processes are inextricably linked and iterative and how maintaining a collaborative relationship with the treatment developer has guided our work and has helped expand our research efforts, and how building human capital to implement PMTO in Mexico supported the implementation efforts of PMTO in other places in the United States. PMID:26052184
Baumann, Ana A; Domenech Rodríguez, Melanie M; Amador, Nancy G; Forgatch, Marion S; Parra-Cardona, J Rubén
2014-03-01
This article describes the process of cultural adaptation at the start of the implementation of the Parent Management Training intervention-Oregon model (PMTO) in Mexico City. The implementation process was guided by the model, and the cultural adaptation of PMTO was theoretically guided by the cultural adaptation process (CAP) model. During the process of the adaptation, we uncovered the potential for the CAP to be embedded in the implementation process, taking into account broader training and economic challenges and opportunities. We discuss how cultural adaptation and implementation processes are inextricably linked and iterative and how maintaining a collaborative relationship with the treatment developer has guided our work and has helped expand our research efforts, and how building human capital to implement PMTO in Mexico supported the implementation efforts of PMTO in other places in the United States.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brost, Randolph C.; McLendon, William Clarence,
2013-01-01
Modeling geospatial information with semantic graphs enables search for sites of interest based on relationships between features, without requiring strong a priori models of feature shape or other intrinsic properties. Geospatial semantic graphs can be constructed from raw sensor data with suitable preprocessing to obtain a discretized representation. This report describes initial work toward extending geospatial semantic graphs to include temporal information, and initial results applying semantic graph techniques to SAR image data. We describe an efficient graph structure that includes geospatial and temporal information, which is designed to support simultaneous spatial and temporal search queries. We also report amore » preliminary implementation of feature recognition, semantic graph modeling, and graph search based on input SAR data. The report concludes with lessons learned and suggestions for future improvements.« less
Tresoldi, Claudia; Bianchi, Elena; Pellegata, Alessandro Filippo; Dubini, Gabriele; Mantero, Sara
2017-08-01
The in vitro replication of physiological mechanical conditioning through bioreactors plays a crucial role in the development of functional Small-Caliber Tissue-Engineered Blood Vessels. An in silico scaffold-specific model under pulsatile perfusion provided by a bioreactor was implemented using a fluid-structure interaction (FSI) approach for viscoelastic tubular scaffolds (e.g. decellularized swine arteries, DSA). Results of working pressures, circumferential deformations, and wall shear stress on DSA fell within the desired physiological range and indicated the ability of this model to correctly predict the mechanical conditioning acting on the cells-scaffold system. Consequently, the FSI model allowed us to a priori define the stimulation pattern, driving in vitro physiological maturation of scaffolds, especially with viscoelastic properties.
Complex amplitude reconstruction by iterative amplitude-phase retrieval algorithm with reference
NASA Astrophysics Data System (ADS)
Shen, Cheng; Guo, Cheng; Tan, Jiubin; Liu, Shutian; Liu, Zhengjun
2018-06-01
Multi-image iterative phase retrieval methods have been successfully applied in plenty of research fields due to their simple but efficient implementation. However, there is a mismatch between the measurement of the first long imaging distance and the sequential interval. In this paper, an amplitude-phase retrieval algorithm with reference is put forward without additional measurements or priori knowledge. It gets rid of measuring the first imaging distance. With a designed update formula, it significantly raises the convergence speed and the reconstruction fidelity, especially in phase retrieval. Its superiority over the original amplitude-phase retrieval (APR) method is validated by numerical analysis and experiments. Furthermore, it provides a conceptual design of a compact holographic image sensor, which can achieve numerical refocusing easily.
Real-Time Parameter Estimation in the Frequency Domain
NASA Technical Reports Server (NTRS)
Morelli, Eugene A.
2000-01-01
A method for real-time estimation of parameters in a linear dynamic state-space model was developed and studied. The application is aircraft dynamic model parameter estimation from measured data in flight. Equation error in the frequency domain was used with a recursive Fourier transform for the real-time data analysis. Linear and nonlinear simulation examples and flight test data from the F-18 High Alpha Research Vehicle were used to demonstrate that the technique produces accurate model parameter estimates with appropriate error bounds. Parameter estimates converged in less than one cycle of the dominant dynamic mode, using no a priori information, with control surface inputs measured in flight during ordinary piloted maneuvers. The real-time parameter estimation method has low computational requirements and could be implemented
High Bar Swing Performance in Novice Adults: Effects of Practice and Talent
ERIC Educational Resources Information Center
Busquets, Albert; Marina, Michel; Irurtia, Alfredo; Ranz, Daniel; Angulo-Barroso, Rosa M.
2011-01-01
An individual's a priori talent can affect movement performance during learning. Also, task requirements and motor-perceptual factors are critical to the learning process. This study describes changes in high bar swing performance after a 2-month practice period. Twenty-five novice participants were divided by a priori talent level…
Five Methods for Estimating Angoff Cut Scores with IRT
ERIC Educational Resources Information Center
Wyse, Adam E.
2017-01-01
This article illustrates five different methods for estimating Angoff cut scores using item response theory (IRT) models. These include maximum likelihood (ML), expected a priori (EAP), modal a priori (MAP), and weighted maximum likelihood (WML) estimators, as well as the most commonly used approach based on translating ratings through the test…
On-orbit calibration for star sensors without priori information.
Zhang, Hao; Niu, Yanxiong; Lu, Jiazhen; Zhang, Chengfen; Yang, Yanqiang
2017-07-24
The star sensor is a prerequisite navigation device for a spacecraft. The on-orbit calibration is an essential guarantee for its operation performance. However, traditional calibration methods rely on ground information and are invalid without priori information. The uncertain on-orbit parameters will eventually influence the performance of guidance navigation and control system. In this paper, a novel calibration method without priori information for on-orbit star sensors is proposed. Firstly, the simplified back propagation neural network is designed for focal length and main point estimation along with system property evaluation, called coarse calibration. Then the unscented Kalman filter is adopted for the precise calibration of all parameters, including focal length, main point and distortion. The proposed method benefits from self-initialization and no attitude or preinstalled sensor parameter is required. Precise star sensor parameter estimation can be achieved without priori information, which is a significant improvement for on-orbit devices. Simulations and experiments results demonstrate that the calibration is easy for operation with high accuracy and robustness. The proposed method can satisfy the stringent requirement for most star sensors.
Effects of daily, high spatial resolution a priori profiles of satellite-derived NOx emissions
NASA Astrophysics Data System (ADS)
Laughner, J.; Zare, A.; Cohen, R. C.
2016-12-01
The current generation of space-borne NO2 column observations provides a powerful method of constraining NOx emissions due to the spatial resolution and global coverage afforded by the Ozone Monitoring Instrument (OMI). The greater resolution available in next generation instruments such as TROPOMI and the capabilities of geosynchronous platforms TEMPO, Sentinel-4, and GEMS will provide even greater capabilities in this regard, but we must apply lessons learned from the current generation of retrieval algorithms to make the best use of these instruments. Here, we focus on the effect of the resolution of the a priori NO2 profiles used in the retrieval algorithms. We show that for an OMI retrieval, using daily high-resolution a priori profiles results in changes in the retrieved VCDs up to 40% when compared to a retrieval using monthly average profiles at the same resolution. Further, comparing a retrieval with daily high spatial resolution a priori profiles to a more standard one, we show that emissions derived increase by 100% when using the optimized retrieval.
Chan, Wiley V; Pearson, Thomas A; Bennett, Glen C; Cushman, William C; Gaziano, Thomas A; Gorman, Paul N; Handler, Joel; Krumholz, Harlan M; Kushner, Robert F; MacKenzie, Thomas D; Sacco, Ralph L; Smith, Sidney C; Stevens, Victor J; Wells, Barbara L
2017-02-28
In 2008, the National Heart, Lung, and Blood Institute convened an Implementation Science Work Group to assess evidence-based strategies for effectively implementing clinical practice guidelines. This was part of a larger effort to update existing clinical practice guidelines on cholesterol, blood pressure, and overweight/obesity. Review evidence from the published implementation science literature and identify effective or promising strategies to enhance the adoption and implementation of clinical practice guidelines. This systematic review was conducted on 4 critical questions, each focusing on the adoption and effectiveness of 4 intervention strategies: (1) reminders, (2) educational outreach visits, (3) audit and feedback, and (4) provider incentives. A scoping review of the Rx for Change database of systematic reviews was used to identify promising guideline implementation interventions aimed at providers. Inclusion and exclusion criteria were developed a priori for each question, and the published literature was initially searched up to 2012, and then updated with a supplemental search to 2015. Two independent reviewers screened the returned citations to identify relevant reviews and rated the quality of each included review. Audit and feedback and educational outreach visits were generally effective in improving both process of care (15 of 21 reviews and 12 of 13 reviews, respectively) and clinical outcomes (7 of 12 reviews and 3 of 5 reviews, respectively). Provider incentives showed mixed effectiveness for improving both process of care (3 of 4 reviews) and clinical outcomes (3 reviews equally distributed between generally effective, mixed, and generally ineffective). Reminders showed mixed effectiveness for improving process of care outcomes (27 reviews with 11 mixed and 3 generally ineffective results) and were generally ineffective for clinical outcomes (18 reviews with 6 mixed and 9 generally ineffective results). Educational outreach visits (2 of 2 reviews), reminders (3 of 4 reviews), and provider incentives (1 of 1 review) were generally effective for cost reduction. Educational outreach visits (1 of 1 review) and provider incentives (1 of 1 review) were also generally effective for cost-effectiveness outcomes. Barriers to clinician adoption or adherence to guidelines included time constraints (8 reviews/overviews); limited staffing resources (2 overviews); timing (5 reviews/overviews); clinician skepticism (5 reviews/overviews); clinician knowledge of guidelines (4 reviews/overviews); and higher age of the clinician (1 overview). Facilitating factors included guideline characteristics such as format, resources, and end-user involvement (6 reviews/overviews); involving stakeholders (5 reviews/overviews); leadership support (5 reviews/overviews); scope of implementation (5 reviews/overviews); organizational culture such as multidisciplinary teams and low-baseline adherence (9 reviews/overviews); and electronic guidelines systems (3 reviews). The strategies of audit and feedback and educational outreach visits were generally effective in improving both process of care and clinical outcomes. Reminders and provider incentives showed mixed effectiveness, or were generally ineffective. No general conclusion could be reached about cost effectiveness, because of limitations in the evidence. Important gaps exist in the evidence on effectiveness of implementation interventions, especially regarding clinical outcomes, cost effectiveness and contextual issues affecting successful implementation. Copyright © 2017 American College of Cardiology Foundation and American Heart Association, Inc. Published by Elsevier Inc. All rights reserved.
Chan, Wiley V; Pearson, Thomas A; Bennett, Glen C; Cushman, William C; Gaziano, Thomas A; Gorman, Paul N; Handler, Joel; Krumholz, Harlan M; Kushner, Robert F; MacKenzie, Thomas D; Sacco, Ralph L; Smith, Sidney C; Stevens, Victor J; Wells, Barbara L; Castillo, Graciela; Heil, Susan K R; Stephens, Jennifer; Vann, Julie C Jacobson
2017-02-28
In 2008, the National Heart, Lung, and Blood Institute convened an Implementation Science Work Group to assess evidence-based strategies for effectively implementing clinical practice guidelines. This was part of a larger effort to update existing clinical practice guidelines on cholesterol, blood pressure, and overweight/obesity. Review evidence from the published implementation science literature and identify effective or promising strategies to enhance the adoption and implementation of clinical practice guidelines. This systematic review was conducted on 4 critical questions, each focusing on the adoption and effectiveness of 4 intervention strategies: (1) reminders, (2) educational outreach visits, (3) audit and feedback, and (4) provider incentives. A scoping review of the Rx for Change database of systematic reviews was used to identify promising guideline implementation interventions aimed at providers. Inclusion and exclusion criteria were developed a priori for each question, and the published literature was initially searched up to 2012, and then updated with a supplemental search to 2015. Two independent reviewers screened the returned citations to identify relevant reviews and rated the quality of each included review. Audit and feedback and educational outreach visits were generally effective in improving both process of care (15 of 21 reviews and 12 of 13 reviews, respectively) and clinical outcomes (7 of 12 reviews and 3 of 5 reviews, respectively). Provider incentives showed mixed effectiveness for improving both process of care (3 of 4 reviews) and clinical outcomes (3 reviews equally distributed between generally effective, mixed, and generally ineffective). Reminders showed mixed effectiveness for improving process of care outcomes (27 reviews with 11 mixed and 3 generally ineffective results) and were generally ineffective for clinical outcomes (18 reviews with 6 mixed and 9 generally ineffective results). Educational outreach visits (2 of 2 reviews), reminders (3 of 4 reviews), and provider incentives (1 of 1 review) were generally effective for cost reduction. Educational outreach visits (1 of 1 review) and provider incentives (1 of 1 review) were also generally effective for cost-effectiveness outcomes. Barriers to clinician adoption or adherence to guidelines included time constraints (8 reviews/overviews); limited staffing resources (2 overviews); timing (5 reviews/overviews); clinician skepticism (5 reviews/overviews); clinician knowledge of guidelines (4 reviews/overviews); and higher age of the clinician (1 overview). Facilitating factors included guideline characteristics such as format, resources, and end-user involvement (6 reviews/overviews); involving stakeholders (5 reviews/overviews); leadership support (5 reviews/overviews); scope of implementation (5 reviews/overviews); organizational culture such as multidisciplinary teams and low-baseline adherence (9 reviews/overviews); and electronic guidelines systems (3 reviews). The strategies of audit and feedback and educational outreach visits were generally effective in improving both process of care and clinical outcomes. Reminders and provider incentives showed mixed effectiveness, or were generally ineffective. No general conclusion could be reached about cost effectiveness, because of limitations in the evidence. Important gaps exist in the evidence on effectiveness of implementation interventions, especially regarding clinical outcomes, cost effectiveness and contextual issues affecting successful implementation. © 2017 by the American College of Cardiology Foundation and the American Heart Association, Inc.
ERIC Educational Resources Information Center
Embi, Zarina Che; Hussain, Hanafizan
2005-01-01
In the world of "edutainment" where multimedia is the ultimate content provider, educational electronic games are a new and fun way for young children to learn concepts and processes that have usually been delivered via books within the traditional classroom. In an effort to implement a design framework for developing educational games…
Bengtsson-Palme, Johan; Larsson, D G Joakim
2016-01-01
There are concerns that selection pressure from antibiotics in the environment may accelerate the evolution and dissemination of antibiotic-resistant pathogens. Nevertheless, there is currently no regulatory system that takes such risks into account. In part, this is due to limited knowledge of environmental concentrations that might exert selection for resistant bacteria. To experimentally determine minimal selective concentrations in complex microbial ecosystems for all antibiotics would involve considerable effort. In this work, our aim was to estimate upper boundaries for selective concentrations for all common antibiotics, based on the assumption that selective concentrations a priori need to be lower than those completely inhibiting growth. Data on Minimal Inhibitory Concentrations (MICs) were obtained for 111 antibiotics from the public EUCAST database. The 1% lowest observed MICs were identified, and to compensate for limited species coverage, predicted lowest MICs adjusted for the number of tested species were extrapolated through modeling. Predicted No Effect Concentrations (PNECs) for resistance selection were then assessed using an assessment factor of 10 to account for differences between MICs and minimal selective concentrations. The resulting PNECs ranged from 8 ng/L to 64 μg/L. Furthermore, the link between taxonomic similarity between species and lowest MIC was weak. This work provides estimated upper boundaries for selective concentrations (lowest MICs) and PNECs for resistance selection for all common antibiotics. In most cases, PNECs for selection of resistance were below available PNECs for ecotoxicological effects. The generated PNECs can guide implementation of compound-specific emission limits that take into account risks for resistance promotion. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.
Kareksela, Santtu; Moilanen, Atte; Tuominen, Seppo; Kotiaho, Janne S
2013-12-01
Globally expanding human land use sets constantly increasing pressure for maintenance of biological diversity and functioning ecosystems. To fight the decline of biological diversity, conservation science has broken ground with methods such as the operational model of systematic conservation planning (SCP), which focuses on design and on-the-ground implementation of conservation areas. The most commonly used method in SCP is reserve selection that focuses on the spatial design of reserve networks and their expansion. We expanded these methods by introducing another form of spatial allocation of conservation effort relevant for land-use zoning at the landscape scale that avoids negative ecological effects of human land use outside protected areas. We call our method inverse spatial conservation prioritization. It can be used to identify areas suitable for economic development while simultaneously limiting total ecological and environmental effects of that development at the landscape level by identifying areas with highest economic but lowest ecological value. Our method is not based on a priori targets, and as such it is applicable to cases where the effects of land use on, for example, individual species or ecosystem types are relatively small and would not lead to violation of regional or national conservation targets. We applied our method to land-use allocation to peat mining. Our method identified a combination of profitable production areas that provides the needed area for peat production while retaining most of the landscape-level ecological value of the ecosystem. The results of this inverse spatial conservation prioritization are being used in land-use zoning in the province of Central Finland. © 2013 Society for Conservation Biology.
GOCE User Toolbox and Tutorial
NASA Astrophysics Data System (ADS)
Knudsen, Per; Benveniste, Jerome
2017-04-01
The GOCE User Toolbox GUT is a compilation of tools for the utilisation and analysis of GOCE Level 2 products. GUT support applications in Geodesy, Oceanography and Solid Earth Physics. The GUT Tutorial provides information and guidance in how to use the toolbox for a variety of applications. GUT consists of a series of advanced computer routines that carry out the required computations. It may be used on Windows PCs, UNIX/Linux Workstations, and Mac. The toolbox is supported by The GUT Algorithm Description and User Guide and The GUT Install Guide. A set of a-priori data and models are made available as well. Without any doubt the development of the GOCE user toolbox have played a major role in paving the way to successful use of the GOCE data for oceanography. The GUT version 2.2 was released in April 2014 and beside some bug-fixes it adds the capability for the computation of Simple Bouguer Anomaly (Solid-Earth). During this fall a new GUT version 3 has been released. GUTv3 was further developed through a collaborative effort where the scientific communities participate aiming on an implementation of remaining functionalities facilitating a wider span of research in the fields of Geodesy, Oceanography and Solid earth studies. Accordingly, the GUT version 3 has: - An attractive and easy to use Graphic User Interface (GUI) for the toolbox, - Enhance the toolbox with some further software functionalities such as to facilitate the use of gradients, anisotropic diffusive filtering and computation of Bouguer and isostatic gravity anomalies. - An associated GUT VCM tool for analyzing the GOCE variance covariance matrices.
GOCE User Toolbox and Tutorial
NASA Astrophysics Data System (ADS)
Knudsen, Per; Benveniste, Jerome; Team Gut
2016-04-01
The GOCE User Toolbox GUT is a compilation of tools for the utilisation and analysis of GOCE Level 2 products. GUT support applications in Geodesy, Oceanography and Solid Earth Physics. The GUT Tutorial provides information and guidance in how to use the toolbox for a variety of applications. GUT consists of a series of advanced computer routines that carry out the required computations. It may be used on Windows PCs, UNIX/Linux Workstations, and Mac. The toolbox is supported by The GUT Algorithm Description and User Guide and The GUT Install Guide. A set of a-priori data and models are made available as well. Without any doubt the development of the GOCE user toolbox have played a major role in paving the way to successful use of the GOCE data for oceanography. The GUT version 2.2 was released in April 2014 and beside some bug-fixes it adds the capability for the computation of Simple Bouguer Anomaly (Solid-Earth). During this fall a new GUT version 3 has been released. GUTv3 was further developed through a collaborative effort where the scientific communities participate aiming on an implementation of remaining functionalities facilitating a wider span of research in the fields of Geodesy, Oceanography and Solid earth studies. Accordingly, the GUT version 3 has: - An attractive and easy to use Graphic User Interface (GUI) for the toolbox, - Enhance the toolbox with some further software functionalities such as to facilitate the use of gradients, anisotropic diffusive filtering and computation of Bouguer and isostatic gravity anomalies. - An associated GUT VCM tool for analyzing the GOCE variance covariance matrices.
Case management to reduce risk of cardiovascular disease in a county health care system.
Ma, Jun; Berra, Kathy; Haskell, William L; Klieman, Linda; Hyde, Shauna; Smith, Mark W; Xiao, Lan; Stafford, Randall S
2009-11-23
Case management (CM) is a systematic approach to supplement physician-centered efforts to prevent cardiovascular disease (CVD). Research is limited on its implementation and efficacy in low-income, ethnic minority populations. We conducted a randomized clinical trial to evaluate a nurse- and dietitian-led CM program for reducing major CVD risk factors in low-income, primarily ethnic minority patients in a county health care system, 63.0% of whom had type 2 diabetes mellitus. The primary outcome was the Framingham risk score (FRS). A total of 419 patients at elevated risk of CVD events were randomized and followed up for a mean of 16 months (81.4% retention). The mean FRS was significantly lower for the CM vs usual care group at follow-up (7.80 [95% confidence interval, 7.21-8.38] vs 8.93 [8.36-9.49]; P = .001) after adjusting for baseline FRS. This is equivalent to 5 fewer heart disease events per 1000 individuals per year attributable to the intervention or to 200 individuals receiving the intervention to prevent 1 event per year. The pattern of group differences in the FRS was similar in subgroups defined a priori by sex and ethnicity. The main driver of these differences was lowering the mean (SD) systolic (-4.2 [18.5] vs 2.6 [22.7] mm Hg; P = .003) and diastolic (-6.0 [11.6] vs -3.0 [11.7] mm Hg; P = .02) blood pressures for the CM vs usual care group. Nurse and dietitian CM targeting multifactor risk reduction can lead to modest improvements in CVD risk factors among high-risk patients in low-income, ethnic minority populations receiving care in county health clinics. clinicaltrials.gov Identifier: NCT00128687.
On the Hilbert-Huang Transform Data Processing System Development
NASA Technical Reports Server (NTRS)
Kizhner, Semion; Flatley, Thomas P.; Huang, Norden E.; Cornwell, Evette; Smith, Darell
2003-01-01
One of the main heritage tools used in scientific and engineering data spectrum analysis is the Fourier Integral Transform and its high performance digital equivalent - the Fast Fourier Transform (FFT). The Fourier view of nonlinear mechanics that had existed for a long time, and the associated FFT (fairly recent development), carry strong a-priori assumptions about the source data, such as linearity and of being stationary. Natural phenomena measurements are essentially nonlinear and nonstationary. A very recent development at the National Aeronautics and Space Administration (NASA) Goddard Space Flight Center (GSFC), known as the Hilbert-Huang Transform (HHT) proposes a novel approach to the solution for the nonlinear class of spectrum analysis problems. Using the Empirical Mode Decomposition (EMD) followed by the Hilbert Transform of the empirical decomposition data (HT), the HHT allows spectrum analysis of nonlinear and nonstationary data by using an engineering a-posteriori data processing, based on the EMD algorithm. This results in a non-constrained decomposition of a source real value data vector into a finite set of Intrinsic Mode Functions (IMF) that can be further analyzed for spectrum interpretation by the classical Hilbert Transform. This paper describes phase one of the development of a new engineering tool, the HHT Data Processing System (HHTDPS). The HHTDPS allows applying the "T to a data vector in a fashion similar to the heritage FFT. It is a generic, low cost, high performance personal computer (PC) based system that implements the HHT computational algorithms in a user friendly, file driven environment. This paper also presents a quantitative analysis for a complex waveform data sample, a summary of technology commercialization efforts and the lessons learned from this new technology development.
A Field Trial of Alternative Targeted Screening Strategies for Chagas Disease in Arequipa, Peru
Hunter, Gabrielle C.; Borrini-Mayorí, Katty; Ancca Juárez, Jenny; Castillo Neyra, Ricardo; Verastegui, Manuela R.; Malaga Chavez, Fernando S.; Cornejo del Carpio, Juan Geny; Córdova Benzaquen, Eleazar; Náquira, César; Gilman, Robert H.; Bern, Caryn; Levy, Michael Z.
2012-01-01
Background Chagas disease is endemic in the rural areas of southern Peru and a growing urban problem in the regional capital of Arequipa, population ∼860,000. It is unclear how to implement cost-effective screening programs across a large urban and periurban environment. Methods We compared four alternative screening strategies in 18 periurban communities, testing individuals in houses with 1) infected vectors; 2) high vector densities; 3) low vector densities; and 4) no vectors. Vector data were obtained from routine Ministry of Health insecticide application campaigns. We performed ring case detection (radius of 15 m) around seropositive individuals, and collected data on costs of implementation for each strategy. Results Infection was detected in 21 of 923 (2.28%) participants. Cases had lived more time on average in rural places than non-cases (7.20 years versus 3.31 years, respectively). Significant risk factors on univariate logistic regression for infection were age (OR 1.02; p = 0.041), time lived in a rural location (OR 1.04; p = 0.022), and time lived in an infested area (OR 1.04; p = 0.008). No multivariate model with these variables fit the data better than a simple model including only the time lived in an area with triatomine bugs. There was no significant difference in prevalence across the screening strategies; however a self-assessment of disease risk may have biased participation, inflating prevalence among residents of houses where no infestation was detected. Testing houses with infected-vectors was least expensive. Ring case detection yielded four secondary cases in only one community, possibly due to vector-borne transmission in this community, apparently absent in the others. Conclusions Targeted screening for urban Chagas disease is promising in areas with ongoing vector-borne transmission; however, these pockets of epidemic transmission remain difficult to detect a priori. The flexibility to adapt to the epidemiology that emerges during screening is key to an efficient case detection intervention. In heterogeneous urban environments, self-assessments of risk and simple residence history questionnaires may be useful to identify those at highest risk for Chagas disease to guide diagnostic efforts. PMID:22253939
A field trial of alternative targeted screening strategies for Chagas disease in Arequipa, Peru.
Hunter, Gabrielle C; Borrini-Mayorí, Katty; Ancca Juárez, Jenny; Castillo Neyra, Ricardo; Verastegui, Manuela R; Malaga Chavez, Fernando S; Cornejo del Carpio, Juan Geny; Córdova Benzaquen, Eleazar; Náquira, César; Gilman, Robert H; Bern, Caryn; Levy, Michael Z
2012-01-01
Chagas disease is endemic in the rural areas of southern Peru and a growing urban problem in the regional capital of Arequipa, population ∼860,000. It is unclear how to implement cost-effective screening programs across a large urban and periurban environment. We compared four alternative screening strategies in 18 periurban communities, testing individuals in houses with 1) infected vectors; 2) high vector densities; 3) low vector densities; and 4) no vectors. Vector data were obtained from routine Ministry of Health insecticide application campaigns. We performed ring case detection (radius of 15 m) around seropositive individuals, and collected data on costs of implementation for each strategy. Infection was detected in 21 of 923 (2.28%) participants. Cases had lived more time on average in rural places than non-cases (7.20 years versus 3.31 years, respectively). Significant risk factors on univariate logistic regression for infection were age (OR 1.02; p = 0.041), time lived in a rural location (OR 1.04; p = 0.022), and time lived in an infested area (OR 1.04; p = 0.008). No multivariate model with these variables fit the data better than a simple model including only the time lived in an area with triatomine bugs. There was no significant difference in prevalence across the screening strategies; however a self-assessment of disease risk may have biased participation, inflating prevalence among residents of houses where no infestation was detected. Testing houses with infected-vectors was least expensive. Ring case detection yielded four secondary cases in only one community, possibly due to vector-borne transmission in this community, apparently absent in the others. Targeted screening for urban Chagas disease is promising in areas with ongoing vector-borne transmission; however, these pockets of epidemic transmission remain difficult to detect a priori. The flexibility to adapt to the epidemiology that emerges during screening is key to an efficient case detection intervention. In heterogeneous urban environments, self-assessments of risk and simple residence history questionnaires may be useful to identify those at highest risk for Chagas disease to guide diagnostic efforts.
Energy-Efficient Implementation of ECDH Key Exchange for Wireless Sensor Networks
NASA Astrophysics Data System (ADS)
Lederer, Christian; Mader, Roland; Koschuch, Manuel; Großschädl, Johann; Szekely, Alexander; Tillich, Stefan
Wireless Sensor Networks (WSNs) are playing a vital role in an ever-growing number of applications ranging from environmental surveillance over medical monitoring to home automation. Since WSNs are often deployed in unattended or even hostile environments, they can be subject to various malicious attacks, including the manipulation and capture of nodes. The establishment of a shared secret key between two or more individual nodes is one of the most important security services needed to guarantee the proper functioning of a sensor network. Despite some recent advances in this field, the efficient implementation of cryptographic key establishment for WSNs remains a challenge due to the resource constraints of small sensor nodes such as the MICAz mote. In this paper we present a lightweight implementation of the elliptic curve Diffie-Hellman (ECDH) key exchange for ZigBee-compliant sensor nodes equipped with an ATmega128 processor running the TinyOS operating system. Our implementation uses a 192-bit prime field specified by the NIST as underlying algebraic structure and requires only 5.20 ·106 clock cycles to compute a scalar multiplication if the base point is fixed and known a priori. A scalar multiplication using a random base point takes about 12.33 ·106 cycles. Our results show that a full ECDH key exchange between two MICAz motes consumes an energy of 57.33 mJ (including radio communication), which is significantly better than most previously reported ECDH implementations on comparable platforms.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berry, Ray Alden; Zou, Ling; Zhao, Haihua
This document summarizes the physical models and mathematical formulations used in the RELAP-7 code. In summary, the MOOSE based RELAP-7 code development is an ongoing effort. The MOOSE framework enables rapid development of the RELAP-7 code. The developmental efforts and results demonstrate that the RELAP-7 project is on a path to success. This theory manual documents the main features implemented into the RELAP-7 code. Because the code is an ongoing development effort, this RELAP-7 Theory Manual will evolve with periodic updates to keep it current with the state of the development, implementation, and model additions/revisions.
Shah, Hemant; Allard, Raymond D; Enberg, Robert; Krishnan, Ganesh; Williams, Patricia; Nadkarni, Prakash M
2012-03-09
A large body of work in the clinical guidelines field has identified requirements for guideline systems, but there are formidable challenges in translating such requirements into production-quality systems that can be used in routine patient care. Detailed analysis of requirements from an implementation perspective can be useful in helping define sub-requirements to the point where they are implementable. Further, additional requirements emerge as a result of such analysis. During such an analysis, study of examples of existing, software-engineering efforts in non-biomedical fields can provide useful signposts to the implementer of a clinical guideline system. In addition to requirements described by guideline-system authors, comparative reviews of such systems, and publications discussing information needs for guideline systems and clinical decision support systems in general, we have incorporated additional requirements related to production-system robustness and functionality from publications in the business workflow domain, in addition to drawing on our own experience in the development of the Proteus guideline system (http://proteme.org). The sub-requirements are discussed by conveniently grouping them into the categories used by the review of Isern and Moreno 2008. We cite previous work under each category and then provide sub-requirements under each category, and provide example of similar work in software-engineering efforts that have addressed a similar problem in a non-biomedical context. When analyzing requirements from the implementation viewpoint, knowledge of successes and failures in related software-engineering efforts can guide implementers in the choice of effective design and development strategies.
2012-01-01
Background A large body of work in the clinical guidelines field has identified requirements for guideline systems, but there are formidable challenges in translating such requirements into production-quality systems that can be used in routine patient care. Detailed analysis of requirements from an implementation perspective can be useful in helping define sub-requirements to the point where they are implementable. Further, additional requirements emerge as a result of such analysis. During such an analysis, study of examples of existing, software-engineering efforts in non-biomedical fields can provide useful signposts to the implementer of a clinical guideline system. Methods In addition to requirements described by guideline-system authors, comparative reviews of such systems, and publications discussing information needs for guideline systems and clinical decision support systems in general, we have incorporated additional requirements related to production-system robustness and functionality from publications in the business workflow domain, in addition to drawing on our own experience in the development of the Proteus guideline system (http://proteme.org). Results The sub-requirements are discussed by conveniently grouping them into the categories used by the review of Isern and Moreno 2008. We cite previous work under each category and then provide sub-requirements under each category, and provide example of similar work in software-engineering efforts that have addressed a similar problem in a non-biomedical context. Conclusions When analyzing requirements from the implementation viewpoint, knowledge of successes and failures in related software-engineering efforts can guide implementers in the choice of effective design and development strategies. PMID:22405400
Feasibility of Implementing the 511 National Traveler Information Number in Texas
DOT National Transportation Integrated Search
2001-10-01
In this report, researchers assess the feasibility of implementing a three-digit (511) traveler information number within Texas. Researchers reviewed implementation efforts in other states, contacted key telephone company representatives and associat...
Jensen, Cathrine Elgaard; Riis, Allan; Petersen, Karin Dam; Jensen, Martin Bach; Pedersen, Kjeld Møller
2017-05-01
In connection with the publication of a clinical practice guideline on the management of low back pain (LBP) in general practice in Denmark, a cluster randomised controlled trial was conducted. In this trial, a multifaceted guideline implementation strategy to improve general practitioners' treatment of patients with LBP was compared with a usual implementation strategy. The aim was to determine whether the multifaceted strategy was cost effective, as compared with the usual implementation strategy. The economic evaluation was conducted as a cost-utility analysis where cost collected from a societal perspective and quality-adjusted life years were used as outcome measures. The analysis was conducted as a within-trial analysis with a 12-month time horizon consistent with the follow-up period of the clinical trial. To adjust for a priori selected covariates, generalised linear models with a gamma family were used to estimate incremental costs and quality-adjusted life years. Furthermore, both deterministic and probabilistic sensitivity analyses were conducted. Results showed that costs associated with primary health care were higher, whereas secondary health care costs were lower for the intervention group when compared with the control group. When adjusting for covariates, the intervention was less costly, and there was no significant difference in effect between the 2 groups. Sensitivity analyses showed that results were sensitive to uncertainty. In conclusion, the multifaceted implementation strategy was cost saving when compared with the usual strategy for implementing LBP clinical practice guidelines in general practice. Furthermore, there was no significant difference in effect, and the estimate was sensitive to uncertainty.
Sriram, Ganesh; Shanks, Jacqueline V
2004-04-01
The biosynthetically directed fractional (13)C labeling method for metabolic flux evaluation relies on performing a 2-D [(13)C, (1)H] NMR experiment on extracts from organisms cultured on a uniformly labeled carbon substrate. This article focuses on improvements in the interpretation of data obtained from such an experiment by employing the concept of bondomers. Bondomers take into account the natural abundance of (13)C; therefore many bondomers in a real network are zero, and can be precluded a priori--thus resulting in fewer balances. Using this method, we obtained a set of linear equations which can be solved to obtain analytical formulas for NMR-measurable quantities in terms of fluxes in glycolysis and the pentose phosphate pathways. For a specific case of this network with four degrees of freedom, a priori identifiability of the fluxes was shown possible for any set of fluxes. For a more general case with five degrees of freedom, the fluxes were shown identifiable for a representative set of fluxes. Minimal sets of measurements which best identify the fluxes are listed. Furthermore, we have delineated Boolean function mapping, a new method to iteratively simulate bondomer abundances or efficiently convert carbon skeleton rearrangement information to mapping matrices. The efficiency of this method is expected to be valuable while analyzing metabolic networks which are not completely known (such as in plant metabolism) or while implementing iterative bondomer balancing methods.
ERIC Educational Resources Information Center
Tisdell, Christopher C.
2017-01-01
For over 50 years, the learning of teaching of "a priori" bounds on solutions to linear differential equations has involved a Euclidean approach to measuring the size of a solution. While the Euclidean approach to "a priori" bounds on solutions is somewhat manageable in the learning and teaching of the proofs involving…
A-Priori Rupture Models for Northern California Type-A Faults
Wills, Chris J.; Weldon, Ray J.; Field, Edward H.
2008-01-01
This appendix describes how a-priori rupture models were developed for the northern California Type-A faults. As described in the main body of this report, and in Appendix G, ?a-priori? models represent an initial estimate of the rate of single and multi-segment surface ruptures on each fault. Whether or not a given model is moment balanced (i.e., satisfies section slip-rate data) depends on assumptions made regarding the average slip on each segment in each rupture (which in turn depends on the chosen magnitude-area relationship). Therefore, for a given set of assumptions, or branch on the logic tree, the methodology of the present Working Group (WGCEP-2007) is to find a final model that is as close as possible to the a-priori model, in the least squares sense, but that also satisfies slip rate and perhaps other data. This is analogous the WGCEP- 2002 approach of effectively voting on the relative rate of each possible rupture, and then finding the closest moment-balance model (under a more limiting set of assumptions than adopted by the present WGCEP, as described in detail in Appendix G). The 2002 Working Group Report (WCCEP, 2003, referred to here as WGCEP-2002), created segmented earthquake rupture forecast models for all faults in the region, including some that had been designated as Type B faults in the NSHMP, 1996, and one that had not previously been considered. The 2002 National Seismic Hazard Maps used the values from WGCEP-2002 for all the faults in the region, essentially treating all the listed faults as Type A faults. As discussed in Appendix A, the current WGCEP found that there are a number of faults with little or no data on slip-per-event, or dates of previous earthquakes. As a result, the WGCEP recommends that faults with minimal available earthquake recurrence data: the Greenville, Mount Diablo, San Gregorio, Monte Vista-Shannon and Concord-Green Valley be modeled as Type B faults to be consistent with similarly poorly-known faults statewide. As a result, the modified segmented models discussed here only concern the San Andreas, Hayward-Rodgers Creek, and Calaveras faults. Given the extensive level of effort given by the recent Bay-Area WGCEP-2002, our approach has been to adopt their final average models as our preferred a-prior models. We have modified the WGCEP-2002 models where necessary to match data that were not available or not used by that WGCEP and where the models needed by WGCEP-2007 for a uniform statewide model require different assumptions and/or logic-tree branch weights. In these cases we have made what are usually slight modifications to the WGCEP-2002 model. This Appendix presents the minor changes needed to accomodate updated information and model construction. We do not attempt to reproduce here the extensive documentation of data, model parameters and earthquake probablilities in the WG-2002 report.
Task-focused modeling in automated agriculture
NASA Astrophysics Data System (ADS)
Vriesenga, Mark R.; Peleg, K.; Sklansky, Jack
1993-01-01
Machine vision systems analyze image data to carry out automation tasks. Our interest is in machine vision systems that rely on models to achieve their designed task. When the model is interrogated from an a priori menu of questions, the model need not be complete. Instead, the machine vision system can use a partial model that contains a large amount of information in regions of interest and less information elsewhere. We propose an adaptive modeling scheme for machine vision, called task-focused modeling, which constructs a model having just sufficient detail to carry out the specified task. The model is detailed in regions of interest to the task and is less detailed elsewhere. This focusing effect saves time and reduces the computational effort expended by the machine vision system. We illustrate task-focused modeling by an example involving real-time micropropagation of plants in automated agriculture.
Global/local stress analysis of composite panels
NASA Technical Reports Server (NTRS)
Ransom, Jonathan B.; Knight, Norman F., Jr.
1989-01-01
A method for performing a global/local stress analysis is described, and its capabilities are demonstrated. The method employs spline interpolation functions which satisfy the linear plate bending equation to determine displacements and rotations from a global model which are used as boundary conditions for the local model. Then, the local model is analyzed independent of the global model of the structure. This approach can be used to determine local, detailed stress states for specific structural regions using independent, refined local models which exploit information from less-refined global models. The method presented is not restricted to having a priori knowledge of the location of the regions requiring local detailed stress analysis. This approach also reduces the computational effort necessary to obtain the detailed stress state. Criteria for applying the method are developed. The effectiveness of the method is demonstrated using a classical stress concentration problem and a graphite-epoxy blade-stiffened panel with a discontinuous stiffener.
Global/local stress analysis of composite structures. M.S. Thesis
NASA Technical Reports Server (NTRS)
Ransom, Jonathan B.
1989-01-01
A method for performing a global/local stress analysis is described and its capabilities are demonstrated. The method employs spline interpolation functions which satisfy the linear plate bending equation to determine displacements and rotations from a global model which are used as boundary conditions for the local model. Then, the local model is analyzed independent of the global model of the structure. This approach can be used to determine local, detailed stress states for specific structural regions using independent, refined local models which exploit information from less-refined global models. The method presented is not restricted to having a priori knowledge of the location of the regions requiring local detailed stress analysis. This approach also reduces the computational effort necessary to obtain the detailed stress state. Criteria for applying the method are developed. The effectiveness of the method is demonstrated using a classical stress concentration problem and a graphite-epoxy blade-stiffened panel with a discontinuous stiffener.
Stability of Contact Lines in Fluids: 2D Stokes Flow
NASA Astrophysics Data System (ADS)
Guo, Yan; Tice, Ian
2018-02-01
In an effort to study the stability of contact lines in fluids, we consider the dynamics of an incompressible viscous Stokes fluid evolving in a two-dimensional open-top vessel under the influence of gravity. This is a free boundary problem: the interface between the fluid in the vessel and the air above (modeled by a trivial fluid) is free to move and experiences capillary forces. The three-phase interface where the fluid, air, and solid vessel wall meet is known as a contact point, and the angle formed between the free interface and the vessel is called the contact angle. We consider a model of this problem that allows for fully dynamic contact points and angles. We develop a scheme of a priori estimates for the model, which then allow us to show that for initial data sufficiently close to equilibrium, the model admits global solutions that decay to equilibrium exponentially quickly.
Indicators for the automated analysis of drug prescribing quality.
Coste, J; Séné, B; Milstein, C; Bouée, S; Venot, A
1998-01-01
Irrational and inconsistent drug prescription has considerable impact on morbidity, mortality, health service utilization, and community burden. However, few studies have addressed the methodology of processing the information contained in these drug orders used to study the quality of drug prescriptions and prescriber behavior. We present a comprehensive set of quantitative indicators for the quality of drug prescriptions which can be derived from a drug order. These indicators were constructed using explicit a priori criteria which were previously validated on the basis of scientific data. Automatic computation is straightforward, using a relational database system, such that large sets of prescriptions can be processed with minimal human effort. We illustrate the feasibility and value of this approach by using a large set of 23,000 prescriptions for several diseases, selected from a nationally representative prescriptions database. Our study may result in direct and wide applications in the epidemiology of medical practice and in quality control procedures.
Ethics research in critically ill patients.
Estella, A
2018-05-01
Research in critical care patients is an ethical obligation. The ethical conflicts of intensive care research arise from patient vulnerability, since during ICU admission these individuals sometimes lose all or part of their decision making capacity and autonomy. We therefore must dedicate effort to ensure that neither treatment (sedation or mechanical ventilation) nor the disease itself can affect the right to individual freedom of the participants in research, improving the conditions under which informed consent must be obtained. Fragility, understood as a decrease in the capacity to tolerate adverse effects derived from research must be taken into account in selecting the participants. Research should be relevant, not possible to carry out in non-critical patients, and a priori should offer potential benefits that outweigh the risks that must be known and assumable, based on principles of responsibility. Copyright © 2018 Elsevier España, S.L.U. y SEMICYUC. All rights reserved.
Floquet analysis of Kuznetsov-Ma breathers: A path towards spectral stability of rogue waves.
Cuevas-Maraver, J; Kevrekidis, P G; Frantzeskakis, D J; Karachalios, N I; Haragus, M; James, G
2017-07-01
In the present work, we aim at taking a step towards the spectral stability analysis of Peregrine solitons, i.e., wave structures that are used to emulate extreme wave events. Given the space-time localized nature of Peregrine solitons, this is a priori a nontrivial task. Our main tool in this effort will be the study of the spectral stability of the periodic generalization of the Peregrine soliton in the evolution variable, namely the Kuznetsov-Ma breather. Given the periodic structure of the latter, we compute the corresponding Floquet multipliers, and examine them in the limit where the period of the orbit tends to infinity. This way, we extrapolate towards the stability of the limiting structure, namely the Peregrine soliton. We find that multiple unstable modes of the background are enhanced, yet no additional unstable eigenmodes arise as the Peregrine limit is approached. We explore the instability evolution also in direct numerical simulations.
ERIC Educational Resources Information Center
Putro, S. Eko; Sukirno; Budi, S.; Didik, W.
2016-01-01
The effort to improve human resource quality is not easy to be implemented. This effort becomes more complicated to do when implemented to the group of poor community, especially in this case marginal community of small island. This research analyzes the characteristic of poor household in small island as well as the strategy of poverty…
Effectiveness of the Department of Defense Information Assurance Accreditation Process
2013-03-01
meeting the requirements of ISO 27001, Information Security Management System. ISO 27002 provides “security techniques” or best practices that can be...efforts to the next level and implement a recognized standard such as the International Organization for Standards ( ISO ) 27000 Series of standards...implemented by an organization as part of their certification effort.15 Most likely, the main motivation a company would have for achieving an ISO
NASA Astrophysics Data System (ADS)
Fisher, Daniel; Poulsen, Caroline A.; Thomas, Gareth E.; Muller, Jan-Peter
2016-03-01
In this paper we evaluate the impact on the cloud parameter retrievals of the ORAC (Optimal Retrieval of Aerosol and Cloud) algorithm following the inclusion of stereo-derived cloud top heights as a priori information. This is performed in a mathematically rigorous way using the ORAC optimal estimation retrieval framework, which includes the facility to use such independent a priori information. Key to the use of a priori information is a characterisation of their associated uncertainty. This paper demonstrates the improvements that are possible using this approach and also considers their impact on the microphysical cloud parameters retrieved. The Along-Track Scanning Radiometer (AATSR) instrument has two views and three thermal channels, so it is well placed to demonstrate the synergy of the two techniques. The stereo retrieval is able to improve the accuracy of the retrieved cloud top height when compared to collocated Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observations (CALIPSO), particularly in the presence of boundary layer inversions and high clouds. The impact of the stereo a priori information on the microphysical cloud properties of cloud optical thickness (COT) and effective radius (RE) was evaluated and generally found to be very small for single-layer clouds conditions over open water (mean RE differences of 2.2 (±5.9) microns and mean COD differences of 0.5 (±1.8) for single-layer ice clouds over open water at elevations of above 9 km, which are most strongly affected by the inclusion of the a priori).
Development Impact Assessment Highlights Co-benefits of GHG Mitigation Actions
DOE Office of Scientific and Technical Information (OSTI.GOV)
2016-06-01
This EC-LEDS document describes the Development Impact Assessment (DIA) process that explores interactions between development goals and the low emission development strategies. DIA aims to support informed decision-making by considering how policies and programs intended to meet one goal may impact other development priorities. Enhancing Capacity for Low Emission Development Strategies (EC-LEDS) is a flagship U.S. government-led effort that assists countries in developing and implementing LEDS. The program enhances partner country efforts by providing targeting technical assistance and building a shared global knowledge base on LEDS. is a flagship U.S. government-led effort that assists countries in developing and implementing LEDS.more » The program enhances partner country efforts by providing targeting technical assistance and building a shared global knowledge base on LEDS.« less
Lodge, Amy C; Kaufman, Laura; Stevens Manser, Stacey
2017-05-01
Despite being an established practice in the disabilities service systems, person-centered planning is a relatively new practice in the behavioral health system. As a result, little is known about the barriers that mental health organizations face in implementing person-centered recovery planning (PCRP). To fill this gap, results are presented from a qualitative analysis of nine focus groups at three public mental health organizations in Texas that have been implementing PCRP for at least 2 years. Findings suggest that organizations experienced 12 distinct barriers to PCRP implementation which were categorized into the Consolidated Framework for Implementation Research domains of intervention characteristics, the outer setting, the inner setting, characteristics of individuals, and the implementation process. Half of these 12 barriers fell within the inner setting domain, suggesting that implementation efforts should be flexible and adaptable to organizational culture and context. One-quarter of the barriers fell into the domain of characteristics of individuals involved in the intervention, which further suggests implementation efforts should assess the impact that both staff and consumers have on implementation success.
Realism, functions, and the a priori: Ernst Cassirer's philosophy of science.
Heis, Jeremy
2014-12-01
This paper presents the main ideas of Cassirer's general philosophy of science, focusing on the two aspects of his thought that--in addition to being the most central ideas in his philosophy of science--have received the most attention from contemporary philosophers of science: his theory of the a priori aspects of physical theory, and his relation to scientific realism.
NASA Technical Reports Server (NTRS)
Giriunas, Julius A.
2012-01-01
Facility upgrades and large maintenance tasks needed at the NASA Glenn 10x10 Supersonic Wind Tunnel requires significant planning to make sure implementation proceeds in an efficiently and cost effective manner. Advanced planning to secure the funding, complete design efforts and schedule the installation needs to be thought out years in advance to avoid interference with wind tunnel testing. This presentation describes five facility tasks planned for implementation over the next few years. The main focus of the presentation highlights the efforts on possible replacement of the diesel generator and the rationale behind the effort.
Development of an intelligent hypertext system for wind tunnel testing
NASA Technical Reports Server (NTRS)
Lo, Ching F.; Shi, George Z.; Steinle, Frank W.; Wu, Y. C. L. Susan; Hoyt, W. Andes
1991-01-01
This paper summarizes the results of a system utilizing artificial intelligence technology to improve the productivity of project engineers who conduct wind tunnel tests. The objective was to create an intelligent hypertext system which integrates a hypertext manual and expert system that stores experts' knowledge and experience. The preliminary (Phase I) effort implemented a prototype IHS module encompassing a portion of the manuals and knowledge used for wind tunnel testing. The effort successfully demonstrated the feasibility of the intelligent hypertext system concept. A module for the internal strain gage balance, implemented on both IBM-PC and Macintosh computers, is presented. A description of the Phase II effort is included.
Zhang, Hongping; Gao, Zhouzheng; Ge, Maorong; Niu, Xiaoji; Huang, Ling; Tu, Rui; Li, Xingxing
2013-11-18
Precise Point Positioning (PPP) has become a very hot topic in GNSS research and applications. However, it usually takes about several tens of minutes in order to obtain positions with better than 10 cm accuracy. This prevents PPP from being widely used in real-time kinematic positioning services, therefore, a large effort has been made to tackle the convergence problem. One of the recent approaches is the ionospheric delay constrained precise point positioning (IC-PPP) that uses the spatial and temporal characteristics of ionospheric delays and also delays from an a priori model. In this paper, the impact of the quality of ionospheric models on the convergence of IC-PPP is evaluated using the IGS global ionospheric map (GIM) updated every two hours and a regional satellite-specific correction model. Furthermore, the effect of the receiver differential code bias (DCB) is investigated by comparing the convergence time for IC-PPP with and without estimation of the DCB parameter. From the result of processing a large amount of data, on the one hand, the quality of the a priori ionosphere delays plays a very important role in IC-PPP convergence. Generally, regional dense GNSS networks can provide more precise ionosphere delays than GIM and can consequently reduce the convergence time. On the other hand, ignoring the receiver DCB may considerably extend its convergence, and the larger the DCB, the longer the convergence time. Estimating receiver DCB in IC-PPP is a proper way to overcome this problem. Therefore, current IC-PPP should be enhanced by estimating receiver DCB and employing regional satellite-specific ionospheric correction models in order to speed up its convergence for more practical applications.
Zhang, Hongping; Gao, Zhouzheng; Ge, Maorong; Niu, Xiaoji; Huang, Ling; Tu, Rui; Li, Xingxing
2013-01-01
Precise Point Positioning (PPP) has become a very hot topic in GNSS research and applications. However, it usually takes about several tens of minutes in order to obtain positions with better than 10 cm accuracy. This prevents PPP from being widely used in real-time kinematic positioning services, therefore, a large effort has been made to tackle the convergence problem. One of the recent approaches is the ionospheric delay constrained precise point positioning (IC-PPP) that uses the spatial and temporal characteristics of ionospheric delays and also delays from an a priori model. In this paper, the impact of the quality of ionospheric models on the convergence of IC-PPP is evaluated using the IGS global ionospheric map (GIM) updated every two hours and a regional satellite-specific correction model. Furthermore, the effect of the receiver differential code bias (DCB) is investigated by comparing the convergence time for IC-PPP with and without estimation of the DCB parameter. From the result of processing a large amount of data, on the one hand, the quality of the a priori ionosphere delays plays a very important role in IC-PPP convergence. Generally, regional dense GNSS networks can provide more precise ionosphere delays than GIM and can consequently reduce the convergence time. On the other hand, ignoring the receiver DCB may considerably extend its convergence, and the larger the DCB, the longer the convergence time. Estimating receiver DCB in IC-PPP is a proper way to overcome this problem. Therefore, current IC-PPP should be enhanced by estimating receiver DCB and employing regional satellite-specific ionospheric correction models in order to speed up its convergence for more practical applications. PMID:24253190
Livingston, Kara A.; Chung, Mei; Sawicki, Caleigh M.; Lyle, Barbara J.; Wang, Ding Ding; Roberts, Susan B.; McKeown, Nicola M.
2016-01-01
Background Dietary fiber is a broad category of compounds historically defined as partially or completely indigestible plant-based carbohydrates and lignin with, more recently, the additional criteria that fibers incorporated into foods as additives should demonstrate functional human health outcomes to receive a fiber classification. Thousands of research studies have been published examining fibers and health outcomes. Objectives (1) Develop a database listing studies testing fiber and physiological health outcomes identified by experts at the Ninth Vahouny Conference; (2) Use evidence mapping methodology to summarize this body of literature. This paper summarizes the rationale, methodology, and resulting database. The database will help both scientists and policy-makers to evaluate evidence linking specific fibers with physiological health outcomes, and identify missing information. Methods To build this database, we conducted a systematic literature search for human intervention studies published in English from 1946 to May 2015. Our search strategy included a broad definition of fiber search terms, as well as search terms for nine physiological health outcomes identified at the Ninth Vahouny Fiber Symposium. Abstracts were screened using a priori defined eligibility criteria and a low threshold for inclusion to minimize the likelihood of rejecting articles of interest. Publications then were reviewed in full text, applying additional a priori defined exclusion criteria. The database was built and published on the Systematic Review Data Repository (SRDR™), a web-based, publicly available application. Conclusions A fiber database was created. This resource will reduce the unnecessary replication of effort in conducting systematic reviews by serving as both a central database archiving PICO (population, intervention, comparator, outcome) data on published studies and as a searchable tool through which this data can be extracted and updated. PMID:27348733
Automatic Artifact Removal from Electroencephalogram Data Based on A Priori Artifact Information.
Zhang, Chi; Tong, Li; Zeng, Ying; Jiang, Jingfang; Bu, Haibing; Yan, Bin; Li, Jianxin
2015-01-01
Electroencephalogram (EEG) is susceptible to various nonneural physiological artifacts. Automatic artifact removal from EEG data remains a key challenge for extracting relevant information from brain activities. To adapt to variable subjects and EEG acquisition environments, this paper presents an automatic online artifact removal method based on a priori artifact information. The combination of discrete wavelet transform and independent component analysis (ICA), wavelet-ICA, was utilized to separate artifact components. The artifact components were then automatically identified using a priori artifact information, which was acquired in advance. Subsequently, signal reconstruction without artifact components was performed to obtain artifact-free signals. The results showed that, using this automatic online artifact removal method, there were statistical significant improvements of the classification accuracies in both two experiments, namely, motor imagery and emotion recognition.
Automatic Artifact Removal from Electroencephalogram Data Based on A Priori Artifact Information
Zhang, Chi; Tong, Li; Zeng, Ying; Jiang, Jingfang; Bu, Haibing; Li, Jianxin
2015-01-01
Electroencephalogram (EEG) is susceptible to various nonneural physiological artifacts. Automatic artifact removal from EEG data remains a key challenge for extracting relevant information from brain activities. To adapt to variable subjects and EEG acquisition environments, this paper presents an automatic online artifact removal method based on a priori artifact information. The combination of discrete wavelet transform and independent component analysis (ICA), wavelet-ICA, was utilized to separate artifact components. The artifact components were then automatically identified using a priori artifact information, which was acquired in advance. Subsequently, signal reconstruction without artifact components was performed to obtain artifact-free signals. The results showed that, using this automatic online artifact removal method, there were statistical significant improvements of the classification accuracies in both two experiments, namely, motor imagery and emotion recognition. PMID:26380294
A New Expanded Mixed Element Method for Convection-Dominated Sobolev Equation
Wang, Jinfeng; Li, Hong; Fang, Zhichao
2014-01-01
We propose and analyze a new expanded mixed element method, whose gradient belongs to the simple square integrable space instead of the classical H(div; Ω) space of Chen's expanded mixed element method. We study the new expanded mixed element method for convection-dominated Sobolev equation, prove the existence and uniqueness for finite element solution, and introduce a new expanded mixed projection. We derive the optimal a priori error estimates in L 2-norm for the scalar unknown u and a priori error estimates in (L 2)2-norm for its gradient λ and its flux σ. Moreover, we obtain the optimal a priori error estimates in H 1-norm for the scalar unknown u. Finally, we obtained some numerical results to illustrate efficiency of the new method. PMID:24701153
Multispectral guided fluorescence diffuse optical tomography using upconverting nanoparticles
DOE Office of Scientific and Technical Information (OSTI.GOV)
Svenmarker, Pontus, E-mail: pontus.svenmarker@physics.umu.se; Department of Physics, Umeå University, SE-901 87 Umeå; Centre for Microbial Research
2014-02-17
We report on improved image detectability for fluorescence diffuse optical tomography using upconverting nanoparticles doped with rare-earth elements. Core-shell NaYF{sub 4}:Yb{sup 3+}/Er{sup 3+}@NaYF{sub 4} upconverting nanoparticles were synthesized through a stoichiometric method. The Yb{sup 3+}/Er{sup 3+} sensitizer-activator pair yielded two anti-Stokes shifted fluorescence emission bands at 540 nm and 660 nm, here used to a priori estimate the fluorescence source depth with sub-millimeter precision. A spatially varying regularization incorporated the a priori fluorescence source depth estimation into the tomography reconstruction scheme. Tissue phantom experiments showed both an improved resolution and contrast in the reconstructed images as compared to not using any amore » priori information.« less
ERIC Educational Resources Information Center
Goutard, M.
1992-01-01
Examines cooperative efforts among nongovernmental organizations (NGOs) in France to implement the United Nation's Convention on the Rights of the Child. Describes collaborative efforts between NGO committees and government authorities, highlighting government experts' recommendations for changes in four areas of French law to conform to…
Organizational readiness in specialty mental health care.
Hamilton, Alison B; Cohen, Amy N; Young, Alexander S
2010-01-01
Implementing quality improvement efforts in clinics is challenging. Assessment of organizational "readiness" for change can set the stage for implementation by providing information regarding existing strengths and deficiencies, thereby increasing the chance of a successful improvement effort. This paper discusses organizational assessment in specialty mental health, in preparation for improving care for individuals with schizophrenia. To assess organizational readiness for change in specialty mental health in order to facilitate locally tailored implementation strategies. EQUIP-2 is a site-level controlled trial at nine VA medical centers (four intervention, five control). Providers at all sites completed an organizational readiness for change (ORC) measure, and key stakeholders at the intervention sites completed a semi-structured interview at baseline. At the four intervention sites, 16 administrators and 43 clinical staff completed the ORC, and 38 key stakeholders were interviewed. The readiness domains of training needs, communication, and change were the domains with lower mean scores (i.e., potential deficiencies) ranging from a low of 23.8 to a high of 36.2 on a scale of 10-50, while staff attributes of growth and adaptability had higher mean scores (i.e., potential strengths) ranging from a low of 35.4 to a high of 41.1. Semi-structured interviews revealed that staff perceptions and experiences of change and decision-making are affected by larger structural factors such as change mandates from VA headquarters. Motivation for change, organizational climate, staff perceptions and beliefs, and prior experience with change efforts contribute to readiness for change in specialty mental health. Sites with less readiness for change may require more flexibility in the implementation of a quality improvement intervention. We suggest that uptake of evidence-based practices can be enhanced by tailoring implementation efforts to the strengths and deficiencies of the organizations that are implementing quality improvement changes.
Adams, Hieab H H; Hilal, Saima; Schwingenschuh, Petra; Wittfeld, Katharina; van der Lee, Sven J; DeCarli, Charles; Vernooij, Meike W; Katschnig-Winter, Petra; Habes, Mohamad; Chen, Christopher; Seshadri, Sudha; van Duijn, Cornelia M; Ikram, M Kamran; Grabe, Hans J; Schmidt, Reinhold; Ikram, M Arfan
2015-12-01
Virchow-Robin spaces (VRS), or perivascular spaces, are compartments of interstitial fluid enclosing cerebral blood vessels and are potential imaging markers of various underlying brain pathologies. Despite a growing interest in the study of enlarged VRS, the heterogeneity in rating and quantification methods combined with small sample sizes have so far hampered advancement in the field. The Uniform Neuro-Imaging of Virchow-Robin Spaces Enlargement (UNIVRSE) consortium was established with primary aims to harmonize rating and analysis (www.uconsortium.org). The UNIVRSE consortium brings together 13 (sub)cohorts from five countries, totaling 16,000 subjects and over 25,000 scans. Eight different magnetic resonance imaging protocols were used in the consortium. VRS rating was harmonized using a validated protocol that was developed by the two founding members, with high reliability independent of scanner type, rater experience, or concomitant brain pathology. Initial analyses revealed risk factors for enlarged VRS including increased age, sex, high blood pressure, brain infarcts, and white matter lesions, but this varied by brain region. Early collaborative efforts between cohort studies with respect to data harmonization and joint analyses can advance the field of population (neuro)imaging. The UNIVRSE consortium will focus efforts on other potential correlates of enlarged VRS, including genetics, cognition, stroke, and dementia.
2012-01-01
The Village/Commune Safety Policy was launched by the Ministry of Interior of the Kingdom of Cambodia in 2010 and, due to a priority focus on “cleaning the streets”, has created difficulties for HIV prevention programs attempting to implement programs that work with key affected populations including female sex workers and people who inject drugs. The implementation of the policy has forced HIV program implementers, the UN and various government counterparts to explore and develop collaborative ways of delivering HIV prevention services within this difficult environment. The following case study explores some of these efforts and highlights the promising development of a Police Community Partnership Initiative that it is hoped will find a meaningful balance between the Village/Commune Safety Policy and HIV prevention efforts with key affected populations in Cambodia. PMID:22770267
Implementation Issues for Departure Planning Systems
NASA Technical Reports Server (NTRS)
Hansman, R. John; Feron, Eric; Clarke, John-Paul; Odoni, Amedeo
1999-01-01
The objective of the proposed effort is to investigate issues associated with the design and implementation of decision aiding tools to assist in improving the departure process at congested airports. This effort follows a preliminary investigation of potential Departure Planning approaches and strategies, which identified potential benefits in departure efficiency, and also in reducing the environmental impact of aircraft in the departure queue. The preliminary study bas based, in large part, on observations and analysis of departure processes at Boston, Logan airport. The objective of this follow-on effort is to address key implementation issues and to expand the observational base to include airports with different constraints and traffic demand. Specifically, the objectives of this research are to: (1) Expand the observational base to include airports with different underlying operational dynamics. (2) Develop prototype decision aiding algorithms/approaches and assess potential benefits. and (3) Investigate Human Machine Integration (HMI) issues associated with decision aids in tower environments.
ADMIT: a toolbox for guaranteed model invalidation, estimation and qualitative–quantitative modeling
Streif, Stefan; Savchenko, Anton; Rumschinski, Philipp; Borchers, Steffen; Findeisen, Rolf
2012-01-01
Summary: Often competing hypotheses for biochemical networks exist in the form of different mathematical models with unknown parameters. Considering available experimental data, it is then desired to reject model hypotheses that are inconsistent with the data, or to estimate the unknown parameters. However, these tasks are complicated because experimental data are typically sparse, uncertain, and are frequently only available in form of qualitative if–then observations. ADMIT (Analysis, Design and Model Invalidation Toolbox) is a MatLabTM-based tool for guaranteed model invalidation, state and parameter estimation. The toolbox allows the integration of quantitative measurement data, a priori knowledge of parameters and states, and qualitative information on the dynamic or steady-state behavior. A constraint satisfaction problem is automatically generated and algorithms are implemented for solving the desired estimation, invalidation or analysis tasks. The implemented methods built on convex relaxation and optimization and therefore provide guaranteed estimation results and certificates for invalidity. Availability: ADMIT, tutorials and illustrative examples are available free of charge for non-commercial use at http://ifatwww.et.uni-magdeburg.de/syst/ADMIT/ Contact: stefan.streif@ovgu.de PMID:22451270
Streif, Stefan; Savchenko, Anton; Rumschinski, Philipp; Borchers, Steffen; Findeisen, Rolf
2012-05-01
Often competing hypotheses for biochemical networks exist in the form of different mathematical models with unknown parameters. Considering available experimental data, it is then desired to reject model hypotheses that are inconsistent with the data, or to estimate the unknown parameters. However, these tasks are complicated because experimental data are typically sparse, uncertain, and are frequently only available in form of qualitative if-then observations. ADMIT (Analysis, Design and Model Invalidation Toolbox) is a MatLab(TM)-based tool for guaranteed model invalidation, state and parameter estimation. The toolbox allows the integration of quantitative measurement data, a priori knowledge of parameters and states, and qualitative information on the dynamic or steady-state behavior. A constraint satisfaction problem is automatically generated and algorithms are implemented for solving the desired estimation, invalidation or analysis tasks. The implemented methods built on convex relaxation and optimization and therefore provide guaranteed estimation results and certificates for invalidity. ADMIT, tutorials and illustrative examples are available free of charge for non-commercial use at http://ifatwww.et.uni-magdeburg.de/syst/ADMIT/
NASA Technical Reports Server (NTRS)
West, M. E.
1992-01-01
A real-time estimation filter which reduces sensitivity to system variations and reduces the amount of preflight computation is developed for the instrument pointing subsystem (IPS). The IPS is a three-axis stabilized platform developed to point various astronomical observation instruments aboard the shuttle. Currently, the IPS utilizes a linearized Kalman filter (LKF), with premission defined gains, to compensate for system drifts and accumulated attitude errors. Since the a priori gains are generated for an expected system, variations result in a suboptimal estimation process. This report compares the performance of three real-time estimation filters with the current LKF implementation. An extended Kalman filter and a second-order Kalman filter are developed to account for the system nonlinearities, while a linear Kalman filter implementation assumes that the nonlinearities are negligible. The performance of each of the four estimation filters are compared with respect to accuracy, stability, settling time, robustness, and computational requirements. It is shown, that for the current IPS pointing requirements, the linear Kalman filter provides improved robustness over the LKF with less computational requirements than the two real-time nonlinear estimation filters.
MuffinInfo: HTML5-Based Statistics Extractor from Next-Generation Sequencing Data.
Alic, Andy S; Blanquer, Ignacio
2016-09-01
Usually, the information known a priori about a newly sequenced organism is limited. Even resequencing the same organism can generate unpredictable output. We introduce MuffinInfo, a FastQ/Fasta/SAM information extractor implemented in HTML5 capable of offering insights into next-generation sequencing (NGS) data. Our new tool can run on any software or hardware environment, in command line or graphically, and in browser or standalone. It presents information such as average length, base distribution, quality scores distribution, k-mer histogram, and homopolymers analysis. MuffinInfo improves upon the existing extractors by adding the ability to save and then reload the results obtained after a run as a navigable file (also supporting saving pictures of the charts), by supporting custom statistics implemented by the user, and by offering user-adjustable parameters involved in the processing, all in one software. At the moment, the extractor works with all base space technologies such as Illumina, Roche, Ion Torrent, Pacific Biosciences, and Oxford Nanopore. Owing to HTML5, our software demonstrates the readiness of web technologies for mild intensive tasks encountered in bioinformatics.
Autonomic Closure for Turbulent Flows Using Approximate Bayesian Computation
NASA Astrophysics Data System (ADS)
Doronina, Olga; Christopher, Jason; Hamlington, Peter; Dahm, Werner
2017-11-01
Autonomic closure is a new technique for achieving fully adaptive and physically accurate closure of coarse-grained turbulent flow governing equations, such as those solved in large eddy simulations (LES). Although autonomic closure has been shown in recent a priori tests to more accurately represent unclosed terms than do dynamic versions of traditional LES models, the computational cost of the approach makes it challenging to implement for simulations of practical turbulent flows at realistically high Reynolds numbers. The optimization step used in the approach introduces large matrices that must be inverted and is highly memory intensive. In order to reduce memory requirements, here we propose to use approximate Bayesian computation (ABC) in place of the optimization step, thereby yielding a computationally-efficient implementation of autonomic closure that trades memory-intensive for processor-intensive computations. The latter challenge can be overcome as co-processors such as general purpose graphical processing units become increasingly available on current generation petascale and exascale supercomputers. In this work, we outline the formulation of ABC-enabled autonomic closure and present initial results demonstrating the accuracy and computational cost of the approach.
A geometric approach to identify cavities in particle systems
NASA Astrophysics Data System (ADS)
Voyiatzis, Evangelos; Böhm, Michael C.; Müller-Plathe, Florian
2015-11-01
The implementation of a geometric algorithm to identify cavities in particle systems in an open-source python program is presented. The algorithm makes use of the Delaunay space tessellation. The present python software is based on platform-independent tools, leading to a portable program. Its successful execution provides information concerning the accessible volume fraction of the system, the size and shape of the cavities and the group of atoms forming each of them. The program can be easily incorporated into the LAMMPS software. An advantage of the present algorithm is that no a priori assumption on the cavity shape has to be made. As an example, the cavity size and shape distributions in a polyethylene melt system are presented for three spherical probe particles. This paper serves also as an introductory manual to the script. It summarizes the algorithm, its implementation, the required user-defined parameters as well as the format of the input and output files. Additionally, we demonstrate possible applications of our approach and compare its capability with the ones of well documented cavity size estimators.
[The trajectory of a public service for alcohol and drug addicts in Vitória: the case of the CPTT].
Dos Reis, Rossana; Garcia, Maria Lúcia Teixeira
2008-01-01
The aim of this case study is to analyze the Center for Prevention and Treatment of Alcohol and Drug Addiction (CPTT)/Psychosocial Care Center in Drugs and Alcohol (CAPS ad), and to reflect about the implementation process of the local addiction recovery policy for drug users and alcoholics. A document research was performed using as sources the CPTTs mid-year/annual and/or management reports and articles about the service. Data were analyzed using a priori and a posteriori content analysis. The CPTT, a health service of the government of the city of Vitória, was created in 1992 as a psychosocial service. The services provided by the CPTT include individual care, daily group activities and reception and follow-up groups. Today, the situation in the CPTT is characterized by precarious employment relations for most of the professionals working there. The trajectory of the implementation of the CPTT in the city of Vitória expresses two contradictory features: on one hand the advances made in the implementation of the public policy for prevention and treatment of drug and alcohol abuse, and on the other hand the challenge posed by the lack of a human resources policy capable of putting in effect the advances proposed by the policy.
Organizational theory for dissemination and implementation research.
Birken, Sarah A; Bunger, Alicia C; Powell, Byron J; Turner, Kea; Clary, Alecia S; Klaman, Stacey L; Yu, Yan; Whitaker, Daniel J; Self, Shannon R; Rostad, Whitney L; Chatham, Jenelle R Shanley; Kirk, M Alexis; Shea, Christopher M; Haines, Emily; Weiner, Bryan J
2017-05-12
Even under optimal internal organizational conditions, implementation can be undermined by changes in organizations' external environments, such as fluctuations in funding, adjustments in contracting practices, new technology, new legislation, changes in clinical practice guidelines and recommendations, or other environmental shifts. Internal organizational conditions are increasingly reflected in implementation frameworks, but nuanced explanations of how organizations' external environments influence implementation success are lacking in implementation research. Organizational theories offer implementation researchers a host of existing, highly relevant, and heretofore largely untapped explanations of the complex interaction between organizations and their environment. In this paper, we demonstrate the utility of organizational theories for implementation research. We applied four well-known organizational theories (institutional theory, transaction cost economics, contingency theories, and resource dependency theory) to published descriptions of efforts to implement SafeCare, an evidence-based practice for preventing child abuse and neglect. Transaction cost economics theory explained how frequent, uncertain processes for contracting for SafeCare may have generated inefficiencies and thus compromised implementation among private child welfare organizations. Institutional theory explained how child welfare systems may have been motivated to implement SafeCare because doing so aligned with expectations of key stakeholders within child welfare systems' professional communities. Contingency theories explained how efforts such as interagency collaborative teams promoted SafeCare implementation by facilitating adaptation to child welfare agencies' internal and external contexts. Resource dependency theory (RDT) explained how interagency relationships, supported by contracts, memoranda of understanding, and negotiations, facilitated SafeCare implementation by balancing autonomy and dependence on funding agencies and SafeCare developers. In addition to the retrospective application of organizational theories demonstrated above, we advocate for the proactive use of organizational theories to design implementation research. For example, implementation strategies should be selected to minimize transaction costs, promote and maintain congruence between organizations' dynamic internal and external contexts over time, and simultaneously attend to organizations' financial needs while preserving their autonomy. We describe implications of applying organizational theory in implementation research for implementation strategies, the evaluation of implementation efforts, measurement, research design, theory, and practice. We also offer guidance to implementation researchers for applying organizational theory.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bartkiewicz, Karol; Miranowicz, Adam
We find an optimal quantum cloning machine, which clones qubits of arbitrary symmetrical distribution around the Bloch vector with the highest fidelity. The process is referred to as phase-independent cloning in contrast to the standard phase-covariant cloning for which an input qubit state is a priori better known. We assume that the information about the input state is encoded in an arbitrary axisymmetric distribution (phase function) on the Bloch sphere of the cloned qubits. We find analytical expressions describing the optimal cloning transformation and fidelity of the clones. As an illustration, we analyze cloning of qubit state described by themore » von Mises-Fisher and Brosseau distributions. Moreover, we show that the optimal phase-independent cloning machine can be implemented by modifying the mirror phase-covariant cloning machine for which quantum circuits are known.« less
Development of an Axisymmetric Afterbody Test Case for Turbulent Flow Separation Validation
NASA Technical Reports Server (NTRS)
Disotell, Kevin J.; Rumsey, Christopher L.
2017-01-01
As identified in the CFD Vision 2030 Study commissioned by NASA, validation of advanced RANS models and scale-resolving methods for computing turbulent flows must be supported by improvements in high-quality experiments designed specifically for CFD implementation. A new test platform referred to as the Axisymmetric Afterbody allows for a range of flow behaviors to be studied on interchangeable afterbodies while facilitating access to higher Reynolds number facilities. A priori RANS computations are reported for a risk-reduction configuration to demonstrate critical variation among turbulence model results for a given afterbody, ranging from barely-attached to mild separated flow. The effects of body nose geometry and tunnel-wall boundary condition on the computed afterbody flow are explored to inform the design of an experimental test program.
NASA Technical Reports Server (NTRS)
Raibstein, A. I.; Kalev, I.; Pipano, A.
1976-01-01
A procedure for the local stiffness modifications of large structures is described. It enables structural modifications without an a priori definition of the changes in the original structure and without loss of efficiency due to multiple loading conditions. The solution procedure, implemented in NASTRAN, involved the decomposed stiffness matrix and the displacement vectors of the original structure. It solves the modified structure exactly, irrespective of the magnitude of the stiffness changes. In order to investigate the efficiency of the present procedure and to test its applicability within a design environment, several real and large structures were solved. The results of the efficiency studies indicate that the break-even point of the procedure varies between 8% and 60% stiffness modifications, depending upon the structure's characteristics and the options employed.
Pas, Elise T; Bradshaw, Catherine P
2012-10-01
Although there is an established literature supporting the efficacy of a variety of prevention programs, there has been less empirical work on the translation of such research to everyday practice or when scaled-up state-wide. There is a considerable need for more research on factors that enhance implementation of programs and optimize outcomes, particularly in school settings. The current paper examines how the implementation fidelity of an increasingly popular and widely disseminated prevention model called, School-wide Positive Behavioral Interventions and Supports (SW-PBIS), relates to student outcomes within the context of a state-wide scale-up effort. Data come from a scale-up effort of SW-PBIS in Maryland; the sample included 421 elementary and middle schools trained in SW-PBIS. SW-PBIS fidelity, as measured by one of three fidelity measures, was found to be associated with higher math achievement, higher reading achievement, and lower truancy. School contextual factors were related to implementation levels and outcomes. Implications for scale-up efforts of behavioral and mental health interventions and measurement considerations are discussed.
Examining the Association Between Implementation and Outcomes
Pas, Elise T.; Bradshaw, Catherine P.
2012-01-01
Although there is an established literature supporting the efficacy of a variety of prevention programs, there has been less empirical work on the tran of such research to everyday practice or when scaled-up state-wide. There is a considerable need for more research on factors that enhance implementation of programs and optimize outcomes, particularly in school settings. The current paper examines how the implementation fidelity of an increasingly popular and widely disseminated prevention model called, School-wide Positive Behavioral Interventions and Supports (SW-PBIS), relates to student outcomes within the context of a state-wide scale-up effort. Data come from a scale-up effort of SW-PBIS in Maryland; the sample included 421 elementary and middle schools trained in SW-PBIS. SW-PBIS fidelity, as measured by one of three fidelity measures, was found to be associated with higher math achievement, higher reading achievement, and lower truancy. School contextual factors were related to implementation levels and outcomes. Implications for scale-up efforts of behavioral and mental health interventions and measurement considerations are discussed. PMID:22836758
ERIC Educational Resources Information Center
Schmidt, Sebastian; Troge, Thomas A.; Lorrain, Denis
2013-01-01
A theory of listening to music is proposed. It suggests that, for listeners, the process of prediction is the starting point to experiencing music. This implies that perception of music starts through both a predisposed and an experience-based extrapolation into the future (this is labeled "a priori" listening). Indications for this…
Phase estimation without a priori phase knowledge in the presence of loss
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kolodynski, Jan; Demkowicz-Dobrzanski, Rafal
2010-11-15
We find the optimal scheme for quantum phase estimation in the presence of loss when no a priori knowledge on the estimated phase is available. We prove analytically an explicit lower bound on estimation uncertainty, which shows that, as a function of the number of probes, quantum precision enhancement amounts at most to a constant factor improvement over classical strategies.
Duan, Q.; Schaake, J.; Andreassian, V.; Franks, S.; Goteti, G.; Gupta, H.V.; Gusev, Y.M.; Habets, F.; Hall, A.; Hay, L.; Hogue, T.; Huang, M.; Leavesley, G.; Liang, X.; Nasonova, O.N.; Noilhan, J.; Oudin, L.; Sorooshian, S.; Wagener, T.; Wood, E.F.
2006-01-01
The Model Parameter Estimation Experiment (MOPEX) is an international project aimed at developing enhanced techniques for the a priori estimation of parameters in hydrologic models and in land surface parameterization schemes of atmospheric models. The MOPEX science strategy involves three major steps: data preparation, a priori parameter estimation methodology development, and demonstration of parameter transferability. A comprehensive MOPEX database has been developed that contains historical hydrometeorological data and land surface characteristics data for many hydrologic basins in the United States (US) and in other countries. This database is being continuously expanded to include more basins in all parts of the world. A number of international MOPEX workshops have been convened to bring together interested hydrologists and land surface modelers from all over world to exchange knowledge and experience in developing a priori parameter estimation techniques. This paper describes the results from the second and third MOPEX workshops. The specific objective of these workshops is to examine the state of a priori parameter estimation techniques and how they can be potentially improved with observations from well-monitored hydrologic basins. Participants of the second and third MOPEX workshops were provided with data from 12 basins in the southeastern US and were asked to carry out a series of numerical experiments using a priori parameters as well as calibrated parameters developed for their respective hydrologic models. Different modeling groups carried out all the required experiments independently using eight different models, and the results from these models have been assembled for analysis in this paper. This paper presents an overview of the MOPEX experiment and its design. The main experimental results are analyzed. A key finding is that existing a priori parameter estimation procedures are problematic and need improvement. Significant improvement of these procedures may be achieved through model calibration of well-monitored hydrologic basins. This paper concludes with a discussion of the lessons learned, and points out further work and future strategy. ?? 2005 Elsevier Ltd. All rights reserved.
A New Self-Constrained Inversion Method of Potential Fields Based on Probability Tomography
NASA Astrophysics Data System (ADS)
Sun, S.; Chen, C.; WANG, H.; Wang, Q.
2014-12-01
The self-constrained inversion method of potential fields uses a priori information self-extracted from potential field data. Differing from external a priori information, the self-extracted information are generally parameters derived exclusively from the analysis of the gravity and magnetic data (Paoletti et al., 2013). Here we develop a new self-constrained inversion method based on probability tomography. Probability tomography doesn't need any priori information, as well as large inversion matrix operations. Moreover, its result can describe the sources, especially the distribution of which is complex and irregular, entirely and clearly. Therefore, we attempt to use the a priori information extracted from the probability tomography results to constrain the inversion for physical properties. The magnetic anomaly data was taken as an example in this work. The probability tomography result of magnetic total field anomaly(ΔΤ) shows a smoother distribution than the anomalous source and cannot display the source edges exactly. However, the gradients of ΔΤ are with higher resolution than ΔΤ in their own direction, and this characteristic is also presented in their probability tomography results. So we use some rules to combine the probability tomography results of ∂ΔΤ⁄∂x, ∂ΔΤ⁄∂y and ∂ΔΤ⁄∂z into a new result which is used for extracting a priori information, and then incorporate the information into the model objective function as spatial weighting functions to invert the final magnetic susceptibility. Some magnetic synthetic examples incorporated with and without a priori information extracted from the probability tomography results were made to do comparison, results of which show that the former are more concentrated and with higher resolution of the source body edges. This method is finally applied in an iron mine in China with field measured ΔΤ data and performs well. ReferencesPaoletti, V., Ialongo, S., Florio, G., Fedi, M. & Cella, F., 2013. Self-constrained inversion of potential fields, Geophys J Int.This research is supported by the Fundamental Research Funds for Institute for Geophysical and Geochemical Exploration, Chinese Academy of Geological Sciences (Grant Nos. WHS201210 and WHS201211).
Nielsen, Matthew E; Birken, Sarah A
2018-05-01
The field of implementation science has been conventionally applied in the context of increasing the application of evidence-based practices into clinical care, given evidence of underusage of appropriate interventions in many settings. Increasingly, however, there is recognition of the potential for similar frameworks to inform efforts to reduce the application of ineffective or potentially harmful practices. In this article, we provide some examples of clinical scenarios in which the quality problem may be overuse and misuse, and review relevant theories and frameworks that may inform improvement activities. Copyright © 2018 Elsevier Inc. All rights reserved.
76 FR 21044 - Notice of Entering Into a Compact With the Republic of Malawi
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-14
...) Operations: MCC funding will support change management efforts, designing human resources strategies... efforts to implement a suitable market model which will include efforts to: (a) Study and design a single... from time to time in accordance with the terms hereof, the ``Program'') to counter a key binding...
De Donno, Giorgio; Cardarelli, Ettore
2017-01-01
In this paper, we present a new code for the modelling and inversion of resistivity and chargeability data using a priori information to improve the accuracy of the reconstructed model for landfill. When a priori information is available in the study area, we can insert them by means of inequality constraints on the whole model or on a single layer or assigning weighting factors for enhancing anomalies elongated in the horizontal or vertical directions. However, when we have to face a multilayered scenario with numerous resistive to conductive transitions (the case of controlled landfills), the effective thickness of the layers can be biased. The presented code includes a model-tuning scheme, which is applied after the inversion of field data, where the inversion of the synthetic data is performed based on an initial guess, and the absolute difference between the field and synthetic inverted models is minimized. The reliability of the proposed approach has been supported in two real-world examples; we were able to identify an unauthorized landfill and to reconstruct the geometrical and physical layout of an old waste dump. The combined analysis of the resistivity and chargeability (normalised) models help us to remove ambiguity due to the presence of the waste mass. Nevertheless, the presence of certain layers can remain hidden without using a priori information, as demonstrated by a comparison of the constrained inversion with a standard inversion. The robustness of the above-cited method (using a priori information in combination with model tuning) has been validated with the cross-section from the construction plans, where the reconstructed model is in agreement with the original design. Copyright © 2016 Elsevier Ltd. All rights reserved.
Stucki, Gerold; Zampolini, Mauro; Juocevicius, Alvydas; Negrini, Stefano; Christodoulou, Nicolas
2017-04-01
Since its launch in 2001, relevant international, regional and national PRM bodies have aimed to implement the International Classification of Functioning, Disability and Health (ICF) in Physical and Rehabilitation Medicine (PRM), whereby contributing to the development of suitable practical tools. These tools are available for implementing the ICF in day-to-day clinical practice, standardized reporting of functioning outcomes in quality management and research, and guiding evidence-informed policy. Educational efforts have reinforced PRM physicians' and other rehabilitation professionals' ICF knowledge, and numerous implementation projects have explored how the ICF is applied in clinical practice, research and policy. Largely lacking though is the system-wide implementation of ICF in day-to-day practice across all rehabilitation services of national health systems. In Europe, system-wide implementation of ICF requires the interaction between practice, science and governance. Considering its mandate, the UEMS PRM Section and Board have decided to lead a European effort towards system-wide ICF implementation in PRM, rehabilitation and health care at large, in interaction with governments, non-governmental actors and the private sector, and aligned with ISPRM's collaboration plan with WHO. In this paper we present the current PRM internal and external policy agenda towards system-wide ICF implementation and the corresponding implementation action plan, while highlighting priority action steps - promotion of ICF-based standardized reporting in national quality management and assurance programs, development of unambiguous rehabilitation service descriptions using the International Classification System for Service Organization in Health-related Rehabilitation, development of Clinical Assessment Schedules, qualitative linkage and quantitative mapping of data to the ICF, and the cultural adaptation of the ICF Clinical Data Collection Tool in European languages.
Martin-Gill, Christian; Higgins, J Stephen; Van Dongen, Hans P A; Buysse, Daniel J; Thackery, Ronald W; Kupas, Douglas F; Becker, David S; Dean, Bradley E; Lindbeck, George H; Guyette, Francis X; Penner, Josef H; Violanti, John M; Lang, Eddy S; Patterson, P Daniel
2018-02-15
Performance measures are a key component of implementation, dissemination, and evaluation of evidence-based guidelines (EBGs). We developed performance measures for Emergency Medical Services (EMS) stakeholders to enable the implementation of guidelines for fatigue risk management in the EMS setting. Panelists associated with the Fatigue in EMS Project, which was supported by the National Highway Traffic Safety Administration (NHTSA), used an iterative process to develop a draft set of performance measures linked to 5 recommendations for fatigue risk management in EMS. We used a cross-sectional survey design and the Content Validity Index (CVI) to quantify agreement among panelists on the wording and content of draft measures. An anonymous web-based tool was used to solicit the panelists' perceptions of clarity and relevance of draft measures. Panelists rated the clarity and relevance separately for each draft measure on a 4-point scale. CVI scores ≥0.78 for clarity and relevance were specified a priori to signify agreement and completion of measurement development. Panelists judged 5 performance measures for fatigue risk management as clear and relevant. These measures address use of fatigue and/or sleepiness survey instruments, optimal duration of shifts, access to caffeine as a fatigue countermeasure, use of napping during shift work, and the delivery of education and training on fatigue risk management for EMS personnel. Panelists complemented performance measures with suggestions for implementation by EMS agencies. Performance measures for fatigue risk management in the EMS setting will facilitate the implementation and evaluation of the EBG for Fatigue in EMS.
Munce, Sarah; Kastner, Monika; Cramm, Heidi; Lal, Shalini; Deschêne, Sarah-Maude; Auais, Mohammad; Stacey, Dawn; Brouwers, Melissa
2013-09-01
Integrated knowledge translation (IKT) interventions may be one solution to improving the uptake of clinical guidelines. IKT research initiatives are particularly relevant for breast cancer research and initiatives targeting the implementation of clinical guidelines and guideline implementation initiatives, where collaboration with an interdisciplinary team of practitioners, patients, caregivers, and policy makers is needed for producing optimum patient outcomes. The objective of this paper was to describe the process of developing an IKT strategy that could be used by guideline developers to improve the uptake of their new clinical practice guidelines on breast cancer screening. An interprofessional group of students as well as two faculty members met six times over three days at the KT Canada Summer Institute in 2011. The team used all of the phases of the action cycle in the Knowledge to Action Framework as an organizing framework. While the entire framework was used, the step involving assessing barriers to knowledge use was judged to be particularly relevant in anticipating implementation problems and being able to inform the specific KT interventions that would be appropriate to mitigate these challenges and to accomplish goals and outcomes. This activity also underscored the importance of group process and teamwork in IKT. We propose that an a priori assessment of barriers to knowledge use (i.e., level and corresponding barriers), along with the other phases of the Knowledge to Action Framework, is a strategic approach for KT strategy development, implementation, and evaluation planning and could be used in the future planning of KT strategies.
Validation of PV-RPM Code in the System Advisor Model.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Klise, Geoffrey Taylor; Lavrova, Olga; Freeman, Janine
2017-04-01
This paper describes efforts made by Sandia National Laboratories (SNL) and the National Renewable Energy Laboratory (NREL) to validate the SNL developed PV Reliability Performance Model (PV - RPM) algorithm as implemented in the NREL System Advisor Model (SAM). The PV - RPM model is a library of functions that estimates component failure and repair in a photovoltaic system over a desired simulation period. The failure and repair distributions in this paper are probabilistic representations of component failure and repair based on data collected by SNL for a PV power plant operating in Arizona. The validation effort focuses on whethermore » the failure and repair dist ributions used in the SAM implementation result in estimated failures that match the expected failures developed in the proof - of - concept implementation. Results indicate that the SAM implementation of PV - RPM provides the same results as the proof - of - concep t implementation, indicating the algorithms were reproduced successfully.« less
Effectiveness and Implementation of Evidence-Based Practices in Residential Care Settings
James, Sigrid; Alemi, Qais; Zepeda, Veronica
2013-01-01
Purpose Prompted by calls to implement evidence-based practices (EBPs) into residential care settings (RCS), this review addresses three questions: (1) Which EBPs have been tested with children and youth within the context of RCS? (2) What is the evidence for their effectiveness within such settings? (3) What implementation issues arise when transporting EBPs into RCS? Methods Evidence-based psychosocial interventions and respective outcome studies, published from 1990–2012, were identified through a multi-phase search process, involving the review of four major clearinghouse websites and relevant electronic databases. To be included, effectiveness had to have been previously established through a comparison group design regardless of the setting, and interventions tested subsequently with youth in RCS. All outcome studies were evaluated for quality and bias using a structured appraisal tool. Results Ten interventions matching a priori criteria were identified: Adolescent Community Reinforcement Approach, Aggression Replacement Training, Dialectical Behavioral Therapy, Ecologically-Based Family Therapy, Eye Movement and Desensitization Therapy, Functional Family Therapy, Multimodal Substance Abuse Prevention, Residential Student Assistance Program, Solution-Focused Brief Therapy, and Trauma Intervention Program for Adjudicated and At-Risk Youth. Interventions were tested in 13 studies, which were conducted in different types of RCS, using a variety of study methods. Outcomes were generally positive, establishing the relative effectiveness of the interventions with youth in RCS across a range of psychosocial outcomes. However, concerns about methodological bias and confounding factors remain. Most studies addressed implementation issues, reporting on treatment adaptations, training and supervision, treatment fidelity and implementation barriers. Conclusion The review unearthed a small but important body of knowledge that demonstrates that EBPs can be implemented in RCS with encouraging results. PMID:23606781
Effectiveness and Implementation of Evidence-Based Practices in Residential Care Settings.
James, Sigrid; Alemi, Qais; Zepeda, Veronica
2013-04-01
Prompted by calls to implement evidence-based practices (EBPs) into residential care settings (RCS), this review addresses three questions: (1) Which EBPs have been tested with children and youth within the context of RCS? (2) What is the evidence for their effectiveness within such settings? (3) What implementation issues arise when transporting EBPs into RCS? Evidence-based psychosocial interventions and respective outcome studies, published from 1990-2012, were identified through a multi-phase search process, involving the review of four major clearinghouse websites and relevant electronic databases. To be included, effectiveness had to have been previously established through a comparison group design regardless of the setting, and interventions tested subsequently with youth in RCS. All outcome studies were evaluated for quality and bias using a structured appraisal tool. Ten interventions matching a priori criteria were identified: Adolescent Community Reinforcement Approach, Aggression Replacement Training, Dialectical Behavioral Therapy, Ecologically-Based Family Therapy, Eye Movement and Desensitization Therapy, Functional Family Therapy, Multimodal Substance Abuse Prevention, Residential Student Assistance Program, Solution-Focused Brief Therapy, and Trauma Intervention Program for Adjudicated and At-Risk Youth. Interventions were tested in 13 studies, which were conducted in different types of RCS, using a variety of study methods. Outcomes were generally positive, establishing the relative effectiveness of the interventions with youth in RCS across a range of psychosocial outcomes. However, concerns about methodological bias and confounding factors remain. Most studies addressed implementation issues, reporting on treatment adaptations, training and supervision, treatment fidelity and implementation barriers. The review unearthed a small but important body of knowledge that demonstrates that EBPs can be implemented in RCS with encouraging results.
Global identifiability of linear compartmental models--a computer algebra algorithm.
Audoly, S; D'Angiò, L; Saccomani, M P; Cobelli, C
1998-01-01
A priori global identifiability deals with the uniqueness of the solution for the unknown parameters of a model and is, thus, a prerequisite for parameter estimation of biological dynamic models. Global identifiability is however difficult to test, since it requires solving a system of algebraic nonlinear equations which increases both in nonlinearity degree and number of terms and unknowns with increasing model order. In this paper, a computer algebra tool, GLOBI (GLOBal Identifiability) is presented, which combines the topological transfer function method with the Buchberger algorithm, to test global identifiability of linear compartmental models. GLOBI allows for the automatic testing of a priori global identifiability of general structure compartmental models from general multi input-multi output experiments. Examples of usage of GLOBI to analyze a priori global identifiability of some complex biological compartmental models are provided.
Mehrotra, Abhishek; Sklar, David P; Tayal, Vivek S; Kocher, Keith E; Handel, Daniel A; Myles Riner, R
2010-12-01
This article is drawn from a report created for the American College of Emergency Physicians (ACEP) Emergency Department (ED) Categorization Task Force and also reflects the proceedings of a breakout session, "Beyond ED Categorization-Matching Networks to Patient Needs," at the 2010 Academic Emergency Medicine consensus conference, "Beyond Regionalization: Integrated Networks of Emergency Care." The authors describe a brief history of the significant national and state efforts at categorization and suggest reasons why many of these efforts failed to persevere or gain wider implementation. The history of efforts to categorize hospital (and ED) emergency services demonstrates recognition of the potential benefits of categorization, but reflects repeated failures to implement full categorization systems or limited excursions into categorization through licensing of EDs or designation of receiving and referral facilities. An understanding of the history of hospital and ED categorization could better inform current efforts to develop categorization schemes and processes. 2010 by the Society for Academic Emergency Medicine.
The ABLe change framework: a conceptual and methodological tool for promoting systems change.
Foster-Fishman, Pennie G; Watson, Erin R
2012-06-01
This paper presents a new approach to the design and implementation of community change efforts like a System of Care. Called the ABLe Change Framework, the model provides simultaneous attention to the content and process of the work, ensuring effective implementation and the pursuit of systems change. Three key strategies are employed in this model to ensure the integration of content and process efforts and effective mobilization of broad scale systems change: Systemic Action Learning Teams, Simple Rules, and Small Wins. In this paper we describe the ABLe Change Framework and present a case study in which we successfully applied this approach to one system of care effort in Michigan.
A Self-Contained Mapping Closure Approximation for Scalar Mixing
2003-12-01
hierarchy in statistical mechanics ( Balescu 1975), where the correlations are specified a priori and then fixed. The MCA approach does not invoke...and thus the scalar fields. Unlike usual treatments in the BBGKY hierar- chy ( Balescu 1975), where the representations are specified a priori, the...discussions. This work was supported by the Speciae Funds for Major Basic Research Project G. 2000077305, P. R. China. REFERENCES BALESCU , R. 1975
ERIC Educational Resources Information Center
Raiche, Gilles; Blais, Jean-Guy
In a computerized adaptive test (CAT), it would be desirable to obtain an acceptable precision of the proficiency level estimate using an optimal number of items. Decreasing the number of items is accompanied, however, by a certain degree of bias when the true proficiency level differs significantly from the a priori estimate. G. Raiche (2000) has…
An algorithmic approach to crustal deformation analysis
NASA Technical Reports Server (NTRS)
Iz, Huseyin Baki
1987-01-01
In recent years the analysis of crustal deformation measurements has become important as a result of current improvements in geodetic methods and an increasing amount of theoretical and observational data provided by several earth sciences. A first-generation data analysis algorithm which combines a priori information with current geodetic measurements was proposed. Relevant methods which can be used in the algorithm were discussed. Prior information is the unifying feature of this algorithm. Some of the problems which may arise through the use of a priori information in the analysis were indicated and preventive measures were demonstrated. The first step in the algorithm is the optimal design of deformation networks. The second step in the algorithm identifies the descriptive model of the deformation field. The final step in the algorithm is the improved estimation of deformation parameters. Although deformation parameters are estimated in the process of model discrimination, they can further be improved by the use of a priori information about them. According to the proposed algorithm this information must first be tested against the estimates calculated using the sample data only. Null-hypothesis testing procedures were developed for this purpose. Six different estimators which employ a priori information were examined. Emphasis was put on the case when the prior information is wrong and analytical expressions for possible improvements under incompatible prior information were derived.
D'Alessandro, Annunziata; De Pergola, Giovanni
2018-01-18
We have analysed the definition of Mediterranean Diet in 28 studies included in six meta-analyses evaluating the relation between the Mediterranean Diet and primary prevention of cardiovascular disease. Some typical food of this dietary pattern like whole cereals, olive oil and red wine were taken into account only in a few a priori indexes, and the dietary pattern defined as Mediterranean showed many differences among the studies and compared to traditional Mediterranean Diet of the early 1960s. Altogether, the analysed studies show a protective effect of the Mediterranean Diet against cardiovascular disease but present different effects against specific conditions as cerebrovascular disease and coronary heart disease. These different effects might depend on the definition of Mediterranean Diet and the indexes of the adhesion to the same one used. To compare the effects of the Mediterranean Diet against cardiovascular disease, coronary heart disease and stroke a univocal model of Mediterranean Diet should be established as a reference, and it might be represented by the Modern Mediterranean Diet Pyramid. The a priori index to evaluate the adhesion to Mediterranean Diet might be the Mediterranean-Style Dietary Pattern Score that has some advantages in comparison to the others a priori indexes.
NASA Astrophysics Data System (ADS)
Dolloff, John; Hottel, Bryant; Edwards, David; Theiss, Henry; Braun, Aaron
2017-05-01
This paper presents an overview of the Full Motion Video-Geopositioning Test Bed (FMV-GTB) developed to investigate algorithm performance and issues related to the registration of motion imagery and subsequent extraction of feature locations along with predicted accuracy. A case study is included corresponding to a video taken from a quadcopter. Registration of the corresponding video frames is performed without the benefit of a priori sensor attitude (pointing) information. In particular, tie points are automatically measured between adjacent frames using standard optical flow matching techniques from computer vision, an a priori estimate of sensor attitude is then computed based on supplied GPS sensor positions contained in the video metadata and a photogrammetric/search-based structure from motion algorithm, and then a Weighted Least Squares adjustment of all a priori metadata across the frames is performed. Extraction of absolute 3D feature locations, including their predicted accuracy based on the principles of rigorous error propagation, is then performed using a subset of the registered frames. Results are compared to known locations (check points) over a test site. Throughout this entire process, no external control information (e.g. surveyed points) is used other than for evaluation of solution errors and corresponding accuracy.
NASA Astrophysics Data System (ADS)
Schneiderbauer, Simon; Saeedipour, Mahdi
2018-02-01
Highly resolved two-fluid model (TFM) simulations of gas-solid flows in vertical periodic channels have been performed to study closures for the filtered drag force and the Reynolds-stress-like contribution stemming from the convective terms. An approximate deconvolution model (ADM) for the large-eddy simulation of turbulent gas-solid suspensions is detailed and subsequently used to reconstruct those unresolved contributions in an a priori manner. With such an approach, an approximation of the unfiltered solution is obtained by repeated filtering allowing the determination of the unclosed terms of the filtered equations directly. A priori filtering shows that predictions of the ADM model yield fairly good agreement with the fine grid TFM simulations for various filter sizes and different particle sizes. In particular, strong positive correlation (ρ > 0.98) is observed at intermediate filter sizes for all sub-grid terms. Additionally, our study reveals that the ADM results moderately depend on the choice of the filters, such as box and Gaussian filter, as well as the deconvolution order. The a priori test finally reveals that ADM is superior compared to isotropic functional closures proposed recently [S. Schneiderbauer, "A spatially-averaged two-fluid model for dense large-scale gas-solid flows," AIChE J. 63, 3544-3562 (2017)].
Malinoski, Darren J; Patel, Madhukar S; Daly, Michael C; Oley-Graybill, Chrystal; Salim, Ali
2012-10-01
Many organ procurement organizations have implemented critical care end points as donor management goals in efforts to increase organs transplanted per donor after neurologic determination of death. Although retrospective studies have demonstrated an association between meeting donor management goals and organ yield, prospective studies are lacking. In June 2008, nine donor management goals were prospectively implemented as a checklist and every donor after neurologic determination of death was managed to meet them. The donor management goals represented normal cardiovascular, pulmonary, renal, and endocrine end points. Data were collected for 7 months. Donor management goals "met" was defined a priori as achieving any seven of the nine donor management goals, and this was recorded at the time of consent, 12-18 hrs later, and prior to organ recovery. The primary outcome measure was ≥4 organs transplanted per donor, and binary logistic regression was used to identify independent predictors of this outcome with a p<.05. All eight organ procurement organizations in the five Southwestern United States (United Network for Organ Sharing Region 5). All standard criteria donors after neurologic determination of deaths. Prospective implementation of a donor management goal checklist. There were 380 standard criteria donors with 3.6±1.7 organs transplanted per donor. Fifteen percent had donor management goals met at the time of consent, 33% at 12-18 hrs, and 38% prior to organ recovery. Forty-eight percent had ≥4 organs transplanted per donor. Donors with ≥4 organs transplanted per donor had significantly more individual donor management goals met at all three time points. Independent predictors of ≥4 organs transplanted per donor were age (odds ratio=0.95 per year), final creatinine (odds ratio=0.75 per 1-unit increase), donor management goals "met" at consent (odds ratio=2.03), donor management goals "met" prior to organ recovery (odds ratio=2.34), and a change in the number of donor management goals achieved from consent to 12-18 hrs later (odds ratio=1.13 per additional donor management goal). Meeting donor management goals prior to consent and prior to organ recovery were both associated with achieving ≥4 organs transplanted per donor. However, only 15% of donors have donor management goals met at the time of consent. The donor hospital management of patients with catastrophic brain injuries, before the intent to donate organs is known, affects outcomes and should remain a priority in the intensive care unit.
Rosen, C S; Matthieu, M M; Wiltsey Stirman, S; Cook, J M; Landes, S; Bernardy, N C; Chard, K M; Crowley, J; Eftekhari, A; Finley, E P; Hamblen, J L; Harik, J M; Kehle-Forbes, S M; Meis, L A; Osei-Bonsu, P E; Rodriguez, A L; Ruggiero, K J; Ruzek, J I; Smith, B N; Trent, L; Watts, B V
2016-11-01
Since 2006, the Veterans Health Administration (VHA) has instituted policy changes and training programs to support system-wide implementation of two evidence-based psychotherapies (EBPs) for posttraumatic stress disorder (PTSD). To assess lessons learned from this unprecedented effort, we used PubMed and the PILOTS databases and networking with researchers to identify 32 reports on contextual influences on implementation or sustainment of EBPs for PTSD in VHA settings. Findings were initially organized using the exploration, planning, implementation, and sustainment framework (EPIS; Aarons et al. in Adm Policy Ment Health Health Serv Res 38:4-23, 2011). Results that could not be adequately captured within the EPIS framework, such as implementation outcomes and adopter beliefs about the innovation, were coded using constructs from the reach, effectiveness, adoption, implementation, maintenance (RE-AIM) framework (Glasgow et al. in Am J Public Health 89:1322-1327, 1999) and Consolidated Framework for Implementation Research (CFIR; Damschroder et al. in Implement Sci 4(1):50, 2009). We highlight key areas of progress in implementation, identify continuing challenges and research questions, and discuss implications for future efforts to promote EBPs in large health care systems.
ADAPTIVE FINITE ELEMENT MODELING TECHNIQUES FOR THE POISSON-BOLTZMANN EQUATION
HOLST, MICHAEL; MCCAMMON, JAMES ANDREW; YU, ZEYUN; ZHOU, YOUNGCHENG; ZHU, YUNRONG
2011-01-01
We consider the design of an effective and reliable adaptive finite element method (AFEM) for the nonlinear Poisson-Boltzmann equation (PBE). We first examine the two-term regularization technique for the continuous problem recently proposed by Chen, Holst, and Xu based on the removal of the singular electrostatic potential inside biomolecules; this technique made possible the development of the first complete solution and approximation theory for the Poisson-Boltzmann equation, the first provably convergent discretization, and also allowed for the development of a provably convergent AFEM. However, in practical implementation, this two-term regularization exhibits numerical instability. Therefore, we examine a variation of this regularization technique which can be shown to be less susceptible to such instability. We establish a priori estimates and other basic results for the continuous regularized problem, as well as for Galerkin finite element approximations. We show that the new approach produces regularized continuous and discrete problems with the same mathematical advantages of the original regularization. We then design an AFEM scheme for the new regularized problem, and show that the resulting AFEM scheme is accurate and reliable, by proving a contraction result for the error. This result, which is one of the first results of this type for nonlinear elliptic problems, is based on using continuous and discrete a priori L∞ estimates to establish quasi-orthogonality. To provide a high-quality geometric model as input to the AFEM algorithm, we also describe a class of feature-preserving adaptive mesh generation algorithms designed specifically for constructing meshes of biomolecular structures, based on the intrinsic local structure tensor of the molecular surface. All of the algorithms described in the article are implemented in the Finite Element Toolkit (FETK), developed and maintained at UCSD. The stability advantages of the new regularization scheme are demonstrated with FETK through comparisons with the original regularization approach for a model problem. The convergence and accuracy of the overall AFEM algorithm is also illustrated by numerical approximation of electrostatic solvation energy for an insulin protein. PMID:21949541
ERIC Educational Resources Information Center
Dariotis, Jacinda K.; Bumbarger, Brian K.; Duncan, Larissa G.; Greenberg, Mark T.
2008-01-01
Widespread replications of evidence-based prevention programs (EBPPs) prompt prevention scientists to examine program implementation adherence in real world settings. Based on Chen's model (1990), we identified five key factors of the implementation system and assessed which characteristics related to program adherence. The sample included 32…
MRP (materiel requirements planning) II: successful implementation the hard way.
Grubbs, S C
1994-05-01
Many manufacturing companies embark on MRP II implementation projects as a method for improvement. In spite of an increasing body of knowledge regarding successful implementations, companies continue to attempt new approaches. This article reviews an actual implementation, featuring some of the mistakes made and the efforts required to still achieve "Class A" performance levels.
Psychosocial Pain Management Moderation: The Limit, Activate, and Enhance Model.
Day, Melissa A; Ehde, Dawn M; Jensen, Mark P
2015-10-01
There is a growing emphasis in the pain literature on understanding the following second-order research questions: Why do psychosocial pain treatments work? For whom do various treatments work? This critical review summarizes research that addresses the latter question and proposes a moderation model to help guide future research. A theoretical moderation framework for matching individuals to specific psychosocial pain interventions has been lacking. However, several such frameworks have been proposed in the broad psychotherapy and implementation science literature. Drawing on these theories and adapting them specifically for psychosocial pain treatment, here we propose a Limit, Activate, and Enhance model of pain treatment moderation. This model is unique in that it includes algorithms not only for matching treatments on the basis of patient weaknesses but also for directing patients to interventions that build on their strengths. Critically, this model provides a basis for specific a priori hypothesis generation, and a selection of the possible hypotheses drawn from the model are proposed and discussed. Future research considerations are presented that could refine and expand the model based on theoretically driven empirical evidence. The Limit, Activate, and Enhance model presented here is a theoretically derived framework that provides an a priori basis for hypothesis generation regarding psychosocial pain treatment moderators. The model will advance moderation research via its unique focus on matching patients to specific treatments that (1) limit maladaptive responses, (2) activate adaptive responses, and (3) enhance treatment outcomes based on patient strengths and resources. Copyright © 2015 American Pain Society. Published by Elsevier Inc. All rights reserved.
SCGICAR: Spatial concatenation based group ICA with reference for fMRI data analysis.
Shi, Yuhu; Zeng, Weiming; Wang, Nizhuan
2017-09-01
With the rapid development of big data, the functional magnetic resonance imaging (fMRI) data analysis of multi-subject is becoming more and more important. As a kind of blind source separation technique, group independent component analysis (GICA) has been widely applied for the multi-subject fMRI data analysis. However, spatial concatenated GICA is rarely used compared with temporal concatenated GICA due to its disadvantages. In this paper, in order to overcome these issues and to consider that the ability of GICA for fMRI data analysis can be improved by adding a priori information, we propose a novel spatial concatenation based GICA with reference (SCGICAR) method to take advantage of the priori information extracted from the group subjects, and then the multi-objective optimization strategy is used to implement this method. Finally, the post-processing means of principal component analysis and anti-reconstruction are used to obtain group spatial component and individual temporal component in the group, respectively. The experimental results show that the proposed SCGICAR method has a better performance on both single-subject and multi-subject fMRI data analysis compared with classical methods. It not only can detect more accurate spatial and temporal component for each subject of the group, but also can obtain a better group component on both temporal and spatial domains. These results demonstrate that the proposed SCGICAR method has its own advantages in comparison with classical methods, and it can better reflect the commonness of subjects in the group. Copyright © 2017 Elsevier B.V. All rights reserved.
Groundspeed filtering for CTAS
NASA Technical Reports Server (NTRS)
Slater, Gary L.
1994-01-01
Ground speed is one of the radar observables which is obtained along with position and heading from NASA Ames Center radar. Within the Center TRACON Automation System (CTAS), groundspeed is converted into airspeed using the wind speeds which CTAS obtains from the NOAA weather grid. This airspeed is then used in the trajectory synthesis logic which computes the trajectory for each individual aircraft. The time history of the typical radar groundspeed data is generally quite noisy, with high frequency variations on the order of five knots, and occasional 'outliers' which can be significantly different from the probable true speed. To try to smooth out these speeds and make the ETA estimate less erratic, filtering of the ground speed is done within CTAS. In its base form, the CTAS filter is a 'moving average' filter which averages the last ten radar values. In addition, there is separate logic to detect and correct for 'outliers', and acceleration logic which limits the groundspeed change in adjacent time samples. As will be shown, these additional modifications do cause significant changes in the actual groundspeed filter output. The conclusion is that the current ground speed filter logic is unable to track accurately the speed variations observed on many aircraft. The Kalman filter logic however, appears to be an improvement to the current algorithm used to smooth ground speed variations, while being simpler and more efficient to implement. Additional logic which can test for true 'outliers' can easily be added by looking at the difference in the a priori and post priori Kalman estimates, and not updating if the difference in these quantities is too large.
Sullivan, Jennifer L.; Adjognon, Omonyêlé L.; Engle, Ryann L.; Shin, Marlena H.; Afable, Melissa K.; Rudin, Whitney; White, Bert; Shay, Kenneth; Lukas, Carol VanDeusen
2018-01-01
Background: From 2010 to 2013, the Department of Veterans Affairs (VA) funded a large pilot initiative to implement noninstitutional long-term services and supports (LTSS) programs to support aging Veterans. Our team evaluated implementation of 59 VA noninstitutional LTSS programs. Purpose: The specific objectives of this study are to (a) examine the challenges influencing program implementation comparing active sites that remained open and inactive sites that closed during the funding period and (b) identify ways that active sites overcame the challenges they experienced. Methodology: Key informant semistructured interviews occurred between 2011 and 2013. We conducted 217 telephone interviews over four time points. Content analysis was used to identify emergent themes. The study team met regularly to define each challenge, review all codes, and discuss discrepancies. For each follow-up interview with the sites, the list of established challenges was used as a priori themes. Emergent data were also coded. Results: The challenges affecting implementation included human resources and staffing issues, infrastructure, resources allocation and geography, referrals and marketing, leadership support, and team dynamics and processes. Programs were able to overcome challenges by communicating with team members and other areas in the organization, utilizing information technology solutions, creative use of staff and flexible schedules, and obtaining additional resources. Discussion: This study highlights several common challenges programs can address during the program implementation. The most often mentioned strategy was effective communication. Strategies also targeted several components of the organization including organizational functions and processes (e.g., importance of coordination within a team and across disciplines to provide good care), infrastructure (e.g., information technology and human resources), and program fit with priorities in the organization (e.g., leadership support). Implications: Anticipating potential pitfalls of program implementation for future noninstitutional LTSS programs can improve implementation efficiency and program sustainability. Staff at multiple levels in the organization must fully support noninstitutional LTSS programs to address these challenges. PMID:28125459
Sullivan, Jennifer L; Adjognon, Omonyêlé L; Engle, Ryann L; Shin, Marlena H; Afable, Melissa K; Rudin, Whitney; White, Bert; Shay, Kenneth; Lukas, Carol VanDeusen
From 2010 to 2013, the Department of Veterans Affairs (VA) funded a large pilot initiative to implement noninstitutional long-term services and supports (LTSS) programs to support aging Veterans. Our team evaluated implementation of 59 VA noninstitutional LTSS programs. The specific objectives of this study are to (a) examine the challenges influencing program implementation comparing active sites that remained open and inactive sites that closed during the funding period and (b) identify ways that active sites overcame the challenges they experienced. Key informant semistructured interviews occurred between 2011 and 2013. We conducted 217 telephone interviews over four time points. Content analysis was used to identify emergent themes. The study team met regularly to define each challenge, review all codes, and discuss discrepancies. For each follow-up interview with the sites, the list of established challenges was used as a priori themes. Emergent data were also coded. The challenges affecting implementation included human resources and staffing issues, infrastructure, resources allocation and geography, referrals and marketing, leadership support, and team dynamics and processes. Programs were able to overcome challenges by communicating with team members and other areas in the organization, utilizing information technology solutions, creative use of staff and flexible schedules, and obtaining additional resources. This study highlights several common challenges programs can address during the program implementation. The most often mentioned strategy was effective communication. Strategies also targeted several components of the organization including organizational functions and processes (e.g., importance of coordination within a team and across disciplines to provide good care), infrastructure (e.g., information technology and human resources), and program fit with priorities in the organization (e.g., leadership support). Anticipating potential pitfalls of program implementation for future noninstitutional LTSS programs can improve implementation efficiency and program sustainability. Staff at multiple levels in the organization must fully support noninstitutional LTSS programs to address these challenges.
Flieger, Signe Peterson
This study explores the implementation experience of nine primary care practices becoming patient-centered medical homes (PCMH) as part of the New Hampshire Citizens Health Initiative Multi-Stakeholder Medical Home Pilot. The purpose of this study is to apply complex adaptive systems theory and relationship-centered organizations theory to explore how nine diverse primary care practices in New Hampshire implemented the PCMH model and to offer insights for how primary care practices can move from a structural PCMH to a relationship-centered PCMH. Eighty-three interviews were conducted with administrative and clinical staff at the nine pilot practices, payers, and conveners of the pilot between November and December 2011. The interviews were transcribed, coded, and analyzed using both a priori and emergent themes. Although there is value in the structural components of the PCMH (e.g., disease registries), these structures are not enough. Becoming a relationship-centered PCMH requires attention to reflection, sensemaking, learning, and collaboration. This can be facilitated by settings aside time for communication and relationship building through structured meetings about PCMH components as well as the implementation process itself. Moreover, team-based care offers a robust opportunity to move beyond the structures to focus on relationships and collaboration. (a) Recognize that PCMH implementation is not a linear process. (b) Implementing the PCMH from a structural perspective is not enough. Although the National Committee for Quality Assurance or other guidelines can offer guidance on the structural components of PCMH implementation, this should serve only as a starting point. (c) During implementation, set aside structured time for reflection and sensemaking. (d) Use team-based care as a cornerstone of transformation. Reflect on team structures and also interactions of the team members. Taking the time to reflect will facilitate greater sensemaking and learning and will ultimately help foster a relationship-centered PCMH.
Flieger, Signe Peterson
2017-01-01
Background This study explores the implementation experience of nine primary care practices becoming patient-centered medical homes (PCMH) as part of the New Hampshire Citizens Health Initiative Multi-Stakeholder Medical Home Pilot. Purpose The purpose of this study is to apply complex adaptive systems theory and relationship-centered organizations theory to explore how nine diverse primary care practices in New Hampshire implemented the PCMH model and to offer insights for how primary care practices can move from a structural PCMH to a relationship-centered PCMH. Methodology/Approach Eighty-three interviews were conducted with administrative and clinical staff at the nine pilot practices, payers, and conveners of the pilot between November and December 2011. The interviews were transcribed, coded, and analyzed using both a priori and emergent themes. Findings Although there is value in the structural components of the PCMH (e.g., disease registries), these structures are not enough. Becoming a relationship-centered PCMH requires attention to reflection, sensemaking, learning, and collaboration. This can be facilitated by settings aside time for communication and relationship building through structured meetings about PCMH components as well as the implementation process itself. Moreover, team-based care offers a robust opportunity to move beyond the structures to focus on relationships and collaboration. Practice Implications (a) Recognize that PCMH implementation is not a linear process. (b) Implementing the PCMH from a structural perspective is not enough. Although the National Committee for Quality Assurance or other guidelines can offer guidance on the structural components of PCMH implementation, this should serve only as a starting point. (c) During implementation, set aside structured time for reflection and sensemaking. (d) Use team-based care as a cornerstone of transformation. Reflect on team structures and also interactions of the team members. Taking the time to reflect will facilitate greater sensemaking and learning and will ultimately help foster a relationship-centered PCMH. PMID:26939031
Implementation of a research prototype onboard fault monitoring and diagnosis system
NASA Technical Reports Server (NTRS)
Palmer, Michael T.; Abbott, Kathy H.; Schutte, Paul C.; Ricks, Wendell R.
1987-01-01
Due to the dynamic and complex nature of in-flight fault monitoring and diagnosis, a research effort was undertaken at NASA Langley Research Center to investigate the application of artificial intelligence techniques for improved situational awareness. Under this research effort, concepts were developed and a software architecture was designed to address the complexities of onboard monitoring and diagnosis. This paper describes the implementation of these concepts in a computer program called FaultFinder. The implementation of the monitoring, diagnosis, and interface functions as separate modules is discussed, as well as the blackboard designed for the communication of these modules. Some related issues concerning the future installation of FaultFinder in an aircraft are also discussed.
MR-based keyhole SPECT for small animal imaging
Lee, Keum Sil; Roeck, Werner W; Gullberg, Grant T; Nalcioglu, Orhan
2011-01-01
The rationale for multi-modality imaging is to integrate the strengths of different imaging technologies while reducing the shortcomings of an individual modality. The work presented here proposes a limited-field-of-view (LFOV) SPECT reconstruction technique that can be implemented on a multi-modality MR/SPECT system that can be used to obtain simultaneous MRI and SPECT images for small animal imaging. The reason for using a combined MR/SPECT system in this work is to eliminate any possible misregistration between the two sets of images when MR images are used as a priori information for SPECT. In nuclear imaging the target area is usually smaller than the entire object; thus, focusing the detector on the LFOV results in various advantages including the use of a smaller nuclear detector (less cost), smaller reconstruction region (faster reconstruction) and higher spatial resolution when used in conjunction with pinhole collimators with magnification. The MR/SPECT system can be used to choose a region of interest (ROI) for SPECT. A priori information obtained by the full field-of-view (FOV) MRI combined with the preliminary SPECT image can be used to reduce the dimensions of the SPECT reconstruction by limiting the computation to the smaller FOV while reducing artifacts resulting from the truncated data. Since the technique is based on SPECT imaging within the LFOV it will be called the keyhole SPECT (K-SPECT) method. At first MRI images of the entire object using a larger FOV are obtained to determine the location of the ROI covering the target organ. Once the ROI is determined, the animal is moved inside the radiofrequency (rf) coil to bring the target area inside the LFOV and then simultaneous MRI and SPECT are performed. The spatial resolution of the SPECT image is improved by employing a pinhole collimator with magnification >1 by having carefully calculated acceptance angles for each pinhole to avoid multiplexing. In our design all the pinholes are focused to the center of the LFOV. K-SPECT reconstruction is accomplished by generating an adaptive weighting matrix using a priori information obtained by simultaneously acquired MR images and the radioactivity distribution obtained from the ROI region of the SPECT image that is reconstructed without any a priori input. Preliminary results using simulations with numerical phantoms show that the image resolution of the SPECT image within the LFOV is improved while minimizing artifacts arising from parts of the object outside the LFOV due to the chosen magnification and the new reconstruction technique. The root-mean-square-error (RMSE) in the out-of-field artifacts was reduced by 60% for spherical phantoms using the K-SPECT reconstruction technique and by 48.5–52.6% for the heart in the case with the MOBY phantom. The KSPECT reconstruction technique significantly improved the spatial resolution and quantification while reducing artifacts from the contributions outside the LFOV as well as reducing the dimension of the reconstruction matrix. PMID:21220840
Ohri-Vachaspati, Punam; Turner, Lindsey; Chaloupka, Frank J
2012-10-01
The availability of competitive foods in schools is a modifiable factor in efforts to prevent childhood obesity. The Alliance for a Healthier Generation launched the Healthy Schools Program in 2006 to encourage schools to create healthier food environments, including the adoption of nutritional guidelines for competitive beverages and foods. This study examines nationwide awareness and implementation of the guidelines in US public elementary schools. Data were collected from a nationally representative sample of elementary schools using mail-back surveys in 2006-2007, 2007-2008, 2008-2009, and 2009-2010. From 2006-2007 to 2009-2010, awareness of the Alliance's beverage guidelines increased from 35.0% to 51.8% among school administrators (p < .01); awareness of the food guidelines increased from 29.4% to 40.2% (p < .01). By 2009-2010, almost one third of the schools that sold competitive beverages and foods reported having implemented or being in the process of implementing the guidelines. Implementation was higher among schools from Southern states. Schools with a majority of Black or Latino students were less likely to implement the guidelines. Awareness and implementation of the Alliance's beverage and food guidelines has significantly increased since the 2006-2007 school year, indicating successful diffusion of the guidelines. However, many administrators at schools who sold competitive products were not aware of the guidelines, indicating a need for continued efforts. In addition, lower implementation among schools serving minority students suggests that the Alliance's targeted efforts to provide intensive technical assistance to such schools is warranted and necessary. © 2012, American School Health Association.
Global a priori estimates for the inhomogeneous Landau equation with moderately soft potentials
NASA Astrophysics Data System (ADS)
Cameron, Stephen; Silvestre, Luis; Snelson, Stanley
2018-05-01
We establish a priori upper bounds for solutions to the spatially inhomogeneous Landau equation in the case of moderately soft potentials, with arbitrary initial data, under the assumption that mass, energy and entropy densities stay under control. Our pointwise estimates decay polynomially in the velocity variable. We also show that if the initial data satisfies a Gaussian upper bound, this bound is propagated for all positive times.
Optimism as a Prior Belief about the Probability of Future Reward
Kalra, Aditi; Seriès, Peggy
2014-01-01
Optimists hold positive a priori beliefs about the future. In Bayesian statistical theory, a priori beliefs can be overcome by experience. However, optimistic beliefs can at times appear surprisingly resistant to evidence, suggesting that optimism might also influence how new information is selected and learned. Here, we use a novel Pavlovian conditioning task, embedded in a normative framework, to directly assess how trait optimism, as classically measured using self-report questionnaires, influences choices between visual targets, by learning about their association with reward progresses. We find that trait optimism relates to an a priori belief about the likelihood of rewards, but not losses, in our task. Critically, this positive belief behaves like a probabilistic prior, i.e. its influence reduces with increasing experience. Contrary to findings in the literature related to unrealistic optimism and self-beliefs, it does not appear to influence the iterative learning process directly. PMID:24853098
Quantum information and the problem of mechanisms of biological evolution.
Melkikh, Alexey V
2014-01-01
One of the most important conditions for replication in early evolution is the de facto elimination of the conformational degrees of freedom of the replicators, the mechanisms of which remain unclear. In addition, realistic evolutionary timescales can be established based only on partially directed evolution, further complicating this issue. A division of the various evolutionary theories into two classes has been proposed based on the presence or absence of a priori information about the evolving system. A priori information plays a key role in solving problems in evolution. Here, a model of partially directed evolution, based on the learning automata theory, which includes a priori information about the fitness space, is proposed. A potential repository of such prior information is the states of biologically important molecules. Thus, the need for extended evolutionary synthesis is discussed. Experiments to test the hypothesis of partially directed evolution are proposed. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
A gantry-based tri-modality system for bioluminescence tomography
Yan, Han; Lin, Yuting; Barber, William C.; Unlu, Mehmet Burcin; Gulsen, Gultekin
2012-01-01
A gantry-based tri-modality system that combines bioluminescence (BLT), diffuse optical (DOT), and x-ray computed tomography (XCT) into the same setting is presented here. The purpose of this system is to perform bioluminescence tomography using a multi-modality imaging approach. As parts of this hybrid system, XCT and DOT provide anatomical information and background optical property maps. This structural and functional a priori information is used to guide and restrain bioluminescence reconstruction algorithm and ultimately improve the BLT results. The performance of the combined system is evaluated using multi-modality phantoms. In particular, a cylindrical heterogeneous multi-modality phantom that contains regions with higher optical absorption and x-ray attenuation is constructed. We showed that a 1.5 mm diameter bioluminescence inclusion can be localized accurately with the functional a priori information while its source strength can be recovered more accurately using both structural and the functional a priori information. PMID:22559540
Implementation and evaluation of ILLIAC 4 algorithms for multispectral image processing
NASA Technical Reports Server (NTRS)
Swain, P. H.
1974-01-01
Data concerning a multidisciplinary and multi-organizational effort to implement multispectral data analysis algorithms on a revolutionary computer, the Illiac 4, are reported. The effectiveness and efficiency of implementing the digital multispectral data analysis techniques for producing useful land use classifications from satellite collected data were demonstrated.
Reverse technology transfer; obtaining feedback from managers.
A.B. Carey; J.M. Calhoun; B. Dick; K. O' Halloran; L.S. Young; R.E. Bigley; S. Chan; C.A. Harrington; J.P. Hayes; J. Marzluff
1999-01-01
Forestry policy, planning, and practice have changed rapidly with implementation of ecosystem management by federal, state, tribal, and private organizations. Implementation entails new concepts, terminology, and management approaches. Yet there seems to have been little organized effort to obtain feedback from on-the-ground managers on the practicality of implementing...
Complex Organizations: The Implementation of Major Organizational Innovations.
ERIC Educational Resources Information Center
Gross, Neal; And Others
Based upon selected findings of a case study of an elementary school which attempted to implement a major organizational innovation--the redefinition of the teacher's role in an individualized instructional program--factors were identified that help to explain why implementation efforts fail. The laboratory school, with a positive climate for…
Educational Reform Implementation: A Co-Constructed Process. Research Report 5.
ERIC Educational Resources Information Center
Datnow, Amanda; Hubbard, Lea; Mehan, Hugh
This research report argues for viewing the complex, often messy process of school reform implementation as a "conditional matrix" coupled with qualitative research. As illustration, two studies (of six reform efforts in one county and of implementation of an untracking program in Kentucky) are reported. Preliminary analysis reveals that…
Culyba, Alison Journey; Patton, William Wesley
2017-01-01
One reason that scientific research takes so long to reach patients is that medical researchers and practitioners often lack training in public policy implementation theory and strategy. General medical and specific psychiatric ethical precepts in the United States and in international ethics codes invest public policy duties in psychiatric researchers and individual clinicians. This essay discusses those medical ethical rules and suggests means for training psychiatrists to meet their public health policy duties in legal fora. The discussion presents a case study of the evolution of polyvictimization research, its initial lack of implementation in clinical practice and public policy debates, and a detailed demonstration of the incorporation of polyvictimization research in informing legislative action. Through systematic efforts to expand training and involvement of psychiatrists, we can expedite the implementation of psychiatric research by marshalling individual psychiatrists to affect decisions in legislative, executive, and judicial proceedings. These individual efforts can occur synergistically with ongoing psychiatric and psychological organizations’ efforts to better effect timely incorporation of evidence-based policies to improve mental health at the local, state, national, and international levels. PMID:28758050
Implementing Cleaner Printed Wiring Board Technologies: Surface Finishes
This document describes the problems, solutions, and time and effort involved in implementing alternative surface finish technologies, and this guide is produced as part of the DfE Printed Wiring Board Project
December Assumable Waters Subcommittee Presentations
Presentations covering adjacent wetlands as defined in the Clean Water Act (CWA) and its implementing regulations, state and tribal efforts to implement section 404(g) and the legislative history of CWA section 404 (g) (1).
Effects of measurement unobservability on neural extended Kalman filter tracking
NASA Astrophysics Data System (ADS)
Stubberud, Stephen C.; Kramer, Kathleen A.
2009-05-01
An important component of tracking fusion systems is the ability to fuse various sensors into a coherent picture of the scene. When multiple sensor systems are being used in an operational setting, the types of data vary. A significant but often overlooked concern of multiple sensors is the incorporation of measurements that are unobservable. An unobservable measurement is one that may provide information about the state, but cannot recreate a full target state. A line of bearing measurement, for example, cannot provide complete position information. Often, such measurements come from passive sensors such as a passive sonar array or an electronic surveillance measure (ESM) system. Unobservable measurements will, over time, result in the measurement uncertainty to grow without bound. While some tracking implementations have triggers to protect against the detrimental effects, many maneuver tracking algorithms avoid discussing this implementation issue. One maneuver tracking technique is the neural extended Kalman filter (NEKF). The NEKF is an adaptive estimation algorithm that estimates the target track as it trains a neural network on line to reduce the error between the a priori target motion model and the actual target dynamics. The weights of neural network are trained in a similar method to the state estimation/parameter estimation Kalman filter techniques. The NEKF has been shown to improve target tracking accuracy through maneuvers and has been use to predict target behavior using the new model that consists of the a priori model and the neural network. The key to the on-line adaptation of the NEKF is the fact that the neural network is trained using the same residuals as the Kalman filter for the tracker. The neural network weights are treated as augmented states to the target track. Through the state-coupling function, the weights are coupled to the target states. Thus, if the measurements cause the states of the target track to be unobservable, then the weights of the neural network have unobservable modes as well. In recent analysis, the NEKF was shown to have a significantly larger growth in the eigenvalues of the error covariance matrix than the standard EKF tracker when the measurements were purely bearings-only. This caused detrimental effects to the ability of the NEKF to model the target dynamics. In this work, the analysis is expanded to determine the detrimental effects of bearings-only measurements of various uncertainties on the performance of the NEKF when these unobservable measurements are interlaced with completely observable measurements. This analysis provides the ability to put implementation limitations on the NEKF when bearings-only sensors are present.
The Application of Modeling and Simulation in Capacity Management within the ITIL Framework
NASA Technical Reports Server (NTRS)
Rahmani, Sonya; vonderHoff, Otto
2010-01-01
Tightly integrating modeling and simulation techniques into Information Technology Infrastructure Library (ITIL) practices can be one of the driving factors behind a successful and cost-effective capacity management effort for any Information Technology (IT) system. ITIL is a best practices framework for managing IT infrastructure, development and operations. Translating ITIL theory into operational reality can be a challenge. This paper aims to highlight how to best integrate modeling and simulation into an ITIL implementation. For cases where the project team initially has difficulty gaining consensus on investing in modeling and simulation resources, a clear definition for M&S implementation into the ITIL framework, specifically its role in supporting Capacity Management, is critical to gaining the support required to garner these resources. This implementation should also help to clearly define M&S support to the overall system mission. This paper will describe the development of an integrated modeling approach and how best to tie M&S to definitive goals for evaluating system capacity and performance requirements. Specifically the paper will discuss best practices for implementing modeling and simulation into ITIL. These practices hinge on implementing integrated M&S methods that 1) encompass at least two or more predictive modeling techniques, 2) complement each one's respective strengths and weaknesses to support the validation of predicted results, and 3) are tied to the system's performance and workload monitoring efforts. How to structure two forms of modeling: statistical and simUlation in the development of "As Is" and "To Be" efforts will be used to exemplify the integrated M&S methods. The paper will show how these methods can better support the project's overall capacity management efforts.
Acceptance of lean redesigns in primary care: A contextual analysis.
Hung, Dorothy; Gray, Caroline; Martinez, Meghan; Schmittdiel, Julie; Harrison, Michael I
Lean is a leading change strategy used in health care to achieve short-term efficiency and quality improvement while promising longer-term system transformation. Most research examines Lean intervention to address isolated problems, rather than to achieve broader systemic changes to care delivery. Moreover, no studies examine contextual influences on system-wide Lean implementation efforts in primary care. The aim of this study was to identify contextual factors most critical to implementing and scaling Lean redesigns across all primary care clinics in a large, ambulatory care delivery system. Over 100 interviews and focus groups were conducted with frontline physicians, clinical staff, and operational leaders. Data analysis was guided by a modified Consolidated Framework for Implementation Research (CFIR), a popular implementation science framework. On the basis of expert recommendations, the modified framework targets factors influencing the implementation of process redesigns. This modified framework, the CFIR-PR, informed our identification of contextual factors that most impacted Lean acceptance among frontline physicians and staff. Several domains identified by the CFIR-PR were critical to acceptance of Lean redesigns. Regarding the implementation process acceptance was influenced by time and intensity of exposure to changes, "top-down" versus "bottom-up" implementation styles, and degrees of employee engagement in developing new workflows. Important factors in the inner setting were the clinic's culture and style of leadership, along with availability of information about Lean's effectiveness. Last, implementation efforts were impacted by individual and team characteristics regarding changed work roles and related issues of professional identity, authority, and autonomy. This study underscores the need for change leaders to consider the contextual factors that surround efforts to implement Lean in primary care. As Lean redesigns are scaled across a system, special attention is warranted with respect to the implementation approach, internal clinic setting, and implications for professional roles and identities of physicians and staff.
A data distributed parallel algorithm for ray-traced volume rendering
NASA Technical Reports Server (NTRS)
Ma, Kwan-Liu; Painter, James S.; Hansen, Charles D.; Krogh, Michael F.
1993-01-01
This paper presents a divide-and-conquer ray-traced volume rendering algorithm and a parallel image compositing method, along with their implementation and performance on the Connection Machine CM-5, and networked workstations. This algorithm distributes both the data and the computations to individual processing units to achieve fast, high-quality rendering of high-resolution data. The volume data, once distributed, is left intact. The processing nodes perform local ray tracing of their subvolume concurrently. No communication between processing units is needed during this locally ray-tracing process. A subimage is generated by each processing unit and the final image is obtained by compositing subimages in the proper order, which can be determined a priori. Test results on both the CM-5 and a group of networked workstations demonstrate the practicality of our rendering algorithm and compositing method.
Phaser.MRage: automated molecular replacement
Bunkóczi, Gábor; Echols, Nathaniel; McCoy, Airlie J.; Oeffner, Robert D.; Adams, Paul D.; Read, Randy J.
2013-01-01
Phaser.MRage is a molecular-replacement automation framework that implements a full model-generation workflow and provides several layers of model exploration to the user. It is designed to handle a large number of models and can distribute calculations efficiently onto parallel hardware. In addition, phaser.MRage can identify correct solutions and use this information to accelerate the search. Firstly, it can quickly score all alternative models of a component once a correct solution has been found. Secondly, it can perform extensive analysis of identified solutions to find protein assemblies and can employ assembled models for subsequent searches. Thirdly, it is able to use a priori assembly information (derived from, for example, homologues) to speculatively place and score molecules, thereby customizing the search procedure to a certain class of protein molecule (for example, antibodies) and incorporating additional biological information into molecular replacement. PMID:24189240
Dynamic test input generation for multiple-fault isolation
NASA Technical Reports Server (NTRS)
Schaefer, Phil
1990-01-01
Recent work is Causal Reasoning has provided practical techniques for multiple fault diagnosis. These techniques provide a hypothesis/measurement diagnosis cycle. Using probabilistic methods, they choose the best measurements to make, then update fault hypotheses in response. For many applications such as computers and spacecraft, few measurement points may be accessible, or values may change quickly as the system under diagnosis operates. In these cases, a hypothesis/measurement cycle is insufficient. A technique is presented for a hypothesis/test-input/measurement diagnosis cycle. In contrast to generating tests a priori for determining device functionality, it dynamically generates tests in response to current knowledge about fault probabilities. It is shown how the mathematics previously used for measurement specification can be applied to the test input generation process. An example from an efficient implementation called Multi-Purpose Causal (MPC) is presented.
Phaser.MRage: automated molecular replacement.
Bunkóczi, Gábor; Echols, Nathaniel; McCoy, Airlie J; Oeffner, Robert D; Adams, Paul D; Read, Randy J
2013-11-01
Phaser.MRage is a molecular-replacement automation framework that implements a full model-generation workflow and provides several layers of model exploration to the user. It is designed to handle a large number of models and can distribute calculations efficiently onto parallel hardware. In addition, phaser.MRage can identify correct solutions and use this information to accelerate the search. Firstly, it can quickly score all alternative models of a component once a correct solution has been found. Secondly, it can perform extensive analysis of identified solutions to find protein assemblies and can employ assembled models for subsequent searches. Thirdly, it is able to use a priori assembly information (derived from, for example, homologues) to speculatively place and score molecules, thereby customizing the search procedure to a certain class of protein molecule (for example, antibodies) and incorporating additional biological information into molecular replacement.
Study of stability of the difference scheme for the model problem of the gaslift process
NASA Astrophysics Data System (ADS)
Temirbekov, Nurlan; Turarov, Amankeldy
2017-09-01
The paper studies a model of the gaslift process where the motion in a gas-lift well is described by partial differential equations. The system describing the studied process consists of equations of motion, continuity, equations of thermodynamic state, and hydraulic resistance. A two-layer finite-difference Lax-Vendroff scheme is constructed for the numerical solution of the problem. The stability of the difference scheme for the model problem is investigated using the method of a priori estimates, the order of approximation is investigated, the algorithm for numerical implementation of the gaslift process model is given, and the graphs are presented. The development and investigation of difference schemes for the numerical solution of systems of equations of gas dynamics makes it possible to obtain simultaneously exact and monotonic solutions.
Cresswell, Kathrin; Morrison, Zoe; Crowe, Sarah; Robertson, Ann; Sheikh, Aziz
2011-01-01
The absence of meaningful end user engagement has repeatedly been highlighted as a key factor contributing to 'failed' implementations of electronic health records (EHRs), but achieving this is particularly challenging in the context of national scale initiatives. In 2002, the National Health Service (NHS) embarked on a so-called 'top-down' national implementation strategy aimed at introducing commercial, centrally procured, EHRs into hospitals throughout England. We aimed to examine approaches to, and experiences of, user engagement in the context of a large-scale EHR implementation across purposefully selected hospital care providers implementing early versions of nationally procured software. We conducted a qualitative, case-study based, socio-technically informed, longitudinal investigation, purposefully sampling and collecting data from four hospitals. Our data comprised a total of 123 semi-structured interviews with users and managers, 15 interviews with additional stakeholders, 43 hours of non-participant observations of meetings and system use, and relevant organisation-specific documents from each case study site. Analysis was thematic, building on an existing model of user engagement that was originally developed in the context of studying the implementation of relatively simple technologies in commercial settings. NVivo8 software was used to facilitate coding. Despite an enduring commitment to the vision of shared EHRs and an appreciation of their potential benefits, meaningful end user engagement was never achieved. Hospital staff were not consulted in systems choice, leading to frustration; they were then further alienated by the implementation of systems that they perceived as inadequately customised. Various efforts to achieve local engagement were attempted, but these were in effect risk mitigation strategies. We found the role of clinical champions to be important in these engagement efforts, but progress was hampered by the hierarchical structures within healthcare teams. As a result, engagement efforts focused mainly on clinical staff with inadequate consideration of management and administrative staff. This work has allowed us to further develop an existing model of user engagement from the commercial sector and adapt it to inform user engagement in the context of large-scale eHealth implementations. By identifying key points of possible engagement, disengagement and re-engagement, this model will we hope both help those planning similar large-scale EHR implementation efforts and act as a much needed catalyst to further research in this neglected field of enquiry.
NASA Astrophysics Data System (ADS)
Rainaud, Jean-François; Clochard, Vincent; Delépine, Nicolas; Crabié, Thomas; Poudret, Mathieu; Perrin, Michel; Klein, Emmanuel
2018-07-01
Accurate reservoir characterization is needed all along the development of an oil and gas field study. It helps building 3D numerical reservoir simulation models for estimating the original oil and gas volumes in place and for simulating fluid flow behaviors. At a later stage of the field development, reservoir characterization can also help deciding which recovery techniques need to be used for fluids extraction. In complex media, such as faulted reservoirs, flow behavior predictions within volumes close to faults can be a very challenging issue. During the development plan, it is necessary to determine which types of communication exist between faults or which potential barriers exist for fluid flows. The solving of these issues rests on accurate fault characterization. In most cases, faults are not preserved along reservoir characterization workflows. The memory of the interpreted faults from seismic is not kept during seismic inversion and further interpretation of the result. The goal of our study is at first to integrate a 3D fault network as a priori information into a model-based stratigraphic inversion procedure. Secondly, we apply our methodology on a well-known oil and gas case study over a typical North Sea field (UK Northern North Sea) in order to demonstrate its added value for determining reservoir properties. More precisely, the a priori model is composed of several geological units populated by physical attributes, they are extrapolated from well log data following the deposition mode, but usually a priori model building methods respect neither the 3D fault geometry nor the stratification dips on the fault sides. We address this difficulty by applying an efficient flattening method for each stratigraphic unit in our workflow. Even before seismic inversion, the obtained stratigraphic model has been directly used to model synthetic seismic on our case study. Comparisons between synthetic seismic obtained from our 3D fault network model give much lower residuals than with a "basic" stratigraphic model. Finally, we apply our model-based inversion considering both faulted and non-faulted a priori models. By comparing the rock impedances results obtain in the two cases, we can see a better delineation of the Brent-reservoir compartments by using the 3D faulted a priori model built with our method.
NASA Technical Reports Server (NTRS)
Johnson, Matthew Stephen
2017-01-01
A primary objective for TOLNet is the evaluation and validation of space-based tropospheric O3 retrievals from future systems such as the Tropospheric Emissions: Monitoring of Pollution (TEMPO) satellite. This study is designed to evaluate the tropopause-based O3 climatology (TB-Clim) dataset which will be used as the a priori profile information in TEMPO O3 retrievals. This study also evaluates model simulated O3 profiles, which could potentially serve as a priori O3 profile information in TEMPO retrievals, from near-real-time (NRT) data assimilation model products (NASA Global Modeling and Assimilation Office (GMAO) Goddard Earth Observing System (GEOS-5) Forward Processing (FP) and Modern-Era Retrospective analysis for Research and Applications version 2 (MERRA2)) and full chemical transport model (CTM), GEOS-Chem, simulations. The TB-Clim dataset and model products are evaluated with surface (0-2 km) and tropospheric (0-10 km) TOLNet observations to demonstrate the accuracy of the suggested a priori dataset and information which could potentially be used in TEMPO O3 algorithms. This study also presents the impact of individual a priori profile sources on the accuracy of theoretical TEMPO O3 retrievals in the troposphere and at the surface. Preliminary results indicate that while the TB-Clim climatological dataset can replicate seasonally-averaged tropospheric O3 profiles observed by TOLNet, model-simulated profiles from a full CTM (GEOS-Chem is used as a proxy for CTM O3 predictions) resulted in more accurate tropospheric and surface-level O3 retrievals from TEMPO when compared to hourly (diurnal cycle evaluation) and daily-averaged (daily variability evaluation) TOLNet observations. Furthermore, it was determined that when large daily-averaged surface O3 mixing ratios are observed (65 ppb), which are important for air quality purposes, TEMPO retrieval values at the surface display higher correlations and less bias when applying CTM a priori profile information compared to all other data products. The primary reason for this is that CTM predictions better capture the spatio-temporal variability of the vertical profiles of observed tropospheric O3 compared to the TB-Clim dataset and other NRT data assimilation models evaluated during this study.
Wadolowska, Lidia; Kowalkowska, Joanna; Czarnocinska, Jolanta; Jezewska-Zychowicz, Marzena; Babicz-Zielinska, Ewa
2017-05-01
To compare dietary patterns (DPs) derived by two methods and their assessment as a factor of obesity in girls aged 13-21 years. Data from a cross-sectional study conducted among the representative sample of Polish females ( n = 1,107) aged 13-21 years were used. Subjects were randomly selected. Dietary information was collected using three short-validated food frequency questionnaires (FFQs) regarding fibre intake, fat intake and overall food intake variety. DPs were identified by two methods: a priori approach (a priori DPs) and cluster analysis (data-driven DPs). The association between obesity and DPs and three single dietary characteristics was examined using multiple logistic regression analysis. Four data-driven DPs were obtained: 'Low-fat-Low-fibre-Low-varied' (21.2%), 'Low-fibre' (29.1%), 'Low-fat' (25.0%) and 'High-fat-Varied' (24.7%). Three a priori DPs were pre-defined: 'Non-healthy' (16.6%), 'Neither-pro-healthy-nor-non-healthy' (79.1%) and 'Pro-healthy' (4.3%). Girls with 'Low-fibre' DP were less likely to have central obesity (adjusted odds ratio (OR) = 0.36; 95% confidence interval (CI): 0.17, 0.75) than girls with 'Low-fat-Low-fibre-Low-varied' DP (reference group, OR = 1.00). No significant associations were found between a priori DPs and overweight including obesity or central obesity. The majority of girls with 'Non-healthy' DP were also classified as 'Low-fibre' DP in the total sample, in girls with overweight including obesity and in girls with central obesity (81.7%, 80.6% and 87.3%, respectively), while most girls with 'Pro-healthy' DP were classified as 'Low-fat' DP (67.8%, 87.6% and 52.1%, respectively). We found that the a priori approach as well as cluster analysis can be used to derive opposite health-oriented DPs in Polish females. Both methods have provided disappointing outcomes in explaining the association between obesity and DPs. The cluster analysis, in comparison with the a priori approach, was more useful for finding any relationship between DPs and central obesity. Our study highlighted the importance of method used to derive DPs in exploring associations between diet and obesity.
NASA Astrophysics Data System (ADS)
Fernandes, R. M. S.; Bos, M. S.; Bruyninx, C.; Crocker, P.; Dousa, J.; Walpersdorf, A.; Socquet, A.; Avallone, A.; Ganas, A.; Ionescu, C.; Kenyeres, A.; Ofeigsson, B.; Ozener, H.; Vergnolle, M.; Lidberg, M.; Liwosz, T.; Soehne, W.; Bezdeka, P.; Cardoso, R.; Cotte, N.; Couto, R.; D'Agostino, N.; Deprez, A.; Fabian, A.; Gonçalves, H.; Féres, L.; Legrand, J.; Menut, J. L.; Nastase, E.; Ngo, K. M.; Sigurðarson, F.; Vaclavovic, P.
2017-12-01
The GNSS working group part of the EPOS-IP (European Plate Observing System - Implementation Phase) project oversees the implementation of services focused on GNSS data and derived products for the use of the geo-sciences community. The objective is to serve essentially the Solid Earth community, but other scientific and technical communities will also be able the benefit of the efforts being carried out to access the data (and derived products) of the European Geodetic Infrastructures. The geodetic component of EPOS is dealing essentially with implementing an e-infrastructure to store and disseminate continuous GNSS data (and derived solutions) from existing Research Infrastructures and new dedicated services. Present efforts are on developing an integrated software package, called GLASS, that will permit to disseminate quality controlled data (using special tools) in a seamless way from dozens of Geodetic Research Infrastructures in Europe. Conceptually, GLASS can be used in a single Research Infrastructure or in hundreds cooperative ones. We present and discuss the status of the implementation of these services, including also the generation of products - time-series, velocity fields and strain rate fields. In concrete, we will present the results of the current validation phase of these services and we will discuss in detail the technical and cooperative efforts being implemented. EPOS-IP is a project funded by the ESFRI European Union.
Good governance and budget reform in Lesotho Public Hospitals: performance, root causes and reality.
Vian, Taryn; Bicknell, William J
2014-09-01
Lesotho has been implementing financial management reforms, including performance-based budgeting (PBB) since 2005 in an effort to increase accountability, transparency and effectiveness in governance, yet little is known about how these efforts are affecting the health sector. Supported by several development partners and $24 million in external resources, the PBB reform is intended to strengthen government capacity to manage aid funds directly and to target assistance to pressing social priorities. This study designed and tested a methodology for measuring implementation progress for PBB reform in the hospital sector in Lesotho. We found that despite some efforts on the national level to promote and support reform implementation, staff at the hospital level were largely unaware of the purpose of the reform and had made almost no progress in transforming institutions and systems to fully realize reform goals. Problems can be traced to a complex reform design, inadequate personnel and capacity to implement, professional boundaries between financial and clinical personnel and weak leadership. The Lesotho reform experience suggests that less complex designs for budget reform, better adapted to the context and realities of health sectors in developing countries, may be needed to improve governance. It also highlights the importance of measuring reform implementation at the sectoral level. Published by Oxford University Press in association with The London School of Hygiene and Tropical Medicine © The Author 2013; all rights reserved.
A priori mesh grading for the numerical calculation of the head-related transfer functions
Ziegelwanger, Harald; Kreuzer, Wolfgang; Majdak, Piotr
2017-01-01
Head-related transfer functions (HRTFs) describe the directional filtering of the incoming sound caused by the morphology of a listener’s head and pinnae. When an accurate model of a listener’s morphology exists, HRTFs can be calculated numerically with the boundary element method (BEM). However, the general recommendation to model the head and pinnae with at least six elements per wavelength renders the BEM as a time-consuming procedure when calculating HRTFs for the full audible frequency range. In this study, a mesh preprocessing algorithm is proposed, viz., a priori mesh grading, which reduces the computational costs in the HRTF calculation process significantly. The mesh grading algorithm deliberately violates the recommendation of at least six elements per wavelength in certain regions of the head and pinnae and varies the size of elements gradually according to an a priori defined grading function. The evaluation of the algorithm involved HRTFs calculated for various geometric objects including meshes of three human listeners and various grading functions. The numerical accuracy and the predicted sound-localization performance of calculated HRTFs were analyzed. A-priori mesh grading appeared to be suitable for the numerical calculation of HRTFs in the full audible frequency range and outperformed uniform meshes in terms of numerical errors, perception based predictions of sound-localization performance, and computational costs. PMID:28239186
Allaert, Francois-André; Mazen, Noël-Jean; Legrand, Louis; Quantin, Catherine
2017-01-17
The market for Connected Health Devices (CHD) with healthcare applications is growing fast and should be worth several billion euros in turnover in the coming years. Their development will completely transform the organisation of our healthcare system, profoundly change the way patients are managed and revolutionizes disease prevention. The CHD with healthcare applications is a tidal wave that has societal impact calling into question the privacy of patients' personal and healthcare information and its protection in secure systems. Rather than trying to stop the use of CHD, we must channel the wave by clearly examining the advantages versus the risks and threats to the patients, and find counter-measures for implementation. The main difficulty is channeling the wave in a way that is acceptable to CHD developers who otherwise will bypass the rules, even if they can be sued for it. Therefore, it appears necessary to implement guidelines that can be used by all developers, defining the minimum requirement for assuring the security of patient privacy and healthcare management. In European Healthcare Systems, there is an imperative need for establishing security guidelines that CHD producers could use to ensure compliance, so that patient privacy and healthcare management is safeguarded. The aim would be to implement the guidelines a posteriori rather than a priori control so as not to hamper innovation.
Implementing Small Scale ICT Projects in Developing Countries--How Challenging Is It?
ERIC Educational Resources Information Center
Karunaratne, Thashmee; Peiris, Colombage; Hansson, Henrik
2018-01-01
This paper summarises experiences of efforts made by twenty individuals when implementing small-scale ICT development projects in their organizations located in seven developing countries. The main focus of these projects was the use of ICT in educational settings. Challenges encountered and the contributing factors for implementation success of…
ERIC Educational Resources Information Center
Takanishi, Stacey M.
2012-01-01
NCLB policies in the United States focus schools' efforts on implementing effective instructional processes to improve student outcomes. This study looks more specifically at how schools are perceived to be implementing state required curricula and benchmarks and developing teaching and learning processes that support the teaching of state…
Implementation Science: Why It Matters for the Future of Social Work
ERIC Educational Resources Information Center
Cabassa, Leopoldo J.
2016-01-01
Bridging the gap between research and practice is a critical frontier for the future of social work. Integrating implementation science into social work can advance our profession's effort to bring research and practice closer together. Implementation science examines the factors, processes, and strategies that influence the uptake, use, and…
ERIC Educational Resources Information Center
Lara, Tracy M.; Hughey, Aaron W.
2008-01-01
Many companies have implemented the team approach as a way to empower their employees in an effort to enhance productivity, quality and overall profitability. While application of the concept to higher education administration has been limited, colleges and universities could benefit from the team approach if implemented appropriately and…
ERIC Educational Resources Information Center
Carrera, Larisa Ivon; Tellez, Tomas Eduardo; D'Ottavio, Alberto Enrique
2003-01-01
Describes the difficulties Argentina's medical schools are likely to face in implementing a problem-based learning (PBL) curriculum. Outlines the basic requirements for successful implementation of PBL curricula and describes the contradiction in Argentina between a health care system that forces specialization and the efforts of medical schools…
ERIC Educational Resources Information Center
Rusman
2015-01-01
Applying a new curriculum, namely implementation of 2013 Curriculum at schools has been commenced in July 2013. The implementation of the curriculum is expected to give a push to an increasing quality of managing and processing educational efforts towards betterments at every unit of learning and education. Backgrounded by application of the…
Advanced Health Management of a Brushless Direct Current Motor/Controller
NASA Technical Reports Server (NTRS)
Pickett, R. D.
2003-01-01
This effort demonstrates that health management can be taken to the component level for electromechanical systems. The same techniques can be applied to take any health management system to the component level, based on the practicality of the implementation for that particular system. This effort allows various logic schemes to be implemented for the identification and management of failures. By taking health management to the component level, integrated vehicle health management systems can be enhanced by protecting box-level avionics from being shut down in order to isolate a failed computer.
Model Checking Verification and Validation at JPL and the NASA Fairmont IV and V Facility
NASA Technical Reports Server (NTRS)
Schneider, Frank; Easterbrook, Steve; Callahan, Jack; Montgomery, Todd
1999-01-01
We show how a technology transfer effort was carried out. The successful use of model checking on a pilot JPL flight project demonstrates the usefulness and the efficacy of the approach. The pilot project was used to model a complex spacecraft controller. Software design and implementation validation were carried out successfully. To suggest future applications we also show how the implementation validation step can be automated. The effort was followed by the formal introduction of the modeling technique as a part of the JPL Quality Assurance process.
Integrating research, policy, and practice in juvenile justice education.
Blomberg, Thomas G; Waldo, Gordon P
2002-06-01
This article provides an overview of the history and context leading to Florida's efforts to implement an evaluation-driven research and associated quality assurance system for its juvenile justice education policies and practices. The Juvenile Justice Educational Enhancement Program began implementing Florida's evaluation research and quality assurance system to juvenile justice education in 1998. The article includes a brief summary of articles comprising this special issue of Evaluation Review that address the Juvenile Justice Educational Enhancement Program's various functions, methodological components, data, preliminary findings, continuing evaluation research efforts, and impediments.
Time-Resolved Data Acquisition for In Situ Subsurface Planetary Geochemistry
NASA Technical Reports Server (NTRS)
Bodnarik, Julia Gates; Burger, Dan M.; Burger, Arnold; Evans, Larry G.; Parsons, Ann M.; Starr, Richard D.; Stassun, Keivan G.
2012-01-01
The current gamma-ray/neutron instrumentation development effort at NASA Goddard Space Flight Center aims to extend the use of active pulsed neutron interrogation techniques to probe the subsurface geochemistry of planetary bodies in situ. All previous NASA planetary science missions, that used neutron and/or gamma-ray spectroscopy instruments, have relied on a constant neutron source produced from galactic cosmic rays. One of the distinguishing features of this effort is the inclusion of a high intensity 14.1 MeV pulsed neutron generator synchronized with a custom data acquisition system to time each event relative to the pulse. With usually only one opportunity to collect data, it is difficult to set a priori time-gating windows to obtain the best possible results. Acquiring time-tagged, event-by-event data from nuclear induced reactions provides raw data sets containing channel/energy, and event time for each gamma ray or neutron detected. The resulting data set can be plotted as a function of time or energy using optimized analysis windows after the data are acquired. Time windows can now be chosen to produce energy spectra that yield the most statistically significant and accurate elemental composition results that can be derived from the complete data set. The advantages of post-processing gamma-ray time-tagged event-by-event data in experimental tests using our prototype instrument will be demonstrated.
Time-resolved Neutron-gamma-ray Data Acquisition for in Situ Subsurface Planetary Geochemistry
NASA Technical Reports Server (NTRS)
Bodnarik, Julie G.; Burger, Dan Michael; Burger, A.; Evans, L. G.; Parsons, A. M.; Schweitzer, J. S.; Starr R. D.; Stassun, K. G.
2013-01-01
The current gamma-ray/neutron instrumentation development effort at NASA Goddard Space Flight Center aims to extend the use of active pulsed neutron interrogation techniques to probe the subsurface elemental composition of planetary bodies in situ. Previous NASA planetary science missions, that used neutron and/or gamma-ray spectroscopy instruments, have relied on neutrons produced from galactic cosmic rays. One of the distinguishing features of this effort is the inclusion of a high intensity 14.1 MeV pulsed neutron generator synchronized with a custom data acquisition system to time each event relative to the pulse. With usually only one opportunity to collect data, it is difficult to set a priori time-gating windows to obtain the best possible results. Acquiring time-tagged, event-by-event data from nuclear induced reactions provides raw data sets containing channel/energy, and event time for each gamma ray or neutron detected. The resulting data set can be plotted as a function of time or energy using optimized analysis windows after the data are acquired. Time windows can now be chosen to produce energy spectra that yield the most statistically significant and accurate elemental composition results that can be derived from the complete data set. The advantages of post-processing gamma-ray time-tagged event-by-event data in experimental tests using our prototype instrument will be demonstrated.
Paradis, Tiffany; St-Louis, Etienne; Landry, Tara; Poenaru, Dan
2018-02-21
The benefits of trauma registries have been well described. The crucial data they provide may guide injury prevention strategies, inform resource allocation, and support advocacy and policy. This has been shown to reduce trauma-related mortality in various settings. Trauma remains a leading cause of mortality in low- and middle-income countries (LMICs). However, the implementation of trauma registries in LMICs can be challenging due to lack of funding, specialized personnel, and infrastructure. This study explores strategies for successful trauma registry implementation in LMICs. The protocol was registered a priori (CRD42017058586). A peer-reviewed search strategy of multiple databases will be developed with a senior librarian. As per PRISMA guidelines, first screen of references based on abstract and title and subsequent full-text review will be conducted by two independent reviewers. Disagreements that cannot be resolved by discussion between reviewers shall be arbitrated by the principal investigator. Data extraction will be performed using a pre-defined data extraction sheet. Finally, bibliographies of included articles will be hand-searched. Studies of any design will be included if they describe or review development and implementation of a trauma registry in LMICs. No language or period restrictions will be applied. Summary statistics and qualitative meta-narrative analyses will be performed. The significant burden of trauma in LMIC environments presents unique challenges and limitations. Adapted strategies for deployment and maintenance of sustainable trauma registries are needed. Our methodology will systematically identify recommendations and strategies for successful trauma registry implementation in LMICs and describe threats and barriers to this endeavor. The protocol was registered on the PROSPERO international prospective register of systematic reviews ( CRD42017058586 ).
Hochgesang, Mindy; Zamudio-Haas, Sophia; Moran, Lissa; Nhampossa, Leopoldo; Packel, Laura; Leslie, Hannah; Richards, Janise; Shade, Starley B
2017-01-01
The rapid scale-up of HIV care and treatment in resource-limited countries requires concurrent, rapid development of health information systems to support quality service delivery. Mozambique, a country with an 11.5% prevalence of HIV, has developed nation-wide patient monitoring systems (PMS) with standardized reporting tools, utilized by all HIV treatment providers in paper or electronic form. Evaluation of the initial implementation of PMS can inform and strengthen future development as the country moves towards a harmonized, sustainable health information system. This assessment was conducted in order to 1) characterize data collection and reporting processes and PMS resources available and 2) provide evidence-based recommendations for harmonization and sustainability of PMS. This baseline assessment of PMS was conducted with eight non-governmental organizations that supported the Ministry of Health to provide 90% of HIV care and treatment in Mozambique. The study team conducted structured and semi-structured surveys at 18 health facilities located in all 11 provinces. Seventy-nine staff were interviewed. Deductive a priori analytic categories guided analysis. Health facilities have implemented paper and electronic monitoring systems with varying success. Where in use, robust electronic PMS facilitate facility-level reporting of required indicators; improve ability to identify patients lost to follow-up; and support facility and patient management. Challenges to implementation of monitoring systems include a lack of national guidelines and norms for patient level HIS, variable system implementation and functionality, and limited human and infrastructure resources to maximize system functionality and information use. This initial assessment supports the need for national guidelines to harmonize, expand, and strengthen HIV-related health information systems. Recommendations may benefit other countries with similar epidemiologic and resource-constrained environments seeking to improve PMS implementation. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Accuracy of Binary Black Hole Waveform Models for Advanced LIGO
NASA Astrophysics Data System (ADS)
Kumar, Prayush; Fong, Heather; Barkett, Kevin; Bhagwat, Swetha; Afshari, Nousha; Chu, Tony; Brown, Duncan; Lovelace, Geoffrey; Pfeiffer, Harald; Scheel, Mark; Szilagyi, Bela; Simulating Extreme Spacetimes (SXS) Team
2016-03-01
Coalescing binaries of compact objects, such as black holes and neutron stars, are the primary targets for gravitational-wave (GW) detection with Advanced LIGO. Accurate modeling of the emitted GWs is required to extract information about the binary source. The most accurate solution to the general relativistic two-body problem is available in numerical relativity (NR), which is however limited in application due to computational cost. Current searches use semi-analytic models that are based in post-Newtonian (PN) theory and calibrated to NR. In this talk, I will present comparisons between contemporary models and high-accuracy numerical simulations performed using the Spectral Einstein Code (SpEC), focusing at the questions: (i) How well do models capture binary's late-inspiral where they lack a-priori accurate information from PN or NR, and (ii) How accurately do they model binaries with parameters outside their range of calibration. These results guide the choice of templates for future GW searches, and motivate future modeling efforts.
Neuroimaging in psychiatric pharmacogenetics research: the promise and pitfalls.
Falcone, Mary; Smith, Ryan M; Chenoweth, Meghan J; Bhattacharjee, Abesh Kumar; Kelsoe, John R; Tyndale, Rachel F; Lerman, Caryn
2013-11-01
The integration of research on neuroimaging and pharmacogenetics holds promise for improving treatment for neuropsychiatric conditions. Neuroimaging may provide a more sensitive early measure of treatment response in genetically defined patient groups, and could facilitate development of novel therapies based on an improved understanding of pathogenic mechanisms underlying pharmacogenetic associations. This review summarizes progress in efforts to incorporate neuroimaging into genetics and treatment research on major psychiatric disorders, such as schizophrenia, major depressive disorder, bipolar disorder, attention-deficit/hyperactivity disorder, and addiction. Methodological challenges include: performing genetic analyses in small study populations used in imaging studies; inclusion of patients with psychiatric comorbidities; and the extensive variability across studies in neuroimaging protocols, neurobehavioral task probes, and analytic strategies. Moreover, few studies use pharmacogenetic designs that permit testing of genotype × drug effects. As a result of these limitations, few findings have been fully replicated. Future studies that pre-screen participants for genetic variants selected a priori based on drug metabolism and targets have the greatest potential to advance the science and practice of psychiatric treatment.
New materials and structures for photovoltaics
NASA Astrophysics Data System (ADS)
Zunger, Alex; Wagner, S.; Petroff, P. M.
1993-01-01
Despite the fact that over the years crystal chemists have discovered numerous semiconducting substances, and that modern epitaxial growth techniques are able to produce many novel atomic-scale architectures, current electronic and opto-electronic technologies are based but on a handful of ˜10 traditional semiconductor core materials. This paper surveys a number of yet-unexploited classes of semiconductors, pointing to the much-needed research in screening, growing, and characterizing promising members of these classes. In light of the unmanageably large number of a-priori possibilities, we emphasize the role that structural chemistry and modern computer-aided design must play in screening potentially important candidates. The basic classes of materials discussed here include nontraditional alloys, such as non-isovalent and heterostructural semiconductors, materials at reduced dimensionality, including superlattices, zeolite-caged nanostructures and organic semiconductors, spontaneously ordered alloys, interstitial semiconductors, filled tetrahedral structures, ordered vacancy compounds, and compounds based on d and f electron elements. A collaborative effort among material predictor, material grower, and material characterizer holds the promise for a successful identification of new and exciting systems.
Stillman, Jennifer A; Fletcher, Richard B; Carr, Stuart C
2007-04-01
Research on groups is often applied to sport teams, and research on teams is often applied to groups. This study investigates the extent to which individuals have distinct schemas for groups and teams. A list of team and group characteristics was generated from 250 individuals, for use in this and related research. Questions about teams versus groups carry an a priori implication that differences exist; therefore, list items were presented to new participants and were analyzed using signal detection theory, which can accommodate a finding of no detectable difference between a nominated category and similar items. Participants were 30 members from each of the following: netball teams, the general public, and hobby groups. Analysis revealed few features that set groups apart from teams; however, teams were perceived as more structured and demanding, requiring commitment and effort toward shared goals. Team and group characteristics were more clearly defined to team members than they were to other participant groups. The research has implications for coaches and practitioners.
Oulevey Bachmann, Annie; Danuser, Brigitta; Morin, Diane
2015-10-01
Coexisting workloads from professional, household and family, and caregiving activities for frail parents expose middle-aged individuals, the so-called "Sandwich Generation", to potential health risks. Current trends suggest that this situation will continue or increase. Thus SG health promotion has become a nursing concern. Most existing research considers coexisting workloads a priori pathogenic. Most studies have examined the association of one, versus two, of these three activities with health. Few studies have used a nursing perspective. This article presents the development of a framework based on a nursing model. We integrated Siegrist's Effort-Reward Imbalance middle-range theory into "Neuman Systems Model". The latter was chosen for its salutogenic orientation, its attention to preventive nursing interventions and the opportunity it provides to simultaneously consider positive and negative perceptions of SG health and SG coexisting workloads. Finally, it facilitated a theoretical identification of health protective factors. © The Author(s) 2015.
NASA Astrophysics Data System (ADS)
Pollizzi, J. A.; Lezon, K.
The Hubble Space Telescope (HST) generates on the order of 7,000 telemetry values, many of which are sampled at 1Hz, and with several hundred parameters being sampled at 40Hz. Such data volumes would quickly tax even the largest of processing facilities. Yet the ability to access the telemetry data in a variety of ways, and in particular, using ad hoc (i.e., no a priori fixed) queries, is essential to assuring the long term viability and usefulness of this instrument. As part of the recent NASA initiative to re-engineer HST's ground control systems, a concept arose to apply newly available data warehousing technologies to this problem. The Space Telescope Science Institute was engaged to develop a pilot to investigate the technology and to create a proof-of-concept testbed that could be demonstrated and evaluated for operational use. This paper describes this effort and its results.
Under-Track CFD-Based Shape Optimization for a Low-Boom Demonstrator Concept
NASA Technical Reports Server (NTRS)
Wintzer, Mathias; Ordaz, Irian; Fenbert, James W.
2015-01-01
The detailed outer mold line shaping of a Mach 1.6, demonstrator-sized low-boom concept is presented. Cruise trim is incorporated a priori as part of the shaping objective, using an equivalent-area-based approach. Design work is performed using a gradient-driven optimization framework that incorporates a three-dimensional, nonlinear flow solver, a parametric geometry modeler, and sensitivities derived using the adjoint method. The shaping effort is focused on reducing the under-track sonic boom level using an inverse design approach, while simultaneously satisfying the trim requirement. Conceptual-level geometric constraints are incorporated in the optimization process, including the internal layout of fuel tanks, landing gear, engine, and crew station. Details of the model parameterization and design process are documented for both flow-through and powered states, and the performance of these optimized vehicles presented in terms of inviscid L/D, trim state, pressures in the near-field and at the ground, and predicted sonic boom loudness.
Teaching differential diagnosis to nurse practitioner students in a distance program.
Colella, Christine L; Beery, Theresa A
2014-08-01
An interactive case study (ICS) is a novel way to enhance the teaching of differential diagnosis to distance learning nurse practitioner students. Distance education renders the use of many teaching strategies commonly used with face-to-face students difficult, if not impossible. To meet this new pedagogical dilemma and to provide excellence in education, the ICS was developed. Kolb's theory of experiential learning supported efforts to follow the utilization of the ICS. This study sought to determine whether learning outcomes for the distance learning students were equivalent to those of on-campus students who engaged in a live-patient encounter. Accuracy of differential diagnosis lists generated by onsite and online students was compared. Equivalency testing assessed clinical, rather than only statistical, significance in data from 291 students. The ICS responses from the distance learning and onsite students differed by 4.9%, which was within the a priori equivalence estimate of 10%. Narrative data supported the findings. Copyright 2014, SLACK Incorporated.
Discovering spatio-temporal models of the spread of West Nile virus.
Orme-Zavaleta, Jennifer; Jorgensen, Jane; D'Ambrosio, Bruce; Altendorf, Eric; Rossignol, Philippe A
2006-04-01
Emerging infectious diseases are characterized by complex interactions among disease agents, vectors, wildlife, humans, and the environment. Since the appearance of West Nile virus (WNV) in New York City in 1999, it has infected over 8,000 people in the United States, resulting in several hundred deaths in 46 contiguous states. The virus is transmitted by mosquitoes and maintained in various bird reservoir hosts. Its unexpected introduction, high morbidity, and rapid spread have left public health agencies facing severe time constraints in a theory-poor environment, dependent largely on observational data collected by independent survey efforts and much uncertainty. Current knowledge may be expressed as a priori constraints on models learned from data. Accordingly, we applied a Bayesian probabilistic relational approach to generate spatially and temporally linked models from heterogeneous data sources. Using data collected from multiple independent sources in Maryland, we discovered the integrated context in which infected birds are plausible indicators for positive mosquito pools and human cases for 2001 and 2002.
Youth self-report of child maltreatment in representative surveys: a systematic review
Jessica, Laurin*; Caroline, Wallace*; Jasminka, Draca; Sarah, Aterman; Lil, Tonmyr
2018-01-01
Abstract Introduction: This systematic review identified population-representative youth surveys containing questions on self-reported child maltreatment. Data quality and ethical issues pertinent to maltreatment data collection were also examined. Methods: A search was conducted of relevant online databases for articles published from January 2000 through March 2016 reporting on population-representative data measuring child maltreatment. Inclusion criteria were established a priori; two reviewers independently assessed articles to ensure that the criteria were met and to verify the accuracy of extracted information. Results: A total of 73 articles reporting on 71 surveys met the inclusion criteria. A variety of strategies to ensure accurate information and to mitigate survey participants’ distress were reported. Conclusion: The extent to which efforts have been undertaken to measure the prevalence of child maltreatment reflects its perceived importance across the world. Data on child maltreatment can be effectively collected from youth, although our knowledge of best practices related to ethics and data quality is incomplete. PMID:29443484
Web-client based distributed generalization and geoprocessing
Wolf, E.B.; Howe, K.
2009-01-01
Generalization and geoprocessing operations on geospatial information were once the domain of complex software running on high-performance workstations. Currently, these computationally intensive processes are the domain of desktop applications. Recent efforts have been made to move geoprocessing operations server-side in a distributed, web accessible environment. This paper initiates research into portable client-side generalization and geoprocessing operations as part of a larger effort in user-centered design for the US Geological Survey's The National Map. An implementation of the Ramer-Douglas-Peucker (RDP) line simplification algorithm was created in the open source OpenLayers geoweb client. This algorithm implementation was benchmarked using differing data structures and browser platforms. The implementation and results of the benchmarks are discussed in the general context of client-side geoprocessing. (Abstract).
Implementation Plan for Chemical Industry R&D Roadmap for Nanomaterials by Design
DOE Office of Scientific and Technical Information (OSTI.GOV)
none,
2006-04-01
The purpose of this effort is to develop an implementation plan to realize the vision and goals identified in the Chemical Industry R&D Roadmap for Nanomaterials By Design: From Fundamentals to Function.
Implementation program on high performance concrete: guidelines for instrumentation on bridges
DOT National Transportation Integrated Search
1996-08-01
This report provides an outline for the instrumentation of bridges being constructed under the Federal Highway Administration's (FHWA's) Strategic Highway Research Program (SHRP) implementation effort in High Performance Concrete (HPC). The report de...
FY 95 engineering work plan for the design reconstitution implementation action plan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bigbee, J.D.
Design reconstitution work is to be performed as part of an overall effort to upgrade Configuration Management (CM) at TWRS. WHC policy is to implement a program that is compliant with DOE-STD-1073-93, Guide for Operational Configuration Management Program. DOE-STD-1073 requires an adjunct program for reconstituting design information. WHC-SD-WM-CM-009, Design Reconstitution Program Plan for Waste Tank Farms and 242-A Evaporator of Tank Waste Remediation System, is the TWRS plan for meeting DOE-STD-1073 design reconstitution requirements. The design reconstitution plan is complex requiring significant time and effort for implementation. In order to control costs, and integrate the work into other TWRS activities,more » a Design Reconstitution Implementation Action Plan (DR IAP) will be developed, and approved by those organizations having ownership or functional interest in this activity.« less
KSC management training system project
NASA Technical Reports Server (NTRS)
Sepulveda, Jose A.
1993-01-01
The stated objectives for the summer of 1993 were: to review the Individual Development Plan Surveys for 1994 in order to automate the analysis of the Needs Assessment effort; and to develop and implement evaluation methodologies to perform ongoing program-wide course-to-course assessment. This includes the following: to propose a methodology to develop and implement objective, performance-based assessment instruments for each training effort; to mechanize course evaluation forms and develop software to facilitate the data gathering, analysis, and reporting processes; and to implement the methodology, forms, and software in at lease one training course or seminar selected among those normally offered in the summer at KSC. Section two of this report addresses the work done in regard to the Individual Development Plan Surveys for 1994. Section three presents the methodology proposed to develop and implement objective, performance-based assessment instruments for each training course offered at KSC.
A computational approach to compare regression modelling strategies in prediction research.
Pajouheshnia, Romin; Pestman, Wiebe R; Teerenstra, Steven; Groenwold, Rolf H H
2016-08-25
It is often unclear which approach to fit, assess and adjust a model will yield the most accurate prediction model. We present an extension of an approach for comparing modelling strategies in linear regression to the setting of logistic regression and demonstrate its application in clinical prediction research. A framework for comparing logistic regression modelling strategies by their likelihoods was formulated using a wrapper approach. Five different strategies for modelling, including simple shrinkage methods, were compared in four empirical data sets to illustrate the concept of a priori strategy comparison. Simulations were performed in both randomly generated data and empirical data to investigate the influence of data characteristics on strategy performance. We applied the comparison framework in a case study setting. Optimal strategies were selected based on the results of a priori comparisons in a clinical data set and the performance of models built according to each strategy was assessed using the Brier score and calibration plots. The performance of modelling strategies was highly dependent on the characteristics of the development data in both linear and logistic regression settings. A priori comparisons in four empirical data sets found that no strategy consistently outperformed the others. The percentage of times that a model adjustment strategy outperformed a logistic model ranged from 3.9 to 94.9 %, depending on the strategy and data set. However, in our case study setting the a priori selection of optimal methods did not result in detectable improvement in model performance when assessed in an external data set. The performance of prediction modelling strategies is a data-dependent process and can be highly variable between data sets within the same clinical domain. A priori strategy comparison can be used to determine an optimal logistic regression modelling strategy for a given data set before selecting a final modelling approach.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pin, F.G.; de Saussure, G.; Spelt, P.F.
1988-01-01
This paper describes recent research activities at the Center for Engineering Systems Advanced Research (CESAR) in the area of sensor based reasoning, with emphasis being given to their application and implementation on our HERMIES-IIB autonomous mobile vehicle. These activities, including navigation and exploration in a-priori unknown and dynamic environments, goal recognition, vision-guided manipulation and sensor-driven machine learning, are discussed within the framework of a scenario in which an autonomous robot is asked to navigate through an unknown dynamic environment, explore, find and dock at the panel, read and understand the status of the panel's meters and dials, learn the functioningmore » of a process control panel, and successfully manipulate the control devices of the panel to solve a maintenance emergency problems. A demonstration of the successful implementation of the algorithms on our HERMIES-IIB autonomous robot for resolution of this scenario is presented. Conclusions are drawn concerning the applicability of the methodologies to more general classes of problems and implications for future work on sensor-driven reasoning for autonomous robots are discussed. 8 refs., 3 figs.« less
A comparison of two adaptive algorithms for the control of active engine mounts
NASA Astrophysics Data System (ADS)
Hillis, A. J.; Harrison, A. J. L.; Stoten, D. P.
2005-08-01
This paper describes work conducted in order to control automotive active engine mounts, consisting of a conventional passive mount and an internal electromagnetic actuator. Active engine mounts seek to cancel the oscillatory forces generated by the rotation of out-of-balance masses within the engine. The actuator generates a force dependent on a control signal from an algorithm implemented with a real-time DSP. The filtered-x least-mean-square (FXLMS) adaptive filter is used as a benchmark for comparison with a new implementation of the error-driven minimal controller synthesis (Er-MCSI) adaptive controller. Both algorithms are applied to an active mount fitted to a saloon car equipped with a four-cylinder turbo-diesel engine, and have no a priori knowledge of the system dynamics. The steady-state and transient performance of the two algorithms are compared and the relative merits of the two approaches are discussed. The Er-MCSI strategy offers significant computational advantages as it requires no cancellation path modelling. The Er-MCSI controller is found to perform in a fashion similar to the FXLMS filter—typically reducing chassis vibration by 50-90% under normal driving conditions.
A labview-based GUI for the measurement of otoacoustic emissions.
Wu, Ye; McNamara, D M; Ziarani, A K
2006-01-01
This paper presents the outcome of a software development project aimed at creating a stand-alone user-friendly signal processing algorithm for the estimation of distortion product otoacoustic emission (OAE) signals. OAE testing is one of the most commonly used methods of first screening of newborns' hearing. Most of the currently available commercial devices rely upon averaging long strings of data and subsequent discrete Fourier analysis to estimate low level OAE signals from within the background noise in the presence of the strong stimuli. The main shortcoming of the presently employed technology is the need for long measurement time and its low noise immunity. The result of the software development project presented here is a graphical user interface (GUI) module that implements a recently introduced adaptive technique of OAE signal estimation. This software module is easy to use and is freely disseminated on the Internet for the use of the hearing research community. This GUI module allows loading of the a priori recorded OAE signals into the workspace, and provides the user with interactive instructions for the OAE signal estimation. Moreover, the user can generate simulated OAE signals to objectively evaluate the performance capability of the implemented signal processing technique.
Benchimol, Eric I.; Smeeth, Liam; Guttmann, Astrid; Harron, Katie; Moher, David; Petersen, Irene; Sørensen, Henrik T.; von Elm, Erik; Langan, Sinéad M.
2015-01-01
Routinely collected health data, obtained for administrative and clinical purposes without specific a priori research goals, are increasingly used for research. The rapid evolution and availability of these data have revealed issues not addressed by existing reporting guidelines, such as Strengthening the Reporting of Observational Studies in Epidemiology (STROBE). The REporting of studies Conducted using Observational Routinely collected health Data (RECORD) statement was created to fill these gaps. RECORD was created as an extension to the STROBE statement to address reporting items specific to observational studies using routinely collected health data. RECORD consists of a checklist of 13 items related to the title, abstract, introduction, methods, results, and discussion section of articles, and other information required for inclusion in such research reports. This document contains the checklist and explanatory and elaboration information to enhance the use of the checklist. Examples of good reporting for each RECORD checklist item are also included herein. This document, as well as the accompanying website and message board (http://www.record-statement.org), will enhance the implementation and understanding of RECORD. Through implementation of RECORD, authors, journals editors, and peer reviewers can encourage transparency of research reporting. PMID:26440803
Benchimol, Eric I; Smeeth, Liam; Guttmann, Astrid; Harron, Katie; Moher, David; Petersen, Irene; Sørensen, Henrik T; von Elm, Erik; Langan, Sinéad M
2015-10-01
Routinely collected health data, obtained for administrative and clinical purposes without specific a priori research goals, are increasingly used for research. The rapid evolution and availability of these data have revealed issues not addressed by existing reporting guidelines, such as Strengthening the Reporting of Observational Studies in Epidemiology (STROBE). The REporting of studies Conducted using Observational Routinely collected health Data (RECORD) statement was created to fill these gaps. RECORD was created as an extension to the STROBE statement to address reporting items specific to observational studies using routinely collected health data. RECORD consists of a checklist of 13 items related to the title, abstract, introduction, methods, results, and discussion section of articles, and other information required for inclusion in such research reports. This document contains the checklist and explanatory and elaboration information to enhance the use of the checklist. Examples of good reporting for each RECORD checklist item are also included herein. This document, as well as the accompanying website and message board (http://www.record-statement.org), will enhance the implementation and understanding of RECORD. Through implementation of RECORD, authors, journals editors, and peer reviewers can encourage transparency of research reporting.
Benchimol, Eric I; Smeeth, Liam; Guttmann, Astrid; Harron, Katie; Hemkens, Lars G; Moher, David; Petersen, Irene; Sørensen, Henrik T; von Elm, Erik; Langan, Sinéad M
2016-10-01
Routinely collected health data, obtained for administrative and clinical purposes without specific a priori research goals, are increasingly used for research. The rapid evolution and availability of these data have revealed issues not addressed by existing reporting guidelines, such as Strengthening the Reporting of Observational Studies in Epidemiology (STROBE). The REporting of studies Conducted using Observational Routinely collected health Data (RECORD) statement was created to fill these gaps. RECORD was created as an extension to the STROBE statement to address reporting items specific to observational studies using routinely collected health data. RECORD consists of a checklist of 13 items related to the title, abstract, introduction, methods, results, and discussion section of articles, and other information required for inclusion in such research reports. This document contains the checklist as well as explanatory and elaboration information to enhance the use of the checklist. Examples of good reporting for each RECORD checklist item are also included. This document, as well as the accompanying website and message board (http://www.record-statement.org), will improve the implementation and understanding of RECORD. By implementing RECORD, authors, journals editors, and peer reviewers can enhance transparency of research reporting. Copyright © 2016. Published by Elsevier GmbH.
Mendieta-Moreno, Jesús I; Marcos-Alcalde, Iñigo; Trabada, Daniel G; Gómez-Puertas, Paulino; Ortega, José; Mendieta, Jesús
2015-01-01
Quantum mechanics/molecular mechanics (QM/MM) methods are excellent tools for the modeling of biomolecular reactions. Recently, we have implemented a new QM/MM method (Fireball/Amber), which combines an efficient density functional theory method (Fireball) and a well-recognized molecular dynamics package (Amber), offering an excellent balance between accuracy and sampling capabilities. Here, we present a detailed explanation of the Fireball method and Fireball/Amber implementation. We also discuss how this tool can be used to analyze reactions in biomolecules using steered molecular dynamics simulations. The potential of this approach is shown by the analysis of a reaction catalyzed by the enzyme triose-phosphate isomerase (TIM). The conformational space and energetic landscape for this reaction are analyzed without a priori assumptions about the protonation states of the different residues during the reaction. The results offer a detailed description of the reaction and reveal some new features of the catalytic mechanism. In particular, we find a new reaction mechanism that is characterized by the intramolecular proton transfer from O1 to O2 and the simultaneous proton transfer from Glu 165 to C2. Copyright © 2015 Elsevier Inc. All rights reserved.
The potential application of the blackboard model of problem solving to multidisciplinary design
NASA Technical Reports Server (NTRS)
Rogers, James L.
1989-01-01
The potential application of the blackboard model of problem solving to multidisciplinary design is discussed. Multidisciplinary design problems are complex, poorly structured, and lack a predetermined decision path from the initial starting point to the final solution. The final solution is achieved using data from different engineering disciplines. Ideally, for the final solution to be the optimum solution, there must be a significant amount of communication among the different disciplines plus intradisciplinary and interdisciplinary optimization. In reality, this is not what happens in today's sequential approach to multidisciplinary design. Therefore it is highly unlikely that the final solution is the true optimum solution from an interdisciplinary optimization standpoint. A multilevel decomposition approach is suggested as a technique to overcome the problems associated with the sequential approach, but no tool currently exists with which to fully implement this technique. A system based on the blackboard model of problem solving appears to be an ideal tool for implementing this technique because it offers an incremental problem solving approach that requires no a priori determined reasoning path. Thus it has the potential of finding a more optimum solution for the multidisciplinary design problems found in today's aerospace industries.
Polydisperse sphere packing in high dimensions, a search for an upper critical dimension
NASA Astrophysics Data System (ADS)
Morse, Peter; Clusel, Maxime; Corwin, Eric
2012-02-01
The recently introduced granocentric model for polydisperse sphere packings has been shown to be in good agreement with experimental and simulational data in two and three dimensions. This model relies on two effective parameters that have to be estimated from experimental/simulational results. The non-trivial values obtained allow the model to take into account the essential effects of correlations in the packing. Once these parameters are set, the model provides a full statistical description of a sphere packing for a given polydispersity. We investigate the evolution of these effective parameters with the spatial dimension to see if, in analogy with the upper critical dimension in critical phenomena, there exists a dimension above which correlations become irrelevant and the model parameters can be fixed a priori as a function of polydispersity. This would turn the model into a proper theory of polydisperse sphere packings at that upper critical dimension. We perform infinite temperature quench simulations of frictionless polydisperse sphere packings in dimensions 2-8 using a parallel algorithm implemented on a GPGPU. We analyze the resulting packings by implementing an algorithm to calculate the additively weighted Voronoi diagram in arbitrary dimension.
NASA Technical Reports Server (NTRS)
Bey, Kim S.; Oden, J. Tinsley
1993-01-01
A priori error estimates are derived for hp-versions of the finite element method for discontinuous Galerkin approximations of a model class of linear, scalar, first-order hyperbolic conservation laws. These estimates are derived in a mesh dependent norm in which the coefficients depend upon both the local mesh size h(sub K) and a number p(sub k) which can be identified with the spectral order of the local approximations over each element.
Monte Carlo Simulations: Number of Iterations and Accuracy
2015-07-01
iterations because of its added complexity compared to the WM . We recommend that the WM be used for a priori estimates of the number of MC ...inaccurate.15 Although the WM and the WSM have generally proven useful in estimating the number of MC iterations and addressing the accuracy of the MC ...Theorem 3 3. A Priori Estimate of Number of MC Iterations 7 4. MC Result Accuracy 11 5. Using Percentage Error of the Mean to Estimate Number of MC
Ohlander, Johan; Weigl, Matthias; Petru, Raluca; Angerer, Peter; Radon, Katja
2015-05-01
Work stress among physicians is a growing concern in various countries and has led to migration. We compared the working conditions and the work stress between a migrated population of German physicians in Sweden and a population of physicians based in Germany. Additionally, specific risk factors for work stress were examined country wise. Using a cross-sectional design, 85 German physicians employed in Sweden were surveyed on working conditions and effort-reward imbalance and compared with corresponding data on 561 physicians working in Germany. Multiple linear regression analyses were applied on both populations separately to model the associations between working conditions and effort-reward ratio (ERR), adjusted for a priori confounders. German physicians in Sweden had a significantly lower ERR than physicians in Germany: mean (M) = 0.47, standard deviation (SD) = 0.24 vs. M = 0.80, SD = 0.35. Physicians in Sweden worked on average 8 h less per week and reported higher work support and responsibility. Multivariate analyses showed in both populations a negative association between work support and the ERR (β = -0.148, 95% CI -0.215 to (-0.081) for physicians in Sweden and β = -0.174, 95% CI -0.240 to (-0.106) for physicians in Germany). Further significant associations with the ERR were found among physicians in Sweden for daily breaks (β = -0.002, 95% CI -0.004 to (-0.001)) and among physicians in Germany for working hours per week (β = 0.006, 95% CI 0.002-0.009). Our findings show substantial differences in work stress and working conditions in favor of migrated German physicians in Sweden. To confirm our results and to explain demonstrated differences in physicians' work stress, longitudinal studies are recommended.
Leadbeater, Bonnie J; Dishion, Tom; Sandler, Irwin; Bradshaw, Catherine P; Dodge, Kenneth; Gottfredson, Denise; Graham, Phillip W; Lindstrom Johnson, Sarah; Maldonado-Molina, Mildred M; Mauricio, Anne M; Smith, Emilie Phillips
2018-06-23
Prevention science researchers and practitioners are increasingly engaged in a wide range of activities and roles to promote evidence-based prevention practices in the community. Ethical concerns invariably arise in these activities and roles that may not be explicitly addressed by university or professional guidelines for ethical conduct. In 2015, the Society for Prevention Research (SPR) Board of Directors commissioned Irwin Sandler and Tom Dishion to organize a series of roundtables and establish a task force to identify salient ethical issues encountered by prevention scientists and community-based practitioners as they collaborate to implement evidence-based prevention practices. This article documents the process and findings of the SPR Ethics Task Force and aims to inform continued efforts to articulate ethical practice. Specifically, the SPR membership and task force identified prevention activities that commonly stemmed from implementation and scale-up efforts. This article presents examples that illustrate typical ethical dilemmas. We present principles and concepts that can be used to frame the discussion of ethical concerns that may be encountered in implementation and scale-up efforts. We summarize value statements that stemmed from our discussion. We also conclude that the field of prevention science in general would benefit from standards and guidelines to promote ethical behavior and social justice in the process of implementing evidence-based prevention practices in community settings. It is our hope that this article serves as an educational resource for students, investigators, and Human Subjects Review Board members regarding some of the complexity of issues of fairness, equality, diversity, and personal rights for implementation of preventive interventions.
Access management implementation in Kentucky technical support document and status report.
DOT National Transportation Integrated Search
2008-05-01
This report describes the efforts of the Kentucky Transportation Cabinet's Access Management implementation Task Force. The task force was established in May 2004 and was charged with the responsibility of reviewing and refining the recommendations i...
Recommendations for Establishing the Texas Roadway Research Implementation Center
DOT National Transportation Integrated Search
1998-07-01
The overall objective of the Roadway Research Initiative study was to describe an advanced testing capability, on that would speed implementation of the results from traditional computer and laboratory-based research efforts by providing a reusable t...
Rajan, Sonali; Roberts, Katherine J; Guerra, Laura; Pirsch, Moira; Morrell, Ernest
2017-12-01
School-based health education efforts can positively affect health behaviors and learning outcomes; however, there is limited available time during the school day for separate health education classes. The purpose of this study was to assess the feasibility and sustainability of implementing a classroom-based health education program that integrates skill development with health learning. A wait-list control study design was conducted among 168 6th graders in 2 urban schools. Data on program implementation, feasibility, and health outcomes were collected from students at 3 time points and from 5 teachers across the implementation of the 10-week program. There were barriers to implementation, including time limitations, unexpected school-wide disruptions, and variations in student reading ability and teacher preparedness. However, analyses revealed there were significant increases in self-efficacy regarding fruit and vegetable consumption and outcome expectations following program implementation, which were also sustained post-program implementation. Despite inconsistent implementation in the wait-list control school, small gains were also noted following the completion of the program. Integrating health education efforts within core curricula classes can lead to favorable outcomes. However, implementation barriers must be actively addressed by schools and program developers to improve program fidelity and maximize the sustainability of program gains. © 2017, American School Health Association.
NASA Astrophysics Data System (ADS)
Akuma, Fru Vitalis; Callaghan, Ronel
2017-11-01
Inquiry-based science education has been incorporated in science curricula internationally. In this regard, however, many teachers encounter challenges. The challenges have been characterised into those linked to the personal characteristics of these teachers (intrinsic challenges) and others associated with contextual factors (extrinsic challenges). However, this level of characterisation is inadequate in terms of appreciating the complexity of the challenges, tracking of their development, and discovering knowledge within specific categories. Against this background, the purpose of the research presented here was to characterise extrinsic challenges linked to the design and implementation of inquiry-based practical work. In order to do so, we used a conceptual framework of teaching challenges based on Bronfenbrenner's ecological theory of human development. The data gathered using a multi-method case study of practical work in two South African high schools, was analysed by combining the data-driven inductive approach and the deductive a priori template of codes approach in thematic analysis. On this basis, the extrinsic challenges linked to the design and implementation of inquiry-based practical work that participants are confronted with, were found to consist of macrosystem challenges (such as a restrictive curriculum) and microsystem challenges. At the latter level, the challenges are material-related (e.g., lack of science education equipment and materials) or non-material-related (such as time constraints and the lack of access to interactive computer simulations). We have discussed the theory-, practice- and research-based implications of these results in relation to the design and implementation of inquiry-based practical work in South Africa and internationally.
Combating terrorism : federal agencies' efforts to implement national policy and strategy
DOT National Transportation Integrated Search
1997-09-01
The threat of terrorist attacks against U.S. citizens and property both at home and abroad has been an issue of growing national concern. As requested, the General Accounting Office (GAO) reviewed U.S. efforts to combat terrorism. This report discuss...
Cultivating Innovation in an Age of Accountability: Tech-Savvy Leadership
ERIC Educational Resources Information Center
Sterrett, William L.; Richardson, Jayson W.
2017-01-01
District and school leaders are uniquely poised to serve as collaborative, innovative leaders. However, the context of current challenges often detracts from implementation efforts. This case describes how district leadership works collaboratively with school administrators through targeted efforts related to professional development, articulation…
Road weather information system environmental sensor station siting guide, version 2.0
DOT National Transportation Integrated Search
2008-11-01
FHWA initiated an effort in 2007 to evaluate and update, as necessary the ESS Guidelines first published in 2004 (FHWA-HOP-05-026). This effort is summarized in a companion report Implementation and Evaluation of RWIS ESS Siting Guidelines. The...
The Conservation Efforts Database: Improving our knowledge of landscape conservation actions
Heller, Matthew M.; Welty, Justin; Wiechman , Lief A.
2017-01-01
The Conservation Efforts Database (CED) is a secure, cloud-based tool that can be used to document and track conservation actions across landscapes. A recently released factsheet describes this tool ahead of the rollout of CED version 2.0. The CED was developed by the U.S. Fish and Wildlife Service, the USGS, and the Great Northern Landscape Conservation Cooperative to support the 2015 Endangered Species Act status review for greater sage-grouse. Currently, the CED accepts policy-level data, such as Land Use Plans, and treatment level data, such as conifer removals and post-fire recovery efforts, as custom spatial and non-spatial records. In addition to a species assessment tool, the CED can also be used to summarize the extent of restoration efforts within a specific area or to strategically site conservation actions based on the location of other implemented actions. The CED can be an important tool, along with post-conservation monitoring, for implementing landscape-scale adaptive management.
ERIC Educational Resources Information Center
Green, Tim; Williamson, Margie E.; Endris, William L., jR.
2000-01-01
Describes how the Vernon Parish (Louisiana) School District implemented Governmental Accounting Board Statement No. 34 for fiscal year 1999. Implementing GASB 34 was mentally challenging and demanded a team effort. The system uses columnar displays for major funds and contains burdensome capital-assets accounting standards. (MLH)
ERIC Educational Resources Information Center
Brasiel, S.; Nafziger, D.
2013-01-01
State education agencies (SEAs) are central players in initiating and leading new reform efforts and in supporting and implementing Federal initiatives. Traditional approaches to providing public information are not adequate for producing public awareness and support and in supporting program implementation at the local level. With limited…
ERIC Educational Resources Information Center
Lee, Sarah L.
2012-01-01
The purpose of this study was to describe levels of RTI implementation in West Virginia elementary schools. Little is known about the national efforts that states are collectively undertaking to scale up implementation of RTI (Hoover, Baca, Wexler-Love, & Saenz, 2008). West Virginia's elementary schools were required by state policy to…
ERIC Educational Resources Information Center
Greenwood, Charles R.; Abbott, Mary; Beecher, Constance; Atwater, Jane; Petersen, Sarah
2017-01-01
Increasingly, prekindergarten programs with literacy outcome goals are seeking to implement evidence-based practices to improve results. Such efforts require instructional intervention strategies to engage children as well as strategies to support teacher implementation. Reported is the iterative development of Literacy 3D, an enhanced support…
Equilibrium statistical mechanics on correlated random graphs
NASA Astrophysics Data System (ADS)
Barra, Adriano; Agliari, Elena
2011-02-01
Biological and social networks have recently attracted great attention from physicists. Among several aspects, two main ones may be stressed: a non-trivial topology of the graph describing the mutual interactions between agents and, typically, imitative, weighted, interactions. Despite such aspects being widely accepted and empirically confirmed, the schemes currently exploited in order to generate the expected topology are based on a priori assumptions and, in most cases, implement constant intensities for links. Here we propose a simple shift [-1,+1]\\to [0,+1] in the definition of patterns in a Hopfield model: a straightforward effect is the conversion of frustration into dilution. In fact, we show that by varying the bias of pattern distribution, the network topology (generated by the reciprocal affinities among agents, i.e. the Hebbian rule) crosses various well-known regimes, ranging from fully connected, to an extreme dilution scenario, then to completely disconnected. These features, as well as small-world properties, are, in this context, emergent and no longer imposed a priori. The model is throughout investigated also from a thermodynamics perspective: the Ising model defined on the resulting graph is analytically solved (at a replica symmetric level) by extending the double stochastic stability technique, and presented together with its fluctuation theory for a picture of criticality. Overall, our findings show that, at least at equilibrium, dilution (of whatever kind) simply decreases the strength of the coupling felt by the spins, but leaves the paramagnetic/ferromagnetic flavors unchanged. The main difference with respect to previous investigations is that, within our approach, replicas do not appear: instead of (multi)-overlaps as order parameters, we introduce a class of magnetizations on all the possible subgraphs belonging to the main one investigated: as a consequence, for these objects a closure for a self-consistent relation is achieved.
Zeni, Mary Beth
2012-03-01
The purpose of this study was to evaluate if paediatric asthma educational intervention studies included in the Cochrane Collaboration database incorporated concepts of health literacy. Inclusion criteria were established to identify review categories in the Cochrane Collaboration database specific to paediatric asthma educational interventions. Articles that met the inclusion criteria were selected from the Cochrane Collaboration database in 2010. The health literacy definition from Healthy People 2010 was used to develop a 4-point a priori rating scale to determine the extent a study reported aspects of health literacy in the development of an educational intervention for parents and/or children. Five Cochrane review categories met the inclusion criteria; 75 studies were rated for health literacy content regarding educational interventions with families and children living with asthma. A priori criteria were used for the rating process. While 52 (69%) studies had no information pertaining to health literacy, 23 (31%) reported an aspect of health literacy. Although all studies maintained the rigorous standards of randomized clinical trials, a model of health literacy was not reported regarding the design and implementation of interventions. While a more comprehensive health literacy model for the development of educational interventions with families and children may have been available after the reviewed studies were conducted, general literacy levels still could have been addressed. The findings indicate a need to incorporate health literacy in the design of client-centred educational interventions and in the selection criteria of relevant Cochrane reviews. Inclusion assures that health literacy is as important as randomization and statistical analyses in the research design of educational interventions and may even assure participation of people with literacy challenges. © 2012 The Author. International Journal of Evidence-Based Healthcare © 2012 The Joanna Briggs Institute.
A Hydrological Modeling Framework for Flood Risk Assessment for Japan
NASA Astrophysics Data System (ADS)
Ashouri, H.; Chinnayakanahalli, K.; Chowdhary, H.; Sen Gupta, A.
2016-12-01
Flooding has been the most frequent natural disaster that claims lives and imposes significant economic losses to human societies worldwide. Japan, with an annual rainfall of up to approximately 4000 mm is extremely vulnerable to flooding. The focus of this research is to develop a macroscale hydrologic model for simulating flooding toward an improved understanding and assessment of flood risk across Japan. The framework employs a conceptual hydrological model, known as the Probability Distributed Model (PDM), as well as the Muskingum-Cunge flood routing procedure for simulating streamflow. In addition, a Temperature-Index model is incorporated to account for snowmelt and its contribution to streamflow. For an efficient calibration of the model, in terms of computational timing and convergence of the parameters, a set of A Priori parameters is obtained based on the relationships between the model parameters and the physical properties of watersheds. In this regard, we have implemented a particle tracking algorithm and a statistical model which use high resolution Digital Terrain Models to estimate different time related parameters of the model such as time to peak of the unit hydrograph. In addition, global soil moisture and depth data are used to generate A Priori estimation of maximum soil moisture capacity, an important parameter of the PDM model. Once the model is calibrated, its performance is examined during the Typhoon Nabi which struck Japan in September 2005 and caused severe flooding throughout the country. The model is also validated for the extreme precipitation event in 2012 which affected Kyushu. In both cases, quantitative measures show that simulated streamflow depicts good agreement with gauge-based observations. The model is employed to simulate thousands of possible flood events for the entire Japan which makes a basis for a comprehensive flood risk assessment and loss estimation for the flood insurance industry.
Howard, Michelle; Day, Andrew G; Bernard, Carrie; Tan, Amy; You, John; Klein, Doug; Heyland, Daren K
2018-01-01
Valid and reliable measurement of barriers to advance care planning (ACP) in health care settings can inform the design of robust interventions. This article describes the development and psychometric evaluation of an instrument to measure the presence and magnitude of perceived barriers to ACP discussion with patients from the perspective of family physicians. A questionnaire was designed through literature review and expert input, asking family physicians to rate the importance of barriers (0 = not at all a barrier and 6 = an extreme amount) to ACP discussions with patients and administered to 117 physicians. Floor effects and missing data patterns were examined. Item-by-item correlations were examined using Pearson correlation. Exploratory factor analysis was conducted (iterated principle factor analysis with oblique rotation), internal consistency (Cronbach's alpha) overall and within factors was calculated, and construct validity was evaluated by calculating three correlations with related questions that were specified a priori. The questionnaire included 31 questions in three domains relating to the clinician, patient/family and system or external factors. No items were removed due to missing data, floor effects, or high correlation with another item. A solution of three factors accounted for 71% of variance. One item was removed because it did not load strongly on any factor. All other items except one remained in the original domain in the questionnaire. Cronbach's alpha for the three factors ranged from 0.84 to 0.90. Two of three a priori correlations with related questions were statistically significant. This questionnaire to assess barriers to ACP discussion from the perspective of family physicians demonstrates preliminary evidence of reliability and validity. Copyright © 2017 American Academy of Hospice and Palliative Medicine. Published by Elsevier Inc. All rights reserved.
Critical appraisal of emergency medicine education research: the best publications of 2012.
Lin, Michelle; Fisher, Jonathan; Coates, Wendy C; Farrell, Susan E; Shayne, Philip; Maggio, Lauren; Kuhn, Gloria
2014-03-01
The objective was to critically appraise and highlight medical education research published in 2012 that was methodologically superior and whose outcomes were pertinent to teaching and education in emergency medicine (EM). A search of the English language literature in 2012 querying Education Resources Information Center (ERIC), PsychInfo, PubMed, and Scopus identified EM studies using hypothesis-testing or observational investigations of educational interventions. Two reviewers independently screened all of the publications and removed articles using established exclusion criteria. This year, publications limited to a single-site survey design that measured satisfaction or self-assessment on unvalidated instruments were not formally reviewed. Six reviewers then independently ranked all remaining publications using one of two scoring systems depending on whether the study methodology was primarily qualitative or quantitative. Each scoring system had nine criteria, including four related to methodology, that were chosen a priori, to standardize evaluation by reviewers. The quantitative study scoring system was used previously to appraise medical education published annually in 2008 through 2011, while a separate, new qualitative study scoring system was derived and implemented consisting of parallel metrics. Forty-eight medical education research papers met the a priori criteria for inclusion, and 33 (30 quantitative and three qualitative studies) were reviewed. Seven quantitative and two qualitative studies met the criteria for inclusion as exemplary and are summarized in this article. This critical appraisal series aims to promote superior education research by reviewing and highlighting nine of the 48 major education research studies with relevance to EM published in 2012. Current trends and common methodologic pitfalls in the 2012 papers are noted. © 2014 by the Society for Academic Emergency Medicine.
Shaw, Rachel L; Holland, Carol; Pattison, Helen M; Cooke, Richard
2016-05-01
This review provides a worked example of 'best fit' framework synthesis using the Theoretical Domains Framework (TDF) of health psychology theories as an a priori framework in the synthesis of qualitative evidence. Framework synthesis works best with 'policy urgent' questions. The review question selected was: what are patients' experiences of prevention programmes for cardiovascular disease (CVD) and diabetes? The significance of these conditions is clear: CVD claims more deaths worldwide than any other; diabetes is a risk factor for CVD and leading cause of death. A systematic review and framework synthesis were conducted. This novel method for synthesizing qualitative evidence aims to make health psychology theory accessible to implementation science and advance the application of qualitative research findings in evidence-based healthcare. Findings from 14 original studies were coded deductively into the TDF and subsequently an inductive thematic analysis was conducted. Synthesized findings produced six themes relating to: knowledge, beliefs, cues to (in)action, social influences, role and identity, and context. A conceptual model was generated illustrating combinations of factors that produce cues to (in)action. This model demonstrated interrelationships between individual (beliefs and knowledge) and societal (social influences, role and identity, context) factors. Several intervention points were highlighted where factors could be manipulated to produce favourable cues to action. However, a lack of transparency of behavioural components of published interventions needs to be corrected and further evaluations of acceptability in relation to patient experience are required. Further work is needed to test the comprehensiveness of the TDF as an a priori framework for 'policy urgent' questions using 'best fit' framework synthesis. Copyright © 2016 Elsevier Ltd. All rights reserved.
A New Linearized Crank-Nicolson Mixed Element Scheme for the Extended Fisher-Kolmogorov Equation
Wang, Jinfeng; Li, Hong; He, Siriguleng; Gao, Wei
2013-01-01
We present a new mixed finite element method for solving the extended Fisher-Kolmogorov (EFK) equation. We first decompose the EFK equation as the two second-order equations, then deal with a second-order equation employing finite element method, and handle the other second-order equation using a new mixed finite element method. In the new mixed finite element method, the gradient ∇u belongs to the weaker (L 2(Ω))2 space taking the place of the classical H(div; Ω) space. We prove some a priori bounds for the solution for semidiscrete scheme and derive a fully discrete mixed scheme based on a linearized Crank-Nicolson method. At the same time, we get the optimal a priori error estimates in L 2 and H 1-norm for both the scalar unknown u and the diffusion term w = −Δu and a priori error estimates in (L 2)2-norm for its gradient χ = ∇u for both semi-discrete and fully discrete schemes. PMID:23864831
A new linearized Crank-Nicolson mixed element scheme for the extended Fisher-Kolmogorov equation.
Wang, Jinfeng; Li, Hong; He, Siriguleng; Gao, Wei; Liu, Yang
2013-01-01
We present a new mixed finite element method for solving the extended Fisher-Kolmogorov (EFK) equation. We first decompose the EFK equation as the two second-order equations, then deal with a second-order equation employing finite element method, and handle the other second-order equation using a new mixed finite element method. In the new mixed finite element method, the gradient ∇u belongs to the weaker (L²(Ω))² space taking the place of the classical H(div; Ω) space. We prove some a priori bounds for the solution for semidiscrete scheme and derive a fully discrete mixed scheme based on a linearized Crank-Nicolson method. At the same time, we get the optimal a priori error estimates in L² and H¹-norm for both the scalar unknown u and the diffusion term w = -Δu and a priori error estimates in (L²)²-norm for its gradient χ = ∇u for both semi-discrete and fully discrete schemes.
The mere exposure effect depends on an odor's initial pleasantness.
Delplanque, Sylvain; Coppin, Géraldine; Bloesch, Laurène; Cayeux, Isabelle; Sander, David
2015-01-01
The mere exposure phenomenon refers to improvement of one's attitude toward an a priori neutral stimulus after its repeated exposure. The extent to which such a phenomenon influences evaluation of a priori emotional stimuli remains under-investigated. Here we investigated this question by presenting participants with different odors varying in a priori pleasantness during different sessions spaced over time. Participants were requested to report each odor's pleasantness, intensity, and familiarity. As expected, participants became more familiar with all stimuli after the repetition procedure. However, while neutral and mildly pleasant odors showed an increase in pleasantness ratings, unpleasant and very pleasant odors remained unaffected. Correlational analyses revealed an inverse U-shape between the magnitude of the mere exposure effect and the initial pleasantness of the odor. Consequently, the initial pleasantness of the stimuli appears to modulate the impact of repeated exposures on an individual's attitude. These data underline the limits of mere exposure effect and are discussed in light of the biological relevance of odors for individual survival.
The mere exposure effect depends on an odor’s initial pleasantness
Delplanque, Sylvain; Coppin, Géraldine; Bloesch, Laurène; Cayeux, Isabelle; Sander, David
2015-01-01
The mere exposure phenomenon refers to improvement of one’s attitude toward an a priori neutral stimulus after its repeated exposure. The extent to which such a phenomenon influences evaluation of a priori emotional stimuli remains under-investigated. Here we investigated this question by presenting participants with different odors varying in a priori pleasantness during different sessions spaced over time. Participants were requested to report each odor’s pleasantness, intensity, and familiarity. As expected, participants became more familiar with all stimuli after the repetition procedure. However, while neutral and mildly pleasant odors showed an increase in pleasantness ratings, unpleasant and very pleasant odors remained unaffected. Correlational analyses revealed an inverse U-shape between the magnitude of the mere exposure effect and the initial pleasantness of the odor. Consequently, the initial pleasantness of the stimuli appears to modulate the impact of repeated exposures on an individual’s attitude. These data underline the limits of mere exposure effect and are discussed in light of the biological relevance of odors for individual survival. PMID:26191021
NASA Astrophysics Data System (ADS)
Laughner, J.; Cohen, R. C.
2017-12-01
Recent work has identified a number of assumptions made in NO2 retrievals that lead to biases in the retrieved NO2 column density. These include the treatment of the surface as an isotropic reflector, the absence of lightning NO2 in high resolution a priori profiles, and the use of monthly averaged a priori profiles. We present a new release of the Berkeley High Resolution (BEHR) OMI NO2 retrieval based on the new NASA Standard Product (version 3) that addresses these assumptions by: accounting for surface anisotropy by using a BRDF albedo product, using an updated method of regridding NO2 data, and revised NO2 a priori profiles that better account for lightning NO2 and daily variation in the profile shape. We quantify the effect these changes have on the retrieved NO2 column densities and the resultant impact these updates have on constraints of urban NOx emissions for select cities throughout the United States.
An introduction to implementation science for the non-specialist.
Bauer, Mark S; Damschroder, Laura; Hagedorn, Hildi; Smith, Jeffrey; Kilbourne, Amy M
2015-09-16
The movement of evidence-based practices (EBPs) into routine clinical usage is not spontaneous, but requires focused efforts. The field of implementation science has developed to facilitate the spread of EBPs, including both psychosocial and medical interventions for mental and physical health concerns. The authors aim to introduce implementation science principles to non-specialist investigators, administrators, and policymakers seeking to become familiar with this emerging field. This introduction is based on published literature and the authors' experience as researchers in the field, as well as extensive service as implementation science grant reviewers. Implementation science is "the scientific study of methods to promote the systematic uptake of research findings and other EBPs into routine practice, and, hence, to improve the quality and effectiveness of health services." Implementation science is distinct from, but shares characteristics with, both quality improvement and dissemination methods. Implementation studies can be either assess naturalistic variability or measure change in response to planned intervention. Implementation studies typically employ mixed quantitative-qualitative designs, identifying factors that impact uptake across multiple levels, including patient, provider, clinic, facility, organization, and often the broader community and policy environment. Accordingly, implementation science requires a solid grounding in theory and the involvement of trans-disciplinary research teams. The business case for implementation science is clear: As healthcare systems work under increasingly dynamic and resource-constrained conditions, evidence-based strategies are essential in order to ensure that research investments maximize healthcare value and improve public health. Implementation science plays a critical role in supporting these efforts.
Incorporating Implementation of ITS Technologies into Local Planning Processes
DOT National Transportation Integrated Search
1998-09-16
In this paper, the author discusses the barriers to implementing intelligent : transportation systems (ITS) at the local level. Chief among these is the need t : o education the public. At the same time, this educational effort must be : tailored for...
Reporting and methodological quality of meta-analyses in urological literature.
Xia, Leilei; Xu, Jing; Guzzo, Thomas J
2017-01-01
To assess the overall quality of published urological meta-analyses and identify predictive factors for high quality. We systematically searched PubMed to identify meta-analyses published from January 1st, 2011 to December 31st, 2015 in 10 predetermined major paper-based urology journals. The characteristics of the included meta-analyses were collected, and their reporting and methodological qualities were assessed by the PRISMA checklist (27 items) and AMSTAR tool (11 items), respectively. Descriptive statistics were used for individual items as a measure of overall compliance, and PRISMA and AMSTAR scores were calculated as the sum of adequately reported domains. Logistic regression was used to identify predictive factors for high qualities. A total of 183 meta-analyses were included. The mean PRISMA and AMSTAR scores were 22.74 ± 2.04 and 7.57 ± 1.41, respectively. PRISMA item 5, protocol and registration, items 15 and 22, risk of bias across studies, items 16 and 23, additional analysis had less than 50% adherence. AMSTAR item 1, " a priori " design, item 5, list of studies and item 10, publication bias had less than 50% adherence. Logistic regression analyses showed that funding support and " a priori " design were associated with superior reporting quality, following PRISMA guideline and " a priori " design were associated with superior methodological quality. Reporting and methodological qualities of recently published meta-analyses in major paper-based urology journals are generally good. Further improvement could potentially be achieved by strictly adhering to PRISMA guideline and having " a priori " protocol.
Kant and the Conservation of Matter
NASA Astrophysics Data System (ADS)
Morris, Joel
This dissertation is an examination of Kant's rather notorious claim that natural science, or physics, has a priori principles, understood as the claim that physics is constrained by rules warranted by the essential nature of thought. The overall direction of this study is towards examining Kant's claim by close study of a particular principle of physics, the principle of the conservation of matter. If indeed this is a principle of physics, and Kant can successfully show that it is a priori, then it will be reasonable to conclude, in company with Kant, that physics has a priori principles. Although Kant's proof of the principle of the conservation of matter has been traditionally regraded as a reasonably straightforward consequence of his First Analogy of Experience, a careful reading of his proof reveals that this is not really the case. Rather, Kant's proof of the conservation of matter is a consequence of (i) his schematisation of the category of substance in terms of permanence, and (ii) his identification of matter as substance, by appeal to what he thinks is the empirical criterion of substance, activity. Careful examination of Kant's argument in defence of the principle of the conservation of matter, however, reveals a number of deficiencies, and it is concluded that Kant cannot be said to have satisfactorily demonstrated the principle of the conservation of matter or to have convincingly illustrated his claim that physics has a priori principles by appeal to this instance.
Mantziari, Styliani; Allemann, Pierre; Winiker, Michael; Sempoux, Christine; Demartines, Nicolas; Schäfer, Markus
2017-09-01
Lymph node (LN) involvement by esophageal cancer is associated with compromised long-term prognosis. This study assessed whether LN downstaging by neoadjuvant treatment (NAT) might offer a survival benefit compared to patients with a priori negative LN. Patients undergoing esophagectomy for cancer between 2005 and 2014 were screened for inclusion. Group 1 included cN0 patients confirmed as pN0 who were treated with surgery first, whereas group 2 included patients initially cN+ and down-staged to ypN0 after NAT. Survival analysis was performed with the Kaplan-Meier and Cox regression methods. Fifty-seven patients were included in our study, 24 in group 1 and 33 in group 2. Group 2 patients had more locally advanced lesions compared to a priori negative patients, and despite complete LN sterilization by NAT they still had worse long-term survival. Overall 3-year survival was 86.8% for a priori LN negative versus 63.3% for downstaged patients (P = 0.013), while disease-free survival was 79.6% and 57.9%, respectively (P = 0.021). Tumor recurrence was also earlier and more disseminated for the down-staged group. Downstaged LN, despite the systemic effect of NAT, still inherit an increased risk for early tumor recurrence and worse long-term survival compared to a priori negative LN. © 2017 Wiley Periodicals, Inc.
Old and New Ideas about School Desegregation.
ERIC Educational Resources Information Center
Willie, Charles V.
This paper presents a set of new ideas which support the value of school desegregation in the United States. First, school desegregation efforts have been almost universally effective, no matter what implementation strategy was employed. Even busing, although widely criticized, has been effective. Second, desegregation efforts have usually…
Lowrey, Kerri McGowan; Morain, Stephanie R
2014-01-01
While provisions of youth sports concussion laws are very similar, little is known as to how they are being implemented, factors that promote or impede implementation, or the level of compliance in each jurisdiction. We aimed to describe state experiences with implementation in order to inform ongoing efforts to reduce the harm of sports-related traumatic brain injury and to guide future evaluations of the laws' impacts and the development of future public health laws. We conducted key-informant interviews in 35 states with recently enacted concussion legislation. States varied considerably in their readiness and capacity for implementation. Factors facilitating implementation included existing partnerships, procedures, and resources; centralized implementation authority; prior related efforts; and involvement in the policymaking process by those now charged with implementation. Inhibitors included ambiguous statutory language, unclear delegation of authority, and compliance difficulties. Ongoing challenges persist, including primary prevention; determining which providers are qualified to make return-to-play assessments and contents of those assessments; compliance difficulties in rural and under-served areas; and unclear responsibility for enforcement. Despite the similarity of youth sports concussion laws, early evidence suggests there is considerable variation in their implementation. These findings are critical for ongoing empirical investigations to accurately evaluate the laws' provisions and to identify successful legal approaches to protecting young athletes. © 2014 American Society of Law, Medicine & Ethics, Inc.
Walker, Daniel M; Hefner, Jennifer L; Sova, Lindsey N; Hilligoss, Brian; Song, Paula H; McAlearney, Ann Scheck
Accountable care organizations (ACOs) are emerging across the healthcare marketplace and now include Medicare, Medicaid, and private sector payers covering more than 24 million lives. However, little is known about the process of organizational change required to achieve cost savings and quality improvements from the ACO model. This study applies the complex innovation implementation framework to understand the challenges and facilitators associated with the ACO implementation process. We conducted four case studies of private sector ACOs, selected to achieve variation in terms of geography and organizational maturity. Across sites, we used semistructured interviews with 68 key informants to elicit information regarding ACO implementation. Our analysis found challenges and facilitators across all domains in the conceptual framework. Notably, our findings deviated from the framework in two ways. First, findings from the financial resource availability domain revealed both financial and nonfinancial (i.e., labor) resources that contributed to implementation effectiveness. Second, a new domain, patient engagement, emerged as an important factor in implementation effectiveness. We present these deviations in an adapted framework. As the ACO model proliferates, these findings can support implementation efforts, and they highlight the importance of focusing on patients throughout the process. Importantly, this study extends the complex innovation implementation framework to incorporate consumers into the implementation framework, making it more patient centered and aiding future efforts.
Saluja, Saurabh; Silverstein, Allison; Mukhopadhyay, Swagoto; Lin, Yihan; Raykar, Nakul; Keshavjee, Salmaan; Samad, Lubna; Meara, John G
2017-01-01
The Lancet Commission on Global Surgery defined six surgical indicators and a framework for a national surgical plan that aimed to incorporate surgical care as a part of global public health. Multiple countries have since begun national surgical planning; each faces unique challenges in doing so. Implementation science can be used to more systematically explain this heterogeneous process, guide implementation efforts and ultimately evaluate progress. We describe our intervention using the Consolidated Framework for Implementation Research. This framework requires identifying characteristics of the intervention, the individuals involved, the inner and outer setting of the intervention, and finally describing implementation processes. By hosting a consultative symposium with clinicians and policy makers from around the world, we are able to specify key aspects of each element of this framework. We define our intervention as the incorporation of surgical care into public health planning, identify local champions as the key individuals involved, and describe elements of the inner and outer settings. Ultimately we describe top-down and bottom-up models that are distinct implementation processes. With the Consolidated Framework for Implementation Research, we are able to identify specific strategic models that can be used by implementers in various settings. While the integration of surgical care into public health throughout the world may seem like an insurmountable challenge, this work adds to a growing effort that seeks to find a way forward. PMID:29225930
Kerrissey, Michaela; Satterstrom, Patricia; Leydon, Nicholas; Schiff, Gordon; Singer, Sara
How some organizations improve while others remain stagnant is a key question in health care research. Studies identifying how organizations can implement improvement despite barriers are needed, particularly in primary care. This inductive qualitative study examines primary care clinics implementing improvement efforts in order to identify mechanisms that enable implementation despite common barriers, such as lack of time and fragmentation across stakeholder groups. Using an embedded multiple case study design, we leverage a longitudinal data set of field notes, meeting minutes, and interviews from 16 primary care clinics implementing improvement over 15 months. We segment clinics into those that implemented more versus those that implemented less, comparing similarities and differences. We identify interpersonal mechanisms promoting implementation, develop a conceptual model of our key findings, and test the relationship with performance using patient surveys conducted pre-/post-implementation. Nine clinics implemented more successfully over the study period, whereas seven implemented less. Successfully implementing clinics exhibited the managerial practice of integrating, which we define as achieving unity of effort among stakeholder groups in the pursuit of a shared and mutually developed goal. We theorize that integrating is critical in improvement implementation because of the fragmentation observed in health care settings, and we extend theory about clinic managers' role in implementation. We identify four integrating mechanisms that clinic managers enacted: engaging groups, bridging communication, sensemaking, and negotiating. The mean patient survey results for integrating clinics improved by 0.07 units over time, whereas the other clinics' survey scores declined by 0.08 units on a scale of 5 (p = .02). Our research explores an understudied element of how clinics can implement improvement despite barriers: integrating stakeholders within and outside the clinic into the process. It provides clinic managers with an actionable path for implementing improvement.
Scaling participation in payments for ecosystem services programs
Donlan, C. Josh; Boyle, Kevin J.; Xu, Weibin; Gelcich, Stefan
2018-01-01
Payments for ecosystem services programs have become common tools but most have failed to achieve wide-ranging conservation outcomes. The capacity for scale and impact increases when PES programs are designed through the lens of the potential participants, yet this has received little attention in research or practice. Our work with small-scale marine fisheries integrates the social science of PES programs and provides a framework for designing programs that focus a priori on scaling. In addition to payments, desirable non-monetary program attributes and ecological feedbacks attract a wider range of potential participants into PES programs, including those who have more negative attitudes and lower trust. Designing programs that draw individuals into participating in PES programs is likely the most strategic path to reaching scale. Research should engage in new models of participatory research to understand these dynamics and to design programs that explicitly integrate a broad range of needs, values, and modes of implementation. PMID:29522554
The ethical implications and religious significance of organ transplantation payment systems.
Smith, Hunter Jackson
2016-03-01
One of the more polarizing policies proposed to alleviate the organ shortage is financial payment of donors in return for organs. A priori and empirical investigation concludes that such systems are ethically inadequate. A new methodological approach towards policy formation and implementation is proposed which places ethical concerns at its core. From a hypothetical secular origin, the optimal ethical policy structure concerning organ donation is derived. However, when applied universally, it does not yield ideal results for every culture and society due to region-specific variation. Since religion holds significant influence in the organ donation debate, three religions-Catholicism, Islam, and Shinto-were examined in order to illustrate this variation. Although secular ethical concerns should rest at the core of policy construction, certain region-specific contexts require cultural and religious competence and necessitate the adjustment of the optimal template policy accordingly to yield the best moral and practical results.
An effective automatic procedure for testing parameter identifiability of HIV/AIDS models.
Saccomani, Maria Pia
2011-08-01
Realistic HIV models tend to be rather complex and many recent models proposed in the literature could not yet be analyzed by traditional identifiability testing techniques. In this paper, we check a priori global identifiability of some of these nonlinear HIV models taken from the recent literature, by using a differential algebra algorithm based on previous work of the author. The algorithm is implemented in a software tool, called DAISY (Differential Algebra for Identifiability of SYstems), which has been recently released (DAISY is freely available on the web site http://www.dei.unipd.it/~pia/ ). The software can be used to automatically check global identifiability of (linear and) nonlinear models described by polynomial or rational differential equations, thus providing a general and reliable tool to test global identifiability of several HIV models proposed in the literature. It can be used by researchers with a minimum of mathematical background.
[Serious events: from statutory requirements to the implementation].
Aullen, J-P; Lassale, B; Verdot, J-J
2008-11-01
From 2005, PACA area has formed think-tank group a priori risk in the transfusional chain. It has enabled to determine each step of the elementary process and evaluate the frequency, the seriousness and critical effect of the errors. Blood sample and conformity are the most critical points and depend on the vigilance identity. In September 2007, the southern blood bank of France has settled 12 nonconformities levels of blood samples. They send the listing of the nonconformities every month. This listing enables the executive staff to determine the errors and, therefore, to solve them. The regional notification of 2007 to 2008 confirms analysis of the think-tank team. Hence, we were able to list the most serious cases. Public and private hospitals have to notify the serious events and will be bound to evaluate professional practices. These acts will be taken into account by the regional-medical contract.
An omnibus test for the global null hypothesis.
Futschik, Andreas; Taus, Thomas; Zehetmayer, Sonja
2018-01-01
Global hypothesis tests are a useful tool in the context of clinical trials, genetic studies, or meta-analyses, when researchers are not interested in testing individual hypotheses, but in testing whether none of the hypotheses is false. There are several possibilities how to test the global null hypothesis when the individual null hypotheses are independent. If it is assumed that many of the individual null hypotheses are false, combination tests have been recommended to maximize power. If, however, it is assumed that only one or a few null hypotheses are false, global tests based on individual test statistics are more powerful (e.g. Bonferroni or Simes test). However, usually there is no a priori knowledge on the number of false individual null hypotheses. We therefore propose an omnibus test based on cumulative sums of the transformed p-values. We show that this test yields an impressive overall performance. The proposed method is implemented in an R-package called omnibus.
NASA Astrophysics Data System (ADS)
Del Vescovo, D.; D'Ambrogio, W.
1995-01-01
A frequency domain method is presented to design a closed-loop control for vibration reduction flexible mechanisms. The procedure is developed on a single-link flexible arm, driven by one rotary degree of freedom servomotor, although the same technique may be applied to similar systems such as supports for aerospace antennae or solar panels. The method uses the structural frequency response functions (FRFs), thus avoiding system identification, that produces modeling uncertainties. Two closed-loops are implemented: the inner loop uses acceleration feedback with the aim of making the FRF similar to that of an equivalent rigid link; the outer loop feeds back displacements to achieve a fast positioning response and null steady state error. In both cases, the controller type is established a priori, while actual characteristics are defined by an optimisation procedure in which the relevant FRF is constrained into prescribed bounds and stability is taken into account.
Neural Classifiers for Learning Higher-Order Correlations
NASA Astrophysics Data System (ADS)
Güler, Marifi
1999-01-01
Studies by various authors suggest that higher-order networks can be more powerful and are biologically more plausible with respect to the more traditional multilayer networks. These architectures make explicit use of nonlinear interactions between input variables in the form of higher-order units or product units. If it is known a priori that the problem to be implemented possesses a given set of invariances like in the translation, rotation, and scale invariant pattern recognition problems, those invariances can be encoded, thus eliminating all higher-order terms which are incompatible with the invariances. In general, however, it is a serious set-back that the complexity of learning increases exponentially with the size of inputs. This paper reviews higher-order networks and introduces an implicit representation in which learning complexity is mainly decided by the number of higher-order terms to be learned and increases only linearly with the input size.
NASA Astrophysics Data System (ADS)
Atta Yaseen, Amer; Bayart, Mireille
2017-01-01
In this work, a new approach will be introduced as a development for the attack-tolerant scheme in the Networked Control System (NCS). The objective is to be able to detect an attack such as the Stuxnet case where the controller is reprogrammed and hijacked. Besides the ability to detect the stealthy controller hijacking attack, the advantage of this approach is that there is no need for a priori mathematical model of the controller. In order to implement the proposed scheme, a specific detector for the controller hijacking attack is designed. The performance of this scheme is evaluated be connected the detector to NCS with basic security elements such as Data Encryption Standard (DES), Message Digest (MD5), and timestamp. The detector is tested along with networked PI controller under stealthy hijacking attack. The test results of the proposed method show that the hijacked controller can be significantly detected and recovered.
Generic distortion model for metrology under optical microscopes
NASA Astrophysics Data System (ADS)
Liu, Xingjian; Li, Zhongwei; Zhong, Kai; Chao, YuhJin; Miraldo, Pedro; Shi, Yusheng
2018-04-01
For metrology under optical microscopes, lens distortion is the dominant source of error. Previous distortion models and correction methods mostly rely on the assumption that parametric distortion models require a priori knowledge of the microscopes' lens systems. However, because of the numerous optical elements in a microscope, distortions can be hardly represented by a simple parametric model. In this paper, a generic distortion model considering both symmetric and asymmetric distortions is developed. Such a model is obtained by using radial basis functions (RBFs) to interpolate the radius and distortion values of symmetric distortions (image coordinates and distortion rays for asymmetric distortions). An accurate and easy to implement distortion correction method is presented. With the proposed approach, quantitative measurement with better accuracy can be achieved, such as in Digital Image Correlation for deformation measurement when used with an optical microscope. The proposed technique is verified by both synthetic and real data experiments.
Precise Geolocation Of Persistent Scatterers Aided And Validated By Lidar DSM
NASA Astrophysics Data System (ADS)
Chang, Ling; Dheenathayalan, Prabu; Hanessen, Ramon
2013-12-01
Persistent Scatterers (PS) interferometry results in the de- formation history of time-coherent scatterers. Although several applications focus on smooth, spatially correlated signals, we aim for the detection, identification and analysis of single anomalies. These targets can be indicative of, e.g., strain in structures, potentially leading to the failure of such structures. For the identification and analysis it is of the greatest importance to know the exact position of the effective scattering center, to avoid an improper interpretation of the driving mechanism. Here we present an approach to optimize the geolocation of important scatterers, when necessary aided by an a priori Lidar-derived DSM (AHN-1 data) with 15cm and 5m resolution in vertical and horizontal directions, respectively. The DSM is also used to validate the geocoding. We implement our approach on a near-collapse event of a shopping mall in Heerlen, the Netherlands, to generate the precise geolocation of local PS points.
Data-Rate Estimation for Autonomous Receiver Operation
NASA Technical Reports Server (NTRS)
Tkacenko, A.; Simon, M. K.
2005-01-01
In this article, we present a series of algorithms for estimating the data rate of a signal whose admissible data rates are integer base, integer powered multiples of a known basic data rate. These algorithms can be applied to the Electra radio currently used in the Deep Space Network (DSN), which employs data rates having the above relationship. The estimation is carried out in an autonomous setting in which very little a priori information is assumed. It is done by exploiting an elegant property of the split symbol moments estimator (SSME), which is traditionally used to estimate the signal-to-noise ratio (SNR) of the received signal. By quantizing the assumed symbol-timing error or jitter, we present an all-digital implementation of the SSME which can be used to jointly estimate the data rate, SNR, and jitter. Simulation results presented show that these joint estimation algorithms perform well, even in the low SNR regions typically encountered in the DSN.
NASA Astrophysics Data System (ADS)
Liu, Derong; Huang, Yuzhu; Wang, Ding; Wei, Qinglai
2013-09-01
In this paper, an observer-based optimal control scheme is developed for unknown nonlinear systems using adaptive dynamic programming (ADP) algorithm. First, a neural-network (NN) observer is designed to estimate system states. Then, based on the observed states, a neuro-controller is constructed via ADP method to obtain the optimal control. In this design, two NN structures are used: a three-layer NN is used to construct the observer which can be applied to systems with higher degrees of nonlinearity and without a priori knowledge of system dynamics, and a critic NN is employed to approximate the value function. The optimal control law is computed using the critic NN and the observer NN. Uniform ultimate boundedness of the closed-loop system is guaranteed. The actor, critic, and observer structures are all implemented in real-time, continuously and simultaneously. Finally, simulation results are presented to demonstrate the effectiveness of the proposed control scheme.
[A computerised clinical decision-support system for the management of depression in Primary Care].
Aragonès, Enric; Comín, Eva; Cavero, Myriam; Pérez, Víctor; Molina, Cristina; Palao, Diego
Despite its clinical relevance and its importance as a public health problem, there are major gaps in the management of depression. Evidence-based clinical guidelines are useful to improve processes and clinical outcomes. In order to make their implementation easier these guidelines have been transformed into computerised clinical decision support systems. In this article, a description is presented on the basics and characteristics of a new computerised clinical guideline for the management of major depression, developed in the public health system in Catalonia. This tool helps the clinician to establish reliable and accurate diagnoses of depression, to choose the best treatment a priori according to the disease and the patient characteristics. It also emphasises the importance of systematic monitoring to assess the clinical course, and to adjust therapeutic interventions to the patient's needs at all times. Copyright © 2016 Elsevier España, S.L.U. All rights reserved.
NASA Astrophysics Data System (ADS)
Medgyesi-Mitschang, L. N.; Putnam, J. M.
1980-04-01
A hierarchy of computer programs implementing the method of moments for bodies of translation (MM/BOT) is described. The algorithm treats the far-field radiation and scattering from finite-length open cylinders of arbitrary cross section as well as the near fields and aperture-coupled fields for rectangular apertures on such bodies. The theoretical development underlying the algorithm is described in Volume 1. The structure of the computer algorithm is such that no a priori knowledge of the method of moments technique or detailed FORTRAN experience are presupposed for the user. A set of carefully drawn example problems illustrates all the options of the algorithm. For more detailed understanding of the workings of the codes, special cross referencing to the equations in Volume 1 is provided. For additional clarity, comment statements are liberally interspersed in the code listings, summarized in the present volume.
Behavior and neural basis of near-optimal visual search
Ma, Wei Ji; Navalpakkam, Vidhya; Beck, Jeffrey M; van den Berg, Ronald; Pouget, Alexandre
2013-01-01
The ability to search efficiently for a target in a cluttered environment is one of the most remarkable functions of the nervous system. This task is difficult under natural circumstances, as the reliability of sensory information can vary greatly across space and time and is typically a priori unknown to the observer. In contrast, visual-search experiments commonly use stimuli of equal and known reliability. In a target detection task, we randomly assigned high or low reliability to each item on a trial-by-trial basis. An optimal observer would weight the observations by their trial-to-trial reliability and combine them using a specific nonlinear integration rule. We found that humans were near-optimal, regardless of whether distractors were homogeneous or heterogeneous and whether reliability was manipulated through contrast or shape. We present a neural-network implementation of near-optimal visual search based on probabilistic population coding. The network matched human performance. PMID:21552276
Learning-Based Adaptive Optimal Tracking Control of Strict-Feedback Nonlinear Systems.
Gao, Weinan; Jiang, Zhong-Ping; Weinan Gao; Zhong-Ping Jiang; Gao, Weinan; Jiang, Zhong-Ping
2018-06-01
This paper proposes a novel data-driven control approach to address the problem of adaptive optimal tracking for a class of nonlinear systems taking the strict-feedback form. Adaptive dynamic programming (ADP) and nonlinear output regulation theories are integrated for the first time to compute an adaptive near-optimal tracker without any a priori knowledge of the system dynamics. Fundamentally different from adaptive optimal stabilization problems, the solution to a Hamilton-Jacobi-Bellman (HJB) equation, not necessarily a positive definite function, cannot be approximated through the existing iterative methods. This paper proposes a novel policy iteration technique for solving positive semidefinite HJB equations with rigorous convergence analysis. A two-phase data-driven learning method is developed and implemented online by ADP. The efficacy of the proposed adaptive optimal tracking control methodology is demonstrated via a Van der Pol oscillator with time-varying exogenous signals.
Commentary: evidence to guide gun violence prevention in America.
Webster, Daniel W
2015-03-18
Gun violence is a major threat to the public's health and safety in the United States. The articles in this volume's symposium on gun violence reveal the scope of the problem and new trends in mortality rates from gunfire. Leading scholars synthesize research evidence that demonstrates the ability of numerous policies and programs-each consistent with lessons learned from successful efforts to combat public health problems-to prevent gun violence. Each approach presents challenges to successful implementation. Future research should inform efforts to assess which approaches are most effective and how to implement evidence-based interventions most effectively.
Cost Benefit and Alternatives Analysis of Distribution Systems with Energy Storage Systems: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harris, Tom; Nagarajan, Adarsh; Baggu, Murali
This paper explores monetized and non-monetized benefits from storage interconnected to distribution system through use cases illustrating potential applications for energy storage in California's electric utility system. This work supports SDG&E in its efforts to quantify, summarize, and compare the cost and benefit streams related to implementation and operation of energy storage on its distribution feeders. This effort develops the cost benefit and alternatives analysis platform, integrated with QSTS feeder simulation capability, and analyzed use cases to explore the cost-benefit of implementation and operation of energy storage for feeder support and market participation.
Electronic Chemotherapy Order Entry: A Major Cancer Center's Implementation
Sklarin, Nancy T.; Granovsky, Svetlana; O'Reilly, Eileen M.; Zelenetz, Andrew D.
2011-01-01
Implementation of a computerized provider order entry system for complex chemotherapy regimens at a large cancer center required intense effort from a multidisciplinary team of clinical and systems experts with experience in all facets of the chemotherapy process. The online tools had to resemble the paper forms used at the time and parallel the successful established process as well as add new functionality. Close collaboration between the institution and the vendor was necessary. This article summarizes the institutional efforts, challenges, and collaborative processes that facilitated universal chemotherapy computerized electronic order entry across multiple sites during a period of several years. PMID:22043182
Electronic Chemotherapy Order Entry: A Major Cancer Center's Implementation.
Sklarin, Nancy T; Granovsky, Svetlana; O'Reilly, Eileen M; Zelenetz, Andrew D
2011-07-01
Implementation of a computerized provider order entry system for complex chemotherapy regimens at a large cancer center required intense effort from a multidisciplinary team of clinical and systems experts with experience in all facets of the chemotherapy process. The online tools had to resemble the paper forms used at the time and parallel the successful established process as well as add new functionality. Close collaboration between the institution and the vendor was necessary. This article summarizes the institutional efforts, challenges, and collaborative processes that facilitated universal chemotherapy computerized electronic order entry across multiple sites during a period of several years.
Open SHMEM Reference Implementation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pritchard, Howard; Curtis, Anthony; Welch, Aaron
2016-05-12
OpenSHMEM is an effort to create a specification for a standardized API for parallel programming in the Partitioned Global Address Space. Along with the specification the project is also creating a reference implementation of the API. This implementation attempts to be portable, to allow it to be deployed in multiple environments, and to be a starting point for implementations targeted to particular hardware platforms. It will also serve as a springboard for future development of the API.
Implementation Fidelity in Community-Based Interventions
Breitenstein, Susan M.; Gross, Deborah; Garvey, Christine; Hill, Carri; Fogg, Louis; Resnick, Barbara
2012-01-01
Implementation fidelity is the degree to which an intervention is delivered as intended and is critical to successful translation of evidence-based interventions into practice. Diminished fidelity may be why interventions that work well in highly controlled trials may fail to yield the same outcomes when applied in real life contexts. The purpose of this paper is to define implementation fidelity and describe its importance for the larger science of implementation, discuss data collection methods and current efforts in measuring implementation fidelity in community-based prevention interventions, and present future research directions for measuring implementation fidelity that will advance implementation science. PMID:20198637
Autonomous interplanetary constellation design
NASA Astrophysics Data System (ADS)
Chow, Cornelius Channing, II
According to NASA's integrated space technology roadmaps, space-based infrastructures are envisioned as necessary ingredients to a sustained effort in continuing space exploration. Whether it be for extra-terrestrial habitats, roving/cargo vehicles, or space tourism, autonomous space networks will provide a vital communications lifeline for both future robotic and human missions alike. Projecting that the Moon will be a bustling hub of activity within a few decades, a near-term opportunity for in-situ infrastructure development is within reach. This dissertation addresses the anticipated need for in-space infrastructure by investigating a general design methodology for autonomous interplanetary constellations; to illustrate the theory, this manuscript presents results from an application to the Earth-Moon neighborhood. The constellation design methodology is formulated as an optimization problem, involving a trajectory design step followed by a spacecraft placement sequence. Modeling the dynamics as a restricted 3-body problem, the investigated design space consists of families of periodic orbits which play host to the constellations, punctuated by arrangements of spacecraft autonomously guided by a navigation strategy called LiAISON (Linked Autonomous Interplanetary Satellite Orbit Navigation). Instead of more traditional exhaustive search methods, a numerical continuation approach is implemented to map the admissible configuration space. In particular, Keller's pseudo-arclength technique is used to follow folding/bifurcating solution manifolds, which are otherwise inaccessible with other parameter continuation schemes. A succinct characterization of the underlying structure of the local, as well as global, extrema is thus achievable with little a priori intuition of the solution space. Furthermore, the proposed design methodology offers benefits in computation speed plus the ability to handle mildly stochastic systems. An application of the constellation design methodology to the restricted Earth-Moon system, reveals optimal pairwise configurations for various L1, L2, and L5 (halo, axial, and vertical) periodic orbit families. Navigation accuracies, ranging from O (10+/-1) meters in position space, are obtained for the optimal Earth-Moon constellations, given measurement noise on the order of 1 meter.
NASA Astrophysics Data System (ADS)
Ellery, A.
Since the remarkable British Interplanetary Society starship study of the late 1970s - Daedalus - there have been significant developments in the areas of artificial intelligence and robotics. These will be critical technologies for any starship as indeed they are for the current generation of exploratory spacecraft and in-situ planetary robotic explorers. Although early visions of truly intelligent robots have yet to materialize (reasons for which will be outlined), there are nonetheless revolutionary developments which have attempted to address at least some of these earlier unperceived deficiencies. The current state of the art comprises a number of separate strands of research which provide components of robotic intelligence though no over- arching approach has been forthcoming. The first question to be considered is the level of intelligent functionality required to support a long-duration starship mission. This will, at a minimum, need to be extensive imposed by the requirement for complex reconfigurability and repair. The second question concerns the tools that we have at our disposal to implement the required intelligent functions of the starship. These are based on two very different approaches - good old-fashioned artificial intelligence (GOFAI) based on logical theorem-proving and knowledge-encoding recently augmented by modal, temporal, circumscriptive and fuzzy logics to address the well-known “frame problem”; and the more recent soft computing approaches based on artificial neural networks, evolutionary algorithms and immunity models and their variants to implement learning. The former has some flight heritage through the Remote Agent architecture whilst the latter has yet to be deployed on any space mission. However, the notion of reconfigurable hardware of recent interest in the space community warrants the use of evolutionary algorithms and neural networks implemented on field programmable gate array technology, blurring the distinction between hardware and software. The primary question in space engineering has traditionally been one of predictability and controllability which online learning compromises. A further factor to be accounted for is the notion that intelligence is derived primarily from robot-environment interaction which stresses the sensory and actuation capabilities (exemplified by the behavioural or situated robotics paradigm). One major concern is whether the major deficiency of current methods in terms of lack of scalability can be overcome using a highly distributed approach rather than the hierarchical approach suggested by the NASREM architecture. It is contended here that a mixed solution will be required where a priori programming is augmented by a posteriori learning resembling the biological distinction between fixed genetically inherited and learned neurally implemented behaviour in animals. In particular, a biomimetic approach is proferred which exploits the neural processes and architecture of the human brain through the use of forward models which attempts to marry the conflicting requirements of learning with predictability. Some small-scale efforts in this direction will be outlined.
Recruitment for a Diabetes Prevention Program translation effort in a worksite setting.
Taradash, J; Kramer, M; Molenaar, D; Arena, V; Vanderwood, K; Kriska, Andrea M
2015-03-01
The success of the Diabetes Prevention Program (DPP) lifestyle intervention has led to community-based translation efforts in a variety of settings. One community setting which holds promise for the delivery of prevention intervention is the worksite; however, information regarding recruitment in this setting is limited. The current effort describes the initial processes surrounding provision of an adapted DPP lifestyle intervention at a corporate worksite. Investigators and key management at the worksite collaborated to develop and implement a recruitment plan for the intervention focusing on 1) in-person onsite activities and 2) implementation of a variety of media recruitment tools and methods. Adult, non-diabetic overweight/obese employees and family members with pre-diabetes and/or the metabolic syndrome were eligible for the study. Telephone pre-screening was completed for 176 individuals resulting in 171 eligible for onsite screening. Of that number, 160 completed onsite screening, 107 met eligibility criteria, and 89 enrolled in the study. Support from worksite leadership, an invested worksite planning team and a solid recruitment plan consisting of multiple strategies were identified as crucial elements of this effective workplace recruitment effort. A worksite team successfully developed and implemented a recruitment plan using existing mechanisms appropriate to that worksite in order to identify and enroll eligible individuals. The results of this effort indicate that employee recruitment in a worksite setting is feasible as the first step in offering onsite behavioral lifestyle intervention programs as part of a widespread dissemination plan to prevent diabetes and lower risk for cardiovascular disease. Copyright © 2015 Elsevier Inc. All rights reserved.
James, Pam; Bebee, Patty; Beekman, Linda; Browning, David; Innes, Mathew; Kain, Jeannie; Royce-Westcott, Theresa; Waldinger, Marcy
2011-11-01
Quantifying data management and regulatory workload for clinical research is a difficult task that would benefit from a robust tool to assess and allocate effort. As in most clinical research environments, The University of Michigan Comprehensive Cancer Center (UMCCC) Clinical Trials Office (CTO) struggled to effectively allocate data management and regulatory time with frequently inaccurate estimates of how much time was required to complete the specific tasks performed by each role. In a dynamic clinical research environment in which volume and intensity of work ebbs and flows, determining requisite effort to meet study objectives was challenging. In addition, a data-driven understanding of how much staff time was required to complete a clinical trial was desired to ensure accurate trial budget development and effective cost recovery. Accordingly, the UMCCC CTO developed and implemented a Web-based effort-tracking application with the goal of determining the true costs of data management and regulatory staff effort in clinical trials. This tool was developed, implemented, and refined over a 3-year period. This article describes the process improvement and subsequent leveling of workload within data management and regulatory that enhanced the efficiency of UMCCC's clinical trials operation.
ERIC Educational Resources Information Center
Fernando, Sheara
2010-01-01
The success of an implementation effort depends on the ability for a system to utilize the innovation effectively; the effective usage of an innovation can be determined by monitoring for program integrity and fidelity, and assessing the degree to which the program implementation matches the intended plan (Fixsen, Blase, Horner, & Sugai 2007). The…
ERIC Educational Resources Information Center
Nariman, Nahid; Chrispeels, Janet
2016-01-01
We explore teachers' efforts to implement problem-based learning (PBL) in an elementary school serving predominantly English learners. Teachers had an opportunity to implement the Next Generation Science Standards (NGSS) using PBL in a summer school setting with no test-pressures. To understand the challenges and benefits of PBL implementation, a…
German Language and Culture: 9-Year Program Guide to Implementation, Grades 4-5-6
ERIC Educational Resources Information Center
Alberta Education, 2008
2008-01-01
This implementation guide is intended to support the Grade 4 to Grade 6 portion of the German Language and Culture Nine-year Program (the program of studies). It was developed primarily for teachers, yet it includes information that may be useful for administrators and other stakeholders in their efforts to plan for and implement the new Chinese…
ERIC Educational Resources Information Center
Bustami, Yakobus; Corebima, Aloysius Duran; Suarsini, Endang; Ibrohim
2017-01-01
The empowerment of social attitudes in higher education is indispensable. The aim of this research was to uncover the effect of the empowerment efforts on the social attitudes of multiethnic biology students through the implementation of JiRQA learning strategy. This research was a quasi experimental of 2 x 3 factorial design implemented on the…
Punjabi Language and Culture: 9-Year Program Guide to Implementation, Grades 4-5-6
ERIC Educational Resources Information Center
Alberta Education, 2008
2008-01-01
This implementation guide is intended to support the Grade 4 to Grade 6 portion of the Punjabi Language and Culture Nine-Year Program (the program of studies.) It was developed primarily for teachers, yet it includes information that may be useful for administrators and other stakeholders in their efforts to plan for and implement the new Punjabi…
ERIC Educational Resources Information Center
Pinkelman, Sarah E.; Horner, Robert H.
2017-01-01
The success of function-based interventions depends not just on the quality of procedures but also on the extent to which procedures are implemented as planned. Too often in schools, effort is committed to functional assessment and behavior support plan design, only to be followed by weak implementation. This study used a multiple baseline across…
FCTC followed by accelerated implementation of tobacco advertising bans.
Hiilamo, Heikki; Glantz, Stanton
2017-07-01
We sought to evaluate changes in countries' enacting advertising bans after the effect of ratifying the WHO Framework Convention on Tobacco Control (FCTC). We compared adoption of advertising bans on five areas (TV and radio, print media, billboards, point-of-sale, sponsorship) in countries that did versus did not ratify the FCTC, accounting for years since the ratification of the Convention. On average, passage of complete advertising bans accelerated after FCTC ratification. The development was strongest among lower middle-income countries. Lack of state capacity was associated with lower likelihood of countries implementing complete advertising bans. Implementation of complete advertising bans slowed after 2007. Implementation of FCTC Article 13 was followed by increased progress towards complete advertising bans, but progress is incomplete, especially among low-income countries. Low-income countries need comprehensive support to implement FCTC as part of a broad effort to reinvigorate progress on global implementation of the FCTC. Enforcing complete bans requires constant monitoring and attacking of tobacco industry efforts to circumvent them. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Water Quality Criteria for Human Health and Aquatic Life
Collaborative effort with the Office of Water to provide science in support of the development and implementation of new or revised ambient water quality criteria for microbial and chemical contaminants for human health and aquatic life. The research also addresses implementation...
DOT National Transportation Integrated Search
2015-09-01
This study provides resistance factors (I) for design of deep foundations to implement Load and Resistance Factor Design (LRFD) for bridge foundations using Texas Cone Penetrometer (TCP) Test data. Initial efforts were made to determine resistance fa...
ERIC Educational Resources Information Center
Amato, Sheila
2003-01-01
This brief report describes the development and implementation of a unique, full-year, credit-bearing, technology course in literary Braille transcription offered at a Long Island (New York) high school. It describes the program's goals, development, implementation, students, ongoing activities, outreach efforts, and student attitudes. Suggestions…
Aviation Maintenance Technology. General. Curriculum Implementation Guide.
ERIC Educational Resources Information Center
Moore, John, Jr.; And Others
This curriculum implementation guide is a scope and sequence for the general section of a course in aviation maintenance technology. The course materials were prepared through a cooperative effort of airframe and powerplant mechanics, general aviation industry representatives, Federal Aviation Administration representatives, and vocational…
Case study of open-source enterprise resource planning implementation in a small business
NASA Astrophysics Data System (ADS)
Olson, David L.; Staley, Jesse
2012-02-01
Enterprise resource planning (ERP) systems have been recognised as offering great benefit to some organisations, although they are expensive and problematic to implement. The cost and risk make well-developed proprietorial systems unaffordable to small businesses. Open-source software (OSS) has become a viable means of producing ERP system products. The question this paper addresses is the feasibility of OSS ERP systems for small businesses. A case is reported involving two efforts to implement freely distributed ERP software products in a small US make-to-order engineering firm. The case emphasises the potential of freely distributed ERP systems, as well as some of the hurdles involved in their implementation. The paper briefly reviews highlights of OSS ERP systems, with the primary focus on reporting the case experiences for efforts to implement ERPLite software and xTuple software. While both systems worked from a technical perspective, both failed due to economic factors. While these economic conditions led to imperfect results, the case demonstrates the feasibility of OSS ERP for small businesses. Both experiences are evaluated in terms of risk dimension.
ERIC Educational Resources Information Center
Ujifusa, Andrew
2013-01-01
Opponents of the Common Core State Standards are ramping up legislative pressure and public relations efforts aimed at getting states to scale back--or even abandon--the high-profile initiative, even as implementation proceeds and tests aligned with the standards loom. Critics of the common core have focused recent lobbying and media efforts on…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Buche, D. L.
This report describes Northern Indiana Public Service Co. project efforts to develop an automated energy distribution and reliability system. The purpose of this project was to implement a database-driven GIS solution that would manage all of the company's gas, electric, and landbase objects. This report is second in a series of reports detailing this effort.
Fiscal Year 2007 Performance and Accountability Report
ERIC Educational Resources Information Center
US Department of Education, 2007
2007-01-01
This document represents the annual report card of the U.S. Department of Education's efforts and outcomes during fiscal year 2007. This year's report builds on efforts to increase transparency, and more effectively communicate goals and objectives. The report emphasizes achievements and challenges associated with implementing the Department of…
Cooperative Research Centres: The Concept and Its Implementation.
ERIC Educational Resources Information Center
Slatyer, Ralph O.
1994-01-01
Australia's Cooperative Research Centres Program, a system of 52 research and development (R&D) units, links researchers from public and private sectors, helping industry and scientific community coordinate research efforts. The program represents 6% of the national R&D effort and spans six major R&D and industry sectors. (MSE)
A Collective Effort to Improve Sociology Students' Writing Skills
ERIC Educational Resources Information Center
Burgess-Proctor, Amanda; Cassano, Graham; Condron, Dennis J.; Lyons, Heidi A.; Sanders, George
2014-01-01
Nationwide, academic sociologists at all types of higher education institutions face the challenge of working to improve students' writing skills. In this article, we describe a collective effort by a group of faculty members in one undergraduate sociology program to implement several effective writing-improvement strategies. We advocate…
Impact of an Extension Social Media Tool Kit on Audience Engagement
ERIC Educational Resources Information Center
Garcia, Aileen S.; Dev, Dipti; McGinnis, Colin M.; Thomas, Tyler
2018-01-01
Extension professionals can improve their use of social media as channels for extending programmatic efforts by maximizing target audience reach and engagement. We describe how implementation of a tool kit highlighting best practices for using social media improved Extension professionals' efforts to engage target audience members via social…
The Challenges of Educational Reform in Modern-Day Peru
ERIC Educational Resources Information Center
McDonald, Jane; Lammert, Jill
2006-01-01
The purpose of this article is to examine a nationwide effort of educational reform in Peru. Specifically, the authors take a close look at the nation's efforts to change secondary education through the implementation of a 2-year postsecondary learning opportunity called the "bachillerato." First, the authors briefly present the…
Inviting the "Outsiders" In: Local Efforts to Improve Adjunct Working Conditions
ERIC Educational Resources Information Center
Schreyer, Jessica
2012-01-01
An adjunct turned writing program administrator reflects on her professional journey and describes efforts to improve the teaching environment amongst composition faculty--primarily part-time--within her department. Based on a local program review, a pilot faculty relations plan was implemented that addressed two major areas: offering more…
Profiles of Schools in Change: Four Urban High Schools.
ERIC Educational Resources Information Center
Wermuth, Thomas R.; And Others
1997-01-01
This report highlights four urban comprehensive secondary schools that are developing, implementing, and evaluating reform initiatives that include vocational and technical education as a key component of these efforts. Efforts of these four high schools are described: Bryan High School, Omaha, Nebraska; Humboldt Secondary Complex, St. Paul,…
Case Studies of Urban Schools: Portrayals of Schools in Change.
ERIC Educational Resources Information Center
Wermuth, Thomas R.; Maddy-Bernstein, Carolyn; Grayson, Thomas E.
A purposeful sample of four comprehensive urban high schools involved in educational restructuring initiatives was analyzed to determine how each site has implemented educational restructuring efforts and how vocational education fits into those restructuring efforts. The four sites studied were as follows: Bryan High School in Omaha, Nebraska;…
Implementing Guided Pathways: Early Insights from the AACC Pathways Colleges
ERIC Educational Resources Information Center
Jenkins, Davis; Lahr, Hana; Fink, John
2017-01-01
Across the United States, a growing number of colleges are redesigning their programs and student support services according to the "guided pathways" model. Central to this approach are efforts to clarify pathways to program completion, career advancement, and further education. Equally essential are efforts to help students explore…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-29
...)(3) (Accounting for Disclosures) because release of the accounting of disclosures could alert the... recipient agency. Disclosure of the accounting would therefore present a serious impediment to law enforcement efforts and/or efforts to preserve national security. Disclosure of the accounting would also...
A New Technique for Mitigating Risk on US College Campuses
ERIC Educational Resources Information Center
Hughes, Stephanie; White, Rebecca J.; Hertz, Giles
2008-01-01
High-profile criminal acts continue to plague United States (US) college campuses despite recent efforts to implement more aggressive risk mitigation practices, such as criminal background checks. Despite these efforts, incidents such as the most recent shootings at Virginia Polytechnic Institute and State University continue to demonstrate that,…
Career Development in the Work Place. Overview: ERIC Fact Sheet No. 11.
ERIC Educational Resources Information Center
Eabon, Michelle F.
Changes in the composition and attitudes of the work force have resulted in increased interest by employers in formulating and implementing career development efforts for their employees. Surveys have revealed: companies believe that career development efforts enhance employee performance and improve utilization of talents; most organizations have…
Rapid near-optimal aerospace plane trajectory generation and guidance
NASA Technical Reports Server (NTRS)
Calise, A. J.; Corban, J. E.; Markopoulos, N.
1991-01-01
Effort was directed toward the problems of the real time trajectory optimization and guidance law development for the National Aerospace Plane (NASP) applications. In particular, singular perturbation methods were used to develop guidance algorithms suitable for onboard, real time implementation. The progress made in this research effort is reported.
New Horizons Risk Communication Strategy, Planning, Implementation, and Lessons Learned
NASA Technical Reports Server (NTRS)
Dawson, Sandra A.
2006-01-01
This paper discusses the risk communication goals, strategy, planning process and product development for the New Horizons mission, including lessons from the Cassini mission that were applied in that effort, and presents lessons learned from the New Horizons effort that could be applicable to future missions.
South Korean Digital Textbook Project
ERIC Educational Resources Information Center
Kim, Jackie Hee-Young; Jung, Hye-Yoon
2010-01-01
South Korea has adopted the widespread use of digital textbooks. Part school reform and part an effort to prepare today's children for tomorrow's challenging world, the way in which this effort was implemented and the lessons learned are valuable. This article highlights the history of the digital textbook project and compares printed textbooks…
School Reform Efforts for Lesbian, Gay, Bisexual, and Transgendered Students
ERIC Educational Resources Information Center
Mayberry, Maralee
2006-01-01
Recent efforts of school personnel across the country to implement a variety of initiatives aimed at providing safe and tolerant learning environments for lesbian, gay, bisexual, and transgendered (LGBT) students have resulted in inclusion of homosexual identities in school curricula, identification of positive role models, counseling programs,…
Institutional transformation: An analysis of change initiatives at NSF ADVANCE institutions
NASA Astrophysics Data System (ADS)
Plummer, Ellen W.
The purpose of this study was to examine how institutional culture promoted or impeded the implementation of round one and two NSF ADVANCE initiatives designed to improve academic climates for women in science and engineering. This study was conducted in two phases. In phase one, 35 participants from 18 institutions were interviewed to answer three research questions. Participants identified a policy, process, or program designed to improve academic cultures for women in science and engineering fields. Participants also identified strategies that promoted the implementation of these efforts, and discussed factors that impeded these efforts. In phase two, site visits were conducted at two institutions to answer a fourth research question. How did institutional culture shape the design and implementation of faculty search processes? Policies, processes, and programs were implemented by participants at the institutional, departmental, and individual levels and included family friendly and dual career policies at the institutional level, improved departmental faculty search and climate improvement processes, and mentoring programs and training for department heads at the individual level. Communication and leadership strategies were key to the successful implementation of policies, processes, and programs designed to achieve institutional transformation. Communication strategies involved shaping change messages to reach varied audiences often with the argument that change efforts would improve the climate for everyone not just women faculty members. Administrative and faculty leaders from multiple levels proved important to change efforts. Institutional Transformation Institutional culture shaped initiatives to improve faculty search processes. Faculty leaders in both settings used data to persuade faculty members of the need for change. At one site, data that included national availability information was critical to advancing the change agenda. At the other site, social science data that illustrated gender bias was persuasive. Faculty members who were effective as change agents were those who were credible with their peers in that setting.
Emotional intensity influences pre-implementation and implementation of distraction and reappraisal
Shafir, Roni; Schwartz, Naama; Blechert, Jens
2015-01-01
Although emotional intensity powerfully challenges regulatory strategies, its influence remains largely unexplored in affective-neuroscience. Accordingly, the present study addressed the moderating role of emotional intensity in two regulatory stages—implementation (during regulation) and pre-implementation (prior to regulation), of two major cognitive regulatory strategies—distraction and reappraisal. According to our framework, because distraction implementation involves early attentional disengagement from emotional information before it gathers force, in high-intensity it should be more effective in the short-term, relative to reappraisal, which modulates emotional processing only at a late semantic meaning phase. Supporting findings showed that in high (but not low) intensity, distraction implementation resulted in stronger modulation of negative experience, reduced neural emotional processing (centro-parietal late positive potential, LPP), with suggestive evidence for less cognitive effort (frontal-LPP), relative to reappraisal. Related pre-implementation findings confirmed that anticipating regulation of high-intensity stimuli resulted in distraction (over reappraisal) preference. In contrast, anticipating regulation of low-intensity stimuli resulted in reappraisal (over distraction) preference, which is most beneficial for long-term adaptation. Furthermore, anticipating cognitively demanding regulation, either in cases of regulating counter to these preferences or via the more effortful strategy of reappraisal, enhanced neural attentional resource allocation (Stimulus Preceding Negativity). Broad implications are discussed. PMID:25700568
An empirical approach to symmetry and probability
NASA Astrophysics Data System (ADS)
North, Jill
We often rely on symmetries to infer outcomes' probabilities, as when we infer that each side of a fair coin is equally likely to come up on a given toss. Why are these inferences successful? I argue against answering this question with an a priori indifference principle. Reasons to reject such a principle are familiar, yet instructive. They point to a new, empirical explanation for the success of our probabilistic predictions. This has implications for indifference reasoning generally. I argue that a priori symmetries need never constrain our probability attributions, even for initial credences.
1982-12-01
RELATIONSHIP OF POOP AND HOOP WITH A PRIORI ALTITUDE UNCERTAINTY IN 3 DIMENSIONAL NAVIGATION. 4Satellite configuration ( AZEL ), (00,100), (900,10O), (180,10O...RELATIONSHIP OF HOOP WITH A PRIORI ALTITUDE UNCERTAINTY IN 2 DIMENSIONAL NAVIGATION. Satellite configuration ( AZEL ), (°,lO), (90,10), (180,lOO), (27o8...UNCERTAINTY IN 2 DIMENSIONAL NAVIGATION. Satellite configuration ( AZEL ), (00,100), (909,200), (l80*,30*), (270*,40*) 4.4-12 4.t 78 " 70 " 30F 20F 4S, a
Training outreach workers for AIDS prevention in rural India: is it sustainable?
Sivaram, S; Celentano, D D
2003-12-01
Through a process of community diagnosis and participation, a non-governmental organization in rural Karnataka state in India selected and trained peer outreach workers to implement and sustain AIDS prevention education activities. This activity was part of a larger AIDS education project that aimed at creating awareness and promoting risk-reducing behaviours in the community. This paper describes efforts of the project to identify and train peer educators during its implementation phase and discusses strategies used to facilitate sustainability. We evaluate the impact of these efforts by conducting an analysis in the project area 2 years after the end of the project. The findings reveal generalized interest among rural communities in HIV prevention issues. The project originally conducted an extensive survey to understand community organization and composition, which helped to identify potential partners and peer educators. Training peer educators was a multi-step process, and one with high attrition. While individual peer educators were an excellent resource during the life of the project, peer educators affiliated with village level institutions had the interest, access to resources and willingness to sustain project efforts. However, the sustainability of their efforts was associated with the quality of interactions with the project implementation team, the strength and leadership of their own institutions, the perceived benefits of implementing AIDS education activities after project life and the gender of the outreach worker. Non-sustainers did not have an organizational structure to backstop their work, were often poor and unemployed persons who later found gainful employment, and overwhelmingly, were female. We present a conceptual model based on these findings to help future projects plan for and achieve sustainability.
Implementing Equal Access Computer Labs.
ERIC Educational Resources Information Center
Clinton, Janeen; And Others
This paper discusses the philosophy followed in Palm Beach County to adapt computer literacy curriculum, hardware, and software to meet the needs of all children. The Department of Exceptional Student Education and the Department of Instructional Computing Services cooperated in planning strategies and coordinating efforts to implement equal…
Considerations in change management related to technology.
Luo, John S; Hilty, Donald M; Worley, Linda L; Yager, Joel
2006-01-01
The authors describe the complexity of social processes for implementing technological change. Once a new technology is available, information about its availability and benefits must be made available to the community of users, with opportunities to try the innovations and find them worthwhile, despite organizational resistances. The authors reviewed the literature from psychiatry, psychology, sociology, business, and technology to distill common denominators for success and failure related to implementing technology. Beneficial technological innovations that are simple to use and obviously save everyone time and effort are easy to inaugurate. However, innovations that primarily serve management rather than subordinates or front-line utilizers may fail, despite considerable institutional effort. This article reviews and outlines several of the more prominent theoretical models governing successful institutional change. Successful implementation of difficult technological changes requires visionary leadership that has carefully considered the benefits, consulted with influence leaders at all organizational levels to spot unintended consequences and sources of resistance, and developed a detailed plan and continuous quality assurance process to foster implementation over time.
Pinnock, Hilary; Epiphaniou, Eleni; Sheikh, Aziz; Griffiths, Chris; Eldridge, Sandra; Craig, Peter; Taylor, Stephanie J C
2015-03-30
Dissemination and implementation of health care interventions are currently hampered by the variable quality of reporting of implementation research. Reporting of other study types has been improved by the introduction of reporting standards (e.g. CONSORT). We are therefore developing guidelines for reporting implementation studies (StaRI). Using established methodology for developing health research reporting guidelines, we systematically reviewed the literature to generate items for a checklist of reporting standards. We then recruited an international, multidisciplinary panel for an e-Delphi consensus-building exercise which comprised an initial open round to revise/suggest a list of potential items for scoring in the subsequent two scoring rounds (scale 1 to 9). Consensus was defined a priori as 80% agreement with the priority scores of 7, 8, or 9. We identified eight papers from the literature review from which we derived 36 potential items. We recruited 23 experts to the e-Delphi panel. Open round comments resulted in revisions, and 47 items went forward to the scoring rounds. Thirty-five items achieved consensus: 19 achieved 100% agreement. Prioritised items addressed the need to: provide an evidence-based justification for implementation; describe the setting, professional/service requirements, eligible population and intervention in detail; measure process and clinical outcomes at population level (using routine data); report impact on health care resources; describe local adaptations to the implementation strategy and describe barriers/facilitators. Over-arching themes from the free-text comments included balancing the need for detailed descriptions of interventions with publishing constraints, addressing the dual aims of reporting on the process of implementation and effectiveness of the intervention and monitoring fidelity to an intervention whilst encouraging adaptation to suit diverse local contexts. We have identified priority items for reporting implementation studies and key issues for further discussion. An international, multidisciplinary workshop, where participants will debate the issues raised, clarify specific items and develop StaRI standards that fit within the suite of EQUATOR reporting guidelines, is planned. The protocol is registered with Equator: http://www.equator-network.org/library/reporting-guidelines-under-development/#17 .
Modeling and Simulation of Amorphous Materials
NASA Astrophysics Data System (ADS)
Pandey, Anup
The general and practical inversion of diffraction data - producing a computer model correctly representing the material explored - is an important unsolved problem for disordered materials. Such modeling should proceed by using our full knowledge base, both from experiment and theory. In this dissertation, we introduce a robust method, Force-Enhanced Atomic Refinement (FEAR), which jointly exploits the power of ab initio atomistic simulation along with the information carried by diffraction data. As a preliminary trial, the method has been implemented using empirical potentials for amorphous silicon (a-Si) and silica ( SiO2). The models obtained are comparable to the ones prepared by the conventional approaches as well as the experiments. Using ab initio interactions, the method is applied to two very different systems: amorphous silicon (a-Si) and two compositions of a solid electrolyte memory material silver-doped GeSe3. It is shown that the method works well for both the materials. Besides that, the technique is easy to implement, is faster and yields results much improved over conventional simulation methods for the materials explored. It offers a means to add a priori information in first principles modeling of materials, and represents a significant step toward the computational design of non-crystalline materials using accurate interatomic interactions and experimental information. Moreover, the method has also been used to create a computer model of a-Si, using highly precise X-ray diffraction data. The model predicts properties that are close to the continuous random network models but with no a priori assumptions. In addition, using the ab initio molecular dynamics simulations (AIMD) we explored the doping and transport in hydrogenated amorphous silicon a-Si:H with the most popular impurities: boron and phosphorous. We investigated doping for these impurities and the role of H in the doping process. We revealed the network motion and H hopping induced by the thermal fluctuations significantly impacts conduction in this material. In the last section of the dissertation, we employed AIMD to model the structure of amorphous zinc oxide (a-ZnO) and trivalent elements (Al, Ga and In) doped a-ZnO. We studied the structure and electronic structure of these models as well as the effect of trivalent dopants in both the structure and electronic structure of a-ZnO.
Multilevel Interventions Targeting Obesity: Research Recommendations for Vulnerable Populations.
Stevens, June; Pratt, Charlotte; Boyington, Josephine; Nelson, Cheryl; Truesdale, Kimberly P; Ward, Dianne S; Lytle, Leslie; Sherwood, Nancy E; Robinson, Thomas N; Moore, Shirley; Barkin, Shari; Cheung, Ying Kuen; Murray, David M
2017-01-01
The origins of obesity are complex and multifaceted. To be successful, an intervention aiming to prevent or treat obesity may need to address multiple layers of biological, social, and environmental influences. NIH recognizes the importance of identifying effective strategies to combat obesity, particularly in high-risk and disadvantaged populations with heightened susceptibility to obesity and subsequent metabolic sequelae. To move this work forward, the National Heart, Lung, and Blood Institute, in collaboration with the NIH Office of Behavioral and Social Science Research and NIH Office of Disease Prevention convened a working group to inform research on multilevel obesity interventions in vulnerable populations. The working group reviewed relevant aspects of intervention planning, recruitment, retention, implementation, evaluation, and analysis, and then made recommendations. Recruitment and retention techniques used in multilevel research must be culturally appropriate and suited to both individuals and organizations. Adequate time and resources for preliminary work are essential. Collaborative projects can benefit from complementary areas of expertise and shared investigations rigorously pretesting specific aspects of approaches. Study designs need to accommodate the social and environmental levels under study, and include appropriate attention given to statistical power. Projects should monitor implementation in the multiple venues and include a priori estimation of the magnitude of change expected within and across levels. The complexity and challenges of delivering interventions at several levels of the social-ecologic model require careful planning and implementation, but hold promise for successful reduction of obesity in vulnerable populations. Copyright © 2016. Published by Elsevier Inc.
Partial Fourier techniques in single-shot cross-term spatiotemporal encoded MRI.
Zhang, Zhiyong; Frydman, Lucio
2018-03-01
Cross-term spatiotemporal encoding (xSPEN) is a single-shot approach with exceptional immunity to field heterogeneities, the images of which faithfully deliver 2D spatial distributions without requiring a priori information or using postacquisition corrections. xSPEN, however, suffers from signal-to-noise ratio penalties due to its non-Fourier nature and due to diffusion losses-especially when seeking high resolution. This study explores partial Fourier transform approaches that, acting along either the readout or the spatiotemporally encoded dimensions, reduce these penalties. xSPEN uses an orthogonal (e.g., z) gradient to read, in direct space, the low-bandwidth (e.g., y) dimension. This substantially changes the nature of partial Fourier acquisitions vis-à-vis conventional imaging counterparts. A suitable theoretical analysis is derived to implement these procedures, along either the spatiotemporally or readout axes. Partial Fourier single-shot xSPEN images were recorded on preclinical and human scanners. Owing to their reduction in the experiments' acquisition times, this approach provided substantial sensitivity gains vis-à-vis previous implementations for a given targeted in-plane resolution. The physical origins of these gains are explained. Partial Fourier approaches, particularly when implemented along the low-bandwidth spatiotemporal dimension, provide several-fold sensitivity advantages at minimal costs to the execution and processing of the single-shot experiments. Magn Reson Med 79:1506-1514, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.
Implementation of Complex Signal Processing Algorithms for Position-Sensitive Microcalorimeters
NASA Technical Reports Server (NTRS)
Smith, Stephen J.
2008-01-01
We have recently reported on a theoretical digital signal-processing algorithm for improved energy and position resolution in position-sensitive, transition-edge sensor (POST) X-ray detectors [Smith et al., Nucl, lnstr and Meth. A 556 (2006) 2371. PoST's consists of one or more transition-edge sensors (TES's) on a large continuous or pixellated X-ray absorber and are under development as an alternative to arrays of single pixel TES's. PoST's provide a means to increase the field-of-view for the fewest number of read-out channels. In this contribution we extend the theoretical correlated energy position optimal filter (CEPOF) algorithm (originally developed for 2-TES continuous absorber PoST's) to investigate the practical implementation on multi-pixel single TES PoST's or Hydras. We use numerically simulated data for a nine absorber device, which includes realistic detector noise, to demonstrate an iterative scheme that enables convergence on the correct photon absorption position and energy without any a priori assumptions. The position sensitivity of the CEPOF implemented on simulated data agrees very well with the theoretically predicted resolution. We discuss practical issues such as the impact of random arrival phase of the measured data on the performance of the CEPOF. The CEPOF algorithm demonstrates that full-width-at- half-maximum energy resolution of < 8 eV coupled with position-sensitivity down to a few 100 eV should be achievable for a fully optimized device.
Multilevel Interventions Targeting Obesity: Research Recommendations for Vulnerable Populations
Stevens, June; Pratt, Charlotte; Boyington, Josephine; Nelson, Cheryl; Truesdale, Kimberly P.; Ward, Dianne S.; Lytle, Leslie; Sherwood, Nancy E.; Robinson, Thomas N.; Moore, Shirley; Barkin, Shari; Cheung, Ying Kuen; Murray, David M.
2017-01-01
Introduction The origins of obesity are complex and multifaceted. To be successful, an intervention aiming to prevent or treat obesity may need to address multiple layers of biological, social, and environmental influences. Methods NIH recognizes the importance of identifying effective strategies to combat obesity, particularly in high-risk and disadvantaged populations with heightened susceptibility to obesity and subsequent metabolic sequelae. To move this work forward, the National Heart, Lung, and Blood Institute, in collaboration with the NIH Office of Behavioral and Social Science Research and NIH Office of Disease Prevention convened a working group to inform research on multilevel obesity interventions in vulnerable populations. The working group reviewed relevant aspects of intervention planning, recruitment, retention, implementation, evaluation, and analysis, and then made recommendations. Results Recruitment and retention techniques used in multilevel research must be culturally appropriate and suited to both individuals and organizations. Adequate time and resources for preliminary work are essential. Collaborative projects can benefit from complementary areas of expertise and shared investigations rigorously pretesting specific aspects of approaches. Study designs need to accommodate the social and environmental levels under study, and include appropriate attention given to statistical power. Projects should monitor implementation in the multiple venues and include a priori estimation of the magnitude of change expected within and across levels. Conclusions The complexity and challenges of delivering interventions at several levels of the social—ecologic model require careful planning and implementation, but hold promise for successful reduction of obesity in vulnerable populations. PMID:28340973
Sepers, Charles E.; McKain, Wesley
2015-01-01
Successful implementation of the Affordable Care Act (ACA) depends on the capacity of local communities to mobilize for action. Yet the literature offers few systematic investigations of what communities are doing to ensure support for enrollment. In this empirical case study, we report implementation and outcomes of Enroll Wyandotte, a community mobilization effort to facilitate enrollment through the ACA in Wyandotte County, Kansas. We describe mobilization activities during the first round of open enrollment in coverage under the ACA (October 1, 2013–March 31, 2014), including the unfolding of community and organizational changes (e.g., new enrollment sites) and services provided to assist enrollment over time. The findings show an association between implementation measures and newly created accounts under the ACA (the primary outcome). PMID:25905820
Stellefson, Michael; Barry, Adam; Chaney, Beth H; Chaney, J Don; Hanik, Bruce
2011-05-01
What exactly is health education? Professionals with advanced degrees in health education have most likely encountered questions such as these either during introductory coursework or from those inquiring about the field. These queries can prove quite perplexing when asked by individuals who are unaware of the health education profession. Because the act of marketing health education is crucial to the sustainability of the field, the purpose of this article is to (a) explore the issue of describing and promoting health education, (b) establish ideas that can facilitate the provision of coordinated marketing efforts, and (c) offer marketing management and implementation principles that can assist in marketing both health education and health educators. Based on this discussion, the authors suggest building mainstream consensus in regards to marketing message development and implementation to better position health education.
ERIC Educational Resources Information Center
Agency for International Development (Dept. of State), Washington, DC.
This report discusses the many policy and procedural issues of the Agency for International Development (AID) in implementing the reforms included in the congressional Foreign Assistance Act of 1973. The act concentrated aid efforts on food and nutrition improvement, population control, health improvement, education, and human resource…
ERIC Educational Resources Information Center
Balfanz, Robert; Mac Iver, Douglas J.; Byrnes, Vaughan
2006-01-01
This article reports on the first 4 years of an effort to develop comprehensive and sustainable mathematics education reforms in high poverty middle schools. In four related analyses, we examine the levels of implementation achieved and impact of the reforms on various measures of achievement in the first 3 schools to implement the Talent…
2014-06-01
activities in 11 countries—Algeria, Burkina Faso, Cameroon, Chad, Mali, Mauritania, Morocco , Niger, Nigeria, Senegal, and Tunisia...are typically State-implemented and are used to support activities designed to enhance the law enforcement capacity and antiterrorism skills of...for other anticrime purposes. INCLE funds are typically State- implemented and are used to develop and implement policies and programs that maintain
ERIC Educational Resources Information Center
Wong, Vivian C.; Wing, Coady; Martin, David; Krishnamachari, Anandita
2018-01-01
When No Child Left Behind (NCLB) became law in 2002, it was viewed as an effort to create uniform standards for students and schools across the country. More than a decade later, we know surprisingly little about how states actually implemented NCLB and the extent to which state implementation decisions managed to undo the centralizing objectives…
Quaternion normalization in spacecraft attitude determination
NASA Technical Reports Server (NTRS)
Deutschmann, J.; Markley, F. L.; Bar-Itzhack, Itzhack Y.
1993-01-01
Attitude determination of spacecraft usually utilizes vector measurements such as Sun, center of Earth, star, and magnetic field direction to update the quaternion which determines the spacecraft orientation with respect to some reference coordinates in the three dimensional space. These measurements are usually processed by an extended Kalman filter (EKF) which yields an estimate of the attitude quaternion. Two EKF versions for quaternion estimation were presented in the literature; namely, the multiplicative EKF (MEKF) and the additive EKF (AEKF). In the multiplicative EKF, it is assumed that the error between the correct quaternion and its a-priori estimate is, by itself, a quaternion that represents the rotation necessary to bring the attitude which corresponds to the a-priori estimate of the quaternion into coincidence with the correct attitude. The EKF basically estimates this quotient quaternion and then the updated quaternion estimate is obtained by the product of the a-priori quaternion estimate and the estimate of the difference quaternion. In the additive EKF, it is assumed that the error between the a-priori quaternion estimate and the correct one is an algebraic difference between two four-tuple elements and thus the EKF is set to estimate this difference. The updated quaternion is then computed by adding the estimate of the difference to the a-priori quaternion estimate. If the quaternion estimate converges to the correct quaternion, then, naturally, the quaternion estimate has unity norm. This fact was utilized in the past to obtain superior filter performance by applying normalization to the filter measurement update of the quaternion. It was observed for the AEKF that when the attitude changed very slowly between measurements, normalization merely resulted in a faster convergence; however, when the attitude changed considerably between measurements, without filter tuning or normalization, the quaternion estimate diverged. However, when the quaternion estimate was normalized, the estimate converged faster and to a lower error than with tuning only. In last years, symposium we presented three new AEKF normalization techniques and we compared them to the brute force method presented in the literature. The present paper presents the issue of normalization of the MEKF and examines several MEKF normalization techniques.
NASA Astrophysics Data System (ADS)
Chouaib, Wafa; Alila, Younes; Caldwell, Peter V.
2018-05-01
The need for predictions of flow time-series persists at ungauged catchments, motivating the research goals of our study. By means of the Sacramento model, this paper explores the use of parameter transfer within homogeneous regions of similar climate and flow characteristics and makes comparisons with predictions from a priori parameters. We assessed the performance using the Nash-Sutcliffe (NS), bias, mean monthly hydrograph and flow duration curve (FDC). The study was conducted on a large dataset of 73 catchments within the eastern US. Two approaches to the parameter transferability were developed and evaluated; (i) the within homogeneous region parameter transfer using one donor catchment specific to each region, (ii) the parameter transfer disregarding the geographical limits of homogeneous regions, where one donor catchment was common to all regions. Comparisons between both parameter transfers enabled to assess the gain in performance from the parameter regionalization and its respective constraints and limitations. The parameter transfer within homogeneous regions outperformed the a priori parameters and led to a decrease in bias and increase in efficiency reaching a median NS of 0.77 and a NS of 0.85 at individual catchments. The use of FDC revealed the effect of bias on the inaccuracy of prediction from parameter transfer. In one specific region, of mountainous and forested catchments, the prediction accuracy of the parameter transfer was less satisfactory and equivalent to a priori parameters. In this region, the parameter transfer from the outsider catchment provided the best performance; less-biased with smaller uncertainty in medium flow percentiles (40%-60%). The large disparity of energy conditions explained the lack of performance from parameter transfer in this region. Besides, the subsurface stormflow is predominant and there is a likelihood of lateral preferential flow, which according to its specific properties further explained the reduced efficiency. Testing the parameter transferability using criteria of similar climate and flow characteristics at ungauged catchments and comparisons with predictions from a priori parameters are a novelty. The ultimate limitations of both approaches are recognized and recommendations are made for future research.
Controller Requirements for Uncoupled Aircraft Motion. Volume 2.
1984-09-01
allow efficient irplementation of the 6-DOF control capability. Thr effort was divided Into two phases. Phase I consisted of def~nInR exi.ting data on...implementation of the 6-DOF control capability. The effort was divided into two phases. Phase I consisted of defining existing data on the design of cockpit...Vehicles. The propose-] criteria are described in Volume I of this report. S The effort was divided into two phases. Phase I consisted of defining
Hall, F Scott; Drgonova, Jana; Jain, Siddharth; Uhl, George R
2013-12-01
Substantial genetic contributions to addiction vulnerability are supported by data from twin studies, linkage studies, candidate gene association studies and, more recently, Genome Wide Association Studies (GWAS). Parallel to this work, animal studies have attempted to identify the genes that may contribute to responses to addictive drugs and addiction liability, initially focusing upon genes for the targets of the major drugs of abuse. These studies identified genes/proteins that affect responses to drugs of abuse; however, this does not necessarily mean that variation in these genes contributes to the genetic component of addiction liability. One of the major problems with initial linkage and candidate gene studies was an a priori focus on the genes thought to be involved in addiction based upon the known contributions of those proteins to drug actions, making the identification of novel genes unlikely. The GWAS approach is systematic and agnostic to such a priori assumptions. From the numerous GWAS now completed several conclusions may be drawn: (1) addiction is highly polygenic; each allelic variant contributing in a small, additive fashion to addiction vulnerability; (2) unexpected, compared to our a priori assumptions, classes of genes are most important in explaining addiction vulnerability; (3) although substantial genetic heterogeneity exists, there is substantial convergence of GWAS signals on particular genes. This review traces the history of this research; from initial transgenic mouse models based upon candidate gene and linkage studies, through the progression of GWAS for addiction and nicotine cessation, to the current human and transgenic mouse studies post-GWAS. © 2013.
Sturke, Rachel; Harmston, Christine; Simonds, R J; Mofenson, Lynne M; Siberry, George K; Watts, D Heather; McIntyre, James; Anand, Nalini; Guay, Laura; Castor, Delivette; Brouwers, Pim; Nagel, Joan D
2014-11-01
In resource-limited countries, interventions to prevent mother-to-child HIV transmission (PMTCT) have not yet realized their full potential health impact, illustrating the common gap between the scientific proof of an intervention's efficacy and effectiveness and its successful implementation at scale into routine health services. For PMTCT, this gap results, in part, from inadequate adaptation of PMTCT interventions to the realities of the implementation environment, including client and health care worker behaviors and preferences, health care policies and systems, and infrastructure and resource constraints. Elimination of mother-to-child HIV transmission can only be achieved through understanding of key implementation barriers and successful adaptation of scientifically proven interventions to the local environment. Central to such efforts is implementation science (IS), which aims to investigate and address major bottlenecks that impede effective implementation and to test new approaches to identifying, understanding, and overcoming barriers to the adoption, adaptation, integration, scale-up, and sustainability of evidence-based interventions. Advancing IS will require deliberate and strategic efforts to facilitate collaboration, communication, and relationship-building among researchers, implementers, and policy-makers. To speed the translation of effective PMTCT interventions into practice and advance IS more broadly, the US National Institutes of Health, in collaboration with the President's Emergency Plan for AIDS Relief launched the National Institutes of Health/President's Emergency Plan for AIDS Relief PMTCT IS Alliance, comprised of IS researchers, PMTCT program implementers, and policy-makers as an innovative platform for interaction and coordination.
The Ecology of Sustainable Implementation: Reflection on a 10-Year Case History Illustration.
Rimehaug, Tormod
2014-01-01
The primary aim of this paper is to illustrate the strategic and ecological nature of implementation. The ultimate aim of implementation is not dissemination but sustainability beyond the implementation effort. A case study is utilized to illustrate these broad and long-term perspectives of sustainable implementation based on qualitative analyses of a 10-year implementation effort. The purveyors aimed to develop selective community prevention services for children in families burdened by parental psychiatric or addictive problems. Services were gradually disseminated to 23 sites serving 40 municipalities by 2013. Up to 2013, only one site terminated services after initial implementation. Although many sites suspended services for shorter periods, services are still offered at 22 sites. This case analysis is based on project reports, user evaluations, practitioner interviews, and service statistics. The paper focuses on the analyses and strategies utilized to cope with quality decay and setbacks as well as progress and success in disseminating and sustaining the services and their quality. Low-cost multilevel strategies to implement services at the community level were organized by a prevention unit in child psychiatry, supervised by a university department (purveyors). The purveyors were also involved in national and international collaboration and development. Multilevel strategies included manualized intervention, in-practice training methods, organizational responsibility, media strategies, service evaluation, staff motivation maintenance, quality assurance, and proposals for new law regulations. These case history aspects will be discussed in relation to the implementation literature, focusing on possible applicability across settings.
State Practices in the Assessment of Outcomes for Students with Disabilities. Technical Report.
ERIC Educational Resources Information Center
Shriner, James G.; And Others
This technical report describes the methodology, results, and conclusions of a 1991 survey, which was conducted to determine state efforts to develop systems to assess educational outcomes, states' needs for solutions to technical/implementation problems, existing databases, and efforts of states to design a comprehensive system of indicators in…
The Potomac River watershed is a critical drinking water supply for the Washington DC metropolitan area. In 2004, the Drinking Water Source Protection Partnership (DWSPP) was formed to help coordinate efforts by local drinking water utilities and government agencies to protect th...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-11
... federally threatened Higo Chumbo cactus. Removal of invasive animal species would also continue, and we... animal species and would implement efforts to avoid introduction of new invasive species from increased... restoration efforts. Over the 15-year life of the CCP, we would complete the removal of all invasive animal...
Ecological assessment of sagebrush grasslands in eastern Wyoming
Amy C. Ganguli; Jonathan B. Haufler; Carolyn A. Mehl; Scott D. Yeats
2011-01-01
An understanding of existing ecosystem conditions is necessary for planning efforts that include formulation of landscape conservation goals and implementation strategies. In support of a landscape planning effort for a 946,000-ac mixed-ownership area in eastern Wyoming, we used remote sensing and field sampling to assess existing ecosystem conditions of terrestrial...
Philosophies of Leadership and Management and Its Influence on Change.
ERIC Educational Resources Information Center
Moore, Rock D.
Since the National Commission on Excellence in Education stated the nation was "at risk," there have been efforts to design and implement improved teacher delivery strategies that would enhance classroom instruction. In spite of reform efforts, crucial decisions affecting teachers' classroom practices still have a tendency to be made by…
The Moderating Effects of School Climate on Bullying Prevention Efforts
ERIC Educational Resources Information Center
Low, Sabina; Van Ryzin, Mark
2014-01-01
Bullying prevention efforts have yielded mixed effects over the last 20 years. Program effectiveness is driven by a number of factors (e.g., program elements and implementation), but there remains a dearth of understanding regarding the role of school climate on the impact of bullying prevention programs. This gap is surprising, given research…
[The University of California Cooperative Teacher Preparation Project (UCCTPP)].
ERIC Educational Resources Information Center
California Univ., Berkeley. School of Education.
The University of California Cooperative Teacher Preparation Project (UCCTPP) began in 1971 as an effort to improve the quality and effectiveness of teacher education programs. UCCTPP is currently implemented through the cooperative efforts of the School of Education at the University of California at Berkeley and the Mount Diablo and Vallejo…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-23
...)(3) and (4) (Accounting for Disclosures) because release of the accounting of disclosures could alert... the recipient agency. Disclosure of the accounting would therefore present a serious impediment to law enforcement efforts and/or efforts to preserve national security. Disclosure of the accounting would also...
An RCT of an Evidence-Based Practice Teaching Model with the Field Instructor
ERIC Educational Resources Information Center
Tennille, Julie Anne
2013-01-01
Problem: Equipping current and future social work practitioners with skills to deliver evidence-based practice (EBP) has remained an elusive prospect since synchronized efforts with field instructors have not been a consistent part of dissemination and implementation efforts. Recognizing the highly influential position of field instructors, this…
ERIC Educational Resources Information Center
Foster, Jamie S.; Shiel-Rolle, Nikita
2011-01-01
To enhance scientific literacy in the general public efforts are needed that both inspire and engage the learner. Often such efforts are provided through school programs or science learning centers, however, in many rural communities such resources are unavailable. Alternate strategies are needed to provide individuals with quality educational…
ERIC Educational Resources Information Center
Case, Alissa; Ngo, Bic
2017-01-01
The authors discuss the ways in which neoliberal multiculturalism influences the reception and implementation of antiracist initiatives on college campuses. They suggest the intersections of neoliberalism and racism produce a resistance to antiracist efforts and desire for softened multicultural approaches that maintain the status quo. Since the…
Implementing Total Quality Management at El Camino College.
ERIC Educational Resources Information Center
Gonzales, Frank S.
The California Community Colleges are in the midst of a reform process initiated by Assembly Bill 1725. A major goal of the bill is to fund colleges' staff development efforts. In an effort to transform the campus organizational culture from one of predominant competitiveness to collaboration, El Camino College (ECC) in Torrance, California,…
ERIC Educational Resources Information Center
Gutierez, Sally Baricaua
2015-01-01
In the Philippines, inquiry-based teaching has been promoted and implemented together with recently instigated curriculum reforms. Serious teacher professional development efforts are being used extensively to properly orient and present the benefits of inquirybased teaching. Despite these efforts, there still exists a big gap in the effective…
Examination of Teacher Observation Dynamics: Role of Observer Effort on Teacher Growth
ERIC Educational Resources Information Center
Bury, Michael Shaun
2017-01-01
This study examined the teacher observation cycle to understand the effect of observer knowledge, observer effort, observer power, and school culture on teachers' perceptions of whether the observation process helped them grow, implement strategies, or increase student learning. The concepts of power and expertise were defined by blending the…
Modeling Psychological Empowerment among Youth Involved in Local Tobacco Control Efforts
ERIC Educational Resources Information Center
Holden, Debra J.; Evans, W. Douglas; Hinnant, Laurie W.; Messeri, Peter
2005-01-01
The American Legacy Foundation funded 13 state health departments for their Statewide Youth Movement Against Tobacco Use in September 2000. Its goal was to create statewide tobacco control initiatives implemented with youth leadership. The underlying theory behind these initiatives was that tobacco control efforts can best be accomplished by…
PBTE (Performance-Based Teacher Education); Vol 2, No. 5, November 1973.
ERIC Educational Resources Information Center
Andrews, Theodore E., Ed.
This issue of the Multi-State Consortium on Performance-Based Teacher Education (PBTE) newsletter presents an examination of Florida International University's efforts over two academic years to develop and implement performance-based curricula across all of its programs at both the undergraduate and graduate levels. Efforts were aimed at…
Space station final study report. Volume 1: Executive summary
NASA Technical Reports Server (NTRS)
1987-01-01
Volume 1 of the Final Study Report provides an Executive Summary of the Phase B study effort conducted under contract NAS8-36526. Space station Phase B implementation resulted in the timely establishment of preliminary design tasks, including trades and analyses. A comprehensive summary of project activities in conducting this study effort is included.