Sample records for method initially developed

  1. An advanced analysis method of initial orbit determination with too short arc data

    NASA Astrophysics Data System (ADS)

    Li, Binzhe; Fang, Li

    2018-02-01

    This paper studies the initial orbit determination (IOD) based on space-based angle measurement. Commonly, these space-based observations have short durations. As a result, classical initial orbit determination algorithms give poor results, such as Laplace methods and Gauss methods. In this paper, an advanced analysis method of initial orbit determination is developed for space-based observations. The admissible region and triangulation are introduced in the method. Genetic algorithm is also used for adding some constraints of parameters. Simulation results show that the algorithm can successfully complete the initial orbit determination.

  2. Ignitability test method

    NASA Technical Reports Server (NTRS)

    Bement, Laurence J.; Schimmel, Morry L.

    1989-01-01

    To overcome serious weaknesses in determining the performance of initiating devices, a novel 'ignitability test method', representing actual design interfaces and ignition materials, has been developed. Ignition device output consists of heat, light, gas an burning particles. Past research methods have evaluated these parameters individually. This paper describes the development and demonstration of an ignitability test method combining all these parameters, and the quantitative assessment of the ignition performance of two widely used percussion primers, the M42C1-PA101 and the M42C2-793. The ignition materials used for this evaluation were several powder, granule and pellet sizes of black powder and boron-potassium nitrate. This test method should be useful for performance evaluation of all initiator types, quality assurance, evaluation of ignition interfaces, and service life studies of initiators and ignition materials.

  3. Elimination of initial stress-induced curvature in a micromachined bi-material composite-layered cantilever

    NASA Astrophysics Data System (ADS)

    Liu, Ruiwen; Jiao, Binbin; Kong, Yanmei; Li, Zhigang; Shang, Haiping; Lu, Dike; Gao, Chaoqun; Chen, Dapeng

    2013-09-01

    Micro-devices with a bi-material-cantilever (BMC) commonly suffer initial curvature due to the mismatch of residual stress. Traditional corrective methods to reduce the residual stress mismatch generally involve the development of different material deposition recipes. In this paper, a new method for reducing residual stress mismatch in a BMC is proposed based on various previously developed deposition recipes. An initial material film is deposited using two or more developed deposition recipes. This first film is designed to introduce a stepped stress gradient, which is then balanced by overlapping a second material film on the first and using appropriate deposition recipes to form a nearly stress-balanced structure. A theoretical model is proposed based on both the moment balance principle and total equal strain at the interface of two adjacent layers. Experimental results and analytical models suggest that the proposed method is effective in producing multi-layer micro cantilevers that display balanced residual stresses. The method provides a generic solution to the problem of mismatched initial stresses which universally exists in micro-electro-mechanical systems (MEMS) devices based on a BMC. Moreover, the method can be incorporated into a MEMS design automation package for efficient design of various multiple material layer devices from MEMS material library and developed deposition recipes.

  4. Calibrating reaction rates for the CREST model

    NASA Astrophysics Data System (ADS)

    Handley, Caroline A.; Christie, Michael A.

    2017-01-01

    The CREST reactive-burn model uses entropy-dependent reaction rates that, until now, have been manually tuned to fit shock-initiation and detonation data in hydrocode simulations. This paper describes the initial development of an automatic method for calibrating CREST reaction-rate coefficients, using particle swarm optimisation. The automatic method is applied to EDC32, to help develop the first CREST model for this conventional high explosive.

  5. Systems Alignment for Comprehensive Faculty Development in Liberal Arts Colleges

    ERIC Educational Resources Information Center

    Baker, Vicki L.; Lunsford, Laura G.; Pifer, Meghan J.

    2015-01-01

    Using an alignment framework, the authors explore faculty development initiatives in liberal arts colleges in order to understand the connection between organizational priorities and processes as connected to faculty members' stated needs. The study draws on mixed-methods data from The Initiative for Faculty Development in Liberal Arts Colleges…

  6. Trimming Line Design using New Development Method and One Step FEM

    NASA Astrophysics Data System (ADS)

    Chung, Wan-Jin; Park, Choon-Dal; Yang, Dong-yol

    2005-08-01

    In most of automobile panel manufacturing, trimming is generally performed prior to flanging. To find feasible trimming line is crucial in obtaining accurate edge profile after flanging. Section-based method develops blank along section planes and find trimming line by generating loop of end points. This method suffers from inaccurate results for regions with out-of-section motion. On the other hand, simulation-based method can produce more accurate trimming line by iterative strategy. However, due to limitation of time and lack of information in initial die design, it is still not widely accepted in the industry. In this study, new fast method to find feasible trimming line is proposed. One step FEM is used to analyze the flanging process because we can define the desired final shape after flanging and most of strain paths are simple in flanging. When we use one step FEM, the main obstacle is the generation of initial guess. Robust initial guess generation method is developed to handle bad-shaped mesh, very different mesh size and undercut part. The new method develops 3D triangular mesh in propagational way from final mesh onto the drawing tool surface. Also in order to remedy mesh distortion during development, energy minimization technique is utilized. Trimming line is extracted from the outer boundary after one step FEM simulation. This method shows many benefits since trimming line can be obtained in the early design stage. The developed method is successfully applied to the complex industrial applications such as flanging of fender and door outer.

  7. Relationship between Defect Size and Fatigue Life Distributions in Al-7 Pct Si-Mg Alloy Castings

    NASA Astrophysics Data System (ADS)

    Tiryakioğlu, Murat

    2009-07-01

    A new method for predicting the variability in fatigue life of castings was developed by combining the size distribution for the fatigue-initiating defects and a fatigue life model based on the Paris-Erdoğan law for crack propagation. Two datasets for the fatigue-initiating defects in Al-7 pct Si-Mg alloy castings, reported previously in the literature, were used to demonstrate that (1) the size of fatigue-initiating defects follow the Gumbel distribution; (2) the crack propagation model developed previously provides respectable fits to experimental data; and (3) the method developed in the present study expresses the variability in both datasets, almost as well as the lognormal distribution and better than the Weibull distribution.

  8. Using Mixed Methods to Assess Initiatives with Broad-Based Goals

    ERIC Educational Resources Information Center

    Inkelas, Karen Kurotsuchi

    2017-01-01

    This chapter describes a process for assessing programmatic initiatives with broad-ranging goals with the use of a mixed-methods design. Using an example of a day-long teaching development conference, this chapter provides practitioners step-by-step guidance on how to implement this assessment process.

  9. THE ONTARIO HYDRO METHOD FOR SPECIATED MERCURY MEASUREMENTS: ISSUES AND CONSIDERATIONS

    EPA Science Inventory

    The Ontario Hydro (OH) method has been developed for the measurement of total and speciated mercury emissions from coal-fired combustion sources. The OH method was initially developed to support EPA's information collection request to characterize and inventory mercury emissions ...

  10. Cumulative Advantage in the Skill Development of STEM Graduate Students: A Mixed-Methods Study

    ERIC Educational Resources Information Center

    Feldon, David F.; Maher, Michelle A.; Roksa, Josipa; Peugh, James

    2016-01-01

    Studies of skill development often describe a process of cumulative advantage, in which small differences in initial skill compound over time, leading to increasing skill gaps between those with an initial advantage and those without. We offer evidence of a similar phenomenon accounting for differential patterns of research skill development in…

  11. Impact of the Alzheimer’s Disease Neuroimaging Initiative, 2004 to 2014

    PubMed Central

    Weiner, Michael W.; Veitch, Dallas P.; Aisen, Paul S.; Beckett, Laurel A.; Cairns, Nigel J.; Cedarbaum, Jesse; Donohue, Michael C.; Green, Robert C.; Harvey, Danielle; Jack, Clifford R.; Jagust, William; Morris, John C.; Petersen, Ronald C.; Saykin, Andrew J.; Shaw, Leslie; Thompson, Paul M.; Toga, Arthur W.; Trojanowski, John Q.

    2015-01-01

    Introduction The Alzheimer’s Disease Neuroimaging Initiative (ADNI) was established in 2004 to facilitate the development of effective treatments for Alzheimer’s disease (AD) by validating biomarkers for AD clinical trials. Methods We searched for ADNI publications using established methods. Results ADNI has (1) developed standardized biomarkers for use in clinical trial subject selection and as surrogate outcome measures; (2) standardized protocols for use across multiple centers; (3) initiated worldwide ADNI; (4) inspired initiatives investigating traumatic brain injury and post-traumatic stress disorder in military populations, and depression, respectively, as an AD risk factor; (5) acted as a data-sharing model; (6) generated data used in over 600 publications, leading to the identification of novel AD risk alleles, and an understanding of the relationship between biomarkers and AD progression; and (7) inspired other public-private partnerships developing biomarkers for Parkinson’s disease and multiple sclerosis. Discussion ADNI has made myriad impacts in its first decade. A competitive renewal of the project in 2015 would see the use of newly developed tau imaging ligands, and the continued development of recruitment strategies and outcome measures for clinical trials. PMID:26194320

  12. Preliminary Tests For Development Of A Non-Pertechnetate Analysis Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Diprete, D.; McCabe, D.

    2016-09-28

    The objective of this task was to develop a non-pertechnetate analysis method that 222-S lab could easily implement. The initial scope involved working with 222-S laboratory personnel to adapt the existing Tc analytical method to fractionate the non-pertechnetate and pertechnetate. SRNL then developed and tested a method using commercial sorbents containing Aliquat ® 336 to extract the pertechnetate (thereby separating it from non-pertechnetate), followed by oxidation, extraction, and stripping steps, and finally analysis by beta counting and Mass Spectroscopy. Several additional items were partially investigated, including impacts of a 137Cs removal step. The method was initially tested on SRS tankmore » waste samples to determine its viability. Although SRS tank waste does not contain non-pertechnetate, testing with it was useful to investigate the compatibility, separation efficiency, interference removal efficacy, and method sensitivity.« less

  13. The Development of a Strategic Prioritisation Method for Green Supply Chain Initiatives.

    PubMed

    Masoumik, S Maryam; Abdul-Rashid, Salwa Hanim; Olugu, Ezutah Udoncy

    2015-01-01

    To maintain a competitive position, companies are increasingly required to integrate their proactive environmental strategies into their business strategies. The shift from reactive and compliance-based to proactive and strategic environmental management has driven companies to consider the strategic factors while identifying the areas in which they should focus their green initiatives. In previous studies little attention was given to providing the managers with a basis from which they could strategically prioritise these green initiatives across their companies' supply chains. Considering this lacuna in the literature, we present a decision-making method for prioritising green supply chain initiatives aligned with the preferred green strategies alternatives for the manufacturing companies. To develop this method, the study considered a position between determinism and the voluntarism orientation of environmental management involving both external pressures and internal competitive drivers and key resources as decision factors. This decision-making method was developed using the analytic network process (ANP) technique. The elements of the decision model were derived from the literature. The causal relationships among the multiple decision variables were validated based on the results of structural equation modelling (SEM) using a dataset collected from a survey of the ISO 14001-certified manufacturers in Malaysia. A portion of the relative weights required for computation in ANP was also calculated using the SEM results. A case study is presented to demonstrate the applicability of the method.

  14. The Development of a Strategic Prioritisation Method for Green Supply Chain Initiatives

    PubMed Central

    Masoumik, S. Maryam; Abdul-Rashid, Salwa Hanim; Olugu, Ezutah Udoncy

    2015-01-01

    To maintain a competitive position, companies are increasingly required to integrate their proactive environmental strategies into their business strategies. The shift from reactive and compliance-based to proactive and strategic environmental management has driven companies to consider the strategic factors while identifying the areas in which they should focus their green initiatives. In previous studies little attention was given to providing the managers with a basis from which they could strategically prioritise these green initiatives across their companies’ supply chains. Considering this lacuna in the literature, we present a decision-making method for prioritising green supply chain initiatives aligned with the preferred green strategies alternatives for the manufacturing companies. To develop this method, the study considered a position between determinism and the voluntarism orientation of environmental management involving both external pressures and internal competitive drivers and key resources as decision factors. This decision-making method was developed using the analytic network process (ANP) technique. The elements of the decision model were derived from the literature. The causal relationships among the multiple decision variables were validated based on the results of structural equation modelling (SEM) using a dataset collected from a survey of the ISO 14001-certified manufacturers in Malaysia. A portion of the relative weights required for computation in ANP was also calculated using the SEM results. A case study is presented to demonstrate the applicability of the method. PMID:26618353

  15. Development of a Research Methods and Statistics Concept Inventory

    ERIC Educational Resources Information Center

    Veilleux, Jennifer C.; Chapman, Kate M.

    2017-01-01

    Research methods and statistics are core courses in the undergraduate psychology major. To assess learning outcomes, it would be useful to have a measure that assesses research methods and statistical literacy beyond course grades. In two studies, we developed and provided initial validation results for a research methods and statistical knowledge…

  16. Analysis of Perfluorinated Chemicals and Their Fluorinated Precursors in Sludge: Method Development and Initial Results

    EPA Science Inventory

    A rigorous method was developed to maximize the extraction efficacy for perfluorocarboxylic acids (PFCAs), perfluorosulfonates (PFSAs), fluorotelomer alcohols (FTOHs), fluorotelomer acrylates (FTAc), perfluorosulfonamides (FOSAs), and perfluorosulfonamidoethanols (FOSEs) from was...

  17. Delivering spacecraft control centers with embedded knowledge-based systems: The methodology issue

    NASA Technical Reports Server (NTRS)

    Ayache, S.; Haziza, M.; Cayrac, D.

    1994-01-01

    Matra Marconi Space (MMS) occupies a leading place in Europe in the domain of satellite and space data processing systems. The maturity of the knowledge-based systems (KBS) technology, the theoretical and practical experience acquired in the development of prototype, pre-operational and operational applications, make it possible today to consider the wide operational deployment of KBS's in space applications. In this perspective, MMS has to prepare the introduction of the new methods and support tools that will form the basis of the development of such systems. This paper introduces elements of the MMS methodology initiatives in the domain and the main rationale that motivated the approach. These initiatives develop along two main axes: knowledge engineering methods and tools, and a hybrid method approach for coexisting knowledge-based and conventional developments.

  18. The promises and limitations of female-initiated methods of HIV/STI protection.

    PubMed

    Mantell, Joanne E; Dworkin, Shari L; Exner, Theresa M; Hoffman, Susie; Smit, Jenni A; Susser, Ida

    2006-10-01

    New methods are now available, and others are being developed, that could enable women to take the initiative in preventing sexually transmitted infections. However, attempts to capitalize on "female-controlled" preventive methods thus far have met with limited success. Female-initiated methods were introduced to intervene in the state of gender relations and assist women who are disempowered vis-à-vis their male partners. Paradoxically, however, we underscore that it is the very structure of regional and local gender relations that shapes the acceptability (or lack of acceptability) of these methods. This paper specifically addresses how the structure of gender relations-for better and for worse-shapes the promises and limitations of widespread use and acceptance of female-initiated methods. We draw on examples from around the world to underscore how the regional specificities of gender (in)equality shape the acceptance, negotiation, and use of these methods. Simultaneously, we demonstrate how the introduction and sustained use of methods are shaped by gender relations and offer possibilities for reinforcing or challenging their current state. Based on our analyses, we offer key policy and programmatic recommendations to increase promotion and effective use of women-initiated HIV/STI protection methods for both women and men.

  19. Organizational Context Matters: A Research Toolkit for Conducting Standardized Case Studies of Integrated Care Initiatives

    PubMed Central

    Grudniewicz, Agnes; Gray, Carolyn Steele; Wodchis, Walter P.; Carswell, Peter; Baker, G. Ross

    2017-01-01

    Introduction: The variable success of integrated care initiatives has led experts to recommend tailoring design and implementation to the organizational context. Yet, organizational contexts are rarely described, understood, or measured with sufficient depth and breadth in empirical studies or in practice. We thus lack knowledge of when and specifically how organizational contexts matter. To facilitate the accumulation of evidence, we developed a research toolkit for conducting case studies using standardized measures of the (inter-)organizational context for integrating care. Theory and Methods: We used a multi-method approach to develop the research toolkit: (1) development and validation of the Context and Capabilities for Integrating Care (CCIC) Framework, (2) identification, assessment, and selection of survey instruments, (3) development of document review methods, (4) development of interview guide resources, and (5) pilot testing of the document review guidelines, consolidated survey, and interview guide. Results: The toolkit provides a framework and measurement tools that examine 18 organizational and inter-organizational factors that affect the implementation and success of integrated care initiatives. Discussion and Conclusion: The toolkit can be used to characterize and compare organizational contexts across cases and enable comparison of results across studies. This information can enhance our understanding of the influence of organizational contexts, support the transfer of best practices, and help explain why some integrated care initiatives succeed and some fail. PMID:28970750

  20. Brief summary of the evolution of high-temperature creep-fatigue life prediction models for crack initiation

    NASA Technical Reports Server (NTRS)

    Halford, Gary R.

    1993-01-01

    The evolution of high-temperature, creep-fatigue, life-prediction methods used for cyclic crack initiation is traced from inception in the late 1940's. The methods reviewed are material models as opposed to structural life prediction models. Material life models are used by both structural durability analysts and by material scientists. The latter use micromechanistic models as guidance to improve a material's crack initiation resistance. Nearly one hundred approaches and their variations have been proposed to date. This proliferation poses a problem in deciding which method is most appropriate for a given application. Approaches were identified as being combinations of thirteen different classifications. This review is intended to aid both developers and users of high-temperature fatigue life prediction methods by providing a background from which choices can be made. The need for high-temperature, fatigue-life prediction methods followed immediately on the heels of the development of large, costly, high-technology industrial and aerospace equipment immediately following the second world war. Major advances were made in the design and manufacture of high-temperature, high-pressure boilers and steam turbines, nuclear reactors, high-temperature forming dies, high-performance poppet valves, aeronautical gas turbine engines, reusable rocket engines, etc. These advances could no longer be accomplished simply by trial and error using the 'build-em and bust-em' approach. Development lead times were too great and costs too prohibitive to retain such an approach. Analytic assessments of anticipated performance, cost, and durability were introduced to cut costs and shorten lead times. The analytic tools were quite primitive at first and out of necessity evolved in parallel with hardware development. After forty years more descriptive, more accurate, and more efficient analytic tools are being developed. These include thermal-structural finite element and boundary element analyses, advanced constitutive stress-strain-temperature-time relations, and creep-fatigue-environmental models for crack initiation and propagation. The high-temperature durability methods that have evolved for calculating high-temperature fatigue crack initiation lives of structural engineering materials are addressed. Only a few of the methods were refined to the point of being directly useable in design. Recently, two of the methods were transcribed into computer software for use with personal computers.

  1. Brief summary of the evolution of high-temperature creep-fatigue life prediction models for crack initiation

    NASA Astrophysics Data System (ADS)

    Halford, Gary R.

    1993-10-01

    The evolution of high-temperature, creep-fatigue, life-prediction methods used for cyclic crack initiation is traced from inception in the late 1940's. The methods reviewed are material models as opposed to structural life prediction models. Material life models are used by both structural durability analysts and by material scientists. The latter use micromechanistic models as guidance to improve a material's crack initiation resistance. Nearly one hundred approaches and their variations have been proposed to date. This proliferation poses a problem in deciding which method is most appropriate for a given application. Approaches were identified as being combinations of thirteen different classifications. This review is intended to aid both developers and users of high-temperature fatigue life prediction methods by providing a background from which choices can be made. The need for high-temperature, fatigue-life prediction methods followed immediately on the heels of the development of large, costly, high-technology industrial and aerospace equipment immediately following the second world war. Major advances were made in the design and manufacture of high-temperature, high-pressure boilers and steam turbines, nuclear reactors, high-temperature forming dies, high-performance poppet valves, aeronautical gas turbine engines, reusable rocket engines, etc. These advances could no longer be accomplished simply by trial and error using the 'build-em and bust-em' approach. Development lead times were too great and costs too prohibitive to retain such an approach. Analytic assessments of anticipated performance, cost, and durability were introduced to cut costs and shorten lead times. The analytic tools were quite primitive at first and out of necessity evolved in parallel with hardware development. After forty years more descriptive, more accurate, and more efficient analytic tools are being developed. These include thermal-structural finite element and boundary element analyses, advanced constitutive stress-strain-temperature-time relations, and creep-fatigue-environmental models for crack initiation and propagation. The high-temperature durability methods that have evolved for calculating high-temperature fatigue crack initiation lives of structural engineering materials are addressed. Only a few of the methods were refined to the point of being directly useable in design.

  2. Scale Development and Initial Tests of the Multidimensional Complex Adaptive Leadership Scale for School Principals: An Exploratory Mixed Method Study

    ERIC Educational Resources Information Center

    Özen, Hamit; Turan, Selahattin

    2017-01-01

    This study was designed to develop the scale of the Complex Adaptive Leadership for School Principals (CAL-SP) and examine its psychometric properties. This was an exploratory mixed method research design (ES-MMD). Both qualitative and quantitative methods were used to develop and assess psychometric properties of the questionnaire. This study…

  3. The Development and Evaluation of Training Methods for Group IV Personnel. 1. Orientation and Implementation of the Training Methods Development School (TMDS).

    ERIC Educational Resources Information Center

    Steinemann, John H.

    The investigation is part of continuing Navy research on the Trainability of Group IV (low ability) personnel intended to maximize the utilization and integration of marginal personnel in the fleet. An experimental Training Methods Development School (TMDS) was initiated to provide an experimental training program, with research controls, for…

  4. Evaluating Teachers' Professional Development Initiatives: Towards an Extended Evaluative Framework

    ERIC Educational Resources Information Center

    Merchie, Emmelien; Tuytens, Melissa; Devos, Geert; Vanderlinde, Ruben

    2018-01-01

    Evaluating teachers' professional development initiatives (PDI) is one of the main challenges for the teacher professionalisation field. Although different studies have focused on the effectiveness of PDI, the obtained effects and evaluative methods have been found to be widely divergent. By means of a narrative review, this study provides an…

  5. Developing and Validating a Competence Framework for Secondary Mathematics Student Teachers through a Delphi Method

    ERIC Educational Resources Information Center

    Muñiz-Rodríguez, Laura; Alonso, Pedro; Rodríguez-Muñiz, Luis J.; Valcke, Martin

    2017-01-01

    Initial teacher education programmes provide student teachers with the desired competences to develop themselves as teachers. Although a generic framework for teaching competences is available covering all school subjects in Spain, the initial teacher education programmes curriculum does not specify which competences secondary mathematics student…

  6. Size-guided multi-seed heuristic method for geometry optimization of clusters: Application to benzene clusters.

    PubMed

    Takeuchi, Hiroshi

    2018-05-08

    Since searching for the global minimum on the potential energy surface of a cluster is very difficult, many geometry optimization methods have been proposed, in which initial geometries are randomly generated and subsequently improved with different algorithms. In this study, a size-guided multi-seed heuristic method is developed and applied to benzene clusters. It produces initial configurations of the cluster with n molecules from the lowest-energy configurations of the cluster with n - 1 molecules (seeds). The initial geometries are further optimized with the geometrical perturbations previously used for molecular clusters. These steps are repeated until the size n satisfies a predefined one. The method locates putative global minima of benzene clusters with up to 65 molecules. The performance of the method is discussed using the computational cost, rates to locate the global minima, and energies of initial geometries. © 2018 Wiley Periodicals, Inc. © 2018 Wiley Periodicals, Inc.

  7. Automatic selection of landmarks in T1-weighted head MRI with regression forests for image registration initialization.

    PubMed

    Wang, Jianing; Liu, Yuan; Noble, Jack H; Dawant, Benoit M

    2017-10-01

    Medical image registration establishes a correspondence between images of biological structures, and it is at the core of many applications. Commonly used deformable image registration methods depend on a good preregistration initialization. We develop a learning-based method to automatically find a set of robust landmarks in three-dimensional MR image volumes of the head. These landmarks are then used to compute a thin plate spline-based initialization transformation. The process involves two steps: (1) identifying a set of landmarks that can be reliably localized in the images and (2) selecting among them the subset that leads to a good initial transformation. To validate our method, we use it to initialize five well-established deformable registration algorithms that are subsequently used to register an atlas to MR images of the head. We compare our proposed initialization method with a standard approach that involves estimating an affine transformation with an intensity-based approach. We show that for all five registration algorithms the final registration results are statistically better when they are initialized with the method that we propose than when a standard approach is used. The technique that we propose is generic and could be used to initialize nonrigid registration algorithms for other applications.

  8. A Mixed-Methods Analysis in Assessing Students' Professional Development by Applying an Assessment for Learning Approach.

    PubMed

    Peeters, Michael J; Vaidya, Varun A

    2016-06-25

    Objective. To describe an approach for assessing the Accreditation Council for Pharmacy Education's (ACPE) doctor of pharmacy (PharmD) Standard 4.4, which focuses on students' professional development. Methods. This investigation used mixed methods with triangulation of qualitative and quantitative data to assess professional development. Qualitative data came from an electronic developmental portfolio of professionalism and ethics, completed by PharmD students during their didactic studies. Quantitative confirmation came from the Defining Issues Test (DIT)-an assessment of pharmacists' professional development. Results. Qualitatively, students' development reflections described growth through this course series. Quantitatively, the 2015 PharmD class's DIT N2-scores illustrated positive development overall; the lower 50% had a large initial improvement compared to the upper 50%. Subsequently, the 2016 PharmD class confirmed these average initial improvements of students and also showed further substantial development among students thereafter. Conclusion. Applying an assessment for learning approach, triangulation of qualitative and quantitative assessments confirmed that PharmD students developed professionally during this course series.

  9. A risk assessment method for multi-site damage

    NASA Astrophysics Data System (ADS)

    Millwater, Harry Russell, Jr.

    This research focused on developing probabilistic methods suitable for computing small probabilities of failure, e.g., 10sp{-6}, of structures subject to multi-site damage (MSD). MSD is defined as the simultaneous development of fatigue cracks at multiple sites in the same structural element such that the fatigue cracks may coalesce to form one large crack. MSD is modeled as an array of collinear cracks with random initial crack lengths with the centers of the initial cracks spaced uniformly apart. The data used was chosen to be representative of aluminum structures. The structure is considered failed whenever any two adjacent cracks link up. A fatigue computer model is developed that can accurately and efficiently grow a collinear array of arbitrary length cracks from initial size until failure. An algorithm is developed to compute the stress intensity factors of all cracks considering all interaction effects. The probability of failure of two to 100 cracks is studied. Lower bounds on the probability of failure are developed based upon the probability of the largest crack exceeding a critical crack size. The critical crack size is based on the initial crack size that will grow across the ligament when the neighboring crack has zero length. The probability is evaluated using extreme value theory. An upper bound is based on the probability of the maximum sum of initial cracks being greater than a critical crack size. A weakest link sampling approach is developed that can accurately and efficiently compute small probabilities of failure. This methodology is based on predicting the weakest link, i.e., the two cracks to link up first, for a realization of initial crack sizes, and computing the cycles-to-failure using these two cracks. Criteria to determine the weakest link are discussed. Probability results using the weakest link sampling method are compared to Monte Carlo-based benchmark results. The results indicate that very small probabilities can be computed accurately in a few minutes using a Hewlett-Packard workstation.

  10. Analysis of Perfluorinated Chemicals in Sludge: Method Development and Initial Results

    EPA Science Inventory

    A fast, rigorous method was developed to maximize the extraction efficacy for ten perfluorocarboxylic acids and perfluorooctanesulfonate from wastewater-treatment sludge and to quantitate using liquid chromatography, tandem-mass spectrometry (LC/MS/MS). First, organic solvents w...

  11. 76 FR 68769 - Bridging the Idea Development Evaluation Assessment and Long-Term Initiative and Total Product...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-07

    ... complementary methodological frameworks of the IDEAL and TPLC initiatives, more comprehensive and applicable... devices, surgical operations, and invasive medical procedures; Unique study designs and reporting methods...

  12. The rideability of a deflected bridge approach slab : technical summary report 457.

    DOT National Transportation Integrated Search

    2009-11-01

    The Louisiana Department of Transportation and Development (LADOTD) initiated the Louisiana Quality : Initiative (LQI) entitled Preservation of Bridge Approach Rideability to explore different potential : methods of solving what has been observ...

  13. Building analytic capacity, facilitating partnerships, and promoting data use in state health agencies: a distance-based workforce development initiative applied to maternal and child health epidemiology.

    PubMed

    Rankin, Kristin M; Kroelinger, Charlan D; Rosenberg, Deborah; Barfield, Wanda D

    2012-12-01

    The purpose of this article is to summarize the methodology, partnerships, and products developed as a result of a distance-based workforce development initiative to improve analytic capacity among maternal and child health (MCH) epidemiologists in state health agencies. This effort was initiated by the Centers for Disease Control's MCH Epidemiology Program and faculty at the University of Illinois at Chicago to encourage and support the use of surveillance data by MCH epidemiologists and program staff in state agencies. Beginning in 2005, distance-based training in advanced analytic skills was provided to MCH epidemiologists. To support participants, this model of workforce development included: lectures about the practical application of innovative epidemiologic methods, development of multidisciplinary teams within and across agencies, and systematic, tailored technical assistance The goal of this initiative evolved to emphasize the direct application of advanced methods to the development of state data products using complex sample surveys, resulting in the articles published in this supplement to MCHJ. Innovative methods were applied by participating MCH epidemiologists, including regional analyses across geographies and datasets, multilevel analyses of state policies, and new indicator development. Support was provided for developing cross-state and regional partnerships and for developing and publishing the results of analytic projects. This collaboration was successful in building analytic capacity, facilitating partnerships and promoting surveillance data use to address state MCH priorities, and may have broader application beyond MCH epidemiology. In an era of decreasing resources, such partnership efforts between state and federal agencies and academia are essential for promoting effective data use.

  14. Development and validation of sensitive kinetic spectrophotometric method for the determination of moxifloxacin antibiotic in pure and commercial tablets

    NASA Astrophysics Data System (ADS)

    Ashour, Safwan; Bayram, Roula

    2015-04-01

    New, accurate, sensitive and reliable kinetic spectrophotometric method for the assay of moxifloxacin hydrochloride (MOXF) in pure form and pharmaceutical formulations has been developed. The method involves the oxidative coupling reaction of MOXF with 3-methyl-2-benzothiazolinone hydrazone hydrochloride monohydrate (MBTH) in the presence of Ce(IV) in an acidic medium to form colored product with lambda max at 623 and 660 nm. The reaction is followed spectrophotometrically by measuring the increase in absorbance at 623 nm as a function of time. The initial rate and fixed time methods were adopted for constructing the calibration curves. The linearity range was found to be 1.89-40.0 μg mL-1 for initial rate and fixed time methods. The limit of detection for initial rate and fixed time methods is 0.644 and 0.043 μg mL-1, respectively. Molar absorptivity for the method was found to be 0.89 × 104 L mol-1 cm-1. Statistical treatment of the experimental results indicates that the methods are precise and accurate. The proposed method has been applied successfully for the estimation of moxifloxacin hydrochloride in tablet dosage form with no interference from the excipients. The results are compared with the official method.

  15. Scalable Production of Glioblastoma Tumor-initiating Cells in 3 Dimension Thermoreversible Hydrogels

    NASA Astrophysics Data System (ADS)

    Li, Qiang; Lin, Haishuang; Wang, Ou; Qiu, Xuefeng; Kidambi, Srivatsan; Deleyrolle, Loic P.; Reynolds, Brent A.; Lei, Yuguo

    2016-08-01

    There is growing interest in developing drugs that specifically target glioblastoma tumor-initiating cells (TICs). Current cell culture methods, however, cannot cost-effectively produce the large numbers of glioblastoma TICs required for drug discovery and development. In this paper we report a new method that encapsulates patient-derived primary glioblastoma TICs and grows them in 3 dimension thermoreversible hydrogels. Our method allows long-term culture (~50 days, 10 passages tested, accumulative ~>1010-fold expansion) with both high growth rate (~20-fold expansion/7 days) and high volumetric yield (~2.0 × 107 cells/ml) without the loss of stemness. The scalable method can be used to produce sufficient, affordable glioblastoma TICs for drug discovery.

  16. Using Academic Literacies and Genre-Based Models for Academic Writing Instruction: A "Literacy" Journey

    ERIC Educational Resources Information Center

    Wingate, Ursula

    2012-01-01

    Three writing development initiatives carried out at King's College London UK are discussed in this article to illustrate the need to draw on different theoretical models to create effective methods of teaching academic writing. The sequence of initiatives resembles a journey: the destination is to develop academic writing programmes suitable for…

  17. Development and Initial Psychometric Properties of the Computer Assisted Maltreatment Inventory (CAMI): A Comprehensive Self-Report Measure of Child Maltreatment History

    ERIC Educational Resources Information Center

    DiLillo, David; Hayes-Skelton, Sarah A.; Fortier, Michelle A.; Perry, Andrea R.; Evans, Sarah E.; Messman Moore, Terri L.; Walsh, Kate; Nash, Cindy; Fauchier, Angele

    2010-01-01

    Objectives: The present study reports on the development and initial psychometric properties of the Computer Assisted Maltreatment Inventory (CAMI), a web-based self-report measure of child maltreatment history, including sexual and physical abuse, exposure to interparental violence, psychological abuse, and neglect. Methods: The CAMI was…

  18. Teaching Theory Construction With Initial Grounded Theory Tools: A Reflection on Lessons and Learning.

    PubMed

    Charmaz, Kathy

    2015-12-01

    This article addresses criticisms of qualitative research for spawning studies that lack analytic development and theoretical import. It focuses on teaching initial grounded theory tools while interviewing, coding, and writing memos for the purpose of scaling up the analytic level of students' research and advancing theory construction. Adopting these tools can improve teaching qualitative methods at all levels although doctoral education is emphasized here. What teachers cover in qualitative methods courses matters. The pedagogy presented here requires a supportive environment and relies on demonstration, collective participation, measured tasks, progressive analytic complexity, and accountability. Lessons learned from using initial grounded theory tools are exemplified in a doctoral student's coding and memo-writing excerpts that demonstrate progressive analytic development. The conclusion calls for increasing the number and depth of qualitative methods courses and for creating a cadre of expert qualitative methodologists. © The Author(s) 2015.

  19. The promises and limitations of female-initiated methods of HIV/STI protection

    PubMed Central

    Mantell, Joanne E.; Dworkin, Shari L.; Exner, Theresa M.; Hoffman, Susie; Smit, Jenni A.; Susser, Ida

    2014-01-01

    New methods are now available, and others are being developed, that could enable women to take the initiative in preventing sexually transmitted infections. However, attempts to capitalize on “female-controlled” preventive methods thus far have met with limited success. Female-initiated methods were introduced to intervene in the state of gender relations and assist women who are disempowered vis-à-vis their male partners. Paradoxically, however, we underscore that it is the very structure of regional and local gender relations that shapes the acceptability (or lack of acceptability) of these methods. This paper specifically addresses how the structure of gender relations—for better and for worse—shapes the promises and limitations of widespread use and acceptance of female-initiated methods. We draw on examples from around the world to underscore how the regional specificities of gender (in)equality shape the acceptance, negotiation, and use of these methods. Simultaneously, we demonstrate how the introduction and sustained use of methods are shaped by gender relations and offer possibilities for reinforcing or challenging their current state. Based on our analyses, we offer key policy and programmatic recommendations to increase promotion and effective use of women-initiated HIV/STI protection methods for both women and men. PMID:16814912

  20. Direct Numerical Simulation of a Temporally Evolving Incompressible Plane Wake: Effect of Initial Conditions on Evolution and Topology

    NASA Technical Reports Server (NTRS)

    Sondergaard, R.; Cantwell, B.; Mansour, N.

    1997-01-01

    Direct numerical simulations have been used to examine the effect of the initial disturbance field on the development of three-dimensionality and the transition to turbulence in the incompressible plane wake. The simulations were performed using a new numerical method for solving the time-dependent, three-dimensional, incompressible Navier-Stokes equations in flows with one infinite and two periodic directions. The method uses standard Fast Fourier Transforms and is applicable to cases where the vorticity field is compact in the infinite direction. Initial disturbances fields examined were combinations of two-dimensional waves and symmetric pairs of 60 deg oblique waves at the fundamental, subharmonic, and sub-subharmonic wavelengths. The results of these simulations indicate that the presence of 60 deg disturbances at the subharmonic streamwise wavelength results in the development of strong coherent three-dimensional structures. The resulting strong three-dimensional rate-of-strain triggers the growth of intense fine scale motions. Wakes initiated with 60 deg disturbances at the fundamental streamwise wavelength develop weak coherent streamwise structures, and do not develop significant fine scale motions, even at high Reynolds numbers. The wakes which develop strong three-dimensional structures exhibit growth rates on par with experimentally observed turbulent plane wakes. Wakes which develop only weak three-dimensional structures exhibit significantly lower late time growth rates. Preliminary studies of wakes initiated with an oblique fundamental and a two-dimensional subharmonic, which develop asymmetric coherent oblique structures at the subharmonic wavelength, indicate that significant fine scale motions only develop if the resulting oblique structures are above an angle of approximately 45 deg.

  1. INTEGRATION OF SPATIAL DATA: METHODS EVALUATION WITH REGARD TO DATA ISSUES AND ASSESSMENT QUESTIONS

    EPA Science Inventory

    EPA's Regional Vulnerability Assessment (REVA) Program is developing and demonstrating approaches to assess current and future environmental vulnerabilities at a regional scale. An initial effort within this research program has been to develop and evaluate methods to synthesize ...

  2. Comparison of initial perturbation methods for the mesoscale ensemble prediction system of the Meteorological Research Institute for the WWRP Beijing 2008 Olympics Research and Development Project (B08RDP)

    NASA Astrophysics Data System (ADS)

    Saito, Kazuo; Hara, Masahiro; Kunii, Masaru; Seko, Hiromu; Yamaguchi, Munehiko

    2011-05-01

    Different initial perturbation methods for the mesoscale ensemble prediction were compared by the Meteorological Research Institute (MRI) as a part of the intercomparison of mesoscale ensemble prediction systems (EPSs) of the World Weather Research Programme (WWRP) Beijing 2008 Olympics Research and Development Project (B08RDP). Five initial perturbation methods for mesoscale ensemble prediction were developed for B08RDP and compared at MRI: (1) a downscaling method of the Japan Meteorological Agency (JMA)'s operational one-week EPS (WEP), (2) a targeted global model singular vector (GSV) method, (3) a mesoscale model singular vector (MSV) method based on the adjoint model of the JMA non-hydrostatic model (NHM), (4) a mesoscale breeding growing mode (MBD) method based on the NHM forecast and (5) a local ensemble transform (LET) method based on the local ensemble transform Kalman filter (LETKF) using NHM. These perturbation methods were applied to the preliminary experiments of the B08RDP Tier-1 mesoscale ensemble prediction with a horizontal resolution of 15 km. To make the comparison easier, the same horizontal resolution (40 km) was employed for the three mesoscale model-based initial perturbation methods (MSV, MBD and LET). The GSV method completely outperformed the WEP method, confirming the advantage of targeting in mesoscale EPS. The GSV method generally performed well with regard to root mean square errors of the ensemble mean, large growth rates of ensemble spreads throughout the 36-h forecast period, and high detection rates and high Brier skill scores (BSSs) for weak rains. On the other hand, the mesoscale model-based initial perturbation methods showed good detection rates and BSSs for intense rains. The MSV method showed a rapid growth in the ensemble spread of precipitation up to a forecast time of 6 h, which suggests suitability of the mesoscale SV for short-range EPSs, but the initial large growth of the perturbation did not last long. The performance of the MBD method was good for ensemble prediction of intense rain with a relatively small computing cost. The LET method showed similar characteristics to the MBD method, but the spread and growth rate were slightly smaller and the relative operating characteristic area skill score and BSS did not surpass those of MBD. These characteristic features of the five methods were confirmed by checking the evolution of the total energy norms and their growth rates. Characteristics of the initial perturbations obtained by four methods (GSV, MSV, MBD and LET) were examined for the case of a synoptic low-pressure system passing over eastern China. With GSV and MSV, the regions of large spread were near the low-pressure system, but with MSV, the distribution was more concentrated on the mesoscale disturbance. On the other hand, large-spread areas were observed southwest of the disturbance in MBD and LET. The horizontal pattern of LET perturbation was similar to that of MBD, but the amplitude of the LET perturbation reflected the observation density.

  3. Mapping debris-flow hazard in Honolulu using a DEM

    USGS Publications Warehouse

    Ellen, Stephen D.; Mark, Robert K.; ,

    1993-01-01

    A method for mapping hazard posed by debris flows has been developed and applied to an area near Honolulu, Hawaii. The method uses studies of past debris flows to characterize sites of initiation, volume at initiation, and volume-change behavior during flow. Digital simulations of debris flows based on these characteristics are then routed through a digital elevation model (DEM) to estimate degree of hazard over the area.

  4. Pilot-in-the-Loop CFD Method Development

    DTIC Science & Technology

    2017-04-20

    the methods on the NAVAIR Manned Flight Simulator. Activities this period During this report period, we implemented the CRAFT CFD code on the...Penn State VLRCROE Flight simulator and performed the first Pilot-in-the-Loop PILCFD tests at Penn State using the COCOA5 clusters. The initial tests...integration of the flight simulator and Penn State computing infrastructure. Initial tests showed slower performance than real-time (3x slower than real

  5. Benchmarking the Performance of Employment and Training Programs: A Pilot Effort of the Annie E. Casey Foundation's Jobs Initiative.

    ERIC Educational Resources Information Center

    Welch, Doug

    As part of its Jobs Initiative (JI) program in six metropolitan areas Denver, Milwaukee, New Orleans, Philadelphia, St. Louis, and Seattle the Annie E. Casey Foundation sought to develop and test a method for establishing benchmarks for workforce development agencies. Data collected from 10 projects in the JI from April through March, 2000,…

  6. Hierarchically-Driven Approach for Quantifying Fatigue Crack Initiation and Short Crack Growth Behavior in Aerospace Materials

    DTIC Science & Technology

    2016-08-31

    crack initiation and SCG mechanisms (initiation and growth versus resistance). 2. Final summary Here, we present a hierarchical form of multiscale...prismatic faults in -Ti: A combined quantum mechanics /molecular mechanics study 2. Nano-indentation and slip transfer (critical in understanding crack...initiation) 3. An extended-finite element framework (XFEM) to study SCG mechanisms 4. Atomistic methods to develop a grain and twin boundaries database

  7. Development of Optimization method about Capital Structure and Senior-Sub Structure by considering Project-Risk

    NASA Astrophysics Data System (ADS)

    Kawamoto, Shigeru; Ikeda, Yuichi; Fukui, Chihiro; Tateshita, Fumihiko

    Private finance initiative is a business scheme that materializes social infrastructure and public services by utilizing private-sector resources. In this paper we propose a new method to optimize capital structure, which is the ratio of capital to debt, and senior-sub structure, which is the ratio of senior loan to subordinated loan, for private finance initiative. We make the quantitative analysis of a private finance initiative's project using the proposed method. We analyze trade-off structure between risk and return in the project, and optimize capital structure and senior-sub structure. The method we propose helps to improve financial stability of the project, and to make a fund raising plan that is expected to be reasonable for project sponsor and moneylender.

  8. Multiscale atomistic simulation of metal-oxygen surface interactions: Methodological development, theoretical investigation, and correlation with experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Judith C.

    The purpose of this grant is to develop the multi-scale theoretical methods to describe the nanoscale oxidation of metal thin films, as the PI (Yang) extensive previous experience in the experimental elucidation of the initial stages of Cu oxidation by primarily in situ transmission electron microscopy methods. Through the use and development of computational tools at varying length (and time) scales, from atomistic quantum mechanical calculation, force field mesoscale simulations, to large scale Kinetic Monte Carlo (KMC) modeling, the fundamental underpinings of the initial stages of Cu oxidation have been elucidated. The development of computational modeling tools allows for acceleratedmore » materials discovery. The theoretical tools developed from this program impact a wide range of technologies that depend on surface reactions, including corrosion, catalysis, and nanomaterials fabrication.« less

  9. One-year contraceptive continuation and pregnancy in adolescent girls and women initiating hormonal contraceptives.

    PubMed

    Raine, Tina R; Foster-Rosales, Anne; Upadhyay, Ushma D; Boyer, Cherrie B; Brown, Beth A; Sokoloff, Abby; Harper, Cynthia C

    2011-02-01

    To assess contraceptive discontinuation, switching, factors associated with method discontinuation, and pregnancy among women initiating hormonal contraceptives. This was a 12-month longitudinal cohort study of adolescent girls and women (n=1,387) aged 15 to 24 years attending public family planning clinics who did not desire pregnancy for at least 1 year and selected to initiate the patch, ring, depot medroxyprogesterone acetate, or pills. Participants completed follow-up assessments at 3, 6, and 12 months after baseline. Life table analysis was used to estimate survival rates for contraceptive continuation. Cox proportional hazards models were used to estimate factors associated with method discontinuation. The continuation rate (per 100 person-years) at 12 months was low for all methods; however, it was lowest for patch and depot medroxyprogesterone acetate initiators, 10.9 and 12.1 per 100 person years, respectively (P≤.003); continuation among ring initiators was comparable to pill initiators, 29.4 and 32.7 per 100 person-years, respectively (P=.06). Discontinuation was independently associated with method initiated and younger age. The only factors associated with lower risk of discontinuation were greater intent to use the method and being in school or working. The pregnancy rate (per 100 person-years) was highest for patch and ring initiators (30.1 and 30.5) and comparable for pill and depot medroxyprogesterone acetate initiators (16.5 and 16.1; P<.001). The patch and the ring may not be better options than the pill or depot medroxyprogesterone acetate for women at high risk for unintended pregnancy. This study highlights the need for counseling interventions to improve contraceptive continuation, education about longer-acting methods, and developing new contraceptives that women may be more likely to continue. II.

  10. Novel kinetic spectrophotometric method for estimation of certain biologically active phenolic sympathomimetic drugs in their bulk powders and different pharmaceutical formulations

    NASA Astrophysics Data System (ADS)

    Omar, Mahmoud A.; Badr El-Din, Khalid M.; Salem, Hesham; Abdelmageed, Osama H.

    2018-03-01

    A simple, selective and sensitive kinetic spectrophotometric method was described for estimation of four phenolic sympathomimetic drugs namely; terbutaline sulfate, fenoterol hydrobromide, isoxsuprine hydrochloride and etilefrine hydrochloride. This method is depended on the oxidation of the phenolic drugs with Folin-Ciocalteu reagent in presence of sodium carbonate. The rate of color development at 747-760 nm was measured spectrophotometrically. The experimental parameters controlling the color development were fully studied and optimized. The reaction mechanism for color development was proposed. The calibration graphs for both the initial rate and fixed time methods were constructed, where linear correlations were found in the general concentration ranges of 3.65 × 10- 6-2.19 × 10- 5 mol L- 1 and 2-24.0 μg mL- 1 with correlation coefficients in the following range 0.9992-0.9999, 0.9991-0.9998 respectively. The limits of detection and quantitation for the initial rate and fixed time methods were found to be in general concentration range 0.109-0.273, 0.363-0.910 and 0.210-0.483, 0.700-1.611 μg mL- 1 respectively. The developed method was validated according to ICH and USP 30 -NF 25 guidelines. The suggested method was successfully implemented to the estimation of these drugs in their commercial pharmaceutical formulations and the recovery percentages obtained were ranged from 97.63% ± 1.37 to 100.17% ± 0.95 and 97.29% ± 0.74 to 100.14 ± 0.81 for initial rate and fixed time methods respectively. The data obtained from the analysis of dosage forms were compared with those obtained by reported methods. Statistical analysis of these results indicated no significant variation in the accuracy and precision of both the proposed and reported methods.

  11. A GEO Initiative to Support the Sustainable Development Goals

    NASA Astrophysics Data System (ADS)

    Friedl, L.

    2016-12-01

    The United Nations Agenda 2030 serves as a global development agenda for progress on economic, social and environmental sustainability. These Sustainable Development Goals (SDG) have a specific provision for the use of Earth observations and geospatial information to support progress. The international Group on Earth Observations, GEO, has a dedicated initiative focused on the SDGs. This initiative supports efforts to integrate Earth observations and geospatial information into national development and monitoring frameworks for the SDGs. It helps enables countries and stakeholders to leverage Earth observations to support the implementation, planning, measuring, monitoring, reporting, and evaluation of the SDGs. This paper will present an overview of the GEO initiative and ways that Earth observations support the development goals. It will address how information and knowledge can be shared on effective methods to apply Earth observations to the SDGs and their associated targets and indicators. It will also highlight some existing information sources and tools on the SDGs, which can help identify key approaches for developing a knowledge base.

  12. Lateral Root Initiation in Arabidopsis: Developmental Window, Spatial Patterning, Density and Predictability

    PubMed Central

    DUBROVSKY, J. G.; GAMBETTA, G. A.; HERNÁNDEZ-BARRERA, A.; SHISHKOVA, S.; GONZÁLEZ, I.

    2006-01-01

    • Background and Aims The basic regulatory mechanisms that control lateral root (LR) initiation are still poorly understood. An attempt is made to characterize the pattern and timing of LR initiation, to define a developmental window in which LR initiation takes place and to address the question of whether LR initiation is predictable. • Methods The spatial patterning of LRs and LR primordia (LRPs) on cleared root preparations were characterized. New measures of LR and LRP densities (number of LRs and/or LRPs divided by the length of the root portions where they are present) were introduced and illustrate the shortcomings of the more customarily used measure through a comparative analysis of the mutant aux1-7. The enhancer trap line J0121 was used to monitor LR initiation in time-lapse experiments and a plasmolysis-based method was developed to determine the number of pericycle cells between successive LRPs. • Key Results LRP initiation occurred strictly acropetally and no de novo initiation events were found between already developed LRs or LRPs. However, LRPs did not become LRs in a similar pattern. The longitudinal spacing of lateral organs was variable and the distance between lateral organs was proportional to the number of cells and the time between initiations of successive LRPs. There was a strong tendency towards alternation in LR initiation between the two pericycle cell files adjacent to the protoxylem poles. LR density increased with time due to the emergence of slowly developing LRPs and appears to be unique for individual Arabidopsis accessions. • Conclusions. In Arabidopsis there is a narrow developmental window for LR initiation, and no specific cell-count or distance-measuring mechanisms have been found that determine the site of successive initiation events. Nevertheless, the branching density and lateral organ density (density of LRs and LRPs) are accession-specific, and based on the latter density the average distance between successive LRs can be predicted. PMID:16390845

  13. An initial investigation into methods of computing transonic aerodynamic sensitivity coefficients

    NASA Technical Reports Server (NTRS)

    Carlson, Leland A.

    1988-01-01

    The initial effort was concentrated on developing the quasi-analytical approach for two-dimensional transonic flow. To keep the problem computationally efficient and straightforward, only the two-dimensional flow was considered and the problem was modeled using the transonic small perturbation equation.

  14. Dynamic Assessment: One Approach and Some Initial Data. Technical Report No. 361.

    ERIC Educational Resources Information Center

    Campione, Joseph C.; Brown, Ann L.

    In an effort to validate dynamic assessment methods influenced by Vygotsky's (1978) definition of zones of proximal development (an indicator of readiness), three sets of experiments addressed two goals: the development of diagnostic assessment methods and the use of diagnostic results to guide the design of instructional programs. The first two…

  15. Development of practical diagnostic methods for monitoring rice bacterial panicle blight disease and evaluation of rice germplasm for resistance

    USDA-ARS?s Scientific Manuscript database

    A study was initiated to understand Burkholderia glumae, the major causal agent for bacterial panicle blight disease of rice; to develop practical diagnostic methods for monitoring the disease; and to evaluate rice germplasm for resistance. Burkholderia glumae was frequently isolated from infected p...

  16. Development of practical diagnostic methods for monitoring rice bacterial panicle blight disease and evaluation of rice germplasm for resistance

    USDA-ARS?s Scientific Manuscript database

    A study was initiated to understand Burkholderia glumae (major causal agent for bacterial panicle blight disease of rice) to develop practical diagnostic methods for monitoring the disease; and to evaluate rice germplasm for resistance. B. glumae was frequently isolated from symptomatic panicles on...

  17. Housing decision making methods for initiation development phase process

    NASA Astrophysics Data System (ADS)

    Zainal, Rozlin; Kasim, Narimah; Sarpin, Norliana; Wee, Seow Ta; Shamsudin, Zarina

    2017-10-01

    Late delivery and sick housing project problems were attributed to poor decision making. These problems are the string of housing developer that prefers to create their own approach based on their experiences and expertise with the simplest approach by just applying the obtainable standards and rules in decision making. This paper seeks to identify the decision making methods for housing development at the initiation phase in Malaysia. The research involved Delphi method by using questionnaire survey which involved 50 numbers of developers as samples for the primary stage of collect data. However, only 34 developers contributed to the second stage of the information gathering process. At the last stage, only 12 developers were left for the final data collection process. Finding affirms that Malaysian developers prefer to make their investment decisions based on simple interpolation of historical data and using simple statistical or mathematical techniques in producing the required reports. It was suggested that they seemed to skip several important decision-making functions at the primary development stage. These shortcomings were mainly due to time and financial constraints and the lack of statistical or mathematical expertise among the professional and management groups in the developer organisations.

  18. Master Logic Diagram: method for hazard and initiating event identification in process plants.

    PubMed

    Papazoglou, I A; Aneziris, O N

    2003-02-28

    Master Logic Diagram (MLD), a method for identifying events initiating accidents in chemical installations, is presented. MLD is a logic diagram that resembles a fault tree but without the formal mathematical properties of the latter. MLD starts with a Top Event "Loss of Containment" and decomposes it into simpler contributing events. A generic MLD has been developed which may be applied to all chemical installations storing toxic and/or flammable substances. The method is exemplified through its application to an ammonia storage facility.

  19. Predicting Ice Sheet and Climate Evolution at Extreme Scales

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heimbach, Patrick

    2016-02-06

    A main research objectives of PISCEES is the development of formal methods for quantifying uncertainties in ice sheet modeling. Uncertainties in simulating and projecting mass loss from the polar ice sheets arise primarily from initial conditions, surface and basal boundary conditions, and model parameters. In general terms, two main chains of uncertainty propagation may be identified: 1. inverse propagation of observation and/or prior onto posterior control variable uncertainties; 2. forward propagation of prior or posterior control variable uncertainties onto those of target output quantities of interest (e.g., climate indices or ice sheet mass loss). A related goal is the developmentmore » of computationally efficient methods for producing initial conditions for an ice sheet that are close to available present-day observations and essentially free of artificial model drift, which is required in order to be useful for model projections (“initialization problem”). To be of maximum value, such optimal initial states should be accompanied by “useful” uncertainty estimates that account for the different sources of uncerainties, as well as the degree to which the optimum state is constrained by available observations. The PISCEES proposal outlined two approaches for quantifying uncertainties. The first targets the full exploration of the uncertainty in model projections with sampling-based methods and a workflow managed by DAKOTA (the main delivery vehicle for software developed under QUEST). This is feasible for low-dimensional problems, e.g., those with a handful of global parameters to be inferred. This approach can benefit from derivative/adjoint information, but it is not necessary, which is why it often referred to as “non-intrusive”. The second approach makes heavy use of derivative information from model adjoints to address quantifying uncertainty in high-dimensions (e.g., basal boundary conditions in ice sheet models). The use of local gradient, or Hessian information (i.e., second derivatives of the cost function), requires additional code development and implementation, and is thus often referred to as an “intrusive” approach. Within PISCEES, MIT has been tasked to develop methods for derivative-based UQ, the ”intrusive” approach discussed above. These methods rely on the availability of first (adjoint) and second (Hessian) derivative code, developed through intrusive methods such as algorithmic differentiation (AD). While representing a significant burden in terms of code development, derivative-baesd UQ is able to cope with very high-dimensional uncertainty spaces. That is, unlike sampling methods (all variations of Monte Carlo), calculational burden is independent of the dimension of the uncertainty space. This is a significant advantage for spatially distributed uncertainty fields, such as threedimensional initial conditions, three-dimensional parameter fields, or two-dimensional surface and basal boundary conditions. Importantly, uncertainty fields for ice sheet models generally fall into this category.« less

  20. Methods of developing core collections based on the predicted genotypic value of rice ( Oryza sativa L.).

    PubMed

    Li, C T; Shi, C H; Wu, J G; Xu, H M; Zhang, H Z; Ren, Y L

    2004-04-01

    The selection of an appropriate sampling strategy and a clustering method is important in the construction of core collections based on predicted genotypic values in order to retain the greatest degree of genetic diversity of the initial collection. In this study, methods of developing rice core collections were evaluated based on the predicted genotypic values for 992 rice varieties with 13 quantitative traits. The genotypic values of the traits were predicted by the adjusted unbiased prediction (AUP) method. Based on the predicted genotypic values, Mahalanobis distances were calculated and employed to measure the genetic similarities among the rice varieties. Six hierarchical clustering methods, including the single linkage, median linkage, centroid, unweighted pair-group average, weighted pair-group average and flexible-beta methods, were combined with random, preferred and deviation sampling to develop 18 core collections of rice germplasm. The results show that the deviation sampling strategy in combination with the unweighted pair-group average method of hierarchical clustering retains the greatest degree of genetic diversities of the initial collection. The core collections sampled using predicted genotypic values had more genetic diversity than those based on phenotypic values.

  1. Time-dependent importance sampling in semiclassical initial value representation calculations for time correlation functions.

    PubMed

    Tao, Guohua; Miller, William H

    2011-07-14

    An efficient time-dependent importance sampling method is developed for the Monte Carlo calculation of time correlation functions via the initial value representation (IVR) of semiclassical (SC) theory. A prefactor-free time-dependent sampling function weights the importance of a trajectory based on the magnitude of its contribution to the time correlation function, and global trial moves are used to facilitate the efficient sampling the phase space of initial conditions. The method can be generally applied to sampling rare events efficiently while avoiding being trapped in a local region of the phase space. Results presented in the paper for two system-bath models demonstrate the efficiency of this new importance sampling method for full SC-IVR calculations.

  2. A simplified method of performance indicators development for epidemiological surveillance networks--application to the RESAPATH surveillance network.

    PubMed

    Sorbe, A; Chazel, M; Gay, E; Haenni, M; Madec, J-Y; Hendrikx, P

    2011-06-01

    Develop and calculate performance indicators allows to continuously follow the operation of an epidemiological surveillance network. This is an internal evaluation method, implemented by the coordinators in collaboration with all the actors of the network. Its purpose is to detect weak points in order to optimize management. A method for the development of performance indicators of epidemiological surveillance networks was developed in 2004 and was applied to several networks. Its implementation requires a thorough description of the network environment and all its activities to define priority indicators. Since this method is considered to be complex, our objective consisted in developing a simplified approach and applying it to an epidemiological surveillance network. We applied the initial method to a theoretical network model to obtain a list of generic indicators that can be adapted to any surveillance network. We obtained a list of 25 generic performance indicators, intended to be reformulated and described according to the specificities of each network. It was used to develop performance indicators for RESAPATH, an epidemiological surveillance network of antimicrobial resistance in pathogenic bacteria of animal origin in France. This application allowed us to validate the simplified method, its value in terms of practical implementation, and its level of user acceptance. Its ease of use and speed of application compared to the initial method argue in favor of its use on broader scale. Copyright © 2011 Elsevier Masson SAS. All rights reserved.

  3. Mathematical modeling of vibration processes in reinforced concrete structures for setting up crack initiation monitoring

    NASA Astrophysics Data System (ADS)

    Bykov, A. A.; Matveenko, B. P.; Serovaev, G. S.; Shardakov, I. N.; Shestakov, A. P.

    2015-03-01

    The contemporary construction industry is based on the use of reinforced concrete structures, but emergency situations resulting in fracture can arise in their exploitation. In a majority of cases, reinforced concrete fracture is realized as the process of crack formation and development. As a rule, the appearance of the first cracks does not lead to the complete loss of the carrying capacity but is a fracture precursor. One method for ensuring the safe operation of building structures is based on crack initiation monitoring. A vibration method for the monitoring of reinforced concrete structures is justified in this paper. An example of a reinforced concrete beam is used to consider all stages related to the analysis of the behavior of natural frequencies in the development of a crack-shaped defect and the use of the obtained numerical results for the vibration test method. The efficiency of the method is illustrated by the results of modeling of the physical part of the method related to the analysis of the natural frequency evolution as a response to the impact action in the crack development process.

  4. State Policy Initiatives for Financing Energy Efficiency in Public Buildings.

    ERIC Educational Resources Information Center

    Business Officer, 1984

    1984-01-01

    Alternative financing methods (other than state financing) for developing cost-effective energy efficiency projects are discussed. It is suggested that by properly financing energy efficiency investments, state campuses can generate immediate positive cash savings. The following eight initiatives for maximizing energy savings potential are…

  5. Perceptions of community-based participatory research in the delta nutrition intervention research initiative:an academic perspective

    USDA-ARS?s Scientific Manuscript database

    Lower Mississippi Delta Nutrition Intervention Research Initiative (Delta NIRI) is an academic-community partnership between seven academic institutions and three communities in Mississippi, Arkansas, and Louisiana. A range of community-based participatory methods have been employed to develop susta...

  6. Materials Genome Initiative Element

    NASA Technical Reports Server (NTRS)

    Vickers, John

    2015-01-01

    NASA is committed to developing new materials and manufacturing methods that can enable new missions with ever increasing mission demands. Typically, the development and certification of new materials and manufacturing methods in the aerospace industry has required more than 20 years of development time with a costly testing and certification program. To reduce the cost and time to mature these emerging technologies, NASA is developing computational materials tools to improve understanding of the material and guide the certification process.

  7. The Joint Experiment for Crop Assessment and Monitoring (JECAM) Initiative: Developing methods and best practices for global agricultural monitoring

    NASA Astrophysics Data System (ADS)

    Champagne, C.; Jarvis, I.; Defourny, P.; Davidson, A.

    2014-12-01

    Agricultural systems differ significantly throughout the world, making a 'one size fits all' approach to remote sensing and monitoring of agricultural landscapes problematic. The Joint Experiment for Crop Assessment and Monitoring (JECAM) was established in 2009 to bring together the global scientific community to work towards a set of best practices and recommendations for using earth observation data to map, monitor and report on agricultural productivity globally across an array of diverse agricultural systems. These methods form the research and development component of the Group on Earth Observation Global Agricultural Monitoring (GEOGLAM) initiative to harmonize global monitoring efforts and increase market transparency. The JECAM initiative brings together researchers from a large number of globally distributed, well monitored agricultural test sites that cover a range of crop types, cropping systems and climate regimes. Each test site works independently as well as together across multiple sites to test methods, sensors and field data collection techniques to derive key agricultural parameters, including crop type, crop condition, crop yield and soil moisture. The outcome of this project will be a set of best practices that cover the range of remote sensing monitoring and reporting needs, including satellite data acquisition, pre-processing techniques, information retrieval and ground data validation. These outcomes provide the research and development foundation for GEOGLAM and will help to inform the development of the GEOGLAM "system of systems" for global agricultural monitoring. The outcomes of the 2014 JECAM science meeting will be discussed as well as examples of methods being developed by JECAM scientists.

  8. From Indoctrination to Initiation: A Non-Coercive Approach to Faith-Learning Integration

    ERIC Educational Resources Information Center

    Reichard, Joshua D.

    2013-01-01

    This article contributes to ongoing discussions related to the nature, scope, and methods of faith-learning integration. The "initiation" approach developed by Tim McDonough (2011) is adapted to faith-learning integration in an attempt to bridge polarizing discussions regarding indoctrination versus rational autonomy and critical…

  9. Pilot-in-the-Loop CFD Method Development

    DTIC Science & Technology

    2014-06-16

    CFD analysis. Coupled simulations will be run at PSU on the COCOA -4 cluster, a high performance computing cluster. The CRUNCH CFD software has...been installed on the COCOA -4 servers and initial software tests are being conducted. Initial efforts will use the Generic Frigate Shape SFS-2 to

  10. General Practitioners' Management of Psychostimulant Drug Misuse: Implications for Education and Training

    ERIC Educational Resources Information Center

    Alkhamis, Ahmed; Matheson, Catriona; Bond, Christine

    2009-01-01

    Aims: To provide baseline data regarding GPs' knowledge, experience, and attitudes toward the management of PsychoStimulant Drug Misuse (PSDM) patients to inform future education and training initiatives. Methods: A structured cross-sectional postal questionnaire was developed following initial content setting interviews, piloted then sent to a…

  11. Hampton Roads climate impact quantification initiative : baseline assessment of the transportation assets & overview of economic analyses useful in quantifying impacts

    DOT National Transportation Integrated Search

    2016-09-13

    The Hampton Roads Climate Impact Quantification Initiative (HRCIQI) is a multi-part study sponsored by the U.S. Department of Transportation (DOT) Climate Change Center with the goals that include developing a cost tool that provides methods for volu...

  12. A Delphi Study and Initial Validation of Counselor Supervision Competencies

    ERIC Educational Resources Information Center

    Neuer Colburn, Anita A.; Grothaus, Tim; Hays, Danica G.; Milliken, Tammi

    2016-01-01

    The authors addressed the lack of supervision training standards for doctoral counseling graduates by developing and validating an initial list of supervision competencies. They used content analysis, Delphi polling, and content validity methods to generate a list, vetted by 2 different panels of supervision experts, of 33 competencies grouped…

  13. Brownfields initiatives offer few incentives for prospective developers, purchasers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wesolowski, T.; Antol, S.M.

    There has been much discussion and analysis in recent years among government agencies and state legislators regarding redevelopment of contaminated industrial sites, or brownfields. The goal of brownfields program is to encourage developers and purchasers to redevelop existing industrial sites, rather than choosing greenfield suburban sites. These programs create initiatives for redevelopment, such as reducing the potential liability of innocent purchasers or developers, streamlining administrative process relating to cleanups, and issuing guidance on cleanup methods and standards. Although such attempts are admirable and important, brownfields initiatives have failed because they address only one component of what is necessary to achievemore » redevelopment. The initiatives attempt to minimize disincentives to brownfields redevelopment; however, what is necessary for progress in this area is to create real economic and other incentives to make such redevelopment more attractive.« less

  14. Staff Study on Cost and Training Effectiveness of Proposed Training Systems. TAEG Report 1.

    ERIC Educational Resources Information Center

    Naval Training Equipment Center, Orlando, FL. Training Analysis and Evaluation Group.

    A study began the development and initial testing of a method for predicting cost and training effectiveness of proposed training programs. A prototype Training Effectiveness and Cost Effectiveness Prediction (TECEP) model was developed and tested. The model was a method for optimization of training media allocation on the basis of fixed training…

  15. Solar Power Tower Integrated Layout and Optimization Tool | Concentrating

    Science.gov Websites

    methods to reduce the overall computational burden while generating accurate and precise results. These methods have been developed as part of the U.S. Department of Energy (DOE) SunShot Initiative research

  16. Modeling, implementation, and validation of arterial travel time reliability.

    DOT National Transportation Integrated Search

    2013-11-01

    Previous research funded by Florida Department of Transportation (FDOT) developed a method for estimating : travel time reliability for arterials. This method was not initially implemented or validated using field data. This : project evaluated and r...

  17. An inviscid-viscous interaction approach to the calculation of dynamic stall initiation on airfoils

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cebeci, T.; Platzer, M.F.; Jang, H.M.

    An interactive boundary-layer method is described for computing unsteady incompressible flow over airfoils, including the initiation of dynamic stall. The inviscid unsteady panel method developed by Platzer and Teng is extended to include viscous effects. The solutions of the boundary-layer equations are obtained with an inverse finite-difference method employing an interaction law based on the Hilbert integral, and the algebraic eddy-viscosity formulation of Cebeci and Smith. The method is applied to airfoils subject to periodic and ramp-type motions and its abilities are examined for a range of angles of attack, reduced frequency, and pitch rate.

  18. The Professional Development Needs of School-Based Leadership in Preparation for a District-Wide One-to-One Initiative in a Large Urban School District

    ERIC Educational Resources Information Center

    Simmons, Brandon Dean

    2015-01-01

    The purpose of this study was to determine the professional development needs of school-based leadership in preparation for a district-wide one-to-one initiative in a large urban school district. This study used an explanatory sequential mixed methods design to answer the three research questions that drove the study. Research for this study was…

  19. Use of Information--LMC Connection

    ERIC Educational Resources Information Center

    Darrow, Rob

    2005-01-01

    Note taking plays an important part in the correct extracting of information from reference sources. The "Cornell Note Taking Method" initially developed as a method of taking notes during a lecture is well suited for taking notes from print sources and is one of the best "Use of Information" methods.

  20. Development of Reliability Based Life Prediction Methods for Thermal and Environmental Barrier Coatings in Ceramic Matrix Composites

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin

    2001-01-01

    Literature survey related to the EBC/TBC (environmental barrier coating/thermal barrier coating) fife models, failure mechanisms in EBC/TBC and the initial work plan for the proposed EBC/TBC life prediction methods development was developed as well as the finite element model for the thermal/stress analysis of the GRC-developed EBC system was prepared. Technical report for these activities is given in the subsequent sections.

  1. A method to generate the surface cell layer of the 3D virtual shoot apex from apical initials.

    PubMed

    Kucypera, Krzysztof; Lipowczan, Marcin; Piekarska-Stachowiak, Anna; Nakielski, Jerzy

    2017-01-01

    The development of cell pattern in the surface cell layer of the shoot apex can be investigated in vivo by use of a time-lapse confocal images, showing naked meristem in 3D in successive times. However, how this layer is originated from apical initials and develops as a result of growth and divisions of their descendants, remains unknown. This is an open area for computer modelling. A method to generate the surface cell layer is presented on the example of the 3D paraboloidal shoot apical dome. In the used model the layer originates from three apical initials that meet at the dome summit and develops through growth and cell divisions under the isotropic surface growth, defined by the growth tensor. The cells, which are described by polyhedrons, divide anticlinally with the smallest division plane that passes depending on the used mode through the cell center, or the point found randomly near this center. The formation of the surface cell pattern is described with the attention being paid to activity of the apical initials and fates of their descendants. The computer generated surface layer that included about 350 cells required about 1200 divisions of the apical initials and their derivatives. The derivatives were arranged into three more or less equal clonal sectors composed of cellular clones at different age. Each apical initial renewed itself 7-8 times to produce the sector. In the shape and location and the cellular clones the following divisions of the initial were manifested. The application of the random factor resulted in more realistic cell pattern in comparison to the pure mode. The cell divisions were analyzed statistically on the top view. When all of the division walls were considered, their angular distribution was uniform, whereas in the distribution that was limited to apical initials only, some preferences related to their arrangement at the dome summit were observed. The realistic surface cell pattern was obtained. The present method is a useful tool to generate surface cell layer, study activity of initial cells and their derivatives, and how cell expansion and division are coordinated during growth. We expect its further application to clarify the question of a number and permanence or impermanence of initial cells, and possible relationship between their shape and oriented divisions, both on the ground of the growth tensor approach.

  2. INTEGRATION OF SPATIAL DATA: EVALUATION OF METHODS BASED ON DATA ISSUES AND ASSESSMENT QUESTIONS

    EPA Science Inventory

    EPA's Regional Vulnerability Assessment (ReVA) Program has focused initially on the synthesis of existing data. We have used the same set of spatial data and synthesized these data using a total of 11 existing and newly developed integration methods. These methods were evaluated ...

  3. Capital update factor: a new era approaches.

    PubMed

    Grimaldi, P L

    1993-02-01

    The Health Care Financing Administration (HCFA) has constructed a preliminary model of a new capital update method which is consistent with the framework being developed to refine the update method for PPS operating costs. HCFA's eventual goal is to develop a single update framework for operating and capital costs. Initial results suggest that adopting the new capital update method would reduce capital payments substantially, which might intensify creditor's concerns about extending loans to hospitals.

  4. The Pixon Method for Data Compression Image Classification, and Image Reconstruction

    NASA Technical Reports Server (NTRS)

    Puetter, Richard; Yahil, Amos

    2002-01-01

    As initially proposed, this program had three goals: (1) continue to develop the highly successful Pixon method for image reconstruction and support other scientist in implementing this technique for their applications; (2) develop image compression techniques based on the Pixon method; and (3) develop artificial intelligence algorithms for image classification based on the Pixon approach for simplifying neural networks. Subsequent to proposal review the scope of the program was greatly reduced and it was decided to investigate the ability of the Pixon method to provide superior restorations of images compressed with standard image compression schemes, specifically JPEG-compressed images.

  5. Evolution of solar magnetic fields - A new approach to MHD initial-boundary value problems by the method of nearcharacteristics

    NASA Technical Reports Server (NTRS)

    Nakagawa, Y.

    1980-01-01

    A method of analysis for the MHD initial-boundary problem is presented in which the model's formulation is based on the method of nearcharacteristics developed by Werner (1968) and modified by Shin and Kot (1978). With this method, the physical causality relationship can be traced from the perturbation to the response as in the method of characteristics, while achieving the advantage of a considerable reduction in mathematical procedures. The method offers the advantage of examining not only the evolution of nonforce free fields, but also the changes of physical conditions in the atmosphere accompanying the evolution of magnetic fields. The physical validity of the method is demonstrated with examples, and their significance in interpreting observations is discussed.

  6. Precision pointing and control of flexible spacecraft

    NASA Technical Reports Server (NTRS)

    Bantell, M. H., Jr.

    1987-01-01

    The problem and long term objectives for the precision pointing and control of flexible spacecraft are given. The four basic objectives are stated in terms of two principle tasks. Under Task 1, robust low order controllers, improved structural modeling methods for control applications and identification methods for structural dynamics are being developed. Under Task 2, a lab test experiment for verification of control laws and system identification algorithms is being developed. For Task 1, work has focused on robust low order controller design and some initial considerations for structural modeling in control applications. For Task 2, work has focused on experiment design and fabrication, along with sensor selection and initial digital controller implementation. Conclusions are given.

  7. A commercially viable virtual reality knee arthroscopy training system.

    PubMed

    McCarthy, A D; Hollands, R J

    1998-01-01

    Arthroscopy is a minimally invasive form of surgery used to inspect joints. It is complex to learn yet current training methods appear inadequate, thus negating the potential benefits to the patient. This paper describes the development and initial assessment of a cost-effective virtual reality based system for training surgeons in arthroscopy of the knee. The system runs on a P.C. Initial assessments by surgeons have been positive and current developments in deformable models are described.

  8. The Impact of Adaptive Complex Assessment on the HOT Skill Development of Students

    ERIC Educational Resources Information Center

    Raiyn, Jamal; Tilchin, Oleg

    2016-01-01

    In this paper we propose a method for the adaptive complex assessment (ACA) of the higher-order thinking (HOT) skills needed by students for problem solving, and we examine the impact of the method on the development of HOT skills in a problem-based learning (PBL) environment. Complexity in the assessment is provided by initial, formative, and…

  9. Developing Skills in Counselling and Psychotherapy: A Scoping Review of Interpersonal Process Recall and Reflecting Team Methods in Initial Therapist Training

    ERIC Educational Resources Information Center

    Meekums, Bonnie; Macaskie, Jane; Kapur, Tricia

    2016-01-01

    The authors conducted a scoping review of the peer-reviewed literature associated with Interpersonal Process Recall (IPR) and Reflecting Team (RT) methods in order to find evidence for their use within skills development in therapist trainings. Inclusion criteria were: empirical research, reviews of empirical research, and responses to these; RT…

  10. Aeroelastic stability and response of rotating structures

    NASA Technical Reports Server (NTRS)

    Keith, Theo G., Jr.

    1993-01-01

    A summary of the work performed during the progress period is presented. Analysis methods for predicting loads and instabilities of wind turbines were developed. Three new areas of research to aid the Advanced Turboprop Project (ATP) were initiated and developed. These three areas of research are aeroelastic analysis methods for cascades including blade and disk flexibility; stall flutter analysis; and computational aeroelasticity.

  11. Development of precursors recognition methods in vector signals

    NASA Astrophysics Data System (ADS)

    Kapralov, V. G.; Elagin, V. V.; Kaveeva, E. G.; Stankevich, L. A.; Dremin, M. M.; Krylov, S. V.; Borovov, A. E.; Harfush, H. A.; Sedov, K. S.

    2017-10-01

    Precursor recognition methods in vector signals of plasma diagnostics are presented. Their requirements and possible options for their development are considered. In particular, the variants of using symbolic regression for building a plasma disruption prediction system are discussed. The initial data preparation using correlation analysis and symbolic regression is discussed. Special attention is paid to the possibility of using algorithms in real time.

  12. Evaluating Phase II of a New York City-Wide STEM Initiative Using Propensity Score Methods: A Replication Study

    ERIC Educational Resources Information Center

    Thomas, Ally S.; Bonner, Sarah M.; Everson, Howard T.

    2014-01-01

    Recently, the authors have been exploring the use of propensity score methods for developing evidence of program impact. Specifically, they have been developing evidence (after one year of implementation) of the effects of the Math Science Partnership in New York City ("MSPinNYC2") on high school students' achievement--both in terms of…

  13. Development of methods for the restoration of the American elm in forested landscapes

    Treesearch

    James M. Slavicek

    2013-01-01

    A project was initiated in 2003 to establish test sites to develop methods to reintroduce the American elm (Ulmus americana L.) in forested landscapes. American elm tree strains with high levels of tolerance to Dutch elm disease (DED) were established in areas where the trees can naturally regenerate and spread. The process of regeneration will...

  14. Multilingual Phoneme Models for Rapid Speech Processing System Development

    DTIC Science & Technology

    2006-09-01

    processes are used to develop an Arabic speech recognition system starting from monolingual English models, In- ternational Phonetic Association (IPA...clusters. It was found that multilingual bootstrapping methods out- perform monolingual English bootstrapping methods on the Arabic evaluation data initially...International Phonetic Alphabet . . . . . . . . . 7 2.3.2 Multilingual vs. Monolingual Speech Recognition 7 2.3.3 Data-Driven Approaches

  15. Lab-on-a-Chip Based Protein Crystallization

    NASA Technical Reports Server (NTRS)

    vanderWoerd, Mark J.; Brasseur, Michael M.; Spearing, Scott F.; Whitaker, Ann F. (Technical Monitor)

    2001-01-01

    We are developing a novel technique with which we will grow protein crystals in very small volumes, utilizing chip-based, microfluidic ("LabChip") technology. This development, which is a collaborative effort between NASA's Marshall Space Flight Center and Caliper Technologies Corporation, promises a breakthrough in the field of protein crystal growth. Our initial results obtained from two model proteins, Lysozyme and Thaumatin, show that it is feasible to dispense and adequately mix protein and precipitant solutions on a nano-liter scale. The mixtures have shown crystal growth in volumes in the range of 10 nanoliters to 5 microliters. In addition, large diffraction quality crystals were obtained by this method. X-ray data from these crystals were shown to be of excellent quality. Our future efforts will include the further development of protein crystal growth with LabChip(trademark) technology for more complex systems. We will initially address the batch growth method, followed by the vapor diffusion method and the liquid-liquid diffusion method. The culmination of these chip developments is to lead to an on orbit protein crystallization facility on the International Space Station. Structural biologists will be invited to utilize the on orbit Iterative Biological Crystallization facility to grow high quality macromolecular crystals in microgravity.

  16. Restructuring Kindergarten in an Urban School District: The Case of Newark, New Jersey.

    ERIC Educational Resources Information Center

    Kopacsi, Rosemarie; Hochwald, Eve

    A collaborative project of Bank Street College and the Newark Public Schools, the New Beginnings initiative was designed to bring about progressive restructuring of kindergarten classrooms. This study used a combination of qualitative and quantitative methods to examine the impact of the initiative on curriculum, professional development, and…

  17. Improving CME: Using Participant Satisfaction Measures to Specify Educational Methods

    ERIC Educational Resources Information Center

    Olivieri, Jason J.; Regala, Roderick P.

    2013-01-01

    Imagine having developed a continuing medical education (CME) initiative to educate physicians on updated guidelines regarding high cholesterol in adults. This initiative consisted of didactic presentations and case-based discussions offered in 5 major US cities, followed by a Web-based enduring component to distill key points of the live…

  18. Laboratory imaging of hydraulic fractures using microseismicity

    NASA Astrophysics Data System (ADS)

    Zeng, Zhengwen

    2002-09-01

    This dissertation starts with an investigation of the industry's needs for future research and development of hydraulic fracturing (HF) technology. Based on the investigation results of a questionnaire answered by some industrial experts, it was found that reliable hydraulic fracturing diagnostic techniques are in need. Further critical review showed that the microseismic method was one of the most promising techniques that needed further development. Developing robust algorithms and software for locating the coordinates of hydraulic fracturing-induced microseismic events, and for simulating the first motion of the induced waveforms were central tasks for this research. In addition, initiation and propagation characteristics of asymmetrical hydraulic fractures were investigated; a recent discovered tight gas sandstone was systematically characterized; a method for measuring Mode-I fracture toughness was upgraded; and the packer influence on the initiation of asymmetrical fractures was numerically simulated. By completing this research, the following contributions have been made: (1) Development of a simplex-based microseismic LOCATION program. This program overcame the shortcoming of ill-conditioning-prone conditions encountered in conventional location programs. (2) Development of a variance-based computer program, ArrTime, to automatically search the first arrival times from the full waveform data points. (3) Development of the first motion simulator of the induced microseismic waveforms. Using this program, the first motion waveform amplitude in any direction at any location induced from seismic sources at an arbitrary location in a known fracturing mode can be calculated. (4) Complete characterization of a newly discovered tight gas formation, the Jackfork sandstone. (5) Upgrade of a core sample-based method for the measurement of fracture toughness. Mode-I fracture toughness of common core samples in any direction can be measured using this method. (6) Discern of the packer influence on HF initiation. It is numerically shown that a properly functioning packer would transfer tensile stress concentrations from the sealed ends to the borehole wall in the maximum principal stress direction. In contrast, a malfunctioning packer would induce tensile stress concentrations at the sealed ends that, in turn, induces transverse fractures. (7) Image of dynamics of the asymmetrical hydraulic fracture initiation and propagation.

  19. High speed transition prediction

    NASA Technical Reports Server (NTRS)

    Gasperas, Gediminis

    1993-01-01

    The main objective of this work period was to develop, maintain and exercise state-of-the-art methods for transition prediction in supersonic flow fields. Basic state and stability codes, acquired during the last work period, were exercised and applied to calculate the properties of various flowfields. The development of a code for the prediction of transition location using a currently novel method (the PSE or Parabolized Stability Equation method), initiated during the last work period and continued during the present work period, was cancelled at mid-year for budgetary reasons. Other activities during this period included the presentation of a paper at the APS meeting in Tallahassee, Florida entitled 'Stability of Two-Dimensional Compressible Boundary Layers', as well as the initiation of a paper co-authored with H. Reed of the Arizona State University entitled 'Stability of Boundary Layers'.

  20. Developments at the Advanced Design Technologies Testbed

    NASA Technical Reports Server (NTRS)

    VanDalsem, William R.; Livingston, Mary E.; Melton, John E.; Torres, Francisco J.; Stremel, Paul M.

    2003-01-01

    A report presents background and historical information, as of August 1998, on the Advanced Design Technologies Testbed (ADTT) at Ames Research Center. The ADTT is characterized as an activity initiated to facilitate improvements in aerospace design processes; provide a proving ground for product-development methods and computational software and hardware; develop bridging methods, software, and hardware that can facilitate integrated solutions to design problems; and disseminate lessons learned to the aerospace and information technology communities.

  1. Rapid Progressive Disease After Nivolumab Therapy in Three Patients with Metastatic Renal Cell Carcinoma

    PubMed Central

    KOBARI, YUKI; KONDO, TSUNENORI; TAKAGI, TOSHIO; OMAE, KENJI; NAKAZAWA, HAYAKAZU; TANABE, KAZUNARI

    2017-01-01

    Background/Aim: Rapid progressive disease (RPD), accelerated tumour growth immediate after the initiation of immune checkpoint inhibitor therapy, has been reported in melanoma and lung cancer. Herein, we describe 3 cases of RPD during the initial phase of nivolumab treatment for metastatic renal cell carcinoma. Patients and Methods: The first and second patients received nivolumab in the fourth-line setting. The third patient received nivolumab therapy as third-line treatment. Results: The first patient developed severe respiratory failure due to carcinomatous lymphangiosis 14 days after initiation of nivolumab therapy. The second patient developed leg paraplegia due to rapid growth of the metastatic tumour at the sixth thoracic vertebrae 5 days later. The third patient developed grade 4 hypercalcemia due to RPD on day 3. Conclusion: Clinicians should be aware of RPD during the initial phase of nivolumab therapy, especially in patients with critical lesions in the late-line setting. PMID:28652455

  2. Local status and power in area-based health improvement partnerships.

    PubMed

    Powell, Katie; Thurston, Miranda; Bloyce, Daniel

    2014-11-01

    Area-based initiatives have formed an important part of public policy towards more socio-economically deprived areas in many countries. Co-ordinating service provision within and across sectors has been a common feature of these initiatives. Despite sustained policy interest in area-based initiatives, little empirical work has explored relations between area-based initiative providers, and partnership development within this context remains under-theorised. This article addresses both of these gaps by exploring partnerships as a social and developmental process, drawing on concepts from figurational sociology to explain how provider relations develop within an area-based initiative. Qualitative methods were used to explore, prospectively, the development of an area-based initiative targeted at a town in the north west of England. A central finding was that although effective delivery of area-based initiatives is premised on a high level of co-ordination between service providers, the pattern of interdependencies between providers limits the frequency and effectiveness of co-operation. In particular, the interdependency of area-based initiative providers with others in their organisation (what is termed here as 'organisational pull') constrained the ways in which they worked with providers outside of their own organisations. 'Local' status, which could be earned over time, enabled some providers to exert greater control over the way in which provider relations developed during the course of the initiative. These findings demonstrate how historically constituted social networks, within which all providers are embedded, shape partnership development. The theoretical insight developed here suggests a need for more realistic expectations among policymakers about how and to what extent provider partnerships can be managed. © The Author(s) 2014.

  3. Strengthening the evidence and action on multi-sectoral partnerships in public health: an action research initiative

    PubMed Central

    Willis, C. D.; Greene, J. K.; Abramowicz, A.; Riley, B. L.

    2016-01-01

    Abstract Introduction: The Public Health Agency of Canada’s Multi-sectoral Partnerships Initiative, administered by the Centre for Chronic Disease Prevention (CCDP), brings together diverse partners to design, implement and advance innovative approaches for improving population health. This article describes the development and initial priorities of an action research project (a learning and improvement strategy) that aims to facilitate continuous improvement of the CCDP’s partnership initiative and contribute to the evidence on multi-sectoral partnerships. Methods: The learning and improvement strategy for the CCDP’s multi-sectoral partnership initiative was informed by (1) consultations with CCDP staff and senior management, and (2) a review of conceptual frameworks to do with multi-sectoral partnerships. Consultations explored the development of the multi-sectoral initiative, barriers and facilitators to success, and markers of effectiveness. Published and grey literature was reviewed using a systematic search strategy with findings synthesized using a narrative approach. Results: Consultations and the review highlighted the importance of understanding partnership impacts, developing a shared vision, implementing a shared measurement system and creating opportunities for knowledge exchange. With that in mind, we propose a six-component learning and improvement strategy that involves (1) prioritizing learning needs, (2) mapping needs to evidence, (3) using relevant data-collection methods, (4) analyzing and synthesizing data, (5) feeding data back to CCDP staff and teams and (6) taking action. Initial learning needs include investigating partnership reach and the unanticipated effects of multi-sectoral partnerships for individuals, groups, organizations or communities. Conclusion: While the CCDP is the primary audience for the learning and improvement strategy, it may prove useful for a range of audiences, including other government departments and external organizations interested in capturing and sharing new knowledge generated from multi-sectoral partnerships. PMID:27284702

  4. Stream-channel and watershed delineations and basin-characteristic measurements using lidar elevation data for small drainage basins within the Des Moines Lobe landform region in Iowa

    USGS Publications Warehouse

    Eash, David A.; Barnes, Kimberlee K.; O'Shea, Padraic S.; Gelder, Brian K.

    2018-02-14

    Basin-characteristic measurements related to stream length, stream slope, stream density, and stream order have been identified as significant variables for estimation of flood, flow-duration, and low-flow discharges in Iowa. The placement of channel initiation points, however, has always been a matter of individual interpretation, leading to differences in stream definitions between analysts.This study investigated five different methods to define stream initiation using 3-meter light detection and ranging (lidar) digital elevation models (DEMs) data for 17 streamgages with drainage areas less than 50 square miles within the Des Moines Lobe landform region in north-central Iowa. Each DEM was hydrologically enforced and the five stream initiation methods were used to define channel initiation points and the downstream flow paths. The five different methods to define stream initiation were tested side-by-side for three watershed delineations: (1) the total drainage-area delineation, (2) an effective drainage-area delineation of basins based on a 2-percent annual exceedance probability (AEP) 12-hour rainfall, and (3) an effective drainage-area delineation based on a 20-percent AEP 12-hour rainfall.Generalized least squares regression analysis was used to develop a set of equations for sites in the Des Moines Lobe landform region for estimating discharges for ungaged stream sites with 50-, 20-, 10-, 4-, 2-, 1-, 0.5-, and 0.2-percent AEPs. A total of 17 streamgages were included in the development of the regression equations. In addition, geographic information system software was used to measure 58 selected basin-characteristics for each streamgage.Results of the regression analyses of the 15 lidar datasets indicate that the datasets that produce regional regression equations (RREs) with the best overall predictive accuracy are the National Hydrographic Dataset, Iowa Department of Natural Resources, and profile curvature of 0.5 stream initiation methods combined with the 20-percent AEP 12-hour rainfall watershed delineation method. These RREs have a mean average standard error of prediction (SEP) for 4-, 2-, and 1-percent AEP discharges of 53.9 percent and a mean SEP for all eight AEPs of 55.5 percent. Compared to the RREs developed in this study using the basin characteristics from the U.S. Geological Survey StreamStats application, the lidar basin characteristics provide better overall predictive accuracy.

  5. Origin and initiation mechanisms of neuroblastoma.

    PubMed

    Tsubota, Shoma; Kadomatsu, Kenji

    2018-05-01

    Neuroblastoma is an embryonal malignancy that affects normal development of the adrenal medulla and paravertebral sympathetic ganglia in early childhood. Extensive studies have revealed the molecular characteristics of human neuroblastomas, including abnormalities at genome, epigenome and transcriptome levels. However, neuroblastoma initiation mechanisms and even its origin are long-standing mysteries. In this review article, we summarize the current knowledge about normal development of putative neuroblastoma sources, namely sympathoadrenal lineage of neural crest cells and Schwann cell precursors that were recently identified as the source of adrenal chromaffin cells. A plausible origin of enigmatic stage 4S neuroblastoma is also discussed. With regard to the initiation mechanisms, we review genetic abnormalities in neuroblastomas and their possible association to initiation mechanisms. We also summarize evidences of neuroblastoma initiation observed in genetically engineered animal models, in which epigenetic alterations were involved, including transcriptomic upregulation by N-Myc and downregulation by polycomb repressive complex 2. Finally, several in vitro experimental methods are proposed that hopefully will accelerate our comprehension of neuroblastoma initiation. Thus, this review summarizes the state-of-the-art knowledge about the mechanisms of neuroblastoma initiation, which is critical for developing new strategies to cure children with neuroblastoma.

  6. Software Dependability and Safety Evaluations ESA's Initiative

    NASA Astrophysics Data System (ADS)

    Hernek, M.

    ESA has allocated funds for an initiative to evaluate Dependability and Safety methods of Software. The objectives of this initiative are; · More extensive validation of Safety and Dependability techniques for Software · Provide valuable results to improve the quality of the Software thus promoting the application of Dependability and Safety methods and techniques. ESA space systems are being developed according to defined PA requirement specifications. These requirements may be implemented through various design concepts, e.g. redundancy, diversity etc. varying from project to project. Analysis methods (FMECA. FTA, HA, etc) are frequently used during requirements analysis and design activities to assure the correct implementation of system PA requirements. The criticality level of failures, functions and systems is determined and by doing that the critical sub-systems are identified, on which dependability and safety techniques are to be applied during development. Proper performance of the software development requires the development of a technical specification for the products at the beginning of the life cycle. Such technical specification comprises both functional and non-functional requirements. These non-functional requirements address characteristics of the product such as quality, dependability, safety and maintainability. Software in space systems is more and more used in critical functions. Also the trend towards more frequent use of COTS and reusable components pose new difficulties in terms of assuring reliable and safe systems. Because of this, its dependability and safety must be carefully analysed. ESA identified and documented techniques, methods and procedures to ensure that software dependability and safety requirements are specified and taken into account during the design and development of a software system and to verify/validate that the implemented software systems comply with these requirements [R1].

  7. Design and development of linked data from the National Map

    USGS Publications Warehouse

    Usery, E. Lynn; Varanka, Dalia E.

    2012-01-01

    The development of linked data on the World-Wide Web provides the opportunity for the U.S. Geological Survey (USGS) to supply its extensive volumes of geospatial data, information, and knowledge in a machine interpretable form and reach users and applications that heretofore have been unavailable. To pilot a process to take advantage of this opportunity, the USGS is developing an ontology for The National Map and converting selected data from nine research test areas to a Semantic Web format to support machine processing and linked data access. In a case study, the USGS has developed initial methods for legacy vector and raster formatted geometry, attributes, and spatial relationships to be accessed in a linked data environment maintaining the capability to generate graphic or image output from semantic queries. The description of an initial USGS approach to developing ontology, linked data, and initial query capability from The National Map databases is presented.

  8. Disaster Risk Reduction in Myanmar: A Need for Focus on Community Preparedness and Improved Evaluation of Initiatives.

    PubMed

    Smith, Andrew D; Chan, Emily Y Y

    2017-11-20

    Myanmar is a country in political and economic transition. Facing a wide-variety of natural hazards and ongoing conflict, the country's under-developed infrastructure has resulted in high disaster risk. Following the devastation of Cyclone Nargis in 2008 and increased global focus on disaster management and risk reduction, Myanmar has begun development of national disaster policies. Myanmar's Action Plan for Disaster Risk Reduction addressed multiple stages of disaster development and has made progress towards national projects, however, has struggled to implement community-based preparedness and response initiatives. This article analyses Myanmar's disaster strategy, though the use of a disaster development framework and suggests areas for possible improvement. In particular, the article aims to generate discussion regarding methods of supporting objective evaluation of risk reduction initiatives in developing countries. (Disaster Med Public Health Preparedness. 2017;page 1 of 5).

  9. A comparison between Gauss-Newton and Markov chain Monte Carlo basedmethods for inverting spectral induced polarization data for Cole-Coleparameters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Jinsong; Kemna, Andreas; Hubbard, Susan S.

    2008-05-15

    We develop a Bayesian model to invert spectral induced polarization (SIP) data for Cole-Cole parameters using Markov chain Monte Carlo (MCMC) sampling methods. We compare the performance of the MCMC based stochastic method with an iterative Gauss-Newton based deterministic method for Cole-Cole parameter estimation through inversion of synthetic and laboratory SIP data. The Gauss-Newton based method can provide an optimal solution for given objective functions under constraints, but the obtained optimal solution generally depends on the choice of initial values and the estimated uncertainty information is often inaccurate or insufficient. In contrast, the MCMC based inversion method provides extensive globalmore » information on unknown parameters, such as the marginal probability distribution functions, from which we can obtain better estimates and tighter uncertainty bounds of the parameters than with the deterministic method. Additionally, the results obtained with the MCMC method are independent of the choice of initial values. Because the MCMC based method does not explicitly offer single optimal solution for given objective functions, the deterministic and stochastic methods can complement each other. For example, the stochastic method can first be used to obtain the means of the unknown parameters by starting from an arbitrary set of initial values and the deterministic method can then be initiated using the means as starting values to obtain the optimal estimates of the Cole-Cole parameters.« less

  10. Analysis and Inverse Design of the HSR Arrow Wing Configuration with Fuselage, Wing, and Flow Through Nacelles

    NASA Technical Reports Server (NTRS)

    Krist, Steven E.; Bauer, Steven X. S.

    1999-01-01

    The design process for developing the natural flow wing design on the HSR arrow wing configuration utilized several design tools and analysis methods. Initial fuselage/wing designs were generated with inviscid analysis and optimization methods in conjunction with the natural flow wing design philosophy. A number of designs were generated, satisfying different system constraints. Of the three natural flow wing designs developed, the NFWAc2 configuration is the design which satisfies the constraints utilized by McDonnell Douglas Aerospace (MDA) in developing a series of optimized configurations; a wind tunnel model of the MDA designed OPT5 configuration was constructed and tested. The present paper is concerned with the viscous analysis and inverse design of the arrow wing configurations, including the effects of the installed diverters/nacelles. Analyses were conducted with OVERFLOW, a Navier-Stokes flow solver for overset grids. Inverse designs were conducted with OVERDISC, which couples OVERFLOW with the CDISC inverse design method. An initial system of overset grids was generated for the OPT5 configuration with installed diverters/nacelles. An automated regridding process was then developed to use the OPT5 component grids to create grids for the natural flow wing designs. The inverse design process was initiated using the NFWAc2 configuration as a starting point, eventually culminating in the NFWAc4 design-for which a wind tunnel model was constructed. Due to the time constraints on the design effort, initial analyses and designs were conducted with a fairly coarse grid; subsequent analyses have been conducted on a refined system of grids. Comparisons of the computational results to experiment are provided at the end of this paper.

  11. 75 FR 6031 - Policy Paper on Revised Risk Assessment Methods for Workers, Children of Workers in Agricultural...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-05

    ... and its relationship to several other key Agency initiatives that are currently under development and... Assessment Methods for Workers, Children of Workers in Agricultural Fields, and Pesticides with No Food Uses... for comment a policy paper entitled ``Revised Risk Assessment Methods for Workers, Children of Workers...

  12. Use of a Numerical Strategy Framework in the Professional Development of Teachers

    ERIC Educational Resources Information Center

    Laxman, Kumar; Hughes, Peter

    2015-01-01

    Derived initially from a strategic analysis of children's methods of counting, the New Zealand Numeracy Projects used, as a starting point for the professional development of teachers, a strategy framework that traces children's development in number reasoning. A pilot study indicated the usefulness of professional development where teachers use…

  13. A COMPARISON OF TWO RAPID BIOLOGICAL ASSESSMENT SAMPLING METHODS FOR MACROINVERTEBRATES

    EPA Science Inventory

    In 2003, the Office of Research and Developments (ORD's) National Exposure Research Laboratory initiated a collaborative research effort with U.S. EPA Region 3 to conduct a study comparing two rapid biological assessment methods for collecting stream macroinvertebrates. One metho...

  14. Triaxial- and uniaxial-compression testing methods developed for extraction of pore water from unsaturated tuff, Yucca Mountain, Nevada

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mower, T.E.; Higgins, J.D.; Yang, I.C.

    1989-12-31

    To support the study of hydrologic system in the unsaturated zone at Yucca Mountain, Nevada, two extraction methods were examined to obtain representative, uncontaminated pore-water samples from unsaturated tuff. Results indicate that triaxial compression, which uses a standard cell, can remove pore water from nonwelded tuff that has an initial moisture content greater than 11% by weight; uniaxial compression, which uses a specifically fabricated cell, can extract pore water from nonwelded tuff that has an initial moisture content greater than 8% and from welded tuff that has an initial moisture content greater than 6.5%. For the ambient moisture conditions ofmore » Yucca Mountain tuffs, uniaxial compression is the most efficient method of pore-water extraction. 12 refs., 7 figs., 2 tabs.« less

  15. Fatigue and fracture: Overview

    NASA Technical Reports Server (NTRS)

    Halford, G. R.

    1984-01-01

    A brief overview of the status of the fatigue and fracture programs is given. The programs involve the development of appropriate analytic material behavior models for cyclic stress-strain-temperature-time/cyclic crack initiation, and cyclic crack propagation. The underlying thrust of these programs is the development and verification of workable engineering methods for the calculation, in advance of service, of the local cyclic stress-strain response at the critical life governing location in hot section compounds, and the resultant crack initiation and crack growth lifetimes.

  16. [Perioperative management of a patient with myotonic dystrophy developing the cardiac symptoms initially prior to the neuromuscular symptoms].

    PubMed

    Wake, M; Matsushita, M; Aono, H; Matsumoto, M; Kohri, Y

    1994-08-01

    The authors anesthetized a 48-year-old woman with endometrial cancer and a large ovarian cyst. She developed cardiac failure initially followed by the sick sinus syndrome and A-V block from hypertrophic cardiomyopathy, prior to neuromuscular symptoms. Epidural anesthesia assisted by general anesthesia was carried out safely without intravenous administration of any muscle relaxants. From this experience, it is considered that epidural anesthesia assisted with some other proper methods is suitable for surgery of lower abdomen.

  17. Impact of the Alzheimer's Disease Neuroimaging Initiative, 2004 to 2014.

    PubMed

    Weiner, Michael W; Veitch, Dallas P; Aisen, Paul S; Beckett, Laurel A; Cairns, Nigel J; Cedarbaum, Jesse; Donohue, Michael C; Green, Robert C; Harvey, Danielle; Jack, Clifford R; Jagust, William; Morris, John C; Petersen, Ronald C; Saykin, Andrew J; Shaw, Leslie; Thompson, Paul M; Toga, Arthur W; Trojanowski, John Q

    2015-07-01

    The Alzheimer's Disease Neuroimaging Initiative (ADNI) was established in 2004 to facilitate the development of effective treatments for Alzheimer's disease (AD) by validating biomarkers for AD clinical trials. We searched for ADNI publications using established methods. ADNI has (1) developed standardized biomarkers for use in clinical trial subject selection and as surrogate outcome measures; (2) standardized protocols for use across multiple centers; (3) initiated worldwide ADNI; (4) inspired initiatives investigating traumatic brain injury and post-traumatic stress disorder in military populations, and depression, respectively, as an AD risk factor; (5) acted as a data-sharing model; (6) generated data used in over 600 publications, leading to the identification of novel AD risk alleles, and an understanding of the relationship between biomarkers and AD progression; and (7) inspired other public-private partnerships developing biomarkers for Parkinson's disease and multiple sclerosis. ADNI has made myriad impacts in its first decade. A competitive renewal of the project in 2015 would see the use of newly developed tau imaging ligands, and the continued development of recruitment strategies and outcome measures for clinical trials. Copyright © 2015 The Alzheimer's Association. All rights reserved.

  18. Text-in-context: a method for extracting findings in mixed-methods mixed research synthesis studies.

    PubMed

    Sandelowski, Margarete; Leeman, Jennifer; Knafl, Kathleen; Crandell, Jamie L

    2013-06-01

    Our purpose in this paper is to propose a new method for extracting findings from research reports included in mixed-methods mixed research synthesis studies. International initiatives in the domains of systematic review and evidence synthesis have been focused on broadening the conceptualization of evidence, increased methodological inclusiveness and the production of evidence syntheses that will be accessible to and usable by a wider range of consumers. Initiatives in the general mixed-methods research field have been focused on developing truly integrative approaches to data analysis and interpretation. The data extraction challenges described here were encountered, and the method proposed for addressing these challenges was developed, in the first year of the ongoing (2011-2016) study: Mixed-Methods Synthesis of Research on Childhood Chronic Conditions and Family. To preserve the text-in-context of findings in research reports, we describe a method whereby findings are transformed into portable statements that anchor results to relevant information about sample, source of information, time, comparative reference point, magnitude and significance and study-specific conceptions of phenomena. The data extraction method featured here was developed specifically to accommodate mixed-methods mixed research synthesis studies conducted in nursing and other health sciences, but reviewers might find it useful in other kinds of research synthesis studies. This data extraction method itself constitutes a type of integration to preserve the methodological context of findings when statements are read individually and in comparison to each other. © 2012 Blackwell Publishing Ltd.

  19. Investigating the Baseline Skills of Research Students Using a Competency-Based Self-Assessment Method

    ERIC Educational Resources Information Center

    Bromley, Anthony P.; Boran, James R.; Myddelton, William A.

    2007-01-01

    Recent government-led initiatives are changing the nature of the UK PhD to support the greater development of transferable skills. There are similar initiatives internationally. A key requirement and challenge is to effectively assess the "baseline" skills of a cohort on entry to a research programme and then monitor their progress in…

  20. The CAST Initiative in Guam: A Model of Effective Teachers Teaching Teachers

    ERIC Educational Resources Information Center

    Zuercher, Deborah K.; Kessler, Cristy; Yoshioka, Jon

    2011-01-01

    The CAST (content area specialized training) model of professional development enables sustainable teacher leadership and is responsive to the need for culturally relevant educational practices. The purpose of this paper is to share the background, methods, findings and recommendations of a case study on the CAST initiative in Guam. The case study…

  1. Physical Interventions for Adults with Intellectual Disabilities: Survey of Use, Policy, Training and Monitoring

    ERIC Educational Resources Information Center

    Deveau, Roy; McGill, Peter

    2009-01-01

    Background: Perceived problems around the use of physical intervention (PI) to manage challenging behaviour have led to UK initiatives to encourage policy development and accredited training. However, information on PI use and the impact of these initiatives remains limited. Method: Adult residential services within an English region were sent a…

  2. Developing Tomorrow's Talent: The Case of an Undergraduate Mentoring Programme

    ERIC Educational Resources Information Center

    Gannon, Judie M.; Maher, Angela

    2012-01-01

    Purpose: The purpose of this paper is to explore the value of an alumni and employer engagement mentoring initiative in a hospitality and tourism school within a UK university. Design/methodology/approach: The paper uses the survey method and interviews to provide qualitative and quantitative data on the participants' reactions to the initiative.…

  3. The characteristics of national health initiatives promoting earlier cancer diagnosis among adult populations: a systematic review protocol

    PubMed Central

    Calanzani, Natalia; Weller, David; Campbell, Christine

    2017-01-01

    Introduction The increasing burden of cancer morbidity and mortality has led to the development of national health initiatives to promote earlier cancer diagnosis and improve cancer survival. This protocol describes a systematic review aiming to identify the evidence about such initiatives among the adult population. We will describe their components, stakeholders and target populations, and summarise their outcomes. Methods and analysis We will search databases and websites for peer-reviewed publications and grey literature on national health initiatives in high-income countries as defined by the World Bank. Quantitative, qualitative and mixed-methods studies will be included and assessed for their methodological quality. Study selection, quality assessment and data extraction will be carried out independently by two reviewers. Narrative synthesis will be used to analyse the findings. Ethics and dissemination This systematic review analyses secondary data and ethical approval is not required. Review findings will be helpful to researchers, policy makers, governments and other key stakeholders developing similar initiatives and assessing cancer outcomes. The results will be submitted to a peer-reviewed journal in order to reach a diverse group of healthcare professionals, researchers and policy makers. This systematic review protocol is registered at PROSPERO (CRD42016047233). PMID:28698336

  4. 48 CFR 715.370-2 - Title XII selection procedure-collaborative assistance.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... AGENCY FOR INTERNATIONAL DEVELOPMENT CONTRACTING METHODS AND CONTRACT TYPES CONTRACTING BY NEGOTIATION... contracting system is appropriate. See AIDR sppendix F (of this chapter)—Use of Collaborative Assistance... initiating any contract actions under the collaborative assistance method: (1) The cognizant technical office...

  5. 48 CFR 715.370-2 - Title XII selection procedure-collaborative assistance.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... AGENCY FOR INTERNATIONAL DEVELOPMENT CONTRACTING METHODS AND CONTRACT TYPES CONTRACTING BY NEGOTIATION... contracting system is appropriate. See AIDR sppendix F (of this chapter)—Use of Collaborative Assistance... initiating any contract actions under the collaborative assistance method: (1) The cognizant technical office...

  6. Preparation for an online asynchronous university doctoral course. Lessons learned.

    PubMed

    Milstead, J A; Nelson, R

    1998-01-01

    This article addresses the development of the initial course in the first completely online doctoral program in nursing. Synchronous and asynchronous methods of distance education were assessed. Planning focused at the university, school, and course levels. University planning involved the technical infrastructure, registration, student services, and library services. School planning examined administrative commitment and faculty commitment and willingness. Course planning focused on marketing, precourse information, time frame, modular design, planned interaction, and professor availability and support. Implementation issues centered on getting students connected, learning the software, changing instructional methods, and managing chats. Traditional methods of evaluating student learning and course evaluation were supplemented with the development of qualitative and quantitative tools to gather data for making administrative decisions. The Dean and faculty agreed that the internet was an effective method of delivering content in the initial Health Policy course. The Dean and faculty agreed to continue the PhD program online for one cohort and continue to evaluate student progress and faculty and student satisfaction.

  7. Monitoring River Water Levels from Space: Quality Assessment of 20 Years of Satellite Altimetry Data

    NASA Astrophysics Data System (ADS)

    Bercher, Nicolas; Kosuth, Pascal

    2013-09-01

    This paper presents the results of 20 years of validation of altimetry data for the monitoring of river water levels using a standardized method. The method was initially developed by Cemagref (2006-2011, [5, 6, 3]), now Irste ´a, its implementation is now pursued at LEGOS.Our initial statement was: "what if someone1 wants to use satellite measurements of river water levels ?" The obvious question that comes to mind is "what the quality of the data ?". Moreover, there's also a need - a demand from data producers, to monitor products quality in a standardized fashion.We addressed such questions and have developped a method to assess the quality of, so called, "Alti-Hydro Products". The method was implemented for the following Alti-Hydro products (and automatically derived from a L2 product*) : AVISO* (Topex/Poseidon, Jason-2), CASH project (Topex/Poseidon), HydroWeb (Topex/Poseidon, ENVISAT), River & Lake Hydrology (ERS-2, ENVISAT) and PISTACH* (Jason-2).

  8. Individual Differences in Initial Sensitivity and Acute Tolerance Predict Patterns of Chronic Drug Tolerance to Nitrous-Oxide-Induced Hypothermia in Rats

    PubMed Central

    Ramsay, Douglas S.; Kaiyala, Karl J.; Leroux, Brian G.; Woods, Stephen C.

    2006-01-01

    Rationale: A preventive strategy for drug addiction would benefit from being able to identify vulnerable individuals. Understanding how an individual responds during an initial drug exposure may be useful for predicting how that individual will respond to repeated drug administrations. Objectives: This study investigated whether individual differences in initial drug sensitivity and acute tolerance can predict how chronic tolerance develops. Methods: During an initial 3-h administration of 60% nitrous oxide (N2O), male Long-Evans rats were screened for N2O’s hypothermic effect into subsets based on being initially insensitive (II), sensitive with acute tolerance (AT), or sensitive with no intrasessional recovery (NR). Animals in each individual difference category were randomly assigned to receive six 90-min exposures of either 60% N2O or placebo gas. Core temperature was measured telemetrically. Results: Rats that exhibited a comparable degree of hypothermia during an initial N2O exposure, but differed in acute tolerance development, developed different patterns of chronic tolerance. Specifically, the NR group did not become fully tolerant over repeated N2O exposures while the AT group developed an initial hyperthermia followed by a return of core temperature to control levels indicative of full tolerance development. By the second N2O exposure, the II group breathing N2O became hyperthermic relative to the placebo control group and this hyperthermia persisted throughout the multiple N2O exposures. Conclusions: Individual differences in initial drug sensitivity and acute tolerance development predict different patterns of chronic tolerance. The hypothesis is suggested that individual differences in opponent adaptive responses may mediate this relationship. PMID:15778887

  9. Simultaneous Polymerization and Polypeptide Particle Production via Reactive Spray-Drying

    PubMed Central

    2016-01-01

    A method for producing polypeptide particles via in situ polymerization of N-carboxyanhydrides during spray-drying has been developed. This method was enabled by the development of a fast and robust synthetic pathway to polypeptides using 1,8-diazabicyclo[5.4.0]undec-7-ene (DBU) as an initiator for the ring-opening polymerization of N-carboxyanhydrides. The polymerizations finished within 5 s and proved to be very tolerant toward impurities such as amino acid salts and water. The formed particles were prepared by mixing the monomer, N-carboxyanhydride of l-glutamic acid benzyl ester (NCAGlu) and the initiator (DBU) during the atomization process in the spray-dryer and were spherical with a size of ∼1 μm. This method combines two steps; making it a straightforward process that facilitates the production of polypeptide particles. Hence, it furthers the use of spray-drying and polypeptide particles in the pharmaceutical industry. PMID:27445061

  10. Simultaneous Polymerization and Polypeptide Particle Production via Reactive Spray-Drying.

    PubMed

    Glavas, Lidija; Odelius, Karin; Albertsson, Ann-Christine

    2016-09-12

    A method for producing polypeptide particles via in situ polymerization of N-carboxyanhydrides during spray-drying has been developed. This method was enabled by the development of a fast and robust synthetic pathway to polypeptides using 1,8-diazabicyclo[5.4.0]undec-7-ene (DBU) as an initiator for the ring-opening polymerization of N-carboxyanhydrides. The polymerizations finished within 5 s and proved to be very tolerant toward impurities such as amino acid salts and water. The formed particles were prepared by mixing the monomer, N-carboxyanhydride of l-glutamic acid benzyl ester (NCAGlu) and the initiator (DBU) during the atomization process in the spray-dryer and were spherical with a size of ∼1 μm. This method combines two steps; making it a straightforward process that facilitates the production of polypeptide particles. Hence, it furthers the use of spray-drying and polypeptide particles in the pharmaceutical industry.

  11. Global quasi-linearization (GQL) versus QSSA for a hydrogen-air auto-ignition problem.

    PubMed

    Yu, Chunkan; Bykov, Viatcheslav; Maas, Ulrich

    2018-04-25

    A recently developed automatic reduction method for systems of chemical kinetics, the so-called Global Quasi-Linearization (GQL) method, has been implemented to study and reduce the dimensions of a homogeneous combustion system. The results of application of the GQL and the Quasi-Steady State Assumption (QSSA) are compared. A number of drawbacks of the QSSA are discussed, i.e. the selection criteria of QSS-species and its sensitivity to system parameters, initial conditions, etc. To overcome these drawbacks, the GQL approach has been developed as a robust, automatic and scaling invariant method for a global analysis of the system timescale hierarchy and subsequent model reduction. In this work the auto-ignition problem of the hydrogen-air system is considered in a wide range of system parameters and initial conditions. The potential of the suggested approach to overcome most of the drawbacks of the standard approaches is illustrated.

  12. Comparing methods for assessing the effectiveness of subnational REDD+ initiatives

    NASA Astrophysics Data System (ADS)

    Bos, Astrid B.; Duchelle, Amy E.; Angelsen, Arild; Avitabile, Valerio; De Sy, Veronique; Herold, Martin; Joseph, Shijo; de Sassi, Claudio; Sills, Erin O.; Sunderlin, William D.; Wunder, Sven

    2017-07-01

    The central role of forests in climate change mitigation, as recognized in the Paris agreement, makes it increasingly important to develop and test methods for monitoring and evaluating the carbon effectiveness of REDD+. Over the last decade, hundreds of subnational REDD+ initiatives have emerged, presenting an opportunity to pilot and compare different approaches to quantifying impacts on carbon emissions. This study (1) develops a Before-After-Control-Intervention (BACI) method to assess the effectiveness of these REDD+ initiatives; (2) compares the results at the meso (initiative) and micro (village) scales; and (3) compares BACI with the simpler Before-After (BA) results. Our study covers 23 subnational REDD+ initiatives in Brazil, Peru, Cameroon, Tanzania, Indonesia and Vietnam. As a proxy for deforestation, we use annual tree cover loss. We aggregate data into two periods (before and after the start of each initiative). Analysis using control areas (‘control-intervention’) suggests better REDD+ performance, although the effect is more pronounced at the micro than at the meso level. Yet, BACI requires more data than BA, and is subject to possible bias in the before period. Selection of proper control areas is vital, but at either scale is not straightforward. Low absolute deforestation numbers and peak years influence both our BA and BACI results. In principle, BACI is superior, with its potential to effectively control for confounding factors. We conclude that the more local the scale of performance assessment, the more relevant is the use of the BACI approach. For various reasons, we find overall minimal impact of REDD+ in reducing deforestation on the ground thus far. Incorporating results from micro and meso level monitoring into national reporting systems is important, since overall REDD+ impact depends on land use decisions on the ground.

  13. STRengthening Analytical Thinking for Observational Studies: the STRATOS initiative

    PubMed Central

    Sauerbrei, Willi; Abrahamowicz, Michal; Altman, Douglas G; le Cessie, Saskia; Carpenter, James

    2014-01-01

    The validity and practical utility of observational medical research depends critically on good study design, excellent data quality, appropriate statistical methods and accurate interpretation of results. Statistical methodology has seen substantial development in recent times. Unfortunately, many of these methodological developments are ignored in practice. Consequently, design and analysis of observational studies often exhibit serious weaknesses. The lack of guidance on vital practical issues discourages many applied researchers from using more sophisticated and possibly more appropriate methods when analyzing observational studies. Furthermore, many analyses are conducted by researchers with a relatively weak statistical background and limited experience in using statistical methodology and software. Consequently, even ‘standard’ analyses reported in the medical literature are often flawed, casting doubt on their results and conclusions. An efficient way to help researchers to keep up with recent methodological developments is to develop guidance documents that are spread to the research community at large. These observations led to the initiation of the strengthening analytical thinking for observational studies (STRATOS) initiative, a large collaboration of experts in many different areas of biostatistical research. The objective of STRATOS is to provide accessible and accurate guidance in the design and analysis of observational studies. The guidance is intended for applied statisticians and other data analysts with varying levels of statistical education, experience and interests. In this article, we introduce the STRATOS initiative and its main aims, present the need for guidance documents and outline the planned approach and progress so far. We encourage other biostatisticians to become involved. PMID:25074480

  14. Robust, Optimal Subsonic Airfoil Shapes

    NASA Technical Reports Server (NTRS)

    Rai, Man Mohan

    2014-01-01

    A method has been developed to create an airfoil robust enough to operate satisfactorily in different environments. This method determines a robust, optimal, subsonic airfoil shape, beginning with an arbitrary initial airfoil shape, and imposes the necessary constraints on the design. Also, this method is flexible and extendible to a larger class of requirements and changes in constraints imposed.

  15. Final report on cermet high-level waste forms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kobisk, E.H.; Quinby, T.C.; Aaron, W.S.

    1981-08-01

    Cermets are being developed as an alternate method for the fixation of defense and commercial high level radioactive waste in a terminal disposal form. Following initial feasibility assessments of this waste form, consisting of ceramic particles dispersed in an iron-nickel base alloy, significantly improved processing methods were developed. The characterization of cermets has continued through property determinations on samples prepared by various methods from a variety of simulated and actual high-level wastes. This report describes the status of development of the cermet waste form as it has evolved since 1977. 6 tables, 18 figures.

  16. Developing community-driven quality improvement initiatives to enhance chronic disease care in Indigenous communities in Canada: the FORGE AHEAD program protocol.

    PubMed

    Naqshbandi Hayward, Mariam; Paquette-Warren, Jann; Harris, Stewart B

    2016-07-26

    Given the dramatic rise and impact of chronic diseases and gaps in care in Indigenous peoples in Canada, a shift from the dominant episodic and responsive healthcare model most common in First Nations communities to one that places emphasis on proactive prevention and chronic disease management is urgently needed. The Transformation of Indigenous Primary Healthcare Delivery (FORGE AHEAD) Program partners with 11 First Nations communities across six provinces in Canada to develop and evaluate community-driven quality improvement (QI) initiatives to enhance chronic disease care. FORGE AHEAD is a 5-year research program (2013-2017) that utilizes a pre-post mixed-methods observational design rooted in participatory research principles to work with communities in developing culturally relevant innovations and improved access to available services. This intensive program incorporates a series of 10 inter-related and progressive program activities designed to foster community-driven initiatives with type 2 diabetes mellitus as the action disease. Preparatory activities include a national community profile survey, best practice and policy literature review, and readiness tool development. Community-level intervention activities include community and clinical readiness consultations, development of a diabetes registry and surveillance system, and QI activities. With a focus on capacity building, all community-level activities are driven by trained community members who champion QI initiatives in their community. Program wrap-up activities include readiness tool validation, cost-analysis and process evaluation. In collaboration with Health Canada and the Aboriginal Diabetes Initiative, scale-up toolkits will be developed in order to build on lessons-learned, tools and methods, and to fuel sustainability and spread of successful innovations. The outcomes of this research program, its related cost and the subsequent policy recommendations, will have the potential to significantly affect future policy decisions pertaining to chronic disease care in First Nations communities in Canada. Current ClinicalTrial.gov protocol ID NCT02234973 . Date of Registration: July 30, 2014.

  17. Development and validation of a SYBR Green I-based real-time polymerase chain reaction method for detection of haptoglobin gene deletion in clinical materials.

    PubMed

    Soejima, Mikiko; Tsuchiya, Yuji; Egashira, Kouichi; Kawano, Hiroyuki; Sagawa, Kimitaka; Koda, Yoshiro

    2010-06-01

    Anhaptoglobinemic patients run the risk of severe anaphylactic transfusion reaction because they produce serum haptoglobin (Hp) antibodies. Being homozygous for the Hp gene deletion (HP(del)) is the only known cause of congenital anhaptoglobinemia, and clinical diagnosis of HP(del) before transfusion is important to prevent anaphylactic shock. We recently developed a 5'-nuclease (TaqMan) real-time polymerase chain reaction (PCR) method. A SYBR Green I-based duplex real-time PCR assay using two forward primers and a common reverse primer followed by melting curve analysis was developed to determine HP(del) zygosity in a single tube. In addition, to obviate initial DNA extraction, we examined serially diluted blood samples as PCR templates. Allelic discrimination of HP(del) yielded optimal results at blood sample dilutions of 1:64 to 1:1024. The results from 2231 blood samples were fully concordant with those obtained by the TaqMan-based real-time PCR method. The detection rate of the HP(del) allele by the SYBR Green I-based method is comparable with that using the TaqMan-based method. This method is readily applicable due to its low initial cost and analyzability using economical real-time PCR machines and is suitable for high-throughput analysis as an alternative method for allelic discrimination of HP(del).

  18. Reconstructing the Initial Density Field of the Local Universe: Methods and Tests with Mock Catalogs

    NASA Astrophysics Data System (ADS)

    Wang, Huiyuan; Mo, H. J.; Yang, Xiaohu; van den Bosch, Frank C.

    2013-07-01

    Our research objective in this paper is to reconstruct an initial linear density field, which follows the multivariate Gaussian distribution with variances given by the linear power spectrum of the current cold dark matter model and evolves through gravitational instabilities to the present-day density field in the local universe. For this purpose, we develop a Hamiltonian Markov Chain Monte Carlo method to obtain the linear density field from a posterior probability function that consists of two components: a prior of a Gaussian density field with a given linear spectrum and a likelihood term that is given by the current density field. The present-day density field can be reconstructed from galaxy groups using the method developed in Wang et al. Using a realistic mock Sloan Digital Sky Survey DR7, obtained by populating dark matter halos in the Millennium simulation (MS) with galaxies, we show that our method can effectively and accurately recover both the amplitudes and phases of the initial, linear density field. To examine the accuracy of our method, we use N-body simulations to evolve these reconstructed initial conditions to the present day. The resimulated density field thus obtained accurately matches the original density field of the MS in the density range 0.3 \\lesssim \\rho /\\bar{\\rho } \\lesssim 20 without any significant bias. In particular, the Fourier phases of the resimulated density fields are tightly correlated with those of the original simulation down to a scale corresponding to a wavenumber of ~1 h Mpc-1, much smaller than the translinear scale, which corresponds to a wavenumber of ~0.15 h Mpc-1.

  19. Iterative image reconstruction in elastic inhomogenous media with application to transcranial photoacoustic tomography

    NASA Astrophysics Data System (ADS)

    Poudel, Joemini; Matthews, Thomas P.; Mitsuhashi, Kenji; Garcia-Uribe, Alejandro; Wang, Lihong V.; Anastasio, Mark A.

    2017-03-01

    Photoacoustic computed tomography (PACT) is an emerging computed imaging modality that exploits optical contrast and ultrasonic detection principles to form images of the photoacoustically induced initial pressure distribution within tissue. The PACT reconstruction problem corresponds to a time-domain inverse source problem, where the initial pressure distribution is recovered from the measurements recorded on an aperture outside the support of the source. A major challenge in transcranial PACT brain imaging is to compensate for aberrations in the measured data due to the propagation of the photoacoustic wavefields through the skull. To properly account for these effects, a wave equation-based inversion method should be employed that can model the heterogeneous elastic properties of the medium. In this study, an iterative image reconstruction method for 3D transcranial PACT is developed based on the elastic wave equation. To accomplish this, a forward model based on a finite-difference time-domain discretization of the elastic wave equation is established. Subsequently, gradient-based methods are employed for computing penalized least squares estimates of the initial source distribution that produced the measured photoacoustic data. The developed reconstruction algorithm is validated and investigated through computer-simulation studies.

  20. Error analysis of finite difference schemes applied to hyperbolic initial boundary value problems

    NASA Technical Reports Server (NTRS)

    Skollermo, G.

    1979-01-01

    Finite difference methods for the numerical solution of mixed initial boundary value problems for hyperbolic equations are studied. The reported investigation has the objective to develop a technique for the total error analysis of a finite difference scheme, taking into account initial approximations, boundary conditions, and interior approximation. Attention is given to the Cauchy problem and the initial approximation, the homogeneous problem in an infinite strip with inhomogeneous boundary data, the reflection of errors in the boundaries, and two different boundary approximations for the leapfrog scheme with a fourth order accurate difference operator in space.

  1. Initiation of small-satellite formations via satellite ejector

    NASA Astrophysics Data System (ADS)

    McMullen, Matthew G

    Small satellites can be constructed at a fraction of the cost of a full-size satellite. One full-size satellite can be replaced with a multitude of small satellites, offering expanded area coverage through formation flight. However, the shortcoming to the smaller size is usually a lack of thrusting capabilities. Furthermore, current designs for small satellite deployment mechanisms are only capable of love deployment velocities (on the order of meters per second). Motivated to address this shortcoming, a conceived satellite ejector would offer a significant orbit change by ejecting the satellite at higher deployment velocities (125-200 m/s). Focusing on the applications of the ejector, it is sought to bridge the gap in prior research by offering a method to initiate formation flight. As a precursor to the initiation, the desired orbit properties to initiate the formation are specified in terms of spacing and velocity change vector. From this, a systematic method is developed to find the relationship among velocity change vector, the desired orbit's orientation, and the spacing required to initiate the formation.

  2. A comparative study of turbulence decay using Navier-Stokes and a discrete particle simulation

    NASA Technical Reports Server (NTRS)

    Goswami, A.; Baganoff, D.; Lele, S.; Feiereisen, W.

    1993-01-01

    A comparative study of the two dimensional temporal decay of an initial turbulent state of flow is presented using a direct Navier-Stokes simulation and a particle method, ranging from the near continuum to more rarefied regimes. Various topics related to matching the initial conditions between the two simulations are considered. The determination of the initial velocity distribution function in the particle method was found to play an important role in the comparison. This distribution was first developed by matching the initial Navier-Stokes state of stress, but was found to be inadequate beyond the near continuum regime. An alternative approach of using the Lees two-sided Maxwellian to match the initial strain-rate is discussed. Results of the comparison of the temporal decay of mean kinetic energy are presented for a range of Knudsen numbers. As expected, good agreement was observed for the near continuum regime, but the differences found for the more rarefied conditions were unexpectedly small.

  3. Time-dependent importance sampling in semiclassical initial value representation calculations for time correlation functions. II. A simplified implementation.

    PubMed

    Tao, Guohua; Miller, William H

    2012-09-28

    An efficient time-dependent (TD) Monte Carlo (MC) importance sampling method has recently been developed [G. Tao and W. H. Miller, J. Chem. Phys. 135, 024104 (2011)] for the evaluation of time correlation functions using the semiclassical (SC) initial value representation (IVR) methodology. In this TD-SC-IVR method, the MC sampling uses information from both time-evolved phase points as well as their initial values, and only the "important" trajectories are sampled frequently. Even though the TD-SC-IVR was shown in some benchmark examples to be much more efficient than the traditional time-independent sampling method (which uses only initial conditions), the calculation of the SC prefactor-which is computationally expensive, especially for large systems-is still required for accepted trajectories. In the present work, we present an approximate implementation of the TD-SC-IVR method that is completely prefactor-free; it gives the time correlation function as a classical-like magnitude function multiplied by a phase function. Application of this approach to flux-flux correlation functions (which yield reaction rate constants) for the benchmark H + H(2) system shows very good agreement with exact quantum results. Limitations of the approximate approach are also discussed.

  4. Gaussian Decomposition of Laser Altimeter Waveforms

    NASA Technical Reports Server (NTRS)

    Hofton, Michelle A.; Minster, J. Bernard; Blair, J. Bryan

    1999-01-01

    We develop a method to decompose a laser altimeter return waveform into its Gaussian components assuming that the position of each Gaussian within the waveform can be used to calculate the mean elevation of a specific reflecting surface within the laser footprint. We estimate the number of Gaussian components from the number of inflection points of a smoothed copy of the laser waveform, and obtain initial estimates of the Gaussian half-widths and positions from the positions of its consecutive inflection points. Initial amplitude estimates are obtained using a non-negative least-squares method. To reduce the likelihood of fitting the background noise within the waveform and to minimize the number of Gaussians needed in the approximation, we rank the "importance" of each Gaussian in the decomposition using its initial half-width and amplitude estimates. The initial parameter estimates of all Gaussians ranked "important" are optimized using the Levenburg-Marquardt method. If the sum of the Gaussians does not approximate the return waveform to a prescribed accuracy, then additional Gaussians are included in the optimization procedure. The Gaussian decomposition method is demonstrated on data collected by the airborne Laser Vegetation Imaging Sensor (LVIS) in October 1997 over the Sequoia National Forest, California.

  5. Numerical methods for incompressible viscous flows with engineering applications

    NASA Technical Reports Server (NTRS)

    Rose, M. E.; Ash, R. L.

    1988-01-01

    A numerical scheme has been developed to solve the incompressible, 3-D Navier-Stokes equations using velocity-vorticity variables. This report summarizes the development of the numerical approximation schemes for the divergence and curl of the velocity vector fields and the development of compact schemes for handling boundary and initial boundary value problems.

  6. Employed Carers' Empathy towards People with Intellectual Disabilities: The Development of a New Measure and Some Initial Theory

    ERIC Educational Resources Information Center

    Collins, Kirsten; Gratton, Caroline; Heneage, Celia; Dagnan, Dave

    2017-01-01

    Background: This study aimed to develop a self-report measure of paid caregivers' empathy towards people with intellectual disabilities. Materials and Methods: Following questionnaire development, 194 staff working in services for people with intellectual disabilities completed self-report questionnaires, including the new empathy measure. The…

  7. Research Committee Issues Brief: Professional Development for Virtual Schooling and Online Learning

    ERIC Educational Resources Information Center

    Davis, Niki; Rose, Ray

    2007-01-01

    This report examines the types of professional development necessary to implement successful online learning initiatives. The potential for schools utilizing online learning is tremendous: schools can develop new distribution methods to enable equity and access for all students, they can provide high quality content for all students and they can…

  8. Development and Initial Testing of a Structured Clinical Observation Tool to Assess Pharmacotherapy Competence

    ERIC Educational Resources Information Center

    Young, John Q.; Lieu, Sandra; O'Sullivan, Patricia; Tong, Lowell

    2011-01-01

    Objective: The authors developed and tested the feasibility and utility of a new direct-observation instrument to assess trainee performance of a medication management session. Methods: The Psychopharmacotherapy-Structured Clinical Observation (P-SCO) instrument was developed based on multiple sources of expertise and then implemented in 4…

  9. Development of a Drug Use Resistance Self-Efficacy (DURSE) Scale

    ERIC Educational Resources Information Center

    Carpenter, Carrie M.; Howard, Donna

    2009-01-01

    Objectives: To develop and evaluate psychometric properties of a new instrument, the drug use resistance self-efficacy (DURSE) scale, designed for young adolescents. Methods: Scale construction occurred in 3 phases: (1) initial development, (2) pilot testing of preliminary items, and (3) final scale administration among a sample of seventh graders…

  10. Adult Education in Development. Methods and Approaches from Changing Societies.

    ERIC Educational Resources Information Center

    McGivney, Veronica; Murray, Frances

    The case studies described in this book provide examples of initiatives illustrating the role of adult education in development and its contribution to the process of change in developing countries. The book is organized in five sections. Case studies in Part 1, "Health Education," illustrate the links between primary health care and…

  11. Initial Development and Validation of the BullyHARM: The Bullying, Harassment, and Aggression Receipt Measure

    ERIC Educational Resources Information Center

    Hall, William J.

    2016-01-01

    This article describes the development and preliminary validation of the Bullying, Harassment, and Aggression Receipt Measure (BullyHARM). The development of the BullyHARM involved a number of steps and methods, including a literature review, expert review, cognitive testing, readability testing, data collection from a large sample, reliability…

  12. Multiple-Methods Needs Assessment of California 4-H Science Education Programming

    ERIC Educational Resources Information Center

    Worker, Steven M.; Schmitt-McQuitty, Lynn; Ambrose, Andrea; Brian, Kelley; Schoenfelder, Emily; Smith, Martin H.

    2017-01-01

    The California 4-H Science Leadership Team conducted a statewide assessment to evaluate the needs of county-based 4-H programs related to the key areas of the 4-H Science Initiative: program development and design, professional development, curricula, evaluation, partnerships, and fund development. The use of multiple qualitative data sources…

  13. Fuel Optimal, Finite Thrust Guidance Methods to Circumnavigate with Lighting Constraints

    NASA Astrophysics Data System (ADS)

    Prince, E. R.; Carr, R. W.; Cobb, R. G.

    This paper details improvements made to the authors' most recent work to find fuel optimal, finite-thrust guidance to inject an inspector satellite into a prescribed natural motion circumnavigation (NMC) orbit about a resident space object (RSO) in geosynchronous orbit (GEO). Better initial guess methodologies are developed for the low-fidelity model nonlinear programming problem (NLP) solver to include using Clohessy- Wiltshire (CW) targeting, a modified particle swarm optimization (PSO), and MATLAB's genetic algorithm (GA). These initial guess solutions may then be fed into the NLP solver as an initial guess, where a different NLP solver, IPOPT, is used. Celestial lighting constraints are taken into account in addition to the sunlight constraint, ensuring that the resulting NMC also adheres to Moon and Earth lighting constraints. The guidance is initially calculated given a fixed final time, and then solutions are also calculated for fixed final times before and after the original fixed final time, allowing mission planners to choose the lowest-cost solution in the resulting range which satisfies all constraints. The developed algorithms provide computationally fast and highly reliable methods for determining fuel optimal guidance for NMC injections while also adhering to multiple lighting constraints.

  14. The Adam Williams initiative: collaborating with community resources to improve care for traumatic brain injury.

    PubMed

    Bader, Mary Kay; Stutzman, Sonja E; Palmer, Sylvain; Nwagwu, Chiedozie I; Goodman, Gary; Whittaker, Margie; Olson, Daiwai M

    2014-12-01

    The Brain Trauma Foundation has developed treatment guidelines for the care of patients with acute traumatic brain injury. However, a method to provide broad acceptance and application of these guidelines has not been published. To describe methods for the development, funding, and continued educational efforts of the Adam Williams Initiative; the experiences from the first 10 years may serve as a template for hospitals and nurses that seek to engage in long-term quality improvement collaborations with foundations and/or industry. In 2004, the nonprofit Adam Williams Initiative was established with the goal of providing education and resources that would encourage hospitals across the United States to incorporate the Brain Trauma Foundation's guidelines into practice. Between 2004 and 2014, 37 hospitals have been funded by the Adam Williams Initiative and have had staff members participate in an immersion experience at Mission Hospital (Mission Viejo, California) during which team members received both didactic and hands-on education in the care of traumatic brain injury. Carefully cultivated relationships and relentless teamwork have contributed to successful implementation of the Brain Trauma Foundation's guidelines in US hospitals. ©2014 American Association of Critical-Care Nurses.

  15. Development of phantom and methodology for 3D and 4D dose intercomparisons for advanced lung radiotherapy

    NASA Astrophysics Data System (ADS)

    Caloz, Misael; Kafrouni, Marilyne; Leturgie, Quentin; Corde, Stéphanie; Downes, Simon; Lehmann, Joerg; Thwaites, David

    2015-01-01

    There are few reported intercomparisons or audits of combinations of advanced radiotherapy methods, particularly for 4D treatments. As part of an evaluation of the implementation of advanced radiotherapy technology, a phantom and associated methods, initially developed for in-house commissioning and QA of 4D lung treatments, has been developed further with the aim of using it for end-to-end dose intercomparison of 4D treatment planning and delivery. The respiratory thorax phantom can house moving inserts with variable speed (breathing rate) and motion amplitude. In one set-up mode it contains a small ion chamber for point dose measurements, or alternatively it can hold strips of radiochromic film to measure dose distributions. Initial pilot and feasibility measurements have been carried out in one hospital to thoroughly test the methods and procedures before using it more widely across a range of hospitals and treatment systems. Overall, the results show good agreement between measured and calculated doses and distributions, supporting the use of the phantom and methodology for multi-centre intercomparisons. However, before wider use, refinements of the method and analysis are currently underway particularly for the film measurements.

  16. Automatic detection of multiple UXO-like targets using magnetic anomaly inversion and self-adaptive fuzzy c-means clustering

    NASA Astrophysics Data System (ADS)

    Yin, Gang; Zhang, Yingtang; Fan, Hongbo; Ren, Guoquan; Li, Zhining

    2017-12-01

    We have developed a method for automatically detecting UXO-like targets based on magnetic anomaly inversion and self-adaptive fuzzy c-means clustering. Magnetic anomaly inversion methods are used to estimate the initial locations of multiple UXO-like sources. Although these initial locations have some errors with respect to the real positions, they form dense clouds around the actual positions of the magnetic sources. Then we use the self-adaptive fuzzy c-means clustering algorithm to cluster these initial locations. The estimated number of cluster centroids represents the number of targets and the cluster centroids are regarded as the locations of magnetic targets. Effectiveness of the method has been demonstrated using synthetic datasets. Computational results show that the proposed method can be applied to the case of several UXO-like targets that are randomly scattered within in a confined, shallow subsurface, volume. A field test was carried out to test the validity of the proposed method and the experimental results show that the prearranged magnets can be detected unambiguously and located precisely.

  17. Participatory methods for Inuit public health promotion and programme evaluation in Nunatsiavut, Canada.

    PubMed

    Saini, Manpreet

    2017-01-01

    Engaging stakeholders is crucial for health promotion and programme evaluations; understanding how to best engage stakeholders is less clear, especially within Indigenous communities. The objectives of this thesis research were to use participatory methods to: (1) co-develop and evaluate a whiteboard video for use as a public health promotion tool in Rigolet, Nunatsiavut, and (2) develop and validate a framework for participatory evaluation of Inuit public health initiatives in Nunatsiavut, Labrador. Data collection tools included interactive workshops, community events, interviews, focus-group discussions and surveys. Results indicated the whiteboard video was an engaging and suitable medium for sharing public health messaging due to its contextually relevant elements. Participants identified 4 foundational evaluation framework components necessary to conduct appropriate evaluations, including: (1) community engagement, (2) collaborative evaluation development, (3) tailored evaluation data collection and (4) evaluation scope. This research illustrates stakeholder participation is critical to develop and evaluate contextually relevant public health initiatives in Nunatsiavut, Labrador and should be considered in other Indigenous communities.

  18. ACCELERATED SOLVENT EXTRACTION COMBINED WITH ...

    EPA Pesticide Factsheets

    A research project was initiated to address a recurring problem of elevated detection limits above required risk-based concentrations for the determination of semivolatile organic compounds in high moisture content solid samples. This project was initiated, in cooperation with the EPA Region 1 Laboratory, under the Regional Methods Program administered through the ORD Office of Science Policy. The aim of the project was to develop an approach for the rapid removal of water in high moisture content solids (e.g., wetland sediments) in preparation for analysis via Method 8270. Alternative methods for water removal have been investigated to enhance compound solid concentrations and improve extraction efficiency, with the use of pressure filtration providing a high-throughput alternative for removal of the majority of free water in sediments and sludges. In order to eliminate problems with phase separation during extraction of solids using Accelerated Solvent Extraction, a variation of a water-isopropanol extraction method developed at the USGS National Water Quality Laboratory in Denver, CO is being employed. The concentrations of target compounds in water-isopropanol extraction fluids are subsequently analyzed using an automated Solid Phase Extraction (SPE)-GC/MS method developed in our laboratory. The coupled approaches for dewatering, extraction, and target compound identification-quantitation provide a useful alternative to enhance sample throughput for Me

  19. New Reduced Two-Time Step Method for Calculating Combustion and Emission Rates of Jet-A and Methane Fuel With and Without Water Injection

    NASA Technical Reports Server (NTRS)

    Molnar, Melissa; Marek, C. John

    2004-01-01

    A simplified kinetic scheme for Jet-A, and methane fuels with water injection was developed to be used in numerical combustion codes, such as the National Combustor Code (NCC) or even simple FORTRAN codes that are being developed at Glenn. The two time step method is either an initial time averaged value (step one) or an instantaneous value (step two). The switch is based on the water concentration in moles/cc of 1x10(exp -20). The results presented here results in a correlation that gives the chemical kinetic time as two separate functions. This two step method is used as opposed to a one step time averaged method previously developed to determine the chemical kinetic time with increased accuracy. The first time averaged step is used at the initial times for smaller water concentrations. This gives the average chemical kinetic time as a function of initial overall fuel air ratio, initial water to fuel mass ratio, temperature, and pressure. The second instantaneous step, to be used with higher water concentrations, gives the chemical kinetic time as a function of instantaneous fuel and water mole concentration, pressure and temperature (T4). The simple correlations would then be compared to the turbulent mixing times to determine the limiting properties of the reaction. The NASA Glenn GLSENS kinetics code calculates the reaction rates and rate constants for each species in a kinetic scheme for finite kinetic rates. These reaction rates were then used to calculate the necessary chemical kinetic times. Chemical kinetic time equations for fuel, carbon monoxide and NOx were obtained for Jet-A fuel and methane with and without water injection to water mass loadings of 2/1 water to fuel. A similar correlation was also developed using data from NASA's Chemical Equilibrium Applications (CEA) code to determine the equilibrium concentrations of carbon monoxide and nitrogen oxide as functions of overall equivalence ratio, water to fuel mass ratio, pressure and temperature (T3). The temperature of the gas entering the turbine (T4) was also correlated as a function of the initial combustor temperature (T3), equivalence ratio, water to fuel mass ratio, and pressure.

  20. Evaluation of Training Programs for Rural Development

    ERIC Educational Resources Information Center

    Indira, A.

    2008-01-01

    An Evaluation of the "Impact Assessment of the Training Programs" of a National Level Training Institution in India was conducted using the Kirkpatrick Method (KP Method). The studied Institution takes up research, provides training, offers consultancy and initiates action in the rural sector of India. The evaluation study used a…

  1. OVERBURDEN MINERALOGY AS RELATED TO GROUND-WATER CHEMICAL CHANGES IN COAL STRIP MINING

    EPA Science Inventory

    A research program was initiated to define and develop an inclusive, effective, and economical method for predicting potential ground-water quality changes resulting from the strip mining of coal in the Western United States. To utilize the predictive method, it is necessary to s...

  2. EVALUATION OF A TEST METHOD FOR MEASURING INDOOR AIR EMISSIONS FROM DRY-PROCESS PHOTOCOPIERS

    EPA Science Inventory

    A large chamber test method for measuring indoor air emissions from office equipment was developed, evaluated, and revised based on the initial testing of four dry-process photocopiers. Because all chambers may not necessarily produce similar results (e.g., due to differences in ...

  3. Text-in-Context: A Method for Extracting Findings in Mixed-Methods Mixed Research Synthesis Studies

    PubMed Central

    Leeman, Jennifer; Knafl, Kathleen; Crandell, Jamie L.

    2012-01-01

    Aim Our purpose in this paper is to propose a new method for extracting findings from research reports included in mixed-methods mixed research synthesis studies. Background International initiatives in the domains of systematic review and evidence synthesis have been focused on broadening the conceptualization of evidence, increased methodological inclusiveness and the production of evidence syntheses that will be accessible to and usable by a wider range of consumers. Initiatives in the general mixed-methods research field have been focused on developing truly integrative approaches to data analysis and interpretation. Data source The data extraction challenges described here were encountered and the method proposed for addressing these challenges was developed, in the first year of the ongoing (2011–2016) study: Mixed-Methods Synthesis of Research on Childhood Chronic Conditions and Family. Discussion To preserve the text-in-context of findings in research reports, we describe a method whereby findings are transformed into portable statements that anchor results to relevant information about sample, source of information, time, comparative reference point, magnitude and significance and study-specific conceptions of phenomena. Implications for nursing The data extraction method featured here was developed specifically to accommodate mixed-methods mixed research synthesis studies conducted in nursing and other health sciences, but reviewers might find it useful in other kinds of research synthesis studies. Conclusion This data extraction method itself constitutes a type of integration to preserve the methodological context of findings when statements are read individually and in comparison to each other. PMID:22924808

  4. Composite load spectra for select space propulsion structural components

    NASA Technical Reports Server (NTRS)

    Newell, J. F.; Kurth, R. E.; Ho, H.

    1986-01-01

    A multiyear program is performed with the objective to develop generic load models with multiple levels of progressive sophistication to simulate the composite (combined) load spectra that are induced in space propulsion system components, representative of Space Shuttle Main Engines (SSME), such as transfer ducts, turbine blades, and liquid oxygen (LOX) posts. Progress of the first year's effort includes completion of a sufficient portion of each task -- probabilistic models, code development, validation, and an initial operational code. This code has from its inception an expert system philosophy that could be added to throughout the program and in the future. The initial operational code is only applicable to turbine blade type loadings. The probabilistic model included in the operational code has fitting routines for loads that utilize a modified Discrete Probabilistic Distribution termed RASCAL, a barrier crossing method and a Monte Carlo method. An initial load model was developed by Battelle that is currently used for the slowly varying duty cycle type loading. The intent is to use the model and related codes essentially in the current form for all loads that are based on measured or calculated data that have followed a slowly varying profile.

  5. Simplified Two-Time Step Method for Calculating Combustion and Emission Rates of Jet-A and Methane Fuel With and Without Water Injection

    NASA Technical Reports Server (NTRS)

    Molnar, Melissa; Marek, C. John

    2005-01-01

    A simplified kinetic scheme for Jet-A, and methane fuels with water injection was developed to be used in numerical combustion codes, such as the National Combustor Code (NCC) or even simple FORTRAN codes. The two time step method is either an initial time averaged value (step one) or an instantaneous value (step two). The switch is based on the water concentration in moles/cc of 1x10(exp -20). The results presented here results in a correlation that gives the chemical kinetic time as two separate functions. This two time step method is used as opposed to a one step time averaged method previously developed to determine the chemical kinetic time with increased accuracy. The first time averaged step is used at the initial times for smaller water concentrations. This gives the average chemical kinetic time as a function of initial overall fuel air ratio, initial water to fuel mass ratio, temperature, and pressure. The second instantaneous step, to be used with higher water concentrations, gives the chemical kinetic time as a function of instantaneous fuel and water mole concentration, pressure and temperature (T4). The simple correlations would then be compared to the turbulent mixing times to determine the limiting rates of the reaction. The NASA Glenn GLSENS kinetics code calculates the reaction rates and rate constants for each species in a kinetic scheme for finite kinetic rates. These reaction rates are used to calculate the necessary chemical kinetic times. Chemical kinetic time equations for fuel, carbon monoxide and NOx are obtained for Jet-A fuel and methane with and without water injection to water mass loadings of 2/1 water to fuel. A similar correlation was also developed using data from NASA's Chemical Equilibrium Applications (CEA) code to determine the equilibrium concentrations of carbon monoxide and nitrogen oxide as functions of overall equivalence ratio, water to fuel mass ratio, pressure and temperature (T3). The temperature of the gas entering the turbine (T4) was also correlated as a function of the initial combustor temperature (T3), equivalence ratio, water to fuel mass ratio, and pressure.

  6. Building the field of population health intervention research: The development and use of an initial set of competencies.

    PubMed

    Riley, Barbara; Harvey, Jean; Di Ruggiero, Erica; Potvin, Louise

    2015-01-01

    Population health intervention research (PHIR) is a relatively new research field that studies interventions that can improve health and health equity at a population level. Competencies are one way to give legitimacy and definition to a field. An initial set of PHIR competencies was developed with leadership from a multi-sector group in Canada. This paper describes the development process for these competencies and their possible uses. Methods to develop the competencies included key informant interviews; a targeted review of scientific and gray literature; a 2-round, online adapted Delphi study with a 24-member panel; and a focus group with 9 international PHIR experts. The resulting competencies consist of 25 items grouped into 6 categories. They include principles of good science applicable though not exclusive to PHIR, and more suitable for PHIR teams rather than individuals. This initial set of competencies, released in 2013, may be used to develop graduate student curriculum, recruit trainees and faculty to academic institutions, plan non-degree professional development, and develop job descriptions for PHIR-related research and professional positions. The competencies provide some initial guideposts for the field and will need to be adapted as the PHIR field matures and to meet unique needs of different jurisdictions.

  7. Building the field of population health intervention research: The development and use of an initial set of competencies

    PubMed Central

    Riley, Barbara; Harvey, Jean; Di Ruggiero, Erica; Potvin, Louise

    2015-01-01

    Population health intervention research (PHIR) is a relatively new research field that studies interventions that can improve health and health equity at a population level. Competencies are one way to give legitimacy and definition to a field. An initial set of PHIR competencies was developed with leadership from a multi-sector group in Canada. This paper describes the development process for these competencies and their possible uses. Methods to develop the competencies included key informant interviews; a targeted review of scientific and gray literature; a 2-round, online adapted Delphi study with a 24-member panel; and a focus group with 9 international PHIR experts. The resulting competencies consist of 25 items grouped into 6 categories. They include principles of good science applicable though not exclusive to PHIR, and more suitable for PHIR teams rather than individuals. This initial set of competencies, released in 2013, may be used to develop graduate student curriculum, recruit trainees and faculty to academic institutions, plan non-degree professional development, and develop job descriptions for PHIR-related research and professional positions. The competencies provide some initial guideposts for the field and will need to be adapted as the PHIR field matures and to meet unique needs of different jurisdictions. PMID:26844160

  8. Piloting a Process Maturity Model as an e-Learning Benchmarking Method

    ERIC Educational Resources Information Center

    Petch, Jim; Calverley, Gayle; Dexter, Hilary; Cappelli, Tim

    2007-01-01

    As part of a national e-learning benchmarking initiative of the UK Higher Education Academy, the University of Manchester is carrying out a pilot study of a method to benchmark e-learning in an institution. The pilot was designed to evaluate the operational viability of a method based on the e-Learning Maturity Model developed at the University of…

  9. Encouraging Teacher Change within the Realities of School-Based Agricultural Education: Lessons from Teachers' Initial Use of Socioscientific Issues-Based Instruction

    ERIC Educational Resources Information Center

    Wilcox, Amie K.; Shoulders, Catherine W.; Myers, Brian E.

    2014-01-01

    Calls for increased interdisciplinary education have led to the development of numerous teaching methods designed to help teachers provide meaningful experiences for their students. However, methods of guiding teachers in the successful adoption of innovative teaching methods are not firmly set. This qualitative study sought to better understand…

  10. Phased Array Ultrasound: Initial Development of PAUT Inspection of Self-Reacting Friction Stir Welds

    NASA Technical Reports Server (NTRS)

    Rairigh, Ryan

    2008-01-01

    This slide presentation reviews the development of Phased Array Ultrasound (PAUT) as a non-destructive examination method for Self Reacting Friction Stir Welds (SR-FSW). PAUT is the only NDE method which has been shown to detect detrimental levels of Residual Oxide Defect (ROD), which can result in significant decrease in weld strength. The presentation reviews the PAUT process, and shows the results in comparison with x-ray radiography.

  11. Overview of Future of Probabilistic Methods and RMSL Technology and the Probabilistic Methods Education Initiative for the US Army at the SAE G-11 Meeting

    NASA Technical Reports Server (NTRS)

    Singhal, Surendra N.

    2003-01-01

    The SAE G-11 RMSL Division and Probabilistic Methods Committee meeting sponsored by the Picatinny Arsenal during March 1-3, 2004 at Westin Morristown, will report progress on projects for probabilistic assessment of Army system and launch an initiative for probabilistic education. The meeting features several Army and industry Senior executives and Ivy League Professor to provide an industry/government/academia forum to review RMSL technology; reliability and probabilistic technology; reliability-based design methods; software reliability; and maintainability standards. With over 100 members including members with national/international standing, the mission of the G-11s Probabilistic Methods Committee is to enable/facilitate rapid deployment of probabilistic technology to enhance the competitiveness of our industries by better, faster, greener, smarter, affordable and reliable product development.

  12. An initial investigation into methods of computing transonic aerodynamic sensitivity coefficients

    NASA Technical Reports Server (NTRS)

    Carlson, Leland A.

    1992-01-01

    Research conducted during the period from July 1991 through December 1992 is covered. A method based upon the quasi-analytical approach was developed for computing the aerodynamic sensitivity coefficients of three dimensional wings in transonic and subsonic flow. In addition, the method computes for comparison purposes the aerodynamic sensitivity coefficients using the finite difference approach. The accuracy and validity of the methods are currently under investigation.

  13. Integrated hydrologic modeling: Effects of spatial scale, discretization and initialization

    NASA Astrophysics Data System (ADS)

    Seck, A.; Welty, C.; Maxwell, R. M.

    2011-12-01

    Groundwater discharge contributes significantly to the annual flows of Chesapeake Bay tributaries and is presumed to contribute to the observed lag time between the implementation of management actions and the environmental response in the Chesapeake Bay. To investigate groundwater fluxes and flow paths and interaction with surface flow, we have developed a fully distributed integrated hydrologic model of the Chesapeake Bay Watershed using ParFlow. Here we present a comparison of model spatial resolution and initialization methods. We have studied the effect of horizontal discretization on overland flow processes at a range of scales. Three nested model domains have been considered: the Monocacy watershed (5600 sq. km), the Potomac watershed (92000 sq. km) and the Chesapeake Bay watershed (400,000 sq. km). Models with homogeneous subsurface and topographically-derived slopes were evaluated at 500-m, 1000-m, 2000-m, and 4000-m grid resolutions. Land surface slopes were derived from resampled DEMs and corrected using stream networks. Simulation results show that the overland flow processes are reasonably well represented with a resolution up to 2000 m. We observe that the effects of horizontal resolution dissipate with larger scale models. Using a homogeneous model that includes subsurface and surface terrain characteristics, we have evaluated various initialization methods for the integrated Monocacy watershed model. This model used several options for water table depths and two rainfall forcing methods including (1) a synthetic rainfall-recession cycle corresponding to the region's average annual rainfall rate, and (2) an initial shut-off of rainfall forcing followed by a rainfall-recession cycling. Results show the dominance of groundwater generated runoff during a first phase of the simulation followed by a convergence towards more balanced runoff generation mechanisms. We observe that the influence of groundwater runoff increases in dissected relief areas characterized by high slope magnitudes. This is due to the increase in initial water table gradients in these regions. As a result, in the domain conditions for this study, an initial shut-off of rainfall forcing proved to be the more efficient initialization method. The initialized model is then coupled with a Land Surface Model (CLM). Ongoing work includes coupling a heterogeneous subsurface field with spatially variable meteorological forcing using the National Land Data Assimilation System (NLDAS) data products. Seasonal trends of groundwater levels for current and pre-development conditions of the basin will be compared.

  14. "I Finally Get It!": Developing Mathematical Understanding during Teacher Education

    ERIC Educational Resources Information Center

    Holm, Jennifer; Kajander, Ann

    2012-01-01

    A deep conceptual understanding of elementary mathematics as appropriate for teaching is increasingly thought to be an important aspect of elementary teacher capacity. This study explores preservice teachers' initial mathematical understandings and how these understandings developed during a mathematics methods course for upper elementary…

  15. DEVELOPMENT OF A MICROSCALE EMISSION FACTOR MODEL FOR CO FOR PREDICTING REAL-TIME MOTOR VEHICLE EMISSIONS

    EPA Science Inventory

    The United States Environmental Protection Agency's (EPA) National Exposure Research Laboratory (NERL) has initiated a project to improve the methodology for modeling human exposure to motor vehicle emission. The overall project goal is to develop improved methods for modeling...

  16. An Object-Based Requirements Modeling Method.

    ERIC Educational Resources Information Center

    Cordes, David W.; Carver, Doris L.

    1992-01-01

    Discusses system modeling and specification as it relates to object-based information systems development and software development. An automated system model based on the objects in the initial requirements document is described, the requirements document translator is explained, and a sample application of the technique is provided. (12…

  17. Applying the Scientific Method of Cybersecurity Research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tardiff, Mark F.; Bonheyo, George T.; Cort, Katherine A.

    The cyber environment has rapidly evolved from a curiosity to an essential component of the contemporary world. As the cyber environment has expanded and become more complex, so have the nature of adversaries and styles of attacks. Today, cyber incidents are an expected part of life. As a result, cybersecurity research emerged to address adversarial attacks interfering with or preventing normal cyber activities. Historical response to cybersecurity attacks is heavily skewed to tactical responses with an emphasis on rapid recovery. While threat mitigation is important and can be time critical, a knowledge gap exists with respect to developing the sciencemore » of cybersecurity. Such a science will enable the development and testing of theories that lead to understanding the broad sweep of cyber threats and the ability to assess trade-offs in sustaining network missions while mitigating attacks. The Asymmetric Resilient Cybersecurity Initiative at Pacific Northwest National Laboratory is a multi-year, multi-million dollar investment to develop approaches for shifting the advantage to the defender and sustaining the operability of systems under attack. The initiative established a Science Council to focus attention on the research process for cybersecurity. The Council shares science practices, critiques research plans, and aids in documenting and reporting reproducible research results. The Council members represent ecology, economics, statistics, physics, computational chemistry, microbiology and genetics, and geochemistry. This paper reports the initial work of the Science Council to implement the scientific method in cybersecurity research. The second section describes the scientific method. The third section in this paper discusses scientific practices for cybersecurity research. Section four describes initial impacts of applying the science practices to cybersecurity research.« less

  18. Numerical Analysis of Effectiveness of Strengthening Concrete Slab in Tension of the Steel-Concrete Composite Beam Using Pretensioned CFRP Strips

    NASA Astrophysics Data System (ADS)

    Jankowiak, Iwona; Madaj, Arkadiusz

    2017-12-01

    One of the methods to increase the load carrying capacity of the reinforced concrete (RC) structure is its strengthening by using carbon fiber (CFRP) strips. There are two methods of strengthening using CFRP strips - passive method and active method. In the passive method a strip is applied to the concrete surface without initial strains, unlike in the active method a strip is initially pretensioned before its application. In the case of a steel-concrete composite beam, strips may be used to strengthen the concrete slab located in the tension zone (in the parts of beams with negative bending moments). The finite element model has been developed and validated by experimental tests to evaluate the strengthening efficiency of the composite girder with pretensioned CFRP strips applied to concrete slab in its tension zone.

  19. Oxygen production on the Lunar materials processing frontier

    NASA Technical Reports Server (NTRS)

    Altenberg, Barbara H.

    1992-01-01

    During the pre-conceptual design phase of an initial lunar oxygen processing facility, it is essential to identify and compare the available processes and evaluate them in order to ensure the success of such an endeavor. The focus of this paper is to provide an overview of materials processing to produce lunar oxygen as one part of a given scenario of a developing lunar occupation. More than twenty-five techniques to produce oxygen from lunar materials have been identified. While it is important to continue research on any feasible method, not all methods can be implemented at the initial lunar facility. Hence, it is necessary during the pre-conceptual design phase to evaluate all methods and determine the leading processes for initial focus. Researchers have developed techniques for evaluating the numerous proposed methods in order to suggest which processes would be best to go to the Moon first. As one section in this paper, the recent evaluation procedures that have been presented in the literature are compared and contrasted. In general, the production methods for lunar oxygen fall into four categories: thermochemical, reactive solvent, pyrolytic, and electrochemical. Examples from two of the four categories are described, operating characteristics are contrasted, and terrestrial analogs are presented when possible. In addition to producing oxygen for use as a propellant and for life support, valuable co-products can be derived from some of the processes. This information is also highlighted in the description of a given process.

  20. Analytical Method of Approximating the Motion of a Spinning Vehicle with Variable Mass and Inertia Properties Acted Upon by Several Disturbing Parameters

    NASA Technical Reports Server (NTRS)

    Buglia, James J.; Young, George R.; Timmons, Jesse D.; Brinkworth, Helen S.

    1961-01-01

    An analytical method has been developed which approximates the dispersion of a spinning symmetrical body in a vacuum, with time-varying mass and inertia characteristics, under the action of several external disturbances-initial pitching rate, thrust misalignment, and dynamic unbalance. The ratio of the roll inertia to the pitch or yaw inertia is assumed constant. Spin was found to be very effective in reducing the dispersion due to an initial pitch rate or thrust misalignment, but was completely Ineffective in reducing the dispersion of a dynamically unbalanced body.

  1. EVA Glove Research Team

    NASA Technical Reports Server (NTRS)

    Strauss, Alvin M.; Peterson, Steven W.; Main, John A.; Dickenson, Rueben D.; Shields, Bobby L.; Lorenz, Christine H.

    1992-01-01

    The goal of the basic research portion of the extravehicular activity (EVA) glove research program is to gain a greater understanding of the kinematics of the hand, the characteristics of the pressurized EVA glove, and the interaction of the two. Examination of the literature showed that there existed no acceptable, non-invasive method of obtaining accurate biomechanical data on the hand. For this reason a project was initiated to develop magnetic resonance imaging as a tool for biomechanical data acquisition and visualization. Literature reviews also revealed a lack of practical modeling methods for fabric structures, so a basic science research program was also initiated in this area.

  2. Advanced Mechanistic 3D Spatial Modeling and Analysis Methods to Accurately Represent Nuclear Facility External Event Scenarios

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sezen, Halil; Aldemir, Tunc; Denning, R.

    Probabilistic risk assessment of nuclear power plants initially focused on events initiated by internal faults at the plant, rather than external hazards including earthquakes and flooding. Although the importance of external hazards risk analysis is now well recognized, the methods for analyzing low probability external hazards rely heavily on subjective judgment of specialists, often resulting in substantial conservatism. This research developed a framework to integrate the risk of seismic and flooding events using realistic structural models and simulation of response of nuclear structures. The results of four application case studies are presented.

  3. Nonlinear Simulation of the Tooth Enamel Spectrum for EPR Dosimetry

    NASA Astrophysics Data System (ADS)

    Kirillov, V. A.; Dubovsky, S. V.

    2016-07-01

    Software was developed where initial EPR spectra of tooth enamel were deconvoluted based on nonlinear simulation, line shapes and signal amplitudes in the model initial spectrum were calculated, the regression coefficient was evaluated, and individual spectra were summed. Software validation demonstrated that doses calculated using it agreed excellently with the applied radiation doses and the doses reconstructed by the method of additive doses.

  4. Traveler Phase 1A Joint Review

    NASA Technical Reports Server (NTRS)

    St. John, Clint; Scofield, Jan; Skoog, Mark; Flock, Alex; Williams, Ethan; Guirguis, Luke; Loudon, Kevin; Sutherland, Jeffrey; Lehmann, Richard; Garland, Michael; hide

    2017-01-01

    The briefing contains the preliminary findings and suggestions for improvement of methods used in development and evaluation of a multi monitor runtime assurance architecture for autonomous flight vehicles. Initial system design, implementation, verification, and flight testing has been conducted. As of yet detailed data review is incomplete, and flight testing has been limited to initial monitor force fights. Detailed monitor flight evaluations have yet to be performed.

  5. Curvature estimation for multilayer hinged structures with initial strains

    NASA Astrophysics Data System (ADS)

    Nikishkov, G. P.

    2003-10-01

    Closed-form estimate of curvature for hinged multilayer structures with initial strains is developed. The finite element method is used for modeling of self-positioning microstructures. The geometrically nonlinear problem with large rotations and large displacements is solved using step procedure with node coordinate update. Finite element results for curvature of the hinged micromirror with variable width is compared to closed-form estimates.

  6. Creating Workforce Development Systems That Work: An Evaluation of the Initial One-Stop Implementation Experience. Final Report.

    ERIC Educational Resources Information Center

    Kogan, Deborah; Dickinson, Katherine P.; Fedrau, Ruth; Midling, Michael J.; Wolff, Kristin E.

    This report analyzes progress states and local sites have made in implementing the One-Stop Career Center systems. An executive summary is followed by Section A, Introduction, which provides an overview of the One-Stop initiative and describes evaluation objectives and methods. The main portion of the report is organized into three major sections.…

  7. National Audubon society's technology initiatives for bird conservation: a summary of application development for the Christmas bird count

    Treesearch

    Kathy Dale

    2005-01-01

    Since 1998, Audubon's Christmas Bird Count (CBC) has been supported by an Internet-based data entry application that was initially designed to accommodate the traditional paper-based methods of this long-running bird monitoring program. The first efforts to computerize the data and the entry procedures have informed a planned strategy to revise the current...

  8. The NanoSustain and NanoValid project--two new EU FP7 research initiatives to assess the unique physical-chemical and toxicological properties of engineered nanomaterials.

    PubMed

    Reuther, Rudolf

    2011-02-01

    In 2010, the EU FP NanoSustain project (247989) has been successfully launched with the objective to develop innovative solutions for the sustainable use, recycling and final treatment of engineered nanomaterials (ENMs). The same year, NanoValid (263147), a large-scale integrating EU FP7 project has been initiated and contract negotiations with the European Commission commenced, to develop new reference methods and materials applicable to the unique properties of ENMs. The paper presented will give an overview on the main objectives of these 2 new European research initiatives, on main tasks to achieve objectives, and on the impact on current standardization efforts and technical innovations.

  9. Methods of analysis by the U.S. Geological Survey National Water Quality Laboratory; a method supplement for the determination of Fipronil and degradates in water by gas chromatography/mass spectrometry

    USGS Publications Warehouse

    Madsen, James F.; Sandstrom, Mark W.; Zaugg, Steven D.

    2002-01-01

    A method for the isolation and detemrination of fipronil and four of its degradates has been developed. This method adapts an analytical method created by the U.S. Geological Survey National Water Quality Laboratory in 1995 for the determination of a broad range of high-use pesticides typically found in filtered natural-water samples. In 2000, fipronil and four of its degradates were extracted, analyzed, and validated using this method. The recoveries for these five compounds in reagent-water samples fortified at 1 microgram per liter (ug/L) avereraged 98 percent. Initial method detection limits averaged 0.0029 ug/L. The performance of these five new compounds is consistent with the performance of the compounds in the initial method, making it possible to include them in addition to the other 41 pesticides and pesticide degradates in the original method.

  10. The Substrata-Factor Theory of Reading: Differential Development of Subsystems Underlying Reading Comprehension in the First Year of Instruction.

    ERIC Educational Resources Information Center

    Katz, Ina; Singer, Harry

    A study tested the instructional hypothesis that variation in instructional methods in the initial stages of formal reading development will differentially develop subsystems for attaining comprehension. The 91 kindergarten and first grade students in the study received their usual reading instruction plus supplementary instruction in one of four…

  11. Disability Statistics in the Developing World: A Reflection on the Meanings in Our Numbers

    ERIC Educational Resources Information Center

    Fujiura, Glenn T.; Park, Hye J.; Rutkowski-Kmitta, Violet

    2005-01-01

    Background: The imbalance between the sheer size of the developing world and what little is known about the lives and life circumstances of persons with disabilities living there should command our attention. Method: International development initiatives routinely give great priority to the collection of statistical indicators yet even the most…

  12. A review of empirical research related to the use of small quantitative samples in clinical outcome scale development.

    PubMed

    Houts, Carrie R; Edwards, Michael C; Wirth, R J; Deal, Linda S

    2016-11-01

    There has been a notable increase in the advocacy of using small-sample designs as an initial quantitative assessment of item and scale performance during the scale development process. This is particularly true in the development of clinical outcome assessments (COAs), where Rasch analysis has been advanced as an appropriate statistical tool for evaluating the developing COAs using a small sample. We review the benefits such methods are purported to offer from both a practical and statistical standpoint and detail several problematic areas, including both practical and statistical theory concerns, with respect to the use of quantitative methods, including Rasch-consistent methods, with small samples. The feasibility of obtaining accurate information and the potential negative impacts of misusing large-sample statistical methods with small samples during COA development are discussed.

  13. Development of the adult PedsQL™ neurofibromatosis type 1 module: initial feasibility, reliability and validity.

    PubMed

    Nutakki, Kavitha; Hingtgen, Cynthia M; Monahan, Patrick; Varni, James W; Swigonski, Nancy L

    2013-02-21

    Neurofibromatosis type 1 (NF1) is a common autosomal dominant genetic disorder with significant impact on health-related quality of life (HRQOL). Research in understanding the pathogenetic mechanisms of neurofibroma development has led to the use of new clinical trials for the treatment of NF1. One of the most important outcomes of a trial is improvement in quality of life, however, no condition specific HRQOL instrument for NF1 exists. The objective of this study was to develop an NF1 HRQOL instrument as a module of PedsQL™ and to test for its initial feasibility, internal consistency reliability and validity in adults with NF1. The NF1 specific HRQOL instrument was developed using a standard method of PedsQL™ module development - literature review, focus group/semi-structured interviews, cognitive interviews and experts' review of initial draft, pilot testing and field testing. Field testing involved 134 adults with NF1. Feasibility was measured by the percentage of missing responses, internal consistency reliability was measured with Cronbach's alpha and validity was measured by the known-groups method. Feasibility, measured by the percentage of missing responses was 4.8% for all subscales on the adult version of the NF1-specific instrument. Internal consistency reliability for the Total Score (alpha =0.97) and subscale reliabilities ranging from 0.72 to 0.96 were acceptable for group comparisons. The PedsQL™ NF1 module distinguished between NF1 adults with excellent to very good, good, and fair to poor health status. The results demonstrate the initial feasibility, reliability and validity of the PedsQL™ NF1 module in adult patients. The PedsQL™ NF1 Module can be used to understand the multidimensional nature of NF1 on the HRQOL patients with this disorder.

  14. STRengthening analytical thinking for observational studies: the STRATOS initiative.

    PubMed

    Sauerbrei, Willi; Abrahamowicz, Michal; Altman, Douglas G; le Cessie, Saskia; Carpenter, James

    2014-12-30

    The validity and practical utility of observational medical research depends critically on good study design, excellent data quality, appropriate statistical methods and accurate interpretation of results. Statistical methodology has seen substantial development in recent times. Unfortunately, many of these methodological developments are ignored in practice. Consequently, design and analysis of observational studies often exhibit serious weaknesses. The lack of guidance on vital practical issues discourages many applied researchers from using more sophisticated and possibly more appropriate methods when analyzing observational studies. Furthermore, many analyses are conducted by researchers with a relatively weak statistical background and limited experience in using statistical methodology and software. Consequently, even 'standard' analyses reported in the medical literature are often flawed, casting doubt on their results and conclusions. An efficient way to help researchers to keep up with recent methodological developments is to develop guidance documents that are spread to the research community at large. These observations led to the initiation of the strengthening analytical thinking for observational studies (STRATOS) initiative, a large collaboration of experts in many different areas of biostatistical research. The objective of STRATOS is to provide accessible and accurate guidance in the design and analysis of observational studies. The guidance is intended for applied statisticians and other data analysts with varying levels of statistical education, experience and interests. In this article, we introduce the STRATOS initiative and its main aims, present the need for guidance documents and outline the planned approach and progress so far. We encourage other biostatisticians to become involved. © 2014 The Authors. Statistics in Medicine published by John Wiley & Sons, Ltd.

  15. Procedure for analysis and design of weaving sections : volume 1, research findings and development of techniques for application.

    DOT National Transportation Integrated Search

    1983-12-01

    This research was performed to complete and advance the status of recently developed : procedures for analysis and design of weaving sections (known as the Leisch method and-initially published in the 1979 issue of ITE Journal). The objective was to ...

  16. Initial investigation of a hypothesized link between thyroid peroxidase inhibition and fish early-life stage toxicity

    EPA Science Inventory

    There is an interest in developing alternatives to the fish early-life stage (FELS) test (OECD test guideline 210), for predicting adverse outcomes (e.g., impacts on growth and survival) using less resource-intensive methods. Development and characterization of adverse outcome pa...

  17. DEVELOPMENT OF A MICROSCALE EMISSION FACTOR MODEL FOR PARTICULATE MATTER (MICROFACPM) FOR PREDICTING REAL-TIME MOTOR VEHICLE EMISSIONS

    EPA Science Inventory

    The United States Environmental Protection Agency's National Exposure Research Laboratory has initiated a project to improve the methodology for modeling human exposure to motor vehicle emissions. The overall project goal is to develop improved methods for modeling the source t...

  18. DEVELOPMENT OF A MICROSCALE EMISSION FACTOR MODEL FOR CO (MICROFACCO) FOR PREDICTING REAL-TIME VEHICLE EMISSIONS

    EPA Science Inventory

    The United States Environmental Protection Agency's National Exposure Research Laboratory has initiated a project to improve the methodology for modeling human exposure to motor vehicle emissions. The overall project goal is to develop improved methods for modeling the source t...

  19. Developing a New Field-Validated Methodology for Landfill Methane Emissions in California

    USDA-ARS?s Scientific Manuscript database

    This project was initiated in the US by the California Energy Commission (CEC) in cooperation with the California Integrated Waste Management Board (CIWMB) to develop improved methods for landfill methane emissions for the California greenhouse gas inventory. This 3-year project (2007-2010) is devel...

  20. Merging Quality Processes & Tools with DACUM.

    ERIC Educational Resources Information Center

    McLennan, Krystyna S.

    This paper explains how merging DACUM (Developing a Curriculum) analysis with quality initiatives can reduce waste, increase job efficiency, assist in development of standard operating procedures, and involve employees in positive job improvement methods. In the first half of the paper, the following principles of total quality management (TQM)…

  1. Adolescent Domain Screening Inventory-Short Form: Development and Initial Validation

    ERIC Educational Resources Information Center

    Corrigan, Matthew J.

    2017-01-01

    This study sought to develop a short version of the ADSI, and investigate its psychometric properties. Methods: This is a secondary analysis. Analysis to determine the Cronbach's Alpha, correlations to determine concurrent criterion validity and known instrument validity and a logistic regression to determine predictive validity were conducted.…

  2. Managing Change in Small Scottish Primary Schools. SCRE Research Report Series.

    ERIC Educational Resources Information Center

    Wilson, Valerie; McPake, Joanna

    This report describes Scottish research on ways in which headteachers in small primary schools managed mandated changes. The research focused on implementation of four recent major initiatives: 5-14 Curriculum Guidelines, School Development Planning, Staff Development and Appraisal, and Devolved School Management. Research methods included a…

  3. Module for phosphorus separation and recycling from liquid manures

    USDA-ARS?s Scientific Manuscript database

    A method has been developed to extract and concentrate soluble phosphates from livestock wastewater. The research was conducted over a 10-year period and went from initial bench studies and discovery, to pilot module development, to full-scale demonstrations of the phosphorus (P) module in swine fa...

  4. Exploring the Nature and Implications of Student Teacher Engagement with Development Education Initiatives

    ERIC Educational Resources Information Center

    Baily, Fiona; O'Flaherty, Joanne; Hogan, Deirdre

    2017-01-01

    In this article, the authors outline and discuss the findings of a research study, which explored student teacher engagement with development education (DE) interventions implemented within Professional Master of Education (PME) programmes across eight Irish Higher Education Institutions. Interpretivist methods were employed incorporating…

  5. Mapping Children's Understanding of Mathematical Equivalence

    ERIC Educational Resources Information Center

    Taylor, Roger S.; Rittle-Johnson, Bethany; Matthews, Percival G.; McEldoon, Katherine L.

    2009-01-01

    The focus of this research is to develop an initial framework for assessing and interpreting students' level of understanding of mathematical equivalence. Although this topic has been studied for many years, there has been no systematic development or evaluation of a valid measure of equivalence knowledge. A powerful method for accomplishing this…

  6. A review for identification of initiating events in event tree development process on nuclear power plants

    NASA Astrophysics Data System (ADS)

    Riyadi, Eko H.

    2014-09-01

    Initiating event is defined as any event either internal or external to the nuclear power plants (NPPs) that perturbs the steady state operation of the plant, if operating, thereby initiating an abnormal event such as transient or loss of coolant accident (LOCA) within the NPPs. These initiating events trigger sequences of events that challenge plant control and safety systems whose failure could potentially lead to core damage or large early release. Selection for initiating events consists of two steps i.e. first step, definition of possible events, such as by evaluating a comprehensive engineering, and by constructing a top level logic model. Then the second step, grouping of identified initiating event's by the safety function to be performed or combinations of systems responses. Therefore, the purpose of this paper is to discuss initiating events identification in event tree development process and to reviews other probabilistic safety assessments (PSA). The identification of initiating events also involves the past operating experience, review of other PSA, failure mode and effect analysis (FMEA), feedback from system modeling, and master logic diagram (special type of fault tree). By using the method of study for the condition of the traditional US PSA categorization in detail, could be obtained the important initiating events that are categorized into LOCA, transients and external events.

  7. Qualitative methods: what are they and why use them?

    PubMed Central

    Sofaer, S

    1999-01-01

    OBJECTIVE: To provide an overview of reasons why qualitative methods have been used and can be used in health services and health policy research, to describe a range of specific methods, and to give examples of their application. DATA SOURCES: Classic and contemporary descriptions of the underpinnings and applications of qualitative research methods and studies that have used such methods to examine important health services and health policy issues. PRINCIPAL FINDINGS: Qualitative research methods are valuable in providing rich descriptions of complex phenomena; tracking unique or unexpected events; illuminating the experience and interpretation of events by actors with widely differing stakes and roles; giving voice to those whose views are rarely heard; conducting initial explorations to develop theories and to generate and even test hypotheses; and moving toward explanations. Qualitative and quantitative methods can be complementary, used in sequence or in tandem. The best qualitative research is systematic and rigorous, and it seeks to reduce bias and error and to identify evidence that disconfirms initial or emergent hypotheses. CONCLUSIONS: Qualitative methods have much to contribute to health services and health policy research, especially as such research deals with rapid change and develops a more fully integrated theory base and research agenda. However, the field must build on the best traditions and techniques of qualitative methods and must recognize that special training and experience are essential to the application of these methods. PMID:10591275

  8. Experiments and simulations of Richtmyer-Meshkov Instability with measured,volumetric initial conditions

    NASA Astrophysics Data System (ADS)

    Sewell, Everest; Ferguson, Kevin; Jacobs, Jeffrey; Greenough, Jeff; Krivets, Vitaliy

    2016-11-01

    We describe experiments of single-shock Richtmyer-Meskhov Instability (RMI) performed on the shock tube apparatus at the University of Arizona in which the initial conditions are volumetrically imaged prior to shock wave arrival. Initial perturbations play a major role in the evolution of RMI, and previous experimental efforts only capture a single plane of the initial condition. The method presented uses a rastered laser sheet to capture additional images throughout the depth of the initial condition immediately before the shock arrival time. These images are then used to reconstruct a volumetric approximation of the experimental perturbation. Analysis of the initial perturbations is performed, and then used as initial conditions in simulations using the hydrodynamics code ARES, developed at Lawrence Livermore National Laboratory (LLNL). Experiments are presented and comparisons are made with simulation results.

  9. Computerization of guidelines: a knowledge specification method to convert text to detailed decision tree for electronic implementation.

    PubMed

    Aguirre-Junco, Angel-Ricardo; Colombet, Isabelle; Zunino, Sylvain; Jaulent, Marie-Christine; Leneveut, Laurence; Chatellier, Gilles

    2004-01-01

    The initial step for the computerization of guidelines is the knowledge specification from the prose text of guidelines. We describe a method of knowledge specification based on a structured and systematic analysis of text allowing detailed specification of a decision tree. We use decision tables to validate the decision algorithm and decision trees to specify and represent this algorithm, along with elementary messages of recommendation. Edition tools are also necessary to facilitate the process of validation and workflow between expert physicians who will validate the specified knowledge and computer scientist who will encode the specified knowledge in a guide-line model. Applied to eleven different guidelines issued by an official agency, the method allows a quick and valid computerization and integration in a larger decision support system called EsPeR (Personalized Estimate of Risks). The quality of the text guidelines is however still to be developed further. The method used for computerization could help to define a framework usable at the initial step of guideline development in order to produce guidelines ready for electronic implementation.

  10. Development of Gold Standard Ion-Selective Electrode-Based Methods for Fluoride Analysis

    PubMed Central

    Martínez-Mier, E.A.; Cury, J.A.; Heilman, J.R.; Katz, B.P.; Levy, S.M.; Li, Y.; Maguire, A.; Margineda, J.; O’Mullane, D.; Phantumvanit, P.; Soto-Rojas, A.E.; Stookey, G.K.; Villa, A.; Wefel, J.S.; Whelton, H.; Whitford, G.M.; Zero, D.T.; Zhang, W.; Zohouri, V.

    2011-01-01

    Background/Aims: Currently available techniques for fluoride analysis are not standardized. Therefore, this study was designed to develop standardized methods for analyzing fluoride in biological and nonbiological samples used for dental research. Methods A group of nine laboratories analyzed a set of standardized samples for fluoride concentration using their own methods. The group then reviewed existing analytical techniques for fluoride analysis, identified inconsistencies in the use of these techniques and conducted testing to resolve differences. Based on the results of the testing undertaken to define the best approaches for the analysis, the group developed recommendations for direct and microdiffusion methods using the fluoride ion-selective electrode. Results Initial results demonstrated that there was no consensus regarding the choice of analytical techniques for different types of samples. Although for several types of samples, the results of the fluoride analyses were similar among some laboratories, greater differences were observed for saliva, food and beverage samples. In spite of these initial differences, precise and true values of fluoride concentration, as well as smaller differences between laboratories, were obtained once the standardized methodologies were used. Intraclass correlation coefficients ranged from 0.90 to 0.93, for the analysis of a certified reference material, using the standardized methodologies. Conclusion The results of this study demonstrate that the development and use of standardized protocols for F analysis significantly decreased differences among laboratories and resulted in more precise and true values. PMID:21160184

  11. In silico platform for predicting and initiating β-turns in a protein at desired locations.

    PubMed

    Singh, Harinder; Singh, Sandeep; Raghava, Gajendra P S

    2015-05-01

    Numerous studies have been performed for analysis and prediction of β-turns in a protein. This study focuses on analyzing, predicting, and designing of β-turns to understand the preference of amino acids in β-turn formation. We analyzed around 20,000 PDB chains to understand the preference of residues or pair of residues at different positions in β-turns. Based on the results, a propensity-based method has been developed for predicting β-turns with an accuracy of 82%. We introduced a new approach entitled "Turn level prediction method," which predicts the complete β-turn rather than focusing on the residues in a β-turn. Finally, we developed BetaTPred3, a Random forest based method for predicting β-turns by utilizing various features of four residues present in β-turns. The BetaTPred3 achieved an accuracy of 79% with 0.51 MCC that is comparable or better than existing methods on BT426 dataset. Additionally, models were developed to predict β-turn types with better performance than other methods available in the literature. In order to improve the quality of prediction of turns, we developed prediction models on a large and latest dataset of 6376 nonredundant protein chains. Based on this study, a web server has been developed for prediction of β-turns and their types in proteins. This web server also predicts minimum number of mutations required to initiate or break a β-turn in a protein at specified location of a protein. © 2015 Wiley Periodicals, Inc.

  12. Computer-assisted initial diagnosis of rare diseases

    PubMed Central

    Piñol, Marc; Vilaplana, Jordi; Teixidó, Ivan; Cruz, Joaquim; Comas, Jorge; Vilaprinyo, Ester; Sorribas, Albert

    2016-01-01

    Introduction. Most documented rare diseases have genetic origin. Because of their low individual frequency, an initial diagnosis based on phenotypic symptoms is not always easy, as practitioners might never have been exposed to patients suffering from the relevant disease. It is thus important to develop tools that facilitate symptom-based initial diagnosis of rare diseases by clinicians. In this work we aimed at developing a computational approach to aid in that initial diagnosis. We also aimed at implementing this approach in a user friendly web prototype. We call this tool Rare Disease Discovery. Finally, we also aimed at testing the performance of the prototype. Methods. Rare Disease Discovery uses the publicly available ORPHANET data set of association between rare diseases and their symptoms to automatically predict the most likely rare diseases based on a patient’s symptoms. We apply the method to retrospectively diagnose a cohort of 187 rare disease patients with confirmed diagnosis. Subsequently we test the precision, sensitivity, and global performance of the system under different scenarios by running large scale Monte Carlo simulations. All settings account for situations where absent and/or unrelated symptoms are considered in the diagnosis. Results. We find that this expert system has high diagnostic precision (≥80%) and sensitivity (≥99%), and is robust to both absent and unrelated symptoms. Discussion. The Rare Disease Discovery prediction engine appears to provide a fast and robust method for initial assisted differential diagnosis of rare diseases. We coupled this engine with a user-friendly web interface and it can be freely accessed at http://disease-discovery.udl.cat/. The code and most current database for the whole project can be downloaded from https://github.com/Wrrzag/DiseaseDiscovery/tree/no_classifiers. PMID:27547534

  13. Extension of rezoned Eulerian-Lagrangian method to astrophysical plasma applications

    NASA Technical Reports Server (NTRS)

    Song, M. T.; Wu, S. T.; Dryer, Murray

    1993-01-01

    The rezoned Eulerian-Lagrangian procedure developed by Brackbill and Pracht (1973), which is limited to simple configurations of the magnetic fields, is modified in order to make it applicable to astrophysical plasma. For this purpose, two specific methods are introduced, which make it possible to determine the initial field topology for which no analytical expressions are available. Numerical examples illustrating these methods are presented.

  14. Some New Mathematical Methods for Variational Objective Analysis

    NASA Technical Reports Server (NTRS)

    Wahba, G.; Johnson, D. R.

    1984-01-01

    New and/or improved variational methods for simultaneously combining forecast, heterogeneous observational data, a priori climatology, and physics to obtain improved estimates of the initial state of the atmosphere for the purpose of numerical weather prediction are developed. Cross validated spline methods are applied to atmospheric data for the purpose of improved description and analysis of atmospheric phenomena such as the tropopause and frontal boundary surfaces.

  15. Pioneers and Followers: Migrant Selectivity and the Development of U.S. Migration Streams in Latin America

    PubMed Central

    Lindstrom, David P.; Ramírez, Adriana López

    2013-01-01

    We present a method for dividing the historical development of community migration streams into an initial period and a subsequent takeoff stage with the purpose of systemically differentiating pioneer migrants from follower migrants. The analysis is organized around five basic research questions. First, can we empirically identify a juncture in the historical development of community-based migration that marks the transition from an initial stage of low levels of migration and gradual growth into a takeoff stage in which the prevalence of migration grows at a more accelerated rate? Second, does this juncture point exist at roughly similar migration prevalence levels across communities? Third, are first-time migrants in the initial stage (pioneers) different from first-time migrants in the takeoff stage (followers)? Fourth, what is the nature of this migrant selectivity? Finally, does the nature and degree of pioneer selectivity vary across country migration streams? PMID:24489382

  16. Analog Design for Digital Deployment of a Serious Leadership Game

    NASA Technical Reports Server (NTRS)

    Maxwell, Nicholas; Lang, Tristan; Herman, Jeffrey L.; Phares, Richard

    2012-01-01

    This paper presents the design, development, and user testing of a leadership development simulation. The authors share lessons learned from using a design process for a board game to allow for quick and inexpensive revision cycles during the development of a serious leadership development game. The goal of this leadership simulation is to accelerate the development of leadership capacity in high-potential mid-level managers (GS-15 level) in a federal government agency. Simulation design included a mixed-method needs analysis, using both quantitative and qualitative approaches to determine organizational leadership needs. Eight design iterations were conducted, including three user testing phases. Three re-design iterations followed initial development, enabling game testing as part of comprehensive instructional events. Subsequent design, development and testing processes targeted digital application to a computer- and tablet-based environment. Recommendations include pros and cons of development and learner testing of an initial analog simulation prior to full digital simulation development.

  17. Using exact solutions to develop an implicit scheme for the baroclinic primitive equations

    NASA Technical Reports Server (NTRS)

    Marchesin, D.

    1984-01-01

    The exact solutions presently obtained by means of a novel method for nonlinear initial value problems are used in the development of numerical schemes for the computer solution of these problems. The method is applied to a new, fully implicit scheme on a vertical slice of the isentropic baroclinic equations. It was not possible to find a global scale phenomenon that could be simulated by the baroclinic primitive equations on a vertical slice.

  18. Impact of a theory-based video on initiation of long-acting reversible contraception after abortion.

    PubMed

    Davidson, AuTumn S; Whitaker, Amy K; Martins, Summer L; Hill, Brandon; Kuhn, Caroline; Hagbom-Ma, Catherine; Gilliam, Melissa

    2015-03-01

    Adoption of long-acting reversible contraception (LARC) (ie, the intrauterine device or the contraceptive implant) immediately after abortion is associated with high contraceptive satisfaction and reduced rates of repeat abortion. Theory-based counseling interventions have been demonstrated to improve a variety of health behaviors; data on theory-based counseling interventions for postabortion contraception are lacking. Informed by the transtheoretical model of behavioral change, a video intervention was developed to increase awareness of, and dispel misconceptions about, LARC methods. The intervention was evaluated in a randomized controlled trial among women aged 18-29 years undergoing surgical abortion at a clinic in Chicago, IL. Participants were randomized 1:1 to watch the intervention video or to watch a stress management video (control), both 7 minutes in duration. Contraceptive methods were supplied to all participants free of charge. Rates of LARC initiation immediately after abortion were compared. Rates of LARC initiation immediately after abortion were not significantly different between the 2 study arms; 59.6% in the intervention and 51.6% in the control arm chose a LARC method (P = .27). This study resulted in an unexpectedly high rate of LARC initiation immediately after abortion. High rates of LARC initiation could not be attributed to a theory-based counseling intervention. Copyright © 2015 Elsevier Inc. All rights reserved.

  19. Ocean Predictability and Uncertainty Forecasts Using Local Ensemble Transfer Kalman Filter (LETKF)

    NASA Astrophysics Data System (ADS)

    Wei, M.; Hogan, P. J.; Rowley, C. D.; Smedstad, O. M.; Wallcraft, A. J.; Penny, S. G.

    2017-12-01

    Ocean predictability and uncertainty are studied with an ensemble system that has been developed based on the US Navy's operational HYCOM using the Local Ensemble Transfer Kalman Filter (LETKF) technology. One of the advantages of this method is that the best possible initial analysis states for the HYCOM forecasts are provided by the LETKF which assimilates operational observations using ensemble method. The background covariance during this assimilation process is implicitly supplied with the ensemble avoiding the difficult task of developing tangent linear and adjoint models out of HYCOM with the complicated hybrid isopycnal vertical coordinate for 4D-VAR. The flow-dependent background covariance from the ensemble will be an indispensable part in the next generation hybrid 4D-Var/ensemble data assimilation system. The predictability and uncertainty for the ocean forecasts are studied initially for the Gulf of Mexico. The results are compared with another ensemble system using Ensemble Transfer (ET) method which has been used in the Navy's operational center. The advantages and disadvantages are discussed.

  20. Flow analysis and design optimization methods for nozzle-afterbody of a hypersonic vehicle

    NASA Technical Reports Server (NTRS)

    Baysal, O.

    1992-01-01

    This report summarizes the methods developed for the aerodynamic analysis and the shape optimization of the nozzle-afterbody section of a hypersonic vehicle. Initially, exhaust gases were assumed to be air. Internal-external flows around a single scramjet module were analyzed by solving the 3D Navier-Stokes equations. Then, exhaust gases were simulated by a cold mixture of Freon and Ar. Two different models were used to compute these multispecies flows as they mixed with the hypersonic airflow. Surface and off-surface properties were successfully compared with the experimental data. The Aerodynamic Design Optimization with Sensitivity analysis was then developed. Pre- and postoptimization sensitivity coefficients were derived and used in this quasi-analytical method. These coefficients were also used to predict inexpensively the flow field around a changed shape when the flow field of an unchanged shape was given. Starting with totally arbitrary initial afterbody shapes, independent computations were converged to the same optimum shape, which rendered the maximum axial thrust.

  1. Flow analysis and design optimization methods for nozzle afterbody of a hypersonic vehicle

    NASA Technical Reports Server (NTRS)

    Baysal, Oktay

    1991-01-01

    This report summarizes the methods developed for the aerodynamic analysis and the shape optimization of the nozzle-afterbody section of a hypersonic vehicle. Initially, exhaust gases were assumed to be air. Internal-external flows around a single scramjet module were analyzed by solving the three dimensional Navier-Stokes equations. Then, exhaust gases were simulated by a cold mixture of Freon and Argon. Two different models were used to compute these multispecies flows as they mixed with the hypersonic airflow. Surface and off-surface properties were successfully compared with the experimental data. In the second phase of this project, the Aerodynamic Design Optimization with Sensitivity analysis (ADOS) was developed. Pre and post optimization sensitivity coefficients were derived and used in this quasi-analytical method. These coefficients were also used to predict inexpensively the flow field around a changed shape when the flow field of an unchanged shape was given. Starting with totally arbitrary initial afterbody shapes, independent computations were converged to the same optimum shape, which rendered the maximum axial thrust.

  2. Investigation of direct solar-to-microwave energy conversion techniques

    NASA Technical Reports Server (NTRS)

    Chatterton, N. E.; Mookherji, T. K.; Wunsch, P. K.

    1978-01-01

    Identification of alternative methods of producing microwave energy from solar radiation for purposes of directing power to the Earth from space is investigated. Specifically, methods of conversion of optical radiation into microwave radiation by the most direct means are investigated. Approaches based on demonstrated device functioning and basic phenomenologies are developed. There is no system concept developed, that is competitive with current baseline concepts. The most direct methods of conversion appear to require an initial step of production of coherent laser radiation. Other methods generally require production of electron streams for use in solid-state or cavity-oscillator systems. Further development is suggested to be worthwhile for suggested devices and on concepts utilizing a free-electron stream for the intraspace station power transport mechanism.

  3. Self-catalyzed photo-initiated RAFT polymerization for fabrication of fluorescent polymeric nanoparticles with aggregation-induced emission feature.

    PubMed

    Zeng, Guangjian; Liu, Meiying; Jiang, Ruming; Huang, Qiang; Huang, Long; Wan, Qing; Dai, Yanfeng; Wen, Yuanqing; Zhang, Xiaoyong; Wei, Yen

    2018-02-01

    In recent years, the fluorescent polymeric nanoparticles (FPNs) with aggregation-induced emission (AIE) feature have been extensively exploited in various biomedical fields owing to their advantages, such as low toxicity, biodegradation, excellent biocompatibility, good designability and optical properties. Therefore, development of a facile, efficient and well designable strategy should be of great importance for the biomedical applications of these AIE-active FPNs. In this work, a novel method for the fabrication of AIE-active FPNs has been developed through the self-catalyzed photo-initiated reversible addition fragmentation chain transfer (RAFT) polymerization using an AIE dye containing chain transfer agent (CTA), which could initiate the RAFT polymerization under light irradiation. The results suggested that the final AIE-active FPNs (named as TPE-poly(St-PEGMA)) showed great potential for biomedical applications owing to their optical and biological properties. More importantly, the method described in the work is rather simple and effective and can be further extended to prepare many other different AIE-active FPNs owing to the good monomer adoptability of RAFT polymerization. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Study of the plastic zone around the ligament of thin sheet D.E.N.T specimen subjected to tensile

    NASA Astrophysics Data System (ADS)

    Djebali, S.; Larbi, S.; Bilek, A.

    2015-03-01

    One of the assumptions of Cotterell and Reddel's method of the essential work of fracture determination is the existence of a fracture process zone surrounded by an outer plastic zone extending to the whole ligament before crack initiation. To verify this hypothesis we developed a method based on micro hardness. The hardness values measured in the domain surrounding the tensile fracture area of ST-37-2 steel sheet D.E.N.T specimens confirm the existence of the two plastic zones. The extension of the plastic deformations to the whole ligament before the crack initiation and the circular shape of the outer plastic zone are revealed by the brittle coating method.

  5. Development of a numerical system to improve particulate matter forecasts in South Korea using geostationary satellite-retrieved aerosol optical data over Northeast Asia

    NASA Astrophysics Data System (ADS)

    Lee, Sojin; Song, Chul-han; Park, Rae Seol; Park, Mi Eun; Han, Kyung man; Kim, Jhoon; Choi, Myungje; Ghim, Young Sung; Woo, Jung-Hun

    2016-04-01

    To improve short-term particulate matter (PM) forecasts in South Korea, the initial distribution of PM composition, particularly over the upwind regions, is primarily important. To prepare the initial PM composition, the aerosol optical depth (AOD) data retrieved from a geostationary equatorial orbit (GEO) satellite sensor, GOCI (Geostationary Ocean Color Imager) which covers a part of Northeast Asia (113-146° E; 25-47° N), were used. Although GOCI can provide a higher number of AOD data in a semicontinuous manner than low Earth orbit (LEO) satellite sensors, it still has a serious limitation in that the AOD data are not available at cloud pixels and over high-reflectance areas, such as desert and snow-covered regions. To overcome this limitation, a spatiotemporal-kriging (STK) method was used to better prepare the initial AOD distributions that were converted into the PM composition over Northeast Asia. One of the largest advantages in using the STK method in this study is that more observed AOD data can be used to prepare the best initial AOD fields compared with other methods that use single frame of observation data around the time of initialization. It is demonstrated in this study that the short-term PM forecast system developed with the application of the STK method can greatly improve PM10 predictions in the Seoul metropolitan area (SMA) when evaluated with ground-based observations. For example, errors and biases of PM10 predictions decreased by ˜ 60 and ˜ 70{%}, respectively, during the first 6 h of short-term PM forecasting, compared with those without the initial PM composition. In addition, the influences of several factors on the performances of the short-term PM forecast were explored in this study. The influences of the choices of the control variables on the PM chemical composition were also investigated with the composition data measured via PILS-IC (particle-into-liquid sampler coupled with ion chromatography) and low air-volume sample instruments at a site near Seoul. To improve the overall performances of the short-term PM forecast system, several future research directions were also discussed and suggested.

  6. Native State Mass Spectrometry, Surface Plasmon Resonance, and X-ray Crystallography Correlate Strongly as a Fragment Screening Combination.

    PubMed

    Woods, Lucy A; Dolezal, Olan; Ren, Bin; Ryan, John H; Peat, Thomas S; Poulsen, Sally-Ann

    2016-03-10

    Fragment-based drug discovery (FBDD) is contingent on the development of analytical methods to identify weak protein-fragment noncovalent interactions. Herein we have combined an underutilized fragment screening method, native state mass spectrometry, together with two proven and popular fragment screening methods, surface plasmon resonance and X-ray crystallography, in a fragment screening campaign against human carbonic anhydrase II (CA II). In an initial fragment screen against a 720-member fragment library (the "CSIRO Fragment Library") seven CA II binding fragments, including a selection of nonclassical CA II binding chemotypes, were identified. A further 70 compounds that comprised the initial hit chemotypes were subsequently sourced from the full CSIRO compound collection and screened. The fragment results were extremely well correlated across the three methods. Our findings demonstrate that there is a tremendous opportunity to apply native state mass spectrometry as a complementary fragment screening method to accelerate drug discovery.

  7. A Methodology for Anatomic Ultrasound Image Diagnostic Quality Assessment.

    PubMed

    Hemmsen, Martin Christian; Lange, Theis; Brandt, Andreas Hjelm; Nielsen, Michael Bachmann; Jensen, Jorgen Arendt

    2017-01-01

    This paper discusses the methods for the assessment of ultrasound image quality based on our experiences with evaluating new methods for anatomic imaging. It presents a methodology to ensure a fair assessment between competing imaging methods using clinically relevant evaluations. The methodology is valuable in the continuing process of method optimization and guided development of new imaging methods. It includes a three phased study plan covering from initial prototype development to clinical assessment. Recommendations to the clinical assessment protocol, software, and statistical analysis are presented. Earlier uses of the methodology has shown that it ensures validity of the assessment, as it separates the influences between developer, investigator, and assessor once a research protocol has been established. This separation reduces confounding influences on the result from the developer to properly reveal the clinical value. This paper exemplifies the methodology using recent studies of synthetic aperture sequential beamforming tissue harmonic imaging.

  8. Development of a Hybrid RANS/LES Method for Compressible Mixing Layer Simulations

    NASA Technical Reports Server (NTRS)

    Georgiadis, Nicholas J.; Alexander, J. Iwan D.; Reshotko, Eli

    2001-01-01

    A hybrid method has been developed for simulations of compressible turbulent mixing layers. Such mixing layers dominate the flows in exhaust systems of modem day aircraft and also those of hypersonic vehicles currently under development. The hybrid method uses a Reynolds-averaged Navier-Stokes (RANS) procedure to calculate wall bounded regions entering a mixing section, and a Large Eddy Simulation (LES) procedure to calculate the mixing dominated regions. A numerical technique was developed to enable the use of the hybrid RANS/LES method on stretched, non-Cartesian grids. The hybrid RANS/LES method is applied to a benchmark compressible mixing layer experiment. Preliminary two-dimensional calculations are used to investigate the effects of axial grid density and boundary conditions. Actual LES calculations, performed in three spatial directions, indicated an initial vortex shedding followed by rapid transition to turbulence, which is in agreement with experimental observations.

  9. Model Robust Calibration: Method and Application to Electronically-Scanned Pressure Transducers

    NASA Technical Reports Server (NTRS)

    Walker, Eric L.; Starnes, B. Alden; Birch, Jeffery B.; Mays, James E.

    2010-01-01

    This article presents the application of a recently developed statistical regression method to the controlled instrument calibration problem. The statistical method of Model Robust Regression (MRR), developed by Mays, Birch, and Starnes, is shown to improve instrument calibration by reducing the reliance of the calibration on a predetermined parametric (e.g. polynomial, exponential, logarithmic) model. This is accomplished by allowing fits from the predetermined parametric model to be augmented by a certain portion of a fit to the residuals from the initial regression using a nonparametric (locally parametric) regression technique. The method is demonstrated for the absolute scale calibration of silicon-based pressure transducers.

  10. Anesthesiology training using 3D imaging and virtual reality

    NASA Astrophysics Data System (ADS)

    Blezek, Daniel J.; Robb, Richard A.; Camp, Jon J.; Nauss, Lee A.

    1996-04-01

    Current training for regional nerve block procedures by anesthesiology residents requires expert supervision and the use of cadavers; both of which are relatively expensive commodities in today's cost-conscious medical environment. We are developing methods to augment and eventually replace these training procedures with real-time and realistic computer visualizations and manipulations of the anatomical structures involved in anesthesiology procedures, such as nerve plexus injections (e.g., celiac blocks). The initial work is focused on visualizations: both static images and rotational renderings. From the initial results, a coherent paradigm for virtual patient and scene representation will be developed.

  11. Program/project management resource lists

    NASA Technical Reports Server (NTRS)

    1993-01-01

    The Program/Project Management Collection at NASA Headquarters Library is part of a larger initiative by the Training and Development Division, Code FT, NASA Headquarters. The collection is being developed to support the Program/Project Management Initiative which includes the training of NASA managers. These PPM Resource Lists have proven to be a useful method of informing NASA employees nationwide about the subject coverage of the library collection. All resources included on the lists are available at or through NASA Headquarters Library. NASA employees at other Centers may request listed books through interlibrary loan, and listed articles by contacting me by phone, mail, or e-mail.

  12. Development of an Educational Video to Improve Patient Knowledge and Communication with Their Healthcare Providers about Colorectal Cancer Screening

    ERIC Educational Resources Information Center

    Katz, Mira L.; Heaner, Sarah; Reiter, Paul; van Putten, Julie; Murray, Lee; McDougle, Leon; Cegala, Donald J.; Post, Douglas; David, Prabu; Slater, Michael; Paskett, Electra D.

    2009-01-01

    Background: Low rates of colorectal cancer (CRC) screening persist due to individual, provider, and system level barriers. Purpose: To develop and obtain initial feedback about a CRC screening educational video from community members and medical professionals. Methods: Focus groups of patients were conducted prior to the development of the CRC…

  13. Laboratory and 3-D distinct element analysis of the failure mechanism of a slope under external surcharge

    NASA Astrophysics Data System (ADS)

    Li, N.; Cheng, Y. M.

    2015-01-01

    Landslide is a major disaster resulting in considerable loss of human lives and property damages in hilly terrain in Hong Kong, China and many other countries. The factor of safety and the critical slip surface for slope stabilization are the main considerations for slope stability analysis in the past, while the detailed post-failure conditions of the slopes have not been considered in sufficient detail. There is however increasing interest in the consequences after the initiation of failure that includes the development and propagation of the failure surfaces, the amount of failed mass and runoff and the affected region. To assess the development of slope failure in more detail and to consider the potential danger of slopes after failure has initiated, the slope stability problem under external surcharge is analyzed by the distinct element method (DEM) and a laboratory model test in the present research. A more refined study about the development of failure, microcosmic failure mechanisms and the post-failure mechanisms of slopes will be carried out. The numerical modeling method and the various findings from the present work can provide an alternate method of analysis of slope failure, which can give additional information not available from the classical methods of analysis.

  14. Laboratory and 3-D-distinct element analysis of failure mechanism of slope under external surcharge

    NASA Astrophysics Data System (ADS)

    Li, N.; Cheng, Y. M.

    2014-09-01

    Landslide is a major disaster resulting in considerable loss of human lives and property damages in hilly terrain in Hong Kong, China and many other countries. The factor of safety and the critical slip surface for slope stabilization are the main considerations for slope stability analysis in the past, while the detailed post-failure conditions of the slopes have not been considered in sufficient details. There are however increasing interest on the consequences after the initiation of failure which includes the development and propagation of the failure surfaces, the amount of failed mass and runoff and the affected region. To assess the development of slope failure in more details and to consider the potential danger of slopes after failure has initiated, the slope stability problem under external surcharge is analyzed by the distinct element method (DEM) and laboratory model test in the present research. A more refined study about the development of failure, microcosmic failure mechanism and the post-failure mechanism of slope will be carried out. The numerical modeling method and the various findings from the present work can provide an alternate method of analysis of slope failure which can give additional information not available from the classical methods of analysis.

  15. A method to stabilize linear systems using eigenvalue gradient information

    NASA Technical Reports Server (NTRS)

    Wieseman, C. D.

    1985-01-01

    Formal optimization methods and eigenvalue gradient information are used to develop a stabilizing control law for a closed loop linear system that is initially unstable. The method was originally formulated by using direct, constrained optimization methods with the constraints being the real parts of the eigenvalues. However, because of problems in trying to achieve stabilizing control laws, the problem was reformulated to be solved differently. The method described uses the Davidon-Fletcher-Powell minimization technique to solve an indirect, constrained minimization problem in which the performance index is the Kreisselmeier-Steinhauser function of the real parts of all the eigenvalues. The method is applied successfully to solve two different problems: the determination of a fourth-order control law stabilizes a single-input single-output active flutter suppression system and the determination of a second-order control law for a multi-input multi-output lateral-directional flight control system. Various sets of design variables and initial starting points were chosen to show the robustness of the method.

  16. Reduction of movement resistance force of pipeline in horizontal curved well at stage of designing underground passage

    NASA Astrophysics Data System (ADS)

    Toropov, V. S.; Toropov, S. Yu

    2018-05-01

    A method has been developed to reduce the resistance to movement of a pipeline in a horizontal curved well in the construction of underground passages using trenchless technologies. The method can be applied at the design stage. The idea of the proposed method consists in approximating the trajectory of the designed trenchless passage to the equilibrium profile. It has been proved that in order to reduce the resistance to movement of the pipeline arising from contact with the borehole wall, the profile of its initial and final sections must correspond, depending on the initial conditions, to the parabola or hyperbolic cosine equation. Analytical dependences are obtained which allow supplementing the methods of calculation of traction effort in trenchless construction for the case when the profile of the well is given by an arbitrary function.

  17. Automated startup of the MIT research reactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kwok, K.S.

    1992-01-01

    This summary describes the development, implementation, and testing of a generic method for performing automated startups of nuclear reactors described by space-independent kinetics under conditions of closed-loop digital control. The technique entails first obtaining a reliable estimate of the reactor's initial degree of subcriticality and then substituting that estimate into a model-based control law so as to permit a power increase from subcritical on a demanded trajectory. The estimation of subcriticality is accomplished by application of the perturbed reactivity method. The shutdown reactor is perturbed by the insertion of reactivity at a known rate. Observation of the resulting period permitsmore » determination of the initial degree of subcriticality. A major advantage to this method is that repeated estimates are obtained of the same quantity. Hence, statistical methods can be applied to improve the quality of the calculation.« less

  18. Mid-career faculty development in academic medicine: How does it impact faculty and institutional vitality?

    PubMed Central

    Campion, MaryAnn W.; Bhasin, Robina M.; Beaudette, Donald J.; Shann, Mary H.; Benjamin, Emelia J.

    2016-01-01

    Purpose Faculty vitality is integral to the advancement of higher education. Strengthening vitality is particularly important for mid-career faculty, who represent the largest and most dissatisfied segment. The demands of academic medicine appear to be another factor that may put faculty at risk of attrition. To address these issues, we initiated a ten-month mid-career faculty development program. Methods A mixed-methods quasi-experimental design was used to evaluate the program's impact on faculty and institutional vitality. Pre/post surveys compared participants with a matched reference group. Quantitative data were augmented by interviews and focus groups with multiple stakeholders. Results At the program's conclusion, participants showed statistically significant gains in knowledge, skills, attitudes, and connectivity when compared to the referents. Conclusion Given that mid-career faculty development in academic medicine has not been extensively studied, our evaluation provides a useful perspective to guide future initiatives aimed at enhancing the vitality and leadership capacity of mid-career faculty. PMID:27942418

  19. Adaptive Wiener filter super-resolution of color filter array images.

    PubMed

    Karch, Barry K; Hardie, Russell C

    2013-08-12

    Digital color cameras using a single detector array with a Bayer color filter array (CFA) require interpolation or demosaicing to estimate missing color information and provide full-color images. However, demosaicing does not specifically address fundamental undersampling and aliasing inherent in typical camera designs. Fast non-uniform interpolation based super-resolution (SR) is an attractive approach to reduce or eliminate aliasing and its relatively low computational load is amenable to real-time applications. The adaptive Wiener filter (AWF) SR algorithm was initially developed for grayscale imaging and has not previously been applied to color SR demosaicing. Here, we develop a novel fast SR method for CFA cameras that is based on the AWF SR algorithm and uses global channel-to-channel statistical models. We apply this new method as a stand-alone algorithm and also as an initialization image for a variational SR algorithm. This paper presents the theoretical development of the color AWF SR approach and applies it in performance comparisons to other SR techniques for both simulated and real data.

  20. Long-term seafloor monitoring at an open ocean aquaculture site in the western Gulf of Maine, USA: development of an adaptive protocol.

    PubMed

    Grizzle, R E; Ward, L G; Fredriksson, D W; Irish, J D; Langan, R; Heinig, C S; Greene, J K; Abeels, H A; Peter, C R; Eberhardt, A L

    2014-11-15

    The seafloor at an open ocean finfish aquaculture facility in the western Gulf of Maine, USA was monitored from 1999 to 2008 by sampling sites inside a predicted impact area modeled by oceanographic conditions and fecal and food settling characteristics, and nearby reference sites. Univariate and multivariate analyses of benthic community measures from box core samples indicated minimal or no significant differences between impact and reference areas. These findings resulted in development of an adaptive monitoring protocol involving initial low-cost methods that required more intensive and costly efforts only when negative impacts were initially indicated. The continued growth of marine aquaculture is dependent on further development of farming methods that minimize negative environmental impacts, as well as effective monitoring protocols. Adaptive monitoring protocols, such as the one described herein, coupled with mathematical modeling approaches, have the potential to provide effective protection of the environment while minimize monitoring effort and costs. Copyright © 2014 Elsevier Ltd. All rights reserved.

  1. Using Population Dose to Evaluate Community-level Health Initiatives.

    PubMed

    Harner, Lisa T; Kuo, Elena S; Cheadle, Allen; Rauzon, Suzanne; Schwartz, Pamela M; Parnell, Barbara; Kelly, Cheryl; Solomon, Loel

    2018-05-01

    Successful community-level health initiatives require implementing an effective portfolio of strategies and understanding their impact on population health. These factors are complicated by the heterogeneity of overlapping multicomponent strategies and availability of population-level data that align with the initiatives. To address these complexities, the population dose methodology was developed for planning and evaluating multicomponent community initiatives. Building on the population dose methodology previously developed, this paper operationalizes dose estimates of one initiative targeting youth physical activity as part of the Kaiser Permanente Community Health Initiative, a multicomponent community-level obesity prevention initiative. The technical details needed to operationalize the population dose method are explained, and the use of population dose as an interim proxy for population-level survey data is introduced. The alignment of the estimated impact from strategy-level data analysis using the dose methodology and the data from the population-level survey suggest that dose is useful for conducting real-time evaluation of multiple heterogeneous strategies, and as a viable proxy for existing population-level surveys when robust strategy-level evaluation data are collected. This article is part of a supplement entitled Building Thriving Communities Through Comprehensive Community Health Initiatives, which is sponsored by Kaiser Permanente, Community Health. Copyright © 2018 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.

  2. The Global Evidence Mapping Initiative: Scoping research in broad topic areas

    PubMed Central

    2011-01-01

    Background Evidence mapping describes the quantity, design and characteristics of research in broad topic areas, in contrast to systematic reviews, which usually address narrowly-focused research questions. The breadth of evidence mapping helps to identify evidence gaps, and may guide future research efforts. The Global Evidence Mapping (GEM) Initiative was established in 2007 to create evidence maps providing an overview of existing research in Traumatic Brain Injury (TBI) and Spinal Cord Injury (SCI). Methods The GEM evidence mapping method involved three core tasks: 1. Setting the boundaries and context of the map: Definitions for the fields of TBI and SCI were clarified, the prehospital, acute inhospital and rehabilitation phases of care were delineated and relevant stakeholders (patients, carers, clinicians, researchers and policymakers) who could contribute to the mapping were identified. Researchable clinical questions were developed through consultation with key stakeholders and a broad literature search. 2. Searching for and selection of relevant studies: Evidence search and selection involved development of specific search strategies, development of inclusion and exclusion criteria, searching of relevant databases and independent screening and selection by two researchers. 3. Reporting on yield and study characteristics: Data extraction was performed at two levels - 'interventions and study design' and 'detailed study characteristics'. The evidence map and commentary reflected the depth of data extraction. Results One hundred and twenty-nine researchable clinical questions in TBI and SCI were identified. These questions were then prioritised into high (n = 60) and low (n = 69) importance by the stakeholders involved in question development. Since 2007, 58 263 abstracts have been screened, 3 731 full text articles have been reviewed and 1 644 relevant neurotrauma publications have been mapped, covering fifty-three high priority questions. Conclusions GEM Initiative evidence maps have a broad range of potential end-users including funding agencies, researchers and clinicians. Evidence mapping is at least as resource-intensive as systematic reviewing. The GEM Initiative has made advancements in evidence mapping, most notably in the area of question development and prioritisation. Evidence mapping complements other review methods for describing existing research, informing future research efforts, and addressing evidence gaps. PMID:21682870

  3. Numerical investigation of shape domain effect to its elasticity and surface energy using adaptive finite element method

    NASA Astrophysics Data System (ADS)

    Alfat, Sayahdin; Kimura, Masato; Firihu, Muhammad Zamrun; Rahmat

    2018-05-01

    In engineering area, investigation of shape effect in elastic materials was very important. It can lead changing elasticity and surface energy, and also increase of crack propagation in the material. A two-dimensional mathematical model was developed to investigation of elasticity and surface energy in elastic material by Adaptive Finite Element Method. Besides that, behavior of crack propagation has observed for every those materials. The government equations were based on a phase field approach in crack propagation model that developed by Takaishi-Kimura. This research has varied four shape domains where physical properties of materials were same (Young's modulus E = 70 GPa and Poisson's ratio ν = 0.334). Investigation assumptions were; (1) homogeneous and isotropic material, (2) there was not initial cracking at t = 0, (3) initial displacement was zero [u1, u2] = 0) at initial condition (t = 0), and (4) length of time simulation t = 5 with interval Δt = 0.005. Mode I/II or mixed mode crack propagation has been used for the numerical investigation. Results of this studies were very good and accurate to show changing energy and behavior of crack propagation. In the future time, this research can be developed to complex phenomena and domain. Furthermore, shape optimization can be investigation by the model.

  4. Detection of chemical pollutants by passive LWIR hyperspectral imaging

    NASA Astrophysics Data System (ADS)

    Lavoie, Hugo; Thériault, Jean-Marc; Bouffard, François; Puckrin, Eldon; Dubé, Denis

    2012-09-01

    Toxic industrial chemicals (TICs) represent a major threat to public health and security. Their detection constitutes a real challenge to security and first responder's communities. One promising detection method is based on the passive standoff identification of chemical vapors emanating from the laboratory under surveillance. To investigate this method, the Department of National Defense and Public Safety Canada have mandated Defense Research and Development Canada (DRDC) - Valcartier to develop and test passive Long Wave Infrared (LWIR) hyperspectral imaging (HSI) sensors for standoff detection. The initial effort was focused to address the standoff detection and identification of toxic industrial chemicals (TICs) and precursors. Sensors such as the Multi-option Differential Detection and Imaging Fourier Spectrometer (MoDDIFS) and the Improved Compact ATmospheric Sounding Interferometer (iCATSI) were developed for this application. This paper describes the sensor developments and presents initial results of standoff detection and identification of TICs and precursors. The standoff sensors are based on the differential Fourier-transform infrared (FTIR) radiometric technology and are able to detect, spectrally resolve and identify small leak plumes at ranges in excess of 1 km. Results from a series of trials in asymmetric threat type scenarios will be presented. These results will serve to establish the potential of the method for standoff detection of TICs precursors and surrogates.

  5. Development of Anxiety Disorders in a Traumatized Pediatric Population: A Preliminary Longitudinal Evaluation

    ERIC Educational Resources Information Center

    Cortes, Adriana M.; Saltzman, Kassey M.; Weems, Carl F.; Regnault, Heather P.; Reiss, Allan L.; Carrion, Victor G.

    2005-01-01

    Objective: The current study was conducted to determine if post-traumatic stress disorder (PTSD) symptomatology predicted later development of non-PTSD anxiety disorders in children and adolescents victimized by interpersonal trauma. Methods: Thirty-four children with a history of interpersonal trauma and no initial diagnosis of anxiety disorder…

  6. Reflection--A Method for Organisational and Individual Development

    ERIC Educational Resources Information Center

    Randle, Hanne; Tilander, Kristian

    2007-01-01

    This paper presents how organisational development can be the results when politicians, managers, social workers and teaching staff take part in reflection. The results are based on a government-funded initiative in Sweden for lowering sick absenteeism. Three local governments introduced reflection as a strategy to combat work related stress and a…

  7. A STUDY OF THE INFLUENCE OF OUTDOOR ENVIRONMENT INCLUDING COMBUSTION RELATED PRODUCTS ON THE INDOOR ENVIRONMENT (ASTHMA INITIATIVE)

    EPA Science Inventory

    This project is a component of a multi-disciplinary collaboration between NHEERL, NERL, and NRMRL to develop and evaluate methods to examine the role of environmental factors on the induction and exacerbation of asthma within the Long-term Child Development Study (LCDS). This pr...

  8. 75 FR 4453 - Health Services Research and Development Service Merit Review Board; Notice of Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-27

    ... testing of new methods of health care delivery and management, and nursing research. Applications are... Research and Development Officer. On March 2, the subcommittee on Nursing Research Initiative will convene... and consistency of the review process. During the closed portion of each meeting, discussion and...

  9. Effective Evaluation of Training and Development in Higher Education.

    ERIC Educational Resources Information Center

    Thackwray, Bob

    This book examines approaches, techniques, and instruments that relate to the evaluation of training and development in the context of higher education in the United Kingdom (UK), focusing on the importance of identifying the purpose of the evaluation as an initial step. Some financial evaluation methods are also considered. It is suggested that…

  10. Providing Authentic Leadership Opportunities through Collaboratively Developed Internships: A University-School District Partnership Initiative

    ERIC Educational Resources Information Center

    Havard, Timothy S.; Morgan, Joyce; Patrick, Lynne

    2010-01-01

    Programs designed to develop future educational leaders must include practical learning experiences that connect the theoretical content of university coursework with the realities of the K-12 workplace. Internships, which offer a common method of providing these experiences, have been generally lacking in the degree to which aspiring leaders…

  11. Visual detection of driving while intoxicated. Project interim report : identification of visual cues and development of detection methods

    DOT National Transportation Integrated Search

    1979-01-01

    The report describes the initial phase of a two-phase project on the visual, on-the-road detection of driving while intoxicated (DWI). The purpose of the overall project is to develop and test procedures for enhancing on-the-road detection of DWI. Th...

  12. Projecting insect voltinism under high and low greenhouse gas emission conditions

    Treesearch

    Shi Chen; Shelby J. Fleischer; Patrick C. Tobin; Michael C. Saunders

    2011-01-01

    We develop individual-based Monte Carlo methods to explore how climate change can alter insect voltinism under varying greenhouse gas emissions scenarios by using input distributions of diapause termination or spring emergence, development rate, and diapause initiation, linked to daily temperature and photoperiod. We show concurrence of these projections with a field...

  13. A Method for Evaluating and Standardizing Ontologies

    ERIC Educational Resources Information Center

    Seyed, Ali Patrice

    2012-01-01

    The Open Biomedical Ontology (OBO) Foundry initiative is a collaborative effort for developing interoperable, science-based ontologies. The Basic Formal Ontology (BFO) serves as the upper ontology for the domain-level ontologies of OBO. BFO is an upper ontology of types as conceived by defenders of realism. Among the ontologies developed for OBO…

  14. NON-NEOPLASTIC LESIONS: USE OF DATA FROM PRE- OR NON-NEOPLASTIC LESIONS THAT MAY INDICATE POTENTIAL FOR CARCINOGENESIS

    EPA Science Inventory

    The Toxicology and Microbiology Division of the US EPA, Health Effects Research Laboratory has initiated a research program to develop a matrix of short-term tests to distinguish carcinogens from non-carcinogens among genotoxic substances and to develop methods for predicting rel...

  15. Why Problem-Based Learning Works: Theoretical Foundations

    ERIC Educational Resources Information Center

    Marra, Rose M.; Jonassen, David H.; Palmer, Betsy; Luft, Steve

    2014-01-01

    Problem-based learning (PBL) is an instructional method where student learning occurs in the context of solving an authentic problem. PBL was initially developed out of an instructional need to help medical school students learn their basic sciences knowledge in a way that would be more lasting while helping to develop clinical skills…

  16. Development and Evaluation of Pretraining as an Adjunct to a Pilot Training Study.

    ERIC Educational Resources Information Center

    McFadden, Robert W.; And Others

    The utility of the pretraining of task-relevant cognitive skills within the context of experimental research methodology was investigated in this study. A criterion referenced pretraining multi-media product was developed and applied to support the initial phase of an experimental research effort in which several instructional methods for training…

  17. Circles of Care: Development and Initial Evaluation of a Peer Support Model for African Americans with Advanced Cancer

    ERIC Educational Resources Information Center

    Hanson, Laura C.; Armstrong, Tonya D.; Green, Melissa A.; Hayes, Michelle; Peacock, Stacie; Elliot-Bynum, Sharon; Goldmon, Moses V.; Corbie-Smith, Giselle; Earp, Jo Anne

    2013-01-01

    Peer support interventions extend care and health information to underserved populations yet rarely address serious illness. Investigators from a well-defined academic-community partnership developed and evaluated a peer support intervention for African Americans facing advanced cancer. Evaluation methods used the Reach, Efficacy, Adoption,…

  18. Development and Initial Validation of the Symptoms and Assets Screening Scale

    ERIC Educational Resources Information Center

    Downs, Andrew; Boucher, Laura A.; Campbell, Duncan G.; Dasse, Michelle

    2013-01-01

    Objective: To develop and test a screening measure of mental health symptoms and well-being in college students, the Symptoms and Assets Screening Scale (SASS). Participants: Participants were 758 college students at 2 universities in the Northwest sampled between October 2009 and April 2011. Methods: Participants completed the SASS, as well as…

  19. Initiating a Developmental Motor Skills Program for Identified Primary Students.

    ERIC Educational Resources Information Center

    Harville, Valerie Terrill

    A physical education specialist at an elementary school in one of the fastest growing sections of the country developed and implemented a developmental motor skills program for primary school students. The program focused on: (1) developing a method of referring students for testing; (2) providing a specialized motor diagnostic test; (3) improving…

  20. Mother-Infant Activities: The Initial Step in Language Development in the Deaf-Blind Child.

    ERIC Educational Resources Information Center

    Vitagliano, James; Purdy, Susan

    1987-01-01

    The exploratory study examined the effectiveness of the Van Dijk method of developing language skills with four deaf-blind infants and their mothers over a two-month period. Findings indicated increased expressive/elocutionary communicative output with concomitant reduction in self-stimulatory, abusive, and tantrum-like behaviors. (DB)

  1. Sifting, sorting and saturating data in a grounded theory study of information use by practice nurses: a worked example.

    PubMed

    Hoare, Karen J; Mills, Jane; Francis, Karen

    2012-12-01

    The terminology used to analyse data in a grounded theory study can be confusing. Different grounded theorists use a variety of terms which all have similar meanings. In the following study, we use terms adopted by Charmaz including: initial, focused and axial coding. Initial codes are used to analyse data with an emphasis on identifying gerunds, a verb acting as a noun. If initial codes are relevant to the developing theory, they are grouped with similar codes into categories. Categories become saturated when there are no new codes identified in the data. Axial codes are used to link categories together into a grounded theory process. Memo writing accompanies this data sifting and sorting. The following article explains how one initial code became a category providing a worked example of the grounded theory method of constant comparative analysis. The interplay between coding and categorization is facilitated by the constant comparative method. © 2012 Wiley Publishing Asia Pty Ltd.

  2. Automatic tracking of wake vortices using ground-wind sensor data

    DOT National Transportation Integrated Search

    1977-01-03

    Algorithms for automatic tracking of wake vortices using ground-wind anemometer : data are developed. Methods of bad-data suppression, track initiation, and : track termination are included. An effective sensor-failure detection-and identification : ...

  3. 48 CFR 15.602 - Policy.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ..., Small Business Innovation Research topics, Small Business Technology Transfer Research topics, Program Research and Development Announcements, or any other Government-initiated solicitation or program. When the....602 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION CONTRACTING METHODS AND...

  4. EVOLUTION OF ENVIRONMENTAL IMMUNOCHEMISTRY

    EPA Science Inventory

    Enzyme-linked immunosorbent assays (ELISAs), initially developed for clinical applications, have made a tremendous impact as clinical diagnostic indicators. Pesticide chemists became attracted to the potential of these sensitive and selective methods in the 1970s. Thus, beg...

  5. Risk Stratification Methods and Provision of Care Management Services in Comprehensive Primary Care Initiative Practices.

    PubMed

    Reddy, Ashok; Sessums, Laura; Gupta, Reshma; Jin, Janel; Day, Tim; Finke, Bruce; Bitton, Asaf

    2017-09-01

    Risk-stratified care management is essential to improving population health in primary care settings, but evidence is limited on the type of risk stratification method and its association with care management services. We describe risk stratification patterns and association with care management services for primary care practices in the Comprehensive Primary Care (CPC) initiative. We undertook a qualitative approach to categorize risk stratification methods being used by CPC practices and tested whether these stratification methods were associated with delivery of care management services. CPC practices reported using 4 primary methods to stratify risk for their patient populations: a practice-developed algorithm (n = 215), the American Academy of Family Physicians' clinical algorithm (n = 155), payer claims and electronic health records (n = 62), and clinical intuition (n = 52). CPC practices using practice-developed algorithm identified the most number of high-risk patients per primary care physician (282 patients, P = .006). CPC practices using clinical intuition had the most high-risk patients in care management and a greater proportion of high-risk patients receiving care management per primary care physician (91 patients and 48%, P =.036 and P =.128, respectively). CPC practices used 4 primary methods to identify high-risk patients. Although practices that developed their own algorithm identified the greatest number of high-risk patients, practices that used clinical intuition connected the greatest proportion of patients to care management services. © 2017 Annals of Family Medicine, Inc.

  6. FOCUS: A Model of Sensemaking

    DTIC Science & Technology

    2007-05-01

    of the current project was to unpack and develop the concept of sensemaking, principally by developing and testing a cognitive model of the processes...themselves. In Year 2, new Cognitive Task Analysis data collection methods were developed and used to further test the model. Cognitive Task Analysis is a...2004) to examine the phenomenon of "sensemaking," a concept initially formulated by Weick (1995), but not developed from a cognitive perspective

  7. GVE-Based Dynamics and Control for Formation Flying Spacecraft

    NASA Technical Reports Server (NTRS)

    Breger, Louis; How, Jonathan P.

    2004-01-01

    Formation flying is an enabling technology for many future space missions. This paper presents extensions to the equations of relative motion expressed in Keplerian orbital elements, including new initialization techniques for general formation configurations. A new linear time-varying form of the equations of relative motion is developed from Gauss Variational Equations and used in a model predictive controller. The linearizing assumptions for these equations are shown to be consistent with typical formation flying scenarios. Several linear, convex initialization techniques are presented, as well as a general, decentralized method for coordinating a tetrahedral formation using differential orbital elements. Control methods are validated using a commercial numerical propagator.

  8. Improved Fractal Space Filling Curves Hybrid Optimization Algorithm for Vehicle Routing Problem.

    PubMed

    Yue, Yi-xiang; Zhang, Tong; Yue, Qun-xing

    2015-01-01

    Vehicle Routing Problem (VRP) is one of the key issues in optimization of modern logistics system. In this paper, a modified VRP model with hard time window is established and a Hybrid Optimization Algorithm (HOA) based on Fractal Space Filling Curves (SFC) method and Genetic Algorithm (GA) is introduced. By incorporating the proposed algorithm, SFC method can find an initial and feasible solution very fast; GA is used to improve the initial solution. Thereafter, experimental software was developed and a large number of experimental computations from Solomon's benchmark have been studied. The experimental results demonstrate the feasibility and effectiveness of the HOA.

  9. Improved Fractal Space Filling Curves Hybrid Optimization Algorithm for Vehicle Routing Problem

    PubMed Central

    Yue, Yi-xiang; Zhang, Tong; Yue, Qun-xing

    2015-01-01

    Vehicle Routing Problem (VRP) is one of the key issues in optimization of modern logistics system. In this paper, a modified VRP model with hard time window is established and a Hybrid Optimization Algorithm (HOA) based on Fractal Space Filling Curves (SFC) method and Genetic Algorithm (GA) is introduced. By incorporating the proposed algorithm, SFC method can find an initial and feasible solution very fast; GA is used to improve the initial solution. Thereafter, experimental software was developed and a large number of experimental computations from Solomon's benchmark have been studied. The experimental results demonstrate the feasibility and effectiveness of the HOA. PMID:26167171

  10. Application of artificial neural networks in nonlinear analysis of trusses

    NASA Technical Reports Server (NTRS)

    Alam, J.; Berke, L.

    1991-01-01

    A method is developed to incorporate neural network model based upon the Backpropagation algorithm for material response into nonlinear elastic truss analysis using the initial stiffness method. Different network configurations are developed to assess the accuracy of neural network modeling of nonlinear material response. In addition to this, a scheme based upon linear interpolation for material data, is also implemented for comparison purposes. It is found that neural network approach can yield very accurate results if used with care. For the type of problems under consideration, it offers a viable alternative to other material modeling methods.

  11. Algorithms for Solvents and Spectral Factors of Matrix Polynomials

    DTIC Science & Technology

    1981-01-01

    spectral factors of matrix polynomials LEANG S. SHIEHt, YIH T. TSAYt and NORMAN P. COLEMANt A generalized Newton method , based on the contracted gradient...of a matrix poly- nomial, is derived for solving the right (left) solvents and spectral factors of matrix polynomials. Two methods of selecting initial...estimates for rapid convergence of the newly developed numerical method are proposed. Also, new algorithms for solving complete sets of the right

  12. Northwest Manufacturing Initiative

    DTIC Science & Technology

    2012-03-27

    crack growth and threshold stress corrosion cracking evaluation. Threshold stress corrosion cracking was done using the rising step load method with...Group Technology methods to establish manufacturing cells for production efficiency, to develop internal Lean Champions, and to implement rapid... different levels, advisory, core, etc. VI. Core steering committee composed of members that have a significant vested interest. Action Item: Draft

  13. A New Approach to Aircraft Robust Performance Analysis

    NASA Technical Reports Server (NTRS)

    Gregory, Irene M.; Tierno, Jorge E.

    2004-01-01

    A recently developed algorithm for nonlinear system performance analysis has been applied to an F16 aircraft to begin evaluating the suitability of the method for aerospace problems. The algorithm has a potential to be much more efficient than the current methods in performance analysis for aircraft. This paper is the initial step in evaluating this potential.

  14. Methods of Improving the Cognitive and Verbal Development of Children with Cerebral Palsy. Monograph 23.

    ERIC Educational Resources Information Center

    Danilova, L. A.

    This four-chapter monograph, translated from a 1977 Russian book written originally in Russian for Russians, describes methodology and results of the study of cognitive activity in children with cerebral palsy. An initial chapter reviews research on impairments in cognitive activity and speech defects in such children and on methods of…

  15. A Mixed-Method Exploration of Functioning in Safe Schools/Healthy Students Partnerships

    ERIC Educational Resources Information Center

    Merrill, Marina L.; Taylor, Nicole L.; Martin, Alison J.; Maxim, Lauren A.; D'Ambrosio, Ryan; Gabriel, Roy M.; Wendt, Staci J.; Mannix, Danyelle; Wells, Michael E.

    2012-01-01

    This paper presents a mixed-method approach to measuring the functioning of Safe Schools/Healthy Students (SS/HS) Initiative partnerships. The SS/HS national evaluation team developed a survey to collect partners' perceptions of functioning within SS/HS partnerships. Average partnership functioning scores were used to rank each site from lowest to…

  16. Shape oscillations of acoustically levitated drops in water: Early research with Bob Apfel on modulated radiation pressure

    NASA Astrophysics Data System (ADS)

    Marston, Philip L.

    2004-05-01

    In 1976, research in collaboration with Bob Apfel demonstrated that low-frequency shape oscillations of hydrocarbon drops levitated in water could be driven using modulated radiation pressure. While that response to modulated ultrasound was subsequently extended to a range of systems, the emphasis here is to recall the initial stages of development in Bob Apfel's laboratory leading to some publications [P. L. Marston and R. E. Apfel, J. Colloid Interface Sci. 68, 280-286 (1979); J. Acoust. Soc. Am. 67, 27-37 (1980)]. The levitation technology used at that time was such that it was helpful to develop a sensitive method for detecting weak oscillations using the interference pattern in laser light scattered by levitated drops. The initial experiments to verify this scattering method used shape oscillations induced by modulated electric fields within the acoustic levitator. Light scattering was subsequently used to detect shape oscillations induced by amplitude modulating a carrier having a high frequency (around 680 kHz) at a resonance of the transducer. Methods were also developed for quantitative measurements of the drop's response and with improved acoustic coupling drop fission was observed. The connection with research currently supported by NASA will also be noted.

  17. Harmonic growth of spherical Rayleigh-Taylor instability in weakly nonlinear regime

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Wanhai; LHD, Institute of Mechanics, Chinese Academy of Sciences, Beijing 100190; Chen, Yulian

    Harmonic growth in classical Rayleigh-Taylor instability (RTI) on a spherical interface is analytically investigated using the method of the parameter expansion up to the third order. Our results show that the amplitudes of the first four harmonics will recover those in planar RTI as the interface radius tends to infinity compared against the initial perturbation wavelength. The initial radius dramatically influences the harmonic development. The appearance of the second-order feedback to the initial unperturbed interface (i.e., the zeroth harmonic) makes the interface move towards the spherical center. For these four harmonics, the smaller the initial radius is, the faster theymore » grow.« less

  18. Protocols for the Initial Treatment of Moderately Severe Juvenile Dermatomyositis: Results of a Children's Arthritis and Rheumatology Research Alliance Consensus Conference

    PubMed Central

    Huber, Adam M.; Giannini, Edward H.; Bowyer, Suzanne L.; Kim, Susan; Lang, Bianca; Lindsley, Carol B.; Pachman, Lauren M.; Pilkington, Clarissa; Reed, Ann M.; Rennebohm, Robert M.; Rider, Lisa G.; Wallace, Carol A.; Feldman, Brian M.

    2010-01-01

    Objective To use juvenile dermatomyositis (JDM) survey data and expert opinion to develop a small number of consensus treatment protocols which reflect current initial treatment of moderately severe JDM. Methods A consensus meeting was held in Toronto, Ontario, Canada on December 1-2, 2007. Nominal group technique was used to achieve consensus on treatment protocols which represented typical management of moderately severe JDM. Consensus was also reached on which patients these protocols would be applicable to (inclusion and exclusion criteria), initial investigations which should be done prior to initiating one of these protocols, data which should be collected to evaluate these protocols, concomitant interventions that would be required or recommended. Results Three protocols were developed which described the first 2 months of treatment. All protocols included corticosteroids and methotrexate. One protocol also included intravenous gammaglobulin. Consensus was achieved for all issues that were addressed by conference participants, although there were some areas of controversy Conclusions This study shows that it is possible to achieve consensus on the initial treatment of JDM, despite considerable variation in clinical practice. Once these protocols are extended beyond 2 months, these protocols will be available for clinical use. By using methods which account for differences between patients (confounding by indication), the comparative effectiveness of the protocols will be evaluated. In the future, the goal will be to identify the optimal treatment of moderately severe JDM. PMID:20191521

  19. Idealized Experiments for Optimizing Model Parameters Using a 4D-Variational Method in an Intermediate Coupled Model of ENSO

    NASA Astrophysics Data System (ADS)

    Gao, Chuan; Zhang, Rong-Hua; Wu, Xinrong; Sun, Jichang

    2018-04-01

    Large biases exist in real-time ENSO prediction, which can be attributed to uncertainties in initial conditions and model parameters. Previously, a 4D variational (4D-Var) data assimilation system was developed for an intermediate coupled model (ICM) and used to improve ENSO modeling through optimized initial conditions. In this paper, this system is further applied to optimize model parameters. In the ICM used, one important process for ENSO is related to the anomalous temperature of subsurface water entrained into the mixed layer ( T e), which is empirically and explicitly related to sea level (SL) variation. The strength of the thermocline effect on SST (referred to simply as "the thermocline effect") is represented by an introduced parameter, α Te. A numerical procedure is developed to optimize this model parameter through the 4D-Var assimilation of SST data in a twin experiment context with an idealized setting. Experiments having their initial condition optimized only, and having their initial condition plus this additional model parameter optimized, are compared. It is shown that ENSO evolution can be more effectively recovered by including the additional optimization of this parameter in ENSO modeling. The demonstrated feasibility of optimizing model parameters and initial conditions together through the 4D-Var method provides a modeling platform for ENSO studies. Further applications of the 4D-Var data assimilation system implemented in the ICM are also discussed.

  20. Using a 4D-Variational Method to Optimize Model Parameters in an Intermediate Coupled Model of ENSO

    NASA Astrophysics Data System (ADS)

    Gao, C.; Zhang, R. H.

    2017-12-01

    Large biases exist in real-time ENSO prediction, which is attributed to uncertainties in initial conditions and model parameters. Previously, a four dimentional variational (4D-Var) data assimilation system was developed for an intermediate coupled model (ICM) and used to improve ENSO modeling through optimized initial conditions. In this paper, this system is further applied to optimize model parameters. In the ICM used, one important process for ENSO is related to the anomalous temperature of subsurface water entrained into the mixed layer (Te), which is empirically and explicitly related to sea level (SL) variation, written as Te=αTe×FTe (SL). The introduced parameter, αTe, represents the strength of the thermocline effect on sea surface temperature (SST; referred as the thermocline effect). A numerical procedure is developed to optimize this model parameter through the 4D-Var assimilation of SST data in a twin experiment context with an idealized setting. Experiments having initial condition optimized only and having initial condition plus this additional model parameter optimized both are compared. It is shown that ENSO evolution can be more effectively recovered by including the additional optimization of this parameter in ENSO modeling. The demonstrated feasibility of optimizing model parameter and initial condition together through the 4D-Var method provides a modeling platform for ENSO studies. Further applications of the 4D-Var data assimilation system implemented in the ICM are also discussed.

  1. A heuristic neural network initialization scheme for modeling nonlinear functions in engineering mechanics: continuous development

    NASA Astrophysics Data System (ADS)

    Pei, Jin-Song; Mai, Eric C.

    2007-04-01

    This paper introduces a continuous effort towards the development of a heuristic initialization methodology for constructing multilayer feedforward neural networks to model nonlinear functions. In this and previous studies that this work is built upon, including the one presented at SPIE 2006, the authors do not presume to provide a universal method to approximate arbitrary functions, rather the focus is given to the development of a rational and unambiguous initialization procedure that applies to the approximation of nonlinear functions in the specific domain of engineering mechanics. The applications of this exploratory work can be numerous including those associated with potential correlation and interpretation of the inner workings of neural networks, such as damage detection. The goal of this study is fulfilled by utilizing the governing physics and mathematics of nonlinear functions and the strength of the sigmoidal basis function. A step-by-step graphical procedure utilizing a few neural network prototypes as "templates" to approximate commonly seen memoryless nonlinear functions of one or two variables is further developed in this study. Decomposition of complex nonlinear functions into a summation of some simpler nonlinear functions is utilized to exploit this prototype-based initialization methodology. Training examples are presented to demonstrate the rationality and effciency of the proposed methodology when compared with the popular Nguyen-Widrow initialization algorithm. Future work is also identfied.

  2. Patient safety initiatives in Central and Eastern Europe: A mixed methods approach by the LINNEAUS collaboration on patient safety in primary care

    PubMed Central

    Godycki-Cwirko, Maciek; Esmail, Aneez; Dovey, Susan; Wensing, Michel; Parker, Dianne; Kowalczyk, Anna; Błaszczyk, Honorata; Kosiek, Katarzyna

    2015-01-01

    ABSTRACT Background: Despite patient safety being recognized as an important healthcare issue in the European Union, there has been variable implementation of patient safety initiatives in Central and Eastern Europe (CEE). Objective: To assess the status of patient safety initiatives in countries in CEE; to describe a process of engagement in Poland, which can serve as a template for the implementation of patient safety initiatives in primary care. Methods: A mixed methods design was used. We conducted a review of literature focusing on publications from CEE, an inventory of patient safety initiatives in CEE countries, interviews with key informants, international survey, review of national reporting systems, and pilot demonstrator project in Poland with implementation of patient safety toolkits assessment. Results: There was no published patient safety research from Albania, Belarus, Greece, Latvia, Lithuania, Romania, or Russia. Nine papers were found from Bulgaria, Croatia, the Czech Republic, Poland, Serbia, and Slovenia. In most of the CEE countries, patient safety had been addressed at the policy level although the focus was mainly in hospital care. There was a dearth of activity in primary care. The use of patient improvement strategies was low. Conclusion: International cooperation as exemplified in the demonstrator project can help in the development and implementation of patient safety initiatives in primary care in changing the emphasis away from a blame culture to one where greater emphasis is placed on improvement and learning. PMID:26339839

  3. Frequent development of combined pituitary hormone deficiency in patients initially diagnosed as isolated growth hormone deficiency: a long term follow-up of patients from a single center.

    PubMed

    Otto, Aline P; França, Marcela M; Correa, Fernanda A; Costalonga, Everlayny F; Leite, Claudia C; Mendonca, Berenice B; Arnhold, Ivo J P; Carvalho, Luciani R S; Jorge, Alexander A L

    2015-08-01

    Children initially diagnosed with isolated GH deficiency (IGHD) have a variable rate to progress to combined pituitary hormone deficiency (CPHD) during follow-up. To evaluate the development of CPHD in a group of childhood-onset IGHD followed at a single tertiary center over a long period of time. We retrospectively analyzed data from 83 patients initially diagnosed as IGHD with a mean follow-up of 15.2 years. The Kaplan-Meier method and Cox regression analysis was used to estimate the temporal progression and to identify risk factors to development of CPHD over time. From 83 patients initially with IGHD, 37 (45%) developed CPHD after a median time of follow up of 5.4 years (range from 1.2 to 21 years). LH and FSH deficiencies were the most common pituitary hormone (38%) deficiencies developed followed by TSH (31%), ACTH (12%) and ADH deficiency (5%). ADH deficiency (3.1 ± 1 years from GHD diagnosis) presented earlier and ACTH deficiency (9.3 ± 3.5 years) presented later during follow up compared to LH/FSH (8.3 ± 4 years) and TSH (7.5 ± 5.6 years) deficiencies. In a Cox regression model, pituitary stalk abnormalities was the strongest risk factor for the development of CPHD (hazard ratio of 3.28; p = 0.002). Our study indicated a high frequency of development of CPHD in patients initially diagnosed as IGHD at childhood. Half of our patients with IGHD developed the second hormone deficiency after 5 years of diagnosis, reinforcing the need for lifelong monitoring of pituitary function in these patients.

  4. A review for identification of initiating events in event tree development process on nuclear power plants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Riyadi, Eko H., E-mail: e.riyadi@bapeten.go.id

    2014-09-30

    Initiating event is defined as any event either internal or external to the nuclear power plants (NPPs) that perturbs the steady state operation of the plant, if operating, thereby initiating an abnormal event such as transient or loss of coolant accident (LOCA) within the NPPs. These initiating events trigger sequences of events that challenge plant control and safety systems whose failure could potentially lead to core damage or large early release. Selection for initiating events consists of two steps i.e. first step, definition of possible events, such as by evaluating a comprehensive engineering, and by constructing a top level logicmore » model. Then the second step, grouping of identified initiating event's by the safety function to be performed or combinations of systems responses. Therefore, the purpose of this paper is to discuss initiating events identification in event tree development process and to reviews other probabilistic safety assessments (PSA). The identification of initiating events also involves the past operating experience, review of other PSA, failure mode and effect analysis (FMEA), feedback from system modeling, and master logic diagram (special type of fault tree). By using the method of study for the condition of the traditional US PSA categorization in detail, could be obtained the important initiating events that are categorized into LOCA, transients and external events.« less

  5. Mixed-Initiative Activity Planning for Mars Rovers

    NASA Technical Reports Server (NTRS)

    Bresina, John; Jonsson, Ari; Morris, Paul; Rajan, Kanna

    2005-01-01

    One of the ground tools used to operate the Mars Exploration Rovers is a mixed-initiative planning system called MAPGEN. The role of the system is to assist operators building daily plans for each of the rovers, maximizing science return, while maintaining rover safety and abiding by science and engineering constraints. In this paper, we describe the MAPGEN system, focusing on the mixed-initiative planning aspect. We note important challenges, both in terms of human interaction and in terms of automated reasoning requirements. We then describe the approaches taken in MAPGEN, focusing on the novel methods developed by our team.

  6. Application of the cognitive therapy model to initial crisis assessment.

    PubMed

    Calvert, Patricia; Palmer, Christine

    2003-03-01

    This article provides a background to the development of cognitive therapy and cognitive therapeutic skills with a specific focus on the treatment of a depressive episode. It discusses the utility of cognitive therapeutic strategies to the model of crisis theory and initial crisis assessment currently used by the Community Assessment & Treatment Team of Waitemata District Health Board on the North Shore of Auckland, New Zealand. A brief background to cognitive therapy is provided, followed by a comprehensive example of the use of the Socratic questioning method in guiding collaborative assessment and treatment of suicidality by nurses during the initial crisis assessment.

  7. Study of high-performance canonical molecular orbitals calculation for proteins

    NASA Astrophysics Data System (ADS)

    Hirano, Toshiyuki; Sato, Fumitoshi

    2017-11-01

    The canonical molecular orbital (CMO) calculation can help to understand chemical properties and reactions in proteins. However, it is difficult to perform the CMO calculation of proteins because of its self-consistent field (SCF) convergence problem and expensive computational cost. To certainly obtain the CMO of proteins, we work in research and development of high-performance CMO applications and perform experimental studies. We have proposed the third-generation density-functional calculation method of calculating the SCF, which is more advanced than the FILE and direct method. Our method is based on Cholesky decomposition for two-electron integrals calculation and the modified grid-free method for the pure-XC term evaluation. By using the third-generation density-functional calculation method, the Coulomb, the Fock-exchange, and the pure-XC terms can be given by simple linear algebraic procedure in the SCF loop. Therefore, we can expect to get a good parallel performance in solving the SCF problem by using a well-optimized linear algebra library such as BLAS on the distributed memory parallel computers. The third-generation density-functional calculation method is implemented to our program, ProteinDF. To achieve computing electronic structure of the large molecule, not only overcoming expensive computation cost and also good initial guess for safe SCF convergence are required. In order to prepare a precise initial guess for the macromolecular system, we have developed the quasi-canonical localized orbital (QCLO) method. The QCLO has the characteristics of both localized and canonical orbital in a certain region of the molecule. We have succeeded in the CMO calculations of proteins by using the QCLO method. For simplified and semi-automated calculation of the QCLO method, we have also developed a Python-based program, QCLObot.

  8. Lightweight Small Arms Technologies

    DTIC Science & Technology

    2006-11-01

    conducted using several methods. Initial measurements were obtained using a strand burner , followed by closed bomb measurements using both pressed... pellets and entire cases. Specialized fixtures were developed to measure primer and booster combustion properties. The final verification of interior

  9. An Overview of Food Loss and Waste: why does it Matter?

    NASA Astrophysics Data System (ADS)

    Ghosh, Purabi R.; Sharma, Shashi B.; Haigh, Yvonne T.; Evers, A. L. Barbara; Ho, Goen

    2015-10-01

    This paper provides an overview of food waste in the context of food security, resources management and environment health. It compares approaches taken by various governments, community groups, civil societies and private sector organisations to reduce food waste in the developed and developing countries. What constitutes ‘food waste’ is not as simple as it may appear due to diverse food waste measurement protocols and different data documentation methods used worldwide. There is a need to improve food waste data collection methods and implementation of effective strategies, policies and actions to reduce food waste. Global initiatives are urgently needed to: enhance awareness of the value of food; encourage countries to develop policies that motivate community and businesses to reduce food waste; encourage and provide assistance to needy countries for improving markets, transport and storage infrastructure to minimise food waste across the value chain; and, develop incentives that encourage businesses to donate food. In some countries, particularly in Europe, initiatives on food waste management have started to gain momentum. Food waste is a global problem and it needs urgent attention and integrated actions of stakeholders across the food value chain to develop global solutions for the present and future generations.

  10. National Quality Measures for Child Mental Health Care: Background, Progress, and Next Steps

    PubMed Central

    Murphy, J. Michael; Scholle, Sarah Hudson; Hoagwood, Kimberly Eaton; Sachdeva, Ramesh C.; Mangione-Smith, Rita; Woods, Donna; Kamin, Hayley S.; Jellinek, Michael

    2013-01-01

    OBJECTIVE: To review recent health policies related to measuring child health care quality, the selection processes of national child health quality measures, the nationally recommended quality measures for child mental health care and their evidence strength, the progress made toward developing new measures, and early lessons learned from these national efforts. METHODS: Methods used included description of the selection process of child health care quality measures from 2 independent national initiatives, the recommended quality measures for child mental health care, and the strength of scientific evidence supporting them. RESULTS: Of the child health quality measures recommended or endorsed during these national initiatives, only 9 unique measures were related to child mental health. CONCLUSIONS: The development of new child mental health quality measures poses methodologic challenges that will require a paradigm shift to align research with its accelerated pace. PMID:23457148

  11. An approach to constrained aerodynamic design with application to airfoils

    NASA Technical Reports Server (NTRS)

    Campbell, Richard L.

    1992-01-01

    An approach was developed for incorporating flow and geometric constraints into the Direct Iterative Surface Curvature (DISC) design method. In this approach, an initial target pressure distribution is developed using a set of control points. The chordwise locations and pressure levels of these points are initially estimated either from empirical relationships and observed characteristics of pressure distributions for a given class of airfoils or by fitting the points to an existing pressure distribution. These values are then automatically adjusted during the design process to satisfy the flow and geometric constraints. The flow constraints currently available are lift, wave drag, pitching moment, pressure gradient, and local pressure levels. The geometric constraint options include maximum thickness, local thickness, leading-edge radius, and a 'glove' constraint involving inner and outer bounding surfaces. This design method was also extended to include the successive constraint release (SCR) approach to constrained minimization.

  12. Charging conditions research to increase the initial projected velocity at different initial charge temperatures

    NASA Astrophysics Data System (ADS)

    Ishchenko, Aleksandr; Burkin, Viktor; Kasimov, Vladimir; Samorokova, Nina; Zykova, Angelica; Diachkovskii, Alexei

    2017-11-01

    The problems of the defense industry occupy the most important place in the constantly developing modern world. The daily development of defense technology does not stop, nor do studies on internal ballistics. The scientists of the whole world are faced with the task of managing the main characteristics of a ballistic experiment. The main characteristics of the ballistic experiment are the maximum pressure in the combustion chamber Pmax and the projected velocity at the time of barrel leaving UM. During the work the combustion law of the new high-energy fuel was determined in a ballistic experiment for different initial temperatures. This combustion law was used for a parametric study of depending Pmax and UM from a powder charge mass and a traveling charge was carried out. The optimal conditions for loading were obtained for improving the initial velocity at pressures up to 600 MPa for different initial temperatures. In this paper, one of the most promising schemes of throwing is considered, as well as a method for increasing the muzzle velocity of a projected element to 3317 m/s.

  13. Prediction of fracture initiation in square cup drawing of DP980 using an anisotropic ductile fracture criterion

    NASA Astrophysics Data System (ADS)

    Park, N.; Huh, H.; Yoon, J. W.

    2017-09-01

    This paper deals with the prediction of fracture initiation in square cup drawing of DP980 steel sheet with the thickness of 1.2 mm. In an attempt to consider the influence of material anisotropy on the fracture initiation, an uncoupled anisotropic ductile fracture criterion is developed based on the Lou—Huh ductile fracture criterion. Tensile tests are carried out at different loading directions of 0°, 45°, and 90° to the rolling direction of the sheet using various specimen geometries including pure shear, dog-bone, and flat grooved specimens so as to calibrate the parameters of the proposed fracture criterion. Equivalent plastic strain distribution on the specimen surface is computed using Digital Image Correlation (DIC) method until surface crack initiates. The proposed fracture criterion is implemented into the commercial finite element code ABAQUS/Explicit by developing the Vectorized User-defined MATerial (VUMAT) subroutine which features the non-associated flow rule. Simulation results of the square cup drawing test clearly show that the proposed fracture criterion is capable of predicting the fracture initiation with sufficient accuracy considering the material anisotropy.

  14. A case study of polypharmacy management in nine European countries: Implications for change management and implementation

    PubMed Central

    MacLure, Katie; Stewart, Derek; Kempen, Thomas; Mair, Alpana; Castel-Branco, Margarida; Codina, Carles; Fernandez-Llimos, Fernando; Fleming, Glenda; Gennimata, Dimitra; Gillespie, Ulrika; Harrison, Cathy; Illario, Maddalena; Junius-Walker, Ulrike; Kampolis, Christos F.; Kardas, Przemyslaw; Lewek, Pawel; Malva, João; Menditto, Enrica; Scullin, Claire; Wiese, Birgitt

    2018-01-01

    Background Multimorbidity and its associated polypharmacy contribute to an increase in adverse drug events, hospitalizations, and healthcare spending. This study aimed to address: what exists regarding polypharmacy management in the European Union (EU); why programs were, or were not, developed; and, how identified initiatives were developed, implemented, and sustained. Methods Change management principles (Kotter) and normalization process theory (NPT) informed data collection and analysis. Nine case studies were conducted in eight EU countries: Germany (Lower Saxony), Greece, Italy (Campania), Poland, Portugal, Spain (Catalonia), Sweden (Uppsala), and the United Kingdom (Northern Ireland and Scotland). The workflow included a review of country/region specific polypharmacy policies, key informant interviews with stakeholders involved in policy development and implementation and, focus groups of clinicians and managers. Data were analyzed using thematic analysis of individual cases and framework analysis across cases. Results Polypharmacy initiatives were identified in five regions (Catalonia, Lower Saxony, Northern Ireland, Scotland, and Uppsala) and included all care settings. There was agreement, even in cases without initiatives, that polypharmacy is a significant issue to address. Common themes regarding the development and implementation of polypharmacy management initiatives were: locally adapted solutions, organizational culture supporting innovation and teamwork, adequate workforce training, multidisciplinary teams, changes in workflow, redefinition of roles and responsibilities of professionals, policies and legislation supporting the initiative, and data management and information and communication systems to assist development and implementation. Depending on the setting, these were considered either facilitators or barriers to implementation. Conclusion Within the studied EU countries, polypharmacy management was not widely addressed. These results highlight the importance of change management and theory-based implementation strategies, and provide examples of polypharmacy management initiatives that can assist managers and policymakers in developing new programs or scaling up existing ones, particularly in places currently lacking such initiatives. PMID:29668763

  15. Derivatives of buckling loads and vibration frequencies with respect to stiffness and initial strain parameters

    NASA Technical Reports Server (NTRS)

    Haftka, Raphael T.; Cohen, Gerald A.; Mroz, Zenon

    1990-01-01

    A uniform variational approach to sensitivity analysis of vibration frequencies and bifurcation loads of nonlinear structures is developed. Two methods of calculating the sensitivities of bifurcation buckling loads and vibration frequencies of nonlinear structures, with respect to stiffness and initial strain parameters, are presented. A direct method requires calculation of derivatives of the prebuckling state with respect to these parameters. An adjoint method bypasses the need for these derivatives by using instead the strain field associated with the second-order postbuckling state. An operator notation is used and the derivation is based on the principle of virtual work. The derivative computations are easily implemented in structural analysis programs. This is demonstrated by examples using a general purpose, finite element program and a shell-of-revolution program.

  16. Development Optimization and Uncertainty Analysis Methods for Oil and Gas Reservoirs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ettehadtavakkol, Amin, E-mail: amin.ettehadtavakkol@ttu.edu; Jablonowski, Christopher; Lake, Larry

    Uncertainty complicates the development optimization of oil and gas exploration and production projects, but methods have been devised to analyze uncertainty and its impact on optimal decision-making. This paper compares two methods for development optimization and uncertainty analysis: Monte Carlo (MC) simulation and stochastic programming. Two example problems for a gas field development and an oilfield development are solved and discussed to elaborate the advantages and disadvantages of each method. Development optimization involves decisions regarding the configuration of initial capital investment and subsequent operational decisions. Uncertainty analysis involves the quantification of the impact of uncertain parameters on the optimum designmore » concept. The gas field development problem is designed to highlight the differences in the implementation of the two methods and to show that both methods yield the exact same optimum design. The results show that both MC optimization and stochastic programming provide unique benefits, and that the choice of method depends on the goal of the analysis. While the MC method generates more useful information, along with the optimum design configuration, the stochastic programming method is more computationally efficient in determining the optimal solution. Reservoirs comprise multiple compartments and layers with multiphase flow of oil, water, and gas. We present a workflow for development optimization under uncertainty for these reservoirs, and solve an example on the design optimization of a multicompartment, multilayer oilfield development.« less

  17. Non-Destructive Current Sensing for Energy Efficiency Monitoring in Buildings with Environmental Certification

    PubMed Central

    Mota, Lia Toledo Moreira; Mota, Alexandre de Assis; Coiado, Lorenzo Campos

    2015-01-01

    Nowadays, buildings environmental certifications encourage the implementation of initiatives aiming to increase energy efficiency in buildings. In these certification systems, increased energy efficiency arising from such initiatives must be demonstrated. Thus, a challenge to be faced is how to check the increase in energy efficiency related to each of the employed initiatives without a considerable building retrofit. In this context, this work presents a non-destructive method for electric current sensing to assess implemented initiatives to increase energy efficiency in buildings with environmental certification. This method proposes the use of a sensor that can be installed directly in the low voltage electrical circuit conductors that are powering the initiative under evaluation, without the need for reforms that result in significant costs, repair, and maintenance. The proposed sensor consists of three elements: an air-core transformer current sensor, an amplifying/filtering stage, and a microprocessor. A prototype of the proposed sensor was developed and tests were performed to validate this sensor. Based on laboratory tests, it was possible to characterize the proposed current sensor with respect to the number of turns and cross-sectional area of the primary and secondary coils. Furthermore, using the Least Squares Method, it was possible to determine the efficiency of the air core transformer current sensor (the best efficiency found, considering different test conditions, was 2%), which leads to a linear output response. PMID:26184208

  18. Non-Destructive Current Sensing for Energy Efficiency Monitoring in Buildings with Environmental Certification.

    PubMed

    Mota, Lia Toledo Moreira; Mota, Alexandre de Assis; Coiado, Lorenzo Campos

    2015-07-10

    Nowadays, buildings environmental certifications encourage the implementation of initiatives aiming to increase energy efficiency in buildings. In these certification systems, increased energy efficiency arising from such initiatives must be demonstrated. Thus, a challenge to be faced is how to check the increase in energy efficiency related to each of the employed initiatives without a considerable building retrofit. In this context, this work presents a non-destructive method for electric current sensing to assess implemented initiatives to increase energy efficiency in buildings with environmental certification. This method proposes the use of a sensor that can be installed directly in the low voltage electrical circuit conductors that are powering the initiative under evaluation, without the need for reforms that result in significant costs, repair, and maintenance. The proposed sensor consists of three elements: an air-core transformer current sensor, an amplifying/filtering stage, and a microprocessor. A prototype of the proposed sensor was developed and tests were performed to validate this sensor. Based on laboratory tests, it was possible to characterize the proposed current sensor with respect to the number of turns and cross-sectional area of the primary and secondary coils. Furthermore, using the Least Squares Method, it was possible to determine the efficiency of the air core transformer current sensor (the best efficiency found, considering different test conditions, was 2%), which leads to a linear output response.

  19. Methods of extending signatures and training without ground information. [data processing, pattern recognition

    NASA Technical Reports Server (NTRS)

    Henderson, R. G.; Thomas, G. S.; Nalepka, R. F.

    1975-01-01

    Methods of performing signature extension, using LANDSAT-1 data, are explored. The emphasis is on improving the performance and cost-effectiveness of large area wheat surveys. Two methods were developed: ASC, and MASC. Two methods, Ratio, and RADIFF, previously used with aircraft data were adapted to and tested on LANDSAT-1 data. An investigation into the sources and nature of between scene data variations was included. Initial investigations into the selection of training fields without in situ ground truth were undertaken.

  20. A method for the analysis of nonlinearities in aircraft dynamic response to atmospheric turbulence

    NASA Technical Reports Server (NTRS)

    Sidwell, K.

    1976-01-01

    An analytical method is developed which combines the equivalent linearization technique for the analysis of the response of nonlinear dynamic systems with the amplitude modulated random process (Press model) for atmospheric turbulence. The method is initially applied to a bilinear spring system. The analysis of the response shows good agreement with exact results obtained by the Fokker-Planck equation. The method is then applied to an example of control-surface displacement limiting in an aircraft with a pitch-hold autopilot.

  1. Lessons learned applying CASE methods/tools to Ada software development projects

    NASA Technical Reports Server (NTRS)

    Blumberg, Maurice H.; Randall, Richard L.

    1993-01-01

    This paper describes the lessons learned from introducing CASE methods/tools into organizations and applying them to actual Ada software development projects. This paper will be useful to any organization planning to introduce a software engineering environment (SEE) or evolving an existing one. It contains management level lessons learned, as well as lessons learned in using specific SEE tools/methods. The experiences presented are from Alpha Test projects established under the STARS (Software Technology for Adaptable and Reliable Systems) project. They reflect the front end efforts by those projects to understand the tools/methods, initial experiences in their introduction and use, and later experiences in the use of specific tools/methods and the introduction of new ones.

  2. Implementing real-time GNSS monitoring to investigate continental rift initiation processes

    NASA Astrophysics Data System (ADS)

    Jones, J. R.; Stamps, D. S.; Wauthier, C.; Daniels, M. D.; Saria, E.; Ji, K. H.; Mencin, D.; Ntambila, D.

    2017-12-01

    Continental rift initiation remains an elusive, yet fundamental, process in the context of plate tectonic theory. Our early work in the Natron Rift, Tanzania, the Earth's archetype continental rift initiation setting, indicates feedback between volcanic deformation and fault slip play a key role in the rift initiation process. We found evidence that fault slip on the Natron border fault during active volcanism at Ol Doniyo Lengai in 2008 required only 0.01 MPa of Coulomb stress change. This previous study was limited by GPS constraints 18 km from the volcano, rather than immediately adjacent on the rift shoulder. We hypothesize that fault slip adjacent to the volcano creeps, and without the need for active eruption. We also hypothesize silent slip events may occur over time-scales less than 1 day. To test our hypotheses we designed a GNSS network with 4 sites on the flanks of Ol Doinyo Lengai and 1 site on the adjacent Natron border fault with the capability to calculate 1 second, 3-5 cm precision positions. Data is transmitted to UNAVCO in real-time with remote satellite internet, which we automatically import to the EarthCube building block CHORDS (Cloud Hosted Real-time Data Services for the Geosciences) using our newly developed method. We use CHORDS to monitor and evaluate the health of our network while visualizing the GNSS data in real-time. In addition to our import method we have also developed user-friendly capabilities to export GNSS positions (longitude, latitude, height) with CHORDS assuming the data are available at UNAVCO in NMEA standardized format through the Networked Transport of RTCM via Internet Protocol (NTRIP). The ability to access the GNSS data that continuously monitors volcanic deformation, tectonics, and their interactions on and around Ol Doinyo Lengai is a crucial component in our investigation of continental rift initiation in the Natron Rift, Tanzania. Our new user-friendly methods developed to access and post-process real-time GNSS positioning data can also be used by others in the geodesy community that need 3-5 cm precision positions (longitude, latitude, height).

  3. A Track Initiation Method for the Underwater Target Tracking Environment

    NASA Astrophysics Data System (ADS)

    Li, Dong-dong; Lin, Yang; Zhang, Yao

    2018-04-01

    A novel efficient track initiation method is proposed for the harsh underwater target tracking environment (heavy clutter and large measurement errors): track splitting, evaluating, pruning and merging method (TSEPM). Track initiation demands that the method should determine the existence and initial state of a target quickly and correctly. Heavy clutter and large measurement errors certainly pose additional difficulties and challenges, which deteriorate and complicate the track initiation in the harsh underwater target tracking environment. There are three primary shortcomings for the current track initiation methods to initialize a target: (a) they cannot eliminate the turbulences of clutter effectively; (b) there may be a high false alarm probability and low detection probability of a track; (c) they cannot estimate the initial state for a new confirmed track correctly. Based on the multiple hypotheses tracking principle and modified logic-based track initiation method, in order to increase the detection probability of a track, track splitting creates a large number of tracks which include the true track originated from the target. And in order to decrease the false alarm probability, based on the evaluation mechanism, track pruning and track merging are proposed to reduce the false tracks. TSEPM method can deal with the track initiation problems derived from heavy clutter and large measurement errors, determine the target's existence and estimate its initial state with the least squares method. What's more, our method is fully automatic and does not require any kind manual input for initializing and tuning any parameter. Simulation results indicate that our new method improves significantly the performance of the track initiation in the harsh underwater target tracking environment.

  4. Computational Methods for Structural Mechanics and Dynamics, part 1

    NASA Technical Reports Server (NTRS)

    Stroud, W. Jefferson (Editor); Housner, Jerrold M. (Editor); Tanner, John A. (Editor); Hayduk, Robert J. (Editor)

    1989-01-01

    The structural analysis methods research has several goals. One goal is to develop analysis methods that are general. This goal of generality leads naturally to finite-element methods, but the research will also include other structural analysis methods. Another goal is that the methods be amenable to error analysis; that is, given a physical problem and a mathematical model of that problem, an analyst would like to know the probable error in predicting a given response quantity. The ultimate objective is to specify the error tolerances and to use automated logic to adjust the mathematical model or solution strategy to obtain that accuracy. A third goal is to develop structural analysis methods that can exploit parallel processing computers. The structural analysis methods research will focus initially on three types of problems: local/global nonlinear stress analysis, nonlinear transient dynamics, and tire modeling.

  5. A Weight-Adaptive Laplacian Embedding for Graph-Based Clustering.

    PubMed

    Cheng, De; Nie, Feiping; Sun, Jiande; Gong, Yihong

    2017-07-01

    Graph-based clustering methods perform clustering on a fixed input data graph. Thus such clustering results are sensitive to the particular graph construction. If this initial construction is of low quality, the resulting clustering may also be of low quality. We address this drawback by allowing the data graph itself to be adaptively adjusted in the clustering procedure. In particular, our proposed weight adaptive Laplacian (WAL) method learns a new data similarity matrix that can adaptively adjust the initial graph according to the similarity weight in the input data graph. We develop three versions of these methods based on the L2-norm, fuzzy entropy regularizer, and another exponential-based weight strategy, that yield three new graph-based clustering objectives. We derive optimization algorithms to solve these objectives. Experimental results on synthetic data sets and real-world benchmark data sets exhibit the effectiveness of these new graph-based clustering methods.

  6. Dating Studies of Elephant Tusks Using Accelerator Mass Spectrometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sideras-Haddad, E; Brown, T A

    A new method for determining the year of birth, the year of death, and hence, the age at death, of post-bomb and recently deceased elephants has been developed. The technique is based on Accelerator Mass Spectrometry radiocarbon analyses of small-sized samples extracted from along the length of a ge-line of an elephant tusk. The measured radiocarbon concentrations in the samples from a tusk can be compared to the {sup 14}C atmospheric bomb-pulse curve to derive the growth years of the initial and final samples from the tusk. Initial data from the application of this method to two tusks will bemore » presented. Potentially, the method may play a significant role in wildlife management practices of African national parks. Additionally, the method may contribute to the underpinnings of efforts to define new international trade regulations, which could, in effect, decrease poaching and the killing of very young animals.« less

  7. An effective method for terrestrial arthropod euthanasia.

    PubMed

    Bennie, Neil A C; Loaring, Christopher D; Bennie, Mikaella M G; Trim, Steven A

    2012-12-15

    As scientific understanding of invertebrate life increases, so does the concern for how to end that life in an effective way that minimises (potential) suffering and is also safe for those carrying out the procedure. There is increasing debate on the most appropriate euthanasia methods for invertebrates as their use in experimental research and zoological institutions grows. Their popularity as pet species has also led to an increase in the need for greater veterinary understanding. Through the use of a local injection of potassium chloride (KCl) initially developed for use in American lobsters, this paper describes a safe and effective method for euthanasia in terrestrial invertebrates. Initial work focused on empirically determining the dose for cockroaches, which was then extrapolated to other arthropod species. For this method of euthanasia, we propose the term 'targeted hyperkalosis' to describe death through terminal depolarisation of the thoracic ganglia as a result of high potassium concentration.

  8. An initial investigation into methods of computing transonic aerodynamic sensitivity coefficients

    NASA Technical Reports Server (NTRS)

    Carlson, Leland A.

    1991-01-01

    Continuing studies associated with the development of the quasi-analytical (QA) sensitivity method for three dimensional transonic flow about wings are presented. Furthermore, initial results using the quasi-analytical approach were obtained and compared to those computed using the finite difference (FD) approach. The basic goals achieved were: (1) carrying out various debugging operations pertaining to the quasi-analytical method; (2) addition of section design variables to the sensitivity equation in the form of multiple right hand sides; (3) reconfiguring the analysis/sensitivity package in order to facilitate the execution of analysis/FD/QA test cases; and (4) enhancing the display of output data to allow careful examination of the results and to permit various comparisons of sensitivity derivatives obtained using the FC/QA methods to be conducted easily and quickly. In addition to discussing the above goals, the results of executing subcritical and supercritical test cases are presented.

  9. Case history of Yakin Field: its development and sand control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sawolo, N.; Krueger, R.F.; Maly, G.P.

    1982-01-01

    This study deals with the development of the Yakin Field in E. Kalimantan, Indonesia, with emphasis on the sand control methods used. Implementation of an effective sand control program insured the successful development of this field. Gravel packed wells had substantially lower production decline rates than the initial completions without gravel packs. Control of sand production also has been demonstrated by the lack of sand problems during the 4-1/2 yr since the sand control program was initiated. During this time there have been no failures of submersible pumps that were associated with sand production. The successful sand control program wasmore » achieved by a well coordinated and cooperative effort of drilling, reservoir engineering, production research, and service company personnel.« less

  10. The State of the NIH BRAIN Initiative.

    PubMed

    Koroshetz, Walter; Gordon, Joshua; Adams, Amy; Beckel-Mitchener, Andrea; Churchill, James; Farber, Gregory; Freund, Michelle; Gnadt, Jim; Hsu, Nina; Langhals, Nicholas; Lisanby, Sarah; Liu, Guoying; Peng, Grace; Ramos, Khara; Steinmetz, Michael; Talley, Edmund; White, Samantha

    2018-06-19

    The BRAIN Initiative® arose from a grand challenge to "accelerate the development and application of new technologies that will enable researchers to produce dynamic pictures of the brain that show how individual brain cells and complex neural circuits interact at the speed of thought." The BRAIN Initiative is a public-private effort focused on the development and use of powerful tools for acquiring fundamental insights about how information processing occurs in the central nervous system. As the Initiative enters its fifth year, NIH has supported over 500 principal investigators, who have answered the Initiative's challenge via hundreds of publications describing novel tools, methods, and discoveries that address the Initiative's seven scientific priorities. We describe scientific advances produced by individual labs, multi-investigator teams, and entire consortia that, over the coming decades, will produce more comprehensive and dynamic maps of the brain, deepen our understanding of how circuit activity can produce a rich tapestry of behaviors, and lay the foundation for understanding how its circuitry is disrupted in brain disorders. Much more work remains to bring this vision to fruition, and NIH continues to look to the diverse scientific community, from mathematics, to physics, chemistry, engineering, neuroethics, and neuroscience, to ensure that the greatest scientific benefit arises from this unique research Initiative. Copyright © 2018 the authors.

  11. Development of test methods for textile composites

    NASA Technical Reports Server (NTRS)

    Masters, John E.; Ifju, Peter G.; Fedro, Mark J.

    1993-01-01

    NASA's Advanced Composite Technology (ACT) Program was initiated in 1990 with the purpose of developing less costly composite aircraft structures. A number of innovative materials and processes were evaluated as a part of this effort. Chief among them are composite materials reinforced with textile preforms. These new forms of composite materials bring with them potential testing problems. Methods currently in practice were developed over the years for composite materials made from prepreg tape or simple 2-D woven fabrics. A wide variety of 2-D and 3-D braided, woven, stitched, and knit preforms were suggested for application in the ACT program. The applicability of existing test methods to the wide range of emerging materials bears investigation. The overriding concern is that the values measured are accurate representations of the true material response. The ultimate objective of this work is to establish a set of test methods to evaluate the textile composites developed for the ACT Program.

  12. Assessing College Student-Athletes' Life Stress: Initial Measurement Development and Validation

    ERIC Educational Resources Information Center

    Lu, Frank Jing-Horng; Hsu, Ya-Wen; Chan, Yuan-Shuo; Cheen, Jang-Rong; Kao, Kuei-Tsu

    2012-01-01

    College student-athletes have unique life stress that warrants close attention. The purpose of this study was to develop a reliable and valid measurement assessing college student-athletes' life stress. In Study 1, a focus group discussion and Delphi method produced a questionnaire draft, termed the College Student-Athletes' Life Stress Scale. In…

  13. Rediscovering Ruth Faison Shaw and Her Finger-Painting Method

    ERIC Educational Resources Information Center

    Mayer, Veronica

    2005-01-01

    Ruth Faison Shaw was an art educator who developed a nontraditional educational perspective of teaching and a different vision about children's art. As such, she is considered by some to be the initiator of finger-painting in America (The History of Art Education Timeline 1930-1939, 2002.) Shaw developed the technique of finger-painting and a…

  14. Developing Face-to-Face Argumentation Skills: Does Arguing on the Computer Help?

    ERIC Educational Resources Information Center

    Iordanou, Kalypso

    2013-01-01

    Arguing on the computer was used as a method to promote development of face-to-face argumentation skills in middle schoolers. In the study presented, sixth graders engaged in electronic dialogues with peers on a controversial topic and in some reflective activities based on transcriptions of the dialogues. Although participants initially exhibited…

  15. Development and Evaluation of a Questionnaire to Assess Physical Educators' Knowledge of Student Assessment

    ERIC Educational Resources Information Center

    Emmanouilidou, Kyriaki; Derri, Vassiliki; Aggelousis, Nicolaos; Vassiliadou, Olga

    2012-01-01

    The purpose of this pilot study was to develop and evaluate an instrument for measuring Greek elementary physical educators' knowledge of student assessment. A multiple-choice questionnaire comprised of items about concepts, methods, tools, and types of student assessment in physical education was designed and tested. The initial 35-item…

  16. Establishing the severity of personality disorder.

    PubMed

    Tyrer, P; Johnson, T

    1996-12-01

    The authors developed a simplified method of rating the severity of personality disorder. The new rating method is based on four levels of severity: no personality disorder, personality difficulty, simple personality disorder, and diffuse personality disorder. The new method was applied to different diagnostic systems and was then compared with an old rating system based on six severity levels. Data were derived from a longitudinal study in which 163 patients with anxiety and depressive disorders had initial assessments of personality status and were followed up over 2 years. Ratings of psychiatric symptoms were made by using the Comprehensive Psychopathological Rating Scale over this period. The results were analyzed with special attention to linear and quadratic trends. The new system was clinically useful in separating patients' initial assessments and outcomes. Patients with no personality disorder had the lowest initial symptom scores and the best outcomes, and those with diffuse personality disorder had the highest initial levels of symptoms and improved least over the 2 years. When the patients were separated by the old classification system, 72% of the variation between groups was accounted for by linear and quadratic trends; the comparable percentage was 97% when the patients were categorized by the new system. The new system of rating severity of personality disturbance is an improvement on existing methods and allows ratings to be made easily from DSM-IV and ICD-10.

  17. [A new method of processing quantitative PCR data].

    PubMed

    Ke, Bing-Shen; Li, Guang-Yun; Chen, Shi-Min; Huang, Xiang-Yan; Chen, Ying-Jian; Xu, Jun

    2003-05-01

    Today standard PCR can't satisfy the need of biotechnique development and clinical research any more. After numerous dynamic research, PE company found there is a linear relation between initial template number and cycling time when the accumulating fluorescent product is detectable.Therefore,they developed a quantitative PCR technique to be used in PE7700 and PE5700. But the error of this technique is too great to satisfy the need of biotechnique development and clinical research. A better quantitative PCR technique is needed. The mathematical model submitted here is combined with the achievement of relative science,and based on the PCR principle and careful analysis of molecular relationship of main members in PCR reaction system. This model describes the function relation between product quantity or fluorescence intensity and initial template number and other reaction conditions, and can reflect the accumulating rule of PCR product molecule accurately. Accurate quantitative PCR analysis can be made use this function relation. Accumulated PCR product quantity can be obtained from initial template number. Using this model to do quantitative PCR analysis,result error is only related to the accuracy of fluorescence intensity or the instrument used. For an example, when the fluorescence intensity is accurate to 6 digits and the template size is between 100 to 1,000,000, the quantitative result accuracy will be more than 99%. The difference of result error is distinct using same condition,same instrument but different analysis method. Moreover,if the PCR quantitative analysis system is used to process data, it will get result 80 times of accuracy than using CT method.

  18. NASA Perspective on Requirements for Development of Advanced Methods Predicting Unsteady Aerodynamics and Aeroelasticity

    NASA Technical Reports Server (NTRS)

    Schuster, David M.

    2008-01-01

    Over the past three years, the National Aeronautics and Space Administration (NASA) has initiated design, development, and testing of a new human-rated space exploration system under the Constellation Program. Initial designs within the Constellation Program are scheduled to replace the present Space Shuttle, which is slated for retirement within the next three years. The development of vehicles for the Constellation system has encountered several unsteady aerodynamics challenges that have bearing on more traditional unsteady aerodynamic and aeroelastic analysis. This paper focuses on the synergy between the present NASA challenges and the ongoing challenges that have historically been the subject of research and method development. There are specific similarities in the flows required to be analyzed for the space exploration problems and those required for some of the more nonlinear unsteady aerodynamic and aeroelastic problems encountered on aircraft. The aggressive schedule, significant technical challenge, and high-priority status of the exploration system development is forcing engineers to implement existing tools and techniques in a design and application environment that is significantly stretching the capability of their methods. While these methods afford the users with the ability to rapidly turn around designs and analyses, their aggressive implementation comes at a price. The relative immaturity of the techniques for specific flow problems and the inexperience with their broad application to them, particularly on manned spacecraft flight system, has resulted in the implementation of an extensive wind tunnel and flight test program to reduce uncertainty and improve the experience base in the application of these methods. This provides a unique opportunity for unsteady aerodynamics and aeroelastic method developers to test and evaluate new analysis techniques on problems with high potential for acquisition of test and even flight data against which they can be evaluated. However, researchers may be required to alter the geometries typically used in their analyses, the types of flows analyzed, and even the techniques by which computational tools are verified and validated. This paper discusses these issues and provides some perspective on the potential for new and innovative approaches to the development of methods to attack problems in nonlinear unsteady aerodynamics.

  19. Environmental Methods Review: Retooling Impact Assessment for the New Century

    DTIC Science & Technology

    1998-03-01

    strategic environmental assessment (SEA), to support sustainable development. The International Study on the Effectiveness of Environmental...Directed by Barry Sadler, initiated and supported prominently by the Canadian Environmental Assessment Agency, and facilitated by IAIA, this study reviews...toward regional ElA,and vertically toward strategic and policy ElA; Larry Canter arrays twenty-two types of methods against seven typical study

  20. Developing strategic planning of green supply chain in refinery CPO company

    NASA Astrophysics Data System (ADS)

    Hidayati, J.; Mumtaz, G.; Hasibuan, S.

    2018-02-01

    We are conducted a research at the company of the manufacturing CPO into cooking oil, margarine and materials of oleochemical industries. Today palm oil based industries are facing global challenges related to environmental issues. To against these challenges, it is necessary to have an environmentally friendly supply chain. However, the limited resource owned by the company requires the integrated environmental strategy with the company’s business strategy. The model is developed based on management orientation towards external pressure, internal key resources and competitive advantage that can be obtained as the decision factor. The decision-making method used is Analytical Network Process (ANP). The results obtained institutional pressure becomes the criterion with the greatest influence on green supply chain initiatives and sub criteria of customer desires and stakeholder integration having the most significant influence on green supply chain initiatives. There are five green alternative initiatives that can be done: green product design, greening upstream, greening production, greening downstream and greening post use. For green supply chain initiative, greening upstream is the best priority.

  1. Learning-based deformable image registration for infant MR images in the first year of life.

    PubMed

    Hu, Shunbo; Wei, Lifang; Gao, Yaozong; Guo, Yanrong; Wu, Guorong; Shen, Dinggang

    2017-01-01

    Many brain development studies have been devoted to investigate dynamic structural and functional changes in the first year of life. To quantitatively measure brain development in such a dynamic period, accurate image registration for different infant subjects with possible large age gap is of high demand. Although many state-of-the-art image registration methods have been proposed for young and elderly brain images, very few registration methods work for infant brain images acquired in the first year of life, because of (a) large anatomical changes due to fast brain development and (b) dynamic appearance changes due to white-matter myelination. To address these two difficulties, we propose a learning-based registration method to not only align the anatomical structures but also alleviate the appearance differences between two arbitrary infant MR images (with large age gap) by leveraging the regression forest to predict both the initial displacement vector and appearance changes. Specifically, in the training stage, two regression models are trained separately, with (a) one model learning the relationship between local image appearance (of one development phase) and its displacement toward the template (of another development phase) and (b) another model learning the local appearance changes between the two brain development phases. Then, in the testing stage, to register a new infant image to the template, we first predict both its voxel-wise displacement and appearance changes by the two learned regression models. Since such initializations can alleviate significant appearance and shape differences between new infant image and the template, it is easy to just use a conventional registration method to refine the remaining registration. We apply our proposed registration method to align 24 infant subjects at five different time points (i.e., 2-week-old, 3-month-old, 6-month-old, 9-month-old, and 12-month-old), and achieve more accurate and robust registration results, compared to the state-of-the-art registration methods. The proposed learning-based registration method addresses the challenging task of registering infant brain images and achieves higher registration accuracy compared with other counterpart registration methods. © 2016 American Association of Physicists in Medicine.

  2. Ageing airplane repair assessment program for Airbus A300

    NASA Technical Reports Server (NTRS)

    Gaillardon, J. M.; Schmidt, HANS-J.; Brandecker, B.

    1992-01-01

    This paper describes the current status of the repair categorization activities and includes all details about the methodologies developed for determination of the inspection program for the skin on pressurized fuselages. For inspection threshold determination two methods are defined based on fatigue life approach, a simplified and detailed method. The detailed method considers 15 different parameters to assess the influences of material, geometry, size location, aircraft usage, and workmanship on the fatigue life of the repair and the original structure. For definition of the inspection intervals a general method is developed which applies to all concerned repairs. For this the initial flaw concept is used by considering 6 parameters and the detectable flaw sizes depending on proposed nondestructive inspection methods. An alternative method is provided for small repairs allowing visual inspection with shorter intervals.

  3. The motion of throw away detectors relative to the space shuttle

    NASA Technical Reports Server (NTRS)

    Mullins, L. D.

    1975-01-01

    The motions of throw away detectors (TAD's) are analyzed using the linearized relative motion equations. The TAD's are to be used in the amps program as diagnostic instruments for making various measurements near the shuttle. The TAD's are ejected from the shuttle in arbitrary directions with small relative velocities (0.1 to 1.0 m/s) their subsequent trajectories relative to the shuttle are analyzed. Initial conditions that are likely to result in recontact between the TAD and the shuttle are identified. The sensitivity of the motion to variations in the initial conditions, possibly resulting from inaccuracy in the ejection mechanism, are analyzed as are effects of atmospheric drag. A targeting method, a method of giving the TAD correct initial conditions such that it will pass through a given point relative to the shuttle at a given time, is developed. The results of many specific cases are presented in graphical form.

  4. Hydrocarbon polymeric binder for advanced solid propellant

    NASA Technical Reports Server (NTRS)

    Potts, J. E. (Editor)

    1972-01-01

    A series of DEAB initiated isoprene polymerizations were run in the 5-gallon stirred autoclave reactor. Polymerization run parameters such as initiator concentration and feed rate were correlated with the molecular weight to provide a basis for molecular weight control in future runs. Synthetic methods were developed for the preparation of n-1,3-alkadienes. By these methods, 1,3-nonadiene was polymerized using DEAB initiator to give an ester-telechelic polynonadiene. This was subsequently hydrogenated with copper chromite catalyst to give a hydroxyl terminated saturated liquid hydrocarbon prepolymer having greatly improved viscosity characteristics and a Tg 18 degrees lower than that of the hydrogenated polyisoprenes. The hydroxyl-telechelic saturated polymers prepared by the hydrogenolysis of ester-telechelic polyisoprene were reached with diisocyanates under conditions favoring linear chain extension gel permeation chromatography was used to monitor this condensation polymerization. Fractions having molecular weights above one million were produced.

  5. Conducting a longitudinal survey of overnight travel : methods and preliminary findings.

    DOT National Transportation Integrated Search

    2015-06-01

    This report summarizes the implementation and initial results of the Longitudinal : Study of Overnight Travel (LSOT), conducted monthly between February 2013 and : February 2014 using an online survey instrument developed by researchers at the : Univ...

  6. Two autowire versions for CDC-3200 and IBM-360

    NASA Technical Reports Server (NTRS)

    Billingsley, J. B.

    1972-01-01

    Microelectronics program was initiated to evaluate circuitry, packaging methods, and fabrication approaches necessary to produce completely procured logic system. Two autowire programs were developed for CDC-3200 and IBM-360 computers for use in designing logic systems.

  7. Automated Indexing of the Hazardous Substances Data Bank (HSDB)

    PubMed Central

    Nuss, Carlo; Chang, Hua Florence; Moore, Dorothy; Fonger, George C.

    2003-01-01

    The Hazardous Substances Data Bank (HSDB), produced and maintained by the National Library of Medicine (NLM), contains over 4600 records on potentially hazardous chemicals. To enhance information retrieval from HSDB, NLM has undertaken the development of an automated HSDB indexing protocol as part of its Indexing Initiative. The NLM Indexing Initiative investigates methods whereby automated indexing may partially or completely substitute for human indexing. The poster’s purpose is to describe the HSDB Automated Indexing Project. PMID:14728459

  8. Relationship between Air Traffic Selection and Training (AT-SAT)) Battery Test Scores and Composite Scores in the Initial en Route Air Traffic Control Qualification Training Course at the Federal Aviation Administration (FAA) Academy

    ERIC Educational Resources Information Center

    Kelley, Ronald Scott

    2012-01-01

    Scope and Method of Study: This study focused on the development and use of the AT-SAT test battery and the Initial En Route Qualification training course for the selection, training, and evaluation of air traffic controller candidates. The Pearson product moment correlation coefficient was used to measure the linear relationship between the…

  9. Yes, but Can They Earn a Living? Methods for Creating an Effective System of Measuring Labor Market Outcomes in Higher Education. Research & Occasional Paper Series: CSHE.5.13

    ERIC Educational Resources Information Center

    Moore, Richard W.; Chapman, Kenneth; Huber, Bettina; Shors, Mark

    2013-01-01

    A new federal initiative calls for a College Scorecard which will include a yet to be determined measure of graduate earnings. In this paper we examine the political context that drives this initiative and examine the nascent efforts of four states to develop statewide systems to measure the labor market outcomes of higher education. We propose…

  10. Developing a laser shockwave model for characterizing diffusion bonded interfaces

    NASA Astrophysics Data System (ADS)

    Lacy, Jeffrey M.; Smith, James A.; Rabin, Barry H.

    2015-03-01

    The US National Nuclear Security Agency has a Global Threat Reduction Initiative (GTRI) with the goal of reducing the worldwide use of high-enriched uranium (HEU). A salient component of that initiative is the conversion of research reactors from HEU to low enriched uranium (LEU) fuels. An innovative fuel is being developed to replace HEU in high-power research reactors. The new LEU fuel is a monolithic fuel made from a U-Mo alloy foil encapsulated in Al-6061 cladding. In order to support the fuel qualification process, the Laser Shockwave Technique (LST) is being developed to characterize the clad-clad and fuel-clad interface strengths in fresh and irradiated fuel plates. LST is a non-contact method that uses lasers for the generation and detection of large amplitude acoustic waves to characterize interfaces in nuclear fuel plates. However, because the deposition of laser energy into the containment layer on a specimen's surface is intractably complex, the shock wave energy is inferred from the surface velocity measured on the backside of the fuel plate and the depth of the impression left on the surface by the high pressure plasma pulse created by the shock laser. To help quantify the stresses generated at the interfaces, a finite element method (FEM) model is being utilized. This paper will report on initial efforts to develop and validate the model by comparing numerical and experimental results for back surface velocities and front surface depressions in a single aluminum plate representative of the fuel cladding.

  11. An adjoint method of sensitivity analysis for residual vibrations of structures subject to impacts

    NASA Astrophysics Data System (ADS)

    Yan, Kun; Cheng, Gengdong

    2018-03-01

    For structures subject to impact loads, the residual vibration reduction is more and more important as the machines become faster and lighter. An efficient sensitivity analysis of residual vibration with respect to structural or operational parameters is indispensable for using a gradient based optimization algorithm, which reduces the residual vibration in either active or passive way. In this paper, an integrated quadratic performance index is used as the measure of the residual vibration, since it globally measures the residual vibration response and its calculation can be simplified greatly with Lyapunov equation. Several sensitivity analysis approaches for performance index were developed based on the assumption that the initial excitations of residual vibration were given and independent of structural design. Since the resulting excitations by the impact load often depend on structural design, this paper aims to propose a new efficient sensitivity analysis method for residual vibration of structures subject to impacts to consider the dependence. The new method is developed by combining two existing methods and using adjoint variable approach. Three numerical examples are carried out and demonstrate the accuracy of the proposed method. The numerical results show that the dependence of initial excitations on structural design variables may strongly affects the accuracy of sensitivities.

  12. Entrapment of subtilisin in ceramic sol-gel coating for antifouling applications.

    PubMed

    Regina, Viduthalai Rasheedkhan; Søhoel, Helmer; Lokanathan, Arcot Raghupathi; Bischoff, Claus; Kingshott, Peter; Revsbech, Niels Peter; Meyer, Rikke Louise

    2012-11-01

    Enzymes with antifouling properties are of great interest in developing nontoxic antifouling coatings. A bottleneck in developing enzyme-based antifouling coatings is to immobilize the enzyme in a suitable coating matrix without compromising its activity and stability. Entrapment of enzymes in ceramics using the sol-gel method is known to have several advantages over other immobilization methods. The sol-gel method can be used to make robust coatings, and the aim of this study was to explore if sol-gel technology can be used to develop robust coatings harboring active enzymes for antifouling applications. We successfully entrapped a protease, subtilisin (Savinase, Novozymes), in a ceramic coating using a sol-gel method. The sol-gel formulation, when coated on a stainless steel surface, adhered strongly and cured at room temperature in less than 8 h. The resultant coating was smoother and less hydrophobic than stainless steel. Changes in the coating's surface structure, thickness and chemistry indicate that the coating undergoes gradual erosion in aqueous medium, which results in release of subtilisin. Subtilisin activity in the coating increased initially, and then gradually decreased. After 9 months, 13% of the initial enzyme activity remained. Compared to stainless steel, the sol-gel-coated surfaces with active subtilisin were able to reduce bacterial attachment of both Gram positive and Gram negative bacteria by 2 orders of magnitude. Together, our results demonstrate that the sol-gel method is a promising coating technology for entrapping active enzymes, presenting an interesting avenue for enzyme-based antifouling solutions.

  13. Babbling, vegetative function, and language development after cricotracheal resection in aphonic children.

    PubMed

    Bohm, Lauren A; Nelson, Marc E; Driver, Lynn E; Green, Glenn E

    2010-12-01

    To determine the importance of prelinguistic babbling by studying patterns of speech and language development after cricotracheal resection in aphonic children. Retrospective review of seven previously aphonic children who underwent cricotracheal resection by our pediatric thoracic airway team. The analyzed variables include age, sex, comorbidity, grade of stenosis, length of resected trachea, and communication methods. Data regarding the children's pre- and postsurgical communication methods, along with their utilization of speech therapy services, were obtained via speech-language pathology evaluations, clinical observations, and a standardized telephone survey supplemented by parental documentation. Postsurgical voice quality was assessed using the Pediatric Voice Outcomes Survey. All seven subjects underwent tracheostomy prior to 2 months of age when corrected for prematurity. The subjects remained aphonic for the entire duration of cannulation. Following cricotracheal resection, they experienced an initial delay in speech acquisition. Vegetative functions were the first laryngeal sounds to emerge. Initially, the children were only able to produce these sounds reflexively, but they subsequently gained voluntary control over these laryngeal functions. All subjects underwent an identifiable stage of canonical babbling that often occurred concomitantly with vocalizations. This was followed by the emergence of true speech. The initial delay in speech acquisition observed following decannulation, along with the presence of a postsurgical canonical stage in all study subjects, supports the hypothesis that babbling is necessary for speech and language development. Furthermore, the presence of babbling is universally evident regardless of the age at which speech develops. Finally, there is no demonstrable correlation between preoperative sign language and rate of speech development. Copyright © 2010 The American Laryngological, Rhinological, and Otological Society, Inc.

  14. Value Iteration Adaptive Dynamic Programming for Optimal Control of Discrete-Time Nonlinear Systems.

    PubMed

    Wei, Qinglai; Liu, Derong; Lin, Hanquan

    2016-03-01

    In this paper, a value iteration adaptive dynamic programming (ADP) algorithm is developed to solve infinite horizon undiscounted optimal control problems for discrete-time nonlinear systems. The present value iteration ADP algorithm permits an arbitrary positive semi-definite function to initialize the algorithm. A novel convergence analysis is developed to guarantee that the iterative value function converges to the optimal performance index function. Initialized by different initial functions, it is proven that the iterative value function will be monotonically nonincreasing, monotonically nondecreasing, or nonmonotonic and will converge to the optimum. In this paper, for the first time, the admissibility properties of the iterative control laws are developed for value iteration algorithms. It is emphasized that new termination criteria are established to guarantee the effectiveness of the iterative control laws. Neural networks are used to approximate the iterative value function and compute the iterative control law, respectively, for facilitating the implementation of the iterative ADP algorithm. Finally, two simulation examples are given to illustrate the performance of the present method.

  15. Accelerating the development of formal thinking in middle and high school students II: Postproject effects on science achievement

    NASA Astrophysics Data System (ADS)

    Shayer, Michael; Adey, Philip S.

    A one-year lag was found between the effect of an intervention intended to promote formal operational thinking in students initially 11 or 12 years of age and the appearance of substantial science achievement in the experimental groups. A one-year lag was also reported on cognitive development: Whereas at the end of the two-year intervention the experimental groups were up to 0.9 ahead of the control groups, one year later the differential on Piagetian measures had disappeared, but the experimentals now showed better science achievement of even greater magnitude. Although the control groups showed normal distribution both on science achievement and cognitive development, the experimental groups showed bi- or trimodal distribution. Between one-half and one-quarter of the students involved in the experiment in different groups showed effects of the order of 2 both on cognitive development and science achievement; some students appeared unaffected (compared with the controls), and others demonstrated modest effects on science achievement. An age/gender interaction is reported: the most substantial effects were found in boys initially aged 12+ and girls initially 11+. The only group to show no effects was boys initially aged 11+. It is suggested that the intervention methods may have favored the abstract analytical learning style as described by Cohen 1986.

  16. 42 CFR 414.313 - Initial method of payment.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 3 2011-10-01 2011-10-01 false Initial method of payment. 414.313 Section 414.313... Charges Under the ESRD Program § 414.313 Initial method of payment. (a) Basic rule. Under this method, the... vaccine. (c) Physician election of the initial method. (1) Each physician in a facility must submit to the...

  17. Succession planning and leadership development: critical business strategies for healthcare organizations.

    PubMed

    Collins, Sandra K; Collins, Kevin S

    2007-01-01

    As labor shortages intensify, succession planning and leadership development have become strategic initiatives requiring rigorous consideration. Traditional methods of replacing personnel will not accommodate the vacancies expected to plague healthcare organizations. Managers should focus on identifying potential gaps of key personnel and adapting programs to accommodate organizational need. Attention should be placed on capturing the intellectual capital existent in the organization and developing diverse groups of leadership candidates.

  18. Next Generation Wiring

    NASA Technical Reports Server (NTRS)

    Medelius, Petro; Jolley, Scott; Fitzpatrick, Lilliana; Vinje, Rubiela; Williams, Martha; Clayton, LaNetra; Roberson, Luke; Smith, Trent; Santiago-Maldonado, Edgardo

    2007-01-01

    Wiring is a major operational component on aerospace hardware that accounts for substantial weight and volumetric space. Over time wire insulation can age and fail, often leading to catastrophic events such as system failure or fire. The next generation of wiring must be reliable and sustainable over long periods of time. These features will be achieved by the development of a wire insulation capable of autonomous self-healing that mitigates failure before it reaches a catastrophic level. In order to develop a self-healing insulation material, three steps must occur. First, methods of bonding similar materials must be developed that are capable of being initiated autonomously. This process will lead to the development of a manual repair system for polyimide wire insulation. Second, ways to initiate these bonding methods that lead to materials that are similar to the primary insulation must be developed. Finally, steps one and two must be integrated to produce a material that has no residues from the process that degrades the insulating properties of the final repaired insulation. The self-healing technology, teamed with the ability to identify and locate damage, will greatly improve reliability and safety of electrical wiring of critical systems. This paper will address these topics, discuss the results of preliminary testing, and remaining development issues related to self-healing wire insulation.

  19. Authentic assessment based showcase portfolio on learning of mathematical problem solving in senior high school

    NASA Astrophysics Data System (ADS)

    Sukmawati, Zuhairoh, Faihatuz

    2017-05-01

    The purpose of this research was to develop authentic assessment model based on showcase portfolio on learning of mathematical problem solving. This research used research and development Method (R & D) which consists of four stages of development that: Phase I, conducting a preliminary study. Phase II, determining the purpose of developing and preparing the initial model. Phase III, trial test of instrument for the initial draft model and the initial product. The respondents of this research are the students of SMAN 8 and SMAN 20 Makassar. The collection of data was through observation, interviews, documentation, student questionnaire, and instrument tests mathematical solving abilities. The data were analyzed with descriptive and inferential statistics. The results of this research are authentic assessment model design based on showcase portfolio which involves: 1) Steps in implementing the authentic assessment based Showcase, assessment rubric of cognitive aspects, assessment rubric of affective aspects, and assessment rubric of skill aspect. 2) The average ability of the students' problem solving which is scored by using authentic assessment based on showcase portfolio was in high category and the students' response in good category.

  20. Process equipped with a sloped UV lamp for the fabrication of gradient-refractive-index lenses.

    PubMed

    Liu, Jui-Hsiang; Chiu, Yi-Hong

    2009-05-01

    In this investigation, a method for the preparation of gradient-refractive-index (GRIN) lenses by UV-energy-controlled polymerization has been developed. A glass reaction tube equipped with a sloped UV lamp was designed. Methyl methacrylate and diphenyl sulfide were used as the reactive monomer and nonreactive dopant, respectively. Ciba IRGACURE 184 (1-hydroxy-cyclohexyl-phenyl-ketone) was used as the initiator. The effects of initiator concentration, the addition of acrylic polymers, and the preparation conditions on the optical characteristics of the GRIN lenses produced by this method were also investigated. Refractive index distributions and image transmission properties were estimated for all GRIN lenses prepared.

  1. Rapid and accurate prediction of degradant formation rates in pharmaceutical formulations using high-performance liquid chromatography-mass spectrometry.

    PubMed

    Darrington, Richard T; Jiao, Jim

    2004-04-01

    Rapid and accurate stability prediction is essential to pharmaceutical formulation development. Commonly used stability prediction methods include monitoring parent drug loss at intended storage conditions or initial rate determination of degradants under accelerated conditions. Monitoring parent drug loss at the intended storage condition does not provide a rapid and accurate stability assessment because often <0.5% drug loss is all that can be observed in a realistic time frame, while the accelerated initial rate method in conjunction with extrapolation of rate constants using the Arrhenius or Eyring equations often introduces large errors in shelf-life prediction. In this study, the shelf life prediction of a model pharmaceutical preparation utilizing sensitive high-performance liquid chromatography-mass spectrometry (LC/MS) to directly quantitate degradant formation rates at the intended storage condition is proposed. This method was compared to traditional shelf life prediction approaches in terms of time required to predict shelf life and associated error in shelf life estimation. Results demonstrated that the proposed LC/MS method using initial rates analysis provided significantly improved confidence intervals for the predicted shelf life and required less overall time and effort to obtain the stability estimation compared to the other methods evaluated. Copyright 2004 Wiley-Liss, Inc. and the American Pharmacists Association.

  2. Combining wet etching and real-time damage event imaging to reveal the most dangerous laser damage initiator in fused silica.

    PubMed

    Hu, Guohang; Zhao, Yuanan; Liu, Xiaofeng; Li, Dawei; Xiao, Qiling; Yi, Kui; Shao, Jianda

    2013-08-01

    A reliable method, combining a wet etch process and real-time damage event imaging during a raster scan laser damage test, has been developed to directly determine the most dangerous precursor inducing low-density laser damage at 355 nm in fused silica. It is revealed that ~16% of laser damage sites were initiated at the place of the scratches, ~49% initiated at the digs, and ~35% initiated at invisible defects. The morphologies of dangerous scratches and digs were compared with those of moderate ones. It is found that local sharp variation at the edge, twist, or inside of a subsurface defect is the most dangerous laser damage precursor.

  3. Ammonia Analysis by Gas Chromatograph/Infrared Detector (GC/IRD)

    NASA Technical Reports Server (NTRS)

    Scott, Joseph P.; Whitfield, Steve W.

    2003-01-01

    Methods are being developed at Marshall Space Flight Center's Toxicity Lab on a CG/IRD System that will be used to detect ammonia in low part per million (ppm) levels. These methods will allow analysis of gas samples by syringe injections. The GC is equipped with a unique cryogenic-cooled inlet system that will enable our lab to make large injections of a gas sample. Although the initial focus of the work will be analysis of ammonia, this instrument could identify other compounds on a molecular level. If proper methods can be developed, the IRD could work as a powerful addition to our offgassing capabilities.

  4. What makes a sustainability tool valuable, practical and useful in real-world healthcare practice? A mixed-methods study on the development of the Long Term Success Tool in Northwest London

    PubMed Central

    Lennox, Laura; Doyle, Cathal; Reed, Julie E

    2017-01-01

    Objectives Although improvement initiatives show benefits to patient care, they often fail to sustain. Models and frameworks exist to address this challenge, but issues with design, clarity and usability have been barriers to use in healthcare settings. This work aimed to collaborate with stakeholders to develop a sustainability tool relevant to people in healthcare settings and practical for use in improvement initiatives. Design Tool development was conducted in six stages. A scoping literature review, group discussions and a stakeholder engagement event explored literature findings and their resonance with stakeholders in healthcare settings. Interviews, small-scale trialling and piloting explored the design and tested the practicality of the tool in improvement initiatives. Setting National Institute for Health Research Collaboration for Leadership in Applied Health Research and Care for Northwest London (CLAHRC NWL). Participants CLAHRC NWL improvement initiative teams and staff. Results The iterative design process and engagement of stakeholders informed the articulation of the sustainability factors identified from the literature and guided tool design for practical application. Key iterations of factors and tool design are discussed. From the development process, the Long Term Success Tool (LTST) has been designed. The Tool supports those implementing improvements to reflect on 12 sustainability factors to identify risks to increase chances of achieving sustainability over time. The Tool is designed to provide a platform for improvement teams to share their own views on sustainability as well as learn about the different views held within their team to prompt discussion and actions. Conclusion The development of the LTST has reinforced the importance of working with stakeholders to design strategies which respond to their needs and preferences and can practically be implemented in real-world settings. Further research is required to study the use and effectiveness of the tool in practice and assess engagement with the method over time. PMID:28947436

  5. A Rapid Method of Isolating Neoantigen-specific T Cell Receptor Sequences | NCI Technology Transfer Center | TTC

    Cancer.gov

    Recent research has demonstrated that neoantigen-specific T-cell receptors (TCRs) can be isolated from a cancer patient’s lymphocytes. These TCRs may be used to engineer populations of tumor-reactive T cells for cancer immunotherapies. Obtaining sequences of these functional TCRs is a critical initial step in preparing this type of personalized cancer treatment; however, current methods are time-consuming and labor-intensive. Scientists at the National Cancer Institute (NCI) have developed a rapid and robust method of isolating the sequences of mutation-specific TCRs to alleviate these issues; they seek licensing and/or co-development research collaborations for the development of a method for isolating the sequences of tumor-reactive TCRs. For collaboration opportunities, please contact Steven A. Rosenberg, M.D., Ph.D. at sar@nih.gov.

  6. High‐Volume Processed, ITO‐Free Superstrates and Substrates for Roll‐to‐Roll Development of Organic Electronics

    PubMed Central

    Hösel, Markus; Angmo, Dechan; Søndergaard, Roar R.; dos Reis Benatto, Gisele A.; Carlé, Jon E.; Jørgensen, Mikkel

    2014-01-01

    The fabrication of substrates and superstrates prepared by scalable roll‐to‐roll methods is reviewed. The substrates and superstrates that act as the flexible carrier for the processing of functional organic electronic devices are an essential component, and proposals are made about how the general availability of various forms of these materials is needed to accelerate the development of the field of organic electronics. The initial development of the replacement of indium‐tin‐oxide (ITO) for the flexible carrier materials is described and a description of how roll‐to‐roll processing development led to simplification from an initially complex make‐up to higher performing materials through a more simple process is also presented. This process intensification through process simplification is viewed as a central strategy for upscaling, increasing throughput, performance, and cost reduction. PMID:27980893

  7. A Tale of Two Trails: Exploring Different Paths to Success

    PubMed Central

    Walker, Jennifer G.; Evenson, Kelly R.; Davis, William J.; Bors, Philip; Rodríguez, Daniel A.

    2016-01-01

    Background This comparative case study investigates 2 successful community trail initiatives, using the Active Living By Design (ALBD) Community Action Model as an analytical framework. The model includes 5 strategies: preparation, promotion, programs, policy, and physical projects. Methods Key stakeholders at 2 sites participated in in-depth interviews (N = 14). Data were analyzed for content using Atlas Ti and grouped according to the 5 strategies. Results Preparation Securing trail resources was challenging, but shared responsibilities facilitated trail development. Promotions The initiatives demonstrated minimal physical activity encouragement strategies. Programs Community stakeholders did not coordinate programmatic opportunities for routine physical activity. Policy Trails’ inclusion in regional greenway master plans contributed to trail funding and development. Policies that were formally institutionalized and enforced led to more consistent trail construction and safer conditions for users. Physical Projects Consistent standards for way finding signage and design safety features enhanced trail usability and safety. Conclusions Communities with different levels of government support contributed unique lessons to inform best practices of trail initiatives. This study revealed a disparity between trail development and use-encouragement strategies, which may limit trails’ impact on physical activity. The ALBD Community Action Model provided a viable framework to structure cross-disciplinary community trail initiatives. PMID:21597125

  8. Advancing working and learning through critical action research: creativity and constraints.

    PubMed

    Bellman, Loretta; Bywood, Catherine; Dale, Susan

    2003-12-01

    Continuous professional development is an essential component within many health care 'Learning Organisations'. The paper describes the first phase of an initiative to develop a professional practice development framework for nurses in an NHS general hospital. The project was undertaken within a critical action research methodology. A tripartite arrangement between the hospital, a university and professional nursing organisation enabled clinical, educational and research support for the nurses (co-researchers) engaged in the project. Initial challenges were from some managers, educationalists and the ethics committee who did not appear to understand the action research process. A multi-method approach to data collection was undertaken to capture the change process from different stakeholders' perceptions. Triangulation of the data was undertaken. Despite organisational constraints, transformational leadership and peer support enabled the co-researchers to identify and initiate three patient-focused initiatives. The change process for the co-researchers included: enlightening personal journey, exploring the research-practice gap, enhancing personal and professional knowledge, evolving cultural change and collaborative working, empowering and disempowering messages. A hospital merger and corporate staff changes directly impacted on the project. A more flexible time-scale and longer term funding are required to enable continuity for trust-wide projects undertaken in dynamic clinical settings.

  9. A high-order gas-kinetic Navier-Stokes flow solver

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li Qibing, E-mail: lqb@tsinghua.edu.c; Xu Kun, E-mail: makxu@ust.h; Fu Song, E-mail: fs-dem@tsinghua.edu.c

    2010-09-20

    The foundation for the development of modern compressible flow solver is based on the Riemann solution of the inviscid Euler equations. The high-order schemes are basically related to high-order spatial interpolation or reconstruction. In order to overcome the low-order wave interaction mechanism due to the Riemann solution, the temporal accuracy of the scheme can be improved through the Runge-Kutta method, where the dynamic deficiencies in the first-order Riemann solution is alleviated through the sub-step spatial reconstruction in the Runge-Kutta process. The close coupling between the spatial and temporal evolution in the original nonlinear governing equations seems weakened due to itsmore » spatial and temporal decoupling. Many recently developed high-order methods require a Navier-Stokes flux function under piece-wise discontinuous high-order initial reconstruction. However, the piece-wise discontinuous initial data and the hyperbolic-parabolic nature of the Navier-Stokes equations seem inconsistent mathematically, such as the divergence of the viscous and heat conducting terms due to initial discontinuity. In this paper, based on the Boltzmann equation, we are going to present a time-dependent flux function from a high-order discontinuous reconstruction. The theoretical basis for such an approach is due to the fact that the Boltzmann equation has no specific requirement on the smoothness of the initial data and the kinetic equation has the mechanism to construct a dissipative wave structure starting from an initially discontinuous flow condition on a time scale being larger than the particle collision time. The current high-order flux evaluation method is an extension of the second-order gas-kinetic BGK scheme for the Navier-Stokes equations (BGK-NS). The novelty for the easy extension from a second-order to a higher order is due to the simple particle transport and collision mechanism on the microscopic level. This paper will present a hierarchy to construct such a high-order method. The necessity to couple spatial and temporal evolution nonlinearly in the flux evaluation can be clearly observed through the numerical performance of the scheme for the viscous flow computations.« less

  10. Hybrid Weighted Minimum Norm Method A new method based LORETA to solve EEG inverse problem.

    PubMed

    Song, C; Zhuang, T; Wu, Q

    2005-01-01

    This Paper brings forward a new method to solve EEG inverse problem. Based on following physiological characteristic of neural electrical activity source: first, the neighboring neurons are prone to active synchronously; second, the distribution of source space is sparse; third, the active intensity of the sources are high centralized, we take these prior knowledge as prerequisite condition to develop the inverse solution of EEG, and not assume other characteristic of inverse solution to realize the most commonly 3D EEG reconstruction map. The proposed algorithm takes advantage of LORETA's low resolution method which emphasizes particularly on 'localization' and FOCUSS's high resolution method which emphasizes particularly on 'separability'. The method is still under the frame of the weighted minimum norm method. The keystone is to construct a weighted matrix which takes reference from the existing smoothness operator, competition mechanism and study algorithm. The basic processing is to obtain an initial solution's estimation firstly, then construct a new estimation using the initial solution's information, repeat this process until the solutions under last two estimate processing is keeping unchanged.

  11. Investigation of Soman Adducts of Human Hemoglobin by Liquid Chromatography

    DTIC Science & Technology

    2004-04-01

    acid standard, with fifteen primary amino acids , was used to evaluate and refine the chromatographic methods . An LC/MS/MS was used to analyze the non...several chromatographic conditions and stationary phases were used to create an LC/MS/MS method to directly analyze the amino acids , these studies...terminated because of a lack of resolution of the amino acid peaks. Also, initial attempts to develop an HPLC method to separate individual amino acids

  12. Improved hydrostatic pressure sample injection by tilting the microchip towards the disposable miniaturized CE device.

    PubMed

    Wang, Wei; Zhou, Fang; Zhao, Liang; Zhang, Jian-Rong; Zhu, Jun-Jie

    2008-02-01

    A simple method of hydrostatic pressure sample injection towards a disposable microchip CE device was developed. The liquid level in the sample reservoir was higher than that in the sample waste reservoir (SWR) by tilting microchip and hydrostatic pressure was generated, the sample was driven to pass through injection channel into SWR. After sample loading, the microchip was levelled for separation under applied high separation voltage. Effects of tilted angle, initial liquid height and injection duration on electrophoresis were investigated. With enough injection duration, the injection result was little affected by tilted angle and initial liquid heights in the reservoirs. Injection duration for obtaining a stable sample plug was mainly dependent on the tilted angle rather than the initial height of liquid. Experimental results were consistent with theoretical prediction. Fluorescence observation and electrochemical detection of dopamine and catechol were employed to verify the feasibility of tilted microchip hydrostatic pressure injection. Good reproducibility of this injection method was obtained. Because the instrumentation was simplified and no additional hardware was needed in this technology, the proposed method would be potentially useful in disposable devices.

  13. Single-Specimen Technique to Establish the J-Resistance of Linear Viscoelastic Solids with Constant Poisson's Ratio

    NASA Technical Reports Server (NTRS)

    Gutierrez-Lemini, Danton; McCool, Alex (Technical Monitor)

    2001-01-01

    A method is developed to establish the J-resistance function for an isotropic linear viscoelastic solid of constant Poisson's ratio using the single-specimen technique with constant-rate test data. The method is based on the fact that, for a test specimen of fixed crack size under constant rate, the initiation J-integral may be established from the crack size itself, the actual external load and load-point displacement at growth initiation, and the relaxation modulus of the viscoelastic solid, without knowledge of the complete test record. Since crack size alone, of the required data, would be unknown at each point of the load-vs-load-point displacement curve of a single-specimen test, an expression is derived to estimate it. With it, the physical J-integral at each point of the test record may be established. Because of its basis on single-specimen testing, not only does the method not require the use of multiple specimens with differing initial crack sizes, but avoids the need for tracking crack growth as well.

  14. Material Testing and Initial Pavement Design Modeling: Minnesota Road Research Project

    DOT National Transportation Integrated Search

    1996-09-01

    Between January 1990 and December 1994, a study verified and applied a Corps of Engineers-developed mechanistic design and evaluation method for pavements in seasonal frost areas as part of a Construction Productivity Advancement Research (CPAR) proj...

  15. The Childhood Obesity Declines Project: Implications for Research and Evaluation Approaches.

    PubMed

    Young-Hyman, Deborah; Morris, Kathryn; Kettel Khan, Laura; Dawkins-Lyn, Nicola; Dooyema, Carrie; Harris, Carole; Jernigan, Jan; Ottley, Phyllis; Kauh, Tina

    2018-03-01

    Childhood obesity remains prevalent and is increasing in some disadvantaged populations. Numerous research, policy and community initiatives are undertaken to impact this pandemic. Understudied are natural experiments. The need to learn from these efforts is paramount. Resulting evidence may not be readily available to inform future research, community initiatives, and policy development/implementation. We discuss the implications of using an adaptation of the Systematic Screening and Assessment (SSA) method to evaluate the Childhood Obesity Declines (COBD) project. The project examined successful initiatives, programs and policies in four diverse communities which were concurrent with significant declines in child obesity. In the context of other research designs and evaluation schemas, rationale for use of SSA is presented. Evidence generated by this method is highlighted and guidance suggested for evaluation of future studies of community-based childhood obesity prevention initiatives. Support for the role of stakeholder collaboratives, in particular the National Collaborative on Childhood Obesity Research, as a synergistic vehicle to accelerate research on childhood obesity is discussed. SSA mapped active processes and provided contextual understanding of multi-level/component simultaneous efforts to reduce rates of childhood obesity in community settings. Initiatives, programs and policies were not necessarily coordinated. And although direct attribution of intervention/initiative/policy components could not be made, the what, by who, how, to whom was temporally associated with statistically significant reductions in childhood obesity. SSA provides evidence for context and processes which are not often evaluated in other data analytic methods. SSA provides an additional tool to layer with other evaluation approaches.

  16. [Determinants of strategic management of a health center].

    PubMed

    Huard, Pierre; Schaller, Philippe

    2014-01-01

    The article highlights the value of a strategic approach for the development of a primary care health centre. The method is adapted from corporate strategy: (i) analysis of the situation of the health centre and the obstacles to its development. (ii) selection of relations on which the strategy can be developed. (iii) elaboration of a system of interventions to create a cumulative development process. (iv) Illustration of the method by application to a case. The example illustrates the principles and method and highlights the importance of interpretations and choices in elaboration of a strategy, which is therefore always a unique construction. The strategic approach provides a framework that (i) provides a subject of discussion and negotiation between members of the health centre, (ii) strengthens the consistency of structural decisions, (iii) helps the health centre to overcome obstacles and initiate a development process.

  17. Estimation of chaotic coupled map lattices using symbolic vector dynamics

    NASA Astrophysics Data System (ADS)

    Wang, Kai; Pei, Wenjiang; Cheung, Yiu-ming; Shen, Yi; He, Zhenya

    2010-01-01

    In [K. Wang, W.J. Pei, Z.Y. He, Y.M. Cheung, Phys. Lett. A 367 (2007) 316], an original symbolic vector dynamics based method has been proposed for initial condition estimation in additive white Gaussian noisy environment. The estimation precision of this estimation method is determined by symbolic errors of the symbolic vector sequence gotten by symbolizing the received signal. This Letter further develops the symbolic vector dynamical estimation method. We correct symbolic errors with backward vector and the estimated values by using different symbols, and thus the estimation precision can be improved. Both theoretical and experimental results show that this algorithm enables us to recover initial condition of coupled map lattice exactly in both noisy and noise free cases. Therefore, we provide novel analytical techniques for understanding turbulences in coupled map lattice.

  18. Back-Face Strain for Monitoring Stable Crack Extension in Precracked Flexure Specimens

    NASA Technical Reports Server (NTRS)

    Salem, Jonathan A.; Ghosn, Louis J.

    2010-01-01

    Calibrations relating back-face strain to crack length in precracked flexure specimens were developed for different strain gage sizes. The functions were verified via experimental compliance measurements of notched and precracked ceramic beams. Good agreement between the functions and experiments occurred, and fracture toughness was calculated via several operational methods: maximum test load and optically measured precrack length; load at 2 percent crack extension and optical precrack length; maximum load and back-face strain crack length. All the methods gave vary comparable results. The initiation toughness, K(sub Ii) , was also estimated from the initial compliance and load.The results demonstrate that stability of precracked ceramics specimens tested in four-point flexure is a common occurrence, and that methods such as remotely-monitored load-point displacement are only adequate for detecting stable extension of relatively deep cracks.

  19. Pediatric hospital medicine: a strategic planning roundtable to chart the future.

    PubMed

    Rauch, Daniel A; Lye, Patricia S; Carlson, Douglas; Daru, Jennifer A; Narang, Steve; Srivastava, Rajendu; Melzer, Sanford; Conway, Patrick H

    2012-04-01

    Given the growing field of Pediatric Hospital Medicine (PHM) and the need to define strategic direction, the Society of Hospital Medicine, the American Academy of Pediatrics, and the Academic Pediatric Association sponsored a roundtable to discuss the future of the field. Twenty-one leaders were invited plus a facilitator utilizing established health care strategic planning methods. A "vision statement" was developed. Specific initiatives in 4 domains (clinical practice, quality of care, research, and workforce) were identified that would advance PHM with a plan to complete each initiative. Review of the current issues demonstrated gaps between the current state of affairs and the full vision of the potential impact of PHM. Clinical initiatives were to develop an educational plan supporting the PHM Core Competencies and a clinical practice monitoring dashboard template. Quality initiatives included an environmental assessment of PHM participation on key committees, societies, and agencies to ensure appropriate PHM representation. Three QI collaboratives are underway. A Research Leadership Task Force was created and the Pediatric Research in Inpatient Settings (PRIS) network was refocused, defining a strategic framework for PRIS, and developing a funding strategy. Workforce initiatives were to develop a descriptive statement that can be used by any PHM physician, a communications tool describing "value added" of PHM; and a tool to assess career satisfaction among PHM physicians. We believe the Roundtable was successful in describing the current state of PHM and laying a course for the near future. Copyright © 2011 Society of Hospital Medicine.

  20. Formation of Microcracks During Micro-Arc Oxidation in a Phytic Acid-Containing Solution on Two-Phase AZ91HP

    NASA Astrophysics Data System (ADS)

    Zhang, R. F.; Chang, W. H.; Jiang, L. F.; Qu, B.; Zhang, S. F.; Qiao, L. P.; Xiang, J. H.

    2016-04-01

    Micro-arc oxidation (MAO) is an effective method to produce ceramic coatings on magnesium alloys and can considerably improve their corrosion resistance. The coating properties are closely related with microcracks, which are always inevitably developed on the coating surface. In order to find out the formation and development regularity of microcracks, anodic coatings developed on two-phase AZ91HP after different anodizing times were fabricated in a solution containing environmentally friendly organic electrolyte phytic acid. The results show that anodic film is initially developed on the α phase. At 50 s, anodic coatings begin to develop on the β phase, evidencing the formation of a rough area. Due to the coating successive development, the microcracks initially appear at the boundary between the initially formed coating on the α phase and the subsequently developed coating on the β phase. With the prolonging treatment time, the microcracks near the β phase become evident. After treating for 3 min, the originally rough area on the β phase disappears and the coatings become almost uniform with microcracks randomly distributed on the sample surface. Inorganic phosphates are found in MAO coatings, suggesting that phytate salts are decomposed due to the high instantaneous temperature on the sample surface resulted from spark discharge.

  1. Advances in Small Particle Handling of Astromaterials in Preparation for OSIRIS-REx and Hayabusa2: Initial Developments

    NASA Technical Reports Server (NTRS)

    Snead, C. J.; McCubbin, F. M.; Nakamura-Messenger, K.; Righter, K.

    2018-01-01

    The Astromaterials Acquisition and Curation office at NASA Johnson Space Center has established an Advanced Curation program that is tasked with developing procedures, technologies, and data sets necessary for the curation of future astromaterials collections as envisioned by NASA exploration goals. One particular objective of the Advanced Curation program is the development of new methods for the collection, storage, handling and characterization of small (less than 100 micrometer) particles. Astromaterials Curation currently maintains four small particle collections: Cosmic Dust that has been collected in Earth's stratosphere by ER2 and WB-57 aircraft, Comet 81P/Wild 2 dust returned by NASA's Stardust spacecraft, interstellar dust that was returned by Stardust, and asteroid Itokawa particles that were returned by the JAXA's Hayabusa spacecraft. NASA Curation is currently preparing for the anticipated return of two new astromaterials collections - asteroid Ryugu regolith to be collected by Hayabusa2 spacecraft in 2021 (samples will be provided by JAXA as part of an international agreement), and asteroid Bennu regolith to be collected by the OSIRIS-REx spacecraft and returned in 2023. A substantial portion of these returned samples are expected to consist of small particle components, and mission requirements necessitate the development of new processing tools and methods in order to maximize the scientific yield from these valuable acquisitions. Here we describe initial progress towards the development of applicable sample handling methods for the successful curation of future small particle collections.

  2. Strengthening Methods for Assessing Students' Metahistorical Conceptions: Initial Development of the Historical Account Differences Survey

    ERIC Educational Resources Information Center

    O'Neill, D. Kevin; Guloy, Sheryl; Sensoy, Özlem

    2014-01-01

    To prepare students for participation in a pluralistic, democratic society, history curriculum should help them develop mature ideas about why multiple accounts of the same events exist. But how can we know if we are successful? In this article, we describe work on the design, validation, and piloting of a paper-and-pencil instrument called the…

  3. The development of uneven-aged southern pine silviculture before the Crossett Experimental Forest (Arkansas, USA)

    Treesearch

    Don C. Bragg

    2017-01-01

    Although the Crossett Experimental Forest (CEF) played a well-publicized role in the development of uneven-aged southern pine silviculture, work on a selection method in Arkansas (USA) did not originate there. In 1925, Leslie Pomeroy and Eugene Connor acquired the Ozark Badger Lumber Company and initiated an expert-driven selection management system compatible with...

  4. Wikis: Developing Pre-Service Teachers' Leadership Skills and Knowledge of Content Standards

    ERIC Educational Resources Information Center

    Reid-Griffin, Angelia; Slaten, Kelli M.

    2016-01-01

    In this initial phase of our multi-year research study we set out to explore the development of leadership skills in our pre-service secondary teachers after using an online wiki, Wikispaces. This paper presents our methods for preparing a group of 13 mathematics and 3 science secondary pre-service teachers to demonstrate the essential knowledge,…

  5. Action Research in a Non-Profit Agency School Setting: Analyzing the Adoption of an Innovation after Initial Training and Coaching

    ERIC Educational Resources Information Center

    Sandoval-Lucero, Elena; Maes, Johanna B.; Pappas, Georgia

    2013-01-01

    Action research is a method of organizational development and improvement often used in educational settings. This study implemented an action research process in an alternative school that serves students with significant special needs. The action research process was implemented by classroom teams who developed a research question, collected and…

  6. Developing and Testing an Online Tool for Teaching GIS Concepts Applied to Spatial Decision-Making

    ERIC Educational Resources Information Center

    Carver, Steve; Evans, Andy; Kingston, Richard

    2004-01-01

    The development and testing of a Web-based GIS e-learning resource is described. This focuses on the application of GIS for siting a nuclear waste disposal facility and the associated principles of spatial decision-making using Boolean and weighted overlay methods. Initial student experiences in using the system are analysed as part of a research…

  7. A new practice environment measure based on the reality and experiences of nurses working lives.

    PubMed

    Webster, Joan; Flint, Anndrea; Courtney, Mary

    2009-01-01

    To explore the underlying organizational issues affecting a nurses' decision to leave and to develop a contemporary practice environment measure based on the experiences of nurses working lives. Turnover had reached an unacceptable level in our organization but underlying reasons for leaving were unknown. In-depth interviews were conducted with 13 nurses who had resigned. Transcripts were analysed using the constant comparative method. Information from the interviews informed the development a new practice environment tool, which has undergone initial testing using the Content Validity Index and Chronbach's alpha. Two domains ('work life' and 'personal life/professional development') and five themes ('feeling safe', 'feeling valued', 'getting things done', 'professional development' and 'being flexible') emerged from the interviews. A content validity score for the new instrument was 0.79 and Chronbach's alpha 0.93. The new practice environment tool has shown useful initial reliability and validity but requires wider testing in other settings. The reality and experiences of nurses working lives can be identified through exit interviews conducted by an independent person. Information from such interviews is useful in identifying an organization's strength and weaknesses and to develop initiatives to support retention.

  8. Origins of Montessori Programming for Dementia

    PubMed Central

    Camp, Cameron J.

    2011-01-01

    The focus of this article is on the evolution of the use of Montessori educational methods as the basis for creating interventions for persons with dementia. The account of this evolution is autobiographical, as the development of Montessori Programming for Dementia (MPD) initially was through the efforts of myself and my research associates. My initial exposure to Maria Montessori’s work came as a result of my involvement with my own children’s education. This exposure influenced ongoing research on development of cognitive interventions for persons with dementia. A brief description of Montessori’s work with children and the educational methods she developed is followed by a description of how this approach can be translated into development of activities for persons with dementia. Assessment tools to document effects of MPD were created, focusing on observational tools to measure engagement and affect during individual and group activities programming for persons with dementia. Examples of the use of MPD by researchers, staff members, and family members are given, as well as examples of how persons with dementia can provide MPD to other persons with dementia or to children. Finally, examples of MPD’s dissemination internationally and future directions for research are presented. PMID:23515663

  9. Origins of Montessori Programming for Dementia.

    PubMed

    Camp, Cameron J

    2010-01-01

    The focus of this article is on the evolution of the use of Montessori educational methods as the basis for creating interventions for persons with dementia. The account of this evolution is autobiographical, as the development of Montessori Programming for Dementia (MPD) initially was through the efforts of myself and my research associates. My initial exposure to Maria Montessori's work came as a result of my involvement with my own children's education. This exposure influenced ongoing research on development of cognitive interventions for persons with dementia. A brief description of Montessori's work with children and the educational methods she developed is followed by a description of how this approach can be translated into development of activities for persons with dementia. Assessment tools to document effects of MPD were created, focusing on observational tools to measure engagement and affect during individual and group activities programming for persons with dementia. Examples of the use of MPD by researchers, staff members, and family members are given, as well as examples of how persons with dementia can provide MPD to other persons with dementia or to children. Finally, examples of MPD's dissemination internationally and future directions for research are presented.

  10. Initialization methods and ensembles generation for the IPSL GCM

    NASA Astrophysics Data System (ADS)

    Labetoulle, Sonia; Mignot, Juliette; Guilyardi, Eric; Denvil, Sébastien; Masson, Sébastien

    2010-05-01

    The protocol used and developments made for decadal and seasonal predictability studies at IPSL (Paris, France) are presented. The strategy chosen is to initialize the IPSL-CM5 (NEMO ocean and LMDZ atmosphere) model only at the ocean-atmosphere interface, following the guidance and expertise gained from ocean-only NEMO experiments. Two novel approaches are presented for initializing the coupled system. First, a nudging of sea surface temperature and wind stress towards available reanalysis is made with the surface salinity climatologically restored. Second, the heat, salt and momentum fluxes received by the ocean model are computed as a linear combination of the fluxes computed by the atmospheric model and by a CORE-style bulk formulation using up-to-date reanalysis. The steps that led to these choices are presented, as well as a description of the code adaptation and a comparison of the computational cost of both methods. The strategy for the generation of ensembles at the end of the initialization phase is also presented. We show how the technical environment of IPSL-CM5 (LibIGCM) was modified to achieve these goals.

  11. Development of Scatterometer-Derived Surface Pressures

    NASA Astrophysics Data System (ADS)

    Hilburn, K. A.; Bourassa, M. A.; O'Brien, J. J.

    2001-12-01

    SeaWinds scatterometer-derived wind fields can be used to estimate surface pressure fields. The method to be used has been developed and tested with Seasat-A and NSCAT wind measurements. The method involves blending two dynamically consistent values of vorticity. Geostrophic relative vorticity is calculated from an initial guess surface pressure field (AVN analysis in this case). Relative vorticity is calculated from SeaWinds winds, adjusted to a geostrophic value, and then blended with the initial guess. An objective method applied minimizes the differences between the initial guess field and scatterometer field, subject to regularization. The long-term goal of this project is to derive research-quality pressure fields from the SeaWinds winds for the Southern Ocean from the Antarctic ice sheet to 30 deg S. The intermediate goal of this report involves generation of pressure fields over the northern hemisphere for testing purposes. Specifically, two issues need to be addressed. First, the most appropriate initial guess field will be determined: the pure AVN analysis or the previously assimilated pressure field. The independent comparison data to be used in answering this question will involve data near land, ship data, and ice data that were not included in the AVN analysis. Second, the smallest number of pressure observations required to anchor the assimilated field will be determined. This study will use Neumann (derivative) boundary conditions on the region of interest. Such boundary conditions only determine the solution to within a constant that must be determined by a number of anchoring points. The smallness of the number of anchoring points will demonstrate the viability of the general use of the scatterometer as a barometer over the oceans.

  12. [Ego-state Therapy: Psychotherapy for Multiple Personality Disorders].

    PubMed

    Sugiyama, Toshiro

    2018-01-01

    The author describes ego-state therapy. This psychotherapy is used for treating multiple personality disorders. The author mentions the theoretical background of this method, and practical points. Initially, ego-state therapy was developed as a type of hypnotherapy, but it evolved as a safe therapeutic method in combination with trauma processing therapies. The author presents a case study, and discusses the clinical significance of this treatment.

  13. Exploring Strategic Thinking: Insights to Assess, Develop, and Retain Strategic Thinkers

    DTIC Science & Technology

    2013-02-01

    rise even as goal difficulty increases (Nisan, 1972). Individual time perspective is associated with the ability to perform professional... initial association with things military, strategy has assumed other more generic meanings that people use broadly and commonly. To mention just a...quantitative methods. Even within the holistic or systemic approach to design there are times when such methods are useful , particularly when

  14. Developing a policy guidance for financing dental care in Iran using the RAND Appropriateness Method.

    PubMed

    Jadidfard, M P; Yazdani, S; Khoshnevisan, M H

    2013-12-01

    This study aimed to provide recommendations on health care financing with special emphasis on dental care. The RAND Appropriateness Method was employed to obtain the collective opinion of a multidisciplinary panel of experts on a set of recommendation statements regarding Iranian dental care financing. An initial set of recommendations were identified from a literature review. Panel members, selected purposively and by peer nomination, each rated the appropriateness and necessity of the recommendations in a structured process of two rounds. Each recommendation was classified as inappropriate, uncertain, appropriate but not necessary, or appropriate and necessary according to the median rating score and the level of disagreement among the panellists. Of 28 initial recommendations, 25 were agreed on as appropriate, of which 22 were considered as necessary. Altogether, these recommendations provide a holistic picture of an oral health system's financing in three domains: revenue collection, pooling of revenues and purchasing of dental services. The policy guidance recommendations are intended to provide the Iranian oral health authorities with an evidence-base for financing dental care. The recommendations may be transferrable, at least in part, particularly to developing countries with similar hybrid health system structures. Finally, the method used to develop the recommendations can serve as a model for use elsewhere.

  15. Determinants of timely initiation of breastfeeding among mothers in Goba Woreda, South East Ethiopia: A cross sectional study

    PubMed Central

    2011-01-01

    Background Although breastfeeding is universal in Ethiopia, ranges of regional differences in timely initiation of breastfeeding have been documented. Initiation of breastfeeding is highly bound to cultural factors that may either enhance or inhibit the optimal practices. The government of Ethiopia developed National Infant and Young Child Feeding Guideline in 2004 and behavior change communications on breast feeding have been going on since then. However, there is a little information on the practice of timely initiation of breast feeding and factors that predict these practices after the implementation of the national guideline. The objective of this study is to determine the prevalence and determinant factors of timely initiation of breastfeeding among mothers in Bale Goba District, South East Ethiopia. Methods A community based cross sectional study was carried out from February to March 2010 using both quantitative and qualitative methods of data collection. A total of 608 mother infant pairs were selected using simple random sampling method and key informants for the in-depth interview were selected conveniently. Descriptive statistics, bivariate analysis and multivariable logistic regression analyses were employed to identify factors associated with timely initiation of breast feeding. Results The prevalence of timely initiation of breastfeeding was 52.4%. Bivariate analysis showed that attendance of formal education, being urban resident, institutional delivery and postnatal counseling on breast feeding were significantly associated with timely initiation of breastfeeding (P < 0.05). After adjust sting for other factors on the multivariable logistic model, being in the urban area [AOR: 4.1 (95%C.I: 2.31-7.30)] and getting postnatal counseling [AOR: 2.7(1.86-3.94)] were independent predictors of timely initiation of breastfeeding. Conclusions The practice of timely initiation of breast feeding is low as nearly half the mothers did not start breastfeeding with one hour after delivery. The results suggest that breast feeding behavior change communication especially during the post natal period is critical in promoting optimal practice in the initiation of breast feeding. Rural mothers need special attention as they are distant from various information sources. PMID:21473791

  16. A Roadmap for Using Agile Development in a Traditional Environment

    NASA Technical Reports Server (NTRS)

    Streiffert, Barbara; Starbird, Thomas; Grenander, Sven

    2006-01-01

    One of the newer classes of software engineering techniques is called 'Agile Development'. In Agile Development software engineers take small implementation steps and, in some cases, they program in pairs. In addition, they develop automatic tests prior to implementing their small functional piece. Agile Development focuses on rapid turnaround, incremental planning, customer involvement and continuous integration. Agile Development is not the traditional waterfall method or even a rapid prototyping method (although this methodology is closer to Agile Development). At the Jet Propulsion Laboratory (JPL) a few groups have begun Agile Development software implementations. The difficulty with this approach becomes apparent when Agile Development is used in an organization that has specific criteria and requirements handed down for how software development is to be performed. The work at the JPL is performed for the National Aeronautics and Space Agency (NASA). Both organizations have specific requirements, rules and processes for developing software. This paper will discuss some of the initial uses of the Agile Development methodology, the spread of this method and the current status of the successful incorporation into the current JPL development policies and processes.

  17. A Roadmap for Using Agile Development in a Traditional Environment

    NASA Technical Reports Server (NTRS)

    Streiffert, Barbara A.; Starbird, Thomas; Grenander, Sven

    2006-01-01

    One of the newer classes of software engineering techniques is called 'Agile Development'. In Agile Development software engineers take small implementation steps and, in some cases they program in pairs. In addition, they develop automatic tests prior to implementing their small functional piece. Agile Development focuses on rapid turnaround, incremental planning, customer involvement and continuous integration. Agile Development is not the traditional waterfall method or even a rapid prototyping method (although this methodology is closer to Agile Development). At Jet Propulsion Laboratory (JPL) a few groups have begun Agile Development software implementations. The difficulty with this approach becomes apparent when Agile Development is used in an organization that has specific criteria and requirements handed down for how software development is to be performed. The work at the JPL is performed for the National Aeronautics and Space Agency (NASA). Both organizations have specific requirements, rules and procedure for developing software. This paper will discuss the some of the initial uses of the Agile Development methodology, the spread of this method and the current status of the successful incorporation into the current JPL development policies.

  18. Assessing Patients' Cognitive Therapy Skills: Initial Evaluation of the Competencies of Cognitive Therapy Scale.

    PubMed

    Strunk, Daniel R; Hollars, Shannon N; Adler, Abby D; Goldstein, Lizabeth A; Braun, Justin D

    2014-10-01

    In Cognitive Therapy (CT), therapists work to help patients develop skills to cope with negative affect. Most current methods of assessing patients' skills are cumbersome and impractical for clinical use. To address this issue, we developed and conducted an initial psychometric evaluation of self and therapist reported versions of a new measure of CT skills: the Competencies of Cognitive Therapy Scale (CCTS). We evaluated the CCTS at intake and post-treatment in a sample of 67 patients participating in CT. The CCTS correlated with a preexisting measure of CT skills (the Ways of Responding Questionnaire) and was also related to concurrent depressive symptoms. Across CT, self-reported improvements in CT competencies were associated with greater changes in depressive symptoms. These findings offer initial evidence for the validity of the CCTS. We discuss the CCTS in comparison with other measures of CT skills and suggest future research directions.

  19. Assessing Patients’ Cognitive Therapy Skills: Initial Evaluation of the Competencies of Cognitive Therapy Scale

    PubMed Central

    Strunk, Daniel R.; Hollars, Shannon N.; Adler, Abby D.; Goldstein, Lizabeth A.; Braun, Justin D.

    2014-01-01

    In Cognitive Therapy (CT), therapists work to help patients develop skills to cope with negative affect. Most current methods of assessing patients’ skills are cumbersome and impractical for clinical use. To address this issue, we developed and conducted an initial psychometric evaluation of self and therapist reported versions of a new measure of CT skills: the Competencies of Cognitive Therapy Scale (CCTS). We evaluated the CCTS at intake and post-treatment in a sample of 67 patients participating in CT. The CCTS correlated with a preexisting measure of CT skills (the Ways of Responding Questionnaire) and was also related to concurrent depressive symptoms. Across CT, self-reported improvements in CT competencies were associated with greater changes in depressive symptoms. These findings offer initial evidence for the validity of the CCTS. We discuss the CCTS in comparison with other measures of CT skills and suggest future research directions. PMID:25408560

  20. Cardiovascular point of care initiative: enhancements in clinical data management.

    PubMed

    Robertson, Jane

    2003-01-01

    The Department of Cardiovascular Surgery at East Alabama Medical Center (EAMC) initiated a program in 1996 to improve the quality and usefulness of clinical outcomes data. After years of using a commercial vendor product and enduring a tedious collection process, the department decided to develop its own tools to support quality improvement efforts. Using a hand-held personal data assistant (PDA), the team developed tools that allowed ongoing data collection at the point of care delivery. The tools and methods facilitated the collection of real time, accurate information that allowed EAMC to participate in multiple clinical quality initiatives. The ability to conduct rapid-cycle performance improvement studies propelled EAMC's Cardiovascular Surgery Program into the Top 100 as recognized by HCIA, now Solucient, for 3 consecutive years (1999-2001). This report will describe the evolution of the data collection process as well as the quality improvements that resulted.

  1. Materials Genome Initiative

    NASA Technical Reports Server (NTRS)

    Vickers, John

    2015-01-01

    The Materials Genome Initiative (MGI) project element is a cross-Center effort that is focused on the integration of computational tools to simulate manufacturing processes and materials behavior. These computational simulations will be utilized to gain understanding of processes and materials behavior to accelerate process development and certification to more efficiently integrate new materials in existing NASA projects and to lead to the design of new materials for improved performance. This NASA effort looks to collaborate with efforts at other government agencies and universities working under the national MGI. MGI plans to develop integrated computational/experimental/ processing methodologies for accelerating discovery and insertion of materials to satisfy NASA's unique mission demands. The challenges include validated design tools that incorporate materials properties, processes, and design requirements; and materials process control to rapidly mature emerging manufacturing methods and develop certified manufacturing processes

  2. Spline Approximation of Thin Shell Dynamics

    NASA Technical Reports Server (NTRS)

    delRosario, R. C. H.; Smith, R. C.

    1996-01-01

    A spline-based method for approximating thin shell dynamics is presented here. While the method is developed in the context of the Donnell-Mushtari thin shell equations, it can be easily extended to the Byrne-Flugge-Lur'ye equations or other models for shells of revolution as warranted by applications. The primary requirements for the method include accuracy, flexibility and efficiency in smart material applications. To accomplish this, the method was designed to be flexible with regard to boundary conditions, material nonhomogeneities due to sensors and actuators, and inputs from smart material actuators such as piezoceramic patches. The accuracy of the method was also of primary concern, both to guarantee full resolution of structural dynamics and to facilitate the development of PDE-based controllers which ultimately require real-time implementation. Several numerical examples provide initial evidence demonstrating the efficacy of the method.

  3. A comparison of optical gradation analysis devices to current test methods--phase 2.

    DOT National Transportation Integrated Search

    2012-04-01

    Optical devices are being developed to deliver accurate size and shape of aggregate particles with, less labor, less consistency error, : and greater reliability. This study was initiated to review the existing technology, and generate basic data to ...

  4. Report: Office of Research and Development Needs to Improve Its Method of Measuring Administrative Savings

    EPA Pesticide Factsheets

    Report #11-P-0333, July 14, 2011. ORD’s efforts to reduce its administrative costs are noteworthy, but ORD needs to improve its measurement mechanism for assessing the effectiveness of its initiatives to reduce administrative costs.

  5. Analysis of Peanut Seed Oil by NIR

    USDA-ARS?s Scientific Manuscript database

    Near infrared reflectance spectra (NIRS) were collected from Arachis hypogaea seed samples and used in predictive models to rapidly identify varieties with high oleic acid. The method was developed for shelled peanut seeds with intact testa. Spectra were evaluated initially by principal component an...

  6. Analysis of plant germline development by high-throughput RNA profiling: technical advances and new insights.

    PubMed

    Schmidt, Anja; Schmid, Marc W; Grossniklaus, Ueli

    2012-04-01

    Reproduction is a crucial step in the life cycle of plants. The male and female germline lineages develop in the reproductive organs of the flower, which in higher plants are the anthers and ovules, respectively. Development of the germline lineage initiates from a dedicated sporophytic cell that undergoes meiosis to form spores that subsequently give rise to the gametophytes through mitotic cell divisions. The mature male and female gametophytes harbour the male (sperm cells) and female gametes (egg and central cell), respectively. Those unite during double fertilization to initiate embryo and endosperm development in sexually reproducing higher plants. While cytological changes involved in development of the germline lineages have been well characterized in a number of species, investigation of the transcriptional basis underlying their development and the specification of the gametes proved challenging. This is largely due to the inaccessibility of the cells constituting the germline lineages, which are enclosed by sporophytic tissues. Only recently, these technical limitations could be overcome by combining new methods to isolate the relevant cells with powerful transcriptional profiling methods, such as microarrays or high-throughput sequencing of RNA. This review focuses on these technical advances and the new insights gained from them concerning the transcriptional basis and molecular mechanisms underlying germline development. © 2012 The Authors. The Plant Journal © 2012 Blackwell Publishing Ltd.

  7. MAGI: many-component galaxy initializer

    NASA Astrophysics Data System (ADS)

    Miki, Yohei; Umemura, Masayuki

    2018-04-01

    Providing initial conditions is an essential procedure for numerical simulations of galaxies. The initial conditions for idealized individual galaxies in N-body simulations should resemble observed galaxies and be dynamically stable for time-scales much longer than their characteristic dynamical times. However, generating a galaxy model ab initio as a system in dynamical equilibrium is a difficult task, since a galaxy contains several components, including a bulge, disc, and halo. Moreover, it is desirable that the initial-condition generator be fast and easy to use. We have now developed an initial-condition generator for galactic N-body simulations that satisfies these requirements. The developed generator adopts a distribution-function-based method, and it supports various kinds of density models, including custom-tabulated inputs and the presence of more than one disc. We tested the dynamical stability of systems generated by our code, representing early- and late-type galaxies, with N = 2097 152 and 8388 608 particles, respectively, and we found that the model galaxies maintain their initial distributions for at least 1 Gyr. The execution times required to generate the two models were 8.5 and 221.7 seconds, respectively, which is negligible compared to typical execution times for N-body simulations. The code is provided as open-source software and is publicly and freely available at https://bitbucket.org/ymiki/magi.

  8. Quality Improvement Initiatives: The Missed Opportunity for Health Plans

    PubMed Central

    Fernandez-Lopez, Sara; Lennert, Barbara

    2009-01-01

    Background The increase in healthcare cost without direct improvements in health outcomes, coupled with a desire to expand access to the large uninsured population, has underscored the importance of quality initiatives and organizations that provide more affordable healthcare by maximizing value. Objectives To determine the knowledge of managed care organizations about quality organizations and initiatives and to identify potential opportunities in which pharmaceutical companies could collaborate with health plans in the development and implementation of quality initiatives. Methods We conducted a survey of 36 pharmacy directors and 15 medical directors of different plans during a Managed Care Network meeting in 2008. The represented plans cover almost 74 million lives in commercial, Medicare, and Medicaid programs, or a combination of them. Results The responses show limited knowledge among pharmacy and medical directors about current quality organizations and initiatives, except for quality organizations that provide health plan quality accreditation. The results also reveal an opportunity for pharmaceutical companies to collaborate with private health plans in the development of quality initiatives, especially those related to drug utilization, such as patient adherence and education and correct drug utilization. Conclusion Our survey shows clearly that today's focus for managed care organizations is mostly limited to the organizations that provide health plan quality accreditation, with less focus on other organizations. PMID:25126303

  9. A New Method to Grow SiC: Solvent-Laser Heated Floating Zone

    NASA Technical Reports Server (NTRS)

    Woodworth, Andrew A.; Neudeck, Philip G.; Sayir, Ali

    2012-01-01

    The solvent-laser heated floating zone (solvent-LHFZ) growth method is being developed to grow long single crystal SiC fibers. The technique combines the single crystal fiber growth ability of laser heated floating zone with solvent based growth techniques (e.g. traveling solvent method) ability to grow SiC from the liquid phase. Initial investigations reported in this paper show that the solvent-LHFZ method readily grows single crystal SiC (retains polytype and orientation), but has a significant amount of inhomogeneous strain and solvent rich inclusions.

  10. High Frequency Vibration Based Fatigue Testing of Developmental Alloys

    NASA Astrophysics Data System (ADS)

    Holycross, Casey M.; Srinivasan, Raghavan; George, Tommy J.; Tamirisakandala, Seshacharyulu; Russ, Stephan M.

    Many fatigue test methods have been previously developed to rapidly evaluate fatigue behavior. This increased test speed can come at some expense, since these methods may require non-standard specimen geometry or increased facility and equipment capability. One such method, developed by George et al, involves a base-excited plate specimen driven into a high frequency bending resonant mode. This resonant mode is of sufficient frequency (typically 1200 to 1700 Hertz) to accumulate 107 cycles in a few hours. One of the main limitations of this test method is that fatigue cracking is almost certainly guaranteed to be surface initiated at regions of high stress. This brings into question the validity of the fatigue test results, as compared to more traditional uniaxial, smooth-bar testing, since high stresses are subjecting only a small volume to fatigue damage. This limitation also brings into question the suitability of this method to screen developmental alloys, should their initiation life be governed by subsurface flaws. However, if applicable, the rapid generation of fatigue data using this method would facilitate faster design iterations, identifying more quickly, material and manufacturing process deficiencies. The developmental alloy used in this study was a powder metallurgy boron-modified Ti-6Al-4V, a new alloy currently being considered for gas turbine engine fan blades. Plate specimens were subjected to fully reversed bending fatigue. Results are compared with existing data from commercially available Ti-6Al-4V using both vibration based and more traditional fatigue test methods.

  11. Factors Associated With Contraceptive Method Choice and Initiation in Adolescents and Young Women.

    PubMed

    Cohen, Rebecca; Sheeder, Jeanelle; Kane, Meghan; Teal, Stephanie B

    2017-10-01

    The purpose of the study was to identify factors associated with uptake of contraceptive implants or intrauterine devices (IUDs) by adolescents and young women. For this prospective cohort study, we recruited English-speaking female contraceptive initiators aged 14-24 years attending a Title X-supported, youth-focused clinic. Immediately prior to their visits, participants completed surveys assessing demographic and reproductive characteristics and awareness of, interest in, and intent to initiate specific contraceptive methods. Participants also answered questions about their social contacts' contraceptive experiences. Following the visit, participants reported the method initiated and the perceived importance of provider counseling. We used a multivariable regression model to ascertain factors associated with initiation of an IUD, an implant, or a short-acting reversible method. We enrolled 1,048 contraceptive initiators: 277 initiated short-acting methods, 384 IUDs, and 387 implants. High previsit personal acceptability of the method was associated with choosing that method for both implants and IUDs. Knowing someone who uses a specific method and likes it was predictive of personal acceptability of that method (IUD adjusted odds ratio: 10.9, 95% confidence interval: 3.8-31.1; implant adjusted odds ratio: 7.0, 95% confidence interval: 2.3-21.0). However, 10.4% of those initiating IUDs and 14.2% of those initiating implants had never heard of the method before their appointment. Even women with previsit intent to initiate a specific method found importance in contraceptive counseling. Previsit personal acceptability, which was associated with social contacts' experiences, was the strongest predictor of specific method uptake in our study. However, counseling informed the decisions of those with low previsit awareness and supported patients with formed intent. Copyright © 2017 Society for Adolescent Health and Medicine. Published by Elsevier Inc. All rights reserved.

  12. Development of the M. D. Anderson Cancer Center Gynecologic Applicators for the Treatment of Cervical Cancer: Historical Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yordy, John S., E-mail: john.yordy@utsouthwestern.edu; Almond, Peter R.; Delclos, Luis

    Purpose: To provide historical background on the development and initial studies of the gynecological (gyn) applicators developed by Dr. Gilbert H. Fletcher, a radiation oncologist and chairperson from 1948 to 1981 of the department at the M.D. Anderson Hospital (MDAH) for Cancer Research in Houston, TX, and to acknowledge the previously unrecognized contribution that Dr. Leonard G. Grimmett, a radiation physicist and chairperson from 1949 to 1951 of the physics department at MDAH, made to the development of the gynecological applicators. Methods and Materials: We reviewed archival materials from the Historical Resource Center and from the Department of Radiation Physicsmore » at University of Texas M. D. Anderson Cancer Center, as well as contemporary published papers, to trace the history of the applicators. Conclusions: Dr. Fletcher's work was influenced by the work on gynecologic applicators in the 1940s in Europe, especially work done at the Royal Cancer Hospital in London. Those efforts influenced not only Dr. Fletcher's approach to the design of the applicators but also the methods used to perform in vivo measurements and determine the dose distribution. Much of the initial development of the dosimetry techniques and measurements at MDAH were carried out by Dr. Grimmett.« less

  13. Towards the Prediction of Decadal to Centennial Climate Processes in the Coupled Earth System Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Zhengyu; Kutzbach, J.; Jacob, R.

    2011-12-05

    In this proposal, we have made major advances in the understanding of decadal and long term climate variability. (a) We performed a systematic study of multidecadal climate variability in FOAM-LPJ and CCSM-T31, and are starting exploring decadal variability in the IPCC AR4 models. (b) We develop several novel methods for the assessment of climate feedbacks in the observation. (c) We also developed a new initialization scheme DAI (Dynamical Analogue Initialization) for ensemble decadal prediction. (d) We also studied climate-vegetation feedback in the observation and models. (e) Finally, we started a pilot program using Ensemble Kalman Filter in CGCM for decadalmore » climate prediction.« less

  14. Healthy competition drives success in results-based aid: Lessons from the Salud Mesoamérica Initiative.

    PubMed

    El Bcheraoui, Charbel; Palmisano, Erin B; Dansereau, Emily; Schaefer, Alexandra; Woldeab, Alexander; Moradi-Lakeh, Maziar; Salvatierra, Benito; Hernandez-Prado, Bernardo; Mokdad, Ali H

    2017-01-01

    The Salud Mesoamérica Initiative (SMI) is a three-operation strategy, and is a pioneer in the world of results-based aid (RBA) in terms of the success it has achieved in improving health system inputs following its initial operation. This success in meeting pre-defined targets is rare in the world of financial assistance for health. We investigated the influential aspects of SMI that could have contributed to its effectiveness in improving health systems, with the aim of providing international donors, bilateral organizations, philanthropies, and recipient countries with new perspectives that can help increase the effectiveness of future assistance for health, specifically in the arena of RBA. Qualitative methods based on the criteria of relevance and effectiveness proposed by the Development Assistance Committee of the Organization for Economic Co-operation and Development. Our methods included document review, key informant interviews, a focus group discussion, and a partnership analysis. A purposive sample of 113 key informants, comprising donors, representatives from the Inter-American Development Bank, ministries of health, technical assistance organizations, evaluation organizations, and health care providers. During May-October 2016, we interviewed regarding the relevance and effectiveness of SMI. Themes emerged relative to the topics we investigated, and covered the design and the drivers of success of the initiative. The success is due to 1) the initiative's regional approach, which pressured recipient countries to compete toward meeting targets, 2) a robust and flexible design that incorporated the richness of input from stakeholders at all levels, 3) the design-embedded evaluation component that created a culture of accountability among recipient countries, and 4) the reflective knowledge environment that created a culture of evidence-based decision-making. A regional approach involving all appropriate stakeholders, and based on knowledge sharing and embedded evaluation can help ensure the effectiveness of future results-based aid programs for health in global settings.

  15. A brief measure of attitudes toward mixed methods research in psychology.

    PubMed

    Roberts, Lynne D; Povee, Kate

    2014-01-01

    The adoption of mixed methods research in psychology has trailed behind other social science disciplines. Teaching psychology students, academics, and practitioners about mixed methodologies may increase the use of mixed methods within the discipline. However, tailoring and evaluating education and training in mixed methodologies requires an understanding of, and way of measuring, attitudes toward mixed methods research in psychology. To date, no such measure exists. In this article we present the development and initial validation of a new measure: Attitudes toward Mixed Methods Research in Psychology. A pool of 42 items developed from previous qualitative research on attitudes toward mixed methods research along with validation measures was administered via an online survey to a convenience sample of 274 psychology students, academics and psychologists. Principal axis factoring with varimax rotation on a subset of the sample produced a four-factor, 12-item solution. Confirmatory factor analysis on a separate subset of the sample indicated that a higher order four factor model provided the best fit to the data. The four factors; 'Limited Exposure,' '(in)Compatibility,' 'Validity,' and 'Tokenistic Qualitative Component'; each have acceptable internal reliability. Known groups validity analyses based on preferred research orientation and self-rated mixed methods research skills, and convergent and divergent validity analyses based on measures of attitudes toward psychology as a science and scientist and practitioner orientation, provide initial validation of the measure. This brief, internally reliable measure can be used in assessing attitudes toward mixed methods research in psychology, measuring change in attitudes as part of the evaluation of mixed methods education, and in larger research programs.

  16. Cardiac biplane strain imaging: initial in vivo experience

    NASA Astrophysics Data System (ADS)

    Lopata, R. G. P.; Nillesen, M. M.; Verrijp, C. N.; Singh, S. K.; Lammens, M. M. Y.; van der Laak, J. A. W. M.; van Wetten, H. B.; Thijssen, J. M.; Kapusta, L.; de Korte, C. L.

    2010-02-01

    In this study, first we propose a biplane strain imaging method using a commercial ultrasound system, yielding estimation of the strain in three orthogonal directions. Secondly, an animal model of a child's heart was introduced that is suitable to simulate congenital heart disease and was used to test the method in vivo. The proposed approach can serve as a framework to monitor the development of cardiac hypertrophy and fibrosis. A 2D strain estimation technique using radio frequency (RF) ultrasound data was applied. Biplane image acquisition was performed at a relatively low frame rate (<100 Hz) using a commercial platform with an RF interface. For testing the method in vivo, biplane image sequences of the heart were recorded during the cardiac cycle in four dogs with an aortic stenosis. Initial results reveal the feasibility of measuring large radial, circumferential and longitudinal cumulative strain (up to 70%) at a frame rate of 100 Hz. Mean radial strain curves of a manually segmented region-of-interest in the infero-lateral wall show excellent correlation between the measured strain curves acquired in two perpendicular planes. Furthermore, the results show the feasibility and reproducibility of assessing radial, circumferential and longitudinal strains simultaneously. In this preliminary study, three beagles developed an elevated pressure gradient over the aortic valve (Δp: 100-200 mmHg) and myocardial hypertrophy. One dog did not develop any sign of hypertrophy (Δp = 20 mmHg). Initial strain (rate) results showed that the maximum strain (rate) decreased with increasing valvular stenosis (-50%), which is in accordance with previous studies. Histological findings corroborated these results and showed an increase in fibrotic tissue for the hearts with larger pressure gradients (100, 200 mmHg), as well as lower strain and strain rate values.

  17. Reminova and EAER: Keeping Enamel Whole through Caries Remineralization.

    PubMed

    Pitts, N B; Wright, J P

    2018-02-01

    This article aims to outline the early development of a King's College London dental spinout company, Reminova, formed to commercialize a novel clinical method of caries remineralization: electrically accelerated and enhanced remineralization (EAER). This method is being developed to address the unmet clinical need identified by modern caries management strategies to keep enamel "whole" through remineralization of clinical caries as a form of nonoperative caries treatment for initial-stage and moderate lesions. A progressive movement within dentistry is shifting away from the restorative-only model, which, it is suggested, has failed. The high prevalence of initial-stage caries across populations provides a significant opportunity to prevent restorations and reduce repeat restorations over a patient's lifetime. Reminova has set out to provide a method to repair lesions without drilling, filling, pain, or injections. The article outlines the rationale for and the chronological stages of the technology and company development. It then outlines corroborative evidence to show that EAER treatment can, in this preliminary in vitro investigation, remineralize clinically significant caries throughout the depth of the lesion as measured by Knoop microhardness and corroborated by scanning electron microscopy. Furthermore, the presented data show that EAER-treated enamel is harder than the healthy enamel measured nearby in each sample and is very similar in appearance to healthy enamel from the subjective interpretation made possible by scanning electron microscopy imagery. The data presented also show that this more "complete" remineralization to a high hardness level has been achieved with 2 remineralizing agents via in vitro human tooth samples. The broad clinical potential of this new treatment methodology seems to be very encouraging from these results. Reminova will strive to continue its mission, to ensure that, in the future, dental teams will not need to drill holes for the treatment of initial-stage and moderate caries lesions.

  18. Optimal sixteenth order convergent method based on quasi-Hermite interpolation for computing roots.

    PubMed

    Zafar, Fiza; Hussain, Nawab; Fatimah, Zirwah; Kharal, Athar

    2014-01-01

    We have given a four-step, multipoint iterative method without memory for solving nonlinear equations. The method is constructed by using quasi-Hermite interpolation and has order of convergence sixteen. As this method requires four function evaluations and one derivative evaluation at each step, it is optimal in the sense of the Kung and Traub conjecture. The comparisons are given with some other newly developed sixteenth-order methods. Interval Newton's method is also used for finding the enough accurate initial approximations. Some figures show the enclosure of finitely many zeroes of nonlinear equations in an interval. Basins of attractions show the effectiveness of the method.

  19. Brain Biology Machine Initiative: Developing Innovative Novel Methods to Improve Neuro-Rehabilitation for Amputees and Treatment for Patients at Remote Sites with Acute Brain Injury

    DTIC Science & Technology

    2010-10-01

    bode well for the future. The paper we submitted to the Journal of Neuroscience detailing the TVAG rabies tracer system was accepted with revisions...of brain electrical activity. Stas Kounitsky successfully completed the port of the new vector-additive implicit (VAI) method for the anisotropic ...Alternating Difference 14 Implicit (ADI) for isotropic head models, and the Vector Additive Implicit (VAI) for anisotropic head models. The ADI method

  20. Method for determining how to operate and control wind turbine arrays in utility systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Javid, S.H.; Hauth, R.L.; Younkins, T.D.

    1984-01-01

    A method for determining how utility wind turbine arrays should be controlled and operated on the load frequency control time-scale is presented. Initial considerations for setting wind turbine control requirements are followed by a description of open loop operation and of closed loop and feed forward wind turbine array control concepts. The impact of variations in array output on meeting minimum criteria are developed. The method for determining the required control functions is then presented and results are tabulated. (LEW)

  1. Experiments and simulations of single shock Richtmeyer-Meshkov Instability with measured, volumetric initial conditions

    NASA Astrophysics Data System (ADS)

    Sewell, Everest; Ferguson, Kevin; Greenough, Jeffrey; Jacobs, Jeffrey

    2014-11-01

    We describe new experiments of single shock Richtmeyer-Meshkov Instability (RMI) performed on the shock tube apparatus at the University of Arizona in which the initial conditions are volumetrically imaged prior to shock wave arrival. Initial perturbation plays a major role in the evolution of RMI, and previous experimental efforts only capture a narrow slice of the initial condition. The method presented uses a rastered laser sheet to capture additional images in the depth of the initial condition shortly before the experimental start time. These images are then used to reconstruct a volumetric approximation of the experimental perturbation, which is simulated using the hydrodynamics code ARES, developed at Lawrence Livermore National Laboratory (LLNL). Comparison is made between the time evolution of the interface width and the mixedness ratio measured from the experiments against the predictions from the numerical simulations.

  2. Model correlation and damage location for large space truss structures: Secant method development and evaluation

    NASA Technical Reports Server (NTRS)

    Smith, Suzanne Weaver; Beattie, Christopher A.

    1991-01-01

    On-orbit testing of a large space structure will be required to complete the certification of any mathematical model for the structure dynamic response. The process of establishing a mathematical model that matches measured structure response is referred to as model correlation. Most model correlation approaches have an identification technique to determine structural characteristics from the measurements of the structure response. This problem is approached with one particular class of identification techniques - matrix adjustment methods - which use measured data to produce an optimal update of the structure property matrix, often the stiffness matrix. New methods were developed for identification to handle problems of the size and complexity expected for large space structures. Further development and refinement of these secant-method identification algorithms were undertaken. Also, evaluation of these techniques is an approach for model correlation and damage location was initiated.

  3. The Incorporation and Initialization of Cloud Water/ice in AN Operational Forecast Model

    NASA Astrophysics Data System (ADS)

    Zhao, Qingyun

    Quantitative precipitation forecasts have been one of the weakest aspects of numerical weather prediction models. Theoretical studies show that the errors in precipitation calculation can arise from three sources: errors in the large-scale forecasts of primary variables, errors in the crude treatment of condensation/evaporation and precipitation processes, and errors in the model initial conditions. A new precipitation parameterization scheme has been developed to investigate the forecast value of improved precipitation physics via the introduction of cloud water and cloud ice into a numerical prediction model. The main feature of this scheme is the explicit calculation of cloud water and cloud ice in both the convective and stratiform precipitation parameterization. This scheme has been applied to the eta model at the National Meteorological Center. Four extensive tests have been performed. The statistical results showed a significant improvement in the model precipitation forecasts. Diagnostic studies suggest that the inclusion of cloud ice is important in transferring water vapor to precipitation and in the enhancement of latent heat release; the latter subsequently affects the vertical motion field significantly. Since three-dimensional cloud data is absent from the analysis/assimilation system for most numerical models, a method has been proposed to incorporate observed precipitation and nephanalysis data into the data assimilation system to obtain the initial cloud field for the eta model. In this scheme, the initial moisture and vertical motion fields are also improved at the same time as cloud initialization. The physical initialization is performed in a dynamical initialization framework that uses the Newtonian dynamical relaxation method to nudge the model's wind and mass fields toward analyses during a 12-hour data assimilation period. Results from a case study showed that a realistic cloud field was produced by this method at the end of the data assimilation period. Precipitation forecasts have been significantly improved as a result of the improved initial cloud, moisture and vertical motion fields.

  4. Initial draft of CSE-UCLA evaluation model based on weighted product in order to optimize digital library services in computer college in Bali

    NASA Astrophysics Data System (ADS)

    Divayana, D. G. H.; Adiarta, A.; Abadi, I. B. G. S.

    2018-01-01

    The aim of this research was to create initial design of CSE-UCLA evaluation model modified with Weighted Product in evaluating digital library service at Computer College in Bali. The method used in this research was developmental research method and developed by Borg and Gall model design. The results obtained from the research that conducted earlier this month was a rough sketch of Weighted Product based CSE-UCLA evaluation model that the design had been able to provide a general overview of the stages of weighted product based CSE-UCLA evaluation model used in order to optimize the digital library services at the Computer Colleges in Bali.

  5. A unified approach to computational drug discovery.

    PubMed

    Tseng, Chih-Yuan; Tuszynski, Jack

    2015-11-01

    It has been reported that a slowdown in the development of new medical therapies is affecting clinical outcomes. The FDA has thus initiated the Critical Path Initiative project investigating better approaches. We review the current strategies in drug discovery and focus on the advantages of the maximum entropy method being introduced in this area. The maximum entropy principle is derived from statistical thermodynamics and has been demonstrated to be an inductive inference tool. We propose a unified method to drug discovery that hinges on robust information processing using entropic inductive inference. Increasingly, applications of maximum entropy in drug discovery employ this unified approach and demonstrate the usefulness of the concept in the area of pharmaceutical sciences. Copyright © 2015. Published by Elsevier Ltd.

  6. Improving the Standards-Based Management-Recognition initiative to provide high-quality, equitable maternal health services in Malawi: an implementation research protocol.

    PubMed

    Mumtaz, Zubia; Salway, Sarah; Nyagero, Josephat; Osur, Joachim; Chirwa, Ellen; Kachale, Fannie; Saunders, Duncan

    2016-01-01

    The Government of Malawi is seeking evidence to improve implementation of its flagship quality of care improvement initiative-the Standards Based Management-Recognition for Reproductive Health (SBM-R(RH)). This implementation study will assess the quality of maternal healthcare in facilities where the SBM-R(RH) initiative has been employed, identify factors that support or undermine effectiveness of the initiative and develop strategies to further enhance its operation. Data will be collected in 4 interlinked modules using quantitative and qualitative research methods. Module 1 will develop the programme theory underlying the SBM-R(RH) initiative, using document review and in-depth interviews with policymakers and programme managers. Module 2 will quantitatively assess the quality and equity of maternal healthcare provided in facilities where the SBM-R(RH) initiative has been implemented, using the Malawi Integrated Performance Standards for Reproductive Health. Module 3 will conduct an organisational ethnography to explore the structures and processes through which SBM-R(RH) is currently operationalised. Barriers and facilitators will be identified. Module 4 will involve coordinated co-production of knowledge by researchers, policymakers and the public, to identify and test strategies to improve implementation of the initiative. The research outcomes will provide empirical evidence of strategies that will enhance the facilitators and address the barriers to effective implementation of the initiative. It will also contribute to the theoretical advances in the emerging science of implementation research.

  7. High speed imaging for assessment of impact damage in natural fibre biocomposites

    NASA Astrophysics Data System (ADS)

    Ramakrishnan, Karthik Ram; Corn, Stephane; Le Moigne, Nicolas; Ienny, Patrick; Leger, Romain; Slangen, Pierre R.

    2017-06-01

    The use of Digital Image Correlation has been generally limited to the estimation of mechanical properties and fracture behaviour at low to moderate strain rates. High speed cameras dedicated to ballistic testing are often used to measure the initial and residual velocities of the projectile but rarely for damage assessment. The evaluation of impact damage is frequently achieved post-impact using visual inspection, ultrasonic C-scan or other NDI methods. Ultra-high speed cameras and developments in image processing have made possible the measurement of surface deformations and stresses in real time during dynamic cracking. In this paper, a method is presented to correlate the force- displacement data from the sensors to the slow motion tracking of the transient failure cracks using real-time high speed imaging. Natural fibre reinforced composites made of flax fibres and polypropylene matrix was chosen for the study. The creation of macro-cracks during the impact results in the loss of stiffness and a corresponding drop in the force history. However, optical instrumentation shows that the initiation of damage is not always evident and so the assessment of damage requires the use of a local approach. Digital Image Correlation is used to study the strain history of the composite and to identify the initiation and progression of damage. The effect of fly-speckled texture on strain measurement by image correlation is also studied. The developed method can be used for the evaluation of impact damage for different composite materials.

  8. Current Status of Single Particle Imaging with X-ray Lasers

    DOE PAGES

    Sun, Zhibin; Fan, Jiadong; Li, Haoyuan; ...

    2018-01-22

    The advent of ultrafast X-ray free-electron lasers (XFELs) opens the tantalizing possibility of the atomic-resolution imaging of reproducible objects such as viruses, nanoparticles, single molecules, clusters, and perhaps biological cells, achieving a resolution for single particle imaging better than a few tens of nanometers. Improving upon this is a significant challenge which has been the focus of a global single particle imaging (SPI) initiative launched in December 2014 at the Linac Coherent Light Source (LCLS), SLAC National Accelerator Laboratory, USA. A roadmap was outlined, and significant multi-disciplinary effort has since been devoted to work on the technical challenges of SPImore » such as radiation damage, beam characterization, beamline instrumentation and optics, sample preparation and delivery and algorithm development at multiple institutions involved in the SPI initiative. Currently, the SPI initiative has achieved 3D imaging of rice dwarf virus (RDV) and coliphage PR772 viruses at ~10 nm resolution by using soft X-ray FEL pulses at the Atomic Molecular and Optical (AMO) instrument of LCLS. Meanwhile, diffraction patterns with signal above noise up to the corner of the detector with a resolution of ~6 Ångström (Å) were also recorded with hard X-rays at the Coherent X-ray Imaging (CXI) instrument, also at LCLS. Achieving atomic resolution is truly a grand challenge and there is still a long way to go in light of recent developments in electron microscopy. However, the potential for studying dynamics at physiological conditions and capturing ultrafast biological, chemical and physical processes represents a tremendous potential application, attracting continued interest in pursuing further method development. In this paper, we give a brief introduction of SPI developments and look ahead to further method development.« less

  9. Current Status of Single Particle Imaging with X-ray Lasers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sun, Zhibin; Fan, Jiadong; Li, Haoyuan

    The advent of ultrafast X-ray free-electron lasers (XFELs) opens the tantalizing possibility of the atomic-resolution imaging of reproducible objects such as viruses, nanoparticles, single molecules, clusters, and perhaps biological cells, achieving a resolution for single particle imaging better than a few tens of nanometers. Improving upon this is a significant challenge which has been the focus of a global single particle imaging (SPI) initiative launched in December 2014 at the Linac Coherent Light Source (LCLS), SLAC National Accelerator Laboratory, USA. A roadmap was outlined, and significant multi-disciplinary effort has since been devoted to work on the technical challenges of SPImore » such as radiation damage, beam characterization, beamline instrumentation and optics, sample preparation and delivery and algorithm development at multiple institutions involved in the SPI initiative. Currently, the SPI initiative has achieved 3D imaging of rice dwarf virus (RDV) and coliphage PR772 viruses at ~10 nm resolution by using soft X-ray FEL pulses at the Atomic Molecular and Optical (AMO) instrument of LCLS. Meanwhile, diffraction patterns with signal above noise up to the corner of the detector with a resolution of ~6 Ångström (Å) were also recorded with hard X-rays at the Coherent X-ray Imaging (CXI) instrument, also at LCLS. Achieving atomic resolution is truly a grand challenge and there is still a long way to go in light of recent developments in electron microscopy. However, the potential for studying dynamics at physiological conditions and capturing ultrafast biological, chemical and physical processes represents a tremendous potential application, attracting continued interest in pursuing further method development. In this paper, we give a brief introduction of SPI developments and look ahead to further method development.« less

  10. Role of the National Institute of Standards and Technology (NIST) in Support of the Vitamin D Initiative of the National Institutes of Health, Office of Dietary Supplements.

    PubMed

    Wise, Stephen A; Tai, Susan S-C; Burdette, Carolyn Q; Camara, Johanna E; Bedner, Mary; Lippa, Katrice A; Nelson, Michael A; Nalin, Federica; Phinney, Karen W; Sander, Lane C; Betz, Joseph M; Sempos, Christopher T; Coates, Paul M

    2017-09-01

    Since 2005, the National Institute of Standards and Technology (NIST) has collaborated with the National Institutes of Health (NIH), Office of Dietary Supplements (ODS) to improve the quality of measurements related to human nutritional markers of vitamin D status. In support of the NIH-ODS Vitamin D Initiative, including the Vitamin D Standardization Program (VDSP), NIST efforts have focused on (1) development of validated analytical methods, including reference measurement procedures (RMPs); (2) development of Standard Reference Materials (SRMs); (3) value assignment of critical study samples using NIST RMPs; and (4) development and coordination of laboratory measurement QA programs. As a result of this collaboration, NIST has developed RMPs for 25-hydroxyvitamin D2 [25(OH)D2], 25(OH)D3, and 24R,25-dihydroxyvitamin D3 [24R,25(OH)2D3]; disseminated serum-based SRMs with values assigned for 25(OH)D2, 25(OH)D3, 3-epi-25(OH)D3, and 24R,25(OH)2D3; assigned values for critical samples for VDSP studies, including an extensive interlaboratory comparison and reference material commutability study; provided an accuracy basis for the Vitamin D External Quality Assurance Scheme; coordinated the first accuracy-based measurement QA program for the determination of 25(OH)D2, 25(OH)D3, and 3-epi-25(OH)D3 in human serum/plasma; and developed methods and SRMs for the determination of vitamin D and 25(OH)D in food and supplement matrix SRMs. The details of these activities and their benefit and impact to the NIH-ODS Vitamin D Initiative are described.

  11. Publication Guidelines for Quality Improvement Studies in Health Care: Evolution of the SQUIRE Project

    PubMed Central

    Batalden, Paul; Stevens, David; Ogrinc, Greg; Mooney, Susan

    2008-01-01

    In 2005 we published draft guidelines for reporting studies of quality improvement interventions as the initial step in a consensus process for development of a more definitive version. The current article contains the revised version, which we refer to as SQUIRE (Standards for QUality Improvement Reporting Excellence). We describe the consensus process, which included informal feedback, formal written commentaries, input from publication guideline developers, review of the literature on the epistemology of improvement and on methods for evaluating complex social programs, and a meeting of stakeholders for critical review of the guidelines’ content and wording, followed by commentary on sequential versions from an expert consultant group. Finally, we examine major differences between SQUIRE and the initial draft, and consider limitations of and unresolved questions about SQUIRE; we also describe ancillary supporting documents and alternative versions under development, and plans for dissemination, testing, and further development of SQUIRE. PMID:18830766

  12. High-Throughput Thermodynamic Modeling and Uncertainty Quantification for ICME

    NASA Astrophysics Data System (ADS)

    Otis, Richard A.; Liu, Zi-Kui

    2017-05-01

    One foundational component of the integrated computational materials engineering (ICME) and Materials Genome Initiative is the computational thermodynamics based on the calculation of phase diagrams (CALPHAD) method. The CALPHAD method pioneered by Kaufman has enabled the development of thermodynamic, atomic mobility, and molar volume databases of individual phases in the full space of temperature, composition, and sometimes pressure for technologically important multicomponent engineering materials, along with sophisticated computational tools for using the databases. In this article, our recent efforts will be presented in terms of developing new computational tools for high-throughput modeling and uncertainty quantification based on high-throughput, first-principles calculations and the CALPHAD method along with their potential propagations to downstream ICME modeling and simulations.

  13. Developing a laser shockwave model for characterizing diffusion bonded interfaces

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lacy, Jeffrey M., E-mail: Jeffrey.Lacy@inl.gov; Smith, James A., E-mail: Jeffrey.Lacy@inl.gov; Rabin, Barry H., E-mail: Jeffrey.Lacy@inl.gov

    2015-03-31

    The US National Nuclear Security Agency has a Global Threat Reduction Initiative (GTRI) with the goal of reducing the worldwide use of high-enriched uranium (HEU). A salient component of that initiative is the conversion of research reactors from HEU to low enriched uranium (LEU) fuels. An innovative fuel is being developed to replace HEU in high-power research reactors. The new LEU fuel is a monolithic fuel made from a U-Mo alloy foil encapsulated in Al-6061 cladding. In order to support the fuel qualification process, the Laser Shockwave Technique (LST) is being developed to characterize the clad-clad and fuel-clad interface strengthsmore » in fresh and irradiated fuel plates. LST is a non-contact method that uses lasers for the generation and detection of large amplitude acoustic waves to characterize interfaces in nuclear fuel plates. However, because the deposition of laser energy into the containment layer on a specimen's surface is intractably complex, the shock wave energy is inferred from the surface velocity measured on the backside of the fuel plate and the depth of the impression left on the surface by the high pressure plasma pulse created by the shock laser. To help quantify the stresses generated at the interfaces, a finite element method (FEM) model is being utilized. This paper will report on initial efforts to develop and validate the model by comparing numerical and experimental results for back surface velocities and front surface depressions in a single aluminum plate representative of the fuel cladding.« less

  14. Procedure for analysis and design of weaving sections : volume 2, users guide.

    DOT National Transportation Integrated Search

    1983-12-01

    This research was performed to complete and advance the status of recently developed procedures for analysis and design of weaving sections (known as the Leisch method and-initially published in the 1979 issue of ITE Journal). The objective was to en...

  15. Development of Alternating Current Potential Drop (ACPD) Procedures for Crack Detection in Aluminum Aircraft Panels.

    DOT National Transportation Integrated Search

    1993-12-01

    The Alternating Current Potential Drop (ACPD) method is investigated as a means of making measurements in laboratory experiments on the initiation and growth of multiple site damage (MSD) cracks in a common aluminum alloy used for aircraft constructi...

  16. 75 FR 12753 - Agency Forms Undergoing Paperwork Reduction Act Review

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-17

    ... effective at improving health care quality. While evidence-based approaches for decisionmaking have become standard in healthcare, this has been limited in laboratory medicine. No single- evidence-based model for... (LMBP) initiative to develop new systematic evidence reviews methods for making evidence-based...

  17. ANALYTICAL METHOD DEVELOPMENT FOR THE ANALYSIS OF N-NITROSODIMETHYLAMINE (NDMA) IN DRINKING WATER

    EPA Science Inventory

    N-Nitrosodimethylamine (NDMA), a by-product of the manufacture of liquid rocket fuel, has recently been identified as a contaminant in several California drinking water sources. The initial source of the contamination was identified as an aerospace facility. Subsequent testing ...

  18. Creating Livable Communities

    ERIC Educational Resources Information Center

    Oberlink, Mia R.

    2006-01-01

    This report identifies barriers to developing livable communities and sheds light on potential methods for overcoming these barriers. It identifies and highlights multiple strategies that may be applied to the design and support of livable community principles. The identified strategies have been initiated by federal and state government agencies…

  19. DETERMINATION OF METALS IN COMPOSITE DIET SAMPLES BY ICP-MS

    EPA Science Inventory

    In order to assess an individual's toal exposure to contaminants in the environment, it is essential that the contribution of dietary exposure be quantified. As a result, USEPA's National Exposure Research Laboratory has initiated a program to develop methods to measure chemical...

  20. DETERMINATION OF METALS IN COMPOSITE DIET SAMPLES BY ICP-MS

    EPA Science Inventory

    In order to assess an individual's total exposure to contaminants in the environment, it is essential that the contribution of dietary exposure be quantified. As a result, USEPA's National Exposure Research Laboratory has initiated a program to develop methods to measure chemica...

  1. EUTROPHICATION OF COASTAL WATER BODIES: RELATIONSHIPS BETWEEN NUTRIENT LOADING AND ECOLOGICAL RESPONSE

    EPA Science Inventory

    This newly initiated research will provide environmental managers with an empirical method to develop regional nutrient input limits for East Coast estuaries/coastal water bodies. The goal will be to reduce the current uncertainty associated with nutrient load-response relationsh...

  2. DEVELOPMENT OF DNA MICROARRAYS FOR ECOLOGICAL EXPOSURE ASSESSMENT

    EPA Science Inventory

    EPA/ORD is moving forward with a computational toxicology initiative in FY 04 which aims to integrate genomics and computational methods to provide a mechanistic basis for prediction of exposure and effects of chemical stressors in the environment.

    The goal of the presen...

  3. The Development of Accepted Performance Items to Demonstrate Braille Competence in the Nemeth Code for Mathematics and Science Notation

    ERIC Educational Resources Information Center

    Smith, Derrick; Rosenblum, L. Penny

    2013-01-01

    Introduction: The purpose of the study presented here was the initial validation of a comprehensive set of competencies focused solely on the Nemeth code. Methods: Using the Delphi method, 20 expert panelists were recruited to participate in the study on the basis of their past experience in teaching a university-level course in the Nemeth code.…

  4. Quantitative PCR for Tracking the Megaplasmid-Borne Biodegradation Potential of a Model Sphingomonad

    PubMed Central

    Hartmann, Erica M.; Badalamenti, Jonathan P.; Krajmalnik-Brown, Rosa

    2012-01-01

    We developed a quantitative PCR method for tracking the dxnA1 gene, the initial, megaplasmid-borne gene in Sphingomonas wittichii RW1's dibenzo-p-dioxin degradation pathway. We used this method on complex environmental samples and report on growth of S. wittichii RW1 in landfill leachate, thus furnishing a novel tool for monitoring megaplasmid-borne, dioxygenase-encoding genes. PMID:22492441

  5. Introducing Research Initiatives into Healthcare: What Do Doctors Think?

    PubMed Central

    Wyld, Lucy; Smith, Sian; Hawkins, Nicholas J.; Long, Janet

    2014-01-01

    Background: Current national and international policies emphasize the need to develop research initiatives within our health care system. Institutional biobanking represents a modern, large-scale research initiative that is reliant upon the support of several aspects of the health care organization. This research project aims to explore doctors' views on the concept of institutional biobanking and to gain insight into the factors which impact the development of research initiatives within healthcare systems. Methods: Qualitative research study using semi-structured interviews. The research was conducted across two public teaching hospitals in Sydney, Australia where institutional biobanking was being introduced. Twenty-five participants were interviewed, of whom 21 were medical practitioners at the specialist trainee level or above in a specialty directly related to biobanking; four were key stakeholders responsible for the design and implementation of the biobanking initiative. Results: All participants strongly supported the concept of institutional biobanking. Participants highlighted the discordance between the doctors who work to establish the biobank (the contributors) and the researchers who use it (the consumers). Participants identified several barriers that limit the success of research initiatives in the hospital setting including: the ‘resistance to change’ culture; the difficulties in engaging health professionals in research initiatives; and the lack of incentives offered to doctors for their contribution. Doctors positively valued the opportunity to advise the implementation team, and felt that the initiative could benefit from their knowledge and expertise. Conclusion: Successful integration of research initiatives into hospitals requires early collaboration between the implementing team and the health care professionals to produce a plan that is sensitive to the needs of the health professionals and tailored to the hospital setting. Research initiatives must consider incentives that encourage doctors to adopt operational responsibility for hospital research initiatives. PMID:24749875

  6. Examining the Fieldwork Experience from the Site Supervisor Perspective: A Mixed-Methods Study Using Vygotsky's Zone of Proximal Development Theory

    ERIC Educational Resources Information Center

    Brannon, Sian

    2013-01-01

    The purpose of this study was to identify feelings and behaviors of fieldwork supervisors in public libraries using Lev Vygotsky's Zone of Proximal Development theory as a background for design, analysis, and discussion of results. This research sought to find out how fieldwork supervisors perform initial assessments of their fieldwork students,…

  7. Development and Standardization of the Diagnostic Adaptive Behavior Scale: Application of Item Response Theory to the Assessment of Adaptive Behavior

    ERIC Educational Resources Information Center

    Tassé, Marc J.; Schalock, Robert L.; Thissen, David; Balboni, Giulia; Bersani, Henry, Jr.; Borthwick-Duffy, Sharon A.; Spreat, Scott; Widaman, Keith F.; Zhang, Dalun; Navas, Patricia

    2016-01-01

    The Diagnostic Adaptive Behavior Scale (DABS) was developed using item response theory (IRT) methods and was constructed to provide the most precise and valid adaptive behavior information at or near the cutoff point of making a decision regarding a diagnosis of intellectual disability. The DABS initial item pool consisted of 260 items. Using IRT…

  8. In Preparation of the Nationwide Dissemination of the School-Based Obesity Prevention Program DOiT: Stepwise Development Applying the Intervention Mapping Protocol

    ERIC Educational Resources Information Center

    van Nassau, Femke; Singh, Amika S.; van Mechelen, Willem; Brug, Johannes; Chin A. Paw, Mai J. M.

    2014-01-01

    Background: The school-based Dutch Obesity Intervention in Teenagers (DOiT) program is an evidence-based obesity prevention program. In preparation for dissemination throughout the Netherlands, this study aimed to adapt the initial program and to develop an implementation strategy and materials. Methods: We revisited the Intervention Mapping (IM)…

  9. Development and Psychometric Evaluation of the Reasons for Living-Older Adults Scale: A Suicide Risk Assessment Inventory

    ERIC Educational Resources Information Center

    Edelstein, Barry A.; Heisel, Marnin J.; McKee, Deborah R.; Martin, Ronald R.; Koven, Lesley P.; Duberstein, Paul R.; Britton, Peter C.

    2009-01-01

    Purpose: The purposes of these studies were to develop and initially evaluate the psychometric properties of the Reasons for Living Scale-Older Adult version (RFL-OA), an older adults version of a measure designed to assess reasons for living among individuals at risk for suicide. Design and Methods: Two studies are reported. Study 1 involved…

  10. A Gompertz population model with Allee effect and fuzzy initial values

    NASA Astrophysics Data System (ADS)

    Amarti, Zenia; Nurkholipah, Nenden Siti; Anggriani, Nursanti; Supriatna, Asep K.

    2018-03-01

    Growth and population dynamics models are important tools used in preparing a good management for society to predict the future of population or species. This has been done by various known methods, one among them is by developing a mathematical model that describes population growth. Models are usually formed into differential equations or systems of differential equations, depending on the complexity of the underlying properties of the population. One example of biological complexity is Allee effect. It is a phenomenon showing a high correlation between very small population size and the mean individual fitness of the population. In this paper the population growth model used is the Gompertz equation model by considering the Allee effect on the population. We explore the properties of the solution to the model numerically using the Runge-Kutta method. Further exploration is done via fuzzy theoretical approach to accommodate uncertainty of the initial values of the model. It is known that an initial value greater than the Allee threshold will cause the solution rises towards carrying capacity asymptotically. However, an initial value smaller than the Allee threshold will cause the solution decreases towards zero asymptotically, which means the population is eventually extinct. Numerical solutions show that modeling uncertain initial value of the critical point A (the Allee threshold) with a crisp initial value could cause the extinction of population of a certain possibilistic degree, depending on the predetermined membership function of the initial value.

  11. The legacy of the Child Health and Nutrition Research Initiative (CHNRI).

    PubMed

    Black, Robert E

    2016-06-01

    Under the Global Forum for Health Research, the Child Health and Nutrition Research Initiative (CHNRI) began its operations in 1999 and became a Swiss foundation in 2006. The vision of CHNRI was to improve child health and nutrition of all children in low- and middle-income countries (LMIC) through research that informs health policy and practice. Specific objectives included expanding global knowledge on childhood disease burden and cost-effectiveness of interventions, promoting priority setting in research, ensuring inclusion of institutions and scientists in LMIC in setting priorities, promoting capacity development in LMIC and stimulating donors and countries to increase resources for research. CHNRI created a knowledge network, funded research through multiple rounds of a global competitive process and published research papers and policy briefs. A signature effort was to develop a systematic methodology for prioritizing health and nutrition research investments. The "CHNRI method" has been extensively applied to global health problems and is now the most commonly used method for prioritizing health research questions.

  12. Optimal cooperative time-fixed impulsive rendezvous

    NASA Technical Reports Server (NTRS)

    Mirfakhraie, Koorosh; Conway, Bruce A.; Prussing, John E.

    1988-01-01

    A method has been developed for determining optimal, i.e., minimum fuel, trajectories for the fixed-time cooperative rendezvous of two spacecraft. The method presently assumes that the vehicles perform a total of three impulsive maneuvers with each vehicle being active, that is, making at least one maneuver. The cost of a feasible 'reference' trajectory is improved by an optimizer which uses an analytical gradient developed using primer vector theory and a new solution for the optimal terminal (rendezvous) maneuver. Results are presented for a large number of cases in which the initial orbits of both vehicles are circular but in which the initial positions of the vehicles and the allotted time for rendezvous are varied. In general, the cost of the cooperative rendezvous is less than that of rendezvous with one vehicle passive. Further improvement in cost may be obtained in the future when additional, i.e., midcourse, impulses are allowed and inserted as indicated for some cases by the primer vector histories which are generated by the program.

  13. Advanced superposition methods for high speed turbopump vibration analysis

    NASA Technical Reports Server (NTRS)

    Nielson, C. E.; Campany, A. D.

    1981-01-01

    The small, high pressure Mark 48 liquid hydrogen turbopump was analyzed and dynamically tested to determine the cause of high speed vibration at an operating speed of 92,400 rpm. This approaches the design point operating speed of 95,000 rpm. The initial dynamic analysis in the design stage and subsequent further analysis of the rotor only dynamics failed to predict the vibration characteristics found during testing. An advanced procedure for dynamics analysis was used in this investigation. The procedure involves developing accurate dynamic models of the rotor assembly and casing assembly by finite element analysis. The dynamically instrumented assemblies are independently rap tested to verify the analytical models. The verified models are then combined by modal superposition techniques to develop a completed turbopump model where dynamic characteristics are determined. The results of the dynamic testing and analysis obtained are presented and methods of moving the high speed vibration characteristics to speeds above the operating range are recommended. Recommendations for use of these advanced dynamic analysis procedures during initial design phases are given.

  14. Multiresidue analytical method for pharmaceuticals and personal care products in sewage and sewage sludge by online direct immersion SPME on-fiber derivatization - GCMS.

    PubMed

    López-Serna, Rebeca; Marín-de-Jesús, David; Irusta-Mata, Rubén; García-Encina, Pedro Antonio; Lebrero, Raquel; Fdez-Polanco, María; Muñoz, Raúl

    2018-08-15

    The work here presented aimed at developing an analytical method for the simultaneous determination of 22 pharmaceuticals and personal care products, including 3 transformation products, in sewage and sludge. A meticulous method optimization, involving an experimental design, was carried out. The developed method was fully automated and consisted of the online extraction of 17 mL of water sample by Direct Immersion Solid Phase MicroExtraction followed by On-fiber Derivatization coupled to Gas Chromatography - Mass Spectrometry (DI-SPME - On-fiber Derivatization - GC - MS). This methodology was validated for 12 of the initial compounds as a reliable (relative recoveries above 90% for sewage and 70% for sludge; repeatability as %RSD below 10% in all cases), sensitive (LODs below 20 ng L -1 in sewage and 10 ng g -1 in sludge), versatile (sewage and sewage-sludge samples up to 15,000 ng L -1 and 900 ng g -1 , respectively) and green analytical alternative for many medium-tech routine laboratories around the world to keep up with both current and forecast environmental regulations requirements. The remaining 10 analytes initially considered showed insufficient suitability to be included in the final method. The methodology was successfully applied to real samples generated in a pilot scale sewage treatment reactor. Copyright © 2018 Elsevier B.V. All rights reserved.

  15. Improvements in Block-Krylov Ritz Vectors and the Boundary Flexibility Method of Component Synthesis

    NASA Technical Reports Server (NTRS)

    Carney, Kelly Scott

    1997-01-01

    A method of dynamic substructuring is presented which utilizes a set of static Ritz vectors as a replacement for normal eigenvectors in component mode synthesis. This set of Ritz vectors is generated in a recurrence relationship, proposed by Wilson, which has the form of a block-Krylov subspace. The initial seed to the recurrence algorithm is based upon the boundary flexibility vectors of the component. Improvements have been made in the formulation of the initial seed to the Krylov sequence, through the use of block-filtering. A method to shift the Krylov sequence to create Ritz vectors that will represent the dynamic behavior of the component at target frequencies, the target frequency being determined by the applied forcing functions, has been developed. A method to terminate the Krylov sequence has also been developed. Various orthonormalization schemes have been developed and evaluated, including the Cholesky/QR method. Several auxiliary theorems and proofs which illustrate issues in component mode synthesis and loss of orthogonality in the Krylov sequence have also been presented. The resulting methodology is applicable to both fixed and free- interface boundary components, and results in a general component model appropriate for any type of dynamic analysis. The accuracy is found to be comparable to that of component synthesis based upon normal modes, using fewer generalized coordinates. In addition, the block-Krylov recurrence algorithm is a series of static solutions and so requires significantly less computation than solving the normal eigenspace problem. The requirement for less vectors to form the component, coupled with the lower computational expense of calculating these Ritz vectors, combine to create a method more efficient than traditional component mode synthesis.

  16. Tracing children's vocabulary development from preschool through the school-age years: An 8-year longitudinal study

    PubMed Central

    Kang, Cuiping; Liu, Hongyun; Zhang, Yuping; McBride-Chang, Catherine; Tardif, Twila; Li, Hong; Liang, Weilan; Zhang, Zhixiang; Shu, Hua

    2014-01-01

    In this 8-year longitudinal study, we traced the vocabulary growth of Chinese children, explored potential precursors of vocabulary knowledge, and investigated how vocabulary growth predicted future reading skills. Two hundred sixty-four (264) native Chinese children from Beijing were measured on a variety of reading and language tasks over 8 years. Between the ages of 4 to 10 years, they were administered tasks of vocabulary and related cognitive skills. At age 11, comprehensive reading skills, including character recognition, reading fluency, and reading comprehension were examined. Individual differences in vocabulary developmental profiles were estimated using the intercept-slope cluster method. Vocabulary development was then examined in relation to later reading outcomes. Three subgroups of lexical growth were classified, namely high-high (with a large initial vocabulary size and a fast growth rate), low-high (with a small initial vocabulary size and a fast growth rate) and low-low (with a small initial vocabulary size and a slow growth rate) groups. Low-high and low-low groups were distinguishable mostly through phonological skills, morphological skills and other reading-related cognitive skills. Childhood vocabulary development (using intercept and slope) explained subsequent reading skills. Findings suggest that language-related and reading-related cognitive skills differ among groups with different developmental trajectories of vocabulary, and the initial size and growth rate of vocabulary may be two predictors for later reading development. PMID:24962559

  17. Training Methods and Training Instructors' Qualification Are Related to Recruits' Fitness Development During Basic Military Training.

    PubMed

    Roos, Lilian; Hofstetter, Marie-Claire; Mäder, Urs; Wyss, Thomas

    2015-11-01

    Adequate physical fitness is essential for successful military service. Military organizations worldwide therefore make continuous efforts to improve their army's physical training (PT) programs. To investigate the effect of the training methods and the qualification of PT instructors on the development of recruits' physical fitness, the present study compared the outcomes of 2 training groups. Both study groups participated in approximately 145 minutes per week of PT. The control group executed the standard army PT prepared and supervised by army PT instructors. Content of the PT in the intervention group was similar to that of the control group, but their training sessions' methods were different. Their training sessions were organized, prepared, and delivered by more and better-qualified supervisors (tertiary-educated physical education teachers). After 10 weeks of training, the participants of the intervention group experienced a significantly greater physical fitness improvement than those of the control group (positive change in endurance 32 and 17%, balance 30 and 21%, and core strength 74 and 45%, respectively). In both groups, the recruits with the lowest initial fitness levels significantly increased their performance. In the intervention group, but not the control, one-third of the recruits with the highest initial fitness levels were able to further improve their general fitness performance. This study demonstrates that the training methods and quality of instruction during PT sessions are relevant for recruits' fitness development in basic military training.

  18. Space Manufacturing: The Next Great Challenge

    NASA Technical Reports Server (NTRS)

    Whitaker, Ann F.; Curreri, Peter; Sharpe, Jonathan B.; Colberg, Wendell R.; Vickers, John H.

    1998-01-01

    Space manufacturing encompasses the research, development and manufacture necessary for the production of any product to be used in near zero gravity, and the production of spacecraft required for transporting research or production devices to space. Manufacturing for space, and manufacturing in space will require significant breakthroughs in materials and manufacturing technology, as well as in equipment designs. This report reviews some of the current initiatives in achieving space manufacturing. The first initiative deals with materials processing in space, e.g., processing non-terrestrial and terrestrial materials, especially metals. Some of the ramifications of the United States Microgravity Payloads fourth (USMP-4) mission are discussed. Some problems in non-terrestrial materials processing are mentioned. The second initiative is structures processing in space. In order to accomplish this, the International Space Welding Experiment was designed to demonstrate welding technology in near-zero gravity. The third initiative is advancements in earth-based manufacturing technologies necessary to achieve low cost access to space. The advancements discussed include development of lightweight material having high specific strength, and automated fabrication and manufacturing methods for these materials.

  19. Tobacco-related disease burden and preventive initiatives in China. Global health and the chronic diseases: perspective, policy and practice.

    PubMed

    Niu, Bolin

    2011-06-01

    The burden of chronic diseases in global health is a surging area of research. The Global Health Initiative at the National Heart, Lung, and Blood Institute brings together investigators from developing countries with those from the developed world to study these diseases. In China, approximately 83 percent of all deaths in 2000 were attributed to chronic illnesses, which are the research focuses of the Chinese center of the Global Health Initiative. Tobacco use as well as passive smoking are modifiable risk factors in a large number of such chronic conditions. The prevalence of smoking in China is extensive and has inseparable ties to the economy, with tobacco taxes making up a large portion of government revenue in poorer provinces. Methods of smoking prevention have been piloted in some Chinese schools, which have mitigated the increase in smoking rate but have not resulted in a primary preventive effect. Efforts by the Yale Global Health Initiative and the Yale-China Association are bringing researchers together to address chronic disease in China as Yale School of Medicine enters its 200th year.

  20. Dynamic Modeling of Solar Dynamic Components and Systems

    NASA Technical Reports Server (NTRS)

    Hochstein, John I.; Korakianitis, T.

    1992-01-01

    The purpose of this grant was to support NASA in modeling efforts to predict the transient dynamic and thermodynamic response of the space station solar dynamic power generation system. In order to meet the initial schedule requirement of providing results in time to support installation of the system as part of the initial phase of space station, early efforts were executed with alacrity and often in parallel. Initially, methods to predict the transient response of a Rankine as well as a Brayton cycle were developed. Review of preliminary design concepts led NASA to select a regenerative gas-turbine cycle using a helium-xenon mixture as the working fluid and, from that point forward, the modeling effort focused exclusively on that system. Although initial project planning called for a three year period of performance, revised NASA schedules moved system installation to later and later phases of station deployment. Eventually, NASA selected to halt development of the solar dynamic power generation system for space station and to reduce support for this project to two-thirds of the original level.

  1. Axisymmetric inlet minimum weight design method

    NASA Technical Reports Server (NTRS)

    Nadell, Shari-Beth

    1995-01-01

    An analytical method for determining the minimum weight design of an axisymmetric supersonic inlet has been developed. The goal of this method development project was to improve the ability to predict the weight of high-speed inlets in conceptual and preliminary design. The initial model was developed using information that was available from inlet conceptual design tools (e.g., the inlet internal and external geometries and pressure distributions). Stiffened shell construction was assumed. Mass properties were computed by analyzing a parametric cubic curve representation of the inlet geometry. Design loads and stresses were developed at analysis stations along the length of the inlet. The equivalent minimum structural thicknesses for both shell and frame structures required to support the maximum loads produced by various load conditions were then determined. Preliminary results indicated that inlet hammershock pressures produced the critical design load condition for a significant portion of the inlet. By improving the accuracy of inlet weight predictions, the method will improve the fidelity of propulsion and vehicle design studies and increase the accuracy of weight versus cost studies.

  2. Sleeve Push Technique: A Novel Method of Space Gaining.

    PubMed

    Verma, Sanjeev; Bhupali, Nameksh Raj; Gupta, Deepak Kumar; Singh, Sombir; Singh, Satinder Pal

    2018-01-01

    Space gaining is frequently required in orthodontics. Multiple loops were initially used for space gaining and alignment. The most common used mechanics for space gaining is the use of nickel-titanium open coil springs. The disadvantage of nickel-titanium coil spring is that they cannot be used until the arches are well aligned to receive the stiffer stainless steel wires. Therefore, a new method of gaining space during initial alignment and leveling has been developed and named as sleeve push technique (SPT). The nickel-titanium wires, i.e. 0.012 inches and 0.014 inches along with archwire sleeve (protective tubing) can be used in a modified way to gain space along with alignment. This method helps in gaining space right from day 1 of treatment. The archwire sleeve and nickel-titanium wire in this new SPT act as a mutually synergistic combination and provide the orthodontist with a completely new technique for space opening.

  3. Towards adjoint-based inversion of time-dependent mantle convection with nonlinear viscosity

    NASA Astrophysics Data System (ADS)

    Li, Dunzhu; Gurnis, Michael; Stadler, Georg

    2017-04-01

    We develop and study an adjoint-based inversion method for the simultaneous recovery of initial temperature conditions and viscosity parameters in time-dependent mantle convection from the current mantle temperature and historic plate motion. Based on a realistic rheological model with temperature-dependent and strain-rate-dependent viscosity, we formulate the inversion as a PDE-constrained optimization problem. The objective functional includes the misfit of surface velocity (plate motion) history, the misfit of the current mantle temperature, and a regularization for the uncertain initial condition. The gradient of this functional with respect to the initial temperature and the uncertain viscosity parameters is computed by solving the adjoint of the mantle convection equations. This gradient is used in a pre-conditioned quasi-Newton minimization algorithm. We study the prospects and limitations of the inversion, as well as the computational performance of the method using two synthetic problems, a sinking cylinder and a realistic subduction model. The subduction model is characterized by the migration of a ridge toward a trench whereby both plate motions and subduction evolve. The results demonstrate: (1) for known viscosity parameters, the initial temperature can be well recovered, as in previous initial condition-only inversions where the effective viscosity was given; (2) for known initial temperature, viscosity parameters can be recovered accurately, despite the existence of trade-offs due to ill-conditioning; (3) for the joint inversion of initial condition and viscosity parameters, initial condition and effective viscosity can be reasonably recovered, but the high dimension of the parameter space and the resulting ill-posedness may limit recovery of viscosity parameters.

  4. The Influence of Pretreatment Characteristics and Radiotherapy Parameters on Time Interval to Development of Radiation-Associated Meningioma

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paulino, Arnold C., E-mail: apaulino@tmhs.or; Ahmed, Irfan M.; Mai, Wei Y.

    2009-12-01

    Purpose: To identify pretreatment characteristics and radiotherapy parameters which may influence time interval to development of radiation-associated meningioma (RAM). Methods and Materials: A Medline/PUBMED search of articles dealing with RAM yielded 66 studies between 1981 and 2006. Factors analyzed included patient age and gender, type of initial tumor treated, radiotherapy (RT) dose and volume, and time interval from RT to development of RAM. Results: A total of 143 patients with a median age at RT of 12 years form the basis of this report. The most common initial tumors or conditions treated with RT were medulloblastoma (n = 27), pituitarymore » adenoma (n = 20), acute lymphoblastic leukemia (n = 20), low-grade astrocytoma (n = 19), and tinea capitis (n = 14). In the 116 patients whose RT fields were known, 55 (47.4%) had a portion of the brain treated, whereas 32 (27.6%) and 29 (25.0%) had craniospinal and whole-brain fields. The median time from RT to develop a RAM or latent time (LT) was 19 years (range, 1-63 years). Male gender (p = 0.001), initial diagnosis of leukemia (p = 0.001), and use of whole brain or craniospinal field (p <= 0.0001) were associated with a shorter LT, whereas patients who received lower doses of RT had a longer LT (p < 0.0001). Conclusions: The latent time to develop a RAM was related to gender, initial tumor type, radiotherapy volume, and radiotherapy dose.« less

  5. Feynman path integral application on deriving black-scholes diffusion equation for european option pricing

    NASA Astrophysics Data System (ADS)

    Utama, Briandhika; Purqon, Acep

    2016-08-01

    Path Integral is a method to transform a function from its initial condition to final condition through multiplying its initial condition with the transition probability function, known as propagator. At the early development, several studies focused to apply this method for solving problems only in Quantum Mechanics. Nevertheless, Path Integral could also apply to other subjects with some modifications in the propagator function. In this study, we investigate the application of Path Integral method in financial derivatives, stock options. Black-Scholes Model (Nobel 1997) was a beginning anchor in Option Pricing study. Though this model did not successfully predict option price perfectly, especially because its sensitivity for the major changing on market, Black-Scholes Model still is a legitimate equation in pricing an option. The derivation of Black-Scholes has a high difficulty level because it is a stochastic partial differential equation. Black-Scholes equation has a similar principle with Path Integral, where in Black-Scholes the share's initial price is transformed to its final price. The Black-Scholes propagator function then derived by introducing a modified Lagrange based on Black-Scholes equation. Furthermore, we study the correlation between path integral analytical solution and Monte-Carlo numeric solution to find the similarity between this two methods.

  6. Structural dynamics of ribosome subunit association studied by mixing-spraying time-resolved cryo-EM

    PubMed Central

    Chen, Bo; Kaledhonkar, Sandip; Sun, Ming; Shen, Bingxin; Lu, Zonghuan; Barnard, David; Lu, Toh-Ming; Gonzalez, Ruben L.; Frank, Joachim

    2015-01-01

    Ribosomal subunit association is a key checkpoint in translation initiation, but its structural dynamics are poorly understood. Here, we used a recently developed mixing-spraying, time-resolved, cryogenic electron microscopy (cryo-EM) method to study ribosomal subunit association in the sub-second time range. We have improved this method and increased the cryo-EM data yield by tenfold. Pre-equilibrium states of the association reaction were captured by reacting the mixture of ribosomal subunits for 60 ms and 140 ms. We also identified three distinct ribosome conformations in the associated ribosomes. The observed proportions of these conformations are the same in these two time points, suggesting that ribosomes equilibrate among the three conformations within less than 60 ms upon formation. Our results demonstrate that the mixing-spraying method can capture multiple states of macromolecules during a sub-second reaction. Other fast processes, such as translation initiation, decoding and ribosome recycling, are amenable to study with this method. PMID:26004440

  7. Application of ultrasonic signature analysis for fatigue detection in complex structures

    NASA Technical Reports Server (NTRS)

    Zuckerwar, A. J.

    1974-01-01

    Ultrasonic signature analysis shows promise of being a singularly well-suited method for detecting fatigue in structures as complex as aircraft. The method employs instrumentation centered about a Fourier analyzer system, which features analog-to-digital conversion, digital data processing, and digital display of cross-correlation functions and cross-spectra. These features are essential to the analysis of ultrasonic signatures according to the procedure described here. In order to establish the feasibility of the method, the initial experiments were confined to simple plates with simulated and fatigue-induced defects respectively. In the first test the signature proved sensitive to the size of a small hole drilled into the plate. In the second test, performed on a series of fatigue-loaded plates, the signature proved capable of indicating both the initial appearance and subsequent growth of a fatigue crack. In view of these encouraging results it is concluded that the method has reached a sufficiently advanced stage of development to warrant application to small-scale structures or even actual aircraft.

  8. Simplified Dynamic Analysis of Grinders Spindle Node

    NASA Astrophysics Data System (ADS)

    Demec, Peter

    2014-12-01

    The contribution deals with the simplified dynamic analysis of surface grinding machine spindle node. Dynamic analysis is based on the use of the transfer matrix method, which is essentially a matrix form of method of initial parameters. The advantage of the described method, despite the seemingly complex mathematical apparatus, is primarily, that it does not require for solve the problem of costly commercial software using finite element method. All calculations can be made for example in MS Excel, which is advantageous especially in the initial stages of constructing of spindle node for the rapid assessment of the suitability its design. After detailing the entire structure of spindle node is then also necessary to perform the refined dynamic analysis in the environment of FEM, which it requires the necessary skills and experience and it is therefore economically difficult. This work was developed within grant project KEGA No. 023TUKE-4/2012 Creation of a comprehensive educational - teaching material for the article Production technique using a combination of traditional and modern information technology and e-learning.

  9. Improved fuzzy clustering algorithms in segmentation of DC-enhanced breast MRI.

    PubMed

    Kannan, S R; Ramathilagam, S; Devi, Pandiyarajan; Sathya, A

    2012-02-01

    Segmentation of medical images is a difficult and challenging problem due to poor image contrast and artifacts that result in missing or diffuse organ/tissue boundaries. Many researchers have applied various techniques however fuzzy c-means (FCM) based algorithms is more effective compared to other methods. The objective of this work is to develop some robust fuzzy clustering segmentation systems for effective segmentation of DCE - breast MRI. This paper obtains the robust fuzzy clustering algorithms by incorporating kernel methods, penalty terms, tolerance of the neighborhood attraction, additional entropy term and fuzzy parameters. The initial centers are obtained using initialization algorithm to reduce the computation complexity and running time of proposed algorithms. Experimental works on breast images show that the proposed algorithms are effective to improve the similarity measurement, to handle large amount of noise, to have better results in dealing the data corrupted by noise, and other artifacts. The clustering results of proposed methods are validated using Silhouette Method.

  10. A Funding Initiative for Community-Based Participatory Research: Lessons from the Harvard Catalyst Seed Grants

    PubMed Central

    Tendulkar, Shalini A.; Chu, Jocelyn; Opp, Jennifer; Geller, Alan; DiGirolamo, Ann; Gandelman, Ediss; Grullon, Milagro; Patil, Pratima; King, Stacey; Hacker, Karen

    2013-01-01

    Background The National Institutes of Health–funded Clinical and Translational Science Awards (CTSA) have increasingly focused on community-engaged research and funded investigators for community-based participatory research (CBPR). However, because CBPR is a collaborative process focused on community-identified research topics, the Harvard CTSA and its Community Advisory Board (CERAB) funded community partners through a CBPR initiative. Objectives We describe lessons learned from this seed grants initiative designed to stimulate community–academic CBPR partnerships. Methods The CBPR program of the Harvard CTSA and the CERAB developed this initiative and each round incorporated participant and advisory feedback toward program improvement. Lessons Learned Although this initiative facilitated relevant and innovative research, challenges included variable community research readiness, insufficient project time, and difficulties identifying investigators for new partnerships. Conclusion Seed grants can foster innovative CBPR projects. Similar initiatives should consider preliminary assessments of community research readiness as well as strategies for meaningful academic researcher engagement. PMID:21441667

  11. The Influence of the Shape of Model Hydrometeors on the Formation of Discharge between an Artificial-Thunderstorm Cell and the Ground

    NASA Astrophysics Data System (ADS)

    Temnikov, A. G.; Chernenskii, L. L.; Orlov, A. V.; Lysov, N. Yu.; Belova, O. S.; Gerastenok, T. K.; Zhuravkova, D. S.

    2017-12-01

    We have experimentally studied how arrays of model coarse hydrometeors influence the initiation and propagation of discharge between an artificial-thunderstorm cell of negative or positive polarity and the ground. It is established for the first time that the probability of initiation and stimulation of a channeled discharge between negatively or positively charged cloud and the ground significantly depends on the shape and size of coarse hydrometeors occurring near the thunderstorm cell boundaries. The obtained results can be used in developing methods for the artificial initiation of the cloud-ground type lightning of both polarities and targeted discharge of thunderstorm clouds.

  12. Bundle Payment Program Initiative: Roles of a Nurse Navigator and Home Health Professionals.

    PubMed

    Peiritsch, Heather

    2017-06-01

    With the passage of the Affordable Care Act, The Centers for Medicare and Medicaid (CMS) introduced a new value-based payment model, the Bundle Payment Care Initiative. The CMS Innovation (Innovation Center) authorized hospitals to participate in a pilot to test innovative payment and service delivery models that have a potential to reduce Medicare expenditures while maintaining or improving the quality of care for beneficiaries. A hospital-based home care agency, Abington Jefferson Health Home Care Department, led the initiative for the development and implementation of the Bundled Payment Program. This was a creative and innovative method to improve care along the continuum while testing a value-based care model.

  13. Non-equilibrium dynamics from RPMD and CMD.

    PubMed

    Welsch, Ralph; Song, Kai; Shi, Qiang; Althorpe, Stuart C; Miller, Thomas F

    2016-11-28

    We investigate the calculation of approximate non-equilibrium quantum time correlation functions (TCFs) using two popular path-integral-based molecular dynamics methods, ring-polymer molecular dynamics (RPMD) and centroid molecular dynamics (CMD). It is shown that for the cases of a sudden vertical excitation and an initial momentum impulse, both RPMD and CMD yield non-equilibrium TCFs for linear operators that are exact for high temperatures, in the t = 0 limit, and for harmonic potentials; the subset of these conditions that are preserved for non-equilibrium TCFs of non-linear operators is also discussed. Furthermore, it is shown that for these non-equilibrium initial conditions, both methods retain the connection to Matsubara dynamics that has previously been established for equilibrium initial conditions. Comparison of non-equilibrium TCFs from RPMD and CMD to Matsubara dynamics at short times reveals the orders in time to which the methods agree. Specifically, for the position-autocorrelation function associated with sudden vertical excitation, RPMD and CMD agree with Matsubara dynamics up to O(t 4 ) and O(t 1 ), respectively; for the position-autocorrelation function associated with an initial momentum impulse, RPMD and CMD agree with Matsubara dynamics up to O(t 5 ) and O(t 2 ), respectively. Numerical tests using model potentials for a wide range of non-equilibrium initial conditions show that RPMD and CMD yield non-equilibrium TCFs with an accuracy that is comparable to that for equilibrium TCFs. RPMD is also used to investigate excited-state proton transfer in a system-bath model, and it is compared to numerically exact calculations performed using a recently developed version of the Liouville space hierarchical equation of motion approach; again, similar accuracy is observed for non-equilibrium and equilibrium initial conditions.

  14. A common evaluation framework for the African Health Initiative

    PubMed Central

    2013-01-01

    Background The African Health Initiative includes highly diverse partnerships in five countries (Ghana, Mozambique, Rwanda, Tanzania, and Zambia), each of which is working to improve population health by strengthening health systems and to evaluate the results. One aim of the Initiative is to generate cross-site learning that can inform implementation in the five partnerships during the project period and identify lessons that may be generalizable to other countries in the region. Collaborators in the Initiative developed a common evaluation framework as a basis for this cross-site learning. Methods This paper describes the components of the framework; this includes the conceptual model, core metrics to be measured in all sites, and standard guidelines for reporting on the implementation of partnership activities and contextual factors that may affect implementation, or the results it produces. We also describe the systems that have been put in place for data management, data quality assessments, and cross-site analysis of results. Results and conclusions The conceptual model for the Initiative highlights points in the causal chain between health system strengthening activities and health impact where evidence produced by the partnerships can contribute to learning. This model represents an important advance over its predecessors by including contextual factors and implementation strength as potential determinants, and explicitly including equity as a component of both outcomes and impact. Specific measurement challenges include the prospective documentation of program implementation and contextual factors. Methodological issues addressed in the development of the framework include the aggregation of data collected using different methods and the challenge of evaluating a complex set of interventions being improved over time based on continuous monitoring and intermediate results. PMID:23819778

  15. Ultimately Reliable Pyrotechnic Systems

    NASA Technical Reports Server (NTRS)

    Scott, John H.; Hinkel, Todd

    2015-01-01

    This paper presents the methods by which NASA has designed, built, tested, and certified pyrotechnic devices for high reliability operation in extreme environments and illustrates the potential applications in the oil and gas industry. NASA's extremely successful application of pyrotechnics is built upon documented procedures and test methods that have been maintained and developed since the Apollo Program. Standards are managed and rigorously enforced for performance margins, redundancy, lot sampling, and personnel safety. The pyrotechnics utilized in spacecraft include such devices as small initiators and detonators with the power of a shotgun shell, detonating cord systems for explosive energy transfer across many feet, precision linear shaped charges for breaking structural membranes, and booster charges to actuate valves and pistons. NASA's pyrotechnics program is one of the more successful in the history of Human Spaceflight. No pyrotechnic device developed in accordance with NASA's Human Spaceflight standards has ever failed in flight use. NASA's pyrotechnic initiators work reliably in temperatures as low as -420 F. Each of the 135 Space Shuttle flights fired 102 of these initiators, some setting off multiple pyrotechnic devices, with never a failure. The recent landing on Mars of the Opportunity rover fired 174 of NASA's pyrotechnic initiators to complete the famous '7 minutes of terror.' Even after traveling through extreme radiation and thermal environments on the way to Mars, every one of them worked. These initiators have fired on the surface of Titan. NASA's design controls, procedures, and processes produce the most reliable pyrotechnics in the world. Application of pyrotechnics designed and procured in this manner could enable the energy industry's emergency equipment, such as shutoff valves and deep-sea blowout preventers, to be left in place for years in extreme environments and still be relied upon to function when needed, thus greatly enhancing safety and operational availability.

  16. Timely initiation of complementary feeding and associated factors among children aged 6 to 12 months in Northern Ethiopia: an institution-based cross-sectional study

    PubMed Central

    2013-01-01

    Background Exclusive breastfeeding (EBF) for the first six months of life is critical for the wellbeing of the child. In the mean while, timely initiation and starting nutritionally-adequate, safe, age-appropriate complementary feeding at six months is recommended for the better health and development of infants. According to the Ethiopian Demographic and Health Survey 2011, timely initiation of complementary feeding in Ethiopia at the 6th month was only 51%. The purpose of this study is to determine the magnitude of timely initiation of complementary feeding and associated factors in Mekelle town, Northern Ethiopia. Methods An institutional based cross-sectional study design was conducted among 422 mothers of infants aged from six months to one year selected from six public health facilities. Sample size proportional to the patient flow rate of each institution was allocated and systematic random sampling method was used to get the study participant. An exit interview using structured questionnaire was conducted about their experience on complementary feeding and related experience. The questionnaire was pretested among 21 mothers. Data were entered with EPI info version 3.5.1 and cleaning and analysis was done by using SPSS version 16. Frequencies distribution, binary and multiple logistic regressions were done. OR and 95% confidence interval was computed. Result The prevalence of timely initiation of complementary feeding at sixth month was 62.8% (265/422, 95% C.I: 58.1, 67.31%). Educational level, occupation of mother, parity, having ANC follow up, and birth preparedness were found to be independent predictor of timely initiation of complementary feeding. Conclusions Almost two-third of mothers initiated complementary feeding at six month of child’ age as recommended. This was relatively higher prevalence than most developing countries. However, significant proportion of mothers still did not initiate complementary feeding timely. Mothers who are illiterate and completed only primary school need more attention. All mothers must be encouraged to make antenatal care follow up. PMID:24195592

  17. Live imaging of developmental processes in a living meristem of Davidia involucrata (Nyssaceae)

    PubMed Central

    Jerominek, Markus; Bull-Hereñu, Kester; Arndt, Melanie; Claßen-Bockhoff, Regine

    2014-01-01

    Morphogenesis in plants is usually reconstructed by scanning electron microscopy and histology of meristematic structures. These techniques are destructive and require many samples to obtain a consecutive series of states. Unfortunately, using this methodology the absolute timing of growth and complete relative initiation of organs remain obscure. To overcome this limitation, an in vivo observational method based on Epi-Illumination Light Microscopy (ELM) was developed and tested with a male inflorescence meristem (floral unit) of the handkerchief tree Davidia involucrata Baill. (Nyssaceae). We asked whether the most basal flowers of this floral unit arise in a basipetal sequence or, alternatively, are delayed in their development. The growing meristem was observed for 30 days, the longest live observation of a meristem achieved to date. The sequence of primordium initiation indicates a later initiation of the most basal flowers and not earlier or simultaneously as SEM images could suggest. D. involucrata exemplarily shows that live-ELM gives new insights into developmental processes of plants. In addition to morphogenetic questions such as the transition from vegetative to reproductive meristems or the absolute timing of ontogenetic processes, this method may also help to quantify cellular growth processes in the context of molecular physiology and developmental genetics studies. PMID:25431576

  18. Live imaging of developmental processes in a living meristem of Davidia involucrata (Nyssaceae).

    PubMed

    Jerominek, Markus; Bull-Hereñu, Kester; Arndt, Melanie; Claßen-Bockhoff, Regine

    2014-01-01

    Morphogenesis in plants is usually reconstructed by scanning electron microscopy and histology of meristematic structures. These techniques are destructive and require many samples to obtain a consecutive series of states. Unfortunately, using this methodology the absolute timing of growth and complete relative initiation of organs remain obscure. To overcome this limitation, an in vivo observational method based on Epi-Illumination Light Microscopy (ELM) was developed and tested with a male inflorescence meristem (floral unit) of the handkerchief tree Davidia involucrata Baill. (Nyssaceae). We asked whether the most basal flowers of this floral unit arise in a basipetal sequence or, alternatively, are delayed in their development. The growing meristem was observed for 30 days, the longest live observation of a meristem achieved to date. The sequence of primordium initiation indicates a later initiation of the most basal flowers and not earlier or simultaneously as SEM images could suggest. D. involucrata exemplarily shows that live-ELM gives new insights into developmental processes of plants. In addition to morphogenetic questions such as the transition from vegetative to reproductive meristems or the absolute timing of ontogenetic processes, this method may also help to quantify cellular growth processes in the context of molecular physiology and developmental genetics studies.

  19. A fast 4D cone beam CT reconstruction method based on the OSC-TV algorithm.

    PubMed

    Mascolo-Fortin, Julia; Matenine, Dmitri; Archambault, Louis; Després, Philippe

    2018-01-01

    Four-dimensional cone beam computed tomography allows for temporally resolved imaging with useful applications in radiotherapy, but raises particular challenges in terms of image quality and computation time. The purpose of this work is to develop a fast and accurate 4D algorithm by adapting a GPU-accelerated ordered subsets convex algorithm (OSC), combined with the total variation minimization regularization technique (TV). Different initialization schemes were studied to adapt the OSC-TV algorithm to 4D reconstruction: each respiratory phase was initialized either with a 3D reconstruction or a blank image. Reconstruction algorithms were tested on a dynamic numerical phantom and on a clinical dataset. 4D iterations were implemented for a cluster of 8 GPUs. All developed methods allowed for an adequate visualization of the respiratory movement and compared favorably to the McKinnon-Bates and adaptive steepest descent projection onto convex sets algorithms, while the 4D reconstructions initialized from a prior 3D reconstruction led to better overall image quality. The most suitable adaptation of OSC-TV to 4D CBCT was found to be a combination of a prior FDK reconstruction and a 4D OSC-TV reconstruction with a reconstruction time of 4.5 minutes. This relatively short reconstruction time could facilitate a clinical use.

  20. GEOS. User Tutorials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fu, Pengchen; Settgast, Randolph R.; Johnson, Scott M.

    2014-12-17

    GEOS is a massively parallel, multi-physics simulation application utilizing high performance computing (HPC) to address subsurface reservoir stimulation activities with the goal of optimizing current operations and evaluating innovative stimulation methods. GEOS enables coupling of di erent solvers associated with the various physical processes occurring during reservoir stimulation in unique and sophisticated ways, adapted to various geologic settings, materials and stimulation methods. Developed at the Lawrence Livermore National Laboratory (LLNL) as a part of a Laboratory-Directed Research and Development (LDRD) Strategic Initiative (SI) project, GEOS represents the culmination of a multi-year ongoing code development and improvement e ort that hasmore » leveraged existing code capabilities and sta expertise to design new computational geosciences software.« less

  1. Development and evaluation of a hybrid averaged orbit generator

    NASA Technical Reports Server (NTRS)

    Mcclain, W. D.; Long, A. C.; Early, L. W.

    1978-01-01

    A rapid orbit generator based on a first-order application of the Generalized Method of Averaging has been developed for the Research and Development (R&D) version of the Goddard Trajectory Determination System (GTDS). The evaluation of the averaged equations of motion can use both numerically averaged and recursively evaluated, analytically averaged perturbation models. These equations are numerically integrated to obtain the secular and long-period motion. Factors affecting efficient orbit prediction are discussed and guidelines are presented for treatment of each major perturbation. Guidelines for obtaining initial mean elements compatible with the theory are presented. An overview of the orbit generator is presented and comparisons with high precision methods are given.

  2. Development of advanced lightweight containment systems

    NASA Technical Reports Server (NTRS)

    Stotler, C.

    1981-01-01

    Parametric type data were obtained on advanced lightweight containment systems. These data were used to generate design methods and procedures necessary for the successful development of such systems. The methods were then demonstrated through the design of a lightweight containment system for a CF6 size engine. The containment concept evaluated consisted basically of a lightweight structural sandwich shell wrapped with dry Kevlar cloth. The initial testing was directed towards the determination of the amount of Kevlar required to result in threshold containment for a specific set of test conditions. A relationship was then developed between the thickness required and the energy of the released blade so that the data could be used to design for conditions other than those tested.

  3. 29 CFR 4211.35 - Direct attribution method for withdrawals after the initial plan year.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... WITHDRAWING EMPLOYERS Allocation Methods for Merged Multiemployer Plans § 4211.35 Direct attribution method for withdrawals after the initial plan year. The allocation method under this section is the... 29 Labor 9 2010-07-01 2010-07-01 false Direct attribution method for withdrawals after the initial...

  4. Wet refractivity tomography with an improved Kalman-Filter method

    NASA Astrophysics Data System (ADS)

    Cao, Yunchang; Chen, Yongqi; Li, Pingwha

    2006-10-01

    An improved retrieval method, which uses the solution with a Gaussian constraint as the initial state variables for the Kalman Filtering (KF) method, was developed to retrieve the wet refractivity profiles from slant wet delays (SWD) extracted by the double-differenced (DD) GPS method. The accuracy of the GPS-derived SWDs is also tested in this study against the measurements of a water vapor radiometer (WVR) and a weather model. It is concluded that the GPS-derived SWDs have similar accuracy to those measured with WVR and are much higher in quality than those derived from the weather model used. The developed method is used to retrieve the 3D wet refractivity distribution in the Hong Kong region. The retrieved profiles agree well with the radiosonde observations, with a difference of about 4 mm km-1 in the low levels. The accurate profiles obtained with this method are applicable in a number of meteorological applications.

  5. Sparse QSAR modelling methods for therapeutic and regenerative medicine

    NASA Astrophysics Data System (ADS)

    Winkler, David A.

    2018-02-01

    The quantitative structure-activity relationships method was popularized by Hansch and Fujita over 50 years ago. The usefulness of the method for drug design and development has been shown in the intervening years. As it was developed initially to elucidate which molecular properties modulated the relative potency of putative agrochemicals, and at a time when computing resources were scarce, there is much scope for applying modern mathematical methods to improve the QSAR method and to extending the general concept to the discovery and optimization of bioactive molecules and materials more broadly. I describe research over the past two decades where we have rebuilt the unit operations of the QSAR method using improved mathematical techniques, and have applied this valuable platform technology to new important areas of research and industry such as nanoscience, omics technologies, advanced materials, and regenerative medicine. This paper was presented as the 2017 ACS Herman Skolnik lecture.

  6. Initiating a Standardized Regional Referral and Counter-Referral System in Guatemala: A Mixed-Methods Study.

    PubMed

    Kapoor, Rupa; Avendaño, Leslie; Sandoval, Maria Antonieta; Cruz, Andrea T; Sampayo, Esther M; Soto, Miguel A; Camp, Elizabeth A; Crouse, Heather L

    2017-01-01

    Background: Few data exist for referral processes in resource-limited settings. We utilized mixed-methods to evaluate the impact of a standardized algorithm and training module developed for locally identified needs in referral/counter-referral procedures between primary health centers (PHCs) and a Guatemalan referral hospital. Methods : PHC personnel and hospital physicians participated in surveys and focus groups pre-implementation and 3, 6, and 12 months post-implementation to evaluate providers' experience with the system. Referred patient records were reviewed to evaluate system effectiveness. Results : A total of 111 initial focus group participants included 96 (86.5%) from PHCs and 15 from the hospital. Of these participants, 53 PHC physicians and nurses and 15 hospital physicians initially completed written surveys. Convenience samples participated in follow-up. Eighteen focus groups achieved thematic saturation. Four themes emerged: effective communication; provision of timely, quality patient care with adequate resources; educational opportunities; and development of empowerment and relationships. Pre- and post-implementation surveys demonstrated significant improvement at the PHCs ( P < .001) and the hospital ( P = .02). Chart review included 435 referrals, 98 (22.5%) pre-implementation and 337 (77.5%) post-implementation. There was a trend toward an increased percentage of appropriately referred patients requiring medical intervention (30% vs 40%, P = .08) and of patients requiring intervention who received it prior to transport (55% vs 73%, P = .06). Conclusions : Standardizing a referral/counter-referral system improved communication, education, and trust across different levels of pediatric health care delivery. This model may be used for extension throughout Guatemala or be modified for use in other countries. Mixed-methods research design can evaluate complex systems in resource-limited settings.

  7. Facile Fabrication of Hierarchically Thermoresponsive Binary Polymer Pattern for Controlled Cell Adhesion.

    PubMed

    Hou, Jianwen; Cui, Lele; Chen, Runhai; Xu, Xiaodong; Chen, Jiayue; Yin, Ligang; Liu, Jingchuan; Shi, Qiang; Yin, Jinghua

    2018-03-01

    A versatile platform allowing capture and detection of normal and dysfunctional cells on the same patterned surface is important for accessing the cellular mechanism, developing diagnostic assays, and implementing therapy. Here, an original and effective method for fabricating binary polymer brushes pattern is developed for controlled cell adhesion. The binary polymer brushes pattern, composed of poly(N-isopropylacrylamide) (PNIPAAm) and poly[poly(ethylene glycol) methyl ether methacrylate] (POEGMA) chains, is simply obtained via a combination of surface-initiated photopolymerization and surface-activated free radical polymerization. This method is unique in that it does not utilize any protecting groups or procedures of backfilling with immobilized initiator. It is demonstrated that the precise and well-defined binary polymer patterns with high resolution are fabricated using this facile method. PNIPAAm chains capture and release cells by thermoresponsiveness, while POEGMA chains possess high capability to capture dysfunctional cells specifically, inducing a switch of normal red blood cells (RBCs) arrays to hemolytic RBCs arrays on the pattern with temperature. This novel platform composed of binary polymer brush pattern is smart and versatile, which opens up pathways to potential applications as microsensors, biochips, and bioassays. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Spectral biopsy for skin cancer diagnosis: initial clinical results

    NASA Astrophysics Data System (ADS)

    Moy, Austin J.; Feng, Xu; Nguyen, Hieu T. M.; Zhang, Yao; Sebastian, Katherine R.; Reichenberg, Jason S.; Tunnell, James W.

    2017-02-01

    Skin cancer is the most common form of cancer in the United States and is a recognized public health issue. Diagnosis of skin cancer involves biopsy of the suspicious lesion followed by histopathology. Biopsies, which involve excision of the lesion, are invasive, at times unnecessary, and are costly procedures ( $2.8B/year in the US). An unmet critical need exists to develop a non-invasive and inexpensive screening method that can eliminate the need for unnecessary biopsies. To address this need, our group has reported on the continued development of a noninvasive method that utilizes multimodal spectroscopy towards the goal of a "spectral biopsy" of skin. Our approach combines Raman spectroscopy, fluorescence spectroscopy, and diffuse reflectance spectroscopy to collect comprehensive optical property information from suspicious skin lesions. We previously described an updated spectral biopsy system that allows acquisition of all three forms of spectroscopy through a single fiber optic probe and is composed of off-the-shelf OEM components that are smaller, cheaper, and enable a more clinic-friendly system. We present initial patient data acquired with the spectral biopsy system, the first from an extensive clinical study (n = 250) to characterize its performance in identifying skin cancers (basal cell carcinoma, squamous cell carcinoma, and melanoma). We also present our first attempts at analyzing this initial set of clinical data using statistical-based models, and with models currently being developed to extract biophysical information from the collected spectra, all towards the goal of noninvasive skin cancer diagnosis.

  9. Failure mechanisms and lifetime prediction methodology for polybutylene pipe in water distribution system

    NASA Astrophysics Data System (ADS)

    Niu, Xiqun

    Polybutylene (PB) is a semicrystalline thermoplastics. It has been widely used in potable water distribution piping system. However, field practice shows that failure occurs much earlier than the expected service lifetime. What are the causes and how to appropriately evaluate its lifetime motivate this study. In this thesis, three parts of work have been done. First is the understanding of PB, which includes material thermo and mechanical characterization, aging phenomena and notch sensitivity. The second part analyzes the applicability of the existing lifetime testing method for PB. It is shown that PB is an anomaly in terms of the temperature-lifetime relation because of the fracture mechanism transition across the testing temperature range. The third part is the development of the methodology of lifetime prediction for PB pipe. The fracture process of PB pipe consists of three stages, i.e., crack initiation, slow crack growth (SCG) and crack instability. The practical lifetime of PB pipe is primarily determined by the duration of the first two stages. The mechanism of crack initiation and the quantitative estimation of the time to crack initiation are studied by employing environment stress cracking technique. A fatigue slow crack growth testing method has been developed and applied in the study of SCG. By using Paris-Erdogan equation, a model is constructed to evaluate the time for SCG. As a result, the total lifetime is determined. Through this work, the failure mechanisms of PB pipe has been analyzed and the lifetime prediction methodology has been developed.

  10. Initialization and simulation of a landfalling typhoon using a variational bogus mapped data assimilation (BMDA)

    NASA Astrophysics Data System (ADS)

    Zhao, Y.; Wang, B.; Wang, Y.

    2007-12-01

    Recently, a new data assimilation method called “3-dimensional variational data assimilation of mapped observation (3DVM)” has been developed by the authors. We have shown that the new method is very efficient and inexpensive compared with its counterpart 4-dimensional variational data assimilation (4DVar). The new method has been implemented into the Penn State/NCAR mesoscale model MM5V1 (MM5_3DVM). In this study, we apply the new method to the bogus data assimilation (BDA) available in the original MM5 with the 4DVar. By the new approach, a specified sea-level pressure (SLP) field (bogus data) is incorporated into MM5 through the 3DVM (for convenient, we call it variational bogus mapped data assimilation - BMDA) instead of the original 4DVar data assimilation. To demonstrate the effectiveness of the new 3DVM method, initialization and simulation of a landfalling typhoon - typhoon Dan (1999) over the western North Pacific with the new method are compared with that with its counterpart 4DVar in MM5. Results show that the initial structure and the simulated intensity and track are improved more significantly using 3DVM than 4DVar. Sensitivity experiments also show that the simulated typhoon track and intensity are more sensitive to the size of the assimilation window in the 4DVar than that in the 3DVM. Meanwhile, 3DVM takes much less computing cost than its counterpart 4DVar for a given time window.

  11. Theoretical development and first-principles analysis of strongly correlated systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Chen

    A variety of quantum many-body methods have been developed for studying the strongly correlated electron systems. We have also proposed a computationally efficient and accurate approach, named the correlation matrix renormalization (CMR) method, to address the challenges. The initial implementation of the CMR method is designed for molecules which have theoretical advantages, including small size of system, manifest mechanism and strongly correlation effect such as bond breaking process. The theoretic development and benchmark tests of the CMR method are included in this thesis. Meanwhile, ground state total energy is the most important property of electronic calculations. We also investigated anmore » alternative approach to calculate the total energy, and extended this method for magnetic anisotropy energy (MAE) of ferromagnetic materials. In addition, another theoretical tool, dynamical mean- field theory (DMFT) on top of the DFT , has also been used in electronic structure calculations for an Iridium oxide to study the phase transition, which results from an interplay of the d electrons' internal degrees of freedom.« less

  12. Fire-protection research for energy technology: Fy 80 year end report

    NASA Astrophysics Data System (ADS)

    Hasegawa, H. K.; Alvares, N. J.; Lipska, A. E.; Ford, H.; Priante, S.; Beason, D. G.

    1981-05-01

    This continuing research program was initiated in order to advance fire protection strategies for Fusion Energy Experiments (FEE). The program expanded to encompass other forms of energy research. Accomplishments for fiscal year 1980 were: finalization of the fault-free analysis of the Shiva fire management system; development of a second-generation, fire-growth analysis using an alternate model and new LLNL combustion dynamics data; improvements of techniques for chemical smoke aerosol analysis; development and test of a simple method to assess the corrosive potential of smoke aerosols; development of an initial aerosol dilution system; completion of primary small-scale tests for measurements of the dynamics of cable fires; finalization of primary survey format for non-LLNL energy technology facilities; and studies of fire dynamics and aerosol production from electrical insulation and computer tape cassettes.

  13. Harmonizing and Optimizing Fish Testing Methods: The OECD Framework Project

    EPA Science Inventory

    The Organisation for Economic Cooperation and Development (OECD) serves a key role in the international harmonization of testing of a wide variety of chemicals. An integrated fish testing framework project was initiated in mid-2009 through the OECD with the US as the lead country...

  14. Handwriting Instruction in Elementary Schools: Revisited!

    ERIC Educational Resources Information Center

    Asher, Asha; Estes, Joanne

    2016-01-01

    Handwriting is an essential literacy and communication skill developed through a variety of instructional methods in elementary school. This study explored the consistency in handwriting instruction across grade levels in a Midwest public school district 15 years after the school initially implemented a uniform handwriting program. Additionally,…

  15. Physical activity problem-solving inventory for adolescents: Development and initial validation

    USDA-ARS?s Scientific Manuscript database

    Youth encounter physical activity barriers, often called problems. The purpose of problem-solving is to generate solutions to overcome the barriers. Enhancing problem-solving ability may enable youth to be more physically active. Therefore, a method for reliably assessing physical activity problem-s...

  16. 78 FR 9698 - Agency Forms Undergoing Paperwork Reduction Act Review

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-11

    ... effective at improving health care quality. While evidence-based approaches for decision-making have become standard in healthcare, this has been limited in laboratory medicine. No single-evidence-based model for... (LMBP) initiative to develop new systematic evidence reviews methods for making evidence-based...

  17. Collective Biography and Memory Work: Girls Reading Fiction

    ERIC Educational Resources Information Center

    Gannon, Susanne

    2015-01-01

    Collective biography draws on memory work methods developed initially by feminist sociologists (Haug et al., 1987) where people collaboratively examined the social and discursive resources through which they take themselves up as particular gendered subjects in the world. Their own memories become resources to investigate processes of…

  18. ANALYTICAL PROCEDURES FOR CHARACTERIZING UNREGULATED EMISSIONS FROM VEHICLES USING MIDDLE-DISTILLATE FUELS

    EPA Science Inventory

    This research program was initiated with the objective of developing, codifying and testing a group of chemical analytical methods for measuring toxic compounds in the exhaust of distillate-fueled engines (i.e. diesel, gas turbine, Stirling, or Rankin cycle powerplants). It is a ...

  19. DianaHealth.com, an On-Line Database Containing Appraisals of the Clinical Value and Appropriateness of Healthcare Interventions: Database Development and Retrospective Analysis

    PubMed Central

    Bonfill, Xavier; Osorio, Dimelza; Solà, Ivan; Pijoan, Jose Ignacio; Balasso, Valentina; Quintana, Maria Jesús; Puig, Teresa; Bolibar, Ignasi; Urrútia, Gerard; Zamora, Javier; Emparanza, José Ignacio; Gómez de la Cámara, Agustín; Ferreira-González, Ignacio

    2016-01-01

    Objective To describe the development of a novel on-line database aimed to serve as a source of information concerning healthcare interventions appraised for their clinical value and appropriateness by several initiatives worldwide, and to present a retrospective analysis of the appraisals already included in the database. Methods and Findings Database development and a retrospective analysis. The database DianaHealth.com is already on-line and it is regularly updated, independent, open access and available in English and Spanish. Initiatives are identified in medical news, in article references, and by contacting experts in the field. We include appraisals in the form of clinical recommendations, expert analyses, conclusions from systematic reviews, and original research that label any health care intervention as low-value or inappropriate. We obtain the information necessary to classify the appraisals according to type of intervention, specialties involved, publication year, authoring initiative, and key words. The database is accessible through a search engine which retrieves a list of appraisals and a link to the website where they were published. DianaHealth.com also provides a brief description of the initiatives and a section where users can report new appraisals or suggest new initiatives. From January 2014 to July 2015, the on-line database included 2940 appraisals from 22 initiatives: eleven campaigns gathering clinical recommendations from scientific societies, five sets of conclusions from literature review, three sets of recommendations from guidelines, two collections of articles on low clinical value in medical journals, and an initiative of our own. Conclusions We have developed an open access on-line database of appraisals about healthcare interventions considered of low clinical value or inappropriate. DianaHealth.com could help physicians and other stakeholders make better decisions concerning patient care and healthcare systems sustainability. Future efforts should be focused on assessing the impact of these appraisals in the clinical practice. PMID:26840451

  20. Cross-Disciplinary Consultancy to Bridge Public Health Technical Needs and Analytic Developers: Asyndromic Surveillance Use Case

    PubMed Central

    Faigen, Zachary; Deyneka, Lana; Ising, Amy; Neill, Daniel; Conway, Mike; Fairchild, Geoffrey; Gunn, Julia; Swenson, David; Painter, Ian; Johnson, Lauren; Kiley, Chris; Streichert, Laura

    2015-01-01

    Introduction: We document a funded effort to bridge the gap between constrained scientific challenges of public health surveillance and methodologies from academia and industry. Component tasks are the collection of epidemiologists’ use case problems, multidisciplinary consultancies to refine them, and dissemination of problem requirements and shareable datasets. We describe an initial use case and consultancy as a concrete example and challenge to developers. Materials and Methods: Supported by the Defense Threat Reduction Agency Biosurveillance Ecosystem project, the International Society for Disease Surveillance formed an advisory group to select tractable use case problems and convene inter-disciplinary consultancies to translate analytic needs into well-defined problems and to promote development of applicable solution methods. The initial consultancy’s focus was a problem originated by the North Carolina Department of Health and its NC DETECT surveillance system: Derive a method for detection of patient record clusters worthy of follow-up based on free-text chief complaints and without syndromic classification. Results: Direct communication between public health problem owners and analytic developers was informative to both groups and constructive for the solution development process. The consultancy achieved refinement of the asyndromic detection challenge and of solution requirements. Participants summarized and evaluated solution approaches and discussed dissemination and collaboration strategies. Practice Implications: A solution meeting the specification of the use case described above could improve human monitoring efficiency with expedited warning of events requiring follow-up, including otherwise overlooked events with no syndromic indicators. This approach can remove obstacles to collaboration with efficient, minimal data-sharing and without costly overhead. PMID:26834939

  1. Improved quality-by-design compliant methodology for method development in reversed-phase liquid chromatography.

    PubMed

    Debrus, Benjamin; Guillarme, Davy; Rudaz, Serge

    2013-10-01

    A complete strategy dedicated to quality-by-design (QbD) compliant method development using design of experiments (DOE), multiple linear regressions responses modelling and Monte Carlo simulations for error propagation was evaluated for liquid chromatography (LC). The proposed approach includes four main steps: (i) the initial screening of column chemistry, mobile phase pH and organic modifier, (ii) the selectivity optimization through changes in gradient time and mobile phase temperature, (iii) the adaptation of column geometry to reach sufficient resolution, and (iv) the robust resolution optimization and identification of the method design space. This procedure was employed to obtain a complex chromatographic separation of 15 antipsychotic basic drugs, widely prescribed. To fully automate and expedite the QbD method development procedure, short columns packed with sub-2 μm particles were employed, together with a UHPLC system possessing columns and solvents selection valves. Through this example, the possibilities of the proposed QbD method development workflow were exposed and the different steps of the automated strategy were critically discussed. A baseline separation of the mixture of antipsychotic drugs was achieved with an analysis time of less than 15 min and the robustness of the method was demonstrated simultaneously with the method development phase. Copyright © 2013 Elsevier B.V. All rights reserved.

  2. The integrative review: updated methodology.

    PubMed

    Whittemore, Robin; Knafl, Kathleen

    2005-12-01

    The aim of this paper is to distinguish the integrative review method from other review methods and to propose methodological strategies specific to the integrative review method to enhance the rigour of the process. Recent evidence-based practice initiatives have increased the need for and the production of all types of reviews of the literature (integrative reviews, systematic reviews, meta-analyses, and qualitative reviews). The integrative review method is the only approach that allows for the combination of diverse methodologies (for example, experimental and non-experimental research), and has the potential to play a greater role in evidence-based practice for nursing. With respect to the integrative review method, strategies to enhance data collection and extraction have been developed; however, methods of analysis, synthesis, and conclusion drawing remain poorly formulated. A modified framework for research reviews is presented to address issues specific to the integrative review method. Issues related to specifying the review purpose, searching the literature, evaluating data from primary sources, analysing data, and presenting the results are discussed. Data analysis methods of qualitative research are proposed as strategies that enhance the rigour of combining diverse methodologies as well as empirical and theoretical sources in an integrative review. An updated integrative review method has the potential to allow for diverse primary research methods to become a greater part of evidence-based practice initiatives.

  3. Application of Artificial Thunderstorm Cells for the Investigation of Lightning Initiation Problems between a Thundercloud and the Ground

    NASA Astrophysics Data System (ADS)

    Temnikov, A. G.; Chernensky, L. L.; Orlov, A. V.; Lysov, N. Y.; Zhuravkova, D. S.; Belova, O. S.; Gerastenok, T. K.

    2017-12-01

    The results of the experimental application of artificial thunderstorm cells of negative and positive polarities for the investigation of the lightning initiation problems between the thundercloud and the ground using model hydrometeor arrays are presented. Possible options of the initiation and development of a discharge between the charged cloud and the ground in the presence of model hydrometeors are established. It is experimentally shown that groups of large hydrometeors of various shapes significantly increase the probability of channel discharge initiation between the artificial thunderstorm cell and the ground, especially in the case of positive polarity of the cloud. The authors assume that large hail arrays in the thundercloud can initiate the preliminary breakdown stage in the lower part of the thundercloud or initiate and stimulate the propagation of positive lightning from its upper part. A significant effect of the shape of model hydrometeors and the way they are grouped on the processes of initiation and stimulation of the channel discharge propagation in the artificial thunderstorm cell of negative or positive polarity-ground gap is experimentally established. It is found that, in the case of negative polarity of a charged cloud, the group of conductive cylindrical hydrometeors connected by a dielectric string more effectively initiates the channel discharge between the artificial thunderstorm cell and the ground. In the case of positive polarity of the artificial thunderstorm cell, the best effect of the channel discharge initiation is achieved for model hydrometeors grouped together by the dielectric tape. The obtained results can be used in the development of the method for the directed artificial lightning initiation between the thundercloud and the ground.

  4. Development and Validation of a Reversed-Phase Chiral HPLC Method to Determine the Chiral Purity of Bulk Batches of (S)-Enantiomer in Afoxolaner.

    PubMed

    Padivitage, Nilusha; Kumar, Satish; Rustum, Abu

    2017-01-01

    Afoxolaner is a new antiparasitic molecule from the isoxazoline family that acts on insect acarine g-aminobutyric acid and glutamate receptors. Afoxolaner is a racemic mixture, which has a chiral center at the isoxazoline ring. A reversed-phase chiral HPLC method has been developed to determine the chiral purity of bulk batches of (S)-enantiomer in afoxolaner for the first time. This method can also be used to verify that afoxolaner is a racemic mixture, which was demonstrated by specific rotation. ChromSword, an artificial intelligence method development tool, was used for initial method development. The column selected for the final method was CHIRALPAK AD-RH (150 × 4.6 mm, 5 μm particle size), maintained at 45°C, and isocratic elution using water-isopropanol-acetonitrile (40 + 50 + 10, v/v/v) as the mobile phase with a detection wavelength of 312 nm. The run time for the method was 11 min. The resolution and selectivity factors of the two enantiomers were 2.3 and 1.24, respectively. LOQ and LOD of the method were 1.6 and 0.8 μg/mL, respectively. This method was appropriately validated according to International Conference on Harmonization guidelines for its intended use.

  5. A Novel Approach to Rotorcraft Damage Tolerance

    NASA Technical Reports Server (NTRS)

    Forth, Scott C.; Everett, Richard A.; Newman, John A.

    2002-01-01

    Damage-tolerance methodology is positioned to replace safe-life methodologies for designing rotorcraft structures. The argument for implementing a damage-tolerance method comes from the fundamental fact that rotorcraft structures typically fail by fatigue cracking. Therefore, if technology permits prediction of fatigue-crack growth in structures, a damage-tolerance method should deliver the most accurate prediction of component life. Implementing damage-tolerance (DT) into high-cycle-fatigue (HCF) components will require a shift from traditional DT methods that rely on detecting an initial flaw with nondestructive inspection (NDI) methods. The rapid accumulation of cycles in a HCF component will result in a design based on a traditional DT method that is either impractical because of frequent inspections, or because the design will be too heavy to operate efficiently. Furthermore, once a HCF component develops a detectable propagating crack, the remaining fatigue life is short, sometimes less than one flight hour, which does not leave sufficient time for inspection. Therefore, designing a HCF component will require basing the life analysis on an initial flaw that is undetectable with current NDI technology.

  6. Military Housing Privatization Initiative (MHPI), Eglin AFB, Florida and Hurlburt Field, Florida. Final Environmental Impact Statement

    DTIC Science & Technology

    2011-05-01

    There are several different methods available for determining stormwater runoff peak flows. Two of the most widely used methods are the Rational...environmental factors between the alternatives differ in terms of their respective potential for adverse effects relative to their location. ENVIRONMENTAL...Force selects a development proposal. As a result, the actual project scope may result in different numbers of units constructed or demolished, or

  7. Assessment of fragment projection hazard: probability distributions for the initial direction of fragments.

    PubMed

    Tugnoli, Alessandro; Gubinelli, Gianfilippo; Landucci, Gabriele; Cozzani, Valerio

    2014-08-30

    The evaluation of the initial direction and velocity of the fragments generated in the fragmentation of a vessel due to internal pressure is an important information in the assessment of damage caused by fragments, in particular within the quantitative risk assessment (QRA) of chemical and process plants. In the present study an approach is proposed to the identification and validation of probability density functions (pdfs) for the initial direction of the fragments. A detailed review of a large number of past accidents provided the background information for the validation procedure. A specific method was developed for the validation of the proposed pdfs. Validated pdfs were obtained for both the vertical and horizontal angles of projection and for the initial velocity of the fragments. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. Guide to conducting tinnitus retraining therapy initial and follow-up interviews.

    PubMed

    Henry, James A; Jastreboff, Margaret M; Jastreboff, Pawel J; Schechter, Martin A; Fausti, Stephen A

    2003-01-01

    Tinnitus Retraining Therapy (TRT) is a structured method of tinnitus treatment that has been performed since 1990. The TRT Initial Interview form was developed to guide clinicians in obtaining essential information from patients that would specify treatment needs. The TRT Follow-up Interview form is similar to the initial interview form and is designed to evaluate outcomes of treatment. The clinician administers these forms verbally. The forms have been used in a highly abbreviated format with the potential for inconsistent interview administration between examiners. This project was to expand the forms to provide specific wording for each question. The expanded forms are presented in this article, and the intent of each question is explained. Standardized administration of these interview forms will facilitate greater uniformity in the initial evaluation and outcomes analyses of patients treated with TRT.

  9. Connected Classroom: A Program Evaluation of the Professional Development Program of a One-to-One Educational Technology Initiative in South Carolina

    ERIC Educational Resources Information Center

    Grant, Kelly J.

    2016-01-01

    The purpose of this study was to evaluate the impact of the first year of a multi-year, district-wide professional development program for teachers that accompanied a one-to-one Apple device rollout for all students. A mixed-method research design was used to perform a logic model of program evaluation. Teacher self-reported proficiency in basic…

  10. Experimenters' reference based upon Skylab experiment management

    NASA Technical Reports Server (NTRS)

    1974-01-01

    The methods and techniques for experiment development and integration that evolved during the Skylab Program are described to facilitate transferring this experience to experimenters in future manned space programs. Management responsibilities and the sequential process of experiment evolution from initial concept through definition, development, integration, operation and postflight analysis are outlined and amplified, as appropriate. Emphasis is placed on specific lessons learned on Skylab that are worthy of consideration by future programs.

  11. The Role of Context in the Development of Second Language Interactional Competence: A Comparative Microanalysis of Topic Initiation Practices in the Study Abroad Homestay and the Language Classroom

    ERIC Educational Resources Information Center

    Van Booven, Christopher D.

    2017-01-01

    This dissertation research aims to better specify the role of context in the development of second language interactional competence. Drawing on conversation-analytic methods and Wong and Waring's (2010) model of interactional practices, I described and compared the opportunities that two study abroad contexts--the homestay and the language…

  12. Pennsylvania Action Research Network (PA-ARN) Staff Development through Five Regional Staff Development Centers. Final Report. July 1997-June 1998.

    ERIC Educational Resources Information Center

    Pennsylvania State Univ., McKeesport.

    The Pennsylvania Action Research Network project was initiated in 1995-1996 to provide Pennsylvania literacy educators with the following: a better method for taking published research findings and testing and adapting them in their own classrooms; a way to study their own research ideas on a daily-action basis; and a systematic way to share and…

  13. Development and Reliability of Items Measuring the Nonmedical Use of Prescription Drugs for the Youth Risk Behavior Survey: Results Froman Initial Pilot Test

    ERIC Educational Resources Information Center

    Howard, Melissa M.; Weiler, Robert M.; Haddox, J. David

    2009-01-01

    Background: The purpose of this study was to develop and test the reliability of self-report survey items designed to monitor the nonmedical use of prescription drugs among adolescents. Methods: Eighteen nonmedical prescription drug items designed to be congruent with the substance abuse items in the US Centers for Disease Control and Prevention's…

  14. Using Method of Instruction to Predict the Skills Supporting Initial Reading Development: Insight from a Synthetic Phonics Approach

    ERIC Educational Resources Information Center

    McGeown, Sarah P.; Medford, Emma

    2014-01-01

    This study examined the skills predicting early reading development when children were taught by a synthetic phonics approach. Eighty five children taught to read by systematic synthetic phonics were assessed on reading and cognitive assessments prior to reading instruction (average age 4 years, 7 months), 6 months later (5 years, 1 month), and 73…

  15. Robust iterative method for nonlinear Helmholtz equation

    NASA Astrophysics Data System (ADS)

    Yuan, Lijun; Lu, Ya Yan

    2017-08-01

    A new iterative method is developed for solving the two-dimensional nonlinear Helmholtz equation which governs polarized light in media with the optical Kerr nonlinearity. In the strongly nonlinear regime, the nonlinear Helmholtz equation could have multiple solutions related to phenomena such as optical bistability and symmetry breaking. The new method exhibits a much more robust convergence behavior than existing iterative methods, such as frozen-nonlinearity iteration, Newton's method and damped Newton's method, and it can be used to find solutions when good initial guesses are unavailable. Numerical results are presented for the scattering of light by a nonlinear circular cylinder based on the exact nonlocal boundary condition and a pseudospectral method in the polar coordinate system.

  16. Survival analysis and classification methods for forest fire size

    PubMed Central

    2018-01-01

    Factors affecting wildland-fire size distribution include weather, fuels, and fire suppression activities. We present a novel application of survival analysis to quantify the effects of these factors on a sample of sizes of lightning-caused fires from Alberta, Canada. Two events were observed for each fire: the size at initial assessment (by the first fire fighters to arrive at the scene) and the size at “being held” (a state when no further increase in size is expected). We developed a statistical classifier to try to predict cases where there will be a growth in fire size (i.e., the size at “being held” exceeds the size at initial assessment). Logistic regression was preferred over two alternative classifiers, with covariates consistent with similar past analyses. We conducted survival analysis on the group of fires exhibiting a size increase. A screening process selected three covariates: an index of fire weather at the day the fire started, the fuel type burning at initial assessment, and a factor for the type and capabilities of the method of initial attack. The Cox proportional hazards model performed better than three accelerated failure time alternatives. Both fire weather and fuel type were highly significant, with effects consistent with known fire behaviour. The effects of initial attack method were not statistically significant, but did suggest a reverse causality that could arise if fire management agencies were to dispatch resources based on a-priori assessment of fire growth potentials. We discuss how a more sophisticated analysis of larger data sets could produce unbiased estimates of fire suppression effect under such circumstances. PMID:29320497

  17. Survival analysis and classification methods for forest fire size.

    PubMed

    Tremblay, Pier-Olivier; Duchesne, Thierry; Cumming, Steven G

    2018-01-01

    Factors affecting wildland-fire size distribution include weather, fuels, and fire suppression activities. We present a novel application of survival analysis to quantify the effects of these factors on a sample of sizes of lightning-caused fires from Alberta, Canada. Two events were observed for each fire: the size at initial assessment (by the first fire fighters to arrive at the scene) and the size at "being held" (a state when no further increase in size is expected). We developed a statistical classifier to try to predict cases where there will be a growth in fire size (i.e., the size at "being held" exceeds the size at initial assessment). Logistic regression was preferred over two alternative classifiers, with covariates consistent with similar past analyses. We conducted survival analysis on the group of fires exhibiting a size increase. A screening process selected three covariates: an index of fire weather at the day the fire started, the fuel type burning at initial assessment, and a factor for the type and capabilities of the method of initial attack. The Cox proportional hazards model performed better than three accelerated failure time alternatives. Both fire weather and fuel type were highly significant, with effects consistent with known fire behaviour. The effects of initial attack method were not statistically significant, but did suggest a reverse causality that could arise if fire management agencies were to dispatch resources based on a-priori assessment of fire growth potentials. We discuss how a more sophisticated analysis of larger data sets could produce unbiased estimates of fire suppression effect under such circumstances.

  18. 42 CFR 414.313 - Initial method of payment.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 42 Public Health 3 2013-10-01 2013-10-01 false Initial method of payment. 414.313 Section 414.313... of Reasonable Charges Under the ESRD Program § 414.313 Initial method of payment. (a) Basic rule. Under this method, the intermediary pays the facility for routine professional services furnished by...

  19. Adaptive designs in clinical trials.

    PubMed

    Bowalekar, Suresh

    2011-01-01

    In addition to the expensive and lengthy process of developing a new medicine, the attrition rate in clinical research was on the rise, resulting in stagnation in the development of new compounds. As a consequence to this, the US Food and Drug Administration released a critical path initiative document in 2004, highlighting the need for developing innovative trial designs. One of the innovations suggested the use of adaptive designs for clinical trials. Thus, post critical path initiative, there is a growing interest in using adaptive designs for the development of pharmaceutical products. Adaptive designs are expected to have great potential to reduce the number of patients and duration of trial and to have relatively less exposure to new drug. Adaptive designs are not new in the sense that the task of interim analysis (IA)/review of the accumulated data used in adaptive designs existed in the past too. However, such reviews/analyses of accumulated data were not necessarily planned at the stage of planning clinical trial and the methods used were not necessarily compliant with clinical trial process. The Bayesian approach commonly used in adaptive designs was developed by Thomas Bayes in the 18th century, about hundred years prior to the development of modern statistical methods by the father of modern statistics, Sir Ronald A. Fisher, but the complexity involved in Bayesian approach prevented its use in real life practice. The advances in the field of computer and information technology over the last three to four decades has changed the scenario and the Bayesian techniques are being used in adaptive designs in addition to other sequential methods used in IA. This paper attempts to describe the various adaptive designs in clinical trial and views of stakeholders about feasibility of using them, without going into mathematical complexities.

  20. A Psychometric Approach to Theory-Based Behavior Change Intervention Development: Example From the Colorado Meaning-Activity Project.

    PubMed

    Masters, Kevin S; Ross, Kaile M; Hooker, Stephanie A; Wooldridge, Jennalee L

    2018-05-18

    There has been a notable disconnect between theories of behavior change and behavior change interventions. Because few interventions are both explicitly and adequately theory-based, investigators cannot assess the impact of theory on intervention effectiveness. Theory-based interventions, designed to deliberately engage the theory's proposed mechanisms of change, are needed to adequately test theories. Thus, systematic approaches to theory-based intervention development are needed. This article will introduce and discuss the psychometric method of developing theory-based interventions. The psychometric approach to intervention development utilizes basic psychometric principles at each step of the intervention development process in order to build a theoretically driven intervention to, subsequently, be tested in process (mechanism) and outcome studies. Five stages of intervention development are presented as follows: (i) Choice of theory; (ii) Identification and characterization of key concepts and expected relations; (iii) Intervention construction; (iv) Initial testing and revision; and (v) Empirical testing of the intervention. Examples of this approach from the Colorado Meaning-Activity Project (COMAP) are presented. Based on self-determination theory integrated with meaning or purpose, and utilizing a motivational interviewing approach, the COMAP intervention is individually based with an initial interview followed by smart phone-delivered interventions for increasing daily activity. The psychometric approach to intervention development is one method to ensure careful consideration of theory in all steps of intervention development. This structured approach supports developing a research culture that endorses deliberate and systematic operationalization of theory into behavior change intervention from the outset of intervention development.

Top