Sample records for method detailed evaluation

  1. Evaluation of streams in selected communities for the application of limited-detail study methods for flood-insurance studies

    USGS Publications Warehouse

    Cobb, Ernest D.

    1986-01-01

    The U.S. Geological Survey evaluated 2,349 communities in 1984 for the application of limited-detail flood-insurance study methods, that is, methods with a reduced effort and cost compared to the detailed studies. Limited-detail study methods were found to be appropriate for 1,705 communities, while detailed studies were appropriate for 62 communities and no studies were appropriate for 582 communities. The total length of streams for which limited-detail studies are recommended is 9 ,327 miles with a corresponding cost of $23,007,000. This results in average estimated costs for conducting limited-detail studies of $2,500 per mile of studied stream length. The purpose of the report is to document the limited-detail study methods and the results of the evaluation. (USGS)

  2. Reported credibility techniques in higher education evaluation studies that use qualitative methods: A research synthesis.

    PubMed

    Liao, Hongjing; Hitchcock, John

    2018-06-01

    This synthesis study examined the reported use of credibility techniques in higher education evaluation articles that use qualitative methods. The sample included 118 articles published in six leading higher education evaluation journals from 2003 to 2012. Mixed methods approaches were used to identify key credibility techniques reported across the articles, document the frequency of these techniques, and describe their use and properties. Two broad sets of techniques were of interest: primary design techniques (i.e., basic), such as sampling/participant recruitment strategies, data collection methods, analytic details, and additional qualitative credibility techniques (e.g., member checking, negative case analyses, peer debriefing). The majority of evaluation articles reported use of primary techniques although there was wide variation in the amount of supporting detail; most of the articles did not describe the use of additional credibility techniques. This suggests that editors of evaluation journals should encourage the reporting of qualitative design details and authors should develop strategies yielding fuller methodological description. Copyright © 2018 Elsevier Ltd. All rights reserved.

  3. Methods for comparative evaluation of propulsion system designs for supersonic aircraft

    NASA Technical Reports Server (NTRS)

    Tyson, R. M.; Mairs, R. Y.; Halferty, F. D., Jr.; Moore, B. E.; Chaloff, D.; Knudsen, A. W.

    1976-01-01

    The propulsion system comparative evaluation study was conducted to define a rapid, approximate method for evaluating the effects of propulsion system changes for an advanced supersonic cruise airplane, and to verify the approximate method by comparing its mission performance results with those from a more detailed analysis. A table look up computer program was developed to determine nacelle drag increments for a range of parametric nacelle shapes and sizes. Aircraft sensitivities to propulsion parameters were defined. Nacelle shapes, installed weights, and installed performance was determined for four study engines selected from the NASA supersonic cruise aircraft research (SCAR) engine studies program. Both rapid evaluation method (using sensitivities) and traditional preliminary design methods were then used to assess the four engines. The method was found to compare well with the more detailed analyses.

  4. Evaluation of fatigue-prone details using a low-cost thermoelastic stress analysis system.

    DOT National Transportation Integrated Search

    2016-11-01

    This study was designed to develop a novel approach for in situ evaluation of stress fields in the vicinity of fatigue-prone details on highway bridges using a low-cost microbolometer thermal imager. : The method was adapted into a field-deployable i...

  5. Quantitative method of medication system interface evaluation.

    PubMed

    Pingenot, Alleene Anne; Shanteau, James; Pingenot, James D F

    2007-01-01

    The objective of this study was to develop a quantitative method of evaluating the user interface for medication system software. A detailed task analysis provided a description of user goals and essential activity. A structural fault analysis was used to develop a detailed description of the system interface. Nurses experienced with use of the system under evaluation provided estimates of failure rates for each point in this simplified fault tree. Means of estimated failure rates provided quantitative data for fault analysis. Authors note that, although failures of steps in the program were frequent, participants reported numerous methods of working around these failures so that overall system failure was rare. However, frequent process failure can affect the time required for processing medications, making a system inefficient. This method of interface analysis, called Software Efficiency Evaluation and Fault Identification Method, provides quantitative information with which prototypes can be compared and problems within an interface identified.

  6. How Much Detail Needs to Be Elucidated in Self-Harm Research?

    ERIC Educational Resources Information Center

    Stanford, Sarah; Jones, Michael P.

    2010-01-01

    Assessing self-harm through brief multiple choice items is simple and less invasive than more detailed methods of assessment. However, there is currently little validation for brief methods of self-harm assessment. This study evaluates the extent to which adolescents' perceptions of self-harm agree with definitions in the literature, and what…

  7. "Expectations to Change" ((E2C): A Participatory Method for Facilitating Stakeholder Engagement with Evaluation Findings

    ERIC Educational Resources Information Center

    Adams, Adrienne E.; Nnawulezi, Nkiru A.; Vandenberg, Lela

    2015-01-01

    From a utilization-focused evaluation perspective, the success of an evaluation is rooted in the extent to which the evaluation was used by stakeholders. This paper details the "Expectations to Change" (E2C) process, an interactive, workshop-based method designed to engage primary users with their evaluation findings as a means of…

  8. Methods for the evaluation of alternative disaster warning systems

    NASA Technical Reports Server (NTRS)

    Agnew, C. E.; Anderson, R. J., Jr.; Lanen, W. N.

    1977-01-01

    For each of the methods identified, a theoretical basis is provided and an illustrative example is described. The example includes sufficient realism and detail to enable an analyst to conduct an evaluation of other systems. The methods discussed in the study include equal capability cost analysis, consumers' surplus, and statistical decision theory.

  9. Methods of assessing structural integrity for space shuttle vehicles

    NASA Technical Reports Server (NTRS)

    Anderson, R. E.; Stuckenberg, F. H.

    1971-01-01

    A detailed description and evaluation of nondestructive evaluation (NDE) methods are given which have application to space shuttle vehicles. Appropriate NDE design data is presented in twelve specifications in an appendix. Recommendations for NDE development work for the space shuttle program are presented.

  10. Traveler Phase 1A Joint Review

    NASA Technical Reports Server (NTRS)

    St. John, Clint; Scofield, Jan; Skoog, Mark; Flock, Alex; Williams, Ethan; Guirguis, Luke; Loudon, Kevin; Sutherland, Jeffrey; Lehmann, Richard; Garland, Michael; hide

    2017-01-01

    The briefing contains the preliminary findings and suggestions for improvement of methods used in development and evaluation of a multi monitor runtime assurance architecture for autonomous flight vehicles. Initial system design, implementation, verification, and flight testing has been conducted. As of yet detailed data review is incomplete, and flight testing has been limited to initial monitor force fights. Detailed monitor flight evaluations have yet to be performed.

  11. A Review of Methods and Instruments Used in State and Local School Readiness Evaluations. Issues & Answers. REL 2007-No. 004

    ERIC Educational Resources Information Center

    Brown, Glyn; Scott-Little, Catherine; Amwake, Lynn; Wynn, Lucy

    2007-01-01

    The report provides detailed information about the methods and instruments used to evaluate school readiness initiatives, discusses important considerations in selecting instruments, and provides resources and recommendations that may be helpful to those who are designing and implementing school readiness evaluations. Study results indicate that…

  12. A pilot study evaluating alternative approaches of academic detailing in rural family practice clinics

    PubMed Central

    2012-01-01

    Background Academic detailing is an interactive, convenient, and user-friendly approach to delivering non-commercial education to healthcare clinicians. While evidence suggests academic detailing is associated with improvements in prescribing behavior, uncertainty exists about generalizability and scalability in diverse settings. Our study evaluates different models of delivering academic detailing in a rural family medicine setting. Methods We conducted a pilot project to assess the feasibility, effectiveness, and satisfaction with academic detailing delivered face-to-face as compared to a modified approach using distance-learning technology. The recipients were four family medicine clinics within the Oregon Rural Practice-based Research Network (ORPRN). Two clinics were allocated to receive face-to-face detailing and two received outreach through video conferencing or asynchronous web-based outreach. Surveys at midpoint and completion were used to assess effectiveness and satisfaction. Results Each clinic received four outreach visits over an eight month period. Topics included treatment-resistant depression, management of atypical antipsychotics, drugs for insomnia, and benzodiazepine tapering. Overall, 90% of participating clinicians were satisfied with the program. Respondents who received in person detailing reported a higher likelihood of changing their behavior compared to respondents in the distance detailing group for five of seven content areas. While 90%-100% of respondents indicated they would continue to participate if the program were continued, the likelihood of participation declined if only distance approaches were offered. Conclusions We found strong support and satisfaction for the program among participating clinicians. Participants favored in-person approaches to distance interactions. Future efforts will be directed at quantitative methods for evaluating the economic and clinical effectiveness of detailing in rural family practice settings. PMID:23276303

  13. Performance and evaluation of real-time multicomputer control systems

    NASA Technical Reports Server (NTRS)

    Shin, K. G.

    1983-01-01

    New performance measures, detailed examples, modeling of error detection process, performance evaluation of rollback recovery methods, experiments on FTMP, and optimal size of an NMR cluster are discussed.

  14. Structural damages observed in state buildings after Simav/Turkey earthquake occurred on 19 May 2011

    NASA Astrophysics Data System (ADS)

    Tama, Y. S.

    2012-08-01

    Different levels of damages occurred in state buildings, especially in educational facilities, during the Simav earthquake (ML=5.7) on 19 May 2011. A site survey was carried out in the area after the earthquake, where six state buildings were examined in detail. The results of the survey showed that main reasons for the formation of damages in these buildings are the use of low strength concrete, insufficient reinforcement, inappropriate detailing, and low-quality workmanship. The investigated buildings were also evaluated by P25-rapid assessment method. The method demonstrates that two of the buildings in question are in "high risk band"; the other two fall into "detailed evaluation band", and the rest are in the "low risk band". This figure also matches with the damages observed in the site survey.

  15. Route visualization using detail lenses.

    PubMed

    Karnick, Pushpak; Cline, David; Jeschke, Stefan; Razdan, Anshuman; Wonka, Peter

    2010-01-01

    We present a method designed to address some limitations of typical route map displays of driving directions. The main goal of our system is to generate a printable version of a route map that shows the overview and detail views of the route within a single, consistent visual frame. Our proposed visualization provides a more intuitive spatial context than a simple list of turns. We present a novel multifocus technique to achieve this goal, where the foci are defined by points of interest (POI) along the route. A detail lens that encapsulates the POI at a finer geospatial scale is created for each focus. The lenses are laid out on the map to avoid occlusion with the route and each other, and to optimally utilize the free space around the route. We define a set of layout metrics to evaluate the quality of a lens layout for a given route map visualization. We compare standard lens layout methods to our proposed method and demonstrate the effectiveness of our method in generating aesthetically pleasing layouts. Finally, we perform a user study to evaluate the effectiveness of our layout choices.

  16. Develop a new testing and evaluation protocol to assess flexbase performance using strength of soil binder.

    DOT National Transportation Integrated Search

    2008-01-01

    This research involved a detailed laboratory study of a new test method for evaluating road base materials based on : the strength of the soil binder. In this test method, small test specimens (5.0in length and 0.75in square cross : section) of binde...

  17. A guide to the use of distance sampling to estimate abundance of Karner blue butterflies

    USGS Publications Warehouse

    Grundel, Ralph

    2015-01-01

    This guide is intended to describe the use of distance sampling as a method for evaluating the abundance of Karner blue butterflies at a location. Other methods for evaluating abundance exist, including mark-release-recapture and index counts derived from Pollard-Yates surveys, for example. Although this guide is not intended to be a detailed comparison of the pros and cons of each type of method, there are important preliminary considerations to think about before selecting any method for evaluating the abundance of Karner blue butterflies.

  18. Post Occupancy Evaluation of Educational Buildings and Equipment.

    ERIC Educational Resources Information Center

    Watson, Chris

    1997-01-01

    Details the post occupancy evaluation (POE) process for public buildings. POEs are used to improve design and optimize educational building and equipment use. The evaluation participants, the method used, the results and recommendations, model schools, and classroom alterations using POE are described. (9 references.) (RE)

  19. Key Features of Academic Detailing: Development of an Expert Consensus Using the Delphi Method

    PubMed Central

    Yeh, James S.; Van Hoof, Thomas J.; Fischer, Michael A.

    2016-01-01

    Background Academic detailing is an outreach education technique that combines the direct social marketing traditionally used by pharmaceutical representatives with unbiased content summarizing the best evidence for a given clinical issue. Academic detailing is conducted with clinicians to encourage evidence-based practice in order to improve the quality of care and patient outcomes. The adoption of academic detailing has increased substantially since the original studies in the 1980s. However, the lack of standard agreement on its implementation makes the evaluation of academic detailing outcomes challenging. Objective To identify consensus on the key elements of academic detailing among a group of experts with varying experiences in academic detailing. Methods This study is based on an online survey of 20 experts with experience in academic detailing. We used the Delphi process, an iterative and systematic method of developing consensus within a group. We conducted 3 rounds of online surveys, which addressed 72 individual items derived from a previous literature review of 5 features of academic detailing, including (1) content, (2) communication process, (3) clinicians targeted, (4) change agents delivering intervention, and (5) context for intervention. Nonrespondents were removed from later rounds of the surveys. For most questions, a 4-point ordinal scale was used for responses. We defined consensus agreement as 70% of respondents for a single rating category or 80% for dichotomized ratings. Results The overall survey response rate was 95% (54 of 57 surveys) and nearly 92% consensus agreement on the survey items (66 of 72 items) by the end of the Delphi exercise. The experts' responses suggested that (1) focused clinician education offering support for clinical decision-making is a key component of academic detailing, (2) detailing messages need to be tailored and provide feasible strategies and solutions to challenging cases, and (3) academic detailers need to develop specific skill sets required to overcome barriers to changing clinician behavior. Conclusion Consensus derived from this Delphi exercise can serve as a useful template of general principles in academic detailing initiatives and evaluation. The study findings are limited by the lack of standard definitions of certain terms used in the Delphi process. PMID:27066195

  20. Lunar-base construction equipment and methods evaluation

    NASA Technical Reports Server (NTRS)

    Boles, Walter W.; Ashley, David B.; Tucker, Richard L.

    1993-01-01

    A process for evaluating lunar-base construction equipment and methods concepts is presented. The process is driven by the need for more quantitative, systematic, and logical methods for assessing further research and development requirements in an area where uncertainties are high, dependence upon terrestrial heuristics is questionable, and quantitative methods are seldom applied. Decision theory concepts are used in determining the value of accurate information and the process is structured as a construction-equipment-and-methods selection methodology. Total construction-related, earth-launch mass is the measure of merit chosen for mathematical modeling purposes. The work is based upon the scope of the lunar base as described in the National Aeronautics and Space Administration's Office of Exploration's 'Exploration Studies Technical Report, FY 1989 Status'. Nine sets of conceptually designed construction equipment are selected as alternative concepts. It is concluded that the evaluation process is well suited for assisting in the establishment of research agendas in an approach that is first broad, with a low level of detail, followed by more-detailed investigations into areas that are identified as critical due to high degrees of uncertainty and sensitivity.

  1. Satellite Power System (SPS) financial/management scenarios

    NASA Technical Reports Server (NTRS)

    Vajk, J. P.

    1978-01-01

    The possible benefits of a Satellite Power System (SPS) program, both domestically and internationally, justify detailed and imaginative investigation of the issues involved in financing and managing such a large-scale program. In this study, ten possible methods of financing a SPS program are identified ranging from pure government agency to private corporations. The following were analyzed and evaluated: (1) capital requirements for SPS; (2) ownership and control; (3) management principles; (4) organizational forms for SPS; (5) criteria for evaluation; (6) detailed description and preliminary evaluation of alternatives; (7) phased approaches; and (8) comparative evaluation. Key issues and observations and recommendations for further study are also presented.

  2. Academic Detailing Has a Positive Effect on Prescribing and Decreasing Prescription Drug Costs: A Health Plan's Perspective

    PubMed Central

    Ndefo, Uche Anadu; Norman, Rolicia; Henry, Andrea

    2017-01-01

    Background When initiated by a health plan, academic detailing can be used to change prescribing practices, which can lead to increased safety and savings. Objective To evaluate the impact of academic detailing on prescribing and prescription drug costs of cefixime to a health plan. Methods A prospective intervention study was carried out that evaluated the prescribing practices and prescription drug costs of cefixime. A total of 11 prescribers were detailed by 1 pharmacist between August 2014 and March 2015. Two of the 11 prescribers did not respond to the academic detailing and were not followed up. The physicians' prescribing habits and prescription costs were compared before and after detailing to evaluate the effectiveness of the intervention. Data were collected for approximately 5 months before and after the intervention. Each prescriber served as his or her own control. Results Overall, an approximate 36% reduction in the number of cefixime prescriptions written and an approximate 20% decrease in prescription costs was seen with academic detailing compared with the year before the intervention. In 9 of 11 (82%) prescribers, intervention with academic detailing was successful and resulted in fewer prescriptions for cefixime during the study period. Conclusion Academic detailing had a positive impact on prescribing, by decreasing the number of cefixime prescriptions and lowering the drug costs to the health plan. PMID:28626509

  3. [Forensic medical evaluation of professional working capacity in victims, who needs additional care].

    PubMed

    Kapustin, A V; Tomilin, V V; Ol'khovnik, V P; Panfilenko, O A; Serebriakova, V G

    2000-01-01

    The philosophy of evaluating the need of a victim in extra care is discussed. The method for evaluating the need in transport vehicles for the victim is described in detail. Legislative documents which help solve such problems are cited, including those used by committees of forensic medical evaluations.

  4. A Study on Project Priority Evaluation Method on Road Slope Disaster Prevention Management

    NASA Astrophysics Data System (ADS)

    Sekiguchi, Nobuyasu; Ohtsu, Hiroyasu; Izu, Ryuutarou

    To improve the safety and security of driving while coping with today's stagnant economy and frequent natural disasters, road slopes should be appropriately managed. To achieve the goals, road managers should establish project priority evaluation methods for each stage of road slope management by clarifying social losses that would result by drops in service levels. It is important that road managers evaluate a project priority properly to manage the road slope effectively. From this viewpoint, this study proposed "project priority evaluation methods" in road slope disaster prevention, which use available slope information at each stage of road slope management under limited funds. In addition, this study investigated the effect of managing it from the high slope of the priority by evaluating a risk of slope failure. In terms of the amount of available information, staged information provision is needed ranging from macroscopic studies, which involves evaluation of the entire route at each stage of decision making, to semi- and microscopic investigations for evaluating slopes, and microscopic investigations for evaluating individual slopes. With limited funds, additional detailed surveys are difficult to perform. It is effective to use the slope risk assessment system, which was constructed to complement detailed data, to extract sites to perform precise investigations.

  5. Care Staff Perceptions of Choking Incidents: What Details Are Reported?

    ERIC Educational Resources Information Center

    Guthrie, Susan; Lecko, Caroline; Roddam, Hazel

    2015-01-01

    Background: Following a series of fatal choking incidents in one UK specialist service, this study evaluated the detail included in incident reporting. This study compared the enhanced reporting system in the specialist service with the national reporting and learning system. Methods: Eligible reports were selected from a national organization and…

  6. Validating Analytical Methods

    ERIC Educational Resources Information Center

    Ember, Lois R.

    1977-01-01

    The procedures utilized by the Association of Official Analytical Chemists (AOAC) to develop, evaluate, and validate analytical methods for the analysis of chemical pollutants are detailed. Methods validated by AOAC are used by the EPA and FDA in their enforcement programs and are granted preferential treatment by the courts. (BT)

  7. Lower-upper-threshold correlation for underwater range-gated imaging self-adaptive enhancement.

    PubMed

    Sun, Liang; Wang, Xinwei; Liu, Xiaoquan; Ren, Pengdao; Lei, Pingshun; He, Jun; Fan, Songtao; Zhou, Yan; Liu, Yuliang

    2016-10-10

    In underwater range-gated imaging (URGI), enhancement of low-brightness and low-contrast images is critical for human observation. Traditional histogram equalizations over-enhance images, with the result of details being lost. To compress over-enhancement, a lower-upper-threshold correlation method is proposed for underwater range-gated imaging self-adaptive enhancement based on double-plateau histogram equalization. The lower threshold determines image details and compresses over-enhancement. It is correlated with the upper threshold. First, the upper threshold is updated by searching for the local maximum in real time, and then the lower threshold is calculated by the upper threshold and the number of nonzero units selected from a filtered histogram. With this method, the backgrounds of underwater images are constrained with enhanced details. Finally, the proof experiments are performed. Peak signal-to-noise-ratio, variance, contrast, and human visual properties are used to evaluate the objective quality of the global and regions of interest images. The evaluation results demonstrate that the proposed method adaptively selects the proper upper and lower thresholds under different conditions. The proposed method contributes to URGI with effective image enhancement for human eyes.

  8. Towards cleaner combustion engines through groundbreaking detailed chemical kinetic models

    PubMed Central

    Battin-Leclerc, Frédérique; Blurock, Edward; Bounaceur, Roda; Fournet, René; Glaude, Pierre-Alexandre; Herbinet, Olivier; Sirjean, Baptiste; Warth, V.

    2013-01-01

    In the context of limiting the environmental impact of transportation, this paper reviews new directions which are being followed in the development of more predictive and more accurate detailed chemical kinetic models for the combustion of fuels. In the first part, the performance of current models, especially in terms of the prediction of pollutant formation, is evaluated. In the next parts, recent methods and ways to improve these models are described. An emphasis is given on the development of detailed models based on elementary reactions, on the production of the related thermochemical and kinetic parameters, and on the experimental techniques available to produce the data necessary to evaluate model predictions under well defined conditions. PMID:21597604

  9. Speciation and Attenuation of Arsenic and Selenium at Coal Combustion By-Product Management Facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    K. Ladwig

    2005-12-31

    The overall objective of this project was to evaluate the impact of key constituents captured from power plant air streams (principally arsenic and selenium) on the disposal and utilization of coal combustion products (CCPs). Specific objectives of the project were: (1) to develop a comprehensive database of field leachate concentrations at a wide range of CCP management sites, including speciation of arsenic and selenium, and low-detection limit analyses for mercury; (2) to perform detailed evaluations of the release and attenuation of arsenic species at three CCP sites; and (3) to perform detailed evaluations of the release and attenuation of seleniummore » species at three CCP sites. Each of these objectives was accomplished using a combination of field sampling and laboratory analysis and experimentation. All of the methods used and results obtained are contained in this report. For ease of use, the report is subdivided into three parts. Volume 1 contains methods and results for the field leachate characterization. Volume 2 contains methods and results for arsenic adsorption. Volume 3 contains methods and results for selenium adsorption.« less

  10. IMPROVING THE REPORTING OF THERAPEUTIC EXERCISE INTERVENTIONS IN REHABILITATION RESEARCH.

    PubMed

    Page, Phil; Hoogenboom, Barb; Voight, Michael

    2017-04-01

    The foundation of evidence-based practice lies in clinical research, which is based on the utilization of the scientific method. The scientific method requires that all details of the experiment be provided in publications to support replication of the study in order to evaluate and validate the results. More importantly, clinical research can only be translated into practice when researchers provide explicit details of the study. Too often, rehabilitation exercise intervention studies lack the appropriate detail to allow clinicians to replicate the exercise protocol in their patient populations. Therefore, the purpose of this clinical commentary is to provide guidelines for optimal reporting of therapeutic exercise interventions in rehabilitation research. 5.

  11. in silico Surveillance: evaluating outbreak detection with simulation models

    PubMed Central

    2013-01-01

    Background Detecting outbreaks is a crucial task for public health officials, yet gaps remain in the systematic evaluation of outbreak detection protocols. The authors’ objectives were to design, implement, and test a flexible methodology for generating detailed synthetic surveillance data that provides realistic geographical and temporal clustering of cases and use to evaluate outbreak detection protocols. Methods A detailed representation of the Boston area was constructed, based on data about individuals, locations, and activity patterns. Influenza-like illness (ILI) transmission was simulated, producing 100 years of in silico ILI data. Six different surveillance systems were designed and developed using gathered cases from the simulated disease data. Performance was measured by inserting test outbreaks into the surveillance streams and analyzing the likelihood and timeliness of detection. Results Detection of outbreaks varied from 21% to 95%. Increased coverage did not linearly improve detection probability for all surveillance systems. Relaxing the decision threshold for signaling outbreaks greatly increased false-positives, improved outbreak detection slightly, and led to earlier outbreak detection. Conclusions Geographical distribution can be more important than coverage level. Detailed simulations of infectious disease transmission can be configured to represent nearly any conceivable scenario. They are a powerful tool for evaluating the performance of surveillance systems and methods used for outbreak detection. PMID:23343523

  12. MDCT evaluation of potential living renal donor, prior to laparoscopic donor nephrectomy: What the transplant surgeon wants to know?

    PubMed Central

    Ghonge, Nitin P; Gadanayak, Satyabrat; Rajakumari, Vijaya

    2014-01-01

    As Laparoscopic Donor Nephrectomy (LDN) offers several advantages for the donor such as lesser post-operative pain, fewer cosmetic concerns and faster recovery time, there is growing global trend towards LDN as compared to open nephrectomy. Comprehensive pre-LDN donor evaluation includes assessment of renal morphology including pelvi-calyceal and vascular system. Apart from donor selection, evaluation of the regional anatomy allows precise surgical planning. Due to limited visualization during laparoscopic renal harvesting, detailed pre-transplant evaluation of regional anatomy, including the renal venous anatomy is of utmost importance. MDCT is the modality of choice for pre-LDN evaluation of potential renal donors. Apart from appropriate scan protocol and post-processing methods, detailed understanding of surgical techniques is essential for the Radiologist for accurate image interpretation during pre-LDN MDCT evaluation of potential renal donors. This review article describes MDCT evaluation of potential living renal donor, prior to LDN with emphasis on scan protocol, post-processing methods and image interpretation. The article laid special emphasis on surgical perspectives of pre-LDN MDCT evaluation and addresses important points which transplant surgeons want to know. PMID:25489130

  13. Validation and evaluation of the advanced aeronautical CFD system SAUNA: A method developer's view

    NASA Astrophysics Data System (ADS)

    Shaw, J. A.; Peace, A. J.; Georgala, J. M.; Childs, P. N.

    1993-09-01

    This paper is concerned with a detailed validation and evaluation of the SAUNA CFD system for complex aircraft configurations. The methodology of the complete system is described in brief, including its unique use of differing grid generation strategies (structured, unstructured or both) depending on the geometric complexity of the configuration. A wide range of configurations and flow conditions are chosen in the validation and evaluation exercise to demonstrate the scope of SAUNA. A detailed description of the results from the method is preceded by a discussion on the philosophy behind the strategy followed in the exercise, in terms of equality assessment and the differing roles of the code developer and the code user. It is considered that SAUNA has grown into a highly usable tool for the aircraft designer, in combining flexibility and accuracy in an efficient manner.

  14. Childhood Obesity Research Demonstration project: Cross-site evaluation method

    USDA-ARS?s Scientific Manuscript database

    The Childhood Obesity Research Demonstration (CORD) project links public health and primary care interventions in three projects described in detail in accompanying articles in this issue of Childhood Obesity. This article describes a comprehensive evaluation plan to determine the extent to which th...

  15. Qualitative evaluations and comparisons of six night-vision colorization methods

    NASA Astrophysics Data System (ADS)

    Zheng, Yufeng; Reese, Kristopher; Blasch, Erik; McManamon, Paul

    2013-05-01

    Current multispectral night vision (NV) colorization techniques can manipulate images to produce colorized images that closely resemble natural scenes. The colorized NV images can enhance human perception by improving observer object classification and reaction times especially for low light conditions. This paper focuses on the qualitative (subjective) evaluations and comparisons of six NV colorization methods. The multispectral images include visible (Red-Green- Blue), near infrared (NIR), and long wave infrared (LWIR) images. The six colorization methods are channel-based color fusion (CBCF), statistic matching (SM), histogram matching (HM), joint-histogram matching (JHM), statistic matching then joint-histogram matching (SM-JHM), and the lookup table (LUT). Four categries of quality measurements are used for the qualitative evaluations, which are contrast, detail, colorfulness, and overall quality. The score of each measurement is rated from 1 to 3 scale to represent low, average, and high quality, respectively. Specifically, high contrast (of rated score 3) means an adequate level of brightness and contrast. The high detail represents high clarity of detailed contents while maintaining low artifacts. The high colorfulness preserves more natural colors (i.e., closely resembles the daylight image). Overall quality is determined from the NV image compared to the reference image. Nine sets of multispectral NV images were used in our experiments. For each set, the six colorized NV images (produced from NIR and LWIR images) are concurrently presented to users along with the reference color (RGB) image (taken at daytime). A total of 67 subjects passed a screening test ("Ishihara Color Blindness Test") and were asked to evaluate the 9-set colorized images. The experimental results showed the quality order of colorization methods from the best to the worst: CBCF < SM < SM-JHM < LUT < JHM < HM. It is anticipated that this work will provide a benchmark for NV colorization and for quantitative evaluation using an objective metric such as objective evaluation index (OEI).

  16. On the Unification of Psychology, Methodology, and Pedagogy.

    ERIC Educational Resources Information Center

    Wettersten, John

    1987-01-01

    The psychological and methodological bases of the Agassi teaching method are described to provide a context for evaluating the theory. A brief history of Selzian psychology and Popper's methodology is given. The Agassi method, which stresses learning through questioning, is detailed. (JL)

  17. An approach to the preliminary evaluation of Closed Ecological Life Support System (CELSS) scenarios and control strategies

    NASA Technical Reports Server (NTRS)

    Stahr, J. D.; Auslander, D. M.; Spear, R. C.; Young, G. E.

    1982-01-01

    Life support systems for manned space missions are discussed. A scenario analysis method was proposed for the initial step of comparing possible partial or total recycle scenarios. The method is discussed in detail.

  18. Clinical Decision Support Alert Appropriateness: A Review and Proposal for Improvement

    PubMed Central

    McCoy, Allison B.; Thomas, Eric J.; Krousel-Wood, Marie; Sittig, Dean F.

    2014-01-01

    Background Many healthcare providers are adopting clinical decision support (CDS) systems to improve patient safety and meet meaningful use requirements. Computerized alerts that prompt clinicians about drug-allergy, drug-drug, and drug-disease warnings or provide dosing guidance are most commonly implemented. Alert overrides, which occur when clinicians do not follow the guidance presented by the alert, can hinder improved patient outcomes. Methods We present a review of CDS alerts and describe a proposal to develop novel methods for evaluating and improving CDS alerts that builds upon traditional informatics approaches. Our proposal incorporates previously described models for predicting alert overrides that utilize retrospective chart review to determine which alerts are clinically relevant and which overrides are justifiable. Results Despite increasing implementations of CDS alerts, detailed evaluations rarely occur because of the extensive labor involved in manual chart reviews to determine alert and response appropriateness. Further, most studies have solely evaluated alert overrides that are appropriate or justifiable. Our proposal expands the use of web-based monitoring tools with an interactive dashboard for evaluating CDS alert and response appropriateness that incorporates the predictive models. The dashboard provides 2 views, an alert detail view and a patient detail view, to provide a full history of alerts and help put the patient's events in context. Conclusion The proposed research introduces several innovations to address the challenges and gaps in alert evaluations. This research can transform alert evaluation processes across healthcare settings, leading to improved CDS, reduced alert fatigue, and increased patient safety. PMID:24940129

  19. COMPARISON AND EVALUATION OF LABORATORY PERFORMANCE ON A METHOD FOR THE DETERMINATION OF PERCHLORATE IN FERTILIZERS

    EPA Science Inventory

    This report details the interlaboratory validation of a method for the determination of perchlorate in fertilizers. In this method (EPA/600/R-01/026), a solid sample of fertilizer is first ground. subsequently, the ground material is either leached with deionized water to dissolv...

  20. [Research progresses on ergonomics assessment and measurement methods for push-pull behavior].

    PubMed

    Zhao, Yan; Li, Dongxu; Guo, Shengpeng

    2011-10-01

    Pushing and pulling (P&P) is a common operating mode of operator's physical works, and plays an important role in evaluation of human behavior health and operation performance. At present, there are many research methods of P&P, and this article is a state-of-art review of the classification of P&P research methods, the various impact factors in P&P program, technical details of internal/external P&P force measurement and evaluation, the limitation of current research methods and the future developments in the ergonomics field.

  1. An Improved Image Ringing Evaluation Method with Weighted Sum of Gray Extreme Value

    NASA Astrophysics Data System (ADS)

    Yang, Ling; Meng, Yanhua; Wang, Bo; Bai, Xu

    2018-03-01

    Blind image restoration algorithm usually produces ringing more obvious at the edges. Ringing phenomenon is mainly affected by noise, species of restoration algorithm, and the impact of the blur kernel estimation during restoration. Based on the physical mechanism of ringing, a method of evaluating the ringing on blind restoration images is proposed. The method extracts the ringing image overshooting and ripple region to make the weighted statistics for the regional gradient value. According to the weights set by multiple experiments, the edge information is used to characterize the details of the edge to determine the weight, quantify the seriousness of the ring effect, and propose the evaluation method of the ringing caused by blind restoration. The experimental results show that the method can effectively evaluate the ring effect in the restoration images under different restoration algorithms and different restoration parameters. The evaluation results are consistent with the visual evaluation results.

  2. Phase-locked tracking loops for LORAN-C

    NASA Technical Reports Server (NTRS)

    Burhans, R. W.

    1978-01-01

    Portable battery operated LORAN-C receivers were fabricated to evaluate simple envelope detector methods with hybrid analog to digital phase locked loop sensor processors. The receivers are used to evaluate LORAN-C in general aviation applications. Complete circuit details are given for the experimental sensor and readout system.

  3. Industrial ecology: Quantitative methods for exploring a lower carbon future

    NASA Astrophysics Data System (ADS)

    Thomas, Valerie M.

    2015-03-01

    Quantitative methods for environmental and cost analyses of energy, industrial, and infrastructure systems are briefly introduced and surveyed, with the aim of encouraging broader utilization and development of quantitative methods in sustainable energy research. Material and energy flow analyses can provide an overall system overview. The methods of engineering economics and cost benefit analysis, such as net present values, are the most straightforward approach for evaluating investment options, with the levelized cost of energy being a widely used metric in electricity analyses. Environmental lifecycle assessment has been extensively developed, with both detailed process-based and comprehensive input-output approaches available. Optimization methods provide an opportunity to go beyond engineering economics to develop detailed least-cost or least-impact combinations of many different choices.

  4. MEASUREMENT OF INDOOR AIR EMISSIONS FROM DRY-PROCESS PHOTOCOPY MACHINES

    EPA Science Inventory

    The article provides background information on indoor air emissions from office equipment, with emphasis on dry-process photocopy machines. The test method is described in detail along with results of a study to evaluate the test method using four dry-process photocopy machines. ...

  5. MOVES2014: Heavy-duty Vehicle Emissions Report

    EPA Science Inventory

    This report updates MOVES methods for evaluating current HD diesel NOx emission rates based on comparisons to independent data from EPA’s IUVP and Houston drayage programs. The report also details methods/assumptions made for HD gasoline HC, CO and NOx emission rates using reduct...

  6. Stereo photos for evaluating jack pine slash fuels.

    Treesearch

    Richard W. Blank

    1982-01-01

    Describes a quick, visual method for estimating jack pine logging residue and other fuels. The method uses a series of large color photographs and stereo pairs as well as data sheets that detail size classes and loadings of the logging slash and other fuels.

  7. Study on the Application of TOPSIS Method to the Introduction of Foreign Players in CBA Games

    NASA Astrophysics Data System (ADS)

    Zhongyou, Xing

    The TOPSIS method is a multiple attribute decision-making method. This paper introduces the current situation of the introduction of foreign players in CBA games, presents the principles and calculation steps of TOPSIS method in detail, and applies it to the quantitative evaluation of the comprehensively competitive ability during the introduction of foreign players. Through the analysis of practical application, we found that the TOPSIS method has relatively high rationality and applicability when it is used to evaluate the comprehensively competitive ability during the introduction of foreign players.

  8. Multipurpose contrast enhancement on epiphyseal plates and ossification centers for bone age assessment

    PubMed Central

    2013-01-01

    Background The high variations of background luminance, low contrast and excessively enhanced contrast of hand bone radiograph often impede the bone age assessment rating system in evaluating the degree of epiphyseal plates and ossification centers development. The Global Histogram equalization (GHE) has been the most frequently adopted image contrast enhancement technique but the performance is not satisfying. A brightness and detail preserving histogram equalization method with good contrast enhancement effect has been a goal of much recent research in histogram equalization. Nevertheless, producing a well-balanced histogram equalized radiograph in terms of its brightness preservation, detail preservation and contrast enhancement is deemed to be a daunting task. Method In this paper, we propose a novel framework of histogram equalization with the aim of taking several desirable properties into account, namely the Multipurpose Beta Optimized Bi-Histogram Equalization (MBOBHE). This method performs the histogram optimization separately in both sub-histograms after the segmentation of histogram using an optimized separating point determined based on the regularization function constituted by three components. The result is then assessed by the qualitative and quantitative analysis to evaluate the essential aspects of histogram equalized image using a total of 160 hand radiographs that are implemented in testing and analyses which are acquired from hand bone online database. Result From the qualitative analysis, we found that basic bi-histogram equalizations are not capable of displaying the small features in image due to incorrect selection of separating point by focusing on only certain metric without considering the contrast enhancement and detail preservation. From the quantitative analysis, we found that MBOBHE correlates well with human visual perception, and this improvement shortens the evaluation time taken by inspector in assessing the bone age. Conclusions The proposed MBOBHE outperforms other existing methods regarding comprehensive performance of histogram equalization. All the features which are pertinent to bone age assessment are more protruding relative to other methods; this has shorten the required evaluation time in manual bone age assessment using TW method. While the accuracy remains unaffected or slightly better than using unprocessed original image. The holistic properties in terms of brightness preservation, detail preservation and contrast enhancement are simultaneous taken into consideration and thus the visual effect is contributive to manual inspection. PMID:23565999

  9. Evaluating Organizational Change at a Multinational Transportation Corporation: Method and Reflections

    ERIC Educational Resources Information Center

    Plakhotnik, Maria S.

    2016-01-01

    The purpose of this perspective on practice is to share my experience conducting an organizational change evaluation using qualitative methodology at a multinational transportation company Global Logistics. I provide a detailed description of the three phase approach to data analysis and my reflections on the process.

  10. Accreditation for Armed Forces Educational Institutions.

    ERIC Educational Resources Information Center

    Tarquine, Robert Blaine

    The report established the need for educational accreditation and consolidates the various means of achieving accreditation that are available to the Armed Forces, into one accessible reference. The scope of each accrediting method is presented in detail, allowing educational officials to evaluate the methods in respect to their individual…

  11. A Framework for Usability Evaluation in EHR Procurement.

    PubMed

    Tyllinen, Mari; Kaipio, Johanna; Lääveri, Tinja

    2018-01-01

    Usability should be considered already by the procuring organizations when selecting future systems. In this paper, we present a framework for usability evaluation during electronic health record (EHR) system procurement. We describe the objectives of the evaluation, the procedure, selected usability attributes and the evaluation methods to measure them. We also present the emphasis usability had in the selection process. We do not elaborate on the details of the results, the application of methods or gathering of data. Instead we focus on the components of the framework to inform and give an example to other similar procurement projects.

  12. Key Features of Academic Detailing: Development of an Expert Consensus Using the Delphi Method.

    PubMed

    Yeh, James S; Van Hoof, Thomas J; Fischer, Michael A

    2016-02-01

    Academic detailing is an outreach education technique that combines the direct social marketing traditionally used by pharmaceutical representatives with unbiased content summarizing the best evidence for a given clinical issue. Academic detailing is conducted with clinicians to encourage evidence-based practice in order to improve the quality of care and patient outcomes. The adoption of academic detailing has increased substantially since the original studies in the 1980s. However, the lack of standard agreement on its implementation makes the evaluation of academic detailing outcomes challenging. To identify consensus on the key elements of academic detailing among a group of experts with varying experiences in academic detailing. This study is based on an online survey of 20 experts with experience in academic detailing. We used the Delphi process, an iterative and systematic method of developing consensus within a group. We conducted 3 rounds of online surveys, which addressed 72 individual items derived from a previous literature review of 5 features of academic detailing, including (1) content, (2) communication process, (3) clinicians targeted, (4) change agents delivering intervention, and (5) context for intervention. Nonrespondents were removed from later rounds of the surveys. For most questions, a 4-point ordinal scale was used for responses. We defined consensus agreement as 70% of respondents for a single rating category or 80% for dichotomized ratings. The overall survey response rate was 95% (54 of 57 surveys) and nearly 92% consensus agreement on the survey items (66 of 72 items) by the end of the Delphi exercise. The experts' responses suggested that (1) focused clinician education offering support for clinical decision-making is a key component of academic detailing, (2) detailing messages need to be tailored and provide feasible strategies and solutions to challenging cases, and (3) academic detailers need to develop specific skill sets required to overcome barriers to changing clinician behavior. Consensus derived from this Delphi exercise can serve as a useful template of general principles in academic detailing initiatives and evaluation. The study findings are limited by the lack of standard definitions of certain terms used in the Delphi process.

  13. Home Brew Salinity Measuring Devices: Their Construction and Use.

    ERIC Educational Resources Information Center

    Schlenker, Richard M.

    This paper discusses several inexpensive methods of evaluating the salinity of seawater. One method is presented in some detail. This method has several attractive features. First, it can be used to provide instruction, not only in marine chemistry, but also in studying the mathematics of the point slope formula, and as an aid in teaching students…

  14. Impacts of Outer Continental Shelf (OCS) development on recreation and tourism. Volume 3. Detailed methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    The final report for the project is presented in five volumes. This volume, Detailed Methodology Review, presents a discussion of the methods considered and used to estimate the impacts of Outer Continental Shelf (OCS) oil and gas development on coastal recreation in California. The purpose is to provide the Minerals Management Service with data and methods to improve their ability to analyze the socio-economic impacts of OCS development. Chapter II provides a review of previous attempts to evaluate the effects of OCS development and of oil spills on coastal recreation. The review also discusses the strengths and weaknesses of differentmore » approaches and presents the rationale for the methodology selection made. Chapter III presents a detailed discussion of the methods actually used in the study. The volume contains the bibliography for the entire study.« less

  15. The effect of berberine on insulin resistance in women with polycystic ovary syndrome: detailed statistical analysis plan (SAP) for a multicenter randomized controlled trial.

    PubMed

    Zhang, Ying; Sun, Jin; Zhang, Yun-Jiao; Chai, Qian-Yun; Zhang, Kang; Ma, Hong-Li; Wu, Xiao-Ke; Liu, Jian-Ping

    2016-10-21

    Although Traditional Chinese Medicine (TCM) has been widely used in clinical settings, a major challenge that remains in TCM is to evaluate its efficacy scientifically. This randomized controlled trial aims to evaluate the efficacy and safety of berberine in the treatment of patients with polycystic ovary syndrome. In order to improve the transparency and research quality of this clinical trial, we prepared this statistical analysis plan (SAP). The trial design, primary and secondary outcomes, and safety outcomes were declared to reduce selection biases in data analysis and result reporting. We specified detailed methods for data management and statistical analyses. Statistics in corresponding tables, listings, and graphs were outlined. The SAP provided more detailed information than trial protocol on data management and statistical analysis methods. Any post hoc analyses could be identified via referring to this SAP, and the possible selection bias and performance bias will be reduced in the trial. This study is registered at ClinicalTrials.gov, NCT01138930 , registered on 7 June 2010.

  16. The Bayesian Evaluation of Categorization Models: Comment on Wills and Pothos (2012)

    ERIC Educational Resources Information Center

    Vanpaemel, Wolf; Lee, Michael D.

    2012-01-01

    Wills and Pothos (2012) reviewed approaches to evaluating formal models of categorization, raising a series of worthwhile issues, challenges, and goals. Unfortunately, in discussing these issues and proposing solutions, Wills and Pothos (2012) did not consider Bayesian methods in any detail. This means not only that their review excludes a major…

  17. Performance and non-destructive evaluation methods of airborne radome and stealth structures

    NASA Astrophysics Data System (ADS)

    Panwar, Ravi; Ryul Lee, Jung

    2018-06-01

    In the past few years, great effort has been devoted to the fabrication of highly efficient, broadband radome and stealth (R&S) structures for distinct control, guidance, surveillance and communication applications for airborne platforms. The evaluation of non-planar aircraft R&S structures in terms of their electromagnetic performance and structural damage is still a very challenging task. In this article, distinct measurement techniques are discussed for the electromagnetic performance and non-destructive evaluation (NDE) of R&S structures. This paper deals with an overview of the transmission line method and free space measurement based microwave measurement techniques for the electromagnetic performance evaluation of R&S structures. In addition, various conventional as well as advanced methods, such as millimetre and terahertz wave based imaging techniques with great potential for NDE of load bearing R&S structures, are also discussed in detail. A glimpse of in situ NDE techniques with corresponding experimental setup for R&S structures is also presented. The basic concepts, measurement ranges and their instrumentation, measurement method of different R&S structures and some miscellaneous topics are discussed in detail. Some of the challenges and issues pertaining to the measurement of curved R&S structures are also presented. This study also lists various mathematical models and analytical techniques for the electromagnetic performance evaluation and NDE of R&S structures. The research directions described in this study may be of interest to the scientific community in the aerospace sectors.

  18. Inside the Primary Classroom.

    ERIC Educational Resources Information Center

    Simon, Brian

    1980-01-01

    Presents some of the findings of the ORACLE research program (Observational Research and Classroom Learning Evaluation), a detailed observational study of teacher-student interaction, teaching styles, and management methods within a sample of primary classrooms. (Editor/SJL)

  19. Occupant Motion Sensors

    DOT National Transportation Integrated Search

    1971-03-01

    An analysis was made of methods for measuring vehicle occupant motion during crash or impact conditions. The purpose of the measurements is to evaluate restraint performance using human, anthropometric dummy, or animal occupants. A detailed Fourier f...

  20. A simplified method of evaluating the stress wave environment of internal equipment

    NASA Technical Reports Server (NTRS)

    Colton, J. D.; Desmond, T. P.

    1979-01-01

    A simplified method called the transfer function technique (TFT) was devised for evaluating the stress wave environment in a structure containing internal equipment. The TFT consists of following the initial in-plane stress wave that propagates through a structure subjected to a dynamic load and characterizing how the wave is altered as it is transmitted through intersections of structural members. As a basis for evaluating the TFT, impact experiments and detailed stress wave analyses were performed for structures with two or three, or more members. Transfer functions that relate the wave transmitted through an intersection to the incident wave were deduced from the predicted wave response. By sequentially applying these transfer functions to a structure with several intersections, it was found that the environment produced by the initial stress wave propagating through the structure can be approximated well. The TFT can be used as a design tool or as an analytical tool to determine whether a more detailed wave analysis is warranted.

  1. Method for technology-delivered healthcare measures.

    PubMed

    Kramer-Jackman, Kelli Lee; Popkess-Vawter, Sue

    2011-12-01

    Current healthcare literature lacks development and evaluation methods for research and practice measures administered by technology. Researchers with varying levels of informatics experience are developing technology-delivered measures because of the numerous advantages they offer. Hasty development of technology-delivered measures can present issues that negatively influence administration and psychometric properties. The Method for Technology-delivered Healthcare Measures is designed to systematically guide the development and evaluation of technology-delivered measures. The five-step Method for Technology-delivered Healthcare Measures includes establishment of content, e-Health literacy, technology delivery, expert usability, and participant usability. Background information and Method for Technology-delivered Healthcare Measures steps are detailed.

  2. Real-Time Nonlocal Means-Based Despeckling.

    PubMed

    Breivik, Lars Hofsoy; Snare, Sten Roar; Steen, Erik Normann; Solberg, Anne H Schistad

    2017-06-01

    In this paper, we propose a multiscale nonlocal means-based despeckling method for medical ultrasound. The multiscale approach leads to large computational savings and improves despeckling results over single-scale iterative approaches. We present two variants of the method. The first, denoted multiscale nonlocal means (MNLM), yields uniform robust filtering of speckle both in structured and homogeneous regions. The second, denoted unnormalized MNLM (UMNLM), is more conservative in regions of structure assuring minimal disruption of salient image details. Due to the popularity of anisotropic diffusion-based methods in the despeckling literature, we review the connection between anisotropic diffusion and iterative variants of NLM. These iterative variants in turn relate to our multiscale variant. As part of our evaluation, we conduct a simulation study making use of ground truth phantoms generated from clinical B-mode ultrasound images. We evaluate our method against a set of popular methods from the despeckling literature on both fine and coarse speckle noise. In terms of computational efficiency, our method outperforms the other considered methods. Quantitatively on simulations and on a tissue-mimicking phantom, our method is found to be competitive with the state-of-the-art. On clinical B-mode images, our method is found to effectively smooth speckle while preserving low-contrast and highly localized salient image detail.

  3. Evaluation of the environmental impact of Brownfield remediation options: comparison of two life cycle assessment-based evaluation tools.

    PubMed

    Cappuyns, Valérie; Kessen, Bram

    2012-01-01

    The choice between different options for the remediation of a contaminated site traditionally relies on economical, technical and regulatory criteria without consideration of the environmental impact of the soil remediation process itself. In the present study, the environmental impact assessment of two potential soil remediation techniques (excavation and off-site cleaning and in situ steam extraction) was performed using two life cycle assessment (LCA)-based evaluation tools, namely the REC (risk reduction, environmental merit and cost) method and the ReCiPe method. The comparison and evaluation of the different tools used to estimate the environmental impact of Brownfield remediation was based on a case study which consisted of the remediation of a former oil and fat processing plant. For the environmental impact assessment, both the REC and ReCiPe methods result in a single score for the environmental impact of the soil remediation process and allow the same conclusion to be drawn: excavation and off-site cleaning has a more pronounced environmental impact than in situ soil remediation by means of steam extraction. The ReCiPe method takes into account more impact categories, but is also more complex to work with and needs more input data. Within the routine evaluation of soil remediation alternatives, a detailed LCA evaluation will often be too time consuming and costly and the estimation of the environmental impact with the REC method will in most cases be sufficient. The case study worked out in this paper wants to provide a basis for a more sounded selection of soil remediation technologies based on a more detailed assessment of the secondary impact of soil remediation.

  4. Econometric Methods for Causal Evaluation of Education Policies and Practices: A Non-Technical Guide

    ERIC Educational Resources Information Center

    Schlotter, Martin; Schwerdt, Guido; Woessmann, Ludger

    2011-01-01

    Education policy-makers and practitioners want to know which policies and practices can best achieve their goals. But research that can inform evidence-based policy often requires complex methods to distinguish causation from accidental association. Avoiding econometric jargon and technical detail, this paper explains the main idea and intuition…

  5. Review of digital holography reconstruction methods

    NASA Astrophysics Data System (ADS)

    Dovhaliuk, Rostyslav Yu.

    2018-01-01

    Development of digital holography opened new ways of both transparent and opaque objects non-destructive study. In this paper, a digital hologram reconstruction process is investigated. The advantages and limitations of common wave propagation methods are discussed. The details of a software implementation of a digital hologram reconstruction methods are presented. Finally, the performance of each wave propagation method is evaluated, and recommendations about possible use cases for each of them are given.

  6. Assessing the reliability of ecotoxicological studies: An overview of current needs and approaches.

    PubMed

    Moermond, Caroline; Beasley, Amy; Breton, Roger; Junghans, Marion; Laskowski, Ryszard; Solomon, Keith; Zahner, Holly

    2017-07-01

    In general, reliable studies are well designed and well performed, and enough details on study design and performance are reported to assess the study. For hazard and risk assessment in various legal frameworks, many different types of ecotoxicity studies need to be evaluated for reliability. These studies vary in study design, methodology, quality, and level of detail reported (e.g., reviews, peer-reviewed research papers, or industry-sponsored studies documented under Good Laboratory Practice [GLP] guidelines). Regulators have the responsibility to make sound and verifiable decisions and should evaluate each study for reliability in accordance with scientific principles regardless of whether they were conducted in accordance with GLP and/or standardized methods. Thus, a systematic and transparent approach is needed to evaluate studies for reliability. In this paper, 8 different methods for reliability assessment were compared using a number of attributes: categorical versus numerical scoring methods, use of exclusion and critical criteria, weighting of criteria, whether methods are tested with case studies, domain of applicability, bias toward GLP studies, incorporation of standard guidelines in the evaluation method, number of criteria used, type of criteria considered, and availability of guidance material. Finally, some considerations are given on how to choose a suitable method for assessing reliability of ecotoxicity studies. Integr Environ Assess Manag 2017;13:640-651. © 2016 The Authors. Integrated Environmental Assessment and Management published by Wiley Periodicals, Inc. on behalf of Society of Environmental Toxicology & Chemistry (SETAC). © 2016 The Authors. Integrated Environmental Assessment and Management published by Wiley Periodicals, Inc. on behalf of Society of Environmental Toxicology & Chemistry (SETAC).

  7. Test Program for Evaluation of Variable Frequency Power Conditioners

    DOT National Transportation Integrated Search

    1973-08-01

    A test program is outlined for variable frequency power conditioners for 3-phase induction motors in vehicle propulsion applications. The Power Conditioner Unit (PCU) performance characteristics are discussed in some detail. Measurement methods, reco...

  8. Effective approaches to disorientation familiarization for aviation personnel.

    DOT National Transportation Integrated Search

    1970-11-01

    Techniques are discussed for providing familiarization of aviation personnel with disorientation problems. The procedures are spelled out in detail. Methods of modifying existing equipment as well as an evaluation of available commercial equipment ar...

  9. Measuring the Carotid to Femoral Pulse Wave Velocity (Cf-PWV) to Evaluate Arterial Stiffness.

    PubMed

    Ji, Hongwei; Xiong, Jing; Yu, Shikai; Chi, Chen; Bai, Bin; Teliewubai, Jiadela; Lu, Yuyan; Zhang, Yi; Xu, Yawei

    2018-05-03

    For the elderly, arterial stiffening is a good marker for aging evaluation and it is recommended that the arterial stiffness be determined noninvasively by the measurement of carotid to femoral pulse wave velocity (cf-PWV) (Class I; Level of Evidence A). In literature, numerous community-based or disease-specific studies have reported that higher cf-PWV is associated with increased cardiovascular risk. Here, we discuss strategies to evaluate arterial stiffness with cf-PWV. Following the well-defined steps detailed here, e.g., proper position operator, distance measurement, and tonometer position, we will obtain a standard cf-PWV value to evaluate arterial stiffness. In this paper, a detailed stepwise method to record a good quality PWV and pulse wave analysis (PWA) using a non-invasive tonometry-based device will be discussed.

  10. Hydrochemical analysis to evaluate the seawater ingress in a small coral island of India.

    PubMed

    Banerjee, Pallavi; Singh, V S; Singh, Ajay; Prasad, R K; Rangarajan, R

    2012-06-01

    The sustainable development of the limited groundwater resources in the tropical island requires a thorough understanding of detail hydrogeological regime including the hydrochemical behavior of groundwater. Detail analysis of chemical data of groundwater helps in assessing the different groundwater zone affected by formation as well as sea water. Groundwater and saline water interaction is better understood using groundwater major ion chemistry over an island aquifer. Multivariate methods to analyze the geochemical data are used to understand geochemical evolution of groundwater. The methods are successfully used to group the data to evaluate influence of various environs in the study area. Various classification methods such as piper, correlation method, and salinity hazard measurements are also employed to critical study of geochemical characteristics of groundwater to identify vulnerable parts of the aquifer. These approaches have been used to successfully evaluate the aquifer zones of a tiny island off the west coast of India. The most part of island is found to be safe for drinking, however some parts of island are identified that are affected by sea water ingress and dissolution of formation minerals. The analysis has successfully leaded to identification of that part of aquifer on the island which needs immediate attention for restoration and avoids further deterioration.

  11. Improved medical image fusion based on cascaded PCA and shift invariant wavelet transforms.

    PubMed

    Reena Benjamin, J; Jayasree, T

    2018-02-01

    In the medical field, radiologists need more informative and high-quality medical images to diagnose diseases. Image fusion plays a vital role in the field of biomedical image analysis. It aims to integrate the complementary information from multimodal images, producing a new composite image which is expected to be more informative for visual perception than any of the individual input images. The main objective of this paper is to improve the information, to preserve the edges and to enhance the quality of the fused image using cascaded principal component analysis (PCA) and shift invariant wavelet transforms. A novel image fusion technique based on cascaded PCA and shift invariant wavelet transforms is proposed in this paper. PCA in spatial domain extracts relevant information from the large dataset based on eigenvalue decomposition, and the wavelet transform operating in the complex domain with shift invariant properties brings out more directional and phase details of the image. The significance of maximum fusion rule applied in dual-tree complex wavelet transform domain enhances the average information and morphological details. The input images of the human brain of two different modalities (MRI and CT) are collected from whole brain atlas data distributed by Harvard University. Both MRI and CT images are fused using cascaded PCA and shift invariant wavelet transform method. The proposed method is evaluated based on three main key factors, namely structure preservation, edge preservation, contrast preservation. The experimental results and comparison with other existing fusion methods show the superior performance of the proposed image fusion framework in terms of visual and quantitative evaluations. In this paper, a complex wavelet-based image fusion has been discussed. The experimental results demonstrate that the proposed method enhances the directional features as well as fine edge details. Also, it reduces the redundant details, artifacts, distortions.

  12. Echocardiography-guided or "sided" pericardiocentesis.

    PubMed

    Degirmencioglu, Aleks; Karakus, Gultekin; Güvenc, Tolga Sinan; Pinhan, Osman; Sipahi, Ilke; Akyol, Ahmet

    2013-10-01

    Echocardiography-guided pericardiocentesis is the first choice method for relieving cardiac tamponade, but the exact role of the echocardiography at the moment of the puncture is still controversial. In this report, detailed echocardiographic evaluation was performed in 21 consecutive patients with cardiac tamponade just before the pericardiocentesis. Appropriate needle position was determined according to the probe position using imaginary x, y, and z axes. Pericardiocentesis was performed successfully using this technique without simultaneous echocardiography and no complications were observed. We concluded that bedside echocardiography with detailed evaluation of the puncture site and angle is enough for pericardiocentesis instead of real time guiding. © 2013, Wiley Periodicals, Inc.

  13. Evaluating a normalized conceptual representation produced from natural language patient discharge summaries.

    PubMed Central

    Zweigenbaum, P.; Bouaud, J.; Bachimont, B.; Charlet, J.; Boisvieux, J. F.

    1997-01-01

    The Menelas project aimed to produce a normalized conceptual representation from natural language patient discharge summaries. Because of the complex and detailed nature of conceptual representations, evaluating the quality of output of such a system is difficult. We present the method designed to measure the quality of Menelas output, and its application to the state of the French Menelas prototype as of the end of the project. We examine this method in the framework recently proposed by Friedman and Hripcsak. We also propose two conditions which enable to reduce the evaluation preparation workload. PMID:9357694

  14. A method for examining the geospatial distribution of CO2 storage resources applied to the Pre-Punta Gorda Composite and Dollar Bay reservoirs of the South Florida Basin, U.S.A

    USGS Publications Warehouse

    Roberts-Ashby, Tina; Brandon N. Ashby,

    2016-01-01

    This paper demonstrates geospatial modification of the USGS methodology for assessing geologic CO2 storage resources, and was applied to the Pre-Punta Gorda Composite and Dollar Bay reservoirs of the South Florida Basin. The study provides detailed evaluation of porous intervals within these reservoirs and utilizes GIS to evaluate the potential spatial distribution of reservoir parameters and volume of CO2 that can be stored. This study also shows that incorporating spatial variation of parameters using detailed and robust datasets may improve estimates of storage resources when compared to applying uniform values across the study area derived from small datasets, like many assessment methodologies. Geospatially derived estimates of storage resources presented here (Pre-Punta Gorda Composite = 105,570 MtCO2; Dollar Bay = 24,760 MtCO2) were greater than previous assessments, which was largely attributed to the fact that detailed evaluation of these reservoirs resulted in higher estimates of porosity and net-porous thickness, and areas of high porosity and thick net-porous intervals were incorporated into the model, likely increasing the calculated volume of storage space available for CO2 sequestration. The geospatial method for evaluating CO2 storage resources also provides the ability to identify areas that potentially contain higher volumes of storage resources, as well as areas that might be less favorable.

  15. Development of an external ceramic insulation for the space shuttle orbiter. Part 2: Optimization

    NASA Technical Reports Server (NTRS)

    Tanzilli, R. A. (Editor)

    1973-01-01

    The basic insulation improvement study concentrated upon evaluating variables which could result in significant near-term gains in mechanical behavior and insulation effectiveness of the baseline system. The approaches undertaken included: evaluation of small diameter fibers, optimization of binder: slurry characteristics, evaluation of techniques for controlling fiber orientation, optimization of firing cycle, and the evaluation of methods for improving insulation efficiency. A detailed discussion of these basic insulation improvement studies is presented.

  16. The TMDL Program Results Analysis Project: Matching Results Measures with Program Expectations

    EPA Pesticide Factsheets

    The paper provides a detailed description of the aims, methods and outputs of the program evaluation project undertaken by EPA in order to generate the insights needed to make TMDL program improvements.

  17. Advanced composite elevator for Boeing 727 aircraft, volume 2

    NASA Technical Reports Server (NTRS)

    Chovil, D. V.; Grant, W. D.; Jamison, E. S.; Syder, H.; Desper, O. E.; Harvey, S. T.; Mccarty, J. E.

    1980-01-01

    Preliminary design activity consisted of developing and analyzing alternate design concepts and selecting the optimum elevator configuration. This included trade studies in which durability, inspectability, producibility, repairability, and customer acceptance were evaluated. Preliminary development efforts consisted of evaluating and selecting material, identifying ancillary structural development test requirements, and defining full scale ground and flight test requirements necessary to obtain Federal Aviation Administration (FAA) certification. After selection of the optimum elevator configuration, detail design was begun and included basic configuration design improvements resulting from manufacturing verification hardware, the ancillary test program, weight analysis, and structural analysis. Detail and assembly tools were designed and fabricated to support a full-scope production program, rather than a limited run. The producibility development programs were used to verify tooling approaches, fabrication processes, and inspection methods for the production mode. Quality parts were readily fabricated and assembled with a minimum rejection rate, using prior inspection methods.

  18. Natural air leak test without submergence for spontaneous pneumothorax.

    PubMed

    Uramoto, Hidetaka; Tanaka, Fumihiro

    2011-12-24

    Postoperative air leaks are frequent complications after surgery for a spontaneous pneumothorax (SP). We herein describe a new method to test for air leaks by using a transparent film and thoracic tube in a closed system. Between 2005 and 2010, 35 patients underwent a novel method for evaluating air leaks without submergence, and their clinical records were retrospectively reviewed. The data on patient characteristics, surgical details, and perioperative outcomes were analyzed. The differences in the clinical background and intraoperative factors did not reach a statistically significant level between the new and classical methods. The incidence of recurrence was also equivalent to the standard method. However, the length of the operation and drainage periods were significantly shorter in patients evaluated using the new method than the conventional method. Further, no postoperative complications were observed in patients evaluated using the new method. This simple technique is satisfactorily effective and does not result in any complications.

  19. Quantitative Investigation of Protein-Nucleic Acid Interactions by Biosensor Surface Plasmon Resonance.

    PubMed

    Wang, Shuo; Poon, Gregory M K; Wilson, W David

    2015-01-01

    Biosensor-surface plasmon resonance (SPR) technology has emerged as a powerful label-free approach for the study of nucleic acid interactions in real time. The method provides simultaneous equilibrium and kinetic characterization for biomolecular interactions with low sample requirements and without the need for external probes. A detailed and practical guide for protein-DNA interaction analyses using biosensor-SPR methods is presented. Details of SPR technology and basic fundamentals are described with recommendations on the preparation of the SPR instrument, sensor chips and samples, experimental design, quantitative and qualitative data analyses and presentation. A specific example of the interaction of a transcription factor with DNA is provided with results evaluated by both kinetic and steady-state SPR methods.

  20. Developing criteria to establish Trusted Digital Repositories

    USGS Publications Warehouse

    Faundeen, John L.

    2017-01-01

    This paper details the drivers, methods, and outcomes of the U.S. Geological Survey’s quest to establish criteria by which to judge its own digital preservation resources as Trusted Digital Repositories. Drivers included recent U.S. legislation focused on data and asset management conducted by federal agencies spending $100M USD or more annually on research activities. The methods entailed seeking existing evaluation criteria from national and international organizations such as International Standards Organization (ISO), U.S. Library of Congress, and Data Seal of Approval upon which to model USGS repository evaluations. Certification, complexity, cost, and usability of existing evaluation models were key considerations. The selected evaluation method was derived to allow the repository evaluation process to be transparent, understandable, and defensible; factors that are critical for judging competing, internal units. Implementing the chosen evaluation criteria involved establishing a cross-agency, multi-disciplinary team that interfaced across the organization. 

  1. Software Tools for Formal Specification and Verification of Distributed Real-Time Systems

    DTIC Science & Technology

    1994-07-29

    time systems and to evaluate the design. The evaluation of the design includes investigation of both the capability and potential usefulness of the toolkit environment and the feasibility of its implementation....The goals of Phase 1 are to design in detail a toolkit environment based on formal methods for the specification and verification of distributed real

  2. Research on uncertainty evaluation measure and method of voltage sag severity

    NASA Astrophysics Data System (ADS)

    Liu, X. N.; Wei, J.; Ye, S. Y.; Chen, B.; Long, C.

    2018-01-01

    Voltage sag is an inevitable serious problem of power quality in power system. This paper focuses on a general summarization and reviews on the concepts, indices and evaluation methods about voltage sag severity. Considering the complexity and uncertainty of influencing factors, damage degree, the characteristics and requirements of voltage sag severity in the power source-network-load sides, the measure concepts and their existing conditions, evaluation indices and methods of voltage sag severity have been analyzed. Current evaluation techniques, such as stochastic theory, fuzzy logic, as well as their fusion, are reviewed in detail. An index system about voltage sag severity is provided for comprehensive study. The main aim of this paper is to propose thought and method of severity research based on advanced uncertainty theory and uncertainty measure. This study may be considered as a valuable guide for researchers who are interested in the domain of voltage sag severity.

  3. Protocol for the process evaluation of a complex intervention designed to increase the use of research in health policy and program organisations (the SPIRIT study).

    PubMed

    Haynes, Abby; Brennan, Sue; Carter, Stacy; O'Connor, Denise; Schneider, Carmen Huckel; Turner, Tari; Gallego, Gisselle

    2014-09-27

    Process evaluation is vital for understanding how interventions function in different settings, including if and why they have different effects or do not work at all. This is particularly important in trials of complex interventions in 'real world' organisational settings where causality is difficult to determine. Complexity presents challenges for process evaluation, and process evaluations that tackle complexity are rarely reported. This paper presents the detailed protocol for a process evaluation embedded in a randomised trial of a complex intervention known as SPIRIT (Supporting Policy In health with Research: an Intervention Trial). SPIRIT aims to build capacity for using research in health policy and program agencies. We describe the flexible and pragmatic methods used for capturing, managing and analysing data across three domains: (a) the intervention as it was implemented; (b) how people participated in and responded to the intervention; and (c) the contextual characteristics that mediated this relationship and may influence outcomes. Qualitative and quantitative data collection methods include purposively sampled semi-structured interviews at two time points, direct observation and coding of intervention activities, and participant feedback forms. We provide examples of the data collection and data management tools developed. This protocol provides a worked example of how to embed process evaluation in the design and evaluation of a complex intervention trial. It tackles complexity in the intervention and its implementation settings. To our knowledge, it is the only detailed example of the methods for a process evaluation of an intervention conducted as part of a randomised trial in policy organisations. We identify strengths and weaknesses, and discuss how the methods are functioning during early implementation. Using 'insider' consultation to develop methods is enabling us to optimise data collection while minimising discomfort and burden for participants. Embedding the process evaluation within the trial design is facilitating access to data, but may impair participants' willingness to talk openly in interviews. While it is challenging to evaluate the process of conducting a randomised trial of a complex intervention, our experience so far suggests that it is feasible and can add considerably to the knowledge generated.

  4. Performance evaluation of contrast-detail in full field digital mammography systems using ideal (Hotelling) observer vs. conventional automated analysis of CDMAM images for quality control of contrast-detail characteristics.

    PubMed

    Delakis, Ioannis; Wise, Robert; Morris, Lauren; Kulama, Eugenia

    2015-11-01

    The purpose of this work was to evaluate the contrast-detail performance of full field digital mammography (FFDM) systems using ideal (Hotelling) observer Signal-to-Noise Ratio (SNR) methodology and ascertain whether it can be considered an alternative to the conventional, automated analysis of CDMAM phantom images. Five FFDM units currently used in the national breast screening programme were evaluated, which differed with respect to age, detector, Automatic Exposure Control (AEC) and target/filter combination. Contrast-detail performance was analysed using CDMAM and ideal observer SNR methodology. The ideal observer SNR was calculated for input signal originating from gold discs of varying thicknesses and diameters, and then used to estimate the threshold gold thickness for each diameter as per CDMAM analysis. The variability of both methods and the dependence of CDMAM analysis on phantom manufacturing discrepancies also investigated. Results from both CDMAM and ideal observer methodologies were informative differentiators of FFDM systems' contrast-detail performance, displaying comparable patterns with respect to the FFDM systems' type and age. CDMAM results suggested higher threshold gold thickness values compared with the ideal observer methodology, especially for small-diameter details, which can be attributed to the behaviour of the CDMAM phantom used in this study. In addition, ideal observer methodology results showed lower variability than CDMAM results. The Ideal observer SNR methodology can provide a useful metric of the FFDM systems' contrast detail characteristics and could be considered a surrogate for conventional, automated analysis of CDMAM images. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  5. simplified aerosol representations in global modeling

    NASA Astrophysics Data System (ADS)

    Kinne, Stefan; Peters, Karsten; Stevens, Bjorn; Rast, Sebastian; Schutgens, Nick; Stier, Philip

    2015-04-01

    The detailed treatment of aerosol in global modeling is complex and time-consuming. Thus simplified approaches are investigated, which prescribe 4D (space and time) distributions of aerosol optical properties and of aerosol microphysical properties. Aerosol optical properties are required to assess aerosol direct radiative effects and aerosol microphysical properties (in terms of their ability as aerosol nuclei to modify cloud droplet concentrations) are needed to address the indirect aerosol impact on cloud properties. Following the simplifying concept of the monthly gridded (1x1 lat/lon) aerosol climatology (MAC), new approaches are presented and evaluated against more detailed methods, including comparisons to detailed simulations with complex aerosol component modules.

  6. Potential-scour assessments and estimates of scour depth using different techniques at selected bridge sites in Missouri

    USGS Publications Warehouse

    Huizinga, Richard J.; Rydlund, Jr., Paul H.

    2004-01-01

    The evaluation of scour at bridges throughout the state of Missouri has been ongoing since 1991 in a cooperative effort by the U.S. Geological Survey and Missouri Department of Transportation. A variety of assessment methods have been used to identify bridges susceptible to scour and to estimate scour depths. A potential-scour assessment (Level 1) was used at 3,082 bridges to identify bridges that might be susceptible to scour. A rapid estimation method (Level 1+) was used to estimate contraction, pier, and abutment scour depths at 1,396 bridge sites to identify bridges that might be scour critical. A detailed hydraulic assessment (Level 2) was used to compute contraction, pier, and abutment scour depths at 398 bridges to determine which bridges are scour critical and would require further monitoring or application of scour countermeasures. The rapid estimation method (Level 1+) was designed to be a conservative estimator of scour depths compared to depths computed by a detailed hydraulic assessment (Level 2). Detailed hydraulic assessments were performed at 316 bridges that also had received a rapid estimation assessment, providing a broad data base to compare the two scour assessment methods. The scour depths computed by each of the two methods were compared for bridges that had similar discharges. For Missouri, the rapid estimation method (Level 1+) did not provide a reasonable conservative estimate of the detailed hydraulic assessment (Level 2) scour depths for contraction scour, but the discrepancy was the result of using different values for variables that were common to both of the assessment methods. The rapid estimation method (Level 1+) was a reasonable conservative estimator of the detailed hydraulic assessment (Level 2) scour depths for pier scour if the pier width is used for piers without footing exposure and the footing width is used for piers with footing exposure. Detailed hydraulic assessment (Level 2) scour depths were conservatively estimated by the rapid estimation method (Level 1+) for abutment scour, but there was substantial variability in the estimates and several substantial underestimations.

  7. Improved methods for distribution loss evaluation. Volume 1: analytic and evaluative techniques. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Flinn, D.G.; Hall, S.; Morris, J.

    This volume describes the background research, the application of the proposed loss evaluation techniques, and the results. The research identified present loss calculation methods as appropriate, provided care was taken to represent the various system elements in sufficient detail. The literature search of past methods and typical data revealed that extreme caution in using typical values (load factor, etc.) should be taken to ensure that all factors were referred to the same time base (daily, weekly, etc.). The performance of the method (and computer program) proposed in this project was determined by comparison of results with a rigorous evaluation ofmore » losses on the Salt River Project system. This rigorous evaluation used statistical modeling of the entire system as well as explicit enumeration of all substation and distribution transformers. Further tests were conducted at Public Service Electric and Gas of New Jersey to check the appropriateness of the methods in a northern environment. Finally sensitivity tests indicated data elements inaccuracy of which would most affect the determination of losses using the method developed in this project.« less

  8. Measurement System Analyses - Gauge Repeatability and Reproducibility Methods

    NASA Astrophysics Data System (ADS)

    Cepova, Lenka; Kovacikova, Andrea; Cep, Robert; Klaput, Pavel; Mizera, Ondrej

    2018-02-01

    The submitted article focuses on a detailed explanation of the average and range method (Automotive Industry Action Group, Measurement System Analysis approach) and of the honest Gauge Repeatability and Reproducibility method (Evaluating the Measurement Process approach). The measured data (thickness of plastic parts) were evaluated by both methods and their results were compared on the basis of numerical evaluation. Both methods were additionally compared and their advantages and disadvantages were discussed. One difference between both methods is the calculation of variation components. The AIAG method calculates the variation components based on standard deviation (then a sum of variation components does not give 100 %) and the honest GRR study calculates the variation components based on variance, where the sum of all variation components (part to part variation, EV & AV) gives the total variation of 100 %. Acceptance of both methods among the professional society, future use, and acceptance by manufacturing industry were also discussed. Nowadays, the AIAG is the leading method in the industry.

  9. Evaluation of surface detail reproduction, dimensional stability and gypsum compatibility of monophase polyvinyl-siloxane and polyether elastomeric impression materials under dry and moist conditions

    PubMed Central

    Vadapalli, Sriharsha Babu; Atluri, Kaleswararao; Putcha, Madhu Sudhan; Kondreddi, Sirisha; Kumar, N. Suman; Tadi, Durga Prasad

    2016-01-01

    Objectives: This in vitro study was designed to compare polyvinyl-siloxane (PVS) monophase and polyether (PE) monophase materials under dry and moist conditions for properties such as surface detail reproduction, dimensional stability, and gypsum compatibility. Materials and Methods: Surface detail reproduction was evaluated using two criteria. Dimensional stability was evaluated according to American Dental Association (ADA) specification no. 19. Gypsum compatibility was assessed by two criteria. All the samples were evaluated, and the data obtained were analyzed by a two-way analysis of variance (ANOVA) and Pearson's Chi-square tests. Results: When surface detail reproduction was evaluated with modification of ADA specification no. 19, both the groups under the two conditions showed no significant difference statistically. When evaluated macroscopically both the groups showed statistically significant difference. Results for dimensional stability showed that the deviation from standard was significant among the two groups, where Aquasil group showed significantly more deviation compared to Impregum group (P < 0.001). Two conditions also showed significant difference, with moist conditions showing significantly more deviation compared to dry condition (P < 0.001). The results of gypsum compatibility when evaluated with modification of ADA specification no. 19 and by giving grades to the casts for both the groups and under two conditions showed no significant difference statistically. Conclusion: Regarding dimensional stability, both impregum and aquasil performed better in dry condition than in moist; impregum performed better than aquasil in both the conditions. When tested for surface detail reproduction according to ADA specification, under dry and moist conditions both of them performed almost equally. When tested according to macroscopic evaluation, impregum and aquasil performed significantly better in dry condition compared to moist condition. In dry condition, both the materials performed almost equally. In moist condition, aquasil performed significantly better than impregum. Regarding gypsum compatibility according to ADA specification, in dry condition both the materials performed almost equally, and in moist condition aquasil performed better than impregum. When tested by macroscopic evaluation, impregum performed better than aquasil in both the conditions. PMID:27583217

  10. Introducing Contemporary Anthropology: A Team-Taught Course for Large Classes.

    ERIC Educational Resources Information Center

    Plotnicov, Leonard

    1985-01-01

    Describes a method of teaching large sections of college introductory anthropology by members of the anthropology faculty giving their best lectures. Presents details of how such a course was initiated, operated, and evaluated at the University of Pittsburgh. (KH)

  11. Recovery of Memory After Posthypnotic Amnesia

    ERIC Educational Resources Information Center

    Kihlstrom, John F.; Evans, Frederick J.

    1976-01-01

    This research uses a sample of 691 male and female college students and adopts an alternative method of evaluating reversibility, an important aspect of posthypnotic amnesia, to explore in greater detail the relations among hypnotic susceptibility, initial amnesia, and subsequent reversibility. (Author/RK)

  12. LANL seismic screening method for existing buildings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dickson, S.L.; Feller, K.C.; Fritz de la Orta, G.O.

    1997-01-01

    The purpose of the Los Alamos National Laboratory (LANL) Seismic Screening Method is to provide a comprehensive, rational, and inexpensive method for evaluating the relative seismic integrity of a large building inventory using substantial life-safety as the minimum goal. The substantial life-safety goal is deemed to be satisfied if the extent of structural damage or nonstructural component damage does not pose a significant risk to human life. The screening is limited to Performance Category (PC) -0, -1, and -2 buildings and structures. Because of their higher performance objectives, PC-3 and PC-4 buildings automatically fail the LANL Seismic Screening Method andmore » will be subject to a more detailed seismic analysis. The Laboratory has also designated that PC-0, PC-1, and PC-2 unreinforced masonry bearing wall and masonry infill shear wall buildings fail the LANL Seismic Screening Method because of their historically poor seismic performance or complex behavior. These building types are also recommended for a more detailed seismic analysis. The results of the LANL Seismic Screening Method are expressed in terms of separate scores for potential configuration or physical hazards (Phase One) and calculated capacity/demand ratios (Phase Two). This two-phase method allows the user to quickly identify buildings that have adequate seismic characteristics and structural capacity and screen them out from further evaluation. The resulting scores also provide a ranking of those buildings found to be inadequate. Thus, buildings not passing the screening can be rationally prioritized for further evaluation. For the purpose of complying with Executive Order 12941, the buildings failing the LANL Seismic Screening Method are deemed to have seismic deficiencies, and cost estimates for mitigation must be prepared. Mitigation techniques and cost-estimate guidelines are not included in the LANL Seismic Screening Method.« less

  13. Short-cut Methods versus Rigorous Methods for Performance-evaluation of Distillation Configurations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramapriya, Gautham Madenoor; Selvarajah, Ajiththaa; Jimenez Cucaita, Luis Eduardo

    Here, this study demonstrates the efficacy of a short-cut method such as the Global Minimization Algorithm (GMA), that uses assumptions of ideal mixtures, constant molar overflow (CMO) and pinched columns, in pruning the search-space of distillation column configurations for zeotropic multicomponent separation, to provide a small subset of attractive configurations with low minimum heat duties. The short-cut method, due to its simplifying assumptions, is computationally efficient, yet reliable in identifying the small subset of useful configurations for further detailed process evaluation. This two-tier approach allows expedient search of the configuration space containing hundreds to thousands of candidate configurations for amore » given application.« less

  14. Short-cut Methods versus Rigorous Methods for Performance-evaluation of Distillation Configurations

    DOE PAGES

    Ramapriya, Gautham Madenoor; Selvarajah, Ajiththaa; Jimenez Cucaita, Luis Eduardo; ...

    2018-05-17

    Here, this study demonstrates the efficacy of a short-cut method such as the Global Minimization Algorithm (GMA), that uses assumptions of ideal mixtures, constant molar overflow (CMO) and pinched columns, in pruning the search-space of distillation column configurations for zeotropic multicomponent separation, to provide a small subset of attractive configurations with low minimum heat duties. The short-cut method, due to its simplifying assumptions, is computationally efficient, yet reliable in identifying the small subset of useful configurations for further detailed process evaluation. This two-tier approach allows expedient search of the configuration space containing hundreds to thousands of candidate configurations for amore » given application.« less

  15. Pile Driving

    NASA Technical Reports Server (NTRS)

    1987-01-01

    Machine-oriented structural engineering firm TERA, Inc. is engaged in a project to evaluate the reliability of offshore pile driving prediction methods to eventually predict the best pile driving technique for each new offshore oil platform. Phase I Pile driving records of 48 offshore platforms including such information as blow counts, soil composition and pertinent construction details were digitized. In Phase II, pile driving records were statistically compared with current methods of prediction. Result was development of modular software, the CRIPS80 Software Design Analyzer System, that companies can use to evaluate other prediction procedures or other data bases.

  16. The research of full automatic oil filtering control technology of high voltage insulating oil

    NASA Astrophysics Data System (ADS)

    Gong, Gangjun; Zhang, Tong; Yan, Guozeng; Zhang, Han; Chen, Zhimin; Su, Chang

    2017-09-01

    In this paper, the design scheme of automatic oil filter control system for transformer oil in UHV substation is summarized. The scheme specifically includes the typical double tank filter connection control method of the transformer oil of the UHV substation, which distinguishes the single port and the double port connection structure of the oil tank. Finally, the design scheme of the temperature sensor and respirator is given in detail, and the detailed evaluation and application scenarios are given for reference.

  17. Simplification of an MCNP model designed for dose rate estimation

    NASA Astrophysics Data System (ADS)

    Laptev, Alexander; Perry, Robert

    2017-09-01

    A study was made to investigate the methods of building a simplified MCNP model for radiological dose estimation. The research was done using an example of a complicated glovebox with extra shielding. The paper presents several different calculations for neutron and photon dose evaluations where glovebox elements were consecutively excluded from the MCNP model. The analysis indicated that to obtain a fast and reasonable estimation of dose, the model should be realistic in details that are close to the tally. Other details may be omitted.

  18. Chemical Synthesis Coxiella Burnetti Lipopolysaccharides: Structural Studies of Coxiella Burnetti Lipopolysaccharides.

    DTIC Science & Technology

    1987-11-30

    currently evaluating two instrumental techniques which seem highly appropriate to this LPS project, supercritical fluid chromatography (SFC) and...NEW INSTRUMENTAL TECHNIQUES AND METHODS OF APPROACH 1. Supercritical Fluid Chromatography (SFC) ............... 6. 2. SFC and Mass Spectrometry...details are discussed below in the appropriate sections. B. NEW INSTRUMENTAL TECHNIQUES AND METHODS OF APPROACH 1. Supercritical Fluid Chromatography (SFC

  19. Evaluation of different flamelet tabulation methods for laminar spray combustion

    NASA Astrophysics Data System (ADS)

    Luo, Yujuan; Wen, Xu; Wang, Haiou; Luo, Kun; Fan, Jianren

    2018-05-01

    In this work, three different flamelet tabulation methods for spray combustion are evaluated. Major differences among these methods lie in the treatment of the temperature boundary conditions of the flamelet equations. Particularly, in the first tabulation method ("M1"), both the fuel and oxidizer temperature boundary conditions are set to be fixed. In the second tabulation method ("M2"), the fuel temperature boundary condition is varied while the oxidizer temperature boundary condition is fixed. In the third tabulation method ("M3"), both the fuel and oxidizer temperature boundary conditions are varied and set to be equal. The focus of this work is to investigate whether the heat transfer between the droplet phase and gas phase can be represented by the studied tabulation methods through a priori analyses. To this end, spray flames stabilized in a three-dimensional counterflow are first simulated with detailed chemistry. Then, the trajectory variables are calculated from the detailed chemistry solutions. Finally, the tabulated thermo-chemical quantities are compared to the corresponding values from the detailed chemistry solutions. The comparisons show that the gas temperature cannot be predicted by "M1" with only a mixture fraction and reaction progress variable being the trajectory variables. The gas temperature can be correctly predicted by both "M2" and "M3," in which the total enthalpy is introduced as an additional manifold. In "M2," variations of the oxidizer temperature are considered with a temperature modification technique, which is not required in "M3." Interestingly, it is found that the mass fractions of the reactants and major products are not sensitive to the representation of the interphase heat transfer in the flamelet chemtables, and they can be correctly predicted by all tabulation methods. By contrast, the intermediate species CO and H2 in the premixed flame reaction zone are over-predicted by all tabulation methods.

  20. Developing, implementing, and evaluating a handbook for parents of pediatric hematology/oncology patients.

    PubMed

    Heiney, S P; Wells, L M

    1995-07-01

    This article details the development of a parent handbook for pediatric hematology and oncology patients. The planning and content development are discussed. Adult learning principles were incorporated throughout the handbook. Use of the handbook in a pediatric cancer center is described. Both subjective and objective methods were used to evaluate the handbook. Results from the evaluation verify the value of the handbook to parents and give direction for future revisions of the handbook.

  1. Filter methods to preserve local contrast and to avoid artifacts in gamut mapping

    NASA Astrophysics Data System (ADS)

    Meili, Marcel; Küpper, Dennis; Barańczuk, Zofia; Caluori, Ursina; Simon, Klaus

    2010-01-01

    Contrary to high dynamic range imaging, the preservation of details and the avoidance of artifacts is not explicitly considered in popular color management systems. An effective way to overcome these difficulties is image filtering. In this paper we investigate several image filter concepts for detail preservation as part of a practical gamut mapping strategy. In particular we define four concepts including various image filters and check their performance with a psycho-visual test. Additionally, we compare our performance evaluation to two image quality measures with emphasis on local contrast. Surprisingly, the most simple filter concept performs highly efficient and achieves an image quality which is comparable to the more established but slower methods.

  2. CARETS: A prototype regional environmental information system. Volume 12: User evaluation of experimental land use maps and related products from the central Atlantic test site

    NASA Technical Reports Server (NTRS)

    Alexander, R. H. (Principal Investigator); Mcginty, H. K., III

    1975-01-01

    The author has identified the following significant results. Recommendations resulting from the CARETS evaluation reflect the need to establish a flexible and reliable system for providing more detailed raw and processed land resource information as well as the need to improve the methods of making information available to users.

  3. Development of iterative techniques for the solution of unsteady compressible viscous flows

    NASA Technical Reports Server (NTRS)

    Sankar, Lakshmi; Hixon, Duane

    1993-01-01

    The work done under this project was documented in detail as the Ph. D. dissertation of Dr. Duane Hixon. The objectives of the research project were evaluation of the generalized minimum residual method (GMRES) as a tool for accelerating 2-D and 3-D unsteady flows and evaluation of the suitability of the GMRES algorithm for unsteady flows, computed on parallel computer architectures.

  4. Going One-to-One in Urban Schools: An Evaluation of the XO Champions Initiative in Project LIFT Elementary Schools

    ERIC Educational Resources Information Center

    Beaver, Jessica K.; Englander, Katie; Leow, Christine; Barnes, Marvin

    2015-01-01

    Beginning in spring 2013, students in seven elementary schools throughout the Project LIFT zone in Charlotte, North Carolina began to receive XO laptops provided by the organization One Laptop Per Child (OLPC) for use both within their classroom and at home. This report details Research for Action's (RFA) mixed-method evaluation of the first year…

  5. A Variational Assimilation Method for Satellite and Conventional Data: a Revised Basic Model 2B

    NASA Technical Reports Server (NTRS)

    Achtemeier, Gary L.; Scott, Robert W.; Chen, J.

    1991-01-01

    A variational objective analysis technique that modifies observations of temperature, height, and wind on the cyclone scale to satisfy the five 'primitive' model forecast equations is presented. This analysis method overcomes all of the problems that hindered previous versions, such as over-determination, time consistency, solution method, and constraint decoupling. A preliminary evaluation of the method shows that it converges rapidly, the divergent part of the wind is strongly coupled in the solution, fields of height and temperature are well-preserved, and derivative quantities such as vorticity and divergence are improved. Problem areas are systematic increases in the horizontal velocity components, and large magnitudes of the local tendencies of the horizontal velocity components. The preliminary evaluation makes note of these problems but detailed evaluations required to determine the origin of these problems await future research.

  6. Gender counts: A systematic review of evaluations of gender-integrated health interventions in low- and middle-income countries.

    PubMed

    Schriver, Brittany; Mandal, Mahua; Muralidharan, Arundati; Nwosu, Anthony; Dayal, Radhika; Das, Madhumita; Fehringer, Jessica

    2017-11-01

    As a result of new global priorities, there is a growing need for high-quality evaluations of gender-integrated health programmes. This systematic review examined 99 peer-reviewed articles on evaluations of gender-integrated (accommodating and transformative) health programmes with regard to their theory of change (ToC), study design, gender integration in data collection, analysis, and gender measures used. Half of the evaluations explicitly described a ToC or conceptual framework (n = 50) that guided strategies for their interventions. Over half (61%) of the evaluations used quantitative methods exclusively; 11% used qualitative methods exclusively; and 28% used mixed methods. Qualitative methods were not commonly detailed. Evaluations of transformative interventions were less likely than those of accommodating interventions to employ randomised control trials. Two-thirds of the reviewed evaluations reported including at least one specific gender-related outcome (n = 18 accommodating, n = 44 transformative). To strengthen evaluations of gender-integrated programmes, we recommend use of ToCs, explicitly including gender in the ToC, use of gender-sensitive measures, mixed-method designs, in-depth descriptions of qualitative methods, and attention to gender-related factors in data collection logistics. We also recommend further research to develop valid and reliable gender measures that are globally relevant.

  7. A new evaluation tool to obtain practice-based evidence of worksite health promotion programs.

    PubMed

    Dunet, Diane O; Sparling, Phillip B; Hersey, James; Williams-Piehota, Pamela; Hill, Mary D; Hanssen, Carl; Lawrenz, Frances; Reyes, Michele

    2008-10-01

    The Centers for Disease Control and Prevention developed the Swift Worksite Assessment and Translation (SWAT) evaluation method to identify promising practices in worksite health promotion programs. The new method complements research studies and evaluation studies of evidence-based practices that promote healthy weight in working adults. We used nationally recognized program evaluation standards of utility, feasibility, accuracy, and propriety as the foundation for our 5-step method: 1) site identification and selection, 2) site visit, 3) post-visit evaluation of promising practices, 4) evaluation capacity building, and 5) translation and dissemination. An independent, outside evaluation team conducted process and summative evaluations of SWAT to determine its efficacy in providing accurate, useful information and its compliance with evaluation standards. The SWAT evaluation approach is feasible in small and medium-sized workplace settings. The independent evaluation team judged SWAT favorably as an evaluation method, noting among its strengths its systematic and detailed procedures and service orientation. Experts in worksite health promotion evaluation concluded that the data obtained by using this evaluation method were sufficient to allow them to make judgments about promising practices. SWAT is a useful, business-friendly approach to systematic, yet rapid, evaluation that comports with program evaluation standards. The method provides a new tool to obtain practice-based evidence of worksite health promotion programs that help prevent obesity and, more broadly, may advance public health goals for chronic disease prevention and health promotion.

  8. Evaluation of linear induction motor characteristics : the Yamamura model

    DOT National Transportation Integrated Search

    1975-04-30

    The Yamamura theory of the double-sided linear induction motor (LIM) excited by a constant current source is discussed in some detail. The report begins with a derivation of thrust and airgap power using the method of vector potentials and theorem of...

  9. Assessment and recommendations for using high-resolution weather information to improve winter maintenance operations.

    DOT National Transportation Integrated Search

    2013-11-01

    A variety of methods for obtaining detailed analyses regarding the timing and duration of winter weather across the state of Indiana for : multiple seasons were compared and evaluated during this project. Meteorological information from sources such ...

  10. Evaluation of geophysical methods and geophysical contractors on four projects in Kentucky.

    DOT National Transportation Integrated Search

    2007-03-01

    his report details four geophysical testing projects that were conducted in Kentucky for the Kentucky Transportation Cabinet. The four projects were as follows: KY 101, Edmonson and Warren Counties, US 31-W, Elizabethtown Bypass, Hardin County, KY 61...

  11. A Tool for the Automated Design and Evaluation of Habitat Interior Layouts

    NASA Technical Reports Server (NTRS)

    Simon, Matthew A.; Wilhite, Alan W.

    2013-01-01

    The objective of space habitat design is to minimize mass and system size while providing adequate space for all necessary equipment and a functional layout that supports crew health and productivity. Unfortunately, development and evaluation of interior layouts is often ignored during conceptual design because of the subjectivity and long times required using current evaluation methods (e.g., human-in-the-loop mockup tests and in-depth CAD evaluations). Early, more objective assessment could prevent expensive design changes that may increase vehicle mass and compromise functionality. This paper describes a new interior design evaluation method to enable early, structured consideration of habitat interior layouts. This interior layout evaluation method features a comprehensive list of quantifiable habitat layout evaluation criteria, automatic methods to measure these criteria from a geometry model, and application of systems engineering tools and numerical methods to construct a multi-objective value function measuring the overall habitat layout performance. In addition to a detailed description of this method, a C++/OpenGL software tool which has been developed to implement this method is also discussed. This tool leverages geometry modeling coupled with collision detection techniques to identify favorable layouts subject to multiple constraints and objectives (e.g., minimize mass, maximize contiguous habitable volume, maximize task performance, and minimize crew safety risks). Finally, a few habitat layout evaluation examples are described to demonstrate the effectiveness of this method and tool to influence habitat design.

  12. Are LOD and LOQ Reliable Parameters for Sensitivity Evaluation of Spectroscopic Methods?

    PubMed

    Ershadi, Saba; Shayanfar, Ali

    2018-03-22

    The limit of detection (LOD) and the limit of quantification (LOQ) are common parameters to assess the sensitivity of analytical methods. In this study, the LOD and LOQ of previously reported terbium sensitized analysis methods were calculated by different methods, and the results were compared with sensitivity parameters [lower limit of quantification (LLOQ)] of U.S. Food and Drug Administration guidelines. The details of the calibration curve and standard deviation of blank samples of three different terbium-sensitized luminescence methods for the quantification of mycophenolic acid, enrofloxacin, and silibinin were used for the calculation of LOD and LOQ. A comparison of LOD and LOQ values calculated by various methods and LLOQ shows a considerable difference. The significant difference of the calculated LOD and LOQ with various methods and LLOQ should be considered in the sensitivity evaluation of spectroscopic methods.

  13. Improving the evaluation of therapeutic interventions in multiple sclerosis: the role of new psychometric methods.

    PubMed

    Hobart, J; Cano, S

    2009-02-01

    In this monograph we examine the added value of new psychometric methods (Rasch measurement and Item Response Theory) over traditional psychometric approaches by comparing and contrasting their psychometric evaluations of existing sets of rating scale data. We have concentrated on Rasch measurement rather than Item Response Theory because we believe that it is the more advantageous method for health measurement from a conceptual, theoretical and practical perspective. Our intention is to provide an authoritative document that describes the principles of Rasch measurement and the practice of Rasch analysis in a clear, detailed, non-technical form that is accurate and accessible to clinicians and researchers in health measurement. A comparison was undertaken of traditional and new psychometric methods in five large sets of rating scale data: (1) evaluation of the Rivermead Mobility Index (RMI) in data from 666 participants in the Cannabis in Multiple Sclerosis (CAMS) study; (2) evaluation of the Multiple Sclerosis Impact Scale (MSIS-29) in data from 1725 people with multiple sclerosis; (3) evaluation of test-retest reliability of MSIS-29 in data from 150 people with multiple sclerosis; (4) examination of the use of Rasch analysis to equate scales purporting to measure the same health construct in 585 people with multiple sclerosis; and (5) comparison of relative responsiveness of the Barthel Index and Functional Independence Measure in data from 1400 people undergoing neurorehabilitation. Both Rasch measurement and Item Response Theory are conceptually and theoretically superior to traditional psychometric methods. Findings from each of the five studies show that Rasch analysis is empirically superior to traditional psychometric methods for evaluating rating scales, developing rating scales, analysing rating scale data, understanding and measuring stability and change, and understanding the health constructs we seek to quantify. There is considerable added value in using Rasch analysis rather than traditional psychometric methods in health measurement. Future research directions include the need to reproduce our findings in a range of clinical populations, detailed head-to-head comparisons of Rasch analysis and Item Response Theory, and the application of Rasch analysis to clinical practice.

  14. ECSIN's methodological approach for hazard evaluation of engineered nanomaterials

    NASA Astrophysics Data System (ADS)

    Bregoli, Lisa; Benetti, Federico; Venturini, Marco; Sabbioni, Enrico

    2013-04-01

    The increasing production volumes and commercialization of engineered nanomaterials (ENM), together with data on their higher biological reactivity when compared to bulk counterpart and ability to cross biological barriers, have caused concerns about their potential impacts on the health and safety of both humans and the environment. A multidisciplinary component of the scientific community has been called to evaluate the real risks associated with the use of products containing ENM, and is today in the process of developing specific definitions and testing strategies for nanomaterials. At ECSIN we are developing an integrated multidisciplinary methodological approach for the evaluation of the biological effects of ENM on the environment and human health. While our testing strategy agrees with the most widely advanced line of work at the European level, the choice of methods and optimization of protocols is made with an extended treatment of details. Our attention to the methodological and technical details is based on the acknowledgment that the innovative characteristics of matter at the nano-size range may influence the existing testing methods in a partially unpredictable manner, an aspect which is frequently recognized at the discussion level but oftentimes disregarded at the laboratory bench level. This work outlines the most important steps of our testing approach. In particular, each step will be briefly discussed in terms of potential technical and methodological pitfalls that we have encountered, and which are often ignored in nanotoxicology research. The final aim is to draw attention to the need of preliminary studies in developing reliable tests, a crucial aspect to confirm the suitability of the chosen analytical and toxicological methods to be used for the specific tested nanoparticle, and to express the idea that in nanotoxicology,"devil is in the detail".

  15. Multiple and mixed methods in formative evaluation: Is more better? Reflections from a South African study.

    PubMed

    Odendaal, Willem; Atkins, Salla; Lewin, Simon

    2016-12-15

    Formative programme evaluations assess intervention implementation processes, and are seen widely as a way of unlocking the 'black box' of any programme in order to explore and understand why a programme functions as it does. However, few critical assessments of the methods used in such evaluations are available, and there are especially few that reflect on how well the evaluation achieved its objectives. This paper describes a formative evaluation of a community-based lay health worker programme for TB and HIV/AIDS clients across three low-income communities in South Africa. It assesses each of the methods used in relation to the evaluation objectives, and offers suggestions on ways of optimising the use of multiple, mixed-methods within formative evaluations of complex health system interventions. The evaluation's qualitative methods comprised interviews, focus groups, observations and diary keeping. Quantitative methods included a time-and-motion study of the lay health workers' scope of practice and a client survey. The authors conceptualised and conducted the evaluation, and through iterative discussions, assessed the methods used and their results. Overall, the evaluation highlighted programme issues and insights beyond the reach of traditional single methods evaluations. The strengths of the multiple, mixed-methods in this evaluation included a detailed description and nuanced understanding of the programme and its implementation, and triangulation of the perspectives and experiences of clients, lay health workers, and programme managers. However, the use of multiple methods needs to be carefully planned and implemented as this approach can overstretch the logistic and analytic resources of an evaluation. For complex interventions, formative evaluation designs including multiple qualitative and quantitative methods hold distinct advantages over single method evaluations. However, their value is not in the number of methods used, but in how each method matches the evaluation questions and the scientific integrity with which the methods are selected and implemented.

  16. Iterative image-domain ring artifact removal in cone-beam CT

    NASA Astrophysics Data System (ADS)

    Liang, Xiaokun; Zhang, Zhicheng; Niu, Tianye; Yu, Shaode; Wu, Shibin; Li, Zhicheng; Zhang, Huailing; Xie, Yaoqin

    2017-07-01

    Ring artifacts in cone beam computed tomography (CBCT) images are caused by pixel gain variations using flat-panel detectors, and may lead to structured non-uniformities and deterioration of image quality. The purpose of this study is to propose a method of general ring artifact removal in CBCT images. This method is based on the polar coordinate system, where the ring artifacts manifest as stripe artifacts. Using relative total variation, the CBCT images are first smoothed to generate template images with fewer image details and ring artifacts. By subtracting the template images from the CBCT images, residual images with image details and ring artifacts are generated. As the ring artifact manifests as a stripe artifact in a polar coordinate system, the artifact image can be extracted by mean value from the residual image; the image details are generated by subtracting the artifact image from the residual image. Finally, the image details are compensated to the template image to generate the corrected images. The proposed framework is iterated until the differences in the extracted ring artifacts are minimized. We use a 3D Shepp-Logan phantom, Catphan©504 phantom, uniform acrylic cylinder, and images from a head patient to evaluate the proposed method. In the experiments using simulated data, the spatial uniformity is increased by 1.68 times and the structural similarity index is increased from 87.12% to 95.50% using the proposed method. In the experiment using clinical data, our method shows high efficiency in ring artifact removal while preserving the image structure and detail. The iterative approach we propose for ring artifact removal in cone-beam CT is practical and attractive for CBCT guided radiation therapy.

  17. An edge-directed interpolation method for fetal spine MR images.

    PubMed

    Yu, Shaode; Zhang, Rui; Wu, Shibin; Hu, Jiani; Xie, Yaoqin

    2013-10-10

    Fetal spinal magnetic resonance imaging (MRI) is a prenatal routine for proper assessment of fetus development, especially when suspected spinal malformations occur while ultrasound fails to provide details. Limited by hardware, fetal spine MR images suffer from its low resolution.High-resolution MR images can directly enhance readability and improve diagnosis accuracy. Image interpolation for higher resolution is required in clinical situations, while many methods fail to preserve edge structures. Edge carries heavy structural messages of objects in visual scenes for doctors to detect suspicions, classify malformations and make correct diagnosis. Effective interpolation with well-preserved edge structures is still challenging. In this paper, we propose an edge-directed interpolation (EDI) method and apply it on a group of fetal spine MR images to evaluate its feasibility and performance. This method takes edge messages from Canny edge detector to guide further pixel modification. First, low-resolution (LR) images of fetal spine are interpolated into high-resolution (HR) images with targeted factor by bi-linear method. Then edge information from LR and HR images is put into a twofold strategy to sharpen or soften edge structures. Finally a HR image with well-preserved edge structures is generated. The HR images obtained from proposed method are validated and compared with that from other four EDI methods. Performances are evaluated from six metrics, and subjective analysis of visual quality is based on regions of interest (ROI). All these five EDI methods are able to generate HR images with enriched details. From quantitative analysis of six metrics, the proposed method outperforms the other four from signal-to-noise ratio (SNR), peak signal-to-noise ratio (PSNR), structure similarity index (SSIM), feature similarity index (FSIM) and mutual information (MI) with seconds-level time consumptions (TC). Visual analysis of ROI shows that the proposed method maintains better consistency in edge structures with the original images. The proposed method classifies edge orientations into four categories and well preserves structures. It generates convincing HR images with fine details and is suitable in real-time situations. Iterative curvature-based interpolation (ICBI) method may result in crisper edges, while the other three methods are sensitive to noise and artifacts.

  18. A novel use of Twitter to provide feedback and evaluations.

    PubMed

    Desai, Bobby

    2014-04-01

    Inconsistencies in work schedules and faculty supervision are barriers to monthly emergency medicine (EM) resident doctor evaluations. Direct and contemporaneous feedback may be effective in providing specific details that determine a resident's evaluation. To determine whether Twitter, an easy to use application that is available on the Internet via smartphones and desktops, can provide direct and contemporaneous feedback that is easily accessible, and easy to store and refer back to. First- to third-year EM residents were administered a survey to assess their thoughts on the current monthly evaluation system. Subsequently, residents obtained a Twitter account and were instructed to follow a single general faculty Twitter account for ease of data collection. Following completion of an 8-week study period, a second survey was administered to assess resident thoughts on contemporaneous feedback and evaluations versus the traditional form. Of the 24 EM residents, 13 were available for study. A total of 220 'tweets' were provided by seven faculty members, with a mean of 11 tweets (range 8-17) per resident. The 13 residents received a total of eight formal evaluations from 19 faculty members. The second survey demonstrated that this method provided more detailed evaluations and increased the volume of feedback. Contemporaneous feedback and evaluation provides a greater volume of feedback that is more detailed than end-of-course evaluations. Twitter is an effective and easy means to provide this feedback. Limitations included the length of study time and the inability to have all of the EM residents involved in the study. © 2014 John Wiley & Sons Ltd.

  19. Generic Verification Protocol for Testing Pesticide Application Spray Drift Reduction Technologies for Row and Field Crops

    EPA Pesticide Factsheets

    This generic verification protocol provides a detailed method to conduct and report results from a verification test of pesticide application technologies that can be used to evaluate these technologies for their potential to reduce spray drift.

  20. Impact of Universities' Promotional Materials on College Choice.

    ERIC Educational Resources Information Center

    Armstrong, Jami J.; Lumsden, D. Barry

    1999-01-01

    Evaluated the impact of printed promotional materials on the recruitment of college freshmen using focus groups of students attending a large, southern metropolitan university. Students provided detailed suggestions on ways to improve the method of distribution, graphic design, and content of the materials. (Author/DB)

  1. Understanding the Biology and Technology of ToxCast and Tox21 Assays

    EPA Science Inventory

    The ToxCast high-throughput toxicity (HTT) testing methods have been developed to evaluate the hazard potential of diverse environmental, industrial and consumer product chemicals. The main goal is prioritizing the compounds of greatest concern for more detailed toxicological stu...

  2. Weightless Environment Training Facility (WETF) materials coating evaluation, volume 3

    NASA Technical Reports Server (NTRS)

    1995-01-01

    This volume consists of Appendices C, D, E, and F to the report on the Weightless Environment Training Facility Materials Coating Evaluation project. The project selected 10 coating systems to be evaluated in six separate exposure environments, and subject to three tests for physical properties. Appendix C is the photographic appendix of the test panels. Appendix D details methods and procedures. Appendix E lists application equipment costs. Appendix F is a compilation of the solicitation of the candidate coating systems.

  3. Pan-sharpening algorithm to remove thin cloud via mask dodging and nonsampled shift-invariant shearlet transform

    NASA Astrophysics Data System (ADS)

    Shi, Cheng; Liu, Fang; Li, Ling-Ling; Hao, Hong-Xia

    2014-01-01

    The goal of pan-sharpening is to get an image with higher spatial resolution and better spectral information. However, the resolution of the pan-sharpened image is seriously affected by the thin clouds. For a single image, filtering algorithms are widely used to remove clouds. These kinds of methods can remove clouds effectively, but the detail lost in the cloud removal image is also serious. To solve this problem, a pan-sharpening algorithm to remove thin cloud via mask dodging and nonsampled shift-invariant shearlet transform (NSST) is proposed. For the low-resolution multispectral (LR MS) and high-resolution panchromatic images with thin clouds, a mask dodging method is used to remove clouds. For the cloud removal LR MS image, an adaptive principal component analysis transform is proposed to balance the spectral information and spatial resolution in the pan-sharpened image. Since the clouds removal process causes the detail loss problem, a weight matrix is designed to enhance the details of the cloud regions in the pan-sharpening process, but noncloud regions remain unchanged. And the details of the image are obtained by NSST. Experimental results over visible and evaluation metrics demonstrate that the proposed method can keep better spectral information and spatial resolution, especially for the images with thin clouds.

  4. Promoting a smokers' quitline in Ontario, Canada: an evaluation of an academic detailing approach.

    PubMed

    Kirst, Maritt; Schwartz, Robert

    2015-06-01

    This study assesses the impact of an academic detailing quitline promotional outreach program on integration of patient referrals to the quitline by fax in healthcare settings and quitline utilization in Ontario, Canada. The study employed a mixed methods approach for evaluation, with trend analysis of quitline administrative data from the year before program inception (2005) to 2011 and qualitative interviews with quitline stakeholders. Participants in the qualitative interviews included academic detailing program staff, regional tobacco control stakeholders and quitline promotion experts. Quantitative outcomes included the number of fax referral partners and fax referrals received, and quitline reach. Trends in proximal and distal outreach program outcomes were assessed. The qualitative data were analysed through a process of data coding involving the constant comparative technique derived from grounded theory methods. The study identified that the outreach program has had some success in integrating the fax referral program in healthcare settings through evidence of increased fax referrals since program inception. However, organizational barriers to program partner engagement have been encountered. While referral from health professionals through the fax referral programs has increased since the inception of the outreach program, the overall reach of the quitline has not increased. The study findings highlight that an academic detailing approach to quitline promotion can have some success in achieving increased fax referral program integration in healthcare settings. However, findings suggest that investment in a comprehensive promotional strategy, incorporating academic detailing, media and the provision of free cessation medications may be a more effective approach to quitline promotion. © The Author (2013). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  5. Advances in the Use of Thermography to Inspect Composite Tanks for Liquid Fuel Propulsion Systems

    NASA Technical Reports Server (NTRS)

    Lansing, Matthew D.; Russell, Samuel S.; Walker, James L.; Jones, Clyde S. (Technical Monitor)

    2001-01-01

    This viewgraph presentation gives an overview of advances in the use of thermography to inspect composite tanks for liquid fuel propulsion systems. Details are given on the thermographic inspection system, thermographic analysis method (includes scan and defect map, method of inspection, and inclusions, ply wrinkle, and delamination defects), graphite composite cryogenic feedline (including method, image map, and deep/shallow inclusions and resin rich area defects), and material degradation nondestructive evaluation.

  6. Evaluating Sleep Disturbance: A Review of Methods

    NASA Technical Reports Server (NTRS)

    Smith, Roy M.; Oyung, R.; Gregory, K.; Miller, D.; Rosekind, M.; Rosekind, Mark R. (Technical Monitor)

    1996-01-01

    There are three general approaches to evaluating sleep disturbance in regards to noise: subjective, behavioral, and physiological. Subjective methods range from standardized questionnaires and scales to self-report measures designed for specific research questions. There are two behavioral methods that provide useful sleep disturbance data. One behavioral method is actigraphy, a motion detector that provides an empirical estimate of sleep quantity and quality. An actigraph, worn on the non-dominant wrist, provides a 24-hr estimate of the rest/activity cycle. The other method involves a behavioral response, either to a specific probe or stimuli or subject initiated (e.g., indicating wakefulness). The classic, gold standard for evaluating sleep disturbance is continuous physiological monitoring of brain, eye, and muscle activity. This allows detailed distinctions of the states and stages of sleep, awakenings, and sleep continuity. Physiological delta can be obtained in controlled laboratory settings and in natural environments. Current ambulatory physiological recording equipment allows evaluation in home and work settings. These approaches will be described and the relative strengths and limitations of each method will be discussed.

  7. The Hundred Languages of Children Exhibition: A Unique Early Childhood Education Professional Development Program. Final Evaluation Report (September 15 to December 15, 1998).

    ERIC Educational Resources Information Center

    Abramson, Shareen; Huggins, Joyce M.

    The "Exhibition of the Hundred Languages of Children" (HLC) was organized in the early 1980s by the early childhood schools in Reggio Emilia, Italy to promote the study of their educational methods and to reveal the potential of young children for learning and creative expression. This report details an evaluation of the exhibition and…

  8. Logistics Enterprise Evaluation Model Based On Fuzzy Clustering Analysis

    NASA Astrophysics Data System (ADS)

    Fu, Pei-hua; Yin, Hong-bo

    In this thesis, we introduced an evaluation model based on fuzzy cluster algorithm of logistics enterprises. First of all,we present the evaluation index system which contains basic information, management level, technical strength, transport capacity,informatization level, market competition and customer service. We decided the index weight according to the grades, and evaluated integrate ability of the logistics enterprises using fuzzy cluster analysis method. In this thesis, we introduced the system evaluation module and cluster analysis module in detail and described how we achieved these two modules. At last, we gave the result of the system.

  9. Effects of methodological variation on assessment of riboflavin status using the erythrocyte glutathione reductase activation coefficient assay.

    PubMed

    Hill, Marilyn H E; Bradley, Angela; Mushtaq, Sohail; Williams, Elizabeth A; Powers, Hilary J

    2009-07-01

    Riboflavin status is usually measured as the in vitro stimulation with flavin adenine dinucleotide of the erythrocyte enzyme glutathione reductase, and expressed as an erythrocyte glutathione reductase activation coefficient (EGRAC). This method is used for the National Diet and Nutrition Surveys (NDNS) of the UK. In the period between the 1990 and 2003 surveys of UK adults, the estimated prevalence of riboflavin deficiency, expressed as an EGRAC value > or = 1.30, increased from 2 to 46 % in males and from 1 to 34 % in females. We hypothesised that subtle but important differences in the detail of the methodology between the two NDNS accounted for this difference. We carried out an evaluation of the performance of the methods used in the two NDNS and compared against an 'in-house' method, using blood samples collected from a riboflavin intervention study. Results indicated that the method used for the 1990 NDNS gave a significantly lower mean EGRAC value than both the 2003 NDNS method and the 'in-house' method (P < 0.0001). The key differences between the methods relate to the concentration of FAD used in the assay and the duration of the period of incubation of FAD with enzyme. The details of the EGRAC method should be standardised for use in different laboratories and over time. Additionally, it is proposed that consideration be given to re-evaluating the basis of the EGRAC threshold for riboflavin deficiency.

  10. Evaluating a Control System Architecture Based on a Formally Derived AOCS Model

    NASA Astrophysics Data System (ADS)

    Ilic, Dubravka; Latvala, Timo; Varpaaniemi, Kimmo; Vaisanen, Pauli; Troubitsyna, Elena; Laibinis, Linas

    2010-08-01

    Attitude & Orbit Control System (AOCS) refers to a wider class of control systems which are used to determine and control the attitude of the spacecraft while in orbit, based on the information obtained from various sensors. In this paper, we propose an approach to evaluate a typical (yet somewhat simplified) AOCS architecture using formal development - based on the Event-B method. As a starting point, an Ada specification of the AOCS is translated into a formal specification and further refined to incorporate all the details of its original source code specification. This way we are able not only to evaluate the Ada specification by expressing and verifying specific system properties in our formal models, but also to determine how well the chosen modelling framework copes with the level of detail required for an actual implementation and code generation from the derived models.

  11. Standard Test Methods for Textile Composites

    NASA Technical Reports Server (NTRS)

    Masters, John E.; Portanova, Marc A.

    1996-01-01

    Standard testing methods for composite laminates reinforced with continuous networks of braided, woven, or stitched fibers have been evaluated. The microstructure of these textile' composite materials differs significantly from that of tape laminates. Consequently, specimen dimensions and loading methods developed for tape type composites may not be applicable to textile composites. To this end, a series of evaluations were made comparing testing practices currently used in the composite industry. Information was gathered from a variety of sources and analyzed to establish a series of recommended test methods for textile composites. The current practices established for laminated composite materials by ASTM and the MIL-HDBK-17 Committee were considered. This document provides recommended test methods for determining both in-plane and out-of-plane properties. Specifically, test methods are suggested for: unnotched tension and compression; open and filled hole tension; open hole compression; bolt bearing; and interlaminar tension. A detailed description of the material architectures evaluated is also provided, as is a recommended instrumentation practice.

  12. A novel evaluation method for building construction project based on integrated information entropy with reliability theory.

    PubMed

    Bai, Xiao-ping; Zhang, Xi-wei

    2013-01-01

    Selecting construction schemes of the building engineering project is a complex multiobjective optimization decision process, in which many indexes need to be selected to find the optimum scheme. Aiming at this problem, this paper selects cost, progress, quality, and safety as the four first-order evaluation indexes, uses the quantitative method for the cost index, uses integrated qualitative and quantitative methodologies for progress, quality, and safety indexes, and integrates engineering economics, reliability theories, and information entropy theory to present a new evaluation method for building construction project. Combined with a practical case, this paper also presents detailed computing processes and steps, including selecting all order indexes, establishing the index matrix, computing score values of all order indexes, computing the synthesis score, sorting all selected schemes, and making analysis and decision. Presented method can offer valuable references for risk computing of building construction projects.

  13. NERC Policy 10: Measurement of two generation and load balancing IOS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spicer, P.J.; Galow, G.G.

    1999-11-01

    Policy 10 will describe specific standards and metrics for most of the reliability functions described in the Interconnected Operations Services Working Group (IOS WG) report. The purpose of this paper is to discuss, in detail, the proposed metrics for two generation and load balancing IOSs: Regulation; Load Following. For purposes of this paper, metrics include both measurement and performance evaluation. The measurement methods discussed are included in the current draft of the proposed Policy 10. The performance evaluation method discussed is offered by the authors for consideration by the IOS ITF (Implementation Task Force) for inclusion into Policy 10.

  14. Effective Approaches to Disorientation Familiarization for Aviation Personnel.

    ERIC Educational Resources Information Center

    Collins, William E.

    Techniques are discussed for providing familiarization of aviation personnel with disorientation problems (dizziness). Procedures are spelled out in detail. Methods of modifying existing equipment as well as an evaluation of available commercial equipment, are presented. The techniques have been used with notable success both at the Civil…

  15. Developing a Reporting Guideline for Social and Psychological Intervention Trials

    ERIC Educational Resources Information Center

    Grant, Sean; Montgomery, Paul; Hopewell, Sally; Macdonald, Geraldine; Moher, David; Mayo-Wilson, Evan

    2013-01-01

    Social and psychological interventions are often complex. Understanding randomized controlled trials (RCTs) of these complex interventions requires a detailed description of the interventions tested and the methods used to evaluate them; however, RCT reports often omit, or inadequately report, this information. Incomplete and inaccurate reporting…

  16. Comprehensive histological evaluation of bone implants

    PubMed Central

    Rentsch, Claudia; Schneiders, Wolfgang; Manthey, Suzanne; Rentsch, Barbe; Rammelt, Stephan

    2014-01-01

    To investigate and assess bone regeneration in sheep in combination with new implant materials classical histological staining methods as well as immunohistochemistry may provide additional information to standard radiographs or computer tomography. Available published data of bone defect regenerations in sheep often present none or sparely labeled histological images. Repeatedly, the exact location of the sample remains unclear, detail enlargements are missing and the labeling of different tissues or cells is absent. The aim of this article is to present an overview of sample preparation, staining methods and their benefits as well as a detailed histological description of bone regeneration in the sheep tibia. General histological staining methods like hematoxylin and eosin, Masson-Goldner trichrome, Movat’s pentachrome and alcian blue were used to define new bone formation within a sheep tibia critical size defect containing a polycaprolactone-co-lactide (PCL) scaffold implanted for 3 months (n = 4). Special attention was drawn to describe the bone healing patterns down to cell level. Additionally one histological quantification method and immunohistochemical staining methods are described. PMID:24504113

  17. Evaluating nuclear physics inputs in core-collapse supernova models

    NASA Astrophysics Data System (ADS)

    Lentz, E.; Hix, W. R.; Baird, M. L.; Messer, O. E. B.; Mezzacappa, A.

    Core-collapse supernova models depend on the details of the nuclear and weak interaction physics inputs just as they depend on the details of the macroscopic physics (transport, hydrodynamics, etc.), numerical methods, and progenitors. We present preliminary results from our ongoing comparison studies of nuclear and weak interaction physics inputs to core collapse supernova models using the spherically-symmetric, general relativistic, neutrino radiation hydrodynamics code Agile-Boltztran. We focus on comparisons of the effects of the nuclear EoS and the effects of improving the opacities, particularly neutrino--nucleon interactions.

  18. Diagnosis of condensation-induced waterhammer: Methods and background

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Izenson, M.G.; Rothe, P.H.; Wallis, G.B.

    This guidebook provides reference material and diagnostic procedures concerning condensation-induced waterhammer in nuclear power plants. Condensation-induced waterhammer is the most damaging form of waterhammer and its diagnosis is complicated by the complex nature of the underlying phenomena. In Volume 1, the guidebook groups condensation-induced waterhammers into five event classes which have similar phenomena and levels of damage. Diagnostic guidelines focus on locating the event center where condensation and slug acceleration take place. Diagnosis is described in three stages: an initial assessment, detailed evaluation and final confirmation. Graphical scoping analyses are provided to evaluate whether an event from one of themore » event classes could have occurred at the event center. Examples are provided for each type of waterhammer. Special instructions are provided for walking down damaged piping and evaluating damage due to waterhammer. To illustrate the diagnostic methods and document past experience, six case studies have been compiled in Volume 2. These case studies, based on actual condensation-induced waterhammer events at nuclear plants, present detailed data and work through the event diagnosis using the tools introduced in the first volume. 65 figs., 8 tabs.« less

  19. Evaluation of Patient Centered Medical Home Practice Transformation Initiatives

    PubMed Central

    Crabtree, Benjamin F.; Chase, Sabrina M.; Wise, Christopher G.; Schiff, Gordon D.; Schmidt, Laura A.; Goyzueta, Jeanette R.; Malouin, Rebecca A.; Payne, Susan M. C.; Quinn, Michael T.; Nutting, Paul A.; Miller, William L.; Jaén, Carlos Roberto

    2011-01-01

    Background The patient-centered medical home (PCMH) has become a widely cited solution to the deficiencies in primary care delivery in the United States. To achieve the magnitude of change being called for in primary care, quality improvement interventions must focus on whole-system redesign, and not just isolated parts of medical practices. Methods Investigators participating in 9 different evaluations of Patient Centered Medical Home implementation shared experiences, methodological strategies, and evaluation challenges for evaluating primary care practice redesign. Results A year-long iterative process of sharing and reflecting on experiences produced consensus on 7 recommendations for future PCMH evaluations: (1) look critically at models being implemented and identify aspects requiring modification; (2) include embedded qualitative and quantitative data collection to detail the implementation process; (3) capture details concerning how different PCMH components interact with one another over time; (4) understand and describe how and why physician and staff roles do, or do not evolve; (5) identify the effectiveness of individual PCMH components and how they are used; (6) capture how primary care practices interface with other entities such as specialists, hospitals, and referral services; and (7) measure resources required for initiating and sustaining innovations. Conclusions Broad-based longitudinal, mixed-methods designs that provide for shared learning among practice participants, program implementers, and evaluators are necessary to evaluate the novelty and promise of the PCMH model. All PCMH evaluations should as comprehensive as possible, and at a minimum should include a combination of brief observations and targeted qualitative interviews along with quantitative measures. PMID:21079525

  20. Evaluation of Wavelet Denoising Methods for Small-Scale Joint Roughness Estimation Using Terrestrial Laser Scanning

    NASA Astrophysics Data System (ADS)

    Bitenc, M.; Kieffer, D. S.; Khoshelham, K.

    2015-08-01

    The precision of Terrestrial Laser Scanning (TLS) data depends mainly on the inherent random range error, which hinders extraction of small details from TLS measurements. New post processing algorithms have been developed that reduce or eliminate the noise and therefore enable modelling details at a smaller scale than one would traditionally expect. The aim of this research is to find the optimum denoising method such that the corrected TLS data provides a reliable estimation of small-scale rock joint roughness. Two wavelet-based denoising methods are considered, namely Discrete Wavelet Transform (DWT) and Stationary Wavelet Transform (SWT), in combination with different thresholding procedures. The question is, which technique provides a more accurate roughness estimates considering (i) wavelet transform (SWT or DWT), (ii) thresholding method (fixed-form or penalised low) and (iii) thresholding mode (soft or hard). The performance of denoising methods is tested by two analyses, namely method noise and method sensitivity to noise. The reference data are precise Advanced TOpometric Sensor (ATOS) measurements obtained on 20 × 30 cm rock joint sample, which are for the second analysis corrupted by different levels of noise. With such a controlled noise level experiments it is possible to evaluate the methods' performance for different amounts of noise, which might be present in TLS data. Qualitative visual checks of denoised surfaces and quantitative parameters such as grid height and roughness are considered in a comparative analysis of denoising methods. Results indicate that the preferred method for realistic roughness estimation is DWT with penalised low hard thresholding.

  1. Nighttime images fusion based on Laplacian pyramid

    NASA Astrophysics Data System (ADS)

    Wu, Cong; Zhan, Jinhao; Jin, Jicheng

    2018-02-01

    This paper expounds method of the average weighted fusion, image pyramid fusion, the wavelet transform and apply these methods on the fusion of multiple exposures nighttime images. Through calculating information entropy and cross entropy of fusion images, we can evaluate the effect of different fusion. Experiments showed that Laplacian pyramid image fusion algorithm is suitable for processing nighttime images fusion, it can reduce the halo while preserving image details.

  2. Comparison of Provider Types Who Performed Prehospital Lifesaving Interventions: A Prospective Study

    DTIC Science & Technology

    2014-12-01

    In less than 2 hours, 15 critically ill children were triaged and admitted to the PICU or surge spaces. Conclusions:Identified strengths included...details increasing telemedicine uti - lization during a 4 year period and outlines program structural changes that improved utilization. Methods: The study...population survival. CSC ICU resource- allocation algorithms (ALGs) exist for adults. Our goal was to evaluate a CSC pandemic ALG for children . Methods

  3. The truncated conjugate gradient (TCG), a non-iterative/fixed-cost strategy for computing polarization in molecular dynamics: Fast evaluation of analytical forces

    NASA Astrophysics Data System (ADS)

    Aviat, Félix; Lagardère, Louis; Piquemal, Jean-Philip

    2017-10-01

    In a recent paper [F. Aviat et al., J. Chem. Theory Comput. 13, 180-190 (2017)], we proposed the Truncated Conjugate Gradient (TCG) approach to compute the polarization energy and forces in polarizable molecular simulations. The method consists in truncating the conjugate gradient algorithm at a fixed predetermined order leading to a fixed computational cost and can thus be considered "non-iterative." This gives the possibility to derive analytical forces avoiding the usual energy conservation (i.e., drifts) issues occurring with iterative approaches. A key point concerns the evaluation of the analytical gradients, which is more complex than that with a usual solver. In this paper, after reviewing the present state of the art of polarization solvers, we detail a viable strategy for the efficient implementation of the TCG calculation. The complete cost of the approach is then measured as it is tested using a multi-time step scheme and compared to timings using usual iterative approaches. We show that the TCG methods are more efficient than traditional techniques, making it a method of choice for future long molecular dynamics simulations using polarizable force fields where energy conservation matters. We detail the various steps required for the implementation of the complete method by software developers.

  4. The truncated conjugate gradient (TCG), a non-iterative/fixed-cost strategy for computing polarization in molecular dynamics: Fast evaluation of analytical forces.

    PubMed

    Aviat, Félix; Lagardère, Louis; Piquemal, Jean-Philip

    2017-10-28

    In a recent paper [F. Aviat et al., J. Chem. Theory Comput. 13, 180-190 (2017)], we proposed the Truncated Conjugate Gradient (TCG) approach to compute the polarization energy and forces in polarizable molecular simulations. The method consists in truncating the conjugate gradient algorithm at a fixed predetermined order leading to a fixed computational cost and can thus be considered "non-iterative." This gives the possibility to derive analytical forces avoiding the usual energy conservation (i.e., drifts) issues occurring with iterative approaches. A key point concerns the evaluation of the analytical gradients, which is more complex than that with a usual solver. In this paper, after reviewing the present state of the art of polarization solvers, we detail a viable strategy for the efficient implementation of the TCG calculation. The complete cost of the approach is then measured as it is tested using a multi-time step scheme and compared to timings using usual iterative approaches. We show that the TCG methods are more efficient than traditional techniques, making it a method of choice for future long molecular dynamics simulations using polarizable force fields where energy conservation matters. We detail the various steps required for the implementation of the complete method by software developers.

  5. A GIHS-based spectral preservation fusion method for remote sensing images using edge restored spectral modulation

    NASA Astrophysics Data System (ADS)

    Zhou, Xiran; Liu, Jun; Liu, Shuguang; Cao, Lei; Zhou, Qiming; Huang, Huawen

    2014-02-01

    High spatial resolution and spectral fidelity are basic standards for evaluating an image fusion algorithm. Numerous fusion methods for remote sensing images have been developed. Some of these methods are based on the intensity-hue-saturation (IHS) transform and the generalized IHS (GIHS), which may cause serious spectral distortion. Spectral distortion in the GIHS is proven to result from changes in saturation during fusion. Therefore, reducing such changes can achieve high spectral fidelity. A GIHS-based spectral preservation fusion method that can theoretically reduce spectral distortion is proposed in this study. The proposed algorithm consists of two steps. The first step is spectral modulation (SM), which uses the Gaussian function to extract spatial details and conduct SM of multispectral (MS) images. This method yields a desirable visual effect without requiring histogram matching between the panchromatic image and the intensity of the MS image. The second step uses the Gaussian convolution function to restore lost edge details during SM. The proposed method is proven effective and shown to provide better results compared with other GIHS-based methods.

  6. Evolution of optic nerve photography for glaucoma screening: a review.

    PubMed

    Myers, Jonathan S; Fudemberg, Scott J; Lee, Daniel

    2018-03-01

    Visual evaluation of the optic nerve has been one of the earliest and most widely used methods to evaluate patients for glaucoma. Photography has proven very useful for documentation of the nerve's appearance at a given time, allowing more detailed scrutiny then, and later comparison for change. Photography serves as the basis for real-time or non-simultaneous review in telemedicine and screening events allowing fundus and optic nerve evaluation by experts elsewhere. Expert evaluation of disc photographs has shown diagnostic performance similar to other methods of optic nerve evaluation for glaucoma. Newer technology has made optic nerve photography simpler, cheaper and more portable creating opportunities for broader utilization in screening in underserved populations by non-physicians. Recent investigations suggest that non-physicians or software algorithms for disc photograph evaluation have promise to allow more screening to be done with fewer experts. © 2017 Royal Australian and New Zealand College of Ophthalmologists.

  7. Goal-oriented evaluation of binarization algorithms for historical document images

    NASA Astrophysics Data System (ADS)

    Obafemi-Ajayi, Tayo; Agam, Gady

    2013-01-01

    Binarization is of significant importance in document analysis systems. It is an essential first step, prior to further stages such as Optical Character Recognition (OCR), document segmentation, or enhancement of readability of the document after some restoration stages. Hence, proper evaluation of binarization methods to verify their effectiveness is of great value to the document analysis community. In this work, we perform a detailed goal-oriented evaluation of image quality assessment of the 18 binarization methods that participated in the DIBCO 2011 competition using the 16 historical document test images used in the contest. We are interested in the image quality assessment of the outputs generated by the different binarization algorithms as well as the OCR performance, where possible. We compare our evaluation of the algorithms based on human perception of quality to the DIBCO evaluation metrics. The results obtained provide an insight into the effectiveness of these methods with respect to human perception of image quality as well as OCR performance.

  8. A Sample of NASA Langley Unsteady Pressure Experiments for Computational Aerodynamics Code Evaluation

    NASA Technical Reports Server (NTRS)

    Schuster, David M.; Scott, Robert C.; Bartels, Robert E.; Edwards, John W.; Bennett, Robert M.

    2000-01-01

    As computational fluid dynamics methods mature, code development is rapidly transitioning from prediction of steady flowfields to unsteady flows. This change in emphasis offers a number of new challenges to the research community, not the least of which is obtaining detailed, accurate unsteady experimental data with which to evaluate new methods. Researchers at NASA Langley Research Center (LaRC) have been actively measuring unsteady pressure distributions for nearly 40 years. Over the last 20 years, these measurements have focused on developing high-quality datasets for use in code evaluation. This paper provides a sample of unsteady pressure measurements obtained by LaRC and available for government, university, and industry researchers to evaluate new and existing unsteady aerodynamic analysis methods. A number of cases are highlighted and discussed with attention focused on the unique character of the individual datasets and their perceived usefulness for code evaluation. Ongoing LaRC research in this area is also presented.

  9. A Model For Teaching Advanced Neuroscience Methods: A Student-Run Seminar to Increase Practical Understanding and Confidence

    PubMed Central

    Harrison, Theresa M.; Ching, Christopher R. K.; Andrews, Anne M.

    2016-01-01

    Neuroscience doctoral students must master specific laboratory techniques and approaches to complete their thesis work (hands-on learning). Due to the highly interdisciplinary nature of the field, learning about a diverse range of methodologies through literature surveys and coursework is also necessary for student success (hands-off learning). Traditional neuroscience coursework stresses what is known about the nervous system with relatively little emphasis on the details of the methods used to obtain this knowledge. Furthermore, hands-off learning is made difficult by a lack of detail in methods sections of primary articles, subfield-specific jargon and vague experimental rationales. We designed a student-taught course to enable first-year neuroscience doctoral students to overcome difficulties in hands-off learning by introducing a new approach to reading and presenting primary research articles that focuses on methodology. In our literature-based course students were encouraged to present a method with which they had no previous experience. To facilitate weekly discussions, “experts” were invited to class sessions. Experts were advanced graduate students who had hands-on experience with the method being covered and served as discussion co-leaders. Self-evaluation worksheets were administered on the first and last days of the 10-week course and used to assess students’ confidence in discussing research and methods outside of their primary research expertise. These evaluations revealed that the course significantly increased the students’ confidence in reading, presenting and discussing a wide range of advanced neuroscience methods. PMID:27980464

  10. Endoscopic Shearography and Thermography Methods for Nondestructive Evaluation of Lined Pressure Vessels

    NASA Technical Reports Server (NTRS)

    Russell, S. S.; Lansing, M. D.

    1997-01-01

    The goal of this research effort was the development of methods for shearographic and thermographic inspection of coatings, bonds, or laminates inside rocket fuel or oxidizer tanks, fuel lines, and other closed structures. The endoscopic methods allow imaging and inspection inside cavities that are traditionally inaccessible with shearography or thermography cameras. The techniques are demonstrated and suggestions for practical application are made in this report. Drawings of the experimental setups, detailed procedures, and experimental data are included.

  11. Endoscopic Shearography and Thermography Methods for Nondestructive Evaluation of Lined Pressure Vessels

    NASA Technical Reports Server (NTRS)

    Lansing, Matthew D.; Bullock, Michael W.

    1996-01-01

    The goal of this research effort was the development of methods for shearography and thermography inspection of coatings, bonds, or laminates inside rocket fuel or oxidizer tanks, fuel lines, and other closed structures. The endoscopic methods allow imaging and inspection inside cavities which are traditionally inaccessible with shearography or thermography cameras. The techniques are demonstrated and suggestions for practical application are made in this report. Drawings of the experimental setups, detailed procedures, and experimental data are included.

  12. Prediction of overall and blade-element performance for axial-flow pump configurations

    NASA Technical Reports Server (NTRS)

    Serovy, G. K.; Kavanagh, P.; Okiishi, T. H.; Miller, M. J.

    1973-01-01

    A method and a digital computer program for prediction of the distributions of fluid velocity and properties in axial flow pump configurations are described and evaluated. The method uses the blade-element flow model and an iterative numerical solution of the radial equilbrium and continuity conditions. Correlated experimental results are used to generate alternative methods for estimating blade-element turning and loss characteristics. Detailed descriptions of the computer program are included, with example input and typical computed results.

  13. Variational method based on Retinex with double-norm hybrid constraints for uneven illumination correction

    NASA Astrophysics Data System (ADS)

    Li, Shuo; Wang, Hui; Wang, Liyong; Yu, Xiangzhou; Yang, Le

    2018-01-01

    The uneven illumination phenomenon reduces the quality of remote sensing image and causes interference in the subsequent processing and applications. A variational method based on Retinex with double-norm hybrid constraints for uneven illumination correction is proposed. The L1 norm and the L2 norm are adopted to constrain the textures and details of reflectance image and the smoothness of the illumination image, respectively. The problem of separating the illumination image from the reflectance image is transformed into the optimal solution of the variational model. In order to accelerate the solution, the split Bregman method is used to decompose the variational model into three subproblems, which are calculated by alternate iteration. Two groups of experiments are implemented on two synthetic images and three real remote sensing images. Compared with the variational Retinex method with single-norm constraint and the Mask method, the proposed method performs better in both visual evaluation and quantitative measurements. The proposed method can effectively eliminate the uneven illumination while maintaining the textures and details of the remote sensing image. Moreover, the proposed method using split Bregman method is more than 10 times faster than the method with the steepest descent method.

  14. Digging Deeper into Helmont's Famous Willow Tree Experiment.

    ERIC Educational Resources Information Center

    Hershey, David R.

    1991-01-01

    Author provides details about Helmont's experiment that was a milestone in the history of plant biology since it marked the start of experimental plant physiology. Concludes that Jean Baptista van Helmont's supposedly simple experiment becomes complex when trying to determine the methods he used, evaluate his sources of experimental error, and…

  15. Scintigraphic evaluation in musculoskeletal sepsis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Merkel, K.D.; Fitzgerald, R.H. Jr.; Brown, M.L.

    In this article, the mechanism of technetium, gallium, and indium-labeled white blood cell localization in septic processes is detailed, and the method of interpretation of these three isotopes with relationship to musculoskeletal infection is outlined. Specific clinical application of technetium, gallium, and indium-labeled white blood cell imaging for musculoskeletal sepsis is reviewed.

  16. 20180416 - Understanding the Biology and Technology of ToxCast and Tox21 Assays (SETAC Durham NC)

    EPA Science Inventory

    The ToxCast high-throughput toxicity (HTT) testing methods have been developed to evaluate the hazard potential of diverse environmental, industrial and consumer product chemicals. The main goal is prioritizing the compounds of greatest concern for more detailed toxicological stu...

  17. Using mixed methods to develop and evaluate complex interventions in palliative care research.

    PubMed

    Farquhar, Morag C; Ewing, Gail; Booth, Sara

    2011-12-01

    there is increasing interest in combining qualitative and quantitative research methods to provide comprehensiveness and greater knowledge yield. Mixed methods are valuable in the development and evaluation of complex interventions. They are therefore particularly valuable in palliative care research where the majority of interventions are complex, and the identification of outcomes particularly challenging. this paper aims to introduce the role of mixed methods in the development and evaluation of complex interventions in palliative care, and how they may be used in palliative care research. the paper defines mixed methods and outlines why and how mixed methods are used to develop and evaluate complex interventions, with a pragmatic focus on design and data collection issues and data analysis. Useful texts are signposted and illustrative examples provided of mixed method studies in palliative care, including a detailed worked example of the development and evaluation of a complex intervention in palliative care for breathlessness. Key challenges to conducting mixed methods in palliative care research are identified in relation to data collection, data integration in analysis, costs and dissemination and how these might be addressed. the development and evaluation of complex interventions in palliative care benefit from the application of mixed methods. Mixed methods enable better understanding of whether and how an intervention works (or does not work) and inform the design of subsequent studies. However, they can be challenging: mixed method studies in palliative care will benefit from working with agreed protocols, multidisciplinary teams and engaging staff with appropriate skill sets.

  18. Evaluation of Treatment Technologies for Wastewater from Insensitive Munitions Production. Phase 1: Technology Down-Selection

    DTIC Science & Technology

    2013-11-01

    the AOP reactor according to the target process formulation. Gases were vented to a GAC vessel. ERDC/EL TR-13-20 94 10.2.2 Results and Discussion...destructive and filtration methods such as biological treatment (destructive), chemical reduction (destructive), reverse osmosis (RO)/nano- filtration ... filtration ), and advanced oxidation processes (destructive). A comprehensive evaluation of alternatives relies on a detailed list of criteria, allowing for

  19. A method for quickly and exactly extracting hepatic vein

    NASA Astrophysics Data System (ADS)

    Xiong, Qing; Yuan, Rong; Wang, Luyao; Wang, Yanchun; Li, Zhen; Hu, Daoyu; Xie, Qingguo

    2013-02-01

    It is of vital importance that providing detailed and accurate information about hepatic vein (HV) for liver surgery planning, such as pre-operative planning of living donor liver transplantation (LDLT). Due to the different blood flow rate of intra-hepatic vascular systems and the restrictions of CT scan, it is common that HV and hepatic portal vein (HPV) are both filled with contrast medium during the scan and in high intensity in the hepatic venous phase images. As a result, the HV segmentation result obtained from the hepatic venous phase images is always contaminated by HPV which makes accurate HV modeling difficult. In this paper, we proposed a method for quick and accurate HV extraction. Based on the topological structure of intra-hepatic vessels, we analyzed the anatomical features of HV and HPV. According to the analysis, three conditions were presented to identify the nodes that connect HV with HPV in the topological structure, and thus to distinguish HV from HPV. The method costs less than one minute to extract HV and provides a correct and detailed HV model even with variations in vessels. Evaluated by two experienced radiologists, the accuracy of the HV model obtained from our method is over 97%. In the following work, we will extend our work to a comprehensive clinical evaluation and apply this method to actual LDLT surgical planning.

  20. International comparison of observation-specific spatial buffers: maximizing the ability to estimate physical activity.

    PubMed

    Frank, Lawrence D; Fox, Eric H; Ulmer, Jared M; Chapman, James E; Kershaw, Suzanne E; Sallis, James F; Conway, Terry L; Cerin, Ester; Cain, Kelli L; Adams, Marc A; Smith, Graham R; Hinckson, Erica; Mavoa, Suzanne; Christiansen, Lars B; Hino, Adriano Akira F; Lopes, Adalberto A S; Schipperijn, Jasper

    2017-01-23

    Advancements in geographic information systems over the past two decades have increased the specificity by which an individual's neighborhood environment may be spatially defined for physical activity and health research. This study investigated how different types of street network buffering methods compared in measuring a set of commonly used built environment measures (BEMs) and tested their performance on associations with physical activity outcomes. An internationally-developed set of objective BEMs using three different spatial buffering techniques were used to evaluate the relative differences in resulting explanatory power on self-reported physical activity outcomes. BEMs were developed in five countries using 'sausage,' 'detailed-trimmed,' and 'detailed,' network buffers at a distance of 1 km around participant household addresses (n = 5883). BEM values were significantly different (p < 0.05) for 96% of sausage versus detailed-trimmed buffer comparisons and 89% of sausage versus detailed network buffer comparisons. Results showed that BEM coefficients in physical activity models did not differ significantly across buffering methods, and in most cases BEM associations with physical activity outcomes had the same level of statistical significance across buffer types. However, BEM coefficients differed in significance for 9% of the sausage versus detailed models, which may warrant further investigation. Results of this study inform the selection of spatial buffering methods to estimate physical activity outcomes using an internationally consistent set of BEMs. Using three different network-based buffering methods, the findings indicate significant variation among BEM values, however associations with physical activity outcomes were similar across each buffering technique. The study advances knowledge by presenting consistently assessed relationships between three different network buffer types and utilitarian travel, sedentary behavior, and leisure-oriented physical activity outcomes.

  1. Toward Mixed Method Evaluations of Scientific Visualizations and Design Process as an Evaluation Tool.

    PubMed

    Jackson, Bret; Coffey, Dane; Thorson, Lauren; Schroeder, David; Ellingson, Arin M; Nuckley, David J; Keefe, Daniel F

    2012-10-01

    In this position paper we discuss successes and limitations of current evaluation strategies for scientific visualizations and argue for embracing a mixed methods strategy of evaluation. The most novel contribution of the approach that we advocate is a new emphasis on employing design processes as practiced in related fields (e.g., graphic design, illustration, architecture) as a formalized mode of evaluation for data visualizations. To motivate this position we describe a series of recent evaluations of scientific visualization interfaces and computer graphics strategies conducted within our research group. Complementing these more traditional evaluations our visualization research group also regularly employs sketching, critique, and other design methods that have been formalized over years of practice in design fields. Our experience has convinced us that these activities are invaluable, often providing much more detailed evaluative feedback about our visualization systems than that obtained via more traditional user studies and the like. We believe that if design-based evaluation methodologies (e.g., ideation, sketching, critique) can be taught and embraced within the visualization community then these may become one of the most effective future strategies for both formative and summative evaluations.

  2. Toward Mixed Method Evaluations of Scientific Visualizations and Design Process as an Evaluation Tool

    PubMed Central

    Jackson, Bret; Coffey, Dane; Thorson, Lauren; Schroeder, David; Ellingson, Arin M.; Nuckley, David J.

    2017-01-01

    In this position paper we discuss successes and limitations of current evaluation strategies for scientific visualizations and argue for embracing a mixed methods strategy of evaluation. The most novel contribution of the approach that we advocate is a new emphasis on employing design processes as practiced in related fields (e.g., graphic design, illustration, architecture) as a formalized mode of evaluation for data visualizations. To motivate this position we describe a series of recent evaluations of scientific visualization interfaces and computer graphics strategies conducted within our research group. Complementing these more traditional evaluations our visualization research group also regularly employs sketching, critique, and other design methods that have been formalized over years of practice in design fields. Our experience has convinced us that these activities are invaluable, often providing much more detailed evaluative feedback about our visualization systems than that obtained via more traditional user studies and the like. We believe that if design-based evaluation methodologies (e.g., ideation, sketching, critique) can be taught and embraced within the visualization community then these may become one of the most effective future strategies for both formative and summative evaluations. PMID:28944349

  3. Experimental Study on Fatigue Behaviour of Shot-Peened Open-Hole Steel Plates

    PubMed Central

    Wang, Zhi-Yu; Wang, Qing-Yuan; Cao, Mengqin

    2017-01-01

    This paper presents an experimental study on the fatigue behaviour of shot-peened open-hole plates with Q345 steel. The beneficial effects induced by shot peening on the fatigue life improvement are highlighted. The characteristic fatigue crack initiation and propagation modes of open-hole details under fatigue loading are revealed. The surface hardening effect brought by the shot peening is analyzed from the aspects of in-depth micro-hardness and compressive residual stress. The fatigue life results are evaluated and related design suggestions are made as a comparison with codified detail categories. In particular, a fracture mechanics theory-based method is proposed and demonstrated its validity in predicting the fatigue life of studied shot-peened open-hole details. PMID:28841160

  4. The Role of Formative Evaluation in Implementation Research and the QUERI Experience

    PubMed Central

    Stetler, Cheryl B; Legro, Marcia W; Wallace, Carolyn M; Bowman, Candice; Guihan, Marylou; Hagedorn, Hildi; Kimmel, Barbara; Sharp, Nancy D; Smith, Jeffrey L

    2006-01-01

    This article describes the importance and role of 4 stages of formative evaluation in our growing understanding of how to implement research findings into practice in order to improve the quality of clinical care. It reviews limitations of traditional approaches to implementation research and presents a rationale for new thinking and use of new methods. Developmental, implementation-focused, progress-focused, and interpretive evaluations are then defined and illustrated with examples from Veterans Health Administration Quality Enhancement Research Initiative projects. This article also provides methodologic details and highlights challenges encountered in actualizing formative evaluation within implementation research. PMID:16637954

  5. [Anatomy of the skull base and the cranial nerves in slice imaging].

    PubMed

    Bink, A; Berkefeld, J; Zanella, F

    2009-07-01

    Computed tomography (CT) and magnetic resonance imaging (MRI) are suitable methods for examination of the skull base. Whereas CT is used to evaluate mainly bone destruction e.g. for planning surgical therapy, MRI is used to show pathologies in the soft tissue and bone invasion. High resolution and thin slice thickness are indispensible for both modalities of skull base imaging. Detailed anatomical knowledge is necessary even for correct planning of the examination procedures. This knowledge is a requirement to be able to recognize and interpret pathologies. MRI is the method of choice for examining the cranial nerves. The total path of a cranial nerve can be visualized by choosing different sequences taking into account the tissue surrounding this cranial nerve. This article summarizes examination methods of the skull base in CT and MRI, gives a detailed description of the anatomy and illustrates it with image examples.

  6. A dual-rating method for evaluating impact noise isolation of floor-ceiling assemblies.

    PubMed

    LoVerde, John J; Dong, D Wayland

    2017-01-01

    Impact Insulation Class (IIC), the single-number rating for evaluating the impact noise insulation of a floor-ceiling assembly, and the associated field testing ratings, are unsatisfactory because they do not have strong correlation with subjective reaction nor provide suitable detailed information for evaluation or design of floor-ceiling assemblies. Various proposals have been made for improving the method, but the data presented indicate that no single-number rating can adequately characterize the impact noise isolation of an assembly. For realistic impact noise sources and floor-ceiling assembly types, there are two frequency domains for impact noise, and the impact noise levels in the two domains can vary independently. Therefore, two ratings are required in order to satisfactorily evaluate the impact isolation provided by a floor-ceiling assembly. Two different ratings are introduced for measuring field impact isolation in the two frequency domains, using the existing impact source and measurement method. They are named low-frequency impact rating (LIR) and high-frequency impact rating (HIR). LIR and HIR are proposed to improve the current method for design and evaluation of floor-ceiling assemblies and also provide a better method for predicting subjective reaction.

  7. Concordance Between Current Job and Usual Job in Occupational and Industry Groupings

    PubMed Central

    Luckhaupt, Sara E.; Cohen, Martha A.; Calvert, Geoffrey M.

    2015-01-01

    Objective To determine whether current job is a reasonable surrogate for usual job. Methods Data from the 2010 National Health Interview Survey were utilized to determine concordance between current and usual jobs for workers employed within the past year. Concordance was quantitated by kappa values for both simple and detailed industry and occupational groups. Good agreement is considered to be present when kappa values exceed 60. Results Overall kappa values ± standard errors were 74.5 ± 0.5 for simple industry, 72.4 ± 0.5 for detailed industry, 76.3 ± 0.4 for simple occupation, 73.7 ± 0.5 for detailed occupation, and 80.4 ± 0.6 for very broad occupational class. Sixty-five of 73 detailed industry groups and 78 of 81 detailed occupation groups evaluated had good agreement between current and usual jobs. Conclusions Current job can often serve as a reliable surrogate for usual job in epidemiologic studies. PMID:23969506

  8. High-quality observation of surface imperviousness for urban runoff modelling using UAV imagery

    NASA Astrophysics Data System (ADS)

    Tokarczyk, P.; Leitao, J. P.; Rieckermann, J.; Schindler, K.; Blumensaat, F.

    2015-10-01

    Modelling rainfall-runoff in urban areas is increasingly applied to support flood risk assessment, particularly against the background of a changing climate and an increasing urbanization. These models typically rely on high-quality data for rainfall and surface characteristics of the catchment area as model input. While recent research in urban drainage has been focusing on providing spatially detailed rainfall data, the technological advances in remote sensing that ease the acquisition of detailed land-use information are less prominently discussed within the community. The relevance of such methods increases as in many parts of the globe, accurate land-use information is generally lacking, because detailed image data are often unavailable. Modern unmanned aerial vehicles (UAVs) allow one to acquire high-resolution images on a local level at comparably lower cost, performing on-demand repetitive measurements and obtaining a degree of detail tailored for the purpose of the study. In this study, we investigate for the first time the possibility of deriving high-resolution imperviousness maps for urban areas from UAV imagery and of using this information as input for urban drainage models. To do so, an automatic processing pipeline with a modern classification method is proposed and evaluated in a state-of-the-art urban drainage modelling exercise. In a real-life case study (Lucerne, Switzerland), we compare imperviousness maps generated using a fixed-wing consumer micro-UAV and standard large-format aerial images acquired by the Swiss national mapping agency (swisstopo). After assessing their overall accuracy, we perform an end-to-end comparison, in which they are used as an input for an urban drainage model. Then, we evaluate the influence which different image data sources and their processing methods have on hydrological and hydraulic model performance. We analyse the surface runoff of the 307 individual subcatchments regarding relevant attributes, such as peak runoff and runoff volume. Finally, we evaluate the model's channel flow prediction performance through a cross-comparison with reference flow measured at the catchment outlet. We show that imperviousness maps generated from UAV images processed with modern classification methods achieve an accuracy comparable to standard, off-the-shelf aerial imagery. In the examined case study, we find that the different imperviousness maps only have a limited influence on predicted surface runoff and pipe flows, when traditional workflows are used. We expect that they will have a substantial influence when more detailed modelling approaches are employed to characterize land use and to predict surface runoff. We conclude that UAV imagery represents a valuable alternative data source for urban drainage model applications due to the possibility of flexibly acquiring up-to-date aerial images at a quality compared with off-the-shelf image products and a competitive price at the same time. We believe that in the future, urban drainage models representing a higher degree of spatial detail will fully benefit from the strengths of UAV imagery.

  9. Integration of Evidence into a Detailed Clinical Model-based Electronic Nursing Record System

    PubMed Central

    Park, Hyeoun-Ae; Jeon, Eunjoo; Chung, Eunja

    2012-01-01

    Objectives The purpose of this study was to test the feasibility of an electronic nursing record system for perinatal care that is based on detailed clinical models and clinical practice guidelines in perinatal care. Methods This study was carried out in five phases: 1) generating nursing statements using detailed clinical models; 2) identifying the relevant evidence; 3) linking nursing statements with the evidence; 4) developing a prototype electronic nursing record system based on detailed clinical models and clinical practice guidelines; and 5) evaluating the prototype system. Results We first generated 799 nursing statements describing nursing assessments, diagnoses, interventions, and outcomes using entities, attributes, and value sets of detailed clinical models for perinatal care which we developed in a previous study. We then extracted 506 recommendations from nine clinical practice guidelines and created sets of nursing statements to be used for nursing documentation by grouping nursing statements according to these recommendations. Finally, we developed and evaluated a prototype electronic nursing record system that can provide nurses with recommendations for nursing practice and sets of nursing statements based on the recommendations for guiding nursing documentation. Conclusions The prototype system was found to be sufficiently complete, relevant, useful, and applicable in terms of content, and easy to use and useful in terms of system user interface. This study has revealed the feasibility of developing such an ENR system. PMID:22844649

  10. Methods for external event screening quantification: Risk Methods Integration and Evaluation Program (RMIEP) methods development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ravindra, M.K.; Banon, H.

    1992-07-01

    In this report, the scoping quantification procedures for external events in probabilistic risk assessments of nuclear power plants are described. External event analysis in a PRA has three important goals; (1) the analysis should be complete in that all events are considered; (2) by following some selected screening criteria, the more significant events are identified for detailed analysis; (3) the selected events are analyzed in depth by taking into account the unique features of the events: hazard, fragility of structures and equipment, external-event initiated accident sequences, etc. Based on the above goals, external event analysis may be considered as amore » three-stage process: Stage I: Identification and Initial Screening of External Events; Stage II: Bounding Analysis; Stage III: Detailed Risk Analysis. In the present report, first, a review of published PRAs is given to focus on the significance and treatment of external events in full-scope PRAs. Except for seismic, flooding, fire, and extreme wind events, the contributions of other external events to plant risk have been found to be negligible. Second, scoping methods for external events not covered in detail in the NRC's PRA Procedures Guide are provided. For this purpose, bounding analyses for transportation accidents, extreme winds and tornadoes, aircraft impacts, turbine missiles, and chemical release are described.« less

  11. Marine Corps Body Composition Program: The Flawed Measurement System

    DTIC Science & Technology

    2006-02-07

    fitness expert and writer for ABC Bodybuilding , an error of 3% in a body fat evaluation is extreme and methods that have this margin of error should not...most other methods. In fact, bodybuilders use a seven to nine point skin fold measurement weekly during their training to monitor body fat...19.95 and recommended and endorsed by “Body-For-Life” and the World Natural Bodybuilding Federation. The caliper comes with detailed instructions

  12. Dynamic Modeling and Evaluation of Recurring Infrastructure Maintenance Budget Determination Methods

    DTIC Science & Technology

    2005-03-01

    represent the annual M&R costs for the entire facility (Melvin, 1992). This method requires immense amounts of detailed data for each facility to be...and where facility and infrastructure maintenance must occur. Uzarski et al (1995) discuss that the data gathered produces a candidate list that can... facilities or an infrastructure plant. Government agencies like the DoD, major universities, and large corporations are the major players. Data

  13. Cnn Based Retinal Image Upscaling Using Zero Component Analysis

    NASA Astrophysics Data System (ADS)

    Nasonov, A.; Chesnakov, K.; Krylov, A.

    2017-05-01

    The aim of the paper is to obtain high quality of image upscaling for noisy images that are typical in medical image processing. A new training scenario for convolutional neural network based image upscaling method is proposed. Its main idea is a novel dataset preparation method for deep learning. The dataset contains pairs of noisy low-resolution images and corresponding noiseless highresolution images. To achieve better results at edges and textured areas, Zero Component Analysis is applied to these images. The upscaling results are compared with other state-of-the-art methods like DCCI, SI-3 and SRCNN on noisy medical ophthalmological images. Objective evaluation of the results confirms high quality of the proposed method. Visual analysis shows that fine details and structures like blood vessels are preserved, noise level is reduced and no artifacts or non-existing details are added. These properties are essential in retinal diagnosis establishment, so the proposed algorithm is recommended to be used in real medical applications.

  14. Evaluating the use of drone photogrammetry for measurement of stream channel morphology and response to high flow events

    NASA Astrophysics Data System (ADS)

    Price, Katie; Ballow, William

    2015-04-01

    Traditional high-precision survey methods for stream channel measurement are labor-intensive and require wadeability or boat access to streams. These conditions limit the number of sites researchers are able to study and generally prohibit the possibility of repeat channel surveys to evaluate short-term fluctuations in channel morphology. In recent years, unmanned aerial vehicles (drones) equipped with photo and video capabilities have become widely available and affordable. Concurrently, developments in photogrammetric software offer unprecedented mapping and 3D rendering capabilities of drone-captured photography. In this study, we evaluate the potential use of drone-mounted cameras for detailed stream channel morphometric analysis. We used a relatively low-cost drone (DJI Phantom 2+ Vision) and commercially available, user friendly software (Agisoft Photscan) for photogrammetric analysis of drone-captured stream channel photography. Our test study was conducted on Proctor Creek, a highly responsive urban stream in Atlanta, Georgia, within the crystalline Piedmont region of the southeastern United States. As a baseline, we performed traditional high-precision survey methods to collect morphological measurements (e.g., bankfull and wetted width, bankfull and wetted thalweg depth) at 11 evenly-spaced transects, following USGS protocols along reaches of 20 times average channel width. We additionally used the drone to capture 200+ photos along the same reaches, concurrent with the channel survey. Using the photogrammetry software, we generated georeferenced 3D models of the stream channel, from which morphological measurements were derived from the 11 transects and compared with measurements from the traditional survey method. We additionally explored possibilities for novel morphometric characterization available from the continuous 3D surface, as an improvement on the limited number of detailed cross-sections available from standard methods. These results showed great promise for the drone photogrammetry methods, which encouraged the exploration of the possibility of repeat aerial surveys to evaluate channel response to high flow events. Repeat drone surveys were performed following a sequence of high-flow events in Proctor Creek to evaluate the possibility of using these methods for assessment of stream channel response to flooding.

  15. Community reporting of ambient air polychlorinated biphenyl concentrations near a Superfund site.

    PubMed

    Tomsho, Kathryn S; Basra, Komal; Rubin, Staci M; Miller, Claire B; Juang, Richard; Broude, Sylvia; Martinez, Andres; Hornbuckle, Keri C; Heiger-Bernays, Wendy; Scammell, Madeleine K

    2017-10-27

    In this manuscript, we describe the process of establishing partnerships for community-based environmental exposure research, the tools and methods implemented for data report-back to community members, and the results of evaluations of these efforts. Data discovery and report-back materials developed by Statistics for Action (SFA) were employed as the framework to communicate the environmental data to community members and workshops. These data communication and research translation efforts are described in detail and evaluated for effectiveness based on feedback provided from community members who attended the workshops. Overall, the methods were mostly effective for the intended data communication.

  16. Preliminary shuttle structural dynamics modeling design study

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The design and development of a structural dynamics model of the space shuttle are discussed. The model provides for early study of structural dynamics problems, permits evaluation of the accuracy of the structural and hydroelastic analysis methods used on test vehicles, and provides for efficiently evaluating potential cost savings in structural dynamic testing techniques. The discussion is developed around the modes in which major input forces and responses occur and the significant structural details in these modes.

  17. Saco River-Camp Ellis Harbor, Saco, Maine. Small Navigation Project, Detailed Project Report and Environmental Assessment.

    DTIC Science & Technology

    1981-12-01

    41294) further evaluation of chemical -biological interactive effects is not necessary because the sediments meet the following evaluation criteria...such release would be predicted at the dredge site. bulk chemical analyses were run on the five samples taken in conjunction with maintenance...material to be dredged is therefore considered to be chemically acceptable for beach nourishment purposes or other disposal methods. 38 1%4* 4J 0 0 * M 0 0

  18. The Preliminary Design of a Standardized Spacecraft Bus for Small Tactical Satellites (Volume 1)

    DTIC Science & Technology

    1996-11-01

    characteristics, and not detailed design recommendations, the team decided to avoid modeling the interaction among the objective attributes. 47 5.6 Flexibility of...in the Modsat computer model are necessarily "generic" in nature to provide both flexibility in design evaluation and a foundation on which more...the methods employed during the study, the scope of the problem, the value system used to evaluate alternatives, tradeoff studies performed, modeling

  19. Sports Injury Surveillance Systems: A Review of Methods and Data Quality.

    PubMed

    Ekegren, Christina L; Gabbe, Belinda J; Finch, Caroline F

    2016-01-01

    Data from sports injury surveillance systems are a prerequisite to the development and evaluation of injury prevention strategies. This review aimed to identify ongoing sports injury surveillance systems and determine whether there are gaps in our understanding of injuries in certain sport settings. A secondary aim was to determine which of the included surveillance systems have evaluated the quality of their data, a key factor in determining their usefulness. A systematic search was carried out to identify (1) publications presenting methodological details of sports injury surveillance systems within clubs and organisations; and (2) publications describing quality evaluations and the quality of data from these systems. Data extracted included methodological details of the surveillance systems, methods used to evaluate data quality, and results of these evaluations. Following literature search and review, a total of 15 sports injury surveillance systems were identified. Data relevant to each aim were summarised descriptively. Most systems were found to exist within professional and elite sports. Publications concerning data quality were identified for seven (47%) systems. Validation of system data through comparison with alternate sources has been undertaken for only four systems (27%). This review identified a shortage of ongoing injury surveillance data from amateur and community sport settings and limited information about the quality of data in professional and elite settings. More surveillance systems are needed across a range of sport settings, as are standards for data quality reporting. These efforts will enable better monitoring of sports injury trends and the development of sports safety strategies.

  20. Are insular populations of the Philippine falconet (Microhierax erythrogenys) steps in a cline?

    Treesearch

    Todd E. Katzner; Nigel J. Collar

    2013-01-01

    Founder effects, new environments, and competition often produce changes in species colonizing islands, although the resulting endemism sometimes requires molecular identification. One method to identify fruitful areas for more detailed genetic study is through comparative morphological analyses. We measured 210 museum specimens to evaluate the potential morphological...

  1. EVALUATION OF A FORMER LANDFILL SITE IN FORT COLLINS, COLORADO USING GROUND-BASED OPTICAL REMOTE SENSING TECHNOLOGY

    EPA Science Inventory

    This report details a measurement campaign conducted using the Radial Plume Mapping (RPM) method and optical remote sensing technologies to characterize fugitive emissions. This work was funded by EPA′s Monitoring and Measurement for the 21st Century Initiative, or 21M2. The si...

  2. Generic Verification Protocol for Testing Pesticide Application Spray Drift Reduction Technologies for Row and Field Crops (Version 1.4)

    EPA Science Inventory

    This generic verification protocol provides a detailed method for conducting and reporting results from verification testing of pesticide application technologies. It can be used to evaluate technologies for their potential to reduce spray drift, hence the term “drift reduction t...

  3. Aerial Infrared Photos for Citrus Growers

    NASA Technical Reports Server (NTRS)

    Blazquez, C. H.; Horn, F. W. J.

    1982-01-01

    Handbook advises on benefits and methods of aerial photography with color infrared film. Interpretation of photographs is discussed in detail. Necessary equipment for interpretation is described--light table, magnifying lenses, and microfiche viewers, for example. Advice is given on rating tree condition; identifying effects of diseases, insects, and nematodes; and evaluating effects of soil, water, and weather.

  4. Medical Advances in Child Sexual Abuse

    ERIC Educational Resources Information Center

    Alexander, Randell A.

    2011-01-01

    This volume is the first of a two-part special issue detailing state of the art practice in medical issues around child sexual abuse. The six articles in this issue explore methods for medical history evaluation, the rationale for when sexual examinations should take place, specific hymenal findings that suggest a child has been sexually abused,…

  5. The Child Diary as a Research Tool

    ERIC Educational Resources Information Center

    Lamsa, Tiina; Ronka, Anna; Poikonen, Pirjo-Liisa; Malinen, Kaisa

    2012-01-01

    The aim of this article is to introduce the use of the child diary as a method in daily diary research. By describing the research process and detailing its structure, a child diary, a structured booklet in which children's parents and day-care personnel (N = 54 children) reported their observations, was evaluated. The participants reported the…

  6. Facility Pollution Prevention Guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    The U.S. Environmental Protection Agency (U.S. EPA) has developed the Facility Pollution Prevention Guide for those who are interested in and responsible for pollution prevention in industrial or service facilities. It summarizes the benefits of a company-wide pollution prevention program and suggests ways to incorporate pollution prevention in company policies and practices. The Guide describes how to establish a company-wide pollution prevention program. It outlines procedures for conducting a preliminary assessment to identify opportunities for waste reduction or elimination. Then, it describes how to use the results of the preassessment to prioritize areas for detailed assessment, how to use themore » detailed assessment to develop pollution prevention options, and how to implement those options that withstand feasibility analysis. Methods of evaluating, adjusting, and maintaining the program are described. Later chapters deal with cost analysis for pollution prevention projects and with the roles of product design and energy conservation in pollution prevention. Appendices consist of materials that will support the pollution prevention effort: assessment worksheets, sources of additional information, examples of evaluative methods, and a glossary.« less

  7. Evaluation of Deblur Methods for Radiography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wood, William M.

    2014-03-31

    Radiography is used as a primary diagnostic for dynamic experiments, providing timeresolved radiographic measurements of areal mass density along a line of sight through the experiment. It is well known that the finite spot extent of the radiographic source, as well as scattering, are sources of blurring of the radiographic images. This blurring interferes with quantitative measurement of the areal mass density. In order to improve the quantitative utility of this diagnostic, it is necessary to deblur or “restore” the radiographs to recover the “true” areal mass density from a radiographic transmission measurement. Towards this end, I am evaluating threemore » separate methods currently in use for deblurring radiographs. I begin by briefly describing the problems associated with image restoration, and outlining the three methods. Next, I illustrate how blurring affects the quantitative measurements using radiographs. I then present the results of the various deblur methods, evaluating each according to several criteria. After I have summarized the results of the evaluation, I give a detailed account of how the restoration process is actually implemented.« less

  8. Shuttle Space Suit: Fabric/LCVG Model Validation. Chapter 8

    NASA Technical Reports Server (NTRS)

    Wilson, J. W.; Tweed, J.; Zeitlin, C.; Kim, M.-H. Y.; Anderson, B. M.; Cucinotta, F. A.; Ware, J.; Persans, A. E.

    2003-01-01

    A detailed space suit computational model is being developed at the Langley Research Center for radiation exposure evaluation studies. The details of the construction of the space suit are critical to estimation of exposures and assessing the risk to the astronaut on EVA. Past evaluations of space suit shielding properties assumed the basic fabric layup (Thermal Micrometeoroid Garment, fabric restraints, and pressure envelope) and LCVG could be homogenized as a single layer overestimating the protective properties over 60 percent of the fabric area. The present space suit model represents the inhomogeneous distributions of LCVG materials (mainly the water filled cooling tubes). An experimental test is performed using a 34-MeV proton beam and high-resolution detectors to compare with model-predicted transmission factors. Some suggestions are made on possible improved construction methods to improve the space suit s protection properties.

  9. Shuttle Spacesuit: Fabric/LCVG Model Validation

    NASA Technical Reports Server (NTRS)

    Wilson, J. W.; Tweed, J.; Zeitlin, C.; Kim, M.-H. Y.; Anderson, B. M.; Cucinotta, F. A.; Ware, J.; Persans, A. E.

    2001-01-01

    A detailed spacesuit computational model is being developed at the Langley Research Center for radiation exposure evaluation studies. The details of the construction of the spacesuit are critical to estimation of exposures and assessing the risk to the astronaut on EVA. Past evaluations of spacesuit shielding properties assumed the basic fabric lay-up (Thermal Micrometeroid Garment, fabric restraints, and pressure envelope) and Liquid Cooling and Ventilation Garment (LCVG) could be homogenized as a single layer overestimating the protective properties over 60 percent of the fabric area. The present spacesuit model represents the inhomogeneous distributions of LCVG materials (mainly the water filled cooling tubes). An experimental test is performed using a 34-MeV proton beam and highresolution detectors to compare with model-predicted transmission factors. Some suggestions are made on possible improved construction methods to improve the spacesuit's protection properties.

  10. Chemometric analysis of soil pollution data using the Tucker N-way method.

    PubMed

    Stanimirova, I; Zehl, K; Massart, D L; Vander Heyden, Y; Einax, J W

    2006-06-01

    N-way methods, particularly the Tucker method, are often the methods of choice when analyzing data sets arranged in three- (or higher) way arrays, which is the case for most environmental data sets. In the future, applying N-way methods will become an increasingly popular way to uncover hidden information in complex data sets. The reason for this is that classical two-way approaches such as principal component analysis are not as good at revealing the complex relationships present in data sets. This study describes in detail the application of a chemometric N-way approach, namely the Tucker method, in order to evaluate the level of pollution in soil from a contaminated site. The analyzed soil data set was five-way in nature. The samples were collected at different depths (way 1) from two locations (way 2) and the levels of thirteen metals (way 3) were analyzed using a four-step-sequential extraction procedure (way 4), allowing detailed information to be obtained about the bioavailability and activity of the different binding forms of the metals. Furthermore, the measurements were performed under two conditions (way 5), inert and non-inert. The preferred Tucker model of definite complexity showed that there was no significant difference in measurements analyzed under inert or non-inert conditions. It also allowed two depth horizons, characterized by different accumulation pathways, to be distinguished, and it allowed the relationships between chemical elements and their biological activities and mobilities in the soil to be described in detail.

  11. Evaluation method on steering for the shape-shifting robot in different configurations

    NASA Astrophysics Data System (ADS)

    Chang, Jian; Li, Bin; Wang, Chong; Zheng, Huaibing; Li, Zhiqiang

    2016-01-01

    The evaluation method on steering is based on qualitative manner in existence, which causes the result inaccurate and fuzziness. It reduces the efficiency of process execution. So the method by quantitative manner for the shape-shifting robot in different configurations is proposed. Comparing to traditional evaluation method, the most important aspects which can influence the steering abilities of the robot in different configurations are researched in detail, including the energy, angular velocity, time and space. In order to improve the robustness of system, the ideal and slippage conditions are all considered by mathematical model. Comparing to the traditional weighting confirming method, the extent of robot steering method is proposed by the combination of subjective and objective weighting method. The subjective weighting method can show more preferences of the experts and is based on five-grade scale. The objective weighting method is based on information entropy to determine the factors. By the sensors fixed on the robot, the contract force between track grouser and ground, the intrinsic motion characteristics of robot are obtained and the experiment is done to prove the algorithm which is proposed as the robot in different common configurations. Through the method proposed in the article, fuzziness and inaccurate of the evaluation method has been solved, so the operators can choose the most suitable configuration of the robot to fulfil the different tasks more quickly and simply.

  12. Analysis of polonium-210 in food products and bioassay samples by isotope-dilution alpha spectrometry.

    PubMed

    Lin, Zhichao; Wu, Zhongyu

    2009-05-01

    A rapid and reliable radiochemical method coupled with a simple and compact plating apparatus was developed, validated, and applied for the analysis of (210)Po in variety of food products and bioassay samples. The method performance characteristics, including accuracy, precision, robustness, and specificity, were evaluated along with a detailed measurement uncertainty analysis. With high Po recovery, improved energy resolution, and effective removal of interfering elements by chromatographic extraction, the overall method accuracy was determined to be better than 5% with measurement precision of 10%, at 95% confidence level.

  13. Evaluation of Methods for Multidisciplinary Design Optimization (MDO). Part 2

    NASA Technical Reports Server (NTRS)

    Kodiyalam, Srinivas; Yuan, Charles; Sobieski, Jaroslaw (Technical Monitor)

    2000-01-01

    A new MDO method, BLISS, and two different variants of the method, BLISS/RS and BLISS/S, have been implemented using iSIGHT's scripting language and evaluated in this report on multidisciplinary problems. All of these methods are based on decomposing a modular system optimization system into several subtasks optimization, that may be executed concurrently, and the system optimization that coordinates the subtasks optimization. The BLISS method and its variants are well suited for exploiting the concurrent processing capabilities in a multiprocessor machine. Several steps, including the local sensitivity analysis, local optimization, response surfaces construction and updates are all ideally suited for concurrent processing. Needless to mention, such algorithms that can effectively exploit the concurrent processing capabilities of the compute servers will be a key requirement for solving large-scale industrial design problems, such as the automotive vehicle problem detailed in Section 3.4.

  14. Gait Analysis Methods for Rodent Models of Osteoarthritis

    PubMed Central

    Jacobs, Brittany Y.; Kloefkorn, Heidi E.; Allen, Kyle D.

    2014-01-01

    Patients with osteoarthritis (OA) primarily seek treatment due to pain and disability, yet the primary endpoints for rodent OA models tend to be histological measures of joint destruction. The discrepancy between clinical and preclinical evaluations is problematic, given that radiographic evidence of OA in humans does not always correlate to the severity of patient-reported symptoms. Recent advances in behavioral analyses have provided new methods to evaluate disease sequelae in rodents. Of particular relevance to rodent OA models are methods to assess rodent gait. While obvious differences exist between quadrupedal and bipedal gait sequences, the gait abnormalities seen in humans and in rodent OA models reflect similar compensatory behaviors that protect an injured limb from loading. The purpose of this review is to describe these compensations and current methods used to assess rodent gait characteristics, while detailing important considerations for the selection of gait analysis methods in rodent OA models. PMID:25160712

  15. An introduction to planar chromatography and its application to natural products isolation.

    PubMed

    Gibbons, Simon

    2012-01-01

    Thin-layer chromatography (TLC) is an easy, inexpensive, rapid, and the most widely used method for the analysis and isolation of small organic natural and synthetic products. It also has use in the biological evaluation of organic compounds, particularly in the areas of antimicrobial and antioxidant metabolites and for the evaluation of acetylcholinesterase inhibitors which have utility in the treatment of Alzheimer's disease. The ease and inexpensiveness of use of this technique, coupled with the ability to rapidly develop separation and bioassay protocols will ensure that TLC will be used for some considerable time alongside conventional instrumental methods. This chapter deals with the basic principles of TLC and describes methods for the analysis and isolation of natural products. Examples of methods for isolation of several classes of natural product are detailed and protocols for TLC bioassays are given.

  16. Evaluation of TOPLATS on three Mediterranean catchments

    NASA Astrophysics Data System (ADS)

    Loizu, Javier; Álvarez-Mozos, Jesús; Casalí, Javier; Goñi, Mikel

    2016-08-01

    Physically based hydrological models are complex tools that provide a complete description of the different processes occurring on a catchment. The TOPMODEL-based Land-Atmosphere Transfer Scheme (TOPLATS) simulates water and energy balances at different time steps, in both lumped and distributed modes. In order to gain insight on the behavior of TOPLATS and its applicability in different conditions a detailed evaluation needs to be carried out. This study aimed to develop a complete evaluation of TOPLATS including: (1) a detailed review of previous research works using this model; (2) a sensitivity analysis (SA) of the model with two contrasted methods (Morris and Sobol) of different complexity; (3) a 4-step calibration strategy based on a multi-start Powell optimization algorithm; and (4) an analysis of the influence of simulation time step (hourly vs. daily). The model was applied on three catchments of varying size (La Tejeria, Cidacos and Arga), located in Navarre (Northern Spain), and characterized by different levels of Mediterranean climate influence. Both Morris and Sobol methods showed very similar results that identified Brooks-Corey Pore Size distribution Index (B), Bubbling pressure (ψc) and Hydraulic conductivity decay (f) as the three overall most influential parameters in TOPLATS. After calibration and validation, adequate streamflow simulations were obtained in the two wettest catchments, but the driest (Cidacos) gave poor results in validation, due to the large climatic variability between calibration and validation periods. To overcome this issue, an alternative random and discontinuous method of cal/val period selection was implemented, improving model results.

  17. A comparative study of new and current methods for dental micro-CT image denoising

    PubMed Central

    Lashgari, Mojtaba; Qin, Jie; Swain, Michael

    2016-01-01

    Objectives: The aim of the current study was to evaluate the application of two advanced noise-reduction algorithms for dental micro-CT images and to implement a comparative analysis of the performance of new and current denoising algorithms. Methods: Denoising was performed using gaussian and median filters as the current filtering approaches and the block-matching and three-dimensional (BM3D) method and total variation method as the proposed new filtering techniques. The performance of the denoising methods was evaluated quantitatively using contrast-to-noise ratio (CNR), edge preserving index (EPI) and blurring indexes, as well as qualitatively using the double-stimulus continuous quality scale procedure. Results: The BM3D method had the best performance with regard to preservation of fine textural features (CNREdge), non-blurring of the whole image (blurring index), the clinical visual score in images with very fine features and the overall visual score for all types of images. On the other hand, the total variation method provided the best results with regard to smoothing of images in texture-free areas (CNRTex-free) and in preserving the edges and borders of image features (EPI). Conclusions: The BM3D method is the most reliable technique for denoising dental micro-CT images with very fine textural details, such as shallow enamel lesions, in which the preservation of the texture and fine features is of the greatest importance. On the other hand, the total variation method is the technique of choice for denoising images without very fine textural details in which the clinician or researcher is interested mainly in anatomical features and structural measurements. PMID:26764583

  18. Deductive Evaluation: Formal Code Analysis With Low User Burden

    NASA Technical Reports Server (NTRS)

    Di Vito, Ben. L

    2016-01-01

    We describe a framework for symbolically evaluating iterative C code using a deductive approach that automatically discovers and proves program properties. Although verification is not performed, the method can infer detailed program behavior. Software engineering work flows could be enhanced by this type of analysis. Floyd-Hoare verification principles are applied to synthesize loop invariants, using a library of iteration-specific deductive knowledge. When needed, theorem proving is interleaved with evaluation and performed on the fly. Evaluation results take the form of inferred expressions and type constraints for values of program variables. An implementation using PVS (Prototype Verification System) is presented along with results for sample C functions.

  19. Testing and Evaluation of Multifunctional Smart Coatings

    NASA Technical Reports Server (NTRS)

    Buhrow, Jerry; Li, Wenyan; Jolley, Scott; Calle, Luz M.; Pearman, Benjamin; Zhang, Xuejun

    2015-01-01

    A smart coating system, based on pH sensitive microcontainers (microparticles and microcapsules) has been developed. Various corrosion inhibitors have been encapsulated and incorporated into commercial and formulated coatings to test the functionality imparted on the coating by the incorporation of the inhibitor microcontainers. Coated carbon steel and aluminum alloy panels were tested using salt immersion, salt fog, and coastal atmospheric exposure conditions. This paper provides the details on coating sample preparation, evaluation methods, as well as test results of the inhibiting function of smart coatings.

  20. Success factors for telehealth--a case study.

    PubMed

    Moehr, J R; Schaafsma, J; Anglin, C; Pantazi, S V; Grimm, N A; Anglin, S

    2006-01-01

    To present the lessons learned from an evaluation of a comprehensive telehealth project regarding success factors and evaluation methodology for such projects. A recent experience with the evaluation of new telehealth services in BC, Canada, is summarized. Two domains of clinical applications, as well as educational and administrative uses, and the project environment were evaluated. In order to contribute to the success of the project, the evaluation included formative and summative approaches employing qualitative and quantitative methods with data collection from telehealth events, participants and existing databases. The evaluation had to be carried out under severe budgetary and time constraints. We therefore deliberately chose a broad ranging exploratory approach within a framework provided, and generated questions to be answered on the basis of initial observations and participant driven interviews with progressively more focused and detailed data gathering, including perusal of a variety of existing data sources. A unique feature was an economic evaluation using static simulation models. The evaluation yielded rich and detailed data, which were able to explain a number of unanticipated findings. One clinical application domain was cancelled after 6 months, the other continues. The factors contributing to success include: Focus on chronic conditions which require visual information for proper management. Involvement of established teams in regular scheduled visits or in sessions scheduled well in advance. Problems arose with: Ad hoc applications, in particular under emergency conditions. Applications that disregard established referral patterns. Applications that support only part of a unit's services. The latter leads to the service mismatch dilemma (SMMD) with the end result that even those e-health services provided are not used. The problems encountered were compounded by issues arising from the manner in which the telehealth services had been introduced, in particular the lack of time for preparation and establishment of routine use. Educational applications had significant clinical benefits. Administrative applications generated savings which exceeded the substantial capital investment and made educational and clinical applications available at variable cost. Evaluation under severe constraints can yield rich information. The identified success factors, including provision of an overarching architecture and infrastructure, strong program management, thorough needs analysis and detailing applications to match the identified needs should improve the sustainability of e-health projects. Insights gained: Existing assumptions before the study was conducted: Evaluation has to proceed from identified questions according to a rigorous experimental design. Emergency and trauma services in remote regions can and should be supported via telehealth based on video-conferencing. Educational applications of telehealth directed at providers are beneficial for recruitment and retention of providers in remote areas. Insights gained by the study: An exploratory approach to evaluation using a multiplicity of methods can yield rich and detailed information even under severe constraints. Ad hoc and emergency clinical applications of telehealth can present problems unless they are based on thorough, detailed analyses of environment and need, conform to established practice patterns and rely on established trusting collaborative relationships. Less difficult applications should be introduced before attempting to support use under emergency conditions. Educational applications are of interest beyond the provider community to patients, family and community members, and have clinical value. In large, sparsely populated areas with difficult travel conditions administrative applications by themselves generate savings that compensate for the substantial capital investment for telehealth required for clinical applications.

  1. A review of evaluative studies of computer-based learning in nursing education.

    PubMed

    Lewis, M J; Davies, R; Jenkins, D; Tait, M I

    2001-01-01

    Although there have been numerous attempts to evaluate the learning benefits of computer-based learning (CBL) packages in nursing education, the results obtained have been equivocal. A literature search conducted for this review found 25 reports of the evaluation of nursing CBL packages since 1966. Detailed analysis of the evaluation methods used in these reports revealed that most had significant design flaws, including the use of too small a sample group, the lack of a control group, etc. Because of this, the conclusions reached were not always valid. More effort is required in the design of future evaluation studies of nursing CBL packages. Copyright 2001 Harcourt Publishers Ltd.

  2. Laser Pencil Beam Based Techniques for Visualization and Analysis of Interfaces Between Media

    NASA Technical Reports Server (NTRS)

    Adamovsky, Grigory; Giles, Sammie, Jr.

    1998-01-01

    Traditional optical methods that include interferometry, Schlieren, and shadowgraphy have been used successfully for visualization and evaluation of various media. Aerodynamics and hydrodynamics are major fields where these methods have been applied. However, these methods have such major drawbacks as a relatively low power density and suppression of the secondary order phenomena. A novel method introduced at NASA Lewis Research Center minimizes disadvantages of the 'classical' methods. The method involves a narrow pencil-like beam that penetrates a medium of interest. The paper describes the laser pencil beam flow visualization methods in detail. Various system configurations are presented. The paper also discusses interfaces between media in general terms and provides examples of interfaces.

  3. Instructional Methodology and Experimental Design for Evaluating Audio-Video Support to Undergraduate Pilot Training.

    ERIC Educational Resources Information Center

    Purifoy, George R., Jr.

    This report presents a detailed description of the methods by which airborne video recording will be utilized in training Air Force pilots, and presents the format for an experiment testing the effectiveness of such training. Portable airborne recording with ground playback permits more economical and efficient teaching of the critical visual and…

  4. Comparing fire spread algorithms using equivalence testing and neutral landscape models

    Treesearch

    Brian R. Miranda; Brian R. Sturtevant; Jian Yang; Eric J. Gustafson

    2009-01-01

    We demonstrate a method to evaluate the degree to which a meta-model approximates spatial disturbance processes represented by a more detailed model across a range of landscape conditions, using neutral landscapes and equivalence testing. We illustrate this approach by comparing burn patterns produced by a relatively simple fire spread algorithm with those generated by...

  5. EVALUATION OF FUGITIVE EMISSIONS AT A FORMER LANDFILL SITE IN COLORADO SPRINGS, COLORADO, USING GROUND-BASED OPTICAL REMOTE SENSING TECHNOLOGY

    EPA Science Inventory

    This report details a measurement campaign conducted using the Radial Plume Mapping (RPM) method and optical remote sensing technologies to characterize fugitive emissions. This work was funded by EPAs Monitoring and Measurement for the 21st Century Initiative, or 21M2. The si...

  6. Estimating forest canopy bulk density using six indirect methods

    Treesearch

    Robert E. Keane; Elizabeth D. Reinhardt; Joe Scott; Kathy Gray; James Reardon

    2005-01-01

    Canopy bulk density (CBD) is an important crown characteristic needed to predict crown fire spread, yet it is difficult to measure in the field. Presented here is a comprehensive research effort to evaluate six indirect sampling techniques for estimating CBD. As reference data, detailed crown fuel biomass measurements were taken on each tree within fixed-area plots...

  7. Educating Information Systems Students on Business Process Management (BPM) through Digital Gaming Metaphors of Virtual Reality

    ERIC Educational Resources Information Center

    Lawler, James P.; Joseph, Anthony

    2010-01-01

    Digital gaming continues to be an approach for enhancing methods of pedagogy. The study evaluates the effectiveness of a gaming product of a leading technology firm in engaging graduate students in an information systems course at a major northeast institution. Findings from a detailed perception survey of the students indicate favorable…

  8. A New Peer Instruction Method for Teaching Practical Skills in the Health Sciences: An Evaluation of the "Learning Trail"

    ERIC Educational Resources Information Center

    Dollman, James

    2005-01-01

    The "Learning Trail" is an innovative application of peer-mediated instruction designed to enhance student learning in large practical classes. The strategy specifically seeks to improve participants' attention to details of protocol that are often difficult to observe during teacher-centered demonstrations to large groups. Students…

  9. Detection of fatigue cracks by nondestructive testing methods

    NASA Technical Reports Server (NTRS)

    Anderson, R. T.; Delacy, T. J.; Stewart, R. C.

    1973-01-01

    The effectiveness was assessed of various NDT methods to detect small tight cracks by randomly introducing fatigue cracks into aluminum sheets. The study included optimizing NDT methods calibrating NDT equipment with fatigue cracked standards, and evaluating a number of cracked specimens by the optimized NDT methods. The evaluations were conducted by highly trained personnel, provided with detailed procedures, in order to minimize the effects of human variability. These personnel performed the NDT on the test specimens without knowledge of the flaw locations and reported on the flaws detected. The performance of these tests was measured by comparing the flaws detected against the flaws present. The principal NDT methods utilized were radiographic, ultrasonic, penetrant, and eddy current. Holographic interferometry, acoustic emission monitoring, and replication methods were also applied on a reduced number of specimens. Generally, the best performance was shown by eddy current, ultrasonic, penetrant and holographic tests. Etching provided no measurable improvement, while proof loading improved flaw detectability. Data are shown that quantify the performances of the NDT methods applied.

  10. A study on the real-time reliability of on-board equipment of train control system

    NASA Astrophysics Data System (ADS)

    Zhang, Yong; Li, Shiwei

    2018-05-01

    Real-time reliability evaluation is conducive to establishing a condition based maintenance system for the purpose of guaranteeing continuous train operation. According to the inherent characteristics of the on-board equipment, the connotation of reliability evaluation of on-board equipment is defined and the evaluation index of real-time reliability is provided in this paper. From the perspective of methodology and practical application, the real-time reliability of the on-board equipment is discussed in detail, and the method of evaluating the realtime reliability of on-board equipment at component level based on Hidden Markov Model (HMM) is proposed. In this method the performance degradation data is used directly to realize the accurate perception of the hidden state transition process of on-board equipment, which can achieve a better description of the real-time reliability of the equipment.

  11. Opening the black box of ethics policy work: evaluating a covert practice.

    PubMed

    Frolic, Andrea; Drolet, Katherine; Bryanton, Kim; Caron, Carole; Cupido, Cynthia; Flaherty, Barb; Fung, Sylvia; McCall, Lori

    2012-01-01

    Hospital ethics committees (HECs) and ethicists generally describe themselves as engaged in four domains of practice: case consultation, research, education, and policy work. Despite the increasing attention to quality indicators, practice standards, and evaluation methods for the other domains, comparatively little is known or published about the policy work of HECs or ethicists. This article attempts to open the "black box" of this health care ethics practice by providing two detailed case examples of ethics policy reviews. We also describe the development and application of an evaluation strategy to assess the quality of ethics policy review work, and to enable continuous improvement of ethics policy review processes. Given the potential for policy work to impact entire patient populations and organizational systems, it is imperative that HECs and ethicists develop clearer roles, responsibilities, procedural standards, and evaluation methods to ensure the delivery of consistent, relevant, and high-quality ethics policy reviews.

  12. Reducing Physical Risk Factors in Construction Work Through a Participatory Intervention: Protocol for a Mixed-Methods Process Evaluation.

    PubMed

    Ajslev, Jeppe; Brandt, Mikkel; Møller, Jeppe Lykke; Skals, Sebastian; Vinstrup, Jonas; Jakobsen, Markus Due; Sundstrup, Emil; Madeleine, Pascal; Andersen, Lars Louis

    2016-05-26

    Previous research has shown that reducing physical workload among workers in the construction industry is complicated. In order to address this issue, we developed a process evaluation in a formative mixed-methods design, drawing on existing knowledge of the potential barriers for implementation. We present the design of a mixed-methods process evaluation of the organizational, social, and subjective practices that play roles in the intervention study, integrating technical measurements to detect excessive physical exertion measured with electromyography and accelerometers, video documentation of working tasks, and a 3-phased workshop program. The evaluation is designed in an adapted process evaluation framework, addressing recruitment, reach, fidelity, satisfaction, intervention delivery, intervention received, and context of the intervention companies. Observational studies, interviews, and questionnaires among 80 construction workers organized in 20 work gangs, as well as health and safety staff, contribute to the creation of knowledge about these phenomena. At the time of publication, the process of participant recruitment is underway. Intervention studies are challenging to conduct and evaluate in the construction industry, often because of narrow time frames and ever-changing contexts. The mixed-methods design presents opportunities for obtaining detailed knowledge of the practices intra-acting with the intervention, while offering the opportunity to customize parts of the intervention.

  13. Stress corrosion evaluation of powder metallurgy aluminum alloy 7091 with the breaking load test method

    NASA Technical Reports Server (NTRS)

    Domack, Marcia S.

    1987-01-01

    The stress corrosion behavior of the P/M aluminum alloy 7091 is evaluated in two overaged heat treatment conditions, T7E69 and T7E70, using an accelerated test technique known as the breaking load test method. The breaking load data obtained in this study indicate that P/M 7091 alloy is highly resistant to stress corrosion in both longitudinal and transverse orientations at stress levels up to 90 percent of the material yield strength. The reduction in mean breaking stress as a result of corrosive attack is smallest for the more overaged T7E70 condition. Details of the test procedure are included.

  14. BIMOS transistor solutions for ESD protection in FD-SOI UTBB CMOS technology

    NASA Astrophysics Data System (ADS)

    Galy, Philippe; Athanasiou, S.; Cristoloveanu, S.

    2016-01-01

    We evaluate the Electro-Static Discharge (ESD) protection capability of BIpolar MOS (BIMOS) transistors integrated in ultrathin silicon film for 28 nm Fully Depleted SOI (FD-SOI) Ultra Thin Body and BOX (UTBB) high-k metal gate technology. Using as a reference our measurements in hybrid bulk-SOI structures, we extend the BIMOS design towards the ultrathin silicon film. Detailed study and pragmatic evaluations are done based on 3D TCAD simulation with standard physical models using Average Current Slope (ACS) method and quasi-static DC stress (Average Voltage Slope AVS method). These preliminary 3D TACD results are very encouraging in terms of ESD protection efficiency in advanced FD-SOI CMOS.

  15. Methods for external event screening quantification: Risk Methods Integration and Evaluation Program (RMIEP) methods development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ravindra, M.K.; Banon, H.

    1992-07-01

    In this report, the scoping quantification procedures for external events in probabilistic risk assessments of nuclear power plants are described. External event analysis in a PRA has three important goals; (1) the analysis should be complete in that all events are considered; (2) by following some selected screening criteria, the more significant events are identified for detailed analysis; (3) the selected events are analyzed in depth by taking into account the unique features of the events: hazard, fragility of structures and equipment, external-event initiated accident sequences, etc. Based on the above goals, external event analysis may be considered as amore » three-stage process: Stage I: Identification and Initial Screening of External Events; Stage II: Bounding Analysis; Stage III: Detailed Risk Analysis. In the present report, first, a review of published PRAs is given to focus on the significance and treatment of external events in full-scope PRAs. Except for seismic, flooding, fire, and extreme wind events, the contributions of other external events to plant risk have been found to be negligible. Second, scoping methods for external events not covered in detail in the NRC`s PRA Procedures Guide are provided. For this purpose, bounding analyses for transportation accidents, extreme winds and tornadoes, aircraft impacts, turbine missiles, and chemical release are described.« less

  16. The Method of Manufacturing Nonmetallic Test-Blocks on Different Sensitivity Classes

    NASA Astrophysics Data System (ADS)

    Kalinichenko, N. P.; Kalinichenko, A. N.; Lobanova, I. S.; Zaitseva, A. A.; Loboda, E. L.

    2016-01-01

    Nowadays in our modern world there is a vital question of quality control of details made from nonmetallic materials due to their wide spreading. Nondestructive penetrant testing is effective, and in some cases it is the only possible method of accidents prevention at high- risk sites. A brief review of check sample necessary for quality evaluation of penetrant materials is considered. There was offered a way of making agents for quality of penetrant materials testing according to different liquid penetrant testing sensibility classes.

  17. Remote sensing fusion based on guided image filtering

    NASA Astrophysics Data System (ADS)

    Zhao, Wenfei; Dai, Qinling; Wang, Leiguang

    2015-12-01

    In this paper, we propose a novel remote sensing fusion approach based on guided image filtering. The fused images can well preserve the spectral features of the original multispectral (MS) images, meanwhile, enhance the spatial details information. Four quality assessment indexes are also introduced to evaluate the fusion effect when compared with other fusion methods. Experiments carried out on Gaofen-2, QuickBird, WorldView-2 and Landsat-8 images. And the results show an excellent performance of the proposed method.

  18. Nailfold capillaroscopy in systemic sclerosis: is there any difference between videocapillaroscopy and dermatoscopy?

    PubMed

    Dogan, Sibel; Akdogan, Ali; Atakan, Nilgün

    2013-11-01

    Vasculopathy is known to destroy nailfold capillary pattern (NCP) in systemic sclerosis (SSc). There are several methods for the evaluation of NCP of which the most common are dermatoscopy and videocapillaroscopy (VCAP). No study has been reported in the literature comparing these two techniques for their diagnostic value. To compare the diagnostic value of dermatoscopy and VCAP which are widely used to determine changes in the NCP in SSc patients. A total of 382 nailfolds were visualized. NCP was evaluated in 39 SSc patients using dermatoscopy and VCAP. Defined dermatoscopic groups were matched with early, active and late phase NCP groups determined by VCAP for comparisons. Both dermatoscopy and VCAP demonstrated distinct NCP of SSc efficiently. According to dermatoscopic NCP, capillary dilatation, giant capillaries and disrupted vascular configuration were able to be visualized. VCAP revealed early phase NCP in N = 8 (20,5%), active phase in N = 18 (46,2%) and late phase NCP in N = 13 (33.3%) of the patients. Statistical evaluation of grouped data resulted a Cohen kappa value (K) = 0,527. Although VCAP was able to facilitate a more detailed evaluation of NCP, there was no difference between dermatoscopy and VCAP for the identification of distinct NCP in SSc. We suggest that dermatoscopy is efficient enough to identify pathognomonic changes in NCP in SSc as well as VCAP and find dermatoscopy as a very easy applicable and convenient method than VCAP although VCAP facilitates a more detailed evaluation of NCP. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  19. Evaluation of feedback interventions for improving the quality assurance of cancer screening in Japan: study design and report of the baseline survey.

    PubMed

    Machii, Ryoko; Saika, Kumiko; Higashi, Takahiro; Aoki, Ayako; Hamashima, Chisato; Saito, Hiroshi

    2012-02-01

    The importance of quality assurance in cancer screening has recently gained increasing attention in Japan. To evaluate and improve quality, checklists and process indicators have been developed. To explore effective methods of enhancing quality in cancer screening, we started a randomized control study of the methods of evaluation and feedback for cancer control from 2009 to 2014. We randomly assigned 1270 municipal governments, equivalent to 71% of all Japanese municipal governments that performed screening programs, into three groups. The high-intensity intervention groups (n = 425) were individually evaluated using both checklist performance and process indicator values, while the low-intensity intervention groups (n= 421) were individually evaluated on the basis of only checklist performance. The control group (n = 424) received only a basic report that included the national average of checklist performance scores. We repeated the survey for each municipality's quality assurance activity performance using checklists and process indicators. In this paper, we report our study design and the result of the baseline survey. The checklist adherence rates were especially low in the checklist elements related to invitation of individuals, detailed monitoring of process indicators such as cancer detection rates according to screening histories and appropriate selection of screening facilities. Screening rate and percentage of examinees who underwent detailed examination tended to be lower for large cities when compared with smaller cities for all cancer sites. The performance of the Japanese cancer screening program in 2009 was identified for the first time.

  20. Microscopic evaluation and physiochemical analysis of Dillenia indica leaf

    PubMed Central

    Kumar, S; Kumar, V; Prakash, Om

    2011-01-01

    Objective To study detail microscopic evaluation and physiochemical analysis of Dillenia indica (D. indica) leaf. Methods Fresh leaf sample and dried power of the leaf were studied macroscopically and microscopically. Preliminary phytochemical investigation of plant material was done. Other WHO recommended parameters for standardizations were also performed. Results The detail microscopy revealed the presence of anomocytic stomata, unicellular trichome, xylem fibres, calcium oxalate crystals, vascular bundles, etc. Leaf constants such as stomatal number, stomatal index, vein-islet number and veinlet termination numbers were also measured. Physiochemical parameters such as ash values, loss on drying, extractive values, percentage of foreign matters, swelling index, etc. were also determined. Preliminary phytochemical screening showed the presence of steroids, terpenoids, glycosides, fatty acids, flavonoids, phenolic compounds and carbohydrates. Conclusions The microscopic and physiochemical analysis of the D. indica leaf is useful in standardization for quality, purity and sample identification. PMID:23569789

  1. Complementary and Alternative Therapies for Autism Spectrum Disorder

    PubMed Central

    Fusar-Poli, Laura; Rocchetti, Matteo; Provenzani, Umberto; Barale, Francesco

    2015-01-01

    Background. Complementary and alternative medicine (CAM) represents a popular therapeutic option for patients with autism spectrum disorder (ASD). Unfortunately, there is a paucity of data regarding the efficacy of CAM in ASD. The aim of the present systematic review is to investigate trials of CAM in ASD. Material and Methods. We searched the following databases: MEDLINE, EMBASE, Cochrane Database of Systematic Reviews, CINAHL, Psychology and Behavioral Sciences Collection, Agricola, and Food Science Source. Results. Our literature search identified 2687 clinical publications. After the title/abstract screening, 139 publications were obtained for detailed evaluation. After detailed evaluation 67 studies were included, from hand search of references we retrieved 13 additional studies for a total of 80. Conclusion. There is no conclusive evidence supporting the efficacy of CAM therapies in ASD. Promising results are reported for music therapy, sensory integration therapy, acupuncture, and massage. PMID:26064157

  2. Enabling high-quality observations of surface imperviousness for water runoff modelling from unmanned aerial vehicles

    NASA Astrophysics Data System (ADS)

    Tokarczyk, Piotr; Leitao, Joao Paulo; Rieckermann, Jörg; Schindler, Konrad; Blumensaat, Frank

    2015-04-01

    Modelling rainfall-runoff in urban areas is increasingly applied to support flood risk assessment particularly against the background of a changing climate and an increasing urbanization. These models typically rely on high-quality data for rainfall and surface characteristics of the area. While recent research in urban drainage has been focusing on providing spatially detailed rainfall data, the technological advances in remote sensing that ease the acquisition of detailed land-use information are less prominently discussed within the community. The relevance of such methods increase as in many parts of the globe, accurate land-use information is generally lacking, because detailed image data is unavailable. Modern unmanned air vehicles (UAVs) allow acquiring high-resolution images on a local level at comparably lower cost, performing on-demand repetitive measurements, and obtaining a degree of detail tailored for the purpose of the study. In this study, we investigate for the first time the possibility to derive high-resolution imperviousness maps for urban areas from UAV imagery and to use this information as input for urban drainage models. To do so, an automatic processing pipeline with a modern classification method is tested and applied in a state-of-the-art urban drainage modelling exercise. In a real-life case study in the area of Lucerne, Switzerland, we compare imperviousness maps generated from a consumer micro-UAV and standard large-format aerial images acquired by the Swiss national mapping agency (swisstopo). After assessing their correctness, we perform an end-to-end comparison, in which they are used as an input for an urban drainage model. Then, we evaluate the influence which different image data sources and their processing methods have on hydrological and hydraulic model performance. We analyze the surface runoff of the 307 individual sub-catchments regarding relevant attributes, such as peak runoff and volume. Finally, we evaluate the model's channel flow prediction performance through a cross-comparison with reference flow measured at the catchment outlet. We show that imperviousness maps generated using UAV imagery processed with modern classification methods achieve accuracy comparable with standard, off-the-shelf aerial imagery. In the examined case study, we find that the different imperviousness maps only have a limited influence on modelled surface runoff and pipe flows. We conclude that UAV imagery represents a valuable alternative data source for urban drainage model applications due to the possibility to flexibly acquire up-to-date aerial images at a superior quality and a competitive price. Our analyses furthermore suggest that spatially more detailed urban drainage models can even better benefit from the full detail of UAV imagery.

  3. High-quality observation of surface imperviousness for urban runoff modelling using UAV imagery

    NASA Astrophysics Data System (ADS)

    Tokarczyk, P.; Leitao, J. P.; Rieckermann, J.; Schindler, K.; Blumensaat, F.

    2015-01-01

    Modelling rainfall-runoff in urban areas is increasingly applied to support flood risk assessment particularly against the background of a changing climate and an increasing urbanization. These models typically rely on high-quality data for rainfall and surface characteristics of the area. While recent research in urban drainage has been focusing on providing spatially detailed rainfall data, the technological advances in remote sensing that ease the acquisition of detailed land-use information are less prominently discussed within the community. The relevance of such methods increase as in many parts of the globe, accurate land-use information is generally lacking, because detailed image data is unavailable. Modern unmanned air vehicles (UAVs) allow acquiring high-resolution images on a local level at comparably lower cost, performing on-demand repetitive measurements, and obtaining a degree of detail tailored for the purpose of the study. In this study, we investigate for the first time the possibility to derive high-resolution imperviousness maps for urban areas from UAV imagery and to use this information as input for urban drainage models. To do so, an automatic processing pipeline with a modern classification method is tested and applied in a state-of-the-art urban drainage modelling exercise. In a real-life case study in the area of Lucerne, Switzerland, we compare imperviousness maps generated from a consumer micro-UAV and standard large-format aerial images acquired by the Swiss national mapping agency (swisstopo). After assessing their correctness, we perform an end-to-end comparison, in which they are used as an input for an urban drainage model. Then, we evaluate the influence which different image data sources and their processing methods have on hydrological and hydraulic model performance. We analyze the surface runoff of the 307 individual subcatchments regarding relevant attributes, such as peak runoff and volume. Finally, we evaluate the model's channel flow prediction performance through a cross-comparison with reference flow measured at the catchment outlet. We show that imperviousness maps generated using UAV imagery processed with modern classification methods achieve accuracy comparable with standard, off-the-shelf aerial imagery. In the examined case study, we find that the different imperviousness maps only have a limited influence on modelled surface runoff and pipe flows. We conclude that UAV imagery represents a valuable alternative data source for urban drainage model applications due to the possibility to flexibly acquire up-to-date aerial images at a superior quality and a competitive price. Our analyses furthermore suggest that spatially more detailed urban drainage models can even better benefit from the full detail of UAV imagery.

  4. Flow prediction over a transport multi-element high-lift system and comparison with flight measurements

    NASA Technical Reports Server (NTRS)

    Vijgen, P. M. H. W.; Hardin, J. D.; Yip, L. P.

    1992-01-01

    Accurate prediction of surface-pressure distributions, merging boundary-layers, and separated-flow regions over multi-element high-lift airfoils is required to design advanced high-lift systems for efficient subsonic transport aircraft. The availability of detailed measurements of pressure distributions and both averaged and time-dependent boundary-layer flow parameters at flight Reynolds numbers is critical to evaluate computational methods and to model the turbulence structure for closure of the flow equations. Several detailed wind-tunnel measurements at subscale Reynolds numbers were conducted to obtain detailed flow information including the Reynolds-stress component. As part of a subsonic-transport high-lift research program, flight experiments are conducted using the NASA-Langley B737-100 research aircraft to obtain detailed flow characteristics for support of computational and wind-tunnel efforts. Planned flight measurements include pressure distributions at several spanwise locations, boundary-layer transition and separation locations, surface skin friction, as well as boundary-layer profiles and Reynolds stresses in adverse pressure-gradient flow.

  5. An experimental investigation devoted to determine heat transfer characteristics in a radiant ceiling heating system

    NASA Astrophysics Data System (ADS)

    Koca, Aliihsan; Acikgoz, Ozgen; Çebi, Alican; Çetin, Gürsel; Dalkilic, Ahmet Selim; Wongwises, Somchai

    2018-02-01

    Investigations on heated ceiling method can be considered as a new research area in comparison to the common wall heating-cooling and cooled ceiling methods. In this work, heat transfer characteristics of a heated radiant ceiling system was investigated experimentally. There were different configurations for a single room design in order to determine the convective and radiative heat transfer rates. Almost all details on the arrangement of the test chamber, hydraulic circuit and radiant panels, the measurement equipment and experimental method including uncertainty analysis were revealed in detail indicating specific international standards. Total heat transfer amount from the panels were calculated as the sum of radiation to the unheated surfaces, convection to the air, and conduction heat loss from the backside of the panels. Integral expression of the view factors was calculated by means of the numerical evaluations using Matlab code. By means of this experimental chamber, the radiative, convective and total heat-transfer coefficient values along with the heat flux values provided from the ceiling to the unheated surrounding surfaces have been calculated. Moreover, the details of 28 different experimental case study measurements from the experimental chamber including the convective, radiative and total heat flux, and heat output results are given in a Table for other researchers to validate their theoretical models and empirical correlations.

  6. Screening organic chemicals in commerce for emissions in the context of environmental and human exposure.

    PubMed

    Breivik, Knut; Arnot, Jon A; Brown, Trevor N; McLachlan, Michael S; Wania, Frank

    2012-08-01

    Quantitative knowledge of organic chemical release into the environment is essential to understand and predict human exposure as well as to develop rational control strategies for any substances of concern. While significant efforts have been invested to characterize and screen organic chemicals for hazardous properties, relatively less effort has been directed toward estimating emissions and hence also risks. Here, a rapid throughput method to estimate emissions of discrete organic chemicals in commerce has been developed, applied and evaluated to support screening studies aimed at ranking and identifying chemicals of potential concern. The method builds upon information in the European Union Technical Guidance Document and utilizes information on quantities in commerce (production and/or import rates), chemical function (use patterns) and physical-chemical properties to estimate emissions to air, soil and water within the OECD for five stages of the chemical life-cycle. The method is applied to 16,029 discrete substances (identified by CAS numbers) from five national and international high production volume lists. As access to consistent input data remains fragmented or even impossible, particular attention is given to estimating, evaluating and discussing uncertainties in the resulting emission scenarios. The uncertainty for individual substances typically spans 3 to 4 orders of magnitude for this initial tier screening method. Information on uncertainties in emissions is useful as any screening or categorization methods which solely rely on threshold values are at risk of leading to a significant number of either false positives or false negatives. A limited evaluation of the screening method's estimates for a sub-set of about 100 substances, compared against independent and more detailed emission scenarios presented in various European Risk Assessment Reports, highlights that up-to-date and accurate information on quantities in commerce as well as a detailed breakdown on chemical function are critically needed for developing more realistic emission scenarios.

  7. Quantitative analysis of small molecule-nucleic acid interactions with a biosensor surface and surface plasmon resonance detection.

    PubMed

    Liu, Yang; Wilson, W David

    2010-01-01

    Surface plasmon resonance (SPR) technology with biosensor surfaces has become a widely-used tool for the study of nucleic acid interactions without any labeling requirements. The method provides simultaneous kinetic and equilibrium characterization of the interactions of biomolecules as well as small molecule-biopolymer binding. SPR monitors molecular interactions in real time and provides significant advantages over optical or calorimetic methods for systems with strong binding coupled to small spectroscopic signals and/or reaction heats. A detailed and practical guide for nucleic acid interaction analysis using SPR-biosensor methods is presented. Details of the SPR technology and basic fundamentals are described with recommendations on the preparation of the SPR instrument, sensor chips, and samples, as well as extensive information on experimental design, quantitative and qualitative data analysis and presentation. A specific example of the interaction of a minor-groove-binding agent with DNA is evaluated by both kinetic and steady-state SPR methods to illustrate the technique. Since the molecules that bind cooperatively to specific DNA sequences are attractive for many applications, a cooperative small molecule-DNA interaction is also presented.

  8. The secret lives of experiments: methods reporting in the fMRI literature.

    PubMed

    Carp, Joshua

    2012-10-15

    Replication of research findings is critical to the progress of scientific understanding. Accordingly, most scientific journals require authors to report experimental procedures in sufficient detail for independent researchers to replicate their work. To what extent do research reports in the functional neuroimaging literature live up to this standard? The present study evaluated methods reporting and methodological choices across 241 recent fMRI articles. Many studies did not report critical methodological details with regard to experimental design, data acquisition, and analysis. Further, many studies were underpowered to detect any but the largest statistical effects. Finally, data collection and analysis methods were highly flexible across studies, with nearly as many unique analysis pipelines as there were studies in the sample. Because the rate of false positive results is thought to increase with the flexibility of experimental designs, the field of functional neuroimaging may be particularly vulnerable to false positives. In sum, the present study documented significant gaps in methods reporting among fMRI studies. Improved methodological descriptions in research reports would yield significant benefits for the field. Copyright © 2012 Elsevier Inc. All rights reserved.

  9. Error Reduction Program. [combustor performance evaluation codes

    NASA Technical Reports Server (NTRS)

    Syed, S. A.; Chiappetta, L. M.; Gosman, A. D.

    1985-01-01

    The details of a study to select, incorporate and evaluate the best available finite difference scheme to reduce numerical error in combustor performance evaluation codes are described. The combustor performance computer programs chosen were the two dimensional and three dimensional versions of Pratt & Whitney's TEACH code. The criteria used to select schemes required that the difference equations mirror the properties of the governing differential equation, be more accurate than the current hybrid difference scheme, be stable and economical, be compatible with TEACH codes, use only modest amounts of additional storage, and be relatively simple. The methods of assessment used in the selection process consisted of examination of the difference equation, evaluation of the properties of the coefficient matrix, Taylor series analysis, and performance on model problems. Five schemes from the literature and three schemes developed during the course of the study were evaluated. This effort resulted in the incorporation of a scheme in 3D-TEACH which is usuallly more accurate than the hybrid differencing method and never less accurate.

  10. Using experimental design modules for process characterization in manufacturing/materials processes laboratories

    NASA Technical Reports Server (NTRS)

    Ankenman, Bruce; Ermer, Donald; Clum, James A.

    1994-01-01

    Modules dealing with statistical experimental design (SED), process modeling and improvement, and response surface methods have been developed and tested in two laboratory courses. One course was a manufacturing processes course in Mechanical Engineering and the other course was a materials processing course in Materials Science and Engineering. Each module is used as an 'experiment' in the course with the intent that subsequent course experiments will use SED methods for analysis and interpretation of data. Evaluation of the modules' effectiveness has been done by both survey questionnaires and inclusion of the module methodology in course examination questions. Results of the evaluation have been very positive. Those evaluation results and details of the modules' content and implementation are presented. The modules represent an important component for updating laboratory instruction and to provide training in quality for improved engineering practice.

  11. Benchmark data sets for structure-based computational target prediction.

    PubMed

    Schomburg, Karen T; Rarey, Matthias

    2014-08-25

    Structure-based computational target prediction methods identify potential targets for a bioactive compound. Methods based on protein-ligand docking so far face many challenges, where the greatest probably is the ranking of true targets in a large data set of protein structures. Currently, no standard data sets for evaluation exist, rendering comparison and demonstration of improvements of methods cumbersome. Therefore, we propose two data sets and evaluation strategies for a meaningful evaluation of new target prediction methods, i.e., a small data set consisting of three target classes for detailed proof-of-concept and selectivity studies and a large data set consisting of 7992 protein structures and 72 drug-like ligands allowing statistical evaluation with performance metrics on a drug-like chemical space. Both data sets are built from openly available resources, and any information needed to perform the described experiments is reported. We describe the composition of the data sets, the setup of screening experiments, and the evaluation strategy. Performance metrics capable to measure the early recognition of enrichments like AUC, BEDROC, and NSLR are proposed. We apply a sequence-based target prediction method to the large data set to analyze its content of nontrivial evaluation cases. The proposed data sets are used for method evaluation of our new inverse screening method iRAISE. The small data set reveals the method's capability and limitations to selectively distinguish between rather similar protein structures. The large data set simulates real target identification scenarios. iRAISE achieves in 55% excellent or good enrichment a median AUC of 0.67 and RMSDs below 2.0 Å for 74% and was able to predict the first true target in 59 out of 72 cases in the top 2% of the protein data set of about 8000 structures.

  12. Initial progress in the recording of crime scene simulations using 3D laser structured light imagery techniques for law enforcement and forensic applications

    NASA Astrophysics Data System (ADS)

    Altschuler, Bruce R.; Monson, Keith L.

    1998-03-01

    Representation of crime scenes as virtual reality 3D computer displays promises to become a useful and important tool for law enforcement evaluation and analysis, forensic identification and pathological study and archival presentation during court proceedings. Use of these methods for assessment of evidentiary materials demands complete accuracy of reproduction of the original scene, both in data collection and in its eventual virtual reality representation. The recording of spatially accurate information as soon as possible after first arrival of law enforcement personnel is advantageous for unstable or hazardous crime scenes and reduces the possibility that either inadvertent measurement error or deliberate falsification may occur or be alleged concerning processing of a scene. Detailed measurements and multimedia archiving of critical surface topographical details in a calibrated, uniform, consistent and standardized quantitative 3D coordinate method are needed. These methods would afford professional personnel in initial contact with a crime scene the means for remote, non-contacting, immediate, thorough and unequivocal documentation of the contents of the scene. Measurements of the relative and absolute global positions of object sand victims, and their dispositions within the scene before their relocation and detailed examination, could be made. Resolution must be sufficient to map both small and large objects. Equipment must be able to map regions at varied resolution as collected from different perspectives. Progress is presented in devising methods for collecting and archiving 3D spatial numerical data from crime scenes, sufficient for law enforcement needs, by remote laser structured light and video imagery. Two types of simulation studies were done. One study evaluated the potential of 3D topographic mapping and 3D telepresence using a robotic platform for explosive ordnance disassembly. The second study involved using the laser mapping system on a fixed optical bench with simulated crime scene models of the people and furniture to assess feasibility, requirements and utility of such a system for crime scene documentation and analysis.

  13. A Review of Methods for Analysis of the Expected Value of Information.

    PubMed

    Heath, Anna; Manolopoulou, Ioanna; Baio, Gianluca

    2017-10-01

    In recent years, value-of-information analysis has become more widespread in health economic evaluations, specifically as a tool to guide further research and perform probabilistic sensitivity analysis. This is partly due to methodological advancements allowing for the fast computation of a typical summary known as the expected value of partial perfect information (EVPPI). A recent review discussed some approximation methods for calculating the EVPPI, but as the research has been active over the intervening years, that review does not discuss some key estimation methods. Therefore, this paper presents a comprehensive review of these new methods. We begin by providing the technical details of these computation methods. We then present two case studies in order to compare the estimation performance of these new methods. We conclude that a method based on nonparametric regression offers the best method for calculating the EVPPI in terms of accuracy, computational time, and ease of implementation. This means that the EVPPI can now be used practically in health economic evaluations, especially as all the methods are developed in parallel with R functions and a web app to aid practitioners.

  14. Design and evaluation of a freeform lens by using a method of luminous intensity mapping and a differential equation

    NASA Astrophysics Data System (ADS)

    Essameldin, Mahmoud; Fleischmann, Friedrich; Henning, Thomas; Lang, Walter

    2017-02-01

    Freeform optical systems are playing an important role in the field of illumination engineering for redistributing the light intensity, because of its capability of achieving accurate and efficient results. The authors have presented the basic idea of the freeform lens design method at the 117th annual meeting of the German Society of Applied Optics (DGAOProceedings). Now, we demonstrate the feasibility of the design method by designing and evaluating a freeform lens. The concepts of luminous intensity mapping, energy conservation and differential equation are combined in designing a lens for non-imaging applications. The required procedures to design a lens including the simulations are explained in detail. The optical performance is investigated by using a numerical simulation of optical ray tracing. For evaluation, the results are compared with another recently published design method, showing the accurate performance of the proposed method using a reduced number of mapping angles. As a part of the tolerance analyses of the fabrication processes, the influence of the light source misalignments (translation and orientation) on the beam-shaping performance is presented. Finally, the importance of considering the extended light source while designing a freeform lens using the proposed method is discussed.

  15. Do Low-Income Students Have Equal Access to the Highest-Performing Teachers? Technical Appendix. NCEE 2011-4016

    ERIC Educational Resources Information Center

    Glazerman, Steven; Max, Jeffrey

    2011-01-01

    This appendix describes the methods and provides further detail to support the evaluation brief, "Do Low-Income Students Have Equal Access to the Highest-Performing Teachers?" (Contains 8 figures, 6 tables and 5 footnotes.) [For the main report, "Do Low-Income Students Have Equal Access to the Highest-Performing Teachers? NCEE…

  16. Measuring Learning Gains in Chemical Education: A Comparison of Two Methods

    ERIC Educational Resources Information Center

    Pentecost, Thomas C.; Barbera, Jack

    2013-01-01

    Evaluating the effect of a pedagogical innovation is often done by looking for a significant difference in a content measure using a pre-post design. While this approach provides valuable information regarding the presence or absence of an effect, it is limited in providing details about the nature of the effect. A measure of the magnitude of the…

  17. Microcomputer Programs for Educational Statistics: A Review of Popular Programs. TME Report 89.

    ERIC Educational Resources Information Center

    Stemmer, Paul M.; Berger, Carl F.

    This publication acquaints the user with microcomputer statistical packages and offers a method for evaluation based on a set of criteria that can be adapted to the needs of the user. Several popular packages, typical of those available, are reviewed in detail: (1) Abstat, an easy to use command driven package compatible with the IBM PC or the…

  18. Evaluation of methods for delineating riparian zones in a semi-arid montane watershed

    Treesearch

    Jessica A. Salo; David M. Theobald; Thomas C. Brown

    2016-01-01

    Riparian zones in semi-arid, mountainous regions provide a disproportionate amount of the available wildlife habitat and ecosystem services. Despite their importance, there is little guidance on the best way to map riparian zones for broad spatial extents (e.g., large watersheds) when detailed maps from field data or high-resolution imagery and terrain data...

  19. Description of Audio-Visual Recording Equipment and Method of Installation for Pilot Training.

    ERIC Educational Resources Information Center

    Neese, James A.

    The Audio-Video Recorder System was developed to evaluate the effectiveness of in-flight audio/video recording as a pilot training technique for the U.S. Air Force Pilot Training Program. It will be used to gather background and performance data for an experimental program. A detailed description of the system is presented and construction and…

  20. TO "LIMITATIONS OF ROI TESTING FOR VENTING DESIGN: DESCRIPTION OF AN ALTERNATIVE APPROACH BASED ON ATTAINMENT OF A CRITICAL PORE-GAS VELOCITY IN CONTAMINATED MEDIA

    EPA Science Inventory

    In this paper, we describe the limitations of radius of influence (ROI) evaluation for venting design in more detail than has been done previously and propose an alternative method based on specification and attainment of critical pore-gas velocities in contaminated subsurface me...

  1. A method for assessing the silvicultural effects of releasing young trees from competition.

    Treesearch

    P.W. Owsten; M. Greenup; V.A. Davis

    1986-01-01

    Systematic, long-term measurements of the survival and growth effects of releasing crop trees from competing vegetation are important for evaluating vegetation management treatments in forest plantations. This report details field-tested procedures for use in any type of release treatment—mechanical, manual, biological, or chemical. The basic concept is to delineate...

  2. Controlled Release of Antibiotics from Biodegradable Microcapsules for Wound Infection Control.

    DTIC Science & Technology

    1982-06-18

    evaporation and phase separation methods were used in formulating the microcapsules .(l1) The microencapsulation process will be described in detail in a...intensity to the antibiotic content. Usi.ng both microencapsulation processes, 14C-labeled ampicillin anhydypte microcapsules were synthesized.(12...excellent technical assistance. .. . . g .SETTERSTROM, TICE, LEWIS, and-MEYERS TABLE 1. IN VIVO AMPICILLIN MICROCAPSULES EVALUATED MICROENCAPSULATION

  3. Investigating Qualities of Teachers' Feedback Conversations for Fostering Reasoning and Feeling of Self-Worth in Learners: A Tool Called Feedback Mapping

    ERIC Educational Resources Information Center

    Quynn, Jennifer Ann

    2013-01-01

    Teacher feedback has been identified throughout the educational literature as a powerful classroom intervention. However few tools exist that allow teachers to understand their own feedback practice. This study details a method for evaluating the feedback experiences of students. The feedback conversations of middle school science teachers were…

  4. Comparison of Feature Learning Methods for Human Activity Recognition Using Wearable Sensors

    PubMed Central

    Li, Frédéric; Nisar, Muhammad Adeel; Köping, Lukas; Grzegorzek, Marcin

    2018-01-01

    Getting a good feature representation of data is paramount for Human Activity Recognition (HAR) using wearable sensors. An increasing number of feature learning approaches—in particular deep-learning based—have been proposed to extract an effective feature representation by analyzing large amounts of data. However, getting an objective interpretation of their performances faces two problems: the lack of a baseline evaluation setup, which makes a strict comparison between them impossible, and the insufficiency of implementation details, which can hinder their use. In this paper, we attempt to address both issues: we firstly propose an evaluation framework allowing a rigorous comparison of features extracted by different methods, and use it to carry out extensive experiments with state-of-the-art feature learning approaches. We then provide all the codes and implementation details to make both the reproduction of the results reported in this paper and the re-use of our framework easier for other researchers. Our studies carried out on the OPPORTUNITY and UniMiB-SHAR datasets highlight the effectiveness of hybrid deep-learning architectures involving convolutional and Long-Short-Term-Memory (LSTM) to obtain features characterising both short- and long-term time dependencies in the data. PMID:29495310

  5. Comparison of Feature Learning Methods for Human Activity Recognition Using Wearable Sensors.

    PubMed

    Li, Frédéric; Shirahama, Kimiaki; Nisar, Muhammad Adeel; Köping, Lukas; Grzegorzek, Marcin

    2018-02-24

    Getting a good feature representation of data is paramount for Human Activity Recognition (HAR) using wearable sensors. An increasing number of feature learning approaches-in particular deep-learning based-have been proposed to extract an effective feature representation by analyzing large amounts of data. However, getting an objective interpretation of their performances faces two problems: the lack of a baseline evaluation setup, which makes a strict comparison between them impossible, and the insufficiency of implementation details, which can hinder their use. In this paper, we attempt to address both issues: we firstly propose an evaluation framework allowing a rigorous comparison of features extracted by different methods, and use it to carry out extensive experiments with state-of-the-art feature learning approaches. We then provide all the codes and implementation details to make both the reproduction of the results reported in this paper and the re-use of our framework easier for other researchers. Our studies carried out on the OPPORTUNITY and UniMiB-SHAR datasets highlight the effectiveness of hybrid deep-learning architectures involving convolutional and Long-Short-Term-Memory (LSTM) to obtain features characterising both short- and long-term time dependencies in the data.

  6. Research and development of a field-ready protocol for sampling of phosgene from stationary source emissions: Diethylamine reagent studies. Research report, 11 July 1995--30 September 1996

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steger, J.L.; Bursey, J.T.; Merrill, R.G.

    1999-03-01

    This report presents the results of laboratory studies to develop and evaluate a method for the sampling and analysis of phosgene from stationary sources of air emissions using diethylamine (DEA) in toluene as the collection media. The method extracts stack gas from emission sources and stabilizes the reactive gas for subsequent analysis. DEA was evaluated both in a benchtop study and in a laboratory train spiking study. This report includes results for both the benchtop study and the train spiking study. Benchtop studies to evaluate the suitability of DEA for collecting and analyzing phosgene investigated five variables: storage time, DEAmore » concentration, moisture/pH, phosgene concentration, and sample storage temperature. Prototype sampling train studies were performed to determine if the benchtop chemical studies were transferable to a Modified Method 5 sampling train collecting phosgene in the presence of clean air mixed with typical stack gas components. Four conditions, which varied the moisture and phosgene spike were evaluated in triplicate. In addition to research results, the report includes a detailed draft method for sampling and analysis of phosgene from stationary source emissions.« less

  7. Analysis techniques for the evaluation of the neutrinoless double-β decay lifetime in 130Te with the CUORE-0 detector

    DOE PAGES

    Alduino, C.; Alfonso, K.; Artusa, D. R.; ...

    2016-04-25

    Here, we describe in detail the methods used to obtain the lower bound on the lifetime of neutrinoless double-beta (0νββ) decay in 130Te and the associated limit on the effective Majorana mass of the neutrino using the CUORE-0 detector. CUORE-0 is a bolometric detector array located at the Laboratori Nazionali del Gran Sasso that was designed to validate the background reduction techniques developed for CUORE, a next-generation experiment scheduled to come online in 2016. CUORE-0 is also a competitive 0νββ decay search in its own right and functions as a platform to further develop the analysis tools and procedures tomore » be used in CUORE. These include data collection, event selection and processing, as well as an evaluation of signal efficiency. In particular, we describe the amplitude evaluation, thermal gain stabilization, energy calibration methods, and the analysis event selection used to create our final 0νββ search spectrum. We define our high level analysis procedures, with emphasis on the new insights gained and challenges encountered. We outline in detail our fitting methods near the hypothesized 0νββ decay peak and catalog the main sources of systematic uncertainty. Finally, we derive the 0νββ decay half-life limits previously reported for CUORE-0, T 0ν 1/2 > 2.7×10 24yr, and in combination with the Cuoricino limit, T 0ν 1/2 > 4.0×10 24yr.« less

  8. A Systematic Literature Review of US Engineering Ethics Interventions.

    PubMed

    Hess, Justin L; Fore, Grant

    2018-04-01

    Promoting the ethical formation of engineering students through the cultivation of their discipline-specific knowledge, sensitivity, imagination, and reasoning skills has become a goal for many engineering education programs throughout the United States. However, there is neither a consensus throughout the engineering education community regarding which strategies are most effective towards which ends, nor which ends are most important. This study provides an overview of engineering ethics interventions within the U.S. through the systematic analysis of articles that featured ethical interventions in engineering, published in select peer-reviewed journals, and published between 2000 and 2015. As a core criterion, each journal article reviewed must have provided an overview of the course as well as how the authors evaluated course-learning goals. In sum, 26 articles were analyzed with a coding scheme that included 56 binary items. The results indicate that the most common methods for integrating ethics into engineering involved exposing students to codes/standards, utilizing case studies, and discussion activities. Nearly half of the articles had students engage with ethical heuristics or philosophical ethics. Following the presentation of the results, this study describes in detail four articles to highlight less common but intriguing pedagogical methods and evaluation techniques. The findings indicate that there is limited empirical work on ethics education within engineering across the United States. Furthermore, due to the large variation in goals, approaches, and evaluation methods described across interventions, this study does not detail "best" practices for integrating ethics into engineering. The science and engineering education community should continue exploring the relative merits of different approaches to ethics education in engineering.

  9. Analysis techniques for the evaluation of the neutrinoless double- β decay lifetime in Te 130 with the CUORE-0 detector

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alduino, C.; Alfonso, K.; Artusa, D. R.

    2016-04-25

    We describe in detail the methods used to obtain the lower bound on the lifetime of neutrinoless double-beta ( 0 ν β β ) decay in 130 Te and the associated limit on the effective Majorana mass of the neutrino using the CUORE-0 detector. CUORE-0 is a bolometric detector array located at the Laboratori Nazionali del Gran Sasso that was designed to validate the background reduction techniques developed for CUORE, a next-generation experiment scheduled to come online in 2016. CUORE-0 is also a competitive 0 ν β β decay search in its own right and functions as a platform tomore » further develop the analysis tools and procedures to be used in CUORE. These include data collection, event selection and processing, as well as an evaluation of signal efficiency. In particular, we describe the amplitude evaluation, thermal gain stabilization, energy calibration methods, and the analysis event selection used to create our final 0 ν β β search spectrum. We define our high level analysis procedures, with emphasis on the new insights gained and challenges encountered. We outline in detail our fitting methods near the hypothesized 0 ν β β decay peak and catalog the main sources of systematic uncertainty. Finally, we derive the 0 ν β β decay half-life limits previously reported for CUORE-0, T 0 ν 1 / 2 > 2.7 × 10 24 yr , and in combination with the Cuoricino limit, T 0 ν 1 / 2 > 4.0 × 10 24 yr .« less

  10. Evaluation of repetitive-PCR and matrix-assisted laser desorption ionization-time of flight mass spectrometry (MALDI-TOF MS) for rapid strain typing of Bacillus coagulans

    PubMed Central

    Nakayama, Motokazu; Tomita, Ayumi; Sonoda, Takumi; Hasumi, Motomitsu; Miyamoto, Takahisa

    2017-01-01

    In order to establish rapid and accurate typing method for Bacillus coagulans strains which is important for controlling in some canned foods and tea-based beverages manufacturing because of the high-heat resistance of the spores and high tolerance of the vegetative cells to catechins and chemicals, matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS) and repetitive-PCR (rep-PCR) were evaluated. For this purpose, 28 strains of B. coagulans obtained from various culture collections were tested. DNA sequence analyses of the genes encoding 16S rRNA and DNA gyrase classified the test strains into two and three groups, respectively, regardless of their phenotypes. Both MALDI-TOF MS and rep-PCR methods classified the test strains in great detail. Strains classified in each group showed similar phenotypes, such as carbohydrate utilization determined using API 50CH. In particular, the respective two pairs of strains which showed the same metabolic characteristic were classified into the same group by both MALDI-TOF MS and rep-PCR methods separating from the other strains. On the other hand, the other strains which have the different profiles of carbohydrate utilization were separated into different groups by these methods. These results suggested that the combination of MALDI-TOF MS and rep-PCR analyses was advantageous for the rapid and detailed typing of bacterial strains in respect to both phenotype and genotype. PMID:29020109

  11. Evaluation of repetitive-PCR and matrix-assisted laser desorption ionization-time of flight mass spectrometry (MALDI-TOF MS) for rapid strain typing of Bacillus coagulans.

    PubMed

    Sato, Jun; Nakayama, Motokazu; Tomita, Ayumi; Sonoda, Takumi; Hasumi, Motomitsu; Miyamoto, Takahisa

    2017-01-01

    In order to establish rapid and accurate typing method for Bacillus coagulans strains which is important for controlling in some canned foods and tea-based beverages manufacturing because of the high-heat resistance of the spores and high tolerance of the vegetative cells to catechins and chemicals, matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS) and repetitive-PCR (rep-PCR) were evaluated. For this purpose, 28 strains of B. coagulans obtained from various culture collections were tested. DNA sequence analyses of the genes encoding 16S rRNA and DNA gyrase classified the test strains into two and three groups, respectively, regardless of their phenotypes. Both MALDI-TOF MS and rep-PCR methods classified the test strains in great detail. Strains classified in each group showed similar phenotypes, such as carbohydrate utilization determined using API 50CH. In particular, the respective two pairs of strains which showed the same metabolic characteristic were classified into the same group by both MALDI-TOF MS and rep-PCR methods separating from the other strains. On the other hand, the other strains which have the different profiles of carbohydrate utilization were separated into different groups by these methods. These results suggested that the combination of MALDI-TOF MS and rep-PCR analyses was advantageous for the rapid and detailed typing of bacterial strains in respect to both phenotype and genotype.

  12. Developing and validating a nutrition knowledge questionnaire: key methods and considerations.

    PubMed

    Trakman, Gina Louise; Forsyth, Adrienne; Hoye, Russell; Belski, Regina

    2017-10-01

    To outline key statistical considerations and detailed methodologies for the development and evaluation of a valid and reliable nutrition knowledge questionnaire. Literature on questionnaire development in a range of fields was reviewed and a set of evidence-based guidelines specific to the creation of a nutrition knowledge questionnaire have been developed. The recommendations describe key qualitative methods and statistical considerations, and include relevant examples from previous papers and existing nutrition knowledge questionnaires. Where details have been omitted for the sake of brevity, the reader has been directed to suitable references. We recommend an eight-step methodology for nutrition knowledge questionnaire development as follows: (i) definition of the construct and development of a test plan; (ii) generation of the item pool; (iii) choice of the scoring system and response format; (iv) assessment of content validity; (v) assessment of face validity; (vi) purification of the scale using item analysis, including item characteristics, difficulty and discrimination; (vii) evaluation of the scale including its factor structure and internal reliability, or Rasch analysis, including assessment of dimensionality and internal reliability; and (viii) gathering of data to re-examine the questionnaire's properties, assess temporal stability and confirm construct validity. Several of these methods have previously been overlooked. The measurement of nutrition knowledge is an important consideration for individuals working in the nutrition field. Improved methods in the development of nutrition knowledge questionnaires, such as the use of factor analysis or Rasch analysis, will enable more confidence in reported measures of nutrition knowledge.

  13. Materials Compatibility Testing in Concentrated Hydrogen Peroxide

    NASA Technical Reports Server (NTRS)

    Boxwell, R.; Bromley, G.; Mason, D.; Crockett, D.; Martinez, L.; McNeal, C.; Lyles, G. (Technical Monitor)

    2000-01-01

    Materials test methods from the 1960's have been used as a starting point in evaluating materials for today's space launch vehicles. These established test methods have been modified to incorporate today's analytical laboratory equipment. The Orbital test objective was to test a wide range of materials to incorporate the revolution in polymer and composite materials that has occurred since the 1960's. Testing is accomplished in 3 stages from rough screening to detailed analytical tests. Several interesting test observations have been made during this testing and are included in the paper. A summary of the set-up, test and evaluation of long-term storage sub-scale tanks is also included. This sub-scale tank test lasted for a 7-month duration prior to being stopped due to a polar boss material breakdown. Chemical evaluations of the hydrogen peroxide and residue left on the polar boss surface identify the material breakdown quite clearly. The paper concludes with recommendations for future testing and a specific effort underway within the industry to standardize the test methods used in evaluating materials.

  14. Organic food quality: a framework for concept, definition and evaluation from the European perspective.

    PubMed

    Kahl, Johannes; Baars, Ton; Bügel, Susanne; Busscher, Nicolaas; Huber, Machteld; Kusche, Daniel; Rembiałkowska, Ewa; Schmid, Otto; Seidel, Kathrin; Taupier-Letage, Bruno; Velimirov, Alberta; Załecka, Aneta

    2012-11-01

    Consumers buy organic food because they believe in the high quality of the product. Furthermore, the EU legal regulatory framework for organic food and farming defines high quality of the products as an important goal of production. A major challenge is the need to define food quality concepts and methods for determination. A background is described which allows embedding of the quality definitions as well as evaluation methods into a conceptual framework connected to the vision and mission of organic agriculture and food production. Organic food quality is defined through specific aspects and criteria. For evaluation each criterion has to be described by indicators. The determination of indicators should be through parameters, where parameters are described by methods. Conversely, the conceptual framework is described according to underlying principles and starting definitions are given, but further work has do be done on the detailed scientific description of the indicators. Furthermore, parameters have to be defined for the evaluation of suitability of these indicators for organic food production. Copyright © 2012 Society of Chemical Industry.

  15. Algorithms for computing the time-corrected instantaneous frequency (reassigned) spectrogram, with applications.

    PubMed

    Fulop, Sean A; Fitz, Kelly

    2006-01-01

    A modification of the spectrogram (log magnitude of the short-time Fourier transform) to more accurately show the instantaneous frequencies of signal components was first proposed in 1976 [Kodera et al., Phys. Earth Planet. Inter. 12, 142-150 (1976)], and has been considered or reinvented a few times since but never widely adopted. This paper presents a unified theoretical picture of this time-frequency analysis method, the time-corrected instantaneous frequency spectrogram, together with detailed implementable algorithms comparing three published techniques for its computation. The new representation is evaluated against the conventional spectrogram for its superior ability to track signal components. The lack of a uniform framework for either mathematics or implementation details which has characterized the disparate literature on the schemes has been remedied here. Fruitful application of the method is shown in the realms of speech phonation analysis, whale song pitch tracking, and additive sound modeling.

  16. Public Health Detailing—A Successful Strategy to Promote Judicious Opioid Analgesic Prescribing

    PubMed Central

    Tuazon, Ellenie; Paone, Denise; Dowell, Deborah; Vo, Linda; Starrels, Joanna L.; Jones, Christopher M.; Kunins, Hillary V.

    2016-01-01

    Objectives. To evaluate knowledge and prescribing changes following a 2-month public health detailing campaign (one-to-one educational visits) about judicious opioid analgesic prescribing conducted among health care providers in Staten Island, New York City, in 2013. Methods. Three detailing campaign recommendations were (1) a 3-day supply of opioids is usually sufficient for acute pain, (2) avoid prescribing opioids for chronic noncancer pain, and (3) avoid high-dose opioid prescriptions. Evaluation consisted of a knowledge survey, and assessing prescribing rates and median day supply per prescription. Prescribing data from the 3-month period before the campaign were compared with 2 sequential 3-month periods after the campaign. Results. Among 866 health care providers visited, knowledge increased for all 3 recommendations (P < .01). After the campaign, the overall prescribing rate decreased similarly in Staten Island and other New York City counties (boroughs), but the high-dose prescribing rate decreased more in Staten Island than in other boroughs (P < .01). Median day supply remained stable in Staten Island and increased in other boroughs. Conclusions. The public health detailing campaign improved knowledge and likely prescribing practices and could be considered by other jurisdictions to promote judicious opioid prescribing. PMID:27400353

  17. Spiking Cortical Model Based Multimodal Medical Image Fusion by Combining Entropy Information with Weber Local Descriptor

    PubMed Central

    Zhang, Xuming; Ren, Jinxia; Huang, Zhiwen; Zhu, Fei

    2016-01-01

    Multimodal medical image fusion (MIF) plays an important role in clinical diagnosis and therapy. Existing MIF methods tend to introduce artifacts, lead to loss of image details or produce low-contrast fused images. To address these problems, a novel spiking cortical model (SCM) based MIF method has been proposed in this paper. The proposed method can generate high-quality fused images using the weighting fusion strategy based on the firing times of the SCM. In the weighting fusion scheme, the weight is determined by combining the entropy information of pulse outputs of the SCM with the Weber local descriptor operating on the firing mapping images produced from the pulse outputs. The extensive experiments on multimodal medical images show that compared with the numerous state-of-the-art MIF methods, the proposed method can preserve image details very well and avoid the introduction of artifacts effectively, and thus it significantly improves the quality of fused images in terms of human vision and objective evaluation criteria such as mutual information, edge preservation index, structural similarity based metric, fusion quality index, fusion similarity metric and standard deviation. PMID:27649190

  18. Spiking Cortical Model Based Multimodal Medical Image Fusion by Combining Entropy Information with Weber Local Descriptor.

    PubMed

    Zhang, Xuming; Ren, Jinxia; Huang, Zhiwen; Zhu, Fei

    2016-09-15

    Multimodal medical image fusion (MIF) plays an important role in clinical diagnosis and therapy. Existing MIF methods tend to introduce artifacts, lead to loss of image details or produce low-contrast fused images. To address these problems, a novel spiking cortical model (SCM) based MIF method has been proposed in this paper. The proposed method can generate high-quality fused images using the weighting fusion strategy based on the firing times of the SCM. In the weighting fusion scheme, the weight is determined by combining the entropy information of pulse outputs of the SCM with the Weber local descriptor operating on the firing mapping images produced from the pulse outputs. The extensive experiments on multimodal medical images show that compared with the numerous state-of-the-art MIF methods, the proposed method can preserve image details very well and avoid the introduction of artifacts effectively, and thus it significantly improves the quality of fused images in terms of human vision and objective evaluation criteria such as mutual information, edge preservation index, structural similarity based metric, fusion quality index, fusion similarity metric and standard deviation.

  19. Research on the use of data fusion technology to evaluate the state of electromechanical equipment

    NASA Astrophysics Data System (ADS)

    Lin, Lin

    2018-04-01

    Aiming at the problems of different testing information modes and the coexistence of quantitative and qualitative information in the state evaluation of electromechanical equipment, the paper proposes the use of data fusion technology to evaluate the state of electromechanical equipment. This paper introduces the state evaluation process of mechanical and electrical equipment in detail, uses the D-S evidence theory to fuse the decision-making layers of mechanical and electrical equipment state evaluation and carries out simulation tests. The simulation results show that it is feasible and effective to apply the data fusion technology to the state evaluation of the mechatronic equipment. After the multiple decision-making information provided by different evaluation methods are fused repeatedly and the useful information is extracted repeatedly, the fuzziness of judgment can be reduced and the state evaluation Credibility.

  20. Structural dynamic analysis of the Space Shuttle Main Engine

    NASA Technical Reports Server (NTRS)

    Scott, L. P.; Jamison, G. T.; Mccutcheon, W. A.; Price, J. M.

    1981-01-01

    This structural dynamic analysis supports development of the SSME by evaluating components subjected to critical dynamic loads, identifying significant parameters, and evaluating solution methods. Engine operating parameters at both rated and full power levels are considered. Detailed structural dynamic analyses of operationally critical and life limited components support the assessment of engine design modifications and environmental changes. Engine system test results are utilized to verify analytic model simulations. The SSME main chamber injector assembly is an assembly of 600 injector elements which are called LOX posts. The overall LOX post analysis procedure is shown.

  1. Development of an Innovative Algorithm for Aerodynamics-Structure Interaction Using Lattice Boltzmann Method

    NASA Technical Reports Server (NTRS)

    Mei, Ren-Wei; Shyy, Wei; Yu, Da-Zhi; Luo, Li-Shi; Rudy, David (Technical Monitor)

    2001-01-01

    The lattice Boltzmann equation (LBE) is a kinetic formulation which offers an alternative computational method capable of solving fluid dynamics for various systems. Major advantages of the method are owing to the fact that the solution for the particle distribution functions is explicit, easy to implement, and the algorithm is natural to parallelize. In this final report, we summarize the works accomplished in the past three years. Since most works have been published, the technical details can be found in the literature. Brief summary will be provided in this report. In this project, a second-order accurate treatment of boundary condition in the LBE method is developed for a curved boundary and tested successfully in various 2-D and 3-D configurations. To evaluate the aerodynamic force on a body in the context of LBE method, several force evaluation schemes have been investigated. A simple momentum exchange method is shown to give reliable and accurate values for the force on a body in both 2-D and 3-D cases. Various 3-D LBE models have been assessed in terms of efficiency, accuracy, and robustness. In general, accurate 3-D results can be obtained using LBE methods. The 3-D 19-bit model is found to be the best one among the 15-bit, 19-bit, and 27-bit LBE models. To achieve desired grid resolution and to accommodate the far field boundary conditions in aerodynamics computations, a multi-block LBE method is developed by dividing the flow field into various blocks each having constant lattice spacing. Substantial contribution to the LBE method is also made through the development of a new, generalized lattice Boltzmann equation constructed in the moment space in order to improve the computational stability, detailed theoretical analysis on the stability, dispersion, and dissipation characteristics of the LBE method, and computational studies of high Reynolds number flows with singular gradients. Finally, a finite difference-based lattice Boltzmann method is developed for inviscid compressible flows.

  2. Imaging regional renal function parameters using radionuclide tracers

    NASA Astrophysics Data System (ADS)

    Qiao, Yi

    A compartmental model is given for evaluating kidney function accurately and noninvasively. This model is cast into a parallel multi-compartment structure and each pixel region (picture element) of kidneys is considered as a single kidney compartment. The loss of radionuclide tracers from the blood to the kidney and from the kidney to the bladder are modelled in great detail. Both the uptake function and the excretion function of the kidneys can be evaluated pixel by pixel, and regional diagnostic information on renal function is obtained. Gamma Camera image data are required by this model and a screening test based renal function measurement is provided. The regional blood background is subtracted from the kidney region of interest (ROI) and the kidney regional rate constants are estimated analytically using the Kuhn-Pucker multiplier method in convex programming by considering the input/output behavior of the kidney compartments. The detailed physiological model of the peripheral compartments of the system, which is not available for most radionuclide tracers, is not required in the determination of the kidney regional rate constants and the regional blood background factors within the kidney ROI. Moreover, the statistical significance of measurements is considered to assure the improved statistical properties of the estimated kidney rate constants. The relations between various renal function parameters and the kidney rate constants are established. Multiple renal function measurements can be found from the renal compartmental model. The blood radioactivity curve and the regional (or total) radiorenogram determining the regional (or total) summed behavior of the kidneys are obtained analytically with the consideration of the statistical significance of measurements using convex programming methods for a single peripheral compartment system. In addition, a new technique for the determination of 'initial conditions' in both the blood compartment and the kidney compartment is presented. The blood curve and the radiorenogram are analyzed in great detail and a physiological analysis from the radiorenogram is given. Applications of Kuhn-Tucker multiplier methods are illustrated for the renal compartmental model in the field of nuclear medicine. Conventional kinetic data analysis methods, the maximum likehood method, and the weighted integration method are investigated and used for comparisons. Moreover, the effect of the blood background subtraction is shown by using the gamma camera images in man. Several functional images are calculated and the functional imaging technique is applied for evaluating renal function in man quantitatively and visually and compared with comments from a physician.

  3. A hyperspectral image optimizing method based on sub-pixel MTF analysis

    NASA Astrophysics Data System (ADS)

    Wang, Yun; Li, Kai; Wang, Jinqiang; Zhu, Yajie

    2015-04-01

    Hyperspectral imaging is used to collect tens or hundreds of images continuously divided across electromagnetic spectrum so that the details under different wavelengths could be represented. A popular hyperspectral imaging methods uses a tunable optical band-pass filter settled in front of the focal plane to acquire images of different wavelengths. In order to alleviate the influence of chromatic aberration in some segments in a hyperspectral series, in this paper, a hyperspectral optimizing method uses sub-pixel MTF to evaluate image blurring quality was provided. This method acquired the edge feature in the target window by means of the line spread function (LSF) to calculate the reliable position of the edge feature, then the evaluation grid in each line was interpolated by the real pixel value based on its relative position to the optimal edge and the sub-pixel MTF was used to analyze the image in frequency domain, by which MTF calculation dimension was increased. The sub-pixel MTF evaluation was reliable, since no image rotation and pixel value estimation was needed, and no artificial information was introduced. With theoretical analysis, the method proposed in this paper is reliable and efficient when evaluation the common images with edges of small tilt angle in real scene. It also provided a direction for the following hyperspectral image blurring evaluation and the real-time focal plane adjustment in real time in related imaging system.

  4. Comparing Anisotropic Output-Based Grid Adaptation Methods by Decomposition

    NASA Technical Reports Server (NTRS)

    Park, Michael A.; Loseille, Adrien; Krakos, Joshua A.; Michal, Todd

    2015-01-01

    Anisotropic grid adaptation is examined by decomposing the steps of flow solution, ad- joint solution, error estimation, metric construction, and simplex grid adaptation. Multiple implementations of each of these steps are evaluated by comparison to each other and expected analytic results when available. For example, grids are adapted to analytic metric fields and grid measures are computed to illustrate the properties of multiple independent implementations of grid adaptation mechanics. Different implementations of each step in the adaptation process can be evaluated in a system where the other components of the adaptive cycle are fixed. Detailed examination of these properties allows comparison of different methods to identify the current state of the art and where further development should be targeted.

  5. Into the decomposed body-forensic digital autopsy using multislice-computed tomography.

    PubMed

    Thali, M J; Yen, K; Schweitzer, W; Vock, P; Ozdoba, C; Dirnhofer, R

    2003-07-08

    It is impossible to obtain a representative anatomical documentation of an entire body using classical X-ray methods, they subsume three-dimensional bodies into a two-dimensional level. We used the novel multislice-computed tomography (MSCT) technique in order to evaluate a case of homicide with putrefaction of the corpse before performing a classical forensic autopsy. This non-invasive method showed gaseous distension of the decomposing organs and tissues in detail as well as a complex fracture of the calvarium. MSCT also proved useful in screening for foreign matter in decomposing bodies, and full-body scanning took only a few minutes. In conclusion, we believe postmortem MSCT imaging is an excellent vizualisation tool with great potential for forensic documentation and evaluation of decomposed bodies.

  6. On Applying the Prognostic Performance Metrics

    NASA Technical Reports Server (NTRS)

    Saxena, Abhinav; Celaya, Jose; Saha, Bhaskar; Saha, Sankalita; Goebel, Kai

    2009-01-01

    Prognostics performance evaluation has gained significant attention in the past few years. As prognostics technology matures and more sophisticated methods for prognostic uncertainty management are developed, a standardized methodology for performance evaluation becomes extremely important to guide improvement efforts in a constructive manner. This paper is in continuation of previous efforts where several new evaluation metrics tailored for prognostics were introduced and were shown to effectively evaluate various algorithms as compared to other conventional metrics. Specifically, this paper presents a detailed discussion on how these metrics should be interpreted and used. Several shortcomings identified, while applying these metrics to a variety of real applications, are also summarized along with discussions that attempt to alleviate these problems. Further, these metrics have been enhanced to include the capability of incorporating probability distribution information from prognostic algorithms as opposed to evaluation based on point estimates only. Several methods have been suggested and guidelines have been provided to help choose one method over another based on probability distribution characteristics. These approaches also offer a convenient and intuitive visualization of algorithm performance with respect to some of these new metrics like prognostic horizon and alpha-lambda performance, and also quantify the corresponding performance while incorporating the uncertainty information.

  7. Sum of top-hat transform based algorithm for vessel enhancement in MRA images

    NASA Astrophysics Data System (ADS)

    Ouazaa, Hibet-Allah; Jlassi, Hajer; Hamrouni, Kamel

    2018-04-01

    The Magnetic Resonance Angiography (MRA) is rich with information's. But, they suffer from poor contrast, illumination and noise. Thus, it is required to enhance the images. But, these significant information can be lost if improper techniques are applied. Therefore, in this paper, we propose a new method of enhancement. We applied firstly the CLAHE method to increase the contrast of the image. Then, we applied the sum of Top-Hat Transform to increase the brightness of vessels. It is performed with the structuring element oriented in different angles. The methodology is tested and evaluated on the publicly available database BRAINIX. And, we used the measurement methods MSE (Mean Square Error), PSNR (Peak Signal to Noise Ratio) and SNR (Signal to Noise Ratio) for the evaluation. The results demonstrate that the proposed method could efficiently enhance the image details and is comparable with state of the art algorithms. Hence, the proposed method could be broadly used in various applications.

  8. Spatio-Temporal Super-Resolution Reconstruction of Remote-Sensing Images Based on Adaptive Multi-Scale Detail Enhancement

    PubMed Central

    Zhu, Hong; Tang, Xinming; Xie, Junfeng; Song, Weidong; Mo, Fan; Gao, Xiaoming

    2018-01-01

    There are many problems in existing reconstruction-based super-resolution algorithms, such as the lack of texture-feature representation and of high-frequency details. Multi-scale detail enhancement can produce more texture information and high-frequency information. Therefore, super-resolution reconstruction of remote-sensing images based on adaptive multi-scale detail enhancement (AMDE-SR) is proposed in this paper. First, the information entropy of each remote-sensing image is calculated, and the image with the maximum entropy value is regarded as the reference image. Subsequently, spatio-temporal remote-sensing images are processed using phase normalization, which is to reduce the time phase difference of image data and enhance the complementarity of information. The multi-scale image information is then decomposed using the L0 gradient minimization model, and the non-redundant information is processed by difference calculation and expanding non-redundant layers and the redundant layer by the iterative back-projection (IBP) technique. The different-scale non-redundant information is adaptive-weighted and fused using cross-entropy. Finally, a nonlinear texture-detail-enhancement function is built to improve the scope of small details, and the peak signal-to-noise ratio (PSNR) is used as an iterative constraint. Ultimately, high-resolution remote-sensing images with abundant texture information are obtained by iterative optimization. Real results show an average gain in entropy of up to 0.42 dB for an up-scaling of 2 and a significant promotion gain in enhancement measure evaluation for an up-scaling of 2. The experimental results show that the performance of the AMED-SR method is better than existing super-resolution reconstruction methods in terms of visual and accuracy improvements. PMID:29414893

  9. Spatio-Temporal Super-Resolution Reconstruction of Remote-Sensing Images Based on Adaptive Multi-Scale Detail Enhancement.

    PubMed

    Zhu, Hong; Tang, Xinming; Xie, Junfeng; Song, Weidong; Mo, Fan; Gao, Xiaoming

    2018-02-07

    There are many problems in existing reconstruction-based super-resolution algorithms, such as the lack of texture-feature representation and of high-frequency details. Multi-scale detail enhancement can produce more texture information and high-frequency information. Therefore, super-resolution reconstruction of remote-sensing images based on adaptive multi-scale detail enhancement (AMDE-SR) is proposed in this paper. First, the information entropy of each remote-sensing image is calculated, and the image with the maximum entropy value is regarded as the reference image. Subsequently, spatio-temporal remote-sensing images are processed using phase normalization, which is to reduce the time phase difference of image data and enhance the complementarity of information. The multi-scale image information is then decomposed using the L ₀ gradient minimization model, and the non-redundant information is processed by difference calculation and expanding non-redundant layers and the redundant layer by the iterative back-projection (IBP) technique. The different-scale non-redundant information is adaptive-weighted and fused using cross-entropy. Finally, a nonlinear texture-detail-enhancement function is built to improve the scope of small details, and the peak signal-to-noise ratio (PSNR) is used as an iterative constraint. Ultimately, high-resolution remote-sensing images with abundant texture information are obtained by iterative optimization. Real results show an average gain in entropy of up to 0.42 dB for an up-scaling of 2 and a significant promotion gain in enhancement measure evaluation for an up-scaling of 2. The experimental results show that the performance of the AMED-SR method is better than existing super-resolution reconstruction methods in terms of visual and accuracy improvements.

  10. Evaluation of health promotion in schools: a realistic evaluation approach using mixed methods

    PubMed Central

    2010-01-01

    Background Schools are key settings for health promotion (HP) but the development of suitable approaches for evaluating HP in schools is still a major topic of discussion. This article presents a research protocol of a program developed to evaluate HP. After reviewing HP evaluation issues, the various possible approaches are analyzed and the importance of a realistic evaluation framework and a mixed methods (MM) design are demonstrated. Methods/Design The design is based on a systemic approach to evaluation, taking into account the mechanisms, context and outcomes, as defined in realistic evaluation, adjusted to our own French context using an MM approach. The characteristics of the design are illustrated through the evaluation of a nationwide HP program in French primary schools designed to enhance children's social, emotional and physical health by improving teachers' HP practices and promoting a healthy school environment. An embedded MM design is used in which a qualitative data set plays a supportive, secondary role in a study based primarily on a different quantitative data set. The way the qualitative and quantitative approaches are combined through the entire evaluation framework is detailed. Discussion This study is a contribution towards the development of suitable approaches for evaluating HP programs in schools. The systemic approach of the evaluation carried out in this research is appropriate since it takes account of the limitations of traditional evaluation approaches and considers suggestions made by the HP research community. PMID:20109202

  11. Applying micro-costing methods to estimate the costs of pharmacy interventions: an illustration using multi-professional clinical medication reviews in care homes for older people.

    PubMed

    Sach, Tracey H; Desborough, James; Houghton, Julie; Holland, Richard

    2014-11-06

    Economic methods are underutilised within pharmacy research resulting in a lack of quality evidence to support funding decisions for pharmacy interventions. The aim of this study is to illustrate the methods of micro-costing within the pharmacy context in order to raise awareness and use of this approach in pharmacy research. Micro-costing methods are particularly useful where a new service or intervention is being evaluated and for which no previous estimates of the costs of providing the service exist. This paper describes the rationale for undertaking a micro-costing study before detailing and illustrating the process involved. The illustration relates to a recently completed trial of multi-professional medication reviews as an intervention provided in care homes. All costs are presented in UK£2012. In general, costing methods involve three broad steps (identification, measurement and valuation); when using micro-costing, closer attention to detail is required within all three stages of this process. The mean (standard deviation; 95% confidence interval (CI) ) cost per resident of the multi-professional medication review intervention was £104.80 (50.91; 98.72 to 109.45), such that the overall cost of providing the intervention to all intervention home residents was £36,221.29 (95% CI, 32 810.81 to 39 631.77). This study has demonstrated that micro-costing can be a useful method, not only for estimating the cost of a pharmacy intervention to feed into a pharmacy economic evaluation, but also as a source of information to help inform those designing pharmacy services about the potential time and costs involved in delivering such services. © 2014 Royal Pharmaceutical Society.

  12. HDlive rendering images of the fetal stomach: a preliminary report.

    PubMed

    Inubashiri, Eisuke; Abe, Kiyotaka; Watanabe, Yukio; Akutagawa, Noriyuki; Kuroki, Katumaru; Sugawara, Masaki; Maeda, Nobuhiko; Minami, Kunihiro; Nomura, Yasuhiro

    2015-01-01

    This study aimed to show reconstruction of the fetal stomach using the HDlive rendering mode in ultrasound. Seventeen healthy singleton fetuses at 18-34 weeks' gestational age were observed using the HDlive rendering mode of ultrasound in utero. In all of the fetuses, we identified specific spatial structures, including macroscopic anatomical features (e.g., the pyrous, cardia, fundus, and great curvature) of the fetal stomach, using the HDlive rendering mode. In particular, HDlive rendering images showed remarkably fine details that appeared as if they were being viewed under an endoscope, with visible rugal folds after 27 weeks' gestational age. Our study suggests that the HDlive rendering mode can be used as an additional method for evaluating the fetal stomach. The HDlive rendering mode shows detailed 3D structural images and anatomically realistic images of the fetal stomach. This technique may be effective in prenatal diagnosis for examining detailed information of fetal organs.

  13. Exploring Architectural Details Through a Wearable Egocentric Vision Device

    PubMed Central

    Alletto, Stefano; Abati, Davide; Serra, Giuseppe; Cucchiara, Rita

    2016-01-01

    Augmented user experiences in the cultural heritage domain are in increasing demand by the new digital native tourists of 21st century. In this paper, we propose a novel solution that aims at assisting the visitor during an outdoor tour of a cultural site using the unique first person perspective of wearable cameras. In particular, the approach exploits computer vision techniques to retrieve the details by proposing a robust descriptor based on the covariance of local features. Using a lightweight wearable board, the solution can localize the user with respect to the 3D point cloud of the historical landmark and provide him with information about the details at which he is currently looking. Experimental results validate the method both in terms of accuracy and computational effort. Furthermore, user evaluation based on real-world experiments shows that the proposal is deemed effective in enriching a cultural experience. PMID:26901197

  14. Exploring Architectural Details Through a Wearable Egocentric Vision Device.

    PubMed

    Alletto, Stefano; Abati, Davide; Serra, Giuseppe; Cucchiara, Rita

    2016-02-17

    Augmented user experiences in the cultural heritage domain are in increasing demand by the new digital native tourists of 21st century. In this paper, we propose a novel solution that aims at assisting the visitor during an outdoor tour of a cultural site using the unique first person perspective of wearable cameras. In particular, the approach exploits computer vision techniques to retrieve the details by proposing a robust descriptor based on the covariance of local features. Using a lightweight wearable board, the solution can localize the user with respect to the 3D point cloud of the historical landmark and provide him with information about the details at which he is currently looking. Experimental results validate the method both in terms of accuracy and computational effort. Furthermore, user evaluation based on real-world experiments shows that the proposal is deemed effective in enriching a cultural experience.

  15. Best practices for evaluating single nucleotide variant calling methods for microbial genomics

    PubMed Central

    Olson, Nathan D.; Lund, Steven P.; Colman, Rebecca E.; Foster, Jeffrey T.; Sahl, Jason W.; Schupp, James M.; Keim, Paul; Morrow, Jayne B.; Salit, Marc L.; Zook, Justin M.

    2015-01-01

    Innovations in sequencing technologies have allowed biologists to make incredible advances in understanding biological systems. As experience grows, researchers increasingly recognize that analyzing the wealth of data provided by these new sequencing platforms requires careful attention to detail for robust results. Thus far, much of the scientific Communit’s focus for use in bacterial genomics has been on evaluating genome assembly algorithms and rigorously validating assembly program performance. Missing, however, is a focus on critical evaluation of variant callers for these genomes. Variant calling is essential for comparative genomics as it yields insights into nucleotide-level organismal differences. Variant calling is a multistep process with a host of potential error sources that may lead to incorrect variant calls. Identifying and resolving these incorrect calls is critical for bacterial genomics to advance. The goal of this review is to provide guidance on validating algorithms and pipelines used in variant calling for bacterial genomics. First, we will provide an overview of the variant calling procedures and the potential sources of error associated with the methods. We will then identify appropriate datasets for use in evaluating algorithms and describe statistical methods for evaluating algorithm performance. As variant calling moves from basic research to the applied setting, standardized methods for performance evaluation and reporting are required; it is our hope that this review provides the groundwork for the development of these standards. PMID:26217378

  16. A review of economic evaluations of behavior change interventions: setting an agenda for research methods and practice.

    PubMed

    Alayli-Goebbels, Adrienne F G; Evers, Silvia M A A; Alexeeva, Daria; Ament, André J H A; de Vries, Nanne K; Tilly, Jan C; Severens, Johan L

    2014-06-01

    The objective of this study was to review methodological quality of economic evaluations of lifestyle behavior change interventions (LBCIs) and to examine how they address methodological challenges for public health economic evaluation identified in the literature. Pubmed and the NHS economic evaluation database were searched for published studies in six key areas for behavior change: smoking, physical activity, dietary behavior, (illegal) drug use, alcohol use and sexual behavior. From included studies (n = 142), we extracted data on general study characteristics, characteristics of the LBCIs, methodological quality and handling of methodological challenges. Economic evaluation evidence for LBCIs showed a number of weaknesses: methods, study design and characteristics of evaluated interventions were not well reported; methodological quality showed several shortcomings and progress with addressing methodological challenges remained limited. Based on the findings of this review we propose an agenda for improving future evidence to support decision-making. Recommendations for practice include improving reporting of essential study details and increasing adherence with good practice standards. Recommendations for research methods focus on mapping out complex causal pathways for modeling, developing measures to capture broader domains of wellbeing and community outcomes, testing methods for considering equity, identifying relevant non-health sector costs and advancing methods for evidence synthesis. © The Author 2013. Published by Oxford University Press on behalf of Faculty of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  17. Application research for 4D technology in flood forecasting and evaluation

    NASA Astrophysics Data System (ADS)

    Li, Ziwei; Liu, Yutong; Cao, Hongjie

    1998-08-01

    In order to monitor the region which disaster flood happened frequently in China, satisfy the great need of province governments for high accuracy monitoring and evaluated data for disaster and improve the efficiency for repelling disaster, under the Ninth Five-year National Key Technologies Programme, the method was researched for flood forecasting and evaluation using satellite and aerial remoted sensed image and land monitor data. The effective and practicable flood forecasting and evaluation system was established and DongTing Lake was selected as the test site. Modern Digital photogrammetry, remote sensing and GIS technology was used in this system, the disastrous flood could be forecasted and loss can be evaluated base on '4D' (DEM -- Digital Elevation Model, DOQ -- Digital OrthophotoQuads, DRG -- Digital Raster Graph, DTI -- Digital Thematic Information) disaster background database. The technology of gathering and establishing method for '4D' disaster environment background database, application technology for flood forecasting and evaluation based on '4D' background data and experimental results for DongTing Lake test site were introduced in detail in this paper.

  18. A synchrotron-based local computed tomography combined with data-constrained modelling approach for quantitative analysis of anthracite coal microstructure

    PubMed Central

    Chen, Wen Hao; Yang, Sam Y. S.; Xiao, Ti Qiao; Mayo, Sherry C.; Wang, Yu Dan; Wang, Hai Peng

    2014-01-01

    Quantifying three-dimensional spatial distributions of pores and material compositions in samples is a key materials characterization challenge, particularly in samples where compositions are distributed across a range of length scales, and where such compositions have similar X-ray absorption properties, such as in coal. Consequently, obtaining detailed information within sub-regions of a multi-length-scale sample by conventional approaches may not provide the resolution and level of detail one might desire. Herein, an approach for quantitative high-definition determination of material compositions from X-ray local computed tomography combined with a data-constrained modelling method is proposed. The approach is capable of dramatically improving the spatial resolution and enabling finer details within a region of interest of a sample larger than the field of view to be revealed than by using conventional techniques. A coal sample containing distributions of porosity and several mineral compositions is employed to demonstrate the approach. The optimal experimental parameters are pre-analyzed. The quantitative results demonstrated that the approach can reveal significantly finer details of compositional distributions in the sample region of interest. The elevated spatial resolution is crucial for coal-bed methane reservoir evaluation and understanding the transformation of the minerals during coal processing. The method is generic and can be applied for three-dimensional compositional characterization of other materials. PMID:24763649

  19. Standardized Methods to Generate Mock (Spiked) Clinical Specimens by Spiking Blood or Plasma with Cultured Pathogens

    PubMed Central

    Dong, Ming; Fisher, Carolyn; Añez, Germán; Rios, Maria; Nakhasi, Hira L.; Hobson, J. Peyton; Beanan, Maureen; Hockman, Donna; Grigorenko, Elena; Duncan, Robert

    2016-01-01

    Aims To demonstrate standardized methods for spiking pathogens into human matrices for evaluation and comparison among diagnostic platforms. Methods and Results This study presents detailed methods for spiking bacteria or protozoan parasites into whole blood and virus into plasma. Proper methods must start with a documented, reproducible pathogen source followed by steps that include standardized culture, preparation of cryopreserved aliquots, quantification of the aliquots by molecular methods, production of sufficient numbers of individual specimens and testing of the platform with multiple mock specimens. Results are presented following the described procedures that showed acceptable reproducibility comparing in-house real-time PCR assays to a commercially available multiplex molecular assay. Conclusions A step by step procedure has been described that can be followed by assay developers who are targeting low prevalence pathogens. Significance and Impact of Study The development of diagnostic platforms for detection of low prevalence pathogens such as biothreat or emerging agents is challenged by the lack of clinical specimens for performance evaluation. This deficit can be overcome using mock clinical specimens made by spiking cultured pathogens into human matrices. To facilitate evaluation and comparison among platforms, standardized methods must be followed in the preparation and application of spiked specimens. PMID:26835651

  20. Reflectance confocal microscopy of cutaneous melanoma. Correlation with dermoscopy and histopathology*

    PubMed Central

    Rstom, Silvia Arroyo; Libório, Lorena Silva; Paschoal, Francisco Macedo

    2015-01-01

    In vivo Confocal Microscopy is a method for non-invasive, real-time visualization of microscopic structures and cellular details of the epidermis and dermis, which has a degree of resolution similar to that obtained with histology. We present a case of cutaneous melanoma in which diagnosis was aided by confocal microscopy examination. We also correlate the observed features with the dermoscopic and histopathological findings. Confocal microscopy proved to be an useful adjunct to dermoscopy, playing an important role as a method 'between clinical evaluation and histopathology'. PMID:26131877

  1. Research on conceptual/innovative design for the life cycle

    NASA Technical Reports Server (NTRS)

    Cagan, Jonathan; Agogino, Alice M.

    1990-01-01

    The goal of this research is developing and integrating qualitative and quantitative methods for life cycle design. The definition of the problem includes formal computer-based methods limited to final detailing stages of design; CAD data bases do not capture design intent or design history; and life cycle issues were ignored during early stages of design. Viewgraphs outline research in conceptual design; the SYMON (SYmbolic MONotonicity analyzer) algorithm; multistart vector quantization optimization algorithm; intelligent manufacturing: IDES - Influence Diagram Architecture; and 1st PRINCE (FIRST PRINciple Computational Evaluator).

  2. Thiol-Ene functionalized siloxanes for use as elastomeric dental impression materials

    PubMed Central

    Cole, Megan A.; Jankousky, Katherine C.; Bowman, Christopher N.

    2014-01-01

    Objectives Thiol- and allyl-functionalized siloxane oligomers are synthesized and evaluated for use as a radical-mediated, rapid set elastomeric dental impression material. Thiol-ene siloxane formulations are crosslinked using a redox-initiated polymerization scheme, and the mechanical properties of the thiol-ene network are manipulated through the incorporation of varying degrees of plasticizer and kaolin filler. Formulations with medium and light body consistencies are further evaluated for their ability to accurately replicate features on both the gross and microscopic levels. We hypothesize that thiol-ene functionalized siloxane systems will exhibit faster setting times and greater detail reproduction than commercially available polyvinylsiloxane (PVS) materials of comparable consistencies. Methods Thiol-ene functionalized siloxane mixtures formulated with varying levels of redox initiators, plasticizer, and kaolin filler are made and evaluated for their polymerization speed (FTIR), consistency (ISO4823.9.2), and surface energy (goniometer). Feature replication is evaluated quantitatively by SEM. The Tg, storage modulus, and creep behavior are determined by DMA. Results Increasing redox initiation rate increases the polymerization rate but at high levels also limits working time. Combining 0.86 wt% oxidizing agent with up to 5 wt% plasticizer gave a working time of 3 min and a setting time of 2 min. The selected medium and light body thiol-ene formulations also achieved greater qualitative detail reproduction than the commercial material and reproduced micrometer patterns with 98% accuracy. Significance Improving detail reproduction and setting speed is a primary focus of dental impression material design and synthesis. Radical-mediated polymerizations, particularly thiol-ene reactions, are recognized for their speed, reduced shrinkage, and ‘click’ nature. PMID:24553250

  3. Beam shape coefficients calculation for an elliptical Gaussian beam with 1-dimensional quadrature and localized approximation methods

    NASA Astrophysics Data System (ADS)

    Wang, Wei; Shen, Jianqi

    2018-06-01

    The use of a shaped beam for applications relying on light scattering depends much on the ability to evaluate the beam shape coefficients (BSC) effectively. Numerical techniques for evaluating the BSCs of a shaped beam, such as the quadrature, the localized approximation (LA), the integral localized approximation (ILA) methods, have been developed within the framework of generalized Lorenz-Mie theory (GLMT). The quadrature methods usually employ the 2-/3-dimensional integrations. In this work, the expressions of the BSCs for an elliptical Gaussian beam (EGB) are simplified into the 1-dimensional integral so as to speed up the numerical computation. Numerical results of BSCs are used to reconstruct the beam field and the fidelity of the reconstructed field to the given beam field is estimated. It is demonstrated that the proposed method is much faster than the 2-dimensional integrations and it can acquire more accurate results than the LA method. Limitations of the quadrature method and also the LA method in the numerical calculation are analyzed in detail.

  4. Infrared and visual image fusion method based on discrete cosine transform and local spatial frequency in discrete stationary wavelet transform domain

    NASA Astrophysics Data System (ADS)

    Jin, Xin; Jiang, Qian; Yao, Shaowen; Zhou, Dongming; Nie, Rencan; Lee, Shin-Jye; He, Kangjian

    2018-01-01

    In order to promote the performance of infrared and visual image fusion and provide better visual effects, this paper proposes a hybrid fusion method for infrared and visual image by the combination of discrete stationary wavelet transform (DSWT), discrete cosine transform (DCT) and local spatial frequency (LSF). The proposed method has three key processing steps. Firstly, DSWT is employed to decompose the important features of the source image into a series of sub-images with different levels and spatial frequencies. Secondly, DCT is used to separate the significant details of the sub-images according to the energy of different frequencies. Thirdly, LSF is applied to enhance the regional features of DCT coefficients, and it can be helpful and useful for image feature extraction. Some frequently-used image fusion methods and evaluation metrics are employed to evaluate the validity of the proposed method. The experiments indicate that the proposed method can achieve good fusion effect, and it is more efficient than other conventional image fusion methods.

  5. Multiresolution generalized N dimension PCA for ultrasound image denoising

    PubMed Central

    2014-01-01

    Background Ultrasound images are usually affected by speckle noise, which is a type of random multiplicative noise. Thus, reducing speckle and improving image visual quality are vital to obtaining better diagnosis. Method In this paper, a novel noise reduction method for medical ultrasound images, called multiresolution generalized N dimension PCA (MR-GND-PCA), is presented. In this method, the Gaussian pyramid and multiscale image stacks on each level are built first. GND-PCA as a multilinear subspace learning method is used for denoising. Each level is combined to achieve the final denoised image based on Laplacian pyramids. Results The proposed method is tested with synthetically speckled and real ultrasound images, and quality evaluation metrics, including MSE, SNR and PSNR, are used to evaluate its performance. Conclusion Experimental results show that the proposed method achieved the lowest noise interference and improved image quality by reducing noise and preserving the structure. Our method is also robust for the image with a much higher level of speckle noise. For clinical images, the results show that MR-GND-PCA can reduce speckle and preserve resolvable details. PMID:25096917

  6. Millimeter wave satellite communication studies. Results of the 1981 propagation modeling effort

    NASA Technical Reports Server (NTRS)

    Stutzman, W. L.; Tsolakis, A.; Dishman, W. K.

    1982-01-01

    Theoretical modeling associated with rain effects on millimeter wave propagation is detailed. Three areas of work are discussed. A simple model for prediction of rain attenuation is developed and evaluated. A method for computing scattering from single rain drops is presented. A complete multiple scattering model is described which permits accurate calculation of the effects on dual polarized signals passing through rain.

  7. Penile enlargement: from medication to surgery.

    PubMed

    Nugteren, Helena M; Balkema, G T; Pascal, A L; Schultz, W C M Weijmar; Nijman, J M; van Driel, M F

    2010-01-01

    Penis lengthening pills, stretch apparatus, vacuum pumps, silicone injections, and lengthening and thickening operations are available for men who worry about their penis size. Surgery is thus far the only proven scientific method for penile enlargement. In this article, we consider patient selection, outcome evaluation, and techniques applied. In our view, sexological counseling and detailed explanation of risks and complications are mandatory before any operative intervention.

  8. Demonstration and Evaluation of Solid Phase Microextraction for the Assessment of Bioavailability and Contaminant Mobility (User’s Manual)

    DTIC Science & Technology

    2012-05-01

    with HPLC and PCBs with GC-ECD. Details of the chemical analysis are not included in this description but standard methods are referenced. Other...5 4.4 Analysis of samples to get the accumulated uptake in the fiber ...................................... 8 4.5 Determination of pore water...13 5.5 QC samples for chemical analysis

  9. A study for development of aerothermodynamic test model materials and fabrication technique

    NASA Technical Reports Server (NTRS)

    Dean, W. G.; Connor, L. E.

    1972-01-01

    A literature survey, materials reformulation and tailoring, fabrication problems, and materials selection and evaluation for fabricating models to be used with the phase-change technique for obtaining quantitative aerodynamic heat transfer data are presented. The study resulted in the selection of two best materials, stycast 2762 FT, and an alumina ceramic. Characteristics of these materials and detailed fabrication methods are presented.

  10. Measurement of the Tidal Dissipation in Multiple Stars

    NASA Astrophysics Data System (ADS)

    Tokovinin, Andrei

    2007-08-01

    Considerable effort has been spent to date in measuring the period of tidal circularisation in close binaries as a function of age, in order to constrain the tidal dissipation theory. Here we evaluate a new, direct method of measuring the tidal dissipation by precise timings of periastron passages in a very eccentric binary. The example of the 41 Dra system is studied in some detail.

  11. CO 2 laser cutting of MDF . 2. Estimation of power distribution

    NASA Astrophysics Data System (ADS)

    Ng, S. L.; Lum, K. C. P.; Black, I.

    2000-02-01

    Part 2 of this paper details an experimentally-based method to evaluate the power distribution for both CW and PM cutting. Variations in power distribution with different cutting speeds, material thickness and pulse ratios are presented. The paper also provides information on both the cutting efficiency and absorptivity index for MDF, and comments on the beam dispersion characteristics after the cutting process.

  12. Revisions in Natural Gas Monthly Consumption and Price Data, 2004 - 2007

    EIA Publications

    2009-01-01

    This report summarizes the method in which natural gas consumption data are collected and processed for publication and details the most notable revisions in natural gas consumption data for the period 2004 to 2007. It is intended to assist data users in evaluating the quality of the monthly consumption and price data for residential, commercial, and industrial consumers of natural gas.

  13. Digital Curation and Digital Literacy: Evaluating the Role of Curation in Developing Critical Literacies for Participation in Digital Culture

    ERIC Educational Resources Information Center

    Mihailidis, Paul

    2015-01-01

    Despite the increased role of digital curation tools and platforms in the daily life of social network users, little research has focused on the competencies and dispositions that young people develop to effectively curate content online. This paper details the results of a mixed method study exploring the curation competencies of young people in…

  14. Fuel quality-processing study. Volume 1: Overview and results

    NASA Technical Reports Server (NTRS)

    Jones, G. E., Jr.

    1982-01-01

    The methods whereby the intermediate results were obtained are outlined, and the evaluation of the feasible paths from liquid fossil fuel sources to generated electricity is presented. The segments from which these paths were built are the results from the fuel upgrading schemes, on-site treatments, and exhaust gas treatments detailed in the subsequent volumes. The salient cost and quality parameters are included.

  15. Probabilistic sensitivity analysis incorporating the bootstrap: an example comparing treatments for the eradication of Helicobacter pylori.

    PubMed

    Pasta, D J; Taylor, J L; Henning, J M

    1999-01-01

    Decision-analytic models are frequently used to evaluate the relative costs and benefits of alternative therapeutic strategies for health care. Various types of sensitivity analysis are used to evaluate the uncertainty inherent in the models. Although probabilistic sensitivity analysis is more difficult theoretically and computationally, the results can be much more powerful and useful than deterministic sensitivity analysis. The authors show how a Monte Carlo simulation can be implemented using standard software to perform a probabilistic sensitivity analysis incorporating the bootstrap. The method is applied to a decision-analytic model evaluating the cost-effectiveness of Helicobacter pylori eradication. The necessary steps are straightforward and are described in detail. The use of the bootstrap avoids certain difficulties encountered with theoretical distributions. The probabilistic sensitivity analysis provided insights into the decision-analytic model beyond the traditional base-case and deterministic sensitivity analyses and should become the standard method for assessing sensitivity.

  16. Program Evaluation: Two Management-Oriented Samples

    ERIC Educational Resources Information Center

    Alford, Kenneth Ray

    2010-01-01

    Two Management-Oriented Samples details two examples of the management-oriented approach to program evaluation. Kenneth Alford, a doctorate candidate at the University of the Cumberlands, details two separate program evaluations conducted in his school district and seeks to compare and contrast the two evaluations based upon the characteristics of…

  17. Assessment of Methodological Quality of Economic Evaluations in Belgian Drug Reimbursement Applications

    PubMed Central

    Simoens, Steven

    2013-01-01

    Objectives This paper aims to assess the methodological quality of economic evaluations included in Belgian reimbursement applications for Class 1 drugs. Materials and Methods For 19 reimbursement applications submitted during 2011 and Spring 2012, a descriptive analysis assessed the methodological quality of the economic evaluation, evaluated the assessment of that economic evaluation by the Drug Reimbursement Committee and the response to that assessment by the company. Compliance with methodological guidelines issued by the Belgian Healthcare Knowledge Centre was assessed using a detailed checklist of 23 methodological items. The rate of compliance was calculated based on the number of economic evaluations for which the item was applicable. Results Economic evaluations tended to comply with guidelines regarding perspective, target population, subgroup analyses, comparator, use of comparative clinical data and final outcome measures, calculation of costs, incremental analysis, discounting and time horizon. However, more attention needs to be paid to the description of limitations of indirect comparisons, the choice of an appropriate analytic technique, the expression of unit costs in values for the current year, the estimation and valuation of outcomes, the presentation of results of sensitivity analyses, and testing the face validity of model inputs and outputs. Also, a large variation was observed in the scope and depth of the quality assessment by the Drug Reimbursement Committee. Conclusions Although general guidelines exist, pharmaceutical companies and the Drug Reimbursement Committee would benefit from the existence of a more detailed checklist of methodological items that need to be reported in an economic evaluation. PMID:24386474

  18. Evaluation of patient centered medical home practice transformation initiatives.

    PubMed

    Crabtree, Benjamin F; Chase, Sabrina M; Wise, Christopher G; Schiff, Gordon D; Schmidt, Laura A; Goyzueta, Jeanette R; Malouin, Rebecca A; Payne, Susan M C; Quinn, Michael T; Nutting, Paul A; Miller, William L; Jaén, Carlos Roberto

    2011-01-01

    The patient-centered medical home (PCMH) has become a widely cited solution to the deficiencies in primary care delivery in the United States. To achieve the magnitude of change being called for in primary care, quality improvement interventions must focus on whole-system redesign, and not just isolated parts of medical practices. Investigators participating in 9 different evaluations of Patient Centered Medical Home implementation shared experiences, methodological strategies, and evaluation challenges for evaluating primary care practice redesign. A year-long iterative process of sharing and reflecting on experiences produced consensus on 7 recommendations for future PCMH evaluations: (1) look critically at models being implemented and identify aspects requiring modification; (2) include embedded qualitative and quantitative data collection to detail the implementation process; (3) capture details concerning how different PCMH components interact with one another over time; (4) understand and describe how and why physician and staff roles do, or do not evolve; (5) identify the effectiveness of individual PCMH components and how they are used; (6) capture how primary care practices interface with other entities such as specialists, hospitals, and referral services; and (7) measure resources required for initiating and sustaining innovations. Broad-based longitudinal, mixed-methods designs that provide for shared learning among practice participants, program implementers, and evaluators are necessary to evaluate the novelty and promise of the PCMH model. All PCMH evaluations should as comprehensive as possible, and at a minimum should include a combination of brief observations and targeted qualitative interviews along with quantitative measures.

  19. [Optimal scan parameters for a method of k-space trajectory (radial scan method) in evaluation of carotid plaque characteristics].

    PubMed

    Nakamura, Manami; Makabe, Takeshi; Tezuka, Hideomi; Miura, Takahiro; Umemura, Takuma; Sugimori, Hiroyuki; Sakata, Motomichi

    2013-04-01

    The purpose of this study was to optimize scan parameters for evaluation of carotid plaque characteristics by k-space trajectory (radial scan method), using a custom-made carotid plaque phantom. The phantom was composed of simulated sternocleidomastoid muscle and four types of carotid plaque. The effect of chemical shift artifact was compared using T1 weighted images (T1WI) of the phantom obtained with and without fat suppression, and using two types of k-space trajectory (the radial scan method and the Cartesian method). The ratio of signal intensity of simulated sternocleidomastoid muscle to the signal intensity of hematoma, blood (including heparin), lard, and mayonnaise was compared among various repetition times (TR) using T1WI and T2 weighted imaging (T2WI). In terms of chemical shift artifacts, image quality was improved using fat suppression for both the radial scan and Cartesian methods. In terms of signal ratio, the highest values were obtained for the radial scan method with TR of 500 ms for T1WI, and TR of 3000 ms for T2WI. For evaluation of carotid plaque characteristics using the radial scan method, chemical shift artifacts were reduced with fat suppression. Signal ratio was improved by optimizing the TR settings for T1WI and T2WI. These results suggest the potential for using magnetic resonance imaging for detailed evaluation of carotid plaque.

  20. Airborne Infrared and Visible Image Fusion Combined with Region Segmentation

    PubMed Central

    Zuo, Yujia; Liu, Jinghong; Bai, Guanbing; Wang, Xuan; Sun, Mingchao

    2017-01-01

    This paper proposes an infrared (IR) and visible image fusion method introducing region segmentation into the dual-tree complex wavelet transform (DTCWT) region. This method should effectively improve both the target indication and scene spectrum features of fusion images, and the target identification and tracking reliability of fusion system, on an airborne photoelectric platform. The method involves segmenting the region in an IR image by significance, and identifying the target region and the background region; then, fusing the low-frequency components in the DTCWT region according to the region segmentation result. For high-frequency components, the region weights need to be assigned by the information richness of region details to conduct fusion based on both weights and adaptive phases, and then introducing a shrinkage function to suppress noise; Finally, the fused low-frequency and high-frequency components are reconstructed to obtain the fusion image. The experimental results show that the proposed method can fully extract complementary information from the source images to obtain a fusion image with good target indication and rich information on scene details. They also give a fusion result superior to existing popular fusion methods, based on eithers subjective or objective evaluation. With good stability and high fusion accuracy, this method can meet the fusion requirements of IR-visible image fusion systems. PMID:28505137

  1. Airborne Infrared and Visible Image Fusion Combined with Region Segmentation.

    PubMed

    Zuo, Yujia; Liu, Jinghong; Bai, Guanbing; Wang, Xuan; Sun, Mingchao

    2017-05-15

    This paper proposes an infrared (IR) and visible image fusion method introducing region segmentation into the dual-tree complex wavelet transform (DTCWT) region. This method should effectively improve both the target indication and scene spectrum features of fusion images, and the target identification and tracking reliability of fusion system, on an airborne photoelectric platform. The method involves segmenting the region in an IR image by significance, and identifying the target region and the background region; then, fusing the low-frequency components in the DTCWT region according to the region segmentation result. For high-frequency components, the region weights need to be assigned by the information richness of region details to conduct fusion based on both weights and adaptive phases, and then introducing a shrinkage function to suppress noise; Finally, the fused low-frequency and high-frequency components are reconstructed to obtain the fusion image. The experimental results show that the proposed method can fully extract complementary information from the source images to obtain a fusion image with good target indication and rich information on scene details. They also give a fusion result superior to existing popular fusion methods, based on eithers subjective or objective evaluation. With good stability and high fusion accuracy, this method can meet the fusion requirements of IR-visible image fusion systems.

  2. Applications of hyperspectral imaging in chicken meat safety and quality detection and evaluation: a review.

    PubMed

    Xiong, Zhenjie; Xie, Anguo; Sun, Da-Wen; Zeng, Xin-An; Liu, Dan

    2015-01-01

    Currently, the issue of food safety and quality is a great public concern. In order to satisfy the demands of consumers and obtain superior food qualities, non-destructive and fast methods are required for quality evaluation. As one of these methods, hyperspectral imaging (HSI) technique has emerged as a smart and promising analytical tool for quality evaluation purposes and has attracted much interest in non-destructive analysis of different food products. With the main advantage of combining both spectroscopy technique and imaging technique, HSI technique shows a convinced attitude to detect and evaluate chicken meat quality objectively. Moreover, developing a quality evaluation system based on HSI technology would bring economic benefits to the chicken meat industry. Therefore, in recent years, many studies have been conducted on using HSI technology for the safety and quality detection and evaluation of chicken meat. The aim of this review is thus to give a detailed overview about HSI and focus on the recently developed methods exerted in HSI technology developed for microbiological spoilage detection and quality classification of chicken meat. Moreover, the usefulness of HSI technique for detecting fecal contamination and bone fragments of chicken carcasses are presented. Finally, some viewpoints on its future research and applicability in the modern poultry industry are proposed.

  3. Monitoring of concentrated radiation beam for photovoltaic and thermal solar energy conversion applications.

    PubMed

    Parretta, Antonio; Privato, Carlo; Nenna, Giuseppe; Antonini, Andrea; Stefancich, Marco

    2006-10-20

    Methods for evaluating the light intensity distribution on receivers of concentrated solar radiation systems are described. They are based on the use of Lambertian diffusers in place of the illuminated receiver and on the acquisition of the scattered light, in reflection or transmission mode, by a CCD camera. The spatial distribution of intensity radiation is then numerically derived from the recorded images via a proprietary code. The details of the method are presented and a short survey of the main applications of the method in the photovoltaic and thermal solar energy conversion field is proposed. Methods for investigating the Lambertian character of commercial diffusers are also discussed.

  4. Evaluating the performance of a fault detection and diagnostic system for vapor compression equipment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Breuker, M.S.; Braun, J.E.

    This paper presents a detailed evaluation of the performance of a statistical, rule-based fault detection and diagnostic (FDD) technique presented by Rossi and Braun (1997). Steady-state and transient tests were performed on a simple rooftop air conditioner over a range of conditions and fault levels. The steady-state data without faults were used to train models that predict outputs for normal operation. The transient data with faults were used to evaluate FDD performance. The effect of a number of design variables on FDD sensitivity for different faults was evaluated and two prototype systems were specified for more complete evaluation. Good performancemore » was achieved in detecting and diagnosing five faults using only six temperatures (2 input and 4 output) and linear models. The performance improved by about a factor of two when ten measurements (three input and seven output) and higher order models were used. This approach for evaluating and optimizing the performance of the statistical, rule-based FDD technique could be used as a design and evaluation tool when applying this FDD method to other packaged air-conditioning systems. Furthermore, the approach could also be modified to evaluate the performance of other FDD methods.« less

  5. Thick Concrete Specimen Construction, Testing, and Preliminary Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clayton, Dwight A.; Hoegh, Kyle; Khazanovich, Lev

    The purpose of the U.S. Department of Energy Office of Nuclear Energy’s Light Water Reactor Sustainability (LWRS) Program is to develop technologies and other solutions that can improve the reliability, sustain the safety, and extend the operating lifetimes of nuclear power plants (NPPs) beyond 60 years. Since many important safety structures in an NPP are constructed of concrete, inspection techniques must be developed and tested to evaluate the internal condition. In-service containment structures generally do not allow for the destructive measures necessary to validate the accuracy of these inspection techniques. This creates a need for comparative testing of the variousmore » nondestructive evaluation (NDE) measurement techniques on concrete specimens with known material properties, voids, internal microstructure flaws, and reinforcement locations. A preliminary report detailed some of the challenges associated with thick reinforced concrete sections and prioritized conceptual designs of specimens that could be fabricated to represent NPP concrete structures for using in NDE evaluation comparisons. This led to the construction of the concrete specimen presented in this report, which has sufficient reinforcement density and cross-sectional size to represent an NPP containment wall. Details on how a suitably thick concrete specimen was constructed are presented, including the construction materials, final nominal design schematic, as well as formwork and rigging required to safely meet the desired dimensions of the concrete structure. The report also details the type and methods of forming the concrete specimen as well as information on how the rebar and simulated defects were embedded. Details on how the resulting specimen was transported, safely anchored, and marked to allow access for systematic comparative NDE testing of defects in a representative NPP containment wall concrete specimen are also given. Data collection using the MIRA Ultrasonic NDE equipment and initial results are also presented along with a discussion of the preliminary findings. Comparative NDE of various defects in reinforced concrete specimens is a key component in identifying the most promising techniques and directing the research and development efforts needed to characterize concrete degradation in commercial NPPs. This requires access to the specimens for data collection using state-of-the-art technology. The construction of the specimen detailed in this report allows for an evaluation of how different NDE techniques may interact with the size and complexities of NPP concrete structures. These factors were taken into account when determining specimen size and features to ensure a realistic design. The lateral dimensions of the specimen were also chosen to mitigate unrealistic boundary effects that would not affect the results of field NPP concrete testing. Preliminary results show that, while the current methods are able to identify some of the deeper defects, improvements in data processing or hardware are necessary to be able to achieve the precision and reliability achieved in evaluating thinner and less heavily reinforced concrete structures.« less

  6. Evaluation of Sensor Configurations for Robotic Surgical Instruments

    PubMed Central

    Gómez-de-Gabriel, Jesús M.; Harwin, William

    2015-01-01

    Designing surgical instruments for robotic-assisted minimally-invasive surgery (RAMIS) is challenging due to constraints on the number and type of sensors imposed by considerations such as space or the need for sterilization. A new method for evaluating the usability of virtual teleoperated surgical instruments based on virtual sensors is presented. This method uses virtual prototyping of the surgical instrument with a dual physical interaction, which allows testing of different sensor configurations in a real environment. Moreover, the proposed approach has been applied to the evaluation of prototypes of a two-finger grasper for lump detection by remote pinching. In this example, the usability of a set of five different sensor configurations, with a different number of force sensors, is evaluated in terms of quantitative and qualitative measures in clinical experiments with 23 volunteers. As a result, the smallest number of force sensors needed in the surgical instrument that ensures the usability of the device can be determined. The details of the experimental setup are also included. PMID:26516863

  7. Magnetic resonance imaging of articular cartilage: trauma, degeneration, and repair.

    PubMed

    Potter, Hollis G; Foo, Li F

    2006-04-01

    The assessment of articular cartilage using magnetic resonance imaging has seen considerable advances in recent years. Cartilage morphologic characteristics can now be evaluated with a high degree of accuracy and reproducibility using dedicated pulse sequences, which are becoming standard at many institutions. These techniques detect clinically unsuspected traumatic cartilage lesions, allowing the physician to study their natural history with longitudinal evaluation and also to assess disease status in degenerative osteoarthritis. Magnetic resonance imaging also provides a more objective assessment of cartilage repair to augment the information obtained from more subjective clinical outcome instruments. Newly developed methods that provide detail at an ultrastructural level offer an important addition to cartilage evaluation, particularly in the detection of early alterations in the extracellular matrix. These methods have created an undeniably important role for magnetic resonance imaging in the reproducible, noninvasive, and objective evaluation and monitoring of cartilage. An overview of the advances, current techniques, and impact of magnetic resonance imaging in the setting of trauma, degenerative arthritides, and surgical treatment for cartilage injury is presented.

  8. Evaluation of Sensor Configurations for Robotic Surgical Instruments.

    PubMed

    Gómez-de-Gabriel, Jesús M; Harwin, William

    2015-10-27

    Designing surgical instruments for robotic-assisted minimally-invasive surgery (RAMIS) is challenging due to constraints on the number and type of sensors imposed by considerations such as space or the need for sterilization. A new method for evaluating the usability of virtual teleoperated surgical instruments based on virtual sensors is presented. This method uses virtual prototyping of the surgical instrument with a dual physical interaction, which allows testing of different sensor configurations in a real environment. Moreover, the proposed approach has been applied to the evaluation of prototypes of a two-finger grasper for lump detection by remote pinching. In this example, the usability of a set of five different sensor configurations, with a different number of force sensors, is evaluated in terms of quantitative and qualitative measures in clinical experiments with 23 volunteers. As a result, the smallest number of force sensors needed in the surgical instrument that ensures the usability of the device can be determined. The details of the experimental setup are also included.

  9. Using evaluation theory in priority setting and resource allocation.

    PubMed

    Smith, Neale; Mitton, Craig; Cornelissen, Evelyn; Gibson, Jennifer; Peacock, Stuart

    2012-01-01

    Public sector interest in methods for priority setting and program or policy evaluation has grown considerably over the last several decades, given increased expectations for accountable and efficient use of resources and emphasis on evidence-based decision making as a component of good management practice. While there has been some occasional effort to conduct evaluation of priority setting projects, the literatures around priority setting and evaluation have largely evolved separately. In this paper, the aim is to bring them together. The contention is that evaluation theory is a means by which evaluators reflect upon what it is they are doing when they do evaluation work. Theories help to organize thinking, sort out relevant from irrelevant information, provide transparent grounds for particular implementation choices, and can help resolve problematic issues which may arise in the conduct of an evaluation project. A detailed review of three major branches of evaluation theory--methods, utilization, and valuing--identifies how such theories can guide the development of efforts to evaluate priority setting and resource allocation initiatives. Evaluation theories differ in terms of their guiding question, anticipated setting or context, evaluation foci, perspective from which benefits are calculated, and typical methods endorsed. Choosing a particular theoretical approach will structure the way in which any priority setting process is evaluated. The paper suggests that explicitly considering evaluation theory makes key aspects of the evaluation process more visible to all stakeholders, and can assist in the design of effective evaluation of priority setting processes; this should iteratively serve to improve the understanding of priority setting practices themselves.

  10. Hotspot-Centric De Novo Design of Protein Binders

    PubMed Central

    Fleishman, Sarel J.; Corn, Jacob E.; Strauch, Eva-Maria; Whitehead, Timothy A.; Karanicolas, John; Baker, David

    2014-01-01

    Protein–protein interactions play critical roles in biology, and computational design of interactions could be useful in a range of applications. We describe in detail a general approach to de novo design of protein interactions based on computed, energetically optimized interaction hotspots, which was recently used to produce high-affinity binders of influenza hemagglutinin. We present several alternative approaches to identify and build the key hotspot interactions within both core secondary structural elements and variable loop regions and evaluate the method's performance in natural-interface recapitulation. We show that the method generates binding surfaces that are more conformationally restricted than previous design methods, reducing opportunities for off-target interactions. PMID:21945116

  11. Research study on high energy radiation effect and environment solar cell degradation methods

    NASA Technical Reports Server (NTRS)

    Horne, W. E.; Wilkinson, M. C.

    1974-01-01

    The most detailed and comprehensively verified analytical model was used to evaluate the effects of simplifying assumptions on the accuracy of predictions made by the external damage coefficient method. It was found that the most serious discrepancies were present in heavily damaged cells, particularly proton damaged cells, in which a gradient in damage across the cell existed. In general, it was found that the current damage coefficient method tends to underestimate damage at high fluences. An exception to this rule was thick cover-slipped cells experiencing heavy degradation due to omnidirectional electrons. In such cases, the damage coefficient method overestimates the damage. Comparisons of degradation predictions made by the two methods and measured flight data confirmed the above findings.

  12. A Mapmark method of standard setting as implemented for the National Assessment Governing Board.

    PubMed

    Schulz, E Matthew; Mitzel, Howard C

    2011-01-01

    This article describes a Mapmark standard setting procedure, developed under contract with the National Assessment Governing Board (NAGB). The procedure enhances the bookmark method with spatially representative item maps, holistic feedback, and an emphasis on independent judgment. A rationale for these enhancements, and the bookmark method, is presented, followed by a detailed description of the materials and procedures used in a meeting to set standards for the 2005 National Assessment of Educational Progress (NAEP) in Grade 12 mathematics. The use of difficulty-ordered content domains to provide holistic feedback is a particularly novel feature of the method. Process evaluation results comparing Mapmark to Anghoff-based methods previously used for NAEP standard setting are also presented.

  13. Methods for semi-automated indexing for high precision information retrieval.

    PubMed

    Berrios, Daniel C; Cucina, Russell J; Fagan, Lawrence M

    2002-01-01

    To evaluate a new system, ISAID (Internet-based Semi-automated Indexing of Documents), and to generate textbook indexes that are more detailed and more useful to readers. Pilot evaluation: simple, nonrandomized trial comparing ISAID with manual indexing methods. Methods evaluation: randomized, cross-over trial comparing three versions of ISAID and usability survey. Pilot evaluation: two physicians. Methods evaluation: twelve physicians, each of whom used three different versions of the system for a total of 36 indexing sessions. Total index term tuples generated per document per minute (TPM), with and without adjustment for concordance with other subjects; inter-indexer consistency; ratings of the usability of the ISAID indexing system. Compared with manual methods, ISAID decreased indexing times greatly. Using three versions of ISAID, inter-indexer consistency ranged from 15% to 65% with a mean of 41%, 31%, and 40% for each of three documents. Subjects using the full version of ISAID were faster (average TPM: 5.6) and had higher rates of concordant index generation. There were substantial learning effects, despite our use of a training/run-in phase. Subjects using the full version of ISAID were much faster by the third indexing session (average TPM: 9.1). There was a statistically significant increase in three-subject concordant indexing rate using the full version of ISAID during the second indexing session (p < 0.05). Users of the ISAID indexing system create complex, precise, and accurate indexing for full-text documents much faster than users of manual methods. Furthermore, the natural language processing methods that ISAID uses to suggest indexes contributes substantially to increased indexing speed and accuracy.

  14. Poor methodological detail precludes experimental repeatability and hampers synthesis in ecology.

    PubMed

    Haddaway, Neal R; Verhoeven, Jos T A

    2015-10-01

    Despite the scientific method's central tenets of reproducibility (the ability to obtain similar results when repeated) and repeatability (the ability to replicate an experiment based on methods described), published ecological research continues to fail to provide sufficient methodological detail to allow either repeatability of verification. Recent systematic reviews highlight the problem, with one example demonstrating that an average of 13% of studies per year (±8.0 [SD]) failed to report sample sizes. The problem affects the ability to verify the accuracy of any analysis, to repeat methods used, and to assimilate the study findings into powerful and useful meta-analyses. The problem is common in a variety of ecological topics examined to date, and despite previous calls for improved reporting and metadata archiving, which could indirectly alleviate the problem, there is no indication of an improvement in reporting standards over time. Here, we call on authors, editors, and peer reviewers to consider repeatability as a top priority when evaluating research manuscripts, bearing in mind that legacy and integration into the evidence base can drastically improve the impact of individual research reports.

  15. Heat flux instrumentation for Hyflite thermal protection system

    NASA Technical Reports Server (NTRS)

    Diller, T. E.

    1994-01-01

    Using Thermal Protection Tile core samples supplied by NASA, the surface characteristics of the FRCI, TUFI, and RCG coatings were evaluated. Based on these results, appropriate methods of surface preparation were determined and tested for the required sputtering processes. Sample sensors were fabricated on the RCG coating and adhesion was acceptable. Based on these encouraging results, complete Heat Flux Microsensors were fabricated on the RCG coating. The issue of lead attachment was addressed with the annnealing and welding methods developed at NASA Lewis. Parallel gap welding appears to be the best method of lead attachment with prior heat treatment of the sputtered pads. Sample Heat Flux Microsensors were submitted for testing in the NASA Ames arc jet facility. Details of the project are contained in two attached reports. One additional item of interest is contained in the attached AIAA paper, which gives details of the transient response of a Heat Flux Microsensors in a shock tube facility at Virginia Tech. The response of the heat flux sensor was measured to be faster than 10 micro-s.

  16. Vadose Zone Transport Field Study: Detailed Test Plan for Simulated Leak Tests

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ward, Anderson L.; Gee, Glendon W.

    2000-06-23

    This report describes controlled transport experiments at well-instrumented field tests to be conducted during FY 2000 in support of DOE?s Vadose Zone Transport Field Study (VZTFS). The VZTFS supports the Groundwater/Vadose Zone Integration Project Science and Technology Initiative. The field tests will improve understanding of field-scale transport and lead to the development or identification of efficient and cost-effective characterization methods. These methods will capture the extent of contaminant plumes using existing steel-cased boreholes. Specific objectives are to 1) identify mechanisms controlling transport processes in soils typical of the hydrogeologic conditions of Hanford?s waste disposal sites; 2) reduce uncertainty in conceptualmore » models; 3) develop a detailed and accurate data base of hydraulic and transport parameters for validation of three-dimensional numerical models; and 4) identify and evaluate advanced, cost-effective characterization methods with the potential to assess changing conditions in the vadose zone, particularly as surrogates of currently undetectable high-risk contaminants. Pacific Northwest National Laboratory (PNNL) manages the VZTFS for DOE.« less

  17. Panel methods: An introduction

    NASA Technical Reports Server (NTRS)

    Erickson, Larry L.

    1990-01-01

    Panel methods are numerical schemes for solving (the Prandtl-Glauert equation) for linear, inviscid, irrotational flow about aircraft flying at subsonic or supersonic speeds. The tools at the panel-method user's disposal are (1) surface panels of source-doublet-vorticity distributions that can represent nearly arbitrary geometry, and (2) extremely versatile boundary condition capabilities that can frequently be used for creative modeling. Panel-method capabilities and limitations, basic concepts common to all panel-method codes, different choices that were made in the implementation of these concepts into working computer programs, and various modeling techniques involving boundary conditions, jump properties, and trailing wakes are discussed. An approach for extending the method to nonlinear transonic flow is also presented. Three appendices supplement the main test. In appendix 1, additional detail is provided on how the basic concepts are implemented into a specific computer program (PANAIR). In appendix 2, it is shown how to evaluate analytically the fundamental surface integral that arises in the expressions for influence-coefficients, and evaluate its jump property. In appendix 3, a simple example is used to illustrate the so-called finite part of the improper integrals.

  18. Very Large Scale Optimization

    NASA Technical Reports Server (NTRS)

    Vanderplaats, Garrett; Townsend, James C. (Technical Monitor)

    2002-01-01

    The purpose of this research under the NASA Small Business Innovative Research program was to develop algorithms and associated software to solve very large nonlinear, constrained optimization tasks. Key issues included efficiency, reliability, memory, and gradient calculation requirements. This report describes the general optimization problem, ten candidate methods, and detailed evaluations of four candidates. The algorithm chosen for final development is a modern recreation of a 1960s external penalty function method that uses very limited computer memory and computational time. Although of lower efficiency, the new method can solve problems orders of magnitude larger than current methods. The resulting BIGDOT software has been demonstrated on problems with 50,000 variables and about 50,000 active constraints. For unconstrained optimization, it has solved a problem in excess of 135,000 variables. The method includes a technique for solving discrete variable problems that finds a "good" design, although a theoretical optimum cannot be guaranteed. It is very scalable in that the number of function and gradient evaluations does not change significantly with increased problem size. Test cases are provided to demonstrate the efficiency and reliability of the methods and software.

  19. Evaluation of Contamination Inspection and Analysis Methods through Modeling System Performance

    NASA Technical Reports Server (NTRS)

    Seasly, Elaine; Dever, Jason; Stuban, Steven M. F.

    2016-01-01

    Contamination is usually identified as a risk on the risk register for sensitive space systems hardware. Despite detailed, time-consuming, and costly contamination control efforts during assembly, integration, and test of space systems, contaminants are still found during visual inspections of hardware. Improved methods are needed to gather information during systems integration to catch potential contamination issues earlier and manage contamination risks better. This research explores evaluation of contamination inspection and analysis methods to determine optical system sensitivity to minimum detectable molecular contamination levels based on IEST-STD-CC1246E non-volatile residue (NVR) cleanliness levels. Potential future degradation of the system is modeled given chosen modules representative of optical elements in an optical system, minimum detectable molecular contamination levels for a chosen inspection and analysis method, and determining the effect of contamination on the system. By modeling system performance based on when molecular contamination is detected during systems integration and at what cleanliness level, the decision maker can perform trades amongst different inspection and analysis methods and determine if a planned method is adequate to meet system requirements and manage contamination risk.

  20. Evaluation of a 3D local multiresolution algorithm for the correction of partial volume effects in positron emission tomography

    PubMed Central

    Le Pogam, Adrien; Hatt, Mathieu; Descourt, Patrice; Boussion, Nicolas; Tsoumpas, Charalampos; Turkheimer, Federico E.; Prunier-Aesch, Caroline; Baulieu, Jean-Louis; Guilloteau, Denis; Visvikis, Dimitris

    2011-01-01

    Purpose Partial volume effects (PVE) are consequences of the limited spatial resolution in emission tomography leading to under-estimation of uptake in tissues of size similar to the point spread function (PSF) of the scanner as well as activity spillover between adjacent structures. Among PVE correction methodologies, a voxel-wise mutual multi-resolution analysis (MMA) was recently introduced. MMA is based on the extraction and transformation of high resolution details from an anatomical image (MR/CT) and their subsequent incorporation into a low resolution PET image using wavelet decompositions. Although this method allows creating PVE corrected images, it is based on a 2D global correlation model which may introduce artefacts in regions where no significant correlation exists between anatomical and functional details. Methods A new model was designed to overcome these two issues (2D only and global correlation) using a 3D wavelet decomposition process combined with a local analysis. The algorithm was evaluated on synthetic, simulated and patient images, and its performance was compared to the original approach as well as the geometric transfer matrix (GTM) method. Results Quantitative performance was similar to the 2D global model and GTM in correlated cases. In cases where mismatches between anatomical and functional information were present the new model outperformed the 2D global approach, avoiding artefacts and significantly improving quality of the corrected images and their quantitative accuracy. Conclusions A new 3D local model was proposed for a voxel-wise PVE correction based on the original mutual multi-resolution analysis approach. Its evaluation demonstrated an improved and more robust qualitative and quantitative accuracy compared to the original MMA methodology, particularly in the absence of full correlation between anatomical and functional information. PMID:21978037

  1. HEALTH TECHNOLOGY ASSESSMENT OF PUBLIC HEALTH INTERVENTIONS: A SYNTHESIS OF METHODOLOGICAL GUIDANCE.

    PubMed

    Mathes, Tim; Antoine, Sunya-Lee; Prengel, Peggy; Bühn, Stefanie; Polus, Stephanie; Pieper, Dawid

    2017-01-01

    The evaluation of public health interventions poses some challenges. As a consequence, health technology assessment (HTA) methods for public health interventions (PHI) have to be adapted. This study aimed to summarize the available guidance on methods for HTA of PHI. We systematically searched for methodological guidance on HTA of PHIs. Our focus was on research synthesis methods to evaluate effectiveness. Relevant information was synthesized narratively in a standardized way. Only four guidance documents were identified specifically for HTAs of PHI. The approaches used for HTAs of PHIs are broader and more flexible than those for medical interventions. For this reason, there is a tendency to identify the intervention components and context factors that influence the effectiveness and transferability of an intervention rather than to assess its effectiveness in general. The details in the guidance vary without justification. Unjustified heterogeneity between the different guidance approaches is most pronounced for quality assessment, assessment of applicability, and methods to integrate qualitative and quantitative evidence. Descriptions for the assessment of integrity, heterogeneity, sustainability, context factors, and applicability are often vague. The heterogeneity in approaches indicates that there is currently no consensus on methods to deal with the challenges of the PHI evaluations. A possible explanation for this may be that the methods are not sufficiently developed, and advantages and disadvantages of a certain method in relation to the research question (e.g., broad/focused) have not yet been sufficiently evaluated.

  2. Evaluation of a Cubature Kalman Filtering-Based Phase Unwrapping Method for Differential Interferograms with High Noise in Coal Mining Areas

    PubMed Central

    Liu, Wanli; Bian, Zhengfu; Liu, Zhenguo; Zhang, Qiuzhao

    2015-01-01

    Differential interferometric synthetic aperture radar has been shown to be effective for monitoring subsidence in coal mining areas. Phase unwrapping can have a dramatic influence on the monitoring result. In this paper, a filtering-based phase unwrapping algorithm in combination with path-following is introduced to unwrap differential interferograms with high noise in mining areas. It can perform simultaneous noise filtering and phase unwrapping so that the pre-filtering steps can be omitted, thus usually retaining more details and improving the detectable deformation. For the method, the nonlinear measurement model of phase unwrapping is processed using a simplified Cubature Kalman filtering, which is an effective and efficient tool used in many nonlinear fields. Three case studies are designed to evaluate the performance of the method. In Case 1, two tests are designed to evaluate the performance of the method under different factors including the number of multi-looks and path-guiding indexes. The result demonstrates that the unwrapped results are sensitive to the number of multi-looks and that the Fisher Distance is the most suitable path-guiding index for our study. Two case studies are then designed to evaluate the feasibility of the proposed phase unwrapping method based on Cubature Kalman filtering. The results indicate that, compared with the popular Minimum Cost Flow method, the Cubature Kalman filtering-based phase unwrapping can achieve promising results without pre-filtering and is an appropriate method for coal mining areas with high noise. PMID:26153776

  3. An accurate method to measure alpha-emitting natural radionuclides in atmospheric filters: Application in two NORM industries

    NASA Astrophysics Data System (ADS)

    Lozano, R. L.; Bolívar, J. P.; San Miguel, E. G.; García-Tenorio, R.; Gázquez, M. J.

    2011-12-01

    In this work, an accurate method for the measurement of natural alpha-emitting radionuclides from aerosols collected in air filters is presented and discussed in detail. The knowledge of the levels of several natural alpha-emitting radionuclides (238U, 234U, 232Th, 230Th, 228Th, 226Ra and 210Po) in atmospheric aerosols is essential not only for a better understanding of the several atmospheric processes and changes, but also for a proper evaluation of the potential doses, which can inadvertently be received by the population via inhalation. The proposed method takes into account the presence of intrinsic amounts of these radionuclides in the matrices of the quartz filters used, as well as the possible variation in the humidity of the filters throughout the collection process. In both cases, the corrections necessary in order to redress these levels have been evaluated and parameterized. Furthermore, a detailed study has been performed into the optimisation of the volume of air to be sampled in order to increase the accuracy in the determination of the radionuclides. The method as a whole has been applied for the determination of the activity concentrations of U- and Th-isotopes in aerosols collected at two NORM (Naturally Occurring Radioactive Material) industries located in the southwest of Spain. Based on the levels found, a conservative estimation has been performed to yield the additional committed effective doses to which the workers are potentially susceptible due to inhalation of anthropogenic material present in the environment of these two NORM industries.

  4. Exploring the Application of Optical Remote Sensing as a Method to Estimate the Depth of Backwater Nursery Habitats of the Colorado Pikeminnow

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hamada, Yuki; LaGory, Kirk E.

    2016-02-01

    Low-velocity channel-margin habitats serve as important nursery habitats for the endangered Colorado pikeminnow (Ptychocheilus lucius) in the middle Green River between Jensen and Ouray, Utah. These habitats, known as backwaters, are associated with emergent sand bars, and are shaped and reformed annually by peak flows. A recent synthesis of information on backwater characteristics and the factors that influence inter-annual variability in those backwaters (Grippo et al. 2015) evaluated detailed survey information collected annually since 2003 on a relatively small sample of backwaters, as well as reach-wide evaluations of backwater surface area from aerial and satellite imagery. An approach is neededmore » to bridge the gap between these detailed surveys, which estimate surface area, volume, and depth, and the reach-wide assessment of surface area to enable an assessment of the amount of habitat that meets the minimum depth requirements for suitable habitat.« less

  5. Proposal of a micromagnetic standard problem for ferromagnetic resonance simulations

    NASA Astrophysics Data System (ADS)

    Baker, Alexander; Beg, Marijan; Ashton, Gregory; Albert, Maximilian; Chernyshenko, Dmitri; Wang, Weiwei; Zhang, Shilei; Bisotti, Marc-Antonio; Franchin, Matteo; Hu, Chun Lian; Stamps, Robert; Hesjedal, Thorsten; Fangohr, Hans

    2017-01-01

    Nowadays, micromagnetic simulations are a common tool for studying a wide range of different magnetic phenomena, including the ferromagnetic resonance. A technique for evaluating reliability and validity of different micromagnetic simulation tools is the simulation of proposed standard problems. We propose a new standard problem by providing a detailed specification and analysis of a sufficiently simple problem. By analyzing the magnetization dynamics in a thin permalloy square sample, triggered by a well defined excitation, we obtain the ferromagnetic resonance spectrum and identify the resonance modes via Fourier transform. Simulations are performed using both finite difference and finite element numerical methods, with OOMMF and Nmag simulators, respectively. We report the effects of initial conditions and simulation parameters on the character of the observed resonance modes for this standard problem. We provide detailed instructions and code to assist in using the results for evaluation of new simulator tools, and to help with numerical calculation of ferromagnetic resonance spectra and modes in general.

  6. SU-C-207B-02: Maximal Noise Reduction Filter with Anatomical Structures Preservation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maitree, R; Guzman, G; Chundury, A

    Purpose: All medical images contain noise, which can result in an undesirable appearance and can reduce the visibility of anatomical details. There are varieties of techniques utilized to reduce noise such as increasing the image acquisition time and using post-processing noise reduction algorithms. However, these techniques are increasing the imaging time and cost or reducing tissue contrast and effective spatial resolution which are useful diagnosis information. The three main focuses in this study are: 1) to develop a novel approach that can adaptively and maximally reduce noise while preserving valuable details of anatomical structures, 2) to evaluate the effectiveness ofmore » available noise reduction algorithms in comparison to the proposed algorithm, and 3) to demonstrate that the proposed noise reduction approach can be used clinically. Methods: To achieve a maximal noise reduction without destroying the anatomical details, the proposed approach automatically estimated the local image noise strength levels and detected the anatomical structures, i.e. tissue boundaries. Such information was used to adaptively adjust strength of the noise reduction filter. The proposed algorithm was tested on 34 repeating swine head datasets and 54 patients MRI and CT images. The performance was quantitatively evaluated by image quality metrics and manually validated for clinical usages by two radiation oncologists and one radiologist. Results: Qualitative measurements on repeated swine head images demonstrated that the proposed algorithm efficiently removed noise while preserving the structures and tissues boundaries. In comparisons, the proposed algorithm obtained competitive noise reduction performance and outperformed other filters in preserving anatomical structures. Assessments from the manual validation indicate that the proposed noise reduction algorithm is quite adequate for some clinical usages. Conclusion: According to both clinical evaluation (human expert ranking) and qualitative assessment, the proposed approach has superior noise reduction and anatomical structures preservation capabilities over existing noise removal methods. Senior Author Dr. Deshan Yang received research funding form ViewRay and Varian.« less

  7. Assessment of Protein Side-Chain Conformation Prediction Methods in Different Residue Environments

    PubMed Central

    Peterson, Lenna X.; Kang, Xuejiao; Kihara, Daisuke

    2016-01-01

    Computational prediction of side-chain conformation is an important component of protein structure prediction. Accurate side-chain prediction is crucial for practical applications of protein structure models that need atomic detailed resolution such as protein and ligand design. We evaluated the accuracy of eight side-chain prediction methods in reproducing the side-chain conformations of experimentally solved structures deposited to the Protein Data Bank. Prediction accuracy was evaluated for a total of four different structural environments (buried, surface, interface, and membrane-spanning) in three different protein types (monomeric, multimeric, and membrane). Overall, the highest accuracy was observed for buried residues in monomeric and multimeric proteins. Notably, side-chains at protein interfaces and membrane-spanning regions were better predicted than surface residues even though the methods did not all use multimeric and membrane proteins for training. Thus, we conclude that the current methods are as practically useful for modeling protein docking interfaces and membrane-spanning regions as for modeling monomers. PMID:24619909

  8. Modeling of unit operating considerations in generating-capacity reliability evaluation. Volume 1. Mathematical models, computing methods, and results. Final report. [GENESIS, OPCON and OPPLAN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patton, A.D.; Ayoub, A.K.; Singh, C.

    1982-07-01

    Existing methods for generating capacity reliability evaluation do not explicitly recognize a number of operating considerations which may have important effects in system reliability performance. Thus, current methods may yield estimates of system reliability which differ appreciably from actual observed reliability. Further, current methods offer no means of accurately studying or evaluating alternatives which may differ in one or more operating considerations. Operating considerations which are considered to be important in generating capacity reliability evaluation include: unit duty cycles as influenced by load cycle shape, reliability performance of other units, unit commitment policy, and operating reserve policy; unit start-up failuresmore » distinct from unit running failures; unit start-up times; and unit outage postponability and the management of postponable outages. A detailed Monte Carlo simulation computer model called GENESIS and two analytical models called OPCON and OPPLAN have been developed which are capable of incorporating the effects of many operating considerations including those noted above. These computer models have been used to study a variety of actual and synthetic systems and are available from EPRI. The new models are shown to produce system reliability indices which differ appreciably from index values computed using traditional models which do not recognize operating considerations.« less

  9. Index of cyber integrity

    NASA Astrophysics Data System (ADS)

    Anderson, Gustave

    2014-05-01

    Unfortunately, there is no metric, nor set of metrics, that are both general enough to encompass all possible types of applications yet specific enough to capture the application and attack specific details. As a result we are left with ad-hoc methods for generating evaluations of the security of our systems. Current state of the art methods for evaluating the security of systems include penetration testing and cyber evaluation tests. For these evaluations, security professionals simulate an attack from malicious outsiders and malicious insiders. These evaluations are very productive and are able to discover potential vulnerabilities resulting from improper system configuration, hardware and software flaws, or operational weaknesses. We therefore propose the index of cyber integrity (ICI), which is modeled after the index of biological integrity (IBI) to provide a holistic measure of the health of a system under test in a cyber-environment. The ICI provides a broad base measure through a collection of application and system specific metrics. In this paper, following the example of the IBI, we demonstrate how a multi-metric index may be used as a holistic measure of the health of a system under test in a cyber-environment.

  10. Measuring Diagnoses: ICD Code Accuracy

    PubMed Central

    O'Malley, Kimberly J; Cook, Karon F; Price, Matt D; Wildes, Kimberly Raiford; Hurdle, John F; Ashton, Carol M

    2005-01-01

    Objective To examine potential sources of errors at each step of the described inpatient International Classification of Diseases (ICD) coding process. Data Sources/Study Setting The use of disease codes from the ICD has expanded from classifying morbidity and mortality information for statistical purposes to diverse sets of applications in research, health care policy, and health care finance. By describing a brief history of ICD coding, detailing the process for assigning codes, identifying where errors can be introduced into the process, and reviewing methods for examining code accuracy, we help code users more systematically evaluate code accuracy for their particular applications. Study Design/Methods We summarize the inpatient ICD diagnostic coding process from patient admission to diagnostic code assignment. We examine potential sources of errors at each step and offer code users a tool for systematically evaluating code accuracy. Principle Findings Main error sources along the “patient trajectory” include amount and quality of information at admission, communication among patients and providers, the clinician's knowledge and experience with the illness, and the clinician's attention to detail. Main error sources along the “paper trail” include variance in the electronic and written records, coder training and experience, facility quality-control efforts, and unintentional and intentional coder errors, such as misspecification, unbundling, and upcoding. Conclusions By clearly specifying the code assignment process and heightening their awareness of potential error sources, code users can better evaluate the applicability and limitations of codes for their particular situations. ICD codes can then be used in the most appropriate ways. PMID:16178999

  11. Space station Simulation Computer System (SCS) study for NASA/MSFC. Volume 4: Conceptual design report

    NASA Technical Reports Server (NTRS)

    1989-01-01

    The Simulation Computer System (SCS) is the computer hardware, software, and workstations that will support the Payload Training Complex (PTC) at Marshall Space Flight Center (MSFC). The PTC will train the space station payload scientists, station scientists, and ground controllers to operate the wide variety of experiments that will be onboard the Space Station Freedom. In the first step of this task, a methodology was developed to ensure that all relevant design dimensions were addressed, and that all feasible designs could be considered. The development effort yielded the following method for generating and comparing designs in task 4: (1) Extract SCS system requirements (functions) from the system specification; (2) Develop design evaluation criteria; (3) Identify system architectural dimensions relevant to SCS system designs; (4) Develop conceptual designs based on the system requirements and architectural dimensions identified in step 1 and step 3 above; (5) Evaluate the designs with respect to the design evaluation criteria developed in step 2 above. The results of the method detailed in the above 5 steps are discussed. The results of the task 4 work provide the set of designs which two or three candidate designs are to be selected by MSFC as input to task 5-refine SCS conceptual designs. The designs selected for refinement will be developed to a lower level of detail, and further analyses will be done to begin to determine the size and speed of the components required to implement these designs.

  12. Detailed seismic evaluation of bridges on and over the parkways in Western Kentucky.

    DOT National Transportation Integrated Search

    2008-06-01

    The report outlines a rating system and details an evaluation procedure for the seismic evaluation of highway bridges. These processes are later used to investigate the structural integrity of selected highway bridges on and over the parkways in West...

  13. HANFORD DOUBLE SHELL TANK (DST) THERMAL & SEISMIC PROJECT BUCKLING EVALUATION METHODS & RESULTS FOR THE PRIMARY TANKS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MACKEY, T.C.

    2006-03-17

    This report documents a detailed buckling evaluation of the primary tanks in the Hanford double shell waste tanks. The analysis is part of a comprehensive structural review for the Double-Shell Tank Integrity Project. This work also provides information on tank integrity that specifically responds to concerns raise by the Office of Environment, Safety, and Health (ES&H) Oversight (EH-22) during a review (in April and May 2001) of work being performed on the double-shell tank farms, and the operation of the aging waste facility (AWF) primary tank ventilation system.

  14. Improved numerical methods for turbulent viscous flows aerothermal modeling program, phase 2

    NASA Technical Reports Server (NTRS)

    Karki, K. C.; Patankar, S. V.; Runchal, A. K.; Mongia, H. C.

    1988-01-01

    The details of a study to develop accurate and efficient numerical schemes to predict complex flows are described. In this program, several discretization schemes were evaluated using simple test cases. This assessment led to the selection of three schemes for an in-depth evaluation based on two-dimensional flows. The scheme with the superior overall performance was incorporated in a computer program for three-dimensional flows. To improve the computational efficiency, the selected discretization scheme was combined with a direct solution approach in which the fluid flow equations are solved simultaneously rather than sequentially.

  15. Numerical evaluation of heating in the human head due to magnetic resonance imaging (MRI)

    NASA Astrophysics Data System (ADS)

    Nguyen, Uyen; Brown, Steve; Chang, Isaac; Krycia, Joe; Mirotznik, Mark S.

    2003-06-01

    In this paper we present a numerical model for evaluating tissue heating during magnetic resonance imaging (MRI). Our method, which included a detailed anatomical model of a human head, calculated both the electromagnetic power deposition and the associated temperature elevations during a MRI head examination. Numerical studies were conducted using a realistic birdcage coil excited at frequencies ranging from 63 MHz to 500 MHz. The model was validated both experimentally and analytically. The experimental validation was performed at the MR test facility located at the FDA's Center for Devices and Radiological Health (CDRH).

  16. Surface Detail Reproduction and Dimensional Stability of Contemporary Irreversible Hydrocolloid Alternatives after Immediate and Delayed Pouring

    PubMed Central

    Kusugal, Preethi; Chourasiya, Ritu Sunil; Ruttonji, Zarir; Astagi, Preeti; Nayak, Ajay Kumar; Patil, Abhishekha

    2018-01-01

    Purpose: To overcome the poor dimensional stability of irreversible hydrocolloids, alternative materials were introduced. The dimensional changes of these alternatives after delayed pouring are not well studied and documented in the literature. The purpose of the study is to evaluate and compare the surface detail reproduction and dimensional stability of two irreversible hydrocolloid alternatives with an extended-pour irreversible hydrocolloid at different time intervals. Materials and Methods: All testing were performed according to the ANSI/ADA specification number 18 for surface detail reproduction and specification number 19 for dimensional change. The test materials used in this study were newer irreversible hydrocolloid alternatives such as AlgiNot FS, Algin-X Ultra FS, and Kromopan 100 which is an extended pour irreversible hydrocolloid as control. The surface detail reproduction was evaluated using stereomicroscope. The dimensional change after storage period of 1 h, 24 h, and 120 h was assessed and compared between the test materials and control. The data were analyzed using one-way ANOVA and post hoc Bonferroni test. Results: Statistically significant results (P < 0.001) were seen when mean scores of the tested materials were compared with respect to reproduction of 22 μm line from the metal block. Kromopan 100 showed statistically significant differences between different time intervals (P < 0.001) and exhibited more dimensional change. Algin-X Ultra FS proved to be more accurate and dimensionally stable. Conclusions: Newer irreversible hydrocolloid alternative impression materials were more accurate in surface detail reproduction and exhibited minimal dimensional change after storage period of 1 h, 24 h, and 120 h than extended-pour irreversible hydrocolloid impression material. PMID:29599578

  17. Report of the Federation of European Laboratory Animal Science Associations Working Group on animal identification.

    PubMed

    Dahlborn, K; Bugnon, P; Nevalainen, T; Raspa, M; Verbost, P; Spangenberg, E

    2013-01-01

    The primary aim of this report is to assist scientists in selecting more reliable/suitable identification (ID) methods for their studies. This is especially true for genetically altered (GA) animals where individual identification is strictly necessary to link samples, research design and genotype. The aim of this Federation of European Laboratory Animal Science Associations working group was to provide an update of the methods used to identify rodents in different situations and to assess their implications for animal welfare. ID procedures are an indispensable prerequisite for conducting good science but the degree of invasiveness differs between the different methods; therefore, one needs to make a good ethical evaluation of the method chosen. Based on the scientific literature the advantages and disadvantages of various methods have been presented comprehensively and this report is intended as a practical guide for researchers. New upcoming methods have been included next to the traditional techniques. Ideally, an ID method should provide reliable identification, be technically easy to apply and not inflict adverse effects on animals while taking into account the type of research. There is no gold standard method because each situation is unique; however, more studies are needed to better evaluate ID systems and the desirable introduction of new and modern approaches will need to be assessed by detailed scientific evaluation.

  18. An Evaluation of a Proposed Revision of the ASTM D 1990 Grouping Procedure

    Treesearch

    Steve P Verrill; James W. Evans; David E. Kretschmann; Cherilyn A. Hatfield

    2013-01-01

    Lum, Taylor, and Zidek have proposed a revised procedure for wood species grouping in ASTM standard D 1990. We applaud the authors’ recognition of the importance of considering a strength distribution’s variability as well as its fifth percentile. However, we have concerns about their proposed method of incorporating this information into a standard. We detail these...

  19. Sea-level evaluation of digitally implemented turbojet engine control functions

    NASA Technical Reports Server (NTRS)

    Arpasi, D. J.; Cwynar, D. S.; Wallhagen, R. E.

    1972-01-01

    The standard hydromechanical control system of a turbojet engine was replaced with a digital control system that implemented the same control laws. A detailed discussion of the digital control system in use with the engine is presented. The engine was operated in a sea-level test stand. The effects of control update interval are defined, and a method for extending this interval by using digital compensation is discussed.

  20. The Queensland experience of participation in a national drug use evaluation project, Community-acquired pneumonia – towards improving outcomes nationally (CAPTION)

    PubMed Central

    Pulver, Lisa K; Tett, Susan E; Coombes, Judith

    2009-01-01

    Background Multicentre drug use evaluations are described in the literature infrequently and usually publish only the results. The purpose of this paper is to describe the experience of Queensland hospitals participating in the Community-Acquired Pneumonia Towards Improving Outcomes Nationally (CAPTION) project, specifically evaluating the implementation of this project, detailing benefits and drawbacks of involvement in a national drug use evaluation program. Methods Emergency departments from nine hospitals in Queensland, Australia, participated in CAPTION, a national quality improvement project, conducted in 37 Australian hospitals. CAPTION was aimed at optimising prescribing in the management of Community-Acquired Pneumonia according to the recommendations of the Australian Therapeutic Guidelines: Antibiotic 12th edition. The project involved data collection, and evaluation, feedback of results and a suite of targeted educational interventions including audit and feedback, group presentations and academic detailing. A baseline audit and two drug use evaluation cycles were conducted during the 2-year project. The implementation of the project was evaluated using feedback forms after each phase of the project (audit or intervention). At completion a group meeting with the hospital coordinators identified positive and negative elements of the project. Results Evaluation by hospitals of their participation in CAPTION demonstrated both benefits and drawbacks. The benefits were grouped into the impact on the hospital dynamic such as; improved interdisciplinary working relationships (e.g. between pharmacist and doctor), recognition of the educational/academic role of the pharmacist, creation of ED Pharmacist positions and enhanced involvement with the National Prescribing Service, and personal benefits. Personal benefits included academic detailing training for participants, improved communication skills and opportunities to present at conferences. The principal drawback of participation was the extra burden on already busy staff members. Conclusion A national multicentre drug use evaluation project such as CAPTION allows hospitals which would otherwise not undertake such projects the opportunity to participate. The Queensland arm of CAPTION demonstrated benefits to both the individual participants and their hospitals, highlighting the additional value of participating in a multicentre project of this type. PMID:19646287

  1. Quality assessment of remote sensing image fusion using feature-based fourth-order correlation coefficient

    NASA Astrophysics Data System (ADS)

    Ma, Dan; Liu, Jun; Chen, Kai; Li, Huali; Liu, Ping; Chen, Huijuan; Qian, Jing

    2016-04-01

    In remote sensing fusion, the spatial details of a panchromatic (PAN) image and the spectrum information of multispectral (MS) images will be transferred into fused images according to the characteristics of the human visual system. Thus, a remote sensing image fusion quality assessment called feature-based fourth-order correlation coefficient (FFOCC) is proposed. FFOCC is based on the feature-based coefficient concept. Spatial features related to spatial details of the PAN image and spectral features related to the spectrum information of MS images are first extracted from the fused image. Then, the fourth-order correlation coefficient between the spatial and spectral features is calculated and treated as the assessment result. FFOCC was then compared with existing widely used indices, such as Erreur Relative Globale Adimensionnelle de Synthese, and quality assessed with no reference. Results of the fusion and distortion experiments indicate that the FFOCC is consistent with subjective evaluation. FFOCC significantly outperforms the other indices in evaluating fusion images that are produced by different fusion methods and that are distorted in spatial and spectral features by blurring, adding noise, and changing intensity. All the findings indicate that the proposed method is an objective and effective quality assessment for remote sensing image fusion.

  2. Development of Flight-Test Performance Estimation Techniques for Small Unmanned Aerial Systems

    NASA Astrophysics Data System (ADS)

    McCrink, Matthew Henry

    This dissertation provides a flight-testing framework for assessing the performance of fixed-wing, small-scale unmanned aerial systems (sUAS) by leveraging sub-system models of components unique to these vehicles. The development of the sub-system models, and their links to broader impacts on sUAS performance, is the key contribution of this work. The sub-system modeling and analysis focuses on the vehicle's propulsion, navigation and guidance, and airframe components. Quantification of the uncertainty in the vehicle's power available and control states is essential for assessing the validity of both the methods and results obtained from flight-tests. Therefore, detailed propulsion and navigation system analyses are presented to validate the flight testing methodology. Propulsion system analysis required the development of an analytic model of the propeller in order to predict the power available over a range of flight conditions. The model is based on the blade element momentum (BEM) method. Additional corrections are added to the basic model in order to capture the Reynolds-dependent scale effects unique to sUAS. The model was experimentally validated using a ground based testing apparatus. The BEM predictions and experimental analysis allow for a parameterized model relating the electrical power, measurable during flight, to the power available required for vehicle performance analysis. Navigation system details are presented with a specific focus on the sensors used for state estimation, and the resulting uncertainty in vehicle state. Uncertainty quantification is provided by detailed calibration techniques validated using quasi-static and hardware-in-the-loop (HIL) ground based testing. The HIL methods introduced use a soft real-time flight simulator to provide inertial quality data for assessing overall system performance. Using this tool, the uncertainty in vehicle state estimation based on a range of sensors, and vehicle operational environments is presented. The propulsion and navigation system models are used to evaluate flight-testing methods for evaluating fixed-wing sUAS performance. A brief airframe analysis is presented to provide a foundation for assessing the efficacy of the flight-test methods. The flight-testing presented in this work is focused on validating the aircraft drag polar, zero-lift drag coefficient, and span efficiency factor. Three methods are detailed and evaluated for estimating these design parameters. Specific focus is placed on the influence of propulsion and navigation system uncertainty on the resulting performance data. Performance estimates are used in conjunction with the propulsion model to estimate the impact sensor and measurement uncertainty on the endurance and range of a fixed-wing sUAS. Endurance and range results for a simplistic power available model are compared to the Reynolds-dependent model presented in this work. Additional parameter sensitivity analysis related to state estimation uncertainties encountered in flight-testing are presented. Results from these analyses indicate that the sub-system models introduced in this work are of first-order importance, on the order of 5-10% change in range and endurance, in assessing the performance of a fixed-wing sUAS.

  3. How to write a materials and methods section of a scientific article?

    PubMed

    Erdemir, Fikret

    2013-09-01

    In contrast to past centuries, scientific researchers have been currently conducted systematically in all countries as part of an education strategy. As a consequence, scientists have published thousands of reports. Writing an effective article is generally a significant problem for researchers. All parts of an article, specifically the abstract, material and methods, results, discussion and references sections should contain certain features that should always be considered before sending a manuscript to a journal for publication. It is generally known that the material and methods section is a relatively easy section of an article to write. Therefore, it is often a good idea to begin by writing the materials and methods section, which is also a crucial part of an article. Because "reproducible results" are very important in science, a detailed account of the study should be given in this section. If the authors provide sufficient detail, other scientists can repeat their experiments to verify their findings. It is generally recommended that the materials and methods should be written in the past tense, either in active or passive voice. In this section, ethical approval, study dates, number of subjects, groups, evaluation criteria, exclusion criteria and statistical methods should be described sequentially. It should be noted that a well-written materials and methods section markedly enhances the chances of an article being published.

  4. A practical guideline for examining a uterine niche using ultrasonography in non-pregnant women: a modified Delphi method amongst European experts.

    PubMed

    Jordans, I P M; de Leeuw, R; Stegwee, S I; Amso, N N; Barri-Soldevila, P N; van den Bosch, T; Bourne, T; Brolmann, H A M; Donnez, O; Dueholm, M; Hehenkamp, W J K; Jastrow, N; Jurkovic, D; Mashiach, R; Naji, O; Streuli, I; Timmerman, D; Vd Voet, L F; Huirne, J A F

    2018-03-14

    To generate a uniform, internationally recognized guideline for detailed uterine niche evaluation by ultrasonography in non-pregnant women using a modified Delphi method amongst international experts. Fifteen international gynecological experts were recruited by their membership of the European niche taskforce group. All experts were physicians with extensive experience in niche evaluation in clinical practice and/or authors of niche studies. Relevant items for niche measurement were determined based on the results of a literature search and recommendations of a focus group. Two online questionnaires were sent to the expert panel and one group meeting was organized. Consensus was predefined as a consensus rate of at least 70%. In total 15 experts participated in this study. Consensus was reached for a total of 42 items on niche evaluation, including definitions, relevance, method of measurement and tips for visualization of the niche. All experts agreed on the proposed guideline for niche evaluation in non-pregnant women as presented in this paper. Consensus between niche experts was achieved on all items regarding ultrasonographic niche measurement. This article is protected by copyright. All rights reserved.

  5. Semiclassical evaluation of quantum fidelity

    NASA Astrophysics Data System (ADS)

    Vanicek, Jiri

    2004-03-01

    We present a numerically feasible semiclassical method to evaluate quantum fidelity (Loschmidt echo) in a classically chaotic system. It was thought that such evaluation would be intractable, but instead we show that a uniform semiclassical expression not only is tractable but it gives remarkably accurate numerical results for the standard map in both the Fermi-golden-rule and Lyapunov regimes. Because it allows a Monte-Carlo evaluation, this uniform expression is accurate at times where there are 10^70 semiclassical contributions. Remarkably, the method also explicitly contains the ``building blocks'' of analytical theories of recent literature, and thus permits a direct test of approximations made by other authors in these regimes, rather than an a posteriori comparison with numerical results. We explain in more detail the extended validity of the classical perturbation approximation and thus provide a ``defense" of the linear response theory from the famous Van Kampen objection. We point out the potential use of our uniform expression in other areas because it gives a most direct link between the quantum Feynman propagator based on the path integral and the semiclassical Van Vleck propagator based on the sum over classical trajectories. Finally, we test the applicability of our method in integrable and mixed systems.

  6. Evaluation of a CFD Method for Aerodynamic Database Development using the Hyper-X Stack Configuration

    NASA Technical Reports Server (NTRS)

    Parikh, Paresh; Engelund, Walter; Armand, Sasan; Bittner, Robert

    2004-01-01

    A computational fluid dynamic (CFD) study is performed on the Hyper-X (X-43A) Launch Vehicle stack configuration in support of the aerodynamic database generation in the transonic to hypersonic flow regime. The main aim of the study is the evaluation of a CFD method that can be used to support aerodynamic database development for similar future configurations. The CFD method uses the NASA Langley Research Center developed TetrUSS software, which is based on tetrahedral, unstructured grids. The Navier-Stokes computational method is first evaluated against a set of wind tunnel test data to gain confidence in the code s application to hypersonic Mach number flows. The evaluation includes comparison of the longitudinal stability derivatives on the complete stack configuration (which includes the X-43A/Hyper-X Research Vehicle, the launch vehicle and an adapter connecting the two), detailed surface pressure distributions at selected locations on the stack body and component (rudder, elevons) forces and moments. The CFD method is further used to predict the stack aerodynamic performance at flow conditions where no experimental data is available as well as for component loads for mechanical design and aero-elastic analyses. An excellent match between the computed and the test data over a range of flow conditions provides a computational tool that may be used for future similar hypersonic configurations with confidence.

  7. The use of high resolution magnetic resonance on 3.0-T system in the diagnosis and surgical planning of intraosseous lesions of the jaws: preliminary results of a retrospective study.

    PubMed

    Cassetta, M; Di Carlo, S; Pranno, N; Stagnitti, A; Pompa, V; Pompa, G

    2012-12-01

    The pre-operative evaluation in oral and maxillofacial surgery is currently performed by computerized tomography (CT). However in some case the information of the traditional imaging methods are not enough in the diagnosis and surgical planning. The efficacy of these imaging methods in the evaluation of soft tissues is lower than magnetic resonance imaging (MRI). The aim of the study was to show the use of MRI in the evaluation of relation between intraosseous lesions of the jaws and anatomical structures, when it was difficult using the traditional radiographic methods, and to evaluate the usefulness of MRI to depict the morphostructural characterization of the lesions and infiltration of the soft tissues. 10 patients with a lesion of jaw were selected. All the patients underwent panoramic radiography (OPT), CT and MRI. The images were examined by dental and maxillofacial radiology who compared the different imaging methods to analyze the morphological and structural characteristics of the lesion and assessed the relationship between the lesion and the anatomical structures. Magnetic resonance imaging provided more detailed spatial and structural information than other imaging methods. MRI allowed us to characterize the intraosseous lesions of the jaws and to plan the surgery, resulting in a lower risk of anatomic structures surgical injury.

  8. Thermographic inspection of pipes, tanks, and containment liners

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Renshaw, Jeremy B., E-mail: jrenshaw@epri.com; Muthu, Nathan; Lhota, James R.

    2015-03-31

    Nuclear power plants are required to operate at a high level of safety. Recent industry and license renewal commitments aim to further increase safety by requiring the inspection of components that have not traditionally undergone detailed inspected in the past, such as tanks and liners. NEI 09-14 requires the inspection of buried pipes and tanks while containment liner inspections are required as a part of license renewal commitments. Containment liner inspections must inspect the carbon steel liner for defects - such as corrosion - that could threaten the pressure boundary and ideally, should be able to inspect the surrounding concretemore » for foreign material that could be in contact with the steel liner and potentially initiate corrosion. Such an inspection requires a simultaneous evaluation of two materials with very different material properties. Rapid, yet detailed, inspection results are required due to the massive size of the tanks and containment liners to be inspected. For this reason, thermal NDE methods were evaluated to inspect tank and containment liner mockups with simulated defects. Thermographic Signal Reconstruction (TSR) was utilized to enhance the images and provide detailed information on the sizes and shapes of the observed defects. The results show that thermographic inspection is highly sensitive to the defects of interest and is capable of rapidly inspecting large areas.« less

  9. Thermographic inspection of pipes, tanks, and containment liners

    NASA Astrophysics Data System (ADS)

    Renshaw, Jeremy B.; Lhota, James R.; Muthu, Nathan; Shepard, Steven M.

    2015-03-01

    Nuclear power plants are required to operate at a high level of safety. Recent industry and license renewal commitments aim to further increase safety by requiring the inspection of components that have not traditionally undergone detailed inspected in the past, such as tanks and liners. NEI 09-14 requires the inspection of buried pipes and tanks while containment liner inspections are required as a part of license renewal commitments. Containment liner inspections must inspect the carbon steel liner for defects - such as corrosion - that could threaten the pressure boundary and ideally, should be able to inspect the surrounding concrete for foreign material that could be in contact with the steel liner and potentially initiate corrosion. Such an inspection requires a simultaneous evaluation of two materials with very different material properties. Rapid, yet detailed, inspection results are required due to the massive size of the tanks and containment liners to be inspected. For this reason, thermal NDE methods were evaluated to inspect tank and containment liner mockups with simulated defects. Thermographic Signal Reconstruction (TSR) was utilized to enhance the images and provide detailed information on the sizes and shapes of the observed defects. The results show that thermographic inspection is highly sensitive to the defects of interest and is capable of rapidly inspecting large areas.

  10. Magnetic Resonance Imaging (MRI) -- Head

    MedlinePlus Videos and Cool Tools

    ... are clearer and more detailed than other imaging methods. This exam does not use ionizing radiation and ... clearer and more detailed than with other imaging methods. This detail makes MRI an invaluable tool in ...

  11. Definition, analysis and development of an optical data distribution network for integrated avionics and control systems

    NASA Technical Reports Server (NTRS)

    Burns, R. R.

    1981-01-01

    The potential and functional requirements of fiber optic bus designs for next generation aircraft are assessed. State-of-the-art component evaluations and projections were used in the system study. Complex networks were decomposed into dedicated structures, star buses, and serial buses for detailed analysis. Comparisons of dedicated links, star buses, and serial buses with and without full duplex operation and with considerations for terminal to terminal communication requirements were obtained. This baseline was then used to consider potential extensions of busing methods to include wavelength multiplexing and optical switches. Example buses were illustrated for various areas of the aircraft as potential starting points for more detail analysis as the platform becomes definitized.

  12. Exploration of Advanced Probabilistic and Stochastic Design Methods

    NASA Technical Reports Server (NTRS)

    Mavris, Dimitri N.

    2003-01-01

    The primary objective of the three year research effort was to explore advanced, non-deterministic aerospace system design methods that may have relevance to designers and analysts. The research pursued emerging areas in design methodology and leverage current fundamental research in the area of design decision-making, probabilistic modeling, and optimization. The specific focus of the three year investigation was oriented toward methods to identify and analyze emerging aircraft technologies in a consistent and complete manner, and to explore means to make optimal decisions based on this knowledge in a probabilistic environment. The research efforts were classified into two main areas. First, Task A of the grant has had the objective of conducting research into the relative merits of possible approaches that account for both multiple criteria and uncertainty in design decision-making. In particular, in the final year of research, the focus was on the comparison and contrasting between three methods researched. Specifically, these three are the Joint Probabilistic Decision-Making (JPDM) technique, Physical Programming, and Dempster-Shafer (D-S) theory. The next element of the research, as contained in Task B, was focused upon exploration of the Technology Identification, Evaluation, and Selection (TIES) methodology developed at ASDL, especially with regards to identification of research needs in the baseline method through implementation exercises. The end result of Task B was the documentation of the evolution of the method with time and a technology transfer to the sponsor regarding the method, such that an initial capability for execution could be obtained by the sponsor. Specifically, the results of year 3 efforts were the creation of a detailed tutorial for implementing the TIES method. Within the tutorial package, templates and detailed examples were created for learning and understanding the details of each step. For both research tasks, sample files and tutorials are attached in electronic form with the enclosed CD.

  13. Metal artifact reduction for CT-based luggage screening.

    PubMed

    Karimi, Seemeen; Martz, Harry; Cosman, Pamela

    2015-01-01

    In aviation security, checked luggage is screened by computed tomography scanning. Metal objects in the bags create artifacts that degrade image quality. Though there exist metal artifact reduction (MAR) methods mainly in medical imaging literature, they require knowledge of the materials in the scan, or are outlier rejection methods. To improve and evaluate a MAR method we previously introduced, that does not require knowledge of the materials in the scan, and gives good results on data with large quantities and different kinds of metal. We describe in detail an optimization which de-emphasizes metal projections and has a constraint for beam hardening and scatter. This method isolates and reduces artifacts in an intermediate image, which is then fed to a previously published sinogram replacement method. We evaluate the algorithm for luggage data containing multiple and large metal objects. We define measures of artifact reduction, and compare this method against others in MAR literature. Metal artifacts were reduced in our test images, even for multiple and large metal objects, without much loss of structure or resolution. Our MAR method outperforms the methods with which we compared it. Our approach does not make assumptions about image content, nor does it discard metal projections.

  14. The need for precise and well-documented experimental data on prompt fission neutron spectra from neutron-induced fission of 239Pu

    DOE PAGES

    Neudecker, Denise; Taddeucci, Terry Nicholas; Haight, Robert Cameron; ...

    2016-01-06

    The spectrum of neutrons emitted promptly after 239Pu(n,f)—a so-called prompt fission neutron spectrum (PFNS)—is a quantity of high interest, for instance, for reactor physics and global security. However, there are only few experimental data sets available that are suitable for evaluations. In addition, some of those data sets differ by more than their 1-σ uncertainty boundaries. We present the results of MCNP studies indicating that these differences are partly caused by underestimated multiple scattering contributions, over-corrected background, and inconsistent deconvolution methods. A detailed uncertainty quantification for suitable experimental data was undertaken including these effects, and test-evaluations were performed with themore » improved uncertainty information. The test-evaluations illustrate that the inadequately estimated effects and detailed uncertainty quantification have an impact on the evaluated PFNS and associated uncertainties as well as the neutron multiplicity of selected critical assemblies. A summary of data and documentation needs to improve the quality of the experimental database is provided based on the results of simulations and test-evaluations. Furthermore, given the possibly substantial distortion of the PFNS by multiple scattering and background effects, special care should be taken to reduce these effects in future measurements, e.g., by measuring the 239Pu PFNS as a ratio to either the 235U or 252Cf PFNS.« less

  15. Evaluation of thermometric monitoring for intradiscal laser ablation in an open 1.0 T MR scanner.

    PubMed

    Wonneberger, Uta; Schnackenburg, Bernhard; Wlodarczyk, Waldemar; Rump, Jens; Walter, Thula; Streitparth, Florian; Teichgräber, Ulf Karl Mart

    2010-01-01

    The purpose of this study was to evaluate different methods of magnetic resonance thermometry (MRTh) for the monitoring of intradiscal laser ablation therapy in an open 1.0 Tesla magnetic resonance (MR) scanner. MRTh methods based on the two endogenous MR temperature indicators of spin-lattice relaxation time T1 and water proton resonance frequency (PRF) shift were optimised and compared in vitro. For the latter, we measured the effective spin-spin relaxation times T2* in intervertebral discs of volunteers. Then we compared four gradient echo-based imaging techniques to monitor laser ablations in human disc specimens. Criteria of assessment were outline of anatomic detail, immunity against needle artefacts, signal-to-noise ratio (SNR) and accuracy of the calculated temperature. T2* decreased in an inverse and almost linear manner with the patients' age (r = 0.9) from 70 to 30 ms (mean of 49 ms). The optimum image quality (anatomic details, needle artefacts, SNR) and temperature accuracy (+/-1.09 degrees C for T1-based and +/-1.11 degrees C for PRF-based MRTh) was achieved with a non-spoiled gradient-echo sequence with an echo time of TE = 10 ms. Combination of anatomic and thermometric non-invasive monitoring of laser ablations in the lumbar spine is feasible. The temperature accuracy of the investigated T1- and PRF-based MRTh methods in vitro is high enough and promises to be reliable in vivo as well.

  16. Occupational exposures and chronic obstructive pulmonary disease (COPD): comparison of a COPD-specific job exposure matrix and expert-evaluated occupational exposures

    PubMed Central

    Kurth, Laura; Doney, Brent; Weinmann, Sheila

    2017-01-01

    Objectives To compare the occupational exposure levels assigned by our National Institute for Occupational Safety and Health chronic obstructive pulmonary disease-specific job exposure matrix (NIOSH COPD JEM) and by expert evaluation of detailed occupational information for various jobs held by members of an integrated health plan in the Northwest USA. Methods We analysed data from a prior study examining COPD and occupational exposures. Jobs were assigned exposure levels using 2 methods: (1) the COPD JEM and (2) expert evaluation. Agreement (Cohen’s κ coefficients), sensitivity and specificity were calculated to compare exposure levels assigned by the 2 methods for 8 exposure categories. Results κ indicated slight to moderate agreement (0.19–0.51) between the 2 methods and was highest for organic dust and overall exposure. Sensitivity of the matrix ranged from 33.9% to 68.5% and was highest for sensitisers, diesel exhaust and overall exposure. Specificity ranged from 74.7% to 97.1% and was highest for fumes, organic dust and mineral dust. Conclusions This COPD JEM was compared with exposures assigned by experts and offers a generalisable approach to assigning occupational exposure. PMID:27777373

  17. An Innovative Method to Involve Community Health Workers as Partners in Evaluation Research

    PubMed Central

    Issel, L. Michele; Townsell, Stephanie J.; Chapple-McGruder, Theresa; Handler, Arden

    2011-01-01

    Objectives. We developed a process through which community outreach workers, whose role is not typically that of a trained researcher, could actively participate in collection of qualitative evaluation data. Methods. Outreach workers for a community-based intervention project received training in qualitative research methodology and certification in research ethics. They used a Voice over Internet Protocol phone-in system to provide narrative reports about challenges faced by women they encountered in their outreach activities as well as their own experiences as outreach workers. Results. Qualitative data contributed by outreach workers provided insights not otherwise available to the evaluation team, including details about the complex lives of underserved women at risk for poor pregnancy outcomes and the challenges and rewards of the outreach worker role. Conclusions. Lay health workers can be a valuable asset as part of a research team. Training in research ethics and methods can be tailored to their educational level and preferences, and their insights provide important information and perspectives that may not be accessible via other data collection methods. Challenges encountered in the dual roles of researcher and lay health worker can be addressed in training. PMID:22021290

  18. Rotor Wake Vortex Definition: Initial Evaluation of 3-C PIV Results of the Hart-II Study

    NASA Technical Reports Server (NTRS)

    Burley, Casey L.; Brooks, Thomas F.; vanderWall, Berend; Richard, Hughes; Raffel, Markus; Beaumier, Philippe; Delrieux, Yves; Lim, Joon W.; Yu, Yung H.; Tung, Chee

    2002-01-01

    An initial evaluation is made of extensive three-component (3C) particle image velocimetry (PIV) measurements within the wake across a rotor disk plane. The model is a 40 percent scale BO-105 helicopter main rotor in forward flight simulation. This study is part of the HART II test program conducted in the German-Dutch Wind Tunnel (DNW). Included are wake vortex field measurements over the advancing and retreating sides of the rotor operating at a typical descent landing condition important for impulsive blade-vortex interaction (BVI) noise. Also included are advancing side results for rotor angle variations from climb to steep descent. Using detailed PIV vector maps of the vortex fields, methods of extracting key vortex parameters are examined and a new method was developed and evaluated. An objective processing method, involving a center-of-vorticity criterion and a vorticity 'disk' integration, was used to determine vortex core size, strength, core velocity distribution characteristics, and unsteadiness. These parameters are mapped over the rotor disk and offer unique physical insight for these parameters of importance for rotor noise and vibration prediction.

  19. An innovative method to involve community health workers as partners in evaluation research.

    PubMed

    Peacock, Nadine; Issel, L Michele; Townsell, Stephanie J; Chapple-McGruder, Theresa; Handler, Arden

    2011-12-01

    We developed a process through which community outreach workers, whose role is not typically that of a trained researcher, could actively participate in collection of qualitative evaluation data. Outreach workers for a community-based intervention project received training in qualitative research methodology and certification in research ethics. They used a Voice over Internet Protocol phone-in system to provide narrative reports about challenges faced by women they encountered in their outreach activities as well as their own experiences as outreach workers. Qualitative data contributed by outreach workers provided insights not otherwise available to the evaluation team, including details about the complex lives of underserved women at risk for poor pregnancy outcomes and the challenges and rewards of the outreach worker role. Lay health workers can be a valuable asset as part of a research team. Training in research ethics and methods can be tailored to their educational level and preferences, and their insights provide important information and perspectives that may not be accessible via other data collection methods. Challenges encountered in the dual roles of researcher and lay health worker can be addressed in training.

  20. In Vitro Methods for Comparing Target Binding and CDC Induction Between Therapeutic Antibodies: Applications in Biosimilarity Analysis.

    PubMed

    Salinas-Jazmín, Nohemi; González-González, Edith; Vásquez-Bochm, Luz X; Pérez-Tapia, Sonia M; Velasco-Velázquez, Marco A

    2017-05-04

    Therapeutic monoclonal antibodies (mAbs) are relevant to the treatment of different pathologies, including cancers. The development of biosimilar mAbs by pharmaceutical companies is a market opportunity, but it is also a strategy to increase drug accessibility and reduce therapy-associated costs. The protocols detailed here describe the evaluation of target binding and CDC induction by rituximab in Daudi cells. These two functions require different structural regions of the antibody and are relevant to the clinical effect induced by rituximab. The protocols allow the side-to-side comparison of a reference rituximab and a marketed rituximab biosimilar. The evaluated products showed differences both in target binding and CDC induction, suggesting that there are underlying physicochemical differences and highlighting the need to analyze the impact of those differences in the clinical setting. The methods reported here constitute simple and inexpensive in vitro models for the evaluation of the activity of rituximab biosimilars. Thus, they can be useful during biosimilar development, as well as for quality control in biosimilar production. Furthermore, the presented methods can be extrapolated to other therapeutic mAbs.

  1. Methods of measuring soil moisture in the field

    USGS Publications Warehouse

    Johnson, A.I.

    1962-01-01

    For centuries, the amount of moisture in the soil has been of interest in agriculture. The subject of soil moisture is also of great importance to the hydrologist, forester, and soils engineer. Much equipment and many methods have been developed to measure soil moisture under field conditions. This report discusses and evaluates the various methods for measurement of soil moisture and describes the equipment needed for each method. The advantages and disadvantages of each method are discussed and an extensive list of references is provided for those desiring to study the subject in more detail. The gravimetric method is concluded to be the most satisfactory method for most problems requiring onetime moisture-content data. The radioactive method is normally best for obtaining repeated measurements of soil moisture in place. It is concluded that all methods have some limitations and that the ideal method for measurement of soil moisture under field conditions has yet to be perfected.

  2. Analysis and evaluation of the applicability of green energy technology

    NASA Astrophysics Data System (ADS)

    Xu, Z. J.; Song, Y. K.

    2017-11-01

    With the seriousness of environmental issues and the shortage of resources, the applicability of green energy technology has been paid more and more attention by scholars in different fields. However, the current researches are often single in perspective and simple in method. According to the Theory of Applicable Technology, this paper analyzes and defines the green energy technology and its applicability from the all-around perspectives of symbiosis of economy, society, environment and science & technology etc., and correspondingly constructs the evaluation index system. The paper further applies the Fuzzy Comprehensive Evaluation to the evaluation of its applicability, discusses in depth the evaluation models and methods, and explains in detail with an example. The author holds that the applicability of green energy technology involves many aspects of economy, society, environment and science & technology and can be evaluated comprehensively by an index system composed of a number of independent indexes. The evaluation is multi-object, multi-factor, multi-level and fuzzy comprehensive, which is undoubtedly correct, effective and feasible by the Fuzzy Comprehensive Evaluation. It is of vital theoretical and practical significance to understand and evaluate comprehensively the applicability of green energy technology for the rational development and utilization of green energy technology and for the better promotion of sustainable development of human and nature.

  3. Methods for economic evaluation of a factorial-design cluster randomised controlled trial of a nutrition supplement and an exercise programme among healthy older people living in Santiago, Chile: the CENEX study

    PubMed Central

    Walker, Damian G; Aedo, Cristian; Albala, Cecilia; Allen, Elizabeth; Dangour, Alan D; Elbourne, Diana; Grundy, Emily; Uauy, Ricardo

    2009-01-01

    Background In an effort to promote healthy ageing and preserve health and function, the government of Chile has formulated a package of actions into the Programme for Complementary Food in Older People (Programa de Alimentación Complementaria para el Adulto Mayor - PACAM). The CENEX study was designed to evaluate the impact, cost and cost-effectiveness of the PACAM and a specially designed exercise programme on pneumonia incidence, walking capacity and body mass index in healthy older people living in low- to medium-socio-economic status areas of Santiago. The purpose of this paper is to describe in detail the methods that will be used to estimate the incremental costs and cost-effectiveness of the interventions. Methods and design The base-case analysis will adopt a societal perspective, including the direct medical and non-medical costs borne by the government and patients. The cost of the interventions will be calculated by the ingredients approach, in which the total quantities of goods and services actually employed in applying the interventions will be estimated, and multiplied by their respective unit prices. Relevant information on costs of interventions will be obtained mainly from administrative records. The costs borne by patients will be collected via exit and telephone interviews. An annual discount rate of 8% will be used, consistent with the rate recommended by the Government of Chile. All costs will be converted from Chilean Peso to US dollars with the 2007 average period exchange rate of US$1 = 522.37 Chilean Peso. To test the robustness of model results, we will vary the assumptions over a plausible range in sensitivity analyses. Discussion The protocol described here indicates our intent to conduct an economic evaluation alongside the CENEX study. It provides a detailed and transparent statement of planned data collection methods and analyses. Trial registration ISRCTN48153354 PMID:19473513

  4. Development of a 3D rockfall simulation model for point cloud topography

    NASA Astrophysics Data System (ADS)

    Noël, François; Wyser, Emmanuel; Jaboyedoff, Michel; Clouthier, Catherine; Locat, Jacques

    2017-04-01

    Rockfall simulations are generally used, for example, as input data to generate rockfall susceptibility map, to evaluate the reach probability of an infrastructure or to define input parameter values for mitigation designs. During the simulations, the lateral and vertical deviations of the particle and the change of velocity happening during the impacts have to be evaluated. Numerous factors control rockfall paths and velocities, like the particle's and terrain's shapes and compositions. Some models, especially the ones using discrete element methods, can consider a lot of physical factors. However, a compromise often has to be done between the time needed to produce a sufficient amount of 2D or 3D rockfall trajectories and the level of complexity of the model. In this presentation, the current version of our rockfall model in development is detailed and the compromises that were made are explained. For example, it is hard to predict the sizes and shapes of the components that could fall from a developing rock instability, or if they will break after the first impact or stay as massive blocks. For this reason, we decided for now to simplify the particle's shape to a sphere which can vary in size and to use a cubical shape to compute the 3D rotational inertia. In contrast to the particle's characteristics, the terrain's shape is known and can be acquired in detail using current topographical acquisition methods, e.g. airborne and terrestrial laser scans and aerial based structure from motion. We made no sacrifice on that side and developed our model so it can simulate rockfalls directly on 3D point clouds topographical data. It is also been shown that calibrating velocity weighting factors, often called restitution coefficients, is not an easy task. Divergent results could be obtained by different users using the same simulation program simply because they use different weighting factors, which are hard to evaluate and quantify from field work. Moreover, the normal velocity weighting factor does not seems to be constant as the impact conditions change, even if the terrain composition does not change. It could be correlated with the incident angle. We then decided for now to let impact characteristics control velocity changes with some variability and to use the detailed topographic representation to control the direction after a rebound. As a high topographical level of detail is used, less random variability is needed. Therefore, it would be easier for different users working on the same study area to get similar results as long as detailed enough topographical data are used. Some applications cases are also shown. Further development should focus on more calibration with known rockfall events, taking into account impact against trees and fragmentation of rock blocks, and improving the impact model by studying impacts on different terrain compositions from a mechanical approach using discrete element method based simulations.

  5. Family History of Breast Cancer, Breast Density, and Breast Cancer Risk in a U.S. Breast Cancer Screening Population.

    PubMed

    Ahern, Thomas P; Sprague, Brian L; Bissell, Michael C S; Miglioretti, Diana L; Buist, Diana S M; Braithwaite, Dejana; Kerlikowske, Karla

    2017-06-01

    Background: The utility of incorporating detailed family history into breast cancer risk prediction hinges on its independent contribution to breast cancer risk. We evaluated associations between detailed family history and breast cancer risk while accounting for breast density. Methods: We followed 222,019 participants ages 35 to 74 in the Breast Cancer Surveillance Consortium, of whom 2,456 developed invasive breast cancer. We calculated standardized breast cancer risks within joint strata of breast density and simple (1 st -degree female relative) or detailed (first-degree, second-degree, or first- and second-degree female relative) breast cancer family history. We fit log-binomial models to estimate age-specific breast cancer associations for simple and detailed family history, accounting for breast density. Results: Simple first-degree family history was associated with increased breast cancer risk compared with no first-degree history [Risk ratio (RR), 1.5; 95% confidence interval (CI), 1.0-2.1 at age 40; RR, 1.5; 95% CI, 1.3-1.7 at age 50; RR, 1.4; 95% CI, 1.2-1.6 at age 60; RR, 1.3; 95% CI, 1.1-1.5 at age 70). Breast cancer associations with detailed family history were strongest for women with first- and second-degree family history compared with no history (RR, 1.9; 95% CI, 1.1-3.2 at age 40); this association weakened in higher age groups (RR, 1.2; 95% CI, 0.88-1.5 at age 70). Associations did not change substantially when adjusted for breast density. Conclusions: Even with adjustment for breast density, a history of breast cancer in both first- and second-degree relatives is more strongly associated with breast cancer than simple first-degree family history. Impact: Future efforts to improve breast cancer risk prediction models should evaluate detailed family history as a risk factor. Cancer Epidemiol Biomarkers Prev; 26(6); 938-44. ©2017 AACR . ©2017 American Association for Cancer Research.

  6. Evaluation of stabilization techniques for ion implant processing

    NASA Astrophysics Data System (ADS)

    Ross, Matthew F.; Wong, Selmer S.; Minter, Jason P.; Marlowe, Trey; Narcy, Mark E.; Livesay, William R.

    1999-06-01

    With the integration of high current ion implant processing into volume CMOS manufacturing, the need for photoresist stabilization to achieve a stable ion implant process is critical. This study compares electron beam stabilization, a non-thermal process, with more traditional thermal stabilization techniques such as hot plate baking and vacuum oven processing. The electron beam processing is carried out in a flood exposure system with no active heating of the wafer. These stabilization techniques are applied to typical ion implant processes that might be found in a CMOS production process flow. The stabilization processes are applied to a 1.1 micrometers thick PFI-38A i-line photoresist film prior to ion implant processing. Post stabilization CD variation is detailed with respect to wall slope and feature integrity. SEM photographs detail the effects of the stabilization technique on photoresist features. The thermal stability of the photoresist is shown for different levels of stabilization and post stabilization thermal cycling. Thermal flow stability of the photoresist is detailed via SEM photographs. A significant improvement in thermal stability is achieved with the electron beam process, such that photoresist features are stable to temperatures in excess of 200 degrees C. Ion implant processing parameters are evaluated and compared for the different stabilization methods. Ion implant system end-station chamber pressure is detailed as a function of ion implant process and stabilization condition. The ion implant process conditions are detailed for varying factors such as ion current, energy, and total dose. A reduction in the ion implant systems end-station chamber pressure is achieved with the electron beam stabilization process over the other techniques considered. This reduction in end-station chamber pressure is shown to provide a reduction in total process time for a given ion implant dose. Improvements in the ion implant process are detailed across several combinations of current and energy.

  7. Comparison of the spatial landmark scatter of various 3D digitalization methods.

    PubMed

    Boldt, Florian; Weinzierl, Christian; Hertrich, Klaus; Hirschfelder, Ursula

    2009-05-01

    The aim of this study was to compare four different three-dimensional digitalization methods on the basis of the complex anatomical surface of a cleft lip and palate plaster cast, and to ascertain their accuracy when positioning 3D landmarks. A cleft lip and palate plaster cast was digitalized with the SCAN3D photo-optical scanner, the OPTIX 400S laser-optical scanner, the Somatom Sensation 64 computed tomography system and the MicroScribe MLX 3-axis articulated-arm digitizer. First, four examiners appraised by individual visual inspection the surface detail reproduction of the three non-tactile digitalization methods in comparison to the reference plaster cast. The four examiners then localized the landmarks five times at intervals of 2 weeks. This involved simply copying, or spatially tracing, the landmarks from a reference plaster cast to each model digitally reproduced by each digitalization method. Statistical analysis of the landmark distribution specific to each method was performed based on the 3D coordinates of the positioned landmarks. Visual evaluation of surface detail conformity assigned the photo-optical digitalization method an average score of 1.5, the highest subjectively-determined conformity (surpassing computer tomographic and laser-optical methods). The tactile scanning method revealed the lowest degree of 3D landmark scatter, 0.12 mm, and at 1.01 mm the lowest maximum 3D landmark scatter; this was followed by the computer tomographic, photo-optical and laser-optical methods (in that order). This study demonstrates that the landmarks' precision and reproducibility are determined by the complexity of the reference-model surface as well as the digital surface quality and individual ability of each evaluator to capture 3D spatial relationships. The differences in the 3D-landmark scatter values and lowest maximum 3D-landmark scatter between the best and the worst methods showed minor differences. The measurement results in this study reveal that it is not the method's precision but rather the complexity of the object analysis being planned that should determine which method is ultimately employed.

  8. Comparison of accuracy of fibrosis degree classifications by liver biopsy and non-invasive tests in chronic hepatitis C

    PubMed Central

    2011-01-01

    Background Non-invasive tests have been constructed and evaluated mainly for binary diagnoses such as significant fibrosis. Recently, detailed fibrosis classifications for several non-invasive tests have been developed, but their accuracy has not been thoroughly evaluated in comparison to liver biopsy, especially in clinical practice and for Fibroscan. Therefore, the main aim of the present study was to evaluate the accuracy of detailed fibrosis classifications available for non-invasive tests and liver biopsy. The secondary aim was to validate these accuracies in independent populations. Methods Four HCV populations provided 2,068 patients with liver biopsy, four different pathologist skill-levels and non-invasive tests. Results were expressed as percentages of correctly classified patients. Results In population #1 including 205 patients and comparing liver biopsy (reference: consensus reading by two experts) and blood tests, Metavir fibrosis (FM) stage accuracy was 64.4% in local pathologists vs. 82.2% (p < 10-3) in single expert pathologist. Significant discrepancy (≥ 2FM vs reference histological result) rates were: Fibrotest: 17.2%, FibroMeter2G: 5.6%, local pathologists: 4.9%, FibroMeter3G: 0.5%, expert pathologist: 0% (p < 10-3). In population #2 including 1,056 patients and comparing blood tests, the discrepancy scores, taking into account the error magnitude, of detailed fibrosis classification were significantly different between FibroMeter2G (0.30 ± 0.55) and FibroMeter3G (0.14 ± 0.37, p < 10-3) or Fibrotest (0.84 ± 0.80, p < 10-3). In population #3 (and #4) including 458 (359) patients and comparing blood tests and Fibroscan, accuracies of detailed fibrosis classification were, respectively: Fibrotest: 42.5% (33.5%), Fibroscan: 64.9% (50.7%), FibroMeter2G: 68.7% (68.2%), FibroMeter3G: 77.1% (83.4%), p < 10-3 (p < 10-3). Significant discrepancy (≥ 2 FM) rates were, respectively: Fibrotest: 21.3% (22.2%), Fibroscan: 12.9% (12.3%), FibroMeter2G: 5.7% (6.0%), FibroMeter3G: 0.9% (0.9%), p < 10-3 (p < 10-3). Conclusions The accuracy in detailed fibrosis classification of the best-performing blood test outperforms liver biopsy read by a local pathologist, i.e., in clinical practice; however, the classification precision is apparently lesser. This detailed classification accuracy is much lower than that of significant fibrosis with Fibroscan and even Fibrotest but higher with FibroMeter3G. FibroMeter classification accuracy was significantly higher than those of other non-invasive tests. Finally, for hepatitis C evaluation in clinical practice, fibrosis degree can be evaluated using an accurate blood test. PMID:22129438

  9. Diagnosis of condensation-induced waterhammer: Case studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Izenson, M.G.; Rothe, P.H.; Wallis, G.B.

    1988-10-01

    This guidebook provides reference material and diagnostic procedures concerning condensation-induced waterhammer in nuclear power plants. Condensation-induced waterhammer is the most damaging form of waterhammer, and its diagnosis is complicated by the complex nature of the underlying phenomena. In Volume 1, the guidebook groups condensation-induced waterhammers into five event classes which have similar phenomena and levels of damage. Diagnostic guidelines focus on locating the event center where condensation and slug acceleration take place. Diagnosis is described in three stages: an initial assessment, detailed evaluation and final confirmation. Graphical scoping analyses are provided to evaluate whether an event from one of themore » event classes could have occurred at the event center. Examples are provided for each type of waterhammer. Special instructions are provided for walking down damaged piping and evaluating damage due to waterhammer. To illustrate the diagnostic methods and document past experience, six case studies have been compiled in Volume 2. These case studies, based on actual condensation-induced waterhammer events at nuclear plants, present detailed data and work through the event diagnosis using the tools introduced in the first volume. 20 refs., 21 figs., 6 tabs.« less

  10. Performance evaluation of a digital mammography unit using a contrast-detail phantom

    NASA Astrophysics Data System (ADS)

    Elizalde-Cabrera, J.; Brandan, M.-E.

    2015-01-01

    The relation between image quality and mean glandular dose (MGD) has been studied for a Senographe 2000D mammographic unit used for research in our laboratory. The magnitudes were evaluated for a clinically relevant range of acrylic thicknesses and radiological techniques. The CDMAM phantom was used to determine the contrast-detail curve. Also, an alternative method based on the analysis of signal-to-noise (SNR) and contrast-to-noise (CNR) ratios from the CDMAM image was proposed and applied. A simple numerical model was utilized to successfully interpret the results. Optimum radiological techniques were determined using the figures-of-merit FOMSNR=SNR2/MGD and FOMCNR=CNR2/MGD. Main results were: the evaluation of the detector response flattening process (it reduces by about one half the spatial non-homogeneities due to the X- ray field), MGD measurements (the values comply with standards), and verification of the automatic exposure control performance (it is sensitive to fluence attenuation, not to contrast). For 4-5 cm phantom thicknesses, the optimum radiological techniques were Rh/Rh 34 kV to optimize SNR, and Rh/Rh 28 kV to optimize CNR.

  11. Evaluation of health promotion in schools: a realistic evaluation approach using mixed methods.

    PubMed

    Pommier, Jeanine; Guével, Marie-Renée; Jourdan, Didier

    2010-01-28

    Schools are key settings for health promotion (HP) but the development of suitable approaches for evaluating HP in schools is still a major topic of discussion. This article presents a research protocol of a program developed to evaluate HP. After reviewing HP evaluation issues, the various possible approaches are analyzed and the importance of a realistic evaluation framework and a mixed methods (MM) design are demonstrated. The design is based on a systemic approach to evaluation, taking into account the mechanisms, context and outcomes, as defined in realistic evaluation, adjusted to our own French context using an MM approach. The characteristics of the design are illustrated through the evaluation of a nationwide HP program in French primary schools designed to enhance children's social, emotional and physical health by improving teachers' HP practices and promoting a healthy school environment. An embedded MM design is used in which a qualitative data set plays a supportive, secondary role in a study based primarily on a different quantitative data set. The way the qualitative and quantitative approaches are combined through the entire evaluation framework is detailed. This study is a contribution towards the development of suitable approaches for evaluating HP programs in schools. The systemic approach of the evaluation carried out in this research is appropriate since it takes account of the limitations of traditional evaluation approaches and considers suggestions made by the HP research community.

  12. RootGraph: a graphic optimization tool for automated image analysis of plant roots

    PubMed Central

    Cai, Jinhai; Zeng, Zhanghui; Connor, Jason N.; Huang, Chun Yuan; Melino, Vanessa; Kumar, Pankaj; Miklavcic, Stanley J.

    2015-01-01

    This paper outlines a numerical scheme for accurate, detailed, and high-throughput image analysis of plant roots. In contrast to existing root image analysis tools that focus on root system-average traits, a novel, fully automated and robust approach for the detailed characterization of root traits, based on a graph optimization process is presented. The scheme, firstly, distinguishes primary roots from lateral roots and, secondly, quantifies a broad spectrum of root traits for each identified primary and lateral root. Thirdly, it associates lateral roots and their properties with the specific primary root from which the laterals emerge. The performance of this approach was evaluated through comparisons with other automated and semi-automated software solutions as well as against results based on manual measurements. The comparisons and subsequent application of the algorithm to an array of experimental data demonstrate that this method outperforms existing methods in terms of accuracy, robustness, and the ability to process root images under high-throughput conditions. PMID:26224880

  13. Particle swarm optimization-based local entropy weighted histogram equalization for infrared image enhancement

    NASA Astrophysics Data System (ADS)

    Wan, Minjie; Gu, Guohua; Qian, Weixian; Ren, Kan; Chen, Qian; Maldague, Xavier

    2018-06-01

    Infrared image enhancement plays a significant role in intelligent urban surveillance systems for smart city applications. Unlike existing methods only exaggerating the global contrast, we propose a particle swam optimization-based local entropy weighted histogram equalization which involves the enhancement of both local details and fore-and background contrast. First of all, a novel local entropy weighted histogram depicting the distribution of detail information is calculated based on a modified hyperbolic tangent function. Then, the histogram is divided into two parts via a threshold maximizing the inter-class variance in order to improve the contrasts of foreground and background, respectively. To avoid over-enhancement and noise amplification, double plateau thresholds of the presented histogram are formulated by means of particle swarm optimization algorithm. Lastly, each sub-image is equalized independently according to the constrained sub-local entropy weighted histogram. Comparative experiments implemented on real infrared images prove that our algorithm outperforms other state-of-the-art methods in terms of both visual and quantized evaluations.

  14. Refining atmosphere light to improve the dark channel prior algorithm

    NASA Astrophysics Data System (ADS)

    Gan, Ling; Li, Dagang; Zhou, Can

    2017-05-01

    The defogging image gotten through dark channel prior algorithm has some shortcomings, such like color distortion, dimmer light and detail-loss near the observer. The main reasons are that the atmosphere light is estimated as one value and its change in different scene depth is not considered. So we modeled the atmosphere, one parameter of the defogging model. Firstly, we scatter the atmosphere light into equivalent point and build discrete model of the light. Secondly, we build some rough and possible models through analyzing the relationship between the atmosphere light and the medium transmission. Finally, by analyzing the results of many experiments qualitatively and quantitatively, we get the selected and optimized model. Although using this method causes the time-consuming to increase slightly, the evaluations, histogram correlation coefficient and peak signal-to-noise ratio are improved significantly and the defogging result is more conformed to human visual. And the color and the details near the observer in the defogging image are better than that achieved by the primal method.

  15. Computer program to perform cost and weight analysis of transport aircraft. Volume 1: Summary

    NASA Technical Reports Server (NTRS)

    1973-01-01

    A digital computer program for evaluating the weight and costs of advanced transport designs was developed. The resultant program, intended for use at the preliminary design level, incorporates both batch mode and interactive graphics run capability. The basis of the weight and cost estimation method developed is a unique way of predicting the physical design of each detail part of a vehicle structure at a time when only configuration concept drawings are available. In addition, the technique relies on methods to predict the precise manufacturing processes and the associated material required to produce each detail part. Weight data are generated in four areas of the program. Overall vehicle system weights are derived on a statistical basis as part of the vehicle sizing process. Theoretical weights, actual weights, and the weight of the raw material to be purchased are derived as part of the structural synthesis and part definition processes based on the computed part geometry.

  16. Visualization of chorioretinal vasculature in mice in vivo using a combined OCT/SLO imaging system

    NASA Astrophysics Data System (ADS)

    Goswami, Mayank; Zhang, Pengfei; Pugh, Edward N.; Zawadzki, Robert J.

    2016-03-01

    Chorioretinal blood vessel morphology in mice is of great interest to researchers studying eye disease mechanisms in animal models. Two leading retinal imaging modalities -- Optical Coherence Tomography (OCT) and Scanning Laser Ophthalmoscopy (SLO) -- have offered much insight into vascular morphology and blood flow. OCT "flow-contrast" methods have provided detailed mapping of vascular morphology with micrometer depth resolution, while OCT Doppler methods have enabled the measurement of local flow velocities. SLO remains indispensable in studying blood leakage, microaneurysms, and the clearance time of contrast agents of different sizes. In this manuscript we present results obtained with a custom OCT/SLO system applied to visualize the chorioretinal vascular morphology of pigmented C57Bl/6J and albino nude (Nu/Nu) mice. Blood perfusion maps of choroidal vessels and choricapillaris created by OCT and SLO are presented, along with detailed evaluation of different OCT imaging parameters, including the use of the scattering contrast agent Intralipid. Future applications are discussed.

  17. Challenges and opportunities in bioanalytical support for gene therapy medicinal product development.

    PubMed

    Ma, Mark; Balasubramanian, Nanda; Dodge, Robert; Zhang, Yan

    2017-09-01

    Gene and nucleic acid therapies have demonstrated patient benefits to address unmet medical needs. Beside considerations regarding the biological nature of the gene therapy, the quality of bioanalytical methods plays an important role in ensuring the success of these novel therapies. Inconsistent approaches among bioanalytical labs during preclinical and clinical phases have been observed. There are many underlying reasons for this inconsistency. Various platforms and reagents used in quantitative methods, lacking of detailed regulatory guidance on method validation and uncertainty of immunogenicity strategy in supporting gene therapy may all be influential. This review summarizes recent practices and considerations in bioanalytical support of pharmacokinetics/pharmacodynamics and immunogenicity evaluations in gene therapy development with insight into method design, development and validations.

  18. A first approach for digital representation and automated classification of toolmarks on locking cylinders using confocal laser microscopy

    NASA Astrophysics Data System (ADS)

    Clausing, Eric; Kraetzer, Christian; Dittmann, Jana; Vielhauer, Claus

    2012-10-01

    An important part of criminalistic forensics is the analysis of toolmarks. Such toolmarks often consist of plenty of single striations, scratches and dents which can allow for conclusions in regards to the sequence of events or used tools. To receive qualified results with an automated analysis and contactless acquisition of such toolmarks, a detailed digital representation of these and their orientation as well as placing to each other is required. For marks of firearms and tools the desired result of an analysis is a conclusion whether or not a mark has been generated by a tool under suspicion. For toolmark analysis on locking cylinders, the aim is not an identification of the used tool but rather an identification of the opening method. The challenge of such an identification is that a one-to-one comparison of two images is not sufficient - although two marked objects look completely different in regards to the specific location and shape of found marks they still can represent a sample for the identical opening method. This paper provides the first approach for modelling toolmarks on lock pins and takes into consideration the different requirements necessary to generate a detailed and interpretable digital representation of these traces. These requirements are 'detail', i.e. adequate features which allow for a suitable representation and interpretation of single marks, 'meta detail', i.e. adequate representation of the context and connection between all marks and 'distinctiveness', i.e. the possibility to reliably distinguish different sample types by the according model. The model is evaluated with a set of 15 physical samples (resulting in 675 digital scans) of lock pins from cylinders opened with different opening methods, contactlessly scanned with a confocal laser microscope. The presented results suggest a high suitability for the aspired purpose of opening method determination.

  19. Orbit transfer vehicle engine technology program. Task B-6 high speed turbopump bearings

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Bearing types were evaluated for use on the Orbit Transfer Vehicle (OTV) high pressure fuel pump. The high speed, high load, and long bearing life requirements dictated selection of hydrostatic bearings as the logical candidate for this engine. Design and fabrication of a bearing tester to evaluate these cryogenic hydrostatic bearings was then conducted. Detailed analysis, evaluation of bearing materials, and design of the hydrostatic bearings were completed resulting in fabrication of Carbon P5N and Kentanium hydrostatic bearings. Rotordynamic analyses determined the exact bearing geometry chosen. Instrumentation was evaluated and data acquisition methods were determined for monitoring shaft motion up to speeds in excess of 200,000 RPM in a cryogenic atmosphere. Fabrication of all hardware was completed, but assembly and testing was conducted outside of this contract.

  20. Improving the Reporting Quality of Nonrandomized Evaluations of Behavioral and Public Health Interventions: The TREND Statement

    PubMed Central

    Des Jarlais, Don C.; Lyles, Cynthia; Crepaz, Nicole

    2004-01-01

    Developing an evidence base for making public health decisions will require using data from evaluation studies with randomized and nonrandomized designs. Assessing individual studies and using studies in quantitative research syntheses require transparent reporting of the study, with sufficient detail and clarity to readily see differences and similarities among studies in the same area. The Consolidated Standards of Reporting Trials (CONSORT) statement provides guidelines for transparent reporting of randomized clinical trials. We present the initial version of the Transparent Reporting of Evaluations with Nonrandomized Designs (TREND) statement. These guidelines emphasize the reporting of theories used and descriptions of intervention and comparison conditions, research design, and methods of adjusting for possible biases in evaluation studies that use nonrandomized designs. PMID:14998794

  1. SNR Improvement of QEPAS System by Preamplifier Circuit Optimization and Frequency Locked Technique

    NASA Astrophysics Data System (ADS)

    Zhang, Qinduan; Chang, Jun; Wang, Zongliang; Wang, Fupeng; Jiang, Fengting; Wang, Mengyao

    2018-06-01

    Preamplifier circuit noise is of great importance in quartz enhanced photoacoustic spectroscopy (QEPAS) system. In this paper, several noise sources are evaluated and discussed in detail. Based on the noise characteristics, the corresponding noise reduction method is proposed. In addition, a frequency locked technique is introduced to further optimize the QEPAS system noise and improve signal, which achieves a better performance than the conventional frequency scan method. As a result, the signal-to-noise ratio (SNR) could be increased 14 times by utilizing frequency locked technique and numerical averaging technique in the QEPAS system for water vapor detection.

  2. Demand Response Resource Quantification with Detailed Building Energy Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hale, Elaine; Horsey, Henry; Merket, Noel

    Demand response is a broad suite of technologies that enables changes in electrical load operations in support of power system reliability and efficiency. Although demand response is not a new concept, there is new appetite for comprehensively evaluating its technical potential in the context of renewable energy integration. The complexity of demand response makes this task difficult -- we present new methods for capturing the heterogeneity of potential responses from buildings, their time-varying nature, and metrics such as thermal comfort that help quantify likely acceptability of specific demand response actions. Computed with an automated software framework, the methods are scalable.

  3. SKYDOSE: A code for gamma skyshine calculations using the integral line-beam method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shultis, J.K.; Faw, R.E.; Brockhoff, R.C.

    1994-07-01

    SKYDOS evaluates skyshine dose from an isotropic, monoenergetic, point photon source collimated by three simple geometries: (1) a source in a silo; (2) a source behind an infinitely long, vertical, black wall; and (3) a source in a rectangular building. In all three geometries, an optical overhead shield may be specified. The source energy must be between 0.02 and 100 MeV (10 MeV for sources with an overhead shield). This is a user`s manual. Other references give more detail on the integral line-beam method used by SKYDOSE.

  4. Analytical Methods to Evaluate the Quality of Edible Fats and Oils: The JOCS Standard Methods for Analysis of Fats, Oils and Related Materials (2013) and Advanced Methods.

    PubMed

    Endo, Yasushi

    2018-01-01

    Edible fats and oils are among the basic components of the human diet, along with carbohydrates and proteins, and they are the source of high energy and essential fatty acids such as linoleic and linolenic acids. Edible fats and oils are used in for pan- and deep-frying, and in salad dressing, mayonnaise and processed foods such as chocolates and cream. The physical and chemical properties of edible fats and oils can affect the quality of oil foods and hence must be evaluated in detail. The physical characteristics of edible fats and oils include color, specific gravity, refractive index, melting point, congeal point, smoke point, flash point, fire point, and viscosity, while the chemical characteristics include acid value, saponification value, iodine value, fatty acid composition, trans isomers, triacylglycerol composition, unsaponifiable matters (sterols, tocopherols) and minor components (phospholipids, chlorophyll pigments, glycidyl fatty acid esters). Peroxide value, p-anisidine value, carbonyl value, polar compounds and polymerized triacylglycerols are indexes of the deterioration of edible fats and oils. This review describes the analytical methods to evaluate the quality of edible fats and oils, especially the Standard Methods for Analysis of Fats, Oils and Related Materials edited by Japan Oil Chemists' Society (the JOCS standard methods) and advanced methods.

  5. The Quality of Methods Reporting in Parasitology Experiments

    PubMed Central

    Flórez-Vargas, Oscar; Bramhall, Michael; Noyes, Harry; Cruickshank, Sheena; Stevens, Robert; Brass, Andy

    2014-01-01

    There is a growing concern both inside and outside the scientific community over the lack of reproducibility of experiments. The depth and detail of reported methods are critical to the reproducibility of findings, but also for making it possible to compare and integrate data from different studies. In this study, we evaluated in detail the methods reporting in a comprehensive set of trypanosomiasis experiments that should enable valid reproduction, integration and comparison of research findings. We evaluated a subset of other parasitic (Leishmania, Toxoplasma, Plasmodium, Trichuris and Schistosoma) and non-parasitic (Mycobacterium) experimental infections in order to compare the quality of method reporting more generally. A systematic review using PubMed (2000–2012) of all publications describing gene expression in cells and animals infected with Trypanosoma spp was undertaken based on PRISMA guidelines; 23 papers were identified and included. We defined a checklist of essential parameters that should be reported and have scored the number of those parameters that are reported for each publication. Bibliometric parameters (impact factor, citations and h-index) were used to look for association between Journal and Author status and the quality of method reporting. Trichuriasis experiments achieved the highest scores and included the only paper to score 100% in all criteria. The mean of scores achieved by Trypanosoma articles through the checklist was 65.5% (range 32–90%). Bibliometric parameters were not correlated with the quality of method reporting (Spearman's rank correlation coefficient <−0.5; p>0.05). Our results indicate that the quality of methods reporting in experimental parasitology is a cause for concern and it has not improved over time, despite there being evidence that most of the assessed parameters do influence the results. We propose that our set of parameters be used as guidelines to improve the quality of the reporting of experimental infection models as a pre-requisite for integrating and comparing sets of data. PMID:25076044

  6. The quality of methods reporting in parasitology experiments.

    PubMed

    Flórez-Vargas, Oscar; Bramhall, Michael; Noyes, Harry; Cruickshank, Sheena; Stevens, Robert; Brass, Andy

    2014-01-01

    There is a growing concern both inside and outside the scientific community over the lack of reproducibility of experiments. The depth and detail of reported methods are critical to the reproducibility of findings, but also for making it possible to compare and integrate data from different studies. In this study, we evaluated in detail the methods reporting in a comprehensive set of trypanosomiasis experiments that should enable valid reproduction, integration and comparison of research findings. We evaluated a subset of other parasitic (Leishmania, Toxoplasma, Plasmodium, Trichuris and Schistosoma) and non-parasitic (Mycobacterium) experimental infections in order to compare the quality of method reporting more generally. A systematic review using PubMed (2000-2012) of all publications describing gene expression in cells and animals infected with Trypanosoma spp was undertaken based on PRISMA guidelines; 23 papers were identified and included. We defined a checklist of essential parameters that should be reported and have scored the number of those parameters that are reported for each publication. Bibliometric parameters (impact factor, citations and h-index) were used to look for association between Journal and Author status and the quality of method reporting. Trichuriasis experiments achieved the highest scores and included the only paper to score 100% in all criteria. The mean of scores achieved by Trypanosoma articles through the checklist was 65.5% (range 32-90%). Bibliometric parameters were not correlated with the quality of method reporting (Spearman's rank correlation coefficient <-0.5; p>0.05). Our results indicate that the quality of methods reporting in experimental parasitology is a cause for concern and it has not improved over time, despite there being evidence that most of the assessed parameters do influence the results. We propose that our set of parameters be used as guidelines to improve the quality of the reporting of experimental infection models as a pre-requisite for integrating and comparing sets of data.

  7. The quadrant method measuring four points is as a reliable and accurate as the quadrant method in the evaluation after anatomical double-bundle ACL reconstruction.

    PubMed

    Mochizuki, Yuta; Kaneko, Takao; Kawahara, Keisuke; Toyoda, Shinya; Kono, Norihiko; Hada, Masaru; Ikegami, Hiroyasu; Musha, Yoshiro

    2017-11-20

    The quadrant method was described by Bernard et al. and it has been widely used for postoperative evaluation of anterior cruciate ligament (ACL) reconstruction. The purpose of this research is to further develop the quadrant method measuring four points, which we named four-point quadrant method, and to compare with the quadrant method. Three-dimensional computed tomography (3D-CT) analyses were performed in 25 patients who underwent double-bundle ACL reconstruction using the outside-in technique. The four points in this study's quadrant method were defined as point1-highest, point2-deepest, point3-lowest, and point4-shallowest, in femoral tunnel position. Value of depth and height in each point was measured. Antero-medial (AM) tunnel is (depth1, height2) and postero-lateral (PL) tunnel is (depth3, height4) in this four-point quadrant method. The 3D-CT images were evaluated independently by 2 orthopaedic surgeons. A second measurement was performed by both observers after a 4-week interval. Intra- and inter-observer reliability was calculated by means of intra-class correlation coefficient (ICC). Also, the accuracy of the method was evaluated against the quadrant method. Intra-observer reliability was almost perfect for both AM and PL tunnel (ICC > 0.81). Inter-observer reliability of AM tunnel was substantial (ICC > 0.61) and that of PL tunnel was almost perfect (ICC > 0.81). The AM tunnel position was 0.13% deep, 0.58% high and PL tunnel position was 0.01% shallow, 0.13% low compared to quadrant method. The four-point quadrant method was found to have high intra- and inter-observer reliability and accuracy. This method can evaluate the tunnel position regardless of the shape and morphology of the bone tunnel aperture for use of comparison and can provide measurement that can be compared with various reconstruction methods. The four-point quadrant method of this study is considered to have clinical relevance in that it is a detailed and accurate tool for evaluating femoral tunnel position after ACL reconstruction. Case series, Level IV.

  8. International funding agencies: potential leaders of impact evaluation in protected areas?

    PubMed Central

    Craigie, Ian D.; Barnes, Megan D.; Geldmann, Jonas; Woodley, Stephen

    2015-01-01

    Globally, protected areas are the most commonly used tools to halt biodiversity loss. Yet, some are failing to adequately conserve the biodiversity they contain. There is an urgent need for knowledge on how to make them function more effectively. Impact evaluation methods provide a set of tools that could yield this knowledge. However, rigorous outcome-focused impact evaluation is not yet used as extensively as it could be in protected area management. We examine the role of international protected area funding agencies in facilitating the use of impact evaluation. These agencies are influential stakeholders as they allocate hundreds of millions of dollars annually to support protected areas, creating a unique opportunity to shape how the conservation funds are spent globally. We identify key barriers to the use of impact evaluation, detail how large funders are uniquely placed to overcome many of these, and highlight the potential benefits if impact evaluation is used more extensively. PMID:26460135

  9. Transdisciplinary Research and Evaluation for Community Health Initiatives

    PubMed Central

    Harper, Gary W.; Neubauer, Leah C.; Bangi, Audrey K.; Francisco, Vincent T.

    2010-01-01

    Transdisciplinary research and evaluation projects provide valuable opportunities to collaborate on interventions to improve the health and well-being of individuals and communities. Given team members’ diverse backgrounds and roles or responsibilities in such projects, members’ perspectives are significant in strengthening a project’s infrastructure and improving its organizational functioning. This article presents an evaluation mechanism that allows team members to express the successes and challenges incurred throughout their involvement in a multisite transdisciplinary research project. Furthermore, their feedback is used to promote future sustainability and growth. Guided by a framework known as organizational development, the evaluative process was conducted by a neutral entity, the Quality Assurance Team. A mixed-methods approach was utilized to garner feedback and clarify how the research project goals could be achieved more effectively and efficiently. The multiple benefits gained by those involved in this evaluation and implications for utilizing transdisciplinary research and evaluation teams for health initiatives are detailed. PMID:18936267

  10. Assessing agreement between preclinical magnetic resonance imaging and histology: An evaluation of their image qualities and quantitative results

    PubMed Central

    Elschner, Cindy; Korn, Paula; Hauptstock, Maria; Schulz, Matthias C.; Range, Ursula; Jünger, Diana; Scheler, Ulrich

    2017-01-01

    One consequence of demographic change is the increasing demand for biocompatible materials for use in implants and prostheses. This is accompanied by a growing number of experimental animals because the interactions between new biomaterials and its host tissue have to be investigated. To evaluate novel materials and engineered tissues the use of non-destructive imaging modalities have been identified as a strategic priority. This provides the opportunity for studying interactions repeatedly with individual animals, along with the advantages of reduced biological variability and decreased number of laboratory animals. However, histological techniques are still the golden standard in preclinical biomaterial research. The present article demonstrates a detailed method comparison between histology and magnetic resonance imaging. This includes the presentation of their image qualities as well as the detailed statistical analysis for assessing agreement between quantitative measures. Exemplarily, the bony ingrowth of tissue engineered bone substitutes for treatment of a cleft-like maxillary bone defect has been evaluated. By using a graphical concordance analysis the mean difference between MRI results and histomorphometrical measures has been examined. The analysis revealed a slightly but significant bias in the case of the bone volume (biasHisto−MRI:Bone volume=2.40 %, p<0.005) and a clearly significant deviation for the remaining defect width (biasHisto−MRI:Defect width=−6.73 %, p≪0.005). But the study although showed a considerable effect of the analyzed section position to the quantitative result. It could be proven, that the bias of the data sets was less originated due to the imaging modalities, but mainly on the evaluation of different slice positions. The article demonstrated that method comparisons not always need the use of an independent animal study, additionally. PMID:28666026

  11. Analysis of the reliability and reproducibility of goniometry compared to hand photogrammetry

    PubMed Central

    de Carvalho, Rosana Martins Ferreira; Mazzer, Nilton; Barbieri, Claudio Henrique

    2012-01-01

    Objective: To evaluate the intra- and inter-examiner reliability and reproducibility of goniometry in relation to photogrammetry of hand, comparing the angles of thumb abduction, PIP joint flexion of the II finger and MCP joint flexion of the V finger. Methods: The study included 30 volunteers, who were divided into three groups: one group of 10 physiotherapy students, one group of 10 physiotherapists, and a third group of 10 therapists of the hand. Each examiner performed the measurements on the same hand mold, using the goniometer followed by two photogrammetry software programs; CorelDraw® and ALCimagem®. Results: The results revealed that the groups and the methods proposed presented inter-examiner reliability, generally rated as excellent (ICC 0.998 I.C. 95% 0.995 - 0.999). In the intra-examiner evaluation, an excellent level of reliability was found between the three groups. In the comparison between groups for each angle and each method, no significant differences were found between the groups for most of the measurements. Conclusion: Goniometry and photogrammetry are reliable and reproducible methods for evaluating measurements of the hand. However, due to the lack of similar references, detailed studies are needed to define the normal parameters between the methods in the joints of the hand. Level of Evidence II, Diagnostic Study. PMID:24453594

  12. Methods for semi-automated indexing for high precision information retrieval

    NASA Technical Reports Server (NTRS)

    Berrios, Daniel C.; Cucina, Russell J.; Fagan, Lawrence M.

    2002-01-01

    OBJECTIVE: To evaluate a new system, ISAID (Internet-based Semi-automated Indexing of Documents), and to generate textbook indexes that are more detailed and more useful to readers. DESIGN: Pilot evaluation: simple, nonrandomized trial comparing ISAID with manual indexing methods. Methods evaluation: randomized, cross-over trial comparing three versions of ISAID and usability survey. PARTICIPANTS: Pilot evaluation: two physicians. Methods evaluation: twelve physicians, each of whom used three different versions of the system for a total of 36 indexing sessions. MEASUREMENTS: Total index term tuples generated per document per minute (TPM), with and without adjustment for concordance with other subjects; inter-indexer consistency; ratings of the usability of the ISAID indexing system. RESULTS: Compared with manual methods, ISAID decreased indexing times greatly. Using three versions of ISAID, inter-indexer consistency ranged from 15% to 65% with a mean of 41%, 31%, and 40% for each of three documents. Subjects using the full version of ISAID were faster (average TPM: 5.6) and had higher rates of concordant index generation. There were substantial learning effects, despite our use of a training/run-in phase. Subjects using the full version of ISAID were much faster by the third indexing session (average TPM: 9.1). There was a statistically significant increase in three-subject concordant indexing rate using the full version of ISAID during the second indexing session (p < 0.05). SUMMARY: Users of the ISAID indexing system create complex, precise, and accurate indexing for full-text documents much faster than users of manual methods. Furthermore, the natural language processing methods that ISAID uses to suggest indexes contributes substantially to increased indexing speed and accuracy.

  13. Methods for Semi-automated Indexing for High Precision Information Retrieval

    PubMed Central

    Berrios, Daniel C.; Cucina, Russell J.; Fagan, Lawrence M.

    2002-01-01

    Objective. To evaluate a new system, ISAID (Internet-based Semi-automated Indexing of Documents), and to generate textbook indexes that are more detailed and more useful to readers. Design. Pilot evaluation: simple, nonrandomized trial comparing ISAID with manual indexing methods. Methods evaluation: randomized, cross-over trial comparing three versions of ISAID and usability survey. Participants. Pilot evaluation: two physicians. Methods evaluation: twelve physicians, each of whom used three different versions of the system for a total of 36 indexing sessions. Measurements. Total index term tuples generated per document per minute (TPM), with and without adjustment for concordance with other subjects; inter-indexer consistency; ratings of the usability of the ISAID indexing system. Results. Compared with manual methods, ISAID decreased indexing times greatly. Using three versions of ISAID, inter-indexer consistency ranged from 15% to 65% with a mean of 41%, 31%, and 40% for each of three documents. Subjects using the full version of ISAID were faster (average TPM: 5.6) and had higher rates of concordant index generation. There were substantial learning effects, despite our use of a training/run-in phase. Subjects using the full version of ISAID were much faster by the third indexing session (average TPM: 9.1). There was a statistically significant increase in three-subject concordant indexing rate using the full version of ISAID during the second indexing session (p < 0.05). Summary. Users of the ISAID indexing system create complex, precise, and accurate indexing for full-text documents much faster than users of manual methods. Furthermore, the natural language processing methods that ISAID uses to suggest indexes contributes substantially to increased indexing speed and accuracy. PMID:12386114

  14. Nuclear Data Activities in Support of the DOE Nuclear Criticality Safety Program

    NASA Astrophysics Data System (ADS)

    Westfall, R. M.; McKnight, R. D.

    2005-05-01

    The DOE Nuclear Criticality Safety Program (NCSP) provides the technical infrastructure maintenance for those technologies applied in the evaluation and performance of safe fissionable-material operations in the DOE complex. These technologies include an Analytical Methods element for neutron transport as well as the development of sensitivity/uncertainty methods, the performance of Critical Experiments, evaluation and qualification of experiments as Benchmarks, and a comprehensive Nuclear Data program coordinated by the NCSP Nuclear Data Advisory Group (NDAG). The NDAG gathers and evaluates differential and integral nuclear data, identifies deficiencies, and recommends priorities on meeting DOE criticality safety needs to the NCSP Criticality Safety Support Group (CSSG). Then the NDAG identifies the required resources and unique capabilities for meeting these needs, not only for performing measurements but also for data evaluation with nuclear model codes as well as for data processing for criticality safety applications. The NDAG coordinates effort with the leadership of the National Nuclear Data Center, the Cross Section Evaluation Working Group (CSEWG), and the Working Party on International Evaluation Cooperation (WPEC) of the OECD/NEA Nuclear Science Committee. The overall objective is to expedite the issuance of new data and methods to the DOE criticality safety user. This paper describes these activities in detail, with examples based upon special studies being performed in support of criticality safety for a variety of DOE operations.

  15. Elemental composition of airborne particulates and source identification - An extensive one year survey. [in Cleveland, OH

    NASA Technical Reports Server (NTRS)

    King, R. B.; Fordyce, J. S.; Antoine, A. C.; Leibecki, H. F.; Neustadter, H. E.; Sidik, S. M.

    1976-01-01

    Concentrations of 60 chemical elements in the airborne particulate matter were measured at 16 sites in Cleveland, OH over a 1 year period during 1971 and 1972 (45 to 50 sampling days). Analytical methods used included instrumental neutron activation, emission spectroscopy, and combustion techniques. Uncertainties in the concentrations associated with the sampling procedures, the analytical methods, the use of several analytical facilities, and samples with concentrations below the detection limits are evaluated in detail. The data are discussed in relation to other studies and source origins. The trace constituent concentrations as a function of wind direction are used to suggest a practical method for air pollution source identification.

  16. A weighted variational gradient-based fusion method for high-fidelity thin cloud removal of Landsat images

    NASA Astrophysics Data System (ADS)

    Huang, Wei; Chen, Xiu; Wang, Yueyun

    2018-03-01

    Landsat data are widely used in various earth observations, but the clouds interfere with the applications of the images. This paper proposes a weighted variational gradient-based fusion method (WVGBF) for high-fidelity thin cloud removal of Landsat images, which is an improvement of the variational gradient-based fusion (VGBF) method. The VGBF method integrates the gradient information from the reference band into visible bands of cloudy image to enable spatial details and remove thin clouds. The VGBF method utilizes the same gradient constraints to the entire image, which causes the color distortion in cloudless areas. In our method, a weight coefficient is introduced into the gradient approximation term to ensure the fidelity of image. The distribution of weight coefficient is related to the cloud thickness map. The map is built on Independence Component Analysis (ICA) by using multi-temporal Landsat images. Quantitatively, we use R value to evaluate the fidelity in the cloudless regions and metric Q to evaluate the clarity in the cloud areas. The experimental results indicate that the proposed method has the better ability to remove thin cloud and achieve high fidelity.

  17. Information Flow Integrity for Systems of Independently-Developed Components

    DTIC Science & Technology

    2015-06-22

    We also examined three programs (Apache, MySQL , and PHP) in detail to evaluate the efficacy of using the provided package test suites to generate...method are just as effective as hooks that were manually placed over the course of years while greatly reducing the burden on programmers. ”Leveraging...to validate optimizations of real-world, mature applications: the Apache software suite, the Mozilla Suite, and the MySQL database. ”Validating Library

  18. Field Collection Methods for an EPA Pilot Study Evaluating Personal, Housing, and Community Factors Influencing Children’s Potential Exposures to Indoor Contaminants at Various Lifestages (EPA Pilot Study Add-On to the GreenHousing Study)

    EPA Science Inventory

    This compilation of field collection standard operating procedures (SOPs) was assembled for the U.S. Environmental Protection Agency’s (EPA) Pilot Study add-on to the Green Housing Study (GHS). A detailed description of this add-on study can be found in the peer reviewed research...

  19. Studies of aerothermal loads generated in regions of shock/shock interaction in hypersonic flow

    NASA Technical Reports Server (NTRS)

    Holden, Michael S.; Moselle, John R.; Lee, Jinho

    1991-01-01

    Experimental studies were conducted to examine the aerothermal characteristics of shock/shock/boundary layer interaction regions generated by single and multiple incident shocks. The presented experimental studies were conducted over a Mach number range from 6 to 19 for a range of Reynolds numbers to obtain both laminar and turbulent interaction regions. Detailed heat transfer and pressure measurements were made for a range of interaction types and incident shock strengths over a transverse cylinder, with emphasis on the 3 and 4 type interaction regions. The measurements were compared with the simple Edney, Keyes, and Hains models for a range of interaction configurations and freestream conditions. The complex flowfields and aerothermal loads generated by multiple-shock impingement, while not generating as large peak loads, provide important test cases for code prediction. The detailed heat transfer and pressure measurements proved a good basis for evaluating the accuracy of simple prediction methods and detailed numerical solutions for laminar and transitional regions or shock/shock interactions.

  20. Global Environmental Data for Mapping Infectious Disease Distribution

    PubMed Central

    Hay, S.I.; Tatem, A.J.; Graham, A.J.; Goetz, S.J.; Rogers, D.J.

    2011-01-01

    This contribution documents the satellite data archives, data processing methods and temporal Fourier analysis (TFA) techniques used to create the remotely sensed datasets on the DVD distributed with this volume. The aim is to provide a detailed reference guide to the genesis of the data, rather than a standard review. These remotely sensed data cover the entire globe at either 1 × 1 or 8 × 8 km spatial resolution. We briefly evaluate the relationships between the 1 × 1 and 8 × 8 km global TFA products to explore their inter-compatibility. The 8 × 8 km TFA surfaces are used in the mapping procedures detailed in the subsequent disease mapping reviews, since the 1 × 1 km products have been validated less widely. Details are also provided on additional, current and planned sensors that should be able to provide continuity with these environmental variable surfaces, as well as other sources of global data that may be used for mapping infectious disease. PMID:16647967

  1. Spaceflight and Immune Responses of Rhesus Monkeys

    NASA Technical Reports Server (NTRS)

    Sonnenfeld, Gerald

    1997-01-01

    In the grant period, we perfected techniques for determination of interleukin production and leukocyte subset analysis of rhesus monkeys. These results are outlined in detail in publication number 2, appended to this report. Additionally, we participated in the ARRT restraint test to determine if restraint conditions for flight in the Space Shuttle could contribute to any effects of space flight on immune responses. All immunological parameters listed in the methods section were tested. Evaluation of the data suggests that the restraint conditions had minimal effects on the results observed, but handling of the monkeys could have had some effect. These results are outlined in detail in manuscript number 3, appended to this report. Additionally, to help us develop our rhesus monkey immunology studies, we carried out preliminary studies in mice to determine the effects of stressors on immunological parameters. We were able to show that there were gender-based differences in the response of immunological parameters to a stressor. These results are outlined in detail in manuscript number 4, appended to this report.

  2. Land cover mapping at sub-pixel scales

    NASA Astrophysics Data System (ADS)

    Makido, Yasuyo Kato

    One of the biggest drawbacks of land cover mapping from remotely sensed images relates to spatial resolution, which determines the level of spatial details depicted in an image. Fine spatial resolution images from satellite sensors such as IKONOS and QuickBird are now available. However, these images are not suitable for large-area studies, since a single image is very small and therefore it is costly for large area studies. Much research has focused on attempting to extract land cover types at sub-pixel scale, and little research has been conducted concerning the spatial allocation of land cover types within a pixel. This study is devoted to the development of new algorithms for predicting land cover distribution using remote sensory imagery at sub-pixel level. The "pixel-swapping" optimization algorithm, which was proposed by Atkinson for predicting sub-pixel land cover distribution, is investigated in this study. Two limitations of this method, the arbitrary spatial range value and the arbitrary exponential model of spatial autocorrelation, are assessed. Various weighting functions, as alternatives to the exponential model, are evaluated in order to derive the optimum weighting function. Two different simulation models were employed to develop spatially autocorrelated binary class maps. In all tested models, Gaussian, Exponential, and IDW, the pixel swapping method improved classification accuracy compared with the initial random allocation of sub-pixels. However the results suggested that equal weight could be used to increase accuracy and sub-pixel spatial autocorrelation instead of using these more complex models of spatial structure. New algorithms for modeling the spatial distribution of multiple land cover classes at sub-pixel scales are developed and evaluated. Three methods are examined: sequential categorical swapping, simultaneous categorical swapping, and simulated annealing. These three methods are applied to classified Landsat ETM+ data that has been resampled to 210 meters. The result suggested that the simultaneous method can be considered as the optimum method in terms of accuracy performance and computation time. The case study employs remote sensing imagery at the following sites: tropical forests in Brazil and temperate multiple land mosaic in East China. Sub-areas for both sites are used to examine how the characteristics of the landscape affect the ability of the optimum technique. Three types of measurement: Moran's I, mean patch size (MPS), and patch size standard deviation (STDEV), are used to characterize the landscape. All results suggested that this technique could increase the classification accuracy more than traditional hard classification. The methods developed in this study can benefit researchers who employ coarse remote sensing imagery but are interested in detailed landscape information. In many cases, the satellite sensor that provides large spatial coverage has insufficient spatial detail to identify landscape patterns. Application of the super-resolution technique described in this dissertation could potentially solve this problem by providing detailed land cover predictions from the coarse resolution satellite sensor imagery.

  3. Methods for computing water-quality loads at sites in the U.S. Geological Survey National Water Quality Network

    USGS Publications Warehouse

    Lee, Casey J.; Murphy, Jennifer C.; Crawford, Charles G.; Deacon, Jeffrey R.

    2017-10-24

    The U.S. Geological Survey publishes information on concentrations and loads of water-quality constituents at 111 sites across the United States as part of the U.S. Geological Survey National Water Quality Network (NWQN). This report details historical and updated methods for computing water-quality loads at NWQN sites. The primary updates to historical load estimation methods include (1) an adaptation to methods for computing loads to the Gulf of Mexico; (2) the inclusion of loads computed using the Weighted Regressions on Time, Discharge, and Season (WRTDS) method; and (3) the inclusion of loads computed using continuous water-quality data. Loads computed using WRTDS and continuous water-quality data are provided along with those computed using historical methods. Various aspects of method updates are evaluated in this report to help users of water-quality loading data determine which estimation methods best suit their particular application.

  4. Antibacterial activity of graphene layers

    NASA Astrophysics Data System (ADS)

    Dybowska-Sarapuk, Ł.; Kotela, A.; Krzemiński, J.; Janczak, D.; Wróblewska, M.; Marchel, H.; Łegorz, P.; Jakubowska, M.

    2016-09-01

    The bacterial biofilm is a direct cause of complications in management of various medical conditions. There is an ongoing search for a feasible method to prevent its growth, as an alternative to antibiotics, which are ineffective. The aim of the study was to prepare and evaluate a detailed algorithm for production of graphene coatings, using economically efficient methods of printed electronics (such as ink-jet printing or spray coating), and assess their antibacterial properties. Based on the preliminary results of our work we suggest that graphene coating may inhibit the formation of microbial biofilms. Further research is needed to verify antibacterial properties of graphene coatings and its future applications in prevention of biofilm-related infections, e.g. by coating surgical instruments, catheters or tracheostomy tubes. In addition, we propose a series of hypotheses to be evaluated in further work.

  5. Mechanoluminescence assisting agile optimization of processing design on surgical epiphysis plates

    NASA Astrophysics Data System (ADS)

    Terasaki, Nao; Toyomasu, Takashi; Sonohata, Motoki

    2018-04-01

    We propose a novel method for agile optimization of processing design by visualization of mechanoluminescence. To demonstrate the effect of the new method, epiphysis plates were processed to form dots (diameters: 1 and 1.5 mm) and the mechanical information was evaluated. As a result, the appearance of new strain concentration was successfully visualized on the basis of mechanoluminescence, and complex mechanical information was instinctively understood by surgeons as the designers. In addition, it was clarified by mechanoluminescence analysis that small dots do not have serious mechanical effects such as strength reduction. Such detail mechanical information evaluated on the basis of mechanoluminescence was successfully applied to the judgement of the validity of the processing design. This clearly proves the effectiveness of the new methodology using mechanoluminescence for assisting agile optimization of the processing design.

  6. Hamiltonian Markov Chain Monte Carlo Methods for the CUORE Neutrinoless Double Beta Decay Sensitivity

    NASA Astrophysics Data System (ADS)

    Graham, Eleanor; Cuore Collaboration

    2017-09-01

    The CUORE experiment is a large-scale bolometric detector seeking to observe the never-before-seen process of neutrinoless double beta decay. Predictions for CUORE's sensitivity to neutrinoless double beta decay allow for an understanding of the half-life ranges that the detector can probe, and also to evaluate the relative importance of different detector parameters. Currently, CUORE uses a Bayesian analysis based in BAT, which uses Metropolis-Hastings Markov Chain Monte Carlo, for its sensitivity studies. My work evaluates the viability and potential improvements of switching the Bayesian analysis to Hamiltonian Monte Carlo, realized through the program Stan and its Morpho interface. I demonstrate that the BAT study can be successfully recreated in Stan, and perform a detailed comparison between the results and computation times of the two methods.

  7. Mining Adverse Drug Reactions in Social Media with Named Entity Recognition and Semantic Methods.

    PubMed

    Chen, Xiaoyi; Deldossi, Myrtille; Aboukhamis, Rim; Faviez, Carole; Dahamna, Badisse; Karapetiantz, Pierre; Guenegou-Arnoux, Armelle; Girardeau, Yannick; Guillemin-Lanne, Sylvie; Lillo-Le-Louët, Agnès; Texier, Nathalie; Burgun, Anita; Katsahian, Sandrine

    2017-01-01

    Suspected adverse drug reactions (ADR) reported by patients through social media can be a complementary source to current pharmacovigilance systems. However, the performance of text mining tools applied to social media text data to discover ADRs needs to be evaluated. In this paper, we introduce the approach developed to mine ADR from French social media. A protocol of evaluation is highlighted, which includes a detailed sample size determination and evaluation corpus constitution. Our text mining approach provided very encouraging preliminary results with F-measures of 0.94 and 0.81 for recognition of drugs and symptoms respectively, and with F-measure of 0.70 for ADR detection. Therefore, this approach is promising for downstream pharmacovigilance analysis.

  8. Vadose zone transport field study: Detailed test plan for simulated leak tests

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    AL Ward; GW Gee

    2000-06-23

    The US Department of Energy (DOE) Groundwater/Vadose Zone Integration Project Science and Technology initiative was created in FY 1999 to reduce the uncertainty associated with vadose zone transport processes beneath waste sites at DOE's Hanford Site near Richland, Washington. This information is needed not only to evaluate the risks from transport, but also to support the adoption of measures for minimizing impacts to the groundwater and surrounding environment. The principal uncertainties in vadose zone transport are the current distribution of source contaminants and the natural heterogeneity of the soil in which the contaminants reside. Oversimplified conceptual models resulting from thesemore » uncertainties and limited use of hydrologic characterization and monitoring technologies have hampered the understanding contaminant migration through Hanford's vadose zone. Essential prerequisites for reducing vadose transport uncertainly include the development of accurate conceptual models and the development or adoption of monitoring techniques capable of delineating the current distributions of source contaminants and characterizing natural site heterogeneity. The Vadose Zone Transport Field Study (VZTFS) was conceived as part of the initiative to address the major uncertainties confronting vadose zone fate and transport predictions at the Hanford Site and to overcome the limitations of previous characterization attempts. Pacific Northwest National Laboratory (PNNL) is managing the VZTFS for DOE. The VZTFS will conduct field investigations that will improve the understanding of field-scale transport and lead to the development or identification of efficient and cost-effective characterization methods. Ideally, these methods will capture the extent of contaminant plumes using existing infrastructure (i.e., more than 1,300 steel-cased boreholes). The objectives of the VZTFS are to conduct controlled transport experiments at well-instrumented field sites at Hanford to: identify mechanisms controlling transport processes in soils typical of the hydrogeologic conditions of Hanford's waste disposal sites; reduce uncertainty in conceptual models; develop a detailed and accurate database of hydraulic and transport parameters for validation of three-dimensional numerical models; identify and evaluate advanced, cost-effective characterization methods with the potential to assess changing conditions in the vadose zone, particularly as surrogates of currently undetectable high-risk contaminants. This plan provides details for conducting field tests during FY 2000 to accomplish these objectives. Details of additional testing during FY 2001 and FY 2002 will be developed as part of the work planning process implemented by the Integration Project.« less

  9. Effects of Different Viewing Conditions on Radiographic Interpretation

    PubMed Central

    Moshfeghi, Mahkameh; Shahbazian, Majid; Sajadi, Soodabeh Sadat; Sajadi, Sepideh; Ansari, Hossein

    2015-01-01

    Objectives: Optimum viewing conditions facilitate identification of radiographic details and decrease the need for retakes, patients’ costs and radiation dose. This study sought to evaluate the effects of different viewing conditions on radiographic interpretation. Materials and Methods: This diagnostic study was performed by evaluating radiograph of a 7mm-thick aluminum block, in which 10 holes with 2mm diameters were randomly drilled with depths ranging from 0.05 mm to 0.50mm. The radiograph was viewed by four oral radiologists independently under four viewing conditions, including a white light viewing light box in a lit room, yellow light viewing light box in a lit room, white light viewing light box in a dark room and yellow light viewing light box in a dark room. Number of circular shadows observed on the film was recorded. The data were analyzed by two-way ANOVA. Results: The mean number of detected circular shadows was 6.75, 7.5, 7.25 and 7.75 in white light viewing light box in a lit room, white light viewing light box in a dark room, yellow light viewing light box in a lit room and yellow light viewing light box in a dark room, respectively. Although the surrounding illumination had statistically significant effect on the radiographic details (P≤0.03), the light color of the viewing light box had no significant effect on visibility of the radiographic details. Conclusion: White and yellow light of the viewing light box had no significant effect on visibility of the radiographic details but more information was obtained in a dark room. PMID:27507997

  10. Accuracy of Implant Position Transfer and Surface Detail Reproduction with Different Impression Materials and Techniques

    PubMed Central

    Alikhasi, Marzieh; Siadat, Hakimeh; Kharazifard, Mohammad Javad

    2015-01-01

    Objectives: The purpose of this study was to compare the accuracy of implant position transfer and surface detail reproduction using two impression techniques and materials. Materials and Methods: A metal model with two implants and three grooves of 0.25, 0.50 and 0.75 mm in depth on the flat superior surface of a die was fabricated. Ten regular-body polyether (PE) and 10 regular-body polyvinyl siloxane (PVS) impressions with square and conical transfer copings using open tray and closed tray techniques were made for each group. Impressions were poured with type IV stone, and linear and angular displacements of the replica heads were evaluated using a coordinate measuring machine (CMM). Also, accurate reproduction of the grooves was evaluated by a video measuring machine (VMM). These measurements were compared with the measurements calculated on the reference model that served as control, and the data were analyzed with two-way ANOVA and t-test at P= 0.05. Results: There was less linear displacement for PVS and less angular displacement for PE in closed-tray technique, and less linear displacement for PE in open tray technique (P<0.001). Also, the open tray technique showed less angular displacement with the use of PVS impression material. Detail reproduction accuracy was the same in all the groups (P>0.05). Conclusion: The open tray technique was more accurate using PE, and also both closed tray and open tray techniques had acceptable results with the use of PVS. The choice of impression material and technique made no significant difference in surface detail reproduction. PMID:27252761

  11. 3D cinematic rendering of the calvarium, maxillofacial structures, and skull base: preliminary observations.

    PubMed

    Rowe, Steven P; Zinreich, S James; Fishman, Elliot K

    2018-06-01

    Three-dimensional (3D) visualizations of volumetric data from CT have gained widespread clinical acceptance and are an important method for evaluating complex anatomy and pathology. Recently, cinematic rendering (CR), a new 3D visualization methodology, has become available. CR utilizes a lighting model that allows for the production of photorealistic images from isotropic voxel data. Given how new this technique is, studies to evaluate its clinical utility and any potential advantages or disadvantages relative to other 3D methods such as volume rendering have yet to be published. In this pictorial review, we provide examples of normal calvarial, maxillofacial, and skull base anatomy and pathological conditions that highlight the potential for CR images to aid in patient evaluation and treatment planning. The highly detailed images and nuanced shadowing that are intrinsic to CR are well suited to the display of the complex anatomy in this region of the body. We look forward to studies with CR that will ascertain the ultimate value of this methodology to evaluate calvarium, maxillofacial, and skull base morphology as well as other complex anatomic structures.

  12. Optimized Heart Sampling and Systematic Evaluation of Cardiac Therapies in Mouse Models of Ischemic Injury: Assessment of Cardiac Remodeling and Semi-Automated Quantification of Myocardial Infarct Size.

    PubMed

    Valente, Mariana; Araújo, Ana; Esteves, Tiago; Laundos, Tiago L; Freire, Ana G; Quelhas, Pedro; Pinto-do-Ó, Perpétua; Nascimento, Diana S

    2015-12-02

    Cardiac therapies are commonly tested preclinically in small-animal models of myocardial infarction. Following functional evaluation, post-mortem histological analysis is essential to assess morphological and molecular alterations underlying the effectiveness of treatment. However, non-methodical and inadequate sampling of the left ventricle often leads to misinterpretations and variability, making direct study comparisons unreliable. Protocols are provided for representative sampling of the ischemic mouse heart followed by morphometric analysis of the left ventricle. Extending the use of this sampling to other types of in situ analysis is also illustrated through the assessment of neovascularization and cellular engraftment in a cell-based therapy setting. This is of interest to the general cardiovascular research community as it details methods for standardization and simplification of histo-morphometric evaluation of emergent heart therapies. © 2015 by John Wiley & Sons, Inc. Copyright © 2015 John Wiley & Sons, Inc.

  13. [An improved medical image fusion algorithm and quality evaluation].

    PubMed

    Chen, Meiling; Tao, Ling; Qian, Zhiyu

    2009-08-01

    Medical image fusion is of very important value for application in medical image analysis and diagnosis. In this paper, the conventional method of wavelet fusion is improved,so a new algorithm of medical image fusion is presented and the high frequency and low frequency coefficients are studied respectively. When high frequency coefficients are chosen, the regional edge intensities of each sub-image are calculated to realize adaptive fusion. The choice of low frequency coefficient is based on the edges of images, so that the fused image preserves all useful information and appears more distinctly. We apply the conventional and the improved fusion algorithms based on wavelet transform to fuse two images of human body and also evaluate the fusion results through a quality evaluation method. Experimental results show that this algorithm can effectively retain the details of information on original images and enhance their edge and texture features. This new algorithm is better than the conventional fusion algorithm based on wavelet transform.

  14. Review of health information technology usability study methodologies

    PubMed Central

    Bakken, Suzanne

    2011-01-01

    Usability factors are a major obstacle to health information technology (IT) adoption. The purpose of this paper is to review and categorize health IT usability study methods and to provide practical guidance on health IT usability evaluation. 2025 references were initially retrieved from the Medline database from 2003 to 2009 that evaluated health IT used by clinicians. Titles and abstracts were first reviewed for inclusion. Full-text articles were then examined to identify final eligibility studies. 629 studies were categorized into the five stages of an integrated usability specification and evaluation framework that was based on a usability model and the system development life cycle (SDLC)-associated stages of evaluation. Theoretical and methodological aspects of 319 studies were extracted in greater detail and studies that focused on system validation (SDLC stage 2) were not assessed further. The number of studies by stage was: stage 1, task-based or user–task interaction, n=42; stage 2, system–task interaction, n=310; stage 3, user–task–system interaction, n=69; stage 4, user–task–system–environment interaction, n=54; and stage 5, user–task–system–environment interaction in routine use, n=199. The studies applied a variety of quantitative and qualitative approaches. Methodological issues included lack of theoretical framework/model, lack of details regarding qualitative study approaches, single evaluation focus, environmental factors not evaluated in the early stages, and guideline adherence as the primary outcome for decision support system evaluations. Based on the findings, a three-level stratified view of health IT usability evaluation is proposed and methodological guidance is offered based upon the type of interaction that is of primary interest in the evaluation. PMID:21828224

  15. Teaching professionalism in graduate medical education: What is the role of simulation?

    PubMed

    Wali, Eisha; Pinto, Jayant M; Cappaert, Melissa; Lambrix, Marcie; Blood, Angela D; Blair, Elizabeth A; Small, Stephen D

    2016-09-01

    We systematically reviewed the literature concerning simulation-based teaching and assessment of the Accreditation Council for Graduate Medical Education professionalism competencies to elucidate best practices and facilitate further research. A systematic review of English literature for "professionalism" and "simulation(s)" yielded 697 abstracts. Two independent raters chose abstracts that (1) focused on graduate medical education, (2) described the simulation method, and (3) used simulation to train or assess professionalism. Fifty abstracts met the criteria, and seven were excluded for lack of relevant information. The raters, 6 professionals with medical education, simulation, and clinical experience, discussed 5 of these articles as a group; they calibrated coding and applied further refinements, resulting in a final, iteratively developed evaluation form. The raters then divided into 2 teams to read and assess the remaining articles. Overall, 15 articles were eliminated, and 28 articles underwent final analysis. Papers addressed a heterogeneous range of professionalism content via multiple methods. Common specialties represented were surgery (46.4%), pediatrics (17.9%), and emergency medicine (14.3%). Sixteen articles (57%) referenced a professionalism framework; 14 (50%) incorporated an assessment tool; and 17 (60.7%) reported debriefing participants, though in limited detail. Twenty-three (82.1%) articles evaluated programs, mostly using subjective trainee reports. Despite early innovation, reporting of simulation-based professionalism training and assessment is nonstandardized in methods and terminology and lacks the details required for replication. We offer minimum standards for reporting of future professionalism-focused simulation training and assessment as well as a basic framework for better mapping proper simulation methods to the targeted domain of professionalism. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Planning riparian restoration in the context of tamarix control in Western North America

    USGS Publications Warehouse

    Shafroth, P.B.; Beauchamp, Vanessa B.; Briggs, M.K.; Lair, K.; Scott, M.L.; Sher, A.A.

    2008-01-01

    Throughout the world, the condition of many riparian ecosystems has declined due to numerous factors, including encroachment of non-native species. In the western United States, millions of dollars are spent annually to control invasions of Tamarix spp., introduced small trees or shrubs from Eurasia that have colonized bottomland ecosystems along many rivers. Resource managers seek to control Tamarix in attempts to meet various objectives, such as increasing water yield and improving wildlife habitat. Often, riparian restoration is an implicit goal, but there has been little emphasis on a process or principles to effectively plan restoration activities, and many Tamarix removal projects are unsuccessful at restoring native vegetation. We propose and summarize the key steps in a planning process aimed at developing effective restoration projects in Tamarix-dominated areas. We discuss in greater detail the biotic and abiotic factors central to the evaluation of potential restoration sites and summarize information about plant communities likely to replace Tamarix under various conditions. Although many projects begin with implementation, which includes the actual removal of Tamarix, we stress the importance of pre-project planning that includes: (1) clearly identifying project goals; (2) developing realistic project objectives based on a detailed evaluation of site conditions; (3) prioritizing and selecting Tamarix control sites with the best chance of ecological recovery; and (4) developing a detailed tactical plan before Tamarix is removed. After removal, monitoring and maintenance as part of an adaptive management approach are crucial for evaluating project success and determining the most effective methods for restoring these challenging sites. ?? 2008 Society for Ecological Restoration International.

  17. Reporting of adverse drug reactions in randomised controlled trials – a systematic survey

    PubMed Central

    Loke, Yoon Kong; Derry, Sheena

    2001-01-01

    Background Decisions on treatment are guided, not only by the potential for benefit, but also by the nature and severity of adverse drug reactions. However, some researchers have found numerous deficiencies in trial reports of adverse effects. We sought to confirm these findings by evaluating trials of drug therapy published in seven eminent medical journals in 1997. Methods Literature review to determine whether the definition, recording and reporting of adverse drug reactions in clinical trials were in accordance with published recommendations on structured reporting. Results Of the 185 trials reviewed, 25 (14%) made no mention of adverse drug reactions. Data in a further 60 (32%) could not be fully evaluated, either because numbers were not given for each treatment arm (31 trials), or because a generic statement was made without full details (29 trials). When adverse drug reactions such as clinical events or patient symptoms were mentioned in the reports, details on how they had been recorded were given in only 14/95 (15%) and 18/104 (17%) trials respectively. Of the 86 trials that mentioned severity of adverse drug reactions, only 42 (49%) stated how severity had been defined. The median amount of space used for safety data in the Results and Discussion sections was 5.8%. Conclusions Trial reports often failed to provide details on how adverse drug reactions were defined or recorded. The absence of such methodological information makes comparative evaluation of adverse reaction rates potentially unreliable. Authors and journals should adopt recommendations on the structured reporting of adverse effects. PMID:11591227

  18. Abort Region Determinator (ARD) module feasibility report. Mission planning, mission analysis and software formulation

    NASA Technical Reports Server (NTRS)

    Draeger, B. G.; Joyner, J. A.

    1976-01-01

    A detailed performance evaluation of the Abort Region Determinator (ARD) module design was provided in support of OFT-1 ascent and OFT-1 intact launch aborts. The evaluation method used compared ARD results against results obtained using the full-up Space Vehicle Dynamic Simulations program under the same conditions. Results were presented for each of the three major ARD math models: (1) the ascent numerical integrator; (2) the mass model, and (3) the second stage predictor as well as the total ARD module. These results demonstrate that the baselined ARD module meets all design objectives for mission control center orbital flight test launch/abort support.

  19. Design, Kinematic Optimization, and Evaluation of a Teleoperated System for Middle Ear Microsurgery

    PubMed Central

    Miroir, Mathieu; Nguyen, Yann; Szewczyk, Jérôme; Sterkers, Olivier; Bozorg Grayeli, Alexis

    2012-01-01

    Middle ear surgery involves the smallest and the most fragile bones of the human body. Since microsurgical gestures and a submillimetric precision are required in these procedures, the outcome can be potentially improved by robotic assistance. Today, there is no commercially available device in this field. Here, we describe a method to design a teleoperated assistance robotic system dedicated to the middle ear surgery. Determination of design specifications, the kinematic structure, and its optimization are detailed. The robot-surgeon interface and the command modes are provided. Finally, the system is evaluated by realistic tasks in experimental dedicated settings and in human temporal bone specimens. PMID:22927789

  20. A review method for UML requirements analysis model employing system-side prototyping.

    PubMed

    Ogata, Shinpei; Matsuura, Saeko

    2013-12-01

    User interface prototyping is an effective method for users to validate the requirements defined by analysts at an early stage of a software development. However, a user interface prototype system offers weak support for the analysts to verify the consistency of the specifications about internal aspects of a system such as business logic. As the result, the inconsistency causes a lot of rework costs because the inconsistency often makes the developers impossible to actualize the system based on the specifications. For verifying such consistency, functional prototyping is an effective method for the analysts, but it needs a lot of costs and more detailed specifications. In this paper, we propose a review method so that analysts can verify the consistency among several different kinds of diagrams in UML efficiently by employing system-side prototyping without the detailed model. The system-side prototype system does not have any functions to achieve business logic, but visualizes the results of the integration among the diagrams in UML as Web pages. The usefulness of our proposal was evaluated by applying our proposal into a development of Library Management System (LMS) for a laboratory. This development was conducted by a group. As the result, our proposal was useful for discovering the serious inconsistency caused by the misunderstanding among the members of the group.

  1. Comparative evaluation of RetCam vs. gonioscopy images in congenital glaucoma

    PubMed Central

    Azad, Raj V; Chandra, Parijat; Chandra, Anuradha; Gupta, Aparna; Gupta, Viney; Sihota, Ramanjit

    2014-01-01

    Purpose: To compare clarity, exposure and quality of anterior chamber angle visualization in congenital glaucoma patients, using RetCam and indirect gonioscopy images. Design: Cross-sectional study Participants. Congenital glaucoma patients over age of 5 years. Materials and Methods: A prospective consecutive pilot study was done in congenital glaucoma patients who were older than 5 years. Methods used are indirect gonioscopy and RetCam imaging. Clarity of the image, extent of angle visible and details of angle structures seen were graded for both methods, on digitally recorded images, in each eye, by two masked observers. Outcome Measures: Image clarity, interobserver agreement. Results: 40 eyes of 25 congenital glaucoma patients were studied. RetCam image had excellent clarity in 77.5% of patients versus 47.5% by gonioscopy. The extent of angle seen was similar by both methods. Agreement between RetCam and gonioscopy images regarding details of angle structures was 72.50% by observer 1 and 65.00% by observer 2. Conclusions: There was good agreement between RetCam and indirect gonioscopy images in detecting angle structures of congenital glaucoma patients. However, RetCam provided greater clarity, with better quality, and higher magnification images. RetCam can be a useful alternative to gonioscopy in infants and small children without the need for general anesthesia. PMID:24008788

  2. A combination of HPLC and automated data analysis for monitoring the efficiency of high-pressure homogenization.

    PubMed

    Eggenreich, Britta; Rajamanickam, Vignesh; Wurm, David Johannes; Fricke, Jens; Herwig, Christoph; Spadiut, Oliver

    2017-08-01

    Cell disruption is a key unit operation to make valuable, intracellular target products accessible for further downstream unit operations. Independent of the applied cell disruption method, each cell disruption process must be evaluated with respect to disruption efficiency and potential product loss. Current state-of-the-art methods, like measuring the total amount of released protein and plating-out assays, are usually time-delayed and involve manual intervention making them error-prone. An automated method to monitor cell disruption efficiency at-line is not available to date. In the current study we implemented a methodology, which we had originally developed to monitor E. coli cell integrity during bioreactor cultivations, to automatically monitor and evaluate cell disruption of a recombinant E. coli strain by high-pressure homogenization. We compared our tool with a library of state-of-the-art methods, analyzed the effect of freezing the biomass before high-pressure homogenization and finally investigated this unit operation in more detail by a multivariate approach. A combination of HPLC and automated data analysis describes a valuable, novel tool to monitor and evaluate cell disruption processes. Our methodology, which can be used both in upstream (USP) and downstream processing (DSP), describes a valuable tool to evaluate cell disruption processes as it can be implemented at-line, gives results within minutes after sampling and does not need manual intervention.

  3. Nakagami-based total variation method for speckle reduction in thyroid ultrasound images.

    PubMed

    Koundal, Deepika; Gupta, Savita; Singh, Sukhwinder

    2016-02-01

    A good statistical model is necessary for the reduction in speckle noise. The Nakagami model is more general than the Rayleigh distribution for statistical modeling of speckle in ultrasound images. In this article, the Nakagami-based noise removal method is presented to enhance thyroid ultrasound images and to improve clinical diagnosis. The statistics of log-compressed image are derived from the Nakagami distribution following a maximum a posteriori estimation framework. The minimization problem is solved by optimizing an augmented Lagrange and Chambolle's projection method. The proposed method is evaluated on both artificial speckle-simulated and real ultrasound images. The experimental findings reveal the superiority of the proposed method both quantitatively and qualitatively in comparison with other speckle reduction methods reported in the literature. The proposed method yields an average signal-to-noise ratio gain of more than 2.16 dB over the non-convex regularizer-based speckle noise removal method, 3.83 dB over the Aubert-Aujol model, 1.71 dB over the Shi-Osher model and 3.21 dB over the Rudin-Lions-Osher model on speckle-simulated synthetic images. Furthermore, visual evaluation of the despeckled images shows that the proposed method suppresses speckle noise well while preserving the textures and fine details. © IMechE 2015.

  4. Global resilience analysis of water distribution systems.

    PubMed

    Diao, Kegong; Sweetapple, Chris; Farmani, Raziyeh; Fu, Guangtao; Ward, Sarah; Butler, David

    2016-12-01

    Evaluating and enhancing resilience in water infrastructure is a crucial step towards more sustainable urban water management. As a prerequisite to enhancing resilience, a detailed understanding is required of the inherent resilience of the underlying system. Differing from traditional risk analysis, here we propose a global resilience analysis (GRA) approach that shifts the objective from analysing multiple and unknown threats to analysing the more identifiable and measurable system responses to extreme conditions, i.e. potential failure modes. GRA aims to evaluate a system's resilience to a possible failure mode regardless of the causal threat(s) (known or unknown, external or internal). The method is applied to test the resilience of four water distribution systems (WDSs) with various features to three typical failure modes (pipe failure, excess demand, and substance intrusion). The study reveals GRA provides an overview of a water system's resilience to various failure modes. For each failure mode, it identifies the range of corresponding failure impacts and reveals extreme scenarios (e.g. the complete loss of water supply with only 5% pipe failure, or still meeting 80% of demand despite over 70% of pipes failing). GRA also reveals that increased resilience to one failure mode may decrease resilience to another and increasing system capacity may delay the system's recovery in some situations. It is also shown that selecting an appropriate level of detail for hydraulic models is of great importance in resilience analysis. The method can be used as a comprehensive diagnostic framework to evaluate a range of interventions for improving system resilience in future studies. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  5. Accuracy and Spatial Variability in GPS Surveying for Landslide Mapping on Road Inventories at a Semi-Detailed Scale: the Case in Colombia

    NASA Astrophysics Data System (ADS)

    Murillo Feo, C. A.; Martnez Martinez, L. J.; Correa Muñoz, N. A.

    2016-06-01

    The accuracy of locating attributes on topographic surfaces when, using GPS in mountainous areas, is affected by obstacles to wave propagation. As part of this research on the semi-automatic detection of landslides, we evaluate the accuracy and spatial distribution of the horizontal error in GPS positioning in the tertiary road network of six municipalities located in mountainous areas in the department of Cauca, Colombia, using geo-referencing with GPS mapping equipment and static-fast and pseudo-kinematic methods. We obtained quality parameters for the GPS surveys with differential correction, using a post-processing method. The consolidated database underwent exploratory analyses to determine the statistical distribution, a multivariate analysis to establish relationships and partnerships between the variables, and an analysis of the spatial variability and calculus of accuracy, considering the effect of non-Gaussian distribution errors. The evaluation of the internal validity of the data provide metrics with a confidence level of 95% between 1.24 and 2.45 m in the static-fast mode and between 0.86 and 4.2 m in the pseudo-kinematic mode. The external validity had an absolute error of 4.69 m, indicating that this descriptor is more critical than precision. Based on the ASPRS standard, the scale obtained with the evaluated equipment was in the order of 1:20000, a level of detail expected in the landslide-mapping project. Modelling the spatial variability of the horizontal errors from the empirical semi-variogram analysis showed predictions errors close to the external validity of the devices.

  6. Use of a multi-level mixed methods approach to study the effectiveness of a primary care progressive return to activity protocol after acute mild traumatic brain injury/concussion in the military.

    PubMed

    Gregory, Emma; West, Therese A; Cole, Wesley R; Bailie, Jason M; McCulloch, Karen L; Ettenhofer, Mark L; Cecchini, Amy; Qashu, Felicia M

    2017-01-01

    The large number of U.S. service members diagnosed with concussion/mild traumatic brain injury each year underscores the necessity for clear and effective clinical guidance for managing concussion. Relevant research continues to emerge supporting a gradual return to pre-injury activity levels without aggravating symptoms; however, available guidance does not provide detailed standards for this return to activity process. To fill this gap, the Defense and Veterans Brain Injury Center released a recommendation for primary care providers detailing a step-wise return to unrestricted activity during the acute phase of concussion. This guidance was developed in collaboration with an interdisciplinary group of clinical, military, and academic subject matter experts using an evidence-based approach. Systematic evaluation of the guidance is critical to ensure positive patient outcomes, to discover barriers to implementation by providers, and to identify ways to improve the recommendation. Here we describe a multi-level, mixed-methods approach to evaluate the recommendation incorporating outcomes from both patients and providers. Procedures were developed to implement the study within complex but ecologically-valid settings at multiple military treatment facilities and operational medical units. Special consideration was given to anticipated challenges such as the frequent movement of military personnel, selection of appropriate design and measures, study implementation at multiple sites, and involvement of multiple service branches (Army, Navy, and Marine Corps). We conclude by emphasizing the need to consider contemporary approaches for evaluating the effectiveness of clinical guidance. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. A New Method to Retrieve the Data Requirements of the Remote Sensing Community – Exemplarily Demonstrated for Hyperspectral User Needs

    PubMed Central

    Nieke, Jens; Reusen, Ils

    2007-01-01

    User-driven requirements for remote sensing data are difficult to define, especially details on geometric, spectral and radiometric parameters. Even more difficult is a decent assessment of the required degrees of processing and corresponding data quality. It is therefore a real challenge to appropriately assess data costs and services to be provided. In 2006, the HYRESSA project was initiated within the framework 6 programme of the European Commission to analyze the user needs of the hyperspectral remote sensing community. Special focus was given to finding an answer to the key question, “What are the individual user requirements for hyperspectral imagery and its related data products?”. A Value-Benefit Analysis (VBA) was performed to retrieve user needs and address open items accordingly. The VBA is an established tool for systematic problem solving by supporting the possibility of comparing competing projects or solutions. It enables evaluation on the basis of a multidimensional objective model and can be augmented with expert's preferences. After undergoing a VBA, the scaling method (e.g., Law of Comparative Judgment) was applied for achieving the desired ranking judgments. The result, which is the relative value of projects with respect to a well-defined main objective, can therefore be produced analytically using a VBA. A multidimensional objective model adhering to VBA methodology was established. Thereafter, end users and experts were requested to fill out a Questionnaire of User Needs (QUN) at the highest level of detail - the value indicator level. The end user was additionally requested to report personal preferences for his particular research field. In the end, results from the experts' evaluation and results from a sensor data survey can be compared in order to understand user needs and the drawbacks of existing data products. The investigation – focusing on the needs of the hyperspectral user community in Europe – showed that a VBA is a suitable method for analyzing the needs of hyperspectral data users and supporting the sensor/data specification-building process. The VBA has the advantage of being easy to handle, resulting in a comprehensive evaluation. The primary disadvantage is the large effort in realizing such an analysis because the level of detail is extremely high.

  8. Effect of Silicon in U-10Mo Alloy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kautz, Elizabeth J.; Devaraj, Arun; Kovarik, Libor

    2017-08-31

    This document details a method for evaluating the effect of silicon impurity content on U-10Mo alloys. Silicon concentration in U-10Mo alloys has been shown to impact the following: volume fraction of precipitate phases, effective density of the final alloy, and 235-U enrichment in the gamma-UMo matrix. This report presents a model for calculating these quantities as a function of Silicon concentration, which along with fuel foil characterization data, will serve as a reference for quality control of the U-10Mo final alloy Si content. Additionally, detailed characterization using scanning electron microscope imaging, transmission electron microscope diffraction, and atom probe tomography showedmore » that Silicon impurities present in U-10Mo alloys form a Si-rich precipitate phase.« less

  9. Analytical criteria for performance characteristics of IgE binding methods for evaluating safety of biotech food products.

    PubMed

    Holzhauser, Thomas; Ree, Ronald van; Poulsen, Lars K; Bannon, Gary A

    2008-10-01

    There is detailed guidance on how to perform bioinformatic analyses and enzymatic degradation studies for genetically modified crops under consideration for approval by regulatory agencies; however, there is no consensus in the scientific community on the details of how to perform IgE serum studies. IgE serum studies are an important safety component to acceptance of genetically modified crops when the introduced protein is novel, the introduced protein is similar to known allergens, or the crop is allergenic. In this manuscript, we describe the characteristics of the reagents, validation of assay performance, and data analysis necessary to optimize the information obtained from serum testing of novel proteins and genetically modified (GM) crops and to make results more accurate and comparable between different investigations.

  10. Locally refined block-centred finite-difference groundwater models: Evaluation of parameter sensitivity and the consequences for inverse modelling

    USGS Publications Warehouse

    Mehl, S.; Hill, M.C.

    2002-01-01

    Models with local grid refinement, as often required in groundwater models, pose special problems for model calibration. This work investigates the calculation of sensitivities and the performance of regression methods using two existing and one new method of grid refinement. The existing local grid refinement methods considered are: (a) a variably spaced grid in which the grid spacing becomes smaller near the area of interest and larger where such detail is not needed, and (b) telescopic mesh refinement (TMR), which uses the hydraulic heads or fluxes of a regional model to provide the boundary conditions for a locally refined model. The new method has a feedback between the regional and local grids using shared nodes, and thereby, unlike the TMR methods, balances heads and fluxes at the interfacing boundary. Results for sensitivities are compared for the three methods and the effect of the accuracy of sensitivity calculations are evaluated by comparing inverse modelling results. For the cases tested, results indicate that the inaccuracies of the sensitivities calculated using the TMR approach can cause the inverse model to converge to an incorrect solution.

  11. Locally refined block-centered finite-difference groundwater models: Evaluation of parameter sensitivity and the consequences for inverse modelling and predictions

    USGS Publications Warehouse

    Mehl, S.; Hill, M.C.

    2002-01-01

    Models with local grid refinement, as often required in groundwater models, pose special problems for model calibration. This work investigates the calculation of sensitivities and performance of regression methods using two existing and one new method of grid refinement. The existing local grid refinement methods considered are (1) a variably spaced grid in which the grid spacing becomes smaller near the area of interest and larger where such detail is not needed and (2) telescopic mesh refinement (TMR), which uses the hydraulic heads or fluxes of a regional model to provide the boundary conditions for a locally refined model. The new method has a feedback between the regional and local grids using shared nodes, and thereby, unlike the TMR methods, balances heads and fluxes at the interfacing boundary. Results for sensitivities are compared for the three methods and the effect of the accuracy of sensitivity calculations are evaluated by comparing inverse modelling results. For the cases tested, results indicate that the inaccuracies of the sensitivities calculated using the TMR approach can cause the inverse model to converge to an incorrect solution.

  12. Analytical Strategies Involved in the Detailed Componential Characterization of Biooil Produced from Lignocellulosic Biomass

    PubMed Central

    Li, Guo-Sheng; Wei, Xian-Yong

    2017-01-01

    Elucidation of chemical composition of biooil is essentially important to evaluate the process of lignocellulosic biomass (LCBM) conversion and its upgrading and suggest proper value-added utilization like producing fuel and feedstock for fine chemicals. Although the main components of LCBM are cellulose, hemicelluloses, and lignin, the chemicals derived from LCBM differ significantly due to the various feedstock and methods used for the decomposition. Biooil, produced from pyrolysis of LCBM, contains hundreds of organic chemicals with various classes. This review covers the methodologies used for the componential analysis of biooil, including pretreatments and instrumental analysis techniques. The use of chromatographic and spectrometric methods was highlighted, covering the conventional techniques such as gas chromatography, high performance liquid chromatography, Fourier transform infrared spectroscopy, nuclear magnetic resonance, and mass spectrometry. The combination of preseparation methods and instrumental technologies is a robust pathway for the detailed componential characterization of biooil. The organic species in biooils can be classified into alkanes, alkenes, alkynes, benzene-ring containing hydrocarbons, ethers, alcohols, phenols, aldehydes, ketones, esters, carboxylic acids, and other heteroatomic organic compounds. The recent development of high resolution mass spectrometry and multidimensional hyphenated chromatographic and spectrometric techniques has considerably elucidated the composition of biooils. PMID:29387086

  13. Methods and basic data from mass-loading studies in American Fork, October 1999, and Mary Ellen Gulch, Utah, September 2000

    USGS Publications Warehouse

    Kimball, Briant A.; Runkel, Robert L.; Gerner, Linda J.

    2009-01-01

    Land-management agencies are faced with decisions about remediation in streams affected by mine drainage. In support of the U. S. Forest Service, for the Uinta National Forest, the U.S. Geological Survey conducted mass-loading studies in American Fork and Mary Ellen Gulch, Utah. Synoptic samples were collected along a 10,000-meter study reach in American Fork and 4,500-meter reach in Mary Ellen Gulch. Tracer-injection methods were combined with synoptic sampling methods to evaluate discharge and mass loading. This data-series report gives the results of the chemical analyses of these samples and provides the equations used to calculate discharge from tracer concentrations and loads from discharge and concentrations of the constituents. The detailed information from these studies will facilitate the preparation of interpretive reports and discussions with stakeholder groups. Data presented include detailed locations of the sampling sites, results of chemical analyses, and graphs of mass-loading profiles for major and trace elements in American Fork and Mary Ellen Gulch. Ultrafiltration was used to define filtered concentrations and total-recoverable concentrations were measured on unfiltered samples.

  14. Preliminary experiments on surface flow visualization in the cryogenic wind tunnel by use of condensing or freezing gases

    NASA Technical Reports Server (NTRS)

    Goodyer, M. J.

    1988-01-01

    Cryogenic wind tunnel users must have available surface flow visualization techniques to satisfy a variety of needs. While the ideal from an aerodynamic stand would be non-intrusive, until an economical technique is developed there will be occasions when the user will be prepared to resort to an intrusive method. One such method is proposed, followed by preliminary evaluation experiments carried out in environments representative of the cryogenic nitrogen tunnel. The technique uses substances which are gases at normal temperature and pressure but liquid or solid at cryogenic temperatures. These are deposited on the model in localized regions, the patterns of the deposits and their subsequent melting or evaporation revealing details of the surface flow. The gases were chosen because of the likelihood that they will not permanently contaminate the model or tunnel. Twenty-four gases were identified as possibly suitable and four of these were tested from which it was concluded that surface flow direction can be shown by the method. Other flow details might also be detectable. The cryogenic wind tunnel used was insulated on the outside and did not show signs of contamination.

  15. Differentiation and Exploration of Model MACP for HE VER 1.0 on Prototype Performance Measurement Application for Higher Education

    NASA Astrophysics Data System (ADS)

    El Akbar, R. Reza; Anshary, Muhammad Adi Khairul; Hariadi, Dennis

    2018-02-01

    Model MACP for HE ver.1. Is a model that describes how to perform measurement and monitoring performance for Higher Education. Based on a review of the research related to the model, there are several parts of the model component to develop in further research, so this research has four main objectives. The first objective is to differentiate the CSF (critical success factor) components in the previous model, the two key KPI (key performance indicators) exploration in the previous model, the three based on the previous objective, the new and more detailed model design. The final goal is the fourth designed prototype application for performance measurement in higher education, based on a new model created. The method used is explorative research method and application design using prototype method. The results of this study are first, forming a more detailed new model for measurement and monitoring of performance in higher education, differentiation and exploration of the Model MACP for HE Ver.1. The second result compiles a dictionary of college performance measurement by re-evaluating the existing indicators. The third result is the design of prototype application of performance measurement in higher education.

  16. Applying CBR to machine tool product configuration design oriented to customer requirements

    NASA Astrophysics Data System (ADS)

    Wang, Pengjia; Gong, Yadong; Xie, Hualong; Liu, Yongxian; Nee, Andrew Yehching

    2017-01-01

    Product customization is a trend in the current market-oriented manufacturing environment. However, deduction from customer requirements to design results and evaluation of design alternatives are still heavily reliant on the designer's experience and knowledge. To solve the problem of fuzziness and uncertainty of customer requirements in product configuration, an analysis method based on the grey rough model is presented. The customer requirements can be converted into technical characteristics effectively. In addition, an optimization decision model for product planning is established to help the enterprises select the key technical characteristics under the constraints of cost and time to serve the customer to maximal satisfaction. A new case retrieval approach that combines the self-organizing map and fuzzy similarity priority ratio method is proposed in case-based design. The self-organizing map can reduce the retrieval range and increase the retrieval efficiency, and the fuzzy similarity priority ratio method can evaluate the similarity of cases comprehensively. To ensure that the final case has the best overall performance, an evaluation method of similar cases based on grey correlation analysis is proposed to evaluate similar cases to select the most suitable case. Furthermore, a computer-aided system is developed using MATLAB GUI to assist the product configuration design. The actual example and result on an ETC series machine tool product show that the proposed method is effective, rapid and accurate in the process of product configuration. The proposed methodology provides a detailed instruction for the product configuration design oriented to customer requirements.

  17. Estimating flood hydrographs and volumes for Alabama streams

    USGS Publications Warehouse

    Olin, D.A.; Atkins, J.B.

    1988-01-01

    The hydraulic design of highway drainage structures involves an evaluation of the effect of the proposed highway structures on lives, property, and stream stability. Flood hydrographs and associated flood volumes are useful tools in evaluating these effects. For design purposes, the Alabama Highway Department needs information on flood hydrographs and volumes associated with flood peaks of specific recurrence intervals (design floods) at proposed or existing bridge crossings. This report will provide the engineer with a method to estimate flood hydrographs, volumes, and lagtimes for rural and urban streams in Alabama with drainage areas less than 500 sq mi. Existing computer programs and methods to estimate flood hydrographs and volumes for ungaged streams have been developed in Georgia. These computer programs and methods were applied to streams in Alabama. The report gives detailed instructions on how to estimate flood hydrographs for ungaged rural or urban streams in Alabama with drainage areas less than 500 sq mi, without significant in-channel storage or regulations. (USGS)

  18. The methodology of preparing the end faces of cylindrical waveguide of polydimethylsiloxane

    NASA Astrophysics Data System (ADS)

    Novak, M.; Nedoma, J.; Jargus, J.; Bednarek, L.; Cvejn, D.; Vasinek, V.

    2017-10-01

    Polydimethylsiloxane (PDMS) can be used for its optical properties and its composition offers the possibility of use in the dangerous environments. Therefore authors of this article focused on more detailed working with this material. The article describes the methodology of preparing the end faces of the cylindrical waveguide of polymer polydimethylsiloxane (PDMS) to minimize losses during joining. The first method of preparing the end faces of the cylindrical waveguide of polydimethylsiloxane is based on the polishing surface of the sandpaper of different sizes of grains (3 species). The second method using so-called heat smoothing and the third method using aligning end faces by a new layer of polydimethylsiloxane. The outcome of the study is to evaluate the quality of the end faces of the cylindrical waveguide of polymer polydimethylsiloxane based on evaluating the attenuation. For this experiment, it was created a total of 140 samples. The attenuation was determined from both sides of the created samples for three different wavelengths of the visible spectrum.

  19. Evaluation of forensic DNA mixture evidence: protocol for evaluation, interpretation, and statistical calculations using the combined probability of inclusion.

    PubMed

    Bieber, Frederick R; Buckleton, John S; Budowle, Bruce; Butler, John M; Coble, Michael D

    2016-08-31

    The evaluation and interpretation of forensic DNA mixture evidence faces greater interpretational challenges due to increasingly complex mixture evidence. Such challenges include: casework involving low quantity or degraded evidence leading to allele and locus dropout; allele sharing of contributors leading to allele stacking; and differentiation of PCR stutter artifacts from true alleles. There is variation in statistical approaches used to evaluate the strength of the evidence when inclusion of a specific known individual(s) is determined, and the approaches used must be supportable. There are concerns that methods utilized for interpretation of complex forensic DNA mixtures may not be implemented properly in some casework. Similar questions are being raised in a number of U.S. jurisdictions, leading to some confusion about mixture interpretation for current and previous casework. Key elements necessary for the interpretation and statistical evaluation of forensic DNA mixtures are described. Given the most common method for statistical evaluation of DNA mixtures in many parts of the world, including the USA, is the Combined Probability of Inclusion/Exclusion (CPI/CPE). Exposition and elucidation of this method and a protocol for use is the focus of this article. Formulae and other supporting materials are provided. Guidance and details of a DNA mixture interpretation protocol is provided for application of the CPI/CPE method in the analysis of more complex forensic DNA mixtures. This description, in turn, should help reduce the variability of interpretation with application of this methodology and thereby improve the quality of DNA mixture interpretation throughout the forensic community.

  20. Sonographic evaluation of unexplained pleural exudate: a prospective case series.

    PubMed

    Marcun, Robert; Sustic, Alan

    2009-01-01

    Thoracic ultrasound may be helpful in differentiating between malignant and tuberculosis-associated pleural exudate. This study aimed to evaluate its utility in patients with unexplained pleural exudate. Consecutive patients were screened and pleural effusion was found in 278 patients. Pleural exudate was present in 106 patients and remained undiagnosed after biochemical and cytological evaluation in 40 patients (median age 58 years, 67% men) who then underwent detailed thoracic ultrasound for the presence of complex (septated or fibrous) or anechoic patterns. Pleural needle biopsy or thoracoscopy with histological evaluation were used for definitive diagnosis. History, clinical characteristics and routine procedures including cytology were not helpful in differential diagnosis. Pleural specimens for histological evaluation were obtained from all 40 patients and confirmed tuberculosis in 12 patients, cancer in nine and nonspecific pleuritis in 19. Sonographic finding of a complex septal pattern was present only in patients with tuberculosis (positive predictive value 100%); anechoic appearance was suggestive of nonspecific pleuritis (positive predictive value 65%). Thoracic ultrasound is a useful bedside method for differentiation of the etiology of pleural exudate. When a complex septal pattern is found, pleural needle biopsy should be the next diagnostic procedure, whereas with less complex pleural sonography findings other methods should be pursued.

  1. Probabilistic evaluation of damage potential in earthquake-induced liquefaction in a 3-D soil deposit

    NASA Astrophysics Data System (ADS)

    Halder, A.; Miller, F. J.

    1982-03-01

    A probabilistic model to evaluate the risk of liquefaction at a site and to limit or eliminate damage during earthquake induced liquefaction is proposed. The model is extended to consider three dimensional nonhomogeneous soil properties. The parameters relevant to the liquefaction phenomenon are identified, including: (1) soil parameters; (2) parameters required to consider laboratory test and sampling effects; and (3) loading parameters. The fundamentals of risk based design concepts pertient to liquefaction are reviewed. A detailed statistical evaluation of the soil parameters in the proposed liquefaction model is provided and the uncertainty associated with the estimation of in situ relative density is evaluated for both direct and indirect methods. It is found that the liquefaction potential the uncertainties in the load parameters could be higher than those in the resistance parameters.

  2. Materials development and evaluation for the ceramic helical expander

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Landingham, R.L.; Taylor, R.W.

    The supporting role of the materials program for the ceramic helical expander program is described. The materials problems for this rotory expander in an extremely severe environment-a direct coal-fired Brayton topping cycle is defined. Readily available materials and methods for possible solution to these material problems as well as initiating some longer-range studies to improve reliability were evaluated. A preliminary screening of materials in hot coal-fired environments to select candidate materials and coating was made. More detailed evaluations of these candidate materials-reaction-bonded silicon nitride (RBSN) and Si--Al--O--N (Sialon) system- and coatings-chemical-vapor-deposited silicon nitride (CVD-Si/sub 3/N/sub 4/) and CVD-Sialon need tomore » be performed. Termination of the helical expander program abruptly stopped the materials program during this evaluation.« less

  3. Materials development and evaluation for the ceramic helical expander

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Landingham, R.L.; Taylor, R.W.

    The supporting role of the materials program for the ceramic helical expander program is described. The materials problems for this rotory expander in an extremely severe environment - a direct coal-fired Brayton topping cycle is defined. Readily available materials and methods are evaluated for possible solution to these material problems as well as initiating some longer-range studies to improve reliability. A preliminary screening of materials in hot coal-fired environments to select candidate materials and coating, was made, but there is a need to perform more detailed evaluations of these candidate materials-reaction-bonded silicon nitride (RBSN) and Si--Al--O--N (Sialon) system- and coatings-chemical-vapor-depositedmore » silicon nitride (CVD-Si/sub 3/N/sub 4/) and CVD-Sialon. Termination of the helical expander program abruptly stopped the materials program during this evaluation.« less

  4. Emlen funnel experiments revisited: methods update for studying compass orientation in songbirds.

    PubMed

    Bianco, Giuseppe; Ilieva, Mihaela; Veibäck, Clas; Öfjäll, Kristoffer; Gadomska, Alicja; Hendeby, Gustaf; Felsberg, Michael; Gustafsson, Fredrik; Åkesson, Susanne

    2016-10-01

    Migratory songbirds carry an inherited capacity to migrate several thousand kilometers each year crossing continental landmasses and barriers between distant breeding sites and wintering areas. How individual songbirds manage with extreme precision to find their way is still largely unknown. The functional characteristics of biological compasses used by songbird migrants has mainly been investigated by recording the birds directed migratory activity in circular cages, so-called Emlen funnels. This method is 50 years old and has not received major updates over the past decades. The aim of this work was to compare the results from newly developed digital methods with the established manual methods to evaluate songbird migratory activity and orientation in circular cages.We performed orientation experiments using the European robin ( Erithacus rubecula ) using modified Emlen funnels equipped with thermal paper and simultaneously recorded the songbird movements from above. We evaluated and compared the results obtained with five different methods. Two methods have been commonly used in songbirds' orientation experiments; the other three methods were developed for this study and were based either on evaluation of the thermal paper using automated image analysis, or on the analysis of videos recorded during the experiment.The methods used to evaluate scratches produced by the claws of birds on the thermal papers presented some differences compared with the video analyses. These differences were caused mainly by differences in scatter, as any movement of the bird along the sloping walls of the funnel was recorded on the thermal paper, whereas video evaluations allowed us to detect single takeoff attempts by the birds and to consider only this behavior in the orientation analyses. Using computer vision, we were also able to identify and separately evaluate different behaviors that were impossible to record by the thermal paper.The traditional Emlen funnel is still the most used method to investigate compass orientation in songbirds under controlled conditions. However, new numerical image analysis techniques provide a much higher level of detail of songbirds' migratory behavior and will provide an increasing number of possibilities to evaluate and quantify specific behaviors as new algorithms will be developed.

  5. Reporting of methodological features in observational studies of pre-harvest food safety.

    PubMed

    Sargeant, Jan M; O'Connor, Annette M; Renter, David G; Kelton, David F; Snedeker, Kate; Wisener, Lee V; Leonard, Erin K; Guthrie, Alessia D; Faires, Meredith

    2011-02-01

    Observational studies in pre-harvest food safety may be useful for identifying risk factors and for evaluating potential mitigation strategies to reduce foodborne pathogens. However, there are no structured reporting guidelines for these types of study designs in livestock species. Our objective was to evaluate the reporting of observational studies in the pre-harvest food safety literature using guidelines modified from the human healthcare literature. We identified 100 pre-harvest food safety studies published between 1999 and 2009. Each study was evaluated independently by two reviewers using a structured checklist. Of the 38 studies that explicitly stated the observational study design, 27 were described as cross-sectional studies, eight as case-control studies, and three as cohort studies. Study features reported in over 75% of the selected studies included: description of the geographic location of the studies, definitions and sources of data for outcomes, organizational level and source of data for independent variables, description of statistical methods and results, number of herds enrolled in the study and included in the analysis, and sources of study funding. However, other features were not consistently reported, including details related to eligibility criteria for groups (such as barn, room, or pen) and individuals, numbers of groups and individuals included in various stages of the study, identification of primary outcomes, the distinction between putative risk factors and confounding variables, the identification of a primary exposure variable, the referent level for evaluation of categorical variable associations, methods of controlling confounding variables and missing variables, model fit, details of subset analysis, demographic information at the sampling unit level, and generalizability of the study results. Improvement in reporting of observational studies of pre-harvest food safety will aid research readers and reviewers in interpreting and evaluating the results of such studies. Copyright © 2010 Elsevier B.V. All rights reserved.

  6. Comparative Study of Nonlinear Time Warping Techniques in Isolated Word Speech Recognition Systems

    DTIC Science & Technology

    1981-06-17

    all modules are loaded under a flexible research oriented supervisor, " Cicada ". Cicada allows for the integration of experimental ideas, extensions...evaluate alternate recognition methods. More detailed information about Cicada can be found in7 . In the following we limit our discussion to the design of...43.70 37.78 32.47 44.44 44.32 38 8. Figures Cicada - a flexible research oriented supervisor ReferenceSTernpl ates Front End Matching Digital Signal

  7. An a priori study of different tabulation methods for turbulent pulverised coal combustion

    NASA Astrophysics Data System (ADS)

    Luo, Yujuan; Wen, Xu; Wang, Haiou; Luo, Kun; Jin, Hanhui; Fan, Jianren

    2018-05-01

    In many practical pulverised coal combustion systems, different oxidiser streams exist, e.g. the primary- and secondary-air streams in the power plant boilers, which makes the modelling of these systems challenging. In this work, three tabulation methods for modelling pulverised coal combustion are evaluated through an a priori study. Pulverised coal flames stabilised in a three-dimensional turbulent counterflow, consisting of different oxidiser streams, are simulated with detailed chemistry first. Then, the thermo-chemical quantities calculated with different tabulation methods are compared to those from detailed chemistry solutions. The comparison shows that the conventional two-stream flamelet model with a fixed oxidiser temperature cannot predict the flame temperature correctly. The conventional two-stream flamelet model is then modified to set the oxidiser temperature equal to the fuel temperature, both of which are varied in the flamelets. By this means, the variations of oxidiser temperature can be considered. It is found that this modified tabulation method performs very well on prediction of the flame temperature. The third tabulation method is an extended three-stream flamelet model that was initially proposed for gaseous combustion. The results show that the reference gaseous temperature profile can be overall reproduced by the extended three-stream flamelet model. Interestingly, it is found that the predictions of major species mass fractions are not sensitive to the oxidiser temperature boundary conditions for the flamelet equations in the a priori analyses.

  8. Optimizing ACS NSQIP modeling for evaluation of surgical quality and risk: patient risk adjustment, procedure mix adjustment, shrinkage adjustment, and surgical focus.

    PubMed

    Cohen, Mark E; Ko, Clifford Y; Bilimoria, Karl Y; Zhou, Lynn; Huffman, Kristopher; Wang, Xue; Liu, Yaoming; Kraemer, Kari; Meng, Xiangju; Merkow, Ryan; Chow, Warren; Matel, Brian; Richards, Karen; Hart, Amy J; Dimick, Justin B; Hall, Bruce L

    2013-08-01

    The American College of Surgeons National Surgical Quality Improvement Program (ACS NSQIP) collects detailed clinical data from participating hospitals using standardized data definitions, analyzes these data, and provides participating hospitals with reports that permit risk-adjusted comparisons with a surgical quality standard. Since its inception, the ACS NSQIP has worked to refine surgical outcomes measurements and enhance statistical methods to improve the reliability and validity of this hospital profiling. From an original focus on controlling for between-hospital differences in patient risk factors with logistic regression, ACS NSQIP has added a variable to better adjust for the complexity and risk profile of surgical procedures (procedure mix adjustment) and stabilized estimates derived from small samples by using a hierarchical model with shrinkage adjustment. New models have been developed focusing on specific surgical procedures (eg, "Procedure Targeted" models), which provide opportunities to incorporate indication and other procedure-specific variables and outcomes to improve risk adjustment. In addition, comparative benchmark reports given to participating hospitals have been expanded considerably to allow more detailed evaluations of performance. Finally, procedures have been developed to estimate surgical risk for individual patients. This article describes the development of, and justification for, these new statistical methods and reporting strategies in ACS NSQIP. Copyright © 2013 American College of Surgeons. Published by Elsevier Inc. All rights reserved.

  9. Comparative study of resist stabilization techniques for metal etch processing

    NASA Astrophysics Data System (ADS)

    Becker, Gerry; Ross, Matthew F.; Wong, Selmer S.; Minter, Jason P.; Marlowe, Trey; Livesay, William R.

    1999-06-01

    This study investigates resist stabilization techniques as they are applied to a metal etch application. The techniques that are compared are conventional deep-UV/thermal stabilization, or UV bake, and electron beam stabilization. The electron beam tool use din this study, an ElectronCure system from AlliedSignal Inc., ELectron Vision Group, utilizes a flood electron source and a non-thermal process. These stabilization techniques are compared with respect to a metal etch process. In this study, two types of resist are considered for stabilization and etch: a g/i-line resist, Shipley SPR-3012, and an advanced i-line, Shipley SPR 955- Cm. For each of these resist the effects of stabilization on resist features are evaluated by post-stabilization SEM analysis. Etch selectivity in all cases is evaluated by using a timed metal etch, and measuring resists remaining relative to total metal thickness etched. Etch selectivity is presented as a function of stabilization condition. Analyses of the effects of the type of stabilization on this method of selectivity measurement are also presented. SEM analysis was also performed on the features after a compete etch process, and is detailed as a function of stabilization condition. Post-etch cleaning is also an important factor impacted by pre-etch resist stabilization. Results of post- etch cleaning are presented for both stabilization methods. SEM inspection is also detailed for the metal features after resist removal processing.

  10. Ethosomal nanocarriers: the impact of constituents and formulation techniques on ethosomal properties, in vivo studies, and clinical trials.

    PubMed

    Abdulbaqi, Ibrahim M; Darwis, Yusrida; Khan, Nurzalina Abdul Karim; Assi, Reem Abou; Khan, Arshad A

    2016-01-01

    Ethosomal systems are novel lipid vesicular carriers containing a relatively high percentage of ethanol. These nanocarriers are especially designed for the efficient delivery of therapeutic agents with different physicochemical properties into deep skin layers and across the skin. Ethosomes have undergone extensive research since they were invented in 1996; new compounds were added to their initial formula, which led to the production of new types of ethosomal systems. Different preparation techniques are used in the preparation of these novel carriers. For ease of application and stability, ethosomal dispersions are incorporated into gels, patches, and creams. Highly diverse in vivo models are used to evaluate their efficacy in dermal/transdermal delivery, in addition to clinical trials. This article provides a detailed review of the ethosomal systems and categorizes them on the basis of their constituents to classical ethosomes, binary ethosomes, and transethosomes. The differences among these systems are discussed from several perspectives, including the formulation, size, ζ-potential (zeta potential), entrapment efficiency, skin-permeation properties, and stability. This paper gives a detailed review on the effects of ethosomal system constituents, preparation methods, and their significant roles in determining the final properties of these nanocarriers. Furthermore, the novel pharmaceutical dosage forms of ethosomal gels, patches, and creams are highlighted. The article also provides detailed information regarding the in vivo studies and clinical trials conducted for the evaluation of these vesicular systems.

  11. A software engineering approach to expert system design and verification

    NASA Technical Reports Server (NTRS)

    Bochsler, Daniel C.; Goodwin, Mary Ann

    1988-01-01

    Software engineering design and verification methods for developing expert systems are not yet well defined. Integration of expert system technology into software production environments will require effective software engineering methodologies to support the entire life cycle of expert systems. The software engineering methods used to design and verify an expert system, RENEX, is discussed. RENEX demonstrates autonomous rendezvous and proximity operations, including replanning trajectory events and subsystem fault detection, onboard a space vehicle during flight. The RENEX designers utilized a number of software engineering methodologies to deal with the complex problems inherent in this system. An overview is presented of the methods utilized. Details of the verification process receive special emphasis. The benefits and weaknesses of the methods for supporting the development life cycle of expert systems are evaluated, and recommendations are made based on the overall experiences with the methods.

  12. Separation and quantification of monoclonal-antibody aggregates by hollow-fiber-flow field-flow fractionation.

    PubMed

    Fukuda, Jun; Iwura, Takafumi; Yanagihara, Shigehiro; Kano, Kenji

    2014-10-01

    Hollow-fiber-flow field-flow fractionation (HF5) separates protein molecules on the basis of the difference in the diffusion coefficient, and can evaluate the aggregation ratio of proteins. However, HF5 is still a minor technique because information on the separation conditions is limited. We examined in detail the effect of different settings, including the main-flow rate, the cross-flow rate, the focus point, the injection amount, and the ionic strength of the mobile phase, on fractographic characteristics. On the basis of the results, we proposed optimized conditions of the HF5 method for quantification of monoclonal antibody in sample solutions. The HF5 method was qualified regarding the precision, accuracy, linearity of the main peak, and quantitation limit. In addition, the HF5 method was applied to non-heated Mab A and heat-induced-antibody-aggregate-containing samples to evaluate the aggregation ratio and the distribution extent. The separation performance was comparable with or better than that of conventional methods including analytical ultracentrifugation-sedimentation velocity and asymmetric-flow field-flow fractionation.

  13. The Need for Precise and Well-documented Experimental Data on Prompt Fission Neutron Spectra from Neutron-induced Fission of {sup 239}Pu

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neudecker, D., E-mail: dneudecker@lanl.gov; Taddeucci, T.N.; Haight, R.C.

    2016-01-15

    The spectrum of neutrons emitted promptly after {sup 239}Pu(n,f)—a so-called prompt fission neutron spectrum (PFNS)—is a quantity of high interest, for instance, for reactor physics and global security. However, there are only few experimental data sets available that are suitable for evaluations. In addition, some of those data sets differ by more than their 1-σ uncertainty boundaries. We present the results of MCNP studies indicating that these differences are partly caused by underestimated multiple scattering contributions, over-corrected background, and inconsistent deconvolution methods. A detailed uncertainty quantification for suitable experimental data was undertaken including these effects, and test-evaluations were performed withmore » the improved uncertainty information. The test-evaluations illustrate that the inadequately estimated effects and detailed uncertainty quantification have an impact on the evaluated PFNS and associated uncertainties as well as the neutron multiplicity of selected critical assemblies. A summary of data and documentation needs to improve the quality of the experimental database is provided based on the results of simulations and test-evaluations. Given the possibly substantial distortion of the PFNS by multiple scattering and background effects, special care should be taken to reduce these effects in future measurements, e.g., by measuring the {sup 239}Pu PFNS as a ratio to either the {sup 235}U or {sup 252}Cf PFNS.« less

  14. Comparison of software and human observers in reading images of the CDMAM test object to assess digital mammography systems

    NASA Astrophysics Data System (ADS)

    Young, Kenneth C.; Cook, James J. H.; Oduko, Jennifer M.; Bosmans, Hilde

    2006-03-01

    European Guidelines for quality control in digital mammography specify minimum and achievable standards of image quality in terms of threshold contrast, based on readings of images of the CDMAM test object by human observers. However this is time-consuming and has large inter-observer error. To overcome these problems a software program (CDCOM) is available to automatically read CDMAM images, but the optimal method of interpreting the output is not defined. This study evaluates methods of determining threshold contrast from the program, and compares these to human readings for a variety of mammography systems. The methods considered are (A) simple thresholding (B) psychometric curve fitting (C) smoothing and interpolation and (D) smoothing and psychometric curve fitting. Each method leads to similar threshold contrasts but with different reproducibility. Method (A) had relatively poor reproducibility with a standard error in threshold contrast of 18.1 +/- 0.7%. This was reduced to 8.4% by using a contrast-detail curve fitting procedure. Method (D) had the best reproducibility with an error of 6.7%, reducing to 5.1% with curve fitting. A panel of 3 human observers had an error of 4.4% reduced to 2.9 % by curve fitting. All automatic methods led to threshold contrasts that were lower than for humans. The ratio of human to program threshold contrasts varied with detail diameter and was 1.50 +/- .04 (sem) at 0.1mm and 1.82 +/- .06 at 0.25mm for method (D). There were good correlations between the threshold contrast determined by humans and the automated methods.

  15. A new quantitative evaluation method for age-related changes of individual pigmented spots in facial skin.

    PubMed

    Kikuchi, K; Masuda, Y; Yamashita, T; Sato, K; Katagiri, C; Hirao, T; Mizokami, Y; Yaguchi, H

    2016-08-01

    Facial skin pigmentation is one of the most prominent visible features of skin aging and often affects perception of health and beauty. To date, facial pigmentation has been evaluated using various image analysis methods developed for the cosmetic and esthetic fields. However, existing methods cannot provide precise information on pigmented spots, such as variations in size, color shade, and distribution pattern. The purpose of this study is the development of image evaluation methods to analyze individual pigmented spots and acquire detailed information on their age-related changes. To characterize the individual pigmented spots within a cheek image, we established a simple object-counting algorithm. First, we captured cheek images using an original imaging system equipped with an illumination unit and a high-resolution digital camera. The acquired images were converted into melanin concentration images using compensation formulae. Next, the melanin images were converted into binary images. The binary images were then subjected to noise reduction. Finally, we calculated parameters such as the melanin concentration, quantity, and size of individual pigmented spots using a connected-components labeling algorithm, which assigns a unique label to each separate group of connected pixels. The cheek image analysis was evaluated on 643 female Japanese subjects. We confirmed that the proposed method was sufficiently sensitive to measure the melanin concentration, and the numbers and sizes of individual pigmented spots through manual evaluation of the cheek images. The image analysis results for the 643 Japanese women indicated clear relationships between age and the changes in the pigmented spots. We developed a new quantitative evaluation method for individual pigmented spots in facial skin. This method facilitates the analysis of the characteristics of various pigmented facial spots and is directly applicable to the fields of dermatology, pharmacology, and esthetic cosmetology. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  16. Detailed seismic evaluation of bridges along I-24 in Western Kentucky.

    DOT National Transportation Integrated Search

    2006-09-01

    This report presents a seismic rating system and a detailed evaluation procedure for selected highway bridges on/over I-24 in Western Kentucky near the New Madrid Seismic Zone (MNSZ). The rating system, based upon structural vulnerability, seismic an...

  17. A benchmark for comparison of dental radiography analysis algorithms.

    PubMed

    Wang, Ching-Wei; Huang, Cheng-Ta; Lee, Jia-Hong; Li, Chung-Hsing; Chang, Sheng-Wei; Siao, Ming-Jhih; Lai, Tat-Ming; Ibragimov, Bulat; Vrtovec, Tomaž; Ronneberger, Olaf; Fischer, Philipp; Cootes, Tim F; Lindner, Claudia

    2016-07-01

    Dental radiography plays an important role in clinical diagnosis, treatment and surgery. In recent years, efforts have been made on developing computerized dental X-ray image analysis systems for clinical usages. A novel framework for objective evaluation of automatic dental radiography analysis algorithms has been established under the auspices of the IEEE International Symposium on Biomedical Imaging 2015 Bitewing Radiography Caries Detection Challenge and Cephalometric X-ray Image Analysis Challenge. In this article, we present the datasets, methods and results of the challenge and lay down the principles for future uses of this benchmark. The main contributions of the challenge include the creation of the dental anatomy data repository of bitewing radiographs, the creation of the anatomical abnormality classification data repository of cephalometric radiographs, and the definition of objective quantitative evaluation for comparison and ranking of the algorithms. With this benchmark, seven automatic methods for analysing cephalometric X-ray image and two automatic methods for detecting bitewing radiography caries have been compared, and detailed quantitative evaluation results are presented in this paper. Based on the quantitative evaluation results, we believe automatic dental radiography analysis is still a challenging and unsolved problem. The datasets and the evaluation software will be made available to the research community, further encouraging future developments in this field. (http://www-o.ntust.edu.tw/~cweiwang/ISBI2015/). Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  18. Variations in definition and method of retrieval of complications influence outcomes statistics after pancreatoduodenectomy: comparison of NSQIP with non-NSQIP methods.

    PubMed

    Sanford, Dominic E; Woolsey, Cheryl A; Hall, Bruce L; Linehan, David C; Hawkins, William G; Fields, Ryan C; Strasberg, Steven M

    2014-09-01

    NSQIP and the Accordion Severity Grading System have recently been used to develop quantitative methods for measuring the burden of postoperative complications. However, other audit methods such as chart reviews and prospective institutional databases are commonly used to gather postoperative complications. The purpose of this study was to evaluate discordance between different audit methods in pancreatoduodenectomy--a common major surgical procedure. The chief aim was to determine how these different methods could affect quantitative evaluations of postoperative complications. Three common audit methods were compared with NSQIP in 84 patients who underwent pancreatoduodenectomy. The methods were use of a prospective database, a chart review based on discharge summaries only, and a detailed retrospective chart review. The methods were evaluated for discordance with NSQIP and among themselves. Severity grading was performed using the Modified Accordion System. Fifty-three complications were listed by NSQIP and 31 complications were identified that were not listed by NSQIP. There was poor agreement for NSQIP-type complications between NSQIP and the other audit methods for mild and moderate complications (kappa 0.381 to 0.744), but excellent agreement for severe complications (kappa 0.953 to 1.00). Discordance was usually due to variations in definition of the complications in non-NSQIP methods. There was good agreement among non-NSQIP methods for non-NSQIP complications for moderate and severe complications, but not for mild complications. There are important differences in perceived surgical outcomes based on the method of complication retrieval. The non-NSQIP methods used in this study could not be substituted for NSQIP in a quantitative analysis unless that analysis was limited to severe complications. Copyright © 2014 American College of Surgeons. Published by Elsevier Inc. All rights reserved.

  19. Evaluation of European Domestic Violence Perpetrator Programmes: Toward a Model for Designing and Reporting Evaluations Related to Perpetrator Treatment Interventions.

    PubMed

    Lilley-Walker, Sarah-Jane; Hester, Marianne; Turner, William

    2018-03-01

    This article is based on a review of 60 evaluations (published and unpublished) relating to European domestic violence perpetrator programmes, involving 7,212 programme participants across 12 countries. The purpose of the review, part of the "IMPACT: Evaluation of European Perpetrator Programmes" project funded by the European Commission (Daphne III Programme), was to provide detailed knowledge about the range of European evaluation studies with particular emphasis on the design, methods, input, output, and outcome measures used in order to identify the possibilities and challenges of a multicountry, Europe-wide evaluation methodology that could be used to assess perpetrator programmes in the future. We provide a model to standardise the reporting of evaluation studies and to ensure attention is paid to what information is being collected at different time points so as to understand what and how the behaviour and attitudes of perpetrators might change throughout the course of the programme.

  20. An Integrated Low-Speed Performance and Noise Prediction Methodology for Subsonic Aircraft

    NASA Technical Reports Server (NTRS)

    Olson, E. D.; Mavris, D. N.

    2000-01-01

    An integrated methodology has been assembled to compute the engine performance, takeoff and landing trajectories, and community noise levels for a subsonic commercial aircraft. Where feasible, physics-based noise analysis methods have been used to make the results more applicable to newer, revolutionary designs and to allow for a more direct evaluation of new technologies. The methodology is intended to be used with approximation methods and risk analysis techniques to allow for the analysis of a greater number of variable combinations while retaining the advantages of physics-based analysis. Details of the methodology are described and limited results are presented for a representative subsonic commercial aircraft.

  1. Electron Capture in Slow Collisions of Si4+ With Atomic Hydrogen

    NASA Astrophysics Data System (ADS)

    Joseph, D. C.; Gu, J. P.; Saha, B. C.

    2009-10-01

    In recent years the charge transfer involving Si4+ and H at low energies has drawn considerable attention both theoretically and experimentally due to its importance not only in astronomical environments but also in modern semiconductor industries. Accurate information regarding its molecular structures and interactions are essential to understand the low energy collision dynamics. Ab initio calculations are performed using the multireference single- and double-excitation configuration-interaction (MRD-CI) method to evaluate potential energies. State selective cross sections are calculate using fully quantum and semi-classical molecular-orbital close coupling (MOCC) methods in the adiabatic representation. Detail results will be presented in the conference.

  2. Construction and In Vivo Testing of Prokaryotic Riboregulators.

    PubMed

    Green, Alexander A

    2017-01-01

    RNAs that are transcribed and self-assemble within living cells are valuable tools for regulating and organizing cellular activities. Riboregulators, in particular, have been widely used for modulating translation and transcription in response to cognate transactivating or trigger RNAs, enabling cells to evaluate logic operations and to respond to environmental cues. Herein we detail a set of methods for the rapid construction and testing of prokaryotic riboregulators in Escherichia coli. These methods enable construction of dozens of riboregulator plasmids at the same time without the use of restriction enzymes. Furthermore, they facilitate rapid screening of devices and can be applied to a variety of other self-assembling in vivo RNA systems.

  3. Graphical evaluation of complexometric titration curves.

    PubMed

    Guinon, J L

    1985-04-01

    A graphical method, based on logarithmic concentration diagrams, for construction, without any calculations, of complexometric titration curves is examined. The titration curves obtained for different kinds of unidentate, bidentate and quadridentate ligands clearly show why only chelating ligands are usually used in titrimetric analysis. The method has also been applied to two practical cases where unidentate ligands are used: (a) the complexometric determination of mercury(II) with halides and (b) the determination of cyanide with silver, which involves both a complexation and a precipitation system; for this purpose construction of the diagrams for the HgCl(2)/HgCl(+)/Hg(2+) and Ag(CN)(2)(-)/AgCN/CN(-) systems is considered in detail.

  4. New high resolution Random Telegraph Noise (RTN) characterization method for resistive RAM

    NASA Astrophysics Data System (ADS)

    Maestro, M.; Diaz, J.; Crespo-Yepes, A.; Gonzalez, M. B.; Martin-Martinez, J.; Rodriguez, R.; Nafria, M.; Campabadal, F.; Aymerich, X.

    2016-01-01

    Random Telegraph Noise (RTN) is one of the main reliability problems of resistive switching-based memories. To understand the physics behind RTN, a complete and accurate RTN characterization is required. The standard equipment used to analyse RTN has a typical time resolution of ∼2 ms which prevents evaluating fast phenomena. In this work, a new RTN measurement procedure, which increases the measurement time resolution to 2 μs, is proposed. The experimental set-up, together with the recently proposed Weighted Time Lag (W-LT) method for the analysis of RTN signals, allows obtaining a more detailed and precise information about the RTN phenomenon.

  5. Evaluation of experimental methods for determining dynamic stiffness and damping of composite materials

    NASA Technical Reports Server (NTRS)

    Bert, C. W.; Clary, R. R.

    1974-01-01

    Various methods potentially usable for determining dynamic stiffness and damping of composite materials are reviewed. Of these, the following most widely used techniques are singled out for more detailed discussion: free vibration, pulse propagation, and forced vibration response. To illustrate the usefulness and validity of dynamic property data, their application in dynamic analyses and comparison with measured structural response are described for the following composite-material structures: free-free sandwich beam with glass-epoxy facings, clamped-edge sandwich plate with similar facings, free-end sandwich conical shell with similar facings, and boron-epoxy free plate with layers arranged at various orientations.

  6. The Heritage of Radiotracers for PET

    DOE R&D Accomplishments Database

    Fowler, J. S.; Wolf, A. P.

    1988-05-01

    The history of PET research clearly demonstrates that it is advances in chemistry coupled with a detailed examination of the biochemistry of new radiotracers which has allowed the PET method to be applied to new areas of biology and medicine. Radiotracers whose regional distribution reflects glucose metabolism, neutrotransmitter activity and enzyme activity have all required the development of rapid synthetic methods for the radiotracers themselves and the characterization of their biochemical behavior. This article traces some of the advances in the production of labeled precursors and in radiotracer synthesis and evaluation which have shaped the rapidly expanding application of PET to problems in the neurosciences, in cardiology and in oncology.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Donnelly, H.; Fullwood, R.; Glancy, J.

    This is the second volume of a two volume report on the VISA method for evaluating safeguards at fixed-site facilities. This volume contains appendices that support the description of the VISA concept and the initial working version of the method, VISA-1, presented in Volume I. The information is separated into four appendices, each describing details of one of the four analysis modules that comprise the analysis sections of the method. The first appendix discusses Path Analysis methodology, applies it to a Model Fuel Facility, and describes the computer codes that are being used. Introductory material on Path Analysis given inmore » Chapter 3.2.1 and Chapter 4.2.1 of Volume I. The second appendix deals with Detection Analysis, specifically the schemes used in VISA-1 for classifying adversaries and the methods proposed for evaluating individual detection mechanisms in order to build the data base required for detection analysis. Examples of evaluations on identity-access systems, SNM portal monitors, and intrusion devices are provided. The third appendix describes the Containment Analysis overt-segment path ranking, the Monte Carlo engagement model, the network simulation code, the delay mechanism data base, and the results of a sensitivity analysis. The last appendix presents general equations used in Interruption Analysis for combining covert-overt segments and compares them with equations given in Volume I, Chapter 3.« less

  8. An analysis of the symmetry issue in the ℓ-distribution method of gas radiation in non-uniform gaseous media

    NASA Astrophysics Data System (ADS)

    André, Frédéric

    2017-03-01

    The recently proposed ℓ-distribution/ICE (Iterative Copula Evaluation) method of gas radiation suffers from symmetry issues when applied in highly non-isothermal and non-homogeneous gaseous media. This problem is studied in a detailed theoretical way. The objective of the present paper is: 1/to provide a mathematical analysis of this problem of symmetry and, 2/to suggest a decisive factor, defined in terms of the ratio between the narrow band Planck and Rosseland mean absorption coefficients, to handle this issue. Comparisons of model predictions with reference LBL calculations show that the proposed criterion improves the accuracy of the intuitive ICE method for applications in highly non-uniform gases at high temperatures.

  9. Comparison of reversible methods for data compression

    NASA Astrophysics Data System (ADS)

    Heer, Volker K.; Reinfelder, Hans-Erich

    1990-07-01

    Widely differing methods for data compression described in the ACR-NEMA draft are used in medical imaging. In our contribution we will review various methods briefly and discuss the relevant advantages and disadvantages. In detail we evaluate 1st order DPCM pyramid transformation and S transformation. We compare as coding algorithms both fixed and adaptive Huffman coding and Lempel-Ziv coding. Our comparison is performed on typical medical images from CT MR DSA and DLR (Digital Luminescence Radiography). Apart from the achieved compression factors we take into account CPU time required and main memory requirement both for compression and for decompression. For a realistic comparison we have implemented the mentioned algorithms in the C program language on a MicroVAX II and a SPARC station 1. 2.

  10. Chapter 12: Survey Design and Implementation for Estimating Gross Savings Cross-Cutting Protocol. The Uniform Methods Project: Methods for Determining Energy Efficiency Savings for Specific Measures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kurnik, Charles W; Baumgartner, Robert

    This chapter presents an overview of best practices for designing and executing survey research to estimate gross energy savings in energy efficiency evaluations. A detailed description of the specific techniques and strategies for designing questions, implementing a survey, and analyzing and reporting the survey procedures and results is beyond the scope of this chapter. So for each topic covered below, readers are encouraged to consult articles and books cited in References, as well as other sources that cover the specific topics in greater depth. This chapter focuses on the use of survey methods to collect data for estimating gross savingsmore » from energy efficiency programs.« less

  11. Evaluation of aircraft crash hazard at Los Alamos National Laboratory facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Selvage, R.D.

    This report selects a method for use in calculating the frequency of an aircraft crash occurring at selected facilities at the Los Alamos National Laboratory (the Laboratory). The Solomon method was chosen to determine these probabilities. Each variable in the Solomon method is defined and a value for each variable is selected for fourteen facilities at the Laboratory. These values and calculated probabilities are to be used in all safety analysis reports and hazards analyses for the facilities addressed in this report. This report also gives detailed directions to perform aircraft-crash frequency calculations for other facilities. This will ensure thatmore » future aircraft-crash frequency calculations are consistent with calculations in this report.« less

  12. Multivariate analysis: A statistical approach for computations

    NASA Astrophysics Data System (ADS)

    Michu, Sachin; Kaushik, Vandana

    2014-10-01

    Multivariate analysis is a type of multivariate statistical approach commonly used in, automotive diagnosis, education evaluating clusters in finance etc and more recently in the health-related professions. The objective of the paper is to provide a detailed exploratory discussion about factor analysis (FA) in image retrieval method and correlation analysis (CA) of network traffic. Image retrieval methods aim to retrieve relevant images from a collected database, based on their content. The problem is made more difficult due to the high dimension of the variable space in which the images are represented. Multivariate correlation analysis proposes an anomaly detection and analysis method based on the correlation coefficient matrix. Anomaly behaviors in the network include the various attacks on the network like DDOs attacks and network scanning.

  13. Initial Results in Using a Self-Coherence Method for Detecting Sustained Oscillations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Ning; Dagle, Jeffery E.

    2015-01-01

    This paper develops a self-coherence method for detecting sustained oscillations using phasor measurement unit (PMU) data. Sustained oscillations decrease system performance and introduce potential reliability issues. Timely detection of the oscillations at an early stage provides the opportunity for taking remedial reaction. Using high-speed time-synchronized PMU data, this paper details a self-coherence method for detecting sustained oscillation, even when the oscillation amplitude is lower than ambient noise. Simulation and field measurement data are used to evaluate the proposed method’s performance. It is shown that the proposed method can detect sustained oscillations and estimate oscillation frequencies with a low signal-to-noise ratio.more » Comparison with a power spectral density method also shows that the proposed self-coherence method performs better. Index Terms—coherence, power spectral density, phasor measurement unit (PMU), oscillations, power system dynamics« less

  14. Method for Smoke Spread Testing of Large Premises

    NASA Astrophysics Data System (ADS)

    Walmerdahl, P.; Werling, P.

    2001-11-01

    A method for performing non-destructive smoke spread tests has been developed, tested and applied to several existing buildings. Burning methanol in different size steel trays cooled by water generates the heat source. Several tray sizes are available to cover fire sources up to nearly 1MW. The smoke is supplied by means of a suitable number of smoke generators that produce a smoke, which can be described as a non-toxic aerosol. The advantage of the method is that it provides a means for performing non-destructive tests in already existing buildings and other installations for the purpose of evaluating the functionality and design of the active fire protection measures such as smoke extraction systems, etc. In the report, the method is described in detail and experimental data from the try-out of the method are also presented in addition to a discussion on applicability and flexibility of the method.

  15. Comparative evaluation of the powder and compression properties of various grades and brands of microcrystalline cellulose by multivariate methods.

    PubMed

    Haware, Rahul V; Bauer-Brandl, Annette; Tho, Ingunn

    2010-01-01

    The present work challenges a newly developed approach to tablet formulation development by using chemically identical materials (grades and brands of microcrystalline cellulose). Tablet properties with respect to process and formulation parameters (e.g. compression speed, added lubricant and Emcompress fractions) were evaluated by 2(3)-factorial designs. Tablets of constant true volume were prepared on a compaction simulator at constant pressure (approx. 100 MPa). The highly repeatable and accurate force-displacement data obtained was evaluated by simple 'in-die' Heckel method and work descriptors. Relationships and interactions between formulation, process and tablet parameters were identified and quantified by multivariate analysis techniques; principal component analysis (PCA) and partial least square regressions (PLS). The method proved to be able to distinguish between different grades of MCC and even between two different brands of the same grade (Avicel PH 101 and Vivapur 101). One example of interaction was studied in more detail by mixed level design: The interaction effect of lubricant and Emcompress on elastic recovery of Avicel PH 102 was demonstrated to be complex and non-linear using the development tool under investigation.

  16. Emergency medicine clerkship curriculum in a high-income developing country: methods for development and application.

    PubMed

    Cevik, Arif Alper; Cakal, Elif Dilek; Abu-Zidan, Fikri M

    2018-06-07

    The published recommendations for international emergency medicine curricula cover the content, but exclude teaching and learning methods, assessment, and evaluation. We aim to provide an overview on available emergency medicine clerkship curricula and report the development and application experience of our own curriculum. Our curriculum is an outcome-based education, enriched by e-learning and various up-to-date pedagogic principles. Teaching and learning methods, assessment, and evaluation are described. The theory behind our practice in the light of recent literature is discussed aiming to help other colleagues from developing countries to have a clear map for developing and tailoring their own curricula depending on their needs. The details of our emergency medicine clerkship will serve as an example for developing and developed countries having immature undergraduate emergency medicine clerkship curricula. However, these recommendations will differ in various settings depending on available resources. The main concept of curriculum development is to create a curriculum having learning outcomes and content relevant to the local context, and then align the teaching and learning activities, assessments, and evaluations to be in harmony. This may assure favorable educational outcome even in resource limited settings.

  17. Nurse manager succession planning: synthesis of the evidence.

    PubMed

    Titzer, Jennifer; Phillips, Tracy; Tooley, Stephanie; Hall, Norma; Shirey, Maria

    2013-10-01

    The literature supporting nurse manager succession planning is reviewed and synthesised to discover best practice for identifying and developing future nurse managers. Healthcare succession planning practices are lacking. Nurse managers are historically selected based on clinical skills and lack formal leadership preparation. A systematic literature search appraises and summarises the current literature supporting nurse manager succession planning. Multiple reviewers were used to increase the reliability and validity of article selection and analysis. New nurse managers require months to adapt to their positions. Deliberate nurse manager succession planning should be integrated in the organisation's strategic plan and provide a proactive method for identifying and developing potential leaders. Organisations that identify and develop internal human capital can improve role transition, reduce nurse manager turnover rates and decrease replacement costs. Despite the clear benefits of succession planning, studies show that resource allocation for proactive, deliberate development of current and future nurse leaders is lacking. Additionally, systematic evaluation of succession planning is limited. Deliberate succession planning efforts and appropriate resource allocation require strategic planning and evaluation methods. Detailed evaluation methods demonstrating a positive return on investment utilising a cost-benefit analysis and empirical outcomes are necessary. © 2013 John Wiley & Sons Ltd.

  18. Methods of international health technology assessment agencies for economic evaluations--a comparative analysis.

    PubMed

    Mathes, Tim; Jacobs, Esther; Morfeld, Jana-Carina; Pieper, Dawid

    2013-09-30

    The number of Health Technology Assessment (HTA) agencies increases. One component of HTAs are economic aspects. To incorporate economic aspects commonly economic evaluations are performed. A convergence of recommendations for methods of health economic evaluations between international HTA agencies would facilitate the adaption of results to different settings and avoid unnecessary expense. A first step in this direction is a detailed analysis of existing similarities and differences in recommendations to identify potential for harmonization. The objective is to provide an overview and comparison of the methodological recommendations of international HTA agencies for economic evaluations. The webpages of 127 international HTA agencies were searched for guidelines containing recommendations on methods for the preparation of economic evaluations. Additionally, the HTA agencies were requested information on methods for economic evaluations. Recommendations of the included guidelines were extracted in standardized tables according to 13 methodological aspects. All process steps were performed independently by two reviewers. Finally 25 publications of 14 HTA agencies were included in the analysis. Methods for economic evaluations vary widely. The greatest accordance could be found for the type of analysis and comparator. Cost-utility-analyses or cost-effectiveness-analyses are recommended. The comparator should continuously be usual care. Again the greatest differences were shown in the recommendations on the measurement/sources of effects, discounting and in the analysis of sensitivity. The main difference regarding effects is the focus either on efficacy or effectiveness. Recommended discounting rates range from 1.5%-5% for effects and 3%-5% for costs whereby it is mostly recommended to use the same rate for costs and effects. With respect to the analysis of sensitivity the main difference is that oftentimes the probabilistic or deterministic approach is recommended exclusively. Methods for modeling are only described vaguely and mainly with the rational that the "appropriate model" depends on the decision problem. Considering all other aspects a comparison is challenging as recommendations vary regarding detailedness and addressed issues. There is a considerable unexplainable variance in recommendations. Further effort is needed to harmonize methods for preparing economic evaluations.

  19. BOOK REVIEW: Practical Density Measurement and Hydrometry

    NASA Astrophysics Data System (ADS)

    Gupta, S. V.

    2003-01-01

    Density determinations are very important not only for science and production but also in everyday life, since very often a product is sold by mass but the content of the package is measured by volume (or vice versa) so that the density is needed to convert the values. In production processes the density serves as a measure of mixing ratios and other properties. In science, the determination of Avogadro's constant using silicon single crystals and the potential replacement of the kilogram prototype boost density determination to an extremely low relative uncertainty of 10-7 or less. The book by S V Gupta explains in detail the foundations of any density measurement, namely the volume determination of solid artefacts in terms of the SI base unit of length and the density of water and mercury. Both the history and the actual state of science are reported. For practical density measurements, these chapters contain very useful formulae and tables. Water is treated in detail since it is most widely used as a standard not only for density determination but also to gravimetrically calibrate the capacity of volumetric glassware. Two thirds of the book are devoted to the practical density measurement of solids and liquids, mainly using classical instruments like pycnometers and hydrometers. Methods using free flotation of samples in a liquid without suspension are especially useful for small samples. Also, density determinations of powders and granular or porous samples are explained. Unfortunately, modern density meters of the oscillation type are dealt with in only a few pages. The book is clearly written and easy to understand. It contains a lot of evaluations of formulae that for practical measurements are represented in detailed tables. Methods and measurement procedures are described in detail, including also the calculation of uncertainty. Listings of the advantages and disadvantages of the different methods are very helpful. S V Gupta has written a book that will be a great help for scientists, students and practitioners. The book fills a gap since there is no modern book that describes density measurements and hydrometry in such detail. Horst Bettin

  20. Evaluation of Graphite Fiber/Polyimide PMCs from Hot Melt vs Solution Prepreg

    NASA Technical Reports Server (NTRS)

    Shin, E. Eugene; Sutter, James K.; Eakin, Howard; Inghram, Linda; McCorkle, Linda; Scheiman, Dan; Papadopoulos, Demetrios; Thesken, John; Fink, Jeffrey E.

    2002-01-01

    Carbon fiber reinforced high temperature polymer matrix composites (PMC) have been extensively investigated as potential weight reduction replacements of various metallic components in next generation high performance propulsion rocket engines. The initial phase involves development of comprehensive composite material-process-structure-design-property-in-service performance correlations and database, especially for a high stiffness facesheet of various sandwich structures. Overview of the program plan, technical approaches and current multi-team efforts will be presented. During composite fabrication, it was found that the two large volume commercial prepregging methods (hot-melt vs. solution) resulted in considerably different composite cure behavior. Details of the process-induced physical and chemical modifications in the prepregs, their effects on composite processing, and systematic cure cycle optimization studies will be discussed. The combined effects of prepregging method and cure cycle modification on composite properties and isothermal aging performance were also evaluated.

  1. Evaluation of a Consistent LES/PDF Method Using a Series of Experimental Spray Flames

    NASA Astrophysics Data System (ADS)

    Heye, Colin; Raman, Venkat

    2012-11-01

    A consistent method for the evolution of the joint-scalar probability density function (PDF) transport equation is proposed for application to large eddy simulation (LES) of turbulent reacting flows containing evaporating spray droplets. PDF transport equations provide the benefit of including the chemical source term in closed form, however, additional terms describing LES subfilter mixing must be modeled. The recent availability of detailed experimental measurements provide model validation data for a wide range of evaporation rates and combustion regimes, as is well-known to occur in spray flames. In this work, the experimental data will used to investigate the impact of droplet mass loading and evaporation rates on the subfilter scalar PDF shape in comparison with conventional flamelet models. In addition, existing model term closures in the PDF transport equations are evaluated with a focus on their validity in the presence of regime changes.

  2. Space Shuttle Redesigned Solid Rocket Motor nozzle natural frequency variations with burn time

    NASA Technical Reports Server (NTRS)

    Lui, C. Y.; Mason, D. R.

    1991-01-01

    The effects of erosion and thermal degradation on the Space Shuttle Redesigned Solid Rocket Motor (RSRM) nozzle's structural dynamic characteristics were analytically evaluated. Also considered was stiffening of the structure due to internal pressurization. A detailed NASTRAN finite element model of the nozzle was developed and used to evaluate the influence of these effects at several discrete times during motor burn. Methods were developed for treating erosion and thermal degradation, and a procedure was developed to account for internal pressure stiffening using differential stiffness matrix techniques. Results were verified using static firing test accelerometer data. Fast Fourier Transform and Maximum Entropy Method techniques were applied to the data to generate waterfall plots which track modal frequencies with burn time. Results indicate that the lower frequency nozzle 'vectoring' modes are only slightly affected by erosion, thermal effects and internal pressurization. The higher frequency shell modes of the nozzle are, however, significantly reduced.

  3. Fuzzy Sarsa with Focussed Replacing Eligibility Traces for Robust and Accurate Control

    NASA Astrophysics Data System (ADS)

    Kamdem, Sylvain; Ohki, Hidehiro; Sueda, Naomichi

    Several methods of reinforcement learning in continuous state and action spaces that utilize fuzzy logic have been proposed in recent years. This paper introduces Fuzzy Sarsa(λ), an on-policy algorithm for fuzzy learning that relies on a novel way of computing replacing eligibility traces to accelerate the policy evaluation. It is tested against several temporal difference learning algorithms: Sarsa(λ), Fuzzy Q(λ), an earlier fuzzy version of Sarsa and an actor-critic algorithm. We perform detailed evaluations on two benchmark problems : a maze domain and the cart pole. Results of various tests highlight the strengths and weaknesses of these algorithms and show that Fuzzy Sarsa(λ) outperforms all other algorithms tested for a larger granularity of design and under noisy conditions. It is a highly competitive method of learning in realistic noisy domains where a denser fuzzy design over the state space is needed for a more precise control.

  4. Evaluation of Graphite Fiber/Polyimide PMCs from Hot Melt versus Solution Prepreg

    NASA Technical Reports Server (NTRS)

    Shin, Eugene E.; Sutter, James K.; Eakin, Howard; Inghram, Linda; McCorkle, Linda; Scheiman, Dan; Papadopoulos, Demetrios; Thesken, John; Fink, Jeffrey E.; Gray, Hugh R. (Technical Monitor)

    2002-01-01

    Carbon fiber reinforced high temperature polymer matrix composites (PMC) have been extensively investigated as potential weight reduction replacements of various metallic components in next generation high performance propulsion rocket engines. The initial phase involves development of comprehensive composite material-process-structure-design-property in-service performance correlations and database, especially for a high stiffness facesheet of various sandwich structures. Overview of the program plan, technical approaches and current multi-team efforts will be presented. During composite fabrication, it was found that the two large volume commercial prepregging methods (hot-melt vs. solution) resulted in considerably different composite cure behavior. Details of the process-induced physical and chemical modifications in the prepregs, their effects on composite processing, and systematic cure cycle optimization studies will be discussed. The combined effects of prepregging method and cure cycle modification on composite properties and isothermal aging performance were also evaluated.

  5. A Gold Standards Approach to Training Instructors to Evaluate Crew Performance

    NASA Technical Reports Server (NTRS)

    Baker, David P.; Dismukes, R. Key

    2003-01-01

    The Advanced Qualification Program requires that airlines evaluate crew performance in Line Oriented Simulation. For this evaluation to be meaningful, instructors must observe relevant crew behaviors and evaluate those behaviors consistently and accurately against standards established by the airline. The airline industry has largely settled on an approach in which instructors evaluate crew performance on a series of event sets, using standardized grade sheets on which behaviors specific to event set are listed. Typically, new instructors are given a class in which they learn to use the grade sheets and practice evaluating crew performance observed on videotapes. These classes emphasize reliability, providing detailed instruction and practice in scoring so that all instructors within a given class will give similar scores to similar performance. This approach has value but also has important limitations; (1) ratings within one class of new instructors may differ from those of other classes; (2) ratings may not be driven primarily by the specific behaviors on which the company wanted the crews to be scored; and (3) ratings may not be calibrated to company standards for level of performance skill required. In this paper we provide a method to extend the existing method of training instructors to address these three limitations. We call this method the "gold standards" approach because it uses ratings from the company's most experienced instructors as the basis for training rater accuracy. This approach ties the training to the specific behaviors on which the experienced instructors based their ratings.

  6. A comparison of the simplified olecranon and digital methods of assessment of skeletal maturity during the pubertal growth spurt.

    PubMed

    Canavese, F; Charles, Y P; Dimeglio, A; Schuller, S; Rousset, M; Samba, A; Pereira, B; Steib, J-P

    2014-11-01

    Assessment of skeletal age is important in children's orthopaedics. We compared two simplified methods used in the assessment of skeletal age. Both methods have been described previously with one based on the appearance of the epiphysis at the olecranon and the other on the digital epiphyses. We also investigated the influence of assessor experience on applying these two methods. Our investigation was based on the anteroposterior left hand and lateral elbow radiographs of 44 boys (mean: 14.4; 12.4 to 16.1 ) and 78 girls (mean: 13.0; 11.1 to14.9) obtained during the pubertal growth spurt. A total of nine observers examined the radiographs with the observers assigned to three groups based on their experience (experienced, intermediate and novice). These raters were required to determined skeletal ages twice at six-week intervals. The correlation between the two methods was determined per assessment and per observer groups. Interclass correlation coefficients (ICC) evaluated the reproducibility of the two methods. The overall correlation between the two methods was r = 0.83 for boys and r = 0.84 for girls. The correlation was equal between first and second assessment, and between the observer groups (r ≥ 0.82). There was an equally strong ICC for the assessment effect (ICC ≤ 0.4%) and observer effect (ICC ≤ 3%) for each method. There was no significant (p < 0.05) difference between the levels of experience. The two methods are equally reliable in assessing skeletal maturity. The olecranon method offers detailed information during the pubertal growth spurt, while the digital method is as accurate but less detailed, making it more useful after the pubertal growth spurt once the olecranon has ossified. ©2014 The British Editorial Society of Bone & Joint Surgery.

  7. Agricultural soil greenhouse gas emissions: a review of national inventory methods.

    PubMed

    Lokupitiya, Erandathie; Paustian, Keith

    2006-01-01

    Parties to the United Nations Framework Convention on Climate Change (UNFCCC) are required to submit national greenhouse gas (GHG) inventories, together with information on methods used in estimating their emissions. Currently agricultural activities contribute a significant portion (approximately 20%) of global anthropogenic GHG emissions, and agricultural soils have been identified as one of the main GHG source categories within the agricultural sector. However, compared to many other GHG sources, inventory methods for soils are relatively more complex and have been implemented only to varying degrees among member countries. This review summarizes and evaluates the methods used by Annex 1 countries in estimating CO2 and N2O emissions in agricultural soils. While most countries utilize the Intergovernmental Panel on Climate Change (IPCC) default methodology, several Annex 1 countries are developing more advanced methods that are tailored for specific country circumstances. Based on the latest national inventory reporting, about 56% of the Annex 1 countries use IPCC Tier 1 methods, about 26% use Tier 2 methods, and about 18% do not estimate or report N2O emissions from agricultural soils. More than 65% of the countries do not report CO2 emissions from the cultivation of mineral soils, organic soils, or liming, and only a handful of countries have used country-specific, Tier 3 methods. Tier 3 methods usually involve process-based models and detailed, geographically specific activity data. Such methods can provide more robust, accurate estimates of emissions and removals but require greater diligence in documentation, transparency, and uncertainty assessment to ensure comparability between countries. Availability of detailed, spatially explicit activity data is a major constraint to implementing higher tiered methods in many countries.

  8. Elevator ride comfort monitoring and evaluation using smartphones

    NASA Astrophysics Data System (ADS)

    Zhang, Yang; Sun, Xiaowei; Zhao, Xuefeng; Su, Wensheng

    2018-05-01

    With rapid urbanization, the demand for elevators is increasing, and their level of safety and ride comfort under vibrating conditions has also aroused interest. It is therefore essential to monitor the ride comfort level of elevators. The traditional method for such monitoring depends significantly on regular professional inspections, and requires expensive equipment and professional skill. With this regard, a new method for elevator ride comfort monitoring using a smartphone is demonstrated herein in detail. A variety of high-precision sensors are installed in a smartphone with strong data processing and telecommunication capabilities. A series of validation tests were designed and completed, and the international organization for standardization ISO2631-1997 was applied to evaluate the level of elevator ride comfort. Experimental results indicate that the proposed method is stable and reliable, its precision meets the engineering requirements, and the elevator ride comfort level can be accurately monitored under various situations. The method is very economical and convenient, and provides the possibility for the public to participate in elevator ride comfort monitoring. In addition, the method can both provide a wide range of data support and eliminate data errors to a certain extent.

  9. Intra-oral models to assess cariogenicity: evaluation of oral fluoride and pH.

    PubMed

    Duckworth, R M; Gilbert, R J

    1992-04-01

    The main purpose of this paper is to review the various methods used for evaluation of fluoride retention in saliva, plaque, and enamel following application of topical anti-caries treatments such as F dentifrices and F mouthwashes. Such methods monitor delivery of fluoride to the site of action, the mouth, and so can be regarded as assessing potential for treatment action. It is concluded that intra-oral fluoride measurements are appropriate to support bioequivalence claims for anti-caries treatments, provided that particular chosen methods have been calibrated against clinical data. Studies purporting to show superiority are of interest mechanistically, but links to caries are not sufficiently understood to define superiority claims. A wide variety of methods has been used for determination of the fluoride content of enamel. Of these, well-established methods such as the micro-drill and acid-etch procedures are appropriate for routine comparative testing, whereas sophisticated instrumental techniques such as SIMS are more appropriate for detailed mechanistic studies. Intra-oral pH measurements are also relevant to many topical treatments. Single-site determinations in plaque are preferred, but for comparative studies non-specific determinations may be adequate.

  10. Estimating Temporal Causal Interaction between Spike Trains with Permutation and Transfer Entropy

    PubMed Central

    Li, Zhaohui; Li, Xiaoli

    2013-01-01

    Estimating the causal interaction between neurons is very important for better understanding the functional connectivity in neuronal networks. We propose a method called normalized permutation transfer entropy (NPTE) to evaluate the temporal causal interaction between spike trains, which quantifies the fraction of ordinal information in a neuron that has presented in another one. The performance of this method is evaluated with the spike trains generated by an Izhikevich’s neuronal model. Results show that the NPTE method can effectively estimate the causal interaction between two neurons without influence of data length. Considering both the precision of time delay estimated and the robustness of information flow estimated against neuronal firing rate, the NPTE method is superior to other information theoretic method including normalized transfer entropy, symbolic transfer entropy and permutation conditional mutual information. To test the performance of NPTE on analyzing simulated biophysically realistic synapses, an Izhikevich’s cortical network that based on the neuronal model is employed. It is found that the NPTE method is able to characterize mutual interactions and identify spurious causality in a network of three neurons exactly. We conclude that the proposed method can obtain more reliable comparison of interactions between different pairs of neurons and is a promising tool to uncover more details on the neural coding. PMID:23940662

  11. Evaluation of Fatigue Strength Improvement by CFRP Laminates and Shot Peening onto the Tension Flanges Joining Corrugated Steel Webs

    PubMed Central

    Wang, Zhi-Yu; Wang, Qing-Yuan; Liu, Yong-Jie

    2015-01-01

    Corrugated steel web with inherent high out-of-plane stiffness has a promising application in configuring large span highway bridge girders. Due to the irregularity of the configuration details, the local stress concentration poses a major fatigue problem for the welded flange plates of high strength low alloy structural steels. In this work, the methods of applying CFRP laminate and shot peening onto the surfaces of the tension flanges were employed with the purpose of improving the fatigue strength of such configuration details. The effectiveness of this method in the improvement of fatigue strength has been examined experimentally. Test results show that the shot peening significantly increases hardness and roughness in contrast to these without treatment. Also, it has beneficial effects on the fatigue strength enhancement when compared against the test data of the joints with CFRP strengthening. The stiffness degradation during the loading progress is compared with each treatment. Incorporating the stress acting on the constituent parts of the CFRP laminates, a discussion is made regarding the mechanism of the retrofit and related influencing factors such as corrosion and economic cost. This work could enhance the understanding of the CFRP and shot peening in repairing such welded details and shed light on the reinforcement design of welded joints between corrugated steel webs and flange plates. PMID:28793509

  12. Radio-guided sentinel lymph node identification by lymphoscintigraphy fused with an anatomical vector profile: clinical applications.

    PubMed

    Niccoli Asabella, A; Antonica, F; Renna, M A; Rubini, D; Notaristefano, A; Nicoletti, A; Rubini, G

    2013-12-01

    To develop a method to fuse lymphoscintigraphic images with an adaptable anatomical vector profile and to evaluate its role in the clinical practice. We used Adobe Illustrator CS6 to create different vector profiles, we fused those profiles, using Adobe Photoshop CS6, with lymphoscintigraphic images of the patient. We processed 197 lymphoscintigraphies performed in patients with cutaneous melanomas, breast cancer or delayed lymph drainage. Our models can be adapted to every patient attitude or position and contain different levels of anatomical details ranging from external body profiles to the internal anatomical structures like bones, muscles, vessels, and lymph nodes. If needed, more new anatomical details can be added and embedded in the profile without redrawing them, saving a lot of time. Details can also be easily hidden, allowing the physician to view only relevant information and structures. Fusion times are about 85 s. The diagnostic confidence of the observers increased significantly. The validation process showed a slight shift (mean 4.9 mm). We have created a new, practical, inexpensive digital technique based on commercial software for fusing lymphoscintigraphic images with built-in anatomical reference profiles. It is easily reproducible and does not alter the original scintigraphic image. Our method allows a more meaningful interpretation of lymphoscintigraphies, an easier recognition of the anatomical site and better lymph node dissection planning.

  13. Methods for economic evaluation of a factorial-design cluster randomised controlled trial of a nutrition supplement and an exercise programme among healthy older people living in Santiago, Chile: the CENEX study.

    PubMed

    Walker, Damian G; Aedo, Cristian; Albala, Cecilia; Allen, Elizabeth; Dangour, Alan D; Elbourne, Diana; Grundy, Emily; Uauy, Ricardo

    2009-05-27

    In an effort to promote healthy ageing and preserve health and function, the government of Chile has formulated a package of actions into the Programme for Complementary Food in Older People (Programa de Alimentación Complementaria para el Adulto Mayor - PACAM). The CENEX study was designed to evaluate the impact, cost and cost-effectiveness of the PACAM and a specially designed exercise programme on pneumonia incidence, walking capacity and body mass index in healthy older people living in low- to medium-socio-economic status areas of Santiago. The purpose of this paper is to describe in detail the methods that will be used to estimate the incremental costs and cost-effectiveness of the interventions. The base-case analysis will adopt a societal perspective, including the direct medical and non-medical costs borne by the government and patients. The cost of the interventions will be calculated by the ingredients approach, in which the total quantities of goods and services actually employed in applying the interventions will be estimated, and multiplied by their respective unit prices. Relevant information on costs of interventions will be obtained mainly from administrative records. The costs borne by patients will be collected via exit and telephone interviews. An annual discount rate of 8% will be used, consistent with the rate recommended by the Government of Chile. All costs will be converted from Chilean Peso to US dollars with the 2007 average period exchange rate of US$1 = 522.37 Chilean Peso. To test the robustness of model results, we will vary the assumptions over a plausible range in sensitivity analyses. The protocol described here indicates our intent to conduct an economic evaluation alongside the CENEX study. It provides a detailed and transparent statement of planned data collection methods and analyses. ISRCTN48153354.

  14. Wavepacket dynamics and the multi-configurational time-dependent Hartree approach

    NASA Astrophysics Data System (ADS)

    Manthe, Uwe

    2017-06-01

    Multi-configurational time-dependent Hartree (MCTDH) based approaches are efficient, accurate, and versatile methods for high-dimensional quantum dynamics simulations. Applications range from detailed investigations of polyatomic reaction processes in the gas phase to high-dimensional simulations studying the dynamics of condensed phase systems described by typical solid state physics model Hamiltonians. The present article presents an overview of the different areas of application and provides a comprehensive review of the underlying theory. The concepts and guiding ideas underlying the MCTDH approach and its multi-mode and multi-layer extensions are discussed in detail. The general structure of the equations of motion is highlighted. The representation of the Hamiltonian and the correlated discrete variable representation (CDVR), which provides an efficient multi-dimensional quadrature in MCTDH calculations, are discussed. Methods which facilitate the calculation of eigenstates, the evaluation of correlation functions, and the efficient representation of thermal ensembles in MCTDH calculations are described. Different schemes for the treatment of indistinguishable particles in MCTDH calculations and recent developments towards a unified multi-layer MCTDH theory for systems including bosons and fermions are discussed.

  15. Image reconstruction through thin scattering media by simulated annealing algorithm

    NASA Astrophysics Data System (ADS)

    Fang, Longjie; Zuo, Haoyi; Pang, Lin; Yang, Zuogang; Zhang, Xicheng; Zhu, Jianhua

    2018-07-01

    An idea for reconstructing the image of an object behind thin scattering media is proposed by phase modulation. The optimized phase mask is achieved by modulating the scattered light using simulated annealing algorithm. The correlation coefficient is exploited as a fitness function to evaluate the quality of reconstructed image. The reconstructed images optimized from simulated annealing algorithm and genetic algorithm are compared in detail. The experimental results show that our proposed method has better definition and higher speed than genetic algorithm.

  16. Absolute configuration and crystal packing for three chiral drugs prone to spontaneous resolution: Guaifenesin, methocarbamol and mephenesin

    NASA Astrophysics Data System (ADS)

    Bredikhin, Alexander A.; Gubaidullin, Aidar T.; Bredikhina, Zemfira A.; Krivolapov, Dmitry B.; Pashagin, Alexander V.; Litvinov, Igor A.

    2009-02-01

    Popular chiral drugs, guaifenesin, methocarbamol, and mephenesin were investigated by single-crystal X-ray analysis both for enantiopure and racemic samples. The absolute configurations for all substances were established through Flack parameter method. The conglomerate-forming nature for the compounds was confirmed by equivalence of crystal characteristics of enantiopure and racemic samples. The molecular structures and crystal packing details were evaluated and compared with one another for all three investigated substances.

  17. Follow-up Methodology: A Comprehensive Study and Evaluation of Academic, Technical and Vocational Del Mar College Graduates from September 1, 1973, Through August 31, 1975, Including Ways, Means, Instruments, Relationships, and Methods of Follow-up. TEX-SIS FOLLOW-UP SC4.

    ERIC Educational Resources Information Center

    Fite, Ronald S.

    This report details the research activities conducted by Del Mar College, as a subcontractor of Project FOLLOW-UP, in the design, development, and implementation of a graduate follow-up system. The activities included questionnaire design, development of manual and computerized record-keeping systems, student-graduate identification, and…

  18. International funding agencies: potential leaders of impact evaluation in protected areas?

    PubMed

    Craigie, Ian D; Barnes, Megan D; Geldmann, Jonas; Woodley, Stephen

    2015-11-05

    Globally, protected areas are the most commonly used tools to halt biodiversity loss. Yet, some are failing to adequately conserve the biodiversity they contain. There is an urgent need for knowledge on how to make them function more effectively. Impact evaluation methods provide a set of tools that could yield this knowledge. However, rigorous outcome-focused impact evaluation is not yet used as extensively as it could be in protected area management. We examine the role of international protected area funding agencies in facilitating the use of impact evaluation. These agencies are influential stakeholders as they allocate hundreds of millions of dollars annually to support protected areas, creating a unique opportunity to shape how the conservation funds are spent globally. We identify key barriers to the use of impact evaluation, detail how large funders are uniquely placed to overcome many of these, and highlight the potential benefits if impact evaluation is used more extensively. © 2015 The Author(s).

  19. Application of Multi-level Grey Evaluation on Geological Tourism Resources’ Economic Values of Geopark: A Case Study of Huashan Geopark in Shaanxi Province

    NASA Astrophysics Data System (ADS)

    Zhang, Yang; Gong, Xianjie

    2018-01-01

    Geo-park gives priority to geographical relic landscapes. It has not only rich geological touristic resources but also extraordinarily high values for economic development. Taking Huashan Geological Park as an example, the thesis systematically analyzes the characteristics of the geological touristic resources in this park. It applies the method of multilevel grey evaluation to establish the evaluation model for the economic values of the touristic resources in the geological park and presents detailed result of the assessment. The result concludes an excellent grade for the comprehensive evaluation of the economic values of Huashan geological touristic resources, reflecting the outstanding natural advantages of the park in geological resources. Moreover, in the single-item evaluations, the scientific evaluation ranks the highest in score, indicating that the geological touristic resources of the park have extraordinary geologically science-popularizing values as a significant condition for the development of scientific tours. It shows that the park is endowed with excellent prospects for economic development.

  20. Piecemeal deglutition and dysphagia limit in normal subjects and in patients with swallowing disorders.

    PubMed Central

    Ertekin, C; Aydoğdu, I; Yüceyar, N

    1996-01-01

    OBJECTIVE: Before the advanced evaluation of deglutition and selection of a treatment method, objective screening methods are necessary for patients with dysphagia. In this study a new electroclinical test was established to evaluate patients with dysphagia. METHODS: This test is based on determining piecemeal deglutition; which is a physiological phenomenon occurring when a bolus of a large volume is divided into two or more parts which are swallowed successively. The combined electrophysiological and mechanical method used to record laryngeal movements detected by a piezoelectric transducer, and activities of the related submental integrated EMG (SM-EMG)-and sometimes the cricopharyngeal muscle of the upper oesophageal sphincter (CP-EMG)-were performed during swallowing. Thirty normal subjects and 66 patients with overt dysphagia of neurogenic origin were investigated after detailed clinical evaluation. Twenty patients with a potential risk of dysphagia, but who were normal clinically at the time of investigation, were also evaluated to determine the specificity of the test. All subjects were instructed to swallow doses of water, gradually increasing in quantity from 1 ml to 20 ml, and any recurrence of the signals related to swallowing within the eight seconds was accepted as a sign of dysphagia limit. RESULTS: In normal subjects as well as in the patients without dysphagia, piecemeal deglutition was never seen with less than 20 ml water. This volume was therefore accepted as the lower limit of piecemeal deglutition. In patients with dysphagia, dysphagia limits were significantly lower than those of normal subjects. CONCLUSION: The method is a highly specific and sensitive test for the objective evaluation of oropharyngeal dysphagia even in patients with suspected dysphagia of neurogenic origin. It can also be safely and simply applied in any EMG laboratory. PMID:8937344

Top