Sample records for processing technique validation

  1. Comparing Parameter Estimation Techniques for an Electrical Power Transformer Oil Temperature Prediction Model

    NASA Technical Reports Server (NTRS)

    Morris, A. Terry

    1999-01-01

    This paper examines various sources of error in MIT's improved top oil temperature rise over ambient temperature model and estimation process. The sources of error are the current parameter estimation technique, quantization noise, and post-processing of the transformer data. Results from this paper will show that an output error parameter estimation technique should be selected to replace the current least squares estimation technique. The output error technique obtained accurate predictions of transformer behavior, revealed the best error covariance, obtained consistent parameter estimates, and provided for valid and sensible parameters. This paper will also show that the output error technique should be used to minimize errors attributed to post-processing (decimation) of the transformer data. Models used in this paper are validated using data from a large transformer in service.

  2. Techniques for Down-Sampling a Measured Surface Height Map for Model Validation

    NASA Technical Reports Server (NTRS)

    Sidick, Erkin

    2012-01-01

    This software allows one to down-sample a measured surface map for model validation, not only without introducing any re-sampling errors, but also eliminating the existing measurement noise and measurement errors. The software tool of the current two new techniques can be used in all optical model validation processes involving large space optical surfaces

  3. Using process elicitation and validation to understand and improve chemotherapy ordering and delivery.

    PubMed

    Mertens, Wilson C; Christov, Stefan C; Avrunin, George S; Clarke, Lori A; Osterweil, Leon J; Cassells, Lucinda J; Marquard, Jenna L

    2012-11-01

    Chemotherapy ordering and administration, in which errors have potentially severe consequences, was quantitatively and qualitatively evaluated by employing process formalism (or formal process definition), a technique derived from software engineering, to elicit and rigorously describe the process, after which validation techniques were applied to confirm the accuracy of the described process. The chemotherapy ordering and administration process, including exceptional situations and individuals' recognition of and responses to those situations, was elicited through informal, unstructured interviews with members of an interdisciplinary team. The process description (or process definition), written in a notation developed for software quality assessment purposes, guided process validation (which consisted of direct observations and semistructured interviews to confirm the elicited details for the treatment plan portion of the process). The overall process definition yielded 467 steps; 207 steps (44%) were dedicated to handling 59 exceptional situations. Validation yielded 82 unique process events (35 new expected but not yet described steps, 16 new exceptional situations, and 31 new steps in response to exceptional situations). Process participants actively altered the process as ambiguities and conflicts were discovered by the elicitation and validation components of the study. Chemotherapy error rates declined significantly during and after the project, which was conducted from October 2007 through August 2008. Each elicitation method and the subsequent validation discussions contributed uniquely to understanding the chemotherapy treatment plan review process, supporting rapid adoption of changes, improved communication regarding the process, and ensuing error reduction.

  4. Guidance and Control Systems Simulation and Validation Techniques

    DTIC Science & Technology

    1988-07-01

    AGARDograph No.273 GUIDANCE AND CONTROL SYSTEMS SIMULATION AND VALIDATION TECHNIQUES Edited by Dr William P.Albritton, Jr AMTEC Corporation 213 Ridgelawn...AND DEVELOPMENT PROCESS FOR TACTICAL GUIDED WEAPONS by Dr W.PAlbritton, Jr AMTEC Corporation 213 Ridgelawn Drive Athens, AL 35611, USA Summary A brief

  5. Review of surface steam sterilization for validation purposes.

    PubMed

    van Doornmalen, Joost; Kopinga, Klaas

    2008-03-01

    Sterilization is an essential step in the process of producing sterile medical devices. To guarantee sterility, the process of sterilization must be validated. Because there is no direct way to measure sterility, the techniques applied to validate the sterilization process are based on statistical principles. Steam sterilization is the most frequently applied sterilization method worldwide and can be validated either by indicators (chemical or biological) or physical measurements. The steam sterilization conditions are described in the literature. Starting from these conditions, criteria for the validation of steam sterilization are derived and can be described in terms of physical parameters. Physical validation of steam sterilization appears to be an adequate and efficient validation method that could be considered as an alternative for indicator validation. Moreover, physical validation can be used for effective troubleshooting in steam sterilizing processes.

  6. Think Aloud: Using Cognitive Interviewing to Validate the PISA Assessment of Student Self-Efficacy in Mathematics

    ERIC Educational Resources Information Center

    Pepper, David; Hodgen, Jeremy; Lamesoo, Katri; Kõiv, Pille; Tolboom, Jos

    2018-01-01

    Cognitive interviewing (CI) provides a method of systematically collecting validity evidence of response processes for questionnaire items. CI involves a range of techniques for prompting individuals to verbalise their responses to items. One such technique is concurrent verbalisation, as developed in Think Aloud Protocol (TAP). This article…

  7. Machine learning, medical diagnosis, and biomedical engineering research - commentary.

    PubMed

    Foster, Kenneth R; Koprowski, Robert; Skufca, Joseph D

    2014-07-05

    A large number of papers are appearing in the biomedical engineering literature that describe the use of machine learning techniques to develop classifiers for detection or diagnosis of disease. However, the usefulness of this approach in developing clinically validated diagnostic techniques so far has been limited and the methods are prone to overfitting and other problems which may not be immediately apparent to the investigators. This commentary is intended to help sensitize investigators as well as readers and reviewers of papers to some potential pitfalls in the development of classifiers, and suggests steps that researchers can take to help avoid these problems. Building classifiers should be viewed not simply as an add-on statistical analysis, but as part and parcel of the experimental process. Validation of classifiers for diagnostic applications should be considered as part of a much larger process of establishing the clinical validity of the diagnostic technique.

  8. Validation of the F-18 high alpha research vehicle flight control and avionics systems modifications

    NASA Technical Reports Server (NTRS)

    Chacon, Vince; Pahle, Joseph W.; Regenie, Victoria A.

    1990-01-01

    The verification and validation process is a critical portion of the development of a flight system. Verification, the steps taken to assure the system meets the design specification, has become a reasonably understood and straightforward process. Validation is the method used to ensure that the system design meets the needs of the project. As systems become more integrated and more critical in their functions, the validation process becomes more complex and important. The tests, tools, and techniques which are being used for the validation of the high alpha research vehicle (HARV) turning valve control system (TVCS) are discussed, and their solutions are documented. The emphasis of this paper is on the validation of integrated systems.

  9. Validation of the F-18 high alpha research vehicle flight control and avionics systems modifications

    NASA Technical Reports Server (NTRS)

    Chacon, Vince; Pahle, Joseph W.; Regenie, Victoria A.

    1990-01-01

    The verification and validation process is a critical portion of the development of a flight system. Verification, the steps taken to assure the system meets the design specification, has become a reasonably understood and straightforward process. Validation is the method used to ensure that the system design meets the needs of the project. As systems become more integrated and more critical in their functions, the validation process becomes more complex and important. The tests, tools, and techniques which are being used for the validation of the high alpha research vehicle (HARV) turning vane control system (TVCS) are discussed and the problems and their solutions are documented. The emphasis of this paper is on the validation of integrated system.

  10. Dynamic MRI to quantify musculoskeletal motion: A systematic review of concurrent validity and reliability, and perspectives for evaluation of musculoskeletal disorders.

    PubMed

    Borotikar, Bhushan; Lempereur, Mathieu; Lelievre, Mathieu; Burdin, Valérie; Ben Salem, Douraied; Brochard, Sylvain

    2017-01-01

    To report evidence for the concurrent validity and reliability of dynamic MRI techniques to evaluate in vivo joint and muscle mechanics, and to propose recommendations for their use in the assessment of normal and impaired musculoskeletal function. The search was conducted on articles published in Web of science, PubMed, Scopus, Academic search Premier, and Cochrane Library between 1990 and August 2017. Studies that reported the concurrent validity and/or reliability of dynamic MRI techniques for in vivo evaluation of joint or muscle mechanics were included after assessment by two independent reviewers. Selected articles were assessed using an adapted quality assessment tool and a data extraction process. Results for concurrent validity and reliability were categorized as poor, moderate, or excellent. Twenty articles fulfilled the inclusion criteria with a mean quality assessment score of 66% (±10.4%). Concurrent validity and/or reliability of eight dynamic MRI techniques were reported, with the knee being the most evaluated joint (seven studies). Moderate to excellent concurrent validity and reliability were reported for seven out of eight dynamic MRI techniques. Cine phase contrast and real-time MRI appeared to be the most valid and reliable techniques to evaluate joint motion, and spin tag for muscle motion. Dynamic MRI techniques are promising for the in vivo evaluation of musculoskeletal mechanics; however results should be evaluated with caution since validity and reliability have not been determined for all joints and muscles, nor for many pathological conditions.

  11. Meeting the needs of an ever-demanding market.

    PubMed

    Rigby, Richard

    2002-04-01

    Balancing cost and performance in packaging is critical. This article outlines techniques to assist in this whilst delivering added value and product differentiation. The techniques include a rigorous statistical process capable of delivering cost reduction and improved quality and a computer modelling process that can save time when validating new packaging options.

  12. Skills Acquisition in Plantain Flour Processing Enterprises: A Validation of Training Modules for Senior Secondary Schools

    ERIC Educational Resources Information Center

    Udofia, Nsikak-Abasi; Nlebem, Bernard S.

    2013-01-01

    This study was to validate training modules that can help provide requisite skills for Senior Secondary school students in plantain flour processing enterprises for self-employment and to enable them pass their examination. The study covered Rivers State. Purposive sampling technique was used to select a sample size of 205. Two sets of structured…

  13. Dynamic MRI to quantify musculoskeletal motion: A systematic review of concurrent validity and reliability, and perspectives for evaluation of musculoskeletal disorders

    PubMed Central

    Lempereur, Mathieu; Lelievre, Mathieu; Burdin, Valérie; Ben Salem, Douraied; Brochard, Sylvain

    2017-01-01

    Purpose To report evidence for the concurrent validity and reliability of dynamic MRI techniques to evaluate in vivo joint and muscle mechanics, and to propose recommendations for their use in the assessment of normal and impaired musculoskeletal function. Materials and methods The search was conducted on articles published in Web of science, PubMed, Scopus, Academic search Premier, and Cochrane Library between 1990 and August 2017. Studies that reported the concurrent validity and/or reliability of dynamic MRI techniques for in vivo evaluation of joint or muscle mechanics were included after assessment by two independent reviewers. Selected articles were assessed using an adapted quality assessment tool and a data extraction process. Results for concurrent validity and reliability were categorized as poor, moderate, or excellent. Results Twenty articles fulfilled the inclusion criteria with a mean quality assessment score of 66% (±10.4%). Concurrent validity and/or reliability of eight dynamic MRI techniques were reported, with the knee being the most evaluated joint (seven studies). Moderate to excellent concurrent validity and reliability were reported for seven out of eight dynamic MRI techniques. Cine phase contrast and real-time MRI appeared to be the most valid and reliable techniques to evaluate joint motion, and spin tag for muscle motion. Conclusion Dynamic MRI techniques are promising for the in vivo evaluation of musculoskeletal mechanics; however results should be evaluated with caution since validity and reliability have not been determined for all joints and muscles, nor for many pathological conditions. PMID:29232401

  14. Reliability and criterion validity of an observation protocol for working technique assessments in cash register work.

    PubMed

    Palm, Peter; Josephson, Malin; Mathiassen, Svend Erik; Kjellberg, Katarina

    2016-06-01

    We evaluated the intra- and inter-observer reliability and criterion validity of an observation protocol, developed in an iterative process involving practicing ergonomists, for assessment of working technique during cash register work for the purpose of preventing upper extremity symptoms. Two ergonomists independently assessed 17 15-min videos of cash register work on two occasions each, as a basis for examining reliability. Criterion validity was assessed by comparing these assessments with meticulous video-based analyses by researchers. Intra-observer reliability was acceptable (i.e. proportional agreement >0.7 and kappa >0.4) for 10/10 questions. Inter-observer reliability was acceptable for only 3/10 questions. An acceptable inter-observer reliability combined with an acceptable criterion validity was obtained only for one working technique aspect, 'Quality of movements'. Thus, major elements of the cashiers' working technique could not be assessed with an acceptable accuracy from short periods of observations by one observer, such as often desired by practitioners. Practitioner Summary: We examined an observation protocol for assessing working technique in cash register work. It was feasible in use, but inter-observer reliability and criterion validity were generally not acceptable when working technique aspects were assessed from short periods of work. We recommend the protocol to be used for educational purposes only.

  15. Use of a Computer-Mediated Delphi Process to Validate a Mass Casualty Conceptual Model

    PubMed Central

    CULLEY, JOAN M.

    2012-01-01

    Since the original work on the Delphi technique, multiple versions have been developed and used in research and industry; however, very little empirical research has been conducted that evaluates the efficacy of using online computer, Internet, and e-mail applications to facilitate a Delphi method that can be used to validate theoretical models. The purpose of this research was to develop computer, Internet, and e-mail applications to facilitate a modified Delphi technique through which experts provide validation for a proposed conceptual model that describes the information needs for a mass-casualty continuum of care. Extant literature and existing theoretical models provided the basis for model development. Two rounds of the Delphi process were needed to satisfy the criteria for consensus and/or stability related to the constructs, relationships, and indicators in the model. The majority of experts rated the online processes favorably (mean of 6.1 on a seven-point scale). Using online Internet and computer applications to facilitate a modified Delphi process offers much promise for future research involving model building or validation. The online Delphi process provided an effective methodology for identifying and describing the complex series of events and contextual factors that influence the way we respond to disasters. PMID:21076283

  16. Use of a computer-mediated Delphi process to validate a mass casualty conceptual model.

    PubMed

    Culley, Joan M

    2011-05-01

    Since the original work on the Delphi technique, multiple versions have been developed and used in research and industry; however, very little empirical research has been conducted that evaluates the efficacy of using online computer, Internet, and e-mail applications to facilitate a Delphi method that can be used to validate theoretical models. The purpose of this research was to develop computer, Internet, and e-mail applications to facilitate a modified Delphi technique through which experts provide validation for a proposed conceptual model that describes the information needs for a mass-casualty continuum of care. Extant literature and existing theoretical models provided the basis for model development. Two rounds of the Delphi process were needed to satisfy the criteria for consensus and/or stability related to the constructs, relationships, and indicators in the model. The majority of experts rated the online processes favorably (mean of 6.1 on a seven-point scale). Using online Internet and computer applications to facilitate a modified Delphi process offers much promise for future research involving model building or validation. The online Delphi process provided an effective methodology for identifying and describing the complex series of events and contextual factors that influence the way we respond to disasters.

  17. Managing distribution changes in time series prediction

    NASA Astrophysics Data System (ADS)

    Matias, J. M.; Gonzalez-Manteiga, W.; Taboada, J.; Ordonez, C.

    2006-07-01

    When a problem is modeled statistically, a single distribution model is usually postulated that is assumed to be valid for the entire space. Nonetheless, this practice may be somewhat unrealistic in certain application areas, in which the conditions of the process that generates the data may change; as far as we are aware, however, no techniques have been developed to tackle this problem.This article proposes a technique for modeling and predicting this change in time series with a view to improving estimates and predictions. The technique is applied, among other models, to the hypernormal distribution recently proposed. When tested on real data from a range of stock market indices the technique produces better results that when a single distribution model is assumed to be valid for the entire period of time studied.Moreover, when a global model is postulated, it is highly recommended to select the hypernormal distribution parameter in the same likelihood maximization process.

  18. Application and Validation of Workload Assessment Techniques

    DTIC Science & Technology

    1993-03-01

    tech ical report documents the process and outcome of meeting this objective. Procedure: A series of eight separate studies was conducted using three...development process . The task analysis and simulation technique was shown to have the capability to track empirical workload ratings. More research is...operator workload during the systems acquisi- tion process , and (b) a pamphlet for the managers of Army systems that describes the need and some procedures

  19. Validating a Geographical Image Retrieval System.

    ERIC Educational Resources Information Center

    Zhu, Bin; Chen, Hsinchun

    2000-01-01

    Summarizes a prototype geographical image retrieval system that demonstrates how to integrate image processing and information analysis techniques to support large-scale content-based image retrieval. Describes an experiment to validate the performance of this image retrieval system against that of human subjects by examining similarity analysis…

  20. Demography of Principals' Work and School Improvement: Content Validity of Kentucky's Standards and Indicators for School Improvement (SISI)

    ERIC Educational Resources Information Center

    Lindle, Jane Clark; Stalion, Nancy; Young, Lu

    2005-01-01

    Kentucky's accountability system includes a school-processes audit known as Standards and Indicators for School Improvement (SISI), which is in a nascent stage of validation. Content validity methods include comparison to instruments measuring similar constructs as well as other techniques such as job analysis. This study used a two-phase process…

  1. Expert system verification and validation study. Delivery 3A and 3B: Trip summaries

    NASA Technical Reports Server (NTRS)

    French, Scott

    1991-01-01

    Key results are documented from attending the 4th workshop on verification, validation, and testing. The most interesting part of the workshop was when representatives from the U.S., Japan, and Europe presented surveys of VV&T within their respective regions. Another interesting part focused on current efforts to define industry standards for artificial intelligence and how that might affect approaches to VV&T of expert systems. The next part of the workshop focused on VV&T methods of applying mathematical techniques to verification of rule bases and techniques for capturing information relating to the process of developing software. The final part focused on software tools. A summary is also presented of the EPRI conference on 'Methodologies, Tools, and Standards for Cost Effective Reliable Software Verification and Validation. The conference was divided into discussion sessions on the following issues: development process, automated tools, software reliability, methods, standards, and cost/benefit considerations.

  2. Development and Validation of Cognitive Screening Instruments.

    ERIC Educational Resources Information Center

    Jarman, Ronald F.

    The author suggests that most research on the early detection of learning disabilities is characterisized by an ineffective and a theoretical method of selecting and validating tasks. An alternative technique is proposed, based on a neurological theory of cognitive processes, whereby task analysis is a first step, with empirical analyses as…

  3. Issues Validation: A New Environmental Scanning Technique for Family Life Educators.

    ERIC Educational Resources Information Center

    Weigel, Randy R.; And Others

    1992-01-01

    Three-state study used Issues Validation, environmental scanning process for family life educators that combines literature reviews, professional and public opinion, and survey research to identify issues facing families and youth. Samples of residents, local advisory committees, and community professionals ranked 30 issues facing families and…

  4. Nature of the optical information recorded in speckles

    NASA Astrophysics Data System (ADS)

    Sciammarella, Cesar A.

    1998-09-01

    The process of encoding displacement information in electronic Holographic Interferometry is reviewed. Procedures to extend the applicability of this technique to large deformations are given. The proposed techniques are applied and results from these experiments are compared with results obtained by other means. The similarity between the two sets of results illustrates the validity for the new techniques.

  5. Advanced Software V&V for Civil Aviation and Autonomy

    NASA Technical Reports Server (NTRS)

    Brat, Guillaume P.

    2017-01-01

    With the advances in high-computing platform (e.g., advanced graphical processing units or multi-core processors), computationally-intensive software techniques such as the ones used in artificial intelligence or formal methods have provided us with an opportunity to further increase safety in the aviation industry. Some of these techniques have facilitated building safety at design time, like in aircraft engines or software verification and validation, and others can introduce safety benefits during operations as long as we adapt our processes. In this talk, I will present how NASA is taking advantage of these new software techniques to build in safety at design time through advanced software verification and validation, which can be applied earlier and earlier in the design life cycle and thus help also reduce the cost of aviation assurance. I will then show how run-time techniques (such as runtime assurance or data analytics) offer us a chance to catch even more complex problems, even in the face of changing and unpredictable environments. These new techniques will be extremely useful as our aviation systems become more complex and more autonomous.

  6. Integration of design and inspection

    NASA Astrophysics Data System (ADS)

    Simmonds, William H.

    1990-08-01

    Developments in advanced computer integrated manufacturing technology, coupled with the emphasis on Total Quality Management, are exposing needs for new techniques to integrate all functions from design through to support of the delivered product. One critical functional area that must be integrated into design is that embracing the measurement, inspection and test activities necessary for validation of the delivered product. This area is being tackled by a collaborative project supported by the UK Government Department of Trade and Industry. The project is aimed at developing techniques for analysing validation needs and for planning validation methods. Within the project an experimental Computer Aided Validation Expert system (CAVE) is being constructed. This operates with a generalised model of the validation process and helps with all design stages: specification of product requirements; analysis of the assurance provided by a proposed design and method of manufacture; development of the inspection and test strategy; and analysis of feedback data. The kernel of the system is a knowledge base containing knowledge of the manufacturing process capabilities and of the available inspection and test facilities. The CAVE system is being integrated into a real life advanced computer integrated manufacturing facility for demonstration and evaluation.

  7. Zr Extrusion – Direct Input for Models & Validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cerreta, Ellen Kathleen

    As we examine differences in the high strain rate, high strain tensile response of high purity, highly textured Zr as a function of loading direction, temperature and extrusion velocity with primarily post mortem characterization techniques, we have also developed a technique for characterizing the in-situ extrusion process. This particular measurement is useful for partitioning energy of the system during the extrusion process: friction, kinetic energy, and temperature

  8. A general software reliability process simulation technique

    NASA Technical Reports Server (NTRS)

    Tausworthe, Robert C.

    1991-01-01

    The structure and rationale of the generalized software reliability process, together with the design and implementation of a computer program that simulates this process are described. Given assumed parameters of a particular project, the users of this program are able to generate simulated status timelines of work products, numbers of injected anomalies, and the progress of testing, fault isolation, repair, validation, and retest. Such timelines are useful in comparison with actual timeline data, for validating the project input parameters, and for providing data for researchers in reliability prediction modeling.

  9. MOSAIC - A space-multiplexing technique for optical processing of large images

    NASA Technical Reports Server (NTRS)

    Athale, Ravindra A.; Astor, Michael E.; Yu, Jeffrey

    1993-01-01

    A technique for Fourier processing of images larger than the space-bandwidth products of conventional or smart spatial light modulators and two-dimensional detector arrays is described. The technique involves a spatial combination of subimages displayed on individual spatial light modulators to form a phase-coherent image, which is subsequently processed with Fourier optical techniques. Because of the technique's similarity with the mosaic technique used in art, the processor used is termed an optical MOSAIC processor. The phase accuracy requirements of this system were studied by computer simulation. It was found that phase errors of less than lambda/8 did not degrade the performance of the system and that the system was relatively insensitive to amplitude nonuniformities. Several schemes for implementing the subimage combination are described. Initial experimental results demonstrating the validity of the mosaic concept are also presented.

  10. A process improvement model for software verification and validation

    NASA Technical Reports Server (NTRS)

    Callahan, John; Sabolish, George

    1994-01-01

    We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and space station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.

  11. A process improvement model for software verification and validation

    NASA Technical Reports Server (NTRS)

    Callahan, John; Sabolish, George

    1994-01-01

    We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and Space Station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.

  12. Relations Between Autonomous Motivation and Leisure-Time Physical Activity Participation: The Mediating Role of Self-Regulation Techniques.

    PubMed

    Nurmi, Johanna; Hagger, Martin S; Haukkala, Ari; Araújo-Soares, Vera; Hankonen, Nelli

    2016-04-01

    This study tested the predictive validity of a multitheory process model in which the effect of autonomous motivation from self-determination theory on physical activity participation is mediated by the adoption of self-regulatory techniques based on control theory. Finnish adolescents (N = 411, aged 17-19) completed a prospective survey including validated measures of the predictors and physical activity, at baseline and after one month (N = 177). A subsample used an accelerometer to objectively measure physical activity and further validate the physical activity self-report assessment tool (n = 44). Autonomous motivation statistically significantly predicted action planning, coping planning, and self-monitoring. Coping planning and self-monitoring mediated the effect of autonomous motivation on physical activity, although self-monitoring was the most prominent. Controlled motivation had no effect on self-regulation techniques or physical activity. Developing interventions that support autonomous motivation for physical activity may foster increased engagement in self-regulation techniques and positively affect physical activity behavior.

  13. Demonstration of automated proximity and docking technologies

    NASA Astrophysics Data System (ADS)

    Anderson, Robert L.; Tsugawa, Roy K.; Bryan, Thomas C.

    An autodock was demonstrated using straightforward techniques and real sensor hardware. A simulation testbed was established and validated. The sensor design was refined with improved optical performance and image processing noise mitigation techniques, and the sensor is ready for production from off-the-shelf components. The autonomous spacecraft architecture is defined. The areas of sensors, docking hardware, propulsion, and avionics are included in the design. The Guidance Navigation and Control architecture and requirements are developed. Modular structures suitable for automated control are used. The spacecraft system manager functions including configuration, resource, and redundancy management are defined. The requirements for autonomous spacecraft executive are defined. High level decisionmaking, mission planning, and mission contingency recovery are a part of this. The next step is to do flight demonstrations. After the presentation the following question was asked. How do you define validation? There are two components to validation definition: software simulation with formal and vigorous validation, and hardware and facility performance validated with respect to software already validated against analytical profile.

  14. SU-E-T-497: Semi-Automated in Vivo Radiochromic Film Dosimetry Using a Novel Image Processing Algorithm

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reyhan, M; Yue, N

    Purpose: To validate an automated image processing algorithm designed to detect the center of radiochromic film used for in vivo film dosimetry against the current gold standard of manual selection. Methods: An image processing algorithm was developed to automatically select the region of interest (ROI) in *.tiff images that contain multiple pieces of radiochromic film (0.5x1.3cm{sup 2}). After a user has linked a calibration file to the processing algorithm and selected a *.tiff file for processing, an ROI is automatically detected for all films by a combination of thresholding and erosion, which removes edges and any additional markings for orientation.more » Calibration is applied to the mean pixel values from the ROIs and a *.tiff image is output displaying the original image with an overlay of the ROIs and the measured doses. Validation of the algorithm was determined by comparing in vivo dose determined using the current gold standard (manually drawn ROIs) versus automated ROIs for n=420 scanned films. Bland-Altman analysis, paired t-test, and linear regression were performed to demonstrate agreement between the processes. Results: The measured doses ranged from 0.2-886.6cGy. Bland-Altman analysis of the two techniques (automatic minus manual) revealed a bias of -0.28cGy and a 95% confidence interval of (5.5cGy,-6.1cGy). These values demonstrate excellent agreement between the two techniques. Paired t-test results showed no statistical differences between the two techniques, p=0.98. Linear regression with a forced zero intercept demonstrated that Automatic=0.997*Manual, with a Pearson correlation coefficient of 0.999. The minimal differences between the two techniques may be explained by the fact that the hand drawn ROIs were not identical to the automatically selected ones. The average processing time was 6.7seconds in Matlab on an IntelCore2Duo processor. Conclusion: An automated image processing algorithm has been developed and validated, which will help minimize user interaction and processing time of radiochromic film used for in vivo dosimetry.« less

  15. Validation of nonlinear interferometric vibrational imaging as a molecular OCT technique by the use of Raman microscopy

    NASA Astrophysics Data System (ADS)

    Benalcazar, Wladimir A.; Jiang, Zhi; Marks, Daniel L.; Geddes, Joseph B.; Boppart, Stephen A.

    2009-02-01

    We validate a molecular imaging technique called Nonlinear Interferometric Vibrational Imaging (NIVI) by comparing vibrational spectra with those acquired from Raman microscopy. This broadband coherent anti-Stokes Raman scattering (CARS) technique uses heterodyne detection and OCT acquisition and design principles to interfere a CARS signal generated by a sample with a local oscillator signal generated separately by a four-wave mixing process. These are mixed and demodulated by spectral interferometry. Its confocal configuration allows the acquisition of 3D images based on endogenous molecular signatures. Images from both phantom and mammary tissues have been acquired by this instrument and its spectrum is compared with its spontaneous Raman signatures.

  16. [Wound microbial sampling methods in surgical practice, imprint techniques].

    PubMed

    Chovanec, Z; Veverková, L; Votava, M; Svoboda, J; Peštál, A; Doležel, J; Jedlička, V; Veselý, M; Wechsler, J; Čapov, I

    2012-12-01

    The wound is a damage of tissue. The process of healing is influenced by many systemic and local factors. The most crucial and the most discussed local factor of wound healing is infection. Surgical site infection in the wound is caused by micro-organisms. This information is known for many years, however the conditions leading to an infection occurrence have not been sufficiently described yet. Correct sampling technique, correct storage, transportation, evaluation, and valid interpretation of these data are very important in clinical practice. There are many methods for microbiological sampling, but the best one has not been yet identified and validated. We aim to discuss the problem with the focus on the imprint technique.

  17. Quantification of chromatin condensation level by image processing.

    PubMed

    Irianto, Jerome; Lee, David A; Knight, Martin M

    2014-03-01

    The level of chromatin condensation is related to the silencing/activation of chromosomal territories and therefore impacts on gene expression. Chromatin condensation changes during cell cycle, progression and differentiation, and is influenced by various physicochemical and epigenetic factors. This study describes a validated experimental technique to quantify chromatin condensation. A novel image processing procedure is developed using Sobel edge detection to quantify the level of chromatin condensation from nuclei images taken by confocal microscopy. The algorithm was developed in MATLAB and used to quantify different levels of chromatin condensation in chondrocyte nuclei achieved through alteration in osmotic pressure. The resulting chromatin condensation parameter (CCP) is in good agreement with independent multi-observer qualitative visual assessment. This image processing technique thereby provides a validated unbiased parameter for rapid and highly reproducible quantification of the level of chromatin condensation. Copyright © 2013 IPEM. Published by Elsevier Ltd. All rights reserved.

  18. Dynamic Rod Worth Measurement

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chao, Y.A.; Chapman, D.M.; Hill, D.J.

    2000-12-15

    The dynamic rod worth measurement (DRWM) technique is a method of quickly validating the predicted bank worth of control rods and shutdown rods. The DRWM analytic method is based on three-dimensional, space-time kinetic simulations of the rapid rod movements. Its measurement data is processed with an advanced digital reactivity computer. DRWM has been used as the method of bank worth validation at numerous plant startups with excellent results. The process and methodology of DRWM are described, and the measurement results of using DRWM are presented.

  19. Monitoring fugitive methane and natural gas emissions, validation of measurement techniques.

    NASA Astrophysics Data System (ADS)

    Robinson, Rod; Innocenti, Fabrizio; Gardiner, Tom; Helmore, Jon; Finlayson, Andrew; Connor, Andy

    2017-04-01

    The detection and quantification of fugitive and diffuse methane emissions has become an increasing priority in recent years. As the requirements for routine measurement to support industry initiatives increase there is a growing requirement to assess and validate the performance of fugitive emission measurement technologies. For reported emissions traceability and comparability of measurements is important. This talk will present recent work addressing these needs. Differential Absorption Lidar (DIAL) is a laser based remote sensing technology, able to map the concentration of gases in the atmosphere and determine emission fluxes for fugitive emissions. A description of the technique and its application for determining fugitive emissions of methane from oil and gas operations and waste management sites will be given. As DIAL has gained acceptance as a powerful tool for the measurement and quantification of fugitive emissions, and given the rich data it produces, it is being increasingly used to assess and validate other measurement approaches. In addition, to support the validation of technologies, we have developed a portable controlled release facility able to simulate the emissions from area sources. This has been used to assess and validate techniques which are used to monitor emissions. The development and capabilities of the controlled release facility will be described. This talk will report on recent studies using DIAL and the controlled release facility to validate fugitive emission measurement techniques. This includes side by side comparisons of two DIAL systems, the application of both the DIAL technique and the controlled release facility in a major study carried out in 2015 by South Coast Air Quality Management District (SCAQMD) in which a number of optical techniques were assessed and the development of a prototype method validation approach for techniques used to measure methane emissions from shale gas sites. In conclusion the talk will provide an update on the current status in the development of a European Standard for the measurement of fugitive emissions of VOCs and the use of validation data in the standardisation process and discuss the application of this to methane measurement.

  20. On demand processing of climate station sensor data

    NASA Astrophysics Data System (ADS)

    Wöllauer, Stephan; Forteva, Spaska; Nauss, Thomas

    2015-04-01

    Large sets of climate stations with several sensors produce big amounts of finegrained time series data. To gain value of this data, further processing and aggregation is needed. We present a flexible system to process the raw data on demand. Several aspects need to be considered to process the raw data in a way that scientists can use the processed data conveniently for their specific research interests. First of all, it is not feasible to pre-process the data in advance because of the great variety of ways it can be processed. Therefore, in this approach only the raw measurement data is archived in a database. When a scientist requires some time series, the system processes the required raw data according to the user-defined request. Based on the type of measurement sensor, some data validation is needed, because the climate station sensors may produce erroneous data. Currently, three validation methods are integrated in the on demand processing system and are optionally selectable. The most basic validation method checks if measurement values are within a predefined range of possible values. For example, it may be assumed that an air temperature sensor measures values within a range of -40 °C to +60 °C. Values outside of this range are considered as a measurement error by this validation method and consequently rejected. An other validation method checks for outliers in the stream of measurement values by defining a maximum change rate between subsequent measurement values. The third validation method compares measurement data to the average values of neighboring stations and rejects measurement values with a high variance. These quality checks are optional, because especially extreme climatic values may be valid but rejected by some quality check method. An other important task is the preparation of measurement data in terms of time. The observed stations measure values in intervals of minutes to hours. Often scientists need a coarser temporal resolution (days, months, years). Therefore, the interval of time aggregation is selectable for the processing. For some use cases it is desirable that the resulting time series are as continuous as possible. To meet these requirements, the processing system includes techniques to fill gaps of missing values by interpolating measurement values with data from adjacent stations using available contemporaneous measurements from the respective stations as training datasets. Alongside processing of sensor values, we created interactive visualization techniques to get a quick overview of a big amount of archived time series data.

  1. Validation of X1 motorcycle model in industrial plant layout by using WITNESSTM simulation software

    NASA Astrophysics Data System (ADS)

    Hamzas, M. F. M. A.; Bareduan, S. A.; Zakaria, M. Z.; Tan, W. J.; Zairi, S.

    2017-09-01

    This paper demonstrates a case study on simulation, modelling and analysis for X1 Motorcycles Model. In this research, a motorcycle assembly plant has been selected as a main place of research study. Simulation techniques by using Witness software were applied to evaluate the performance of the existing manufacturing system. The main objective is to validate the data and find out the significant impact on the overall performance of the system for future improvement. The process of validation starts when the layout of the assembly line was identified. All components are evaluated to validate whether the data is significance for future improvement. Machine and labor statistics are among the parameters that were evaluated for process improvement. Average total cycle time for given workstations is used as criterion for comparison of possible variants. From the simulation process, the data used are appropriate and meet the criteria for two-sided assembly line problems.

  2. Specification Reformulation During Specification Validation

    NASA Technical Reports Server (NTRS)

    Benner, Kevin M.

    1992-01-01

    The goal of the ARIES Simulation Component (ASC) is to uncover behavioral errors by 'running' a specification at the earliest possible points during the specification development process. The problems to be overcome are the obvious ones the specification may be large, incomplete, underconstrained, and/or uncompilable. This paper describes how specification reformulation is used to mitigate these problems. ASC begins by decomposing validation into specific validation questions. Next, the specification is reformulated to abstract out all those features unrelated to the identified validation question thus creating a new specialized specification. ASC relies on a precise statement of the validation question and a careful application of transformations so as to preserve the essential specification semantics in the resulting specialized specification. This technique is a win if the resulting specialized specification is small enough so the user my easily handle any remaining obstacles to execution. This paper will: (1) describe what a validation question is; (2) outline analysis techniques for identifying what concepts are and are not relevant to a validation question; and (3) identify and apply transformations which remove these less relevant concepts while preserving those which are relevant.

  3. Satellite stratospheric aerosol measurement validation

    NASA Technical Reports Server (NTRS)

    Russell, P. B.; Mccormick, M. P.

    1984-01-01

    The validity of the stratospheric aerosol measurements made by the satellite sensors SAM II and SAGE was tested by comparing their results with each other and with results obtained by other techniques (lider, dustsonde, filter, and impactor). The latter type of comparison required the development of special techniques that convert the quantity measured by the correlative sensor (e.g. particle backscatter, number, or mass) to that measured by the satellite sensor (extinction) and quantitatively estimate the uncertainty in the conversion process. The results of both types of comparisons show agreement within the measurement and conversion uncertainties. Moreover, the satellite uncertainty is small compared to aerosol natural variability (caused by seasonal changes, volcanoes, sudden warmings, and vortex structure). It was concluded that the satellite measurements are valid.

  4. CMOS array design automation techniques. [metal oxide semiconductors

    NASA Technical Reports Server (NTRS)

    Ramondetta, P.; Feller, A.; Noto, R.; Lombardi, T.

    1975-01-01

    A low cost, quick turnaround technique for generating custom metal oxide semiconductor arrays using the standard cell approach was developed, implemented, tested and validated. Basic cell design topology and guidelines are defined based on an extensive analysis that includes circuit, layout, process, array topology and required performance considerations particularly high circuit speed.

  5. An Integrative Theory-Driven Positive Emotion Regulation Intervention

    PubMed Central

    Weytens, Fanny; Luminet, Olivier; Verhofstadt, Lesley L.; Mikolajczak, Moïra

    2014-01-01

    Over the past fifteen years, positive psychology research has validated a set of happiness enhancing techniques. These techniques are relatively simple exercises that allow happiness seekers to mimic thoughts and behavior of naturally happy people, in order to increase their level of well-being. Because research has shown that the joint use of these exercises increases their effects, practitioners who want to help happiness seekers need validated interventions that combine several of these techniques. To meet this need, we have developed and tested an integrative intervention (Positive Emotion Regulation program – PER program) incorporating a number of validated techniques structured around a theoretical model: the Process Model of Positive Emotion Regulation. To test the effectiveness of this program and to identify its added value relative to existing interventions, 113 undergraduate students were randomly assigned to a 6-week positive emotion regulation pilot program, a loving-kindness meditation training program, or a wait-list control group. Results indicate that fewer participants dropped out from the PER program than from the Loving-Kindness Meditation training. Furthermore, subjects in the PER group showed a significant increase in subjective well-being and life satisfaction and a significant decrease in depression and physical symptoms when compared to controls. Our results suggest that the Process Model of Positive Emotion Regulation can be an effective option to organize and deliver positive integrative interventions. PMID:24759870

  6. Applying Standard Independent Verification and Validation (IVV) Techniques Within an Agile Framework: Is There a Compatibility Issue?

    NASA Technical Reports Server (NTRS)

    Dabney, James B.; Arthur, James Douglas

    2017-01-01

    Agile methods have gained wide acceptance over the past several years, to the point that they are now a standard management and execution approach for small-scale software development projects. While conventional Agile methods are not generally applicable to large multi-year and mission-critical systems, Agile hybrids are now being developed (such as SAFe) to exploit the productivity improvements of Agile while retaining the necessary process rigor and coordination needs of these projects. From the perspective of Independent Verification and Validation (IVV), however, the adoption of these hybrid Agile frameworks is becoming somewhat problematic. Hence, we find it prudent to question the compatibility of conventional IVV techniques with (hybrid) Agile practices.This paper documents our investigation of (a) relevant literature, (b) the modification and adoption of Agile frameworks to accommodate the development of large scale, mission critical systems, and (c) the compatibility of standard IVV techniques within hybrid Agile development frameworks. Specific to the latter, we found that the IVV methods employed within a hybrid Agile process can be divided into three groups: (1) early lifecycle IVV techniques that are fully compatible with the hybrid lifecycles, (2) IVV techniques that focus on tracing requirements, test objectives, etc. are somewhat incompatible, but can be tailored with a modest effort, and (3) IVV techniques involving an assessment requiring artifact completeness that are simply not compatible with hybrid Agile processes, e.g., those that assume complete requirement specification early in the development lifecycle.

  7. Direct lexical control of eye movements in reading: Evidence from a survival analysis of fixation durations

    PubMed Central

    Reingold, Eyal M.; Reichle, Erik D.; Glaholt, Mackenzie G.; Sheridan, Heather

    2013-01-01

    Participants’ eye movements were monitored in an experiment that manipulated the frequency of target words (high vs. low) as well as their availability for parafoveal processing during fixations on the pre-target word (valid vs. invalid preview). The influence of the word-frequency by preview validity manipulation on the distributions of first fixation duration was examined by using ex-Gaussian fitting as well as a novel survival analysis technique which provided precise estimates of the timing of the first discernible influence of word frequency on first fixation duration. Using this technique, we found a significant influence of word frequency on fixation duration in normal reading (valid preview) as early as 145 ms from the start of fixation. We also demonstrated an equally rapid non-lexical influence on first fixation duration as a function of initial landing position (location) on target words. The time-course of frequency effects, but not location effects was strongly influenced by preview validity, demonstrating the crucial role of parafoveal processing in enabling direct lexical control of reading fixation times. Implications for models of eye-movement control are discussed. PMID:22542804

  8. The Interview and Personnel Selection: Is the Process Valid and Reliable?

    ERIC Educational Resources Information Center

    Niece, Richard

    1983-01-01

    Reviews recent literature concerning the job interview. Concludes that such interviews are generally ineffective and proposes that school administrators devise techniques for improving their interviewing systems. (FL)

  9. A semi-automatic method for left ventricle volume estimate: an in vivo validation study

    NASA Technical Reports Server (NTRS)

    Corsi, C.; Lamberti, C.; Sarti, A.; Saracino, G.; Shiota, T.; Thomas, J. D.

    2001-01-01

    This study aims to the validation of the left ventricular (LV) volume estimates obtained by processing volumetric data utilizing a segmentation model based on level set technique. The validation has been performed by comparing real-time volumetric echo data (RT3DE) and magnetic resonance (MRI) data. A validation protocol has been defined. The validation protocol was applied to twenty-four estimates (range 61-467 ml) obtained from normal and pathologic subjects, which underwent both RT3DE and MRI. A statistical analysis was performed on each estimate and on clinical parameters as stroke volume (SV) and ejection fraction (EF). Assuming MRI estimates (x) as a reference, an excellent correlation was found with volume measured by utilizing the segmentation procedure (y) (y=0.89x + 13.78, r=0.98). The mean error on SV was 8 ml and the mean error on EF was 2%. This study demonstrated that the segmentation technique is reliably applicable on human hearts in clinical practice.

  10. Automatic welding detection by an intelligent tool pipe inspection

    NASA Astrophysics Data System (ADS)

    Arizmendi, C. J.; Garcia, W. L.; Quintero, M. A.

    2015-07-01

    This work provide a model based on machine learning techniques in welds recognition, based on signals obtained through in-line inspection tool called “smart pig” in Oil and Gas pipelines. The model uses a signal noise reduction phase by means of pre-processing algorithms and attribute-selection techniques. The noise reduction techniques were selected after a literature review and testing with survey data. Subsequently, the model was trained using recognition and classification algorithms, specifically artificial neural networks and support vector machines. Finally, the trained model was validated with different data sets and the performance was measured with cross validation and ROC analysis. The results show that is possible to identify welding automatically with an efficiency between 90 and 98 percent.

  11. Localized analysis of paint-coat drying using dynamic speckle interferometry

    NASA Astrophysics Data System (ADS)

    Sierra-Sosa, Daniel; Tebaldi, Myrian; Grumel, Eduardo; Rabal, Hector; Elmaghraby, Adel

    2018-07-01

    The paint-coating is part of several industrial processes, including the automotive industry, architectural coatings, machinery and appliances. These paint-coatings must comply with high quality standards, for this reason evaluation techniques from paint-coatings are in constant development. One important factor from the paint-coating process is the drying, as it has influence on the quality of final results. In this work we present an assessment technique based on the optical dynamic speckle interferometry, this technique allows for the temporal activity evaluation of the paint-coating drying process, providing localized information from drying. This localized information is relevant in order to address the drying homogeneity, optimal drying, and quality control. The technique relies in the definition of a new temporal history of the speckle patterns to obtain the local activity; this information is then clustered to provide a convenient indicative of different drying process stages. The experimental results presented were validated using the gravimetric drying curves

  12. Survey of NASA V and V Processes/Methods

    NASA Technical Reports Server (NTRS)

    Pecheur, Charles; Nelson, Stacy

    2002-01-01

    The purpose of this report is to describe current NASA Verification and Validation (V&V) techniques and to explain how these techniques are applicable to 2nd Generation RLV Integrated Vehicle Health Management (IVHM) software. It also contains recommendations for special V&V requirements for IVHM. This report is divided into the following three sections: 1) Survey - Current NASA V&V Processes/Methods; 2) Applicability of NASA V&V to 2nd Generation RLV IVHM; and 3) Special 2nd Generation RLV IVHM V&V Requirements.

  13. The forensic validity of visual analytics

    NASA Astrophysics Data System (ADS)

    Erbacher, Robert F.

    2008-01-01

    The wider use of visualization and visual analytics in wide ranging fields has led to the need for visual analytics capabilities to be legally admissible, especially when applied to digital forensics. This brings the need to consider legal implications when performing visual analytics, an issue not traditionally examined in visualization and visual analytics techniques and research. While digital data is generally admissible under the Federal Rules of Evidence [10][21], a comprehensive validation of the digital evidence is considered prudent. A comprehensive validation requires validation of the digital data under rules for authentication, hearsay, best evidence rule, and privilege. Additional issues with digital data arise when exploring digital data related to admissibility and the validity of what information was examined, to what extent, and whether the analysis process was sufficiently covered by a search warrant. For instance, a search warrant generally covers very narrow requirements as to what law enforcement is allowed to examine and acquire during an investigation. When searching a hard drive for child pornography, how admissible is evidence of an unrelated crime, i.e. drug dealing. This is further complicated by the concept of "in plain view". When performing an analysis of a hard drive what would be considered "in plain view" when analyzing a hard drive. The purpose of this paper is to discuss the issues of digital forensics and the related issues as they apply to visual analytics and identify how visual analytics techniques fit into the digital forensics analysis process, how visual analytics techniques can improve the legal admissibility of digital data, and identify what research is needed to further improve this process. The goal of this paper is to open up consideration of legal ramifications among the visualization community; the author is not a lawyer and the discussions are not meant to be inclusive of all differences in laws between states and countries.

  14. Vision-based system identification technique for building structures using a motion capture system

    NASA Astrophysics Data System (ADS)

    Oh, Byung Kwan; Hwang, Jin Woo; Kim, Yousok; Cho, Tongjun; Park, Hyo Seon

    2015-11-01

    This paper presents a new vision-based system identification (SI) technique for building structures by using a motion capture system (MCS). The MCS with outstanding capabilities for dynamic response measurements can provide gage-free measurements of vibrations through the convenient installation of multiple markers. In this technique, from the dynamic displacement responses measured by MCS, the dynamic characteristics (natural frequency, mode shape, and damping ratio) of building structures are extracted after the processes of converting the displacement from MCS to acceleration and conducting SI by frequency domain decomposition. A free vibration experiment on a three-story shear frame was conducted to validate the proposed technique. The SI results from the conventional accelerometer-based method were compared with those from the proposed technique and showed good agreement, which confirms the validity and applicability of the proposed vision-based SI technique for building structures. Furthermore, SI directly employing MCS measured displacements to FDD was performed and showed identical results to those of conventional SI method.

  15. Multiscale metrologies for process optimization of carbon nanotube polymer composites

    DOE PAGES

    Natarajan, Bharath; Orloff, Nathan D.; Ashkar, Rana; ...

    2016-07-18

    Carbon nanotube (CNT) polymer nanocomposites are attractive multifunctional materials with a growing range of commercial applications. With the increasing demand for these materials, it is imperative to develop and validate methods for on-line quality control and process monitoring during production. In this work, a novel combination of characterization techniques is utilized, that facilitates the non-invasive assessment of CNT dispersion in epoxy produced by the scalable process of calendering. First, the structural parameters of these nanocomposites are evaluated across multiple length scales (10 -10 m to 10 -3 m) using scanning gallium-ion microscopy, transmission electron microscopy and small-angle neutron scattering. Then,more » a non-contact resonant microwave cavity perturbation (RCP) technique is employed to accurately measure the AC electrical conductivity of the nanocomposites. Quantitative correlations between the conductivity and structural parameters find the RCP measurements to be sensitive to CNT mass fraction, spatial organization and, therefore, the processing parameters. These results, and the non-contact nature and speed of RCP measurements identify this technique as being ideally suited for quality control of CNT nanocomposites in a nanomanufacturing environment. In conclusion, when validated by the multiscale characterization suite, RCP may be broadly applicable in the production of hybrid functional materials, such as graphene, gold nanorod, and carbon black nanocomposites.« less

  16. Validity of Scientific Based Chemistry Android Module to Empower Science Process Skills (SPS) in Solubility Equilibrium

    NASA Astrophysics Data System (ADS)

    Antrakusuma, B.; Masykuri, M.; Ulfa, M.

    2018-04-01

    Evolution of Android technology can be applied to chemistry learning, one of the complex chemistry concept was solubility equilibrium. this concept required the science process skills (SPS). This study aims to: 1) Characteristic scientific based chemistry Android module to empowering SPS, and 2) Validity of the module based on content validity and feasibility test. This research uses a Research and Development approach (RnD). Research subjects were 135 s1tudents and three teachers at three high schools in Boyolali, Central of Java. Content validity of the module was tested by seven experts using Aiken’s V technique, and the module feasibility was tested to students and teachers in each school. Characteristics of chemistry module can be accessed using the Android device. The result of validation of the module contents got V = 0.89 (Valid), and the results of the feasibility test Obtained 81.63% (by the student) and 73.98% (by the teacher) indicates this module got good criteria.

  17. Validity of High School Physic Module With Character Values Using Process Skill Approach In STKIP PGRI West Sumatera

    NASA Astrophysics Data System (ADS)

    Anaperta, M.; Helendra, H.; Zulva, R.

    2018-04-01

    This study aims to describe the validity of physics module with Character Oriented Values Using Process Approach Skills at Dynamic Electrical Material in high school physics / MA and SMK. The type of research is development research. The module development model uses the development model proposed by Plomp which consists of (1) preliminary research phase, (2) the prototyping phase, and (3) assessment phase. In this research is done is initial investigation phase and designing. Data collecting technique to know validation is observation and questionnaire. In the initial investigative phase, curriculum analysis, student analysis, and concept analysis were conducted. In the design phase and the realization of module design for SMA / MA and SMK subjects in dynamic electrical materials. After that, the formative evaluation which include self evaluation, prototyping (expert reviews, one-to-one, and small group. At this stage validity is performed. This research data is obtained through the module validation sheet, which then generates a valid module.

  18. Joint use of over- and under-sampling techniques and cross-validation for the development and assessment of prediction models.

    PubMed

    Blagus, Rok; Lusa, Lara

    2015-11-04

    Prediction models are used in clinical research to develop rules that can be used to accurately predict the outcome of the patients based on some of their characteristics. They represent a valuable tool in the decision making process of clinicians and health policy makers, as they enable them to estimate the probability that patients have or will develop a disease, will respond to a treatment, or that their disease will recur. The interest devoted to prediction models in the biomedical community has been growing in the last few years. Often the data used to develop the prediction models are class-imbalanced as only few patients experience the event (and therefore belong to minority class). Prediction models developed using class-imbalanced data tend to achieve sub-optimal predictive accuracy in the minority class. This problem can be diminished by using sampling techniques aimed at balancing the class distribution. These techniques include under- and oversampling, where a fraction of the majority class samples are retained in the analysis or new samples from the minority class are generated. The correct assessment of how the prediction model is likely to perform on independent data is of crucial importance; in the absence of an independent data set, cross-validation is normally used. While the importance of correct cross-validation is well documented in the biomedical literature, the challenges posed by the joint use of sampling techniques and cross-validation have not been addressed. We show that care must be taken to ensure that cross-validation is performed correctly on sampled data, and that the risk of overestimating the predictive accuracy is greater when oversampling techniques are used. Examples based on the re-analysis of real datasets and simulation studies are provided. We identify some results from the biomedical literature where the incorrect cross-validation was performed, where we expect that the performance of oversampling techniques was heavily overestimated.

  19. The specification-based validation of reliable multicast protocol: Problem Report. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Wu, Yunqing

    1995-01-01

    Reliable Multicast Protocol (RMP) is a communication protocol that provides an atomic, totally ordered, reliable multicast service on top of unreliable IP multicasting. In this report, we develop formal models for RMP using existing automated verification systems, and perform validation on the formal RMP specifications. The validation analysis help identifies some minor specification and design problems. We also use the formal models of RMP to generate a test suite for conformance testing of the implementation. Throughout the process of RMP development, we follow an iterative, interactive approach that emphasizes concurrent and parallel progress of implementation and verification processes. Through this approach, we incorporate formal techniques into our development process, promote a common understanding for the protocol, increase the reliability of our software, and maintain high fidelity between the specifications of RMP and its implementation.

  20. Martin Mayman's early memories technique: bridging the gap between personality assessment and psychotherapy.

    PubMed

    Fowler, J C; Hilsenroth, M J; Handler, L

    2000-08-01

    In this article, we describe Martin Mayman's approach to early childhood memories as a projective technique, beginning with his scientific interest in learning theory, coupled with his interest in ego psychology and object relations theory. We describe Mayman's contributions to the use of the early memories technique to inform the psychotherapy process, tying assessment closely to psychotherapy and making assessment more useful in treatment. In this article, we describe a representative sample of research studies that demonstrate the reliability and validity of early memories, followed by case examples in which the early memories informed the therapy process, including issues of transference and countertransference.

  1. FDA 2011 process validation guidance: lifecycle compliance model.

    PubMed

    Campbell, Cliff

    2014-01-01

    This article has been written as a contribution to the industry's efforts in migrating from a document-driven to a data-driven compliance mindset. A combination of target product profile, control engineering, and general sum principle techniques is presented as the basis of a simple but scalable lifecycle compliance model in support of modernized process validation. Unit operations and significant variables occupy pole position within the model, documentation requirements being treated as a derivative or consequence of the modeling process. The quality system is repositioned as a subordinate of system quality, this being defined as the integral of related "system qualities". The article represents a structured interpretation of the U.S. Food and Drug Administration's 2011 Guidance for Industry on Process Validation and is based on the author's educational background and his manufacturing/consulting experience in the validation field. The U.S. Food and Drug Administration's Guidance for Industry on Process Validation (2011) provides a wide-ranging and rigorous outline of compliant drug manufacturing requirements relative to its 20(th) century predecessor (1987). Its declared focus is patient safety, and it identifies three inter-related (and obvious) stages of the compliance lifecycle. Firstly, processes must be designed, both from a technical and quality perspective. Secondly, processes must be qualified, providing evidence that the manufacturing facility is fully "roadworthy" and fit for its intended purpose. Thirdly, processes must be verified, meaning that commercial batches must be monitored to ensure that processes remain in a state of control throughout their lifetime.

  2. Parametric techniques for characterizing myocardial tissue by magnetic resonance imaging (part 1): T1 mapping.

    PubMed

    Perea Palazón, R J; Ortiz Pérez, J T; Prat González, S; de Caralt Robira, T M; Cibeira López, M T; Solé Arqués, M

    2016-01-01

    The development of myocardial fibrosis is a common process in the appearance of ventricular dysfunction in many heart diseases. Magnetic resonance imaging makes it possible to accurately evaluate the structure and function of the heart, and its role in the macroscopic characterization of myocardial fibrosis by late enhancement techniques has been widely validated clinically. Recent studies have demonstrated that T1-mapping techniques can quantify diffuse myocardial fibrosis and the expansion of the myocardial extracellular space in absolute terms. However, further studies are necessary to validate the usefulness of this technique in the early detection of tissue remodeling at a time when implementing early treatment would improve a patient's prognosis. This article reviews the state of the art for T1 mapping of the myocardium, its clinical applications, and its limitations. Copyright © 2016 SERAM. Published by Elsevier España, S.L.U. All rights reserved.

  3. Permanent Scatterer InSAR Analysis and Validation in the Gulf of Corinth.

    PubMed

    Elias, Panagiotis; Kontoes, Charalabos; Papoutsis, Ioannis; Kotsis, Ioannis; Marinou, Aggeliki; Paradissis, Dimitris; Sakellariou, Dimitris

    2009-01-01

    The Permanent Scatterers Interferometric SAR technique (PSInSAR) is a method that accurately estimates the near vertical terrain deformation rates, of the order of ∼1 mm year(-1), overcoming the physical and technical restrictions of classic InSAR. In this paper the method is strengthened by creating a robust processing chain, incorporating PSInSAR analysis together with algorithmic adaptations for Permanent Scatterer Candidates (PSCs) and Permanent Scatterers (PSs) selection. The processing chain, called PerSePHONE, was applied and validated in the geophysically active area of the Gulf of Corinth. The analysis indicated a clear subsidence trend in the north-eastern part of the gulf, with the maximum deformation of ∼2.5 mm year(-1) occurring in the region north of the Gulf of Alkyonides. The validity of the results was assessed against geophysical/geological and geodetic studies conducted in the area, which include continuous seismic profiling data and GPS height measurements. All these observations converge to the same deformation pattern as the one derived by the PSInSAR technique.

  4. Permanent Scatterer InSAR Analysis and Validation in the Gulf of Corinth

    PubMed Central

    Elias, Panagiotis; Kontoes, Charalabos; Papoutsis, Ioannis; Kotsis, Ioannis; Marinou, Aggeliki; Paradissis, Dimitris; Sakellariou, Dimitris

    2009-01-01

    The Permanent Scatterers Interferometric SAR technique (PSInSAR) is a method that accurately estimates the near vertical terrain deformation rates, of the order of ∼1 mm year-1, overcoming the physical and technical restrictions of classic InSAR. In this paper the method is strengthened by creating a robust processing chain, incorporating PSInSAR analysis together with algorithmic adaptations for Permanent Scatterer Candidates (PSCs) and Permanent Scatterers (PSs) selection. The processing chain, called PerSePHONE, was applied and validated in the geophysically active area of the Gulf of Corinth. The analysis indicated a clear subsidence trend in the north-eastern part of the gulf, with the maximum deformation of ∼2.5 mm year-1 occurring in the region north of the Gulf of Alkyonides. The validity of the results was assessed against geophysical/geological and geodetic studies conducted in the area, which include continuous seismic profiling data and GPS height measurements. All these observations converge to the same deformation pattern as the one derived by the PSInSAR technique. PMID:22389587

  5. Tone and Broadband Noise Separation from Acoustic Data of a Scale-Model Counter-Rotating Open Rotor

    NASA Technical Reports Server (NTRS)

    Sree, David; Stephens, David B.

    2014-01-01

    Renewed interest in contra-rotating open rotor technology for aircraft propulsion application has prompted the development of advanced diagnostic tools for better design and improved acoustical performance. In particular, the determination of tonal and broadband components of open rotor acoustic spectra is essential for properly assessing the noise control parameters and also for validating the open rotor noise simulation codes. The technique of phase averaging has been employed to separate the tone and broadband components from a single rotor, but this method does not work for the two-shaft contra-rotating open rotor. A new signal processing technique was recently developed to process the contra-rotating open rotor acoustic data. The technique was first tested using acoustic data taken of a hobby aircraft open rotor propeller, and reported previously. The intent of the present work is to verify and validate the applicability of the new technique to a realistic one-fifth scale open rotor model which has 12 forward and 10 aft contra-rotating blades operating at realistic forward flight Mach numbers and tip speeds. The results and discussions of that study are presented in this paper.

  6. Tone and Broadband Noise Separation from Acoustic Data of a Scale-Model Contra-Rotating Open Rotor

    NASA Technical Reports Server (NTRS)

    Sree, Dave; Stephens, David B.

    2014-01-01

    Renewed interest in contra-rotating open rotor technology for aircraft propulsion application has prompted the development of advanced diagnostic tools for better design and improved acoustical performance. In particular, the determination of tonal and broadband components of open rotor acoustic spectra is essential for properly assessing the noise control parameters and also for validating the open rotor noise simulation codes. The technique of phase averaging has been employed to separate the tone and broadband components from a single rotor, but this method does not work for the two-shaft contra-rotating open rotor. A new signal processing technique was recently developed to process the contra-rotating open rotor acoustic data. The technique was first tested using acoustic data taken of a hobby aircraft open rotor propeller, and reported previously. The intent of the present work is to verify and validate the applicability of the new technique to a realistic one-fifth scale open rotor model which has 12 forward and 10 aft contra-rotating blades operating at realistic forward flight Mach numbers and tip speeds. The results and discussions of that study are presented in this paper.

  7. A maximum entropy reconstruction technique for tomographic particle image velocimetry

    NASA Astrophysics Data System (ADS)

    Bilsky, A. V.; Lozhkin, V. A.; Markovich, D. M.; Tokarev, M. P.

    2013-04-01

    This paper studies a novel approach for reducing tomographic PIV computational complexity. The proposed approach is an algebraic reconstruction technique, termed MENT (maximum entropy). This technique computes the three-dimensional light intensity distribution several times faster than SMART, using at least ten times less memory. Additionally, the reconstruction quality remains nearly the same as with SMART. This paper presents the theoretical computation performance comparison for MENT, SMART and MART, followed by validation using synthetic particle images. Both the theoretical assessment and validation of synthetic images demonstrate significant computational time reduction. The data processing accuracy of MENT was compared to that of SMART in a slot jet experiment. A comparison of the average velocity profiles shows a high level of agreement between the results obtained with MENT and those obtained with SMART.

  8. A diagnostic technique used to obtain cross range radiation centers from antenna patterns

    NASA Technical Reports Server (NTRS)

    Lee, T. H.; Burnside, W. D.

    1988-01-01

    A diagnostic technique to obtain cross range radiation centers based on antenna radiation patterns is presented. This method is similar to the synthetic aperture processing of scattered fields in the radar application. Coherent processing of the radiated fields is used to determine the various radiation centers associated with the far-zone pattern of an antenna for a given radiation direction. This technique can be used to identify an unexpected radiation center that creates an undesired effect in a pattern; on the other hand, it can improve a numerical simulation of the pattern by identifying other significant mechanisms. Cross range results for two 8' reflector antennas are presented to illustrate as well as validate that technique.

  9. Development and Validation of Instruments to Measure Learning of Expert-Like Thinking

    NASA Astrophysics Data System (ADS)

    Adams, Wendy K.; Wieman, Carl E.

    2011-06-01

    This paper describes the process for creating and validating an assessment test that measures the effectiveness of instruction by probing how well that instruction causes students in a class to think like experts about specific areas of science. The design principles and process are laid out and it is shown how these align with professional standards that have been established for educational and psychological testing and the elements of assessment called for in a recent National Research Council study on assessment. The importance of student interviews for creating and validating the test is emphasized, and the appropriate interview procedures are presented. The relevance and use of standard psychometric statistical tests are discussed. Additionally, techniques for effective test administration are presented.

  10. Model Checking Verification and Validation at JPL and the NASA Fairmont IV and V Facility

    NASA Technical Reports Server (NTRS)

    Schneider, Frank; Easterbrook, Steve; Callahan, Jack; Montgomery, Todd

    1999-01-01

    We show how a technology transfer effort was carried out. The successful use of model checking on a pilot JPL flight project demonstrates the usefulness and the efficacy of the approach. The pilot project was used to model a complex spacecraft controller. Software design and implementation validation were carried out successfully. To suggest future applications we also show how the implementation validation step can be automated. The effort was followed by the formal introduction of the modeling technique as a part of the JPL Quality Assurance process.

  11. Using the Internet to Improve HRD Research: The Case of the Web-Based Delphi Research Technique to Achieve Content Validity of an HRD-Oriented Measurement

    ERIC Educational Resources Information Center

    Hatcher, Tim; Colton, Sharon

    2007-01-01

    Purpose: The purpose of this article is to highlight the results of the online Delphi research project; in particular the procedures used to establish an online and innovative process of content validation and obtaining "rich" and descriptive information using the internet and current e-learning technologies. The online Delphi was proven to be an…

  12. Simulation of hypersonic rarefied flows with the immersed-boundary method

    NASA Astrophysics Data System (ADS)

    Bruno, D.; De Palma, P.; de Tullio, M. D.

    2011-05-01

    This paper provides a validation of an immersed boundary method for computing hypersonic rarefied gas flows. The method is based on the solution of the Navier-Stokes equation and is validated versus numerical results obtained by the DSMC approach. The Navier-Stokes solver employs a flexible local grid refinement technique and is implemented on parallel machines using a domain-decomposition approach. Thanks to the efficient grid generation process, based on the ray-tracing technique, and the use of the METIS software, it is possible to obtain the partitioned grids to be assigned to each processor with a minimal effort by the user. This allows one to by-pass the expensive (in terms of time and human resources) classical generation process of a body fitted grid. First-order slip-velocity boundary conditions are employed and tested for taking into account rarefied gas effects.

  13. Applications of integrated human error identification techniques on the chemical cylinder change task.

    PubMed

    Cheng, Ching-Min; Hwang, Sheue-Ling

    2015-03-01

    This paper outlines the human error identification (HEI) techniques that currently exist to assess latent human errors. Many formal error identification techniques have existed for years, but few have been validated to cover latent human error analysis in different domains. This study considers many possible error modes and influential factors, including external error modes, internal error modes, psychological error mechanisms, and performance shaping factors, and integrates several execution procedures and frameworks of HEI techniques. The case study in this research was the operational process of changing chemical cylinders in a factory. In addition, the integrated HEI method was used to assess the operational processes and the system's reliability. It was concluded that the integrated method is a valuable aid to develop much safer operational processes and can be used to predict human error rates on critical tasks in the plant. Copyright © 2014 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  14. Finite element analysis of dental implants with validation: to what extent can we expect the model to predict biological phenomena? A literature review and proposal for classification of a validation process.

    PubMed

    Chang, Yuanhan; Tambe, Abhijit Anil; Maeda, Yoshinobu; Wada, Masahiro; Gonda, Tomoya

    2018-03-08

    A literature review of finite element analysis (FEA) studies of dental implants with their model validation process was performed to establish the criteria for evaluating validation methods with respect to their similarity to biological behavior. An electronic literature search of PubMed was conducted up to January 2017 using the Medical Subject Headings "dental implants" and "finite element analysis." After accessing the full texts, the context of each article was searched using the words "valid" and "validation" and articles in which these words appeared were read to determine whether they met the inclusion criteria for the review. Of 601 articles published from 1997 to 2016, 48 that met the eligibility criteria were selected. The articles were categorized according to their validation method as follows: in vivo experiments in humans (n = 1) and other animals (n = 3), model experiments (n = 32), others' clinical data and past literature (n = 9), and other software (n = 2). Validation techniques with a high level of sufficiency and efficiency are still rare in FEA studies of dental implants. High-level validation, especially using in vivo experiments tied to an accurate finite element method, needs to become an established part of FEA studies. The recognition of a validation process should be considered when judging the practicality of an FEA study.

  15. Analysis of Management Control Techniques for the Data Processing Department at the Navy Finance Center, Cleveland, Ohio.

    DTIC Science & Technology

    1983-03-01

    Sysiem are: Order processinq coordinators Order processing management Credit and collections Accounts receivable Support management Admin ianagemenr...or sales secretary, then by order processing (OP). Phone-in orders go directly to OP. The infor- mation is next Transcribed onto an order entry... ORDER PROCESSING : The central systems validate The order items and codes t!, processing them against the customer file, the prodicT or PA? ts file, and

  16. Comparative Study on the Different Testing Techniques in Tree Classification for Detecting the Learning Motivation

    NASA Astrophysics Data System (ADS)

    Juliane, C.; Arman, A. A.; Sastramihardja, H. S.; Supriana, I.

    2017-03-01

    Having motivation to learn is a successful requirement in a learning process, and needs to be maintained properly. This study aims to measure learning motivation, especially in the process of electronic learning (e-learning). Here, data mining approach was chosen as a research method. For the testing process, the accuracy comparative study on the different testing techniques was conducted, involving Cross Validation and Percentage Split. The best accuracy was generated by J48 algorithm with a percentage split technique reaching at 92.19 %. This study provided an overview on how to detect the presence of learning motivation in the context of e-learning. It is expected to be good contribution for education, and to warn the teachers for whom they have to provide motivation.

  17. Improved Concrete Cutting and Excavation Capabilities for Crater Repair Phase 2

    DTIC Science & Technology

    2015-05-01

    production rate and ease of execution. The current ADR techniques, tactics, and procedures (TTPs) indicate cutting of pavement around a small crater...demonstrations and evaluations were used to create the techniques, tactics, and procedures (TTPs) manual describing the processes and requirements of...was more difficult when dowels were present. In general, the OUA demonstration validated that the new materials, equipment, and procedures were

  18. Development and validation of a ten-item questionnaire with explanatory illustrations to assess upper extremity disorders: favorable effect of illustrations in the item reduction process.

    PubMed

    Kurimoto, Shigeru; Suzuki, Mikako; Yamamoto, Michiro; Okui, Nobuyuki; Imaeda, Toshihiko; Hirata, Hitoshi

    2011-11-01

    The purpose of this study is to develop a short and valid measure for upper extremity disorders and to assess the effect of attached illustrations in item reduction of a self-administered disability questionnaire while retaining psychometric properties. A validated questionnaire used to assess upper extremity disorders, the Hand20, was reduced to ten items using two item-reduction techniques. The psychometric properties of the abbreviated form, the Hand10, were evaluated on an independent sample that was used for the shortening process. Validity, reliability, and responsiveness of the Hand10 were retained in the item reduction process. It was possible that the use of explanatory illustrations attached to the Hand10 helped with its reproducibility. The illustrations for the Hand10 promoted text comprehension and motivation to answer the items. These changes resulted in high acceptability; more than 99.3% of patients, including 98.5% of elderly patients, could complete the Hand10 properly. The illustrations had favorable effects on the item reduction process and made it possible to retain precision of the instrument. The Hand10 is a reliable and valid instrument for individual-level applications with the advantage of being compact and broadly applicable, even in elderly individuals.

  19. Modern modeling techniques had limited external validity in predicting mortality from traumatic brain injury.

    PubMed

    van der Ploeg, Tjeerd; Nieboer, Daan; Steyerberg, Ewout W

    2016-10-01

    Prediction of medical outcomes may potentially benefit from using modern statistical modeling techniques. We aimed to externally validate modeling strategies for prediction of 6-month mortality of patients suffering from traumatic brain injury (TBI) with predictor sets of increasing complexity. We analyzed individual patient data from 15 different studies including 11,026 TBI patients. We consecutively considered a core set of predictors (age, motor score, and pupillary reactivity), an extended set with computed tomography scan characteristics, and a further extension with two laboratory measurements (glucose and hemoglobin). With each of these sets, we predicted 6-month mortality using default settings with five statistical modeling techniques: logistic regression (LR), classification and regression trees, random forests (RFs), support vector machines (SVM) and neural nets. For external validation, a model developed on one of the 15 data sets was applied to each of the 14 remaining sets. This process was repeated 15 times for a total of 630 validations. The area under the receiver operating characteristic curve (AUC) was used to assess the discriminative ability of the models. For the most complex predictor set, the LR models performed best (median validated AUC value, 0.757), followed by RF and support vector machine models (median validated AUC value, 0.735 and 0.732, respectively). With each predictor set, the classification and regression trees models showed poor performance (median validated AUC value, <0.7). The variability in performance across the studies was smallest for the RF- and LR-based models (inter quartile range for validated AUC values from 0.07 to 0.10). In the area of predicting mortality from TBI, nonlinear and nonadditive effects are not pronounced enough to make modern prediction methods beneficial. Copyright © 2016 Elsevier Inc. All rights reserved.

  20. A rational approach to legacy data validation when transitioning between electronic health record systems.

    PubMed

    Pageler, Natalie M; Grazier G'Sell, Max Jacob; Chandler, Warren; Mailes, Emily; Yang, Christine; Longhurst, Christopher A

    2016-09-01

    The objective of this project was to use statistical techniques to determine the completeness and accuracy of data migrated during electronic health record conversion. Data validation during migration consists of mapped record testing and validation of a sample of the data for completeness and accuracy. We statistically determined a randomized sample size for each data type based on the desired confidence level and error limits. The only error identified in the post go-live period was a failure to migrate some clinical notes, which was unrelated to the validation process. No errors in the migrated data were found during the 12- month post-implementation period. Compared to the typical industry approach, we have demonstrated that a statistical approach to sampling size for data validation can ensure consistent confidence levels while maximizing efficiency of the validation process during a major electronic health record conversion. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  1. Validating Retinal Fundus Image Analysis Algorithms: Issues and a Proposal

    PubMed Central

    Trucco, Emanuele; Ruggeri, Alfredo; Karnowski, Thomas; Giancardo, Luca; Chaum, Edward; Hubschman, Jean Pierre; al-Diri, Bashir; Cheung, Carol Y.; Wong, Damon; Abràmoff, Michael; Lim, Gilbert; Kumar, Dinesh; Burlina, Philippe; Bressler, Neil M.; Jelinek, Herbert F.; Meriaudeau, Fabrice; Quellec, Gwénolé; MacGillivray, Tom; Dhillon, Bal

    2013-01-01

    This paper concerns the validation of automatic retinal image analysis (ARIA) algorithms. For reasons of space and consistency, we concentrate on the validation of algorithms processing color fundus camera images, currently the largest section of the ARIA literature. We sketch the context (imaging instruments and target tasks) of ARIA validation, summarizing the main image analysis and validation techniques. We then present a list of recommendations focusing on the creation of large repositories of test data created by international consortia, easily accessible via moderated Web sites, including multicenter annotations by multiple experts, specific to clinical tasks, and capable of running submitted software automatically on the data stored, with clear and widely agreed-on performance criteria, to provide a fair comparison. PMID:23794433

  2. Valid and Reliable Science Content Assessments for Science Teachers

    NASA Astrophysics Data System (ADS)

    Tretter, Thomas R.; Brown, Sherri L.; Bush, William S.; Saderholm, Jon C.; Holmes, Vicki-Lynn

    2013-03-01

    Science teachers' content knowledge is an important influence on student learning, highlighting an ongoing need for programs, and assessments of those programs, designed to support teacher learning of science. Valid and reliable assessments of teacher science knowledge are needed for direct measurement of this crucial variable. This paper describes multiple sources of validity and reliability (Cronbach's alpha greater than 0.8) evidence for physical, life, and earth/space science assessments—part of the Diagnostic Teacher Assessments of Mathematics and Science (DTAMS) project. Validity was strengthened by systematic synthesis of relevant documents, extensive use of external reviewers, and field tests with 900 teachers during assessment development process. Subsequent results from 4,400 teachers, analyzed with Rasch IRT modeling techniques, offer construct and concurrent validity evidence.

  3. An Investigation to Advance the Technology Readiness Level of the Centaur Derived On-orbit Propellant Storage and Transfer System

    NASA Astrophysics Data System (ADS)

    Silvernail, Nathan L.

    This research was carried out in collaboration with the United Launch Alliance (ULA), to advance an innovative Centaur-based on-orbit propellant storage and transfer system that takes advantage of rotational settling to simplify Fluid Management (FM), specifically enabling settled fluid transfer between two tanks and settled pressure control. This research consists of two specific objectives: (1) technique and process validation and (2) computational model development. In order to raise the Technology Readiness Level (TRL) of this technology, the corresponding FM techniques and processes must be validated in a series of experimental tests, including: laboratory/ground testing, microgravity flight testing, suborbital flight testing, and orbital testing. Researchers from Embry-Riddle Aeronautical University (ERAU) have joined with the Massachusetts Institute of Technology (MIT) Synchronized Position Hold Engage and Reorient Experimental Satellites (SPHERES) team to develop a prototype FM system for operations aboard the International Space Station (ISS). Testing of the integrated system in a representative environment will raise the FM system to TRL 6. The tests will demonstrate the FM system and provide unique data pertaining to the vehicle's rotational dynamics while undergoing fluid transfer operations. These data sets provide insight into the behavior and physical tendencies of the on-orbit refueling system. Furthermore, they provide a baseline for comparison against the data produced by various computational models; thus verifying the accuracy of the models output and validating the modeling approach. Once these preliminary models have been validated, the parameters defined by them will provide the basis of development for accurate simulations of full scale, on-orbit systems. The completion of this project and the models being developed will accelerate the commercialization of on-orbit propellant storage and transfer technologies as well as all in-space technologies that utilize or will utilize similar FM techniques and processes.

  4. Application of High Speed Digital Image Correlation in Rocket Engine Hot Fire Testing

    NASA Technical Reports Server (NTRS)

    Gradl, Paul R.; Schmidt, Tim

    2016-01-01

    Hot fire testing of rocket engine components and rocket engine systems is a critical aspect of the development process to understand performance, reliability and system interactions. Ground testing provides the opportunity for highly instrumented development testing to validate analytical model predictions and determine necessary design changes and process improvements. To properly obtain discrete measurements for model validation, instrumentation must survive in the highly dynamic and extreme temperature application of hot fire testing. Digital Image Correlation has been investigated and being evaluated as a technique to augment traditional instrumentation during component and engine testing providing further data for additional performance improvements and cost savings. The feasibility of digital image correlation techniques were demonstrated in subscale and full scale hotfire testing. This incorporated a pair of high speed cameras to measure three-dimensional, real-time displacements and strains installed and operated under the extreme environments present on the test stand. The development process, setup and calibrations, data collection, hotfire test data collection and post-test analysis and results are presented in this paper.

  5. Development of a Prototype H-46 Helicopter Diagnostic Expert System.

    DTIC Science & Technology

    1987-09-01

    SQUADRON MAINTEN\\NCE: CURRENT PROCESS AND CA D S INTEG R ,ATIO N ........................................ 14 A. MAINTENANCE DATA SYSTEM...increasce the effectiveness of the maintenance process should enhance the ability of achieving :hee objectives. Artificial intelligence techniques offer a...completeiy validated. G. ORGANIZATION OF STUDY Chapter II contains a description of the Naval Aviation Maintenance Program’s Maintenance Data System (MDS

  6. Evacuation performance evaluation tool.

    PubMed

    Farra, Sharon; Miller, Elaine T; Gneuhs, Matthew; Timm, Nathan; Li, Gengxin; Simon, Ashley; Brady, Whittney

    2016-01-01

    Hospitals conduct evacuation exercises to improve performance during emergency events. An essential aspect in this process is the creation of reliable and valid evaluation tools. The objective of this article is to describe the development and implications of a disaster evacuation performance tool that measures one portion of the very complex process of evacuation. Through the application of the Delphi technique and DeVellis's framework, disaster and neonatal experts provided input in developing this performance evaluation tool. Following development, content validity and reliability of this tool were assessed. Large pediatric hospital and medical center in the Midwest. The tool was pilot tested with an administrative, medical, and nursing leadership group and then implemented with a group of 68 healthcare workers during a disaster exercise of a neonatal intensive care unit (NICU). The tool has demonstrated high content validity with a scale validity index of 0.979 and inter-rater reliability G coefficient (0.984, 95% CI: 0.948-0.9952). The Delphi process based on the conceptual framework of DeVellis yielded a psychometrically sound evacuation performance evaluation tool for a NICU.

  7. Development of single shot 1D-Raman scattering measurements for flames

    NASA Astrophysics Data System (ADS)

    Biase, Amelia; Uddi, Mruthunjaya

    2017-11-01

    The majority of energy consumption in the US comes from burning fossil fuels which increases the concentration of carbon dioxide in the atmosphere. The increasing concentration of carbon dioxide in the atmosphere has negative impacts on the environment. One solution to this problem is to study the oxy-combustion process. A pure oxygen stream is used instead of air for combustion. Products contain only carbon dioxide and water. It is easy to separate water from carbon dioxide by condensation and the carbon dioxide can be captured easily. Lower gas volume allows for easier removal of pollutants from the flue gas. The design of a system that studies the oxy-combustion process using advanced laser diagnostic techniques and Raman scattering measurements is presented. The experiments focus on spontaneous Raman scattering. This is one of the few techniques that can provide quantitative measurements of the concentration and temperature of different chemical species in a turbulent flow. The experimental design and process of validating the design to ensure the data is accurate is described. The Raman data collected form an experimental data base that is used for the validation of spontaneous Raman scattering in high pressure environments for the oxy-combustion process. NSF EEC 1659710.

  8. Soot volume fraction fields in unsteady axis-symmetric flames by continuous laser extinction technique.

    PubMed

    Kashif, Muhammad; Bonnety, Jérôme; Guibert, Philippe; Morin, Céline; Legros, Guillaume

    2012-12-17

    A Laser Extinction Method has been set up to provide two-dimensional soot volume fraction field time history at a tunable frequency up to 70 Hz inside an axis-symmetric diffusion flame experiencing slow unsteady phenomena preserving the symmetry. The use of a continuous wave laser as the light source enables this repetition rate, which is an incremental advance in the laser extinction technique. The technique is shown to allow a fine description of the soot volume fraction field in a flickering flame exhibiting a 12.6 Hz flickering phenomenon. Within this range of repetition rate, the technique and its subsequent post-processing require neither any method for time-domain reconstruction nor any correction for energy intrusion. Possibly complemented by such a reconstruction method, the technique should support further soot volume fraction database in oscillating flames that exhibit characteristic times relevant to the current efforts in the validation of soot processes modeling.

  9. Study of photon correlation techniques for processing of laser velocimeter signals

    NASA Technical Reports Server (NTRS)

    Mayo, W. T., Jr.

    1977-01-01

    The objective was to provide the theory and a system design for a new type of photon counting processor for low level dual scatter laser velocimeter (LV) signals which would be capable of both the first order measurements of mean flow and turbulence intensity and also the second order time statistics: cross correlation auto correlation, and related spectra. A general Poisson process model for low level LV signals and noise which is valid from the photon-resolved regime all the way to the limiting case of nonstationary Gaussian noise was used. Computer simulation algorithms and higher order statistical moment analysis of Poisson processes were derived and applied to the analysis of photon correlation techniques. A system design using a unique dual correlate and subtract frequency discriminator technique is postulated and analyzed. Expectation analysis indicates that the objective measurements are feasible.

  10. Breaking Out of the Lab: Measuring Real-Time Responses to Televised Political Content in Real-World Settings.

    PubMed

    Maier, Jürgen; Hampe, J Felix; Jahn, Nico

    2016-01-01

    Real-time response (RTR) measurement is an important technique for analyzing human processing of electronic media stimuli. Although it has been demonstrated that RTR data are reliable and internally valid, some argue that they lack external validity. The reason for this is that RTR measurement is restricted to a laboratory environment due to its technical requirements. This paper introduces a smartphone app that 1) captures real-time responses using the dial technique and 2) provides a solution for one of the most important problems in RTR measurement, the (automatic) synchronization of RTR data. In addition, it explores the reliability and validity of mobile RTR measurement by comparing the real-time reactions of two samples of young and well-educated voters to the 2013 German televised debate. Whereas the first sample participated in a classical laboratory study, the second sample was equipped with our mobile RTR system and watched the debate at home. Results indicate that the mobile RTR system yields similar results to the lab-based RTR measurement, providing evidence that laboratory studies using RTR are externally valid. In particular, the argument that the artificial reception situation creates artificial results has to be questioned. In addition, we conclude that RTR measurement outside the lab is possible. Hence, mobile RTR opens the door for large-scale studies to better understand the processing and impact of electronic media content.

  11. Breaking Out of the Lab

    PubMed Central

    Maier, Jürgen; Hampe, J. Felix; Jahn, Nico

    2016-01-01

    Real-time response (RTR) measurement is an important technique for analyzing human processing of electronic media stimuli. Although it has been demonstrated that RTR data are reliable and internally valid, some argue that they lack external validity. The reason for this is that RTR measurement is restricted to a laboratory environment due to its technical requirements. This paper introduces a smartphone app that 1) captures real-time responses using the dial technique and 2) provides a solution for one of the most important problems in RTR measurement, the (automatic) synchronization of RTR data. In addition, it explores the reliability and validity of mobile RTR measurement by comparing the real-time reactions of two samples of young and well-educated voters to the 2013 German televised debate. Whereas the first sample participated in a classical laboratory study, the second sample was equipped with our mobile RTR system and watched the debate at home. Results indicate that the mobile RTR system yields similar results to the lab-based RTR measurement, providing evidence that laboratory studies using RTR are externally valid. In particular, the argument that the artificial reception situation creates artificial results has to be questioned. In addition, we conclude that RTR measurement outside the lab is possible. Hence, mobile RTR opens the door for large-scale studies to better understand the processing and impact of electronic media content. PMID:27274577

  12. Bidirectional light-scattering image processing method for high-concentration jet sprays

    NASA Astrophysics Data System (ADS)

    Shimizu, I.; Emori, Y.; Yang, W.-J.; Shimoda, M.; Suzuki, T.

    1985-01-01

    In order to study the distributions of droplet size and volume density in high-concentration jet sprays, a new technique is developed, which combines the forward and backward light scattering method and an image processing method. A pulsed ruby laser is used as the light source. The Mie scattering theory is applied to the results obtained from image processing on the scattering photographs. The time history is obtained for the droplet size and volume density distributions, and the method is demonstrated by diesel fuel sprays under various injecting conditions. The validity of the technique is verified by a good agreement in the injected fuel volume distributions obtained by the present method and by injection rate measurements.

  13. Field validation of speed estimation techniques for air quality conformity analysis.

    DOT National Transportation Integrated Search

    2004-01-01

    The air quality conformity analysis process requires the estimation of speeds for a horizon year on a link-by-link basis where only a few future roadway characteristics, such as forecast volume and capacity, are known. Accordingly, the Virginia Depar...

  14. Multi-gene genetic programming based predictive models for municipal solid waste gasification in a fluidized bed gasifier.

    PubMed

    Pandey, Daya Shankar; Pan, Indranil; Das, Saptarshi; Leahy, James J; Kwapinski, Witold

    2015-03-01

    A multi-gene genetic programming technique is proposed as a new method to predict syngas yield production and the lower heating value for municipal solid waste gasification in a fluidized bed gasifier. The study shows that the predicted outputs of the municipal solid waste gasification process are in good agreement with the experimental dataset and also generalise well to validation (untrained) data. Published experimental datasets are used for model training and validation purposes. The results show the effectiveness of the genetic programming technique for solving complex nonlinear regression problems. The multi-gene genetic programming are also compared with a single-gene genetic programming model to show the relative merits and demerits of the technique. This study demonstrates that the genetic programming based data-driven modelling strategy can be a good candidate for developing models for other types of fuels as well. Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. ISS Logistics Hardware Disposition and Metrics Validation

    NASA Technical Reports Server (NTRS)

    Rogers, Toneka R.

    2010-01-01

    I was assigned to the Logistics Division of the International Space Station (ISS)/Spacecraft Processing Directorate. The Division consists of eight NASA engineers and specialists that oversee the logistics portion of the Checkout, Assembly, and Payload Processing Services (CAPPS) contract. Boeing, their sub-contractors and the Boeing Prime contract out of Johnson Space Center, provide the Integrated Logistics Support for the ISS activities at Kennedy Space Center. Essentially they ensure that spares are available to support flight hardware processing and the associated ground support equipment (GSE). Boeing maintains a Depot for electrical, mechanical and structural modifications and/or repair capability as required. My assigned task was to learn project management techniques utilized by NASA and its' contractors to provide an efficient and effective logistics support infrastructure to the ISS program. Within the Space Station Processing Facility (SSPF) I was exposed to Logistics support components, such as, the NASA Spacecraft Services Depot (NSSD) capabilities, Mission Processing tools, techniques and Warehouse support issues, required for integrating Space Station elements at the Kennedy Space Center. I also supported the identification of near-term ISS Hardware and Ground Support Equipment (GSE) candidates for excessing/disposition prior to October 2010; and the validation of several Logistics Metrics used by the contractor to measure logistics support effectiveness.

  16. Developmental validation of a Nextera XT mitogenome Illumina MiSeq sequencing method for high-quality samples.

    PubMed

    Peck, Michelle A; Sturk-Andreaggi, Kimberly; Thomas, Jacqueline T; Oliver, Robert S; Barritt-Ross, Suzanne; Marshall, Charla

    2018-05-01

    Generating mitochondrial genome (mitogenome) data from reference samples in a rapid and efficient manner is critical to harnessing the greater power of discrimination of the entire mitochondrial DNA (mtDNA) marker. The method of long-range target enrichment, Nextera XT library preparation, and Illumina sequencing on the MiSeq is a well-established technique for generating mitogenome data from high-quality samples. To this end, a validation was conducted for this mitogenome method processing up to 24 samples simultaneously along with analysis in the CLC Genomics Workbench and utilizing the AQME (AFDIL-QIAGEN mtDNA Expert) tool to generate forensic profiles. This validation followed the Federal Bureau of Investigation's Quality Assurance Standards (QAS) for forensic DNA testing laboratories and the Scientific Working Group on DNA Analysis Methods (SWGDAM) validation guidelines. The evaluation of control DNA, non-probative samples, blank controls, mixtures, and nonhuman samples demonstrated the validity of this method. Specifically, the sensitivity was established at ≥25 pg of nuclear DNA input for accurate mitogenome profile generation. Unreproducible low-level variants were observed in samples with low amplicon yields. Further, variant quality was shown to be a useful metric for identifying sequencing error and crosstalk. Success of this method was demonstrated with a variety of reference sample substrates and extract types. These studies further demonstrate the advantages of using NGS techniques by highlighting the quantitative nature of heteroplasmy detection. The results presented herein from more than 175 samples processed in ten sequencing runs, show this mitogenome sequencing method and analysis strategy to be valid for the generation of reference data. Copyright © 2018 Elsevier B.V. All rights reserved.

  17. Quantum-state anomaly detection for arbitrary errors using a machine-learning technique

    NASA Astrophysics Data System (ADS)

    Hara, Satoshi; Ono, Takafumi; Okamoto, Ryo; Washio, Takashi; Takeuchi, Shigeki

    2016-10-01

    The accurate detection of small deviations in given density matrice is important for quantum information processing, which is a difficult task because of the intrinsic fluctuation in density matrices reconstructed using a limited number of experiments. We previously proposed a method for decoherence error detection using a machine-learning technique [S. Hara, T. Ono, R. Okamoto, T. Washio, and S. Takeuchi, Phys. Rev. A 89, 022104 (2014), 10.1103/PhysRevA.89.022104]. However, the previous method is not valid when the errors are just changes in phase. Here, we propose a method that is valid for arbitrary errors in density matrices. The performance of the proposed method is verified using both numerical simulation data and real experimental data.

  18. Advanced Reactors-Intermediate Heat Exchanger (IHX) Coupling: Theoretical Modeling and Experimental Validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Utgikar, Vivek; Sun, Xiaodong; Christensen, Richard

    2016-12-29

    The overall goal of the research project was to model the behavior of the advanced reactorintermediate heat exchange system and to develop advanced control techniques for off-normal conditions. The specific objectives defined for the project were: 1. To develop the steady-state thermal hydraulic design of the intermediate heat exchanger (IHX); 2. To develop mathematical models to describe the advanced nuclear reactor-IHX-chemical process/power generation coupling during normal and off-normal operations, and to simulate models using multiphysics software; 3. To develop control strategies using genetic algorithm or neural network techniques and couple these techniques with the multiphysics software; 4. To validate themore » models experimentally The project objectives were accomplished by defining and executing four different tasks corresponding to these specific objectives. The first task involved selection of IHX candidates and developing steady state designs for those. The second task involved modeling of the transient and offnormal operation of the reactor-IHX system. The subsequent task dealt with the development of control strategies and involved algorithm development and simulation. The last task involved experimental validation of the thermal hydraulic performances of the two prototype heat exchangers designed and fabricated for the project at steady state and transient conditions to simulate the coupling of the reactor- IHX-process plant system. The experimental work utilized the two test facilities at The Ohio State University (OSU) including one existing High-Temperature Helium Test Facility (HTHF) and the newly developed high-temperature molten salt facility.« less

  19. Neutron capture on short-lived nuclei via the surrogate (d,pγ) reaction

    NASA Astrophysics Data System (ADS)

    Cizewski, Jolie A.; Ratkiewicz, Andrew

    2018-05-01

    Rapid r-process nucleosynthesis is responsible for the creation of about half of the elements heavier than iron. Neutron capture on shortlived nuclei in cold processes or during freeze out from hot processes can have a significant impact on the final observed r-process abundances. We are validating the (d,pγ) reaction as a surrogate for neutron capture with measurements on 95Mo targets and a focus on discrete transitions. The experimental results have been analyzed within the Hauser-Feshbach approach with non-elastic breakup of the deuteron providing a neutron to be captured. Preliminary results support the (d,pγ) reaction as a valid surrogate for neutron capture. We are poised to measure the (d,pγ) reaction in inverse kinematics with unstable beams following the development of the experimental techniques.

  20. Validation of the NASA Dryden X-31 simulation and evaluation of mechanization techniques

    NASA Technical Reports Server (NTRS)

    Dickes, Edward; Kay, Jacob; Ralston, John

    1994-01-01

    This paper shall discuss the evaluation of the original Dryden X-31 aerodynamic math model, processes involved in the justification and creation of the modified data base, and comparison time history results of the model response with flight test.

  1. Dream Symbol or Dream Process?

    ERIC Educational Resources Information Center

    Himelstein, Philip

    1984-01-01

    Discusses the relationship of the symbolic content of dreams to the theory of the dream in psychoanalysis and Gestalt therapy. Points out that the utility of the dream depends upon the techniques of the therapist and not on the validity of the underlying theory of the dream. (LLL)

  2. Applications of LC-MS in PET Radioligand Development and Metabolic Elucidation

    PubMed Central

    Ma, Ying; Kiesewetter, Dale O.; Lang, Lixin; Gu, Dongyu; Chen, Xiaoyuan

    2013-01-01

    Positron emission tomography (PET) is a very sensitive molecular imaging technique that when employed with an appropriate radioligand has the ability to quantititate physiological processes in a non-invasive manner. Since the imaging technique detects all radioactive emissions in the field of view, the presence and biological activity of radiolabeled metabolites must be determined for each radioligand in order to validate the utility of the radiotracer for measuring the desired physiological process. Thus, the identification of metabolic profiles of radiolabeled compounds is an important aspect of design, development, and validation of new radiopharmaceuticals and their applications in drug development and molecular imaging. Metabolite identification for different chemical classes of radiopharmaceuticals allows rational design to minimize the formation and accumulation of metabolites in the target tissue, either through enhanced excretion or minimized metabolism. This review will discuss methods for identifying and quantitating metabolites during the pre-clinical development of radiopharmaceuticals with special emphasis on the application of LC/MS. PMID:20540692

  3. On vital aid: the why, what and how of validation

    PubMed Central

    Kleywegt, Gerard J.

    2009-01-01

    Limitations to the data and subjectivity in the structure-determination process may cause errors in macromolecular crystal structures. Appropriate validation techniques may be used to reveal problems in structures, ideally before they are analysed, published or deposited. Additionally, such tech­niques may be used a posteriori to assess the (relative) merits of a model by potential users. Weak validation methods and statistics assess how well a model reproduces the information that was used in its construction (i.e. experimental data and prior knowledge). Strong methods and statistics, on the other hand, test how well a model predicts data or information that were not used in the structure-determination process. These may be data that were excluded from the process on purpose, general knowledge about macromolecular structure, information about the biological role and biochemical activity of the molecule under study or its mutants or complexes and predictions that are based on the model and that can be tested experimentally. PMID:19171968

  4. Topographic gravity modeling for global Bouguer maps to degree 2160: Validation of spectral and spatial domain forward modeling techniques at the 10 microGal level

    NASA Astrophysics Data System (ADS)

    Hirt, Christian; Reußner, Elisabeth; Rexer, Moritz; Kuhn, Michael

    2016-09-01

    Over the past years, spectral techniques have become a standard to model Earth's global gravity field to 10 km scales, with the EGM2008 geopotential model being a prominent example. For some geophysical applications of EGM2008, particularly Bouguer gravity computation with spectral techniques, a topographic potential model of adequate resolution is required. However, current topographic potential models have not yet been successfully validated to degree 2160, and notable discrepancies between spectral modeling and Newtonian (numerical) integration well beyond the 10 mGal level have been reported. Here we accurately compute and validate gravity implied by a degree 2160 model of Earth's topographic masses. Our experiments are based on two key strategies, both of which require advanced computational resources. First, we construct a spectrally complete model of the gravity field which is generated by the degree 2160 Earth topography model. This involves expansion of the topographic potential to the 15th integer power of the topography and modeling of short-scale gravity signals to ultrahigh degree of 21,600, translating into unprecedented fine scales of 1 km. Second, we apply Newtonian integration in the space domain with high spatial resolution to reduce discretization errors. Our numerical study demonstrates excellent agreement (8 μGgal RMS) between gravity from both forward modeling techniques and provides insight into the convergence process associated with spectral modeling of gravity signals at very short scales (few km). As key conclusion, our work successfully validates the spectral domain forward modeling technique for degree 2160 topography and increases the confidence in new high-resolution global Bouguer gravity maps.

  5. A Surrogate Technique for Investigating Deterministic Dynamics in Discrete Human Movement.

    PubMed

    Taylor, Paul G; Small, Michael; Lee, Kwee-Yum; Landeo, Raul; O'Meara, Damien M; Millett, Emma L

    2016-10-01

    Entropy is an effective tool for investigation of human movement variability. However, before applying entropy, it can be beneficial to employ analyses to confirm that observed data are not solely the result of stochastic processes. This can be achieved by contrasting observed data with that produced using surrogate methods. Unlike continuous movement, no appropriate method has been applied to discrete human movement. This article proposes a novel surrogate method for discrete movement data, outlining the processes for determining its critical values. The proposed technique reliably generated surrogates for discrete joint angle time series, destroying fine-scale dynamics of the observed signal, while maintaining macro structural characteristics. Comparison of entropy estimates indicated observed signals had greater regularity than surrogates and were not only the result of stochastic but also deterministic processes. The proposed surrogate method is both a valid and reliable technique to investigate determinism in other discrete human movement time series.

  6. Online measurement of bead geometry in GMAW-based additive manufacturing using passive vision

    NASA Astrophysics Data System (ADS)

    Xiong, Jun; Zhang, Guangjun

    2013-11-01

    Additive manufacturing based on gas metal arc welding is an advanced technique for depositing fully dense components with low cost. Despite this fact, techniques to achieve accurate control and automation of the process have not yet been perfectly developed. The online measurement of the deposited bead geometry is a key problem for reliable control. In this work a passive vision-sensing system, comprising two cameras and composite filtering techniques, was proposed for real-time detection of the bead height and width through deposition of thin walls. The nozzle to the top surface distance was monitored for eliminating accumulated height errors during the multi-layer deposition process. Various image processing algorithms were applied and discussed for extracting feature parameters. A calibration procedure was presented for the monitoring system. Validation experiments confirmed the effectiveness of the online measurement system for bead geometry in layered additive manufacturing.

  7. Key management of the double random-phase-encoding method using public-key encryption

    NASA Astrophysics Data System (ADS)

    Saini, Nirmala; Sinha, Aloka

    2010-03-01

    Public-key encryption has been used to encode the key of the encryption process. In the proposed technique, an input image has been encrypted by using the double random-phase-encoding method using extended fractional Fourier transform. The key of the encryption process have been encoded by using the Rivest-Shamir-Adelman (RSA) public-key encryption algorithm. The encoded key has then been transmitted to the receiver side along with the encrypted image. In the decryption process, first the encoded key has been decrypted using the secret key and then the encrypted image has been decrypted by using the retrieved key parameters. The proposed technique has advantage over double random-phase-encoding method because the problem associated with the transmission of the key has been eliminated by using public-key encryption. Computer simulation has been carried out to validate the proposed technique.

  8. Design, development, testing and validation of a Photonics Virtual Laboratory for the study of LEDs

    NASA Astrophysics Data System (ADS)

    Naranjo, Francisco L.; Martínez, Guadalupe; Pérez, Ángel L.; Pardo, Pedro J.

    2014-07-01

    This work presents the design, development, testing and validation of a Photonic Virtual Laboratory, highlighting the study of LEDs. The study was conducted from a conceptual, experimental and didactic standpoint, using e-learning and m-learning platforms. Specifically, teaching tools that help ensure that our students perform significant learning have been developed. It has been brought together the scientific aspect, such as the study of LEDs, with techniques of generation and transfer of knowledge through the selection, hierarchization and structuring of information using concept maps. For the validation of the didactic materials developed, it has been used procedures with various assessment tools for the collection and processing of data, applied in the context of an experimental design. Additionally, it was performed a statistical analysis to determine the validity of the materials developed. The assessment has been designed to validate the contributions of the new materials developed over the traditional method of teaching, and to quantify the learning achieved by students, in order to draw conclusions that serve as a reference for its application in the teaching and learning processes, and comprehensively validate the work carried out.

  9. Measuring adverse events in helicopter emergency medical services: establishing content validity.

    PubMed

    Patterson, P Daniel; Lave, Judith R; Martin-Gill, Christian; Weaver, Matthew D; Wadas, Richard J; Arnold, Robert M; Roth, Ronald N; Mosesso, Vincent N; Guyette, Francis X; Rittenberger, Jon C; Yealy, Donald M

    2014-01-01

    We sought to create a valid framework for detecting adverse events (AEs) in the high-risk setting of helicopter emergency medical services (HEMS). We assembled a panel of 10 expert clinicians (n = 6 emergency medicine physicians and n = 4 prehospital nurses and flight paramedics) affiliated with a large multistate HEMS organization in the Northeast US. We used a modified Delphi technique to develop a framework for detecting AEs associated with the treatment of critically ill or injured patients. We used a widely applied measure, the content validity index (CVI), to quantify the validity of the framework's content. The expert panel of 10 clinicians reached consensus on a common AE definition and four-step protocol/process for AE detection in HEMS. The consensus-based framework is composed of three main components: (1) a trigger tool, (2) a method for rating proximal cause, and (3) a method for rating AE severity. The CVI findings isolate components of the framework considered content valid. We demonstrate a standardized process for the development of a content-valid framework for AE detection. The framework is a model for the development of a method for AE identification in other settings, including ground-based EMS.

  10. Controlling for confounding variables in MS-omics protocol: why modularity matters.

    PubMed

    Smith, Rob; Ventura, Dan; Prince, John T

    2014-09-01

    As the field of bioinformatics research continues to grow, more and more novel techniques are proposed to meet new challenges and improvements upon solutions to long-standing problems. These include data processing techniques and wet lab protocol techniques. Although the literature is consistently thorough in experimental detail and variable-controlling rigor for wet lab protocol techniques, bioinformatics techniques tend to be less described and less controlled. As the validation or rejection of hypotheses rests on the experiment's ability to isolate and measure a variable of interest, we urge the importance of reducing confounding variables in bioinformatics techniques during mass spectrometry experimentation. © The Author 2013. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  11. Mapping accuracy via spectrally and structurally based filtering techniques: comparisons through visual observations

    NASA Astrophysics Data System (ADS)

    Chockalingam, Letchumanan

    2005-01-01

    The data of Gunung Ledang region of Malaysia acquired through LANDSAT are considered to map certain hydrogeolocial features. To map these significant features, image-processing tools such as contrast enhancement, edge detection techniques are employed. The advantages of these techniques over the other methods are evaluated from the point of their validity in properly isolating features of hydrogeolocial interest are discussed. As these techniques take the advantage of spectral aspects of the images, these techniques have several limitations to meet the objectives. To discuss these limitations, a morphological transformation, which generally considers the structural aspects rather than spectral aspects from the image, are applied to provide comparisons between the results derived from spectral based and the structural based filtering techniques.

  12. The ICA Communication Audit and Perceived Communication Effectiveness Changes in 16 Audited Organizations.

    ERIC Educational Resources Information Center

    Brooks, Keith; And Others

    1979-01-01

    Discusses the benefits of the International Communication Association Communication Audit as a methodology for evaluation of organizational communication processes and outcomes. An "after" survey of 16 audited organizations confirmed the audit as a valid diagnostic methodology and organization development intervention technique which…

  13. Bicarbonate of soda paint stripping process validation and material characterization

    NASA Technical Reports Server (NTRS)

    Haas, Michael N.

    1995-01-01

    The Aircraft Production Division at San Antonio Air Logistics Center has conducted extensive investigation into the replacement of hazardous chemicals in aircraft component cleaning, degreasing, and depainting. One of the most viable solutions is process substitution utilizing abrasive techniques. SA-ALC has incorporated the use of Bicarbonate of Soda Blasting as one such substitution. Previous utilization of methylene chloride based chemical strippers and carbon removal agents has been replaced by a walk-in blast booth in which we remove carbon from engine nozzles and various gas turbine engine parts, depaint cowlings, and perform various other functions on a variety of parts. Prior to implementation of this new process, validation of the process was performed, and materials and waste stream characterization studies were conducted. These characterization studies examined the effects of the blasting process on the integrity of the thin-skinned aluminum substrates, the effects of the process on both air emissions and effluent disposal, and the effects on the personnel exposed to the process.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jordan, Amy B.; Stauffer, Philip H.; Reed, Donald T.

    The primary objective of the experimental effort described here is to aid in understanding the complex nature of liquid, vapor, and solid transport occurring around heated nuclear waste in bedded salt. In order to gain confidence in the predictive capability of numerical models, experimental validation must be performed to ensure that (a) hydrological and physiochemical parameters and (b) processes are correctly simulated. The experiments proposed here are designed to study aspects of the system that have not been satisfactorily quantified in prior work. In addition to exploring the complex coupled physical processes in support of numerical model validation, lessons learnedmore » from these experiments will facilitate preparations for larger-scale experiments that may utilize similar instrumentation techniques.« less

  15. Ultrasonic linear array validation via concrete test blocks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoegh, Kyle, E-mail: hoeg0021@umn.edu; Khazanovich, Lev, E-mail: hoeg0021@umn.edu; Ferraro, Chris

    2015-03-31

    Oak Ridge National Laboratory (ORNL) comparatively evaluated the ability of a number of NDE techniques to generate an image of the volume of 6.5′ X 5.0′ X 10″ concrete specimens fabricated at the Florida Department of Transportation (FDOT) NDE Validation Facility in Gainesville, Florida. These test blocks were fabricated to test the ability of various NDE methods to characterize various placements and sizes of rebar as well as simulated cracking and non-consolidation flaws. The first version of the ultrasonic linear array device, MIRA [version 1], was one of 7 different NDE equipment used to characterize the specimens. This paper dealsmore » with the ability of this equipment to determine subsurface characterizations such as reinforcing steel relative size, concrete thickness, irregularities, and inclusions using Kirchhoff-based migration techniques. The ability of individual synthetic aperture focusing technique (SAFT) B-scan cross sections resulting from self-contained scans are compared with various processing, analysis, and interpretation methods using the various features fabricated in the specimens for validation. The performance is detailed, especially with respect to the limitations and implications for evaluation of a thicker, more heavily reinforced concrete structures.« less

  16. Applied photo interpretation for airbrush cartography

    NASA Technical Reports Server (NTRS)

    Inge, J. L.; Bridges, P. M.

    1976-01-01

    New techniques of cartographic portrayal have been developed for the compilation of maps of lunar and planetary surfaces. Conventional photo interpretation methods utilizing size, shape, shadow, tone, pattern, and texture are applied to computer processed satellite television images. The variety of the image data allows the illustrator to interpret image details by inter-comparison and intra-comparison of photographs. Comparative judgements are affected by illumination, resolution, variations in surface coloration, and transmission or processing artifacts. The validity of the interpretation process is tested by making a representational drawing by an airbrush portrayal technique. Production controls insure the consistency of a map series. Photo interpretive cartographic portrayal skills are used to prepare two kinds of map series and are adaptable to map products of different kinds and purposes.

  17. Lagrangian Modeling of Evaporating Sprays at Diesel Engine Conditions: Effects of Multi-Hole Injector Nozzles With JP-8 Surrogates

    DTIC Science & Technology

    2014-05-01

    solver to treat the spray process. An Adaptive Mesh Refinement (AMR) and fixed embedding technique is employed to capture the gas - liquid interface with...Adaptive Mesh Refinement (AMR) and fixed embedding technique is employed to capture the gas - liquid interface with high fidelity while keeping the cell...in single and multi-hole nozzle configurations. The models were added to the present CONVERGE liquid fuel database and validated extensively

  18. A simple iterative independent component analysis algorithm for vibration source signal identification of complex structures

    NASA Astrophysics Data System (ADS)

    Lee, Dong-Sup; Cho, Dae-Seung; Kim, Kookhyun; Jeon, Jae-Jin; Jung, Woo-Jin; Kang, Myeng-Hwan; Kim, Jae-Ho

    2015-01-01

    Independent Component Analysis (ICA), one of the blind source separation methods, can be applied for extracting unknown source signals only from received signals. This is accomplished by finding statistical independence of signal mixtures and has been successfully applied to myriad fields such as medical science, image processing, and numerous others. Nevertheless, there are inherent problems that have been reported when using this technique: instability and invalid ordering of separated signals, particularly when using a conventional ICA technique in vibratory source signal identification of complex structures. In this study, a simple iterative algorithm of the conventional ICA has been proposed to mitigate these problems. The proposed method to extract more stable source signals having valid order includes an iterative and reordering process of extracted mixing matrix to reconstruct finally converged source signals, referring to the magnitudes of correlation coefficients between the intermediately separated signals and the signals measured on or nearby sources. In order to review the problems of the conventional ICA technique and to validate the proposed method, numerical analyses have been carried out for a virtual response model and a 30 m class submarine model. Moreover, in order to investigate applicability of the proposed method to real problem of complex structure, an experiment has been carried out for a scaled submarine mockup. The results show that the proposed method could resolve the inherent problems of a conventional ICA technique.

  19. The application of welat latino for creating paes in solo wedding bride

    NASA Astrophysics Data System (ADS)

    Ihsani, Ade Novi Nurul; Krisnawati, Maria; Prasetyaningtyas, Wulansari; Anggraeni, Puput; Bela, Herlina Tria; Zunaedah, Putri Wahyu

    2018-03-01

    The purposes of this research were: 1) to find out the process of creating innovative welat, 2) to find out how to use innovative welat for Solo wedding bride paes creation. The method used in the research was research and development (R & D). Sampling technique in this research was purposive sampling by using 13 people as models. The data collection technique used observation and documentation. Data analysis technique used descriptive technique. The results of the study showed that 1) there were two times design change of the validity of welat creation, each product passed through several stages of designing, forming, determining the material and printing, 3) the first way of using the welat determined the distance dot between the cengkorongan of both forms by using welat according to the existed mold. In conclusion, Innovative welat can produce paes in accordance with the standard and shorten the process.

  20. Simultaneous overpass off nadir (SOON): a method for unified calibration/validation across IEOS and GEOSS system of systems

    NASA Astrophysics Data System (ADS)

    Ardanuy, Philip; Bergen, Bill; Huang, Allen; Kratz, Gene; Puschell, Jeff; Schueler, Carl; Walker, Joe

    2006-08-01

    The US operates a diverse, evolving constellation of research and operational environmental satellites, principally in polar and geosynchronous orbits. Our current and enhanced future domestic remote sensing capability is complemented by the significant capabilities of our current and potential future international partners. In this analysis, we define "success" through the data customers' "eyes": participating in the sufficient and continuously improving satisfaction of their mission responsibilities. To successfully fuse together observations from multiple simultaneous platforms and sensors into a common, self-consistent, operational environment requires that there exist a unified calibration and validation approach. Here, we consider develop a concept for an integrating framework for absolute accuracy; long-term stability; self-consistency among sensors, platforms, techniques, and observing systems; and validation and characterization of performance. Across all systems, this is a non-trivial problem. Simultaneous Nadir Overpasses, or SNO's, provide a proven intercomparison technique: simultaneous, collocated, co-angular measurements. Many systems have off-nadir elements, or effects, that must be calibrated. For these systems, the nadir technique constrains the process. We define the term "SOON," for simultaneous overpass off nadir. We present a target architecture and sensitivity analysis for the affordable, sustainable implementation of a global SOON calibration/validation network that can deliver the much-needed comprehensive, common, self-consistent operational picture in near-real time, at an affordable cost.

  1. Validation of Computational Models in Biomechanics

    PubMed Central

    Henninger, Heath B.; Reese, Shawn P.; Anderson, Andrew E.; Weiss, Jeffrey A.

    2010-01-01

    The topics of verification and validation (V&V) have increasingly been discussed in the field of computational biomechanics, and many recent articles have applied these concepts in an attempt to build credibility for models of complex biological systems. V&V are evolving techniques that, if used improperly, can lead to false conclusions about a system under study. In basic science these erroneous conclusions may lead to failure of a subsequent hypothesis, but they can have more profound effects if the model is designed to predict patient outcomes. While several authors have reviewed V&V as they pertain to traditional solid and fluid mechanics, it is the intent of this manuscript to present them in the context of computational biomechanics. Specifically, the task of model validation will be discussed with a focus on current techniques. It is hoped that this review will encourage investigators to engage and adopt the V&V process in an effort to increase peer acceptance of computational biomechanics models. PMID:20839648

  2. Conceptual dissonance: evaluating the efficacy of natural language processing techniques for validating translational knowledge constructs.

    PubMed

    Payne, Philip R O; Kwok, Alan; Dhaval, Rakesh; Borlawsky, Tara B

    2009-03-01

    The conduct of large-scale translational studies presents significant challenges related to the storage, management and analysis of integrative data sets. Ideally, the application of methodologies such as conceptual knowledge discovery in databases (CKDD) provides a means for moving beyond intuitive hypothesis discovery and testing in such data sets, and towards the high-throughput generation and evaluation of knowledge-anchored relationships between complex bio-molecular and phenotypic variables. However, the induction of such high-throughput hypotheses is non-trivial, and requires correspondingly high-throughput validation methodologies. In this manuscript, we describe an evaluation of the efficacy of a natural language processing-based approach to validating such hypotheses. As part of this evaluation, we will examine a phenomenon that we have labeled as "Conceptual Dissonance" in which conceptual knowledge derived from two or more sources of comparable scope and granularity cannot be readily integrated or compared using conventional methods and automated tools.

  3. The development of an integrated assessment instrument for measuring analytical thinking and science process skills

    NASA Astrophysics Data System (ADS)

    Irwanto, Rohaeti, Eli; LFX, Endang Widjajanti; Suyanta

    2017-05-01

    This research aims to develop instrument and determine the characteristics of an integrated assessment instrument. This research uses 4-D model, which includes define, design, develop, and disseminate. The primary product is validated by expert judgment, tested it's readability by students, and assessed it's feasibility by chemistry teachers. This research involved 246 students of grade XI of four senior high schools in Yogyakarta, Indonesia. Data collection techniques include interview, questionnaire, and test. Data collection instruments include interview guideline, item validation sheet, users' response questionnaire, instrument readability questionnaire, and essay test. The results show that the integrated assessment instrument has Aiken validity value of 0.95. Item reliability was 0.99 and person reliability was 0.69. Teachers' response to the integrated assessment instrument is very good. Therefore, the integrated assessment instrument is feasible to be applied to measure the students' analytical thinking and science process skills.

  4. Analyzing psychotherapy process as intersubjective sensemaking: an approach based on discourse analysis and neural networks.

    PubMed

    Nitti, Mariangela; Ciavolino, Enrico; Salvatore, Sergio; Gennaro, Alessandro

    2010-09-01

    The authors propose a method for analyzing the psychotherapy process: discourse flow analysis (DFA). DFA is a technique representing the verbal interaction between therapist and patient as a discourse network, aimed at measuring the therapist-patient discourse ability to generate new meanings through time. DFA assumes that the main function of psychotherapy is to produce semiotic novelty. DFA is applied to the verbatim transcript of the psychotherapy. It defines the main meanings active within the therapeutic discourse by means of the combined use of text analysis and statistical techniques. Subsequently, it represents the dynamic interconnections among these meanings in terms of a "discursive network." The dynamic and structural indexes of the discursive network have been shown to provide a valid representation of the patient-therapist communicative flow as well as an estimation of its clinical quality. Finally, a neural network is designed specifically to identify patterns of functioning of the discursive network and to verify the clinical validity of these patterns in terms of their association with specific phases of the psychotherapy process. An application of the DFA to a case of psychotherapy is provided to illustrate the method and the kinds of results it produces.

  5. Process-based costing.

    PubMed

    Lee, Robert H; Bott, Marjorie J; Forbes, Sarah; Redford, Linda; Swagerty, Daniel L; Taunton, Roma Lee

    2003-01-01

    Understanding how quality improvement affects costs is important. Unfortunately, low-cost, reliable ways of measuring direct costs are scarce. This article builds on the principles of process improvement to develop a costing strategy that meets both criteria. Process-based costing has 4 steps: developing a flowchart, estimating resource use, valuing resources, and calculating direct costs. To illustrate the technique, this article uses it to cost the care planning process in 3 long-term care facilities. We conclude that process-based costing is easy to implement; generates reliable, valid data; and allows nursing managers to assess the costs of new or modified processes.

  6. Fostering and Assessing Creativity in Technology Education

    ERIC Educational Resources Information Center

    Buelin-Biesecker, Jennifer Katherine

    2012-01-01

    This study compared the creative outcomes in student work resulting from two pedagogical approaches to creative problem solving activities. A secondary goal was to validate the Consensual Assessment Technique (CAT) as a means of assessing creativity. Linear models for problem solving and design processes serve as the current paradigm in classroom…

  7. A simple enrichment correction factor for improving erosion estimation by rare earth oxide tracers

    USDA-ARS?s Scientific Manuscript database

    Spatially distributed soil erosion data are needed to better understanding soil erosion processes and validating distributed erosion models. Rare earth element (REE) oxides were used to generate spatial erosion data. However, a general concern on the accuracy of the technique arose due to selective ...

  8. Development and Validation of a New Technique for Detection of Stress and Pregnancy

    DTIC Science & Technology

    2014-09-30

    of stress hormone levels in a female Steller sea lion (Eumetopias jubatus) pup undergoing rehabilitation. J. Zoo and Wildl. Med.37 (1): 75-78...Atkinson, S. 2012. Changes during the rehabilitation process elicit endocrine responses in developing harbor seal (Phoca vitulina) pups. Zoo Biol. 32

  9. Note: Methodology for the analysis of Bluetooth gateways in an implemented scatternet.

    PubMed

    Etxaniz, J; Monje, P M; Aranguren, G

    2014-03-01

    This Note introduces a novel methodology to analyze the time performance of Bluetooth gateways in multi-hop networks, known as scatternets. The methodology is focused on distinguishing between the processing time and the time that each communication between nodes takes along an implemented scatternet. This technique is not only valid for Bluetooth networks but also for other wireless networks that offer access to their middleware in order to include beacons in the operation of the nodes. We show in this Note the results of the tests carried out on a Bluetooth scatternet in order to highlight the reliability and effectiveness of the methodology. The results also validate this technique showing convergence in the results when subtracting the time for the beacons from the delay measurements.

  10. Numerical and experimental analysis of a ducted propeller designed by a fully automated optimization process under open water condition

    NASA Astrophysics Data System (ADS)

    Yu, Long; Druckenbrod, Markus; Greve, Martin; Wang, Ke-qi; Abdel-Maksoud, Moustafa

    2015-10-01

    A fully automated optimization process is provided for the design of ducted propellers under open water conditions, including 3D geometry modeling, meshing, optimization algorithm and CFD analysis techniques. The developed process allows the direct integration of a RANSE solver in the design stage. A practical ducted propeller design case study is carried out for validation. Numerical simulations and open water tests are fulfilled and proved that the optimum ducted propeller improves hydrodynamic performance as predicted.

  11. Detection of delamination defects in CFRP materials using ultrasonic signal processing.

    PubMed

    Benammar, Abdessalem; Drai, Redouane; Guessoum, Abderrezak

    2008-12-01

    In this paper, signal processing techniques are tested for their ability to resolve echoes associated with delaminations in carbon fiber-reinforced polymer multi-layered composite materials (CFRP) detected by ultrasonic methods. These methods include split spectrum processing (SSP) and the expectation-maximization (EM) algorithm. A simulation study on defect detection was performed, and results were validated experimentally on CFRP with and without delamination defects taken from aircraft. Comparison of the methods for their ability to resolve echoes are made.

  12. Managing design excellence tools during the development of new orthopaedic implants.

    PubMed

    Défossez, Henri J P; Serhan, Hassan

    2013-11-01

    Design excellence (DEX) tools have been widely used for years in some industries for their potential to facilitate new product development. The medical sector, targeted by cost pressures, has therefore started adopting them. Numerous tools are available; however only appropriate deployment during the new product development stages can optimize the overall process. The primary study objectives were to describe generic tools and illustrate their implementation and management during the development of new orthopaedic implants, and compile a reference package. Secondary objectives were to present the DEX tool investment costs and savings, since the method can require significant resources for which companies must carefully plan. The publicly available DEX method "Define Measure Analyze Design Verify Validate" was adopted and implemented during the development of a new spinal implant. Several tools proved most successful at developing the correct product, addressing clinical needs, and increasing market penetration potential, while reducing design iterations and manufacturing validations. Cost analysis and Pugh Matrix coupled with multi generation planning enabled developing a strong rationale to activate the project, set the vision and goals. improved risk management and product map established a robust technical verification-validation program. Design of experiments and process quantification facilitated design for manufacturing of critical features, as early as the concept phase. Biomechanical testing with analysis of variance provided a validation model with a recognized statistical performance baseline. Within those tools, only certain ones required minimum resources (i.e., business case, multi generational plan, project value proposition, Pugh Matrix, critical To quality process validation techniques), while others required significant investments (i.e., voice of customer, product usage map, improved risk management, design of experiments, biomechanical testing techniques). All used techniques provided savings exceeding investment costs. Some other tools were considered and found less relevant. A matrix summarized the investment costs and generated estimated savings. Globally, all companies can benefit from using DEX by smartly selecting and estimating those tools with best return on investment at the start of the project. For this, a good understanding of the available company resources, background and development strategy are needed. In conclusion, it was possible to illustrate that appropriate management of design excellence tools can greatly facilitate the development of new orthopaedic implant systems.

  13. Menstrual blood loss measurement: validation of the alkaline hematin technique for feminine hygiene products containing superabsorbent polymers.

    PubMed

    Magnay, Julia L; Nevatte, Tracy M; Dhingra, Vandana; O'Brien, Shaughn

    2010-12-01

    To validate the alkaline hematin technique for measurement of menstrual blood loss using ultra-thin sanitary towels that contain superabsorbent polymer granules as the absorptive agent. Laboratory study using simulated menstrual fluid (SMF) and Always Ultra Normal, Long, and Night "with wings" sanitary towels. Keele Menstrual Disorders Laboratory. None. None. Recovery of blood, linearity, and interassay variation over a range of SMF volumes applied to towels. Because of the variable percentage of blood in menstrual fluid, blood recovery was assessed from SMF constituted as 10%, 25%, 50%, and 100% blood. The lower limit of reliable detection and the effect of storing soiled towels for up to 4 weeks at 15°C-20°C, 4°C, and -20°C before analysis were determined. Ninety percent recovery was reproducibly achieved up to 30 mL applied volume at all tested SMF compositions, except at low volume or high dilution equivalent to <2 mL whole blood. Samples could be stored for 3 weeks at all tested temperatures without loss of recovery. The technique was suitable for processing towels individually or in batches. The alkaline hematin technique is a suitable and validated method for measuring menstrual blood loss from Always Ultra sanitary towels that contain superabsorbent polymers. Copyright © 2010 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.

  14. Evaluating the statistical performance of less applied algorithms in classification of worldview-3 imagery data in an urbanized landscape

    NASA Astrophysics Data System (ADS)

    Ranaie, Mehrdad; Soffianian, Alireza; Pourmanafi, Saeid; Mirghaffari, Noorollah; Tarkesh, Mostafa

    2018-03-01

    In recent decade, analyzing the remotely sensed imagery is considered as one of the most common and widely used procedures in the environmental studies. In this case, supervised image classification techniques play a central role. Hence, taking a high resolution Worldview-3 over a mixed urbanized landscape in Iran, three less applied image classification methods including Bagged CART, Stochastic gradient boosting model and Neural network with feature extraction were tested and compared with two prevalent methods: random forest and support vector machine with linear kernel. To do so, each method was run ten time and three validation techniques was used to estimate the accuracy statistics consist of cross validation, independent validation and validation with total of train data. Moreover, using ANOVA and Tukey test, statistical difference significance between the classification methods was significantly surveyed. In general, the results showed that random forest with marginal difference compared to Bagged CART and stochastic gradient boosting model is the best performing method whilst based on independent validation there was no significant difference between the performances of classification methods. It should be finally noted that neural network with feature extraction and linear support vector machine had better processing speed than other.

  15. Burn-injured tissue detection for debridement surgery through the combination of non-invasive optical imaging techniques.

    PubMed

    Heredia-Juesas, Juan; Thatcher, Jeffrey E; Lu, Yang; Squiers, John J; King, Darlene; Fan, Wensheng; DiMaio, J Michael; Martinez-Lorenzo, Jose A

    2018-04-01

    The process of burn debridement is a challenging technique requiring significant skills to identify the regions that need excision and their appropriate excision depths. In order to assist surgeons, a machine learning tool is being developed to provide a quantitative assessment of burn-injured tissue. This paper presents three non-invasive optical imaging techniques capable of distinguishing four kinds of tissue-healthy skin, viable wound bed, shallow burn, and deep burn-during serial burn debridement in a porcine model. All combinations of these three techniques have been studied through a k-fold cross-validation method. In terms of global performance, the combination of all three techniques significantly improves the classification accuracy with respect to just one technique, from 0.42 up to more than 0.76. Furthermore, a non-linear spatial filtering based on the mode of a small neighborhood has been applied as a post-processing technique, in order to improve the performance of the classification. Using this technique, the global accuracy reaches a value close to 0.78 and, for some particular tissues and combination of techniques, the accuracy improves by 13%.

  16. Development and Experimental Validation of Large Eddy Simulation Techniques for the Prediction of Combustion-Dynamic Process in Syngas Combustion: Characterization of Autoignition, Flashback, and Flame-Liftoff at Gas-Turbine Relevant Operating Conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ihme, Matthias; Driscoll, James

    2015-08-31

    The objective of this closely coordinated experimental and computational research effort is the development of simulation techniques for the prediction of combustion processes, relevant to the oxidation of syngas and high hydrogen content (HHC) fuels at gas-turbine relevant operating conditions. Specifically, the research goals are (i) the characterization of the sensitivity of syngas ignition processes to hydrodynamic processes and perturbations in temperature and mixture composition in rapid compression machines and ow-reactors and (ii) to conduct comprehensive experimental investigations in a swirl-stabilized gas turbine (GT) combustor under realistic high-pressure operating conditions in order (iii) to obtain fundamental understanding about mechanisms controllingmore » unstable flame regimes in HHC-combustion.« less

  17. Validation of New Wind Resource Maps

    NASA Astrophysics Data System (ADS)

    Elliott, D.; Schwartz, M.

    2002-05-01

    The National Renewable Energy Laboratory (NREL) recently led a project to validate updated state wind resource maps for the northwestern United States produced by a private U.S. company, TrueWind Solutions (TWS). The independent validation project was a cooperative activity among NREL, TWS, and meteorological consultants. The independent validation concept originated at a May 2001 technical workshop held at NREL to discuss updating the Wind Energy Resource Atlas of the United States. Part of the workshop, which included more than 20 attendees from the wind resource mapping and consulting community, was dedicated to reviewing the latest techniques for wind resource assessment. It became clear that using a numerical modeling approach for wind resource mapping was rapidly gaining ground as a preferred technique and if the trend continues, it will soon become the most widely-used technique around the world. The numerical modeling approach is a relatively fast application compared to older mapping methods and, in theory, should be quite accurate because it directly estimates the magnitude of boundary-layer processes that affect the wind resource of a particular location. Numerical modeling output combined with high resolution terrain data can produce useful wind resource information at a resolution of 1 km or lower. However, because the use of the numerical modeling approach is new (last 35 years) and relatively unproven, meteorological consultants question the accuracy of the approach. It was clear that new state or regional wind maps produced by this method would have to undergo independent validation before the results would be accepted by the wind energy community and developers.

  18. Validation of Reference Genes in mRNA Expression Analysis Applied to the Study of Asthma.

    PubMed

    Segundo-Val, Ignacio San; Sanz-Lozano, Catalina S

    2016-01-01

    The quantitative Polymerase Chain Reaction is the most used technique for the study of gene expression. To correct putative experimental errors of this technique is necessary normalizing the expression results of the gene of interest with the obtained for reference genes. Here, we describe an example of the process to select reference genes. In this particular case, we select reference genes for expression studies in the peripheral blood mononuclear cells of asthmatic patients.

  19. The design and instrumentation of the Purdue annular cascade facility with initial data acquisition and analysis

    NASA Technical Reports Server (NTRS)

    Stauter, R. C.; Fleeter, S.

    1982-01-01

    Three dimensional aerodynamic data, required to validate and/or indicate necessary refinements to inviscid and viscous analyses of the flow through turbomachine blade rows, are discussed. Instrumentation and capabilities for pressure measurement, probe insertion and traversing, and flow visualization are reviewed. Advanced measurement techniques including Laser Doppler Anemometers, are considered. Data processing is reviewed. Predictions were correlated with the experimental data. A flow visualization technique using helium filled soap bubbles was demonstrated.

  20. Advanced bulk processing of lightweight materials for utilization in the transportation sector

    NASA Astrophysics Data System (ADS)

    Milner, Justin L.

    The overall objective of this research is to develop the microstructure of metallic lightweight materials via multiple advanced processing techniques with potentials for industrial utilization on a large scale to meet the demands of the aerospace and automotive sectors. This work focused on (i) refining the grain structure to increase the strength, (ii) controlling the texture to increase formability and (iii) directly reducing processing/production cost of lightweight material components. Advanced processing is conducted on a bulk scale by several severe plastic deformation techniques including: accumulative roll bonding, isolated shear rolling and friction stir processing to achieve the multiple targets of this research. Development and validation of the processing techniques is achieved through wide-ranging experiments along with detailed mechanical and microstructural examination of the processed material. On a broad level, this research will make advancements in processing of bulk lightweight materials facilitating industrial-scale implementation. Where accumulative roll bonding and isolated shear rolling, currently feasible on an industrial scale, processes bulk sheet materials capable of replacing more expensive grades of alloys and enabling low-temperature and high-strain-rate formability. Furthermore, friction stir processing to manufacture lightweight tubes, made from magnesium alloys, has the potential to increase the utilization of these materials in the automotive and aerospace sectors for high strength - high formability applications. With the increased utilization of these advanced processing techniques will significantly reduce the cost associated with lightweight materials for many applications in the transportation sectors.

  1. Generalized ISAR--part II: interferometric techniques for three-dimensional location of scatterers.

    PubMed

    Given, James A; Schmidt, William R

    2005-11-01

    This paper is the second part of a study dedicated to optimizing diagnostic inverse synthetic aperture radar (ISAR) studies of large naval vessels. The method developed here provides accurate determination of the position of important radio-frequency scatterers by combining accurate knowledge of ship position and orientation with specialized signal processing. The method allows for the simultaneous presence of substantial Doppler returns from both change of roll angle and change of aspect angle by introducing generalized ISAR ates. The first paper provides two modes of interpreting ISAR plots, one valid when roll Doppler is dominant, the other valid when the aspect angle Doppler is dominant. Here, we provide, for each type of ISAR plot technique, a corresponding interferometric ISAR (InSAR) technique. The former, aspect-angle dominated InSAR, is a generalization of standard InSAR; the latter, roll-angle dominated InSAR, seems to be new to this work. Both methods are shown to be efficient at identifying localized scatterers under simulation conditions.

  2. A strategy for selecting data mining techniques in metabolomics.

    PubMed

    Banimustafa, Ahmed Hmaidan; Hardy, Nigel W

    2012-01-01

    There is a general agreement that the development of metabolomics depends not only on advances in chemical analysis techniques but also on advances in computing and data analysis methods. Metabolomics data usually requires intensive pre-processing, analysis, and mining procedures. Selecting and applying such procedures requires attention to issues including justification, traceability, and reproducibility. We describe a strategy for selecting data mining techniques which takes into consideration the goals of data mining techniques on the one hand, and the goals of metabolomics investigations and the nature of the data on the other. The strategy aims to ensure the validity and soundness of results and promote the achievement of the investigation goals.

  3. A formal approach to validation and verification for knowledge-based control systems

    NASA Technical Reports Server (NTRS)

    Castore, Glen

    1987-01-01

    As control systems become more complex in response to desires for greater system flexibility, performance and reliability, the promise is held out that artificial intelligence might provide the means for building such systems. An obstacle to the use of symbolic processing constructs in this domain is the need for verification and validation (V and V) of the systems. Techniques currently in use do not seem appropriate for knowledge-based software. An outline of a formal approach to V and V for knowledge-based control systems is presented.

  4. RTL validation methodology on high complexity wireless microcontroller using OVM technique for fast time to market

    NASA Astrophysics Data System (ADS)

    Zhafirah Muhammad, Nurul; Harun, A.; Hambali, N. A. M. A.; Murad, S. A. Z.; Mohyar, S. N.; Isa, M. N.; Jambek, AB

    2017-11-01

    Increased demand in internet of thing (IOT) application based has inadvertently forced the move towards higher complexity of integrated circuit supporting SoC. Such spontaneous increased in complexity poses unequivocal complicated validation strategies. Hence, the complexity allows researchers to come out with various exceptional methodologies in order to overcome this problem. This in essence brings about the discovery of dynamic verification, formal verification and hybrid techniques. In reserve, it is very important to discover bugs at infancy of verification process in (SoC) in order to reduce time consuming and fast time to market for the system. Ergo, in this paper we are focusing on the methodology of verification that can be done at Register Transfer Level of SoC based on the AMBA bus design. On top of that, the discovery of others verification method called Open Verification Methodology (OVM) brings out an easier way in RTL validation methodology neither as the replacement for the traditional method yet as an effort for fast time to market for the system. Thus, the method called OVM is proposed in this paper as the verification method for larger design to avert the disclosure of the bottleneck in validation platform.

  5. Nonparametric estimation of the heterogeneity of a random medium using compound Poisson process modeling of wave multiple scattering.

    PubMed

    Le Bihan, Nicolas; Margerin, Ludovic

    2009-07-01

    In this paper, we present a nonparametric method to estimate the heterogeneity of a random medium from the angular distribution of intensity of waves transmitted through a slab of random material. Our approach is based on the modeling of forward multiple scattering using compound Poisson processes on compact Lie groups. The estimation technique is validated through numerical simulations based on radiative transfer theory.

  6. Modeling biological gradient formation: combining partial differential equations and Petri nets.

    PubMed

    Bertens, Laura M F; Kleijn, Jetty; Hille, Sander C; Heiner, Monika; Koutny, Maciej; Verbeek, Fons J

    2016-01-01

    Both Petri nets and differential equations are important modeling tools for biological processes. In this paper we demonstrate how these two modeling techniques can be combined to describe biological gradient formation. Parameters derived from partial differential equation describing the process of gradient formation are incorporated in an abstract Petri net model. The quantitative aspects of the resulting model are validated through a case study of gradient formation in the fruit fly.

  7. An intelligent signal processing and pattern recognition technique for defect identification using an active sensor network

    NASA Astrophysics Data System (ADS)

    Su, Zhongqing; Ye, Lin

    2004-08-01

    The practical utilization of elastic waves, e.g. Rayleigh-Lamb waves, in high-performance structural health monitoring techniques is somewhat impeded due to the complicated wave dispersion phenomena, the existence of multiple wave modes, the high susceptibility to diverse interferences, the bulky sampled data and the difficulty in signal interpretation. An intelligent signal processing and pattern recognition (ISPPR) approach using the wavelet transform and artificial neural network algorithms was developed; this was actualized in a signal processing package (SPP). The ISPPR technique comprehensively functions as signal filtration, data compression, characteristic extraction, information mapping and pattern recognition, capable of extracting essential yet concise features from acquired raw wave signals and further assisting in structural health evaluation. For validation, the SPP was applied to the prediction of crack growth in an alloy structural beam and construction of a damage parameter database for defect identification in CF/EP composite structures. It was clearly apparent that the elastic wave propagation-based damage assessment could be dramatically streamlined by introduction of the ISPPR technique.

  8. Estimation of the Scatterer Distribution of the Cirrhotic Liver using Ultrasonic Image

    NASA Astrophysics Data System (ADS)

    Yamaguchi, Tadashi; Hachiya, Hiroyuki

    1998-05-01

    In the B-mode image of the liver obtained by an ultrasonic imaging system, the speckled pattern changes with the progression of the disease such as liver cirrhosis.In this paper we present the statistical characteristics of the echo envelope of the liver, and the technique to extract information of the scatterer distribution from the normal and cirrhotic liver images using constant false alarm rate (CFAR) processing.We analyze the relationship between the extracted scatterer distribution and the stage of liver cirrhosis. The ratio of the area in which the amplitude of the processing signal is more than the threshold to the entire processed image area is related quantitatively to the stage of liver cirrhosis.It is found that the proposed technique is valid for the quantitative diagnosis of liver cirrhosis.

  9. Error Estimation and Uncertainty Propagation in Computational Fluid Mechanics

    NASA Technical Reports Server (NTRS)

    Zhu, J. Z.; He, Guowei; Bushnell, Dennis M. (Technical Monitor)

    2002-01-01

    Numerical simulation has now become an integral part of engineering design process. Critical design decisions are routinely made based on the simulation results and conclusions. Verification and validation of the reliability of the numerical simulation is therefore vitally important in the engineering design processes. We propose to develop theories and methodologies that can automatically provide quantitative information about the reliability of the numerical simulation by estimating numerical approximation error, computational model induced errors and the uncertainties contained in the mathematical models so that the reliability of the numerical simulation can be verified and validated. We also propose to develop and implement methodologies and techniques that can control the error and uncertainty during the numerical simulation so that the reliability of the numerical simulation can be improved.

  10. LipidQC: Method Validation Tool for Visual Comparison to SRM 1950 Using NIST Interlaboratory Comparison Exercise Lipid Consensus Mean Estimate Values.

    PubMed

    Ulmer, Candice Z; Ragland, Jared M; Koelmel, Jeremy P; Heckert, Alan; Jones, Christina M; Garrett, Timothy J; Yost, Richard A; Bowden, John A

    2017-12-19

    As advances in analytical separation techniques, mass spectrometry instrumentation, and data processing platforms continue to spur growth in the lipidomics field, more structurally unique lipid species are detected and annotated. The lipidomics community is in need of benchmark reference values to assess the validity of various lipidomics workflows in providing accurate quantitative measurements across the diverse lipidome. LipidQC addresses the harmonization challenge in lipid quantitation by providing a semiautomated process, independent of analytical platform, for visual comparison of experimental results of National Institute of Standards and Technology Standard Reference Material (SRM) 1950, "Metabolites in Frozen Human Plasma", against benchmark consensus mean concentrations derived from the NIST Lipidomics Interlaboratory Comparison Exercise.

  11. A cross-validation package driving Netica with python

    USGS Publications Warehouse

    Fienen, Michael N.; Plant, Nathaniel G.

    2014-01-01

    Bayesian networks (BNs) are powerful tools for probabilistically simulating natural systems and emulating process models. Cross validation is a technique to avoid overfitting resulting from overly complex BNs. Overfitting reduces predictive skill. Cross-validation for BNs is known but rarely implemented due partly to a lack of software tools designed to work with available BN packages. CVNetica is open-source, written in Python, and extends the Netica software package to perform cross-validation and read, rebuild, and learn BNs from data. Insights gained from cross-validation and implications on prediction versus description are illustrated with: a data-driven oceanographic application; and a model-emulation application. These examples show that overfitting occurs when BNs become more complex than allowed by supporting data and overfitting incurs computational costs as well as causing a reduction in prediction skill. CVNetica evaluates overfitting using several complexity metrics (we used level of discretization) and its impact on performance metrics (we used skill).

  12. Measuring Adverse Events in Helicopter Emergency Medical Services: Establishing Content Validity

    PubMed Central

    Patterson, P. Daniel; Lave, Judith R.; Martin-Gill, Christian; Weaver, Matthew D.; Wadas, Richard J.; Arnold, Robert M.; Roth, Ronald N.; Mosesso, Vincent N.; Guyette, Francis X.; Rittenberger, Jon C.; Yealy, Donald M.

    2015-01-01

    Introduction We sought to create a valid framework for detecting Adverse Events (AEs) in the high-risk setting of Helicopter Emergency Medical Services (HEMS). Methods We assembled a panel of 10 expert clinicians (n=6 emergency medicine physicians and n=4 prehospital nurses and flight paramedics) affiliated with a large multi-state HEMS organization in the Northeast U.S. We used a modified Delphi technique to develop a framework for detecting AEs associated with the treatment of critically ill or injured patients. We used a widely applied measure, the Content Validity Index (CVI), to quantify the validity of the framework’s content. Results The expert panel of 10 clinicians reached consensus on a common AE definition and four-step protocol/process for AE detection in HEMS. The consensus-based framework is composed of three main components: 1) a trigger tool, 2) a method for rating proximal cause, and 3) a method for rating AE severity. The CVI findings isolate components of the framework considered content valid. Conclusions We demonstrate a standardized process for the development of a content valid framework for AE detection. The framework is a model for the development of a method for AE identification in other settings, including ground-based EMS. PMID:24003951

  13. Estimation et validation des derivees de stabilite et controle du modele dynamique non-lineaire d'un drone a voilure fixe

    NASA Astrophysics Data System (ADS)

    Courchesne, Samuel

    Knowledge of the dynamic characteristics of a fixed-wing UAV is necessary to design flight control laws and to conceive a high quality flight simulator. The basic features of a flight mechanic model include the properties of mass, inertia and major aerodynamic terms. They respond to a complex process involving various numerical analysis techniques and experimental procedures. This thesis focuses on the analysis of estimation techniques applied to estimate problems of stability and control derivatives from flight test data provided by an experimental UAV. To achieve this objective, a modern identification methodology (Quad-M) is used to coordinate the processing tasks from multidisciplinary fields, such as parameter estimation modeling, instrumentation, the definition of flight maneuvers and validation. The system under study is a non-linear model with six degrees of freedom with a linear aerodynamic model. The time domain techniques are used for identification of the drone. The first technique, the equation error method is used to determine the structure of the aerodynamic model. Thereafter, the output error method and filter error method are used to estimate the aerodynamic coefficients values. The Matlab scripts for estimating the parameters obtained from the American Institute of Aeronautics and Astronautics (AIAA) are used and modified as necessary to achieve the desired results. A commendable effort in this part of research is devoted to the design of experiments. This includes an awareness of the system data acquisition onboard and the definition of flight maneuvers. The flight tests were conducted under stable flight conditions and with low atmospheric disturbance. Nevertheless, the identification results showed that the filter error method is most effective for estimating the parameters of the drone due to the presence of process noise and measurement. The aerodynamic coefficients are validated using a numerical analysis of the vortex method. In addition, a simulation model incorporating the estimated parameters is used to compare the behavior of states measured. Finally, a good correspondence between the results is demonstrated despite a limited number of flight data. Keywords: drone, identification, estimation, nonlinear, flight test, system, aerodynamic coefficient.

  14. Survey of statistical techniques used in validation studies of air pollution prediction models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bornstein, R D; Anderson, S F

    1979-03-01

    Statistical techniques used by meteorologists to validate predictions made by air pollution models are surveyed. Techniques are divided into the following three groups: graphical, tabular, and summary statistics. Some of the practical problems associated with verification are also discussed. Characteristics desired in any validation program are listed and a suggested combination of techniques that possesses many of these characteristics is presented.

  15. Adaptive vibration control of structures under earthquakes

    NASA Astrophysics Data System (ADS)

    Lew, Jiann-Shiun; Juang, Jer-Nan; Loh, Chin-Hsiung

    2017-04-01

    techniques, for structural vibration suppression under earthquakes. Various control strategies have been developed to protect structures from natural hazards and improve the comfort of occupants in buildings. However, there has been little development of adaptive building control with the integration of real-time system identification and control design. Generalized predictive control, which combines the process of real-time system identification and the process of predictive control design, has received widespread acceptance and has been successfully applied to various test-beds. This paper presents a formulation of the predictive control scheme for adaptive vibration control of structures under earthquakes. Comprehensive simulations are performed to demonstrate and validate the proposed adaptive control technique for earthquake-induced vibration of a building.

  16. Statistical Validation for Clinical Measures: Repeatability and Agreement of Kinect™-Based Software.

    PubMed

    Lopez, Natalia; Perez, Elisa; Tello, Emanuel; Rodrigo, Alejandro; Valentinuzzi, Max E

    2018-01-01

    The rehabilitation process is a fundamental stage for recovery of people's capabilities. However, the evaluation of the process is performed by physiatrists and medical doctors, mostly based on their observations, that is, a subjective appreciation of the patient's evolution. This paper proposes a tracking platform of the movement made by an individual's upper limb using Kinect sensor(s) to be applied for the patient during the rehabilitation process. The main contribution is the development of quantifying software and the statistical validation of its performance, repeatability, and clinical use in the rehabilitation process. The software determines joint angles and upper limb trajectories for the construction of a specific rehabilitation protocol and quantifies the treatment evolution. In turn, the information is presented via a graphical interface that allows the recording, storage, and report of the patient's data. For clinical purposes, the software information is statistically validated with three different methodologies, comparing the measures with a goniometer in terms of agreement and repeatability. The agreement of joint angles measured with the proposed software and goniometer is evaluated with Bland-Altman plots; all measurements fell well within the limits of agreement, meaning interchangeability of both techniques. Additionally, the results of Bland-Altman analysis of repeatability show 95% confidence. Finally, the physiotherapists' qualitative assessment shows encouraging results for the clinical use. The main conclusion is that the software is capable of offering a clinical history of the patient and is useful for quantification of the rehabilitation success. The simplicity, low cost, and visualization possibilities enhance the use of the software Kinect for rehabilitation and other applications, and the expert's opinion endorses the choice of our approach for clinical practice. Comparison of the new measurement technique with established goniometric methods determines that the proposed software agrees sufficiently to be used interchangeably.

  17. Interference detection and correction applied to incoherent-scatter radar power spectrum measurement

    NASA Technical Reports Server (NTRS)

    Ying, W. P.; Mathews, J. D.; Rastogi, P. K.

    1986-01-01

    A median filter based interference detection and correction technique is evaluated and the method applied to the Arecibo incoherent scatter radar D-region ionospheric power spectrum is discussed. The method can be extended to other kinds of data when the statistics involved in the process are still valid.

  18. Self-Alert Training: Volitional Modulation of Autonomic Arousal Improves Sustained Attention

    ERIC Educational Resources Information Center

    O'Connell, Redmond G.; Bellgrove, Mark A.; Dockree, Paul M.; Lau, Adam; Fitzgerald, Michael; Robertson, Ian H.

    2008-01-01

    The present study examines a new alertness training strategy (Self-Alert Training, SAT) designed to explore the relationship between the top-down control processes governing arousal and sustained attention. In order to maximally target frontal control systems SAT combines a previously validated behavioural self-alerting technique [Robertson, I.…

  19. Towards Application of NASA Standard for Models and Simulations in Aeronautical Design Process

    NASA Astrophysics Data System (ADS)

    Vincent, Luc; Dunyach, Jean-Claude; Huet, Sandrine; Pelissier, Guillaume; Merlet, Joseph

    2012-08-01

    Even powerful computational techniques like simulation endure limitations in their validity domain. Consequently using simulation models requires cautions to avoid making biased design decisions for new aeronautical products on the basis of inadequate simulation results. Thus the fidelity, accuracy and validity of simulation models shall be monitored in context all along the design phases to build confidence in achievement of the goals of modelling and simulation.In the CRESCENDO project, we adapt the Credibility Assessment Scale method from NASA standard for models and simulations from space programme to the aircraft design in order to assess the quality of simulations. The proposed eight quality assurance metrics aggregate information to indicate the levels of confidence in results. They are displayed in management dashboard and can secure design trade-off decisions at programme milestones.The application of this technique is illustrated in aircraft design context with specific thermal Finite Elements Analysis. This use case shows how to judge the fitness- for-purpose of simulation as Virtual testing means and then green-light the continuation of Simulation Lifecycle Management (SLM) process.

  20. GNSS climatology: A summary of findings from the COST Action ES1206 GNSS4SWEC

    NASA Astrophysics Data System (ADS)

    Bock, Olivier; Pacione, Rosa

    2017-04-01

    Working Group 3 of COST Action GNSS4SWEC promoted the coordinated development and assessment of GNSS tropospheric products for climate research. More than 50 researchers from 17 institutions participated in the discussions. The activities were organised in five main topics, each of which led to conclusions and recommendations for a proper production and use of GNSS tropospheric products for climate research. 1) GNSS data processing and validation: an inventory was established listing the main existing reprocessed datasets and one of them (IGS repro1) was more specifically assessed and used as a community dataset to demonstrate the capacity of GNSS to retrieve decadal trends and variability in zenith tropospheric delay (ZTD). Several groups performed also processing sensitivity studies producing long term (15 years or more) solutions and testing the impact of various processing parameters (tropospheric models, cutoff angle…) on the accuracy and stability of the retrieved ZTD estimates. 2) Standards and methods for post-processing: (i) elaborate screening methods have been developed and tested for the detection of outliers in ZTD data; (ii) ZTD to IWV conversion methods and auxiliary datasets have been reviewed and assessed; (iii) the homogeneity of long ZTD and IWV time series has been investigated. Standardised procedures were proposed for first two points. Inhomogeneities have been identified in all reprocessed GNSS datasets which are due to equipment changes or changes in the measurement conditions. Significant activity is on-going on the development of statistical homogenisation techniques that match the GNSS data characteristics. 3) IWV validations: new intercomparisons of GNSS IWV estimates to IWV retrieved from other observational techniques (radiosondes, microwave radiometers, VLBI, DORIS…) have been encouraged to enhance the results of the past and contribute to a better evaluation of inter-technique biases and absolute accuracy of the different IWV sensing techniques. 4) GNSS climatology: as a major goal of this working group, applications have been promoted in collaboration with the climate research community such as the analysis of global and regional trends and variability, the evaluation of global and regional climate model simulations (IPCC, EC-Earth, CORDEX…) and reanalysis products (ERA-Interim, ERA20C, 20CR…). 5) Databases and data formats: cooperation with IGS and EUREF fostered the specification and development of new database structures and updated SINEX format for a more efficient and enhanced exchange, use, and validation of GNSS tropospheric data.

  1. Validation of GOES-10 Satellite-derived Cloud and Radiative Properties for the MASRAD ARM Mobile Facility Deployment

    NASA Technical Reports Server (NTRS)

    Khaiyer, M. M.; Doelling, D. R.; Palikonda, R.; Mordeen, M. L.; Minnis, P.

    2007-01-01

    This poster presentation reviews the process used to validate the GOES-10 satellite derived cloud and radiative properties. The ARM Mobile Facility (AMF) deployment at Pt Reyes, CA as part of the Marine Stratus Radiation Aerosol and Drizzle experiment (MASRAD), 14 March - 14 September 2005 provided an excellent chance to validate satellite cloud-property retrievals with the AMF's flexible suite of ground-based remote sensing instruments. For this comparison, NASA LaRC GOES10 satellite retrievals covering this region and period were re-processed using an updated version of the Visible Infrared Solar-Infrared Split-Window Technique (VISST), which uses data taken at 4 wavelengths (0.65, 3.9,11 and 12 m resolution), and computes broadband fluxes using improved CERES (Clouds and Earth's Radiant Energy System)-GOES-10 narrowband-to-broadband flux conversion coefficients. To validate MASRAD GOES-10 satellite-derived cloud property data, VISST-derived cloud amounts, heights, liquid water paths are compared with similar quantities derived from available ARM ground-based instrumentation and with CERES fluxes from Terra.

  2. [Validation of measurement methods and estimation of uncertainty of measurement of chemical agents in the air at workstations].

    PubMed

    Dobecki, Marek

    2012-01-01

    This paper reviews the requirements for measurement methods of chemical agents in the air at workstations. European standards, which have a status of Polish standards, comprise some requirements and information on sampling strategy, measuring techniques, type of samplers, sampling pumps and methods of occupational exposure evaluation at a given technological process. Measurement methods, including air sampling and analytical procedure in a laboratory, should be appropriately validated before intended use. In the validation process, selected methods are tested and budget of uncertainty is set up. The validation procedure that should be implemented in the laboratory together with suitable statistical tools and major components of uncertainity to be taken into consideration, were presented in this paper. Methods of quality control, including sampling and laboratory analyses were discussed. Relative expanded uncertainty for each measurement expressed as a percentage, should not exceed the limit of values set depending on the type of occupational exposure (short-term or long-term) and the magnitude of exposure to chemical agents in the work environment.

  3. Material model validation for laser shock peening process simulation

    NASA Astrophysics Data System (ADS)

    Amarchinta, H. K.; Grandhi, R. V.; Langer, K.; Stargel, D. S.

    2009-01-01

    Advanced mechanical surface enhancement techniques have been used successfully to increase the fatigue life of metallic components. These techniques impart deep compressive residual stresses into the component to counter potentially damage-inducing tensile stresses generated under service loading. Laser shock peening (LSP) is an advanced mechanical surface enhancement technique used predominantly in the aircraft industry. To reduce costs and make the technique available on a large-scale basis for industrial applications, simulation of the LSP process is required. Accurate simulation of the LSP process is a challenging task, because the process has many parameters such as laser spot size, pressure profile and material model that must be precisely determined. This work focuses on investigating the appropriate material model that could be used in simulation and design. In the LSP process material is subjected to strain rates of 106 s-1, which is very high compared with conventional strain rates. The importance of an accurate material model increases because the material behaves significantly different at such high strain rates. This work investigates the effect of multiple nonlinear material models for representing the elastic-plastic behavior of materials. Elastic perfectly plastic, Johnson-Cook and Zerilli-Armstrong models are used, and the performance of each model is compared with available experimental results.

  4. A Validation Study of the Impression Replica Technique.

    PubMed

    Segerström, Sofia; Wiking-Lima de Faria, Johanna; Braian, Michael; Ameri, Arman; Ahlgren, Camilla

    2018-04-17

    To validate the well-known and often-used impression replica technique for measuring fit between a preparation and a crown in vitro. The validation consisted of three steps. First, a measuring instrument was validated to elucidate its accuracy. Second, a specimen consisting of male and female counterparts was created and validated by the measuring instrument. Calculations were made for the exact values of three gaps between the male and female. Finally, impression replicas were produced of the specimen gaps and sectioned into four pieces. The replicas were then measured with the use of a light microscope. The values received from measuring the specimen were then compared with the values received from the impression replicas, and the technique was thereby validated. The impression replica technique overvalued all measured gaps. Depending on location of the three measuring sites, the difference between the specimen and the impression replicas varied from 47 to 130 μm. The impression replica technique overestimates gaps within the range of 2% to 11%. The validation of the replica technique enables the method to be used as a reference when testing other methods for evaluating fit in dentistry. © 2018 by the American College of Prosthodontists.

  5. Computer-assisted Biology Learning Materials: Designing and Developing an Interactive CD on Spermatogenesis

    NASA Astrophysics Data System (ADS)

    Haviz, M.

    2018-04-01

    The purpose of this article is to design and develop an interactive CD on spermatogenesis. This is a research and development. Procedure of development is making an outline of media program, making flowchart, making story board, gathering of materials, programming and finishing. The quantitative data obtained were analyzed by descriptive statistics. Qualitative data obtained were analyzed with Miles and Huberman techniques. The instrument used is a validation sheet. The result of CD design with a Macro flash MX program shows there are 17 slides generated. This prototype obtained a valid value after a self-review technique with many revisions, especially on sound and programming. This finding suggests that process-oriented spermatogenesis can be audio-visualized into a more comprehensive form of learning media. But this interactive CD product needs further testing to determine consistency and resistance to revisions.

  6. Research on the technique of large-aperture off-axis parabolic surface processing using tri-station machine and its applicability.

    PubMed

    Zhang, Xin; Luo, Xiao; Hu, Haixiang; Zhang, Xuejun

    2015-09-01

    In order to process large-aperture aspherical mirrors, we designed and constructed a tri-station machine processing center with a three station device, which bears vectored feed motion of up to 10 axes. Based on this processing center, an aspherical mirror-processing model is proposed, in which each station implements traversal processing of large-aperture aspherical mirrors using only two axes, while the stations are switchable, thus lowering cost and enhancing processing efficiency. The applicability of the tri-station machine is also analyzed. At the same time, a simple and efficient zero-calibration method for processing is proposed. To validate the processing model, using our processing center, we processed an off-axis parabolic SiC mirror with an aperture diameter of 1450 mm. The experimental results indicate that, with a one-step iterative process, the peak to valley (PV) and root mean square (RMS) of the mirror converged from 3.441 and 0.5203 μm to 2.637 and 0.2962 μm, respectively, where the RMS reduced by 43%. The validity and high accuracy of the model are thereby demonstrated.

  7. Active Aeroelastic Wing Aerodynamic Model Development and Validation for a Modified F/A-18A Airplane

    NASA Technical Reports Server (NTRS)

    Cumming, Stephen B.; Diebler, Corey G.

    2005-01-01

    A new aerodynamic model has been developed and validated for a modified F/A-18A airplane used for the Active Aeroelastic Wing (AAW) research program. The goal of the program was to demonstrate the advantages of using the inherent flexibility of an aircraft to enhance its performance. The research airplane was an F/A-18A with wings modified to reduce stiffness and a new control system to increase control authority. There have been two flight phases. Data gathered from the first flight phase were used to create the new aerodynamic model. A maximum-likelihood output-error parameter estimation technique was used to obtain stability and control derivatives. The derivatives were incorporated into the National Aeronautics and Space Administration F-18 simulation, validated, and used to develop new AAW control laws. The second phase of flights was used to evaluate the handling qualities of the AAW airplane and the control law design process, and to further test the accuracy of the new model. The flight test envelope covered Mach numbers between 0.85 and 1.30 and dynamic pressures from 600 to 1250 pound-force per square foot. The results presented in this report demonstrate that a thorough parameter identification analysis can be used to improve upon models that were developed using other means. This report describes the parameter estimation technique used, details the validation techniques, discusses differences between previously existing F/A-18 models, and presents results from the second phase of research flights.

  8. Cognitive Bias in the Verification and Validation of Space Flight Systems

    NASA Technical Reports Server (NTRS)

    Larson, Steve

    2012-01-01

    Cognitive bias is generally recognized as playing a significant role in virtually all domains of human decision making. Insight into this role is informally built into many of the system engineering practices employed in the aerospace industry. The review process, for example, typically has features that help to counteract the effect of bias. This paper presents a discussion of how commonly recognized biases may affect the verification and validation process. Verifying and validating a system is arguably more challenging than development, both technically and cognitively. Whereas there may be a relatively limited number of options available for the design of a particular aspect of a system, there is a virtually unlimited number of potential verification scenarios that may be explored. The probability of any particular scenario occurring in operations is typically very difficult to estimate, which increases reliance on judgment that may be affected by bias. Implementing a verification activity often presents technical challenges that, if they can be overcome at all, often result in a departure from actual flight conditions (e.g., 1-g testing, simulation, time compression, artificial fault injection) that may raise additional questions about the meaningfulness of the results, and create opportunities for the introduction of additional biases. In addition to mitigating the biases it can introduce directly, the verification and validation process must also overcome the cumulative effect of biases introduced during all previous stages of development. A variety of cognitive biases will be described, with research results for illustration. A handful of case studies will be presented that show how cognitive bias may have affected the verification and validation process on recent JPL flight projects, identify areas of strength and weakness, and identify potential changes or additions to commonly used techniques that could provide a more robust verification and validation of future systems.

  9. Image processing developments and applications for water quality monitoring and trophic state determination

    NASA Technical Reports Server (NTRS)

    Blackwell, R. J.

    1982-01-01

    Remote sensing data analysis of water quality monitoring is evaluated. Data anaysis and image processing techniques are applied to LANDSAT remote sensing data to produce an effective operational tool for lake water quality surveying and monitoring. Digital image processing and analysis techniques were designed, developed, tested, and applied to LANDSAT multispectral scanner (MSS) data and conventional surface acquired data. Utilization of these techniques facilitates the surveying and monitoring of large numbers of lakes in an operational manner. Supervised multispectral classification, when used in conjunction with surface acquired water quality indicators, is used to characterize water body trophic status. Unsupervised multispectral classification, when interpreted by lake scientists familiar with a specific water body, yields classifications of equal validity with supervised methods and in a more cost effective manner. Image data base technology is used to great advantage in characterizing other contributing effects to water quality. These effects include drainage basin configuration, terrain slope, soil, precipitation and land cover characteristics.

  10. Expansion of transient operating data

    NASA Astrophysics Data System (ADS)

    Chipman, Christopher; Avitabile, Peter

    2012-08-01

    Real time operating data is very important to understand actual system response. Unfortunately, the amount of physical data points typically collected is very small and often interpretation of the data is difficult. Expansion techniques have been developed using traditional experimental modal data to augment this limited set of data. This expansion process allows for a much improved description of the real time operating response. This paper presents the results from several different structures to show the robustness of the technique. Comparisons are made to a more complete set of measured data to validate the approach. Both analytical simulations and actual experimental data are used to illustrate the usefulness of the technique.

  11. Feature tracking cardiac magnetic resonance imaging: A review of a novel non-invasive cardiac imaging technique

    PubMed Central

    Rahman, Zia Ur; Sethi, Pooja; Murtaza, Ghulam; Virk, Hafeez Ul Hassan; Rai, Aitzaz; Mahmod, Masliza; Schoondyke, Jeffrey; Albalbissi, Kais

    2017-01-01

    Cardiovascular disease is a leading cause of morbidity and mortality globally. Early diagnostic markers are gaining popularity for better patient care disease outcomes. There is an increasing interest in noninvasive cardiac imaging biomarkers to diagnose subclinical cardiac disease. Feature tracking cardiac magnetic resonance imaging is a novel post-processing technique that is increasingly being employed to assess global and regional myocardial function. This technique has numerous applications in structural and functional diagnostics. It has been validated in multiple studies, although there is still a long way to go for it to become routine standard of care. PMID:28515849

  12. CLAss-Specific Subspace Kernel Representations and Adaptive Margin Slack Minimization for Large Scale Classification.

    PubMed

    Yu, Yinan; Diamantaras, Konstantinos I; McKelvey, Tomas; Kung, Sun-Yuan

    2018-02-01

    In kernel-based classification models, given limited computational power and storage capacity, operations over the full kernel matrix becomes prohibitive. In this paper, we propose a new supervised learning framework using kernel models for sequential data processing. The framework is based on two components that both aim at enhancing the classification capability with a subset selection scheme. The first part is a subspace projection technique in the reproducing kernel Hilbert space using a CLAss-specific Subspace Kernel representation for kernel approximation. In the second part, we propose a novel structural risk minimization algorithm called the adaptive margin slack minimization to iteratively improve the classification accuracy by an adaptive data selection. We motivate each part separately, and then integrate them into learning frameworks for large scale data. We propose two such frameworks: the memory efficient sequential processing for sequential data processing and the parallelized sequential processing for distributed computing with sequential data acquisition. We test our methods on several benchmark data sets and compared with the state-of-the-art techniques to verify the validity of the proposed techniques.

  13. Interprofessional partnerships in chronic illness care: a conceptual model for measuring partnership effectiveness

    PubMed Central

    Butt, Gail; Markle-Reid, Maureen; Browne, Gina

    2008-01-01

    Introduction Interprofessional health and social service partnerships (IHSSP) are internationally acknowledged as integral for comprehensive chronic illness care. However, the evidence-base for partnership effectiveness is lacking. This paper aims to clarify partnership measurement issues, conceptualize IHSSP at the front-line staff level, and identify tools valid for group process measurement. Theory and methods A systematic literature review utilizing three interrelated searches was conducted. Thematic analysis techniques were supported by NVivo 7 software. Complexity theory was used to guide the analysis, ground the new conceptualization and validate the selected measures. Other properties of the measures were critiqued using established criteria. Results There is a need for a convergent view of what constitutes a partnership and its measurement. The salient attributes of IHSSP and their interorganizational context were described and grounded within complexity theory. Two measures were selected and validated for measurement of proximal group outcomes. Conclusion This paper depicts a novel complexity theory-based conceptual model for IHSSP of front-line staff who provide chronic illness care. The conceptualization provides the underpinnings for a comprehensive evaluative framework for partnerships. Two partnership process measurement tools, the PSAT and TCI are valid for IHSSP process measurement with consideration of their strengths and limitations. PMID:18493591

  14. Space - A unique environment for process modeling R&D

    NASA Technical Reports Server (NTRS)

    Overfelt, Tony

    1991-01-01

    Process modeling, the application of advanced computational techniques to simulate real processes as they occur in regular use, e.g., welding, casting and semiconductor crystal growth, is discussed. Using the low-gravity environment of space will accelerate the technical validation of the procedures and enable extremely accurate determinations of the many necessary thermophysical properties. Attention is given to NASA's centers for the commercial development of space; joint ventures of universities, industries, and goverment agencies to study the unique attributes of space that offer potential for applied R&D and eventual commercial exploitation.

  15. Network Security Validation Using Game Theory

    NASA Astrophysics Data System (ADS)

    Papadopoulou, Vicky; Gregoriades, Andreas

    Non-functional requirements (NFR) such as network security recently gained widespread attention in distributed information systems. Despite their importance however, there is no systematic approach to validate these requirements given the complexity and uncertainty characterizing modern networks. Traditionally, network security requirements specification has been the results of a reactive process. This however, limited the immunity property of the distributed systems that depended on these networks. Security requirements specification need a proactive approach. Networks' infrastructure is constantly under attack by hackers and malicious software that aim to break into computers. To combat these threats, network designers need sophisticated security validation techniques that will guarantee the minimum level of security for their future networks. This paper presents a game-theoretic approach to security requirements validation. An introduction to game theory is presented along with an example that demonstrates the application of the approach.

  16. Evaluation of biologic occupational risk control practices: quality indicators development and validation.

    PubMed

    Takahashi, Renata Ferreira; Gryschek, Anna Luíza F P L; Izumi Nichiata, Lúcia Yasuko; Lacerda, Rúbia Aparecida; Ciosak, Suely Itsuko; Gir, Elucir; Padoveze, Maria Clara

    2010-05-01

    There is growing demand for the adoption of qualification systems for health care practices. This study is aimed at describing the development and validation of indicators for evaluation of biologic occupational risk control programs. The study involved 3 stages: (1) setting up a research team, (2) development of indicators, and (3) validation of the indicators by a team of specialists recruited to validate each attribute of the developed indicators. The content validation method was used for the validation, and a psychometric scale was developed for the specialists' assessment. A consensus technique was used, and every attribute that obtained a Content Validity Index of at least 0.75 was approved. Eight indicators were developed for the evaluation of the biologic occupational risk prevention program, with emphasis on accidents caused by sharp instruments and occupational tuberculosis prevention. The indicators included evaluation of the structure, process, and results at the prevention and biologic risk control levels. The majority of indicators achieved a favorable consensus regarding all validated attributes. The developed indicators were considered validated, and the method used for construction and validation proved to be effective. Copyright (c) 2010 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Mosby, Inc. All rights reserved.

  17. Validation and application of Acoustic Mapping Velocimetry

    NASA Astrophysics Data System (ADS)

    Baranya, Sandor; Muste, Marian

    2016-04-01

    The goal of this paper is to introduce a novel methodology to estimate bedload transport in rivers based on an improved bedform tracking procedure. The measurement technique combines components and processing protocols from two contemporary nonintrusive instruments: acoustic and image-based. The bedform mapping is conducted with acoustic surveys while the estimation of the velocity of the bedforms is obtained with processing techniques pertaining to image-based velocimetry. The technique is therefore called Acoustic Mapping Velocimetry (AMV). The implementation of this technique produces a whole-field velocity map associated with the multi-directional bedform movement. Based on the calculated two-dimensional bedform migration velocity field, the bedload transport estimation is done using the Exner equation. A proof-of-concept experiment was performed to validate the AMV based bedload estimation in a laboratory flume at IIHR-Hydroscience & Engineering (IIHR). The bedform migration was analysed at three different flow discharges. Repeated bed geometry mapping, using a multiple transducer array (MTA), provided acoustic maps, which were post-processed with a particle image velocimetry (PIV) method. Bedload transport rates were calculated along longitudinal sections using the streamwise components of the bedform velocity vectors and the measured bedform heights. The bulk transport rates were compared with the results from concurrent direct physical samplings and acceptable agreement was found. As a first field implementation of the AMV an attempt was made to estimate bedload transport for a section of the Ohio river in the United States, where bed geometry maps, resulted by repeated multibeam echo sounder (MBES) surveys, served as input data. Cross-sectional distributions of bedload transport rates from the AMV based method were compared with the ones obtained from another non-intrusive technique (due to the lack of direct samplings), ISSDOTv2, developed by the US Army Corps of Engineers. The good agreement between the results from the two different methods is encouraging and suggests further field tests in varying hydro-morphological situations.

  18. Evaluating the sources of water to wells: Three techniques for metamodeling of a groundwater flow model

    USGS Publications Warehouse

    Fienen, Michael N.; Nolan, Bernard T.; Feinstein, Daniel T.

    2016-01-01

    For decision support, the insights and predictive power of numerical process models can be hampered by insufficient expertise and computational resources required to evaluate system response to new stresses. An alternative is to emulate the process model with a statistical “metamodel.” Built on a dataset of collocated numerical model input and output, a groundwater flow model was emulated using a Bayesian Network, an Artificial neural network, and a Gradient Boosted Regression Tree. The response of interest was surface water depletion expressed as the source of water-to-wells. The results have application for managing allocation of groundwater. Each technique was tuned using cross validation and further evaluated using a held-out dataset. A numerical MODFLOW-USG model of the Lake Michigan Basin, USA, was used for the evaluation. The performance and interpretability of each technique was compared pointing to advantages of each technique. The metamodel can extend to unmodeled areas.

  19. Link-prediction to tackle the boundary specification problem in social network surveys

    PubMed Central

    De Wilde, Philippe; Buarque de Lima-Neto, Fernando

    2017-01-01

    Diffusion processes in social networks often cause the emergence of global phenomena from individual behavior within a society. The study of those global phenomena and the simulation of those diffusion processes frequently require a good model of the global network. However, survey data and data from online sources are often restricted to single social groups or features, such as age groups, single schools, companies, or interest groups. Hence, a modeling approach is required that extrapolates the locally restricted data to a global network model. We tackle this Missing Data Problem using Link-Prediction techniques from social network research, network generation techniques from the area of Social Simulation, as well as a combination of both. We found that techniques employing less information may be more adequate to solve this problem, especially when data granularity is an issue. We validated the network models created with our techniques on a number of real-world networks, investigating degree distributions as well as the likelihood of links given the geographical distance between two nodes. PMID:28426826

  20. Experimental Validation of Advanced Dispersed Fringe Sensing (ADFS) Algorithm Using Advanced Wavefront Sensing and Correction Testbed (AWCT)

    NASA Technical Reports Server (NTRS)

    Wang, Xu; Shi, Fang; Sigrist, Norbert; Seo, Byoung-Joon; Tang, Hong; Bikkannavar, Siddarayappa; Basinger, Scott; Lay, Oliver

    2012-01-01

    Large aperture telescope commonly features segment mirrors and a coarse phasing step is needed to bring these individual segments into the fine phasing capture range. Dispersed Fringe Sensing (DFS) is a powerful coarse phasing technique and its alteration is currently being used for JWST.An Advanced Dispersed Fringe Sensing (ADFS) algorithm is recently developed to improve the performance and robustness of previous DFS algorithms with better accuracy and unique solution. The first part of the paper introduces the basic ideas and the essential features of the ADFS algorithm and presents the some algorithm sensitivity study results. The second part of the paper describes the full details of algorithm validation process through the advanced wavefront sensing and correction testbed (AWCT): first, the optimization of the DFS hardware of AWCT to ensure the data accuracy and reliability is illustrated. Then, a few carefully designed algorithm validation experiments are implemented, and the corresponding data analysis results are shown. Finally the fiducial calibration using Range-Gate-Metrology technique is carried out and a <10nm or <1% algorithm accuracy is demonstrated.

  1. Benchmarking the ATLAS software through the Kit Validation engine

    NASA Astrophysics Data System (ADS)

    De Salvo, Alessandro; Brasolin, Franco

    2010-04-01

    The measurement of the experiment software performance is a very important metric in order to choose the most effective resources to be used and to discover the bottlenecks of the code implementation. In this work we present the benchmark techniques used to measure the ATLAS software performance through the ATLAS offline testing engine Kit Validation and the online portal Global Kit Validation. The performance measurements, the data collection, the online analysis and display of the results will be presented. The results of the measurement on different platforms and architectures will be shown, giving a full report on the CPU power and memory consumption of the Monte Carlo generation, simulation, digitization and reconstruction of the most CPU-intensive channels. The impact of the multi-core computing on the ATLAS software performance will also be presented, comparing the behavior of different architectures when increasing the number of concurrent processes. The benchmark techniques described in this paper have been used in the HEPiX group since the beginning of 2008 to help defining the performance metrics for the High Energy Physics applications, based on the real experiment software.

  2. 41 CFR 60-3.6 - Use of selection procedures which have not been validated.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... validation techniques contemplated by these guidelines. In such circumstances, the user should utilize... a formal and scored selection procedure is used which has an adverse impact, the validation... user cannot or need not follow the validation techniques anticipated by these guidelines, the user...

  3. 29 CFR 1607.6 - Use of selection procedures which have not been validated.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... circumstances in which a user cannot or need not utilize the validation techniques contemplated by these... which has an adverse impact, the validation techniques contemplated by these guidelines usually should be followed if technically feasible. Where the user cannot or need not follow the validation...

  4. 41 CFR 60-3.6 - Use of selection procedures which have not been validated.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... validation techniques contemplated by these guidelines. In such circumstances, the user should utilize... a formal and scored selection procedure is used which has an adverse impact, the validation... user cannot or need not follow the validation techniques anticipated by these guidelines, the user...

  5. 29 CFR 1607.6 - Use of selection procedures which have not been validated.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... circumstances in which a user cannot or need not utilize the validation techniques contemplated by these... which has an adverse impact, the validation techniques contemplated by these guidelines usually should be followed if technically feasible. Where the user cannot or need not follow the validation...

  6. 29 CFR 1607.6 - Use of selection procedures which have not been validated.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... circumstances in which a user cannot or need not utilize the validation techniques contemplated by these... which has an adverse impact, the validation techniques contemplated by these guidelines usually should be followed if technically feasible. Where the user cannot or need not follow the validation...

  7. 41 CFR 60-3.6 - Use of selection procedures which have not been validated.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... validation techniques contemplated by these guidelines. In such circumstances, the user should utilize... a formal and scored selection procedure is used which has an adverse impact, the validation... user cannot or need not follow the validation techniques anticipated by these guidelines, the user...

  8. 29 CFR 1607.6 - Use of selection procedures which have not been validated.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... circumstances in which a user cannot or need not utilize the validation techniques contemplated by these... which has an adverse impact, the validation techniques contemplated by these guidelines usually should be followed if technically feasible. Where the user cannot or need not follow the validation...

  9. 41 CFR 60-3.6 - Use of selection procedures which have not been validated.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... validation techniques contemplated by these guidelines. In such circumstances, the user should utilize... a formal and scored selection procedure is used which has an adverse impact, the validation... user cannot or need not follow the validation techniques anticipated by these guidelines, the user...

  10. 29 CFR 1607.6 - Use of selection procedures which have not been validated.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... circumstances in which a user cannot or need not utilize the validation techniques contemplated by these... which has an adverse impact, the validation techniques contemplated by these guidelines usually should be followed if technically feasible. Where the user cannot or need not follow the validation...

  11. Teaching "Instant Experience" with Graphical Model Validation Techniques

    ERIC Educational Resources Information Center

    Ekstrøm, Claus Thorn

    2014-01-01

    Graphical model validation techniques for linear normal models are often used to check the assumptions underlying a statistical model. We describe an approach to provide "instant experience" in looking at a graphical model validation plot, so it becomes easier to validate if any of the underlying assumptions are violated.

  12. INcreasing Security and Protection through Infrastructure REsilience: The INSPIRE Project

    NASA Astrophysics Data System (ADS)

    D'Antonio, Salvatore; Romano, Luigi; Khelil, Abdelmajid; Suri, Neeraj

    The INSPIRE project aims at enhancing the European potential in the field of security by ensuring the protection of critical information infrastructures through (a) the identification of their vulnerabilities and (b) the development of innovative techniques for securing networked process control systems. To increase the resilience of such systems INSPIRE will develop traffic engineering algorithms, diagnostic processes and self-reconfigurable architectures along with recovery techniques. Hence, the core idea of the INSPIRE project is to protect critical information infrastructures by appropriately configuring, managing, and securing the communication network which interconnects the distributed control systems. A working prototype will be implemented as a final demonstrator of selected scenarios. Controls/Communication Experts will support project partners in the validation and demonstration activities. INSPIRE will also contribute to standardization process in order to foster multi-operator interoperability and coordinated strategies for securing lifeline systems.

  13. A Study on the Data Compression Technology-Based Intelligent Data Acquisition (IDAQ) System for Structural Health Monitoring of Civil Structures

    PubMed Central

    Jeon, Joonryong

    2017-01-01

    In this paper, a data compression technology-based intelligent data acquisition (IDAQ) system was developed for structural health monitoring of civil structures, and its validity was tested using random signals (El-Centro seismic waveform). The IDAQ system was structured to include a high-performance CPU with large dynamic memory for multi-input and output in a radio frequency (RF) manner. In addition, the embedded software technology (EST) has been applied to it to implement diverse logics needed in the process of acquiring, processing and transmitting data. In order to utilize IDAQ system for the structural health monitoring of civil structures, this study developed an artificial filter bank by which structural dynamic responses (acceleration) were efficiently acquired, and also optimized it on the random El-Centro seismic waveform. All techniques developed in this study have been embedded to our system. The data compression technology-based IDAQ system was proven valid in acquiring valid signals in a compressed size. PMID:28704945

  14. A Study on the Data Compression Technology-Based Intelligent Data Acquisition (IDAQ) System for Structural Health Monitoring of Civil Structures.

    PubMed

    Heo, Gwanghee; Jeon, Joonryong

    2017-07-12

    In this paper, a data compression technology-based intelligent data acquisition (IDAQ) system was developed for structural health monitoring of civil structures, and its validity was tested using random signals (El-Centro seismic waveform). The IDAQ system was structured to include a high-performance CPU with large dynamic memory for multi-input and output in a radio frequency (RF) manner. In addition, the embedded software technology (EST) has been applied to it to implement diverse logics needed in the process of acquiring, processing and transmitting data. In order to utilize IDAQ system for the structural health monitoring of civil structures, this study developed an artificial filter bank by which structural dynamic responses (acceleration) were efficiently acquired, and also optimized it on the random El-Centro seismic waveform. All techniques developed in this study have been embedded to our system. The data compression technology-based IDAQ system was proven valid in acquiring valid signals in a compressed size.

  15. Infinite hidden conditional random fields for human behavior analysis.

    PubMed

    Bousmalis, Konstantinos; Zafeiriou, Stefanos; Morency, Louis-Philippe; Pantic, Maja

    2013-01-01

    Hidden conditional random fields (HCRFs) are discriminative latent variable models that have been shown to successfully learn the hidden structure of a given classification problem (provided an appropriate validation of the number of hidden states). In this brief, we present the infinite HCRF (iHCRF), which is a nonparametric model based on hierarchical Dirichlet processes and is capable of automatically learning the optimal number of hidden states for a classification task. We show how we learn the model hyperparameters with an effective Markov-chain Monte Carlo sampling technique, and we explain the process that underlines our iHCRF model with the Restaurant Franchise Rating Agencies analogy. We show that the iHCRF is able to converge to a correct number of represented hidden states, and outperforms the best finite HCRFs--chosen via cross-validation--for the difficult tasks of recognizing instances of agreement, disagreement, and pain. Moreover, the iHCRF manages to achieve this performance in significantly less total training, validation, and testing time.

  16. Formulation and Characterization of Solid Dispersion Prepared by Hot Melt Mixing: A Fast Screening Approach for Polymer Selection

    PubMed Central

    Enose, Arno A.; Dasan, Priya K.; Sivaramakrishnan, H.; Shah, Sanket M.

    2014-01-01

    Solid dispersion is molecular dispersion of drug in a polymer matrix which leads to improved solubility and hence better bioavailability. Solvent evaporation technique was employed to prepare films of different combinations of polymers, plasticizer, and a modal drug sulindac to narrow down on a few polymer-plasticizer-sulindac combinations. The sulindac-polymer-plasticizer combination that was stable with good film forming properties was processed by hot melt mixing, a technique close to hot melt extrusion, to predict its behavior in a hot melt extrusion process. Hot melt mixing is not a substitute to hot melt extrusion but is an aid in predicting the formation of molecularly dispersed form of a given set of drug-polymer-plasticizer combination in a hot melt extrusion process. The formulations were characterized by advanced techniques like optical microscopy, differential scanning calorimetry, hot stage microscopy, dynamic vapor sorption, and X-ray diffraction. Subsequently, the best drug-polymer-plasticizer combination obtained by hot melt mixing was subjected to hot melt extrusion process to validate the usefulness of hot melt mixing as a predictive tool in hot melt extrusion process. PMID:26556187

  17. The effect of processing on the mechanical properties of self-reinforced composites

    NASA Astrophysics Data System (ADS)

    Hassani, Farzaneh; Martin, Peter J.; Falzon, Brian G.

    2018-05-01

    Hot-compaction is one of the most common manufacturing methods for creating recyclable all thermoplastic composites. The current work investigates the compaction of highly oriented self-reinforced fabrics with three processing methods to study the effect of pressure and temperature in the tensile mechanical properties of the consolidated laminates. Hot-press, calender roller and vacuum bag technique were adopted to consolidate bi-component polypropylene woven fabrics in a range of pressures and compaction temperatures. Hot-pressed samples exhibited the highest quality of compaction. The modulus of the hot-pressed samples increased with compaction temperature initially due to the improved interlayer bonding and decreased after a maximum at 150°C because of partial melting of the reinforcement phase. The calender roller technique exhibited to have smaller processing temperature window as the pressure is only applied for a short time and the fabrics start to shrink with increasing the processing temperature. The need for constraining the fabrics through the process is therefore found to be paramount. The Vacuum bag results showed this technique to be the least efficient method because of the low compaction pressure. Microscopic images and void content measurement of the consolidated samples further validate the results from tensile testing.

  18. The implementation of portfolio assessment by the educators on the mathematics learning process in senior high school

    NASA Astrophysics Data System (ADS)

    Lestariani, Ida; Sujadi, Imam; Pramudya, Ikrar

    2018-05-01

    Portfolio assessment can shows the development of the ability of learners in a period through the work so that can be seen progress monitored learning of each learner. The purpose of research to describe and know the implementation of portfolio assessment on the mathematics learning process with the Senior High school math teacher class X as the subject because of the importance of applying the assessment for the progress of learning outcomes of learners. This research includes descriptive qualitative research type. Techniques of data collecting is done by observation method, interview and documentation. Data collection then validated using triangulation technique that is observation technique, interview and documentation. Data analysis technique is done by data reduction, data presentation and conclusion. The results showed that the steps taken by teachers in applying portfolio assessment obtained focused on learning outcomes. Student learning outcomes include homework and daily tests. Based on the results of research can be concluded that the implementation of portfolio assessment is the form of learning results are scored. Teachers have not yet implemented other portfolio assessment techniques such as student work.

  19. Energy-efficient process-stacking multiplexing access for 60-GHz mm-wave wireless personal area networks.

    PubMed

    Estevez, Claudio; Kailas, Aravind

    2012-01-01

    Millimeter-wave technology shows high potential for future wireless personal area networks, reaching over 1 Gbps transmissions using simple modulation techniques. Current specifications consider dividing the spectrum into effortlessly separable spectrum ranges. These low requirements open a research area in time and space multiplexing techniques for millimeter-waves. In this work a process-stacking multiplexing access algorithm is designed for single channel operation. The concept is intuitive, but its implementation is not trivial. The key to stacking single channel events is to operate while simultaneously obtaining and handling a-posteriori time-frame information of scheduled events. This information is used to shift a global time pointer that the wireless access point manages and uses to synchronize all serviced nodes. The performance of the proposed multiplexing access technique is lower bounded by the performance of legacy TDMA and can significantly improve the effective throughput. Work is validated by simulation results.

  20. Large-scale experimental technology with remote sensing in land surface hydrology and meteorology

    NASA Technical Reports Server (NTRS)

    Brutsaert, Wilfried; Schmugge, Thomas J.; Sellers, Piers J.; Hall, Forrest G.

    1988-01-01

    Two field experiments to study atmospheric and land surface processes and their interactions are summarized. The Hydrologic-Atmospheric Pilot Experiment, which tested techniques for measuring evaporation, soil moisture storage, and runoff at scales of about 100 km, was conducted over a 100 X 100 km area in France from mid-1985 to early 1987. The first International Satellite Land Surface Climatology Program field experiment was conducted in 1987 to develop and use relationships between current satellite measurements and hydrologic, climatic, and biophysical variables at the earth's surface and to validate these relationships with ground truth. This experiment also validated surface parameterization methods for simulation models that describe surface processes from the scale of vegetation leaves up to scales appropriate to satellite remote sensing.

  1. Simulation verification techniques study: Simulation performance validation techniques document. [for the space shuttle system

    NASA Technical Reports Server (NTRS)

    Duncan, L. M.; Reddell, J. P.; Schoonmaker, P. B.

    1975-01-01

    Techniques and support software for the efficient performance of simulation validation are discussed. Overall validation software structure, the performance of validation at various levels of simulation integration, guidelines for check case formulation, methods for real time acquisition and formatting of data from an all up operational simulator, and methods and criteria for comparison and evaluation of simulation data are included. Vehicle subsystems modules, module integration, special test requirements, and reference data formats are also described.

  2. Advances in Neutron Radiography: Application to Additive Manufacturing Inconel 718

    DOE PAGES

    Bilheux, Hassina Z; Song, Gian; An, Ke; ...

    2016-01-01

    Reactor-based neutron radiography is a non-destructive, non-invasive characterization technique that has been extensively used for engineering materials such as inspection of components, evaluation of porosity, and in-operando observations of engineering parts. Neutron radiography has flourished at reactor facilities for more than four decades and is relatively new to accelerator-based neutron sources. Recent advances in neutron source and detector technologies, such as the Spallation Neutron Source (SNS) at the Oak Ridge National Laboratory (ORNL) in Oak Ridge, TN, and the microchannel plate (MCP) detector, respectively, enable new contrast mechanisms using the neutron scattering Bragg features for crystalline information such as averagemore » lattice strain, crystalline plane orientation, and identification of phases in a neutron radiograph. Additive manufacturing (AM) processes or 3D printing have recently become very popular and have a significant potential to revolutionize the manufacturing of materials by enabling new designs with complex geometries that are not feasible using conventional manufacturing processes. However, the technique lacks standards for process optimization and control compared to conventional processes. Residual stresses are a common occurrence in materials that are machined, rolled, heat treated, welded, etc., and have a significant impact on a component s mechanical behavior and durability. They may also arise during the 3D printing process, and defects such as internal cracks can propagate over time as the component relaxes after being removed from its build plate (the base plate utilized to print materials on). Moreover, since access to the AM material is possible only after the component has been fully manufactured, it is difficult to characterize the material for defects a priori to minimize expensive re-runs. Currently, validation of the AM process and materials is mainly through expensive trial-and-error experiments at the component level, whereas in conventional processes the level of confidence in predictive computational modeling is high enough to allow process and materials optimization through computational approaches. Thus, there is a clear need for non-destructive characterization techniques and for the establishment of processing- microstructure databases that can be used for developing and validating predictive modeling tools for AM.« less

  3. Strain analysis in CRT candidates using the novel segment length in cine (SLICE) post-processing technique on standard CMR cine images.

    PubMed

    Zweerink, Alwin; Allaart, Cornelis P; Kuijer, Joost P A; Wu, LiNa; Beek, Aernout M; van de Ven, Peter M; Meine, Mathias; Croisille, Pierre; Clarysse, Patrick; van Rossum, Albert C; Nijveldt, Robin

    2017-12-01

    Although myocardial strain analysis is a potential tool to improve patient selection for cardiac resynchronization therapy (CRT), there is currently no validated clinical approach to derive segmental strains. We evaluated the novel segment length in cine (SLICE) technique to derive segmental strains from standard cardiovascular MR (CMR) cine images in CRT candidates. Twenty-seven patients with left bundle branch block underwent CMR examination including cine imaging and myocardial tagging (CMR-TAG). SLICE was performed by measuring segment length between anatomical landmarks throughout all phases on short-axis cines. This measure of frame-to-frame segment length change was compared to CMR-TAG circumferential strain measurements. Subsequently, conventional markers of CRT response were calculated. Segmental strains showed good to excellent agreement between SLICE and CMR-TAG (septum strain, intraclass correlation coefficient (ICC) 0.76; lateral wall strain, ICC 0.66). Conventional markers of CRT response also showed close agreement between both methods (ICC 0.61-0.78). Reproducibility of SLICE was excellent for intra-observer testing (all ICC ≥0.76) and good for interobserver testing (all ICC ≥0.61). The novel SLICE post-processing technique on standard CMR cine images offers both accurate and robust segmental strain measures compared to the 'gold standard' CMR-TAG technique, and has the advantage of being widely available. • Myocardial strain analysis could potentially improve patient selection for CRT. • Currently a well validated clinical approach to derive segmental strains is lacking. • The novel SLICE technique derives segmental strains from standard CMR cine images. • SLICE-derived strain markers of CRT response showed close agreement with CMR-TAG. • Future studies will focus on the prognostic value of SLICE in CRT candidates.

  4. Validation of Multilevel Constructs: Validation Methods and Empirical Findings for the EDI

    ERIC Educational Resources Information Center

    Forer, Barry; Zumbo, Bruno D.

    2011-01-01

    The purposes of this paper are to highlight the foundations of multilevel construct validation, describe two methodological approaches and associated analytic techniques, and then apply these approaches and techniques to the multilevel construct validation of a widely-used school readiness measure called the Early Development Instrument (EDI;…

  5. Developing material for promoting problem-solving ability through bar modeling technique

    NASA Astrophysics Data System (ADS)

    Widyasari, N.; Rosiyanti, H.

    2018-01-01

    This study aimed at developing material for enhancing problem-solving ability through bar modeling technique with thematic learning. Polya’s steps of problem-solving were chosen as the basis of the study. The methods of the study were research and development. The subject of this study were five teen students of the fifth grade of Lab-school FIP UMJ elementary school. Expert review and student’ response analysis were used to collect the data. Furthermore, the data were analyzed using qualitative descriptive and quantitative. The findings showed that material in theme “Selalu Berhemat Energi” was categorized as valid and practical. The validity was measured by using the aspect of language, contents, and graphics. Based on the expert comments, the materials were easy to implement in the teaching-learning process. In addition, the result of students’ response showed that material was both interesting and easy to understand. Thus, students gained more understanding in learning problem-solving.

  6. [Development and validation of a questionnaire on perception of portfolio by undergraduate medical students].

    PubMed

    Riquelme, Arnoldo; Méndez, Benjamín; de la Fuente, Paloma; Padilla, Oslando; Benaglio, Carla; Sirhan, Marisol; Labarca, Jaime

    2011-01-01

    Portfolio is an innovative instrument that promotes reflection, creativity and professionalism among students. To describe the development and validation process of a questionnaire to evaluate the use of portfolio in undergraduate medical students. Focus groups with students and teachers were employed to identify aspects related with portfolio in undergraduate teaching. The Delphi technique was used to prioritize relevant aspects and construct the questionnaire. The validated questionnaire, consisting of 43 items and 6 factors, was applied to 97 students (response rote of 99.9%) in 2007 and 100 students (99.2%) in 2008. Each question had to be answered using a Likert scale, from 0 (completely disagree) to 4 (completely agree) The validity and reliability of the questionnaire was evaluated. The questionnaire showed a high reliability (Cronbach alpha = 0.9). The mean total scores obtained in 2007 and 2008 were 106.2 ± 21.2 (61.7% of the maximal obtainable score) and 104.6 ± 34.0 (60.8% of the maximal obtainable score), respectively No significant differences were seen in the analysis by factors. Changes in portfolio during 2008 showed differences in items related with organization, evaluation and regulation. The questionnaire is a valid and highly reliable instrument, measuring perceptions about the portfolio by undergraduate medical students. The students perceived an improvement in their creativity and professionalism as one of the strengths of portfolio. The weaknesses identified during the implementation process helped us to focus changes in organization and evaluation to improve the portfolio as a dynamic process.

  7. Quantitative analysis of packed and compacted granular systems by x-ray microtomography

    NASA Astrophysics Data System (ADS)

    Fu, Xiaowei; Milroy, Georgina E.; Dutt, Meenakshi; Bentham, A. Craig; Hancock, Bruno C.; Elliott, James A.

    2005-04-01

    The packing and compaction of powders are general processes in pharmaceutical, food, ceramic and powder metallurgy industries. Understanding how particles pack in a confined space and how powders behave during compaction is crucial for producing high quality products. This paper outlines a new technique, based on modern desktop X-ray tomography and image processing, to quantitatively investigate the packing of particles in the process of powder compaction and provide great insights on how powder densify during powder compaction, which relate in terms of materials properties and processing conditions to tablet manufacture by compaction. A variety of powder systems were considered, which include glass, sugar, NaCl, with a typical particle size of 200-300 mm and binary mixtures of NaCl-Glass Spheres. The results are new and have been validated by SEM observation and numerical simulations using discrete element methods (DEM). The research demonstrates that XMT technique has the potential in further investigating of pharmaceutical processing and even verifying other physical models on complex packing.

  8. The role of failure modes and effects analysis in showing the benefits of automation in the blood bank.

    PubMed

    Han, Tae Hee; Kim, Moon Jung; Kim, Shinyoung; Kim, Hyun Ok; Lee, Mi Ae; Choi, Ji Seon; Hur, Mina; St John, Andrew

    2013-05-01

    Failure modes and effects analysis (FMEA) is a risk management tool used by the manufacturing industry but now being applied in laboratories. Teams from six South Korean blood banks used this tool to map their manual and automated blood grouping processes and determine the risk priority numbers (RPNs) as a total measure of error risk. The RPNs determined by each of the teams consistently showed that the use of automation dramatically reduced the RPN compared to manual processes. In addition, FMEA showed where the major risks occur in each of the manual processes and where attention should be prioritized to improve the process. Despite no previous experience with FMEA, the teams found the technique relatively easy to use and the subjectivity associated with assigning risk numbers did not affect the validity of the data. FMEA should become a routine technique for improving processes in laboratories. © 2012 American Association of Blood Banks.

  9. Ship Speed Retrieval From Single Channel TerraSAR-X Data

    NASA Astrophysics Data System (ADS)

    Soccorsi, Matteo; Lehner, Susanne

    2010-04-01

    A method to estimate the speed of a moving ship is presented. The technique, introduced in Kirscht (1998), is extended to marine application and validated on TerraSAR-X High-Resolution (HR) data. The generation of a sequence of single-look SAR images from a single- channel image corresponds to an image time series with reduced resolution. This allows applying change detection techniques on the time series to evaluate the velocity components in range and azimuth of the ship. The evaluation of the displacement vector of a moving target in consecutive images of the sequence allows the estimation of the azimuth velocity component. The range velocity component is estimated by evaluating the variation of the signal amplitude during the sequence. In order to apply the technique on TerraSAR-X Spot Light (SL) data a further processing step is needed. The phase has to be corrected as presented in Eineder et al. (2009) due to the SL acquisition mode; otherwise the image sequence cannot be generated. The analysis, when possible validated by the Automatic Identification System (AIS), was performed in the framework of the ESA project MARISS.

  10. Text Classification for Organizational Researchers

    PubMed Central

    Kobayashi, Vladimer B.; Mol, Stefan T.; Berkers, Hannah A.; Kismihók, Gábor; Den Hartog, Deanne N.

    2017-01-01

    Organizations are increasingly interested in classifying texts or parts thereof into categories, as this enables more effective use of their information. Manual procedures for text classification work well for up to a few hundred documents. However, when the number of documents is larger, manual procedures become laborious, time-consuming, and potentially unreliable. Techniques from text mining facilitate the automatic assignment of text strings to categories, making classification expedient, fast, and reliable, which creates potential for its application in organizational research. The purpose of this article is to familiarize organizational researchers with text mining techniques from machine learning and statistics. We describe the text classification process in several roughly sequential steps, namely training data preparation, preprocessing, transformation, application of classification techniques, and validation, and provide concrete recommendations at each step. To help researchers develop their own text classifiers, the R code associated with each step is presented in a tutorial. The tutorial draws from our own work on job vacancy mining. We end the article by discussing how researchers can validate a text classification model and the associated output. PMID:29881249

  11. [Validation and verfication of microbiology methods].

    PubMed

    Camaró-Sala, María Luisa; Martínez-García, Rosana; Olmos-Martínez, Piedad; Catalá-Cuenca, Vicente; Ocete-Mochón, María Dolores; Gimeno-Cardona, Concepción

    2015-01-01

    Clinical microbiologists should ensure, to the maximum level allowed by the scientific and technical development, the reliability of the results. This implies that, in addition to meeting the technical criteria to ensure their validity, they must be performed with a number of conditions that allows comparable results to be obtained, regardless of the laboratory that performs the test. In this sense, the use of recognized and accepted reference methodsis the most effective tool for these guarantees. The activities related to verification and validation of analytical methods has become very important, as there is continuous development, as well as updating techniques and increasingly complex analytical equipment, and an interest of professionals to ensure quality processes and results. The definitions of validation and verification are described, along with the different types of validation/verification, and the types of methods, and the level of validation necessary depending on the degree of standardization. The situations in which validation/verification is mandatory and/or recommended is discussed, including those particularly related to validation in Microbiology. It stresses the importance of promoting the use of reference strains as controls in Microbiology and the use of standard controls, as well as the importance of participation in External Quality Assessment programs to demonstrate technical competence. The emphasis is on how to calculate some of the parameters required for validation/verification, such as the accuracy and precision. The development of these concepts can be found in the microbiological process SEIMC number 48: «Validation and verification of microbiological methods» www.seimc.org/protocols/microbiology. Copyright © 2013 Elsevier España, S.L.U. y Sociedad Española de Enfermedades Infecciosas y Microbiología Clínica. All rights reserved.

  12. Validation of an Improved Computer-Assisted Technique for Mining Free-Text Electronic Medical Records.

    PubMed

    Duz, Marco; Marshall, John F; Parkin, Tim

    2017-06-29

    The use of electronic medical records (EMRs) offers opportunity for clinical epidemiological research. With large EMR databases, automated analysis processes are necessary but require thorough validation before they can be routinely used. The aim of this study was to validate a computer-assisted technique using commercially available content analysis software (SimStat-WordStat v.6 (SS/WS), Provalis Research) for mining free-text EMRs. The dataset used for the validation process included life-long EMRs from 335 patients (17,563 rows of data), selected at random from a larger dataset (141,543 patients, ~2.6 million rows of data) and obtained from 10 equine veterinary practices in the United Kingdom. The ability of the computer-assisted technique to detect rows of data (cases) of colic, renal failure, right dorsal colitis, and non-steroidal anti-inflammatory drug (NSAID) use in the population was compared with manual classification. The first step of the computer-assisted analysis process was the definition of inclusion dictionaries to identify cases, including terms identifying a condition of interest. Words in inclusion dictionaries were selected from the list of all words in the dataset obtained in SS/WS. The second step consisted of defining an exclusion dictionary, including combinations of words to remove cases erroneously classified by the inclusion dictionary alone. The third step was the definition of a reinclusion dictionary to reinclude cases that had been erroneously classified by the exclusion dictionary. Finally, cases obtained by the exclusion dictionary were removed from cases obtained by the inclusion dictionary, and cases from the reinclusion dictionary were subsequently reincluded using Rv3.0.2 (R Foundation for Statistical Computing, Vienna, Austria). Manual analysis was performed as a separate process by a single experienced clinician reading through the dataset once and classifying each row of data based on the interpretation of the free-text notes. Validation was performed by comparison of the computer-assisted method with manual analysis, which was used as the gold standard. Sensitivity, specificity, negative predictive values (NPVs), positive predictive values (PPVs), and F values of the computer-assisted process were calculated by comparing them with the manual classification. Lowest sensitivity, specificity, PPVs, NPVs, and F values were 99.82% (1128/1130), 99.88% (16410/16429), 94.6% (223/239), 100.00% (16410/16412), and 99.0% (100×2×0.983×0.998/[0.983+0.998]), respectively. The computer-assisted process required few seconds to run, although an estimated 30 h were required for dictionary creation. Manual classification required approximately 80 man-hours. The critical step in this work is the creation of accurate and inclusive dictionaries to ensure that no potential cases are missed. It is significantly easier to remove false positive terms from a SS/WS selected subset of a large database than search that original database for potential false negatives. The benefits of using this method are proportional to the size of the dataset to be analyzed. ©Marco Duz, John F Marshall, Tim Parkin. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 29.06.2017.

  13. Validation of an Improved Computer-Assisted Technique for Mining Free-Text Electronic Medical Records

    PubMed Central

    Marshall, John F; Parkin, Tim

    2017-01-01

    Background The use of electronic medical records (EMRs) offers opportunity for clinical epidemiological research. With large EMR databases, automated analysis processes are necessary but require thorough validation before they can be routinely used. Objective The aim of this study was to validate a computer-assisted technique using commercially available content analysis software (SimStat-WordStat v.6 (SS/WS), Provalis Research) for mining free-text EMRs. Methods The dataset used for the validation process included life-long EMRs from 335 patients (17,563 rows of data), selected at random from a larger dataset (141,543 patients, ~2.6 million rows of data) and obtained from 10 equine veterinary practices in the United Kingdom. The ability of the computer-assisted technique to detect rows of data (cases) of colic, renal failure, right dorsal colitis, and non-steroidal anti-inflammatory drug (NSAID) use in the population was compared with manual classification. The first step of the computer-assisted analysis process was the definition of inclusion dictionaries to identify cases, including terms identifying a condition of interest. Words in inclusion dictionaries were selected from the list of all words in the dataset obtained in SS/WS. The second step consisted of defining an exclusion dictionary, including combinations of words to remove cases erroneously classified by the inclusion dictionary alone. The third step was the definition of a reinclusion dictionary to reinclude cases that had been erroneously classified by the exclusion dictionary. Finally, cases obtained by the exclusion dictionary were removed from cases obtained by the inclusion dictionary, and cases from the reinclusion dictionary were subsequently reincluded using Rv3.0.2 (R Foundation for Statistical Computing, Vienna, Austria). Manual analysis was performed as a separate process by a single experienced clinician reading through the dataset once and classifying each row of data based on the interpretation of the free-text notes. Validation was performed by comparison of the computer-assisted method with manual analysis, which was used as the gold standard. Sensitivity, specificity, negative predictive values (NPVs), positive predictive values (PPVs), and F values of the computer-assisted process were calculated by comparing them with the manual classification. Results Lowest sensitivity, specificity, PPVs, NPVs, and F values were 99.82% (1128/1130), 99.88% (16410/16429), 94.6% (223/239), 100.00% (16410/16412), and 99.0% (100×2×0.983×0.998/[0.983+0.998]), respectively. The computer-assisted process required few seconds to run, although an estimated 30 h were required for dictionary creation. Manual classification required approximately 80 man-hours. Conclusions The critical step in this work is the creation of accurate and inclusive dictionaries to ensure that no potential cases are missed. It is significantly easier to remove false positive terms from a SS/WS selected subset of a large database than search that original database for potential false negatives. The benefits of using this method are proportional to the size of the dataset to be analyzed. PMID:28663163

  14. Independent validation of Swarm Level 2 magnetic field products and `Quick Look' for Level 1b data

    NASA Astrophysics Data System (ADS)

    Beggan, Ciarán D.; Macmillan, Susan; Hamilton, Brian; Thomson, Alan W. P.

    2013-11-01

    Magnetic field models are produced on behalf of the European Space Agency (ESA) by an independent scientific consortium known as the Swarm Satellite Constellation Application and Research Facility (SCARF), through the Level 2 Processor (L2PS). The consortium primarily produces magnetic field models for the core, lithosphere, ionosphere and magnetosphere. Typically, for each magnetic product, two magnetic field models are produced in separate chains using complementary data selection and processing techniques. Hence, the magnetic field models from the complementary processing chains will be similar but not identical. The final step in the overall L2PS therefore involves inspection and validation of the magnetic field models against each other and against data from (semi-) independent sources (e.g. ground observatories). We describe the validation steps for each magnetic field product and the comparison against independent datasets, and we show examples of the output of the validation. In addition, the L2PS also produces a daily set of `Quick Look' output graphics and statistics to monitor the overall quality of Level 1b data issued by ESA. We describe the outputs of the `Quick Look' chain.

  15. Validation of Reverse-Engineered and Additive-Manufactured Microsurgical Instrument Prototype.

    PubMed

    Singh, Ramandeep; Suri, Ashish; Anand, Sneh; Baby, Britty

    2016-12-01

    With advancements in imaging techniques, neurosurgical procedures are becoming highly precise and minimally invasive, thus demanding development of new ergonomically aesthetic instruments. Conventionally, neurosurgical instruments are manufactured using subtractive manufacturing methods. Such a process is complex, time-consuming, and impractical for prototype development and validation of new designs. Therefore, an alternative design process has been used utilizing blue light scanning, computer-aided designing, and additive manufacturing direct metal laser sintering (DMLS) for microsurgical instrument prototype development. Deviations of DMLS-fabricated instrument were studied by superimposing scan data of fabricated instrument with the computer-aided designing model. Content and concurrent validity of the fabricated prototypes was done by a group of 15 neurosurgeons by performing sciatic nerve anastomosis in small laboratory animals. Comparative scoring was obtained for the control and study instrument. T test was applied to the individual parameters and P values for force (P < .0001) and surface roughness (P < .01) were found to be statistically significant. These 2 parameters were further analyzed using objective measures. Results depicts that additive manufacturing by DMLS provides an effective method for prototype development. However, direct application of these additive-manufactured instruments in the operating room requires further validation. © The Author(s) 2016.

  16. The Naïve Overfitting Index Selection (NOIS): A new method to optimize model complexity for hyperspectral data

    NASA Astrophysics Data System (ADS)

    Rocha, Alby D.; Groen, Thomas A.; Skidmore, Andrew K.; Darvishzadeh, Roshanak; Willemen, Louise

    2017-11-01

    The growing number of narrow spectral bands in hyperspectral remote sensing improves the capacity to describe and predict biological processes in ecosystems. But it also poses a challenge to fit empirical models based on such high dimensional data, which often contain correlated and noisy predictors. As sample sizes, to train and validate empirical models, seem not to be increasing at the same rate, overfitting has become a serious concern. Overly complex models lead to overfitting by capturing more than the underlying relationship, and also through fitting random noise in the data. Many regression techniques claim to overcome these problems by using different strategies to constrain complexity, such as limiting the number of terms in the model, by creating latent variables or by shrinking parameter coefficients. This paper is proposing a new method, named Naïve Overfitting Index Selection (NOIS), which makes use of artificially generated spectra, to quantify the relative model overfitting and to select an optimal model complexity supported by the data. The robustness of this new method is assessed by comparing it to a traditional model selection based on cross-validation. The optimal model complexity is determined for seven different regression techniques, such as partial least squares regression, support vector machine, artificial neural network and tree-based regressions using five hyperspectral datasets. The NOIS method selects less complex models, which present accuracies similar to the cross-validation method. The NOIS method reduces the chance of overfitting, thereby avoiding models that present accurate predictions that are only valid for the data used, and too complex to make inferences about the underlying process.

  17. Cognitive Support During High-Consequence Episodes of Care in Cardiovascular Surgery.

    PubMed

    Conboy, Heather M; Avrunin, George S; Clarke, Lori A; Osterweil, Leon J; Christov, Stefan C; Goldman, Julian M; Yule, Steven J; Zenati, Marco A

    2017-03-01

    Despite significant efforts to reduce preventable adverse events in medical processes, such events continue to occur at unacceptable rates. This paper describes a computer science approach that uses formal process modeling to provide situationally aware monitoring and management support to medical professionals performing complex processes. These process models represent both normative and non-normative situations, and are validated by rigorous automated techniques such as model checking and fault tree analysis, in addition to careful review by experts. Context-aware Smart Checklists are then generated from the models, providing cognitive support during high-consequence surgical episodes. The approach is illustrated with a case study in cardiovascular surgery.

  18. Cross-Correlation-Based Structural System Identification Using Unmanned Aerial Vehicles

    PubMed Central

    Yoon, Hyungchul; Hoskere, Vedhus; Park, Jong-Woong; Spencer, Billie F.

    2017-01-01

    Computer vision techniques have been employed to characterize dynamic properties of structures, as well as to capture structural motion for system identification purposes. All of these methods leverage image-processing techniques using a stationary camera. This requirement makes finding an effective location for camera installation difficult, because civil infrastructure (i.e., bridges, buildings, etc.) are often difficult to access, being constructed over rivers, roads, or other obstacles. This paper seeks to use video from Unmanned Aerial Vehicles (UAVs) to address this problem. As opposed to the traditional way of using stationary cameras, the use of UAVs brings the issue of the camera itself moving; thus, the displacements of the structure obtained by processing UAV video are relative to the UAV camera. Some efforts have been reported to compensate for the camera motion, but they require certain assumptions that may be difficult to satisfy. This paper proposes a new method for structural system identification using the UAV video directly. Several challenges are addressed, including: (1) estimation of an appropriate scale factor; and (2) compensation for the rolling shutter effect. Experimental validation is carried out to validate the proposed approach. The experimental results demonstrate the efficacy and significant potential of the proposed approach. PMID:28891985

  19. The Novel Nonlinear Adaptive Doppler Shift Estimation Technique and the Coherent Doppler Lidar System Validation Lidar

    NASA Technical Reports Server (NTRS)

    Beyon, Jeffrey Y.; Koch, Grady J.

    2006-01-01

    The signal processing aspect of a 2-m wavelength coherent Doppler lidar system under development at NASA Langley Research Center in Virginia is investigated in this paper. The lidar system is named VALIDAR (validation lidar) and its signal processing program estimates and displays various wind parameters in real-time as data acquisition occurs. The goal is to improve the quality of the current estimates such as power, Doppler shift, wind speed, and wind direction, especially in low signal-to-noise-ratio (SNR) regime. A novel Nonlinear Adaptive Doppler Shift Estimation Technique (NADSET) is developed on such behalf and its performance is analyzed using the wind data acquired over a long period of time by VALIDAR. The quality of Doppler shift and power estimations by conventional Fourier-transform-based spectrum estimation methods deteriorates rapidly as SNR decreases. NADSET compensates such deterioration in the quality of wind parameter estimates by adaptively utilizing the statistics of Doppler shift estimate in a strong SNR range and identifying sporadic range bins where good Doppler shift estimates are found. The authenticity of NADSET is established by comparing the trend of wind parameters with and without NADSET applied to the long-period lidar return data.

  20. Applying modern psychometric techniques to melodic discrimination testing: Item response theory, computerised adaptive testing, and automatic item generation.

    PubMed

    Harrison, Peter M C; Collins, Tom; Müllensiefen, Daniel

    2017-06-15

    Modern psychometric theory provides many useful tools for ability testing, such as item response theory, computerised adaptive testing, and automatic item generation. However, these techniques have yet to be integrated into mainstream psychological practice. This is unfortunate, because modern psychometric techniques can bring many benefits, including sophisticated reliability measures, improved construct validity, avoidance of exposure effects, and improved efficiency. In the present research we therefore use these techniques to develop a new test of a well-studied psychological capacity: melodic discrimination, the ability to detect differences between melodies. We calibrate and validate this test in a series of studies. Studies 1 and 2 respectively calibrate and validate an initial test version, while Studies 3 and 4 calibrate and validate an updated test version incorporating additional easy items. The results support the new test's viability, with evidence for strong reliability and construct validity. We discuss how these modern psychometric techniques may also be profitably applied to other areas of music psychology and psychological science in general.

  1. A new technique for the characterization of chaff elements

    NASA Astrophysics Data System (ADS)

    Scholfield, David; Myat, Maung; Dauby, Jason; Fesler, Jonathon; Bright, Jonathan

    2011-07-01

    A new technique for the experimental characterization of electromagnetic chaff based on Inverse Synthetic Aperture Radar is presented. This technique allows for the characterization of as few as one filament of chaff in a controlled anechoic environment allowing for stability and repeatability of experimental results. This approach allows for a deeper understanding of the fundamental phenomena of electromagnetic scattering from chaff through an incremental analysis approach. Chaff analysis can now begin with a single element and progress through the build-up of particles into pseudo-cloud structures. This controlled incremental approach is supported by an identical incremental modeling and validation process. Additionally, this technique has the potential to produce considerable savings in financial and schedule cost and provides a stable and repeatable experiment to aid model valuation.

  2. Signal processing and neural network toolbox and its application to failure diagnosis and prognosis

    NASA Astrophysics Data System (ADS)

    Tu, Fang; Wen, Fang; Willett, Peter K.; Pattipati, Krishna R.; Jordan, Eric H.

    2001-07-01

    Many systems are comprised of components equipped with self-testing capability; however, if the system is complex involving feedback and the self-testing itself may occasionally be faulty, tracing faults to a single or multiple causes is difficult. Moreover, many sensors are incapable of reliable decision-making on their own. In such cases, a signal processing front-end that can match inference needs will be very helpful. The work is concerned with providing an object-oriented simulation environment for signal processing and neural network-based fault diagnosis and prognosis. In the toolbox, we implemented a wide range of spectral and statistical manipulation methods such as filters, harmonic analyzers, transient detectors, and multi-resolution decomposition to extract features for failure events from data collected by data sensors. Then we evaluated multiple learning paradigms for general classification, diagnosis and prognosis. The network models evaluated include Restricted Coulomb Energy (RCE) Neural Network, Learning Vector Quantization (LVQ), Decision Trees (C4.5), Fuzzy Adaptive Resonance Theory (FuzzyArtmap), Linear Discriminant Rule (LDR), Quadratic Discriminant Rule (QDR), Radial Basis Functions (RBF), Multiple Layer Perceptrons (MLP) and Single Layer Perceptrons (SLP). Validation techniques, such as N-fold cross-validation and bootstrap techniques, are employed for evaluating the robustness of network models. The trained networks are evaluated for their performance using test data on the basis of percent error rates obtained via cross-validation, time efficiency, generalization ability to unseen faults. Finally, the usage of neural networks for the prediction of residual life of turbine blades with thermal barrier coatings is described and the results are shown. The neural network toolbox has also been applied to fault diagnosis in mixed-signal circuits.

  3. Estimation and evaluation of COSMIC radio occultation excess phase using undifferenced measurements

    NASA Astrophysics Data System (ADS)

    Xia, Pengfei; Ye, Shirong; Jiang, Kecai; Chen, Dezhong

    2017-05-01

    In the GPS radio occultation technique, the atmospheric excess phase (AEP) can be used to derive the refractivity, which is an important quantity in numerical weather prediction. The AEP is conventionally estimated based on GPS double-difference or single-difference techniques. These two techniques, however, rely on the reference data in the data processing, increasing the complexity of computation. In this study, an undifferenced (ND) processing strategy is proposed to estimate the AEP. To begin with, we use PANDA (Positioning and Navigation Data Analyst) software to perform the precise orbit determination (POD) for the purpose of acquiring the position and velocity of the mass centre of the COSMIC (The Constellation Observing System for Meteorology, Ionosphere and Climate) satellites and the corresponding receiver clock offset. The bending angles, refractivity and dry temperature profiles are derived from the estimated AEP using Radio Occultation Processing Package (ROPP) software. The ND method is validated by the COSMIC products in typical rising and setting occultation events. Results indicate that rms (root mean square) errors of relative refractivity differences between undifferenced and atmospheric profiles (atmPrf) provided by UCAR/CDAAC (University Corporation for Atmospheric Research/COSMIC Data Analysis and Archive Centre) are better than 4 and 3 % in rising and setting occultation events respectively. In addition, we also compare the relative refractivity bias between ND-derived methods and atmPrf profiles of globally distributed 200 COSMIC occultation events on 12 December 2013. The statistical results indicate that the average rms relative refractivity deviation between ND-derived and COSMIC profiles is better than 2 % in the rising occultation event and better than 1.7 % in the setting occultation event. Moreover, the observed COSMIC refractivity profiles from ND processing strategy are further validated using European Centre for Medium-Range Weather Forecasts (ECMWF) analysis data, and the results indicate that the undifferenced method reduces the noise level on the excess phase paths in the lower troposphere compared to the single-difference processing strategy.

  4. Measurement Properties of the Persian Translated Version of Graves Orbitopathy Quality of Life Questionnaire: A Validation Study.

    PubMed

    Kashkouli, Mohsen Bahmani; Karimi, Nasser; Aghamirsalim, Mohamadreza; Abtahi, Mohammad Bagher; Nojomi, Marzieh; Shahrad-Bejestani, Hadi; Salehi, Masoud

    2017-02-01

    To determine the measurement properties of the Persian language version of the Graves orbitopathy quality of life questionnaire (GO-QOL). Following a systematic translation and cultural adaptation process, 141 consecutive unselected thyroid eye disease (TED) patients answered the Persian GO-QOL and underwent complete ophthalmic examination. The questionnaire was again completed by 60 patients on the second visit, 2-4 weeks later. Construct validity (cross-cultural validity, structural validity and hypotheses testing), reliability (internal consistency and test-retest reliability), and floor and ceiling effects of the Persian version of the GO-QOL were evaluated. Furthermore, Rasch analysis was used to assess its psychometric properties. Cross-cultural validity was established by back-translation techniques, committee review and pretesting techniques. Bi-dimensionality of the questionnaire was confirmed by factor analysis. Construct validity was also supported through confirmation of 6 out of 8 predefined hypotheses. Cronbach's α and intraclass correlation coefficient (ICC) were 0.650 and 0.859 for visual functioning and 0.875 and 0.896 for appearance subscale, respectively. Mean quality of life (QOL) scores for visual functioning and appearance were 78.18 (standard deviation, SD, 21.57) and 56.25 (SD 26.87), respectively. Person reliabilities from the Rasch rating scale model for both visual functioning and appearance revealed an acceptable internal consistency for the Persian GO-QOL. The Persian GO-QOL questionnaire is a valid and reliable tool with good psychometric properties in evaluation of Persian-speaking patients with TED. Applying Rasch analysis to future versions of the GO-QOL is recommended in order to perform tests for linearity between the estimated item measures in different versions.

  5. Testing single point incremental forming moulds for rotomoulding operations

    NASA Astrophysics Data System (ADS)

    Afonso, Daniel; de Sousa, Ricardo Alves; Torcato, Ricardo

    2017-10-01

    Low pressure polymer processes as thermoforming or rotational moulding use much simpler moulds than high pressure processes like injection. However, despite the low forces involved in the process, moulds manufacturing for these applications is still a very material, energy and time consuming operation. Particularly in rotational moulding there is no standard for the mould manufacture and very different techniques are applicable. The goal of this research is to develop and validate a method for manufacturing plastically formed sheet metal moulds by single point incremental forming (SPIF) for rotomoulding and rotocasting operations. A Stewart platform based SPIF machine allow the forming of thick metal sheets, granting the required structural stiffness for the mould surface, and keeping a short manufacture lead time and low thermal inertia. The experimental work involves the proposal of a hollow part, design and fabrication of a sheet metal mould using dieless incremental forming techniques and testing its operation in the production of prototype parts.

  6. Hierarchical multi-scale approach to validation and uncertainty quantification of hyper-spectral image modeling

    NASA Astrophysics Data System (ADS)

    Engel, Dave W.; Reichardt, Thomas A.; Kulp, Thomas J.; Graff, David L.; Thompson, Sandra E.

    2016-05-01

    Validating predictive models and quantifying uncertainties inherent in the modeling process is a critical component of the HARD Solids Venture program [1]. Our current research focuses on validating physics-based models predicting the optical properties of solid materials for arbitrary surface morphologies and characterizing the uncertainties in these models. We employ a systematic and hierarchical approach by designing physical experiments and comparing the experimental results with the outputs of computational predictive models. We illustrate this approach through an example comparing a micro-scale forward model to an idealized solid-material system and then propagating the results through a system model to the sensor level. Our efforts should enhance detection reliability of the hyper-spectral imaging technique and the confidence in model utilization and model outputs by users and stakeholders.

  7. Acidification of In-Storage-Psychrophilic-Anaerobic-Digestion (ISPAD) process to reduce ammonia volatilization: Model development and validation.

    PubMed

    Madani-Hosseini, Mahsa; Mulligan, Catherine N; Barrington, Suzelle

    2016-06-01

    In-Storage-Psychrophilic-Anaerobic-Digestion (ISPAD) is an ambient temperature treatment system for wastewaters stored for over 100days under temperate climates, which produces a nitrogen rich digestate susceptible to ammonia (NH3) volatilization. Present acidification techniques reducing NH3 volatilization are not only expensive and with secondary environmental effects, but do not apply to ISPAD relying on batch-to-batch inoculation. The objectives of this study were to identify and validate sequential organic loading (OL) strategies producing imbalances in acidogen and methanogen growth, acidifying ISPAD content one week before emptying to a pH of 6, while also preserving the inoculation potential. This acidification process is challenging as wastewaters often offer a high buffering capacity and ISPAD operational practices foster low microbial populations. A model simulating the ISPAD pH regime was used to optimize 3 different sequential OLs to decrease the ISPAD pH to 6.0. All 3 strategies were compared in terms of biogas production, volatile fatty acid (VFA) concentration, microbial activity, glucose consumption, and pH decrease. Laboratory validation of the model outputs confirmed that a sequential OL of 13kg glucose/m(3) of ISPAD content over 4days could indeed reduce the pH to 6.0. Such OL competes feasibly with present acidification techniques. Nevertheless, more research is required to explain the 3-day lag between the model results and the experimental data. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Automatic Coregistration and orthorectification (ACRO) and subsequent mosaicing of NASA high-resolution imagery over the Mars MC11 quadrangle, using HRSC as a baseline

    NASA Astrophysics Data System (ADS)

    Sidiropoulos, Panagiotis; Muller, Jan-Peter; Watson, Gillian; Michael, Gregory; Walter, Sebastian

    2018-02-01

    This work presents the coregistered, orthorectified and mosaiced high-resolution products of the MC11 quadrangle of Mars, which have been processed using novel, fully automatic, techniques. We discuss the development of a pipeline that achieves fully automatic and parameter independent geometric alignment of high-resolution planetary images, starting from raw input images in NASA PDS format and following all required steps to produce a coregistered geotiff image, a corresponding footprint and useful metadata. Additionally, we describe the development of a radiometric calibration technique that post-processes coregistered images to make them radiometrically consistent. Finally, we present a batch-mode application of the developed techniques over the MC11 quadrangle to validate their potential, as well as to generate end products, which are released to the planetary science community, thus assisting in the analysis of Mars static and dynamic features. This case study is a step towards the full automation of signal processing tasks that are essential to increase the usability of planetary data, but currently, require the extensive use of human resources.

  9. Modified signed-digit trinary addition using synthetic wavelet filter

    NASA Astrophysics Data System (ADS)

    Iftekharuddin, K. M.; Razzaque, M. A.

    2000-09-01

    The modified signed-digit (MSD) number system has been a topic of interest as it allows for parallel carry-free addition of two numbers for digital optical computing. In this paper, harmonic wavelet joint transform (HWJT)-based correlation technique is introduced for optical implementation of MSD trinary adder implementation. The realization of the carry-propagation-free addition of MSD trinary numerals is demonstrated using synthetic HWJT correlator model. It is also shown that the proposed synthetic wavelet filter-based correlator shows high performance in logic processing. Simulation results are presented to validate the performance of the proposed technique.

  10. Comparing interpolation techniques for annual temperature mapping across Xinjiang region

    NASA Astrophysics Data System (ADS)

    Ren-ping, Zhang; Jing, Guo; Tian-gang, Liang; Qi-sheng, Feng; Aimaiti, Yusupujiang

    2016-11-01

    Interpolating climatic variables such as temperature is challenging due to the highly variable nature of meteorological processes and the difficulty in establishing a representative network of stations. In this paper, based on the monthly temperature data which obtained from the 154 official meteorological stations in the Xinjiang region and surrounding areas, we compared five spatial interpolation techniques: Inverse distance weighting (IDW), Ordinary kriging, Cokriging, thin-plate smoothing splines (ANUSPLIN) and Empirical Bayesian kriging(EBK). Error metrics were used to validate interpolations against independent data. Results indicated that, the ANUSPLIN performed best than the other four interpolation methods.

  11. Extended wavelet transformation to digital holographic reconstruction: application to the elliptical, astigmatic Gaussian beams.

    PubMed

    Remacha, Clément; Coëtmellec, Sébastien; Brunel, Marc; Lebrun, Denis

    2013-02-01

    Wavelet analysis provides an efficient tool in numerous signal processing problems and has been implemented in optical processing techniques, such as in-line holography. This paper proposes an improvement of this tool for the case of an elliptical, astigmatic Gaussian (AEG) beam. We show that this mathematical operator allows reconstructing an image of a spherical particle without compression of the reconstructed image, which increases the accuracy of the 3D location of particles and of their size measurement. To validate the performance of this operator we have studied the diffraction pattern produced by a particle illuminated by an AEG beam. This study used mutual intensity propagation, and the particle is defined as a chirped Gaussian sum. The proposed technique was applied and the experimental results are presented.

  12. Real-time spectral characterization of a photon pair source using a chirped supercontinuum seed.

    PubMed

    Erskine, Jennifer; England, Duncan; Kupchak, Connor; Sussman, Benjamin

    2018-02-15

    Photon pair sources have wide ranging applications in a variety of quantum photonic experiments and protocols. Many of these protocols require well controlled spectral correlations between the two output photons. However, due to low cross-sections, measuring the joint spectral properties of photon pair sources has historically been a challenging and time-consuming task. Here, we present an approach for the real-time measurement of the joint spectral properties of a fiber-based four wave mixing source. We seed the four wave mixing process using a broadband chirped pulse, studying the stimulated process to extract information regarding the spontaneous process. In addition, we compare stimulated emission measurements with the spontaneous process to confirm the technique's validity. Joint spectral measurements have taken many hours historically and several minutes with recent techniques. Here, measurements have been demonstrated in 5-30 s depending on resolution, offering substantial improvement. Additional benefits of this approach include flexible resolution, large measurement bandwidth, and reduced experimental overhead.

  13. Radiocardiography in clinical cardiology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pierson, R.N. Jr.; Alam, S.; Kemp, H.G.

    1977-01-01

    Quantitative radiocardiography provides a variety of noninvasive measurements of value in cardiology. A gamma camera and computer processing are required for most of these measurements. The advantages of ease, economy, and safety of these procedures are, in part, offset by the complexity of as yet unstandardized methods and incomplete validation of results. The expansion of these techniques will inevitably be rapid. Their careful performance requires, for the moment, a major and perhaps dedicated effort by at least one member of the professional team, if the pitfalls that lead to unrecognized error are to be avoided. We may anticipate more automatedmore » and reliable results with increased experience and validation.« less

  14. Simulation verification techniques study

    NASA Technical Reports Server (NTRS)

    Schoonmaker, P. B.; Wenglinski, T. H.

    1975-01-01

    Results are summarized of the simulation verification techniques study which consisted of two tasks: to develop techniques for simulator hardware checkout and to develop techniques for simulation performance verification (validation). The hardware verification task involved definition of simulation hardware (hardware units and integrated simulator configurations), survey of current hardware self-test techniques, and definition of hardware and software techniques for checkout of simulator subsystems. The performance verification task included definition of simulation performance parameters (and critical performance parameters), definition of methods for establishing standards of performance (sources of reference data or validation), and definition of methods for validating performance. Both major tasks included definition of verification software and assessment of verification data base impact. An annotated bibliography of all documents generated during this study is provided.

  15. Single-Shot Scalar-Triplet Measurements in High-Pressure Swirl-Stabilized Flames for Combustion Code Validation

    NASA Technical Reports Server (NTRS)

    Kojima, Jun; Nguyen, Quang-Viet

    2007-01-01

    In support of NASA ARMD's code validation project, we have made significant progress by providing the first quantitative single-shot multi-scalar data from a turbulent elevated-pressure (5 atm), swirl-stabilized, lean direct injection (LDI) type research burner operating on CH4-air using a spatially-resolved pulsed-laser spontaneous Raman diagnostic technique. The Raman diagnostics apparatus and data analysis that we present here were developed over the past 6 years at Glenn Research Center. From the Raman scattering data, we produce spatially-mapped probability density functions (PDFs) of the instantaneous temperature, determined using a newly developed low-resolution effective rotational bandwidth (ERB) technique. The measured 3-scalar (triplet) correlations, between temperature, CH4, and O2 concentrations, as well as their PDF s, also provide a high-level of detail into the nature and extent of the turbulent mixing process and its impact on chemical reactions in a realistic gas turbine injector flame at elevated pressures. The multi-scalar triplet data set presented here provides a good validation case for CFD combustion codes to simulate by providing both average and statistical values for the 3 measured scalars.

  16. Statistical Validation for Clinical Measures: Repeatability and Agreement of Kinect™-Based Software

    PubMed Central

    Tello, Emanuel; Rodrigo, Alejandro; Valentinuzzi, Max E.

    2018-01-01

    Background The rehabilitation process is a fundamental stage for recovery of people's capabilities. However, the evaluation of the process is performed by physiatrists and medical doctors, mostly based on their observations, that is, a subjective appreciation of the patient's evolution. This paper proposes a tracking platform of the movement made by an individual's upper limb using Kinect sensor(s) to be applied for the patient during the rehabilitation process. The main contribution is the development of quantifying software and the statistical validation of its performance, repeatability, and clinical use in the rehabilitation process. Methods The software determines joint angles and upper limb trajectories for the construction of a specific rehabilitation protocol and quantifies the treatment evolution. In turn, the information is presented via a graphical interface that allows the recording, storage, and report of the patient's data. For clinical purposes, the software information is statistically validated with three different methodologies, comparing the measures with a goniometer in terms of agreement and repeatability. Results The agreement of joint angles measured with the proposed software and goniometer is evaluated with Bland-Altman plots; all measurements fell well within the limits of agreement, meaning interchangeability of both techniques. Additionally, the results of Bland-Altman analysis of repeatability show 95% confidence. Finally, the physiotherapists' qualitative assessment shows encouraging results for the clinical use. Conclusion The main conclusion is that the software is capable of offering a clinical history of the patient and is useful for quantification of the rehabilitation success. The simplicity, low cost, and visualization possibilities enhance the use of the software Kinect for rehabilitation and other applications, and the expert's opinion endorses the choice of our approach for clinical practice. Comparison of the new measurement technique with established goniometric methods determines that the proposed software agrees sufficiently to be used interchangeably. PMID:29750166

  17. Nonlinear ultrasonic fatigue crack detection using a single piezoelectric transducer

    NASA Astrophysics Data System (ADS)

    An, Yun-Kyu; Lee, Dong Jun

    2016-04-01

    This paper proposes a new nonlinear ultrasonic technique for fatigue crack detection using a single piezoelectric transducer (PZT). The proposed technique identifies a fatigue crack using linear (α) and nonlinear (β) parameters obtained from only a single PZT mounted on a target structure. Based on the different physical characteristics of α and β, a fatigue crack-induced feature is able to be effectively isolated from the inherent nonlinearity of a target structure and data acquisition system. The proposed technique requires much simpler test setup and less processing costs than the existing nonlinear ultrasonic techniques, but fast and powerful. To validate the proposed technique, a real fatigue crack is created in an aluminum plate, and then false positive and negative tests are carried out under varying temperature conditions. The experimental results reveal that the fatigue crack is successfully detected, and no positive false alarm is indicated.

  18. Induction motor broken rotor bar fault location detection through envelope analysis of start-up current using Hilbert transform

    NASA Astrophysics Data System (ADS)

    Abd-el-Malek, Mina; Abdelsalam, Ahmed K.; Hassan, Ola E.

    2017-09-01

    Robustness, low running cost and reduced maintenance lead Induction Motors (IMs) to pioneerly penetrate the industrial drive system fields. Broken rotor bars (BRBs) can be considered as an important fault that needs to be early assessed to minimize the maintenance cost and labor time. The majority of recent BRBs' fault diagnostic techniques focus on differentiating between healthy and faulty rotor cage. In this paper, a new technique is proposed for detecting the location of the broken bar in the rotor. The proposed technique relies on monitoring certain statistical parameters estimated from the analysis of the start-up stator current envelope. The envelope of the signal is obtained using Hilbert Transformation (HT). The proposed technique offers non-invasive, fast computational and accurate location diagnostic process. Various simulation scenarios are presented that validate the effectiveness of the proposed technique.

  19. Evaluating the dynamic response of in-flight thrust calculation techniques during throttle transients

    NASA Technical Reports Server (NTRS)

    Ray, Ronald J.

    1994-01-01

    New flight test maneuvers and analysis techniques for evaluating the dynamic response of in-flight thrust models during throttle transients have been developed and validated. The approach is based on the aircraft and engine performance relationship between thrust and drag. Two flight test maneuvers, a throttle step and a throttle frequency sweep, were developed and used in the study. Graphical analysis techniques, including a frequency domain analysis method, were also developed and evaluated. They provide quantitative and qualitative results. Four thrust calculation methods were used to demonstrate and validate the test technique. Flight test applications on two high-performance aircraft confirmed the test methods as valid and accurate. These maneuvers and analysis techniques were easy to implement and use. Flight test results indicate the analysis techniques can identify the combined effects of model error and instrumentation response limitations on the calculated thrust value. The methods developed in this report provide an accurate approach for evaluating, validating, or comparing thrust calculation methods for dynamic flight applications.

  20. Applied digital signal processing systems for vortex flowmeter with digital signal processing.

    PubMed

    Xu, Ke-Jun; Zhu, Zhi-Hai; Zhou, Yang; Wang, Xiao-Fen; Liu, San-Shan; Huang, Yun-Zhi; Chen, Zhi-Yuan

    2009-02-01

    The spectral analysis is combined with digital filter to process the vortex sensor signal for reducing the effect of disturbance at low frequency from pipe vibrations and increasing the turndown ratio. Using digital signal processing chip, two kinds of digital signal processing systems are developed to implement these algorithms. One is an integrative system, and the other is a separated system. A limiting amplifier is designed in the input analog condition circuit to adapt large amplitude variation of sensor signal. Some technique measures are taken to improve the accuracy of the output pulse, speed up the response time of the meter, and reduce the fluctuation of the output signal. The experimental results demonstrate the validity of the digital signal processing systems.

  1. Application of Cross-Correlation Greens Function Along With FDTD for Fast Computation of Envelope Correlation Coefficient Over Wideband for MIMO Antennas

    NASA Astrophysics Data System (ADS)

    Sarkar, Debdeep; Srivastava, Kumar Vaibhav

    2017-02-01

    In this paper, the concept of cross-correlation Green's functions (CGF) is used in conjunction with the finite difference time domain (FDTD) technique for calculation of envelope correlation coefficient (ECC) of any arbitrary MIMO antenna system over wide frequency band. Both frequency-domain (FD) and time-domain (TD) post-processing techniques are proposed for possible application with this FDTD-CGF scheme. The FDTD-CGF time-domain (FDTD-CGF-TD) scheme utilizes time-domain signal processing methods and exhibits significant reduction in ECC computation time as compared to the FDTD-CGF frequency domain (FDTD-CGF-FD) scheme, for high frequency-resolution requirements. The proposed FDTD-CGF based schemes can be applied for accurate and fast prediction of wideband ECC response, instead of the conventional scattering parameter based techniques which have several limitations. Numerical examples of the proposed FDTD-CGF techniques are provided for two-element MIMO systems involving thin-wire half-wavelength dipoles in parallel side-by-side as well as orthogonal arrangements. The results obtained from the FDTD-CGF techniques are compared with results from commercial electromagnetic solver Ansys HFSS, to verify the validity of proposed approach.

  2. Attenuated total reflectance-FT-IR spectroscopy for gunshot residue analysis: potential for ammunition determination.

    PubMed

    Bueno, Justin; Sikirzhytski, Vitali; Lednev, Igor K

    2013-08-06

    The ability to link a suspect to a particular shooting incident is a principal task for many forensic investigators. Here, we attempt to achieve this goal by analysis of gunshot residue (GSR) through the use of attenuated total reflectance (ATR) Fourier transform infrared spectroscopy (FT-IR) combined with statistical analysis. The firearm discharge process is analogous to a complex chemical process. Therefore, the products of this process (GSR) will vary based upon numerous factors, including the specific combination of the firearm and ammunition which was discharged. Differentiation of FT-IR data, collected from GSR particles originating from three different firearm-ammunition combinations (0.38 in., 0.40 in., and 9 mm calibers), was achieved using projection to latent structures discriminant analysis (PLS-DA). The technique was cross (leave-one-out), both internally and externally, validated. External validation was achieved via assignment (caliber identification) of unknown FT-IR spectra from unknown GSR particles. The results demonstrate great potential for ATR-FT-IR spectroscopic analysis of GSR for forensic purposes.

  3. A Consensus Approach to Investigate Undergraduate Pharmacy Students’ Experience of Interprofessional Education

    PubMed Central

    Obara, Ilona; Paterson, Alastair; Nazar, Zachariah; Portlock, Jane; Husband, Andrew

    2017-01-01

    Objective. To assess the development of knowledge, attitudes, and behaviors for collaborative practice among first-year pharmacy students following completion of interprofessional education. Methods. A mixed-methods strategy was employed to detect student self-reported change in knowledge, attitudes, and behaviors. Validated survey tools were used to assess student perception and attitudes. The Nominal Group Technique (NGT) was used to capture student reflections and provide peer discussion on the individual IPE sessions. Results. The validated survey tools did not detect any change in students’ attitudes and perceptions. The NGT succeeded in providing a milieu for participating students to reflect on their IPE experiences. The peer review process allowed students to compare their initial perceptions and reactions and renew their reflections on the learning experience. Conclusion. The NGT process has provided the opportunity to assess the student experience through the reflective process that was enriched via peer discussion. Students have demonstrated more positive attitudes and behaviors toward interprofessional working through IPE. PMID:28381886

  4. Students Better Be on Their Best Behavior: How to Prepare for the Most Common Job Interviewing Technique

    ERIC Educational Resources Information Center

    Browning, Blair W.; Cunningham, John R.

    2012-01-01

    Nearly every student will go through the selection interview process to obtain a job in his or her future vocation. Regardless of the major of the student or the profession which they will pursue, the selection interview remains a constant. There has been some attention paid to the validity of the selection interview, and personality constructs…

  5. Experimental Validation Techniques for the Heleeos Off-Axis Laser Propagation Model

    DTIC Science & Technology

    2010-03-01

    EXPERIMENTAL VALIDATION TECHNIQUES FOR THE HELEEOS OFF-AXIS LASER PROPAGATION MODEL THESIS John Haiducek, 1st Lt, USAF AFIT/GAP/ENP/10-M07 DEPARTMENT...Department of Defense, or the United States Government. AFIT/GAP/ENP/10-M07 EXPERIMENTAL VALIDATION TECHNIQUES FOR THE HELEEOS OFF-AXIS LASER ...BS, Physics 1st Lt, USAF March 2010 APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED. AFIT/GAP/ENP/10-M07 Abstract The High Energy Laser End-to-End

  6. The Validation of Vapor Phase Hydrogen Peroxide Microbial Reduction for Planetary Protection and a Proposed Vacuum Process Specification

    NASA Technical Reports Server (NTRS)

    Chung, Shirley; Barengoltz, Jack; Kern, Roger; Koukol, Robert; Cash, Howard

    2006-01-01

    The Jet Propulsion Laboratory, in conjunction with the NASA Planetary Protection Officer, has selected the vapor phase hydrogen peroxide sterilization process for continued development as a NASA approved sterilization technique for spacecraft subsystems and systems. The goal is to include this technique, with an appropriate specification, in NPR 8020.12C as a low temperature complementary technique to the dry heat sterilization process.To meet microbial reduction requirements for all Mars in-situ life detection and sample return missions, various planetary spacecraft subsystems will have to be exposed to a qualified sterilization process. This process could be the elevated temperature dry heat sterilization process (115 C for 40 hours) which was used to sterilize the Viking lander spacecraft. However, with utilization of such elements as highly sophisticated electronics and sensors in modern spacecraft, this process presents significant materials challenges and is thus an undesirable bioburden reduction method to design engineers. The objective of this work is to introduce vapor hydrogen peroxide (VHP) as an alternative to dry heat microbial reduction to meet planetary protection requirements.The VHP process is widely used by the medical industry to sterilize surgical instruments and biomedical devices, but high doses of VHP may degrade the performance of flight hardware, or compromise material properties. Our goal for this study was to determine the minimum VHP process conditions to achieve microbial reduction levels acceptable for planetary protection.

  7. Scratching as a Fracture Process: From Butter to Steel

    NASA Astrophysics Data System (ADS)

    Akono, A.-T.; Reis, P. M.; Ulm, F.-J.

    2011-05-01

    We present results of a hybrid experimental and theoretical investigation of the fracture scaling in scratch tests and show that scratching is a fracture dominated process. Validated for paraffin wax, cement paste, Jurassic limestone and steel, we derive a model that provides a quantitative means to relate quantities measured in scratch tests to fracture properties of materials at multiple scales. The scalability of scratching for different probes and depths opens new venues towards miniaturization of our technique, to extract fracture properties of materials at even smaller length scales.

  8. Monitoring Building Deformation with InSAR: Experiments and Validation.

    PubMed

    Yang, Kui; Yan, Li; Huang, Guoman; Chen, Chu; Wu, Zhengpeng

    2016-12-20

    Synthetic Aperture Radar Interferometry (InSAR) techniques are increasingly applied for monitoring land subsidence. The advantages of InSAR include high accuracy and the ability to cover large areas; nevertheless, research validating the use of InSAR on building deformation is limited. In this paper, we test the monitoring capability of the InSAR in experiments using two landmark buildings; the Bohai Building and the China Theater, located in Tianjin, China. They were selected as real examples to compare InSAR and leveling approaches for building deformation. Ten TerraSAR-X images spanning half a year were used in Permanent Scatterer InSAR processing. These extracted InSAR results were processed considering the diversity in both direction and spatial distribution, and were compared with true leveling values in both Ordinary Least Squares (OLS) regression and measurement of error analyses. The detailed experimental results for the Bohai Building and the China Theater showed a high correlation between InSAR results and the leveling values. At the same time, the two Root Mean Square Error (RMSE) indexes had values of approximately 1 mm. These analyses show that a millimeter level of accuracy can be achieved by means of InSAR technique when measuring building deformation. We discuss the differences in accuracy between OLS regression and measurement of error analyses, and compare the accuracy index of leveling in order to propose InSAR accuracy levels appropriate for monitoring buildings deformation. After assessing the advantages and limitations of InSAR techniques in monitoring buildings, further applications are evaluated.

  9. Producibility improvements suggested by a validated process model of seeded CdZnTe vertical Bridgman growth

    NASA Astrophysics Data System (ADS)

    Larson, David J., Jr.; Casagrande, Louis G.; Di Marzio, Don; Levy, Alan; Carlson, Frederick M.; Lee, Taipao; Black, David R.; Wu, Jun; Dudley, Michael

    1994-07-01

    We have successfully validated theoretical models of seeded vertical Bridgman-Stockbarger CdZnTe crystal growth and post-solidification processing, using in-situ thermal monitoring and innovative material characterization techniques. The models predict the thermal gradients, interface shape, fluid flow and solute redistribution during solidification, as well as the distributions of accumulated excess stress that causes defect generation and redistribution. Data from the furnace and ampoule wall have validated predictions from the thermal model. Results are compared to predictions of the thermal and thermo-solutal models. We explain the measured initial, change-of-rate, and terminal compositional transients as well as the macrosegregation. Macro and micro-defect distributions have been imaged on CdZnTe wafers from 40 mm diameter boules. Superposition of topographic defect images and predicted excess stress patterns suggests the origin of some frequently encountered defects, particularly on a macro scale, to result from the applied and accumulated stress fields and the anisotropic nature of the CdZnTe crystal. Implications of these findings with respect to producibility are discussed.

  10. Alzheimer's Disease Assessment: A Review and Illustrations Focusing on Item Response Theory Techniques.

    PubMed

    Balsis, Steve; Choudhury, Tabina K; Geraci, Lisa; Benge, Jared F; Patrick, Christopher J

    2018-04-01

    Alzheimer's disease (AD) affects neurological, cognitive, and behavioral processes. Thus, to accurately assess this disease, researchers and clinicians need to combine and incorporate data across these domains. This presents not only distinct methodological and statistical challenges but also unique opportunities for the development and advancement of psychometric techniques. In this article, we describe relatively recent research using item response theory (IRT) that has been used to make progress in assessing the disease across its various symptomatic and pathological manifestations. We focus on applications of IRT to improve scoring, test development (including cross-validation and adaptation), and linking and calibration. We conclude by describing potential future multidimensional applications of IRT techniques that may improve the precision with which AD is measured.

  11. The conservation physiology toolbox: status and opportunities

    PubMed Central

    Love, Oliver P; Hultine, Kevin R

    2018-01-01

    Abstract For over a century, physiological tools and techniques have been allowing researchers to characterize how organisms respond to changes in their natural environment and how they interact with human activities or infrastructure. Over time, many of these techniques have become part of the conservation physiology toolbox, which is used to monitor, predict, conserve, and restore plant and animal populations under threat. Here, we provide a summary of the tools that currently comprise the conservation physiology toolbox. By assessing patterns in articles that have been published in ‘Conservation Physiology’ over the past 5 years that focus on introducing, refining and validating tools, we provide an overview of where researchers are placing emphasis in terms of taxa and physiological sub-disciplines. Although there is certainly diversity across the toolbox, metrics of stress physiology (particularly glucocorticoids) and studies focusing on mammals have garnered the greatest attention, with both comprising the majority of publications (>45%). We also summarize the types of validations that are actively being completed, including those related to logistics (sample collection, storage and processing), interpretation of variation in physiological traits and relevance for conservation science. Finally, we provide recommendations for future tool refinement, with suggestions for: (i) improving our understanding of the applicability of glucocorticoid physiology; (ii) linking multiple physiological and non-physiological tools; (iii) establishing a framework for plant conservation physiology; (iv) assessing links between environmental disturbance, physiology and fitness; (v) appreciating opportunities for validations in under-represented taxa; and (vi) emphasizing tool validation as a core component of research programmes. Overall, we are confident that conservation physiology will continue to increase its applicability to more taxa, develop more non-invasive techniques, delineate where limitations exist, and identify the contexts necessary for interpretation in captivity and the wild. PMID:29942517

  12. Measurement of conjugated linoleic acid (CLA) in CLA-rich soy oil by attenuated total reflectance-Fourier transform infrared spectroscopy (ATR-FTIR).

    PubMed

    Kadamne, Jeta V; Jain, Vishal P; Saleh, Mohammed; Proctor, Andrew

    2009-11-25

    Conjugated linoleic acid (CLA) isomers in oils are currently measured as fatty acid methyl esters by a gas chromatography-flame ionization detector (GC-FID) technique, which requires approximately 2 h to complete the analysis. Hence, we aim to develop a method to rapidly determine CLA isomers in CLA-rich soy oil. Soy oil with 0.38-25.11% total CLA was obtained by photo-isomerization of 96 soy oil samples for 24 h. A sample was withdrawn at 30 min intervals with repeated processing using a second batch of oil. Six replicates of GC-FID fatty acid analysis were conducted for each oil sample. The oil samples were scanned using attenuated total reflectance-Fourier transform infrared spectroscopy (ATR-FTIR), and the spectrum was collected. Calibration models were developed using partial least-squares (PLS-1) regression using Unscrambler software. Models were validated using a full cross-validation technique and tested using samples that were not included in the calibration sample set. Measured and predicted total CLA, trans,trans CLA isomers, total mono trans CLA isomers, trans-10,cis-12 CLA, trans-9,cis-11 CLA and cis-10,trans-12 CLA, and cis-9,trans-11 CLA had cross-validated coefficients of determinations (R2v) of 0.97, 0.98, 0.97, 0.98, 0.97, and 0.99 and corresponding root-mean-square error of validation (RMSEV) of 1.14, 0.69, 0.27, 0.07, 0.14, and 0.07% CLA, respectively. The ATR-FTIR technique is a rapid and less expensive method for determining CLA isomers in linoleic acid photo-isomerized soy oil than GC-FID.

  13. Improving Focal Depth Estimates: Studies of Depth Phase Detection at Regional Distances

    NASA Astrophysics Data System (ADS)

    Stroujkova, A.; Reiter, D. T.; Shumway, R. H.

    2006-12-01

    The accurate estimation of the depth of small, regionally recorded events continues to be an important and difficult explosion monitoring research problem. Depth phases (free surface reflections) are the primary tool that seismologists use to constrain the depth of a seismic event. When depth phases from an event are detected, an accurate source depth is easily found by using the delay times of the depth phases relative to the P wave and a velocity profile near the source. Cepstral techniques, including cepstral F-statistics, represent a class of methods designed for the depth-phase detection and identification; however, they offer only a moderate level of success at epicentral distances less than 15°. This is due to complexities in the Pn coda, which can lead to numerous false detections in addition to the true phase detection. Therefore, cepstral methods cannot be used independently to reliably identify depth phases. Other evidence, such as apparent velocities, amplitudes and frequency content, must be used to confirm whether the phase is truly a depth phase. In this study we used a variety of array methods to estimate apparent phase velocities and arrival azimuths, including beam-forming, semblance analysis, MUltiple SIgnal Classification (MUSIC) (e.g., Schmidt, 1979), and cross-correlation (e.g., Cansi, 1995; Tibuleac and Herrin, 1997). To facilitate the processing and comparison of results, we developed a MATLAB-based processing tool, which allows application of all of these techniques (i.e., augmented cepstral processing) in a single environment. The main objective of this research was to combine the results of three focal-depth estimation techniques and their associated standard errors into a statistically valid unified depth estimate. The three techniques include: 1. Direct focal depth estimate from the depth-phase arrival times picked via augmented cepstral processing. 2. Hypocenter location from direct and surface-reflected arrivals observed on sparse networks of regional stations using a Grid-search, Multiple-Event Location method (GMEL; Rodi and Toksöz, 2000; 2001). 3. Surface-wave dispersion inversion for event depth and focal mechanism (Herrmann and Ammon, 2002). To validate our approach and provide quality control for our solutions, we applied the techniques to moderated- sized events (mb between 4.5 and 6.0) with known focal mechanisms. We illustrate the techniques using events observed at regional distances from the KSAR (Wonju, South Korea) teleseismic array and other nearby broadband three-component stations. Our results indicate that the techniques can produce excellent agreement between the various depth estimates. In addition, combining the techniques into a "unified" estimate greatly reduced location errors and improved robustness of the solution, even if results from the individual methods yielded large standard errors.

  14. Application of the Delphi technique in healthcare maintenance.

    PubMed

    Njuangang, Stanley; Liyanage, Champika; Akintoye, Akintola

    2017-10-09

    Purpose The purpose of this paper is to examine the research design, issues and considerations in the application of the Delphi technique to identify, refine and rate the critical success factors and performance measures in maintenance-associated infections. Design/methodology/approach In-depth literature review through the application of open and axial coding were applied to formulate the interview and research questions. These were used to conduct an exploratory case study of two healthcare maintenance managers, randomly selected from two National Health Service Foundation Trusts in England. The results of exploratory case study provided the rationale for the application of the Delphi technique in this research. The different processes in the application of the Delphi technique in healthcare research are examined thoroughly. Findings This research demonstrates the need to apply and integrate different research methods to enhance the validity of the Delphi technique. The rationale for the application of the Delphi technique in this research is because some healthcare maintenance managers lack knowledge about basic infection control (IC) principles to make hospitals safe for patient care. The result of first round of the Delphi exercise is a useful contribution in its own rights. It identified a number of salient issues and differences in the opinions of the Delphi participants, noticeably between healthcare maintenance managers and members of the infection control team. It also resulted in useful suggestions and comments to improve the quality and presentation of the second- and third-round Delphi instruments. Practical implications This research provides a research methodology that can be adopted by researchers investigating new and emerging issues in the healthcare sector. As this research demonstrates, the Delphi technique is relevant in soliciting expert knowledge and opinion to identify performance measures to control maintenance-associated infections in hospitals. The methodology provided here could be applied by other researchers elsewhere to probe, investigate and generate rich information about new and emerging healthcare research topics. Originality/value The authors demonstrate how different research methods can be integrated to enhance the validity of the Delphi technique. For example, the results of an exploratory case study provided the rationale for the application of the Delphi technique investigating the key performance measures in maintenance-associated infections. The different processes involved in the application of the Delphi technique are also carefully explored and discussed in depth.

  15. Copernicus POD Service: Ready for Sentinel-3

    NASA Astrophysics Data System (ADS)

    Peter, H.; Fernández, J.; Escobar, D.; Féménias, P.; Flohrer, C.; Otten, M.

    2015-12-01

    The Copernicus POD Service is part of the Copernicus PDGS Ground Segment of the Sentinel missions. A GMV-led consortium is operating the Copernicus POD Service being in charge of generating precise orbital products and auxiliary data files for their use as part of the processing chains of the respective Sentinel PDGS. The Sentinel-1, -2 & -3 missions have different but very demanding requirements in terms of orbital accuracy and timeliness. Orbital products in Near Real Time (latency: 30 min), Short Time Critical (1.5 days) and Non-time Critical (20-30 days) are required. The accuracy requirements are very challenging, targeting 5 cm in 3D for Sentinel-1 and 2-3 cm in radial direction for Sentinel-3. Sentinel-3A carries, in addition to a GPS receiver a laser retro reflector and a DORIS receiver. On the one hand, the three different techniques GPS, SLR and DORIS make POD more complex but, on the other hand, it is very helpful to have independent techniques available for validation of the orbit results. The successful POD processing for Sentinel-1A is a good preparation for Sentinel-3A due to the similar demanding orbit accuracy requirements. The Copernicus POD Service is ready for Sentinel-3A and the service will process GPS and SLR data routinely and has the capacity to process DORIS in NTC and reprocessing campaigns. The three independent orbit determination techniques on Sentinel-3 offer big potential for scientific exploitation. Carrying all three techniques together makes the satellite, e.g., very useful for combining all the techniques on observation level as it could only be done for Jason-2 until now. The Sentinel POD Quality Working Group strongly supporting the CPOD Service delivers additional orbit solutions to validate the CPOD results independently. The recommendations from this body guarantee that the CPOD Service is updated following state-of-the-art algorithms, models and conventions. The QWG also focuses on the scientific exploitation of the Sentinel missions. The current status of the CPOD Service is presented operating Sentinel-1A and -2A and its readiness for Sentinel-3A. It is shown how the quality and the timeliness of the products are guaranteed. Possibilities for scientific exploitation of the Sentinel-3 mission also in synergy with other Earth Observation and Sentinel missions are presented.

  16. Influencing Factors and Workpiece's Microstructure in Laser-Assisted Milling of Titanium

    NASA Astrophysics Data System (ADS)

    Wiedenmann, R.; Liebl, S.; Zaeh, M. F.

    Today's lightweight components have to withstand increasing mechanical and thermal loads. Therefore, advanced materials substitute conventional materials like steel or aluminum alloys. Using these high-performance materials the associated costs become prohibitively high. This paper presents the newest fundamental investigations on the hybrid process 'laser-assisted milling' which is an innovative technique to process such materials. The focus is on the validation of a numerical database for a CAD/CAM process control unit which is calculated by using simulation. Prior to that, the influencing factors on a laser-assisted milling process are systematically investigated using Design of Experiments (DoE) to identify the main influencing parameters coming from the laser and the milling operation.

  17. Hierarchical Multi-Scale Approach To Validation and Uncertainty Quantification of Hyper-Spectral Image Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Engel, David W.; Reichardt, Thomas A.; Kulp, Thomas J.

    Validating predictive models and quantifying uncertainties inherent in the modeling process is a critical component of the HARD Solids Venture program [1]. Our current research focuses on validating physics-based models predicting the optical properties of solid materials for arbitrary surface morphologies and characterizing the uncertainties in these models. We employ a systematic and hierarchical approach by designing physical experiments and comparing the experimental results with the outputs of computational predictive models. We illustrate this approach through an example comparing a micro-scale forward model to an idealized solid-material system and then propagating the results through a system model to the sensormore » level. Our efforts should enhance detection reliability of the hyper-spectral imaging technique and the confidence in model utilization and model outputs by users and stakeholders.« less

  18. Validation of the Contrast Attenuation Technique (CAT) for Deducing Dust Densities from Photographic Records Taken during the MILL RACE High Explosive Test.

    DTIC Science & Technology

    1982-02-28

    BRE O STNDADS 193- rC low& L --. -: !’- • ,- r;4; [.9 ’- DNA-TR-81-81 VALIDATION OF THE CONTRAST ATTENUATION TECHNIQUE ( CAT ) FOR DEDUCING DUST...TITLE (and Sublitle) S. TYPE OF REPORT & PERIOD COVERED VALIDATION OF THE CONTRAST ATTENUATION TECHNIQUE Technical Report ( CAT ) FOR DEDUCING DUST...SCATTERING AND EXTINCTION CONSIDERATIONS- -------- 77 C DATA ON FILMS*USED FOR THE MILL RACE CAT TEST -- ------- 85 2

  19. Expert system verification and validation study

    NASA Technical Reports Server (NTRS)

    French, Scott W.; Hamilton, David

    1992-01-01

    Five workshops on verification and validation (V&V) of expert systems (ES) where taught during this recent period of performance. Two key activities, previously performed under this contract, supported these recent workshops (1) Survey of state-of-the-practice of V&V of ES and (2) Development of workshop material and first class. The first activity involved performing an extensive survey of ES developers in order to answer several questions regarding the state-of-the-practice in V&V of ES. These questions related to the amount and type of V&V done and the successfulness of this V&V. The next key activity involved developing an intensive hands-on workshop in V&V of ES. This activity involved surveying a large number of V&V techniques, conventional as well as ES specific ones. In addition to explaining the techniques, we showed how each technique could be applied on a sample problem. References were included in the workshop material, and cross referenced to techniques, so that students would know where to go to find additional information about each technique. In addition to teaching specific techniques, we included an extensive amount of material on V&V concepts and how to develop a V&V plan for an ES project. We felt this material was necessary so that developers would be prepared to develop an orderly and structured approach to V&V. That is, they would have a process that supported the use of the specific techniques. Finally, to provide hands-on experience, we developed a set of case study exercises. These exercises were to provide an opportunity for the students to apply all the material (concepts, techniques, and planning material) to a realistic problem.

  20. Fabrication of an interim complete removable dental prosthesis with an in-office digital light processing three-dimensional printer: A proof-of-concept technique.

    PubMed

    Lin, Wei-Shao; Harris, Bryan T; Pellerito, John; Morton, Dean

    2018-04-30

    This report describes a proof of concept for fabricating an interim complete removable dental prosthesis with a digital light processing 3-dimensional (3D) printer. Although an in-office 3D printer can reduce the overall production cost for an interim complete removable dental prosthesis, the process has not been validated with clinical studies. This report provided a preliminary proof of concept in developing a digital workflow for the in-office additively manufactured interim complete removable dental prosthesis. Copyright © 2018 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.

  1. Fingerprint image enhancement by differential hysteresis processing.

    PubMed

    Blotta, Eduardo; Moler, Emilce

    2004-05-10

    A new method to enhance defective fingerprints images through image digital processing tools is presented in this work. When the fingerprints have been taken without any care, blurred and in some cases mostly illegible, as in the case presented here, their classification and comparison becomes nearly impossible. A combination of spatial domain filters, including a technique called differential hysteresis processing (DHP), is applied to improve these kind of images. This set of filtering methods proved to be satisfactory in a wide range of cases by uncovering hidden details that helped to identify persons. Dactyloscopy experts from Policia Federal Argentina and the EAAF have validated these results.

  2. Faster methods for estimating arc centre position during VAR and results from Ti-6Al-4V and INCONEL 718 alloys

    NASA Astrophysics Data System (ADS)

    Nair, B. G.; Winter, N.; Daniel, B.; Ward, R. M.

    2016-07-01

    Direct measurement of the flow of electric current during VAR is extremely difficult due to the aggressive environment as the arc process itself controls the distribution of current. In previous studies the technique of “magnetic source tomography” was presented; this was shown to be effective but it used a computationally intensive iterative method to analyse the distribution of arc centre position. In this paper we present faster computational methods requiring less numerical optimisation to determine the centre position of a single distributed arc both numerically and experimentally. Numerical validation of the algorithms were done on models and experimental validation on measurements based on titanium and nickel alloys (Ti6Al4V and INCONEL 718). The results are used to comment on the effects of process parameters on arc behaviour during VAR.

  3. Development of Airport Surface Required Navigation Performance (RNP)

    NASA Technical Reports Server (NTRS)

    Cassell, Rick; Smith, Alex; Hicok, Dan

    1999-01-01

    The U.S. and international aviation communities have adopted the Required Navigation Performance (RNP) process for defining aircraft performance when operating the en-route, approach and landing phases of flight. RNP consists primarily of the following key parameters - accuracy, integrity, continuity, and availability. The processes and analytical techniques employed to define en-route, approach and landing RNP have been applied in the development of RNP for the airport surface. To validate the proposed RNP requirements several methods were used. Operational and flight demonstration data were analyzed for conformance with proposed requirements, as were several aircraft flight simulation studies. The pilot failure risk component was analyzed through several hypothetical scenarios. Additional simulator studies are recommended to better quantify crew reactions to failures as well as additional simulator and field testing to validate achieved accuracy performance, This research was performed in support of the NASA Low Visibility Landing and Surface Operations Programs.

  4. High speed stereovision setup for position and motion estimation of fertilizer particles leaving a centrifugal spreader.

    PubMed

    Hijazi, Bilal; Cool, Simon; Vangeyte, Jürgen; Mertens, Koen C; Cointault, Frédéric; Paindavoine, Michel; Pieters, Jan G

    2014-11-13

    A 3D imaging technique using a high speed binocular stereovision system was developed in combination with corresponding image processing algorithms for accurate determination of the parameters of particles leaving the spinning disks of centrifugal fertilizer spreaders. Validation of the stereo-matching algorithm using a virtual 3D stereovision simulator indicated an error of less than 2 pixels for 90% of the particles. The setup was validated using the cylindrical spread pattern of an experimental spreader. A 2D correlation coefficient of 90% and a Relative Error of 27% was found between the experimental results and the (simulated) spread pattern obtained with the developed setup. In combination with a ballistic flight model, the developed image acquisition and processing algorithms can enable fast determination and evaluation of the spread pattern which can be used as a tool for spreader design and precise machine calibration.

  5. Validating (d,p gamma) as a Surrogate for Neutron Capture

    DOE PAGES

    Ratkiewicz, A.; Cizewski, J.A.; Pain, S.D.; ...

    2015-05-28

    The r-process is responsible for creating roughly half of the elements heavier than iron. It has recently become understood that the rates at which neutron capture reactions proceed at late times in the r-process may dramatically affect the final abundance pattern. However, direct measurements of neutron capture reaction rates on exotic nuclei are exceptionally difficult, necessitating the development of indirect approaches such as the surrogate technique. The (d,pγ) reaction at low energies was identified as a promising surrogate for the (n,γ) reaction, as both reactions share many characteristics. We report on a program to validate (d,pγ) as a surrogate formore » (n,γ) using 95Mo as a target. The experimental campaign includes direct measurements of the γ-ray intensities from the decay of excited states populated in the 95Mo(n,γ) and 95Mo(d,pγ) reactions.« less

  6. An automatic tooth preparation technique: A preliminary study

    NASA Astrophysics Data System (ADS)

    Yuan, Fusong; Wang, Yong; Zhang, Yaopeng; Sun, Yuchun; Wang, Dangxiao; Lyu, Peijun

    2016-04-01

    The aim of this study is to validate the feasibility and accuracy of a new automatic tooth preparation technique in dental healthcare. An automatic tooth preparation robotic device with three-dimensional motion planning software was developed, which controlled an ultra-short pulse laser (USPL) beam (wavelength 1,064 nm, pulse width 15 ps, output power 30 W, and repeat frequency rate 100 kHz) to complete the tooth preparation process. A total of 15 freshly extracted human intact first molars were collected and fixed into a phantom head, and the target preparation shapes of these molars were designed using customised computer-aided design (CAD) software. The accuracy of tooth preparation was evaluated using the Geomagic Studio and Imageware software, and the preparing time of each tooth was recorded. Compared with the target preparation shape, the average shape error of the 15 prepared molars was 0.05-0.17 mm, the preparation depth error of the occlusal surface was approximately 0.097 mm, and the error of the convergence angle was approximately 1.0°. The average preparation time was 17 minutes. These results validated the accuracy and feasibility of the automatic tooth preparation technique.

  7. An automatic tooth preparation technique: A preliminary study.

    PubMed

    Yuan, Fusong; Wang, Yong; Zhang, Yaopeng; Sun, Yuchun; Wang, Dangxiao; Lyu, Peijun

    2016-04-29

    The aim of this study is to validate the feasibility and accuracy of a new automatic tooth preparation technique in dental healthcare. An automatic tooth preparation robotic device with three-dimensional motion planning software was developed, which controlled an ultra-short pulse laser (USPL) beam (wavelength 1,064 nm, pulse width 15 ps, output power 30 W, and repeat frequency rate 100 kHz) to complete the tooth preparation process. A total of 15 freshly extracted human intact first molars were collected and fixed into a phantom head, and the target preparation shapes of these molars were designed using customised computer-aided design (CAD) software. The accuracy of tooth preparation was evaluated using the Geomagic Studio and Imageware software, and the preparing time of each tooth was recorded. Compared with the target preparation shape, the average shape error of the 15 prepared molars was 0.05-0.17 mm, the preparation depth error of the occlusal surface was approximately 0.097 mm, and the error of the convergence angle was approximately 1.0°. The average preparation time was 17 minutes. These results validated the accuracy and feasibility of the automatic tooth preparation technique.

  8. Detection of small earthquakes with dense array data: example from the San Jacinto fault zone, southern California

    NASA Astrophysics Data System (ADS)

    Meng, Haoran; Ben-Zion, Yehuda

    2018-01-01

    We present a technique to detect small earthquakes not included in standard catalogues using data from a dense seismic array. The technique is illustrated with continuous waveforms recorded in a test day by 1108 vertical geophones in a tight array on the San Jacinto fault zone. Waveforms are first stacked without time-shift in nine non-overlapping subarrays to increase the signal-to-noise ratio. The nine envelope functions of the stacked records are then multiplied with each other to suppress signals associated with sources affecting only some of the nine subarrays. Running a short-term moving average/long-term moving average (STA/LTA) detection algorithm on the product leads to 723 triggers in the test day. Using a local P-wave velocity model derived for the surface layer from Betsy gunshot data, 5 s long waveforms of all sensors around each STA/LTA trigger are beamformed for various incident directions. Of the 723 triggers, 220 are found to have localized energy sources and 103 of these are confirmed as earthquakes by verifying their observation at 4 or more stations of the regional seismic network. This demonstrates the general validity of the method and allows processing further the validated events using standard techniques. The number of validated events in the test day is >5 times larger than that in the standard catalogue. Using these events as templates can lead to additional detections of many more earthquakes.

  9. Text Mining in Organizational Research

    PubMed Central

    Kobayashi, Vladimer B.; Berkers, Hannah A.; Kismihók, Gábor; Den Hartog, Deanne N.

    2017-01-01

    Despite the ubiquity of textual data, so far few researchers have applied text mining to answer organizational research questions. Text mining, which essentially entails a quantitative approach to the analysis of (usually) voluminous textual data, helps accelerate knowledge discovery by radically increasing the amount data that can be analyzed. This article aims to acquaint organizational researchers with the fundamental logic underpinning text mining, the analytical stages involved, and contemporary techniques that may be used to achieve different types of objectives. The specific analytical techniques reviewed are (a) dimensionality reduction, (b) distance and similarity computing, (c) clustering, (d) topic modeling, and (e) classification. We describe how text mining may extend contemporary organizational research by allowing the testing of existing or new research questions with data that are likely to be rich, contextualized, and ecologically valid. After an exploration of how evidence for the validity of text mining output may be generated, we conclude the article by illustrating the text mining process in a job analysis setting using a dataset composed of job vacancies. PMID:29881248

  10. Text Mining in Organizational Research.

    PubMed

    Kobayashi, Vladimer B; Mol, Stefan T; Berkers, Hannah A; Kismihók, Gábor; Den Hartog, Deanne N

    2018-07-01

    Despite the ubiquity of textual data, so far few researchers have applied text mining to answer organizational research questions. Text mining, which essentially entails a quantitative approach to the analysis of (usually) voluminous textual data, helps accelerate knowledge discovery by radically increasing the amount data that can be analyzed. This article aims to acquaint organizational researchers with the fundamental logic underpinning text mining, the analytical stages involved, and contemporary techniques that may be used to achieve different types of objectives. The specific analytical techniques reviewed are (a) dimensionality reduction, (b) distance and similarity computing, (c) clustering, (d) topic modeling, and (e) classification. We describe how text mining may extend contemporary organizational research by allowing the testing of existing or new research questions with data that are likely to be rich, contextualized, and ecologically valid. After an exploration of how evidence for the validity of text mining output may be generated, we conclude the article by illustrating the text mining process in a job analysis setting using a dataset composed of job vacancies.

  11. Feathering effect detection and artifact agglomeration index-based video deinterlacing technique

    NASA Astrophysics Data System (ADS)

    Martins, André Luis; Rodrigues, Evandro Luis Linhari; de Paiva, Maria Stela Veludo

    2018-03-01

    Several video deinterlacing techniques have been developed, and each one presents a better performance in certain conditions. Occasionally, even the most modern deinterlacing techniques create frames with worse quality than primitive deinterlacing processes. This paper validates that the final image quality can be improved by combining different types of deinterlacing techniques. The proposed strategy is able to select between two types of deinterlaced frames and, if necessary, make the local correction of the defects. This decision is based on an artifact agglomeration index obtained from a feathering effect detection map. Starting from a deinterlaced frame produced by the "interfield average" method, the defective areas are identified, and, if deemed appropriate, these areas are replaced by pixels generated through the "edge-based line average" method. Test results have proven that the proposed technique is able to produce video frames with higher quality than applying a single deinterlacing technique through getting what is good from intra- and interfield methods.

  12. Validating an Air Traffic Management Concept of Operation Using Statistical Modeling

    NASA Technical Reports Server (NTRS)

    He, Yuning; Davies, Misty Dawn

    2013-01-01

    Validating a concept of operation for a complex, safety-critical system (like the National Airspace System) is challenging because of the high dimensionality of the controllable parameters and the infinite number of states of the system. In this paper, we use statistical modeling techniques to explore the behavior of a conflict detection and resolution algorithm designed for the terminal airspace. These techniques predict the robustness of the system simulation to both nominal and off-nominal behaviors within the overall airspace. They also can be used to evaluate the output of the simulation against recorded airspace data. Additionally, the techniques carry with them a mathematical value of the worth of each prediction-a statistical uncertainty for any robustness estimate. Uncertainty Quantification (UQ) is the process of quantitative characterization and ultimately a reduction of uncertainties in complex systems. UQ is important for understanding the influence of uncertainties on the behavior of a system and therefore is valuable for design, analysis, and verification and validation. In this paper, we apply advanced statistical modeling methodologies and techniques on an advanced air traffic management system, namely the Terminal Tactical Separation Assured Flight Environment (T-TSAFE). We show initial results for a parameter analysis and safety boundary (envelope) detection in the high-dimensional parameter space. For our boundary analysis, we developed a new sequential approach based upon the design of computer experiments, allowing us to incorporate knowledge from domain experts into our modeling and to determine the most likely boundary shapes and its parameters. We carried out the analysis on system parameters and describe an initial approach that will allow us to include time-series inputs, such as the radar track data, into the analysis

  13. Analytical Method Development and Validation for the Quantification of Acetone and Isopropyl Alcohol in the Tartaric Acid Base Pellets of Dipyridamole Modified Release Capsules by Using Headspace Gas Chromatographic Technique

    PubMed Central

    2018-01-01

    A simple, sensitive, accurate, robust headspace gas chromatographic method was developed for the quantitative determination of acetone and isopropyl alcohol in tartaric acid-based pellets of dipyridamole modified release capsules. The residual solvents acetone and isopropyl alcohol were used in the manufacturing process of the tartaric acid-based pellets of dipyridamole modified release capsules by considering the solubility of the dipyridamole and excipients in the different manufacturing stages. The method was developed and optimized by using fused silica DB-624 (30 m × 0.32 mm × 1.8 µm) column with the flame ionization detector. The method validation was carried out with regard to the guidelines for validation of analytical procedures Q2 demanded by the International Council for Harmonisation of Technical Requirements for Pharmaceuticals for Human Use (ICH). All the validation characteristics were meeting the acceptance criteria. Hence, the developed and validated method can be applied for the intended routine analysis. PMID:29686931

  14. Analytical Method Development and Validation for the Quantification of Acetone and Isopropyl Alcohol in the Tartaric Acid Base Pellets of Dipyridamole Modified Release Capsules by Using Headspace Gas Chromatographic Technique.

    PubMed

    Valavala, Sriram; Seelam, Nareshvarma; Tondepu, Subbaiah; Jagarlapudi, V Shanmukha Kumar; Sundarmurthy, Vivekanandan

    2018-01-01

    A simple, sensitive, accurate, robust headspace gas chromatographic method was developed for the quantitative determination of acetone and isopropyl alcohol in tartaric acid-based pellets of dipyridamole modified release capsules. The residual solvents acetone and isopropyl alcohol were used in the manufacturing process of the tartaric acid-based pellets of dipyridamole modified release capsules by considering the solubility of the dipyridamole and excipients in the different manufacturing stages. The method was developed and optimized by using fused silica DB-624 (30 m × 0.32 mm × 1.8  µ m) column with the flame ionization detector. The method validation was carried out with regard to the guidelines for validation of analytical procedures Q2 demanded by the International Council for Harmonisation of Technical Requirements for Pharmaceuticals for Human Use (ICH). All the validation characteristics were meeting the acceptance criteria. Hence, the developed and validated method can be applied for the intended routine analysis.

  15. Modeling and validation of heat and mass transfer in individual coffee beans during the coffee roasting process using computational fluid dynamics (CFD).

    PubMed

    Alonso-Torres, Beatriz; Hernández-Pérez, José Alfredo; Sierra-Espinoza, Fernando; Schenker, Stefan; Yeretzian, Chahan

    2013-01-01

    Heat and mass transfer in individual coffee beans during roasting were simulated using computational fluid dynamics (CFD). Numerical equations for heat and mass transfer inside the coffee bean were solved using the finite volume technique in the commercial CFD code Fluent; the software was complemented with specific user-defined functions (UDFs). To experimentally validate the numerical model, a single coffee bean was placed in a cylindrical glass tube and roasted by a hot air flow, using the identical geometrical 3D configuration and hot air flow conditions as the ones used for numerical simulations. Temperature and humidity calculations obtained with the model were compared with experimental data. The model predicts the actual process quite accurately and represents a useful approach to monitor the coffee roasting process in real time. It provides valuable information on time-resolved process variables that are otherwise difficult to obtain experimentally, but critical to a better understanding of the coffee roasting process at the individual bean level. This includes variables such as time-resolved 3D profiles of bean temperature and moisture content, and temperature profiles of the roasting air in the vicinity of the coffee bean.

  16. High-Throughput Platform for Synthesis of Melamine-Formaldehyde Microcapsules.

    PubMed

    Çakir, Seda; Bauters, Erwin; Rivero, Guadalupe; Parasote, Tom; Paul, Johan; Du Prez, Filip E

    2017-07-10

    The synthesis of microcapsules via in situ polymerization is a labor-intensive and time-consuming process, where many composition and process factors affect the microcapsule formation and its morphology. Herein, we report a novel combinatorial technique for the preparation of melamine-formaldehyde microcapsules, using a custom-made and automated high-throughput platform (HTP). After performing validation experiments for ensuring the accuracy and reproducibility of the novel platform, a design of experiment study was performed. The influence of different encapsulation parameters was investigated, such as the effect of the surfactant, surfactant type, surfactant concentration and core/shell ratio. As a result, this HTP-platform is suitable to be used for the synthesis of different types of microcapsules in an automated and controlled way, allowing the screening of different reaction parameters in a shorter time compared to the manual synthetic techniques.

  17. Developing an instrument to measure emotional behaviour abilities of meaningful learning through the Delphi technique.

    PubMed

    Cadorin, Lucia; Bagnasco, Annamaria; Tolotti, Angela; Pagnucci, Nicola; Sasso, Loredana

    2017-09-01

    To identify items for a new instrument that measures emotional behaviour abilities of meaningful learning, according to Fink's Taxonomy. Meaningful learning is an active process that promotes a wider and deeper understanding of concepts. It is the result of an interaction between new and previous knowledge and produces a long-term change of knowledge and skills. To measure meaningful learning capability, it is very important in the education of health professionals to identify problems or special learning needs. For this reason, it is necessary to create valid instruments. A Delphi Study technique was implemented in four phases by means of e-mail. The study was conducted from April-September 2015. An expert panel consisting of ten researchers with experience in Fink's Taxonomy was established to identify the items of the instrument. Data were analysed for conceptual description and item characteristics and attributes were rated. Expert consensus was sought in each of these phases. An 87·5% consensus cut-off was established. After four rounds, consensus was obtained for validation of the content of the instrument 'Assessment of Meaningful learning Behavioural and Emotional Abilities'. This instrument consists of 56 items evaluated on a 6-point Likert-type scale. Foundational Knowledge, Application, Integration, Human Dimension, Caring and Learning How to Learn were the six major categories explored. This content validated tool can help educators (teachers, trainers and tutors) to identify and improve the strategies to support students' learning capability, which could increase their awareness of and/or responsibility in the learning process. © 2017 John Wiley & Sons Ltd.

  18. Critical review of norovirus surrogates in food safety research: rationale for considering volunteer studies.

    PubMed

    Richards, Gary P

    2012-03-01

    The inability to propagate human norovirus (NoV) or to clearly differentiate infectious from noninfectious virus particles has led to the use of surrogate viruses, like feline calicivirus (FCV) and murine norovirus-1 (MNV), which are propagatable in cell culture. The use of surrogates is predicated on the assumption that they generally mimic the viruses they represent; however, studies are proving this concept invalid. In direct comparisons between FCV and MNV, their susceptibility to temperatures, environmental and food processing conditions, and disinfectants are dramatically different. Differences have also been noted between the inactivation of NoV and its surrogates, thus questioning the validity of surrogates. Considerable research funding is provided globally each year to conduct surrogate studies on NoVs; however, there is little demonstrated benefit derived from these studies in regard to the development of virus inactivation techniques or food processing strategies. Human challenge studies are needed to determine which processing techniques are effective in reducing NoVs in foods. A major obstacle to clinical trials on NoVs is the perception that such trials are too costly and risky, but in reality, there is far more cost and risk in allowing millions of unsuspecting consumers to contract NoV illness each year, when practical interventions are only a few volunteer studies away. A number of clinical trials have been conducted, providing important insights into NoV inactivation. A shift in research priorities from surrogate research to volunteer studies is essential if we are to identify realistic, practical, and scientifically valid processing approaches to improve food safety.

  19. At-line process analytical technology (PAT) for more efficient scale up of biopharmaceutical microfiltration unit operations.

    PubMed

    Watson, Douglas S; Kerchner, Kristi R; Gant, Sean S; Pedersen, Joseph W; Hamburger, James B; Ortigosa, Allison D; Potgieter, Thomas I

    2016-01-01

    Tangential flow microfiltration (MF) is a cost-effective and robust bioprocess separation technique, but successful full scale implementation is hindered by the empirical, trial-and-error nature of scale-up. We present an integrated approach leveraging at-line process analytical technology (PAT) and mass balance based modeling to de-risk MF scale-up. Chromatography-based PAT was employed to improve the consistency of an MF step that had been a bottleneck in the process used to manufacture a therapeutic protein. A 10-min reverse phase ultra high performance liquid chromatography (RP-UPLC) assay was developed to provide at-line monitoring of protein concentration. The method was successfully validated and method performance was comparable to previously validated methods. The PAT tool revealed areas of divergence from a mass balance-based model, highlighting specific opportunities for process improvement. Adjustment of appropriate process controls led to improved operability and significantly increased yield, providing a successful example of PAT deployment in the downstream purification of a therapeutic protein. The general approach presented here should be broadly applicable to reduce risk during scale-up of filtration processes and should be suitable for feed-forward and feed-back process control. © 2015 American Institute of Chemical Engineers.

  20. Gaussian process regression for tool wear prediction

    NASA Astrophysics Data System (ADS)

    Kong, Dongdong; Chen, Yongjie; Li, Ning

    2018-05-01

    To realize and accelerate the pace of intelligent manufacturing, this paper presents a novel tool wear assessment technique based on the integrated radial basis function based kernel principal component analysis (KPCA_IRBF) and Gaussian process regression (GPR) for real-timely and accurately monitoring the in-process tool wear parameters (flank wear width). The KPCA_IRBF is a kind of new nonlinear dimension-increment technique and firstly proposed for feature fusion. The tool wear predictive value and the corresponding confidence interval are both provided by utilizing the GPR model. Besides, GPR performs better than artificial neural networks (ANN) and support vector machines (SVM) in prediction accuracy since the Gaussian noises can be modeled quantitatively in the GPR model. However, the existence of noises will affect the stability of the confidence interval seriously. In this work, the proposed KPCA_IRBF technique helps to remove the noises and weaken its negative effects so as to make the confidence interval compressed greatly and more smoothed, which is conducive for monitoring the tool wear accurately. Moreover, the selection of kernel parameter in KPCA_IRBF can be easily carried out in a much larger selectable region in comparison with the conventional KPCA_RBF technique, which helps to improve the efficiency of model construction. Ten sets of cutting tests are conducted to validate the effectiveness of the presented tool wear assessment technique. The experimental results show that the in-process flank wear width of tool inserts can be monitored accurately by utilizing the presented tool wear assessment technique which is robust under a variety of cutting conditions. This study lays the foundation for tool wear monitoring in real industrial settings.

  1. Application of Coherent Anti-Stokes Raman Scattering to Combustion Media.

    DTIC Science & Technology

    1981-02-01

    BANDS FOR REAL-TIME TEMPERATURE MEASUREMENT IN FLAMES 44 3.6 COMPARISONS OF SINGLE-SHOT THERMOMETRY OF CARS WITH OTHER OPTICAL THERMOMETRIC ...b COMPARISONS OF SINGLE-SHOT THERMOMETRY OF CARS WITH OTHER OPTICAL THERMOMETRIC TECHNIQUES Two-Line Fluorescence A fluorescence system was developed...constitute a firm basis for evaluating the validity and accuracy of the CARS process as a thermometric tool for flames. Winefordner 30 has shown that the

  2. Improvement of Computer Software Quality through Software Automated Tools.

    DTIC Science & Technology

    1986-08-31

    requirement for increased emphasis on software quality assurance has lead to the creation of various methods of verification and validation. Experience...result was a vast array of methods , systems, languages and automated tools to assist in the process. Given that the primary role of quality assurance is...Unfortunately, there is no single method , tool or technique that can insure accurate, reliable and cost effective software. Therefore, government and industry

  3. Using deep learning for detecting gender in adult chest radiographs

    NASA Astrophysics Data System (ADS)

    Xue, Zhiyun; Antani, Sameer; Long, L. Rodney; Thoma, George R.

    2018-03-01

    In this paper, we present a method for automatically identifying the gender of an imaged person using their frontal chest x-ray images. Our work is motivated by the need to determine missing gender information in some datasets. The proposed method employs the technique of convolutional neural network (CNN) based deep learning and transfer learning to overcome the challenge of developing handcrafted features in limited data. Specifically, the method consists of four main steps: pre-processing, CNN feature extractor, feature selection, and classifier. The method is tested on a combined dataset obtained from several sources with varying acquisition quality resulting in different pre-processing steps that are applied for each. For feature extraction, we tested and compared four CNN architectures, viz., AlexNet, VggNet, GoogLeNet, and ResNet. We applied a feature selection technique, since the feature length is larger than the number of images. Two popular classifiers: SVM and Random Forest, are used and compared. We evaluated the classification performance by cross-validation and used seven performance measures. The best performer is the VggNet-16 feature extractor with the SVM classifier, with accuracy of 86.6% and ROC Area being 0.932 for 5-fold cross validation. We also discuss several misclassified cases and describe future work for performance improvement.

  4. Integration of Scale Invariant Generator Technique and S-A Technique for Characterizing 2-D Patterns for Information Retrieve

    NASA Astrophysics Data System (ADS)

    Cao, L.; Cheng, Q.

    2004-12-01

    The scale invariant generator technique (SIG) and spectrum-area analysis technique (S-A) were developed independently relevant to the concept of the generalized scale invariance (GSI). The former was developed for characterizing the parameters involved in the GSI for characterizing and simulating multifractal measures whereas the latter was for identifying scaling breaks for decomposition of superimposed multifractal measures caused by multiple geophysical processes. A natural integration of these two techniques may yield a new technique to serve two purposes, on the one hand, that can enrich the power of S-A by increasing the interpretability of decomposed patterns in some applications of S-A and, on the other hand, that can provide a mean to test the uniqueness of multifractality of measures which is essential for application of SIG technique in more complicated environment. The implementation of the proposed technique has been done as a Dynamic Link Library (DLL) in Visual C++. The program can be friendly used for method validation and application in different fields.

  5. Direct-Write Printing on Three-Dimensional Geometries for Miniaturized Detector and Electronic Assemblies

    NASA Technical Reports Server (NTRS)

    Paquette, Beth; Samuels, Margaret; Chen, Peng

    2017-01-01

    Direct-write printing techniques will enable new detector assemblies that were not previously possible with traditional assembly processes. Detector concepts were manufactured using this technology to validate repeatability. Additional detector applications and printed wires on a 3-dimensional magnetometer bobbin will be designed for print. This effort focuses on evaluating performance for direct-write manufacturing techniques on 3-dimensional surfaces. Direct-write manufacturing has the potential to reduce mass and volume for fabrication and assembly of advanced detector concepts by reducing trace widths down to 10 microns, printing on complex geometries, allowing new electronic concept production, and reduced production times of complex those electronics.

  6. Reverberation Chamber Uniformity Validation and Radiated Susceptibility Test Procedures for the NASA High Intensity Radiated Fields Laboratory

    NASA Technical Reports Server (NTRS)

    Koppen, Sandra V.; Nguyen, Truong X.; Mielnik, John J.

    2010-01-01

    The NASA Langley Research Center's High Intensity Radiated Fields Laboratory has developed a capability based on the RTCA/DO-160F Section 20 guidelines for radiated electromagnetic susceptibility testing in reverberation chambers. Phase 1 of the test procedure utilizes mode-tuned stirrer techniques and E-field probe measurements to validate chamber uniformity, determines chamber loading effects, and defines a radiated susceptibility test process. The test procedure is segmented into numbered operations that are largely software controlled. This document is intended as a laboratory test reference and includes diagrams of test setups, equipment lists, as well as test results and analysis. Phase 2 of development is discussed.

  7. Design and Validation of 3D Printed Complex Bone Models with Internal Anatomic Fidelity for Surgical Training and Rehearsal.

    PubMed

    Unger, Bertram J; Kraut, Jay; Rhodes, Charlotte; Hochman, Jordan

    2014-01-01

    Physical models of complex bony structures can be used for surgical skills training. Current models focus on surface rendering but suffer from a lack of internal accuracy due to limitations in the manufacturing process. We describe a technique for generating internally accurate rapid-prototyped anatomical models with solid and hollow structures from clinical and microCT data using a 3D printer. In a face validation experiment, otolaryngology residents drilled a cadaveric bone and its corresponding printed model. The printed bone models were deemed highly realistic representations across all measured parameters and the educational value of the models was strongly appreciated.

  8. Efficient generalized cross-validation with applications to parametric image restoration and resolution enhancement.

    PubMed

    Nguyen, N; Milanfar, P; Golub, G

    2001-01-01

    In many image restoration/resolution enhancement applications, the blurring process, i.e., point spread function (PSF) of the imaging system, is not known or is known only to within a set of parameters. We estimate these PSF parameters for this ill-posed class of inverse problem from raw data, along with the regularization parameters required to stabilize the solution, using the generalized cross-validation method (GCV). We propose efficient approximation techniques based on the Lanczos algorithm and Gauss quadrature theory, reducing the computational complexity of the GCV. Data-driven PSF and regularization parameter estimation experiments with synthetic and real image sequences are presented to demonstrate the effectiveness and robustness of our method.

  9. Membrane processes

    NASA Astrophysics Data System (ADS)

    Staszak, Katarzyna

    2017-11-01

    The membrane processes have played important role in the industrial separation process. These technologies can be found in all industrial areas such as food, beverages, metallurgy, pulp and paper, textile, pharmaceutical, automotive, biotechnology and chemical industry, as well as in water treatment for domestic and industrial application. Although these processes are known since twentieth century, there are still many studies that focus on the testing of new membranes' materials and determining of conditions for optimal selectivity, i. e. the optimum transmembrane pressure (TMP) or permeate flux to minimize fouling. Moreover the researchers proposed some calculation methods to predict the membrane processes properties. In this article, the laboratory scale experiments of membrane separation techniques, as well their validation by calculation methods are presented. Because membrane is the "heart" of the process, experimental and computational methods for its characterization are also described.

  10. Standardization of chemical analytical techniques for pyrolysis bio-oil: history, challenges, and current status of methods

    DOE PAGES

    Ferrell, Jack R.; Olarte, Mariefel V.; Christensen, Earl D.; ...

    2016-07-05

    Here, we discuss the standardization of analytical techniques for pyrolysis bio-oils, including the current status of methods, and our opinions on future directions. First, the history of past standardization efforts is summarized, and both successful and unsuccessful validation of analytical techniques highlighted. The majority of analytical standardization studies to-date has tested only physical characterization techniques. In this paper, we present results from an international round robin on the validation of chemical characterization techniques for bio-oils. Techniques tested included acid number, carbonyl titrations using two different methods (one at room temperature and one at 80 °C), 31P NMR for determination ofmore » hydroxyl groups, and a quantitative gas chromatography–mass spectrometry (GC-MS) method. Both carbonyl titration and acid number methods have yielded acceptable inter-laboratory variabilities. 31P NMR produced acceptable results for aliphatic and phenolic hydroxyl groups, but not for carboxylic hydroxyl groups. As shown in previous round robins, GC-MS results were more variable. Reliable chemical characterization of bio-oils will enable upgrading research and allow for detailed comparisons of bio-oils produced at different facilities. Reliable analytics are also needed to enable an emerging bioenergy industry, as processing facilities often have different analytical needs and capabilities than research facilities. We feel that correlations in reliable characterizations of bio-oils will help strike a balance between research and industry, and will ultimately help to -determine metrics for bio-oil quality. Lastly, the standardization of additional analytical methods is needed, particularly for upgraded bio-oils.« less

  11. Standardization of chemical analytical techniques for pyrolysis bio-oil: history, challenges, and current status of methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferrell, Jack R.; Olarte, Mariefel V.; Christensen, Earl D.

    Here, we discuss the standardization of analytical techniques for pyrolysis bio-oils, including the current status of methods, and our opinions on future directions. First, the history of past standardization efforts is summarized, and both successful and unsuccessful validation of analytical techniques highlighted. The majority of analytical standardization studies to-date has tested only physical characterization techniques. In this paper, we present results from an international round robin on the validation of chemical characterization techniques for bio-oils. Techniques tested included acid number, carbonyl titrations using two different methods (one at room temperature and one at 80 °C), 31P NMR for determination ofmore » hydroxyl groups, and a quantitative gas chromatography–mass spectrometry (GC-MS) method. Both carbonyl titration and acid number methods have yielded acceptable inter-laboratory variabilities. 31P NMR produced acceptable results for aliphatic and phenolic hydroxyl groups, but not for carboxylic hydroxyl groups. As shown in previous round robins, GC-MS results were more variable. Reliable chemical characterization of bio-oils will enable upgrading research and allow for detailed comparisons of bio-oils produced at different facilities. Reliable analytics are also needed to enable an emerging bioenergy industry, as processing facilities often have different analytical needs and capabilities than research facilities. We feel that correlations in reliable characterizations of bio-oils will help strike a balance between research and industry, and will ultimately help to -determine metrics for bio-oil quality. Lastly, the standardization of additional analytical methods is needed, particularly for upgraded bio-oils.« less

  12. Micro-Raman Technology to Interrogate Two-Phase Extraction on a Microfluidic Device.

    PubMed

    Nelson, Gilbert L; Asmussen, Susan E; Lines, Amanda M; Casella, Amanda J; Bottenus, Danny R; Clark, Sue B; Bryan, Samuel A

    2018-05-21

    Microfluidic devices provide ideal environments to study solvent extraction. When droplets form and generate plug flow down the microfluidic channel, the device acts as a microreactor in which the kinetics of chemical reactions and interfacial transfer can be examined. Here, we present a methodology that combines chemometric analysis with online micro-Raman spectroscopy to monitor biphasic extractions within a microfluidic device. Among the many benefits of microreactors is the ability to maintain small sample volumes, which is especially important when studying solvent extraction in harsh environments, such as in separations related to the nuclear fuel cycle. In solvent extraction, the efficiency of the process depends on complex formation and rates of transfer in biphasic systems. Thus, it is important to understand the kinetic parameters in an extraction system to maintain a high efficiency and effectivity of the process. This monitoring provided concentration measurements in both organic and aqueous plugs as they were pumped through the microfluidic channel. The biphasic system studied was comprised of HNO 3 as the aqueous phase and 30% (v/v) tributyl phosphate in n-dodecane comprised the organic phase, which simulated the plutonium uranium reduction extraction (PUREX) process. Using pre-equilibrated solutions (post extraction), the validity of the technique and methodology is illustrated. Following this validation, solutions that were not equilibrated were examined and the kinetics of interfacial mass transfer within the biphasic system were established. Kinetic results of extraction were compared to kinetics already determined on a macro scale to prove the efficacy of the technique.

  13. Optimization of spectral bands for hyperspectral remote sensing of forest vegetation

    NASA Astrophysics Data System (ADS)

    Dmitriev, Egor V.; Kozoderov, Vladimir V.

    2013-10-01

    Optimization principles of accounting for the most informative spectral channels in hyperspectral remote sensing data processing serve to enhance the efficiency of the employed high-productive computers. The problem of pattern recognition of the remotely sensed land surface objects with the accent on the forests is outlined from the point of view of the spectral channels optimization on the processed hyperspectral images. The relevant computational procedures are tested using the images obtained by the produced in Russia hyperspectral camera that was installed on a gyro-stabilized platform to conduct the airborne flight campaigns. The Bayesian classifier is used for the pattern recognition of the forests with different tree species and age. The probabilistically optimal algorithm constructed on the basis of the maximum likelihood principle is described to minimize the probability of misclassification given by this classifier. The classification error is the major category to estimate the accuracy of the applied algorithm by the known holdout cross-validation method. Details of the related techniques are presented. Results are shown of selecting the spectral channels of the camera while processing the images having in mind radiometric distortions that diminish the classification accuracy. The spectral channels are selected of the obtained subclasses extracted from the proposed validation techniques and the confusion matrices are constructed that characterize the age composition of the classified pine species as well as the broad age-class recognition for the pine and birch species with the fully illuminated parts of their crowns.

  14. Droplet-Based Segregation and Extraction of Concentrated Samples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Buie, C R; Buckley, P; Hamilton, J

    2007-02-23

    Microfluidic analysis often requires sample concentration and separation techniques to isolate and detect analytes of interest. Complex or scarce samples may also require an orthogonal separation and detection method or off-chip analysis to confirm results. To perform these additional steps, the concentrated sample plug must be extracted from the primary microfluidic channel with minimal sample loss and dilution. We investigated two extraction techniques; injection of immiscible fluid droplets into the sample stream (''capping'''') and injection of the sample into an immiscible fluid stream (''extraction''). From our results we conclude that capping is the more effective partitioning technique. Furthermore, this functionalitymore » enables additional off-chip post-processing procedures such as DNA/RNA microarray analysis, realtime polymerase chain reaction (RT-PCR), and culture growth to validate chip performance.« less

  15. Video encryption using chaotic masks in joint transform correlator

    NASA Astrophysics Data System (ADS)

    Saini, Nirmala; Sinha, Aloka

    2015-03-01

    A real-time optical video encryption technique using a chaotic map has been reported. In the proposed technique, each frame of video is encrypted using two different chaotic random phase masks in the joint transform correlator architecture. The different chaotic random phase masks can be obtained either by using different iteration levels or by using different seed values of the chaotic map. The use of different chaotic random phase masks makes the decryption process very complex for an unauthorized person. Optical, as well as digital, methods can be used for video encryption but the decryption is possible only digitally. To further enhance the security of the system, the key parameters of the chaotic map are encoded using RSA (Rivest-Shamir-Adleman) public key encryption. Numerical simulations are carried out to validate the proposed technique.

  16. Damage Evaluation Based on a Wave Energy Flow Map Using Multiple PZT Sensors

    PubMed Central

    Liu, Yaolu; Hu, Ning; Xu, Hong; Yuan, Weifeng; Yan, Cheng; Li, Yuan; Goda, Riu; Alamusi; Qiu, Jinhao; Ning, Huiming; Wu, Liangke

    2014-01-01

    A new wave energy flow (WEF) map concept was proposed in this work. Based on it, an improved technique incorporating the laser scanning method and Betti's reciprocal theorem was developed to evaluate the shape and size of damage as well as to realize visualization of wave propagation. In this technique, a simple signal processing algorithm was proposed to construct the WEF map when waves propagate through an inspection region, and multiple lead zirconate titanate (PZT) sensors were employed to improve inspection reliability. Various damages in aluminum and carbon fiber reinforced plastic laminated plates were experimentally and numerically evaluated to validate this technique. The results show that it can effectively evaluate the shape and size of damage from wave field variations around the damage in the WEF map. PMID:24463430

  17. Multispectral Wavefronts Retrieval in Digital Holographic Three-Dimensional Imaging Spectrometry

    NASA Astrophysics Data System (ADS)

    Yoshimori, Kyu

    2010-04-01

    This paper deals with a recently developed passive interferometric technique for retrieving a set of spectral components of wavefronts that are propagating from a spatially incoherent, polychromatic object. The technique is based on measurement of 5-D spatial coherence function using a suitably designed interferometer. By applying signal processing, including aperture synthesis and spectral decomposition, one may obtains a set of wavefronts of different spectral bands. Since each wavefront is equivalent to the complex Fresnel hologram at a particular spectrum of the polychromatic object, application of the conventional Fresnel transform yields 3-D image of different spectrum. Thus, this technique of multispectral wavefronts retrieval provides a new type of 3-D imaging spectrometry based on a fully passive interferometry. Experimental results are also shown to demonstrate the validity of the method.

  18. Plasma Processing of Lunar Regolith Simulant for Diverse Applications

    NASA Technical Reports Server (NTRS)

    Schofield, Elizabeth C.; Sen, Subhayu; O'Dell, J. Scott

    2008-01-01

    Versatile manufacturing technologies for extracting resources from the moon are needed to support future space missions. Of particular interest is the production of gases and metals from lunar resources for life support, propulsion, and in-space fabrication. Deposits made from lunar regolith could yield highly emissive coatings and near-net shaped parts for replacement or repair of critical components. Equally important is development of high fidelity lunar simulants for ground based validation of potential lunar surface operations. Described herein is an innovative plasma processing technique for insitu production of gases, metals, coatings, and deposits from lunar regolith, and synthesis of high fidelity lunar simulant from NASA issued lunar simulant JSC-1. Initial plasma reduction trials of JSC-1 lunar simulant have indicated production of metallic iron and magnesium. Evolution of carbon monoxide has been detected subsequent to reduction of the simulant using the plasma process. Plasma processing of the simulant has also resulted in glassy phases resembling the volcanic glass and agglutinates found in lunar regolith. Complete and partial glassy phase deposits have been obtained by varying the plasma process variables. Experimental techniques, product characterization, and process gas analysis will be discussed.

  19. The Scientific Status of Projective Techniques.

    PubMed

    Lilienfeld, S O; Wood, J M; Garb, H N

    2000-11-01

    Although projective techniques continue to be widely used in clinical and forensic settings, their scientific status remains highly controversial. In this monograph, we review the current state of the literature concerning the psychometric properties (norms, reliability, validity, incremental validity, treatment utility) of three major projective instruments: Rorschach Inkblot Test, Thematic Apperception Test (TAT), and human figure drawings. We conclude that there is empirical support for the validity of a small number of indexes derived from the Rorschach and TAT. However, the substantial majority of Rorschach and TAT indexes are not empirically supported. The validity evidence for human figure drawings is even more limited. With a few exceptions, projective indexes have not consistently demonstrated incremental validity above and beyond other psychometric data. In addition, we summarize the results of a new meta-analysis intended to examine the capacity of these three instruments to detect child sexual abuse. Although some projective instruments were better than chance at detecting child sexual abuse, there were virtually no replicated findings across independent investigative teams. This meta-analysis also provides the first clear evidence of substantial file drawer effects in the projectives literature, as the effect sizes from published studies markedly exceeded those from unpublished studies. We conclude with recommendations regarding the (a) construction of projective techniques with adequate validity, (b) forensic and clinical use of projective techniques, and (c) education and training of future psychologists regarding projective techniques. © 2000 Association for Psychological Science.

  20. Techniques for optimal crop selection in a controlled ecological life support system

    NASA Technical Reports Server (NTRS)

    Mccormack, Ann; Finn, Cory; Dunsky, Betsy

    1993-01-01

    A Controlled Ecological Life Support System (CELSS) utilizes a plant's natural ability to regenerate air and water while being grown as a food source in a closed life support system. Current plant research is directed toward obtaining quantitative empirical data on the regenerative ability of each species of plant and the system volume and power requirements. Two techniques were adapted to optimize crop species selection while at the same time minimizing the system volume and power requirements. Each allows the level of life support supplied by the plants to be selected, as well as other system parameters. The first technique uses decision analysis in the form of a spreadsheet. The second method, which is used as a comparison with and validation of the first, utilizes standard design optimization techniques. Simple models of plant processes are used in the development of these methods.

  1. Techniques for optimal crop selection in a controlled ecological life support system

    NASA Technical Reports Server (NTRS)

    Mccormack, Ann; Finn, Cory; Dunsky, Betsy

    1992-01-01

    A Controlled Ecological Life Support System (CELSS) utilizes a plant's natural ability to regenerate air and water while being grown as a food source in a closed life support system. Current plant research is directed toward obtaining quantitative empirical data on the regenerative ability of each species of plant and the system volume and power requirements. Two techniques were adapted to optimize crop species selection while at the same time minimizing the system volume and power requirements. Each allows the level of life support supplied by the plants to be selected, as well as other system parameters. The first technique uses decision analysis in the form of a spreadsheet. The second method, which is used as a comparison with and validation of the first, utilizes standard design optimization techniques. Simple models of plant processes are used in the development of these methods.

  2. Nondestructive Evaluation (NDE) for Inspection of Composite Sandwich Structures

    NASA Technical Reports Server (NTRS)

    Zalameda, Joseph N.; Parker, F. Raymond

    2014-01-01

    Composite honeycomb structures are widely used in aerospace applications due to their low weight and high strength advantages. Developing nondestructive evaluation (NDE) inspection methods are essential for their safe performance. Flash thermography is a commonly used technique for composite honeycomb structure inspections due to its large area and rapid inspection capability. Flash thermography is shown to be sensitive for detection of face sheet impact damage and face sheet to core disbond. Data processing techniques, using principal component analysis to improve the defect contrast, are discussed. Limitations to the thermal detection of the core are investigated. In addition to flash thermography, X-ray computed tomography is used. The aluminum honeycomb core provides excellent X-ray contrast compared to the composite face sheet. The X-ray CT technique was used to detect impact damage, core crushing, and skin to core disbonds. Additionally, the X-ray CT technique is used to validate the thermography results.

  3. Alignment of an acoustic manipulation device with cepstral analysis of electronic impedance data.

    PubMed

    Hughes, D A; Qiu, Y; Démoré, C; Weijer, C J; Cochran, S

    2015-02-01

    Acoustic particle manipulation is an emerging technology that uses ultrasonic standing waves to position objects with pressure gradients and acoustic radiation forces. To produce strong standing waves, the transducer and the reflector must be aligned properly such that they are parallel to each other. This can be a difficult process due to the need to visualise the ultrasound waves and as higher frequencies are introduced, this alignment requires higher accuracy. In this paper, we present a method for aligning acoustic resonators with cepstral analysis. This is a simple signal processing technique that requires only the electrical impedance measurement data of the resonator, which is usually recorded during the fabrication process of the device. We first introduce the mathematical basis of cepstral analysis and then demonstrate and validate it using a computer simulation of an acoustic resonator. Finally, the technique is demonstrated experimentally to create many parallel linear traps for 10 μm fluorescent beads inside an acoustic resonator. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. Simultaneous spectrophotometric determination of indacaterol and glycopyrronium in a newly approved pharmaceutical formulation using different signal processing techniques of ratio spectra

    NASA Astrophysics Data System (ADS)

    Abdel Ghany, Maha F.; Hussein, Lobna A.; Magdy, Nancy; Yamani, Hend Z.

    2016-03-01

    Three spectrophotometric methods have been developed and validated for determination of indacaterol (IND) and glycopyrronium (GLY) in their binary mixtures and novel pharmaceutical dosage form. The proposed methods are considered to be the first methods to determine the investigated drugs simultaneously. The developed methods are based on different signal processing techniques of ratio spectra namely; Numerical Differentiation (ND), Savitsky-Golay (SG) and Fourier Transform (FT). The developed methods showed linearity over concentration range 1-30 and 10-35 (μg/mL) for IND and GLY, respectively. The accuracy calculated as percentage recoveries were in the range of 99.00%-100.49% with low value of RSD% (< 1.5%) demonstrating an excellent accuracy of the proposed methods. The developed methods were proved to be specific, sensitive and precise for quality control of the investigated drugs in their pharmaceutical dosage form without the need for any separation process.

  5. A strategy for design and fabrication of low cost microchannel for future reproductivity of bio/chemical lab-on-chip application

    NASA Astrophysics Data System (ADS)

    Humayun, Q.; Hashim, U.; Ruzaidi, C. M.; Noriman, N. Z.

    2017-03-01

    The fabrication and characterization of sensitive and selective fluids delivery system for the application of nano laboratory on a single chip is a challenging task till to date. This paper is one of the initial attempt to resolve this challenging task by using a simple, cost effective and reproductive technique for pattering a microchannel structures on SU-8 resist. The objective of the research is to design, fabricate and characterize polydimethylsiloxane (PDMS) microchannel. The proposed device mask was designed initially by using AutoCAD software and then the designed was transferred to transparency sheet and to commercial chrome mask for better photo masking process. The standard photolithography process coupled with wet chemical etching process was used for the fabrication of proposed microchannel. This is a low cost fabrication technique for the formation of microchannel structure at resist. The fabrication process start from microchannel formation and then the structure was transformed to PDMS substrate, the microchannel structure was cured from mold and then the cured mold was bonded with the glass substrate by plasma oxidation bonding process. The surface morphology was characterized by high power microscope (HPM) and the structure was characterized by Hawk 3 D surface nanoprofiler. The next part of the research will be focus onto device testing and validation by using real biological samples by the implementation of a simple manual injection technique.

  6. X-ray digital industrial radiography (DIR) for local liquid velocity (VLL) measurement in trickle bed reactors (TBRs): Validation of the technique

    NASA Astrophysics Data System (ADS)

    Mohd Salleh, Khairul Anuar; Rahman, Mohd Fitri Abdul; Lee, Hyoung Koo; Al Dahhan, Muthanna H.

    2014-06-01

    Local liquid velocity measurements in Trickle Bed Reactors (TBRs) are one of the essential components in its hydrodynamic studies. These measurements are used to effectively determine a reactor's operating condition. This study was conducted to validate a newly developed technique that combines Digital Industrial Radiography (DIR) with Particle Tracking Velocimetry (PTV) to measure the Local Liquid Velocity (VLL) inside TBRs. Three millimeter-sized Expanded Polystyrene (EPS) beads were used as packing material. Three validation procedures were designed to test the newly developed technique. All procedures and statistical approaches provided strong evidence that the technique can be used to measure the VLL within TBRs.

  7. X-ray digital industrial radiography (DIR) for local liquid velocity (V(LL)) measurement in trickle bed reactors (TBRs): validation of the technique.

    PubMed

    Mohd Salleh, Khairul Anuar; Rahman, Mohd Fitri Abdul; Lee, Hyoung Koo; Al Dahhan, Muthanna H

    2014-06-01

    Local liquid velocity measurements in Trickle Bed Reactors (TBRs) are one of the essential components in its hydrodynamic studies. These measurements are used to effectively determine a reactor's operating condition. This study was conducted to validate a newly developed technique that combines Digital Industrial Radiography (DIR) with Particle Tracking Velocimetry (PTV) to measure the Local Liquid Velocity (V(LL)) inside TBRs. Three millimeter-sized Expanded Polystyrene (EPS) beads were used as packing material. Three validation procedures were designed to test the newly developed technique. All procedures and statistical approaches provided strong evidence that the technique can be used to measure the V(LL) within TBRs.

  8. Improvement of Storm Forecasts Using Gridded Bayesian Linear Regression for Northeast United States

    NASA Astrophysics Data System (ADS)

    Yang, J.; Astitha, M.; Schwartz, C. S.

    2017-12-01

    Bayesian linear regression (BLR) is a post-processing technique in which regression coefficients are derived and used to correct raw forecasts based on pairs of observation-model values. This study presents the development and application of a gridded Bayesian linear regression (GBLR) as a new post-processing technique to improve numerical weather prediction (NWP) of rain and wind storm forecasts over northeast United States. Ten controlled variables produced from ten ensemble members of the National Center for Atmospheric Research (NCAR) real-time prediction system are used for a GBLR model. In the GBLR framework, leave-one-storm-out cross-validation is utilized to study the performances of the post-processing technique in a database composed of 92 storms. To estimate the regression coefficients of the GBLR, optimization procedures that minimize the systematic and random error of predicted atmospheric variables (wind speed, precipitation, etc.) are implemented for the modeled-observed pairs of training storms. The regression coefficients calculated for meteorological stations of the National Weather Service are interpolated back to the model domain. An analysis of forecast improvements based on error reductions during the storms will demonstrate the value of GBLR approach. This presentation will also illustrate how the variances are optimized for the training partition in GBLR and discuss the verification strategy for grid points where no observations are available. The new post-processing technique is successful in improving wind speed and precipitation storm forecasts using past event-based data and has the potential to be implemented in real-time.

  9. Novel optical scanning cryptography using Fresnel telescope imaging.

    PubMed

    Yan, Aimin; Sun, Jianfeng; Hu, Zhijuan; Zhang, Jingtao; Liu, Liren

    2015-07-13

    We propose a new method called modified optical scanning cryptography using Fresnel telescope imaging technique for encryption and decryption of remote objects. An image or object can be optically encrypted on the fly by Fresnel telescope scanning system together with an encryption key. For image decryption, the encrypted signals are received and processed with an optical coherent heterodyne detection system. The proposed method has strong performance through use of secure Fresnel telescope scanning with orthogonal polarized beams and efficient all-optical information processing. The validity of the proposed method is demonstrated by numerical simulations and experimental results.

  10. Highly reliable oxide VCSELs for datacom applications

    NASA Astrophysics Data System (ADS)

    Aeby, Ian; Collins, Doug; Gibson, Brian; Helms, Christopher J.; Hou, Hong Q.; Lou, Wenlin; Bossert, David J.; Wang, Charlie X.

    2003-06-01

    In this paper we describe the processes and procedures that have been developed to ensure high reliability for Emcore"s 850 nm oxide confined GaAs VCSELs. Evidence from on-going accelerated life testing and other reliability studies that confirm that this process yields reliable products will be discussed. We will present data and analysis techniques used to determine the activation energy and acceleration factors for the dominant wear-out failure mechanisms for our devices as well as our estimated MTTF of greater than 2 million use hours. We conclude with a summary of internal verification and field return rate validation data.

  11. Traceability validation of a high speed short-pulse testing method used in LED production

    NASA Astrophysics Data System (ADS)

    Revtova, Elena; Vuelban, Edgar Moreno; Zhao, Dongsheng; Brenkman, Jacques; Ulden, Henk

    2017-12-01

    Industrial processes of LED (light-emitting diode) production include LED light output performance testing. Most of them are monitored and controlled by optically, electrically and thermally measuring LEDs by high speed short-pulse measurement methods. However, these are not standardized and a lot of information is proprietary that it is impossible for third parties, such as NMIs, to trace and validate. It is known, that these techniques have traceability issue and metrological inadequacies. Often due to these, the claimed performance specifications of LEDs are overstated, which consequently results to manufacturers experiencing customers' dissatisfaction and a large percentage of failures in daily use of LEDs. In this research a traceable setup is developed to validate one of the high speed testing techniques, investigate inadequacies and work out the traceability issues. A well-characterised short square pulse of 25 ms is applied to chip-on-board (CoB) LED modules to investigate the light output and colour content. We conclude that the short-pulse method is very efficient in case a well-defined electrical current pulse is applied and the stabilization time of the device is "a priori" accurately determined. No colour shift is observed. The largest contributors to the measurement uncertainty include badly-defined current pulse and inaccurate calibration factor.

  12. Molecular simulation and experimental validation of resorcinol adsorption on Ordered Mesoporous Carbon (OMC).

    PubMed

    Ahmad, Zaki Uddin; Chao, Bing; Konggidinata, Mas Iwan; Lian, Qiyu; Zappi, Mark E; Gang, Daniel Dianchen

    2018-04-27

    Numerous research works have been devoted in the adsorption area using experimental approaches. All these approaches are based on trial and error process and extremely time consuming. Molecular simulation technique is a new tool that can be used to design and predict the performance of an adsorbent. This research proposed a simulation technique that can greatly reduce the time in designing the adsorbent. In this study, a new Rhombic ordered mesoporous carbon (OMC) model is proposed and constructed with various pore sizes and oxygen contents using Materials Visualizer Module to optimize the structure of OMC for resorcinol adsorption. The specific surface area, pore volume, small angle X-ray diffraction pattern, and resorcinol adsorption capacity were calculated by Forcite and Sorption module in Materials Studio Package. The simulation results were validated experimentally through synthesizing OMC with different pore sizes and oxygen contents prepared via hard template method employing SBA-15 silica scaffold. Boric acid was used as the pore expanding reagent to synthesize OMC with different pore sizes (from 4.6 to 11.3 nm) and varying oxygen contents (from 11.9% to 17.8%). Based on the simulation and experimental validation, the optimal pore size was found to be 6 nm for maximum adsorption of resorcinol. Copyright © 2018 Elsevier B.V. All rights reserved.

  13. Concurrent validation of a neurocognitive assessment protocol for clients with mental illness in job matching as shop sales in supported employment.

    PubMed

    Ng, S S W; Lak, D C C; Lee, S C K; Ng, P P K

    2015-03-01

    Occupational therapists play a major role in the assessment and referral of clients with severe mental illness for supported employment. Nonetheless, there is scarce literature about the content and predictive validity of the process. In addition, the criteria of successful job matching have not been analysed and job supervisors have relied on experience rather than objective standards in recruitment. This study aimed to explore the profile of successful clients working in 'shop sales' in a supportive environment using a neurocognitive assessment protocol, and to validate the protocol against 'internal standards' of the job supervisors. This was a concurrent validation study of criterion-related scales for a single job type. The subjective ratings from the supervisors were concurrently validated against the results of neurocognitive assessment of intellectual function and work-related cognitive behaviour. A regression model was established for clients who succeeded and failed in employment using supervisor's ratings and a cutoff value of 10.5 for the Performance Fitness Rating Scale (R(2) = 0.918, F[41] = 3.794, p = 0.003). Classification And Regression Tree was also plotted to identify the profile of cases, with an overall accuracy of 0.861 (relative error, 0.26). Use of both inference statistics and data mining techniques enables the decision tree of neurocognitive assessments to be more readily applied by therapists in vocational rehabilitation, and thus directly improve the efficiency and efficacy of the process.

  14. Fabrication of thermal-resistant gratings for high-temperature measurements using geometric phase analysis.

    PubMed

    Zhang, Q; Liu, Z; Xie, H; Ma, K; Wu, L

    2016-12-01

    Grating fabrication techniques are crucial to the success of grating-based deformation measurement methods because the quality of the grating will directly affect the measurement results. Deformation measurements at high temperatures entail heating and, perhaps, oxidize the grating. The contrast of the grating lines may change during the heating process. Thus, the thermal-resistant capability of the grating becomes a point of great concern before taking measurements. This study proposes a method that combines a laser-engraving technique with the processes of particle spraying and sintering for fabricating thermal-resistant gratings. The grating fabrication technique is introduced and discussed in detail. A numerical simulation with a geometric phase analysis (GPA) is performed for a homogeneous deformation case. Then, the selection scheme of the grating pitch is suggested. The validity of the proposed technique is verified by fabricating a thermal-resistant grating on a ZrO 2 specimen and measuring its thermal strain at high temperatures (up to 1300 °C). Images of the grating before and after deformation are used to obtain the thermal-strain field by GPA and to compare the results with well-established reference data. The experimental results indicate that this proposed technique is feasible and will offer good prospects for further applications.

  15. Wab-InSAR: a new wavelet based InSAR time series technique applied to volcanic and tectonic areas

    NASA Astrophysics Data System (ADS)

    Walter, T. R.; Shirzaei, M.; Nankali, H.; Roustaei, M.

    2009-12-01

    Modern geodetic techniques such as InSAR and GPS provide valuable observations of the deformation field. Because of the variety of environmental interferences (e.g., atmosphere, topography distortion) and incompleteness of the models (assumption of the linear model for deformation), those observations are usually tainted by various systematic and random errors. Therefore we develop and test new methods to identify and filter unwanted periodic or episodic artifacts to obtain accurate and precise deformation measurements. Here we present and implement a new wavelet based InSAR (Wab-InSAR) time series approach. Because wavelets are excellent tools for identifying hidden patterns and capturing transient signals, we utilize wavelet functions for reducing the effect of atmospheric delay and digital elevation model inaccuracies. Wab-InSAR is a model free technique, reducing digital elevation model errors in individual interferograms using a 2D spatial Legendre polynomial wavelet filter. Atmospheric delays are reduced using a 3D spatio-temporal wavelet transform algorithm and a novel technique for pixel selection. We apply Wab-InSAR to several targets, including volcano deformation processes at Hawaii Island, and mountain building processes in Iran. Both targets are chosen to investigate large and small amplitude signals, variable and complex topography and atmospheric effects. In this presentation we explain different steps of the technique, validate the results by comparison to other high resolution processing methods (GPS, PS-InSAR, SBAS) and discuss the geophysical results.

  16. Real-time understanding of lignocellulosic bioethanol fermentation by Raman spectroscopy

    PubMed Central

    2013-01-01

    Background A substantial barrier to commercialization of lignocellulosic ethanol production is a lack of process specific sensors and associated control strategies that are essential for economic viability. Current sensors and analytical techniques require lengthy offline analysis or are easily fouled in situ. Raman spectroscopy has the potential to continuously monitor fermentation reactants and products, maximizing efficiency and allowing for improved process control. Results In this paper we show that glucose and ethanol in a lignocellulosic fermentation can be accurately monitored by a 785 nm Raman spectroscopy instrument and novel immersion probe, even in the presence of an elevated background thought to be caused by lignin-derived compounds. Chemometric techniques were used to reduce the background before generating calibration models for glucose and ethanol concentration. The models show very good correlation between the real-time Raman spectra and the offline HPLC validation. Conclusions Our results show that the changing ethanol and glucose concentrations during lignocellulosic fermentation processes can be monitored in real-time, allowing for optimization and control of large scale bioconversion processes. PMID:23425590

  17. Assembly processes comparison for a miniaturized laser used for the Exomars European Space Agency mission

    NASA Astrophysics Data System (ADS)

    Ribes-Pleguezuelo, Pol; Inza, Andoni Moral; Basset, Marta Gilaberte; Rodríguez, Pablo; Rodríguez, Gemma; Laudisio, Marco; Galan, Miguel; Hornaff, Marcel; Beckert, Erik; Eberhardt, Ramona; Tünnermann, Andreas

    2016-11-01

    A miniaturized diode-pumped solid-state laser (DPSSL) designed as part of the Raman laser spectrometer (RLS) instrument for the European Space Agency (ESA) Exomars mission 2020 is assembled and tested for the mission purpose and requirements. Two different processes were tried for the laser assembling: one based on adhesives, following traditional laser manufacturing processes; another based on a low-stress and organic-free soldering technique called solderjet bumping technology. The manufactured devices were tested for the processes validation by passing mechanical, thermal cycles, radiation, and optical functional tests. The comparison analysis showed a device improvement in terms of reliability of the optical performances from the soldered to the assembled by adhesive-based means.

  18. Monitoring Building Deformation with InSAR: Experiments and Validation

    PubMed Central

    Yang, Kui; Yan, Li; Huang, Guoman; Chen, Chu; Wu, Zhengpeng

    2016-01-01

    Synthetic Aperture Radar Interferometry (InSAR) techniques are increasingly applied for monitoring land subsidence. The advantages of InSAR include high accuracy and the ability to cover large areas; nevertheless, research validating the use of InSAR on building deformation is limited. In this paper, we test the monitoring capability of the InSAR in experiments using two landmark buildings; the Bohai Building and the China Theater, located in Tianjin, China. They were selected as real examples to compare InSAR and leveling approaches for building deformation. Ten TerraSAR-X images spanning half a year were used in Permanent Scatterer InSAR processing. These extracted InSAR results were processed considering the diversity in both direction and spatial distribution, and were compared with true leveling values in both Ordinary Least Squares (OLS) regression and measurement of error analyses. The detailed experimental results for the Bohai Building and the China Theater showed a high correlation between InSAR results and the leveling values. At the same time, the two Root Mean Square Error (RMSE) indexes had values of approximately 1 mm. These analyses show that a millimeter level of accuracy can be achieved by means of InSAR technique when measuring building deformation. We discuss the differences in accuracy between OLS regression and measurement of error analyses, and compare the accuracy index of leveling in order to propose InSAR accuracy levels appropriate for monitoring buildings deformation. After assessing the advantages and limitations of InSAR techniques in monitoring buildings, further applications are evaluated. PMID:27999403

  19. Estimates of air emissions from asphalt storage tanks and truck loading

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trumbore, D.C.

    1999-12-31

    Title V of the 1990 Clean Air Act requires the accurate estimation of emissions from all US manufacturing processes, and places the burden of proof for that estimate on the process owner. This paper is published as a tool to assist in the estimation of air emission from hot asphalt storage tanks and asphalt truck loading operations. Data are presented on asphalt vapor pressure, vapor molecular weight, and the emission split between volatile organic compounds and particulate emissions that can be used with AP-42 calculation techniques to estimate air emissions from asphalt storage tanks and truck loading operations. Since currentmore » AP-42 techniques are not valid in asphalt tanks with active fume removal, a different technique for estimation of air emissions in those tanks, based on direct measurement of vapor space combustible gas content, is proposed. Likewise, since AP-42 does not address carbon monoxide or hydrogen sulfide emissions that are known to be present in asphalt operations, this paper proposes techniques for estimation of those emissions. Finally, data are presented on the effectiveness of fiber bed filters in reducing air emissions in asphalt operations.« less

  20. RRW: repeated random walks on genome-scale protein networks for local cluster discovery

    PubMed Central

    Macropol, Kathy; Can, Tolga; Singh, Ambuj K

    2009-01-01

    Background We propose an efficient and biologically sensitive algorithm based on repeated random walks (RRW) for discovering functional modules, e.g., complexes and pathways, within large-scale protein networks. Compared to existing cluster identification techniques, RRW implicitly makes use of network topology, edge weights, and long range interactions between proteins. Results We apply the proposed technique on a functional network of yeast genes and accurately identify statistically significant clusters of proteins. We validate the biological significance of the results using known complexes in the MIPS complex catalogue database and well-characterized biological processes. We find that 90% of the created clusters have the majority of their catalogued proteins belonging to the same MIPS complex, and about 80% have the majority of their proteins involved in the same biological process. We compare our method to various other clustering techniques, such as the Markov Clustering Algorithm (MCL), and find a significant improvement in the RRW clusters' precision and accuracy values. Conclusion RRW, which is a technique that exploits the topology of the network, is more precise and robust in finding local clusters. In addition, it has the added flexibility of being able to find multi-functional proteins by allowing overlapping clusters. PMID:19740439

  1. Watermarking and copyright labeling of printed images

    NASA Astrophysics Data System (ADS)

    Hel-Or, Hagit Z.

    2001-07-01

    Digital watermarking is a labeling technique for digital images which embeds a code into the digital data so the data are marked. Watermarking techniques previously developed deal with on-line digital data. These techniques have been developed to withstand digital attacks such as image processing, image compression and geometric transformations. However, one must also consider the readily available attack of printing and scanning. The available watermarking techniques are not reliable under printing and scanning. In fact, one must consider the availability of watermarks for printed images as well as for digital images. An important issue is to intercept and prevent forgery in printed material such as currency notes, back checks, etc. and to track and validate sensitive and secrete printed material. Watermarking in such printed material can be used not only for verification of ownership but as an indicator of date and type of transaction or date and source of the printed data. In this work we propose a method of embedding watermarks in printed images by inherently taking advantage of the printing process. The method is visually unobtrusive to the printed image, the watermark is easily extracted and is robust under reconstruction errors. The decoding algorithm is automatic given the watermarked image.

  2. Creation of a novel simulator for minimally invasive neurosurgery: fusion of 3D printing and special effects.

    PubMed

    Weinstock, Peter; Rehder, Roberta; Prabhu, Sanjay P; Forbes, Peter W; Roussin, Christopher J; Cohen, Alan R

    2017-07-01

    OBJECTIVE Recent advances in optics and miniaturization have enabled the development of a growing number of minimally invasive procedures, yet innovative training methods for the use of these techniques remain lacking. Conventional teaching models, including cadavers and physical trainers as well as virtual reality platforms, are often expensive and ineffective. Newly developed 3D printing technologies can recreate patient-specific anatomy, but the stiffness of the materials limits fidelity to real-life surgical situations. Hollywood special effects techniques can create ultrarealistic features, including lifelike tactile properties, to enhance accuracy and effectiveness of the surgical models. The authors created a highly realistic model of a pediatric patient with hydrocephalus via a unique combination of 3D printing and special effects techniques and validated the use of this model in training neurosurgery fellows and residents to perform endoscopic third ventriculostomy (ETV), an effective minimally invasive method increasingly used in treating hydrocephalus. METHODS A full-scale reproduction of the head of a 14-year-old adolescent patient with hydrocephalus, including external physical details and internal neuroanatomy, was developed via a unique collaboration of neurosurgeons, simulation engineers, and a group of special effects experts. The model contains "plug-and-play" replaceable components for repetitive practice. The appearance of the training model (face validity) and the reproducibility of the ETV training procedure (content validity) were assessed by neurosurgery fellows and residents of different experience levels based on a 14-item Likert-like questionnaire. The usefulness of the training model for evaluating the performance of the trainees at different levels of experience (construct validity) was measured by blinded observers using the Objective Structured Assessment of Technical Skills (OSATS) scale for the performance of ETV. RESULTS A combination of 3D printing technology and casting processes led to the creation of realistic surgical models that include high-fidelity reproductions of the anatomical features of hydrocephalus and allow for the performance of ETV for training purposes. The models reproduced the pulsations of the basilar artery, ventricles, and cerebrospinal fluid (CSF), thus simulating the experience of performing ETV on an actual patient. The results of the 14-item questionnaire showed limited variability among participants' scores, and the neurosurgery fellows and residents gave the models consistently high ratings for face and content validity. The mean score for the content validity questions (4.88) was higher than the mean score for face validity (4.69) (p = 0.03). On construct validity scores, the blinded observers rated performance of fellows significantly higher than that of residents, indicating that the model provided a means to distinguish between novice and expert surgical skills. CONCLUSIONS A plug-and-play lifelike ETV training model was developed through a combination of 3D printing and special effects techniques, providing both anatomical and haptic accuracy. Such simulators offer opportunities to accelerate the development of expertise with respect to new and novel procedures as well as iterate new surgical approaches and innovations, thus allowing novice neurosurgeons to gain valuable experience in surgical techniques without exposing patients to risk of harm.

  3. High-throughput prediction of eucalypt lignin syringyl/guaiacyl content using multivariate analysis: a comparison between mid-infrared, near-infrared, and Raman spectroscopies for model development

    PubMed Central

    2014-01-01

    Background In order to rapidly and efficiently screen potential biofuel feedstock candidates for quintessential traits, robust high-throughput analytical techniques must be developed and honed. The traditional methods of measuring lignin syringyl/guaiacyl (S/G) ratio can be laborious, involve hazardous reagents, and/or be destructive. Vibrational spectroscopy can furnish high-throughput instrumentation without the limitations of the traditional techniques. Spectral data from mid-infrared, near-infrared, and Raman spectroscopies was combined with S/G ratios, obtained using pyrolysis molecular beam mass spectrometry, from 245 different eucalypt and Acacia trees across 17 species. Iterations of spectral processing allowed the assembly of robust predictive models using partial least squares (PLS). Results The PLS models were rigorously evaluated using three different randomly generated calibration and validation sets for each spectral processing approach. Root mean standard errors of prediction for validation sets were lowest for models comprised of Raman (0.13 to 0.16) and mid-infrared (0.13 to 0.15) spectral data, while near-infrared spectroscopy led to more erroneous predictions (0.18 to 0.21). Correlation coefficients (r) for the validation sets followed a similar pattern: Raman (0.89 to 0.91), mid-infrared (0.87 to 0.91), and near-infrared (0.79 to 0.82). These statistics signify that Raman and mid-infrared spectroscopy led to the most accurate predictions of S/G ratio in a diverse consortium of feedstocks. Conclusion Eucalypts present an attractive option for biofuel and biochemical production. Given the assortment of over 900 different species of Eucalyptus and Corymbia, in addition to various species of Acacia, it is necessary to isolate those possessing ideal biofuel traits. This research has demonstrated the validity of vibrational spectroscopy to efficiently partition different potential biofuel feedstocks according to lignin S/G ratio, significantly reducing experiment and analysis time and expense while providing non-destructive, accurate, global, predictive models encompassing a diverse array of feedstocks. PMID:24955114

  4. Validity of the Consensual Assessment Technique--Evidence with Three Groups of Judges and an Elementary School Student Sample

    ERIC Educational Resources Information Center

    Long, Haiying

    2012-01-01

    As one of the most widely used creativity assessment tools, the Consensual Assessment Technique (CAT) has been praised as a valid tool to assess creativity. In Amabile's (1982) seminal work, the inter-rater reliability was defined as construct validity of the CAT. During the past three decades, researchers followed this definition and…

  5. Antibodies Targeting EMT

    DTIC Science & Technology

    2017-10-01

    impact substrate usage in AKRs . The blue residues are variable positions between AKR1C1-4. These residues are near the active site and may play a ...with a collection of information if it does not display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS...and diagnostic biomarkers. We have developed a new technique allowing for discovery of new antibodies that disrupt a key process in cancer progression

  6. Applicability of SREM to the Verification of Management Information System Software Requirements. Volume I.

    DTIC Science & Technology

    1981-04-30

    However, SREM was not designed to harmonize these kinds of problems. Rather, it is a tool to investigate the logic of the processing specified in the... design . Supoorting programs were also conducted to perform basic research into such areas as software reliability, static and dynamic validation techniques...development. 0 Maintain requirements development independent of the target machine and the eventual software design . 0. Allow for easy response to

  7. A digital computer simulation and study of a direct-energy-transfer power-conditioning system

    NASA Technical Reports Server (NTRS)

    Burns, W. W., III; Owen, H. A., Jr.; Wilson, T. G.; Rodriguez, G. E.; Paulkovich, J.

    1974-01-01

    A digital computer simulation technique, which can be used to study such composite power-conditioning systems, was applied to a spacecraft direct-energy-transfer power-processing system. The results obtained duplicate actual system performance with considerable accuracy. The validity of the approach and its usefulness in studying various aspects of system performance such as steady-state characteristics and transient responses to severely varying operating conditions are demonstrated experimentally.

  8. Rapid Development of New Protein Biosensors Utilizing Peptides Obtained via Phage Display

    DTIC Science & Technology

    2011-10-01

    removal of loosely bound peptide or the viscosity/density change of solutions. ALT sensor operation QCM , CV and EIS measurements validated the formation...manuscript, we demonstrate this process from start to finish to create a new biosensor for the detection of ALT. Figure 6. Sensor operation. A) QCM ...peptides, peptide synthesis with a terminal thiol, QCM in-situ monitoring of peptide immobilization, and sensor detection using electrochemical techniques

  9. Infrared imaging - A validation technique for computational fluid dynamics codes used in STOVL applications

    NASA Technical Reports Server (NTRS)

    Hardman, R. R.; Mahan, J. R.; Smith, M. H.; Gelhausen, P. A.; Van Dalsem, W. R.

    1991-01-01

    The need for a validation technique for computational fluid dynamics (CFD) codes in STOVL applications has led to research efforts to apply infrared thermal imaging techniques to visualize gaseous flow fields. Specifically, a heated, free-jet test facility was constructed. The gaseous flow field of the jet exhaust was characterized using an infrared imaging technique in the 2 to 5.6 micron wavelength band as well as conventional pitot tube and thermocouple methods. These infrared images are compared to computer-generated images using the equations of radiative exchange based on the temperature distribution in the jet exhaust measured with the thermocouple traverses. Temperature and velocity measurement techniques, infrared imaging, and the computer model of the infrared imaging technique are presented and discussed. From the study, it is concluded that infrared imaging techniques coupled with the radiative exchange equations applied to CFD models are a valid method to qualitatively verify CFD codes used in STOVL applications.

  10. Experimental validation of pulsed column inventory estimators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beyerlein, A.L.; Geldard, J.F.; Weh, R.

    Near-real-time accounting (NRTA) for reprocessing plants relies on the timely measurement of all transfers through the process area and all inventory in the process. It is difficult to measure the inventory of the solvent contractors; therefore, estimation techniques are considered. We have used experimental data obtained at the TEKO facility in Karlsruhe and have applied computer codes developed at Clemson University to analyze this data. For uranium extraction, the computer predictions agree to within 15% of the measured inventories. We believe this study is significant in demonstrating that using theoretical models with a minimum amount of process data may bemore » an acceptable approach to column inventory estimation for NRTA. 15 refs., 7 figs.« less

  11. Visual processing speed.

    PubMed

    Owsley, Cynthia

    2013-09-20

    Older adults commonly report difficulties in visual tasks of everyday living that involve visual clutter, secondary task demands, and time sensitive responses. These difficulties often cannot be attributed to visual sensory impairment. Techniques for measuring visual processing speed under divided attention conditions and among visual distractors have been developed and have established construct validity in that those older adults performing poorly in these tests are more likely to exhibit daily visual task performance problems. Research suggests that computer-based training exercises can increase visual processing speed in older adults and that these gains transfer to enhancement of health and functioning and a slowing in functional and health decline as people grow older. Copyright © 2012 Elsevier Ltd. All rights reserved.

  12. Predictive Modeling and Optimization of Vibration-assisted AFM Tip-based Nanomachining

    NASA Astrophysics Data System (ADS)

    Kong, Xiangcheng

    The tip-based vibration-assisted nanomachining process offers a low-cost, low-effort technique in fabricating nanometer scale 2D/3D structures in sub-100 nm regime. To understand its mechanism, as well as provide the guidelines for process planning and optimization, we have systematically studied this nanomachining technique in this work. To understand the mechanism of this nanomachining technique, we firstly analyzed the interaction between the AFM tip and the workpiece surface during the machining process. A 3D voxel-based numerical algorithm has been developed to calculate the material removal rate as well as the contact area between the AFM tip and the workpiece surface. As a critical factor to understand the mechanism of this nanomachining process, the cutting force has been analyzed and modeled. A semi-empirical model has been proposed by correlating the cutting force with the material removal rate, which was validated using experimental data from different machining conditions. With the understanding of its mechanism, we have developed guidelines for process planning of this nanomachining technique. To provide the guideline for parameter selection, the effect of machining parameters on the feature dimensions (depth and width) has been analyzed. Based on ANOVA test results, the feature width is only controlled by the XY vibration amplitude, while the feature depth is affected by several machining parameters such as setpoint force and feed rate. A semi-empirical model was first proposed to predict the machined feature depth under given machining condition. Then, to reduce the computation intensity, linear and nonlinear regression models were also proposed and validated using experimental data. Given the desired feature dimensions, feasible machining parameters could be provided using these predictive feature dimension models. As the tip wear is unavoidable during the machining process, the machining precision will gradually decrease. To maintain the machining quality, the guideline for when to change the tip should be provided. In this study, we have developed several metrics to detect tip wear, such as tip radius and the pull-off force. The effect of machining parameters on the tip wear rate has been studied using these metrics, and the machining distance before a tip must be changed has been modeled using these machining parameters. Finally, the optimization functions have been built for unit production time and unit production cost subject to realistic constraints, and the optimal machining parameters can be found by solving these functions.

  13. Identification of modal strains using sub-microstrain FBG data and a novel wavelength-shift detection algorithm

    NASA Astrophysics Data System (ADS)

    Anastasopoulos, Dimitrios; Moretti, Patrizia; Geernaert, Thomas; De Pauw, Ben; Nawrot, Urszula; De Roeck, Guido; Berghmans, Francis; Reynders, Edwin

    2017-03-01

    The presence of damage in a civil structure alters its stiffness and consequently its modal characteristics. The identification of these changes can provide engineers with useful information about the condition of a structure and constitutes the basic principle of the vibration-based structural health monitoring. While eigenfrequencies and mode shapes are the most commonly monitored modal characteristics, their sensitivity to structural damage may be low relative to their sensitivity to environmental influences. Modal strains or curvatures could offer an attractive alternative but current measurement techniques encounter difficulties in capturing the very small strain (sub-microstrain) levels occurring during ambient, or operational excitation, with sufficient accuracy. This paper investigates the ability to obtain sub-microstrain accuracy with standard fiber-optic Bragg gratings using a novel optical signal processing algorithm that identifies the wavelength shift with high accuracy and precision. The novel technique is validated in an extensive experimental modal analysis test on a steel I-beam which is instrumented with FBG sensors at its top and bottom flange. The raw wavelength FBG data are processed into strain values using both a novel correlation-based processing technique and a conventional peak tracking technique. Subsequently, the strain time series are used for identifying the beam's modal characteristics. Finally, the accuracy of both algorithms in identification of modal characteristics is extensively investigated.

  14. Using object-oriented analysis techniques to support system testing

    NASA Astrophysics Data System (ADS)

    Zucconi, Lin

    1990-03-01

    Testing of real-time control systems can be greatly facilitated by use of object-oriented and structured analysis modeling techniques. This report describes a project where behavior, process and information models built for a real-time control system were used to augment and aid traditional system testing. The modeling techniques used were an adaptation of the Ward/Mellor method for real-time systems analysis and design (Ward85) for object-oriented development. The models were used to simulate system behavior by means of hand execution of the behavior or state model and the associated process (data and control flow) and information (data) models. The information model, which uses an extended entity-relationship modeling technique, is used to identify application domain objects and their attributes (instance variables). The behavioral model uses state-transition diagrams to describe the state-dependent behavior of the object. The process model uses a transformation schema to describe the operations performed on or by the object. Together, these models provide a means of analyzing and specifying a system in terms of the static and dynamic properties of the objects which it manipulates. The various models were used to simultaneously capture knowledge about both the objects in the application domain and the system implementation. Models were constructed, verified against the software as-built and validated through informal reviews with the developer. These models were then hand-executed.

  15. Computational solution verification and validation applied to a thermal model of a ruggedized instrumentation package

    DOE PAGES

    Scott, Sarah Nicole; Templeton, Jeremy Alan; Hough, Patricia Diane; ...

    2014-01-01

    This study details a methodology for quantification of errors and uncertainties of a finite element heat transfer model applied to a Ruggedized Instrumentation Package (RIP). The proposed verification and validation (V&V) process includes solution verification to examine errors associated with the code's solution techniques, and model validation to assess the model's predictive capability for quantities of interest. The model was subjected to mesh resolution and numerical parameters sensitivity studies to determine reasonable parameter values and to understand how they change the overall model response and performance criteria. To facilitate quantification of the uncertainty associated with the mesh, automatic meshing andmore » mesh refining/coarsening algorithms were created and implemented on the complex geometry of the RIP. Automated software to vary model inputs was also developed to determine the solution’s sensitivity to numerical and physical parameters. The model was compared with an experiment to demonstrate its accuracy and determine the importance of both modelled and unmodelled physics in quantifying the results' uncertainty. An emphasis is placed on automating the V&V process to enable uncertainty quantification within tight development schedules.« less

  16. Analytical validation of an explicit finite element model of a rolling element bearing with a localised line spall

    NASA Astrophysics Data System (ADS)

    Singh, Sarabjeet; Howard, Carl Q.; Hansen, Colin H.; Köpke, Uwe G.

    2018-03-01

    In this paper, numerically modelled vibration response of a rolling element bearing with a localised outer raceway line spall is presented. The results were obtained from a finite element (FE) model of the defective bearing solved using an explicit dynamics FE software package, LS-DYNA. Time domain vibration signals of the bearing obtained directly from the FE modelling were processed further to estimate time-frequency and frequency domain results, such as spectrogram and power spectrum, using standard signal processing techniques pertinent to the vibration-based monitoring of rolling element bearings. A logical approach to analyses of the numerically modelled results was developed with an aim to presenting the analytical validation of the modelled results. While the time and frequency domain analyses of the results show that the FE model generates accurate bearing kinematics and defect frequencies, the time-frequency analysis highlights the simulation of distinct low- and high-frequency characteristic vibration signals associated with the unloading and reloading of the rolling elements as they move in and out of the defect, respectively. Favourable agreement of the numerical and analytical results demonstrates the validation of the results from the explicit FE modelling of the bearing.

  17. Integrative change model in psychotherapy: Perspectives from Indian thought.

    PubMed

    Manickam, L S S

    2013-01-01

    Different psychotherapeutic approaches claim positive changes in patients as a result of therapy. Explanations related to the change process led to different change models. Some of the change models are experimentally oriented whereas some are theoretical. Apart from the core models of behavioral, psychodynamic, humanistic, cognitive and spiritually oriented models there are specific models, within psychotherapy that explains the change process. Integrative theory of a person as depicted in Indian thought provides a common ground for the integration of various therapies. Integrative model of change based on Indian thought, with specific reference to psychological concepts in Upanishads, Ayurveda, Bhagavad Gita and Yoga are presented. Appropriate psychological tools may be developed in order to help the clinicians to choose the techniques that match the problem and the origin of the dimension. Explorations have to be conducted to develop more techniques that are culturally appropriate and clinically useful. Research has to be initiated to validate the identified concepts.

  18. Integrative change model in psychotherapy: Perspectives from Indian thought

    PubMed Central

    Manickam, L. S. S

    2013-01-01

    Different psychotherapeutic approaches claim positive changes in patients as a result of therapy. Explanations related to the change process led to different change models. Some of the change models are experimentally oriented whereas some are theoretical. Apart from the core models of behavioral, psychodynamic, humanistic, cognitive and spiritually oriented models there are specific models, within psychotherapy that explains the change process. Integrative theory of a person as depicted in Indian thought provides a common ground for the integration of various therapies. Integrative model of change based on Indian thought, with specific reference to psychological concepts in Upanishads, Ayurveda, Bhagavad Gita and Yoga are presented. Appropriate psychological tools may be developed in order to help the clinicians to choose the techniques that match the problem and the origin of the dimension. Explorations have to be conducted to develop more techniques that are culturally appropriate and clinically useful. Research has to be initiated to validate the identified concepts. PMID:23858275

  19. Ionospheric Correction of D-InSAR Using Split-Spectrum Technique and 3D Ionosphere Model in Deformation Monitoring

    NASA Astrophysics Data System (ADS)

    Chen, Y.; Guo, L.; Wu, J. J.; Chen, Q.; Song, S.

    2014-12-01

    In Differential Interferometric Synthetic Aperture Radar (D-InSAR) atmosphere effect including troposphere and ionosphere is one of the dominant sources of error in most interferograms, which greatly reduced the accuracy of deformation monitoring. In recent years tropospheric correction especially Zwd in InSAR data processing has ever got widely investigated and got efficiently suppressed. And thus we focused our study on ionospheric correction using two different methods, which are split-spectrum technique and Nequick model, one of the three dimensional electron density models. We processed Wenchuan ALOS PALSAR images, and compared InSAR surface deformation after ionospheric modification using the two approaches mentioned above with ground GPS subsidence observations to validate the effect of split-spectrum method and NeQuick model, further discussed the performance and feasibility of external data and InSAR itself during the study of the elimination of InSAR ionospheric effect.

  20. Mathematics learning on geometry for children with autism

    NASA Astrophysics Data System (ADS)

    Widayati, F. E.; Usodo, B.; Pamudya, I.

    2017-12-01

    The purpose of this research is to describe: (1) the mathematics learning process in an inclusion class and (2) the obstacle during the process of mathematics learning in the inclusion class. This research is a descriptive qualitative research. The subjects were a mathematics teacher, children with autism, and a teacher assistant. Method of collecting data was observation and interview. Data validation technique is triangulation technique. The results of this research are : (1) There is a modification of lesson plan for children with autism. This modification such as the indicator of success, material, time, and assessment. Lesson plan for children with autism is arranged by mathematics teacher and teacher assistant. There is no special media for children with autism used by mathematics teacher. (2) The obstacle of children with autism is that they are difficult to understand mathematics concept. Besides, children with autism are easy to lose their focus.

  1. Accurate Identification of Fatty Liver Disease in Data Warehouse Utilizing Natural Language Processing.

    PubMed

    Redman, Joseph S; Natarajan, Yamini; Hou, Jason K; Wang, Jingqi; Hanif, Muzammil; Feng, Hua; Kramer, Jennifer R; Desiderio, Roxanne; Xu, Hua; El-Serag, Hashem B; Kanwal, Fasiha

    2017-10-01

    Natural language processing is a powerful technique of machine learning capable of maximizing data extraction from complex electronic medical records. We utilized this technique to develop algorithms capable of "reading" full-text radiology reports to accurately identify the presence of fatty liver disease. Abdominal ultrasound, computerized tomography, and magnetic resonance imaging reports were retrieved from the Veterans Affairs Corporate Data Warehouse from a random national sample of 652 patients. Radiographic fatty liver disease was determined by manual review by two physicians and verified with an expert radiologist. A split validation method was utilized for algorithm development. For all three imaging modalities, the algorithms could identify fatty liver disease with >90% recall and precision, with F-measures >90%. These algorithms could be used to rapidly screen patient records to establish a large cohort to facilitate epidemiological and clinical studies and examine the clinic course and outcomes of patients with radiographic hepatic steatosis.

  2. A real time, FEM based optimal control algorithm and its implementation using parallel processing hardware (transistors) in a microprocessor environment

    NASA Technical Reports Server (NTRS)

    Patten, William Neff

    1989-01-01

    There is an evident need to discover a means of establishing reliable, implementable controls for systems that are plagued by nonlinear and, or uncertain, model dynamics. The development of a generic controller design tool for tough-to-control systems is reported. The method utilizes a moving grid, time infinite element based solution of the necessary conditions that describe an optimal controller for a system. The technique produces a discrete feedback controller. Real time laboratory experiments are now being conducted to demonstrate the viability of the method. The algorithm that results is being implemented in a microprocessor environment. Critical computational tasks are accomplished using a low cost, on-board, multiprocessor (INMOS T800 Transputers) and parallel processing. Progress to date validates the methodology presented. Applications of the technique to the control of highly flexible robotic appendages are suggested.

  3. Automated Analysis of Stateflow Models

    NASA Technical Reports Server (NTRS)

    Bourbouh, Hamza; Garoche, Pierre-Loic; Garion, Christophe; Gurfinkel, Arie; Kahsaia, Temesghen; Thirioux, Xavier

    2017-01-01

    Stateflow is a widely used modeling framework for embedded and cyber physical systems where control software interacts with physical processes. In this work, we present a framework a fully automated safety verification technique for Stateflow models. Our approach is two-folded: (i) we faithfully compile Stateflow models into hierarchical state machines, and (ii) we use automated logic-based verification engine to decide the validity of safety properties. The starting point of our approach is a denotational semantics of State flow. We propose a compilation process using continuation-passing style (CPS) denotational semantics. Our compilation technique preserves the structural and modal behavior of the system. The overall approach is implemented as an open source toolbox that can be integrated into the existing Mathworks Simulink Stateflow modeling framework. We present preliminary experimental evaluations that illustrate the effectiveness of our approach in code generation and safety verification of industrial scale Stateflow models.

  4. The three-dimensional morphology of growing dendrites

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gibbs, J. W.; Mohan, K. A.; Gulsoy, E. B.

    The processes controlling the morphology of dendrites have been of great interest to a wide range of communities, since they are examples of an out-of-equilibrium pattern forming system, there is a clear connection with battery failure processes, and their morphology sets the properties of many metallic alloys. We determine the three-dimensional morphology of free growing metallic dendrites using a novel X-ray tomographic technique that improves the temporal resolution by more than an order of magnitude compared to conventional techniques. These measurements show that the growth morphology of metallic dendrites is surprisingly different from that seen in model systems, the morphologymore » is not self-similar with distance back from the tip, and that this morphology can have an unexpectedly strong influence on solute segregation in castings. As a result, these experiments also provide benchmark data that can be used to validate simulations of free dendritic growth.« less

  5. The three-dimensional morphology of growing dendrites

    DOE PAGES

    Gibbs, J. W.; Mohan, K. A.; Gulsoy, E. B.; ...

    2015-07-03

    The processes controlling the morphology of dendrites have been of great interest to a wide range of communities, since they are examples of an out-of-equilibrium pattern forming system, there is a clear connection with battery failure processes, and their morphology sets the properties of many metallic alloys. We determine the three-dimensional morphology of free growing metallic dendrites using a novel X-ray tomographic technique that improves the temporal resolution by more than an order of magnitude compared to conventional techniques. These measurements show that the growth morphology of metallic dendrites is surprisingly different from that seen in model systems, the morphologymore » is not self-similar with distance back from the tip, and that this morphology can have an unexpectedly strong influence on solute segregation in castings. As a result, these experiments also provide benchmark data that can be used to validate simulations of free dendritic growth.« less

  6. CFD Techniques for Propulsion Applications

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The symposium was composed of the following sessions: turbomachinery computations and validations; flow in ducts, intakes, and nozzles; and reacting flows. Forty papers were presented, and they covered full 3-D code validation and numerical techniques; multidimensional reacting flow; and unsteady viscous flow for the entire spectrum of propulsion system components. The capabilities of the various numerical techniques were assessed and significant new developments were identified. The technical evaluation spells out where progress has been made and concludes that the present state of the art has almost reached the level necessary to tackle the comprehensive topic of computational fluid dynamics (CFD) validation for propulsion.

  7. System Identification of a Heaving Point Absorber: Design of Experiment and Device Modeling

    DOE PAGES

    Bacelli, Giorgio; Coe, Ryan; Patterson, David; ...

    2017-04-01

    Empirically based modeling is an essential aspect of design for a wave energy converter. These models are used in structural, mechanical and control design processes, as well as for performance prediction. The design of experiments and methods used to produce models from collected data have a strong impact on the quality of the model. This study considers the system identification and model validation process based on data collected from a wave tank test of a model-scale wave energy converter. Experimental design and data processing techniques based on general system identification procedures are discussed and compared with the practices often followedmore » for wave tank testing. The general system identification processes are shown to have a number of advantages. The experimental data is then used to produce multiple models for the dynamics of the device. These models are validated and their performance is compared against one and other. Furthermore, while most models of wave energy converters use a formulation with wave elevation as an input, this study shows that a model using a hull pressure sensor to incorporate the wave excitation phenomenon has better accuracy.« less

  8. System Identification of a Heaving Point Absorber: Design of Experiment and Device Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bacelli, Giorgio; Coe, Ryan; Patterson, David

    Empirically based modeling is an essential aspect of design for a wave energy converter. These models are used in structural, mechanical and control design processes, as well as for performance prediction. The design of experiments and methods used to produce models from collected data have a strong impact on the quality of the model. This study considers the system identification and model validation process based on data collected from a wave tank test of a model-scale wave energy converter. Experimental design and data processing techniques based on general system identification procedures are discussed and compared with the practices often followedmore » for wave tank testing. The general system identification processes are shown to have a number of advantages. The experimental data is then used to produce multiple models for the dynamics of the device. These models are validated and their performance is compared against one and other. Furthermore, while most models of wave energy converters use a formulation with wave elevation as an input, this study shows that a model using a hull pressure sensor to incorporate the wave excitation phenomenon has better accuracy.« less

  9. In-Line Monitoring of a Pharmaceutical Pan Coating Process by Optical Coherence Tomography.

    PubMed

    Markl, Daniel; Hannesschläger, Günther; Sacher, Stephan; Leitner, Michael; Buchsbaum, Andreas; Pescod, Russel; Baele, Thomas; Khinast, Johannes G

    2015-08-01

    This work demonstrates a new in-line measurement technique for monitoring the coating growth of randomly moving tablets in a pan coating process. In-line quality control is performed by an optical coherence tomography (OCT) sensor allowing nondestructive and contact-free acquisition of cross-section images of film coatings in real time. The coating thickness can be determined directly from these OCT images and no chemometric calibration models are required for quantification. Coating thickness measurements are extracted from the images by a fully automated algorithm. Results of the in-line measurements are validated using off-line OCT images, thickness calculations from tablet dimension measurements, and weight gain measurements. Validation measurements are performed on sample tablets periodically removed from the process during production. Reproducibility of the results is demonstrated by three batches produced under the same process conditions. OCT enables a multiple direct measurement of the coating thickness on individual tablets rather than providing the average coating thickness of a large number of tablets. This gives substantially more information about the coating quality, that is, intra- and intertablet coating variability, than standard quality control methods. © 2015 Wiley Periodicals, Inc. and the American Pharmacists Association.

  10. Unremarked or Unperformed? Systematic Review on Reporting of Validation Efforts of Health Economic Decision Models in Seasonal Influenza and Early Breast Cancer.

    PubMed

    de Boer, Pieter T; Frederix, Geert W J; Feenstra, Talitha L; Vemer, Pepijn

    2016-09-01

    Transparent reporting of validation efforts of health economic models give stakeholders better insight into the credibility of model outcomes. In this study we reviewed recently published studies on seasonal influenza and early breast cancer in order to gain insight into the reporting of model validation efforts in the overall health economic literature. A literature search was performed in Pubmed and Embase to retrieve health economic modelling studies published between 2008 and 2014. Reporting on model validation was evaluated by checking for the word validation, and by using AdViSHE (Assessment of the Validation Status of Health Economic decision models), a tool containing a structured list of relevant items for validation. Additionally, we contacted corresponding authors to ask whether more validation efforts were performed other than those reported in the manuscripts. A total of 53 studies on seasonal influenza and 41 studies on early breast cancer were included in our review. The word validation was used in 16 studies (30 %) on seasonal influenza and 23 studies (56 %) on early breast cancer; however, in a minority of studies, this referred to a model validation technique. Fifty-seven percent of seasonal influenza studies and 71 % of early breast cancer studies reported one or more validation techniques. Cross-validation of study outcomes was found most often. A limited number of studies reported on model validation efforts, although good examples were identified. Author comments indicated that more validation techniques were performed than those reported in the manuscripts. Although validation is deemed important by many researchers, this is not reflected in the reporting habits of health economic modelling studies. Systematic reporting of validation efforts would be desirable to further enhance decision makers' confidence in health economic models and their outcomes.

  11. Age validation of canary rockfish (Sebastes pinniger) using two independent otolith techniques: lead-radium and bomb radiocarbon dating.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andrews, A H; Kerr, L A; Cailliet, G M

    2007-11-04

    Canary rockfish (Sebastes pinniger) have long been an important part of recreational and commercial rockfish fishing from southeast Alaska to southern California, but localized stock abundances have declined considerably. Based on age estimates from otoliths and other structures, lifespan estimates vary from about 20 years to over 80 years. For the purpose of monitoring stocks, age composition is routinely estimated by counting growth zones in otoliths; however, age estimation procedures and lifespan estimates remain largely unvalidated. Typical age validation techniques have limited application for canary rockfish because they are deep dwelling and may be long lived. In this study, themore » unaged otolith of the pair from fish aged at the Department of Fisheries and Oceans Canada was used in one of two age validation techniques: (1) lead-radium dating and (2) bomb radiocarbon ({sup 14}C) dating. Age estimate accuracy and the validity of age estimation procedures were validated based on the results from each technique. Lead-radium dating proved successful in determining a minimum estimate of lifespan was 53 years and provided support for age estimation procedures up to about 50-60 years. These findings were further supported by {Delta}{sup 14}C data, which indicated a minimum estimate of lifespan was 44 {+-} 3 years. Both techniques validate, to differing degrees, age estimation procedures and provide support for inferring that canary rockfish can live more than 80 years.« less

  12. Effect of fabrication processes on mechanical properties of glass fiber reinforced polymer composites for 49 meter (160 foot) recreational yachts

    NASA Astrophysics Data System (ADS)

    Kim, Dave (dea-wook); Hennigan, Daniel John; Beavers, Kevin Daniel

    2010-03-01

    Polymer composite materialsoffer high strength and stiffness to weight ratio, corrosion resistance, and total life cost reductions that appeal to the marine industry. The advantages of composite construction have led to their incorporation in U.S. yacht hull structures over 46 meters (150 feet) in length. In order to construct even larger hull structures, higher quality composites with a lower cost production techniques need to be developed. In this study, the effect of composite hull fabrication processes on mechanical properties of glass fiber reinforced plastic(GFRP) composites is presented. Fabrication techniques used in this study are hand lay-up (HL), vacuum infusion (VI), and hybrid (HL+VI) processes. Mechanical property testing includes: tensile, compressive, and ignition loss sample analysis. Results demonstrate that the vacuum pressure implemented during composite fabrication has an effect on mechanical properties. The VI processed GFRP yields improved mechanical properties in tension/compression strengths and tensile modulus. The hybrid GFRP composites, however, failed in a sequential manor, due to dissimilar failure modes in the HL and VI processed sides. Fractography analysis was conducted to validate the mechanical property testing results

  13. A centerless grinding unit used for precisely processing ferrules of optical fiber connector

    NASA Astrophysics Data System (ADS)

    Wu, Yongbo; Kondo, Takahiro; Kato, Masana

    2005-02-01

    This paper describes the development of a centerless grinding unit used for precisely processing ferrules, a key component of optical fiber connectors. In conventional processing procedure, the outer diameter of a ferrule is ground by employing a special machine tool, i.e., centerless grinder. However, in the case of processing small amount of ferrules, introducing a centerless grinder leads to high processing cost. Therefore, in order to take measures against this problem, the present authors propose a new centerless grinding technique where a compact centerless grinding unit, which is composed of an ultrasonic elliptic-vibration shoe, a workrest blade, and their respective holders, is installed on a popular surface grinder to perform the centerless grinding operations for outer diameter machining of ferrules. In this work, a unit is designed and constructed, and is installed on a surface grinder equipped with a diamond grinding wheel. Then, the performance of the unit is examined experimentally followed by grinding tests of ferrule"s outer diameter. As a result, the roundness of the ferrule"s outer diameter improved from the original value of around 3μm to the final value of around 0.5 μm, confirming the validity of the new technique.

  14. Coaching leadership: leaders' and followers' perception assessment questionnaires in nursing

    PubMed Central

    Cardoso, Maria Lúcia Alves Pereira; Ramos, Laís Helena; D'Innocenzo, Maria

    2014-01-01

    ABSTRACT Objective: To describe the development, content analysis, and reliability of two questionnaires to assess the perception of nurse leaders, nurse technicians, and licensed practical nurses – coached in the practice of leadership and the relation with the dimensions of the coaching process. Methods: This was a methodological study with a quantitative and qualitative approach, which had the goal of instrumentation in reference to the construction and validation of measuring instruments. The instrument proposition design was based on the literature on leadership, coaching, and assessment of psychometric properties, subjected to content validation as to clarity, relevance, and applicability in order to validate the propositions through the consensus of judges, using the Delphi technique, in 2010. The final version of the questionnaires was administered to 279 nurses and 608 nurse technicians and licensed practical nurses, at two university hospitals and two private hospitals. Results: The Cronbach's alpha value with all items of the self-perception instrument was very high (0.911). The team members' instrument of perception showed that for all determinants and for each dimension of the coaching process, Cronbach's overall alpha value (0.952) was considered quite high, pointing to a very strong consistency of the scale. Confirmatory analysis showed that the models were well adjusted. Conclusion: From the statistical validation we compared the possibility of reusing the questionnaires for other study samples, because there was evidence of reliability and applicability. PMID:24728249

  15. Angular relational signature-based chest radiograph image view classification.

    PubMed

    Santosh, K C; Wendling, Laurent

    2018-01-22

    In a computer-aided diagnosis (CAD) system, especially for chest radiograph or chest X-ray (CXR) screening, CXR image view information is required. Automatically separating CXR image view, frontal and lateral can ease subsequent CXR screening process, since the techniques may not equally work for both views. We present a novel technique to classify frontal and lateral CXR images, where we introduce angular relational signature through force histogram to extract features and apply three different state-of-the-art classifiers: multi-layer perceptron, random forest, and support vector machine to make a decision. We validated our fully automatic technique on a set of 8100 images hosted by the U.S. National Library of Medicine (NLM), National Institutes of Health (NIH), and achieved an accuracy close to 100%. Our method outperforms the state-of-the-art methods in terms of processing time (less than or close to 2 s for the whole test data) while the accuracies can be compared, and therefore, it justifies its practicality. Graphical Abstract Interpreting chest X-ray (CXR) through the angular relational signature.

  16. Automated Identification and Shape Analysis of Chorus Elements in the Van Allen Radiation Belts

    NASA Astrophysics Data System (ADS)

    Sen Gupta, Ananya; Kletzing, Craig; Howk, Robin; Kurth, William; Matheny, Morgan

    2017-12-01

    An important goal of the Van Allen Probes mission is to understand wave-particle interaction by chorus emissions in terrestrial Van Allen radiation belts. To test models, statistical characterization of chorus properties, such as amplitude variation and sweep rates, is an important scientific goal. The Electric and Magnetic Field Instrument Suite and Integrated Science (EMFISIS) instrumentation suite provides measurements of wave electric and magnetic fields as well as DC magnetic fields for the Van Allen Probes mission. However, manual inspection across terabytes of EMFISIS data is not feasible and as such introduces human confirmation bias. We present signal processing techniques for automated identification, shape analysis, and sweep rate characterization of high-amplitude whistler-mode chorus elements in the Van Allen radiation belts. Specifically, we develop signal processing techniques based on the radon transform that disambiguate chorus elements with a dominant sweep rate against hiss-like chorus. We present representative results validating our techniques and also provide statistical characterization of detected chorus elements across a case study of a 6 s epoch.

  17. Application of Eddy Current Techniques for Orbiter Reinforced Carbon-Carbon Structural Health Monitoring

    NASA Technical Reports Server (NTRS)

    Wincheski, Buzz; Simpson, John

    2005-01-01

    The development and application of advanced nondestructive evaluation techniques for the Reinforced Carbon-Carbon (RCC) components of the Space Shuttle Orbiter Leading Edge Structural Subsystem (LESS) was identified as a crucial step toward returning the shuttle fleet to service. In order to help meet this requirement, eddy current techniques have been developed for application to RCC components. Eddy current technology has been found to be particularly useful for measuring the protective coating thickness over the reinforced carbon-carbon and for the identification of near surface cracking and voids in the RCC matrix. Testing has been performed on as manufactured and flown RCC components with both actual and fabricated defects representing impact and oxidation damage. Encouraging initial results have led to the development of two separate eddy current systems for in-situ RCC inspections in the orbiter processing facility. Each of these systems has undergone blind validation testing on a full scale leading edge panel, and recently transitioned to Kennedy Space Center to be applied as a part of a comprehensive RCC inspection strategy to be performed in the orbiter processing facility after each shuttle flight.

  18. High resolution beamforming on large aperture vertical line arrays: Processing synthetic data

    NASA Astrophysics Data System (ADS)

    Tran, Jean-Marie Q.; Hodgkiss, William S.

    1990-09-01

    This technical memorandum studies the beamforming of large aperture line arrays deployed vertically in the water column. The work concentrates on the use of high resolution techniques. Two processing strategies are envisioned: (1) full aperture coherent processing which offers in theory the best processing gain; and (2) subaperture processing which consists in extracting subapertures from the array and recombining the angular spectra estimated from these subarrays. The conventional beamformer, the minimum variance distortionless response (MVDR) processor, the multiple signal classification (MUSIC) algorithm and the minimum norm method are used in this study. To validate the various processing techniques, the ATLAS normal mode program is used to generate synthetic data which constitute a realistic signals environment. A deep-water, range-independent sound velocity profile environment, characteristic of the North-East Pacific, is being studied for two different 128 sensor arrays: a very long one cut for 30 Hz and operating at 20 Hz; and a shorter one cut for 107 Hz and operating at 100 Hz. The simulated sound source is 5 m deep. The full aperture and subaperture processing are being implemented with curved and plane wavefront replica vectors. The beamforming results are examined and compared to the ray-theory results produced by the generic sonar model.

  19. Individual pore and interconnection size analysis of macroporous ceramic scaffolds using high-resolution X-ray tomography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jerban, Saeed, E-mail: saeed.jerban@usherbrooke.ca

    2016-08-15

    The pore interconnection size of β-tricalcium phosphate scaffolds plays an essential role in the bone repair process. Although, the μCT technique is widely used in the biomaterial community, it is rarely used to measure the interconnection size because of the lack of algorithms. In addition, discrete nature of the μCT introduces large systematic errors due to the convex geometry of interconnections. We proposed, verified and validated a novel pore-level algorithm to accurately characterize the individual pores and interconnections. Specifically, pores and interconnections were isolated, labeled, and individually analyzed with high accuracy. The technique was verified thoroughly by visually inspecting andmore » verifying over 3474 properties of randomly selected pores. This extensive verification process has passed a one-percent accuracy criterion. Scanning errors inherent in the discretization, which lead to both dummy and significantly overestimated interconnections, have been examined using computer-based simulations and additional high-resolution scanning. Then accurate correction charts were developed and used to reduce the scanning errors. Only after the corrections, both the μCT and SEM-based results converged, and the novel algorithm was validated. Material scientists with access to all geometrical properties of individual pores and interconnections, using the novel algorithm, will have a more-detailed and accurate description of the substitute architecture and a potentially deeper understanding of the link between the geometric and biological interaction. - Highlights: •An algorithm is developed to analyze individually all pores and interconnections. •After pore isolating, the discretization errors in interconnections were corrected. •Dummy interconnections and overestimated sizes were due to thin material walls. •The isolating algorithm was verified through visual inspection (99% accurate). •After correcting for the systematic errors, algorithm was validated successfully.« less

  20. EPA requirements and programs

    NASA Technical Reports Server (NTRS)

    Koutsandreas, J. D.

    1975-01-01

    The proposed ERTS-DCS system is designed to allow EPA the capability to evaluate, through demonstrable hardware, the effectiveness of automated data collection techniques. The total effectiveness of any system is dependent upon many factors which include equipment cost, installation, maintainability, logistic support, growth potential, flexibility and failure rate. This can best be accomplished by installing the system at an operational environmental control agency (CAMP station) to insure that valid data is being obtained and processed. Consequently, it is imperative that the equipment interface must not compromise the validity of the sensor data nor should the experimental system effect the present operations of the CAMP station. Since both the system which is presently in use and the automatic system would be in operation in parallel, conformation and comparison are readily obtained.

  1. Hyper-resolution monitoring of urban flooding with social media and crowdsourcing data

    NASA Astrophysics Data System (ADS)

    Wang, Ruo-Qian; Mao, Huina; Wang, Yuan; Rae, Chris; Shaw, Wesley

    2018-02-01

    Hyper-resolution datasets for urban flooding are rare. This problem prevents detailed flooding risk analysis, urban flooding control, and the validation of hyper-resolution numerical models. We employed social media and crowdsourcing data to address this issue. Natural Language Processing and Computer Vision techniques are applied to the data collected from Twitter and MyCoast (a crowdsourcing app). We found these big data based flood monitoring approaches can complement the existing means of flood data collection. The extracted information is validated against precipitation data and road closure reports to examine the data quality. The two data collection approaches are compared and the two data mining methods are discussed. A series of suggestions is given to improve the data collection strategy.

  2. Control Performance, Aerodynamic Modeling, and Validation of Coupled Simulation Techniques for Guided Projectile Roll Dynamics

    DTIC Science & Technology

    2014-11-01

    39–44) has been explored in depth in the literature. Of particular interest for this study are investigations into roll control. Isolating the...Control Performance, Aerodynamic Modeling, and Validation of Coupled Simulation Techniques for Guided Projectile Roll Dynamics by Jubaraj...Simulation Techniques for Guided Projectile Roll Dynamics Jubaraj Sahu, Frank Fresconi, and Karen R. Heavey Weapons and Materials Research

  3. Near-infrared spectral image analysis of pork marbling based on Gabor filter and wide line detector techniques.

    PubMed

    Huang, Hui; Liu, Li; Ngadi, Michael O; Gariépy, Claude; Prasher, Shiv O

    2014-01-01

    Marbling is an important quality attribute of pork. Detection of pork marbling usually involves subjective scoring, which raises the efficiency costs to the processor. In this study, the ability to predict pork marbling using near-infrared (NIR) hyperspectral imaging (900-1700 nm) and the proper image processing techniques were studied. Near-infrared images were collected from pork after marbling evaluation according to current standard chart from the National Pork Producers Council. Image analysis techniques-Gabor filter, wide line detector, and spectral averaging-were applied to extract texture, line, and spectral features, respectively, from NIR images of pork. Samples were grouped into calibration and validation sets. Wavelength selection was performed on calibration set by stepwise regression procedure. Prediction models of pork marbling scores were built using multiple linear regressions based on derivatives of mean spectra and line features at key wavelengths. The results showed that the derivatives of both texture and spectral features produced good results, with correlation coefficients of validation of 0.90 and 0.86, respectively, using wavelengths of 961, 1186, and 1220 nm. The results revealed the great potential of the Gabor filter for analyzing NIR images of pork for the effective and efficient objective evaluation of pork marbling.

  4. Proof of Concept for an Ultrasensitive Technique to Detect and Localize Sources of Elastic Nonlinearity Using Phononic Crystals.

    PubMed

    Miniaci, M; Gliozzi, A S; Morvan, B; Krushynska, A; Bosia, F; Scalerandi, M; Pugno, N M

    2017-05-26

    The appearance of nonlinear effects in elastic wave propagation is one of the most reliable and sensitive indicators of the onset of material damage. However, these effects are usually very small and can be detected only using cumbersome digital signal processing techniques. Here, we propose and experimentally validate an alternative approach, using the filtering and focusing properties of phononic crystals to naturally select and reflect the higher harmonics generated by nonlinear effects, enabling the realization of time-reversal procedures for nonlinear elastic source detection. The proposed device demonstrates its potential as an efficient, compact, portable, passive apparatus for nonlinear elastic wave sensing and damage detection.

  5. Image encryption using a synchronous permutation-diffusion technique

    NASA Astrophysics Data System (ADS)

    Enayatifar, Rasul; Abdullah, Abdul Hanan; Isnin, Ismail Fauzi; Altameem, Ayman; Lee, Malrey

    2017-03-01

    In the past decade, the interest on digital images security has been increased among scientists. A synchronous permutation and diffusion technique is designed in order to protect gray-level image content while sending it through internet. To implement the proposed method, two-dimensional plain-image is converted to one dimension. Afterward, in order to reduce the sending process time, permutation and diffusion steps for any pixel are performed in the same time. The permutation step uses chaotic map and deoxyribonucleic acid (DNA) to permute a pixel, while diffusion employs DNA sequence and DNA operator to encrypt the pixel. Experimental results and extensive security analyses have been conducted to demonstrate the feasibility and validity of this proposed image encryption method.

  6. Automation is an Effective Way to Improve Quality of Verification (Calibration) of Measuring Instruments

    NASA Astrophysics Data System (ADS)

    Golobokov, M.; Danilevich, S.

    2018-04-01

    In order to assess calibration reliability and automate such assessment, procedures for data collection and simulation study of thermal imager calibration procedure have been elaborated. The existing calibration techniques do not always provide high reliability. A new method for analyzing the existing calibration techniques and developing new efficient ones has been suggested and tested. A type of software has been studied that allows generating instrument calibration reports automatically, monitoring their proper configuration, processing measurement results and assessing instrument validity. The use of such software allows reducing man-hours spent on finalization of calibration data 2 to 5 times and eliminating a whole set of typical operator errors.

  7. Exploitation of molecular profiling techniques for GM food safety assessment.

    PubMed

    Kuiper, Harry A; Kok, Esther J; Engel, Karl-Heinz

    2003-04-01

    Several strategies have been developed to identify unintended alterations in the composition of genetically modified (GM) food crops that may occur as a result of the genetic modification process. These include comparative chemical analysis of single compounds in GM food crops and their conventional non-GM counterparts, and profiling methods such as DNA/RNA microarray technologies, proteomics and metabolite profiling. The potential of profiling methods is obvious, but further exploration of specificity, sensitivity and validation is needed. Moreover, the successful application of profiling techniques to the safety evaluation of GM foods will require linked databases to be built that contain information on variations in profiles associated with differences in developmental stages and environmental conditions.

  8. [Validation of plasma creatinine measurement on UniCel DxC 600 according to LAB GTA 04 recommendation].

    PubMed

    Chianea, Denis; Renard, Christophe; Garcia, Carine; Mbongo, Elvire; Monpeurt, Corine; Vest, Philippe

    2010-01-01

    The accreditation process, according to NF EN ISO 15189, implies a prior evaluation of the new reagent on-site for the implementation of each new assay technique. Thus, our new standardized method for determination of creatinine (non compensated method) in plasma or serum on UniCel DxC 600 (Beckman Coulter) has been tested according to LAB GTA 04 protocol. The reagent meets the quality criteria recommended by Valtec protocol, except fidelity with the low concentration standard (50 micromol/L). Besides there is no problem of results transferability with the two other techniques used in the laboratory (Jaffe compensated and enzymatic methods performed on Cobas Integra 800).

  9. Modelling low velocity impact induced damage in composite laminates

    NASA Astrophysics Data System (ADS)

    Shi, Yu; Soutis, Constantinos

    2017-12-01

    The paper presents recent progress on modelling low velocity impact induced damage in fibre reinforced composite laminates. It is important to understand the mechanisms of barely visible impact damage (BVID) and how it affects structural performance. To reduce labour intensive testing, the development of finite element (FE) techniques for simulating impact damage becomes essential and recent effort by the composites research community is reviewed in this work. The FE predicted damage initiation and propagation can be validated by Non Destructive Techniques (NDT) that gives confidence to the developed numerical damage models. A reliable damage simulation can assist the design process to optimise laminate configurations, reduce weight and improve performance of components and structures used in aircraft construction.

  10. Recommendations for accreditation of laboratories in molecular biology of hematologic malignancies.

    PubMed

    Flandrin-Gresta, Pascale; Cornillet, Pascale; Hayette, Sandrine; Gachard, Nathalie; Tondeur, Sylvie; Mauté, Carole; Cayuela, Jean-Michel

    2015-01-01

    Over recent years, the development of molecular biology techniques has improved the hematological diseases diagnostic and follow-up. Consequently, these techniques are largely used in the biological screening of these diseases; therefore the Hemato-oncology molecular diagnostics laboratories must be actively involved in the accreditation process according the ISO 15189 standard. The French group of molecular biologists (GBMHM) provides requirements for the implementation of quality assurance for the medical molecular laboratories. This guideline states the recommendations for the pre-analytical, analytical (methods validation procedures, quality controls, reagents), and post-analytical conditions. In addition, herein we state a strategy for the internal quality control management. These recommendations will be regularly updated.

  11. Arduino-based noise robust online heart-rate detection.

    PubMed

    Das, Sangita; Pal, Saurabh; Mitra, Madhuchhanda

    2017-04-01

    This paper introduces a noise robust real time heart rate detection system from electrocardiogram (ECG) data. An online data acquisition system is developed to collect ECG signals from human subjects. Heart rate is detected using window-based autocorrelation peak localisation technique. A low-cost Arduino UNO board is used to implement the complete automated process. The performance of the system is compared with PC-based heart rate detection technique. Accuracy of the system is validated through simulated noisy ECG data with various levels of signal to noise ratio (SNR). The mean percentage error of detected heart rate is found to be 0.72% for the noisy database with five different noise levels.

  12. In-line UV spectroscopy for the quantification of low-dose active ingredients during the manufacturing of pharmaceutical semi-solid and liquid formulations.

    PubMed

    Bostijn, N; Hellings, M; Van Der Veen, M; Vervaet, C; De Beer, T

    2018-07-12

    UltraViolet (UV) spectroscopy was evaluated as an innovative Process Analytical Technology (PAT) - tool for the in-line and real-time quantitative determination of low-dosed active pharmaceutical ingredients (APIs) in a semi-solid (gel) and a liquid (suspension) pharmaceutical formulation during their batch production process. The performance of this new PAT-tool (i.e., UV spectroscopy) was compared with an already more established PAT-method based on Raman spectroscopy. In-line UV measurements were carried out with an immersion probe while for the Raman measurements a non-contact PhAT probe was used. For both studied formulations, an in-line API quantification model was developed and validated per spectroscopic technique. The known API concentrations (Y) were correlated with the corresponding in-line collected preprocessed spectra (X) through a Partial Least Squares (PLS) regression. Each developed quantification method was validated by calculating the accuracy profile on the basis of the validation experiments. Furthermore, the measurement uncertainty was determined based on the data generated for the determination of the accuracy profiles. From the accuracy profile of the UV- and Raman-based quantification method for the gel, it was concluded that at the target API concentration of 2% (w/w), 95 out of 100 future routine measurements given by the Raman method will not deviate more than 10% (relative error) from the true API concentration, whereas for the UV method the acceptance limits of 10% were exceeded. For the liquid formulation, the Raman method was not able to quantify the API in the low-dosed suspension (0.09% (w/w) API). In contrast, the in-line UV method was able to adequately quantify the API in the suspension. This study demonstrated that UV spectroscopy can be adopted as a novel in-line PAT-technique for low-dose quantification purposes in pharmaceutical processes. Important is that none of the two spectroscopic techniques was superior to the other for both formulations: the Raman method was more accurate in quantifying the API in the gel (2% (w/w) API), while the UV method performed better for API quantification in the suspension (0.09% (w/w) API). Copyright © 2018 Elsevier B.V. All rights reserved.

  13. Multi-rendezvous low-thrust trajectory optimization using costate transforming and homotopic approach

    NASA Astrophysics Data System (ADS)

    Chen, Shiyu; Li, Haiyang; Baoyin, Hexi

    2018-06-01

    This paper investigates a method for optimizing multi-rendezvous low-thrust trajectories using indirect methods. An efficient technique, labeled costate transforming, is proposed to optimize multiple trajectory legs simultaneously rather than optimizing each trajectory leg individually. Complex inner-point constraints and a large number of free variables are one main challenge in optimizing multi-leg transfers via shooting algorithms. Such a difficulty is reduced by first optimizing each trajectory leg individually. The results may be, next, utilized as an initial guess in the simultaneous optimization of multiple trajectory legs. In this paper, the limitations of similar techniques in previous research is surpassed and a homotopic approach is employed to improve the convergence efficiency of the shooting process in multi-rendezvous low-thrust trajectory optimization. Numerical examples demonstrate that newly introduced techniques are valid and efficient.

  14. Do two machine-learning based prognostic signatures for breast cancer capture the same biological processes?

    PubMed

    Drier, Yotam; Domany, Eytan

    2011-03-14

    The fact that there is very little if any overlap between the genes of different prognostic signatures for early-discovery breast cancer is well documented. The reasons for this apparent discrepancy have been explained by the limits of simple machine-learning identification and ranking techniques, and the biological relevance and meaning of the prognostic gene lists was questioned. Subsequently, proponents of the prognostic gene lists claimed that different lists do capture similar underlying biological processes and pathways. The present study places under scrutiny the validity of this claim, for two important gene lists that are at the focus of current large-scale validation efforts. We performed careful enrichment analysis, controlling the effects of multiple testing in a manner which takes into account the nested dependent structure of gene ontologies. In contradiction to several previous publications, we find that the only biological process or pathway for which statistically significant concordance can be claimed is cell proliferation, a process whose relevance and prognostic value was well known long before gene expression profiling. We found that the claims reported by others, of wider concordance between the biological processes captured by the two prognostic signatures studied, were found either to be lacking statistical rigor or were in fact based on addressing some other question.

  15. Online monitoring of fermentation processes via non-invasive low-field NMR.

    PubMed

    Kreyenschulte, Dirk; Paciok, Eva; Regestein, Lars; Blümich, Bernhard; Büchs, Jochen

    2015-09-01

    For the development of biotechnological processes in academia as well as in industry new techniques are required which enable online monitoring for process characterization and control. Nuclear magnetic resonance (NMR) spectroscopy is a promising analytical tool, which has already found broad applications in offline process analysis. The use of online monitoring, however, is oftentimes constrained by high complexity of custom-made NMR bioreactors and considerable costs for high-field NMR instruments (>US$200,000). Therefore, low-field (1) H NMR was investigated in this study in a bypass system for real-time observation of fermentation processes. The new technique was validated with two microbial systems. For the yeast Hansenula polymorpha glycerol consumption could accurately be assessed in spite of the presence of high amounts of complex constituents in the medium. During cultivation of the fungal strain Ustilago maydis, which is accompanied by the formation of several by-products, the concentrations of glucose, itaconic acid, and the relative amount of glycolipids could be quantified. While low-field spectra are characterized by reduced spectral resolution compared to high-field NMR, the compact design combined with the high temporal resolution (15 s-8 min) of spectra acquisition allowed online monitoring of the respective processes. Both applications clearly demonstrate that the investigated technique is well suited for reaction monitoring in opaque media while at the same time it is highly robust and chemically specific. It can thus be concluded that low-field NMR spectroscopy has a great potential for non-invasive online monitoring of biotechnological processes at the research and practical industrial scales. © 2015 Wiley Periodicals, Inc.

  16. Validation of the tool assessment of clinical education (AssCE): A study using Delphi method and clinical experts.

    PubMed

    Löfmark, Anna; Mårtensson, Gunilla

    2017-03-01

    The aim of the present study was to establish the validity of the tool Assessment of Clinical Education (AssCE). The tool is widely used in Sweden and some Nordic countries for assessing nursing students' performance in clinical education. It is important that the tools in use be subjected to regular audit and critical reviews. The validation process, performed in two stages, was concluded with a high level of congruence. In the first stage, Delphi technique was used to elaborate the AssCE tool using a group of 35 clinical nurse lecturers. After three rounds, we reached consensus. In the second stage, a group of 46 clinical nurse lecturers representing 12 universities in Sweden and Norway audited the revised version of the AssCE in relation to learning outcomes from the last clinical course at their respective institutions. Validation of the revised AssCE was established with high congruence between the factors in the AssCE and examined learning outcomes. The revised AssCE tool seems to meet its objective to be a validated assessment tool for use in clinical nursing education. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Aeroservoelastic Model Validation and Test Data Analysis of the F/A-18 Active Aeroelastic Wing

    NASA Technical Reports Server (NTRS)

    Brenner, Martin J.; Prazenica, Richard J.

    2003-01-01

    Model validation and flight test data analysis require careful consideration of the effects of uncertainty, noise, and nonlinearity. Uncertainty prevails in the data analysis techniques and results in a composite model uncertainty from unmodeled dynamics, assumptions and mechanics of the estimation procedures, noise, and nonlinearity. A fundamental requirement for reliable and robust model development is an attempt to account for each of these sources of error, in particular, for model validation, robust stability prediction, and flight control system development. This paper is concerned with data processing procedures for uncertainty reduction in model validation for stability estimation and nonlinear identification. F/A-18 Active Aeroelastic Wing (AAW) aircraft data is used to demonstrate signal representation effects on uncertain model development, stability estimation, and nonlinear identification. Data is decomposed using adaptive orthonormal best-basis and wavelet-basis signal decompositions for signal denoising into linear and nonlinear identification algorithms. Nonlinear identification from a wavelet-based Volterra kernel procedure is used to extract nonlinear dynamics from aeroelastic responses, and to assist model development and uncertainty reduction for model validation and stability prediction by removing a class of nonlinearity from the uncertainty.

  18. The Myotonometer: Not a Valid Measurement Tool for Active Hamstring Musculotendinous Stiffness.

    PubMed

    Pamukoff, Derek N; Bell, Sarah E; Ryan, Eric D; Blackburn, J Troy

    2016-05-01

    Hamstring musculotendinous stiffness (MTS) is associated with lower-extremity injury risk (ie, hamstring strain, anterior cruciate ligament injury) and is commonly assessed using the damped oscillatory technique. However, despite a preponderance of studies that measure MTS reliably in laboratory settings, there are no valid clinical measurement tools. A valid clinical measurement technique is needed to assess MTS and permit identification of individuals at heightened risk of injury and track rehabilitation progress. To determine the validity and reliability of the Myotonometer for measuring active hamstring MTS. Descriptive laboratory study. Laboratory. 33 healthy participants (15 men, age 21.33 ± 2.94 y, height 172.03 ± 16.36 cm, mass 74.21 ± 16.36 kg). Hamstring MTS was assessed using the damped oscillatory technique and the Myotonometer. Intraclass correlations were used to determine the intrasession, intersession, and interrater reliability of the Myotonometer. Criterion validity was assessed via Pearson product-moment correlation between MTS measures obtained from the Myotonometer and from the damped oscillatory technique. The Myotonometer demonstrated good intrasession (ICC3,1 = .807) and interrater reliability (ICC2,k = .830) and moderate intersession reliability (ICC2,k = .693). However, it did not provide a valid measurement of MTS compared with the damped oscillatory technique (r = .346, P = .061). The Myotonometer does not provide a valid measure of active hamstring MTS. Although the Myotonometer does not measure active MTS, it possesses good reliability and portability and could be used clinically to measure tissue compliance, muscle tone, or spasticity associated with multiple musculoskeletal disorders. Future research should focus on portable and clinically applicable tools to measure active hamstring MTS in efforts to prevent and monitor injuries.

  19. EVLncRNAs: a manually curated database for long non-coding RNAs validated by low-throughput experiments.

    PubMed

    Zhou, Bailing; Zhao, Huiying; Yu, Jiafeng; Guo, Chengang; Dou, Xianghua; Song, Feng; Hu, Guodong; Cao, Zanxia; Qu, Yuanxu; Yang, Yuedong; Zhou, Yaoqi; Wang, Jihua

    2018-01-04

    Long non-coding RNAs (lncRNAs) play important functional roles in various biological processes. Early databases were utilized to deposit all lncRNA candidates produced by high-throughput experimental and/or computational techniques to facilitate classification, assessment and validation. As more lncRNAs are validated by low-throughput experiments, several databases were established for experimentally validated lncRNAs. However, these databases are small in scale (with a few hundreds of lncRNAs only) and specific in their focuses (plants, diseases or interactions). Thus, it is highly desirable to have a comprehensive dataset for experimentally validated lncRNAs as a central repository for all of their structures, functions and phenotypes. Here, we established EVLncRNAs by curating lncRNAs validated by low-throughput experiments (up to 1 May 2016) and integrating specific databases (lncRNAdb, LncRANDisease, Lnc2Cancer and PLNIncRBase) with additional functional and disease-specific information not covered previously. The current version of EVLncRNAs contains 1543 lncRNAs from 77 species that is 2.9 times larger than the current largest database for experimentally validated lncRNAs. Seventy-four percent lncRNA entries are partially or completely new, comparing to all existing experimentally validated databases. The established database allows users to browse, search and download as well as to submit experimentally validated lncRNAs. The database is available at http://biophy.dzu.edu.cn/EVLncRNAs. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  20. EVLncRNAs: a manually curated database for long non-coding RNAs validated by low-throughput experiments

    PubMed Central

    Zhao, Huiying; Yu, Jiafeng; Guo, Chengang; Dou, Xianghua; Song, Feng; Hu, Guodong; Cao, Zanxia; Qu, Yuanxu

    2018-01-01

    Abstract Long non-coding RNAs (lncRNAs) play important functional roles in various biological processes. Early databases were utilized to deposit all lncRNA candidates produced by high-throughput experimental and/or computational techniques to facilitate classification, assessment and validation. As more lncRNAs are validated by low-throughput experiments, several databases were established for experimentally validated lncRNAs. However, these databases are small in scale (with a few hundreds of lncRNAs only) and specific in their focuses (plants, diseases or interactions). Thus, it is highly desirable to have a comprehensive dataset for experimentally validated lncRNAs as a central repository for all of their structures, functions and phenotypes. Here, we established EVLncRNAs by curating lncRNAs validated by low-throughput experiments (up to 1 May 2016) and integrating specific databases (lncRNAdb, LncRANDisease, Lnc2Cancer and PLNIncRBase) with additional functional and disease-specific information not covered previously. The current version of EVLncRNAs contains 1543 lncRNAs from 77 species that is 2.9 times larger than the current largest database for experimentally validated lncRNAs. Seventy-four percent lncRNA entries are partially or completely new, comparing to all existing experimentally validated databases. The established database allows users to browse, search and download as well as to submit experimentally validated lncRNAs. The database is available at http://biophy.dzu.edu.cn/EVLncRNAs. PMID:28985416

  1. Verification and Validation of Residual Stresses in Bi-Material Composite Rings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, Stacy Michelle; Hanson, Alexander Anthony; Briggs, Timothy

    Process-induced residual stresses commonly occur in composite structures composed of dissimilar materials. These residual stresses form due to differences in the composite materials’ coefficients of thermal expansion and the shrinkage upon cure exhibited by polymer matrix materials. Depending upon the specific geometric details of the composite structure and the materials’ curing parameters, it is possible that these residual stresses could result in interlaminar delamination or fracture within the composite. Therefore, the consideration of potential residual stresses is important when designing composite parts and their manufacturing processes. However, the experimental determination of residual stresses in prototype parts can be time andmore » cost prohibitive. As an alternative to physical measurement, it is possible for computational tools to be used to quantify potential residual stresses in composite prototype parts. Therefore, the objectives of the presented work are to demonstrate a simplistic method for simulating residual stresses in composite parts, as well as the potential value of sensitivity and uncertainty quantification techniques during analyses for which material property parameters are unknown. Specifically, a simplified residual stress modeling approach, which accounts for coefficient of thermal expansion mismatch and polymer shrinkage, is implemented within the Sandia National Laboratories’ developed SIERRA/SolidMechanics code. Concurrent with the model development, two simple, bi-material structures composed of a carbon fiber/epoxy composite and aluminum, a flat plate and a cylinder, are fabricated and the residual stresses are quantified through the measurement of deformation. Then, in the process of validating the developed modeling approach with the experimental residual stress data, manufacturing process simulations of the two simple structures are developed and undergo a formal verification and validation process, including a mesh convergence study, sensitivity analysis, and uncertainty quantification. The simulations’ final results show adequate agreement with the experimental measurements, indicating the validity of a simple modeling approach, as well as a necessity for the inclusion of material parameter uncertainty in the final residual stress predictions.« less

  2. MO-B-BRB-03: 3D Dosimetry in the Clinic: Validating Special Techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Juang, T.

    Full three-dimensional (3D) dosimetry using volumetric chemical dosimeters probed by 3D imaging systems has long been a promising technique for the radiation therapy clinic, since it provides a unique methodology for dose measurements in the volume irradiated using complex conformal delivery techniques such as IMRT and VMAT. To date true 3D dosimetry is still not widely practiced in the community; it has been confined to centres of specialized expertise especially for quality assurance or commissioning roles where other dosimetry techniques are difficult to implement. The potential for improved clinical applicability has been advanced considerably in the last decade by themore » development of improved 3D dosimeters (e.g., radiochromic plastics, radiochromic gel dosimeters and normoxic polymer gel systems) and by improved readout protocols using optical computed tomography or magnetic resonance imaging. In this session, established users of some current 3D chemical dosimeters will briefly review the current status of 3D dosimetry, describe several dosimeters and their appropriate imaging for dose readout, present workflow procedures required for good dosimetry, and analyze some limitations for applications in select settings. We will review the application of 3D dosimetry to various clinical situations describing how 3D approaches can complement other dose delivery validation approaches already available in the clinic. The applications presented will be selected to inform attendees of the unique features provided by full 3D techniques. Learning Objectives: L. John Schreiner: Background and Motivation Understand recent developments enabling clinically practical 3D dosimetry, Appreciate 3D dosimetry workflow and dosimetry procedures, and Observe select examples from the clinic. Sofie Ceberg: Application to dynamic radiotherapy Observe full dosimetry under dynamic radiotherapy during respiratory motion, and Understand how the measurement of high resolution dose data in an irradiated volume can help understand interplay effects during TomoTherapy or VMAT. Titania Juang: Special techniques in the clinic and research Understand the potential for 3D dosimetry in validating dose accumulation in deformable systems, and Observe the benefits of high resolution measurements for precision therapy in SRS and in MicroSBRT for small animal irradiators Geoffrey S. Ibbott: 3D Dosimetry in end-to-end dosimetry QA Understand the potential for 3D dosimetry for end-to-end radiation therapy process validation in the in-house and external credentialing setting. Canadian Institutes of Health Research; L. Schreiner, Modus QA, London, ON, Canada; T. Juang, NIH R01CA100835.« less

  3. MO-B-BRB-02: 3D Dosimetry in the Clinic: IMRT Technique Validation in Sweden

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ceberg, S.

    Full three-dimensional (3D) dosimetry using volumetric chemical dosimeters probed by 3D imaging systems has long been a promising technique for the radiation therapy clinic, since it provides a unique methodology for dose measurements in the volume irradiated using complex conformal delivery techniques such as IMRT and VMAT. To date true 3D dosimetry is still not widely practiced in the community; it has been confined to centres of specialized expertise especially for quality assurance or commissioning roles where other dosimetry techniques are difficult to implement. The potential for improved clinical applicability has been advanced considerably in the last decade by themore » development of improved 3D dosimeters (e.g., radiochromic plastics, radiochromic gel dosimeters and normoxic polymer gel systems) and by improved readout protocols using optical computed tomography or magnetic resonance imaging. In this session, established users of some current 3D chemical dosimeters will briefly review the current status of 3D dosimetry, describe several dosimeters and their appropriate imaging for dose readout, present workflow procedures required for good dosimetry, and analyze some limitations for applications in select settings. We will review the application of 3D dosimetry to various clinical situations describing how 3D approaches can complement other dose delivery validation approaches already available in the clinic. The applications presented will be selected to inform attendees of the unique features provided by full 3D techniques. Learning Objectives: L. John Schreiner: Background and Motivation Understand recent developments enabling clinically practical 3D dosimetry, Appreciate 3D dosimetry workflow and dosimetry procedures, and Observe select examples from the clinic. Sofie Ceberg: Application to dynamic radiotherapy Observe full dosimetry under dynamic radiotherapy during respiratory motion, and Understand how the measurement of high resolution dose data in an irradiated volume can help understand interplay effects during TomoTherapy or VMAT. Titania Juang: Special techniques in the clinic and research Understand the potential for 3D dosimetry in validating dose accumulation in deformable systems, and Observe the benefits of high resolution measurements for precision therapy in SRS and in MicroSBRT for small animal irradiators Geoffrey S. Ibbott: 3D Dosimetry in end-to-end dosimetry QA Understand the potential for 3D dosimetry for end-to-end radiation therapy process validation in the in-house and external credentialing setting. Canadian Institutes of Health Research; L. Schreiner, Modus QA, London, ON, Canada; T. Juang, NIH R01CA100835.« less

  4. A comparison of computer-assisted and manual wound size measurement.

    PubMed

    Thawer, Habiba A; Houghton, Pamela E; Woodbury, M Gail; Keast, David; Campbell, Karen

    2002-10-01

    Accurate and precise wound measurements are a critical component of every wound assessment. To examine the reliability and validity of a new computerized technique for measuring human and animal wounds, chronic human wounds (N = 45) and surgical animal wounds (N = 38) were assessed using manual and computerized techniques. Using intraclass correlation coefficients, intrarater and interrater reliability of surface area measurements obtained using the computerized technique were compared to those obtained using acetate tracings and planimetry. A single measurement of surface area using either technique produced excellent intrarater and interrater reliability for both human and animal wounds, but the computerized technique was more precise than the manual technique for measuring the surface area of animal wounds. For both types of wounds and measurement techniques, intrarater and interrater reliability improved when the average of three repeated measurements was obtained. The precision of each technique with human wounds and the precision of the manual technique with animal wounds also improved when three repeated measurement results were averaged. Concurrent validity between the two techniques was excellent for human wounds but poor for the smaller animal wounds, regardless of whether single or the average of three repeated surface area measurements was used. The computerized technique permits reliable and valid assessment of the surface area of both human and animal wounds.

  5. Survey of Verification and Validation Techniques for Small Satellite Software Development

    NASA Technical Reports Server (NTRS)

    Jacklin, Stephen A.

    2015-01-01

    The purpose of this paper is to provide an overview of the current trends and practices in small-satellite software verification and validation. This document is not intended to promote a specific software assurance method. Rather, it seeks to present an unbiased survey of software assurance methods used to verify and validate small satellite software and to make mention of the benefits and value of each approach. These methods include simulation and testing, verification and validation with model-based design, formal methods, and fault-tolerant software design with run-time monitoring. Although the literature reveals that simulation and testing has by far the longest legacy, model-based design methods are proving to be useful for software verification and validation. Some work in formal methods, though not widely used for any satellites, may offer new ways to improve small satellite software verification and validation. These methods need to be further advanced to deal with the state explosion problem and to make them more usable by small-satellite software engineers to be regularly applied to software verification. Last, it is explained how run-time monitoring, combined with fault-tolerant software design methods, provides an important means to detect and correct software errors that escape the verification process or those errors that are produced after launch through the effects of ionizing radiation.

  6. Electronic Module Design with Scientifically Character-Charged Approach on Kinematics Material Learning to Improve Holistic Competence of High School Students in 10th Grade

    NASA Astrophysics Data System (ADS)

    Anggraini, R.; Darvina, Y.; Amir, H.; Murtiani, M.; Yulkifli, Y.

    2018-04-01

    The availability of modules in schools is currently lacking. Learners have not used the module as a source in the learning process. In accordance with the demands of the 2013 curriculum, that learning should be conducted using a scientific approach and loaded with character values as well as learning using interactive learning resources. The solution of this problem is to create an interactive module with a scientifically charged character approach. This interactive module can be used by learners outside the classroom or in the classroom. This interactive module contains straight motion material, parabolic motion and circular motion of high school physics class X semester 1. The purpose of this research is to produce an interactive module with a scientific approach charged with character and determine the validity and practicality. The research is Research and Development. This study was conducted only until the validity test and practice test. The validity test was conducted by three lecturers of Physics of FMIPA UNP as experts. The instruments used in this research are validation sheet and worksheet sheet. Data analysis technique used is product validity analysis. The object of this research is electronic module, while the subject of this research is three validator.

  7. Validation of the FFM PD count technique for screening personality pathology in later middle-aged and older adults.

    PubMed

    Van den Broeck, Joke; Rossi, Gina; De Clercq, Barbara; Dierckx, Eva; Bastiaansen, Leen

    2013-01-01

    Research on the applicability of the five factor model (FFM) to capture personality pathology coincided with the development of a FFM personality disorder (PD) count technique, which has been validated in adolescent, young, and middle-aged samples. This study extends the literature by validating this technique in an older sample. Five alternative FFM PD counts based upon the Revised NEO Personality Inventory (NEO PI-R) are computed and evaluated in terms of both convergent and divergent validity with the Assessment of DSM-IV Personality Disorders Questionnaire (shortly ADP-IV; DSM-IV, Diagnostic and Statistical Manual of Mental Disorders - Fourth edition). For the best working count for each PD normative data are presented, from which cut-off scores are derived. The validity of these cut-offs and their usefulness as a screening tool is tested against both a categorical (i.e., the DSM-IV - Text Revision), and a dimensional (i.e., the Dimensional Assessment of Personality Pathology; DAPP) measure of personality pathology. All but the Antisocial and Obsessive-Compulsive counts exhibited adequate convergent and divergent validity, supporting the use of this method in older adults. Using the ADP-IV and the DAPP - Short Form as validation criteria, results corroborate the use of the FFM PD count technique to screen for PDs in older adults, in particular for the Paranoid, Borderline, Histrionic, Avoidant, and Dependent PDs. Given the age-neutrality of the NEO PI-R and the considerable lack of valid personality assessment tools, current findings appear to be promising for the assessment of pathology in older adults.

  8. Fabrication and Characterization of Surrogate Glasses Aimed to Validate Nuclear Forensic Techniques

    DTIC Science & Technology

    2017-12-01

    sample is processed while submerged and produces fine sized particles the exposure levels and risk of contamination from the samples is also greatly...induced the partial collapses of the xerogel network strengthened the network while the sample sizes were reduced [22], [26]. As a result the wt...inhomogeneous, making it difficult to clearly determine which features were present in the sample before LDHP and which were caused by it. In this study

  9. Aerosol profiling during the large scale field campaign CINDI-2

    NASA Astrophysics Data System (ADS)

    Apituley, Arnoud; Roozendael, Michel Van; Richter, Andreas; Wagner, Thomas; Friess, Udo; Hendrick, Francois; Kreher, Karin; Tirpitz, Jan-Lukas

    2018-04-01

    For the validation of space borne observations of NO2 and other trace gases from hyperspectral imagers, ground based instruments based on the MAXDOAS technique are an excellent choice, since they rely on similar retrieval techniques as the observations from orbit. To ensure proper traceability of the MAXDOAS observations, a thorough validation and intercomparison is mandatory. Advanced MAXDOAS observation and retrieval techniques enable inferring vertical structure of trace gases and aerosols. These techniques and their results need validation by e.g. lidar techniques. For the proper understanding of the results from passive remote sensing techniques, independent observations are needed that include parameters needed to understand the light paths, i.e. in-situ aerosol observations of optical and microphysical properties, and essential are in particular the vertical profiles of aerosol optical properties by (Raman) lidar. The approach used in the CINDI-2 campaign held in Cabauw in 2016 is presented in this paper and the results will be discussed in the presentation at the conference.

  10. Development of Modal Test Techniques for Validation of a Solar Sail Design

    NASA Technical Reports Server (NTRS)

    Gaspar, James L.; Mann, Troy; Behun, Vaughn; Wilkie, W. Keats; Pappa, Richard

    2004-01-01

    This paper focuses on the development of modal test techniques for validation of a solar sail gossamer space structure design. The major focus is on validating and comparing the capabilities of various excitation techniques for modal testing solar sail components. One triangular shaped quadrant of a solar sail membrane was tested in a 1 Torr vacuum environment using various excitation techniques including, magnetic excitation, and surface-bonded piezoelectric patch actuators. Results from modal tests performed on the sail using piezoelectric patches at different positions are discussed. The excitation methods were evaluated for their applicability to in-vacuum ground testing and to the development of on orbit flight test techniques. The solar sail membrane was tested in the horizontal configuration at various tension levels to assess the variation in frequency with tension in a vacuum environment. A segment of a solar sail mast prototype was also tested in ambient atmospheric conditions using various excitation techniques, and these methods are also assessed for their ground test capabilities and on-orbit flight testing.

  11. The ALADIN System and its canonical model configurations AROME CY41T1 and ALARO CY40T1

    NASA Astrophysics Data System (ADS)

    Termonia, Piet; Fischer, Claude; Bazile, Eric; Bouyssel, François; Brožková, Radmila; Bénard, Pierre; Bochenek, Bogdan; Degrauwe, Daan; Derková, Mariá; El Khatib, Ryad; Hamdi, Rafiq; Mašek, Ján; Pottier, Patricia; Pristov, Neva; Seity, Yann; Smolíková, Petra; Španiel, Oldřich; Tudor, Martina; Wang, Yong; Wittmann, Christoph; Joly, Alain

    2018-01-01

    The ALADIN System is a numerical weather prediction (NWP) system developed by the international ALADIN consortium for operational weather forecasting and research purposes. It is based on a code that is shared with the global model IFS of the ECMWF and the ARPEGE model of Météo-France. Today, this system can be used to provide a multitude of high-resolution limited-area model (LAM) configurations. A few configurations are thoroughly validated and prepared to be used for the operational weather forecasting in the 16 partner institutes of this consortium. These configurations are called the ALADIN canonical model configurations (CMCs). There are currently three CMCs: the ALADIN baseline CMC, the AROME CMC and the ALARO CMC. Other configurations are possible for research, such as process studies and climate simulations. The purpose of this paper is (i) to define the ALADIN System in relation to the global counterparts IFS and ARPEGE, (ii) to explain the notion of the CMCs, (iii) to document their most recent versions, and (iv) to illustrate the process of the validation and the porting of these configurations to the operational forecast suites of the partner institutes of the ALADIN consortium. This paper is restricted to the forecast model only; data assimilation techniques and postprocessing techniques are part of the ALADIN System but they are not discussed here.

  12. Validation of a low field Rheo-NMR instrument and application to shear-induced migration of suspended non-colloidal particles in Couette flow

    NASA Astrophysics Data System (ADS)

    Colbourne, A. A.; Blythe, T. W.; Barua, R.; Lovett, S.; Mitchell, J.; Sederman, A. J.; Gladden, L. F.

    2018-01-01

    Nuclear magnetic resonance rheology (Rheo-NMR) is a valuable tool for studying the transport of suspended non-colloidal particles, important in many commercial processes. The Rheo-NMR imaging technique directly and quantitatively measures fluid displacement as a function of radial position. However, the high field magnets typically used in these experiments are unsuitable for the industrial environment and significantly hinder the measurement of shear stress. We introduce a low field Rheo-NMR instrument (1 H resonance frequency of 10.7MHz), which is portable and suitable as a process monitoring tool. This system is applied to the measurement of steady-state velocity profiles of a Newtonian carrier fluid suspending neutrally-buoyant non-colloidal particles at a range of concentrations. The large particle size (diameter > 200 μm) in the system studied requires a wide-gap Couette geometry and the local rheology was expected to be controlled by shear-induced particle migration. The low-field results are validated against high field Rheo-NMR measurements of consistent samples at matched shear rates. Additionally, it is demonstrated that existing models for particle migration fail to adequately describe the solid volume fractions measured in these systems, highlighting the need for improvement. The low field implementation of Rheo-NMR is complementary to shear stress rheology, such that the two techniques could be combined in a single instrument.

  13. Objective and expert-independent validation of retinal image registration algorithms by a projective imaging distortion model.

    PubMed

    Lee, Sangyeol; Reinhardt, Joseph M; Cattin, Philippe C; Abràmoff, Michael D

    2010-08-01

    Fundus camera imaging of the retina is widely used to diagnose and manage ophthalmologic disorders including diabetic retinopathy, glaucoma, and age-related macular degeneration. Retinal images typically have a limited field of view, and multiple images can be joined together using an image registration technique to form a montage with a larger field of view. A variety of methods for retinal image registration have been proposed, but evaluating such methods objectively is difficult due to the lack of a reference standard for the true alignment of the individual images that make up the montage. A method of generating simulated retinal images by modeling the geometric distortions due to the eye geometry and the image acquisition process is described in this paper. We also present a validation process that can be used for any retinal image registration method by tracing through the distortion path and assessing the geometric misalignment in the coordinate system of the reference standard. The proposed method can be used to perform an accuracy evaluation over the whole image, so that distortion in the non-overlapping regions of the montage components can be easily assessed. We demonstrate the technique by generating test image sets with a variety of overlap conditions and compare the accuracy of several retinal image registration models. Copyright 2010 Elsevier B.V. All rights reserved.

  14. Assessing Requirements Quality through Requirements Coverage

    NASA Technical Reports Server (NTRS)

    Rajan, Ajitha; Heimdahl, Mats; Woodham, Kurt

    2008-01-01

    In model-based development, the development effort is centered around a formal description of the proposed software system the model. This model is derived from some high-level requirements describing the expected behavior of the software. For validation and verification purposes, this model can then be subjected to various types of analysis, for example, completeness and consistency analysis [6], model checking [3], theorem proving [1], and test-case generation [4, 7]. This development paradigm is making rapid inroads in certain industries, e.g., automotive, avionics, space applications, and medical technology. This shift towards model-based development naturally leads to changes in the verification and validation (V&V) process. The model validation problem determining that the model accurately captures the customer's high-level requirements has received little attention and the sufficiency of the validation activities has been largely determined through ad-hoc methods. Since the model serves as the central artifact, its correctness with respect to the users needs is absolutely crucial. In our investigation, we attempt to answer the following two questions with respect to validation (1) Are the requirements sufficiently defined for the system? and (2) How well does the model implement the behaviors specified by the requirements? The second question can be addressed using formal verification. Nevertheless, the size and complexity of many industrial systems make formal verification infeasible even if we have a formal model and formalized requirements. Thus, presently, there is no objective way of answering these two questions. To this end, we propose an approach based on testing that, when given a set of formal requirements, explores the relationship between requirements-based structural test-adequacy coverage and model-based structural test-adequacy coverage. The proposed technique uses requirements coverage metrics defined in [9] on formal high-level software requirements and existing model coverage metrics such as the Modified Condition and Decision Coverage (MC/DC) used when testing highly critical software in the avionics industry [8]. Our work is related to Chockler et al. [2], but we base our work on traditional testing techniques as opposed to verification techniques.

  15. Final Report 2007: DOE-FG02-87ER60561

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kilbourn, Michael R

    2007-04-26

    This project involved a multi-faceted approach to the improvement of techniques used in Positron Emission Tomography (PET), from radiochemistry to image processing and data analysis. New methods for radiochemical syntheses were examined, new radiochemicals prepared for evaluation and eventual use in human PET studies, and new pre-clinical methods examined for validation of biochemical parameters in animal studies. The value of small animal PET imaging in measuring small changes of in vivo biochemistry was examined and directly compared to traditional tissue sampling techniques. In human imaging studies, the ability to perform single experimental sessions utilizing two overlapping injections of radiopharmaceuticals wasmore » tested, and it was shown that valid biochemical measures for both radiotracers can be obtained through careful pharmacokinetic modeling of the PET emission data. Finally, improvements in reconstruction algorithms for PET data from small animal PET scanners was realized and these have been implemented in commercial releases. Together, the project represented an integrated effort to improve and extend all basic science aspects of PET imaging at both the animal and human level.« less

  16. A Foot-Mounted Inertial Measurement Unit (IMU) Positioning Algorithm Based on Magnetic Constraint

    PubMed Central

    Zou, Jiaheng

    2018-01-01

    With the development of related applications, indoor positioning techniques have been more and more widely developed. Based on Wi-Fi, Bluetooth low energy (BLE) and geomagnetism, indoor positioning techniques often rely on the physical location of fingerprint information. The focus and difficulty of establishing the fingerprint database are in obtaining a relatively accurate physical location with as little given information as possible. This paper presents a foot-mounted inertial measurement unit (IMU) positioning algorithm under the loop closure constraint based on magnetic information. It can provide relatively reliable position information without maps and geomagnetic information and provides a relatively accurate coordinate for the collection of a fingerprint database. In the experiment, the features extracted by the multi-level Fourier transform method proposed in this paper are validated and the validity of loop closure matching is tested with a RANSAC-based method. Moreover, the loop closure detection results show that the cumulative error of the trajectory processed by the graph optimization algorithm is significantly suppressed, presenting a good accuracy. The average error of the trajectory under loop closure constraint is controlled below 2.15 m. PMID:29494542

  17. A Foot-Mounted Inertial Measurement Unit (IMU) Positioning Algorithm Based on Magnetic Constraint.

    PubMed

    Wang, Yan; Li, Xin; Zou, Jiaheng

    2018-03-01

    With the development of related applications, indoor positioning techniques have been more and more widely developed. Based on Wi-Fi, Bluetooth low energy (BLE) and geomagnetism, indoor positioning techniques often rely on the physical location of fingerprint information. The focus and difficulty of establishing the fingerprint database are in obtaining a relatively accurate physical location with as little given information as possible. This paper presents a foot-mounted inertial measurement unit (IMU) positioning algorithm under the loop closure constraint based on magnetic information. It can provide relatively reliable position information without maps and geomagnetic information and provides a relatively accurate coordinate for the collection of a fingerprint database. In the experiment, the features extracted by the multi-level Fourier transform method proposed in this paper are validated and the validity of loop closure matching is tested with a RANSAC-based method. Moreover, the loop closure detection results show that the cumulative error of the trajectory processed by the graph optimization algorithm is significantly suppressed, presenting a good accuracy. The average error of the trajectory under loop closure constraint is controlled below 2.15 m.

  18. Modified Welding Technique of a Hypo-Eutectic Al-Cu Alloy for Higher Mechanical Properties

    NASA Astrophysics Data System (ADS)

    Ghosh, B. R.; Gupta, R. K.; Biju, S.; Sinha, P. P.

    GTAW process is used for welding of pressure vessels made of hypo-eutectic Al-Cu alloy AA2219 containing 6.3% Cu. As welded Yield strength of the alloy was found to be in the range of 140-150 MPa, using conventional single pass GTAW technique on both AC and DCSP modes. Interestingly, it was also found that weld-strength decreased with increase in thickness of the weld coupons. Welding metallurgy of AA2219 Al alloy was critically reviewed and factors responsible for lower properties were identified. Multipass GTAW on DCSP mode was postulated to improve the weld strength of this alloy. A systematic experimentation using 12 mm thick plates was carried out and YS of 200 MPa has been achieved in the as welded condition. Thorough characterization including optical and electron microscopy was conducted to validate the metallurgical phenomena attributable to improvement in weld strength. This paper presents the conceptual understanding of welding metallurgy of AA2219 alloy and validation by experiments, which could lead to better weld properties using multipass GTAW on DCSP mode.

  19. Definition and Demonstration of a Methodology for Validating Aircraft Trajectory Predictors

    NASA Technical Reports Server (NTRS)

    Vivona, Robert A.; Paglione, Mike M.; Cate, Karen T.; Enea, Gabriele

    2010-01-01

    This paper presents a new methodology for validating an aircraft trajectory predictor, inspired by the lessons learned from a number of field trials, flight tests and simulation experiments for the development of trajectory-predictor-based automation. The methodology introduces new techniques and a new multi-staged approach to reduce the effort in identifying and resolving validation failures, avoiding the potentially large costs associated with failures during a single-stage, pass/fail approach. As a case study, the validation effort performed by the Federal Aviation Administration for its En Route Automation Modernization (ERAM) system is analyzed to illustrate the real-world applicability of this methodology. During this validation effort, ERAM initially failed to achieve six of its eight requirements associated with trajectory prediction and conflict probe. The ERAM validation issues have since been addressed, but to illustrate how the methodology could have benefited the FAA effort, additional techniques are presented that could have been used to resolve some of these issues. Using data from the ERAM validation effort, it is demonstrated that these new techniques could have identified trajectory prediction error sources that contributed to several of the unmet ERAM requirements.

  20. Bridging the Gap Between Validation and Implementation of Non-Animal Veterinary Vaccine Potency Testing Methods.

    PubMed

    Dozier, Samantha; Brown, Jeffrey; Currie, Alistair

    2011-11-29

    In recent years, technologically advanced high-throughput techniques have been developed that replace, reduce or refine animal use in vaccine quality control tests. Following validation, these tests are slowly being accepted for use by international regulatory authorities. Because regulatory acceptance itself has not guaranteed that approved humane methods are adopted by manufacturers, various organizations have sought to foster the preferential use of validated non-animal methods by interfacing with industry and regulatory authorities. After noticing this gap between regulation and uptake by industry, we began developing a paradigm that seeks to narrow the gap and quicken implementation of new replacement, refinement or reduction guidance. A systematic analysis of our experience in promoting the transparent implementation of validated non-animal vaccine potency assays has led to the refinement of our paradigmatic process, presented here, by which interested parties can assess the local regulatory acceptance of methods that reduce animal use and integrate them into quality control testing protocols, or ensure the elimination of peripheral barriers to their use, particularly for potency and other tests carried out on production batches.

  1. Caffeine expectancy: instrument development in the Rasch measurement framework.

    PubMed

    Heinz, Adrienne J; Kassel, Jon D; Smith, Everett V

    2009-09-01

    Although caffeine is the most widely consumed psychoactive drug in the world, the mechanisms associated with consumption are not well understood. Nonetheless, outcome expectancies for caffeine use are thought to underlie caffeine's reinforcing properties. To date, however, there is no available, sufficient measure by which to assess caffeine expectancy. Therefore, the current study sought to develop such a measure employing Rasch measurement models. Unlike traditional measurement development techniques, Rasch analyses afford dynamic and interactive control of the analysis process and generate helpful information to guide instrument construction. A 5-stage developmental process is described, ultimately yielding a 37-item Caffeine Expectancy Questionnaire (CEQ) comprised of 4 factors representing "withdrawal symptoms," "positive effects," "acute negative effects," and "mood effects." Initial evaluation of the CEQ yielded sufficient evidence for various aspects of validity. Although additional research with more heterogeneous samples is required to further assess the measure's reliability and validity, the CEQ demonstrates potential with regard to its utility in experimental laboratory research and clinical application. 2009 APA, all rights reserved.

  2. Semi Automatic Ontology Instantiation in the domain of Risk Management

    NASA Astrophysics Data System (ADS)

    Makki, Jawad; Alquier, Anne-Marie; Prince, Violaine

    One of the challenging tasks in the context of Ontological Engineering is to automatically or semi-automatically support the process of Ontology Learning and Ontology Population from semi-structured documents (texts). In this paper we describe a Semi-Automatic Ontology Instantiation method from natural language text, in the domain of Risk Management. This method is composed from three steps 1 ) Annotation with part-of-speech tags, 2) Semantic Relation Instances Extraction, 3) Ontology instantiation process. It's based on combined NLP techniques using human intervention between steps 2 and 3 for control and validation. Since it heavily relies on linguistic knowledge it is not domain dependent which is a good feature for portability between the different fields of risk management application. The proposed methodology uses the ontology of the PRIMA1 project (supported by the European community) as a Generic Domain Ontology and populates it via an available corpus. A first validation of the approach is done through an experiment with Chemical Fact Sheets from Environmental Protection Agency2.

  3. Experimental and theoretical investigation of radiation and dynamics properties in laser-produced carbon plasmas

    NASA Astrophysics Data System (ADS)

    Min, Qi; Su, Maogen; Wang, Bo; Cao, Shiquan; Sun, Duixiong; Dong, Chenzhong

    2018-05-01

    The radiation and dynamics properties of laser-produced carbon plasma in vacuum were studied experimentally with aid of a spatio-temporally resolved emission spectroscopy technique. In addition, a radiation hydrodynamics model based on the fluid dynamic equations and the radiative transfer equation was presented, and calculation of the charge states was performed within the time-dependent collisional radiative model. Detailed temporal and spatial evolution behavior about plasma parameters have been analyzed, such as velocity, electron temperature, charge state distribution, energy level population, and various atomic processes. At the same time, the effects of different atomic processes on the charge state distribution were examined. Finally, the validity of assuming a local thermodynamic equilibrium in the carbon plasma expansion was checked, and the results clearly indicate that the assumption was valid only at the initial (<80 ns) stage of plasma expansion. At longer delay times, it was not applicable near the plasma boundary because of a sharp drop of plasma temperature and electron density.

  4. Flight control system design factors for applying automated testing techniques

    NASA Technical Reports Server (NTRS)

    Sitz, Joel R.; Vernon, Todd H.

    1990-01-01

    Automated validation of flight-critical embedded systems is being done at ARC Dryden Flight Research Facility. The automated testing techniques are being used to perform closed-loop validation of man-rated flight control systems. The principal design features and operational experiences of the X-29 forward-swept-wing aircraft and F-18 High Alpha Research Vehicle (HARV) automated test systems are discussed. Operationally applying automated testing techniques has accentuated flight control system features that either help or hinder the application of these techniques. The paper also discusses flight control system features which foster the use of automated testing techniques.

  5. Simulation verification techniques study. Subsystem simulation validation techniques

    NASA Technical Reports Server (NTRS)

    Duncan, L. M.; Reddell, J. P.; Schoonmaker, P. B.

    1974-01-01

    Techniques for validation of software modules which simulate spacecraft onboard systems are discussed. An overview of the simulation software hierarchy for a shuttle mission simulator is provided. A set of guidelines for the identification of subsystem/module performance parameters and critical performance parameters are presented. Various sources of reference data to serve as standards of performance for simulation validation are identified. Environment, crew station, vehicle configuration, and vehicle dynamics simulation software are briefly discussed from the point of view of their interfaces with subsystem simulation modules. A detailed presentation of results in the area of vehicle subsystems simulation modules is included. A list of references, conclusions and recommendations are also given.

  6. Non-thermal inactivation of Noroviruses in food

    NASA Astrophysics Data System (ADS)

    Velebit, B.; Petronijević, R.; Bošković, T.

    2017-09-01

    An increased incidence of foodborne illnesses caused by Norovirus and consumer demand for fresh, convenient, and safe foods have prompted research into alternative antiviral processing technologies. Chlorine dioxide, UV treatment and thermal processing are standard antinoroviral technologies that have been employed for a while; however, they tend to be non-effective in modern processing due to residue concerns (ClO2), shadowing effects (UV) and low-energy efficiency (heat treatment). Alternative technologies have been validated such as ozone treatment, high pressure processing and pulse electric fields. Although these techniques are promising, none of them individually can deem food free of Norovirus. Further research on the effects on Norovirus in various food matrices is required. Good manufacturing practices and proper sanitation procedures remain the “gold” safety tools in food business.

  7. Image analysis method for the measurement of water saturation in a two-dimensional experimental flow tank

    NASA Astrophysics Data System (ADS)

    Belfort, Benjamin; Weill, Sylvain; Lehmann, François

    2017-07-01

    A novel, non-invasive imaging technique is proposed that determines 2D maps of water content in unsaturated porous media. This method directly relates digitally measured intensities to the water content of the porous medium. This method requires the classical image analysis steps, i.e., normalization, filtering, background subtraction, scaling and calibration. The main advantages of this approach are that no calibration experiment is needed, because calibration curve relating water content and reflected light intensities is established during the main monitoring phase of each experiment and that no tracer or dye is injected into the flow tank. The procedure enables effective processing of a large number of photographs and thus produces 2D water content maps at high temporal resolution. A drainage/imbibition experiment in a 2D flow tank with inner dimensions of 40 cm × 14 cm × 6 cm (L × W × D) is carried out to validate the methodology. The accuracy of the proposed approach is assessed using a statistical framework to perform an error analysis and numerical simulations with a state-of-the-art computational code that solves the Richards' equation. Comparison of the cumulative mass leaving and entering the flow tank and water content maps produced by the photographic measurement technique and the numerical simulations demonstrate the efficiency and high accuracy of the proposed method for investigating vadose zone flow processes. Finally, the photometric procedure has been developed expressly for its extension to heterogeneous media. Other processes may be investigated through different laboratory experiments which will serve as benchmark for numerical codes validation.

  8. An investigation into the effects of excipient particle size, blending techniques and processing parameters on the homogeneity and content uniformity of a blend containing low-dose model drug

    PubMed Central

    Alyami, Hamad; Dahmash, Eman; Bowen, James

    2017-01-01

    Powder blend homogeneity is a critical attribute in formulation development of low dose and potent active pharmaceutical ingredients (API) yet a complex process with multiple contributing factors. Excipient characteristics play key role in efficient blending process and final product quality. In this work the effect of excipient type and properties, blending technique and processing time on content uniformity was investigated. Powder characteristics for three commonly used excipients (starch, pregelatinised starch and microcrystalline cellulose) were initially explored using laser diffraction particle size analyser, angle of repose for flowability, followed by thorough evaluations of surface topography employing scanning electron microscopy and interferometry. Blend homogeneity was evaluated based on content uniformity analysis of the model API, ergocalciferol, using a validated analytical technique. Flowability of powders were directly related to particle size and shape, while surface topography results revealed the relationship between surface roughness and ability of excipient with high surface roughness to lodge fine API particles within surface groves resulting in superior uniformity of content. Of the two blending techniques, geometric blending confirmed the ability to produce homogeneous blends at low dilution when processed for longer durations, whereas manual ordered blending failed to achieve compendial requirement for content uniformity despite mixing for 32 minutes. Employing the novel dry powder hybrid mixer device, developed at Aston University laboratory, results revealed the superiority of the device and enabled the production of homogenous blend irrespective of excipient type and particle size. Lower dilutions of the API (1% and 0.5% w/w) were examined using non-sieved excipients and the dry powder hybrid mixing device enabled the development of successful blends within compendial requirements and low relative standard deviation. PMID:28609454

  9. An investigation into the effects of excipient particle size, blending techniques and processing parameters on the homogeneity and content uniformity of a blend containing low-dose model drug.

    PubMed

    Alyami, Hamad; Dahmash, Eman; Bowen, James; Mohammed, Afzal R

    2017-01-01

    Powder blend homogeneity is a critical attribute in formulation development of low dose and potent active pharmaceutical ingredients (API) yet a complex process with multiple contributing factors. Excipient characteristics play key role in efficient blending process and final product quality. In this work the effect of excipient type and properties, blending technique and processing time on content uniformity was investigated. Powder characteristics for three commonly used excipients (starch, pregelatinised starch and microcrystalline cellulose) were initially explored using laser diffraction particle size analyser, angle of repose for flowability, followed by thorough evaluations of surface topography employing scanning electron microscopy and interferometry. Blend homogeneity was evaluated based on content uniformity analysis of the model API, ergocalciferol, using a validated analytical technique. Flowability of powders were directly related to particle size and shape, while surface topography results revealed the relationship between surface roughness and ability of excipient with high surface roughness to lodge fine API particles within surface groves resulting in superior uniformity of content. Of the two blending techniques, geometric blending confirmed the ability to produce homogeneous blends at low dilution when processed for longer durations, whereas manual ordered blending failed to achieve compendial requirement for content uniformity despite mixing for 32 minutes. Employing the novel dry powder hybrid mixer device, developed at Aston University laboratory, results revealed the superiority of the device and enabled the production of homogenous blend irrespective of excipient type and particle size. Lower dilutions of the API (1% and 0.5% w/w) were examined using non-sieved excipients and the dry powder hybrid mixing device enabled the development of successful blends within compendial requirements and low relative standard deviation.

  10. Classification and Validation of Behavioral Subtypes of Learning-Disabled Children.

    ERIC Educational Resources Information Center

    Speece, Deborah L.; And Others

    1985-01-01

    Using the Classroom Behavior Inventory, teachers rated the behaviors of 63 school-identified, learning-disabled first and second graders. Hierarchical cluster analysis techniques identified seven distinct behavioral subtypes. Internal validation techniques indicated that the subtypes were replicable and had profile patterns different from a sample…

  11. Thermal Modeling and Simulation of Electron Beam Melting for Rapid Prototyping on Ti6Al4V Alloys

    NASA Astrophysics Data System (ADS)

    Neira Arce, Alderson

    To be a viable solution for contemporary engineering challenges, the use of titanium alloys in a wider range of applications requires the development of new techniques and processes that are able to decrease production cost and delivery times. As a result, the use of material consolidation in a near-net-shape fashion, using dynamic techniques like additive manufacturing by electron beam selective melting EBSM represents a promising method for part manufacturing. However, a new product material development can be cost prohibitive, requiring the use of computer modeling and simulation as a way to decrease turnaround time. To ensure a proper representation of the EBSM process, a thermophysical material characterization and comparison was first performed on two Ti6Al4V powder feedstock materials prepared by plasma (PREP) and gas atomized (GA) processes. This evaluation comprises an evaluation on particle size distribution, density and powder surface area, collectively with the temperature dependence on properties such as heat capacity, thermal diffusivity, thermal conductivity and surface emissivity. Multiple techniques were employed in this evaluation, including high temperature differential scanning calorimetry (HT-DSC), laser flash analysis (LFA), infrared remote temperature analysis (IR-Thermography), laser diffraction, liquid and gas pycnometry using mercury and krypton adsorption respectively. This study was followed by the review of complementary strategies to simulate the temperature evolution during the EBSM process, using a finite element analysis package called COMSOL Multiphysics. Two alternatives dedicated to representing a moving heat source (electron beam) and the powder bed were developed using a step-by-step approximation initiative. The first method consisted of the depiction of a powder bed discretized on an array of domains, each one representing a static melt pool, where the moving heat source was illustrated by a series of time dependant selective heating and cooling steps. The second method consisted of the solution of a prescribed domain, where each powder layer is discretized by an individual 3D element and the heat source is represented by a 1D element displaced by a temperature-coupling extrapolation routine. Two validation strategies were presented here; the first was used to confirm the accuracy of the proposed model strategy by setting up a controlled experiment; the second was used to validate the post-processing data obtained by the simulation by comparison with in-situ measured EBSM process temperature. Finally, a post-process part evaluation on surface finishing and part porosity was discussed including an assessment of the use of non-destructive inspection techniques such as 3D profilometry by axial chromatism for surface roughness, partial section analysis by serial block-face scanning electron microscopy (SBFSEM) and micro computed tomography (CT-Scan) for pore and inclusion detection.

  12. Strain Rate Dependency of Bronze Metal Matrix Composite Mechanical Properties as a Function of Casting Technique

    NASA Astrophysics Data System (ADS)

    Brown, Lloyd; Joyce, Peter; Radice, Joshua; Gregorian, Dro; Gobble, Michael

    2012-07-01

    Strain rate dependency of mechanical properties of tungsten carbide (WC)-filled bronze castings fabricated by centrifugal and sedimentation-casting techniques are examined, in this study. Both casting techniques are an attempt to produce a functionally graded material with high wear resistance at a chosen surface. Potential applications of such materials include shaft bushings, electrical contact surfaces, and brake rotors. Knowledge of strain rate-dependent mechanical properties is recommended for predicting component response due to dynamic loading or impact events. A brief overview of the casting techniques for the materials considered in this study is followed by an explanation of the test matrix and testing techniques. Hardness testing, density measurement, and determination of the volume fraction of WC particles are performed throughout the castings using both image analysis and optical microscopy. The effects of particle filling on mechanical properties are first evaluated through a microhardness survey of the castings. The volume fraction of WC particles is validated using a thorough density survey and a rule-of-mixtures model. Split Hopkinson Pressure Bar (SHPB) testing of various volume fraction specimens is conducted to determine strain dependence of mechanical properties and to compare the process-property relationships between the two casting techniques. The baseline performances of C95400 bronze are provided for comparison. The results show that the addition of WC particles improves microhardness significantly for the centrifugally cast specimens, and, to a lesser extent, in the sedimentation-cast specimens, largely because the WC particles are more concentrated as a result of the centrifugal-casting process. Both metal matrix composites (MMCs) demonstrate strain rate dependency, with sedimentation casting having a greater, but variable, effects on material response. This difference is attributed to legacy effects from the casting process, namely, porosity and localized WC particle grouping.

  13. A novel image processing technique for 3D volumetric analysis of severely resorbed alveolar sockets with CBCT.

    PubMed

    Manavella, Valeria; Romano, Federica; Garrone, Federica; Terzini, Mara; Bignardi, Cristina; Aimetti, Mario

    2017-06-01

    The aim of this study was to present and validate a novel procedure for the quantitative volumetric assessment of extraction sockets that combines cone-beam computed tomography (CBCT) and image processing techniques. The CBCT dataset of 9 severely resorbed extraction sockets was analyzed by means of two image processing software, Image J and Mimics, using manual and automated segmentation techniques. They were also applied on 5-mm spherical aluminum markers of known volume and on a polyvinyl chloride model of one alveolar socket scanned with Micro-CT to test the accuracy. Statistical differences in alveolar socket volume were found between the different methods of volumetric analysis (P<0.0001). The automated segmentation using Mimics was the most reliable and accurate method with a relative error of 1.5%, considerably smaller than the error of 7% and of 10% introduced by the manual method using Mimics and by the automated method using ImageJ. The currently proposed automated segmentation protocol for the three-dimensional rendering of alveolar sockets showed more accurate results, excellent inter-observer similarity and increased user friendliness. The clinical application of this method enables a three-dimensional evaluation of extraction socket healing after the reconstructive procedures and during the follow-up visits.

  14. Instrumentation and signal processing for the detection of heavy water using off axis-integrated cavity output spectroscopy technique

    NASA Astrophysics Data System (ADS)

    Gupta, A.; Singh, P. J.; Gaikwad, D. Y.; Udupa, D. V.; Topkar, A.; Sahoo, N. K.

    2018-02-01

    An experimental setup is developed for the trace level detection of heavy water (HDO) using the off axis-integrated cavity output spectroscopy technique. The absorption spectrum of water samples is recorded in the spectral range of 7190.7 cm-1-7191.5 cm-1 with the diode laser as the light source. From the recorded water vapor absorption spectrum, the heavy water concentration is determined from the HDO and water line. The effect of cavity gain nonlinearity with per pass absorption is studied. The signal processing and data fitting procedure is devised to obtain linear calibration curves by including nonlinear cavity gain effects into the calculation. Initial calibration of mirror reflectivity is performed by measurements on the natural water sample. The signal processing and data fitting method has been validated by the measurement of the HDO concentration in water samples over a wide range from 20 ppm to 2280 ppm showing a linear calibration curve. The average measurement time is about 30 s. The experimental technique presented in this paper could be applied for the development of a portable instrument for the fast measurement of water isotopic composition in heavy water plants and for the detection of heavy water leak in pressurized heavy water reactors.

  15. Micro-computed tomography in murine models of cerebral cavernous malformations as a paradigm for brain disease.

    PubMed

    Girard, Romuald; Zeineddine, Hussein A; Orsbon, Courtney; Tan, Huan; Moore, Thomas; Hobson, Nick; Shenkar, Robert; Lightle, Rhonda; Shi, Changbin; Fam, Maged D; Cao, Ying; Shen, Le; Neander, April I; Rorrer, Autumn; Gallione, Carol; Tang, Alan T; Kahn, Mark L; Marchuk, Douglas A; Luo, Zhe-Xi; Awad, Issam A

    2016-09-15

    Cerebral cavernous malformations (CCMs) are hemorrhagic brain lesions, where murine models allow major mechanistic discoveries, ushering genetic manipulations and preclinical assessment of therapies. Histology for lesion counting and morphometry is essential yet tedious and time consuming. We herein describe the application and validations of X-ray micro-computed tomography (micro-CT), a non-destructive technique allowing three-dimensional CCM lesion count and volumetric measurements, in transgenic murine brains. We hereby describe a new contrast soaking technique not previously applied to murine models of CCM disease. Volumetric segmentation and image processing paradigm allowed for histologic correlations and quantitative validations not previously reported with the micro-CT technique in brain vascular disease. Twenty-two hyper-dense areas on micro-CT images, identified as CCM lesions, were matched by histology. The inter-rater reliability analysis showed strong consistency in the CCM lesion identification and staging (K=0.89, p<0.0001) between the two techniques. Micro-CT revealed a 29% greater CCM lesion detection efficiency, and 80% improved time efficiency. Serial integrated lesional area by histology showed a strong positive correlation with micro-CT estimated volume (r(2)=0.84, p<0.0001). Micro-CT allows high throughput assessment of lesion count and volume in pre-clinical murine models of CCM. This approach complements histology with improved accuracy and efficiency, and can be applied for lesion burden assessment in other brain diseases. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Real-Time Measurement of Width and Height of Weld Beads in GMAW Processes.

    PubMed

    Pinto-Lopera, Jesús Emilio; S T Motta, José Mauricio; Absi Alfaro, Sadek Crisostomo

    2016-09-15

    Associated to the weld quality, the weld bead geometry is one of the most important parameters in welding processes. It is a significant requirement in a welding project, especially in automatic welding systems where a specific width, height, or penetration of weld bead is needed. This paper presents a novel technique for real-time measuring of the width and height of weld beads in gas metal arc welding (GMAW) using a single high-speed camera and a long-pass optical filter in a passive vision system. The measuring method is based on digital image processing techniques and the image calibration process is based on projective transformations. The measurement process takes less than 3 milliseconds per image, which allows a transfer rate of more than 300 frames per second. The proposed methodology can be used in any metal transfer mode of a gas metal arc welding process and does not have occlusion problems. The responses of the measurement system, presented here, are in a good agreement with off-line data collected by a common laser-based 3D scanner. Each measurement is compare using a statistical Welch's t-test of the null hypothesis, which, in any case, does not exceed the threshold of significance level α = 0.01, validating the results and the performance of the proposed vision system.

  17. Garment Counting in a Textile Warehouse by Means of a Laser Imaging System

    PubMed Central

    Martínez-Sala, Alejandro Santos; Sánchez-Aartnoutse, Juan Carlos; Egea-López, Esteban

    2013-01-01

    Textile logistic warehouses are highly automated mechanized places where control points are needed to count and validate the number of garments in each batch. This paper proposes and describes a low cost and small size automated system designed to count the number of garments by processing an image of the corresponding hanger hooks generated using an array of phototransistors sensors and a linear laser beam. The generated image is processed using computer vision techniques to infer the number of garment units. The system has been tested on two logistic warehouses with a mean error in the estimated number of hangers of 0.13%. PMID:23628760

  18. Garment counting in a textile warehouse by means of a laser imaging system.

    PubMed

    Martínez-Sala, Alejandro Santos; Sánchez-Aartnoutse, Juan Carlos; Egea-López, Esteban

    2013-04-29

    Textile logistic warehouses are highly automated mechanized places where control points are needed to count and validate the number of garments in each batch. This paper proposes and describes a low cost and small size automated system designed to count the number of garments by processing an image of the corresponding hanger hooks generated using an array of phototransistors sensors and a linear laser beam. The generated image is processed using computer vision techniques to infer the number of garment units. The system has been tested on two logistic warehouses with a mean error in the estimated number of hangers of 0.13%.

  19. Comparison of the resulting error in data fusion techniques when used with remote sensing, earth observation, and in-situ data sets for water quality applications

    NASA Astrophysics Data System (ADS)

    Ziemba, Alexander; El Serafy, Ghada

    2016-04-01

    Ecological modeling and water quality investigations are complex processes which can require a high level of parameterization and a multitude of varying data sets in order to properly execute the model in question. Since models are generally complex, their calibration and validation can benefit from the application of data and information fusion techniques. The data applied to ecological models comes from a wide range of sources such as remote sensing, earth observation, and in-situ measurements, resulting in a high variability in the temporal and spatial resolution of the various data sets available to water quality investigators. It is proposed that effective fusion into a comprehensive singular set will provide a more complete and robust data resource with which models can be calibrated, validated, and driven by. Each individual product contains a unique valuation of error resulting from the method of measurement and application of pre-processing techniques. The uncertainty and error is further compounded when the data being fused is of varying temporal and spatial resolution. In order to have a reliable fusion based model and data set, the uncertainty of the results and confidence interval of the data being reported must be effectively communicated to those who would utilize the data product or model outputs in a decision making process[2]. Here we review an array of data fusion techniques applied to various remote sensing, earth observation, and in-situ data sets whose domains' are varied in spatial and temporal resolution. The data sets examined are combined in a manner so that the various classifications, complementary, redundant, and cooperative, of data are all assessed to determine classification's impact on the propagation and compounding of error. In order to assess the error of the fused data products, a comparison is conducted with data sets containing a known confidence interval and quality rating. We conclude with a quantification of the performance of the data fusion techniques and a recommendation on the feasibility of applying of the fused products in operating forecast systems and modeling scenarios. The error bands and confidence intervals derived can be used in order to clarify the error and confidence of water quality variables produced by prediction and forecasting models. References [1] F. Castanedo, "A Review of Data Fusion Techniques", The Scientific World Journal, vol. 2013, pp. 1-19, 2013. [2] T. Keenan, M. Carbone, M. Reichstein and A. Richardson, "The model-data fusion pitfall: assuming certainty in an uncertain world", Oecologia, vol. 167, no. 3, pp. 587-597, 2011.

  20. Statistical Approaches to Interpretation of Local, Regional, and National Highway-Runoff and Urban-Stormwater Data

    USGS Publications Warehouse

    Tasker, Gary D.; Granato, Gregory E.

    2000-01-01

    Decision makers need viable methods for the interpretation of local, regional, and national-highway runoff and urban-stormwater data including flows, concentrations and loads of chemical constituents and sediment, potential effects on receiving waters, and the potential effectiveness of various best management practices (BMPs). Valid (useful for intended purposes), current, and technically defensible stormwater-runoff models are needed to interpret data collected in field studies, to support existing highway and urban-runoffplanning processes, to meet National Pollutant Discharge Elimination System (NPDES) requirements, and to provide methods for computation of Total Maximum Daily Loads (TMDLs) systematically and economically. Historically, conceptual, simulation, empirical, and statistical models of varying levels of detail, complexity, and uncertainty have been used to meet various data-quality objectives in the decision-making processes necessary for the planning, design, construction, and maintenance of highways and for other land-use applications. Water-quality simulation models attempt a detailed representation of the physical processes and mechanisms at a given site. Empirical and statistical regional water-quality assessment models provide a more general picture of water quality or changes in water quality over a region. All these modeling techniques share one common aspect-their predictive ability is poor without suitable site-specific data for calibration. To properly apply the correct model, one must understand the classification of variables, the unique characteristics of water-resources data, and the concept of population structure and analysis. Classifying variables being used to analyze data may determine which statistical methods are appropriate for data analysis. An understanding of the characteristics of water-resources data is necessary to evaluate the applicability of different statistical methods, to interpret the results of these techniques, and to use tools and techniques that account for the unique nature of water-resources data sets. Populations of data on stormwater-runoff quantity and quality are often best modeled as logarithmic transformations. Therefore, these factors need to be considered to form valid, current, and technically defensible stormwater-runoff models. Regression analysis is an accepted method for interpretation of water-resources data and for prediction of current or future conditions at sites that fit the input data model. Regression analysis is designed to provide an estimate of the average response of a system as it relates to variation in one or more known variables. To produce valid models, however, regression analysis should include visual analysis of scatterplots, an examination of the regression equation, evaluation of the method design assumptions, and regression diagnostics. A number of statistical techniques are described in the text and in the appendixes to provide information necessary to interpret data by use of appropriate methods. Uncertainty is an important part of any decisionmaking process. In order to deal with uncertainty problems, the analyst needs to know the severity of the statistical uncertainty of the methods used to predict water quality. Statistical models need to be based on information that is meaningful, representative, complete, precise, accurate, and comparable to be deemed valid, up to date, and technically supportable. To assess uncertainty in the analytical tools, the modeling methods, and the underlying data set, all of these components need be documented and communicated in an accessible format within project publications.

  1. Correcting the planar perspective projection in geometric structures applied to forensic facial analysis.

    PubMed

    Baldasso, Rosane Pérez; Tinoco, Rachel Lima Ribeiro; Vieira, Cristina Saft Matos; Fernandes, Mário Marques; Oliveira, Rogério Nogueira

    2016-10-01

    The process of forensic facial analysis may be founded on several scientific techniques and imaging modalities, such as digital signal processing, photogrammetry and craniofacial anthropometry. However, one of the main limitations in this analysis is the comparison of images acquired with different angles of incidence. The present study aimed to explore a potential approach for the correction of the planar perspective projection (PPP) in geometric structures traced from the human face. A technique for the correction of the PPP was calibrated within photographs of two geometric structures obtained with angles of incidence distorted in 80°, 60° and 45°. The technique was performed using ImageJ ® 1.46r (National Institutes of Health, Bethesda, Maryland). The corrected images were compared with photographs of the same object obtained in 90° (reference). In a second step, the technique was validated in a digital human face created using MakeHuman ® 1.0.2 (Free Software Foundation, Massachusetts, EUA) and Blender ® 2.75 (Blender ® Foundation, Amsterdam, Nederland) software packages. The images registered with angular distortion presented a gradual decrease in height when compared to the reference. The digital technique for the correction of the PPP is a valuable tool for forensic applications using photographic imaging modalities, such as forensic facial analysis. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  2. Comparison of elevation derived from insar data with dem from topography map in Son Dong, Bac Giang, Viet Nam

    NASA Astrophysics Data System (ADS)

    Nguyen, Duy

    2012-07-01

    Digital Elevation Models (DEMs) are used in many applications in the context of earth sciences such as in topographic mapping, environmental modeling, rainfall-runoff studies, landslide hazard zonation, seismic source modeling, etc. During the last years multitude of scientific applications of Synthetic Aperture Radar Interferometry (InSAR) techniques have evolved. It has been shown that InSAR is an established technique of generating high quality DEMs from space borne and airborne data, and that it has advantages over other methods for the generation of large area DEM. However, the processing of InSAR data is still a challenging task. This paper describes InSAR operational steps and processing chain for DEM generation from Single Look Complex (SLC) SAR data and compare a satellite SAR estimate of surface elevation with a digital elevation model (DEM) from Topography map. The operational steps are performed in three major stages: Data Search, Data Processing, and product Validation. The Data processing stage is further divided into five steps of Data Pre-Processing, Co-registration, Interferogram generation, Phase unwrapping, and Geocoding. The Data processing steps have been tested with ERS 1/2 data using Delft Object-oriented Interferometric (DORIS) InSAR processing software. Results of the outcome of the application of the described processing steps to real data set are presented.

  3. Impact of varying lidar measurement and data processing techniques in evaluating cirrus cloud and aerosol direct radiative effects

    NASA Astrophysics Data System (ADS)

    Lolli, Simone; Madonna, Fabio; Rosoldi, Marco; Campbell, James R.; Welton, Ellsworth J.; Lewis, Jasper R.; Gu, Yu; Pappalardo, Gelsomina

    2018-03-01

    In the past 2 decades, ground-based lidar networks have drastically increased in scope and relevance, thanks primarily to the advent of lidar observations from space and their need for validation. Lidar observations of aerosol and cloud geometrical, optical and microphysical atmospheric properties are subsequently used to evaluate their direct radiative effects on climate. However, the retrievals are strongly dependent on the lidar instrument measurement technique and subsequent data processing methodologies. In this paper, we evaluate the discrepancies between the use of Raman and elastic lidar measurement techniques and corresponding data processing methods for two aerosol layers in the free troposphere and for two cirrus clouds with different optical depths. Results show that the different lidar techniques are responsible for discrepancies in the model-derived direct radiative effects for biomass burning (0.05 W m-2 at surface and 0.007 W m-2 at top of the atmosphere) and dust aerosol layers (0.7 W m-2 at surface and 0.85 W m-2 at top of the atmosphere). Data processing is further responsible for discrepancies in both thin (0.55 W m-2 at surface and 2.7 W m-2 at top of the atmosphere) and opaque (7.7 W m-2 at surface and 11.8 W m-2 at top of the atmosphere) cirrus clouds. Direct radiative effect discrepancies can be attributed to the larger variability of the lidar ratio for aerosols (20-150 sr) than for clouds (20-35 sr). For this reason, the influence of the applied lidar technique plays a more fundamental role in aerosol monitoring because the lidar ratio must be retrieved with relatively high accuracy. In contrast, for cirrus clouds, with the lidar ratio being much less variable, the data processing is critical because smoothing it modifies the aerosol and cloud vertically resolved extinction profile that is used as input to compute direct radiative effect calculations.

  4. Simple laser vision sensor calibration for surface profiling applications

    NASA Astrophysics Data System (ADS)

    Abu-Nabah, Bassam A.; ElSoussi, Adnane O.; Al Alami, Abed ElRahman K.

    2016-09-01

    Due to the relatively large structures in the Oil and Gas industry, original equipment manufacturers (OEMs) have been implementing custom-designed laser vision sensor (LVS) surface profiling systems as part of quality control in their manufacturing processes. The rough manufacturing environment and the continuous movement and misalignment of these custom-designed tools adversely affect the accuracy of laser-based vision surface profiling applications. Accordingly, Oil and Gas businesses have been raising the demand from the OEMs to implement practical and robust LVS calibration techniques prior to running any visual inspections. This effort introduces an LVS calibration technique representing a simplified version of two known calibration techniques, which are commonly implemented to obtain a calibrated LVS system for surface profiling applications. Both calibration techniques are implemented virtually and experimentally to scan simulated and three-dimensional (3D) printed features of known profiles, respectively. Scanned data is transformed from the camera frame to points in the world coordinate system and compared with the input profiles to validate the introduced calibration technique capability against the more complex approach and preliminarily assess the measurement technique for weld profiling applications. Moreover, the sensitivity to stand-off distances is analyzed to illustrate the practicality of the presented technique.

  5. Smartphone snapshot mapping of skin chromophores under triple-wavelength laser illumination

    NASA Astrophysics Data System (ADS)

    Spigulis, Janis; Oshina, Ilze; Berzina, Anna; Bykov, Alexander

    2017-09-01

    Chromophore distribution maps are useful tools for skin malformation severity assessment and for monitoring of skin recovery after burns, surgeries, and other interactions. The chromophore maps can be obtained by processing several spectral images of skin, e.g., captured by hyperspectral or multispectral cameras during seconds or even minutes. To avoid motion artifacts and simplify the procedure, a single-snapshot technique for mapping melanin, oxyhemoglobin, and deoxyhemoglobin of in-vivo skin by a smartphone under simultaneous three-wavelength (448-532-659 nm) laser illumination is proposed and examined. Three monochromatic spectral images related to the illumination wavelengths were extracted from the smartphone camera RGB image data set with respect to crosstalk between the RGB detection bands. Spectral images were further processed accordingly to Beer's law in a three chromophore approximation. Photon absorption path lengths in skin at the exploited wavelengths were estimated by means of Monte Carlo simulations. The technique was validated clinically on three kinds of skin lesions: nevi, hemangiomas, and seborrheic keratosis. Design of the developed add-on laser illumination system, image-processing details, and the results of clinical measurements are presented and discussed.

  6. Filament Breakage Monitoring in Fused Deposition Modeling Using Acoustic Emission Technique

    PubMed Central

    Jin, Li; Yan, Youruiling; Mei, Yiming

    2018-01-01

    Polymers are being used in a wide range of Additive Manufacturing (AM) applications and have been shown to have tremendous potential for producing complex, individually customized parts. In order to improve part quality, it is essential to identify and monitor the process malfunctions of polymer-based AM. The present work endeavored to develop an alternative method for filament breakage identification in the Fused Deposition Modeling (FDM) AM process. The Acoustic Emission (AE) technique was applied due to the fact that it had the capability of detecting bursting and weak signals, especially from complex background noises. The mechanism of filament breakage was depicted thoroughly. The relationship between the process parameters and critical feed rate was obtained. In addition, the framework of filament breakage detection based on the instantaneous skewness and relative similarity of the AE raw waveform was illustrated. Afterwards, we conducted several filament breakage tests to validate their feasibility and effectiveness. Results revealed that the breakage could be successfully identified. Achievements of the present work could be further used to develop a comprehensive in situ FDM monitoring system with moderate cost. PMID:29494559

  7. Probing sensorimotor integration during musical performance.

    PubMed

    Furuya, Shinichi; Furukawa, Yuta; Uehara, Kazumasa; Oku, Takanori

    2018-03-10

    An integration of afferent sensory information from the visual, auditory, and proprioceptive systems into execution and update of motor programs plays crucial roles in control and acquisition of skillful sequential movements in musical performance. However, conventional behavioral and neurophysiological techniques that have been applied to study simplistic motor behaviors limit elucidating online sensorimotor integration processes underlying skillful musical performance. Here, we propose two novel techniques that were developed to investigate the roles of auditory and proprioceptive feedback in piano performance. First, a closed-loop noninvasive brain stimulation system that consists of transcranial magnetic stimulation, a motion sensor, and a microcomputer enabled to assess time-varying cortical processes subserving auditory-motor integration during piano playing. Second, a force-field system capable of manipulating the weight of a piano key allowed for characterizing movement adaptation based on the feedback obtained, which can shed light on the formation of an internal representation of the piano. Results of neurophysiological and psychophysics experiments provided evidence validating these systems as effective means for disentangling computational and neural processes of sensorimotor integration in musical performance. © 2018 New York Academy of Sciences.

  8. On the next generation of reliability analysis tools

    NASA Technical Reports Server (NTRS)

    Babcock, Philip S., IV; Leong, Frank; Gai, Eli

    1987-01-01

    The current generation of reliability analysis tools concentrates on improving the efficiency of the description and solution of the fault-handling processes and providing a solution algorithm for the full system model. The tools have improved user efficiency in these areas to the extent that the problem of constructing the fault-occurrence model is now the major analysis bottleneck. For the next generation of reliability tools, it is proposed that techniques be developed to improve the efficiency of the fault-occurrence model generation and input. Further, the goal is to provide an environment permitting a user to provide a top-down design description of the system from which a Markov reliability model is automatically constructed. Thus, the user is relieved of the tedious and error-prone process of model construction, permitting an efficient exploration of the design space, and an independent validation of the system's operation is obtained. An additional benefit of automating the model construction process is the opportunity to reduce the specialized knowledge required. Hence, the user need only be an expert in the system he is analyzing; the expertise in reliability analysis techniques is supplied.

  9. 21 CFR 820.75 - Process validation.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 8 2012-04-01 2012-04-01 false Process validation. 820.75 Section 820.75 Food and... QUALITY SYSTEM REGULATION Production and Process Controls § 820.75 Process validation. (a) Where the... validated with a high degree of assurance and approved according to established procedures. The validation...

  10. 21 CFR 820.75 - Process validation.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 8 2013-04-01 2013-04-01 false Process validation. 820.75 Section 820.75 Food and... QUALITY SYSTEM REGULATION Production and Process Controls § 820.75 Process validation. (a) Where the... validated with a high degree of assurance and approved according to established procedures. The validation...

  11. 21 CFR 820.75 - Process validation.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 8 2014-04-01 2014-04-01 false Process validation. 820.75 Section 820.75 Food and... QUALITY SYSTEM REGULATION Production and Process Controls § 820.75 Process validation. (a) Where the... validated with a high degree of assurance and approved according to established procedures. The validation...

  12. 21 CFR 820.75 - Process validation.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Process validation. 820.75 Section 820.75 Food and... QUALITY SYSTEM REGULATION Production and Process Controls § 820.75 Process validation. (a) Where the... validated with a high degree of assurance and approved according to established procedures. The validation...

  13. 21 CFR 820.75 - Process validation.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Process validation. 820.75 Section 820.75 Food and... QUALITY SYSTEM REGULATION Production and Process Controls § 820.75 Process validation. (a) Where the... validated with a high degree of assurance and approved according to established procedures. The validation...

  14. Evaluating Quality of Decision-Making Processes in Medicines' Development, Regulatory Review, and Health Technology Assessment: A Systematic Review of the Literature.

    PubMed

    Bujar, Magdalena; McAuslane, Neil; Walker, Stuart R; Salek, Sam

    2017-01-01

    Introduction: Although pharmaceutical companies, regulatory authorities, and health technology assessment (HTA) agencies have been increasingly using decision-making frameworks, it is not certain whether these enable better quality decision making. This could be addressed by formally evaluating the quality of decision-making process within those organizations. The aim of this literature review was to identify current techniques (tools, questionnaires, surveys, and studies) for measuring the quality of the decision-making process across the three stakeholders. Methods: Using MEDLINE, Web of Knowledge, and other Internet-based search engines, a literature review was performed to systematically identify techniques for assessing quality of decision making in medicines development, regulatory review, and HTA. A structured search was applied using key words and a secondary review was carried out. In addition, the measurement properties of each technique were assessed and compared. Ten Quality Decision-Making Practices (QDMPs) developed previously were then used as a framework for the evaluation of techniques identified in the review. Due to the variation in studies identified, meta-analysis was inappropriate. Results: This review identified 13 techniques, where 7 were developed specifically to assess decision making in medicines' development, regulatory review, or HTA; 2 examined corporate decision making, and 4 general decision making. Regarding how closely each technique conformed to the 10 QDMPs, the 13 techniques assessed a median of 6 QDMPs, with a mode of 3 QDMPs. Only 2 techniques evaluated all 10 QDMPs, namely the Organizational IQ and the Quality of Decision Making Orientation Scheme (QoDoS), of which only one technique, QoDoS could be applied to assess decision making of both individuals and organizations, and it possessed generalizability to capture issues relevant to companies as well as regulatory authorities. Conclusion: This review confirmed a general paucity of research in this area, particularly regarding the development and systematic application of techniques for evaluating quality decision making, with no consensus around a gold standard. This review has identified QoDoS as the most promising available technique for assessing decision making in the lifecycle of medicines and the next steps would be to further test its validity, sensitivity, and reliability.

  15. Multi-Objective Optimization of Friction Stir Welding Process Parameters of AA6061-T6 and AA7075-T6 Using a Biogeography Based Optimization Algorithm

    PubMed Central

    Tamjidy, Mehran; Baharudin, B. T. Hang Tuah; Paslar, Shahla; Matori, Khamirul Amin; Sulaiman, Shamsuddin; Fadaeifard, Firouz

    2017-01-01

    The development of Friction Stir Welding (FSW) has provided an alternative approach for producing high-quality welds, in a fast and reliable manner. This study focuses on the mechanical properties of the dissimilar friction stir welding of AA6061-T6 and AA7075-T6 aluminum alloys. The FSW process parameters such as tool rotational speed, tool traverse speed, tilt angle, and tool offset influence the mechanical properties of the friction stir welded joints significantly. A mathematical regression model is developed to determine the empirical relationship between the FSW process parameters and mechanical properties, and the results are validated. In order to obtain the optimal values of process parameters that simultaneously optimize the ultimate tensile strength, elongation, and minimum hardness in the heat affected zone (HAZ), a metaheuristic, multi objective algorithm based on biogeography based optimization is proposed. The Pareto optimal frontiers for triple and dual objective functions are obtained and the best optimal solution is selected through using two different decision making techniques, technique for order of preference by similarity to ideal solution (TOPSIS) and Shannon’s entropy. PMID:28772893

  16. Optimization of cladding parameters for resisting corrosion on low carbon steels using simulated annealing algorithm

    NASA Astrophysics Data System (ADS)

    Balan, A. V.; Shivasankaran, N.; Magibalan, S.

    2018-04-01

    Low carbon steels used in chemical industries are frequently affected by corrosion. Cladding is a surfacing process used for depositing a thick layer of filler metal in a highly corrosive materials to achieve corrosion resistance. Flux cored arc welding (FCAW) is preferred in cladding process due to its augmented efficiency and higher deposition rate. In this cladding process, the effect of corrosion can be minimized by controlling the output responses such as minimizing dilution, penetration and maximizing bead width, reinforcement and ferrite number. This paper deals with the multi-objective optimization of flux cored arc welding responses by controlling the process parameters such as wire feed rate, welding speed, Nozzle to plate distance, welding gun angle for super duplex stainless steel material using simulated annealing technique. Regression equation has been developed and validated using ANOVA technique. The multi-objective optimization of weld bead parameters was carried out using simulated annealing to obtain optimum bead geometry for reducing corrosion. The potentiodynamic polarization test reveals the balanced formation of fine particles of ferrite and autenite content with desensitized nature of the microstructure in the optimized clad bead.

  17. Model reduction in integrated controls-structures design

    NASA Technical Reports Server (NTRS)

    Maghami, Peiman G.

    1993-01-01

    It is the objective of this paper to present a model reduction technique developed for the integrated controls-structures design of flexible structures. Integrated controls-structures design problems are typically posed as nonlinear mathematical programming problems, where the design variables consist of both structural and control parameters. In the solution process, both structural and control design variables are constantly changing; therefore, the dynamic characteristics of the structure are also changing. This presents a problem in obtaining a reduced-order model for active control design and analysis which will be valid for all design points within the design space. In other words, the frequency and number of the significant modes of the structure (modes that should be included) may vary considerably throughout the design process. This is also true as the locations and/or masses of the sensors and actuators change. Moreover, since the number of design evaluations in the integrated design process could easily run into thousands, any feasible order-reduction method should not require model reduction analysis at every design iteration. In this paper a novel and efficient technique for model reduction in the integrated controls-structures design process, which addresses these issues, is presented.

  18. Digital image comparison by subtracting contextual transformations—percentile rank order differentiation

    USGS Publications Warehouse

    Wehde, M. E.

    1995-01-01

    The common method of digital image comparison by subtraction imposes various constraints on the image contents. Precise registration of images is required to assure proper evaluation of surface locations. The attribute being measured and the calibration and scaling of the sensor are also important to the validity and interpretability of the subtraction result. Influences of sensor gains and offsets complicate the subtraction process. The presence of any uniform systematic transformation component in one of two images to be compared distorts the subtraction results and requires analyst intervention to interpret or remove it. A new technique has been developed to overcome these constraints. Images to be compared are first transformed using the cumulative relative frequency as a transfer function. The transformed images represent the contextual relationship of each surface location with respect to all others within the image. The process of differentiating between the transformed images results in a percentile rank ordered difference. This process produces consistent terrain-change information even when the above requirements necessary for subtraction are relaxed. This technique may be valuable to an appropriately designed hierarchical terrain-monitoring methodology because it does not require human participation in the process.

  19. Multi-Objective Optimization of Friction Stir Welding Process Parameters of AA6061-T6 and AA7075-T6 Using a Biogeography Based Optimization Algorithm.

    PubMed

    Tamjidy, Mehran; Baharudin, B T Hang Tuah; Paslar, Shahla; Matori, Khamirul Amin; Sulaiman, Shamsuddin; Fadaeifard, Firouz

    2017-05-15

    The development of Friction Stir Welding (FSW) has provided an alternative approach for producing high-quality welds, in a fast and reliable manner. This study focuses on the mechanical properties of the dissimilar friction stir welding of AA6061-T6 and AA7075-T6 aluminum alloys. The FSW process parameters such as tool rotational speed, tool traverse speed, tilt angle, and tool offset influence the mechanical properties of the friction stir welded joints significantly. A mathematical regression model is developed to determine the empirical relationship between the FSW process parameters and mechanical properties, and the results are validated. In order to obtain the optimal values of process parameters that simultaneously optimize the ultimate tensile strength, elongation, and minimum hardness in the heat affected zone (HAZ), a metaheuristic, multi objective algorithm based on biogeography based optimization is proposed. The Pareto optimal frontiers for triple and dual objective functions are obtained and the best optimal solution is selected through using two different decision making techniques, technique for order of preference by similarity to ideal solution (TOPSIS) and Shannon's entropy.

  20. Natural Fiber Composite Retting, Preform Manufacture and Molding (Project 18988/Agreement 16313)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simmons, Kevin L.; Howe, Daniel T.; Laddha, Sachin

    2009-12-31

    Plant-based natural fibers can be used in place of glass in fiber reinforced automotive composites to reduce weight, cost and provide environmental benefits. Current automotive applications use natural fibers in injection molded thermoplastics for interior, non-structural applications. Compression molded natural fiber reinforced thermosets have the opportunity to extend natural fiber composite applications to structural and semi-structural parts and exterior parts realizing further vehicle weight savings. The development of low cost molding and fiber processing techniques for large volumes of natural fibers has helped in understanding the barriers of non-aqueous retting. The retting process has a significant effect on the fibermore » quality and its processing ability that is related to the natural fiber composite mechanical properties. PNNL has developed a compression molded fiber reinforced composite system of which is the basis for future preforming activities and fiber treatment. We are using this process to develop preforming techniques and to validate fiber treatment methods relative to OEM provided application specifications. It is anticipated for next fiscal year that demonstration of larger quantities of SMC materials and molding of larger, more complex components with a more complete testing regimen in coordination with Tier suppliers under OEM guidance.« less

  1. Robust Library Building for Autonomous Classification of Downhole Geophysical Logs Using Gaussian Processes

    NASA Astrophysics Data System (ADS)

    Silversides, Katherine L.; Melkumyan, Arman

    2017-03-01

    Machine learning techniques such as Gaussian Processes can be used to identify stratigraphically important features in geophysical logs. The marker shales in the banded iron formation hosted iron ore deposits of the Hamersley Ranges, Western Australia, form distinctive signatures in the natural gamma logs. The identification of these marker shales is important for stratigraphic identification of unit boundaries for the geological modelling of the deposit. Machine learning techniques each have different unique properties that will impact the results. For Gaussian Processes (GPs), the output values are inclined towards the mean value, particularly when there is not sufficient information in the library. The impact that these inclinations have on the classification can vary depending on the parameter values selected by the user. Therefore, when applying machine learning techniques, care must be taken to fit the technique to the problem correctly. This study focuses on optimising the settings and choices for training a GPs system to identify a specific marker shale. We show that the final results converge even when different, but equally valid starting libraries are used for the training. To analyse the impact on feature identification, GP models were trained so that the output was inclined towards a positive, neutral or negative output. For this type of classification, the best results were when the pull was towards a negative output. We also show that the GP output can be adjusted by using a standard deviation coefficient that changes the balance between certainty and accuracy in the results.

  2. Electronic Two-Transition-Induced Enhancement of Emission Efficiency in Polymer Light-Emitting Diodes

    PubMed Central

    Chen, Ren-Ai; Wang, Cong; Li, Sheng; George, Thomas F.

    2013-01-01

    With the development of experimental techniques, effective injection and transportation of electrons is proven as a way to obtain polymer light-emitting diodes (PLEDs) with high quantum efficiency. This paper reveals a valid mechanism for the enhancement of quantum efficiency in PLEDs. When an external electric field is applied, the interaction between a negative polaron and triplet exciton leads to an electronic two-transition process, which induces the exciton to emit light and thus improve the emission efficiency of PLEDs. PMID:28809346

  3. Using Genotype Abundance to Improve Phylogenetic Inference

    PubMed Central

    Mesin, Luka; Victora, Gabriel D; Minin, Vladimir N; Matsen, Frederick A

    2018-01-01

    Abstract Modern biological techniques enable very dense genetic sampling of unfolding evolutionary histories, and thus frequently sample some genotypes multiple times. This motivates strategies to incorporate genotype abundance information in phylogenetic inference. In this article, we synthesize a stochastic process model with standard sequence-based phylogenetic optimality, and show that tree estimation is substantially improved by doing so. Our method is validated with extensive simulations and an experimental single-cell lineage tracing study of germinal center B cell receptor affinity maturation. PMID:29474671

  4. Edge enhancement and image equalization by unsharp masking using self-adaptive photochromic filters.

    PubMed

    Ferrari, José A; Flores, Jorge L; Perciante, César D; Frins, Erna

    2009-07-01

    A new method for real-time edge enhancement and image equalization using photochromic filters is presented. The reversible self-adaptive capacity of photochromic materials is used for creating an unsharp mask of the original image. This unsharp mask produces a kind of self filtering of the original image. Unlike the usual Fourier (coherent) image processing, the technique we propose can also be used with incoherent illumination. Validation experiments with Bacteriorhodopsin and photochromic glass are presented.

  5. IMAGESEER - IMAGEs for Education and Research

    NASA Technical Reports Server (NTRS)

    Le Moigne, Jacqueline; Grubb, Thomas; Milner, Barbara

    2012-01-01

    IMAGESEER is a new Web portal that brings easy access to NASA image data for non-NASA researchers, educators, and students. The IMAGESEER Web site and database are specifically designed to be utilized by the university community, to enable teaching image processing (IP) techniques on NASA data, as well as to provide reference benchmark data to validate new IP algorithms. Along with the data and a Web user interface front-end, basic knowledge of the application domains, benchmark information, and specific NASA IP challenges (or case studies) are provided.

  6. Edge enhancement of color images using a digital micromirror device.

    PubMed

    Di Martino, J Matías; Flores, Jorge L; Ayubi, Gastón A; Alonso, Julia R; Fernández, Ariel; Ferrari, José A

    2012-06-01

    A method for orientation-selective enhancement of edges in color images is proposed. The method utilizes the capacity of digital micromirror devices to generate a positive and a negative color replica of the image used as input. When both images are slightly displaced and imagined together, one obtains an image with enhanced edges. The proposed technique does not require a coherent light source or precise alignment. The proposed method could be potentially useful for processing large image sequences in real time. Validation experiments are presented.

  7. Modeling Amorphous Microporous Polymers for CO2 Capture and Separations.

    PubMed

    Kupgan, Grit; Abbott, Lauren J; Hart, Kyle E; Colina, Coray M

    2018-06-13

    This review concentrates on the advances of atomistic molecular simulations to design and evaluate amorphous microporous polymeric materials for CO 2 capture and separations. A description of atomistic molecular simulations is provided, including simulation techniques, structural generation approaches, relaxation and equilibration methodologies, and considerations needed for validation of simulated samples. The review provides general guidelines and a comprehensive update of the recent literature (since 2007) to promote the acceleration of the discovery and screening of amorphous microporous polymers for CO 2 capture and separation processes.

  8. ASDC Advances in the Utilization of Microservices and Hybrid Cloud Environments

    NASA Astrophysics Data System (ADS)

    Baskin, W. E.; Herbert, A.; Mazaika, A.; Walter, J.

    2017-12-01

    The Atmospheric Science Data Center (ASDC) is transitioning many of its software tools and applications to standalone microservices deployable in a hybrid cloud, offering benefits such as scalability and efficient environment management. This presentation features several projects the ASDC staff have implemented leveraging the OpenShift Container Application Platform and OpenStack Hybrid Cloud Environment focusing on key tools and techniques applied to: Earth Science data processing Spatial-Temporal metadata generation, validation, repair, and curation Archived Data discovery, visualization, and access

  9. Use of TV in space science activities - Some considerations. [onboard primary experimental data recording

    NASA Technical Reports Server (NTRS)

    Bannister, T. C.

    1977-01-01

    Advantages in the use of TV on board satellites as the primary data-recording system in a manned space laboratory when certain types of experiments are flown are indicated. Real-time or near-real-time validation, elimination of film weight, improved depth of field and low-light sensitivity, and better adaptability to computer and electronic processing of data are spelled out as advantages of TV over photographic techniques, say, in fluid dynamics experiments, and weightlessness studies.

  10. The development of learning materials based on core model to improve students’ learning outcomes in topic of Chemical Bonding

    NASA Astrophysics Data System (ADS)

    Avianti, R.; Suyatno; Sugiarto, B.

    2018-04-01

    This study aims to create an appropriate learning material based on CORE (Connecting, Organizing, Reflecting, Extending) model to improve students’ learning achievement in Chemical Bonding Topic. This study used 4-D models as research design and one group pretest-posttest as design of the material treatment. The subject of the study was teaching materials based on CORE model, conducted on 30 students of Science class grade 10. The collecting data process involved some techniques such as validation, observation, test, and questionnaire. The findings were that: (1) all the contents were valid, (2) the practicality and the effectiveness of all the contents were good. The conclusion of this research was that the CORE model is appropriate to improve students’ learning outcomes for studying Chemical Bonding.

  11. Post-image acquisition processing approaches for coherent backscatter validation

    NASA Astrophysics Data System (ADS)

    Smith, Christopher A.; Belichki, Sara B.; Coffaro, Joseph T.; Panich, Michael G.; Andrews, Larry C.; Phillips, Ronald L.

    2014-10-01

    Utilizing a retro-reflector from a target point, the reflected irradiance of a laser beam traveling back toward the transmitting point contains a peak point of intensity known as the enhanced backscatter (EBS) phenomenon. EBS is dependent on the strength regime of turbulence currently occurring within the atmosphere as the beam propagates across and back. In order to capture and analyze this phenomenon so that it may be compared to theory, an imaging system is integrated into the optical set up. With proper imaging established, we are able to implement various post-image acquisition techniques to help determine detection and positioning of EBS which can then be validated with theory by inspection of certain dependent meteorological parameters such as the refractive index structure parameter, Cn2 and wind speed.

  12. Spectroscopic characterization and quantitative determination of atorvastatin calcium impurities by novel HPLC method

    NASA Astrophysics Data System (ADS)

    Gupta, Lokesh Kumar

    2012-11-01

    Seven process related impurities were identified by LC-MS in the atorvastatin calcium drug substance. These impurities were identified by LC-MS. The structure of impurities was confirmed by modern spectroscopic techniques like 1H NMR and IR and physicochemical studies conducted by using synthesized authentic reference compounds. The synthesized reference samples of the impurity compounds were used for the quantitative HPLC determination. These impurities were detected by newly developed gradient, reverse phase high performance liquid chromatographic (HPLC) method. The system suitability of HPLC analysis established the validity of the separation. The analytical method was validated according to International Conference of Harmonization (ICH) with respect to specificity, precision, accuracy, linearity, robustness and stability of analytical solutions to demonstrate the power of newly developed HPLC method.

  13. Another dimension to metamorphic phase equilibria: the power of interactive movies for understanding complex phase diagram sections

    NASA Astrophysics Data System (ADS)

    Moulas, E.; Caddick, M. J.; Tisato, N.; Burg, J.-P.

    2012-04-01

    The investigation of metamorphic phase equilibria, using software packages that perform thermodynamic calculations, involves a series of important assumptions whose validity can often be questioned but are difficult to test. For example, potential influences of deformation on phase relations, and modification of effective reactant composition (X) at successive stages of equilibrium may both introduce significant uncertainty into phase diagram calculations. This is generally difficult to model with currently available techniques, and is typically not well quantified. We present here a method to investigate such phenomena along pre-defined Pressure-Temperature (P-T) paths, calculating local equilibrium via Gibbs energy minimization. An automated strategy to investigate complex changes in the effective equilibration composition has been developed. This demonstrates the consequences of specified X modification and, more importantly, permits automated calculation of X changes that are likely along the requested path if considering several specified processes. Here we describe calculations considering two such processes and show an additional example of a metamorphic texture that is difficult to model with current techniques. Firstly, we explore the assumption that although water saturation and bulk-rock equilibrium are generally considered to be valid assumptions in the calculation of phase equilibria, the saturation of thermodynamic components ignores mechanical effects that the fluid/melt phase can impose on the rock, which in turn can modify the effective equilibrium composition. Secondly, we examine how mass fractionation caused by porphyroblast growth at low temperatures or progressive melt extraction at high temperatures successively modifies X out of the plane of the initial diagram, complicating the process of determining best-fit P-T paths for natural samples. In particular, retrograde processes are poorly modeled without careful consideration of prograde fractionation processes. Finally we show how, although the effective composition of symplectite growth is not easy to determine and quantify, it is possible to successfully model by constructing a series of phase equilibria calculations.

  14. A Hardware Model Validation Tool for Use in Complex Space Systems

    NASA Technical Reports Server (NTRS)

    Davies, Misty Dawn; Gundy-Burlet, Karen L.; Limes, Gregory L.

    2010-01-01

    One of the many technological hurdles that must be overcome in future missions is the challenge of validating as-built systems against the models used for design. We propose a technique composed of intelligent parameter exploration in concert with automated failure analysis as a scalable method for the validation of complex space systems. The technique is impervious to discontinuities and linear dependencies in the data, and can handle dimensionalities consisting of hundreds of variables over tens of thousands of experiments.

  15. Novel Hybrid Scheduling Technique for Sensor Nodes with Mixed Criticality Tasks.

    PubMed

    Micea, Mihai-Victor; Stangaciu, Cristina-Sorina; Stangaciu, Valentin; Curiac, Daniel-Ioan

    2017-06-26

    Sensor networks become increasingly a key technology for complex control applications. Their potential use in safety- and time-critical domains has raised the need for task scheduling mechanisms specially adapted to sensor node specific requirements, often materialized in predictable jitter-less execution of tasks characterized by different criticality levels. This paper offers an efficient scheduling solution, named Hybrid Hard Real-Time Scheduling (H²RTS), which combines a static, clock driven method with a dynamic, event driven scheduling technique, in order to provide high execution predictability, while keeping a high node Central Processing Unit (CPU) utilization factor. From the detailed, integrated schedulability analysis of the H²RTS, a set of sufficiency tests are introduced and demonstrated based on the processor demand and linear upper bound metrics. The performance and correct behavior of the proposed hybrid scheduling technique have been extensively evaluated and validated both on a simulator and on a sensor mote equipped with ARM7 microcontroller.

  16. Evaluation of Inversion Methods Applied to Ionospheric ro Observations

    NASA Astrophysics Data System (ADS)

    Rios Caceres, Arq. Estela Alejandra; Rios, Victor Hugo; Guyot, Elia

    The new technique of radio-occultation can be used to study the Earth's ionosphere. The retrieval processes of ionospheric profiling from radio occultation observations usually assume spherical symmetry of electron density distribution at the locality of occultation and use the Abel integral transform to invert the measured total electron content (TEC) values. This pa-per presents a set of ionospheric profiles obtained from SAC-C satellite with the Abel inversion technique. The effects of the ionosphere on the GPS signal during occultation, such as bending and scintillation, are examined. Electron density profiles are obtained using the Abel inversion technique. Ionospheric radio occultations are validated using vertical profiles of electron con-centration from inverted ionograms , obtained from ionosonde sounding in the vicinity of the occultation. Results indicate that the Abel transform works well in the mid-latitudes during the daytime, but is less accurate during the night-time.

  17. Fiber-wireless integrated mobile backhaul network based on a hybrid millimeter-wave and free-space-optics architecture with an adaptive diversity combining technique.

    PubMed

    Zhang, Junwen; Wang, Jing; Xu, Yuming; Xu, Mu; Lu, Feng; Cheng, Lin; Yu, Jianjun; Chang, Gee-Kung

    2016-05-01

    We propose and experimentally demonstrate a novel fiber-wireless integrated mobile backhaul network based on a hybrid millimeter-wave (MMW) and free-space-optics (FSO) architecture using an adaptive combining technique. Both 60 GHz MMW and FSO links are demonstrated and fully integrated with optical fibers in a scalable and cost-effective backhaul system setup. Joint signal processing with an adaptive diversity combining technique (ADCT) is utilized at the receiver side based on a maximum ratio combining algorithm. Mobile backhaul transportation of 4-Gb/s 16 quadrature amplitude modulation frequency-division multiplexing (QAM-OFDM) data is experimentally demonstrated and tested under various weather conditions synthesized in the lab. Performance improvement in terms of reduced error vector magnitude (EVM) and enhanced link reliability are validated under fog, rain, and turbulence conditions.

  18. FEM Techniques for High Stress Detection in Accelerated Fatigue Simulation

    NASA Astrophysics Data System (ADS)

    Veltri, M.

    2016-09-01

    This work presents the theory and a numerical validation study in support to a novel method for a priori identification of fatigue critical regions, with the aim to accelerate durability design in large FEM problems. The investigation is placed in the context of modern full-body structural durability analysis, where a computationally intensive dynamic solution could be required to identify areas with potential for fatigue damage initiation. The early detection of fatigue critical areas can drive a simplification of the problem size, leading to sensible improvement in solution time and model handling while allowing processing of the critical areas in higher detail. The proposed technique is applied to a real life industrial case in a comparative assessment with established practices. Synthetic damage prediction quantification and visualization techniques allow for a quick and efficient comparison between methods, outlining potential application benefits and boundaries.

  19. Pressure dependence of thermal conductivity and specific heat in CeRh2Si2 measured by an extended thermal relaxation method

    NASA Astrophysics Data System (ADS)

    Nishigori, Shijo; Seida, Osamu

    2018-05-01

    We have developed a new technique for measuring thermal conductivity and specific heat under pressure by improving a thermal relaxation method. In this technique, a cylindrical sample with a small disc heater is embedded in the pressure-transmitting medium, then temperature variations of the sample and heater were directly measured by thermocouples during a heating and cooling process. Thermal conductivity and specific heat are estimated by comparing the experimental data with temperature variations simulated by a finite element method. The obtained thermal conductivity and specific heat of the test sample CeRh2Si2 exhibit a small enhancement and a clear peak arising from antiferromagnetic transition, respectively. The observation of these typical behaviors for magnetic compounds indicate that the technique is valid for the study on thermal properties under pressure.

  20. Tailored Welding Technique for High Strength Al-Cu Alloy for Higher Mechanical Properties

    NASA Astrophysics Data System (ADS)

    Biradar, N. S.; Raman, R.

    AA2014 aluminum alloy, with 4.5% Cu as major alloying element, offers highest strength and hardness values in T6 temper and finds extensive use in aircraft primary structures. However, this alloy is difficult to weld by fusion welding because the dendritic structure formed can affect weld properties seriously. Among the welding processes, AC-TIG technique is largely used for welding. As welded yield strength was in the range of 190-195 MPa, using conventional TIG technique. Welding metallurgy of AA2014 was critically reviewed and factors responsible for lower properties were identified. Square-wave AC TIG with Transverse mechanical arc oscillation (TMAO) was postulated to improve the weld strength. A systematic experimentation using 4 mm thick plates produced YS in the range of 230-240 MPa, has been achieved. Through characterization including optical and SEM/EDX was conducted to validate the metallurgical phenomena attributable to improvement in weld properties.

  1. Efficient finite element simulation of slot spirals, slot radomes and microwave structures

    NASA Technical Reports Server (NTRS)

    Gong, J.; Volakis, J. L.

    1995-01-01

    This progress report contains the following two documents: (1) 'Efficient Finite Element Simulation of Slot Antennas using Prismatic Elements' - A hybrid finite element-boundary integral (FE-BI) simulation technique is discussed to treat narrow slot antennas etched on a planar platform. Specifically, the prismatic elements are used to reduce the redundant sampling rates and ease the mesh generation process. Numerical results for an antenna slot and frequency selective surfaces are presented to demonstrate the validity and capability of the technique; and (2) 'Application and Design Guidelines of the PML Absorber for Finite Element Simulations of Microwave Packages' - The recently introduced perfectly matched layer (PML) uniaxial absorber for frequency domain finite element simulations has several advantages. In this paper we present the application of PML for microwave circuit simulations along with design guidelines to obtain a desired level of absorption. Different feeding techniques are also investigated for improved accuracy.

  2. Newmark-Beta-FDTD method for super-resolution analysis of time reversal waves

    NASA Astrophysics Data System (ADS)

    Shi, Sheng-Bing; Shao, Wei; Ma, Jing; Jin, Congjun; Wang, Xiao-Hua

    2017-09-01

    In this work, a new unconditionally stable finite-difference time-domain (FDTD) method with the split-field perfectly matched layer (PML) is proposed for the analysis of time reversal (TR) waves. The proposed method is very suitable for multiscale problems involving microstructures. The spatial and temporal derivatives in this method are discretized by the central difference technique and Newmark-Beta algorithm, respectively, and the derivation results in the calculation of a banded-sparse matrix equation. Since the coefficient matrix keeps unchanged during the whole simulation process, the lower-upper (LU) decomposition of the matrix needs to be performed only once at the beginning of the calculation. Moreover, the reverse Cuthill-Mckee (RCM) technique, an effective preprocessing technique in bandwidth compression of sparse matrices, is used to improve computational efficiency. The super-resolution focusing of TR wave propagation in two- and three-dimensional spaces is included to validate the accuracy and efficiency of the proposed method.

  3. Guided Inquiry with Cognitive Conflict Strategy: Drilling Indonesian High School Students’ Creative Thinking Skills

    NASA Astrophysics Data System (ADS)

    Syadzili, A. F.; Soetjipto; Tukiran

    2018-01-01

    This research aims to produce physics learning materials in Indonesian high school using guided inquiry with cognitive conflict strategy to drill students’ creative thinking skills in a static fluid learning. This development research used 4D model with one group pre-test and post-test design implemented in the eleventh grade students in the second semester of 2016/2017 academic year. The data were collected by validation sheets, questionnaires, tests and observations, while data analysis techniques is descriptive quantitative analysis. This research obtained several findings, they are : the learning material developed had an average validity score with very valid category. The lesson plan can be implemented very well. The students’ responses toward the learning process were very possitive with the students’ interest to follow the learning. Creative thinking skills of student before the implementation of product was inadequate, then it is very creative after product was implemented. The impacts of the research suggest that guided inquiry may stimulate the students to think creatifly.

  4. The use of concept mapping for scale development and validation in evaluation.

    PubMed

    Rosas, Scott R; Camphausen, Lauren C

    2007-05-01

    Evaluators often make key decisions about what content to include when designing new scales. However, without clear conceptual grounding, there is a risk these decisions may compromise the scale's validity. Techniques such as concept mapping are available to evaluators for the specification of conceptual frameworks, but have not been used as a fully integrated part of scale development. As part of a multi-site evaluation of family support programs, we integrated concept mapping with traditional scale-development processes to strengthen the creation of a scale for inclusion in an evaluation instrument. Using concept mapping, we engaged staff and managers in the development of a framework of intended benefits of program participation and used the information to systematically select the scale's content. The psychometric characteristics of the scale were then formally assessed using a sample of program participants. The implications of the approach for supporting construct validity, inclusion of staff and managers, and theory-driven evaluation are discussed.

  5. Classification of images acquired with colposcopy using artificial neural networks.

    PubMed

    Simões, Priscyla W; Izumi, Narjara B; Casagrande, Ramon S; Venson, Ramon; Veronezi, Carlos D; Moretti, Gustavo P; da Rocha, Edroaldo L; Cechinel, Cristian; Ceretta, Luciane B; Comunello, Eros; Martins, Paulo J; Casagrande, Rogério A; Snoeyer, Maria L; Manenti, Sandra A

    2014-01-01

    To explore the advantages of using artificial neural networks (ANNs) to recognize patterns in colposcopy to classify images in colposcopy. Transversal, descriptive, and analytical study of a quantitative approach with an emphasis on diagnosis. The training test e validation set was composed of images collected from patients who underwent colposcopy. These images were provided by a gynecology clinic located in the city of Criciúma (Brazil). The image database (n = 170) was divided; 48 images were used for the training process, 58 images were used for the tests, and 64 images were used for the validation. A hybrid neural network based on Kohonen self-organizing maps and multilayer perceptron (MLP) networks was used. After 126 cycles, the validation was performed. The best results reached an accuracy of 72.15%, a sensibility of 69.78%, and a specificity of 68%. Although the preliminary results still exhibit an average efficiency, the present approach is an innovative and promising technique that should be deeply explored in the context of the present study.

  6. Validity Evidence in Scale Development: The Application of Cross Validation and Classification-Sequencing Validation

    ERIC Educational Resources Information Center

    Acar, Tu¨lin

    2014-01-01

    In literature, it has been observed that many enhanced criteria are limited by factor analysis techniques. Besides examinations of statistical structure and/or psychological structure, such validity studies as cross validation and classification-sequencing studies should be performed frequently. The purpose of this study is to examine cross…

  7. Structural exploration for the refinement of anticancer matrix metalloproteinase-2 inhibitor designing approaches through robust validated multi-QSARs

    NASA Astrophysics Data System (ADS)

    Adhikari, Nilanjan; Amin, Sk. Abdul; Saha, Achintya; Jha, Tarun

    2018-03-01

    Matrix metalloproteinase-2 (MMP-2) is a promising pharmacological target for designing potential anticancer drugs. MMP-2 plays critical functions in apoptosis by cleaving the DNA repair enzyme namely poly (ADP-ribose) polymerase (PARP). Moreover, MMP-2 expression triggers the vascular endothelial growth factor (VEGF) having a positive influence on tumor size, invasion, and angiogenesis. Therefore, it is an urgent need to develop potential MMP-2 inhibitors without any toxicity but better pharmacokinetic property. In this article, robust validated multi-quantitative structure-activity relationship (QSAR) modeling approaches were attempted on a dataset of 222 MMP-2 inhibitors to explore the important structural and pharmacophoric requirements for higher MMP-2 inhibition. Different validated regression and classification-based QSARs, pharmacophore mapping and 3D-QSAR techniques were performed. These results were challenged and subjected to further validation to explain 24 in house MMP-2 inhibitors to judge the reliability of these models further. All these models were individually validated internally as well as externally and were supported and validated by each other. These results were further justified by molecular docking analysis. Modeling techniques adopted here not only helps to explore the necessary structural and pharmacophoric requirements but also for the overall validation and refinement techniques for designing potential MMP-2 inhibitors.

  8. Three-dimensional modeling and animation of two carpal bones: a technique.

    PubMed

    Green, Jason K; Werner, Frederick W; Wang, Haoyu; Weiner, Marsha M; Sacks, Jonathan M; Short, Walter H

    2004-05-01

    The objectives of this study were to (a). create 3D reconstructions of two carpal bones from single CT data sets and animate these bones with experimental in vitro motion data collected during dynamic loading of the wrist joint, (b). develop a technique to calculate the minimum interbone distance between the two carpal bones, and (c). validate the interbone distance calculation process. This method utilized commercial software to create the animations and an in-house program to interface with three-dimensional CAD software to calculate the minimum distance between the irregular geometries of the bones. This interbone minimum distance provides quantitative information regarding the motion of the bones studied and may help to understand and quantify the effects of ligamentous injury.

  9. Validation and calibration of a TDLAS oxygen sensor for in-line measurement on flow-packed products

    NASA Astrophysics Data System (ADS)

    Cocola, L.; Fedel, M.; Allermann, H.; Landa, S.; Tondello, G.; Bardenstein, A.; Poletto, L.

    2016-05-01

    A device based on Tunable Diode Laser Absorption Spectroscopy has been developed for non-invasive evaluation of gaseous oxygen concentration inside packed food containers. This work has been done in the context of the SAFETYPACK European project in order to enable full, automated product testing on a production line. The chosen samples at the end of the manufacturing process are modified atmosphere bags of processed mozzarella, in which the target oxygen concentration is required to be below 5%. The spectrometer allows in-line measurement of moving samples which are passing on a conveyor belt, with an optical layout optimized for bags made of a flexible scattering material, and works by sensing the gas phase in the headspace at the top of the package. A field applicable method for the calibration of this device has been identified and validated against traditional, industry standard, invasive measurement techniques. This allows some degrees of freedom for the end-user regarding packaging dimensions and shape. After deployment and setup of the instrument at the end-user manufacturing site, performance has been evaluated on a different range of samples in order to validate the choice of electro optical and geometrical parameters regarding sample handling and measurement timing at the actual measurement conditions.

  10. Gene network biological validity based on gene-gene interaction relevance.

    PubMed

    Gómez-Vela, Francisco; Díaz-Díaz, Norberto

    2014-01-01

    In recent years, gene networks have become one of the most useful tools for modeling biological processes. Many inference gene network algorithms have been developed as techniques for extracting knowledge from gene expression data. Ensuring the reliability of the inferred gene relationships is a crucial task in any study in order to prove that the algorithms used are precise. Usually, this validation process can be carried out using prior biological knowledge. The metabolic pathways stored in KEGG are one of the most widely used knowledgeable sources for analyzing relationships between genes. This paper introduces a new methodology, GeneNetVal, to assess the biological validity of gene networks based on the relevance of the gene-gene interactions stored in KEGG metabolic pathways. Hence, a complete KEGG pathway conversion into a gene association network and a new matching distance based on gene-gene interaction relevance are proposed. The performance of GeneNetVal was established with three different experiments. Firstly, our proposal is tested in a comparative ROC analysis. Secondly, a randomness study is presented to show the behavior of GeneNetVal when the noise is increased in the input network. Finally, the ability of GeneNetVal to detect biological functionality of the network is shown.

  11. In-line monitoring of extraction process of scutellarein from Erigeron breviscapus (vant.) Hand-Mazz based on qualitative and quantitative uses of near-infrared spectroscopy.

    PubMed

    Wu, Yongjiang; Jin, Ye; Ding, Haiying; Luan, Lianjun; Chen, Yong; Liu, Xuesong

    2011-09-01

    The application of near-infrared (NIR) spectroscopy for in-line monitoring of extraction process of scutellarein from Erigeron breviscapus (vant.) Hand-Mazz was investigated. For NIR measurements, two fiber optic probes designed to transmit NIR radiation through a 2 mm pathlength flow cell were utilized to collect spectra in real-time. High performance liquid chromatography (HPLC) was used as a reference method to determine scutellarein in extract solution. Partial least squares regression (PLSR) calibration model of Savitzky-Golay smoothing NIR spectra in the 5450-10,000 cm(-1) region gave satisfactory predictive results for scutellarein. The results showed that the correlation coefficients of calibration and cross validation were 0.9967 and 0.9811, respectively, and the root mean square error of calibration and cross validation were 0.044 and 0.105, respectively. Furthermore, both the moving block standard deviation (MBSD) method and conformity test were used to identify the end point of extraction process, providing real-time data and instant feedback about the extraction course. The results obtained in this study indicated that the NIR spectroscopy technique provides an efficient and environmentally friendly approach for fast determination of scutellarein and end point control of extraction process. Copyright © 2011 Elsevier B.V. All rights reserved.

  12. Comparison of infrared and 3D digital image correlation techniques applied for mechanical testing of materials

    NASA Astrophysics Data System (ADS)

    Krstulović-Opara, Lovre; Surjak, Martin; Vesenjak, Matej; Tonković, Zdenko; Kodvanj, Janoš; Domazet, Željko

    2015-11-01

    To investigate the applicability of infrared thermography as a tool for acquiring dynamic yielding in metals, a comparison of infrared thermography with three dimensional digital image correlation has been made. Dynamical tension tests and three point bending tests of aluminum alloys have been performed to evaluate results obtained by IR thermography in order to detect capabilities and limits for these two methods. Both approaches detect pastification zone migrations during the yielding process. The results of the tension test and three point bending test proved the validity of the IR approach as a method for evaluating the dynamic yielding process when used on complex structures such as cellular porous materials. The stability of the yielding process in the three point bending test, as contrary to the fluctuation of the plastification front in the tension test, is of great importance for the validation of numerical constitutive models. The research proved strong performance, robustness and reliability of the IR approach when used to evaluate yielding during dynamic loading processes, while the 3D DIC method proved to be superior in the low velocity loading regimes. This research based on two basic tests, proved the conclusions and suggestions presented in our previous research on porous materials where middle wave infrared thermography was applied.

  13. Statistical methodology: II. Reliability and validity assessment in study design, Part B.

    PubMed

    Karras, D J

    1997-02-01

    Validity measures the correspondence between a test and other purported measures of the same or similar qualities. When a reference standard exists, a criterion-based validity coefficient can be calculated. If no such standard is available, the concepts of content and construct validity may be used, but quantitative analysis may not be possible. The Pearson and Spearman tests of correlation are often used to assess the correspondence between tests, but do not account for measurement biases and may yield misleading results. Techniques that measure interest differences may be more meaningful in validity assessment, and the kappa statistic is useful for analyzing categorical variables. Questionnaires often can be designed to allow quantitative assessment of reliability and validity, although this may be difficult. Inclusion of homogeneous questions is necessary to assess reliability. Analysis is enhanced by using Likert scales or similar techniques that yield ordinal data. Validity assessment of questionnaires requires careful definition of the scope of the test and comparison with previously validated tools.

  14. Validation of Immunohistochemical Assays for Integral Biomarkers in the NCI-MATCH EAY131 Clinical Trial.

    PubMed

    Khoury, Joseph D; Wang, Wei-Lien; Prieto, Victor G; Medeiros, L Jeffrey; Kalhor, Neda; Hameed, Meera; Broaddus, Russell; Hamilton, Stanley R

    2018-02-01

    Biomarkers that guide therapy selection are gaining unprecedented importance as targeted therapy options increase in scope and complexity. In conjunction with high-throughput molecular techniques, therapy-guiding biomarker assays based upon immunohistochemistry (IHC) have a critical role in cancer care in that they inform about the expression status of a protein target. Here, we describe the validation procedures for four clinical IHC biomarker assays-PTEN, RB, MLH1, and MSH2-for use as integral biomarkers in the nationwide NCI-Molecular Analysis for Therapy Choice (NCI-MATCH) EAY131 clinical trial. Validation procedures were developed through an iterative process based on collective experience and adaptation of broad guidelines from the FDA. The steps included primary antibody selection; assay optimization; development of assay interpretation criteria incorporating biological considerations; and expected staining patterns, including indeterminate results, orthogonal validation, and tissue validation. Following assay lockdown, patient samples and cell lines were used for analytic and clinical validation. The assays were then approved as laboratory-developed tests and used for clinical trial decisions for treatment selection. Calculations of sensitivity and specificity were undertaken using various definitions of gold-standard references, and external validation was required for the PTEN IHC assay. In conclusion, validation of IHC biomarker assays critical for guiding therapy in clinical trials is feasible using comprehensive preanalytic, analytic, and postanalytic steps. Implementation of standardized guidelines provides a useful framework for validating IHC biomarker assays that allow for reproducibility across institutions for routine clinical use. Clin Cancer Res; 24(3); 521-31. ©2017 AACR . ©2017 American Association for Cancer Research.

  15. AdViSHE: A Validation-Assessment Tool of Health-Economic Models for Decision Makers and Model Users.

    PubMed

    Vemer, P; Corro Ramos, I; van Voorn, G A K; Al, M J; Feenstra, T L

    2016-04-01

    A trade-off exists between building confidence in health-economic (HE) decision models and the use of scarce resources. We aimed to create a practical tool providing model users with a structured view into the validation status of HE decision models, to address this trade-off. A Delphi panel was organized, and was completed by a workshop during an international conference. The proposed tool was constructed iteratively based on comments from, and the discussion amongst, panellists. During the Delphi process, comments were solicited on the importance and feasibility of possible validation techniques for modellers, their relevance for decision makers, and the overall structure and formulation in the tool. The panel consisted of 47 experts in HE modelling and HE decision making from various professional and international backgrounds. In addition, 50 discussants actively engaged in the discussion at the conference workshop and returned 19 questionnaires with additional comments. The final version consists of 13 items covering all relevant aspects of HE decision models: the conceptual model, the input data, the implemented software program, and the model outcomes. Assessment of the Validation Status of Health-Economic decision models (AdViSHE) is a validation-assessment tool in which model developers report in a systematic way both on validation efforts performed and on their outcomes. Subsequently, model users can establish whether confidence in the model is justified or whether additional validation efforts should be undertaken. In this way, AdViSHE enhances transparency of the validation status of HE models and supports efficient model validation.

  16. Gas treatment in trickle-bed biofilters: biomass, how much is enough?

    PubMed

    Alonso, C; Suidan, M T; Sorial, G A; Smith, F L; Biswas, P; Smith, P J; Brenner, R C

    1997-06-20

    The objective of this article is to define and validate a mathematical model that desribes the physical and biological processes occurring in a trickle-bed air biofilter for waste gas treatment. This model considers a two-phase system, quasi-steady-state processes, uniform bacterial population, and one limiting substrate. The variation of the specific surface area with bacterial growth is included in the model, and its effect on the biofilter performance is analyzed. This analysis leads to the conclusion that excessive accumulation of biomass in the reactor has a negative effect on contaminant removal efficiency. To solve this problem, excess biomass is removed via full media fluidization and backwashing of the biofilter. The backwashing technique is also incorporated in the model as a process variable. Experimental data from the biodegradation of toluene in a pilot system with four packed-bed reactors are used to validate the model. Once the model is calibrated with the estimation of the unknown parameters of the system, it is used to simulate the biofilter performance for different operating conditions. Model predictions are found to be in agreement with experimental data. (c) 1997 John Wiley & Sons, Inc. Biotechnol Bioeng 54: 583-594, 1997.

  17. 76 FR 4360 - Guidance for Industry on Process Validation: General Principles and Practices; Availability

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-25

    ... elements of process validation for the manufacture of human and animal drug and biological products... process validation for the manufacture of human and animal drug and biological products, including APIs. This guidance describes process validation activities in three stages: In Stage 1, Process Design, the...

  18. Multiscale GPS tomography during COPS: validation and applications

    NASA Astrophysics Data System (ADS)

    Champollion, Cédric; Flamant, Cyrille; Masson, Frédéric; Gégout, Pascal; Boniface, Karen; Richard, Evelyne

    2010-05-01

    Accurate 3D description of the water vapour field is of interest for process studies such as convection initiation. None of the current techniques (LIDAR, satellite, radio soundings, GPS) can provide an all weather continuous 3D field of moisture. The combination of GPS tomography with radio-soundings (and/or LIDAR) has been used for such process studies using both advantages of vertically resolved soundings and high temporal density of GPS measurements. GPS tomography has been used at short scale (10 km horizontal resolution but in a 50 km² area) for process studies such as the ESCOMPTE experiment (Bastin et al., 2005) and at larger scale (50 km horizontal resolution) during IHOP_2002. But no extensive statistical validation has been done so far. The overarching goal of the COPS field experiment is to advance the quality of forecasts of orographically induced convective precipitation by four-dimensional observations and modeling of its life cycle for identifying the physical and chemical processes responsible for deficiencies in QPF over low-mountain regions. During the COPS field experiment, a GPS network of about 100 GPS stations has been continuously operating during three months in an area of 500 km² in the East of France (Vosges Mountains) and West of Germany (Black Forest). If the mean spacing between the GPS is about 50 km, an East-West GPS profile with a density of about 10 km is dedicated to high resolution tomography. One major goal of the GPS COPS experiment is to validate the GPS tomography with different spatial resolutions. Validation is based on additional radio-soundings and airborne / ground-based LIDAR measurement. The number and the high quality of vertically resolved water vapor observations give an unique data set for GPS tomography validation. Numerous tests have been done on real data to show the type water vapor structures that can be imaging by GPS tomography depending of the assimilation of additional data (radio soundings), the resolution of the tomography grid and the density of GPS network. Finally some applications to different cases studies will be shortly presented.

  19. A geomorphology-based ANFIS model for multi-station modeling of rainfall-runoff process

    NASA Astrophysics Data System (ADS)

    Nourani, Vahid; Komasi, Mehdi

    2013-05-01

    This paper demonstrates the potential use of Artificial Intelligence (AI) techniques for predicting daily runoff at multiple gauging stations. Uncertainty and complexity of the rainfall-runoff process due to its variability in space and time in one hand and lack of historical data on the other hand, cause difficulties in the spatiotemporal modeling of the process. In this paper, an Integrated Geomorphological Adaptive Neuro-Fuzzy Inference System (IGANFIS) model conjugated with C-means clustering algorithm was used for rainfall-runoff modeling at multiple stations of the Eel River watershed, California. The proposed model could be used for predicting runoff in the stations with lack of data or any sub-basin within the watershed because of employing the spatial and temporal variables of the sub-basins as the model inputs. This ability of the integrated model for spatiotemporal modeling of the process was examined through the cross validation technique for a station. In this way, different ANFIS structures were trained using Sugeno algorithm in order to estimate daily discharge values at different stations. In order to improve the model efficiency, the input data were then classified into some clusters by the means of fuzzy C-means (FCMs) method. The goodness-of-fit measures support the gainful use of the IGANFIS and FCM methods in spatiotemporal modeling of hydrological processes.

  20. Accelerated numerical processing of electronically recorded holograms with reduced speckle noise.

    PubMed

    Trujillo, Carlos; Garcia-Sucerquia, Jorge

    2013-09-01

    The numerical reconstruction of digitally recorded holograms suffers from speckle noise. An accelerated method that uses general-purpose computing in graphics processing units to reduce that noise is shown. The proposed methodology utilizes parallelized algorithms to record, reconstruct, and superimpose multiple uncorrelated holograms of a static scene. For the best tradeoff between reduction of the speckle noise and processing time, the method records, reconstructs, and superimposes six holograms of 1024 × 1024 pixels in 68 ms; for this case, the methodology reduces the speckle noise by 58% compared with that exhibited by a single hologram. The fully parallelized method running on a commodity graphics processing unit is one order of magnitude faster than the same technique implemented on a regular CPU using its multithreading capabilities. Experimental results are shown to validate the proposal.

  1. Implementation of a Medication Reconciliation Assistive Technology: A Qualitative Analysis

    PubMed Central

    Wright, Theodore B.; Adams, Kathleen; Church, Victoria L.; Ferraro, Mimi; Ragland, Scott; Sayers, Anthony; Tallett, Stephanie; Lovejoy, Travis; Ash, Joan; Holahan, Patricia J.; Lesselroth, Blake J.

    2017-01-01

    Objective: To aid the implementation of a medication reconciliation process within a hybrid primary-specialty care setting by using qualitative techniques to describe the climate of implementation and provide guidance for future projects. Methods: Guided by McMullen et al’s Rapid Assessment Process1, we performed semi-structured interviews prior to and iteratively throughout the implementation. Interviews were coded and analyzed using grounded theory2 and cross-examined for validity. Results: We identified five barriers and five facilitators that impacted the implementation. Facilitators identified were process alignment with user values, and motivation and clinical champions fostered by the implementation team rather than the administration. Barriers included a perceived limited capacity for change, diverging priorities, and inconsistencies in process standards and role definitions. Discussion: A more complete, qualitative understanding of existing barriers and facilitators helps to guide critical decisions on the design and implementation of a successful medication reconciliation process. PMID:29854251

  2. Processing data base information having nonwhite noise

    DOEpatents

    Gross, Kenneth C.; Morreale, Patricia

    1995-01-01

    A method and system for processing a set of data from an industrial process and/or a sensor. The method and system can include processing data from either real or calculated data related to an industrial process variable. One of the data sets can be an artificial signal data set generated by an autoregressive moving average technique. After obtaining two data sets associated with one physical variable, a difference function data set is obtained by determining the arithmetic difference between the two pairs of data sets over time. A frequency domain transformation is made of the difference function data set to obtain Fourier modes describing a composite function data set. A residual function data set is obtained by subtracting the composite function data set from the difference function data set and the residual function data set (free of nonwhite noise) is analyzed by a statistical probability ratio test to provide a validated data base.

  3. Detection of brain tumor margins using optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Juarez-Chambi, Ronald M.; Kut, Carmen; Rico-Jimenez, Jesus; Campos-Delgado, Daniel U.; Quinones-Hinojosa, Alfredo; Li, Xingde; Jo, Javier

    2018-02-01

    In brain cancer surgery, it is critical to achieve extensive resection without compromising adjacent healthy, noncancerous regions. Various technological advances have made major contributions in imaging, including intraoperative magnetic imaging (MRI) and computed tomography (CT). However, these technologies have pros and cons in providing quantitative, real-time and three-dimensional (3D) continuous guidance in brain cancer detection. Optical Coherence Tomography (OCT) is a non-invasive, label-free, cost-effective technique capable of imaging tissue in three dimensions and real time. The purpose of this study is to reliably and efficiently discriminate between non-cancer and cancerinfiltrated brain regions using OCT images. To this end, a mathematical model for quantitative evaluation known as the Blind End-Member and Abundances Extraction method (BEAE). This BEAE method is a constrained optimization technique which extracts spatial information from volumetric OCT images. Using this novel method, we are able to discriminate between cancerous and non-cancerous tissues and using logistic regression as a classifier for automatic brain tumor margin detection. Using this technique, we are able to achieve excellent performance using an extensive cross-validation of the training dataset (sensitivity 92.91% and specificity 98.15%) and again using an independent, blinded validation dataset (sensitivity 92.91% and specificity 86.36%). In summary, BEAE is well-suited to differentiate brain tissue which could support the guiding surgery process for tissue resection.

  4. Detection of brain tumor margins using optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Juarez-Chambi, Ronald M.; Kut, Carmen; Rico-Jimenez, Jesus; Campos-Delgado, Daniel U.; Quinones-Hinojosa, Alfredo; Li, Xingde; Jo, Javier

    2018-02-01

    In brain cancer surgery, it is critical to achieve extensive resection without compromising adjacent healthy, non-cancerous regions. Various technological advances have made major contributions in imaging, including intraoperative magnetic imaging (MRI) and computed tomography (CT). However, these technologies have pros and cons in providing quantitative, real-time and three-dimensional (3D) continuous guidance in brain cancer detection. Optical Coherence Tomography (OCT) is a non-invasive, label-free, cost-effective technique capable of imaging tissue in three dimensions and real time. The purpose of this study is to reliably and efficiently discriminate between non-cancer and cancer-infiltrated brain regions using OCT images. To this end, a mathematical model for quantitative evaluation known as the Blind End- Member and Abundances Extraction method (BEAE). This BEAE method is a constrained optimization technique which extracts spatial information from volumetric OCT images. Using this novel method, we are able to discriminate between cancerous and non-cancerous tissues and using logistic regression as a classifier for automatic brain tumor margin detection. Using this technique, we are able to achieve excellent performance using an extensive cross-validation of the training dataset (sensitivity 92.91% and specificity 98.15%) and again using an independent, blinded validation dataset (sensitivity 92.91% and specificity 86.36%). In summary, BEAE is well-suited to differentiate brain tissue which could support the guiding surgery process for tissue resection.

  5. Single-shot pressure-sensitive paint lifetime measurements on fast rotating blades using an optimized double-shutter technique

    NASA Astrophysics Data System (ADS)

    Weiss, Armin; Geisler, Reinhard; Schwermer, Till; Yorita, Daisuke; Henne, Ulrich; Klein, Christian; Raffel, Markus

    2017-09-01

    A pressure-sensitive paint (PSP) system is presented to measure global surface pressures on fast rotating blades. It is dedicated to solve the problem of blurred image data employing the single-shot lifetime method. The efficient blur reduction capability of an optimized double-shutter imaging technique is demonstrated omitting error-prone post-processing or laborious de-rotation setups. The system is applied on Mach-scaled DSA-9A helicopter blades in climb at various collective pitch settings and blade tip Mach and chord Reynolds numbers (M_{ {tip}} = 0.29-0.57; Re_{ {tip}} = 4.63-9.26 × 10^5). Temperature effects in the PSP are corrected by a theoretical approximation validated against measured temperatures using temperature-sensitive paint (TSP) on a separate blade. Ensemble-averaged PSP results are comparable to pressure-tap data on the same blade to within 250 Pa. Resulting pressure maps on the blade suction side reveal spatially high resolved flow features such as the leading edge suction peak, footprints of blade-tip vortices and evidence of laminar-turbulent boundary-layer (BL) transition. The findings are validated by a separately conducted BL transition measurement by means of TSP and numerical simulations using a 2D coupled Euler/boundary-layer code. Moreover, the principal ability of the single-shot technique to capture unsteady flow phenomena is stressed revealing three-dimensional pressure fluctuations at stall.

  6. Probabilistic terrain models from waveform airborne LiDAR: AutoProbaDTM project results

    NASA Astrophysics Data System (ADS)

    Jalobeanu, A.; Goncalves, G. R.

    2012-12-01

    The main objective of the AutoProbaDTM project was to develop new methods for automated probabilistic topographic map production using the latest LiDAR scanners. It included algorithmic development, implementation and validation over a 200 km2 test area in continental Portugal, representing roughly 100 GB of raw data and half a billion waveforms. We aimed to generate digital terrain models automatically, including ground topography as well as uncertainty maps, using Bayesian inference for model estimation and error propagation, and approaches based on image processing. Here we are presenting the results of the completed project (methodological developments and processing results from the test dataset). In June 2011, the test data were acquired in central Portugal, over an area of geomorphological and ecological interest, using a Riegl LMS-Q680i sensor. We managed to survey 70% of the test area at a satisfactory sampling rate, the angular spacing matching the laser beam divergence and the ground spacing nearly equal to the footprint (almost 4 pts/m2 for a 50cm footprint at 1500 m AGL). This is crucial for a correct processing as aliasing artifacts are significantly reduced. A reverse engineering had to be done as the data were delivered in a proprietary binary format, so we were able to read the waveforms and the essential parameters. A robust waveform processing method has been implemented and tested, georeferencing and geometric computations have been coded. Fast gridding and interpolation techniques have been developed. Validation is nearly completed, as well as geometric calibration, IMU error correction, full error propagation and large-scale DEM reconstruction. A probabilistic processing software package has been implemented and code optimization is in progress. This package includes new boresight calibration procedures, robust peak extraction modules, DEM gridding and interpolation methods, and means to visualize the produced uncertain surfaces (topography and accuracy map). Vegetation filtering for bare ground extraction has been left aside, and we wish to explore this research area in the future. A thorough validation of the new techniques and computed models has been conducted, using large numbers of ground control points (GCP) acquired with GPS, evenly distributed and classified according to ground cover and terrain characteristics. More than 16,000 GCP have been acquired during field work. The results are now freely accessible online through a web map service (GeoServer) thus allowing users to visualize data interactively without having to download the full processed dataset.

  7. Downward longwave surface radiation from sun-synchronous satellite data - Validation of methodology

    NASA Technical Reports Server (NTRS)

    Darnell, W. L.; Gupta, S. K.; Staylor, W. F.

    1986-01-01

    An extensive study has been carried out to validate a satellite technique for estimating downward longwave radiation at the surface. The technique, mostly developed earlier, uses operational sun-synchronous satellite data and a radiative transfer model to provide the surface flux estimates. The satellite-derived fluxes were compared directly with corresponding ground-measured fluxes at four different sites in the United States for a common one-year period. This provided a study of seasonal variations as well as a diversity of meteorological conditions. Dome heating errors in the ground-measured fluxes were also investigated and were corrected prior to the comparisons. Comparison of the monthly averaged fluxes from the satellite and ground sources for all four sites for the entire year showed a correlation coefficient of 0.98 and a standard error of estimate of 10 W/sq m. A brief description of the technique is provided, and the results validating the technique are presented.

  8. Scalability and Validation of Big Data Bioinformatics Software.

    PubMed

    Yang, Andrian; Troup, Michael; Ho, Joshua W K

    2017-01-01

    This review examines two important aspects that are central to modern big data bioinformatics analysis - software scalability and validity. We argue that not only are the issues of scalability and validation common to all big data bioinformatics analyses, they can be tackled by conceptually related methodological approaches, namely divide-and-conquer (scalability) and multiple executions (validation). Scalability is defined as the ability for a program to scale based on workload. It has always been an important consideration when developing bioinformatics algorithms and programs. Nonetheless the surge of volume and variety of biological and biomedical data has posed new challenges. We discuss how modern cloud computing and big data programming frameworks such as MapReduce and Spark are being used to effectively implement divide-and-conquer in a distributed computing environment. Validation of software is another important issue in big data bioinformatics that is often ignored. Software validation is the process of determining whether the program under test fulfils the task for which it was designed. Determining the correctness of the computational output of big data bioinformatics software is especially difficult due to the large input space and complex algorithms involved. We discuss how state-of-the-art software testing techniques that are based on the idea of multiple executions, such as metamorphic testing, can be used to implement an effective bioinformatics quality assurance strategy. We hope this review will raise awareness of these critical issues in bioinformatics.

  9. Validation and calibration of structural models that combine information from multiple sources.

    PubMed

    Dahabreh, Issa J; Wong, John B; Trikalinos, Thomas A

    2017-02-01

    Mathematical models that attempt to capture structural relationships between their components and combine information from multiple sources are increasingly used in medicine. Areas covered: We provide an overview of methods for model validation and calibration and survey studies comparing alternative approaches. Expert commentary: Model validation entails a confrontation of models with data, background knowledge, and other models, and can inform judgments about model credibility. Calibration involves selecting parameter values to improve the agreement of model outputs with data. When the goal of modeling is quantitative inference on the effects of interventions or forecasting, calibration can be viewed as estimation. This view clarifies issues related to parameter identifiability and facilitates formal model validation and the examination of consistency among different sources of information. In contrast, when the goal of modeling is the generation of qualitative insights about the modeled phenomenon, calibration is a rather informal process for selecting inputs that result in model behavior that roughly reproduces select aspects of the modeled phenomenon and cannot be equated to an estimation procedure. Current empirical research on validation and calibration methods consists primarily of methodological appraisals or case-studies of alternative techniques and cannot address the numerous complex and multifaceted methodological decisions that modelers must make. Further research is needed on different approaches for developing and validating complex models that combine evidence from multiple sources.

  10. Image processing in digital pathology: an opportunity to solve inter-batch variability of immunohistochemical staining

    NASA Astrophysics Data System (ADS)

    van Eycke, Yves-Rémi; Allard, Justine; Salmon, Isabelle; Debeir, Olivier; Decaestecker, Christine

    2017-02-01

    Immunohistochemistry (IHC) is a widely used technique in pathology to evidence protein expression in tissue samples. However, this staining technique is known for presenting inter-batch variations. Whole slide imaging in digital pathology offers a possibility to overcome this problem by means of image normalisation techniques. In the present paper we propose a methodology to objectively evaluate the need of image normalisation and to identify the best way to perform it. This methodology uses tissue microarray (TMA) materials and statistical analyses to evidence the possible variations occurring at colour and intensity levels as well as to evaluate the efficiency of image normalisation methods in correcting them. We applied our methodology to test different methods of image normalisation based on blind colour deconvolution that we adapted for IHC staining. These tests were carried out for different IHC experiments on different tissue types and targeting different proteins with different subcellular localisations. Our methodology enabled us to establish and to validate inter-batch normalization transforms which correct the non-relevant IHC staining variations. The normalised image series were then processed to extract coherent quantitative features characterising the IHC staining patterns.

  11. Twist Model Development and Results from the Active Aeroelastic Wing F/A-18 Aircraft

    NASA Technical Reports Server (NTRS)

    Lizotte, Andrew M.; Allen, Michael J.

    2007-01-01

    Understanding the wing twist of the active aeroelastic wing (AAW) F/A-18 aircraft is a fundamental research objective for the program and offers numerous benefits. In order to clearly understand the wing flexibility characteristics, a model was created to predict real-time wing twist. A reliable twist model allows the prediction of twist for flight simulation, provides insight into aircraft performance uncertainties, and assists with computational fluid dynamic and aeroelastic issues. The left wing of the aircraft was heavily instrumented during the first phase of the active aeroelastic wing program allowing deflection data collection. Traditional data processing steps were taken to reduce flight data, and twist predictions were made using linear regression techniques. The model predictions determined a consistent linear relationship between the measured twist and aircraft parameters, such as surface positions and aircraft state variables. Error in the original model was reduced in some cases by using a dynamic pressure-based assumption. This technique produced excellent predictions for flight between the standard test points and accounted for nonlinearities in the data. This report discusses data processing techniques and twist prediction validation, and provides illustrative and quantitative results.

  12. Image processing in digital pathology: an opportunity to solve inter-batch variability of immunohistochemical staining

    PubMed Central

    Van Eycke, Yves-Rémi; Allard, Justine; Salmon, Isabelle; Debeir, Olivier; Decaestecker, Christine

    2017-01-01

    Immunohistochemistry (IHC) is a widely used technique in pathology to evidence protein expression in tissue samples. However, this staining technique is known for presenting inter-batch variations. Whole slide imaging in digital pathology offers a possibility to overcome this problem by means of image normalisation techniques. In the present paper we propose a methodology to objectively evaluate the need of image normalisation and to identify the best way to perform it. This methodology uses tissue microarray (TMA) materials and statistical analyses to evidence the possible variations occurring at colour and intensity levels as well as to evaluate the efficiency of image normalisation methods in correcting them. We applied our methodology to test different methods of image normalisation based on blind colour deconvolution that we adapted for IHC staining. These tests were carried out for different IHC experiments on different tissue types and targeting different proteins with different subcellular localisations. Our methodology enabled us to establish and to validate inter-batch normalization transforms which correct the non-relevant IHC staining variations. The normalised image series were then processed to extract coherent quantitative features characterising the IHC staining patterns. PMID:28220842

  13. Damage Detection in Composite Structures with Wavenumber Array Data Processing

    NASA Technical Reports Server (NTRS)

    Tian, Zhenhua; Leckey, Cara; Yu, Lingyu

    2013-01-01

    Guided ultrasonic waves (GUW) have the potential to be an efficient and cost-effective method for rapid damage detection and quantification of large structures. Attractive features include sensitivity to a variety of damage types and the capability of traveling relatively long distances. They have proven to be an efficient approach for crack detection and localization in isotropic materials. However, techniques must be pushed beyond isotropic materials in order to be valid for composite aircraft components. This paper presents our study on GUW propagation and interaction with delamination damage in composite structures using wavenumber array data processing, together with advanced wave propagation simulations. Parallel elastodynamic finite integration technique (EFIT) is used for the example simulations. Multi-dimensional Fourier transform is used to convert time-space wavefield data into frequency-wavenumber domain. Wave propagation in the wavenumber-frequency domain shows clear distinction among the guided wave modes that are present. This allows for extracting a guided wave mode through filtering and reconstruction techniques. Presence of delamination causes spectral change accordingly. Results from 3D CFRP guided wave simulations with delamination damage in flat-plate specimens are used for wave interaction with structural defect study.

  14. Twist Model Development and Results From the Active Aeroelastic Wing F/A-18 Aircraft

    NASA Technical Reports Server (NTRS)

    Lizotte, Andrew; Allen, Michael J.

    2005-01-01

    Understanding the wing twist of the active aeroelastic wing F/A-18 aircraft is a fundamental research objective for the program and offers numerous benefits. In order to clearly understand the wing flexibility characteristics, a model was created to predict real-time wing twist. A reliable twist model allows the prediction of twist for flight simulation, provides insight into aircraft performance uncertainties, and assists with computational fluid dynamic and aeroelastic issues. The left wing of the aircraft was heavily instrumented during the first phase of the active aeroelastic wing program allowing deflection data collection. Traditional data processing steps were taken to reduce flight data, and twist predictions were made using linear regression techniques. The model predictions determined a consistent linear relationship between the measured twist and aircraft parameters, such as surface positions and aircraft state variables. Error in the original model was reduced in some cases by using a dynamic pressure-based assumption and by using neural networks. These techniques produced excellent predictions for flight between the standard test points and accounted for nonlinearities in the data. This report discusses data processing techniques and twist prediction validation, and provides illustrative and quantitative results.

  15. CARES: Completely Automated Robust Edge Snapper for carotid ultrasound IMT measurement on a multi-institutional database of 300 images: a two stage system combining an intensity-based feature approach with first order absolute moments

    NASA Astrophysics Data System (ADS)

    Molinari, Filippo; Acharya, Rajendra; Zeng, Guang; Suri, Jasjit S.

    2011-03-01

    The carotid intima-media thickness (IMT) is the most used marker for the progression of atherosclerosis and onset of the cardiovascular diseases. Computer-aided measurements improve accuracy, but usually require user interaction. In this paper we characterized a new and completely automated technique for carotid segmentation and IMT measurement based on the merits of two previously developed techniques. We used an integrated approach of intelligent image feature extraction and line fitting for automatically locating the carotid artery in the image frame, followed by wall interfaces extraction based on Gaussian edge operator. We called our system - CARES. We validated the CARES on a multi-institutional database of 300 carotid ultrasound images. IMT measurement bias was 0.032 +/- 0.141 mm, better than other automated techniques and comparable to that of user-driven methodologies. Our novel approach of CARES processed 96% of the images leading to the figure of merit to be 95.7%. CARES ensured complete automation and high accuracy in IMT measurement; hence it could be a suitable clinical tool for processing of large datasets in multicenter studies involving atherosclerosis.pre-

  16. Chemometric compositional analysis of phenolic compounds in fermenting samples and wines using different infrared spectroscopy techniques.

    PubMed

    Aleixandre-Tudo, Jose Luis; Nieuwoudt, Helene; Aleixandre, Jose Luis; du Toit, Wessel

    2018-01-01

    The wine industry requires reliable methods for the quantification of phenolic compounds during the winemaking process. Infrared spectroscopy appears as a suitable technique for process control and monitoring. The ability of Fourier transform near infrared (FT-NIR), attenuated total reflectance mid infrared (ATR-MIR) and Fourier transform infrared (FT-IR) spectroscopies to predict compositional phenolic levels during red wine fermentation and aging was investigated. Prediction models containing a large number of samples collected over two vintages from several industrial fermenting tanks as well as wine samples covering a varying number of vintages were validated. FT-NIR appeared as the most accurate technique to predict the phenolic content. Although slightly less accurate models were observed, ATR-MIR and FT-IR can also be used for the prediction of the majority of phenolic measurements. Additionally, the slope and intercept test indicated a systematic error for the three spectroscopies which seems to be slightly more pronounced for HPLC generated phenolics data than for the spectrophotometric parameters. However, the results also showed that the predictions made with the three instruments are statistically comparable. The robustness of the prediction models was also investigated and discussed. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Interactive, graphical processing unitbased evaluation of evacuation scenarios at the state scale

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perumalla, Kalyan S; Aaby, Brandon G; Yoginath, Srikanth B

    2011-01-01

    In large-scale scenarios, transportation modeling and simulation is severely constrained by simulation time. For example, few real- time simulators scale to evacuation traffic scenarios at the level of an entire state, such as Louisiana (approximately 1 million links) or Florida (2.5 million links). New simulation approaches are needed to overcome severe computational demands of conventional (microscopic or mesoscopic) modeling techniques. Here, a new modeling and execution methodology is explored that holds the potential to provide a tradeoff among the level of behavioral detail, the scale of transportation network, and real-time execution capabilities. A novel, field-based modeling technique and its implementationmore » on graphical processing units are presented. Although additional research with input from domain experts is needed for refining and validating the models, the techniques reported here afford interactive experience at very large scales of multi-million road segments. Illustrative experiments on a few state-scale net- works are described based on an implementation of this approach in a software system called GARFIELD. Current modeling cap- abilities and implementation limitations are described, along with possible use cases and future research.« less

  18. Unified multiphase modeling for evolving, acoustically coupled systems consisting of acoustic, elastic, poroelastic media and septa

    NASA Astrophysics Data System (ADS)

    Lee, Joong Seok; Kang, Yeon June; Kim, Yoon Young

    2012-12-01

    This paper presents a new modeling technique that can represent acoustically coupled systems in a unified manner. The proposed unified multiphase (UMP) modeling technique uses Biot's equations that are originally derived for poroelastic media to represent not only poroelastic media but also non-poroelastic ones ranging from acoustic and elastic media to septa. To recover the original vibro-acoustic behaviors of non-poroelastic media, material parameters of a base poroelastic medium are adjusted depending on the target media. The real virtue of this UMP technique is that interface coupling conditions between any media can be automatically satisfied, so no medium-dependent interface condition needs to be imposed explicitly. Thereby, the proposed technique can effectively model any acoustically coupled system having locally varying medium phases and evolving interfaces. A typical situation can occur in an iterative design process. Because the proposed UMP modeling technique needs theoretical justifications for further development, this work is mainly focused on how the technique recovers the governing equations of non-poroelastic media and expresses their interface conditions. We also address how to describe various boundary conditions of the media in the technique. Some numerical studies are carried out to demonstrate the validity of the proposed modeling technique.

  19. A Fast MHD Code for Gravitationally Stratified Media using Graphical Processing Units: SMAUG

    NASA Astrophysics Data System (ADS)

    Griffiths, M. K.; Fedun, V.; Erdélyi, R.

    2015-03-01

    Parallelization techniques have been exploited most successfully by the gaming/graphics industry with the adoption of graphical processing units (GPUs), possessing hundreds of processor cores. The opportunity has been recognized by the computational sciences and engineering communities, who have recently harnessed successfully the numerical performance of GPUs. For example, parallel magnetohydrodynamic (MHD) algorithms are important for numerical modelling of highly inhomogeneous solar, astrophysical and geophysical plasmas. Here, we describe the implementation of SMAUG, the Sheffield Magnetohydrodynamics Algorithm Using GPUs. SMAUG is a 1-3D MHD code capable of modelling magnetized and gravitationally stratified plasma. The objective of this paper is to present the numerical methods and techniques used for porting the code to this novel and highly parallel compute architecture. The methods employed are justified by the performance benchmarks and validation results demonstrating that the code successfully simulates the physics for a range of test scenarios including a full 3D realistic model of wave propagation in the solar atmosphere.

  20. Unveiling the Biometric Potential of Finger-Based ECG Signals

    PubMed Central

    Lourenço, André; Silva, Hugo; Fred, Ana

    2011-01-01

    The ECG signal has been shown to contain relevant information for human identification. Even though results validate the potential of these signals, data acquisition methods and apparatus explored so far compromise user acceptability, requiring the acquisition of ECG at the chest. In this paper, we propose a finger-based ECG biometric system, that uses signals collected at the fingers, through a minimally intrusive 1-lead ECG setup recurring to Ag/AgCl electrodes without gel as interface with the skin. The collected signal is significantly more noisy than the ECG acquired at the chest, motivating the application of feature extraction and signal processing techniques to the problem. Time domain ECG signal processing is performed, which comprises the usual steps of filtering, peak detection, heartbeat waveform segmentation, and amplitude normalization, plus an additional step of time normalization. Through a simple minimum distance criterion between the test patterns and the enrollment database, results have revealed this to be a promising technique for biometric applications. PMID:21837235

Top