Sample records for systems validation methods

  1. Fault-tolerant clock synchronization validation methodology. [in computer systems

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Palumbo, Daniel L.; Johnson, Sally C.

    1987-01-01

    A validation method for the synchronization subsystem of a fault-tolerant computer system is presented. The high reliability requirement of flight-crucial systems precludes the use of most traditional validation methods. The method presented utilizes formal design proof to uncover design and coding errors and experimentation to validate the assumptions of the design proof. The experimental method is described and illustrated by validating the clock synchronization system of the Software Implemented Fault Tolerance computer. The design proof of the algorithm includes a theorem that defines the maximum skew between any two nonfaulty clocks in the system in terms of specific system parameters. Most of these parameters are deterministic. One crucial parameter is the upper bound on the clock read error, which is stochastic. The probability that this upper bound is exceeded is calculated from data obtained by the measurement of system parameters. This probability is then included in a detailed reliability analysis of the system.

  2. Validation Methods for Fault-Tolerant avionics and control systems, working group meeting 1

    NASA Technical Reports Server (NTRS)

    1979-01-01

    The proceedings of the first working group meeting on validation methods for fault tolerant computer design are presented. The state of the art in fault tolerant computer validation was examined in order to provide a framework for future discussions concerning research issues for the validation of fault tolerant avionics and flight control systems. The development of positions concerning critical aspects of the validation process are given.

  3. Validation of GC and HPLC systems for residue studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, M.

    1995-12-01

    For residue studies, GC and HPLC system performance must be validated prior to and during use. One excellent measure of system performance is the standard curve and associated chromatograms used to construct that curve. The standard curve is a model of system response to an analyte over a specific time period, and is prima facia evidence of system performance beginning at the auto sampler and proceeding through the injector, column, detector, electronics, data-capture device, and printer/plotter. This tool measures the performance of the entire chromatographic system; its power negates most of the benefits associated with costly and time-consuming validation ofmore » individual system components. Other measures of instrument and method validation will be discussed, including quality control charts and experimental designs for method validation.« less

  4. [Data validation methods and discussion on Chinese materia medica resource survey].

    PubMed

    Zhang, Yue; Ma, Wei-Feng; Zhang, Xiao-Bo; Zhu, Shou-Dong; Guo, Lan-Ping; Wang, Xing-Xing

    2013-07-01

    From the beginning of the fourth national survey of the Chinese materia medica resources, there were 22 provinces have conducted pilots. The survey teams have reported immense data, it put forward the very high request to the database system construction. In order to ensure the quality, it is necessary to check and validate the data in database system. Data validation is important methods to ensure the validity, integrity and accuracy of census data. This paper comprehensively introduce the data validation system of the fourth national survey of the Chinese materia medica resources database system, and further improve the design idea and programs of data validation. The purpose of this study is to promote the survey work smoothly.

  5. Validation Methods Research for Fault-Tolerant Avionics and Control Systems: Working Group Meeting, 2

    NASA Technical Reports Server (NTRS)

    Gault, J. W. (Editor); Trivedi, K. S. (Editor); Clary, J. B. (Editor)

    1980-01-01

    The validation process comprises the activities required to insure the agreement of system realization with system specification. A preliminary validation methodology for fault tolerant systems documented. A general framework for a validation methodology is presented along with a set of specific tasks intended for the validation of two specimen system, SIFT and FTMP. Two major areas of research are identified. First, are those activities required to support the ongoing development of the validation process itself, and second, are those activities required to support the design, development, and understanding of fault tolerant systems.

  6. Meeting report: Validation of toxicogenomics-based test systems: ECVAM-ICCVAM/NICEATM considerations for regulatory use.

    PubMed

    Corvi, Raffaella; Ahr, Hans-Jürgen; Albertini, Silvio; Blakey, David H; Clerici, Libero; Coecke, Sandra; Douglas, George R; Gribaldo, Laura; Groten, John P; Haase, Bernd; Hamernik, Karen; Hartung, Thomas; Inoue, Tohru; Indans, Ian; Maurici, Daniela; Orphanides, George; Rembges, Diana; Sansone, Susanna-Assunta; Snape, Jason R; Toda, Eisaku; Tong, Weida; van Delft, Joost H; Weis, Brenda; Schechtman, Leonard M

    2006-03-01

    This is the report of the first workshop "Validation of Toxicogenomics-Based Test Systems" held 11-12 December 2003 in Ispra, Italy. The workshop was hosted by the European Centre for the Validation of Alternative Methods (ECVAM) and organized jointly by ECVAM, the U.S. Interagency Coordinating Committee on the Validation of Alternative Methods (ICCVAM), and the National Toxicology Program (NTP) Interagency Center for the Evaluation of Alternative Toxicological Methods (NICEATM). The primary aim of the workshop was for participants to discuss and define principles applicable to the validation of toxicogenomics platforms as well as validation of specific toxicologic test methods that incorporate toxicogenomics technologies. The workshop was viewed as an opportunity for initiating a dialogue between technologic experts, regulators, and the principal validation bodies and for identifying those factors to which the validation process would be applicable. It was felt that to do so now, as the technology is evolving and associated challenges are identified, would be a basis for the future validation of the technology when it reaches the appropriate stage. Because of the complexity of the issue, different aspects of the validation of toxicogenomics-based test methods were covered. The three focus areas include a) biologic validation of toxicogenomics-based test methods for regulatory decision making, b) technical and bioinformatics aspects related to validation, and c) validation issues as they relate to regulatory acceptance and use of toxicogenomics-based test methods. In this report we summarize the discussions and describe in detail the recommendations for future direction and priorities.

  7. Task Validation for the AN/TPQ-36 Radar System

    DTIC Science & Technology

    1978-09-01

    report presents the method and results of a study to validate personnel task descriptions for the new AN/TyP-Jb radar...TP.J-Sb KAPAK SVSTKM CONTENTS i ■ l.t |i- INTRODUCTION t METHOD 2 RESULTS, CONCLUSIONS, AND RECOMMENDATIONS b Task Validation 5 26B MOS... method , results, conclusions, and recommendations of the validation study. The appendixes contain the following: 1. Appendix A contains

  8. Valid methods: the quality assurance of test method development, validation, approval, and transfer for veterinary testing laboratories.

    PubMed

    Wiegers, Ann L

    2003-07-01

    Third-party accreditation is a valuable tool to demonstrate a laboratory's competence to conduct testing. Accreditation, internationally and in the United States, has been discussed previously. However, accreditation is only I part of establishing data credibility. A validated test method is the first component of a valid measurement system. Validation is defined as confirmation by examination and the provision of objective evidence that the particular requirements for a specific intended use are fulfilled. The international and national standard ISO/IEC 17025 recognizes the importance of validated methods and requires that laboratory-developed methods or methods adopted by the laboratory be appropriate for the intended use. Validated methods are therefore required and their use agreed to by the client (i.e., end users of the test results such as veterinarians, animal health programs, and owners). ISO/IEC 17025 also requires that the introduction of methods developed by the laboratory for its own use be a planned activity conducted by qualified personnel with adequate resources. This article discusses considerations and recommendations for the conduct of veterinary diagnostic test method development, validation, evaluation, approval, and transfer to the user laboratory in the ISO/IEC 17025 environment. These recommendations are based on those of nationally and internationally accepted standards and guidelines, as well as those of reputable and experienced technical bodies. They are also based on the author's experience in the evaluation of method development and transfer projects, validation data, and the implementation of quality management systems in the area of method development.

  9. 40 CFR Table 6 to Subpart Bbbb of... - Model Rule-Requirements for Validating Continuous Emission Monitoring Systems (CEMS)

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Continuous Emission Monitoring Systems (CEMS) 6 Table 6 to Subpart BBBB of Part 60 Protection of Environment...—Requirements for Validating Continuous Emission Monitoring Systems (CEMS) For the following continuous emission monitoring systems Use the following methods in appendix A of this part to validate poollutant concentratin...

  10. 40 CFR Table 6 to Subpart Bbbb of... - Model Rule-Requirements for Validating Continuous Emission Monitoring Systems (CEMS)

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Continuous Emission Monitoring Systems (CEMS) 6 Table 6 to Subpart BBBB of Part 60 Protection of Environment...—Requirements for Validating Continuous Emission Monitoring Systems (CEMS) For the following continuous emission monitoring systems Use the following methods in appendix A of this part to validate poollutant concentratin...

  11. Formal methods and digital systems validation for airborne systems

    NASA Technical Reports Server (NTRS)

    Rushby, John

    1993-01-01

    This report has been prepared to supplement a forthcoming chapter on formal methods in the FAA Digital Systems Validation Handbook. Its purpose is as follows: to outline the technical basis for formal methods in computer science; to explain the use of formal methods in the specification and verification of software and hardware requirements, designs, and implementations; to identify the benefits, weaknesses, and difficulties in applying these methods to digital systems used on board aircraft; and to suggest factors for consideration when formal methods are offered in support of certification. These latter factors assume the context for software development and assurance described in RTCA document DO-178B, 'Software Considerations in Airborne Systems and Equipment Certification,' Dec. 1992.

  12. A novel validation and calibration method for motion capture systems based on micro-triangulation.

    PubMed

    Nagymáté, Gergely; Tuchband, Tamás; Kiss, Rita M

    2018-06-06

    Motion capture systems are widely used to measure human kinematics. Nevertheless, users must consider system errors when evaluating their results. Most validation techniques for these systems are based on relative distance and displacement measurements. In contrast, our study aimed to analyse the absolute volume accuracy of optical motion capture systems by means of engineering surveying reference measurement of the marker coordinates (uncertainty: 0.75 mm). The method is exemplified on an 18 camera OptiTrack Flex13 motion capture system. The absolute accuracy was defined by the root mean square error (RMSE) between the coordinates measured by the camera system and by engineering surveying (micro-triangulation). The original RMSE of 1.82 mm due to scaling error was managed to be reduced to 0.77 mm while the correlation of errors to their distance from the origin reduced from 0.855 to 0.209. A simply feasible but less accurate absolute accuracy compensation method using tape measure on large distances was also tested, which resulted in similar scaling compensation compared to the surveying method or direct wand size compensation by a high precision 3D scanner. The presented validation methods can be less precise in some respects as compared to previous techniques, but they address an error type, which has not been and cannot be studied with the previous validation methods. Copyright © 2018 Elsevier Ltd. All rights reserved.

  13. The Value of Qualitative Methods in Social Validity Research

    ERIC Educational Resources Information Center

    Leko, Melinda M.

    2014-01-01

    One quality indicator of intervention research is the extent to which the intervention has a high degree of social validity, or practicality. In this study, I drew on Wolf's framework for social validity and used qualitative methods to ascertain five middle schoolteachers' perceptions of the social validity of System 44®--a phonics-based reading…

  14. Validation of the F-18 high alpha research vehicle flight control and avionics systems modifications

    NASA Technical Reports Server (NTRS)

    Chacon, Vince; Pahle, Joseph W.; Regenie, Victoria A.

    1990-01-01

    The verification and validation process is a critical portion of the development of a flight system. Verification, the steps taken to assure the system meets the design specification, has become a reasonably understood and straightforward process. Validation is the method used to ensure that the system design meets the needs of the project. As systems become more integrated and more critical in their functions, the validation process becomes more complex and important. The tests, tools, and techniques which are being used for the validation of the high alpha research vehicle (HARV) turning valve control system (TVCS) are discussed, and their solutions are documented. The emphasis of this paper is on the validation of integrated systems.

  15. Validation of the F-18 high alpha research vehicle flight control and avionics systems modifications

    NASA Technical Reports Server (NTRS)

    Chacon, Vince; Pahle, Joseph W.; Regenie, Victoria A.

    1990-01-01

    The verification and validation process is a critical portion of the development of a flight system. Verification, the steps taken to assure the system meets the design specification, has become a reasonably understood and straightforward process. Validation is the method used to ensure that the system design meets the needs of the project. As systems become more integrated and more critical in their functions, the validation process becomes more complex and important. The tests, tools, and techniques which are being used for the validation of the high alpha research vehicle (HARV) turning vane control system (TVCS) are discussed and the problems and their solutions are documented. The emphasis of this paper is on the validation of integrated system.

  16. Directed Design of Experiments for Validating Probability of Detection Capability of a Testing System

    NASA Technical Reports Server (NTRS)

    Generazio, Edward R. (Inventor)

    2012-01-01

    A method of validating a probability of detection (POD) testing system using directed design of experiments (DOE) includes recording an input data set of observed hit and miss or analog data for sample components as a function of size of a flaw in the components. The method also includes processing the input data set to generate an output data set having an optimal class width, assigning a case number to the output data set, and generating validation instructions based on the assigned case number. An apparatus includes a host machine for receiving the input data set from the testing system and an algorithm for executing DOE to validate the test system. The algorithm applies DOE to the input data set to determine a data set having an optimal class width, assigns a case number to that data set, and generates validation instructions based on the case number.

  17. Low-cost extrapolation method for maximal LTE radio base station exposure estimation: test and validation.

    PubMed

    Verloock, Leen; Joseph, Wout; Gati, Azeddine; Varsier, Nadège; Flach, Björn; Wiart, Joe; Martens, Luc

    2013-06-01

    An experimental validation of a low-cost method for extrapolation and estimation of the maximal electromagnetic-field exposure from long-term evolution (LTE) radio base station installations are presented. No knowledge on downlink band occupation or service characteristics is required for the low-cost method. The method is applicable in situ. It only requires a basic spectrum analyser with appropriate field probes without the need of expensive dedicated LTE decoders. The method is validated both in laboratory and in situ, for a single-input single-output antenna LTE system and a 2×2 multiple-input multiple-output system, with low deviations in comparison with signals measured using dedicated LTE decoders.

  18. A Complex Systems Approach to Causal Discovery in Psychiatry.

    PubMed

    Saxe, Glenn N; Statnikov, Alexander; Fenyo, David; Ren, Jiwen; Li, Zhiguo; Prasad, Meera; Wall, Dennis; Bergman, Nora; Briggs, Ernestine C; Aliferis, Constantin

    2016-01-01

    Conventional research methodologies and data analytic approaches in psychiatric research are unable to reliably infer causal relations without experimental designs, or to make inferences about the functional properties of the complex systems in which psychiatric disorders are embedded. This article describes a series of studies to validate a novel hybrid computational approach--the Complex Systems-Causal Network (CS-CN) method-designed to integrate causal discovery within a complex systems framework for psychiatric research. The CS-CN method was first applied to an existing dataset on psychopathology in 163 children hospitalized with injuries (validation study). Next, it was applied to a much larger dataset of traumatized children (replication study). Finally, the CS-CN method was applied in a controlled experiment using a 'gold standard' dataset for causal discovery and compared with other methods for accurately detecting causal variables (resimulation controlled experiment). The CS-CN method successfully detected a causal network of 111 variables and 167 bivariate relations in the initial validation study. This causal network had well-defined adaptive properties and a set of variables was found that disproportionally contributed to these properties. Modeling the removal of these variables resulted in significant loss of adaptive properties. The CS-CN method was successfully applied in the replication study and performed better than traditional statistical methods, and similarly to state-of-the-art causal discovery algorithms in the causal detection experiment. The CS-CN method was validated, replicated, and yielded both novel and previously validated findings related to risk factors and potential treatments of psychiatric disorders. The novel approach yields both fine-grain (micro) and high-level (macro) insights and thus represents a promising approach for complex systems-oriented research in psychiatry.

  19. Design for validation: An approach to systems validation

    NASA Technical Reports Server (NTRS)

    Carter, William C.; Dunham, Janet R.; Laprie, Jean-Claude; Williams, Thomas; Howden, William; Smith, Brian; Lewis, Carl M. (Editor)

    1989-01-01

    Every complex system built is validated in some manner. Computer validation begins with review of the system design. As systems became too complicated for one person to review, validation began to rely on the application of adhoc methods by many individuals. As the cost of the changes mounted and the expense of failure increased, more organized procedures became essential. Attempts at devising and carrying out those procedures showed that validation is indeed a difficult technical problem. The successful transformation of the validation process into a systematic series of formally sound, integrated steps is necessary if the liability inherent in the future digita-system-based avionic and space systems is to be minimized. A suggested framework and timetable for the transformtion are presented. Basic working definitions of two pivotal ideas (validation and system life-cyle) are provided and show how the two concepts interact. Many examples are given of past and present validation activities by NASA and others. A conceptual framework is presented for the validation process. Finally, important areas are listed for ongoing development of the validation process at NASA Langley Research Center.

  20. System Identification Methods for Aircraft Flight Control Development and Validation

    DOT National Transportation Integrated Search

    1995-10-01

    System-identification methods compose a mathematical model, or series of models, : from measurements of inputs and outputs of dynamic systems. This paper : discusses the use of frequency-domain system-identification methods for the : development and ...

  1. Validation of Digital Systems in Avionics and Flight Control Applications Handbook. Volume 1.

    DTIC Science & Technology

    1983-07-01

    will also be available to Airways Facilities, Systems Research and Development Service, Air Traffic Control Service, and Flight Standards elements...2114, March 12-14, 1979. 3. Validation Methods Research for Fault-Tolerant Avionics and Control Systems-- *r Working Group Meeting II, NASA...command generation with the multiple methods becoming avail- able for closure of the outer control loop necessitates research on alternative integration

  2. Validation approach for a fast and simple targeted screening method for 75 antibiotics in meat and aquaculture products using LC-MS/MS.

    PubMed

    Dubreil, Estelle; Gautier, Sophie; Fourmond, Marie-Pierre; Bessiral, Mélaine; Gaugain, Murielle; Verdon, Eric; Pessel, Dominique

    2017-04-01

    An approach is described to validate a fast and simple targeted screening method for antibiotic analysis in meat and aquaculture products by LC-MS/MS. The strategy of validation was applied for a panel of 75 antibiotics belonging to different families, i.e., penicillins, cephalosporins, sulfonamides, macrolides, quinolones and phenicols. The samples were extracted once with acetonitrile, concentrated by evaporation and injected into the LC-MS/MS system. The approach chosen for the validation was based on the Community Reference Laboratory (CRL) guidelines for the validation of screening qualitative methods. The aim of the validation was to prove sufficient sensitivity of the method to detect all the targeted antibiotics at the level of interest, generally the maximum residue limit (MRL). A robustness study was also performed to test the influence of different factors. The validation showed that the method is valid to detect and identify 73 antibiotics of the 75 antibiotics studied in meat and aquaculture products at the validation levels.

  3. Examining Teacher Evaluation Validity and Leadership Decision Making within a Standards-Based Evaluation System

    ERIC Educational Resources Information Center

    Kimball, Steven M.; Milanowski, Anthony

    2009-01-01

    Purpose: The article reports on a study of school leader decision making that examined variation in the validity of teacher evaluation ratings in a school district that has implemented a standards-based teacher evaluation system. Research Methods: Applying mixed methods, the study used teacher evaluation ratings and value-added student achievement…

  4. Improvement of Simulation Method in Validation of Software of the Coordinate Measuring Systems

    NASA Astrophysics Data System (ADS)

    Nieciąg, Halina

    2015-10-01

    Software is used in order to accomplish various tasks at each stage of the functioning of modern measuring systems. Before metrological confirmation of measuring equipment, the system has to be validated. This paper discusses the method for conducting validation studies of a fragment of software to calculate the values of measurands. Due to the number and nature of the variables affecting the coordinate measurement results and the complex character and multi-dimensionality of measurands, the study used the Monte Carlo method of numerical simulation. The article presents an attempt of possible improvement of results obtained by classic Monte Carlo tools. The algorithm LHS (Latin Hypercube Sampling) was implemented as alternative to the simple sampling schema of classic algorithm.

  5. Vacuum decay container closure integrity leak test method development and validation for a lyophilized product-package system.

    PubMed

    Patel, Jayshree; Mulhall, Brian; Wolf, Heinz; Klohr, Steven; Guazzo, Dana Morton

    2011-01-01

    A leak test performed according to ASTM F2338-09 Standard Test Method for Nondestructive Detection of Leaks in Packages by Vacuum Decay Method was developed and validated for container-closure integrity verification of a lyophilized product in a parenteral vial package system. This nondestructive leak test method is intended for use in manufacturing as an in-process package integrity check, and for testing product stored on stability in lieu of sterility tests. Method development and optimization challenge studies incorporated artificially defective packages representing a range of glass vial wall and sealing surface defects, as well as various elastomeric stopper defects. Method validation required 3 days of random-order replicate testing of a test sample population of negative-control, no-defect packages and positive-control, with-defect packages. Positive-control packages were prepared using vials each with a single hole laser-drilled through the glass vial wall. Hole creation and hole size certification was performed by Lenox Laser. Validation study results successfully demonstrated the vacuum decay leak test method's ability to accurately and reliably detect those packages with laser-drilled holes greater than or equal to approximately 5 μm in nominal diameter. All development and validation studies were performed at Whitehouse Analytical Laboratories in Whitehouse, NJ, under the direction of consultant Dana Guazzo of RxPax, LLC, using a VeriPac 455 Micro Leak Test System by Packaging Technologies & Inspection (Tuckahoe, NY). Bristol Myers Squibb (New Brunswick, NJ) fully subsidized all work. A leak test performed according to ASTM F2338-09 Standard Test Method for Nondestructive Detection of Leaks in Packages by Vacuum Decay Method was developed and validated to detect defects in stoppered vial packages containing lyophilized product for injection. This nondestructive leak test method is intended for use in manufacturing as an in-process package integrity check, and for testing product stored on stability in lieu of sterility tests. Test method validation study results proved the method capable of detecting holes laser-drilled through the glass vial wall greater than or equal to 5 μm in nominal diameter. Total test time is less than 1 min per package. All method development and validation studies were performed at Whitehouse Analytical Laboratories in Whitehouse, NJ, under the direction of consultant Dana Guazzo of RxPax, LLC, using a VeriPac 455 Micro Leak Test System by Packaging Technologies & Inspection (Tuckahoe, NY). Bristol Myers Squibb (New Brunswick, NJ) fully subsidized all work.

  6. Automatic control system generation for robot design validation

    NASA Technical Reports Server (NTRS)

    Bacon, James A. (Inventor); English, James D. (Inventor)

    2012-01-01

    The specification and drawings present a new method, system and software product for and apparatus for generating a robotic validation system for a robot design. The robotic validation system for the robot design of a robotic system is automatically generated by converting a robot design into a generic robotic description using a predetermined format, then generating a control system from the generic robotic description and finally updating robot design parameters of the robotic system with an analysis tool using both the generic robot description and the control system.

  7. Solar-Diesel Hybrid Power System Optimization and Experimental Validation

    NASA Astrophysics Data System (ADS)

    Jacobus, Headley Stewart

    As of 2008 1.46 billion people, or 22 percent of the World's population, were without electricity. Many of these people live in remote areas where decentralized generation is the only method of electrification. Most mini-grids are powered by diesel generators, but new hybrid power systems are becoming a reliable method to incorporate renewable energy while also reducing total system cost. This thesis quantifies the measurable Operational Costs for an experimental hybrid power system in Sierra Leone. Two software programs, Hybrid2 and HOMER, are used during the system design and subsequent analysis. Experimental data from the installed system is used to validate the two programs and to quantify the savings created by each component within the hybrid system. This thesis bridges the gap between design optimization studies that frequently lack subsequent validation and experimental hybrid system performance studies.

  8. Comparison of Biophysical Characteristics and Predicted Thermophysiological Responses of Three Prototype Body Armor Systems Versus Baseline U.S. Army Body Armor Systems

    DTIC Science & Technology

    2015-06-19

    effective and scientifically valid method of making comparisons of clothing and equipment changes prior to conducting human research. predictive modeling...valid method of making comparisons of clothing and equipment changes prior to conducting human research. 2 INTRODUCTION Modern day...clothing and equipment changes prior to conducting human research. METHODS Ensembles Three different body armor (BA) plus clothing ensembles were

  9. Analytical difficulties facing today's regulatory laboratories: issues in method validation.

    PubMed

    MacNeil, James D

    2012-08-01

    The challenges facing analytical laboratories today are not unlike those faced in the past, although both the degree of complexity and the rate of change have increased. Challenges such as development and maintenance of expertise, maintenance and up-dating of equipment, and the introduction of new test methods have always been familiar themes for analytical laboratories, but international guidelines for laboratories involved in the import and export testing of food require management of such changes in a context which includes quality assurance, accreditation, and method validation considerations. Decisions as to when a change in a method requires re-validation of the method or on the design of a validation scheme for a complex multi-residue method require a well-considered strategy, based on a current knowledge of international guidance documents and regulatory requirements, as well the laboratory's quality system requirements. Validation demonstrates that a method is 'fit for purpose', so the requirement for validation should be assessed in terms of the intended use of a method and, in the case of change or modification of a method, whether that change or modification may affect a previously validated performance characteristic. In general, method validation involves method scope, calibration-related parameters, method precision, and recovery. Any method change which may affect method scope or any performance parameters will require re-validation. Some typical situations involving change in methods are discussed and a decision process proposed for selection of appropriate validation measures. © 2012 John Wiley & Sons, Ltd.

  10. Towards Validation of an Adaptive Flight Control Simulation Using Statistical Emulation

    NASA Technical Reports Server (NTRS)

    He, Yuning; Lee, Herbert K. H.; Davies, Misty D.

    2012-01-01

    Traditional validation of flight control systems is based primarily upon empirical testing. Empirical testing is sufficient for simple systems in which a.) the behavior is approximately linear and b.) humans are in-the-loop and responsible for off-nominal flight regimes. A different possible concept of operation is to use adaptive flight control systems with online learning neural networks (OLNNs) in combination with a human pilot for off-nominal flight behavior (such as when a plane has been damaged). Validating these systems is difficult because the controller is changing during the flight in a nonlinear way, and because the pilot and the control system have the potential to co-adapt in adverse ways traditional empirical methods are unlikely to provide any guarantees in this case. Additionally, the time it takes to find unsafe regions within the flight envelope using empirical testing means that the time between adaptive controller design iterations is large. This paper describes a new concept for validating adaptive control systems using methods based on Bayesian statistics. This validation framework allows the analyst to build nonlinear models with modal behavior, and to have an uncertainty estimate for the difference between the behaviors of the model and system under test.

  11. Development of Advanced Verification and Validation Procedures and Tools for the Certification of Learning Systems in Aerospace Applications

    NASA Technical Reports Server (NTRS)

    Jacklin, Stephen; Schumann, Johann; Gupta, Pramod; Richard, Michael; Guenther, Kurt; Soares, Fola

    2005-01-01

    Adaptive control technologies that incorporate learning algorithms have been proposed to enable automatic flight control and vehicle recovery, autonomous flight, and to maintain vehicle performance in the face of unknown, changing, or poorly defined operating environments. In order for adaptive control systems to be used in safety-critical aerospace applications, they must be proven to be highly safe and reliable. Rigorous methods for adaptive software verification and validation must be developed to ensure that control system software failures will not occur. Of central importance in this regard is the need to establish reliable methods that guarantee convergent learning, rapid convergence (learning) rate, and algorithm stability. This paper presents the major problems of adaptive control systems that use learning to improve performance. The paper then presents the major procedures and tools presently developed or currently being developed to enable the verification, validation, and ultimate certification of these adaptive control systems. These technologies include the application of automated program analysis methods, techniques to improve the learning process, analytical methods to verify stability, methods to automatically synthesize code, simulation and test methods, and tools to provide on-line software assurance.

  12. Validation of Skills, Knowledge and Experience in Lifelong Learning in Europe

    ERIC Educational Resources Information Center

    Ogunleye, James

    2012-01-01

    The paper examines systems of validation of skills and experience as well as the main methods/tools currently used for validating skills and knowledge in lifelong learning. The paper uses mixed methods--a case study research and content analysis of European Union policy documents and frameworks--as a basis for this research. The selection of the…

  13. Evaluation of passenger health risk assessment of sustainable indoor air quality monitoring in metro systems based on a non-Gaussian dynamic sensor validation method.

    PubMed

    Kim, MinJeong; Liu, Hongbin; Kim, Jeong Tai; Yoo, ChangKyoo

    2014-08-15

    Sensor faults in metro systems provide incorrect information to indoor air quality (IAQ) ventilation systems, resulting in the miss-operation of ventilation systems and adverse effects on passenger health. In this study, a new sensor validation method is proposed to (1) detect, identify and repair sensor faults and (2) evaluate the influence of sensor reliability on passenger health risk. To address the dynamic non-Gaussianity problem of IAQ data, dynamic independent component analysis (DICA) is used. To detect and identify sensor faults, the DICA-based squared prediction error and sensor validity index are used, respectively. To restore the faults to normal measurements, a DICA-based iterative reconstruction algorithm is proposed. The comprehensive indoor air-quality index (CIAI) that evaluates the influence of the current IAQ on passenger health is then compared using the faulty and reconstructed IAQ data sets. Experimental results from a metro station showed that the DICA-based method can produce an improved IAQ level in the metro station and reduce passenger health risk since it more accurately validates sensor faults than do conventional methods. Copyright © 2014 Elsevier B.V. All rights reserved.

  14. Effective data validation of high-frequency data: time-point-, time-interval-, and trend-based methods.

    PubMed

    Horn, W; Miksch, S; Egghart, G; Popow, C; Paky, F

    1997-09-01

    Real-time systems for monitoring and therapy planning, which receive their data from on-line monitoring equipment and computer-based patient records, require reliable data. Data validation has to utilize and combine a set of fast methods to detect, eliminate, and repair faulty data, which may lead to life-threatening conclusions. The strength of data validation results from the combination of numerical and knowledge-based methods applied to both continuously-assessed high-frequency data and discontinuously-assessed data. Dealing with high-frequency data, examining single measurements is not sufficient. It is essential to take into account the behavior of parameters over time. We present time-point-, time-interval-, and trend-based methods for validation and repair. These are complemented by time-independent methods for determining an overall reliability of measurements. The data validation benefits from the temporal data-abstraction process, which provides automatically derived qualitative values and patterns. The temporal abstraction is oriented on a context-sensitive and expectation-guided principle. Additional knowledge derived from domain experts forms an essential part for all of these methods. The methods are applied in the field of artificial ventilation of newborn infants. Examples from the real-time monitoring and therapy-planning system VIE-VENT illustrate the usefulness and effectiveness of the methods.

  15. Alternative validation practice of an automated faulting measurement method.

    DOT National Transportation Integrated Search

    2010-03-08

    A number of states have adopted profiler based systems to automatically measure faulting, : in jointed concrete pavements. However, little published work exists which documents the : validation process used for such automated faulting systems. This p...

  16. Theoretical validation for changing magnetic fields of systems of permanent magnets of drum separators

    NASA Astrophysics Data System (ADS)

    Lozovaya, S. Y.; Lozovoy, N. M.; Okunev, A. N.

    2018-03-01

    This article is devoted to the theoretical validation of the change in magnetic fields created by the permanent magnet systems of the drum separators. In the article, using the example of a magnetic separator for enrichment of highly magnetic ores, the method of analytical calculation of the magnetic fields of systems of permanent magnets based on the Biot-Savart-Laplace law, the equivalent solenoid method, and the superposition principle of fields is considered.

  17. Validation of Alternative In Vitro Methods to Animal Testing: Concepts, Challenges, Processes and Tools.

    PubMed

    Griesinger, Claudius; Desprez, Bertrand; Coecke, Sandra; Casey, Warren; Zuang, Valérie

    This chapter explores the concepts, processes, tools and challenges relating to the validation of alternative methods for toxicity and safety testing. In general terms, validation is the process of assessing the appropriateness and usefulness of a tool for its intended purpose. Validation is routinely used in various contexts in science, technology, the manufacturing and services sectors. It serves to assess the fitness-for-purpose of devices, systems, software up to entire methodologies. In the area of toxicity testing, validation plays an indispensable role: "alternative approaches" are increasingly replacing animal models as predictive tools and it needs to be demonstrated that these novel methods are fit for purpose. Alternative approaches include in vitro test methods, non-testing approaches such as predictive computer models up to entire testing and assessment strategies composed of method suites, data sources and decision-aiding tools. Data generated with alternative approaches are ultimately used for decision-making on public health and the protection of the environment. It is therefore essential that the underlying methods and methodologies are thoroughly characterised, assessed and transparently documented through validation studies involving impartial actors. Importantly, validation serves as a filter to ensure that only test methods able to produce data that help to address legislative requirements (e.g. EU's REACH legislation) are accepted as official testing tools and, owing to the globalisation of markets, recognised on international level (e.g. through inclusion in OECD test guidelines). Since validation creates a credible and transparent evidence base on test methods, it provides a quality stamp, supporting companies developing and marketing alternative methods and creating considerable business opportunities. Validation of alternative methods is conducted through scientific studies assessing two key hypotheses, reliability and relevance of the test method for a given purpose. Relevance encapsulates the scientific basis of the test method, its capacity to predict adverse effects in the "target system" (i.e. human health or the environment) as well as its applicability for the intended purpose. In this chapter we focus on the validation of non-animal in vitro alternative testing methods and review the concepts, challenges, processes and tools fundamental to the validation of in vitro methods intended for hazard testing of chemicals. We explore major challenges and peculiarities of validation in this area. Based on the notion that validation per se is a scientific endeavour that needs to adhere to key scientific principles, namely objectivity and appropriate choice of methodology, we examine basic aspects of study design and management, and provide illustrations of statistical approaches to describe predictive performance of validated test methods as well as their reliability.

  18. Theoretical relationship between vibration transmissibility and driving-point response functions of the human body.

    PubMed

    Dong, Ren G; Welcome, Daniel E; McDowell, Thomas W; Wu, John Z

    2013-11-25

    The relationship between the vibration transmissibility and driving-point response functions (DPRFs) of the human body is important for understanding vibration exposures of the system and for developing valid models. This study identified their theoretical relationship and demonstrated that the sum of the DPRFs can be expressed as a linear combination of the transmissibility functions of the individual mass elements distributed throughout the system. The relationship is verified using several human vibration models. This study also clarified the requirements for reliably quantifying transmissibility values used as references for calibrating the system models. As an example application, this study used the developed theory to perform a preliminary analysis of the method for calibrating models using both vibration transmissibility and DPRFs. The results of the analysis show that the combined method can theoretically result in a unique and valid solution of the model parameters, at least for linear systems. However, the validation of the method itself does not guarantee the validation of the calibrated model, because the validation of the calibration also depends on the model structure and the reliability and appropriate representation of the reference functions. The basic theory developed in this study is also applicable to the vibration analyses of other structures.

  19. Model-Based Method for Sensor Validation

    NASA Technical Reports Server (NTRS)

    Vatan, Farrokh

    2012-01-01

    Fault detection, diagnosis, and prognosis are essential tasks in the operation of autonomous spacecraft, instruments, and in situ platforms. One of NASA s key mission requirements is robust state estimation. Sensing, using a wide range of sensors and sensor fusion approaches, plays a central role in robust state estimation, and there is a need to diagnose sensor failure as well as component failure. Sensor validation can be considered to be part of the larger effort of improving reliability and safety. The standard methods for solving the sensor validation problem are based on probabilistic analysis of the system, from which the method based on Bayesian networks is most popular. Therefore, these methods can only predict the most probable faulty sensors, which are subject to the initial probabilities defined for the failures. The method developed in this work is based on a model-based approach and provides the faulty sensors (if any), which can be logically inferred from the model of the system and the sensor readings (observations). The method is also more suitable for the systems when it is hard, or even impossible, to find the probability functions of the system. The method starts by a new mathematical description of the problem and develops a very efficient and systematic algorithm for its solution. The method builds on the concepts of analytical redundant relations (ARRs).

  20. Principles for valid histopathologic scoring in research

    PubMed Central

    Gibson-Corley, Katherine N.; Olivier, Alicia K.; Meyerholz, David K.

    2013-01-01

    Histopathologic scoring is a tool by which semi-quantitative data can be obtained from tissues. Initially, a thorough understanding of the experimental design, study objectives and methods are required to allow the pathologist to appropriately examine tissues and develop lesion scoring approaches. Many principles go into the development of a scoring system such as tissue examination, lesion identification, scoring definitions and consistency in interpretation. Masking (a.k.a. “blinding”) of the pathologist to experimental groups is often necessary to constrain bias and multiple mechanisms are available. Development of a tissue scoring system requires appreciation of the attributes and limitations of the data (e.g. nominal, ordinal, interval and ratio data) to be evaluated. Incidence, ordinal and rank methods of tissue scoring are demonstrated along with key principles for statistical analyses and reporting. Validation of a scoring system occurs through two principal measures: 1) validation of repeatability and 2) validation of tissue pathobiology. Understanding key principles of tissue scoring can help in the development and/or optimization of scoring systems so as to consistently yield meaningful and valid scoring data. PMID:23558974

  1. Validity and reliability of the Myotest accelerometric system for the assessment of vertical jump height.

    PubMed

    Casartelli, Nicola; Müller, Roland; Maffiuletti, Nicola A

    2010-11-01

    The aim of the present study was to verify the validity and reliability of the Myotest accelerometric system (Myotest SA, Sion, Switzerland) for the assessment of vertical jump height. Forty-four male basketball players (age range: 9-25 years) performed series of squat, countermovement and repeated jumps during 2 identical test sessions separated by 2-15 days. Flight height was simultaneously quantified with the Myotest system and validated photoelectric cells (Optojump). Two calculation methods were used to estimate the jump height from Myotest recordings: flight time (Myotest-T) and vertical takeoff velocity (Myotest-V). Concurrent validity was investigated comparing Myotest-T and Myotest-V to the criterion method (Optojump), and test-retest reliability was also examined. As regards validity, Myotest-T overestimated jumping height compared to Optojump (p < 0.001) with a systematic bias of approximately 7 cm, even though random errors were low (2.7 cm) and intraclass correlation coefficients (ICCs) where high (>0.98), that is, excellent validity. Myotest-V overestimated jumping height compared to Optojump (p < 0.001), with high random errors (>12 cm), high limits of agreement ratios (>36%), and low ICCs (<0.75), that is, poor validity. As regards reliability, Myotest-T showed high ICCs (range: 0.92-0.96), whereas Myotest-V showed low ICCs (range: 0.56-0.89), and high random errors (>9 cm). In conclusion, Myotest-T is a valid and reliable method for the assessment of vertical jump height, and its use is legitimate for field-based evaluations, whereas Myotest-V is neither valid nor reliable.

  2. Simulation verification techniques study: Simulation performance validation techniques document. [for the space shuttle system

    NASA Technical Reports Server (NTRS)

    Duncan, L. M.; Reddell, J. P.; Schoonmaker, P. B.

    1975-01-01

    Techniques and support software for the efficient performance of simulation validation are discussed. Overall validation software structure, the performance of validation at various levels of simulation integration, guidelines for check case formulation, methods for real time acquisition and formatting of data from an all up operational simulator, and methods and criteria for comparison and evaluation of simulation data are included. Vehicle subsystems modules, module integration, special test requirements, and reference data formats are also described.

  3. Validation in Support of Internationally Harmonised OECD Test Guidelines for Assessing the Safety of Chemicals.

    PubMed

    Gourmelon, Anne; Delrue, Nathalie

    Ten years elapsed since the OECD published the Guidance document on the validation and international regulatory acceptance of test methods for hazard assessment. Much experience has been gained since then in validation centres, in countries and at the OECD on a variety of test methods that were subjected to validation studies. This chapter reviews validation principles and highlights common features that appear to be important for further regulatory acceptance across studies. Existing OECD-agreed validation principles will most likely generally remain relevant and applicable to address challenges associated with the validation of future test methods. Some adaptations may be needed to take into account the level of technique introduced in test systems, but demonstration of relevance and reliability will continue to play a central role as pre-requisite for the regulatory acceptance. Demonstration of relevance will become more challenging for test methods that form part of a set of predictive tools and methods, and that do not stand alone. OECD is keen on ensuring that while these concepts evolve, countries can continue to rely on valid methods and harmonised approaches for an efficient testing and assessment of chemicals.

  4. Formal methods and their role in digital systems validation for airborne systems

    NASA Technical Reports Server (NTRS)

    Rushby, John

    1995-01-01

    This report is based on one prepared as a chapter for the FAA Digital Systems Validation Handbook (a guide to assist FAA certification specialists with advanced technology issues). Its purpose is to explain the use of formal methods in the specification and verification of software and hardware requirements, designs, and implementations; to identify the benefits, weaknesses, and difficulties in applying these methods to digital systems used in critical applications; and to suggest factors for consideration when formal methods are offered in support of certification. The presentation concentrates on the rationale for formal methods and on their contribution to assurance for critical applications within a context such as that provided by DO-178B (the guidelines for software used on board civil aircraft); it is intended as an introduction for those to whom these topics are new.

  5. Meeting Report: Validation of Toxicogenomics-Based Test Systems: ECVAM–ICCVAM/NICEATM Considerations for Regulatory Use

    PubMed Central

    Corvi, Raffaella; Ahr, Hans-Jürgen; Albertini, Silvio; Blakey, David H.; Clerici, Libero; Coecke, Sandra; Douglas, George R.; Gribaldo, Laura; Groten, John P.; Haase, Bernd; Hamernik, Karen; Hartung, Thomas; Inoue, Tohru; Indans, Ian; Maurici, Daniela; Orphanides, George; Rembges, Diana; Sansone, Susanna-Assunta; Snape, Jason R.; Toda, Eisaku; Tong, Weida; van Delft, Joost H.; Weis, Brenda; Schechtman, Leonard M.

    2006-01-01

    This is the report of the first workshop “Validation of Toxicogenomics-Based Test Systems” held 11–12 December 2003 in Ispra, Italy. The workshop was hosted by the European Centre for the Validation of Alternative Methods (ECVAM) and organized jointly by ECVAM, the U.S. Interagency Coordinating Committee on the Validation of Alternative Methods (ICCVAM), and the National Toxicology Program (NTP) Interagency Center for the Evaluation of Alternative Toxicological Methods (NICEATM). The primary aim of the workshop was for participants to discuss and define principles applicable to the validation of toxicogenomics platforms as well as validation of specific toxicologic test methods that incorporate toxicogenomics technologies. The workshop was viewed as an opportunity for initiating a dialogue between technologic experts, regulators, and the principal validation bodies and for identifying those factors to which the validation process would be applicable. It was felt that to do so now, as the technology is evolving and associated challenges are identified, would be a basis for the future validation of the technology when it reaches the appropriate stage. Because of the complexity of the issue, different aspects of the validation of toxicogenomics-based test methods were covered. The three focus areas include a) biologic validation of toxicogenomics-based test methods for regulatory decision making, b) technical and bioinformatics aspects related to validation, and c) validation issues as they relate to regulatory acceptance and use of toxicogenomics-based test methods. In this report we summarize the discussions and describe in detail the recommendations for future direction and priorities. PMID:16507466

  6. A Hardware Model Validation Tool for Use in Complex Space Systems

    NASA Technical Reports Server (NTRS)

    Davies, Misty Dawn; Gundy-Burlet, Karen L.; Limes, Gregory L.

    2010-01-01

    One of the many technological hurdles that must be overcome in future missions is the challenge of validating as-built systems against the models used for design. We propose a technique composed of intelligent parameter exploration in concert with automated failure analysis as a scalable method for the validation of complex space systems. The technique is impervious to discontinuities and linear dependencies in the data, and can handle dimensionalities consisting of hundreds of variables over tens of thousands of experiments.

  7. Field-scale moisture estimates using COSMOS sensors: a validation study with temporary networks and leaf-area-indices

    USDA-ARS?s Scientific Manuscript database

    The Cosmic-ray Soil Moisture Observing System (COSMOS) is a new and innovative method for estimating surface and near surface soil moisture at large (~700 m) scales. This system accounts for liquid water within its measurement volume. Many of the sites used in the early validation of the system had...

  8. Screening Systems and Decision Making at the Preschool Level: Application of a Comprehensive Validity Framework

    ERIC Educational Resources Information Center

    Kettler, Ryan J.; Feeney-Kettler, Kelly A.

    2011-01-01

    Universal screening is designed to be an efficient method for identifying preschool students with mental health problems, but prior to use, screening systems must be evaluated to determine their appropriateness within a specific setting. In this article, an evidence-based validity framework is applied to four screening systems for identifying…

  9. Methodologies for Pre-Validation of Biofilters and Wetlands for Stormwater Treatment

    PubMed Central

    Zhang, Kefeng; Randelovic, Anja; Aguiar, Larissa M.; Page, Declan; McCarthy, David T.; Deletic, Ana

    2015-01-01

    Background Water Sensitive Urban Design (WSUD) systems are frequently used as part of a stormwater harvesting treatment trains (e.g. biofilters (bio-retentions and rain-gardens) and wetlands). However, validation frameworks for such systems do not exist, limiting their adoption for end-uses such as drinking water. The first stage in the validation framework is pre-validation, which prepares information for further validation monitoring. Objectives A pre-validation roadmap, consisting of five steps, is suggested in this paper. Detailed methods for investigating target micropollutants in stormwater, and determining challenge conditions for biofilters and wetlands, are provided. Methods A literature review was undertaken to identify and quantify micropollutants in stormwater. MUSIC V5.1 was utilized to simulate the behaviour of the systems based on 30-year rainfall data in three distinct climate zones; outputs were evaluated to identify the threshold of operational variables, including length of dry periods (LDPs) and volume of water treated per event. Results The paper highlights that a number of micropollutants were found in stormwater at levels above various worldwide drinking water guidelines (eight pesticides, benzene, benzo(a)pyrene, pentachlorophenol, di-(2-ethylhexyl)-phthalate and a total of polychlorinated biphenyls). The 95th percentile LDPs was exponentially related to system design area while the 5th percentile length of dry periods remained within short durations (i.e. 2–8 hours). 95th percentile volume of water treated per event was exponentially related to system design area as a percentage of an impervious catchment area. Conclusions The out-comings of this study show that pre-validation could be completed through a roadmap consisting of a series of steps; this will help in the validation of stormwater treatment systems. PMID:25955688

  10. In-house validation study of the DuPont Qualicon BAX system Q7 instrument with the BAX system PCR Assay for Salmonella (modification of AOAC Official Method 2003.09 and AOAC Research Institute Performance-Tested Method 100201).

    PubMed

    Tice, George; Andaloro, Bridget; White, H Kirk; Bolton, Lance; Wang, Siqun; Davis, Eugene; Wallace, Morgan

    2009-01-01

    In 2006, DuPont Qualicon introduced the BAX system Q7 instrument for use with its assays. To demonstrate the equivalence of the new and old instruments, a validation study was conducted using the BAX system PCR Assay for Salmonella, AOAC Official Method 2003.09, on three food types. The foods were simultaneously analyzed with the BAX system Q7 instrument and either the U.S. Food and Drug Administration Bacteriological Analytical Manual or the U.S. Department of Agriculture-Food Safety and Inspection Service Microbiology Laboratory Guidebook reference method for detecting Salmonella. Comparable performance between the BAX system and the reference methods was observed. Of the 75 paired samples analyzed, 39 samples were positive by both the BAX system and reference methods, and 36 samples were negative by both the BAX system and reference methods, demonstrating 100% correlation. Inclusivity and exclusivity for the BAX system Q7 instrument were also established by testing 50 Salmonella strains and 20 non-Salmonella isolates. All Salmonella strains returned positive results, and all non-Salmonella isolates returned a negative response.

  11. Validation protocol of analytical procedures for quantification of drugs in polymeric systems for parenteral administration: dexamethasone phosphate disodium microparticles.

    PubMed

    Martín-Sabroso, Cristina; Tavares-Fernandes, Daniel Filipe; Espada-García, Juan Ignacio; Torres-Suárez, Ana Isabel

    2013-12-15

    In this work a protocol to validate analytical procedures for the quantification of drug substances formulated in polymeric systems that comprise both drug entrapped into the polymeric matrix (assay:content test) and drug released from the systems (assay:dissolution test) is developed. This protocol is applied to the validation two isocratic HPLC analytical procedures for the analysis of dexamethasone phosphate disodium microparticles for parenteral administration. Preparation of authentic samples and artificially "spiked" and "unspiked" samples is described. Specificity (ability to quantify dexamethasone phosphate disodium in presence of constituents of the dissolution medium and other microparticle constituents), linearity, accuracy and precision are evaluated, in the range from 10 to 50 μg mL(-1) in the assay:content test procedure and from 0.25 to 10 μg mL(-1) in the assay:dissolution test procedure. The robustness of the analytical method to extract drug from microparticles is also assessed. The validation protocol developed allows us to conclude that both analytical methods are suitable for their intended purpose, but the lack of proportionality of the assay:dissolution analytical method should be taken into account. The validation protocol designed in this work could be applied to the validation of any analytical procedure for the quantification of drugs formulated in controlled release polymeric microparticles. Copyright © 2013 Elsevier B.V. All rights reserved.

  12. Statistically Validated Networks in Bipartite Complex Systems

    PubMed Central

    Tumminello, Michele; Miccichè, Salvatore; Lillo, Fabrizio; Piilo, Jyrki; Mantegna, Rosario N.

    2011-01-01

    Many complex systems present an intrinsic bipartite structure where elements of one set link to elements of the second set. In these complex systems, such as the system of actors and movies, elements of one set are qualitatively different than elements of the other set. The properties of these complex systems are typically investigated by constructing and analyzing a projected network on one of the two sets (for example the actor network or the movie network). Complex systems are often very heterogeneous in the number of relationships that the elements of one set establish with the elements of the other set, and this heterogeneity makes it very difficult to discriminate links of the projected network that are just reflecting system's heterogeneity from links relevant to unveil the properties of the system. Here we introduce an unsupervised method to statistically validate each link of a projected network against a null hypothesis that takes into account system heterogeneity. We apply the method to a biological, an economic and a social complex system. The method we propose is able to detect network structures which are very informative about the organization and specialization of the investigated systems, and identifies those relationships between elements of the projected network that cannot be explained simply by system heterogeneity. We also show that our method applies to bipartite systems in which different relationships might have different qualitative nature, generating statistically validated networks in which such difference is preserved. PMID:21483858

  13. Likelihood ratio data to report the validation of a forensic fingerprint evaluation method.

    PubMed

    Ramos, Daniel; Haraksim, Rudolf; Meuwly, Didier

    2017-02-01

    Data to which the authors refer to throughout this article are likelihood ratios (LR) computed from the comparison of 5-12 minutiae fingermarks with fingerprints. These LRs data are used for the validation of a likelihood ratio (LR) method in forensic evidence evaluation. These data present a necessary asset for conducting validation experiments when validating LR methods used in forensic evidence evaluation and set up validation reports. These data can be also used as a baseline for comparing the fingermark evidence in the same minutiae configuration as presented in (D. Meuwly, D. Ramos, R. Haraksim,) [1], although the reader should keep in mind that different feature extraction algorithms and different AFIS systems used may produce different LRs values. Moreover, these data may serve as a reproducibility exercise, in order to train the generation of validation reports of forensic methods, according to [1]. Alongside the data, a justification and motivation for the use of methods is given. These methods calculate LRs from the fingerprint/mark data and are subject to a validation procedure. The choice of using real forensic fingerprint in the validation and simulated data in the development is described and justified. Validation criteria are set for the purpose of validation of the LR methods, which are used to calculate the LR values from the data and the validation report. For privacy and data protection reasons, the original fingerprint/mark images cannot be shared. But these images do not constitute the core data for the validation, contrarily to the LRs that are shared.

  14. Concept Development for Future Domains: A New Method of Knowledge Elicitation

    DTIC Science & Technology

    2005-06-01

    Procedure: U.S. Army Research Institute for the Behavioral and Social Sciences (ARI) examined methods to generate, refine, test , and validate new...generate, elaborate, refine, describe, test , and validate new Future Force concepts relating to doctrine, tactics, techniques, procedures, unit and team...System (Harvey, 1993), and the Job Element Method (Primoff & Eyde , 1988). Figure 1 provides a more comprehensive list of task analytic methods. Please see

  15. Validation of the Combined Comorbidity Index of Charlson and Elixhauser to Predict 30-Day Mortality Across ICD-9 and ICD-10.

    PubMed

    Simard, Marc; Sirois, Caroline; Candas, Bernard

    2018-05-01

    To validate and compare performance of an International Classification of Diseases, tenth revision (ICD-10) version of a combined comorbidity index merging conditions of Charlson and Elixhauser measures against individual measures in the prediction of 30-day mortality. To select a weight derivation method providing optimal performance across ICD-9 and ICD-10 coding systems. Using 2 adult population-based cohorts of patients with hospital admissions in ICD-9 (2005, n=337,367) and ICD-10 (2011, n=348,820), we validated a combined comorbidity index by predicting 30-day mortality with logistic regression. To appreciate performance of the Combined index and both individual measures, factors impacting indices performance such as population characteristics and weight derivation methods were accounted for. We applied 3 scoring methods (Van Walraven, Schneeweiss, and Charlson) and determined which provides best predictive values. Combined index [c-statistics: 0.853 (95% confidence interval: CI, 0.848-0.856)] performed better than original Charlson [0.841 (95% CI, 0.835-0.844)] or Elixhauser [0.841 (95% CI, 0.837-0.844)] measures on ICD-10 cohort. All weight derivation methods provided close high discrimination results for the Combined index (Van Walraven: 0.852, Schneeweiss: 0.851, Charlson: 0.849). Results were consistent across both coding systems. The Combined index remains valid with both ICD-9 and ICD-10 coding systems and the 3 weight derivation methods evaluated provided consistent high performance across those coding systems.

  16. TMATS/ IHAL/ DDML Schema Validation

    DTIC Science & Technology

    2017-02-01

    task was to create a method for performing IRIG eXtensible Markup Language (XML) schema validation. As opposed to XML instance document validation...TMATS / IHAL / DDML Schema Validation, RCC 126-17, February 2017 vii Acronyms DDML Data Display Markup Language HUD heads-up display iNET...system XML eXtensible Markup Language TMATS / IHAL / DDML Schema Validation, RCC 126-17, February 2017 viii This page intentionally left blank

  17. Performance Equivalence and Validation of the Soleris Automated System for Quantitative Microbial Content Testing Using Pure Suspension Cultures.

    PubMed

    Limberg, Brian J; Johnstone, Kevin; Filloon, Thomas; Catrenich, Carl

    2016-09-01

    Using United States Pharmacopeia-National Formulary (USP-NF) general method <1223> guidance, the Soleris(®) automated system and reagents (Nonfermenting Total Viable Count for bacteria and Direct Yeast and Mold for yeast and mold) were validated, using a performance equivalence approach, as an alternative to plate counting for total microbial content analysis using five representative microbes: Staphylococcus aureus, Bacillus subtilis, Pseudomonas aeruginosa, Candida albicans, and Aspergillus brasiliensis. Detection times (DTs) in the alternative automated system were linearly correlated to CFU/sample (R(2) = 0.94-0.97) with ≥70% accuracy per USP General Chapter <1223> guidance. The LOD and LOQ of the automated system were statistically similar to the traditional plate count method. This system was significantly more precise than plate counting (RSD 1.2-2.9% for DT, 7.8-40.6% for plate counts), was statistically comparable to plate counting with respect to variations in analyst, vial lots, and instruments, and was robust when variations in the operating detection thresholds (dTs; ±2 units) were used. The automated system produced accurate results, was more precise and less labor-intensive, and met or exceeded criteria for a valid alternative quantitative method, consistent with USP-NF general method <1223> guidance.

  18. Prognostics of Power Electronics, Methods and Validation Experiments

    NASA Technical Reports Server (NTRS)

    Kulkarni, Chetan S.; Celaya, Jose R.; Biswas, Gautam; Goebel, Kai

    2012-01-01

    Abstract Failure of electronic devices is a concern for future electric aircrafts that will see an increase of electronics to drive and control safety-critical equipment throughout the aircraft. As a result, investigation of precursors to failure in electronics and prediction of remaining life of electronic components is of key importance. DC-DC power converters are power electronics systems employed typically as sourcing elements for avionics equipment. Current research efforts in prognostics for these power systems focuses on the identification of failure mechanisms and the development of accelerated aging methodologies and systems to accelerate the aging process of test devices, while continuously measuring key electrical and thermal parameters. Preliminary model-based prognostics algorithms have been developed making use of empirical degradation models and physics-inspired degradation model with focus on key components like electrolytic capacitors and power MOSFETs (metal-oxide-semiconductor-field-effect-transistor). This paper presents current results on the development of validation methods for prognostics algorithms of power electrolytic capacitors. Particularly, in the use of accelerated aging systems for algorithm validation. Validation of prognostics algorithms present difficulties in practice due to the lack of run-to-failure experiments in deployed systems. By using accelerated experiments, we circumvent this problem in order to define initial validation activities.

  19. Validation of the Five-Phase Method for Simulating Complex Fenestration Systems with Radiance against Field Measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Geisler-Moroder, David; Lee, Eleanor S.; Ward, Gregory J.

    2016-08-29

    The Five-Phase Method (5-pm) for simulating complex fenestration systems with Radiance is validated against field measurements. The capability of the method to predict workplane illuminances, vertical sensor illuminances, and glare indices derived from captured and rendered high dynamic range (HDR) images is investigated. To be able to accurately represent the direct sun part of the daylight not only in sensor point simulations, but also in renderings of interior scenes, the 5-pm calculation procedure was extended. The validation shows that the 5-pm is superior to the Three-Phase Method for predicting horizontal and vertical illuminance sensor values as well as glare indicesmore » derived from rendered images. Even with input data from global and diffuse horizontal irradiance measurements only, daylight glare probability (DGP) values can be predicted within 10% error of measured values for most situations.« less

  20. On Selecting Commercial Information Systems

    PubMed Central

    Möhr, J.R.; Sawinski, R.; Kluge, A.; Alle, W.

    1984-01-01

    As more commercial information systems become available, the methodology for their selection gains importance. An instances where the method employed for the selection of laboratory information systems was multilevel assessment. The method is described and the experience gained in the project is summarized and discussed. Evidence is provided that the employed method is comprehensive, reproducible, valid and economic.

  1. Validation of Safety-Critical Systems for Aircraft Loss-of-Control Prevention and Recovery

    NASA Technical Reports Server (NTRS)

    Belcastro, Christine M.

    2012-01-01

    Validation of technologies developed for loss of control (LOC) prevention and recovery poses significant challenges. Aircraft LOC can result from a wide spectrum of hazards, often occurring in combination, which cannot be fully replicated during evaluation. Technologies developed for LOC prevention and recovery must therefore be effective under a wide variety of hazardous and uncertain conditions, and the validation framework must provide some measure of assurance that the new vehicle safety technologies do no harm (i.e., that they themselves do not introduce new safety risks). This paper summarizes a proposed validation framework for safety-critical systems, provides an overview of validation methods and tools developed by NASA to date within the Vehicle Systems Safety Project, and develops a preliminary set of test scenarios for the validation of technologies for LOC prevention and recovery

  2. Quantifying Soiling Loss Directly From PV Yield

    DOE PAGES

    Deceglie, Michael G.; Micheli, Leonardo; Muller, Matthew

    2018-01-23

    Soiling of photovoltaic (PV) panels is typically quantified through the use of specialized sensors. Here, we describe and validate a method for estimating soiling loss experienced by PV systems directly from system yield without the need for precipitation data. The method, termed the stochastic rate and recovery (SRR) method, automatically detects soiling intervals in a dataset, then stochastically generates a sample of possible soiling profiles based on the observed characteristics of each interval. In this paper, we describe the method, validate it against soiling station measurements, and compare it with other PV-yield-based soiling estimation methods. The broader application of themore » SRR method will enable the fleet scale assessment of soiling loss to facilitate mitigation planning and risk assessment.« less

  3. Quantifying Soiling Loss Directly From PV Yield

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deceglie, Michael G.; Micheli, Leonardo; Muller, Matthew

    Soiling of photovoltaic (PV) panels is typically quantified through the use of specialized sensors. Here, we describe and validate a method for estimating soiling loss experienced by PV systems directly from system yield without the need for precipitation data. The method, termed the stochastic rate and recovery (SRR) method, automatically detects soiling intervals in a dataset, then stochastically generates a sample of possible soiling profiles based on the observed characteristics of each interval. In this paper, we describe the method, validate it against soiling station measurements, and compare it with other PV-yield-based soiling estimation methods. The broader application of themore » SRR method will enable the fleet scale assessment of soiling loss to facilitate mitigation planning and risk assessment.« less

  4. Evidence flow graph methods for validation and verification of expert systems

    NASA Technical Reports Server (NTRS)

    Becker, Lee A.; Green, Peter G.; Bhatnagar, Jayant

    1989-01-01

    The results of an investigation into the use of evidence flow graph techniques for performing validation and verification of expert systems are given. A translator to convert horn-clause rule bases into evidence flow graphs, a simulation program, and methods of analysis were developed. These tools were then applied to a simple rule base which contained errors. It was found that the method was capable of identifying a variety of problems, for example that the order of presentation of input data or small changes in critical parameters could affect the output from a set of rules.

  5. Validation of reactive gases and aerosols in the MACC global analysis and forecast system

    NASA Astrophysics Data System (ADS)

    Eskes, H.; Huijnen, V.; Arola, A.; Benedictow, A.; Blechschmidt, A.-M.; Botek, E.; Boucher, O.; Bouarar, I.; Chabrillat, S.; Cuevas, E.; Engelen, R.; Flentje, H.; Gaudel, A.; Griesfeller, J.; Jones, L.; Kapsomenakis, J.; Katragkou, E.; Kinne, S.; Langerock, B.; Razinger, M.; Richter, A.; Schultz, M.; Schulz, M.; Sudarchikova, N.; Thouret, V.; Vrekoussis, M.; Wagner, A.; Zerefos, C.

    2015-02-01

    The European MACC (Monitoring Atmospheric Composition and Climate) project is preparing the operational Copernicus Atmosphere Monitoring Service (CAMS), one of the services of the European Copernicus Programme on Earth observation and environmental services. MACC uses data assimilation to combine in-situ and remote sensing observations with global and regional models of atmospheric reactive gases, aerosols and greenhouse gases, and is based on the Integrated Forecast System of the ECMWF. The global component of the MACC service has a dedicated validation activity to document the quality of the atmospheric composition products. In this paper we discuss the approach to validation that has been developed over the past three years. Topics discussed are the validation requirements, the operational aspects, the measurement data sets used, the structure of the validation reports, the models and assimilation systems validated, the procedure to introduce new upgrades, and the scoring methods. One specific target of the MACC system concerns forecasting special events with high pollution concentrations. Such events receive extra attention in the validation process. Finally, a summary is provided of the results from the validation of the latest set of daily global analysis and forecast products from the MACC system reported in November 2014.

  6. System and method for forward error correction

    NASA Technical Reports Server (NTRS)

    Cole, Robert M. (Inventor); Bishop, James E. (Inventor)

    2006-01-01

    A system and method are provided for transferring a packet across a data link. The packet may include a stream of data symbols which is delimited by one or more framing symbols. Corruptions of the framing symbol which result in valid data symbols may be mapped to invalid symbols. If it is desired to transfer one of the valid data symbols that has been mapped to an invalid symbol, the data symbol may be replaced with an unused symbol. At the receiving end, these unused symbols are replaced with the corresponding valid data symbols. The data stream of the packet may be encoded with forward error correction information to detect and correct errors in the data stream.

  7. System and method for transferring data on a data link

    NASA Technical Reports Server (NTRS)

    Cole, Robert M. (Inventor); Bishop, James E. (Inventor)

    2007-01-01

    A system and method are provided for transferring a packet across a data link. The packet may include a stream of data symbols which is delimited by one or more framing symbols. Corruptions of the framing symbol which result in valid data symbols may be mapped to invalid symbols. If it is desired to transfer one of the valid data symbols that has been mapped to an invalid symbol, the data symbol may be replaced with an unused symbol. At the receiving end, these unused symbols are replaced with the corresponding valid data symbols. The data stream of the packet may be encoded with forward error correction information to detect and correct errors in the data stream.

  8. Validation techniques for fault emulation of SRAM-based FPGAs

    DOE PAGES

    Quinn, Heather; Wirthlin, Michael

    2015-08-07

    A variety of fault emulation systems have been created to study the effect of single-event effects (SEEs) in static random access memory (SRAM) based field-programmable gate arrays (FPGAs). These systems are useful for augmenting radiation-hardness assurance (RHA) methodologies for verifying the effectiveness for mitigation techniques; understanding error signatures and failure modes in FPGAs; and failure rate estimation. For radiation effects researchers, it is important that these systems properly emulate how SEEs manifest in FPGAs. If the fault emulation systems does not mimic the radiation environment, the system will generate erroneous data and incorrect predictions of behavior of the FPGA inmore » a radiation environment. Validation determines whether the emulated faults are reasonable analogs to the radiation-induced faults. In this study we present methods for validating fault emulation systems and provide several examples of validated FPGA fault emulation systems.« less

  9. A generic minimization random allocation and blinding system on web.

    PubMed

    Cai, Hongwei; Xia, Jielai; Xu, Dezhong; Gao, Donghuai; Yan, Yongping

    2006-12-01

    Minimization is a dynamic randomization method for clinical trials. Although recommended by many researchers, the utilization of minimization has been seldom reported in randomized trials mainly because of the controversy surrounding the validity of conventional analyses and its complexity in implementation. However, both the statistical and clinical validity of minimization were demonstrated in recent studies. Minimization random allocation system integrated with blinding function that could facilitate the implementation of this method in general clinical trials has not been reported. SYSTEM OVERVIEW: The system is a web-based random allocation system using Pocock and Simon minimization method. It also supports multiple treatment arms within a trial, multiple simultaneous trials, and blinding without further programming. This system was constructed with generic database schema design method, Pocock and Simon minimization method and blinding method. It was coded with Microsoft Visual Basic and Active Server Pages (ASP) programming languages. And all dataset were managed with a Microsoft SQL Server database. Some critical programming codes were also provided. SIMULATIONS AND RESULTS: Two clinical trials were simulated simultaneously to test the system's applicability. Not only balanced groups but also blinded allocation results were achieved in both trials. Practical considerations for minimization method, the benefits, general applicability and drawbacks of the technique implemented in this system are discussed. Promising features of the proposed system are also summarized.

  10. Empirical evaluation of decision support systems: Needs, definitions, potential methods, and an example pertaining to waterfowl management

    USGS Publications Warehouse

    Sojda, R.S.

    2007-01-01

    Decision support systems are often not empirically evaluated, especially the underlying modelling components. This can be attributed to such systems necessarily being designed to handle complex and poorly structured problems and decision making. Nonetheless, evaluation is critical and should be focused on empirical testing whenever possible. Verification and validation, in combination, comprise such evaluation. Verification is ensuring that the system is internally complete, coherent, and logical from a modelling and programming perspective. Validation is examining whether the system is realistic and useful to the user or decision maker, and should answer the question: “Was the system successful at addressing its intended purpose?” A rich literature exists on verification and validation of expert systems and other artificial intelligence methods; however, no single evaluation methodology has emerged as preeminent. At least five approaches to validation are feasible. First, under some conditions, decision support system performance can be tested against a preselected gold standard. Second, real-time and historic data sets can be used for comparison with simulated output. Third, panels of experts can be judiciously used, but often are not an option in some ecological domains. Fourth, sensitivity analysis of system outputs in relation to inputs can be informative. Fifth, when validation of a complete system is impossible, examining major components can be substituted, recognizing the potential pitfalls. I provide an example of evaluation of a decision support system for trumpeter swan (Cygnus buccinator) management that I developed using interacting intelligent agents, expert systems, and a queuing system. Predicted swan distributions over a 13-year period were assessed against observed numbers. Population survey numbers and banding (ringing) studies may provide long term data useful in empirical evaluation of decision support.

  11. Fuzzy Logic Controller Stability Analysis Using a Satisfiability Modulo Theories Approach

    NASA Technical Reports Server (NTRS)

    Arnett, Timothy; Cook, Brandon; Clark, Matthew A.; Rattan, Kuldip

    2017-01-01

    While many widely accepted methods and techniques exist for validation and verification of traditional controllers, at this time no solutions have been accepted for Fuzzy Logic Controllers (FLCs). Due to the highly nonlinear nature of such systems, and the fact that developing a valid FLC does not require a mathematical model of the system, it is quite difficult to use conventional techniques to prove controller stability. Since safety-critical systems must be tested and verified to work as expected for all possible circumstances, the fact that FLC controllers cannot be tested to achieve such requirements poses limitations on the applications for such technology. Therefore, alternative methods for verification and validation of FLCs needs to be explored. In this study, a novel approach using formal verification methods to ensure the stability of a FLC is proposed. Main research challenges include specification of requirements for a complex system, conversion of a traditional FLC to a piecewise polynomial representation, and using a formal verification tool in a nonlinear solution space. Using the proposed architecture, the Fuzzy Logic Controller was found to always generate negative feedback, but inconclusive for Lyapunov stability.

  12. Model-based verification and validation of the SMAP uplink processes

    NASA Astrophysics Data System (ADS)

    Khan, M. O.; Dubos, G. F.; Tirona, J.; Standley, S.

    Model-Based Systems Engineering (MBSE) is being used increasingly within the spacecraft design community because of its benefits when compared to document-based approaches. As the complexity of projects expands dramatically with continually increasing computational power and technology infusion, the time and effort needed for verification and validation (V& V) increases geometrically. Using simulation to perform design validation with system-level models earlier in the life cycle stands to bridge the gap between design of the system (based on system-level requirements) and verifying those requirements/validating the system as a whole. This case study stands as an example of how a project can validate a system-level design earlier in the project life cycle than traditional V& V processes by using simulation on a system model. Specifically, this paper describes how simulation was added to a system model of the Soil Moisture Active-Passive (SMAP) mission's uplink process. Also discussed are the advantages and disadvantages of the methods employed and the lessons learned; which are intended to benefit future model-based and simulation-based development efforts.

  13. Evaluation of three different validation procedures regarding the accuracy of template-guided implant placement: an in vitro study.

    PubMed

    Vasak, Christoph; Strbac, Georg D; Huber, Christian D; Lettner, Stefan; Gahleitner, André; Zechner, Werner

    2015-02-01

    The study aims to evaluate the accuracy of the NobelGuide™ (Medicim/Nobel Biocare, Göteborg, Sweden) concept maximally reducing the influence of clinical and surgical parameters. Moreover, the study was to compare and validate two validation procedures versus a reference method. Overall, 60 implants were placed in 10 artificial edentulous mandibles according to the NobelGuide™ protocol. For merging the pre- and postoperative DICOM data sets, three different fusion methods (Triple Scan Technique, NobelGuide™ Validation software, and AMIRA® software [VSG - Visualization Sciences Group, Burlington, MA, USA] as reference) were applied. Discrepancies between the virtual and the actual implant positions were measured. The mean deviations measured with AMIRA® were 0.49 mm (implant shoulder), 0.69 mm (implant apex), and 1.98°mm (implant axis). The Triple Scan Technique as well as the NobelGuide™ Validation software revealed similar deviations compared with the reference method. A significant correlation between angular and apical deviations was seen (r = 0.53; p < .001). A greater implant diameter was associated with greater deviations (p = .03). The Triple Scan Technique as a system-independent validation procedure as well as the NobelGuide™ Validation software are in accordance with the AMIRA® software. The NobelGuide™ system showed similar or less spatial and angular deviations compared with others. © 2013 Wiley Periodicals, Inc.

  14. Using Lunar Observations to Validate In-Flight Calibrations of Clouds and Earth Radiant Energy System Instruments

    NASA Technical Reports Server (NTRS)

    Daniels, Janet L.; Smith, G. Louis; Priestley, Kory J.; Thomas, Susan

    2014-01-01

    The validation of in-orbit instrument performance requires stability in both instrument and calibration source. This paper describes a method of validation using lunar observations scanning near full moon by the Clouds and Earth Radiant Energy System (CERES) instruments. Unlike internal calibrations, the Moon offers an external source whose signal variance is predictable and non-degrading. From 2006 to present, in-orbit observations have become standardized and compiled for the Flight Models-1 and -2 aboard the Terra satellite, for Flight Models-3 and -4 aboard the Aqua satellite, and beginning 2012, for Flight Model-5 aboard Suomi-NPP. Instrument performance parameters which can be gleaned are detector gain, pointing accuracy and static detector point response function validation. Lunar observations are used to examine the stability of all three detectors on each of these instruments from 2006 to present. This validation method has yielded results showing trends per CERES data channel of 1.2% per decade or less.

  15. 3D inversion of full gravity gradient tensor data in spherical coordinate system using local north-oriented frame

    NASA Astrophysics Data System (ADS)

    Zhang, Yi; Wu, Yulong; Yan, Jianguo; Wang, Haoran; Rodriguez, J. Alexis P.; Qiu, Yue

    2018-04-01

    In this paper, we propose an inverse method for full gravity gradient tensor data in the spherical coordinate system. As opposed to the traditional gravity inversion in the Cartesian coordinate system, our proposed method takes the curvature of the Earth, the Moon, or other planets into account, using tesseroid bodies to produce gravity gradient effects in forward modeling. We used both synthetic and observed datasets to test the stability and validity of the proposed method. Our results using synthetic gravity data show that our new method predicts the depth of the density anomalous body efficiently and accurately. Using observed gravity data for the Mare Smythii area on the moon, the density distribution of the crust in this area reveals its geological structure. These results validate the proposed method and potential application for large area data inversion of planetary geological structures.[Figure not available: see fulltext.

  16. Validation of biological activity testing procedure of recombinant human interleukin-7.

    PubMed

    Lutsenko, T N; Kovalenko, M V; Galkin, O Yu

    2017-01-01

    Validation procedure for method of monitoring the biological activity of reсombinant human interleukin-7 has been developed and conducted according to the requirements of national and international recommendations. This method is based on the ability of recombinant human interleukin-7 to induce proliferation of T lymphocytes. It has been shown that to control the biological activity of recombinant human interleukin-7 peripheral blood mononuclear cells (PBMCs) derived from blood or cell lines can be used. Validation charac­teristics that should be determined depend on the method, type of product or object test/measurement and biological test systems used in research. The validation procedure for the method of control of biological activity of recombinant human interleukin-7 in peripheral blood mononuclear cells showed satisfactory results on all parameters tested such as specificity, accuracy, precision and linearity.

  17. Evidence flow graph methods for validation and verification of expert systems

    NASA Technical Reports Server (NTRS)

    Becker, Lee A.; Green, Peter G.; Bhatnagar, Jayant

    1988-01-01

    This final report describes the results of an investigation into the use of evidence flow graph techniques for performing validation and verification of expert systems. This was approached by developing a translator to convert horn-clause rule bases into evidence flow graphs, a simulation program, and methods of analysis. These tools were then applied to a simple rule base which contained errors. It was found that the method was capable of identifying a variety of problems, for example that the order of presentation of input data or small changes in critical parameters could effect the output from a set of rules.

  18. Integration of design and inspection

    NASA Astrophysics Data System (ADS)

    Simmonds, William H.

    1990-08-01

    Developments in advanced computer integrated manufacturing technology, coupled with the emphasis on Total Quality Management, are exposing needs for new techniques to integrate all functions from design through to support of the delivered product. One critical functional area that must be integrated into design is that embracing the measurement, inspection and test activities necessary for validation of the delivered product. This area is being tackled by a collaborative project supported by the UK Government Department of Trade and Industry. The project is aimed at developing techniques for analysing validation needs and for planning validation methods. Within the project an experimental Computer Aided Validation Expert system (CAVE) is being constructed. This operates with a generalised model of the validation process and helps with all design stages: specification of product requirements; analysis of the assurance provided by a proposed design and method of manufacture; development of the inspection and test strategy; and analysis of feedback data. The kernel of the system is a knowledge base containing knowledge of the manufacturing process capabilities and of the available inspection and test facilities. The CAVE system is being integrated into a real life advanced computer integrated manufacturing facility for demonstration and evaluation.

  19. Relationships between the decoupled and coupled transfer functions: Theoretical studies and experimental validation

    NASA Astrophysics Data System (ADS)

    Wang, Zengwei; Zhu, Ping; Liu, Zhao

    2018-01-01

    A generalized method for predicting the decoupled transfer functions based on in-situ transfer functions is proposed. The method allows predicting the decoupled transfer functions using coupled transfer functions, without disassembling the system. Two ways to derive relationships between the decoupled and coupled transfer functions are presented. Issues related to immeasurability of coupled transfer functions are also discussed. The proposed method is validated by numerical and experimental case studies.

  20. Automatic, semi-automatic and manual validation of urban drainage data.

    PubMed

    Branisavljević, N; Prodanović, D; Pavlović, D

    2010-01-01

    Advances in sensor technology and the possibility of automated long distance data transmission have made continuous measurements the preferable way of monitoring urban drainage processes. Usually, the collected data have to be processed by an expert in order to detect and mark the wrong data, remove them and replace them with interpolated data. In general, the first step in detecting the wrong, anomaly data is called the data quality assessment or data validation. Data validation consists of three parts: data preparation, validation scores generation and scores interpretation. This paper will present the overall framework for the data quality improvement system, suitable for automatic, semi-automatic or manual operation. The first two steps of the validation process are explained in more detail, using several validation methods on the same set of real-case data from the Belgrade sewer system. The final part of the validation process, which is the scores interpretation, needs to be further investigated on the developed system.

  1. Validation of model-based deformation correction in image-guided liver surgery via tracked intraoperative ultrasound: preliminary method and results

    NASA Astrophysics Data System (ADS)

    Clements, Logan W.; Collins, Jarrod A.; Wu, Yifei; Simpson, Amber L.; Jarnagin, William R.; Miga, Michael I.

    2015-03-01

    Soft tissue deformation represents a significant error source in current surgical navigation systems used for open hepatic procedures. While numerous algorithms have been proposed to rectify the tissue deformation that is encountered during open liver surgery, clinical validation of the proposed methods has been limited to surface based metrics and sub-surface validation has largely been performed via phantom experiments. Tracked intraoperative ultrasound (iUS) provides a means to digitize sub-surface anatomical landmarks during clinical procedures. The proposed method involves the validation of a deformation correction algorithm for open hepatic image-guided surgery systems via sub-surface targets digitized with tracked iUS. Intraoperative surface digitizations were acquired via a laser range scanner and an optically tracked stylus for the purposes of computing the physical-to-image space registration within the guidance system and for use in retrospective deformation correction. Upon completion of surface digitization, the organ was interrogated with a tracked iUS transducer where the iUS images and corresponding tracked locations were recorded. After the procedure, the clinician reviewed the iUS images to delineate contours of anatomical target features for use in the validation procedure. Mean closest point distances between the feature contours delineated in the iUS images and corresponding 3-D anatomical model generated from the preoperative tomograms were computed to quantify the extent to which the deformation correction algorithm improved registration accuracy. The preliminary results for two patients indicate that the deformation correction method resulted in a reduction in target error of approximately 50%.

  2. Evaluation and validation of a multi-residue method based on biochip technology for the simultaneous screening of six families of antibiotics in muscle and aquaculture products.

    PubMed

    Gaudin, Valérie; Hedou, Celine; Soumet, Christophe; Verdon, Eric

    2016-01-01

    The Evidence Investigator™ system (Randox, UK) is a biochip and semi-automated system. The microarray kit II (AM II) is capable of detecting several compounds belonging to different families of antibiotics: quinolones, ceftiofur, thiamphenicol, streptomycin, tylosin and tetracyclines. The performance of this innovative system was evaluated for the detection of antibiotic residues in new matrices, in muscle of different animal species and in aquaculture products. The method was validated according to the European Decision No. EC/2002/657 and the European guideline for the validation of screening methods, which represents a complete initial validation. The false-positive rate was equal to 0% in muscle and in aquaculture products. The detection capabilities CCβ for 12 validated antibiotics (enrofloxacin, difloxacin, ceftiofur, desfuroyl ceftiofur cysteine disulfide, thiamphenicol, florfenicol, tylosin, tilmicosin, streptomycin, dihydrostreptomycin, tetracycline, doxycycline) were all lower than the respective maximum residue limits (MRLs) in muscle from different animal origins (bovine, ovine, porcine, poultry). No cross-reactions were observed with other antibiotics, neither with the six detected families nor with other families of antibiotics. The AM II kit could be applied to aquaculture products but with higher detection capabilities from those in muscle. The detection capabilities CCβ in aquaculture products were respectively at 0.25, 0.10 and 0.5 of the respective MRL in aquaculture products for enrofloxacin, tylosin and oxytetracycline. The performance of the AM II kit has been compared with other screening methods and with the performance characteristics previously determined in honey.

  3. Modeling Terrorism Risk to the Air Transportation System: An Independent Assessment of TSA’s Risk Management Analysis Tool and Associated Methods

    DTIC Science & Technology

    2012-01-01

    our own work for this discussion. DoD Instruction 5000.61 defines model validation as “the pro - cess of determining the degree to which a model and its... determined that RMAT is highly con - crete code, potentially leading to redundancies in the code itself and making RMAT more difficult to maintain...system con - ceptual models valid, and are the data used to support them adequate? (Chapters Two and Three) 2. Are the sources and methods for populating

  4. Pose Self-Calibration of Stereo Vision Systems for Autonomous Vehicle Applications.

    PubMed

    Musleh, Basam; Martín, David; Armingol, José María; de la Escalera, Arturo

    2016-09-14

    Nowadays, intelligent systems applied to vehicles have grown very rapidly; their goal is not only the improvement of safety, but also making autonomous driving possible. Many of these intelligent systems are based on making use of computer vision in order to know the environment and act accordingly. It is of great importance to be able to estimate the pose of the vision system because the measurement matching between the perception system (pixels) and the vehicle environment (meters) depends on the relative position between the perception system and the environment. A new method of camera pose estimation for stereo systems is presented in this paper, whose main contribution regarding the state of the art on the subject is the estimation of the pitch angle without being affected by the roll angle. The validation of the self-calibration method is accomplished by comparing it with relevant methods of camera pose estimation, where a synthetic sequence is used in order to measure the continuous error with a ground truth. This validation is enriched by the experimental results of the method in real traffic environments.

  5. Pose Self-Calibration of Stereo Vision Systems for Autonomous Vehicle Applications

    PubMed Central

    Musleh, Basam; Martín, David; Armingol, José María; de la Escalera, Arturo

    2016-01-01

    Nowadays, intelligent systems applied to vehicles have grown very rapidly; their goal is not only the improvement of safety, but also making autonomous driving possible. Many of these intelligent systems are based on making use of computer vision in order to know the environment and act accordingly. It is of great importance to be able to estimate the pose of the vision system because the measurement matching between the perception system (pixels) and the vehicle environment (meters) depends on the relative position between the perception system and the environment. A new method of camera pose estimation for stereo systems is presented in this paper, whose main contribution regarding the state of the art on the subject is the estimation of the pitch angle without being affected by the roll angle. The validation of the self-calibration method is accomplished by comparing it with relevant methods of camera pose estimation, where a synthetic sequence is used in order to measure the continuous error with a ground truth. This validation is enriched by the experimental results of the method in real traffic environments. PMID:27649178

  6. Validation of reactive gases and aerosols in the MACC global analysis and forecast system

    NASA Astrophysics Data System (ADS)

    Eskes, H.; Huijnen, V.; Arola, A.; Benedictow, A.; Blechschmidt, A.-M.; Botek, E.; Boucher, O.; Bouarar, I.; Chabrillat, S.; Cuevas, E.; Engelen, R.; Flentje, H.; Gaudel, A.; Griesfeller, J.; Jones, L.; Kapsomenakis, J.; Katragkou, E.; Kinne, S.; Langerock, B.; Razinger, M.; Richter, A.; Schultz, M.; Schulz, M.; Sudarchikova, N.; Thouret, V.; Vrekoussis, M.; Wagner, A.; Zerefos, C.

    2015-11-01

    The European MACC (Monitoring Atmospheric Composition and Climate) project is preparing the operational Copernicus Atmosphere Monitoring Service (CAMS), one of the services of the European Copernicus Programme on Earth observation and environmental services. MACC uses data assimilation to combine in situ and remote sensing observations with global and regional models of atmospheric reactive gases, aerosols, and greenhouse gases, and is based on the Integrated Forecasting System of the European Centre for Medium-Range Weather Forecasts (ECMWF). The global component of the MACC service has a dedicated validation activity to document the quality of the atmospheric composition products. In this paper we discuss the approach to validation that has been developed over the past 3 years. Topics discussed are the validation requirements, the operational aspects, the measurement data sets used, the structure of the validation reports, the models and assimilation systems validated, the procedure to introduce new upgrades, and the scoring methods. One specific target of the MACC system concerns forecasting special events with high-pollution concentrations. Such events receive extra attention in the validation process. Finally, a summary is provided of the results from the validation of the latest set of daily global analysis and forecast products from the MACC system reported in November 2014.

  7. Apparatus and method for managing digital resources by passing digital resource tokens between queues

    DOEpatents

    Crawford, H.J.; Lindenstruth, V.

    1999-06-29

    A method of managing digital resources of a digital system includes the step of reserving token values for certain digital resources in the digital system. A selected token value in a free-buffer-queue is then matched to an incoming digital resource request. The selected token value is then moved to a valid-request-queue. The selected token is subsequently removed from the valid-request-queue to allow a digital agent in the digital system to process the incoming digital resource request associated with the selected token. Thereafter, the selected token is returned to the free-buffer-queue. 6 figs.

  8. Apparatus and method for managing digital resources by passing digital resource tokens between queues

    DOEpatents

    Crawford, Henry J.; Lindenstruth, Volker

    1999-01-01

    A method of managing digital resources of a digital system includes the step of reserving token values for certain digital resources in the digital system. A selected token value in a free-buffer-queue is then matched to an incoming digital resource request. The selected token value is then moved to a valid-request-queue. The selected token is subsequently removed from the valid-request-queue to allow a digital agent in the digital system to process the incoming digital resource request associated with the selected token. Thereafter, the selected token is returned to the free-buffer-queue.

  9. RTL validation methodology on high complexity wireless microcontroller using OVM technique for fast time to market

    NASA Astrophysics Data System (ADS)

    Zhafirah Muhammad, Nurul; Harun, A.; Hambali, N. A. M. A.; Murad, S. A. Z.; Mohyar, S. N.; Isa, M. N.; Jambek, AB

    2017-11-01

    Increased demand in internet of thing (IOT) application based has inadvertently forced the move towards higher complexity of integrated circuit supporting SoC. Such spontaneous increased in complexity poses unequivocal complicated validation strategies. Hence, the complexity allows researchers to come out with various exceptional methodologies in order to overcome this problem. This in essence brings about the discovery of dynamic verification, formal verification and hybrid techniques. In reserve, it is very important to discover bugs at infancy of verification process in (SoC) in order to reduce time consuming and fast time to market for the system. Ergo, in this paper we are focusing on the methodology of verification that can be done at Register Transfer Level of SoC based on the AMBA bus design. On top of that, the discovery of others verification method called Open Verification Methodology (OVM) brings out an easier way in RTL validation methodology neither as the replacement for the traditional method yet as an effort for fast time to market for the system. Thus, the method called OVM is proposed in this paper as the verification method for larger design to avert the disclosure of the bottleneck in validation platform.

  10. Calibration of Clinical Audio Recording and Analysis Systems for Sound Intensity Measurement.

    PubMed

    Maryn, Youri; Zarowski, Andrzej

    2015-11-01

    Sound intensity is an important acoustic feature of voice/speech signals. Yet recordings are performed with different microphone, amplifier, and computer configurations, and it is therefore crucial to calibrate sound intensity measures of clinical audio recording and analysis systems on the basis of output of a sound-level meter. This study was designed to evaluate feasibility, validity, and accuracy of calibration methods, including audiometric speech noise signals and human voice signals under typical speech conditions. Calibration consisted of 3 comparisons between data from 29 measurement microphone-and-computer systems and data from the sound-level meter: signal-specific comparison with audiometric speech noise at 5 levels, signal-specific comparison with natural voice at 3 levels, and cross-signal comparison with natural voice at 3 levels. Intensity measures from recording systems were then linearly converted into calibrated data on the basis of these comparisons, and validity and accuracy of calibrated sound intensity were investigated. Very strong correlations and quasisimilarity were found between calibrated data and sound-level meter data across calibration methods and recording systems. Calibration of clinical sound intensity measures according to this method is feasible, valid, accurate, and representative for a heterogeneous set of microphones and data acquisition systems in real-life circumstances with distinct noise contexts.

  11. Validation and evaluation of the advanced aeronautical CFD system SAUNA: A method developer's view

    NASA Astrophysics Data System (ADS)

    Shaw, J. A.; Peace, A. J.; Georgala, J. M.; Childs, P. N.

    1993-09-01

    This paper is concerned with a detailed validation and evaluation of the SAUNA CFD system for complex aircraft configurations. The methodology of the complete system is described in brief, including its unique use of differing grid generation strategies (structured, unstructured or both) depending on the geometric complexity of the configuration. A wide range of configurations and flow conditions are chosen in the validation and evaluation exercise to demonstrate the scope of SAUNA. A detailed description of the results from the method is preceded by a discussion on the philosophy behind the strategy followed in the exercise, in terms of equality assessment and the differing roles of the code developer and the code user. It is considered that SAUNA has grown into a highly usable tool for the aircraft designer, in combining flexibility and accuracy in an efficient manner.

  12. 40 CFR Table 3 of Subpart Aaaa to... - Requirements for Validating Continuous Emission Monitoring Systems (CEMS)

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Emission Monitoring Systems (CEMS) 3 Table 3 of Subpart AAAA to Part 60 Protection of Environment... SOURCES Pt. 60, Subpt. AAAA, Table 3 Table 3 of Subpart AAAA to Part 60—Requirements for Validating... following methods in appendix A of this part to measure oxygen (or carbon dioxide) 1. Nitrogen Oxides (Class...

  13. Design and landing dynamic analysis of reusable landing leg for a near-space manned capsule

    NASA Astrophysics Data System (ADS)

    Yue, Shuai; Nie, Hong; Zhang, Ming; Wei, Xiaohui; Gan, Shengyong

    2018-06-01

    To improve the landing performance of a near-space manned capsule under various landing conditions, a novel landing system is designed that employs double chamber and single chamber dampers in the primary and auxiliary struts, respectively. A dynamic model of the landing system is established, and the damper parameters are determined by employing the design method. A single-leg drop test with different initial pitch angles is then conducted to compare and validate the simulation model. Based on the validated simulation model, seven critical landing conditions regarding nine crucial landing responses are found by combining the radial basis function (RBF) surrogate model and adaptive simulated annealing (ASA) optimization method. Subsequently, the adaptability of the landing system under critical landing conditions is analyzed. The results show that the simulation effectively results match the test results, which validates the accuracy of the dynamic model. In addition, all of the crucial responses under their corresponding critical landing conditions satisfy the design specifications, demonstrating the feasibility of the landing system.

  14. On demand processing of climate station sensor data

    NASA Astrophysics Data System (ADS)

    Wöllauer, Stephan; Forteva, Spaska; Nauss, Thomas

    2015-04-01

    Large sets of climate stations with several sensors produce big amounts of finegrained time series data. To gain value of this data, further processing and aggregation is needed. We present a flexible system to process the raw data on demand. Several aspects need to be considered to process the raw data in a way that scientists can use the processed data conveniently for their specific research interests. First of all, it is not feasible to pre-process the data in advance because of the great variety of ways it can be processed. Therefore, in this approach only the raw measurement data is archived in a database. When a scientist requires some time series, the system processes the required raw data according to the user-defined request. Based on the type of measurement sensor, some data validation is needed, because the climate station sensors may produce erroneous data. Currently, three validation methods are integrated in the on demand processing system and are optionally selectable. The most basic validation method checks if measurement values are within a predefined range of possible values. For example, it may be assumed that an air temperature sensor measures values within a range of -40 °C to +60 °C. Values outside of this range are considered as a measurement error by this validation method and consequently rejected. An other validation method checks for outliers in the stream of measurement values by defining a maximum change rate between subsequent measurement values. The third validation method compares measurement data to the average values of neighboring stations and rejects measurement values with a high variance. These quality checks are optional, because especially extreme climatic values may be valid but rejected by some quality check method. An other important task is the preparation of measurement data in terms of time. The observed stations measure values in intervals of minutes to hours. Often scientists need a coarser temporal resolution (days, months, years). Therefore, the interval of time aggregation is selectable for the processing. For some use cases it is desirable that the resulting time series are as continuous as possible. To meet these requirements, the processing system includes techniques to fill gaps of missing values by interpolating measurement values with data from adjacent stations using available contemporaneous measurements from the respective stations as training datasets. Alongside processing of sensor values, we created interactive visualization techniques to get a quick overview of a big amount of archived time series data.

  15. A validation procedure for a LADAR system radiometric simulation model

    NASA Astrophysics Data System (ADS)

    Leishman, Brad; Budge, Scott; Pack, Robert

    2007-04-01

    The USU LadarSIM software package is a ladar system engineering tool that has recently been enhanced to include the modeling of the radiometry of Ladar beam footprints. This paper will discuss our validation of the radiometric model and present a practical approach to future validation work. In order to validate complicated and interrelated factors affecting radiometry, a systematic approach had to be developed. Data for known parameters were first gathered then unknown parameters of the system were determined from simulation test scenarios. This was done in a way to isolate as many unknown variables as possible, then build on the previously obtained results. First, the appropriate voltage threshold levels of the discrimination electronics were set by analyzing the number of false alarms seen in actual data sets. With this threshold set, the system noise was then adjusted to achieve the appropriate number of dropouts. Once a suitable noise level was found, the range errors of the simulated and actual data sets were compared and studied. Predicted errors in range measurements were analyzed using two methods: first by examining the range error of a surface with known reflectivity and second by examining the range errors for specific detectors with known responsivities. This provided insight into the discrimination method and receiver electronics used in the actual system.

  16. Methods for evaluating information in managing the enterprise on the basis of a hybrid three-tier system

    NASA Astrophysics Data System (ADS)

    Vasil'ev, V. A.; Dobrynina, N. V.

    2017-01-01

    The article presents data on the influence of information upon the functioning of complex systems in the process of ensuring their effective management. Ways and methods for evaluating multidimensional information that reduce time and resources, improve the validity of the studied system management decisions, were proposed.

  17. Validation of MCNP6 Version 1.0 with the ENDF/B-VII.1 Cross Section Library for Uranium Metal, Oxide, and Solution Systems on the High Performance Computing Platform Moonlight

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chapman, Bryan Scott; MacQuigg, Michael Robert; Wysong, Andrew Russell

    In this document, the code MCNP is validated with ENDF/B-VII.1 cross section data under the purview of ANSI/ANS-8.24-2007, for use with uranium systems. MCNP is a computer code based on Monte Carlo transport methods. While MCNP has wide reading capability in nuclear transport simulation, this validation is limited to the functionality related to neutron transport and calculation of criticality parameters such as k eff.

  18. A bibliography on formal methods for system specification, design and validation

    NASA Technical Reports Server (NTRS)

    Meyer, J. F.; Furchtgott, D. G.; Movaghar, A.

    1982-01-01

    Literature on the specification, design, verification, testing, and evaluation of avionics systems was surveyed, providing 655 citations. Journal papers, conference papers, and technical reports are included. Manual and computer-based methods were employed. Keywords used in the online search are listed.

  19. Validation and Improvement of Reliability Methods for Air Force Building Systems

    DTIC Science & Technology

    focusing primarily on HVAC systems . This research used contingency analysis to assess the performance of each model for HVAC systems at six Air Force...probabilistic model produced inflated reliability calculations for HVAC systems . In light of these findings, this research employed a stochastic method, a...Nonhomogeneous Poisson Process (NHPP), in an attempt to produce accurate HVAC system reliability calculations. This effort ultimately concluded that

  20. Validity of Torque-Data Collection at Multiple Sites: A Framework for Collaboration on Clinical-Outcomes Research in Sports Medicine.

    PubMed

    Kuenze, Christopher; Eltouhky, Moataz; Thomas, Abbey; Sutherlin, Mark; Hart, Joseph

    2016-05-01

    Collecting torque data using a multimode dynamometer is common in sports-medicine research. The error in torque measurements across multiple sites and dynamometers has not been established. To assess the validity of 2 calibration protocols across 3 dynamometers and the error associated with torque measurement for each system. Observational study. 3 university laboratories at separate institutions. 2 Biodex System 3 dynamometers and 1 Biodex System 4 dynamometer. System calibration was completed using the manufacturer-recommended single-weight method and an experimental calibration method using a series of progressive weights. Both calibration methods were compared with a manually calculated theoretical torque across a range of applied weights. Relative error, absolute error, and percent error were calculated at each weight. Each outcome variable was compared between systems using 95% confidence intervals across low (0-65 Nm), moderate (66-110 Nm), and high (111-165 Nm) torque categorizations. Calibration coefficients were established for each system using both calibration protocols. However, within each system the calibration coefficients generated using the single-weight (System 4 = 2.42 [0.90], System 3a = 1.37 [1.11], System 3b = -0.96 [1.45]) and experimental calibration protocols (System 4 = 3.95 [1.08], System 3a = -0.79 [1.23], System 3b = 2.31 [1.66]) were similar and displayed acceptable mean relative error compared with calculated theoretical torque values. Overall, percent error was greatest for all 3 systems in low-torque conditions (System 4 = 11.66% [6.39], System 3a = 6.82% [11.98], System 3b = 4.35% [9.49]). The System 4 significantly overestimated torque across all 3 weight increments, and the System 3b overestimated torque over the moderate-torque increment. Conversion of raw voltage to torque values using the single-calibration-weight method is valid and comparable to a more complex multiweight calibration process; however, it is clear that calibration must be done for each individual system to ensure accurate data collection.

  1. Reliability Validation and Improvement Framework

    DTIC Science & Technology

    2012-11-01

    systems . Steps in that direction include the use of the Architec- ture Tradeoff Analysis Method ® (ATAM®) developed at the Carnegie Mellon...embedded software • cyber - physical systems (CPSs) to indicate that the embedded software interacts with, manag - es, and controls a physical system [Lee...the use of formal static analysis methods to increase our confidence in system operation beyond testing. However, analysis results

  2. A weight modification sequential method for VSC-MTDC power system state estimation

    NASA Astrophysics Data System (ADS)

    Yang, Xiaonan; Zhang, Hao; Li, Qiang; Guo, Ziming; Zhao, Kun; Li, Xinpeng; Han, Feng

    2017-06-01

    This paper presents an effective sequential approach based on weight modification for VSC-MTDC power system state estimation, called weight modification sequential method. The proposed approach simplifies the AC/DC system state estimation algorithm through modifying the weight of state quantity to keep the matrix dimension constant. The weight modification sequential method can also make the VSC-MTDC system state estimation calculation results more ccurate and increase the speed of calculation. The effectiveness of the proposed weight modification sequential method is demonstrated and validated in modified IEEE 14 bus system.

  3. Validation (not just verification) of Deep Space Missions

    NASA Technical Reports Server (NTRS)

    Duren, Riley M.

    2006-01-01

    ion & Validation (V&V) is a widely recognized and critical systems engineering function. However, the often used definition 'Verification proves the design is right; validation proves it is the right design' is rather vague. And while Verification is a reasonably well standardized systems engineering process, Validation is a far more abstract concept and the rigor and scope applied to it varies widely between organizations and individuals. This is reflected in the findings in recent Mishap Reports for several NASA missions, in which shortfalls in Validation (not just Verification) were cited as root- or contributing-factors in catastrophic mission loss. Furthermore, although there is strong agreement in the community that Test is the preferred method for V&V, many people equate 'V&V' with 'Test', such that Analysis and Modeling aren't given comparable attention. Another strong motivator is a realization that the rapid growth in complexity of deep-space missions (particularly Planetary Landers and Space Observatories given their inherent unknowns) is placing greater demands on systems engineers to 'get it right' with Validation.

  4. Validation of an in-vivo proton beam range check method in an anthropomorphic pelvic phantom using dose measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bentefour, El H., E-mail: hassan.bentefour@iba-group.com; Prieels, Damien; Tang, Shikui

    Purpose: In-vivo dosimetry and beam range verification in proton therapy could play significant role in proton treatment validation and improvements. In-vivo beam range verification, in particular, could enable new treatment techniques one of which could be the use of anterior fields for prostate treatment instead of opposed lateral fields as in current practice. This paper reports validation study of an in-vivo range verification method which can reduce the range uncertainty to submillimeter levels and potentially allow for in-vivo dosimetry. Methods: An anthropomorphic pelvic phantom is used to validate the clinical potential of the time-resolved dose method for range verification inmore » the case of prostrate treatment using range modulated anterior proton beams. The method uses a 3 × 4 matrix of 1 mm diodes mounted in water balloon which are read by an ADC system at 100 kHz. The method is first validated against beam range measurements by dose extinction measurements. The validation is first completed in water phantom and then in pelvic phantom for both open field and treatment field configurations. Later, the beam range results are compared with the water equivalent path length (WEPL) values computed from the treatment planning system XIO. Results: Beam range measurements from both time-resolved dose method and the dose extinction method agree with submillimeter precision in water phantom. For the pelvic phantom, when discarding two of the diodes that show sign of significant range mixing, the two methods agree with ±1 mm. Only a dose of 7 mGy is sufficient to achieve this result. The comparison to the computed WEPL by the treatment planning system (XIO) shows that XIO underestimates the protons beam range. Quantifying the exact XIO range underestimation depends on the strategy used to evaluate the WEPL results. To our best evaluation, XIO underestimates the treatment beam range between a minimum of 1.7% and maximum of 4.1%. Conclusions: Time-resolved dose measurement method satisfies the two basic requirements, WEPL accuracy and minimum dose, necessary for clinical use, thus, its potential for in-vivo protons range verification. Further development is needed, namely, devising a workflow that takes into account the limits imposed by proton range mixing and the susceptibility of the comparison of measured and expected WEPLs to errors on the detector positions. The methods may also be used for in-vivo dosimetry and could benefit various proton therapy treatments.« less

  5. Demography of Principals' Work and School Improvement: Content Validity of Kentucky's Standards and Indicators for School Improvement (SISI)

    ERIC Educational Resources Information Center

    Lindle, Jane Clark; Stalion, Nancy; Young, Lu

    2005-01-01

    Kentucky's accountability system includes a school-processes audit known as Standards and Indicators for School Improvement (SISI), which is in a nascent stage of validation. Content validity methods include comparison to instruments measuring similar constructs as well as other techniques such as job analysis. This study used a two-phase process…

  6. Computer-Assisted Update of a Consumer Health Vocabulary Through Mining of Social Network Data

    PubMed Central

    2011-01-01

    Background Consumer health vocabularies (CHVs) have been developed to aid consumer health informatics applications. This purpose is best served if the vocabulary evolves with consumers’ language. Objective Our objective was to create a computer assisted update (CAU) system that works with live corpora to identify new candidate terms for inclusion in the open access and collaborative (OAC) CHV. Methods The CAU system consisted of three main parts: a Web crawler and an HTML parser, a candidate term filter that utilizes natural language processing tools including term recognition methods, and a human review interface. In evaluation, the CAU system was applied to the health-related social network website PatientsLikeMe.com. The system’s utility was assessed by comparing the candidate term list it generated to a list of valid terms hand extracted from the text of the crawled webpages. Results The CAU system identified 88,994 unique terms 1- to 7-grams (“n-grams” are n consecutive words within a sentence) in 300 crawled PatientsLikeMe.com webpages. The manual review of the crawled webpages identified 651 valid terms not yet included in the OAC CHV or the Unified Medical Language System (UMLS) Metathesaurus, a collection of vocabularies amalgamated to form an ontology of medical terms, (ie, 1 valid term per 136.7 candidate n-grams). The term filter selected 774 candidate terms, of which 237 were valid terms, that is, 1 valid term among every 3 or 4 candidates reviewed. Conclusion The CAU system is effective for generating a list of candidate terms for human review during CHV development. PMID:21586386

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quinn, Heather; Wirthlin, Michael

    A variety of fault emulation systems have been created to study the effect of single-event effects (SEEs) in static random access memory (SRAM) based field-programmable gate arrays (FPGAs). These systems are useful for augmenting radiation-hardness assurance (RHA) methodologies for verifying the effectiveness for mitigation techniques; understanding error signatures and failure modes in FPGAs; and failure rate estimation. For radiation effects researchers, it is important that these systems properly emulate how SEEs manifest in FPGAs. If the fault emulation systems does not mimic the radiation environment, the system will generate erroneous data and incorrect predictions of behavior of the FPGA inmore » a radiation environment. Validation determines whether the emulated faults are reasonable analogs to the radiation-induced faults. In this study we present methods for validating fault emulation systems and provide several examples of validated FPGA fault emulation systems.« less

  8. Task-oriented evaluation of electronic medical records systems: development and validation of a questionnaire for physicians

    PubMed Central

    2004-01-01

    Background Evaluation is a challenging but necessary part of the development cycle of clinical information systems like the electronic medical records (EMR) system. It is believed that such evaluations should include multiple perspectives, be comparative and employ both qualitative and quantitative methods. Self-administered questionnaires are frequently used as a quantitative evaluation method in medical informatics, but very few validated questionnaires address clinical use of EMR systems. Methods We have developed a task-oriented questionnaire for evaluating EMR systems from the clinician's perspective. The key feature of the questionnaire is a list of 24 general clinical tasks. It is applicable to physicians of most specialties and covers essential parts of their information-oriented work. The task list appears in two separate sections, about EMR use and task performance using the EMR, respectively. By combining these sections, the evaluator may estimate the potential impact of the EMR system on health care delivery. The results may also be compared across time, site or vendor. This paper describes the development, performance and validation of the questionnaire. Its performance is shown in two demonstration studies (n = 219 and 80). Its content is validated in an interview study (n = 10), and its reliability is investigated in a test-retest study (n = 37) and a scaling study (n = 31). Results In the interviews, the physicians found the general clinical tasks in the questionnaire relevant and comprehensible. The tasks were interpreted concordant to their definitions. However, the physicians found questions about tasks not explicitly or only partially supported by the EMR systems difficult to answer. The two demonstration studies provided unambiguous results and low percentages of missing responses. In addition, criterion validity was demonstrated for a majority of task-oriented questions. Their test-retest reliability was generally high, and the non-standard scale was found symmetric and ordinal. Conclusion This questionnaire is relevant for clinical work and EMR systems, provides reliable and interpretable results, and may be used as part of any evaluation effort involving the clinician's perspective of an EMR system. PMID:15018620

  9. Assessing Primary Representational System (PRS) Preference for Neurolinguistic Programming (NLP) Using Three Methods.

    ERIC Educational Resources Information Center

    Dorn, Fred J.

    1983-01-01

    Considered three methods of identifying Primary Representational System (PRS)--an interview, a word list, and a self-report--in a study of 120 college students. Results suggested the three methods offer little to counselors either collectively or individually. Results did not validate the PRS construct, suggesting the need for further research.…

  10. Dynamic leg length asymmetry during gait is not a valid method for estimating mild anatomic leg length discrepancy.

    PubMed

    Leporace, Gustavo; Batista, Luiz Alberto; Serra Cruz, Raphael; Zeitoune, Gabriel; Cavalin, Gabriel Armondi; Metsavaht, Leonardo

    2018-03-01

    The purpose of this study was to test the validity of dynamic leg length discrepancy (DLLD) during gait as a radiation-free screening method for measuring anatomic leg length discrepancy (ALLD). Thirty-three subjects with mild leg length discrepancy walked along a walkway and the dynamic leg length discrepancy (DLLD) was calculated using a motion analysis system. Pearson correlation and paired Student t -tests were applied to calculate the correlation and compare the differences between DLLD and ALLD (α = 0.05). The results of our study showed DLLD is not a valid method to predict ALLD in subjects with mild limb discrepancy.

  11. Verification and Validation in a Rapid Software Development Process

    NASA Technical Reports Server (NTRS)

    Callahan, John R.; Easterbrook, Steve M.

    1997-01-01

    The high cost of software production is driving development organizations to adopt more automated design and analysis methods such as rapid prototyping, computer-aided software engineering (CASE) tools, and high-level code generators. Even developers of safety-critical software system have adopted many of these new methods while striving to achieve high levels Of quality and reliability. While these new methods may enhance productivity and quality in many cases, we examine some of the risks involved in the use of new methods in safety-critical contexts. We examine a case study involving the use of a CASE tool that automatically generates code from high-level system designs. We show that while high-level testing on the system structure is highly desirable, significant risks exist in the automatically generated code and in re-validating releases of the generated code after subsequent design changes. We identify these risks and suggest process improvements that retain the advantages of rapid, automated development methods within the quality and reliability contexts of safety-critical projects.

  12. Evaluation of the confusion matrix method in the validation of an automated system for measuring feeding behaviour of cattle.

    PubMed

    Ruuska, Salla; Hämäläinen, Wilhelmiina; Kajava, Sari; Mughal, Mikaela; Matilainen, Pekka; Mononen, Jaakko

    2018-03-01

    The aim of the present study was to evaluate empirically confusion matrices in device validation. We compared the confusion matrix method to linear regression and error indices in the validation of a device measuring feeding behaviour of dairy cattle. In addition, we studied how to extract additional information on classification errors with confusion probabilities. The data consisted of 12 h behaviour measurements from five dairy cows; feeding and other behaviour were detected simultaneously with a device and from video recordings. The resulting 216 000 pairs of classifications were used to construct confusion matrices and calculate performance measures. In addition, hourly durations of each behaviour were calculated and the accuracy of measurements was evaluated with linear regression and error indices. All three validation methods agreed when the behaviour was detected very accurately or inaccurately. Otherwise, in the intermediate cases, the confusion matrix method and error indices produced relatively concordant results, but the linear regression method often disagreed with them. Our study supports the use of confusion matrix analysis in validation since it is robust to any data distribution and type of relationship, it makes a stringent evaluation of validity, and it offers extra information on the type and sources of errors. Copyright © 2018 Elsevier B.V. All rights reserved.

  13. Development and validation of a rapid multi-class method for the confirmation of fourteen prohibited medicinal additives in pig and poultry compound feed by liquid chromatography-tandem mass spectrometry.

    PubMed

    Cronly, Mark; Behan, P; Foley, B; Malone, E; Earley, S; Gallagher, M; Shearan, P; Regan, L

    2010-12-01

    A confirmatory method has been developed to allow for the analysis of fourteen prohibited medicinal additives in pig and poultry compound feed. These compounds are prohibited for use as feed additives although some are still authorised for use in medicated feed. Feed samples are extracted by acetonitrile with addition of sodium sulfate. The extracts undergo a hexane wash to aid with sample purification. The extracts are then evaporated to dryness and reconstituted in initial mobile phase. The samples undergo an ultracentrifugation step prior to injection onto the LC-MS/MS system and are analysed in a run time of 26 min. The LC-MS/MS system is run in MRM mode with both positive and negative electrospray ionisation. The method was validated over three days and is capable of quantitatively analysing for metronidazole, dimetridazole, ronidazole, ipronidazole, chloramphenicol, sulfamethazine, dinitolimide, ethopabate, carbadox and clopidol. The method is also capable of qualitatively analysing for sulfadiazine, tylosin, virginiamycin and avilamycin. A level of 100 microg kg(-1) was used for validation purposes and the method is capable of analysing to this level for all the compounds. Validation criteria of trueness, precision, repeatability and reproducibility along with measurement uncertainty are calculated for all analytes. Copyright (c) 2010 Elsevier B.V. All rights reserved.

  14. A Methodology for Validating Safety Heuristics Using Clinical Simulations: Identifying and Preventing Possible Technology-Induced Errors Related to Using Health Information Systems

    PubMed Central

    Borycki, Elizabeth; Kushniruk, Andre; Carvalho, Christopher

    2013-01-01

    Internationally, health information systems (HIS) safety has emerged as a significant concern for governments. Recently, research has emerged that has documented the ability of HIS to be implicated in the harm and death of patients. Researchers have attempted to develop methods that can be used to prevent or reduce technology-induced errors. Some researchers are developing methods that can be employed prior to systems release. These methods include the development of safety heuristics and clinical simulations. In this paper, we outline our methodology for developing safety heuristics specific to identifying the features or functions of a HIS user interface design that may lead to technology-induced errors. We follow this with a description of a methodological approach to validate these heuristics using clinical simulations. PMID:23606902

  15. Common Criteria related security design patterns--validation on the intelligent sensor example designed for mine environment.

    PubMed

    Bialas, Andrzej

    2010-01-01

    The paper discusses the security issues of intelligent sensors that are able to measure and process data and communicate with other information technology (IT) devices or systems. Such sensors are often used in high risk applications. To improve their robustness, the sensor systems should be developed in a restricted way to provide them with assurance. One of assurance creation methodologies is Common Criteria (ISO/IEC 15408), used for IT products and systems. The contribution of the paper is a Common Criteria compliant and pattern-based method for the intelligent sensors security development. The paper concisely presents this method and its evaluation for the sensor detecting methane in a mine, focusing on the security problem of the intelligent sensor definition and solution. The aim of the validation is to evaluate and improve the introduced method.

  16. On Federated and Proof Of Validation Based Consensus Algorithms In Blockchain

    NASA Astrophysics Data System (ADS)

    Ambili, K. N.; Sindhu, M.; Sethumadhavan, M.

    2017-08-01

    Almost all real world activities have been digitized and there are various client server architecture based systems in place to handle them. These are all based on trust on third parties. There is an active attempt to successfully implement blockchain based systems which ensures that the IT systems are immutable, double spending is avoided and cryptographic strength is provided to them. A successful implementation of blockchain as backbone of existing information technology systems is bound to eliminate various types of fraud and ensure quicker delivery of the item on trade. To adapt IT systems to blockchain architecture, an efficient consensus algorithm need to be designed. Blockchain based on proof of work first came up as the backbone of cryptocurrency. After this, several other methods with variety of interesting features have come up. In this paper, we conduct a survey on existing attempts to achieve consensus in block chain. A federated consensus method and a proof of validation method are being compared.

  17. Validation of an entirely in vitro approach for rapid prototyping of DNA regulatory elements for synthetic biology

    PubMed Central

    Chappell, James; Jensen, Kirsten; Freemont, Paul S.

    2013-01-01

    A bottleneck in our capacity to rationally and predictably engineer biological systems is the limited number of well-characterized genetic elements from which to build. Current characterization methods are tied to measurements in living systems, the transformation and culturing of which are inherently time-consuming. To address this, we have validated a completely in vitro approach for the characterization of DNA regulatory elements using Escherichia coli extract cell-free systems. Importantly, we demonstrate that characterization in cell-free systems correlates and is reflective of performance in vivo for the most frequently used DNA regulatory elements. Moreover, we devise a rapid and completely in vitro method to generate DNA templates for cell-free systems, bypassing the need for DNA template generation and amplification from living cells. This in vitro approach is significantly quicker than current characterization methods and is amenable to high-throughput techniques, providing a valuable tool for rapidly prototyping libraries of DNA regulatory elements for synthetic biology. PMID:23371936

  18. Scoring Methods for Building Genotypic Scores: An Application to Didanosine Resistance in a Large Derivation Set

    PubMed Central

    Houssaini, Allal; Assoumou, Lambert; Miller, Veronica; Calvez, Vincent; Marcelin, Anne-Geneviève; Flandre, Philippe

    2013-01-01

    Background Several attempts have been made to determine HIV-1 resistance from genotype resistance testing. We compare scoring methods for building weighted genotyping scores and commonly used systems to determine whether the virus of a HIV-infected patient is resistant. Methods and Principal Findings Three statistical methods (linear discriminant analysis, support vector machine and logistic regression) are used to determine the weight of mutations involved in HIV resistance. We compared these weighted scores with known interpretation systems (ANRS, REGA and Stanford HIV-db) to classify patients as resistant or not. Our methodology is illustrated on the Forum for Collaborative HIV Research didanosine database (N = 1453). The database was divided into four samples according to the country of enrolment (France, USA/Canada, Italy and Spain/UK/Switzerland). The total sample and the four country-based samples allow external validation (one sample is used to estimate a score and the other samples are used to validate it). We used the observed precision to compare the performance of newly derived scores with other interpretation systems. Our results show that newly derived scores performed better than or similar to existing interpretation systems, even with external validation sets. No difference was found between the three methods investigated. Our analysis identified four new mutations associated with didanosine resistance: D123S, Q207K, H208Y and K223Q. Conclusions We explored the potential of three statistical methods to construct weighted scores for didanosine resistance. Our proposed scores performed at least as well as already existing interpretation systems and previously unrecognized didanosine-resistance associated mutations were identified. This approach could be used for building scores of genotypic resistance to other antiretroviral drugs. PMID:23555613

  19. Technical note: validation of a motion analysis system for measuring the relative motion of the intermediate component of a tripolar total hip arthroplasty prosthesis.

    PubMed

    Chen, Qingshan; Lazennec, Jean Yves; Guyen, Olivier; Kinbrum, Amy; Berry, Daniel J; An, Kai-Nan

    2005-07-01

    Tripolar total hip arthroplasty (THA) prosthesis had been suggested as a method to reduce the occurrence of hip dislocation and microseparation. Precisely measuring the motion of the intermediate component in vitro would provide fundamental knowledge for understanding its mechanism. The present study validates the accuracy and repeatability of a three-dimensional motion analysis system to quantitatively measure the relative motion of the intermediate component of tripolar total hip arthroplasty prostheses. Static and dynamic validations of the system were made by comparing the measurement to that of a potentiometer. Differences between the mean system-calculated angle and the angle measured by the potentiometer were within +/-1 degrees . The mean within-trial variability was less than 1 degrees . The mean slope was 0.9-1.02 for different angular velocities. The dynamic noise was within 1 degrees . The system was then applied to measure the relative motion of an eccentric THA prosthesis. The study shows that this motion analysis system provides an accurate and practical method for measuring the relative motion of the tripolar THA prosthesis in vitro, a necessary first step towards the understanding of its in vivo kinematics.

  20. Validation of Ion Chromatographic Method for Determination of Standard Inorganic Anions in Treated and Untreated Drinking Water

    NASA Astrophysics Data System (ADS)

    Ivanova, V.; Surleva, A.; Koleva, B.

    2018-06-01

    An ion chromatographic method for determination of fluoride, chloride, nitrate and sulphate in untreated and treated drinking waters was described. An automated 850 IC Professional, Metrohm system equipped with conductivity detector and Metrosep A Supp 7-250 (250 x 4 mm) column was used. The validation of the method was performed for simultaneous determination of all studied analytes and the results have showed that the validated method fits the requirements of the current water legislation. The main analytical characteristics were estimated for each of studied analytes: limits of detection, limits of quantification, working and linear ranges, repeatability and intermediate precision, recovery. The trueness of the method was estimated by analysis of certified reference material for soft drinking water. Recovery test was performed on spiked drinking water samples. An uncertainty was estimated. The method was applied for analysis of drinking waters before and after chlorination.

  1. Approach to method development and validation in capillary electrophoresis for enantiomeric purity testing of active basic pharmaceutical ingredients.

    PubMed

    Sokoliess, Torsten; Köller, Gerhard

    2005-06-01

    A chiral capillary electrophoresis system allowing the determination of the enantiomeric purity of an investigational new drug was developed using a generic method development approach for basic analytes. The method was optimized in terms of type and concentration of both cyclodextrin (CD) and electrolyte, buffer pH, temperature, voltage, and rinsing procedure. Optimal chiral separation of the analyte was obtained using an electrolyte with 2.5% carboxymethyl-beta-CD in 25 mM NaH2PO4 (pH 4.0). Interchanging the inlet and outlet vials after each run improved the method's precision. To assure the method's suitability for the control of enantiomeric impurities in pharmaceutical quality control, its specificity, linearity, precision, accuracy, and robustness were validated according to the requirements of the International Conference on Harmonization. The usefulness of our generic method development approach for the validation of robustness was demonstrated.

  2. IMPLEMENTATION AND VALIDATION OF A FULLY IMPLICIT ACCUMULATOR MODEL IN RELAP-7

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, Haihua; Zou, Ling; Zhang, Hongbin

    2016-01-01

    This paper presents the implementation and validation of an accumulator model in RELAP-7 under the framework of preconditioned Jacobian free Newton Krylov (JFNK) method, based on the similar model used in RELAP5. RELAP-7 is a new nuclear reactor system safety analysis code being developed at the Idaho National Laboratory (INL). RELAP-7 is a fully implicit system code. The JFNK and preconditioning methods used in RELAP-7 is briefly discussed. The slightly modified accumulator model is summarized for completeness. The implemented model was validated with LOFT L3-1 test and benchmarked with RELAP5 results. RELAP-7 and RELAP5 had almost identical results for themore » accumulator gas pressure and water level, although there were some minor difference in other parameters such as accumulator gas temperature and tank wall temperature. One advantage of the JFNK method is its easiness to maintain and modify models due to fully separation of numerical methods from physical models. It would be straightforward to extend the current RELAP-7 accumulator model to simulate the advanced accumulator design.« less

  3. Wrestlers' minimal weight: anthropometry, bioimpedance, and hydrostatic weighing compared.

    PubMed

    Oppliger, R A; Nielsen, D H; Vance, C G

    1991-02-01

    The need for accurate assessment of minimal wrestling weight among interscholastic wrestlers has been well documented. Previous research has demonstrated the validity of anthropometric methods for this purpose, but little research has examined the validity of bioelectrical impedance (BIA) measurements. Comparisons between BIA systems has received limited attention. With these two objectives, we compared the prediction of minimal weight (MW) among 57 interscholastic wrestlers using three anthropometric methods (skinfolds (SF) and two skeletal dimensions equations) and three BIA systems (Berkeley Medical Research (BMR), RJL, and Valhalla (VAL]. All methods showed high correlations (r values greater than 0.92) with hydrostatic weighting (HW) and between methods (r values greater than 0.90). The standard errors of estimate (SEE) were relatively small for all methods, especially for SF and the three BIA systems (SEE less than 0.70 kg). The total errors of prediction (E) for RJL and VAL (E = 4.4 and 3.9 kg) were significantly larger than observed nonsignificant BMR and SF values (E = 2.3 and 1.8 kg, respectively). Significant mean differences were observed between HW, RJL, VAL, and the two skeletal dimensions equations, but nonsignificant differences were observed between HW, BMR, and SF. BMR differed significantly from the RJL and VAL systems. The results suggest that RJL and VAL have potential application for this subpopulation. Prediction equation refinement with the addition of selected anthropometric measurement or moderating variables may enhance their utility. However, within the scope of our study, SF and BMR BIA appear to be the most valid methods for determining MW in interscholastic wrestlers.

  4. Ab initio analytical Raman intensities for periodic systems through a coupled perturbed Hartree-Fock/Kohn-Sham method in an atomic orbital basis. II. Validation and comparison with experiments

    NASA Astrophysics Data System (ADS)

    Maschio, Lorenzo; Kirtman, Bernard; Rérat, Michel; Orlando, Roberto; Dovesi, Roberto

    2013-10-01

    In this work, we validate a new, fully analytical method for calculating Raman intensities of periodic systems, developed and presented in Paper I [L. Maschio, B. Kirtman, M. Rérat, R. Orlando, and R. Dovesi, J. Chem. Phys. 139, 164101 (2013)]. Our validation of this method and its implementation in the CRYSTAL code is done through several internal checks as well as comparison with experiment. The internal checks include consistency of results when increasing the number of periodic directions (from 0D to 1D, 2D, 3D), comparison with numerical differentiation, and a test of the sum rule for derivatives of the polarizability tensor. The choice of basis set as well as the Hamiltonian is also studied. Simulated Raman spectra of α-quartz and of the UiO-66 Metal-Organic Framework are compared with the experimental data.

  5. Content Validity of National Post Marriage Educational Program Using Mixed Methods

    PubMed Central

    MOHAJER RAHBARI, Masoumeh; SHARIATI, Mohammad; KERAMAT, Afsaneh; YUNESIAN, Masoud; ESLAMI, Mohammad; MOUSAVI, Seyed Abbas; MONTAZERI, Ali

    2015-01-01

    Background: Although the validity of content of program is mostly conducted with qualitative methods, this study used both qualitative and quantitative methods for the validation of content of post marriage training program provided for newly married couples. Content validity is a preliminary step of obtaining authorization required to install the program in country's health care system. Methods: This mixed methodological content validation study carried out in four steps with forming three expert panels. Altogether 24 expert panelists were involved in 3 qualitative and quantitative panels; 6 in the first item development one; 12 in the reduction kind, 4 of them were common with the first panel, and 10 executive experts in the last one organized to evaluate psychometric properties of CVR and CVI and Face validity of 57 educational objectives. Results: The raw data of post marriage program had been written by professional experts of Ministry of Health, using qualitative expert panel, the content was more developed by generating 3 topics and refining one topic and its respective content. In the second panel, totally six other objectives were deleted, three for being out of agreement cut of point and three on experts' consensus. The validity of all items was above 0.8 and their content validity indices (0.8–1) were completely appropriate in quantitative assessment. Conclusion: This study provided a good evidence for validation and accreditation of national post marriage program planned for newly married couples in health centers of the country in the near future. PMID:26056672

  6. Evaluation of PDA Technical Report No 33. Statistical Testing Recommendations for a Rapid Microbiological Method Case Study.

    PubMed

    Murphy, Thomas; Schwedock, Julie; Nguyen, Kham; Mills, Anna; Jones, David

    2015-01-01

    New recommendations for the validation of rapid microbiological methods have been included in the revised Technical Report 33 release from the PDA. The changes include a more comprehensive review of the statistical methods to be used to analyze data obtained during validation. This case study applies those statistical methods to accuracy, precision, ruggedness, and equivalence data obtained using a rapid microbiological methods system being evaluated for water bioburden testing. Results presented demonstrate that the statistical methods described in the PDA Technical Report 33 chapter can all be successfully applied to the rapid microbiological method data sets and gave the same interpretation for equivalence to the standard method. The rapid microbiological method was in general able to pass the requirements of PDA Technical Report 33, though the study shows that there can be occasional outlying results and that caution should be used when applying statistical methods to low average colony-forming unit values. Prior to use in a quality-controlled environment, any new method or technology has to be shown to work as designed by the manufacturer for the purpose required. For new rapid microbiological methods that detect and enumerate contaminating microorganisms, additional recommendations have been provided in the revised PDA Technical Report No. 33. The changes include a more comprehensive review of the statistical methods to be used to analyze data obtained during validation. This paper applies those statistical methods to analyze accuracy, precision, ruggedness, and equivalence data obtained using a rapid microbiological method system being validated for water bioburden testing. The case study demonstrates that the statistical methods described in the PDA Technical Report No. 33 chapter can be successfully applied to rapid microbiological method data sets and give the same comparability results for similarity or difference as the standard method. © PDA, Inc. 2015.

  7. Scenario-based design: A method for connecting information system design with public health operations and emergency management

    PubMed Central

    Reeder, Blaine; Turner, Anne M

    2011-01-01

    Responding to public health emergencies requires rapid and accurate assessment of workforce availability under adverse and changing circumstances. However, public health information systems to support resource management during both routine and emergency operations are currently lacking. We applied scenario-based design as an approach to engage public health practitioners in the creation and validation of an information design to support routine and emergency public health activities. Methods: Using semi-structured interviews we identified the information needs and activities of senior public health managers of a large municipal health department during routine and emergency operations. Results: Interview analysis identified twenty-five information needs for public health operations management. The identified information needs were used in conjunction with scenario-based design to create twenty-five scenarios of use and a public health manager persona. Scenarios of use and persona were validated and modified based on follow-up surveys with study participants. Scenarios were used to test and gain feedback on a pilot information system. Conclusion: The method of scenario-based design was applied to represent the resource management needs of senior-level public health managers under routine and disaster settings. Scenario-based design can be a useful tool for engaging public health practitioners in the design process and to validate an information system design. PMID:21807120

  8. Validation of the sperm class analyser CASA system for sperm counting in a busy diagnostic semen analysis laboratory.

    PubMed

    Dearing, Chey G; Kilburn, Sally; Lindsay, Kevin S

    2014-03-01

    Sperm counts have been linked to several fertility outcomes making them an essential parameter of semen analysis. It has become increasingly recognised that Computer-Assisted Semen Analysis (CASA) provides improved precision over manual methods but that systems are seldom validated robustly for use. The objective of this study was to gather the evidence to validate or reject the Sperm Class Analyser (SCA) as a tool for routine sperm counting in a busy laboratory setting. The criteria examined were comparison with the Improved Neubauer and Leja 20-μm chambers, within and between field precision, sperm concentration linearity from a stock diluted in semen and media, accuracy against internal and external quality material, assessment of uneven flow effects and a receiver operating characteristic (ROC) analysis to predict fertility in comparison with the Neubauer method. This work demonstrates that SCA CASA technology is not a standalone 'black box', but rather a tool for well-trained staff that allows rapid, high-number sperm counting providing errors are identified and corrected. The system will produce accurate, linear, precise results, with less analytical variance than manual methods that correlate well against the Improved Neubauer chamber. The system provides superior predictive potential for diagnosing fertility problems.

  9. Incremental checking of Master Data Management model based on contextual graphs

    NASA Astrophysics Data System (ADS)

    Lamolle, Myriam; Menet, Ludovic; Le Duc, Chan

    2015-10-01

    The validation of models is a crucial step in distributed heterogeneous systems. In this paper, an incremental validation method is proposed in the scope of a Model Driven Engineering (MDE) approach, which is used to develop a Master Data Management (MDM) field represented by XML Schema models. The MDE approach presented in this paper is based on the definition of an abstraction layer using UML class diagrams. The validation method aims to minimise the model errors and to optimisethe process of model checking. Therefore, the notion of validation contexts is introduced allowing the verification of data model views. Description logics specify constraints that the models have to check. An experimentation of the approach is presented through an application developed in ArgoUML IDE.

  10. A scoring system for ascertainment of incident stroke; the Risk Index Score (RISc).

    PubMed

    Kass-Hout, T A; Moyé, L A; Smith, M A; Morgenstern, L B

    2006-01-01

    The main objective of this study was to develop and validate a computer-based statistical algorithm that could be translated into a simple scoring system in order to ascertain incident stroke cases using hospital admission medical records data. The Risk Index Score (RISc) algorithm was developed using data collected prospectively by the Brain Attack Surveillance in Corpus Christi (BASIC) project, 2000. The validity of RISc was evaluated by estimating the concordance of scoring system stroke ascertainment to stroke ascertainment by physician and/or abstractor review of hospital admission records. RISc was developed on 1718 randomly selected patients (training set) and then statistically validated on an independent sample of 858 patients (validation set). A multivariable logistic model was used to develop RISc and subsequently evaluated by goodness-of-fit and receiver operating characteristic (ROC) analyses. The higher the value of RISc, the higher the patient's risk of potential stroke. The study showed RISc was well calibrated and discriminated those who had potential stroke from those that did not on initial screening. In this study we developed and validated a rapid, easy, efficient, and accurate method to ascertain incident stroke cases from routine hospital admission records for epidemiologic investigations. Validation of this scoring system was achieved statistically; however, clinical validation in a community hospital setting is warranted.

  11. Validated spectrofluorometric method for determination of gemfibrozil in self nanoemulsifying drug delivery systems (SNEDDS)

    NASA Astrophysics Data System (ADS)

    Sierra Villar, Ana M.; Calpena Campmany, Ana C.; Bellowa, Lyda Halbaut; Trenchs, Monserrat Aróztegui; Naveros, Beatriz Clares

    2013-09-01

    A spectrofluorometric method has been developed and validated for the determination of gemfibrozil. The method is based on the excitation and emission capacities of gemfibrozil with excitation and emission wavelengths of 276 and 304 nm respectively. This method allows de determination of the drug in a self-nanoemulsifying drug delivery system (SNEDDS) for improve its intestinal absorption. Results obtained showed linear relationships with good correlation coefficients (r2 > 0.999) and low limits of detection and quantification (LOD of 0.075 μg mL-1 and LOQ of 0.226 μg mL-1) in the range of 0.2-5 μg mL-1, equally this method showed a good robustness and stability. Thus the amounts of gemfibrozil released from SNEDDS contained in gastro resistant hard gelatine capsules were analysed, and release studies could be performed satisfactorily.

  12. A Methodological Approach to Small Area Estimation for the Behavioral Risk Factor Surveillance System

    PubMed Central

    Xu, Fang; Wallace, Robyn C.; Garvin, William; Greenlund, Kurt J.; Bartoli, William; Ford, Derek; Eke, Paul; Town, G. Machell

    2016-01-01

    Public health researchers have used a class of statistical methods to calculate prevalence estimates for small geographic areas with few direct observations. Many researchers have used Behavioral Risk Factor Surveillance System (BRFSS) data as a basis for their models. The aims of this study were to 1) describe a new BRFSS small area estimation (SAE) method and 2) investigate the internal and external validity of the BRFSS SAEs it produced. The BRFSS SAE method uses 4 data sets (the BRFSS, the American Community Survey Public Use Microdata Sample, Nielsen Claritas population totals, and the Missouri Census Geographic Equivalency File) to build a single weighted data set. Our findings indicate that internal and external validity tests were successful across many estimates. The BRFSS SAE method is one of several methods that can be used to produce reliable prevalence estimates in small geographic areas. PMID:27418213

  13. Validated spectrofluorometric method for determination of gemfibrozil in self nanoemulsifying drug delivery systems (SNEDDS).

    PubMed

    Sierra Villar, Ana M; Calpena Campmany, Ana C; Bellowa, Lyda Halbaut; Trenchs, Monserrat Aróztegui; Naveros, Beatriz Clares

    2013-09-01

    A spectrofluorometric method has been developed and validated for the determination of gemfibrozil. The method is based on the excitation and emission capacities of gemfibrozil with excitation and emission wavelengths of 276 and 304 nm respectively. This method allows de determination of the drug in a self-nanoemulsifying drug delivery system (SNEDDS) for improve its intestinal absorption. Results obtained showed linear relationships with good correlation coefficients (r(2)>0.999) and low limits of detection and quantification (LOD of 0.075 μg mL(-1) and LOQ of 0.226 μg mL(-1)) in the range of 0.2-5 μg mL(-1), equally this method showed a good robustness and stability. Thus the amounts of gemfibrozil released from SNEDDS contained in gastro resistant hard gelatine capsules were analysed, and release studies could be performed satisfactorily. Copyright © 2013 Elsevier B.V. All rights reserved.

  14. A collaborative design method to support integrated care. An ICT development method containing continuous user validation improves the entire care process and the individual work situation

    PubMed Central

    Scandurra, Isabella; Hägglund, Maria

    2009-01-01

    Introduction Integrated care involves different professionals, belonging to different care provider organizations and requires immediate and ubiquitous access to patient-oriented information, supporting an integrated view on the care process [1]. Purpose To present a method for development of usable and work process-oriented information and communication technology (ICT) systems for integrated care. Theory and method Based on Human-computer Interaction Science and in particular Participatory Design [2], we present a new collaborative design method in the context of health information systems (HIS) development [3]. This method implies a thorough analysis of the entire interdisciplinary cooperative work and a transformation of the results into technical specifications, via user validated scenarios, prototypes and use cases, ultimately leading to the development of appropriate ICT for the variety of occurring work situations for different user groups, or professions, in integrated care. Results and conclusions Application of the method in homecare of the elderly resulted in an HIS that was well adapted to the intended user groups. Conducted in multi-disciplinary seminars, the method captured and validated user needs and system requirements for different professionals, work situations, and environments not only for current work; it also aimed to improve collaboration in future (ICT supported) work processes. A holistic view of the entire care process was obtained and supported through different views of the HIS for different user groups, resulting in improved work in the entire care process as well as for each collaborating profession [4].

  15. Primary central nervous system lymphoma and glioblastoma differentiation based on conventional magnetic resonance imaging by high-throughput SIFT features.

    PubMed

    Chen, Yinsheng; Li, Zeju; Wu, Guoqing; Yu, Jinhua; Wang, Yuanyuan; Lv, Xiaofei; Ju, Xue; Chen, Zhongping

    2018-07-01

    Due to the totally different therapeutic regimens needed for primary central nervous system lymphoma (PCNSL) and glioblastoma (GBM), accurate differentiation of the two diseases by noninvasive imaging techniques is important for clinical decision-making. Thirty cases of PCNSL and 66 cases of GBM with conventional T1-contrast magnetic resonance imaging (MRI) were analyzed in this study. Convolutional neural networks was used to segment tumor automatically. A modified scale invariant feature transform (SIFT) method was utilized to extract three-dimensional local voxel arrangement information from segmented tumors. Fisher vector was proposed to normalize the dimension of SIFT features. An improved genetic algorithm (GA) was used to extract SIFT features with PCNSL and GBM discrimination ability. The data-set was divided into a cross-validation cohort and an independent validation cohort by the ratio of 2:1. Support vector machine with the leave-one-out cross-validation based on 20 cases of PCNSL and 44 cases of GBM was employed to build and validate the differentiation model. Among 16,384 high-throughput features, 1356 features show significant differences between PCNSL and GBM with p < 0.05 and 420 features with p < 0.001. A total of 496 features were finally chosen by improved GA algorithm. The proposed method produces PCNSL vs. GBM differentiation with an area under the curve (AUC) curve of 99.1% (98.2%), accuracy 95.3% (90.6%), sensitivity 85.0% (80.0%) and specificity 100% (95.5%) on the cross-validation cohort (and independent validation cohort). Since the local voxel arrangement characterization provided by SIFT features, proposed method produced more competitive PCNSL and GBM differentiation performance by using conventional MRI than methods based on advanced MRI.

  16. Cumulative query method for influenza surveillance using search engine data.

    PubMed

    Seo, Dong-Woo; Jo, Min-Woo; Sohn, Chang Hwan; Shin, Soo-Yong; Lee, JaeHo; Yu, Maengsoo; Kim, Won Young; Lim, Kyoung Soo; Lee, Sang-Il

    2014-12-16

    Internet search queries have become an important data source in syndromic surveillance system. However, there is currently no syndromic surveillance system using Internet search query data in South Korea. The objective of this study was to examine correlations between our cumulative query method and national influenza surveillance data. Our study was based on the local search engine, Daum (approximately 25% market share), and influenza-like illness (ILI) data from the Korea Centers for Disease Control and Prevention. A quota sampling survey was conducted with 200 participants to obtain popular queries. We divided the study period into two sets: Set 1 (the 2009/10 epidemiological year for development set 1 and 2010/11 for validation set 1) and Set 2 (2010/11 for development Set 2 and 2011/12 for validation Set 2). Pearson's correlation coefficients were calculated between the Daum data and the ILI data for the development set. We selected the combined queries for which the correlation coefficients were .7 or higher and listed them in descending order. Then, we created a cumulative query method n representing the number of cumulative combined queries in descending order of the correlation coefficient. In validation set 1, 13 cumulative query methods were applied, and 8 had higher correlation coefficients (min=.916, max=.943) than that of the highest single combined query. Further, 11 of 13 cumulative query methods had an r value of ≥.7, but 4 of 13 combined queries had an r value of ≥.7. In validation set 2, 8 of 15 cumulative query methods showed higher correlation coefficients (min=.975, max=.987) than that of the highest single combined query. All 15 cumulative query methods had an r value of ≥.7, but 6 of 15 combined queries had an r value of ≥.7. Cumulative query method showed relatively higher correlation with national influenza surveillance data than combined queries in the development and validation set.

  17. Accurate prediction of secreted substrates and identification of a conserved putative secretion signal for type III secretion systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Samudrala, Ram; Heffron, Fred; McDermott, Jason E.

    2009-04-24

    The type III secretion system is an essential component for virulence in many Gram-negative bacteria. Though components of the secretion system apparatus are conserved, its substrates, effector proteins, are not. We have used a machine learning approach to identify new secreted effectors. The method integrates evolutionary measures, such as the pattern of homologs in a range of other organisms, and sequence-based features, such as G+C content, amino acid composition and the N-terminal 30 residues of the protein sequence. The method was trained on known effectors from Salmonella typhimurium and validated on a corresponding set of effectors from Pseudomonas syringae, aftermore » eliminating effectors with detectable sequence similarity. The method was able to identify all of the known effectors in P. syringae with a specificity of 84% and sensitivity of 82%. The reciprocal validation, training on P. syringae and validating on S. typhimurium, gave similar results with a specificity of 86% when the sensitivity level was 87%. These results show that type III effectors in disparate organisms share common features. We found that maximal performance is attained by including an N-terminal sequence of only 30 residues, which agrees with previous studies indicating that this region contains the secretion signal. We then used the method to define the most important residues in this putative secretion signal. Finally, we present novel predictions of secreted effectors in S. typhimurium, some of which have been experimentally validated, and apply the method to predict secreted effectors in the genetically intractable human pathogen Chlamydia trachomatis. This approach is a novel and effective way to identify secreted effectors in a broad range of pathogenic bacteria for further experimental characterization and provides insight into the nature of the type III secretion signal.« less

  18. Validation of electronic systems to collect patient-reported outcome (PRO) data-recommendations for clinical trial teams: report of the ISPOR ePRO systems validation good research practices task force.

    PubMed

    Zbrozek, Arthur; Hebert, Joy; Gogates, Gregory; Thorell, Rod; Dell, Christopher; Molsen, Elizabeth; Craig, Gretchen; Grice, Kenneth; Kern, Scottie; Hines, Sheldon

    2013-06-01

    Outcomes research literature has many examples of high-quality, reliable patient-reported outcome (PRO) data entered directly by electronic means, ePRO, compared to data entered from original results on paper. Clinical trial managers are increasingly using ePRO data collection for PRO-based end points. Regulatory review dictates the rules to follow with ePRO data collection for medical label claims. A critical component for regulatory compliance is evidence of the validation of these electronic data collection systems. Validation of electronic systems is a process versus a focused activity that finishes at a single point in time. Eight steps need to be described and undertaken to qualify the validation of the data collection software in its target environment: requirements definition, design, coding, testing, tracing, user acceptance testing, installation and configuration, and decommissioning. These elements are consistent with recent regulatory guidance for systems validation. This report was written to explain how the validation process works for sponsors, trial teams, and other users of electronic data collection devices responsible for verifying the quality of the data entered into relational databases from such devices. It is a guide on the requirements and documentation needed from a data collection systems provider to demonstrate systems validation. It is a practical source of information for study teams to ensure that ePRO providers are using system validation and implementation processes that will ensure the systems and services: operate reliably when in practical use; produce accurate and complete data and data files; support management control and comply with any existing regulations. Furthermore, this short report will increase user understanding of the requirements for a technology review leading to more informed and balanced recommendations or decisions on electronic data collection methods. Copyright © 2013 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  19. Fourth NASA Langley Formal Methods Workshop

    NASA Technical Reports Server (NTRS)

    Holloway, C. Michael (Compiler); Hayhurst, Kelly J. (Compiler)

    1997-01-01

    This publication consists of papers presented at NASA Langley Research Center's fourth workshop on the application of formal methods to the design and verification of life-critical systems. Topic considered include: Proving properties of accident; modeling and validating SAFER in VDM-SL; requirement analysis of real-time control systems using PVS; a tabular language for system design; automated deductive verification of parallel systems. Also included is a fundamental hardware design in PVS.

  20. A critique of Lilienfeld et al.'s (2000) "The scientific status of projective techniques".

    PubMed

    Hibbard, Stephen

    2003-06-01

    Lilienfeld, Wood, and Garb (2000) published a largely negative critique of the validity and reliability of projective methods, concentrating on the Comprehensive System for the Rorschach (Exner, 1993), 3 systems for coding the Thematic Apperception Test (TAT; Murray, 1943) cards, and human figure drawings. This article is an effort to document and correct what I perceive as errors of omission and commission in the Lilienfeld et al. article. When projective measures are viewed in the light of these corrections, the evidence for the validity and clinical usefulness of the Rorschach and TAT methods is more robust than Lilienfeld et al. represented.

  1. Verifying Data Integrity of Electronically Scanned Pressure Systems at the NASA Glenn Research Center

    NASA Technical Reports Server (NTRS)

    Panek, Joseph W.

    2001-01-01

    The proper operation of the Electronically Scanned Pressure (ESP) System critical to accomplish the following goals: acquisition of highly accurate pressure data for the development of aerospace and commercial aviation systems and continuous confirmation of data quality to avoid costly, unplanned, repeat wind tunnel or turbine testing. Standard automated setup and checkout routines are necessary to accomplish these goals. Data verification and integrity checks occur at three distinct stages, pretest pressure tubing and system checkouts, daily system validation and in-test confirmation of critical system parameters. This paper will give an overview of the existing hardware, software and methods used to validate data integrity.

  2. Translating expert system rules into Ada code with validation and verification

    NASA Technical Reports Server (NTRS)

    Becker, Lee; Duckworth, R. James; Green, Peter; Michalson, Bill; Gosselin, Dave; Nainani, Krishan; Pease, Adam

    1991-01-01

    The purpose of this ongoing research and development program is to develop software tools which enable the rapid development, upgrading, and maintenance of embedded real-time artificial intelligence systems. The goals of this phase of the research were to investigate the feasibility of developing software tools which automatically translate expert system rules into Ada code and develop methods for performing validation and verification testing of the resultant expert system. A prototype system was demonstrated which automatically translated rules from an Air Force expert system was demonstrated which detected errors in the execution of the resultant system. The method and prototype tools for converting AI representations into Ada code by converting the rules into Ada code modules and then linking them with an Activation Framework based run-time environment to form an executable load module are discussed. This method is based upon the use of Evidence Flow Graphs which are a data flow representation for intelligent systems. The development of prototype test generation and evaluation software which was used to test the resultant code is discussed. This testing was performed automatically using Monte-Carlo techniques based upon a constraint based description of the required performance for the system.

  3. The development and validation of a Real Time Location System to reliably monitor everyday activities in natural contexts.

    PubMed

    Judah, Gaby; de Witt Huberts, Jessie; Drassal, Allan; Aunger, Robert

    2017-01-01

    The accurate measurement of behaviour is vitally important to many disciplines and practitioners of various kinds. While different methods have been used (such as observation, diaries, questionnaire), none are able to accurately monitor behaviour over the long term in the natural context of people's own lives. The aim of this work was therefore to develop and test a reliable system for unobtrusively monitoring various behaviours of multiple individuals within the same household over a period of several months. A commercial Real Time Location System was adapted to meet these requirements and subsequently validated in three households by monitoring various bathroom behaviours. The results indicate that the system is robust, can monitor behaviours over the long-term in different households and can reliably distinguish between individuals. Precision rates were high and consistent. Recall rates were less consistent across households and behaviours, although recall rates improved considerably with practice at set-up of the system. The achieved precision and recall rates were comparable to the rates observed in more controlled environments using more valid methods of ground truthing. These initial findings indicate that the system is a valuable, flexible and robust system for monitoring behaviour in its natural environment that would allow new research questions to be addressed.

  4. Common Criteria Related Security Design Patterns—Validation on the Intelligent Sensor Example Designed for Mine Environment

    PubMed Central

    Bialas, Andrzej

    2010-01-01

    The paper discusses the security issues of intelligent sensors that are able to measure and process data and communicate with other information technology (IT) devices or systems. Such sensors are often used in high risk applications. To improve their robustness, the sensor systems should be developed in a restricted way to provide them with assurance. One of assurance creation methodologies is Common Criteria (ISO/IEC 15408), used for IT products and systems. The contribution of the paper is a Common Criteria compliant and pattern-based method for the intelligent sensors security development. The paper concisely presents this method and its evaluation for the sensor detecting methane in a mine, focusing on the security problem of the intelligent sensor definition and solution. The aim of the validation is to evaluate and improve the introduced method. PMID:22399888

  5. Moment analysis method as applied to the 2S --> 2P transition in cryogenic alkali metal/rare gas matrices.

    PubMed

    Terrill Vosbein, Heidi A; Boatz, Jerry A; Kenney, John W

    2005-12-22

    The moment analysis method (MA) has been tested for the case of 2S --> 2P ([core]ns1 --> [core]np1) transitions of alkali metal atoms (M) doped into cryogenic rare gas (Rg) matrices using theoretically validated simulations. Theoretical/computational M/Rg system models are constructed with precisely defined parameters that closely mimic known M/Rg systems. Monte Carlo (MC) techniques are then employed to generate simulated absorption and magnetic circular dichroism (MCD) spectra of the 2S --> 2P M/Rg transition to which the MA method can be applied with the goal of seeing how effective the MA method is in re-extracting the M/Rg system parameters from these known simulated systems. The MA method is summarized in general, and an assessment is made of the use of the MA method in the rigid shift approximation typically used to evaluate M/Rg systems. The MC-MCD simulation technique is summarized, and validating evidence is presented. The simulation results and the assumptions used in applying MA to M/Rg systems are evaluated. The simulation results on Na/Ar demonstrate that the MA method does successfully re-extract the 2P spin-orbit coupling constant and Landé g-factor values initially used to build the simulations. However, assigning physical significance to the cubic and noncubic Jahn-Teller (JT) vibrational mode parameters in cryogenic M/Rg systems is not supported.

  6. 75 FR 57027 - National Toxicology Program (NTP); NTP Interagency Center for the Evaluation of Alternative...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-17

    ... for the Evaluation of Alternative Toxicological Methods (NICEATM); Availability of Interagency Coordinating Committee on the Validation of Alternative Methods (ICCVAM) Test Method Evaluation Reports: In Vitro Ocular Safety Testing Methods and Strategies, and Routine Use of Topical Anesthetics, Systemic...

  7. A method of measuring three-dimensional scapular attitudes using the optotrak probing system.

    PubMed

    Hébert, L J; Moffet, H; McFadyen, B J; St-Vincent, G

    2000-01-01

    To develop a method to obtain accurate three-dimensional scapular attitudes and to assess their concurrent validity and reliability. In this methodological study, the three-dimensional scapular attitudes were calculated in degrees, using a rotation matrix (cyclic Cardanic sequence), from spatial coordinates obtained with the probing of three non colinear landmarks first on an anatomical model and second on a healthy subject. Although abnormal movement of the scapula is related to shoulder impingement syndrome, it is not clearly understood whether or not scapular motion impairment is a predisposing factor. Characterization of three-dimensional scapular attitudes in planes and at joint angles for which sub-acromial impingement is more likely to occur is not known. The Optotrak probing system was used. An anatomical model of the scapula was built and allowed us to impose scapular attitudes of known direction and magnitude. A local coordinate reference system was defined with three non colinear anatomical landmarks to assess accuracy and concurrent validity of the probing method with fixed markers. Axial rotation angles were calculated from a rotation matrix using a cyclic Cardanic sequence of rotations. The same three non colinear body landmarks were digitized on one healthy subject and the three dimensional scapular attitudes obtained were compared between sessions in order to assess the reliability. The measure of three dimensional scapular attitudes calculated from data using the Optotrak probing system was accurate with means of the differences between imposed and calculated rotation angles ranging from 1.5 degrees to 4.2 degrees. Greatest variations were observed around the third axis of the Cardanic sequence associated with posterior-anterior transverse rotations. The mean difference between the Optotrak probing system method and fixed markers was 1.73 degrees showing a good concurrent validity. Differences between the two methods were generally very low for one and two direction displacements and the largest discrepancies were observed for imposed displacements combining movement about the three axes. The between sessions variation of three dimensional scapular attitudes was less than 10% for most of the arm positions adopted by a healthy subject suggesting a good reliability. The Optotrak probing system used with a standardized protocol lead to accurate, valid and reliable measures of scapular attitudes. Although abnormal range of motion of the scapula is often related to shoulder pathologies, reliable outcome measures to quantify three-dimensional scapular motion on subjects are not available. It is important to establish a standardized protocol to characterize three-dimensional scapular motion on subjects using a method for which the accuracy and validity are known. The method used in the present study has provided such a protocol and will now allow to verify to what extent, scapular motion impairment is linked to the development of specific shoulder pathologies.

  8. Complex multidisciplinary system composition for aerospace vehicle conceptual design

    NASA Astrophysics Data System (ADS)

    Gonzalez, Lex

    Although, there exists a vast amount of work concerning the analysis, design, integration of aerospace vehicle systems, there is no standard for how this data and knowledge should be combined in order to create a synthesis system. Each institution creating a synthesis system has in house vehicle and hardware components they are attempting to model and proprietary methods with which to model them. This leads to the fact that synthesis systems begin as one-off creations meant to answer a specific problem. As the scope of the synthesis system grows to encompass more and more problems, so does its size and complexity; in order for a single synthesis system to answer multiple questions the number of methods and method interface must increase. As a means to curtail the requirement that the increase of an aircraft synthesis systems capability leads to an increase in its size and complexity, this research effort focuses on the idea that each problem in aerospace requires its own analysis framework. By focusing on the creation of a methodology which centers on the matching of an analysis framework towards the problem being solved, the complexity of the analysis framework is decoupled from the complexity of the system that creates it. The derived methodology allows for the composition of complex multi-disciplinary systems (CMDS) through the automatic creation and implementation of system and disciplinary method interfaces. The CMDS Composition process follows a four step methodology meant to take a problem definition and progress towards the creation of an analysis framework meant to answer said problem. The unique implementation of the CMDS Composition process take user selected disciplinary analysis methods and automatically integrates them, together in order to create a syntactically composable analysis framework. As a means of assessing the validity of the CMDS Composition process a prototype system (AVDDBMS) has been developed. AVD DBMS has been used to model the Generic Hypersonic Vehicle (GHV), an open source family of hypersonic vehicles originating from the Air Force Research Laboratory. AVDDBMS has been applied in three different ways in order to assess its validity: Verification using GHV disciplinary data, Validation using selected disciplinary analysis methods, and Application of the CMDS Composition Process to assess the design solution space for the GHV hardware. The research demonstrates the holistic effect that selection of individual disciplinary analysis methods has on the structure and integration of the analysis framework.

  9. Development and Validation of High Performance Liquid Chromatography Method for Determination Atorvastatin in Tablet

    NASA Astrophysics Data System (ADS)

    Yugatama, A.; Rohmani, S.; Dewangga, A.

    2018-03-01

    Atorvastatin is the primary choice for dyslipidemia treatment. Due to patent expiration of atorvastatin, the pharmaceutical industry makes copy of the drug. Therefore, the development methods for tablet quality tests involving atorvastatin concentration on tablets needs to be performed. The purpose of this research was to develop and validate the simple atorvastatin tablet analytical method by HPLC. HPLC system used in this experiment consisted of column Cosmosil C18 (150 x 4,6 mm, 5 µm) as the stationary reverse phase chomatography, a mixture of methanol-water at pH 3 (80:20 v/v) as the mobile phase, flow rate of 1 mL/min, and UV detector at wavelength of 245 nm. Validation methods were including: selectivity, linearity, accuracy, precision, limit of detection (LOD), and limit of quantitation (LOQ). The results of this study indicate that the developed method had good validation including selectivity, linearity, accuracy, precision, LOD, and LOQ for analysis of atorvastatin tablet content. LOD and LOQ were 0.2 and 0.7 ng/mL, and the linearity range were 20 - 120 ng/mL.

  10. Examining construct and predictive validity of the Health-IT Usability Evaluation Scale: confirmatory factor analysis and structural equation modeling results

    PubMed Central

    Yen, Po-Yin; Sousa, Karen H; Bakken, Suzanne

    2014-01-01

    Background In a previous study, we developed the Health Information Technology Usability Evaluation Scale (Health-ITUES), which is designed to support customization at the item level. Such customization matches the specific tasks/expectations of a health IT system while retaining comparability at the construct level, and provides evidence of its factorial validity and internal consistency reliability through exploratory factor analysis. Objective In this study, we advanced the development of Health-ITUES to examine its construct validity and predictive validity. Methods The health IT system studied was a web-based communication system that supported nurse staffing and scheduling. Using Health-ITUES, we conducted a cross-sectional study to evaluate users’ perception toward the web-based communication system after system implementation. We examined Health-ITUES's construct validity through first and second order confirmatory factor analysis (CFA), and its predictive validity via structural equation modeling (SEM). Results The sample comprised 541 staff nurses in two healthcare organizations. The CFA (n=165) showed that a general usability factor accounted for 78.1%, 93.4%, 51.0%, and 39.9% of the explained variance in ‘Quality of Work Life’, ‘Perceived Usefulness’, ‘Perceived Ease of Use’, and ‘User Control’, respectively. The SEM (n=541) supported the predictive validity of Health-ITUES, explaining 64% of the variance in intention for system use. Conclusions The results of CFA and SEM provide additional evidence for the construct and predictive validity of Health-ITUES. The customizability of Health-ITUES has the potential to support comparisons at the construct level, while allowing variation at the item level. We also illustrate application of Health-ITUES across stages of system development. PMID:24567081

  11. An approach to operating system testing

    NASA Technical Reports Server (NTRS)

    Sum, R. N., Jr.; Campbell, R. H.; Kubitz, W. J.

    1984-01-01

    To ensure the reliability and performance of a new system, it must be verified or validated in some manner. Currently, testing is the only resonable technique available for doing this. Part of this testing process is the high level system test. System testing is considered with respect to operating systems and in particular UNIX. This consideration results in the development and presentation of a good method for performing the system test. The method includes derivations from the system specifications and ideas for management of the system testing project. Results of applying the method to the IBM System/9000 XENIX operating system test and the development of a UNIX test suite are presented.

  12. Validation of a Monte Carlo code system for grid evaluation with interference effect on Rayleigh scattering

    NASA Astrophysics Data System (ADS)

    Zhou, Abel; White, Graeme L.; Davidson, Rob

    2018-02-01

    Anti-scatter grids are commonly used in x-ray imaging systems to reduce scatter radiation reaching the image receptor. Anti-scatter grid performance and validation can be simulated through use of Monte Carlo (MC) methods. Our recently reported work has modified existing MC codes resulting in improved performance when simulating x-ray imaging. The aim of this work is to validate the transmission of x-ray photons in grids from the recently reported new MC codes against experimental results and results previously reported in other literature. The results of this work show that the scatter-to-primary ratio (SPR), the transmissions of primary (T p), scatter (T s), and total (T t) radiation determined using this new MC code system have strong agreement with the experimental results and the results reported in the literature. T p, T s, T t, and SPR determined in this new MC simulation code system are valid. These results also show that the interference effect on Rayleigh scattering should not be neglected in both mammographic and general grids’ evaluation. Our new MC simulation code system has been shown to be valid and can be used for analysing and evaluating the designs of grids.

  13. Design, development and method validation of a novel multi-resonance microwave sensor for moisture measurement.

    PubMed

    Peters, Johanna; Taute, Wolfgang; Bartscher, Kathrin; Döscher, Claas; Höft, Michael; Knöchel, Reinhard; Breitkreutz, Jörg

    2017-04-08

    Microwave sensor systems using resonance technology at a single resonance in the range of 2-3 GHz have been shown to be a rapid and reliable tool for moisture determination in solid materials including pharmaceutical granules. So far, their application is limited to lower moisture ranges or limitations above certain moisture contents had to be accepted. Aim of the present study was to develop a novel multi-resonance sensor system in order to expand the measurement range. Therefore, a novel sensor using additional resonances over a wide frequency band was designed and used to investigate inherent limitations of first generation sensor systems and material-related limits. Using granule samples with different moisture contents, an experimental protocol for calibration and validation of the method was established. Pursuant to this protocol, a multiple linear regression (MLR) prediction model built by correlating microwave moisture values to the moisture determined by Karl Fischer titration was chosen and rated using conventional criteria such as coefficient of determination (R 2 ) and root mean square error of calibration (RMSEC). Using different operators, different analysis dates and different ambient conditions the method was fully validated following the guidance of ICH Q2(R1). The study clearly showed explanations for measurement uncertainties of first generation sensor systems which confirmed the approach to overcome these by using additional resonances. The established prediction model could be validated in the range of 7.6-19.6%, demonstrating its fit for its future purpose, the moisture content determination during wet granulations. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Development and validation of an automated delirium risk assessment system (Auto-DelRAS) implemented in the electronic health record system.

    PubMed

    Moon, Kyoung-Ja; Jin, Yinji; Jin, Taixian; Lee, Sun-Mi

    2018-01-01

    A key component of the delirium management is prevention and early detection. To develop an automated delirium risk assessment system (Auto-DelRAS) that automatically alerts health care providers of an intensive care unit (ICU) patient's delirium risk based only on data collected in an electronic health record (EHR) system, and to evaluate the clinical validity of this system. Cohort and system development designs were used. Medical and surgical ICUs in two university hospitals in Seoul, Korea. A total of 3284 patients for the development of Auto-DelRAS, 325 for external validation, 694 for validation after clinical applications. The 4211 data items were extracted from the EHR system and delirium was measured using CAM-ICU (Confusion Assessment Method for Intensive Care Unit). The potential predictors were selected and a logistic regression model was established to create a delirium risk scoring algorithm to construct the Auto-DelRAS. The Auto-DelRAS was evaluated at three months and one year after its application to clinical practice to establish the predictive validity of the system. Eleven predictors were finally included in the logistic regression model. The results of the Auto-DelRAS risk assessment were shown as high/moderate/low risk on a Kardex screen. The predictive validity, analyzed after the clinical application of Auto-DelRAS after one year, showed a sensitivity of 0.88, specificity of 0.72, positive predictive value of 0.53, negative predictive value of 0.94, and a Youden index of 0.59. A relatively high level of predictive validity was maintained with the Auto-DelRAS system, even one year after it was applied to clinical practice. Copyright © 2017. Published by Elsevier Ltd.

  15. LAnd surface remote sensing Products VAlidation System (LAPVAS) and its preliminary application

    NASA Astrophysics Data System (ADS)

    Lin, Xingwen; Wen, Jianguang; Tang, Yong; Ma, Mingguo; Dou, Baocheng; Wu, Xiaodan; Meng, Lumin

    2014-11-01

    The long term record of remote sensing product shows the land surface parameters with spatial and temporal change to support regional and global scientific research widely. Remote sensing product with different sensors and different algorithms is necessary to be validated to ensure the high quality remote sensing product. Investigation about the remote sensing product validation shows that it is a complex processing both the quality of in-situ data requirement and method of precision assessment. A comprehensive validation should be needed with long time series and multiple land surface types. So a system named as land surface remote sensing product is designed in this paper to assess the uncertainty information of the remote sensing products based on a amount of in situ data and the validation techniques. The designed validation system platform consists of three parts: Validation databases Precision analysis subsystem, Inter-external interface of system. These three parts are built by some essential service modules, such as Data-Read service modules, Data-Insert service modules, Data-Associated service modules, Precision-Analysis service modules, Scale-Change service modules and so on. To run the validation system platform, users could order these service modules and choreograph them by the user interactive and then compete the validation tasks of remote sensing products (such as LAI ,ALBEDO ,VI etc.) . Taking SOA-based architecture as the framework of this system. The benefit of this architecture is the good service modules which could be independent of any development environment by standards such as the Web-Service Description Language(WSDL). The standard language: C++ and java will used as the primary programming language to create service modules. One of the key land surface parameter, albedo, is selected as an example of the system application. It is illustrated that the LAPVAS has a good performance to implement the land surface remote sensing product validation.

  16. Evaluation of biologic occupational risk control practices: quality indicators development and validation.

    PubMed

    Takahashi, Renata Ferreira; Gryschek, Anna Luíza F P L; Izumi Nichiata, Lúcia Yasuko; Lacerda, Rúbia Aparecida; Ciosak, Suely Itsuko; Gir, Elucir; Padoveze, Maria Clara

    2010-05-01

    There is growing demand for the adoption of qualification systems for health care practices. This study is aimed at describing the development and validation of indicators for evaluation of biologic occupational risk control programs. The study involved 3 stages: (1) setting up a research team, (2) development of indicators, and (3) validation of the indicators by a team of specialists recruited to validate each attribute of the developed indicators. The content validation method was used for the validation, and a psychometric scale was developed for the specialists' assessment. A consensus technique was used, and every attribute that obtained a Content Validity Index of at least 0.75 was approved. Eight indicators were developed for the evaluation of the biologic occupational risk prevention program, with emphasis on accidents caused by sharp instruments and occupational tuberculosis prevention. The indicators included evaluation of the structure, process, and results at the prevention and biologic risk control levels. The majority of indicators achieved a favorable consensus regarding all validated attributes. The developed indicators were considered validated, and the method used for construction and validation proved to be effective. Copyright (c) 2010 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Mosby, Inc. All rights reserved.

  17. A Comprehensive, Multi-modal Evaluation of the Assessment System of an Undergraduate Research Methodology Course: Translating Theory into Practice

    PubMed Central

    Mohammad Abdulghani, Hamza; G. Ponnamperuma, Gominda; Ahmad, Farah; Amin, Zubair

    2014-01-01

    Objective: To evaluate assessment system of the 'Research Methodology Course' using utility criteria (i.e. validity, reliability, acceptability, educational impact, and cost-effectiveness). This study demonstrates comprehensive evaluation of assessment system and suggests a framework for similar courses. Methods: Qualitative and quantitative methods used for evaluation of the course assessment components (50 MCQ, 3 Short Answer Questions (SAQ) and research project) using the utility criteria. Results of multiple evaluation methods for all the assessment components were collected and interpreted together to arrive at holistic judgments, rather than judgments based on individual methods or individual assessment. Results: Face validity, evaluated using a self-administered questionnaire (response rate-88.7%) disclosed that the students perceived that there was an imbalance in the contents covered by the assessment. This was confirmed by the assessment blueprint. Construct validity was affected by the low correlation between MCQ and SAQ scores (r=0.326). There was a higher correlation between the project and MCQ (r=0.466)/SAQ (r=0.463) scores. Construct validity was also affected by the presence of recall type of MCQs (70%; 35/50), item construction flaws and non-functioning distractors. High discriminating indices (>0.35) were found in MCQs with moderate difficulty indices (0.3-0.7). Reliability of the MCQs was 0.75 which could be improved up to 0.8 by increasing the number of MCQs to at least 70. A positive educational impact was found in the form of the research project assessment driving students to present/publish their work in conferences/peer reviewed journals. Cost per student to complete the course was US$164.50. Conclusions: The multi-modal evaluation of an assessment system is feasible and provides thorough and diagnostic information. Utility of the assessment system could be further improved by modifying the psychometrically inappropriate assessment items. PMID:24772117

  18. A hybrid method for prediction and repositioning of drug Anatomical Therapeutic Chemical classes.

    PubMed

    Chen, Lei; Lu, Jing; Zhang, Ning; Huang, Tao; Cai, Yu-Dong

    2014-04-01

    In the Anatomical Therapeutic Chemical (ATC) classification system, therapeutic drugs are divided into 14 main classes according to the organ or system on which they act and their chemical, pharmacological and therapeutic properties. This system, recommended by the World Health Organization (WHO), provides a global standard for classifying medical substances and serves as a tool for international drug utilization research to improve quality of drug use. In view of this, it is necessary to develop effective computational prediction methods to identify the ATC-class of a given drug, which thereby could facilitate further analysis of this system. In this study, we initiated an attempt to develop a prediction method and to gain insights from it by utilizing ontology information of drug compounds. Since only about one-fourth of drugs in the ATC classification system have ontology information, a hybrid prediction method combining the ontology information, chemical interaction information and chemical structure information of drug compounds was proposed for the prediction of drug ATC-classes. As a result, by using the Jackknife test, the 1st prediction accuracies for identifying the 14 main ATC-classes in the training dataset, the internal validation dataset and the external validation dataset were 75.90%, 75.70% and 66.36%, respectively. Analysis of some samples with false-positive predictions in the internal and external validation datasets indicated that some of them may even have a relationship with the false-positive predicted ATC-class, suggesting novel uses of these drugs. It was conceivable that the proposed method could be used as an efficient tool to identify ATC-classes of novel drugs or to discover novel uses of known drugs.

  19. Direct total and free testosterone measurement by liquid chromatography tandem mass spectrometry across two different platforms.

    PubMed

    Rhea, Jeanne M; French, Deborah; Molinaro, Ross J

    2013-05-01

    To develop and validate liquid chromatography tandem mass spectrometry (LC-MS/MS) methods for the direct measurement of total and free testosterone in patient samples on two different analytical systems. An API 4000 and 5000 triple quadropoles were used and compared; the former is reported to be 3-5 times less sensitive, as was used to set the quantitation limits. Free testosterone was separated from the protein-bound fraction by equilibrium dialysis followed by derivatization. Either free or total testosterone, and a deuterated internal standard (d3-testosterone) were extracted by liquid-liquid extraction. The validation results were compared to two different clinical laboratories. The use of d2-testosterone was found to be unacceptable for our method. The total testosterone LC-MS/MS methods on both systems were linear over a wide concentration range of 1.5-2000ng/dL. Free testosterone was measured directly using equilibrium dialysis coupled LC-MS/MS and linear over the concentration range of 2.5-2500pg/mL. Good correlation (total testosterone, R(2)=0.96; free testosterone, R(2)=0.98) was observed between our LC-MS/MS systems and comparator laboratory. However, differences in absolute values for both free and total testosterone measurements were observed while a comparison to a second published LC-MS/MS method showed excellent correlation. Free and total testosterone measurements correlated well with clinical observations. To our knowledge, this is the first published validation of free and total testosterone methods across two analytical systems of different analytical sensitivities. A less sensitive system does not sacrifice analytical or clinical sensitivity to directly measure free and total testosterone in patient samples. Copyright © 2013 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  20. Full-vectorial finite element method in a cylindrical coordinate system for loss analysis of photonic wire bends

    NASA Astrophysics Data System (ADS)

    Kakihara, Kuniaki; Kono, Naoya; Saitoh, Kunimasa; Koshiba, Masanori

    2006-11-01

    This paper presents a new full-vectorial finite-element method in a local cylindrical coordinate system, to effectively analyze bending losses in photonic wires. The discretization is performed in the cross section of a three-dimensional curved waveguide, using hybrid edge/nodal elements. The solution region is truncated by anisotropic, perfectly matched layers in the cylindrical coordinate system, to deal properly with leaky modes of the waveguide. This approach is used to evaluate bending losses in silicon wire waveguides. The numerical results of the present approach are compared with results calculated with an equivalent straight waveguide approach and with reported experimental data. These comparisons together demonstrate the validity of the present approach based on the cylindrical coordinate system and also clarifies the limited validity of the equivalent straight waveguide approximation.

  1. A systematic review of publications assessing reliability and validity of the Behavioral Risk Factor Surveillance System (BRFSS), 2004–2011

    PubMed Central

    2013-01-01

    Background In recent years response rates on telephone surveys have been declining. Rates for the behavioral risk factor surveillance system (BRFSS) have also declined, prompting the use of new methods of weighting and the inclusion of cell phone sampling frames. A number of scholars and researchers have conducted studies of the reliability and validity of the BRFSS estimates in the context of these changes. As the BRFSS makes changes in its methods of sampling and weighting, a review of reliability and validity studies of the BRFSS is needed. Methods In order to assess the reliability and validity of prevalence estimates taken from the BRFSS, scholarship published from 2004–2011 dealing with tests of reliability and validity of BRFSS measures was compiled and presented by topics of health risk behavior. Assessments of the quality of each publication were undertaken using a categorical rubric. Higher rankings were achieved by authors who conducted reliability tests using repeated test/retest measures, or who conducted tests using multiple samples. A similar rubric was used to rank validity assessments. Validity tests which compared the BRFSS to physical measures were ranked higher than those comparing the BRFSS to other self-reported data. Literature which undertook more sophisticated statistical comparisons was also ranked higher. Results Overall findings indicated that BRFSS prevalence rates were comparable to other national surveys which rely on self-reports, although specific differences are noted for some categories of response. BRFSS prevalence rates were less similar to surveys which utilize physical measures in addition to self-reported data. There is very little research on reliability and validity for some health topics, but a great deal of information supporting the validity of the BRFSS data for others. Conclusions Limitations of the examination of the BRFSS were due to question differences among surveys used as comparisons, as well as mode of data collection differences. As the BRFSS moves to incorporating cell phone data and changing weighting methods, a review of reliability and validity research indicated that past BRFSS landline only data were reliable and valid as measured against other surveys. New analyses and comparisons of BRFSS data which include the new methodologies and cell phone data will be needed to ascertain the impact of these changes on estimates in the future. PMID:23522349

  2. An improved adaptive weighting function method for State Estimation in Power Systems with VSC-MTDC

    NASA Astrophysics Data System (ADS)

    Zhao, Kun; Yang, Xiaonan; Lang, Yansheng; Song, Xuri; Wang, Minkun; Luo, Yadi; Wu, Lingyun; Liu, Peng

    2017-04-01

    This paper presents an effective approach for state estimation in power systems that include multi-terminal voltage source converter based high voltage direct current (VSC-MTDC), called improved adaptive weighting function method. The proposed approach is simplified in which the VSC-MTDC system is solved followed by the AC system. Because the new state estimation method only changes the weight and keeps the matrix dimension unchanged. Accurate and fast convergence of AC/DC system can be realized by adaptive weight function method. This method also provides the technical support for the simulation analysis and accurate regulation of AC/DC system. Both the oretical analysis and numerical tests verify practicability, validity and convergence of new method.

  3. Real-time validation of receiver state information in optical space-time block code systems.

    PubMed

    Alamia, John; Kurzweg, Timothy

    2014-06-15

    Free space optical interconnect (FSOI) systems are a promising solution to interconnect bottlenecks in high-speed systems. To overcome some sources of diminished FSOI performance caused by close proximity of multiple optical channels, multiple-input multiple-output (MIMO) systems implementing encoding schemes such as space-time block coding (STBC) have been developed. These schemes utilize information pertaining to the optical channel to reconstruct transmitted data. The STBC system is dependent on accurate channel state information (CSI) for optimal system performance. As a result of dynamic changes in optical channels, a system in operation will need to have updated CSI. Therefore, validation of the CSI during operation is a necessary tool to ensure FSOI systems operate efficiently. In this Letter, we demonstrate a method of validating CSI, in real time, through the use of moving averages of the maximum likelihood decoder data, and its capacity to predict the bit error rate (BER) of the system.

  4. Development of Flight-Test Performance Estimation Techniques for Small Unmanned Aerial Systems

    NASA Astrophysics Data System (ADS)

    McCrink, Matthew Henry

    This dissertation provides a flight-testing framework for assessing the performance of fixed-wing, small-scale unmanned aerial systems (sUAS) by leveraging sub-system models of components unique to these vehicles. The development of the sub-system models, and their links to broader impacts on sUAS performance, is the key contribution of this work. The sub-system modeling and analysis focuses on the vehicle's propulsion, navigation and guidance, and airframe components. Quantification of the uncertainty in the vehicle's power available and control states is essential for assessing the validity of both the methods and results obtained from flight-tests. Therefore, detailed propulsion and navigation system analyses are presented to validate the flight testing methodology. Propulsion system analysis required the development of an analytic model of the propeller in order to predict the power available over a range of flight conditions. The model is based on the blade element momentum (BEM) method. Additional corrections are added to the basic model in order to capture the Reynolds-dependent scale effects unique to sUAS. The model was experimentally validated using a ground based testing apparatus. The BEM predictions and experimental analysis allow for a parameterized model relating the electrical power, measurable during flight, to the power available required for vehicle performance analysis. Navigation system details are presented with a specific focus on the sensors used for state estimation, and the resulting uncertainty in vehicle state. Uncertainty quantification is provided by detailed calibration techniques validated using quasi-static and hardware-in-the-loop (HIL) ground based testing. The HIL methods introduced use a soft real-time flight simulator to provide inertial quality data for assessing overall system performance. Using this tool, the uncertainty in vehicle state estimation based on a range of sensors, and vehicle operational environments is presented. The propulsion and navigation system models are used to evaluate flight-testing methods for evaluating fixed-wing sUAS performance. A brief airframe analysis is presented to provide a foundation for assessing the efficacy of the flight-test methods. The flight-testing presented in this work is focused on validating the aircraft drag polar, zero-lift drag coefficient, and span efficiency factor. Three methods are detailed and evaluated for estimating these design parameters. Specific focus is placed on the influence of propulsion and navigation system uncertainty on the resulting performance data. Performance estimates are used in conjunction with the propulsion model to estimate the impact sensor and measurement uncertainty on the endurance and range of a fixed-wing sUAS. Endurance and range results for a simplistic power available model are compared to the Reynolds-dependent model presented in this work. Additional parameter sensitivity analysis related to state estimation uncertainties encountered in flight-testing are presented. Results from these analyses indicate that the sub-system models introduced in this work are of first-order importance, on the order of 5-10% change in range and endurance, in assessing the performance of a fixed-wing sUAS.

  5. Validation of Milliflex® Quantum for Bioburden Testing of Pharmaceutical Products.

    PubMed

    Gordon, Oliver; Goverde, Marcel; Staerk, Alexandra; Roesti, David

    2017-01-01

    This article reports the validation strategy used to demonstrate that the Milliflex ® Quantum yielded non-inferior results to the traditional bioburden method. It was validated according to USP <1223>, European Pharmacopoeia 5.1.6, and Parenteral Drug Association Technical Report No. 33 and comprised the validation parameters robustness, ruggedness, repeatability, specificity, limit of detection and quantification, accuracy, precision, linearity, range, and equivalence in routine operation. For the validation, a combination of pharmacopeial ATCC strains as well as a broad selection of in-house isolates were used. In-house isolates were used in stressed state. Results were statistically evaluated regarding the pharmacopeial acceptance criterion of ≥70% recovery compared to the traditional method. Post-hoc test power calculations verified the appropriateness of the used sample size to detect such a difference. Furthermore, equivalence tests verified non-inferiority of the rapid method as compared to the traditional method. In conclusion, the rapid bioburden on basis of the Milliflex ® Quantum was successfully validated as alternative method to the traditional bioburden test. LAY ABSTRACT: Pharmaceutical drug products must fulfill specified quality criteria regarding their microbial content in order to ensure patient safety. Drugs that are delivered into the body via injection, infusion, or implantation must be sterile (i.e., devoid of living microorganisms). Bioburden testing measures the levels of microbes present in the bulk solution of a drug before sterilization, and thus it provides important information for manufacturing a safe product. In general, bioburden testing has to be performed using the methods described in the pharmacopoeias (membrane filtration or plate count). These methods are well established and validated regarding their effectiveness; however, the incubation time required to visually identify microbial colonies is long. Thus, alternative methods that detect microbial contamination faster will improve control over the manufacturing process and speed up product release. Before alternative methods may be used, they must undergo a side-by-side comparison with pharmacopeial methods. In this comparison, referred to as validation, it must be shown in a statistically verified manner that the effectiveness of the alternative method is at least equivalent to that of the pharmacopeial methods. Here we describe the successful validation of an alternative bioburden testing method based on fluorescent staining of growing microorganisms applying the Milliflex ® Quantum system by MilliporeSigma. © PDA, Inc. 2017.

  6. Advanced Method of Boundary-Layer Control Based on Localized Plasma Generation

    DTIC Science & Technology

    2009-05-01

    measurements, validation of experiments, wind-tunnel testing of the microwave / plasma generation system , preliminary assessment of energy required...and design of a microwave generator , electrodynamic and multivibrator systems for experiments in the IHM-NAU wind tunnel: MW generator and its high...equipped with the microwave - generation and protection systems to study advanced methods of flow control (Kiev) Fig. 2.1,a. The blade

  7. Computational Fluid Dynamics Uncertainty Analysis for Payload Fairing Spacecraft Environmental Control Systems

    NASA Technical Reports Server (NTRS)

    Groves, Curtis E.

    2013-01-01

    Spacecraft thermal protection systems are at risk of being damaged due to airflow produced from Environmental Control Systems. There are inherent uncertainties and errors associated with using Computational Fluid Dynamics to predict the airflow field around a spacecraft from the Environmental Control System. This proposal describes an approach to validate the uncertainty in using Computational Fluid Dynamics to predict airflow speeds around an encapsulated spacecraft. The research described here is absolutely cutting edge. Quantifying the uncertainty in analytical predictions is imperative to the success of any simulation-based product. The method could provide an alternative to traditional"validation by test only'' mentality. This method could be extended to other disciplines and has potential to provide uncertainty for any numerical simulation, thus lowering the cost of performing these verifications while increasing the confidence in those predictions. Spacecraft requirements can include a maximum airflow speed to protect delicate instruments during ground processing. Computationaf Fluid Dynamics can be used to veritY these requirements; however, the model must be validated by test data. The proposed research project includes the following three objectives and methods. Objective one is develop, model, and perform a Computational Fluid Dynamics analysis of three (3) generic, non-proprietary, environmental control systems and spacecraft configurations. Several commercially available solvers have the capability to model the turbulent, highly three-dimensional, incompressible flow regime. The proposed method uses FLUENT and OPEN FOAM. Objective two is to perform an uncertainty analysis of the Computational Fluid . . . Dynamics model using the methodology found in "Comprehensive Approach to Verification and Validation of Computational Fluid Dynamics Simulations". This method requires three separate grids and solutions, which quantify the error bars around Computational Fluid Dynamics predictions. The method accounts for all uncertainty terms from both numerical and input variables. Objective three is to compile a table of uncertainty parameters that could be used to estimate the error in a Computational Fluid Dynamics model of the Environmental Control System /spacecraft system. Previous studies have looked at the uncertainty in a Computational Fluid Dynamics model for a single output variable at a single point, for example the re-attachment length of a backward facing step. To date, the author is the only person to look at the uncertainty in the entire computational domain. For the flow regime being analyzed (turbulent, threedimensional, incompressible), the error at a single point can propagate into the solution both via flow physics and numerical methods. Calculating the uncertainty in using Computational Fluid Dynamics to accurately predict airflow speeds around encapsulated spacecraft in is imperative to the success of future missions.

  8. Expert system verification and validation study. Delivery 3A and 3B: Trip summaries

    NASA Technical Reports Server (NTRS)

    French, Scott

    1991-01-01

    Key results are documented from attending the 4th workshop on verification, validation, and testing. The most interesting part of the workshop was when representatives from the U.S., Japan, and Europe presented surveys of VV&T within their respective regions. Another interesting part focused on current efforts to define industry standards for artificial intelligence and how that might affect approaches to VV&T of expert systems. The next part of the workshop focused on VV&T methods of applying mathematical techniques to verification of rule bases and techniques for capturing information relating to the process of developing software. The final part focused on software tools. A summary is also presented of the EPRI conference on 'Methodologies, Tools, and Standards for Cost Effective Reliable Software Verification and Validation. The conference was divided into discussion sessions on the following issues: development process, automated tools, software reliability, methods, standards, and cost/benefit considerations.

  9. Model-Based Verification and Validation of the SMAP Uplink Processes

    NASA Technical Reports Server (NTRS)

    Khan, M. Omair; Dubos, Gregory F.; Tirona, Joseph; Standley, Shaun

    2013-01-01

    This case study stands as an example of how a project can validate a system-level design earlier in the project life cycle than traditional V&V processes by using simulation on a system model. Specifically, this paper describes how simulation was added to a system model of the Soil Moisture Active-Passive (SMAP) mission's uplink process.Also discussed are the advantages and disadvantages of the methods employed and the lessons learned; which are intended to benefit future model-based and simulation-based V&V development efforts.

  10. Validation of Web-Based Physical Activity Measurement Systems Using Doubly Labeled Water

    PubMed Central

    Yamaguchi, Yukio; Yamada, Yosuke; Tokushima, Satoru; Hatamoto, Yoichi; Sagayama, Hiroyuki; Kimura, Misaka; Higaki, Yasuki; Tanaka, Hiroaki

    2012-01-01

    Background Online or Web-based measurement systems have been proposed as convenient methods for collecting physical activity data. We developed two Web-based physical activity systems—the 24-hour Physical Activity Record Web (24hPAR WEB) and 7 days Recall Web (7daysRecall WEB). Objective To examine the validity of two Web-based physical activity measurement systems using the doubly labeled water (DLW) method. Methods We assessed the validity of the 24hPAR WEB and 7daysRecall WEB in 20 individuals, aged 25 to 61 years. The order of email distribution and subsequent completion of the two Web-based measurements systems was randomized. Each measurement tool was used for a week. The participants’ activity energy expenditure (AEE) and total energy expenditure (TEE) were assessed over each week using the DLW method and compared with the respective energy expenditures estimated using the Web-based systems. Results The mean AEE was 3.90 (SD 1.43) MJ estimated using the 24hPAR WEB and 3.67 (SD 1.48) MJ measured by the DLW method. The Pearson correlation for AEE between the two methods was r = .679 (P < .001). The Bland-Altman 95% limits of agreement ranged from –2.10 to 2.57 MJ between the two methods. The Pearson correlation for TEE between the two methods was r = .874 (P < .001). The mean AEE was 4.29 (SD 1.94) MJ using the 7daysRecall WEB and 3.80 (SD 1.36) MJ by the DLW method. The Pearson correlation for AEE between the two methods was r = .144 (P = .54). The Bland-Altman 95% limits of agreement ranged from –3.83 to 4.81 MJ between the two methods. The Pearson correlation for TEE between the two methods was r = .590 (P = .006). The average input times using terminal devices were 8 minutes and 10 seconds for the 24hPAR WEB and 6 minutes and 38 seconds for the 7daysRecall WEB. Conclusions Both Web-based systems were found to be effective methods for collecting physical activity data and are appropriate for use in epidemiological studies. Because the measurement accuracy of the 24hPAR WEB was moderate to high, it could be suitable for evaluating the effect of interventions on individuals as well as for examining physical activity behavior. PMID:23010345

  11. Development and validation of a rapid reverse-phase HPLC method for the determination of methotrexate from nanostructured liquid crystalline systems.

    PubMed

    Zuben, E S Von; Oliveira, A G; Chorilli, M; Scarpa, M V

    2018-03-05

    A reversed-phase liquid chromatography (RP-LC) method was successfully developed and validated for the determination of methotrexate in nanostructured liquid crystalline systems composed by polyether functional siloxane and silicone polyether copolymer. The LC method was performed on RP C18-ODS column, Agilent Zorbax® (4.6 x 250 mm, 5 μm), maintained at room temperature, with a mobile phase constituted by a mixture of 50 mM ammonium acetate buffer (pH 6.0) and methanol (77:23,v/v) with a flow rate of 1.0 mL/min, using ultraviolet detection at 313 nm. The parameters used in the validation process were linearity, specificity, intra and inter-day precision, accuracy, robustness. The quantitation and detection limits yielded good results. The calibration plot assumed linear behavior from 5.0-150.0 μg. mL-1 (r2 = 0.9999). The methotrexate was subjected to oxidation, acid, base and neutral degradation, photolysis and heat as stress conditions. There were no interfering peaks at or near the retention time of methotrexate. The nanostructured liquid crystalline systems did not interfere with the analysis and the recovery was quantitative. The intra and inter-day assay relative standard deviation were less than 0.20 %. The method developed proved to be simple, sensitive, accurate, precise, reproducible and therefore adequate for routine analysis of methotrexate in nanostructured liquid crystalline systems.

  12. Development and Validation of a Rapid 13C6-Glucose Isotope Dilution UPLC-MRM Mass Spectrometry Method for Use in Determining System Accuracy and Performance of Blood Glucose Monitoring Devices

    PubMed Central

    Matsunami, Risë K.; Angelides, Kimon; Engler, David A.

    2015-01-01

    Background: There is currently considerable discussion about the accuracy of blood glucose concentrations determined by personal blood glucose monitoring systems (BGMS). To date, the FDA has allowed new BGMS to demonstrate accuracy in reference to other glucose measurement systems that use the same or similar enzymatic-based methods to determine glucose concentration. These types of reference measurement procedures are only comparative in nature and are subject to the same potential sources of error in measurement and system perturbations as the device under evaluation. It would be ideal to have a completely orthogonal primary method that could serve as a true standard reference measurement procedure for establishing the accuracy of new BGMS. Methods: An isotope-dilution liquid chromatography/mass spectrometry (ID-UPLC-MRM) assay was developed using 13C6-glucose as a stable isotope analogue to specifically measure glucose concentration in human plasma, and validated for use against NIST standard reference materials, and against fresh isolates of whole blood and plasma into which exogenous glucose had been spiked. Assay performance was quantified to NIST-traceable dry weight measures for both glucose and 13C6-glucose. Results: The newly developed assay method was shown to be rapid, highly specific, sensitive, accurate, and precise for measuring plasma glucose levels. The assay displayed sufficient dynamic range and linearity to measure across the range of both normal and diabetic blood glucose levels. Assay performance was measured to within the same uncertainty levels (<1%) as the NIST definitive method for glucose measurement in human serum. Conclusions: The newly developed ID UPLC-MRM assay can serve as a validated reference measurement procedure to which new BGMS can be assessed for glucose measurement performance. PMID:25986627

  13. Harmonics analysis of the ITER poloidal field converter based on a piecewise method

    NASA Astrophysics Data System (ADS)

    Xudong, WANG; Liuwei, XU; Peng, FU; Ji, LI; Yanan, WU

    2017-12-01

    Poloidal field (PF) converters provide controlled DC voltage and current to PF coils. The many harmonics generated by the PF converter flow into the power grid and seriously affect power systems and electric equipment. Due to the complexity of the system, the traditional integral operation in Fourier analysis is complicated and inaccurate. This paper presents a piecewise method to calculate the harmonics of the ITER PF converter. The relationship between the grid input current and the DC output current of the ITER PF converter is deduced. The grid current is decomposed into the sum of some simple functions. By calculating simple function harmonics based on the piecewise method, the harmonics of the PF converter under different operation modes are obtained. In order to examine the validity of the method, a simulation model is established based on Matlab/Simulink and a relevant experiment is implemented in the ITER PF integration test platform. Comparative results are given. The calculated results are found to be consistent with simulation and experiment. The piecewise method is proved correct and valid for calculating the system harmonics.

  14. Center of pressure based segment inertial parameters validation

    PubMed Central

    Rezzoug, Nasser; Gorce, Philippe; Isableu, Brice; Venture, Gentiane

    2017-01-01

    By proposing efficient methods for estimating Body Segment Inertial Parameters’ (BSIP) estimation and validating them with a force plate, it is possible to improve the inverse dynamic computations that are necessary in multiple research areas. Until today a variety of studies have been conducted to improve BSIP estimation but to our knowledge a real validation has never been completely successful. In this paper, we propose a validation method using both kinematic and kinetic parameters (contact forces) gathered from optical motion capture system and a force plate respectively. To compare BSIPs, we used the measured contact forces (Force plate) as the ground truth, and reconstructed the displacements of the Center of Pressure (COP) using inverse dynamics from two different estimation techniques. Only minor differences were seen when comparing the estimated segment masses. Their influence on the COP computation however is large and the results show very distinguishable patterns of the COP movements. Improving BSIP techniques is crucial and deviation from the estimations can actually result in large errors. This method could be used as a tool to validate BSIP estimation techniques. An advantage of this approach is that it facilitates the comparison between BSIP estimation methods and more specifically it shows the accuracy of those parameters. PMID:28662090

  15. Certification in Structural Health Monitoring Systems

    DTIC Science & Technology

    2011-09-01

    validation [3,8]. This may be accomplished by computing the sum of squares of pure error ( SSPE ) and its associated squared correlation [3,8]. To compute...these values, a cross- validation sample must be established. In general, if the SSPE is high, the model does not predict well on independent data...plethora of cross- validation methods, some of which are more useful for certain models than others [3,8]. When possible, a disclosure of the SSPE

  16. Design and Validation of an Augmented Reality System for Laparoscopic Surgery in a Real Environment

    PubMed Central

    López-Mir, F.; Naranjo, V.; Fuertes, J. J.; Alcañiz, M.; Bueno, J.; Pareja, E.

    2013-01-01

    Purpose. This work presents the protocol carried out in the development and validation of an augmented reality system which was installed in an operating theatre to help surgeons with trocar placement during laparoscopic surgery. The purpose of this validation is to demonstrate the improvements that this system can provide to the field of medicine, particularly surgery. Method. Two experiments that were noninvasive for both the patient and the surgeon were designed. In one of these experiments the augmented reality system was used, the other one was the control experiment, and the system was not used. The type of operation selected for all cases was a cholecystectomy due to the low degree of complexity and complications before, during, and after the surgery. The technique used in the placement of trocars was the French technique, but the results can be extrapolated to any other technique and operation. Results and Conclusion. Four clinicians and ninety-six measurements obtained of twenty-four patients (randomly assigned in each experiment) were involved in these experiments. The final results show an improvement in accuracy and variability of 33% and 63%, respectively, in comparison to traditional methods, demonstrating that the use of an augmented reality system offers advantages for trocar placement in laparoscopic surgery. PMID:24236293

  17. Fast sweeping methods for hyperbolic systems of conservation laws at steady state II

    NASA Astrophysics Data System (ADS)

    Engquist, Björn; Froese, Brittany D.; Tsai, Yen-Hsi Richard

    2015-04-01

    The idea of using fast sweeping methods for solving stationary systems of conservation laws has previously been proposed for efficiently computing solutions with sharp shocks. We further develop these methods to allow for a more challenging class of problems including problems with sonic points, shocks originating in the interior of the domain, rarefaction waves, and two-dimensional systems. We show that fast sweeping methods can produce higher-order accuracy. Computational results validate the claims of accuracy, sharp shock curves, and optimal computational efficiency.

  18. Current HPLC Methods for Assay of Nano Drug Delivery Systems.

    PubMed

    Tekkeli, Serife Evrim Kepekci; Kiziltas, Mustafa Volkan

    2017-01-01

    In nano drug formulations the mechanism of release is a critical process to recognize controlled and targeted drug delivery systems. In order to gain high bioavailability and specificity from the drug to reach its therapeutic goal, the active substance must be loaded into the nanoparticles efficiently. Therefore, the amount in biological fluids or tissues and the remaining amount in nano carriers are very important parameters to understand the potential of the nano drug delivery systems. For this aim, suitable and validated quantitation methods are required to determine released drug concentrations from nano pharmaceutical formulations. HPLC (High Performance Liquid Chromatography) is one of the most common techniques used for determination of released drug content out of nano drug formulations, in different physical conditions, over different periods of time. Since there are many types of HPLC methods depending on detector and column types, it is a challenge for the researchers to choose a suitable method that is simple, fast and validated HPLC techniques for their nano drug delivery systems. This review's goal is to compare HPLC methods that are currently used in different nano drug delivery systems in order to provide detailed and useful information for researchers. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  19. Use of an automated learning management system to validate nursing competencies.

    PubMed

    Dumpe, Michelle L; Kanyok, Nancy; Hill, Kristin

    2007-01-01

    Maintaining nurse competencies in a dynamic environment is not an easy task and requires the use of resources already strained. An online learning management system was created, and 24 annual competencies were redesigned for online validation. As a result of this initiative, competencies have been standardized across many disciplines and are completed in a more timely manner, nurses and managers are more satisfied with this method of annual assessments, and cost savings have been realized.

  20. A Tool for Requirements-Based Programming

    NASA Technical Reports Server (NTRS)

    Rash, James L.; Hinchey, Michael G.; Rouff, Christopher A.; Gracanin, Denis; Erickson, John

    2005-01-01

    Absent a general method for mathematically sound, automated transformation of customer requirements into a formal model of the desired system, developers must resort to either manual application of formal methods or to system testing (either manual or automated). While formal methods have afforded numerous successes, they present serious issues, e.g., costs to gear up to apply them (time, expensive staff), and scalability and reproducibility when standards in the field are not settled. The testing path cannot be walked to the ultimate goal, because exhaustive testing is infeasible for all but trivial systems. So system verification remains problematic. System or requirements validation is similarly problematic. The alternatives available today depend on either having a formal model or pursuing enough testing to enable the customer to be certain that system behavior meets requirements. The testing alternative for non-trivial systems always have some system behaviors unconfirmed and therefore is not the answer. To ensure that a formal model is equivalent to the customer s requirements necessitates that the customer somehow fully understands the formal model, which is not realistic. The predominant view that provably correct system development depends on having a formal model of the system leads to a desire for a mathematically sound method to automate the transformation of customer requirements into a formal model. Such a method, an augmentation of requirements-based programming, will be briefly described in this paper, and a prototype tool to support it will be described. The method and tool enable both requirements validation and system verification for the class of systems whose behavior can be described as scenarios. An application of the tool to a prototype automated ground control system for NASA mission is presented.

  1. Scoring and staging systems using cox linear regression modeling and recursive partitioning.

    PubMed

    Lee, J W; Um, S H; Lee, J B; Mun, J; Cho, H

    2006-01-01

    Scoring and staging systems are used to determine the order and class of data according to predictors. Systems used for medical data, such as the Child-Turcotte-Pugh scoring and staging systems for ordering and classifying patients with liver disease, are often derived strictly from physicians' experience and intuition. We construct objective and data-based scoring/staging systems using statistical methods. We consider Cox linear regression modeling and recursive partitioning techniques for censored survival data. In particular, to obtain a target number of stages we propose cross-validation and amalgamation algorithms. We also propose an algorithm for constructing scoring and staging systems by integrating local Cox linear regression models into recursive partitioning, so that we can retain the merits of both methods such as superior predictive accuracy, ease of use, and detection of interactions between predictors. The staging system construction algorithms are compared by cross-validation evaluation of real data. The data-based cross-validation comparison shows that Cox linear regression modeling is somewhat better than recursive partitioning when there are only continuous predictors, while recursive partitioning is better when there are significant categorical predictors. The proposed local Cox linear recursive partitioning has better predictive accuracy than Cox linear modeling and simple recursive partitioning. This study indicates that integrating local linear modeling into recursive partitioning can significantly improve prediction accuracy in constructing scoring and staging systems.

  2. The Politics and Statistics of Value-Added Modeling for Accountability of Teacher Preparation Programs

    ERIC Educational Resources Information Center

    Lincove, Jane Arnold; Osborne, Cynthia; Dillon, Amanda; Mills, Nicholas

    2014-01-01

    Despite questions about validity and reliability, the use of value-added estimation methods has moved beyond academic research into state accountability systems for teachers, schools, and teacher preparation programs (TPPs). Prior studies of value-added measurement for TPPs test the validity of researcher-designed models and find that measuring…

  3. Load Model Verification, Validation and Calibration Framework by Statistical Analysis on Field Data

    NASA Astrophysics Data System (ADS)

    Jiao, Xiangqing; Liao, Yuan; Nguyen, Thai

    2017-11-01

    Accurate load models are critical for power system analysis and operation. A large amount of research work has been done on load modeling. Most of the existing research focuses on developing load models, while little has been done on developing formal load model verification and validation (V&V) methodologies or procedures. Most of the existing load model validation is based on qualitative rather than quantitative analysis. In addition, not all aspects of model V&V problem have been addressed by the existing approaches. To complement the existing methods, this paper proposes a novel load model verification and validation framework that can systematically and more comprehensively examine load model's effectiveness and accuracy. Statistical analysis, instead of visual check, quantifies the load model's accuracy, and provides a confidence level of the developed load model for model users. The analysis results can also be used to calibrate load models. The proposed framework can be used as a guidance to systematically examine load models for utility engineers and researchers. The proposed method is demonstrated through analysis of field measurements collected from a utility system.

  4. Validation Methods Research for Fault-Tolerant Avionics and Control Systems Sub-Working Group Meeting. CARE 3 peer review

    NASA Technical Reports Server (NTRS)

    Trivedi, K. S. (Editor); Clary, J. B. (Editor)

    1980-01-01

    A computer aided reliability estimation procedure (CARE 3), developed to model the behavior of ultrareliable systems required by flight-critical avionics and control systems, is evaluated. The mathematical models, numerical method, and fault-tolerant architecture modeling requirements are examined, and the testing and characterization procedures are discussed. Recommendations aimed at enhancing CARE 3 are presented; in particular, the need for a better exposition of the method and the user interface is emphasized.

  5. Validated reversed phase LC method for quantitative analysis of polymethoxyflavones in citrus peel extracts.

    PubMed

    Wang, Zhenyu; Li, Shiming; Ferguson, Stephen; Goodnow, Robert; Ho, Chi-Tang

    2008-01-01

    Polymethoxyflavones (PMFs), which exist exclusively in the citrus genus, have biological activities including anti-inflammatory, anticarcinogenic, and antiatherogenic properties. A validated RPLC method was developed for quantitative analysis of six major PMFs, namely nobiletin, tangeretin, sinensetin, 5,6,7,4'-tetramethoxyflavone, 3,5,6,7,3',4'-hexamethoxyflavone, and 3,5,6,7,8,3',4'-heptamethoxyflavone. The polar embedded LC stationary phase was able to fully resolve the six analogues. The developed method was fully validated in terms of linearity, accuracy, precision, sensitivity, and system suitability. The LOD of the method was calculated as 0.15 microg/mL and the recovery rate was between 97.0 and 105.1%. This analytical method was successfully applied to quantify the individual PMFs in four commercially available citrus peel extracts (CPEs). Each extract shows significant difference in the PMF composition and concentration. This method may provide a simple, rapid, and reliable tool to help reveal the correlation between the bioactivity of the PMF extracts and the individual PMF content.

  6. An automation-assisted generic approach for biological sample preparation and LC-MS/MS method validation.

    PubMed

    Zhang, Jie; Wei, Shimin; Ayres, David W; Smith, Harold T; Tse, Francis L S

    2011-09-01

    Although it is well known that automation can provide significant improvement in the efficiency of biological sample preparation in quantitative LC-MS/MS analysis, it has not been widely implemented in bioanalytical laboratories throughout the industry. This can be attributed to the lack of a sound strategy and practical procedures in working with robotic liquid-handling systems. Several comprehensive automation assisted procedures for biological sample preparation and method validation were developed and qualified using two types of Hamilton Microlab liquid-handling robots. The procedures developed were generic, user-friendly and covered the majority of steps involved in routine sample preparation and method validation. Generic automation procedures were established as a practical approach to widely implement automation into the routine bioanalysis of samples in support of drug-development programs.

  7. Calibration and validation of coarse-grained models of atomic systems: application to semiconductor manufacturing

    NASA Astrophysics Data System (ADS)

    Farrell, Kathryn; Oden, J. Tinsley

    2014-07-01

    Coarse-grained models of atomic systems, created by aggregating groups of atoms into molecules to reduce the number of degrees of freedom, have been used for decades in important scientific and technological applications. In recent years, interest in developing a more rigorous theory for coarse graining and in assessing the predictivity of coarse-grained models has arisen. In this work, Bayesian methods for the calibration and validation of coarse-grained models of atomistic systems in thermodynamic equilibrium are developed. For specificity, only configurational models of systems in canonical ensembles are considered. Among major challenges in validating coarse-grained models are (1) the development of validation processes that lead to information essential in establishing confidence in the model's ability predict key quantities of interest and (2), above all, the determination of the coarse-grained model itself; that is, the characterization of the molecular architecture, the choice of interaction potentials and thus parameters, which best fit available data. The all-atom model is treated as the "ground truth," and it provides the basis with respect to which properties of the coarse-grained model are compared. This base all-atom model is characterized by an appropriate statistical mechanics framework in this work by canonical ensembles involving only configurational energies. The all-atom model thus supplies data for Bayesian calibration and validation methods for the molecular model. To address the first challenge, we develop priors based on the maximum entropy principle and likelihood functions based on Gaussian approximations of the uncertainties in the parameter-to-observation error. To address challenge (2), we introduce the notion of model plausibilities as a means for model selection. This methodology provides a powerful approach toward constructing coarse-grained models which are most plausible for given all-atom data. We demonstrate the theory and methods through applications to representative atomic structures and we discuss extensions to the validation process for molecular models of polymer structures encountered in certain semiconductor nanomanufacturing processes. The powerful method of model plausibility as a means for selecting interaction potentials for coarse-grained models is discussed in connection with a coarse-grained hexane molecule. Discussions of how all-atom information is used to construct priors are contained in an appendix.

  8. Methodologies for pre-validation of biofilters and wetlands for stormwater treatment.

    PubMed

    Zhang, Kefeng; Randelovic, Anja; Aguiar, Larissa M; Page, Declan; McCarthy, David T; Deletic, Ana

    2015-01-01

    Water Sensitive Urban Design (WSUD) systems are frequently used as part of a stormwater harvesting treatment trains (e.g. biofilters (bio-retentions and rain-gardens) and wetlands). However, validation frameworks for such systems do not exist, limiting their adoption for end-uses such as drinking water. The first stage in the validation framework is pre-validation, which prepares information for further validation monitoring. A pre-validation roadmap, consisting of five steps, is suggested in this paper. Detailed methods for investigating target micropollutants in stormwater, and determining challenge conditions for biofilters and wetlands, are provided. A literature review was undertaken to identify and quantify micropollutants in stormwater. MUSIC V5.1 was utilized to simulate the behaviour of the systems based on 30-year rainfall data in three distinct climate zones; outputs were evaluated to identify the threshold of operational variables, including length of dry periods (LDPs) and volume of water treated per event. The paper highlights that a number of micropollutants were found in stormwater at levels above various worldwide drinking water guidelines (eight pesticides, benzene, benzo(a)pyrene, pentachlorophenol, di-(2-ethylhexyl)-phthalate and a total of polychlorinated biphenyls). The 95th percentile LDPs was exponentially related to system design area while the 5th percentile length of dry periods remained within short durations (i.e. 2-8 hours). 95th percentile volume of water treated per event was exponentially related to system design area as a percentage of an impervious catchment area. The out-comings of this study show that pre-validation could be completed through a roadmap consisting of a series of steps; this will help in the validation of stormwater treatment systems.

  9. A Validation of an Intelligent Decision-Making Support System for the Nutrition Diagnosis of Bariatric Surgery Patients

    PubMed Central

    Martins, Cristina; Dias, João; Pinto, José S

    2014-01-01

    Background Bariatric surgery is an important method for treatment of morbid obesity. It is known that significant nutritional deficiencies might occur after surgery, such as, calorie-protein malnutrition, iron deficiency anemia, and lack of vitamin B12, thiamine, and folic acid. Objective The objective of our study was to validate a computerized intelligent decision support system that suggests nutritional diagnoses of patients submitted to bariatric surgery. Methods There were fifteen clinical cases that were developed and sent to three dietitians in order to evaluate and define a nutritional diagnosis. After this step, the cases were sent to four bariatric surgery expert dietitians who were aiming to collaborate on a gold standard. The nutritional diagnosis was to be defined individually, and any disagreements were solved through a consensus. The final result was used as the gold standard. Bayesian networks were used to implement the system, and database training was done with Shell Netica. For the system validation, a similar answer rate was calculated, as well as the specificity and sensibility. Receiver operating characteristic (ROC) curves were projected to each nutritional diagnosis. Results Among the four experts, the rate of similar answers found was 80% (48/60) to 93% (56/60), depending on the nutritional diagnosis. The rate of similar answers of the system, compared to the gold standard, was 100% (60/60). The system sensibility and specificity were 95.0%. The ROC curves projection showed that the system was able to represent the expert knowledge (gold standard), and to help them in their daily tasks. Conclusions The system that was developed was validated to be used by health care professionals for decision-making support in their nutritional diagnosis of patients submitted to bariatric surgery. PMID:25601419

  10. Determination of total arsenic in fish by hydride-generation atomic absorption spectrometry: method validation, traceability and uncertainty evaluation

    NASA Astrophysics Data System (ADS)

    Nugraha, W. C.; Elishian, C.; Ketrin, R.

    2017-03-01

    Fish containing arsenic compound is one of the important indicators of arsenic contamination in water monitoring. The high level of arsenic in fish is due to absorption through food chain and accumulated in their habitat. Hydride generation (HG) coupled with atomic absorption spectrometric (AAS) detection is one of the most popular techniques employed for arsenic determination in a variety of matrices including fish. This study aimed to develop a method for the determination of total arsenic in fish by HG-AAS. The method for sample preparation from American of Analytical Chemistry (AOAC) Method 999.10-2005 was adopted for acid digestion using microwave digestion system and AOAC Method 986.15 - 2005 for dry ashing. The method was developed and validated using Certified Reference Material DORM 3 Fish Protein for trace metals for ensuring the accuracy and the traceability of the results. The sources of uncertainty of the method were also evaluated. By using the method, it was found that the total arsenic concentration in the fish was 45.6 ± 1.22 mg.Kg-1 with a coverage factor of equal to 2 at 95% of confidence level. Evaluation of uncertainty was highly influenced by the calibration curve. This result was also traceable to International Standard System through analysis of Certified Reference Material DORM 3 with 97.5% of recovery. In summary, it showed that method of preparation and HG-AAS technique for total arsenic determination in fish were valid and reliable.

  11. Stability analysis of piecewise non-linear systems and its application to chaotic synchronisation with intermittent control

    NASA Astrophysics Data System (ADS)

    Wang, Qingzhi; Tan, Guanzheng; He, Yong; Wu, Min

    2017-10-01

    This paper considers a stability analysis issue of piecewise non-linear systems and applies it to intermittent synchronisation of chaotic systems. First, based on piecewise Lyapunov function methods, more general and less conservative stability criteria of piecewise non-linear systems in periodic and aperiodic cases are presented, respectively. Next, intermittent synchronisation conditions of chaotic systems are derived which extend existing results. Finally, Chua's circuit is taken as an example to verify the validity of our methods.

  12. Forward ultrasonic model validation using wavefield imaging methods

    NASA Astrophysics Data System (ADS)

    Blackshire, James L.

    2018-04-01

    The validation of forward ultrasonic wave propagation models in a complex titanium polycrystalline material system is accomplished using wavefield imaging methods. An innovative measurement approach is described that permits the visualization and quantitative evaluation of bulk elastic wave propagation and scattering behaviors in the titanium material for a typical focused immersion ultrasound measurement process. Results are provided for the determination and direct comparison of the ultrasonic beam's focal properties, mode-converted shear wave position and angle, and scattering and reflection from millimeter-sized microtexture regions (MTRs) within the titanium material. The approach and results are important with respect to understanding the root-cause backscatter signal responses generated in aerospace engine materials, where model-assisted methods are being used to understand the probabilistic nature of the backscatter signal content. Wavefield imaging methods are shown to be an effective means for corroborating and validating important forward model predictions in a direct manner using time- and spatially-resolved displacement field amplitude measurements.

  13. Global Land Product Validation Protocols: An Initiative of the CEOS Working Group on Calibration and Validation to Evaluate Satellite-derived Essential Climate Variables

    NASA Astrophysics Data System (ADS)

    Guillevic, P. C.; Nickeson, J. E.; Roman, M. O.; camacho De Coca, F.; Wang, Z.; Schaepman-Strub, G.

    2016-12-01

    The Global Climate Observing System (GCOS) has specified the need to systematically produce and validate Essential Climate Variables (ECVs). The Committee on Earth Observation Satellites (CEOS) Working Group on Calibration and Validation (WGCV) and in particular its subgroup on Land Product Validation (LPV) is playing a key coordination role leveraging the international expertise required to address actions related to the validation of global land ECVs. The primary objective of the LPV subgroup is to set standards for validation methods and reporting in order to provide traceable and reliable uncertainty estimates for scientists and stakeholders. The Subgroup is comprised of 9 focus areas that encompass 10 land surface variables. The activities of each focus area are coordinated by two international co-leads and currently include leaf area index (LAI) and fraction of absorbed photosynthetically active radiation (FAPAR), vegetation phenology, surface albedo, fire disturbance, snow cover, land cover and land use change, soil moisture, land surface temperature (LST) and emissivity. Recent additions to the focus areas include vegetation indices and biomass. The development of best practice validation protocols is a core activity of CEOS LPV with the objective to standardize the evaluation of land surface products. LPV has identified four validation levels corresponding to increasing spatial and temporal representativeness of reference samples used to perform validation. Best practice validation protocols (1) provide the definition of variables, ancillary information and uncertainty metrics, (2) describe available data sources and methods to establish reference validation datasets with SI traceability, and (3) describe evaluation methods and reporting. An overview on validation best practice components will be presented based on the LAI and LST protocol efforts to date.

  14. Statistically validated network of portfolio overlaps and systemic risk.

    PubMed

    Gualdi, Stanislao; Cimini, Giulio; Primicerio, Kevin; Di Clemente, Riccardo; Challet, Damien

    2016-12-21

    Common asset holding by financial institutions (portfolio overlap) is nowadays regarded as an important channel for financial contagion with the potential to trigger fire sales and severe losses at the systemic level. We propose a method to assess the statistical significance of the overlap between heterogeneously diversified portfolios, which we use to build a validated network of financial institutions where links indicate potential contagion channels. The method is implemented on a historical database of institutional holdings ranging from 1999 to the end of 2013, but can be applied to any bipartite network. We find that the proportion of validated links (i.e. of significant overlaps) increased steadily before the 2007-2008 financial crisis and reached a maximum when the crisis occurred. We argue that the nature of this measure implies that systemic risk from fire sales liquidation was maximal at that time. After a sharp drop in 2008, systemic risk resumed its growth in 2009, with a notable acceleration in 2013. We finally show that market trends tend to be amplified in the portfolios identified by the algorithm, such that it is possible to have an informative signal about institutions that are about to suffer (enjoy) the most significant losses (gains).

  15. Statistically validated network of portfolio overlaps and systemic risk

    PubMed Central

    Gualdi, Stanislao; Cimini, Giulio; Primicerio, Kevin; Di Clemente, Riccardo; Challet, Damien

    2016-01-01

    Common asset holding by financial institutions (portfolio overlap) is nowadays regarded as an important channel for financial contagion with the potential to trigger fire sales and severe losses at the systemic level. We propose a method to assess the statistical significance of the overlap between heterogeneously diversified portfolios, which we use to build a validated network of financial institutions where links indicate potential contagion channels. The method is implemented on a historical database of institutional holdings ranging from 1999 to the end of 2013, but can be applied to any bipartite network. We find that the proportion of validated links (i.e. of significant overlaps) increased steadily before the 2007–2008 financial crisis and reached a maximum when the crisis occurred. We argue that the nature of this measure implies that systemic risk from fire sales liquidation was maximal at that time. After a sharp drop in 2008, systemic risk resumed its growth in 2009, with a notable acceleration in 2013. We finally show that market trends tend to be amplified in the portfolios identified by the algorithm, such that it is possible to have an informative signal about institutions that are about to suffer (enjoy) the most significant losses (gains). PMID:28000764

  16. Simulation of fMRI signals to validate dynamic causal modeling estimation

    NASA Astrophysics Data System (ADS)

    Anandwala, Mobin; Siadat, Mohamad-Reza; Hadi, Shamil M.

    2012-03-01

    Through cognitive tasks certain brain areas are activated and also receive increased blood to them. This is modeled through a state system consisting of two separate parts one that deals with the neural node stimulation and the other blood response during that stimulation. The rationale behind using this state system is to validate existing analysis methods such as DCM to see what levels of noise they can handle. Using the forward Euler's method this system was approximated in a series of difference equations. What was obtained was the hemodynamic response for each brain area and this was used to test an analysis tool to estimate functional connectivity between each brain area with a given amount of noise. The importance of modeling this system is to not only have a model for neural response but also to compare to actual data obtained through functional imaging scans.

  17. Rolling bearing fault diagnosis and health assessment using EEMD and the adjustment Mahalanobis-Taguchi system

    NASA Astrophysics Data System (ADS)

    Chen, Junxun; Cheng, Longsheng; Yu, Hui; Hu, Shaolin

    2018-01-01

    ABSTRACTSFor the timely identification of the potential faults of a rolling bearing and to observe its health condition intuitively and accurately, a novel fault diagnosis and health assessment model for a rolling bearing based on the ensemble empirical mode decomposition (EEMD) method and the adjustment Mahalanobis-Taguchi system (AMTS) method is proposed. The specific steps are as follows: First, the vibration signal of a rolling bearing is decomposed by EEMD, and the extracted features are used as the input vectors of AMTS. Then, the AMTS method, which is designed to overcome the shortcomings of the traditional Mahalanobis-Taguchi system and to extract the key features, is proposed for fault diagnosis. Finally, a type of HI concept is proposed according to the results of the fault diagnosis to accomplish the health assessment of a bearing in its life cycle. To validate the superiority of the developed method proposed approach, it is compared with other recent method and proposed methodology is successfully validated on a vibration data-set acquired from seeded defects and from an accelerated life test. The results show that this method represents the actual situation well and is able to accurately and effectively identify the fault type.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20020079427','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20020079427"><span>Verification and Validation of Neural Networks for Aerospace Systems</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Mackall, Dale; Nelson, Stacy; Schumman, Johann; Clancy, Daniel (Technical Monitor)</p> <p>2002-01-01</p> <p>The Dryden Flight Research Center V&V working group and NASA Ames Research Center Automated Software Engineering (ASE) group collaborated to prepare this report. The purpose is to describe V&V processes and methods for certification of neural networks for aerospace applications, particularly adaptive flight control systems like Intelligent Flight Control Systems (IFCS) that use neural networks. This report is divided into the following two sections: 1) Overview of Adaptive Systems; and 2) V&V Processes/Methods.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20020070826','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20020070826"><span>Verification and Validation of Neural Networks for Aerospace Systems</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Mackall, Dale; Nelson, Stacy; Schumann, Johann</p> <p>2002-01-01</p> <p>The Dryden Flight Research Center V&V working group and NASA Ames Research Center Automated Software Engineering (ASE) group collaborated to prepare this report. The purpose is to describe V&V processes and methods for certification of neural networks for aerospace applications, particularly adaptive flight control systems like Intelligent Flight Control Systems (IFCS) that use neural networks. This report is divided into the following two sections: Overview of Adaptive Systems and V&V Processes/Methods.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://ntrs.nasa.gov/search.jsp?R=19940005465&hterms=knowledge+representation&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D90%26Ntt%3Dknowledge%2Brepresentation','NASA-TRS'); return false;" href="https://ntrs.nasa.gov/search.jsp?R=19940005465&hterms=knowledge+representation&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D90%26Ntt%3Dknowledge%2Brepresentation"><span>Knowledge-based system verification and validation</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Johnson, Sally C.</p> <p>1990-01-01</p> <p>The objective of this task is to develop and evaluate a methodology for verification and validation (V&V) of knowledge-based systems (KBS) for space station applications with high reliability requirements. The approach consists of three interrelated tasks. The first task is to evaluate the effectiveness of various validation methods for space station applications. The second task is to recommend requirements for KBS V&V for Space Station Freedom (SSF). The third task is to recommend modifications to the SSF to support the development of KBS using effectiveness software engineering and validation techniques. To accomplish the first task, three complementary techniques will be evaluated: (1) Sensitivity Analysis (Worchester Polytechnic Institute); (2) Formal Verification of Safety Properties (SRI International); and (3) Consistency and Completeness Checking (Lockheed AI Center). During FY89 and FY90, each contractor will independently demonstrate the user of his technique on the fault detection, isolation, and reconfiguration (FDIR) KBS or the manned maneuvering unit (MMU), a rule-based system implemented in LISP. During FY91, the application of each of the techniques to other knowledge representations and KBS architectures will be addressed. After evaluation of the results of the first task and examination of Space Station Freedom V&V requirements for conventional software, a comprehensive KBS V&V methodology will be developed and documented. Development of highly reliable KBS's cannot be accomplished without effective software engineering methods. Using the results of current in-house research to develop and assess software engineering methods for KBS's as well as assessment of techniques being developed elsewhere, an effective software engineering methodology for space station KBS's will be developed, and modification of the SSF to support these tools and methods will be addressed.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_8");'>8</a></li> <li><a href="#" onclick='return showDiv("page_9");'>9</a></li> <li class="active"><span>10</span></li> <li><a href="#" onclick='return showDiv("page_11");'>11</a></li> <li><a href="#" onclick='return showDiv("page_12");'>12</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_10 --> <div id="page_11" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_9");'>9</a></li> <li><a href="#" onclick='return showDiv("page_10");'>10</a></li> <li class="active"><span>11</span></li> <li><a href="#" onclick='return showDiv("page_12");'>12</a></li> <li><a href="#" onclick='return showDiv("page_13");'>13</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="201"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/15360786','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/15360786"><span>Computerization of guidelines: a knowledge specification method to convert text to detailed decision tree for electronic implementation.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Aguirre-Junco, Angel-Ricardo; Colombet, Isabelle; Zunino, Sylvain; Jaulent, Marie-Christine; Leneveut, Laurence; Chatellier, Gilles</p> <p>2004-01-01</p> <p>The initial step for the computerization of guidelines is the knowledge specification from the prose text of guidelines. We describe a method of knowledge specification based on a structured and systematic analysis of text allowing detailed specification of a decision tree. We use decision tables to validate the decision algorithm and decision trees to specify and represent this algorithm, along with elementary messages of recommendation. Edition tools are also necessary to facilitate the process of validation and workflow between expert physicians who will validate the specified knowledge and computer scientist who will encode the specified knowledge in a guide-line model. Applied to eleven different guidelines issued by an official agency, the method allows a quick and valid computerization and integration in a larger decision support system called EsPeR (Personalized Estimate of Risks). The quality of the text guidelines is however still to be developed further. The method used for computerization could help to define a framework usable at the initial step of guideline development in order to produce guidelines ready for electronic implementation.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20090025468','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20090025468"><span>Baseline Assessment and Prioritization Framework for IVHM Integrity Assurance Enabling Capabilities</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Cooper, Eric G.; DiVito, Benedetto L.; Jacklin, Stephen A.; Miner, Paul S.</p> <p>2009-01-01</p> <p>Fundamental to vehicle health management is the deployment of systems incorporating advanced technologies for predicting and detecting anomalous conditions in highly complex and integrated environments. Integrated structural integrity health monitoring, statistical algorithms for detection, estimation, prediction, and fusion, and diagnosis supporting adaptive control are examples of advanced technologies that present considerable verification and validation challenges. These systems necessitate interactions between physical and software-based systems that are highly networked with sensing and actuation subsystems, and incorporate technologies that are, in many respects, different from those employed in civil aviation today. A formidable barrier to deploying these advanced technologies in civil aviation is the lack of enabling verification and validation tools, methods, and technologies. The development of new verification and validation capabilities will not only enable the fielding of advanced vehicle health management systems, but will also provide new assurance capabilities for verification and validation of current generation aviation software which has been implicated in anomalous in-flight behavior. This paper describes the research focused on enabling capabilities for verification and validation underway within NASA s Integrated Vehicle Health Management project, discusses the state of the art of these capabilities, and includes a framework for prioritizing activities.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5494621','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5494621"><span>Real-Time PCR Method for Detection of Salmonella spp. in Environmental Samples</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Drgon, Tomas</p> <p>2017-01-01</p> <p>ABSTRACT The methods currently used for detecting Salmonella in environmental samples require 2 days to produce results and have limited sensitivity. Here, we describe the development and validation of a real-time PCR Salmonella screening method that produces results in 18 to 24 h. Primers and probes specific to the gene invA, group D, and Salmonella enterica serovar Enteritidis organisms were designed and evaluated for inclusivity and exclusivity using a panel of 329 Salmonella isolates representing 126 serovars and 22 non-Salmonella organisms. The invA- and group D-specific sets identified all the isolates accurately. The PCR method had 100% inclusivity and detected 1 to 2 copies of Salmonella DNA per reaction. Primers specific for Salmonella-differentiating fragment 1 (Sdf-1) in conjunction with the group D set had 100% inclusivity for 32 S. Enteritidis isolates and 100% exclusivity for the 297 non-Enteritidis Salmonella isolates. Single-laboratory validation performed on 1,741 environmental samples demonstrated that the PCR method detected 55% more positives than the Vitek immunodiagnostic assay system (VIDAS) method. The PCR results correlated well with the culture results, and the method did not report any false-negative results. The receiver operating characteristic (ROC) analysis documented excellent agreement between the results from the culture and PCR methods (area under the curve, 0.90; 95% confidence interval of 0.76 to 1.0) confirming the validity of the PCR method. IMPORTANCE This validated PCR method detects 55% more positives for Salmonella in half the time required for the reference method, VIDAS. The validated PCR method will help to strengthen public health efforts through rapid screening of Salmonella spp. in environmental samples. PMID:28500041</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/18528275','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/18528275"><span>Validation of a low cost computer-based method for quantification of immunohistochemistry-stained sections.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Montgomery, Jill D; Hensler, Heather R; Jacobson, Lisa P; Jenkins, Frank J</p> <p>2008-07-01</p> <p>The aim of the present study was to determine if the Alpha DigiDoc RT system would be an effective method of quantifying immunohistochemical staining as compared with a manual counting method, which is considered the gold standard. Two readers were used to count 31 samples by both methods. The results obtained using the Bland-Altman for concordance deemed no statistical difference between the 2 methods. Thus, the Alpha DigiDoc RT system is an effective, low cost method to quantify immunohistochemical data.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/biblio/1344440-evaluation-system-integrated-smart-grid-devices-using-software-hardware-loop','SCIGOV-STC'); return false;" href="https://www.osti.gov/biblio/1344440-evaluation-system-integrated-smart-grid-devices-using-software-hardware-loop"><span></span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Lundstrom, Blake; Chakraborty, Sudipta; Lauss, Georg</p> <p></p> <p>This paper presents a concise description of state-of-the-art real-time simulation-based testing methods and demonstrates how they can be used independently and/or in combination as an integrated development and validation approach for smart grid DERs and systems. A three-part case study demonstrating the application of this integrated approach at the different stages of development and validation of a system-integrated smart photovoltaic (PV) inverter is also presented. Laboratory testing results and perspectives from two international research laboratories are included in the case study.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/27269976','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/27269976"><span>Reliability and Validity of the Footprint Assessment Method Using Photoshop CS5 Software in Young People with Down Syndrome.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Gutiérrez-Vilahú, Lourdes; Massó-Ortigosa, Núria; Rey-Abella, Ferran; Costa-Tutusaus, Lluís; Guerra-Balic, Myriam</p> <p>2016-05-01</p> <p>People with Down syndrome present skeletal abnormalities in their feet that can be analyzed by commonly used gold standard indices (the Hernández-Corvo index, the Chippaux-Smirak index, the Staheli arch index, and the Clarke angle) based on footprint measurements. The use of Photoshop CS5 software (Adobe Systems Software Ireland Ltd, Dublin, Ireland) to measure footprints has been validated in the general population. The present study aimed to assess the reliability and validity of this footprint assessment technique in the population with Down syndrome. Using optical podography and photography, 44 footprints from 22 patients with Down syndrome (11 men [mean ± SD age, 23.82 ± 3.12 years] and 11 women [mean ± SD age, 24.82 ± 6.81 years]) were recorded in a static bipedal standing position. A blinded observer performed the measurements using a validated manual method three times during the 4-month study, with 2 months between measurements. Test-retest was used to check the reliability of the Photoshop CS5 software measurements. Validity and reliability were obtained by intraclass correlation coefficient (ICC). The reliability test for all of the indices showed very good values for the Photoshop CS5 method (ICC, 0.982-0.995). Validity testing also found no differences between the techniques (ICC, 0.988-0.999). The Photoshop CS5 software method is reliable and valid for the study of footprints in young people with Down syndrome.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/16828016','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/16828016"><span>[Selection of risk and diagnosis in diabetic polyneuropathy. Validation of method of new systems].</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Jurado, Jerónimo; Caula, Jacinto; Pou i Torelló, Josep Maria</p> <p>2006-06-30</p> <p>In a previous study we developed a specific algorithm, the polyneuropathy selection method (PSM) with 4 parameters (age, HDL-C, HbA1c, and retinopathy), to select patients at risk of diabetic polyneuropathy (DPN). We also developed a simplified method for DPN diagnosis: outpatient polyneuropathy diagnosis (OPD), with 4 variables (symptoms and 3 objective tests). To confirm the validity of conventional tests for DPN diagnosis; to validate the discriminatory power of the PSM and the diagnostic value of OPD by evaluating their relationship to electrodiagnosis studies and objective clinical neurological assessment; and to evaluate the correlation of DPN and pro-inflammatory status. Cross-sectional, crossed association for PSM validation. Paired samples for OPD validation. Primary care in 3 counties. Random sample of 75 subjects from the type-2 diabetes census for PSM evaluation. Thirty DPN patients and 30 non-DPN patients (from 2 DM2 sub-groups in our earlier study) for OPD evaluation. The gold standard for DPN diagnosis will be studied by means of a clinical neurological study (symptoms, physical examination, and sensitivity tests) and electrodiagnosis studies (sensitivity and motor EMG). Risks of neuropathy, macroangiopathy and pro-inflammatory status (PCR, TNF soluble fraction and total TGF-beta1) will be studied in every subject. Electrodiagnosis studies should confirm the validity of conventional tests for DPN diagnosis. PSM and OPD will be valid methods for selecting patients at risk and diagnosing DPN. There will be a significant relationship between DPN and pro-inflammatory tests.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/26146968','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/26146968"><span>Reliability and Validity of the Footprint Assessment Method Using Photoshop CS5 Software.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Gutiérrez-Vilahú, Lourdes; Massó-Ortigosa, Núria; Costa-Tutusaus, Lluís; Guerra-Balic, Myriam</p> <p>2015-05-01</p> <p>Several sophisticated methods of footprint analysis currently exist. However, it is sometimes useful to apply standard measurement methods of recognized evidence with an easy and quick application. We sought to assess the reliability and validity of a new method of footprint assessment in a healthy population using Photoshop CS5 software (Adobe Systems Inc, San Jose, California). Forty-two footprints, corresponding to 21 healthy individuals (11 men with a mean ± SD age of 20.45 ± 2.16 years and 10 women with a mean ± SD age of 20.00 ± 1.70 years) were analyzed. Footprints were recorded in static bipedal standing position using optical podography and digital photography. Three trials for each participant were performed. The Hernández-Corvo, Chippaux-Smirak, and Staheli indices and the Clarke angle were calculated by manual method and by computerized method using Photoshop CS5 software. Test-retest was used to determine reliability. Validity was obtained by intraclass correlation coefficient (ICC). The reliability test for all of the indices showed high values (ICC, 0.98-0.99). Moreover, the validity test clearly showed no difference between techniques (ICC, 0.99-1). The reliability and validity of a method to measure, assess, and record the podometric indices using Photoshop CS5 software has been demonstrated. This provides a quick and accurate tool useful for the digital recording of morphostatic foot study parameters and their control.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.fs.usda.gov/treesearch/pubs/20911','TREESEARCH'); return false;" href="https://www.fs.usda.gov/treesearch/pubs/20911"><span>Measuring landscape esthetics: the scenic beauty estimation method</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.fs.usda.gov/treesearch/">Treesearch</a></p> <p>Terry C. Daniel; Ron S. Boster</p> <p>1976-01-01</p> <p>The Scenic Beauty Estimation Method (SBE) provides quantitative measures of esthetic preferences for alternative wildland management systems. Extensive experimentation and testing with user, interest, and professional groups validated the method. SBE shows promise as an efficient and objective means for assessing the scenic beauty of public forests and wildlands, and...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/FR-2012-05-09/pdf/2012-10895.pdf','FEDREG'); return false;" href="https://www.gpo.gov/fdsys/pkg/FR-2012-05-09/pdf/2012-10895.pdf"><span>77 FR 27135 - HACCP Systems Validation</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.gpo.gov/fdsys/browse/collection.action?collectionCode=FR">Federal Register 2010, 2011, 2012, 2013, 2014</a></p> <p></p> <p>2012-05-09</p> <p>.... Comments may be submitted by either of the following methods: Federal eRulemaking Portal: This Web site... those CCPs and the method of monitoring of them and provides certificates of analysis that specify the sampling method that the supplier uses and the results of that sampling. The receiving establishment should...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4773626','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4773626"><span>Validation of the Behavioral Risk Factor Surveillance System Sleep Questions</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Jungquist, Carla R.; Mund, Jaime; Aquilina, Alan T.; Klingman, Karen; Pender, John; Ochs-Balcom, Heather; van Wijngaarden, Edwin; Dickerson, Suzanne S.</p> <p>2016-01-01</p> <p>Study Objective: Sleep problems may constitute a risk for health problems, including cardiovascular disease, depression, diabetes, poor work performance, and motor vehicle accidents. The primary purpose of this study was to assess the validity of the current Behavioral Risk Factor Surveillance System (BRFSS) sleep questions by establishing the sensitivity and specificity for detection of sleep/ wake disturbance. Methods: Repeated cross-sectional assessment of 300 community dwelling adults over the age of 18 who did not wear CPAP or oxygen during sleep. Reliability and validity testing of the BRFSS sleep questions was performed comparing to BFRSS responses to data from home sleep study, actigraphy for 14 days, Insomnia Severity Index, Epworth Sleepiness Scale, and PROMIS-57. Results: Only two of the five BRFSS sleep questions were found valid and reliable in determining total sleep time and excessive daytime sleepiness. Conclusions: Refinement of the BRFSS questions is recommended. Citation: Jungquist CR, Mund J, Aquilina AT, Klingman K, Pender J, Ochs-Balcom H, van Wijngaarden E, Dickerson SS. Validation of the behavioral risk factor surveillance system sleep questions. J Clin Sleep Med 2016;12(3):301–310. PMID:26446246</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5760758','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5760758"><span>Bio-analytical method development and validation of Rasagiline by high performance liquid chromatography tandem mass spectrometry detection and its application to pharmacokinetic study</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Konda, Ravi Kumar; Chandu, Babu Rao; Challa, B.R.; Kothapalli, Chandrasekhar B.</p> <p>2012-01-01</p> <p>The most suitable bio-analytical method based on liquid–liquid extraction has been developed and validated for quantification of Rasagiline in human plasma. Rasagiline-13C3 mesylate was used as an internal standard for Rasagiline. Zorbax Eclipse Plus C18 (2.1 mm×50 mm, 3.5 μm) column provided chromatographic separation of analyte followed by detection with mass spectrometry. The method involved simple isocratic chromatographic condition and mass spectrometric detection in the positive ionization mode using an API-4000 system. The total run time was 3.0 min. The proposed method has been validated with the linear range of 5–12000 pg/mL for Rasagiline. The intra-run and inter-run precision values were within 1.3%–2.9% and 1.6%–2.2% respectively for Rasagiline. The overall recovery for Rasagiline and Rasagiline-13C3 mesylate analog was 96.9% and 96.7% respectively. This validated method was successfully applied to the bioequivalence and pharmacokinetic study of human volunteers under fasting condition. PMID:29403764</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/25986627','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/25986627"><span>Development and Validation of a Rapid (13)C6-Glucose Isotope Dilution UPLC-MRM Mass Spectrometry Method for Use in Determining System Accuracy and Performance of Blood Glucose Monitoring Devices.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Matsunami, Risë K; Angelides, Kimon; Engler, David A</p> <p>2015-05-18</p> <p>There is currently considerable discussion about the accuracy of blood glucose concentrations determined by personal blood glucose monitoring systems (BGMS). To date, the FDA has allowed new BGMS to demonstrate accuracy in reference to other glucose measurement systems that use the same or similar enzymatic-based methods to determine glucose concentration. These types of reference measurement procedures are only comparative in nature and are subject to the same potential sources of error in measurement and system perturbations as the device under evaluation. It would be ideal to have a completely orthogonal primary method that could serve as a true standard reference measurement procedure for establishing the accuracy of new BGMS. An isotope-dilution liquid chromatography/mass spectrometry (ID-UPLC-MRM) assay was developed using (13)C6-glucose as a stable isotope analogue to specifically measure glucose concentration in human plasma, and validated for use against NIST standard reference materials, and against fresh isolates of whole blood and plasma into which exogenous glucose had been spiked. Assay performance was quantified to NIST-traceable dry weight measures for both glucose and (13)C6-glucose. The newly developed assay method was shown to be rapid, highly specific, sensitive, accurate, and precise for measuring plasma glucose levels. The assay displayed sufficient dynamic range and linearity to measure across the range of both normal and diabetic blood glucose levels. Assay performance was measured to within the same uncertainty levels (<1%) as the NIST definitive method for glucose measurement in human serum. The newly developed ID UPLC-MRM assay can serve as a validated reference measurement procedure to which new BGMS can be assessed for glucose measurement performance. © 2015 Diabetes Technology Society.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/27343591','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/27343591"><span>Validation of selected analytical methods using accuracy profiles to assess the impact of a Tobacco Heating System on indoor air quality.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Mottier, Nicolas; Tharin, Manuel; Cluse, Camille; Crudo, Jean-René; Lueso, María Gómez; Goujon-Ginglinger, Catherine G; Jaquier, Anne; Mitova, Maya I; Rouget, Emmanuel G R; Schaller, Mathieu; Solioz, Jennifer</p> <p>2016-09-01</p> <p>Studies in environmentally controlled rooms have been used over the years to assess the impact of environmental tobacco smoke on indoor air quality. As new tobacco products are developed, it is important to determine their impact on air quality when used indoors. Before such an assessment can take place it is essential that the analytical methods used to assess indoor air quality are validated and shown to be fit for their intended purpose. Consequently, for this assessment, an environmentally controlled room was built and seven analytical methods, representing eighteen analytes, were validated. The validations were carried out with smoking machines using a matrix-based approach applying the accuracy profile procedure. The performances of the methods were compared for all three matrices under investigation: background air samples, the environmental aerosol of Tobacco Heating System THS 2.2, a heat-not-burn tobacco product developed by Philip Morris International, and the environmental tobacco smoke of a cigarette. The environmental aerosol generated by the THS 2.2 device did not have any appreciable impact on the performances of the methods. The comparison between the background and THS 2.2 environmental aerosol samples generated by smoking machines showed that only five compounds were higher when THS 2.2 was used in the environmentally controlled room. Regarding environmental tobacco smoke from cigarettes, the yields of all analytes were clearly above those obtained with the other two air sample types. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19960020436','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19960020436"><span>Domain Decomposition Algorithms for First-Order System Least Squares Methods</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Pavarino, Luca F.</p> <p>1996-01-01</p> <p>Least squares methods based on first-order systems have been recently proposed and analyzed for second-order elliptic equations and systems. They produce symmetric and positive definite discrete systems by using standard finite element spaces, which are not required to satisfy the inf-sup condition. In this paper, several domain decomposition algorithms for these first-order least squares methods are studied. Some representative overlapping and substructuring algorithms are considered in their additive and multiplicative variants. The theoretical and numerical results obtained show that the classical convergence bounds (on the iteration operator) for standard Galerkin discretizations are also valid for least squares methods.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=definition+AND+validity+AND+reliability&pg=7&id=EJ930502','ERIC'); return false;" href="https://eric.ed.gov/?q=definition+AND+validity+AND+reliability&pg=7&id=EJ930502"><span>Validation of the Seating and Mobility Script Concordance Test</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Cohen, Laura J.; Fitzgerald, Shirley G.; Lane, Suzanne; Boninger, Michael L.; Minkel, Jean; McCue, Michael</p> <p>2009-01-01</p> <p>The purpose of this study was to develop the scoring system for the Seating and Mobility Script Concordance Test (SMSCT), obtain and appraise internal and external structure evidence, and assess the validity of the SMSCT. The SMSCT purpose is to provide a method for testing knowledge of seating and mobility prescription. A sample of 106 therapists…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://files.eric.ed.gov/fulltext/ED478586.pdf','ERIC'); return false;" href="http://files.eric.ed.gov/fulltext/ED478586.pdf"><span>Lather (Interior Systems Mechanic). Occupational Analyses Series.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Chapman, Mike; Chapman, Carol; MacLean, Margaret</p> <p></p> <p>This analysis covers tasks performed by a lather, an occupational title some provinces and territories of Canada have also identified as drywall and acoustical mechanic; interior systems installer; and interior systems mechanic. A guide to analysis discusses development, structure, and validation method; scope of the occupation; trends; and…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=gnss&id=EJ967140','ERIC'); return false;" href="https://eric.ed.gov/?q=gnss&id=EJ967140"><span>A New Time Measurement Method Using a High-End Global Navigation Satellite System to Analyze Alpine Skiing</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Supej, Matej; Holmberg, Hans-Christer</p> <p>2011-01-01</p> <p>Accurate time measurement is essential to temporal analysis in sport. This study aimed to (a) develop a new method for time computation from surveyed trajectories using a high-end global navigation satellite system (GNSS), (b) validate its precision by comparing GNSS with photocells, and (c) examine whether gate-to-gate times can provide more…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/29323205','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/29323205"><span>Unsupervised Learning and Pattern Recognition of Biological Data Structures with Density Functional Theory and Machine Learning.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Chen, Chien-Chang; Juan, Hung-Hui; Tsai, Meng-Yuan; Lu, Henry Horng-Shing</p> <p>2018-01-11</p> <p>By introducing the methods of machine learning into the density functional theory, we made a detour for the construction of the most probable density function, which can be estimated by learning relevant features from the system of interest. Using the properties of universal functional, the vital core of density functional theory, the most probable cluster numbers and the corresponding cluster boundaries in a studying system can be simultaneously and automatically determined and the plausibility is erected on the Hohenberg-Kohn theorems. For the method validation and pragmatic applications, interdisciplinary problems from physical to biological systems were enumerated. The amalgamation of uncharged atomic clusters validated the unsupervised searching process of the cluster numbers and the corresponding cluster boundaries were exhibited likewise. High accurate clustering results of the Fisher's iris dataset showed the feasibility and the flexibility of the proposed scheme. Brain tumor detections from low-dimensional magnetic resonance imaging datasets and segmentations of high-dimensional neural network imageries in the Brainbow system were also used to inspect the method practicality. The experimental results exhibit the successful connection between the physical theory and the machine learning methods and will benefit the clinical diagnoses.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015AtmEn.102...96J','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015AtmEn.102...96J"><span>Imputation of missing data in time series for air pollutants</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Junger, W. L.; Ponce de Leon, A.</p> <p>2015-02-01</p> <p>Missing data are major concerns in epidemiological studies of the health effects of environmental air pollutants. This article presents an imputation-based method that is suitable for multivariate time series data, which uses the EM algorithm under the assumption of normal distribution. Different approaches are considered for filtering the temporal component. A simulation study was performed to assess validity and performance of proposed method in comparison with some frequently used methods. Simulations showed that when the amount of missing data was as low as 5%, the complete data analysis yielded satisfactory results regardless of the generating mechanism of the missing data, whereas the validity began to degenerate when the proportion of missing values exceeded 10%. The proposed imputation method exhibited good accuracy and precision in different settings with respect to the patterns of missing observations. Most of the imputations obtained valid results, even under missing not at random. The methods proposed in this study are implemented as a package called mtsdi for the statistical software system R.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_9");'>9</a></li> <li><a href="#" onclick='return showDiv("page_10");'>10</a></li> <li class="active"><span>11</span></li> <li><a href="#" onclick='return showDiv("page_12");'>12</a></li> <li><a href="#" onclick='return showDiv("page_13");'>13</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_11 --> <div id="page_12" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_10");'>10</a></li> <li><a href="#" onclick='return showDiv("page_11");'>11</a></li> <li class="active"><span>12</span></li> <li><a href="#" onclick='return showDiv("page_13");'>13</a></li> <li><a href="#" onclick='return showDiv("page_14");'>14</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="221"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/25905745','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/25905745"><span>Development and validation of a high performance liquid chromatographic method for simultaneous determination of vitamins A and D3 in fluid milk products.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Chen, Yang; Reddy, Ravinder M; Li, Wenjing; Yettlla, Ramesh R; Lopez, Salvador; Woodman, Michael</p> <p>2015-01-01</p> <p>An HPLC method for simultaneous determination of vitamins A and D3 in fluid milk was developed and validated. Saponification and extraction conditions were studied for optimum recovery and simplicity. An RP HPLC system equipped with a C18 column and diode array detector was used for quantitation. The method was subjected to a single-laboratory validation using skim, 2% fat, and whole milk samples at concentrations of 50, 100, and 200% of the recommended fortification levels for vitamins A and D3 for Grade "A" fluid milk. The method quantitation limits for vitamins A and D3 were 0.0072 and 0.0026 μg/mL, respectively. Average recoveries between 94 and 110% and SD values ranging from 2.7 to 6.9% were obtained for both vitamins A and D3. The accuracy of the method was evaluated using a National Institute of Standards and Technology standard reference material (1849a) and proficiency test samples.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/29850371','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/29850371"><span>Development and Validation of an Extractive Spectrophotometric Method for Miconazole Nitrate Assay in Pharmaceutical Formulations.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Eticha, Tadele; Kahsay, Getu; Hailu, Teklebrhan; Gebretsadikan, Tesfamichael; Asefa, Fitsum; Gebretsadik, Hailekiros; Thangabalan, Boovizhikannan</p> <p>2018-01-01</p> <p>A simple extractive spectrophotometric technique has been developed and validated for the determination of miconazole nitrate in pure and pharmaceutical formulations. The method is based on the formation of a chloroform-soluble ion-pair complex between the drug and bromocresol green (BCG) dye in an acidic medium. The complex showed absorption maxima at 422 nm, and the system obeys Beer's law in the concentration range of 1-30  µ g/mL with molar absorptivity of 2.285 × 10 4  L/mol/cm. The composition of the complex was studied by Job's method of continuous variation, and the results revealed that the mole ratio of drug : BCG is 1 : 1. Full factorial design was used to optimize the effect of variable factors, and the method was validated based on the ICH guidelines. The method was applied for the determination of miconazole nitrate in real samples.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/26353879','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/26353879"><span>Validity of the Child Facial Coding System for the Assessment of Acute Pain in Children With Cerebral Palsy.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Hadden, Kellie L; LeFort, Sandra; O'Brien, Michelle; Coyte, Peter C; Guerriere, Denise N</p> <p>2016-04-01</p> <p>The purpose of the current study was to examine the concurrent and discriminant validity of the Child Facial Coding System for children with cerebral palsy. Eighty-five children (mean = 8.35 years, SD = 4.72 years) were videotaped during a passive joint stretch with their physiotherapist and during 3 time segments: baseline, passive joint stretch, and recovery. Children's pain responses were rated from videotape using the Numerical Rating Scale and Child Facial Coding System. Results indicated that Child Facial Coding System scores during the passive joint stretch significantly correlated with Numerical Rating Scale scores (r = .72, P < .01). Child Facial Coding System scores were also significantly higher during the passive joint stretch than the baseline and recovery segments (P < .001). Facial activity was not significantly correlated with the developmental measures. These findings suggest that the Child Facial Coding System is a valid method of identifying pain in children with cerebral palsy. © The Author(s) 2015.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19950013335','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19950013335"><span>Transport aircraft loading and balancing system: Using a CLIPS expert system for military aircraft load planning</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Richardson, J.; Labbe, M.; Belala, Y.; Leduc, Vincent</p> <p>1994-01-01</p> <p>The requirement for improving aircraft utilization and responsiveness in airlift operations has been recognized for quite some time by the Canadian Forces. To date, the utilization of scarce airlift resources has been planned mainly through the employment of manpower-intensive manual methods in combination with the expertise of highly qualified personnel. In this paper, we address the problem of facilitating the load planning process for military aircraft cargo planes through the development of a computer-based system. We introduce TALBAS (Transport Aircraft Loading and BAlancing System), a knowledge-based system designed to assist personnel involved in preparing valid load plans for the C130 Hercules aircraft. The main features of this system which are accessible through a convivial graphical user interface, consists of the automatic generation of valid cargo arrangements given a list of items to be transported, the user-definition of load plans and the automatic validation of such load plans.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/27451435','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/27451435"><span>Natural language processing in pathology: a scoping review.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Burger, Gerard; Abu-Hanna, Ameen; de Keizer, Nicolette; Cornet, Ronald</p> <p>2016-07-22</p> <p>Encoded pathology data are key for medical registries and analyses, but pathology information is often expressed as free text. We reviewed and assessed the use of NLP (natural language processing) for encoding pathology documents. Papers addressing NLP in pathology were retrieved from PubMed, Association for Computing Machinery (ACM) Digital Library and Association for Computational Linguistics (ACL) Anthology. We reviewed and summarised the study objectives; NLP methods used and their validation; software implementations; the performance on the dataset used and any reported use in practice. The main objectives of the 38 included papers were encoding and extraction of clinically relevant information from pathology reports. Common approaches were word/phrase matching, probabilistic machine learning and rule-based systems. Five papers (13%) compared different methods on the same dataset. Four papers did not specify the method(s) used. 18 of the 26 studies that reported F-measure, recall or precision reported values of over 0.9. Proprietary software was the most frequently mentioned category (14 studies); General Architecture for Text Engineering (GATE) was the most applied architecture overall. Practical system use was reported in four papers. Most papers used expert annotation validation. Different methods are used in NLP research in pathology, and good performances, that is, high precision and recall, high retrieval/removal rates, are reported for all of these. Lack of validation and of shared datasets precludes performance comparison. More comparative analysis and validation are needed to provide better insight into the performance and merits of these methods. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/23039046','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/23039046"><span>Validation of quantitative and qualitative methods for detecting allergenic ingredients in processed foods in Japan.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Sakai, Shinobu; Adachi, Reiko; Akiyama, Hiroshi; Teshima, Reiko</p> <p>2013-06-19</p> <p>A labeling system for food allergenic ingredients was established in Japan in April 2002. To monitor the labeling, the Japanese government announced official methods for detecting allergens in processed foods in November 2002. The official methods consist of quantitative screening tests using enzyme-linked immunosorbent assays (ELISAs) and qualitative confirmation tests using Western blotting or polymerase chain reactions (PCR). In addition, the Japanese government designated 10 μg protein/g food (the corresponding allergenic ingredient soluble protein weight/food weight), determined by ELISA, as the labeling threshold. To standardize the official methods, the criteria for the validation protocol were described in the official guidelines. This paper, which was presented at the Advances in Food Allergen Detection Symposium, ACS National Meeting and Expo, San Diego, CA, Spring 2012, describes the validation protocol outlined in the official Japanese guidelines, the results of interlaboratory studies for the quantitative detection method (ELISA for crustacean proteins) and the qualitative detection method (PCR for shrimp and crab DNAs), and the reliability of the detection methods.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/24311861','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/24311861"><span>Validated modified Lycopodium spore method development for standardisation of ingredients of an ayurvedic powdered formulation Shatavaryadi churna.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Kumar, Puspendra; Jha, Shivesh; Naved, Tanveer</p> <p>2013-01-01</p> <p>Validated modified lycopodium spore method has been developed for simple and rapid quantification of herbal powdered drugs. Lycopodium spore method was performed on ingredients of Shatavaryadi churna, an ayurvedic formulation used as immunomodulator, galactagogue, aphrodisiac and rejuvenator. Estimation of diagnostic characters of each ingredient of Shatavaryadi churna individually was carried out. Microscopic determination, counting of identifying number, measurement of area, length and breadth of identifying characters were performed using Leica DMLS-2 microscope. The method was validated for intraday precision, linearity, specificity, repeatability, accuracy and system suitability, respectively. The method is simple, precise, sensitive, and accurate, and can be used for routine standardisation of raw materials of herbal drugs. This method gives the ratio of individual ingredients in the powdered drug so that any adulteration of genuine drug with its adulterant can be found out. The method shows very good linearity value between 0.988-0.999 for number of identifying character and area of identifying character. Percentage purity of the sample drug can be determined by using the linear equation of standard genuine drug.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=medicine+AND+child&pg=5&id=EJ851872','ERIC'); return false;" href="https://eric.ed.gov/?q=medicine+AND+child&pg=5&id=EJ851872"><span>Development of the Gross Motor Function Classification System (1997)</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Morris, Christopher</p> <p>2008-01-01</p> <p>To address the need for a standardized system to classify the gross motor function of children with cerebral palsy, the authors developed a five-level classification system analogous to the staging and grading systems used in medicine. Nominal group process and Delphi survey consensus methods were used to examine content validity and revise the…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/10095715','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/10095715"><span>Changing reward systems for team-based systems.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Barksdale, G T</p> <p>1998-12-01</p> <p>With the rapidly changing pace in health care, hospitals are struggling to keep costs under control and to remain competitive. Leadership is increasingly convinced that old methods of compensation are no longer valid and thus are turning to innovative approaches to pay and reward systems. This article describes some of the new pay methods, with an emphasis on team rewards, showing that compensation can keep pace with the evolving needs of health care.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/28719271','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/28719271"><span>Trachomatous Scar Ranking: A Novel Outcome for Trachoma Studies.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Baldwin, Angela; Ryner, Alexander M; Tadesse, Zerihun; Shiferaw, Ayalew; Callahan, Kelly; Fry, Dionna M; Zhou, Zhaoxia; Lietman, Thomas M; Keenan, Jeremy D</p> <p>2017-06-01</p> <p>AbstractWe evaluated a new trachoma scarring ranking system with potential use in clinical research. The upper right tarsal conjunctivas of 427 individuals from Ethiopian villages with hyperendemic trachoma were photographed. An expert grader first assigned a scar grade to each photograph using the 1981 World Health Organization (WHO) grading system. Then, all photographs were ranked from least (rank = 1) to most scarring (rank = 427). Photographic grading found 79 (18.5%) conjunctivae without scarring (C0), 191 (44.7%) with minimal scarring (C1), 105 (24.6%) with moderate scarring (C2), and 52 (12.2%) with severe scarring (C3). The ranking method demonstrated good internal validity, exhibiting a monotonic increase in the median rank across the levels of the 1981 WHO grading system. Intrarater repeatability was better for the ranking method (intraclass correlation coefficient = 0.84, 95% CI = 0.74-0.94). Exhibiting better internal and external validity, this ranking method may be useful for evaluating the difference in scarring between groups of individuals.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/22317334','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/22317334"><span>Evaluation of objectivity, reliability and criterion validity of the key indicator method for manual handling operations (KIM-MHO), draft 2007.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Klußmann, André; Gebhardt, Hansjürgen; Rieger, Monika; Liebers, Falk; Steinberg, Ulf</p> <p>2012-01-01</p> <p>Upper extremity musculoskeletal symptoms and disorders are common in the working population. The economic and social impact of such disorders is considerable. Long-time, dynamic repetitive exposure of the hand-arm system during manual handling operations (MHO) alone or in combination with static and postural effort are recognised as causes of musculoskeletal symptoms and disorders. The assessment of these manual work tasks is crucial to estimate health risks of exposed employees. For these work tasks, a new method for the assessment of the working conditions was developed and a validation study was performed. The results suggest satisfying criterion validity and moderate objectivity of the KIM-MHO draft 2007. The method was modified and evaluated again. It is planned to release a new version of KIM-MHO in spring 2012.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5002440','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5002440"><span>Validating silicon polytrodes with paired juxtacellular recordings: method and dataset</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Lopes, Gonçalo; Frazão, João; Nogueira, Joana; Lacerda, Pedro; Baião, Pedro; Aarts, Arno; Andrei, Alexandru; Musa, Silke; Fortunato, Elvira; Barquinha, Pedro; Kampff, Adam R.</p> <p>2016-01-01</p> <p>Cross-validating new methods for recording neural activity is necessary to accurately interpret and compare the signals they measure. Here we describe a procedure for precisely aligning two probes for in vivo “paired-recordings” such that the spiking activity of a single neuron is monitored with both a dense extracellular silicon polytrode and a juxtacellular micropipette. Our new method allows for efficient, reliable, and automated guidance of both probes to the same neural structure with micrometer resolution. We also describe a new dataset of paired-recordings, which is available online. We propose that our novel targeting system, and ever expanding cross-validation dataset, will be vital to the development of new algorithms for automatically detecting/sorting single-units, characterizing new electrode materials/designs, and resolving nagging questions regarding the origin and nature of extracellular neural signals. PMID:27306671</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=64638','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=64638"><span>A systematic review of the quality of homeopathic clinical trials</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Jonas, Wayne B; Anderson, Rachel L; Crawford, Cindy C; Lyons, John S</p> <p>2001-01-01</p> <p>Background While a number of reviews of homeopathic clinical trials have been done, all have used methods dependent on allopathic diagnostic classifications foreign to homeopathic practice. In addition, no review has used established and validated quality criteria allowing direct comparison of the allopathic and homeopathic literature. Methods In a systematic review, we compared the quality of clinical-trial research in homeopathy to a sample of research on conventional therapies using a validated and system-neutral approach. All clinical trials on homeopathic treatments with parallel treatment groups published between 1945–1995 in English were selected. All were evaluated with an established set of 33 validity criteria previously validated on a broad range of health interventions across differing medical systems. Criteria covered statistical conclusion, internal, construct and external validity. Reliability of criteria application is greater than 0.95. Results 59 studies met the inclusion criteria. Of these, 79% were from peer-reviewed journals, 29% used a placebo control, 51% used random assignment, and 86% failed to consider potentially confounding variables. The main validity problems were in measurement where 96% did not report the proportion of subjects screened, and 64% did not report attrition rate. 17% of subjects dropped out in studies where this was reported. There was practically no replication of or overlap in the conditions studied and most studies were relatively small and done at a single-site. Compared to research on conventional therapies the overall quality of studies in homeopathy was worse and only slightly improved in more recent years. Conclusions Clinical homeopathic research is clearly in its infancy with most studies using poor sampling and measurement techniques, few subjects, single sites and no replication. Many of these problems are correctable even within a "holistic" paradigm given sufficient research expertise, support and methods. PMID:11801202</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/28462909','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/28462909"><span>Toward quantitative estimation of material properties with dynamic mode atomic force microscopy: a comparative study.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Ghosal, Sayan; Gannepalli, Anil; Salapaka, Murti</p> <p>2017-08-11</p> <p>In this article, we explore methods that enable estimation of material properties with the dynamic mode atomic force microscopy suitable for soft matter investigation. The article presents the viewpoint of casting the system, comprising of a flexure probe interacting with the sample, as an equivalent cantilever system and compares a steady-state analysis based method with a recursive estimation technique for determining the parameters of the equivalent cantilever system in real time. The steady-state analysis of the equivalent cantilever model, which has been implicitly assumed in studies on material property determination, is validated analytically and experimentally. We show that the steady-state based technique yields results that quantitatively agree with the recursive method in the domain of its validity. The steady-state technique is considerably simpler to implement, however, slower compared to the recursive technique. The parameters of the equivalent system are utilized to interpret storage and dissipative properties of the sample. Finally, the article identifies key pitfalls that need to be avoided toward the quantitative estimation of material properties.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/15018620','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/15018620"><span>Task-oriented evaluation of electronic medical records systems: development and validation of a questionnaire for physicians.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Laerum, Hallvard; Faxvaag, Arild</p> <p>2004-02-09</p> <p>Evaluation is a challenging but necessary part of the development cycle of clinical information systems like the electronic medical records (EMR) system. It is believed that such evaluations should include multiple perspectives, be comparative and employ both qualitative and quantitative methods. Self-administered questionnaires are frequently used as a quantitative evaluation method in medical informatics, but very few validated questionnaires address clinical use of EMR systems. We have developed a task-oriented questionnaire for evaluating EMR systems from the clinician's perspective. The key feature of the questionnaire is a list of 24 general clinical tasks. It is applicable to physicians of most specialties and covers essential parts of their information-oriented work. The task list appears in two separate sections, about EMR use and task performance using the EMR, respectively. By combining these sections, the evaluator may estimate the potential impact of the EMR system on health care delivery. The results may also be compared across time, site or vendor. This paper describes the development, performance and validation of the questionnaire. Its performance is shown in two demonstration studies (n = 219 and 80). Its content is validated in an interview study (n = 10), and its reliability is investigated in a test-retest study (n = 37) and a scaling study (n = 31). In the interviews, the physicians found the general clinical tasks in the questionnaire relevant and comprehensible. The tasks were interpreted concordant to their definitions. However, the physicians found questions about tasks not explicitly or only partially supported by the EMR systems difficult to answer. The two demonstration studies provided unambiguous results and low percentages of missing responses. In addition, criterion validity was demonstrated for a majority of task-oriented questions. Their test-retest reliability was generally high, and the non-standard scale was found symmetric and ordinal. This questionnaire is relevant for clinical work and EMR systems, provides reliable and interpretable results, and may be used as part of any evaluation effort involving the clinician's perspective of an EMR system.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016SPIE.9903E..0NC','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016SPIE.9903E..0NC"><span>Study of glass hydrometer calibration by hydrostatic weighting</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Chen, Chaoyun; Wang, Jintao; Li, Zhihao; Zhang, Peiman</p> <p>2016-01-01</p> <p>Glass hydrometers are simple but effective instruments for measuring the density of liquids. Glass hydrometers calibration based on the Archimedes law, using silicon ring as a reference standard solid density, n-tridecane with density stability and low surface tension as the standard working liquid, based on hydrostatic weighing method designs a glass hydrometer calibration system. Glass hydrometer calibration system uses CCD image measurement system to align the scale of hydrometer and liquid surface, with positioning accuracy of 0.01 mm. Surface tension of the working liquid is measured by Whihemy plate. According to twice glass hydrometer weighing in the air and liquid can calculate the correction value of the current scale. In order to verify the validity of the principle of the hydrostatic weighing method of glass hydrometer calibration system, for measuring the density range of (770-790) kg/m3, with a resolution of 0.2 kg/m3 of hydrometer. The results of measurement compare with the Physikalisch-Technische Bundesanstalt(PTB) ,verifying the validity of the calibration system.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19970005632','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19970005632"><span>Experiences Using Formal Methods for Requirements Modeling</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Easterbrook, Steve; Lutz, Robyn; Covington, Rick; Kelly, John; Ampo, Yoko; Hamilton, David</p> <p>1996-01-01</p> <p>This paper describes three cases studies in the lightweight application of formal methods to requirements modeling for spacecraft fault protection systems. The case studies differ from previously reported applications of formal methods in that formal methods were applied very early in the requirements engineering process, to validate the evolving requirements. The results were fed back into the projects, to improve the informal specifications. For each case study, we describe what methods were applied, how they were applied, how much effort was involved, and what the findings were. In all three cases, the formal modeling provided a cost effective enhancement of the existing verification and validation processes. We conclude that the benefits gained from early modeling of unstable requirements more than outweigh the effort needed to maintain multiple representations.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20080033682','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20080033682"><span>Robustness Analysis and Reliable Flight Regime Estimation of an Integrated Resilent Control System for a Transport Aircraft</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Shin, Jong-Yeob; Belcastro, Christine</p> <p>2008-01-01</p> <p>Formal robustness analysis of aircraft control upset prevention and recovery systems could play an important role in their validation and ultimate certification. As a part of the validation process, this paper describes an analysis method for determining a reliable flight regime in the flight envelope within which an integrated resilent control system can achieve the desired performance of tracking command signals and detecting additive faults in the presence of parameter uncertainty and unmodeled dynamics. To calculate a reliable flight regime, a structured singular value analysis method is applied to analyze the closed-loop system over the entire flight envelope. To use the structured singular value analysis method, a linear fractional transform (LFT) model of a transport aircraft longitudinal dynamics is developed over the flight envelope by using a preliminary LFT modeling software tool developed at the NASA Langley Research Center, which utilizes a matrix-based computational approach. The developed LFT model can capture original nonlinear dynamics over the flight envelope with the ! block which contains key varying parameters: angle of attack and velocity, and real parameter uncertainty: aerodynamic coefficient uncertainty and moment of inertia uncertainty. Using the developed LFT model and a formal robustness analysis method, a reliable flight regime is calculated for a transport aircraft closed-loop system.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017JPSJ...86b4009I','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017JPSJ...86b4009I"><span>Robust Measurements of Phase Response Curves Realized via Multicycle Weighted Spike-Triggered Averages</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Imai, Takashi; Ota, Kaiichiro; Aoyagi, Toshio</p> <p>2017-02-01</p> <p>Phase reduction has been extensively used to study rhythmic phenomena. As a result of phase reduction, the rhythm dynamics of a given system can be described using the phase response curve. Measuring this characteristic curve is an important step toward understanding a system's behavior. Recently, a basic idea for a new measurement method (called the multicycle weighted spike-triggered average method) was proposed. This paper confirms the validity of this method by providing an analytical proof and demonstrates its effectiveness in actual experimental systems by applying the method to an oscillating electric circuit. Some practical tips to use the method are also presented.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/26539333','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/26539333"><span>Getting the most out of RNA-seq data analysis.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Khang, Tsung Fei; Lau, Ching Yee</p> <p>2015-01-01</p> <p>Background. A common research goal in transcriptome projects is to find genes that are differentially expressed in different phenotype classes. Biologists might wish to validate such gene candidates experimentally, or use them for downstream systems biology analysis. Producing a coherent differential gene expression analysis from RNA-seq count data requires an understanding of how numerous sources of variation such as the replicate size, the hypothesized biological effect size, and the specific method for making differential expression calls interact. We believe an explicit demonstration of such interactions in real RNA-seq data sets is of practical interest to biologists. Results. Using two large public RNA-seq data sets-one representing strong, and another mild, biological effect size-we simulated different replicate size scenarios, and tested the performance of several commonly-used methods for calling differentially expressed genes in each of them. We found that, when biological effect size was mild, RNA-seq experiments should focus on experimental validation of differentially expressed gene candidates. Importantly, at least triplicates must be used, and the differentially expressed genes should be called using methods with high positive predictive value (PPV), such as NOISeq or GFOLD. In contrast, when biological effect size was strong, differentially expressed genes mined from unreplicated experiments using NOISeq, ASC and GFOLD had between 30 to 50% mean PPV, an increase of more than 30-fold compared to the cases of mild biological effect size. Among methods with good PPV performance, having triplicates or more substantially improved mean PPV to over 90% for GFOLD, 60% for DESeq2, 50% for NOISeq, and 30% for edgeR. At a replicate size of six, we found DESeq2 and edgeR to be reasonable methods for calling differentially expressed genes at systems level analysis, as their PPV and sensitivity trade-off were superior to the other methods'. Conclusion. When biological effect size is weak, systems level investigation is not possible using RNAseq data, and no meaningful result can be obtained in unreplicated experiments. Nonetheless, NOISeq or GFOLD may yield limited numbers of gene candidates with good validation potential, when triplicates or more are available. When biological effect size is strong, NOISeq and GFOLD are effective tools for detecting differentially expressed genes in unreplicated RNA-seq experiments for qPCR validation. When triplicates or more are available, GFOLD is a sharp tool for identifying high confidence differentially expressed genes for targeted qPCR validation; for downstream systems level analysis, combined results from DESeq2 and edgeR are useful.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_10");'>10</a></li> <li><a href="#" onclick='return showDiv("page_11");'>11</a></li> <li class="active"><span>12</span></li> <li><a href="#" onclick='return showDiv("page_13");'>13</a></li> <li><a href="#" onclick='return showDiv("page_14");'>14</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_12 --> <div id="page_13" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_11");'>11</a></li> <li><a href="#" onclick='return showDiv("page_12");'>12</a></li> <li class="active"><span>13</span></li> <li><a href="#" onclick='return showDiv("page_14");'>14</a></li> <li><a href="#" onclick='return showDiv("page_15");'>15</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="241"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3319686','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3319686"><span>Measuring physical activity in preschoolers: Reliability and validity of The System for Observing Fitness Instruction Time for Preschoolers (SOFIT-P)</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Sharma, Shreela; Chuang, Ru-Jye; Skala, Katherine; Atteberry, Heather</p> <p>2012-01-01</p> <p>The purpose of this study is describe the initial feasibility, reliability, and validity of an instrument to measure physical activity in preschoolers using direct observation. The System for Observing Fitness Instruction Time for Preschoolers was developed and tested among 3- to 6-year-old children over fall 2008 for feasibility and reliability (Phase I, n=67) and in fall 2009 for concurrent validity (Phase II, n=27). Phase I showed that preschoolers spent >75% of their active time at preschool in light physical activity. The mean inter-observer agreements scores were ≥.75 for physical activity level and type. Correlation coefficients, measuring construct validity between the lesson context and physical activity types with and with the activity levels, were moderately strong. Phase II showed moderately strong correlations ranging from .50 to .54 between the System for Observing Fitness Instruction Time for Preschoolers and Actigraph accelerometers for physical activity levels. The System for Observing Fitness Instruction Time for Preschoolers shows promising initial results as a new method for measuring physical activity among preschoolers. PMID:22485071</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4756054','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4756054"><span>Convergence of Health Level Seven Version 2 Messages to Semantic Web Technologies for Software-Intensive Systems in Telemedicine Trauma Care</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Cook, Timothy Wayne; Cavalini, Luciana Tricai</p> <p>2016-01-01</p> <p>Objectives To present the technical background and the development of a procedure that enriches the semantics of Health Level Seven version 2 (HL7v2) messages for software-intensive systems in telemedicine trauma care. Methods This study followed a multilevel model-driven approach for the development of semantically interoperable health information systems. The Pre-Hospital Trauma Life Support (PHTLS) ABCDE protocol was adopted as the use case. A prototype application embedded the semantics into an HL7v2 message as an eXtensible Markup Language (XML) file, which was validated against an XML schema that defines constraints on a common reference model. This message was exchanged with a second prototype application, developed on the Mirth middleware, which was also used to parse and validate both the original and the hybrid messages. Results Both versions of the data instance (one pure XML, one embedded in the HL7v2 message) were equally validated and the RDF-based semantics recovered by the receiving side of the prototype from the shared XML schema. Conclusions This study demonstrated the semantic enrichment of HL7v2 messages for intensive-software telemedicine systems for trauma care, by validating components of extracts generated in various computing environments. The adoption of the method proposed in this study ensures the compliance of the HL7v2 standard in Semantic Web technologies. PMID:26893947</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2001JRScT..38..646S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2001JRScT..38..646S"><span>Design, validation, and use of an evaluation instrument for monitoring systemic reform</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Scantlebury, Kathryn; Boone, William; Butler Kahle, Jane; Fraser, Barry J.</p> <p>2001-08-01</p> <p>Over the past decade, state and national policymakers have promoted systemic reform as a way to achieve high-quality science education for all students. However, few instruments are available to measure changes in key dimensions relevant to systemic reform such as teaching practices, student attitudes, or home and peer support. Furthermore, Rasch methods of analysis are needed to permit valid comparison of different cohorts of students during different years of a reform effort. This article describes the design, development, validation, and use of an instrument that measures student attitudes and several environment dimensions (standards-based teaching, home support, and peer support) using a three-step process that incorporated expert opinion, factor analysis, and item response theory. The instrument was validated with over 8,000 science and mathematics students, taught by more than 1,000 teachers in over 200 schools as part of a comprehensive assessment of the effectiveness of Ohio's systemic reform initiative. When the new four-factor, 20-item questionnaire was used to explore the relative influence of the class, home, and peer environment on student achievement and attitudes, findings were remarkably consistent across 3 years and different units and methods of analysis. All three environments accounted for unique variance in student attitudes, but only the environment of the class accounted for unique variance in student achievement. However, the class environment (standards-based teaching practices) was the strongest independent predictor of both achievement and attitude, and appreciable amounts of the total variance in attitudes were common to the three environments.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2004IJTPE.124..363O','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2004IJTPE.124..363O"><span>A Method for Suppressing Line Overload Phenomena Using NAS Battery Systems</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Ohtaka, Toshiya; Iwamoto, Shinichi</p> <p></p> <p>In this paper, we pay attention to the superior operating control function and instantaneous discharging characteristics of NAS battery systems, and propose a method for determining installation planning and operating control schemes of NAS battery systems for suppressing line overload phenomena. In the stage of planning, a target contingency is identified, and an optimal allocation and capacity of NAS battery systems and an amount of generation changes are determined for the contingency. In the stage of operation, the control strategy of NAS battery system is determined. Simulations are carried out for verifying the validity of the proposed method using the IEEJ 1 machine V system model and an example 2 machine 16 bus system model.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018AIPC.1960b0028S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018AIPC.1960b0028S"><span>Comparison of validation methods for forming simulations</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Schug, Alexander; Kapphan, Gabriel; Bardl, Georg; Hinterhölzl, Roland; Drechsler, Klaus</p> <p>2018-05-01</p> <p>The forming simulation of fibre reinforced thermoplastics could reduce the development time and improve the forming results. But to take advantage of the full potential of the simulations it has to be ensured that the predictions for material behaviour are correct. For that reason, a thorough validation of the material model has to be conducted after characterising the material. Relevant aspects for the validation of the simulation are for example the outer contour, the occurrence of defects and the fibre paths. To measure these features various methods are available. Most relevant and also most difficult to measure are the emerging fibre orientations. For that reason, the focus of this study was on measuring this feature. The aim was to give an overview of the properties of different measuring systems and select the most promising systems for a comparison survey. Selected were an optical, an eddy current and a computer-assisted tomography system with the focus on measuring the fibre orientations. Different formed 3D parts made of unidirectional glass fibre and carbon fibre reinforced thermoplastics were measured. Advantages and disadvantages of the tested systems were revealed. Optical measurement systems are easy to use, but are limited to the surface plies. With an eddy current system also lower plies can be measured, but it is only suitable for carbon fibres. Using a computer-assisted tomography system all plies can be measured, but the system is limited to small parts and challenging to evaluate.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://rosap.ntl.bts.gov/view/dot/10267','DOTNTL'); return false;" href="https://rosap.ntl.bts.gov/view/dot/10267"><span>Description and Evaluation of the MBTA Magnetic Card Fare Collection System</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntlsearch.bts.gov/tris/index.do">DOT National Transportation Integrated Search</a></p> <p></p> <p>1982-07-01</p> <p>Observation over the years by the Massachusetts Bay Transportation Authority (MBTA) indicated that a significant number of passengers entered the system without using a valid pass. Fraudulent entry was gained using various methods of deceit, the most...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/21722081','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/21722081"><span>Validation of the criteria for initiating the cleaning of heating, ventilation, and air-conditioning (HVAC) ductwork under real conditions.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Lavoie, Jacques; Marchand, Geneviève; Cloutier, Yves; Lavoué, Jérôme</p> <p>2011-08-01</p> <p>Dust accumulation in the components of heating, ventilation, and air-conditioning (HVAC) systems is a potential source of contaminants. To date, very little information is available on recognized methods for assessing dust buildup in these systems. The few existing methods are either objective in nature, involving numerical values, or subjective in nature, based on experts' judgments. An earlier project aimed at assessing different methods of sampling dust in ducts was carried out in the laboratories of the Institut de recherche Robert-Sauvé en santé et en sécurité du travail (IRSST). This laboratory study showed that all the sampling methods were practicable, provided that a specific surface-dust cleaning initiation criterion was used for each method. However, these conclusions were reached on the basis of ideal conditions in a laboratory using a reference dust. The objective of this present study was to validate these laboratory results in the field. To this end, the laboratory sampling templates were replicated in real ducts and the three sampling methods (the IRSST method, the method of the U.S. organization National Air Duct Cleaner Association [NADCA] and that of the French organization Association pour la Prévention et l'Étude de la Contamination [ASPEC]) were used simultaneously in a statistically representative number of systems. The air return and supply ducts were also compared. Cleaning initiation criteria under real conditions were found to be 6.0 mg/100 cm(2) using the IRSST method, 2.0 mg/100 cm(2) using the NADCA method, and 23 mg/100 cm(2) using the ASPEC method. In the laboratory study, the criteria using the same methods were 6.0 for the IRSST method, 2.0 for the NADCA method, and 3.0 for the ASPEC method. The laboratory criteria for the IRSST and NADCA methods were therefore validated in the field. The ASPEC criterion was the only one to change. The ASPEC method therefore allows for the most accurate evaluation of dust accumulation in HVAC ductwork. We therefore recommend using the latter method to objectively assess dust accumulation levels in HVAC ductwork.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19980016986','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19980016986"><span>Experiences Using Lightweight Formal Methods for Requirements Modeling</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Easterbrook, Steve; Lutz, Robyn; Covington, Rick; Kelly, John; Ampo, Yoko; Hamilton, David</p> <p>1997-01-01</p> <p>This paper describes three case studies in the lightweight application of formal methods to requirements modeling for spacecraft fault protection systems. The case studies differ from previously reported applications of formal methods in that formal methods were applied very early in the requirements engineering process, to validate the evolving requirements. The results were fed back into the projects, to improve the informal specifications. For each case study, we describe what methods were applied, how they were applied, how much effort was involved, and what the findings were. In all three cases, formal methods enhanced the existing verification and validation processes, by testing key properties of the evolving requirements, and helping to identify weaknesses. We conclude that the benefits gained from early modeling of unstable requirements more than outweigh the effort needed to maintain multiple representations.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018MS%26E..336a2033E','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018MS%26E..336a2033E"><span>Development of Servo Motor Trainer for Basic Control System in Laboratory of Electrical Engineering Control System Faculty of Engineering Universitas Negeri Surabaya</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Endryansyah; Wanarti Rusimamto, Puput; Ridianto, Adam; Sugiarto, Hariyadi</p> <p>2018-04-01</p> <p>In the Department of Electrical Engineering FT Unesa, there are 3 majors: S1 Electrical Engineering Education, S1 Electrical Engineering, and D3 Electrical Engineering. Courses the Basic System Settings go to in the curriculum of the three programs. Team lecturer college of basic system settings seek learning innovation, focused on the development of trainer to student practicum at the laboratory of systems control. Trainer developed is a servo motor along with the lab module that contains a wide variety of theories about the servo motor and guide the practicum. This research type is development research using methods Research & development (R & D). In which the steps are applied in this study is as follows: pay attention to the potential and existing problems, gather information and study the literature, design the product, validate the design, revise the design, a limited trial. The results of the validation of learning device in the form of modules and trainer obtained as follows: score validation of learning device is 3,64; score validation lab module Servo Motor is 3,47; and questionnaire responses of students is 3,73. The result of the whole validation value is located in the interval >of 3.25 s/d 4 with the category of “Very Valid”, so it can be concluded that all instruments have a level of validity “Very Valid” and worthy of use for further learning.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=2952302','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=2952302"><span>Law's Dilemma: Validating Complementary and Alternative Medicine and the Clash of Evidential Paradigms</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Iyioha, Ireh</p> <p>2011-01-01</p> <p>This paper examines the (in)compatibility between the diagnostic and therapeutic theories of complementary and alternative medicine (CAM) and a science-based regulatory framework. Specifically, the paper investigates the nexus between statutory legitimacy and scientific validation of health systems, with an examination of its impact on the development of complementary and alternative therapies. The paper evaluates competing theories for validating CAM ranging from the RCT methodology to anthropological perspectives and contends that while the RCT method might be beneficial in the regulation of many CAM therapies, yet dogmatic adherence to this paradigm as the exclusive method for legitimizing CAM will be adverse to the independent development of many CAM therapies whose philosophies and mechanisms of action are not scientifically interpretable. Drawing on history and research evidence to support this argument, the paper sues for a regulatory model that is accommodative of different evidential paradigms in support of a pluralistic healthcare system that balances the imperative of quality assurance with the need to ensure access. PMID:20953428</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2011CEJE....1..423M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2011CEJE....1..423M"><span>Benchmark tests for a Formula SAE Student car prototyping</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Mariasiu, Florin</p> <p>2011-12-01</p> <p>Aerodynamic characteristics of a vehicle are important elements in its design and construction. A low drag coefficient brings significant fuel savings and increased engine power efficiency. In designing and developing vehicles trough computer simulation process to determine the vehicles aerodynamic characteristics are using dedicated CFD (Computer Fluid Dynamics) software packages. However, the results obtained by this faster and cheaper method, are validated by experiments in wind tunnels tests, which are expensive and were complex testing equipment are used in relatively high costs. Therefore, the emergence and development of new low-cost testing methods to validate CFD simulation results would bring great economic benefits for auto vehicles prototyping process. This paper presents the initial development process of a Formula SAE Student race-car prototype using CFD simulation and also present a measurement system based on low-cost sensors through which CFD simulation results were experimentally validated. CFD software package used for simulation was Solid Works with the FloXpress add-on and experimental measurement system was built using four piezoresistive force sensors FlexiForce type.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1064062','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1064062"><span>USING CFD TO ANALYZE NUCLEAR SYSTEMS BEHAVIOR: DEFINING THE VALIDATION REQUIREMENTS</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Richard Schultz</p> <p>2012-09-01</p> <p>A recommended protocol to formulate numeric tool specifications and validation needs in concert with practices accepted by regulatory agencies for advanced reactors is described. The protocol is based on the plant type and perceived transient and accident envelopes that translates to boundary conditions for a process that gives the: (a) key phenomena and figures-of-merit which must be analyzed to ensure that the advanced plant can be licensed, (b) specification of the numeric tool capabilities necessary to perform the required analyses—including bounding calculational uncertainties, and (c) specification of the validation matrices and experiments--including the desired validation data. The result of applyingmore » the process enables a complete program to be defined, including costs, for creating and benchmarking transient and accident analysis methods for advanced reactors. By following a process that is in concert with regulatory agency licensing requirements from the start to finish, based on historical acceptance of past licensing submittals, the methods derived and validated have a high probability of regulatory agency acceptance.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/27693956','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/27693956"><span>Validation of a commercial inertial sensor system for spatiotemporal gait measurements in children.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Lanovaz, Joel L; Oates, Alison R; Treen, Tanner T; Unger, Janelle; Musselman, Kristin E</p> <p>2017-01-01</p> <p>Although inertial sensor systems are becoming a popular tool for gait analysis in both healthy and pathological adult populations, there are currently no data on the validity of these systems for use with children. The purpose of this study was to validate spatiotemporal data from a commercial inertial sensor system (MobilityLab) in typically-developing children. Data from 10 children (5 males; 3.0-8.3 years, mean=5.1) were collected simultaneously from MobilityLab and 3D motion capture during gait at self-selected and fast walking speeds. Spatiotemporal parameters were compared between the two methods using a Bland-Altman method. The results indicate that, while the temporal gait measurements were similar between the two systems, MobilityLab demonstrated a consistent bias with respect to measurement of the spatial data (stride length). This error is likely due to differences in relative leg length and gait characteristics in children compared to the MobilityLab adult reference population used to develop the stride length algorithm. A regression-based equation was developed based on the current data to correct the MobilityLab stride length output. The correction was based on leg length, stride time, and shank range-of-motion, each of which were independently associated with stride length. Once the correction was applied, all of the spatiotemporal parameters evaluated showed good agreement. The results of this study indicate that MobilityLab is a valid tool for gait analysis in typically-developing children. Further research is needed to determine the efficacy of this system for use in children suffering from pathologies that impact gait mechanics. Copyright © 2016 Elsevier B.V. All rights reserved.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/28249396','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/28249396"><span>Dimension from covariance matrices.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Carroll, T L; Byers, J M</p> <p>2017-02-01</p> <p>We describe a method to estimate embedding dimension from a time series. This method includes an estimate of the probability that the dimension estimate is valid. Such validity estimates are not common in algorithms for calculating the properties of dynamical systems. The algorithm described here compares the eigenvalues of covariance matrices created from an embedded signal to the eigenvalues for a covariance matrix of a Gaussian random process with the same dimension and number of points. A statistical test gives the probability that the eigenvalues for the embedded signal did not come from the Gaussian random process.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/24772117','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/24772117"><span>A Comprehensive, Multi-modal Evaluation of the Assessment System of an Undergraduate Research Methodology Course: Translating Theory into Practice.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Mohammad Abdulghani, Hamza; G Ponnamperuma, Gominda; Ahmad, Farah; Amin, Zubair</p> <p>2014-03-01</p> <p>To evaluate assessment system of the 'Research Methodology Course' using utility criteria (i.e. validity, reliability, acceptability, educational impact, and cost-effectiveness). This study demonstrates comprehensive evaluation of assessment system and suggests a framework for similar courses. Qualitative and quantitative methods used for evaluation of the course assessment components (50 MCQ, 3 Short Answer Questions (SAQ) and research project) using the utility criteria. RESULTS of multiple evaluation methods for all the assessment components were collected and interpreted together to arrive at holistic judgments, rather than judgments based on individual methods or individual assessment. Face validity, evaluated using a self-administered questionnaire (response rate-88.7%) disclosed that the students perceived that there was an imbalance in the contents covered by the assessment. This was confirmed by the assessment blueprint. Construct validity was affected by the low correlation between MCQ and SAQ scores (r=0.326). There was a higher correlation between the project and MCQ (r=0.466)/SAQ (r=0.463) scores. Construct validity was also affected by the presence of recall type of MCQs (70%; 35/50), item construction flaws and non-functioning distractors. High discriminating indices (>0.35) were found in MCQs with moderate difficulty indices (0.3-0.7). Reliability of the MCQs was 0.75 which could be improved up to 0.8 by increasing the number of MCQs to at least 70. A positive educational impact was found in the form of the research project assessment driving students to present/publish their work in conferences/peer reviewed journals. Cost per student to complete the course was US$164.50. The multi-modal evaluation of an assessment system is feasible and provides thorough and diagnostic information. Utility of the assessment system could be further improved by modifying the psychometrically inappropriate assessment items.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=Pendulum&pg=3&id=EJ901528','ERIC'); return false;" href="https://eric.ed.gov/?q=Pendulum&pg=3&id=EJ901528"><span>Explicit Analytical Solution of a Pendulum with Periodically Varying Length</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Yang, Tianzhi; Fang, Bo; Li, Song; Huang, Wenhu</p> <p>2010-01-01</p> <p>A pendulum with periodically varying length is an interesting physical system. It has been studied by some researchers using traditional perturbation methods (for example, the averaging method). But due to the limitation of the conventional perturbation methods, the solutions are not valid for long-term prediction of the pendulum. In this paper,…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19790002688','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19790002688"><span>An analytical method for designing low noise helicopter transmissions</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Bossler, R. B., Jr.; Bowes, M. A.; Royal, A. C.</p> <p>1978-01-01</p> <p>The development and experimental validation of a method for analytically modeling the noise mechanism in the helicopter geared power transmission systems is described. This method can be used within the design process to predict interior noise levels and to investigate the noise reducing potential of alternative transmission design details. Examples are discussed.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.ars.usda.gov/research/publications/publication/?seqNo115=332976','TEKTRAN'); return false;" href="http://www.ars.usda.gov/research/publications/publication/?seqNo115=332976"><span>Analysis of various quality attributes of sunflower and soybean plants by near infra-red reflectance spectroscopy: Development and validation calibration models</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ars.usda.gov/research/publications/find-a-publication/">USDA-ARS?s Scientific Manuscript database</a></p> <p></p> <p></p> <p>Sunflower and soybean are summer annuals that can be grown as an alternative to corn and may be particularly useful in organic production systems. Rapid and low cost methods of analyzing plant quality would be helpful for crop management. We developed and validated calibration models for Near-infrar...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/29023537','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/29023537"><span>Development and validation of a nursing professionalism evaluation model in a career ladder system.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Kim, Yeon Hee; Jung, Young Sun; Min, Ja; Song, Eun Young; Ok, Jung Hui; Lim, Changwon; Kim, Kyunghee; Kim, Ji-Su</p> <p>2017-01-01</p> <p>The clinical ladder system categorizes the degree of nursing professionalism and rewards and is an important human resource tool for managing nursing. We developed a model to evaluate nursing professionalism, which determines the clinical ladder system levels, and verified its validity. Data were collected using a clinical competence tool developed in this study, and existing methods such as the nursing professionalism evaluation tool, peer reviews, and face-to-face interviews to evaluate promotions and verify the presented content in a medical institution. Reliability and convergent and discriminant validity of the clinical competence evaluation tool were verified using SmartPLS software. The validity of the model for evaluating overall nursing professionalism was also analyzed. Clinical competence was determined by five dimensions of nursing practice: scientific, technical, ethical, aesthetic, and existential. The structural model explained 66% of the variance. Clinical competence scales, peer reviews, and face-to-face interviews directly determined nursing professionalism levels. The evaluation system can be used for evaluating nurses' professionalism in actual medical institutions from a nursing practice perspective. A conceptual framework for establishing a human resources management system for nurses and a tool for evaluating nursing professionalism at medical institutions is provided.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/27001709','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/27001709"><span>Hybrid particle-field molecular dynamics simulation for polyelectrolyte systems.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Zhu, You-Liang; Lu, Zhong-Yuan; Milano, Giuseppe; Shi, An-Chang; Sun, Zhao-Yan</p> <p>2016-04-14</p> <p>To achieve simulations on large spatial and temporal scales with high molecular chemical specificity, a hybrid particle-field method was proposed recently. This method is developed by combining molecular dynamics and self-consistent field theory (MD-SCF). The MD-SCF method has been validated by successfully predicting the experimentally observable properties of several systems. Here we propose an efficient scheme for the inclusion of electrostatic interactions in the MD-SCF framework. In this scheme, charged molecules are interacting with the external fields that are self-consistently determined from the charge densities. This method is validated by comparing the structural properties of polyelectrolytes in solution obtained from the MD-SCF and particle-based simulations. Moreover, taking PMMA-b-PEO and LiCF3SO3 as examples, the enhancement of immiscibility between the ion-dissolving block and the inert block by doping lithium salts into the copolymer is examined by using the MD-SCF method. By employing GPU-acceleration, the high performance of the MD-SCF method with explicit treatment of electrostatics facilitates the simulation study of many problems involving polyelectrolytes.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_11");'>11</a></li> <li><a href="#" onclick='return showDiv("page_12");'>12</a></li> <li class="active"><span>13</span></li> <li><a href="#" onclick='return showDiv("page_14");'>14</a></li> <li><a href="#" onclick='return showDiv("page_15");'>15</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_13 --> <div id="page_14" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_12");'>12</a></li> <li><a href="#" onclick='return showDiv("page_13");'>13</a></li> <li class="active"><span>14</span></li> <li><a href="#" onclick='return showDiv("page_15");'>15</a></li> <li><a href="#" onclick='return showDiv("page_16");'>16</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="261"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014ISPAr.XL2...91D','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014ISPAr.XL2...91D"><span>Validation techniques of agent based modelling for geospatial simulations</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Darvishi, M.; Ahmadi, G.</p> <p>2014-10-01</p> <p>One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent-based modelling and simulation (ABMS) is a new modelling method comprising of multiple interacting agent. They have been used in the different areas; for instance, geographic information system (GIS), biology, economics, social science and computer science. The emergence of ABM toolkits in GIS software libraries (e.g. ESRI's ArcGIS, OpenMap, GeoTools, etc) for geospatial modelling is an indication of the growing interest of users to use of special capabilities of ABMS. Since ABMS is inherently similar to human cognition, therefore it could be built easily and applicable to wide range applications than a traditional simulation. But a key challenge about ABMS is difficulty in their validation and verification. Because of frequent emergence patterns, strong dynamics in the system and the complex nature of ABMS, it is hard to validate and verify ABMS by conventional validation methods. Therefore, attempt to find appropriate validation techniques for ABM seems to be necessary. In this paper, after reviewing on Principles and Concepts of ABM for and its applications, the validation techniques and challenges of ABM validation are discussed.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/27527103','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/27527103"><span>Development and Validation of an Inductively Coupled Plasma Mass Spectrometry (ICP-MS) Method for Quantitative Analysis of Platinum in Plasma, Urine, and Tissues.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Zhang, Ti; Cai, Shuang; Forrest, Wai Chee; Mohr, Eva; Yang, Qiuhong; Forrest, M Laird</p> <p>2016-09-01</p> <p>Cisplatin, a platinum chemotherapeutic, is one of the most commonly used chemotherapeutic agents for many solid tumors. In this work, we developed and validated an inductively coupled plasma mass spectrometry (ICP-MS) method for quantitative determination of platinum levels in rat urine, plasma, and tissue matrices including liver, brain, lungs, kidney, muscle, heart, spleen, bladder, and lymph nodes. The tissues were processed using a microwave accelerated reaction system (MARS) system prior to analysis on an Agilent 7500 ICP-MS. According to the Food and Drug Administration guidance for industry, bioanalytical validation parameters of the method, such as selectivity, accuracy, precision, recovery, and stability were evaluated in rat biological samples. Our data suggested that the method was selective for platinum without interferences caused by other presenting elements, and the lower limit of quantification was 0.5 ppb. The accuracy and precision of the method were within 15% variation and the recoveries of platinum for all tissue matrices examined were determined to be 85-115% of the theoretical values. The stability of the platinum-containing solutions, including calibration standards, stock solutions, and processed samples in rat biological matrices was investigated. Results indicated that the samples were stable after three cycles of freeze-thaw and for up to three months. © The Author(s) 2016.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5179258','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5179258"><span>Development and Validation of an Inductively Coupled Plasma Mass Spectrometry (ICP-MS) Method for Quantitative Analysis of Platinum in Plasma, Urine, and Tissues</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Zhang, Ti; Cai, Shuang; Forrest, Wai Chee; Mohr, Eva; Yang, Qiuhong; Forrest, M. Laird</p> <p>2016-01-01</p> <p>Cisplatin, a platinum chemotherapeutic, is one of the most commonly used chemotherapeutic agents for many solid tumors. In this work, we developed and validated an inductively coupled plasma mass spectrometry (ICP-MS) method for quantitative determination of platinum levels in rat urine, plasma, and tissue matrices including liver, brain, lungs, kidney, muscle, heart, spleen, bladder, and lymph nodes. The tissues were processed using a microwave accelerated reaction system (MARS) system prior to analysis on an Agilent 7500 ICP-MS. According to the Food and Drug Administration guidance for industry, bioanalytical validation parameters of the method, such as selectivity, accuracy, precision, recovery, and stability were evaluated in rat biological samples. Our data suggested that the method was selective for platinum without interferences caused by other presenting elements, and the lower limit of quantification was 0.5 ppb. The accuracy and precision of the method were within 15% variation and the recoveries of platinum for all tissue matrices examined were determined to be 85–115% of the theoretical values. The stability of the platinum-containing solutions, including calibration standards, stock solutions, and processed samples in rat biological matrices was investigated. Results indicated that the samples were stable after three cycles of freeze–thaw and for up to three months. PMID:27527103</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/28238558','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/28238558"><span>A novel computer system for the evaluation of nasolabial morphology, symmetry and aesthetics after cleft lip and palate treatment. Part 1: General concept and validation.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Pietruski, Piotr; Majak, Marcin; Debski, Tomasz; Antoszewski, Boguslaw</p> <p>2017-04-01</p> <p>The need for a widely accepted method suitable for a multicentre quantitative evaluation of facial aesthetics after surgical treatment of cleft lip and palate (CLP) has been emphasized for years. The aim of this study was to validate a novel computer system 'Analyse It Doc' (A.I.D.) as a tool for objective anthropometric analysis of the nasolabial region. An indirect anthropometric analysis of facial photographs was conducted with the A.I.D. system and Adobe Photoshop/ImageJ software. Intra-rater and inter-rater reliability and the time required for the analysis were estimated separately for each method and compared. Analysis with A.I.D. system was nearly 10-fold faster than that with the reference evaluation method. The A.I.D. system provided strong inter-rater and intra-rater correlations for linear, angular and area measurements of the nasolabial region, as well as a significantly higher accuracy and reproducibility of angular measurements in submental view. No statistically significant inter-method differences were found for other measurements. The hereby presented novel computer system is suitable for simple, time-efficient and reliable multicenter photogrammetric analyses of the nasolabial region in CLP patients and healthy subjects. Copyright © 2017 European Association for Cranio-Maxillo-Facial Surgery. Published by Elsevier Ltd. All rights reserved.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4625322','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4625322"><span>Validation of Gujarati Version of ABILOCO-Kids Questionnaire</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Diwan, Jasmin; Patel, Pankaj; Bansal, Ankita B.</p> <p>2015-01-01</p> <p>Background ABILOCO-Kids is a measure of locomotion ability for children with cerebral palsy (CP) aged 6 to 15 years & is available in English & French. Aim To validate the Gujarati version of ABILOCO-Kids questionnaire to be used in clinical research on Gujarati population. Materials and Methods ABILOCO-Kids questionnaire was translated into Gujarati from English using forward-backward-forward method. To ensure face & content validity of Gujarati version using group consensus method, each item was examined by group of experts having mean experience of 24.62 years in field of paediatric and paediatric physiotherapy. Each item was analysed for content, meaning, wording, format, ease of administration & scoring. Each item was scored by expert group as either accepted, rejected or accepted with modification. Procedure was continued until 80% of consensus for all items. Concurrent validity was examined on 55 children with Cerebral Palsy (6-15 years) of all Gross Motor Functional Classification System (GMFCS) level & all clinical types by correlating score of ABILOCO-Kids with Gross Motor Functional Measure & GMFCS. Result In phase 1 of validation, 16 items were accepted as it is; 22 items accepted with modification & 3 items went for phase 2 validation. For concurrent validity, highly significant positive correlation was found between score of ABILOCO-Kids & total GMFM (r=0.713, p<0.005) & highly significant negative correlation with GMFCS (r= -0.778, p<0.005). Conclusion Gujarati translated version of ABILOCO-Kids questionnaire has good face & content validity as well as concurrent validity which can be used to measure caregiver reported locomotion ability in children with CP. PMID:26557603</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/17118615','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/17118615"><span>Quantitative determination and sampling of azathioprine residues for cleaning validation in production area.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Fazio, Tatiana Tatit; Singh, Anil Kumar; Kedor-Hackmann, Erika Rosa Maria; Santoro, Maria Inês Rocha Miritello</p> <p>2007-03-12</p> <p>Cleaning validation is an integral part of current good manufacturing practices in any pharmaceutical industry. Nowadays, azathioprine and several other pharmacologically potent pharmaceuticals are manufactured in same production area. Carefully designed cleaning validation and its evaluation can ensure that residues of azathioprine will not carry over and cross contaminate the subsequent product. The aim of this study was to validate simple analytical method for verification of residual azathioprine in equipments used in the production area and to confirm efficiency of cleaning procedure. The HPLC method was validated on a LC system using Nova-Pak C18 (3.9 mm x 150 mm, 4 microm) and methanol-water-acetic acid (20:80:1, v/v/v) as mobile phase at a flow rate of 1.0 mL min(-1). UV detection was made at 280 nm. The calibration curve was linear over a concentration range from 2.0 to 22.0 microg mL(-1) with a correlation coefficient of 0.9998. The detection limit (DL) and quantitation limit (QL) were 0.09 and 0.29 microg mL(-1), respectively. The intra-day and inter-day precision expressed as relative standard deviation (R.S.D.) were below 2.0%. The mean recovery of method was 99.19%. The mean extraction-recovery from manufacturing equipments was 83.5%. The developed UV spectrophotometric method could only be used as limit method to qualify or reject cleaning procedure in production area. Nevertheless, the simplicity of spectrophotometric method makes it useful for routine analysis of azathioprine residues on cleaned surface and as an alternative to proposed HPLC method.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1364510','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1364510"><span>Inter-Disciplinary Collaboration in Support of the Post-Standby TREAT Mission</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>DeHart, Mark; Baker, Benjamin; Ortensi, Javier</p> <p></p> <p>Although analysis methods have advanced significantly in the last two decades, high fidelity multi- physics methods for reactors systems have been under development for only a few years and are not presently mature nor deployed. Furthermore, very few methods provide the ability to simulate rapid transients in three dimensions. Data for validation of advanced time-dependent multi- physics is sparse; at TREAT, historical data were not collected for the purpose of validating three-dimensional methods, let alone multi-physics simulations. Existing data continues to be collected to attempt to simulate the behavior of experiments and calibration transients, but it will be insufficient formore » the complete validation of analysis methods used for TREAT transient simulations. Hence, a 2018 restart will most likely occur without the direct application of advanced modeling and simulation methods. At present, the current INL modeling and simulation team plans to work with TREAT operations staff in performing reactor simulations with MAMMOTH, in parallel with the software packages currently being used in preparation for core restart (e.g., MCNP5, RELAP5, ABAQUS). The TREAT team has also requested specific measurements to be performed during startup testing, currently scheduled to run from February to August of 2018. These startup measurements will be crucial in validating the new analysis methods in preparation for ultimate application for TREAT operations and experiment design. This document describes the collaboration between modeling and simulation staff and restart, operations, instrumentation and experiment development teams to be able to effectively interact and achieve successful validation work during restart testing.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1214272','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1214272"><span>Computational design and experimental validation of new thermal barrier systems</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Guo, Shengmin</p> <p>2015-03-31</p> <p>The focus of this project is on the development of a reliable and efficient ab initio based computational high temperature material design method which can be used to assist the Thermal Barrier Coating (TBC) bond-coat and top-coat design. Experimental evaluations on the new TBCs are conducted to confirm the new TBCs’ properties. Southern University is the subcontractor on this project with a focus on the computational simulation method development. We have performed ab initio density functional theory (DFT) method and molecular dynamics simulation on screening the top coats and bond coats for gas turbine thermal barrier coating design and validationmore » applications. For experimental validations, our focus is on the hot corrosion performance of different TBC systems. For example, for one of the top coatings studied, we examined the thermal stability of TaZr 2.75O 8 and confirmed it’s hot corrosion performance.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20100024474','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20100024474"><span>Formal Methods for Automated Diagnosis of Autosub 6000</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Ernits, Juhan; Dearden, Richard; Pebody, Miles</p> <p>2009-01-01</p> <p>This is a progress report on applying formal methods in the context of building an automated diagnosis and recovery system for Autosub 6000, an Autonomous Underwater Vehicle (AUV). The diagnosis task involves building abstract models of the control system of the AUV. The diagnosis engine is based on Livingstone 2, a model-based diagnoser originally built for aerospace applications. Large parts of the diagnosis model can be built without concrete knowledge about each mission, but actual mission scripts and configuration parameters that carry important information for diagnosis are changed for every mission. Thus we use formal methods for generating the mission control part of the diagnosis model automatically from the mission script and perform a number of invariant checks to validate the configuration. After the diagnosis model is augmented with the generated mission control component model, it needs to be validated using verification techniques.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017IJSE...36..654C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017IJSE...36..654C"><span>Modelling a hydropower plant with reservoir with the micropower optimisation model (HOMER)</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Canales, Fausto A.; Beluco, Alexandre; Mendes, Carlos André B.</p> <p>2017-08-01</p> <p>Hydropower with water accumulation is an interesting option to consider in hybrid systems, because it helps dealing with the intermittence characteristics of renewable energy resources. The software HOMER (version Legacy) is extensively used in research works related to these systems, but it does not include a specific option for modelling hydro with reservoir. This paper describes a method for modelling a hydropower plant with reservoir with HOMER by adapting an existing procedure used for modelling pumped storage. An example with two scenarios in southern Brazil is presented for illustrating and validating the method explained in this paper. The results validate the method by showing a direct correspondence between an equivalent battery and the reservoir. The refill of the reservoir, its power output as a function of the flow rate and installed hydropower capacity are effectively simulated, indicating an adequate representation of a hydropower plant with reservoir is possible with HOMER.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19930012246','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19930012246"><span>A phase one AR/C system design</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Kachmar, Peter M.; Polutchko, Robert J.; Matusky, Martin; Chu, William; Jackson, William; Montez, Moises</p> <p>1991-01-01</p> <p>The Phase One AR&C System Design integrates an evolutionary design based on the legacy of previous mission successes, flight tested components from manned Rendezvous and Proximity Operations (RPO) space programs, and additional AR&C components validated using proven methods. The Phase One system has a modular, open architecture with the standardized interfaces proposed for Space Station Freedom system architecture.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014JEE....65...97K','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014JEE....65...97K"><span>Hybrid Chaos Synchronization of Four-Scroll Systems via Active Control</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Karthikeyan, Rajagopal; Sundarapandian, Vaidyanathan</p> <p>2014-03-01</p> <p>This paper investigates the hybrid chaos synchronization of identical Wang four-scroll systems (Wang, 2009), identical Liu-Chen four-scroll systems (Liu and Chen, 2004) and non-identical Wang and Liu-Chen four-scroll systems. Active control method is the method adopted to achieve the hybrid chaos synchronization of the four-scroll chaotic systems addressed in this paper and our synchronization results are established using Lyapunov stability theory. Since the Lyapunov exponents are not required for these calculations, the active control method is effective and convenient to hybrid synchronize identical and different Wang and Liu-Chen four-scroll chaotic systems. Numerical simulations are also shown to illustrate and validate the hybrid synchronization results derived in this paper.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3478848','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3478848"><span>Design of a Multi-Sensor Cooperation Travel Environment Perception System for Autonomous Vehicle</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Chen, Long; Li, Qingquan; Li, Ming; Zhang, Liang; Mao, Qingzhou</p> <p>2012-01-01</p> <p>This paper describes the environment perception system designed for intelligent vehicle SmartV-II, which won the 2010 Future Challenge. This system utilizes the cooperation of multiple lasers and cameras to realize several necessary functions of autonomous navigation: road curb detection, lane detection and traffic sign recognition. Multiple single scan lasers are integrated to detect the road curb based on Z-variance method. Vision based lane detection is realized by two scans method combining with image model. Haar-like feature based method is applied for traffic sign detection and SURF matching method is used for sign classification. The results of experiments validate the effectiveness of the proposed algorithms and the whole system.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/21559143','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/21559143"><span>Femtosecond laser micro-inscription of optical coherence tomography resolution test artifacts.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Tomlins, Peter H; Smith, Graham N; Woolliams, Peter D; Rasakanthan, Janarthanan; Sugden, Kate</p> <p>2011-04-25</p> <p>Optical coherence tomography (OCT) systems are becoming more commonly used in biomedical imaging and, to enable continued uptake, a reliable method of characterizing their performance and validating their operation is required. This paper outlines the use of femtosecond laser subsurface micro-inscription techniques to fabricate an OCT test artifact for validating the resolution performance of a commercial OCT system. The key advantage of this approach is that by utilizing the nonlinear absorption a three dimensional grid of highly localized point and line defects can be written in clear fused silica substrates.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3857844','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3857844"><span>Participation in Decision Making as a Property of Complex Adaptive Systems: Developing and Testing a Measure</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Anderson, Ruth A.; Hsieh, Pi-Ching; Su, Hui Fang; Landerman, Lawrence R.; McDaniel, Reuben R.</p> <p>2013-01-01</p> <p>Objectives. To (1) describe participation in decision-making as a systems-level property of complex adaptive systems and (2) present empirical evidence of reliability and validity of a corresponding measure. Method. Study 1 was a mail survey of a single respondent (administrators or directors of nursing) in each of 197 nursing homes. Study 2 was a field study using random, proportionally stratified sampling procedure that included 195 organizations with 3,968 respondents. Analysis. In Study 1, we analyzed the data to reduce the number of scale items and establish initial reliability and validity. In Study 2, we strengthened the psychometric test using a large sample. Results. Results demonstrated validity and reliability of the participation in decision-making instrument (PDMI) while measuring participation of workers in two distinct job categories (RNs and CNAs). We established reliability at the organizational level aggregated items scores. We established validity of the multidimensional properties using convergent and discriminant validity and confirmatory factor analysis. Conclusions. Participation in decision making, when modeled as a systems-level property of organization, has multiple dimensions and is more complex than is being traditionally measured. Managers can use this model to form decision teams that maximize the depth and breadth of expertise needed and to foster connection among them. PMID:24349771</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/24349771','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/24349771"><span>Participation in decision making as a property of complex adaptive systems: developing and testing a measure.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Anderson, Ruth A; Plowman, Donde; Corazzini, Kirsten; Hsieh, Pi-Ching; Su, Hui Fang; Landerman, Lawrence R; McDaniel, Reuben R</p> <p>2013-01-01</p> <p>Objectives. To (1) describe participation in decision-making as a systems-level property of complex adaptive systems and (2) present empirical evidence of reliability and validity of a corresponding measure. Method. Study 1 was a mail survey of a single respondent (administrators or directors of nursing) in each of 197 nursing homes. Study 2 was a field study using random, proportionally stratified sampling procedure that included 195 organizations with 3,968 respondents. Analysis. In Study 1, we analyzed the data to reduce the number of scale items and establish initial reliability and validity. In Study 2, we strengthened the psychometric test using a large sample. Results. Results demonstrated validity and reliability of the participation in decision-making instrument (PDMI) while measuring participation of workers in two distinct job categories (RNs and CNAs). We established reliability at the organizational level aggregated items scores. We established validity of the multidimensional properties using convergent and discriminant validity and confirmatory factor analysis. Conclusions. Participation in decision making, when modeled as a systems-level property of organization, has multiple dimensions and is more complex than is being traditionally measured. Managers can use this model to form decision teams that maximize the depth and breadth of expertise needed and to foster connection among them.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/25877590','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/25877590"><span>Accurate electronic and chemical properties of 3d transition metal oxides using a calculated linear response U and a DFT + U(V) method.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Xu, Zhongnan; Joshi, Yogesh V; Raman, Sumathy; Kitchin, John R</p> <p>2015-04-14</p> <p>We validate the usage of the calculated, linear response Hubbard U for evaluating accurate electronic and chemical properties of bulk 3d transition metal oxides. We find calculated values of U lead to improved band gaps. For the evaluation of accurate reaction energies, we first identify and eliminate contributions to the reaction energies of bulk systems due only to changes in U and construct a thermodynamic cycle that references the total energies of unique U systems to a common point using a DFT + U(V) method, which we recast from a recently introduced DFT + U(R) method for molecular systems. We then introduce a semi-empirical method based on weighted DFT/DFT + U cohesive energies to calculate bulk oxidation energies of transition metal oxides using density functional theory and linear response calculated U values. We validate this method by calculating 14 reactions energies involving V, Cr, Mn, Fe, and Co oxides. We find up to an 85% reduction of the mean average error (MAE) compared to energies calculated with the Perdew-Burke-Ernzerhof functional. When our method is compared with DFT + U with empirically derived U values and the HSE06 hybrid functional, we find up to 65% and 39% reductions in the MAE, respectively.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/biblio/22622298-development-point-kinetic-verification-scheme-nuclear-reactor-applications','SCIGOV-STC'); return false;" href="https://www.osti.gov/biblio/22622298-development-point-kinetic-verification-scheme-nuclear-reactor-applications"><span>Development of a point-kinetic verification scheme for nuclear reactor applications</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Demazière, C., E-mail: demaz@chalmers.se; Dykin, V.; Jareteg, K.</p> <p></p> <p>In this paper, a new method that can be used for checking the proper implementation of time- or frequency-dependent neutron transport models and for verifying their ability to recover some basic reactor physics properties is proposed. This method makes use of the application of a stationary perturbation to the system at a given frequency and extraction of the point-kinetic component of the system response. Even for strongly heterogeneous systems for which an analytical solution does not exist, the point-kinetic component follows, as a function of frequency, a simple analytical form. The comparison between the extracted point-kinetic component and its expectedmore » analytical form provides an opportunity to verify and validate neutron transport solvers. The proposed method is tested on two diffusion-based codes, one working in the time domain and the other working in the frequency domain. As long as the applied perturbation has a non-zero reactivity effect, it is demonstrated that the method can be successfully applied to verify and validate time- or frequency-dependent neutron transport solvers. Although the method is demonstrated in the present paper in a diffusion theory framework, higher order neutron transport methods could be verified based on the same principles.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2003IJCFD..17..499C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2003IJCFD..17..499C"><span>Control Theory based Shape Design for the Incompressible Navier-Stokes Equations</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Cowles, G.; Martinelli, L.</p> <p>2003-12-01</p> <p>A design method for shape optimization in incompressible turbulent viscous flow has been developed and validated for inverse design. The gradient information is determined using a control theory based algorithm. With such an approach, the cost of computing the gradient is negligible. An additional adjoint system must be solved which requires the cost of a single steady state flow solution. Thus, this method has an enormous advantage over traditional finite-difference based algorithms. The method of artificial compressibility is utilized to solve both the flow and adjoint systems. An algebraic turbulence model is used to compute the eddy viscosity. The method is validated using several inverse wing design test cases. In each case, the program must modify the shape of the initial wing such that its pressure distribution matches that of the target wing. Results are shown for the inversion of both finite thickness wings as well as zero thickness wings which can be considered a model of yacht sails.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/27292100','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/27292100"><span>Cluster analysis of molecular simulation trajectories for systems where both conformation and orientation of the sampled states are important.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Abramyan, Tigran M; Snyder, James A; Thyparambil, Aby A; Stuart, Steven J; Latour, Robert A</p> <p>2016-08-05</p> <p>Clustering methods have been widely used to group together similar conformational states from molecular simulations of biomolecules in solution. For applications such as the interaction of a protein with a surface, the orientation of the protein relative to the surface is also an important clustering parameter because of its potential effect on adsorbed-state bioactivity. This study presents cluster analysis methods that are specifically designed for systems where both molecular orientation and conformation are important, and the methods are demonstrated using test cases of adsorbed proteins for validation. Additionally, because cluster analysis can be a very subjective process, an objective procedure for identifying both the optimal number of clusters and the best clustering algorithm to be applied to analyze a given dataset is presented. The method is demonstrated for several agglomerative hierarchical clustering algorithms used in conjunction with three cluster validation techniques. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_12");'>12</a></li> <li><a href="#" onclick='return showDiv("page_13");'>13</a></li> <li class="active"><span>14</span></li> <li><a href="#" onclick='return showDiv("page_15");'>15</a></li> <li><a href="#" onclick='return showDiv("page_16");'>16</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_14 --> <div id="page_15" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_13");'>13</a></li> <li><a href="#" onclick='return showDiv("page_14");'>14</a></li> <li class="active"><span>15</span></li> <li><a href="#" onclick='return showDiv("page_16");'>16</a></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="281"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/19909926','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/19909926"><span>ENFIN--A European network for integrative systems biology.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Kahlem, Pascal; Clegg, Andrew; Reisinger, Florian; Xenarios, Ioannis; Hermjakob, Henning; Orengo, Christine; Birney, Ewan</p> <p>2009-11-01</p> <p>Integration of biological data of various types and the development of adapted bioinformatics tools represent critical objectives to enable research at the systems level. The European Network of Excellence ENFIN is engaged in developing an adapted infrastructure to connect databases, and platforms to enable both the generation of new bioinformatics tools and the experimental validation of computational predictions. With the aim of bridging the gap existing between standard wet laboratories and bioinformatics, the ENFIN Network runs integrative research projects to bring the latest computational techniques to bear directly on questions dedicated to systems biology in the wet laboratory environment. The Network maintains internally close collaboration between experimental and computational research, enabling a permanent cycling of experimental validation and improvement of computational prediction methods. The computational work includes the development of a database infrastructure (EnCORE), bioinformatics analysis methods and a novel platform for protein function analysis FuncNet.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/27306671','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/27306671"><span>Validating silicon polytrodes with paired juxtacellular recordings: method and dataset.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Neto, Joana P; Lopes, Gonçalo; Frazão, João; Nogueira, Joana; Lacerda, Pedro; Baião, Pedro; Aarts, Arno; Andrei, Alexandru; Musa, Silke; Fortunato, Elvira; Barquinha, Pedro; Kampff, Adam R</p> <p>2016-08-01</p> <p>Cross-validating new methods for recording neural activity is necessary to accurately interpret and compare the signals they measure. Here we describe a procedure for precisely aligning two probes for in vivo "paired-recordings" such that the spiking activity of a single neuron is monitored with both a dense extracellular silicon polytrode and a juxtacellular micropipette. Our new method allows for efficient, reliable, and automated guidance of both probes to the same neural structure with micrometer resolution. We also describe a new dataset of paired-recordings, which is available online. We propose that our novel targeting system, and ever expanding cross-validation dataset, will be vital to the development of new algorithms for automatically detecting/sorting single-units, characterizing new electrode materials/designs, and resolving nagging questions regarding the origin and nature of extracellular neural signals. Copyright © 2016 the American Physiological Society.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018E%26ES..113a2161L','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018E%26ES..113a2161L"><span>Test method research on weakening interface strength of steel - concrete under cyclic loading</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Liu, Ming-wei; Zhang, Fang-hua; Su, Guang-quan</p> <p>2018-02-01</p> <p>The mechanical properties of steel - concrete interface under cyclic loading are the key factors affecting the rule of horizontal load transfer, the calculation of bearing capacity and cumulative horizontal deformation. Cyclic shear test is an effective method to study the strength reduction of steel - concrete interface. A test system composed of large repeated direct shear test instrument, hydraulic servo system, data acquisition system, test control software system and so on is independently designed, and a set of test method, including the specimen preparation, the instrument preparation, the loading method and so on, is put forward. By listing a set of test results, the validity of the test method is verified. The test system and the test method based on it provide a reference for the experimental study on mechanical properties of steel - concrete interface.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2010JMiMi..20k5011L','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2010JMiMi..20k5011L"><span>A modular assembly method of a feed and thruster system for Cubesats</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Louwerse, Marcus; Jansen, Henri; Elwenspoek, Miko</p> <p>2010-11-01</p> <p>A modular assembly method for devices based on micro system technology is presented. The assembly method forms the foundation for a miniaturized feed and thruster system as part of a micro propulsion unit working as a simple blow-down system of a rocket engine. The micro rocket is designed to be used for constellation maintenance of Cubesats, which measure 10 × 10 × 10 cm and have a mass less than 1 kg. The feed and thruster system contains an active valve, control electronics, a particle filter and an axisymmetric converging-diverging nozzle, all fabricated as separate modules. A novel method is used to integrate these modules by placing them on or in a glass tube package. The assembly method is shown to be a valid method but the valve module needs to be improved considerably.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=2724410','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=2724410"><span>Reliability and validity of pendulum test measures of spasticity obtained with the Polhemus tracking system from patients with chronic stroke</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Bohannon, Richard W; Harrison, Steven; Kinsella-Shaw, Jeffrey</p> <p>2009-01-01</p> <p>Background Spasticity is a common impairment accompanying stroke. Spasticity of the quadriceps femoris muscle can be quantified using the pendulum test. The measurement properties of pendular kinematics captured using a magnetic tracking system has not been studied among patients who have experienced a stroke. Therefore, this study describes the test-retest reliability and known groups and convergent validity of the pendulum test measures obtained with the Polhemus tracking system. Methods Eight patients with chronic stroke underwent pendulum tests with their affected and unaffected lower limbs, with and without the addition of a 2.2 kg cuff weight at the ankle, using the Polhemus magnetic tracking system. Also measured bilaterally were knee resting angles, Ashworth scores (grades 0–4) of quadriceps femoris muscles, patellar tendon (knee jerk) reflexes (grades 0–4), and isometric knee extension force. Results Three measures obtained from pendular traces of the affected side were reliable (intraclass correlation coefficient ≥ .844). Known groups validity was confirmed by demonstration of a significant difference in the measurements between sides. Convergent validity was supported by correlations ≥ .57 between pendulum test measures and other measures reflective of spasticity. Conclusion Pendulum test measures obtained with the Polhemus tracking system from the affected side of patients with stroke have good test-retest reliability and both known groups and convergent validity. PMID:19642989</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/biblio/22420297-validating-prognostic-scoring-system-postmastectomy-locoregional-recurrence-breast-cancer','SCIGOV-STC'); return false;" href="https://www.osti.gov/biblio/22420297-validating-prognostic-scoring-system-postmastectomy-locoregional-recurrence-breast-cancer"><span>Validating a Prognostic Scoring System for Postmastectomy Locoregional Recurrence in Breast Cancer</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Cheng, Skye Hung-Chun, E-mail: skye@kfsyscc.org; Clinical Research Office, Koo Foundation Sun Yat-Sen Cancer Center, Taipei, Taiwan; Department of Radiation Oncology, Duke University Medical Center, Durham, North Carolina</p> <p>2013-03-15</p> <p>Purpose: This study is designed to validate a previously developed locoregional recurrence risk (LRR) scoring system and further define which groups of patients with breast cancer would benefit from postmastectomy radiation therapy (PMRT). Methods and Materials: An LRR risk scoring system was developed previously at our institution using breast cancer patients initially treated with modified radical mastectomy between 1990 and 2001. The LRR score comprised 4 factors: patient age, lymphovascular invasion, estrogen receptor negativity, and number of involved lymph nodes. We sought to validate the original study by examining a new dataset of 1545 patients treated between 2002 and 2007. Results:more » The 1545 patients were scored according to the previously developed criteria: 920 (59.6%) were low risk (score 0-1), 493 (31.9%) intermediate risk (score 2-3), and 132 (8.5%) were high risk (score ≥4). The 5-year locoregional control rates with and without PMRT in low-risk, intermediate-risk, and high-risk groups were 98% versus 97% (P=.41), 97% versus 91% (P=.0005), and 89% versus 50% (P=.0002) respectively. Conclusions: This analysis of an additional 1545 patients treated between 2002 and 2007 validates our previously reported LRR scoring system and suggests appropriate patients for whom PMRT will be beneficial. Independent validation of this scoring system by other institutions is recommended.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/28374103','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/28374103"><span>A validation method for near-infrared spectroscopy based tissue oximeters for cerebral and somatic tissue oxygen saturation measurements.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Benni, Paul B; MacLeod, David; Ikeda, Keita; Lin, Hung-Mo</p> <p>2018-04-01</p> <p>We describe the validation methodology for the NIRS based FORE-SIGHT ELITE ® (CAS Medical Systems, Inc., Branford, CT, USA) tissue oximeter for cerebral and somatic tissue oxygen saturation (StO 2 ) measurements for adult subjects submitted to the United States Food and Drug Administration (FDA) to obtain clearance for clinical use. This validation methodology evolved from a history of NIRS validations in the literature and FDA recommended use of Deming regression and bootstrapping statistical validation methods. For cerebral validation, forehead cerebral StO 2 measurements were compared to a weighted 70:30 reference (REF CX B ) of co-oximeter internal jugular venous and arterial blood saturation of healthy adult subjects during a controlled hypoxia sequence, with a sensor placed on the forehead. For somatic validation, somatic StO 2 measurements were compared to a weighted 70:30 reference (REF CX S ) of co-oximetry central venous and arterial saturation values following a similar protocol, with sensors place on the flank, quadriceps muscle, and calf muscle. With informed consent, 25 subjects successfully completed the cerebral validation study. The bias and precision (1 SD) of cerebral StO 2 compared to REF CX B was -0.14 ± 3.07%. With informed consent, 24 subjects successfully completed the somatic validation study. The bias and precision of somatic StO 2 compared to REF CX S was 0.04 ± 4.22% from the average of flank, quadriceps, and calf StO 2 measurements to best represent the global whole body REF CX S . The NIRS validation methods presented potentially provide a reliable means to test NIRS monitors and qualify them for clinical use.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/29046041','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/29046041"><span>Development and Content Validation of the Transition Readiness Inventory Item Pool for Adolescent and Young Adult Survivors of Childhood Cancer.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Schwartz, Lisa A; Hamilton, Jessica L; Brumley, Lauren D; Barakat, Lamia P; Deatrick, Janet A; Szalda, Dava E; Bevans, Katherine B; Tucker, Carole A; Daniel, Lauren C; Butler, Eliana; Kazak, Anne E; Hobbie, Wendy L; Ginsberg, Jill P; Psihogios, Alexandra M; Ver Hoeve, Elizabeth; Tuchman, Lisa K</p> <p>2017-10-01</p> <p>The development of the Transition Readiness Inventory (TRI) item pool for adolescent and young adult childhood cancer survivors is described, aiming to both advance transition research and provide an example of the application of NIH Patient Reported Outcomes Information System methods. Using rigorous measurement development methods including mixed methods, patient and parent versions of the TRI item pool were created based on the Social-ecological Model of Adolescent and young adult Readiness for Transition (SMART). Each stage informed development and refinement of the item pool. Content validity ratings and cognitive interviews resulted in 81 content valid items for the patient version and 85 items for the parent version. TRI represents the first multi-informant, rigorously developed transition readiness item pool that comprehensively measures the social-ecological components of transition readiness. Discussion includes clinical implications, the application of TRI and the methods to develop the item pool to other populations, and next steps for further validation and refinement. © The Author 2017. Published by Oxford University Press on behalf of the Society of Pediatric Psychology. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/21485287','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/21485287"><span>Development and validation of RP HPLC method to determine nandrolone phenylpropionate in different pharmaceutical formulations.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Mukherjee, Jayanti; Das, Ayan; Chakrabarty, Uday Sankar; Sahoo, Bijay Kumar; Dey, Goutam; Choudhury, Hira; Pal, Tapan Kumar</p> <p>2011-01-01</p> <p>This study describes development and subsequent validation of a reversed phase high performance liquid chromatographic (RP-HPLC) method for the estimation of nandrolone phenylpropionate, an anabolic steroid, in bulk drug, in conventional parenteral dosage formulation and in prepared nanoparticle dosage form. The chromatographic system consisted of a Luna Phenomenex, CN (250 mm x 4.6 mm, 5 microm) column, an isocratic mobile phase comprising 10 mM phosphate buffer and acetonitrile (50:50, v/v) and UV detection at 240 nm. Nandrolone phenylpropionate was eluted about 6.3 min with no interfering peaks of excipients used for the preparation of dosage forms. The method was linear over the range from 0.050 to 25 microg/mL in raw drug (r2 = 0.9994). The intra-day and inter-day precision values were in the range of 0.219-0.609% and 0.441-0.875%, respectively. Limits of detection and quantitation were 0.010 microg/mL and 0.050 microg/mL, respectively. The results were validated according to International Conference on Harmonization (ICH) guidelines in parenteral and prepared nanoparticle formulation. The validated HPLC method is simple, sensitive, precise, accurate and reproducible.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20110012281','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20110012281"><span>ALHAT System Validation</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Brady, Tye; Bailey, Erik; Crain, Timothy; Paschall, Stephen</p> <p>2011-01-01</p> <p>NASA has embarked on a multiyear technology development effort to develop a safe and precise lunar landing capability. The Autonomous Landing and Hazard Avoidance Technology (ALHAT) Project is investigating a range of landing hazard detection methods while developing a hazard avoidance capability to best field test the proper set of relevant autonomous GNC technologies. Ultimately, the advancement of these technologies through the ALHAT Project will provide an ALHAT System capable of enabling next generation lunar lander vehicles to globally land precisely and safely regardless of lighting condition. This paper provides an overview of the ALHAT System and describes recent validation experiments that have advanced the highly capable GNC architecture.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/biblio/1311305-integrating-symbolic-statistical-methods-testing-intelligent-systems-applications-machine-learning-computer-vision','SCIGOV-STC'); return false;" href="https://www.osti.gov/biblio/1311305-integrating-symbolic-statistical-methods-testing-intelligent-systems-applications-machine-learning-computer-vision"><span>Integrating Symbolic and Statistical Methods for Testing Intelligent Systems Applications to Machine Learning and Computer Vision</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Jha, Sumit Kumar; Pullum, Laura L; Ramanathan, Arvind</p> <p></p> <p>Embedded intelligent systems ranging from tiny im- plantable biomedical devices to large swarms of autonomous un- manned aerial systems are becoming pervasive in our daily lives. While we depend on the flawless functioning of such intelligent systems, and often take their behavioral correctness and safety for granted, it is notoriously difficult to generate test cases that expose subtle errors in the implementations of machine learning algorithms. Hence, the validation of intelligent systems is usually achieved by studying their behavior on representative data sets, using methods such as cross-validation and bootstrapping.In this paper, we present a new testing methodology for studyingmore » the correctness of intelligent systems. Our approach uses symbolic decision procedures coupled with statistical hypothesis testing to. We also use our algorithm to analyze the robustness of a human detection algorithm built using the OpenCV open-source computer vision library. We show that the human detection implementation can fail to detect humans in perturbed video frames even when the perturbations are so small that the corresponding frames look identical to the naked eye.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/28610810','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/28610810"><span>Developing a contributing factor classification scheme for Rasmussen's AcciMap: Reliability and validity evaluation.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Goode, N; Salmon, P M; Taylor, N Z; Lenné, M G; Finch, C F</p> <p>2017-10-01</p> <p>One factor potentially limiting the uptake of Rasmussen's (1997) Accimap method by practitioners is the lack of a contributing factor classification scheme to guide accident analyses. This article evaluates the intra- and inter-rater reliability and criterion-referenced validity of a classification scheme developed to support the use of Accimap by led outdoor activity (LOA) practitioners. The classification scheme has two levels: the system level describes the actors, artefacts and activity context in terms of 14 codes; the descriptor level breaks the system level codes down into 107 specific contributing factors. The study involved 11 LOA practitioners using the scheme on two separate occasions to code a pre-determined list of contributing factors identified from four incident reports. Criterion-referenced validity was assessed by comparing the codes selected by LOA practitioners to those selected by the method creators. Mean intra-rater reliability scores at the system (M = 83.6%) and descriptor (M = 74%) levels were acceptable. Mean inter-rater reliability scores were not consistently acceptable for both coding attempts at the system level (M T1  = 68.8%; M T2  = 73.9%), and were poor at the descriptor level (M T1  = 58.5%; M T2  = 64.1%). Mean criterion referenced validity scores at the system level were acceptable (M T1  = 73.9%; M T2  = 75.3%). However, they were not consistently acceptable at the descriptor level (M T1  = 67.6%; M T2  = 70.8%). Overall, the results indicate that the classification scheme does not currently satisfy reliability and validity requirements, and that further work is required. The implications for the design and development of contributing factors classification schemes are discussed. Copyright © 2017 Elsevier Ltd. All rights reserved.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2007JJSEE..55.4.14O','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2007JJSEE..55.4.14O"><span>Reviewing Reliability and Validity of Information for University Educational Evaluation</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Otsuka, Yusaku</p> <p></p> <p>To better utilize evaluations in higher education, it is necessary to share the methods of reviewing reliability and validity of examination scores and grades, and to accumulate and share data for confirming results. Before the GPA system is first introduced into a university or college, the reliability of examination scores and grades, especially for essay examinations, must be assured. Validity is a complicated concept, so should be assured in various ways, including using professional audits, theoretical models, and statistical data analysis. Because individual students and teachers are continually improving, using evaluations to appraise their progress is not always compatible with using evaluations in appraising the implementation of accountability in various departments or the university overall. To better utilize evaluations and improve higher education, evaluations should be integrated into the current system by sharing the vision of an academic learning community and promoting interaction between students and teachers based on sufficiently reliable and validated evaluation tools.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016PhDT........19H','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016PhDT........19H"><span>Validating the Use of pPerformance Risk Indices for System-Level Risk and Maturity Assessments</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Holloman, Sherrica S.</p> <p></p> <p>With pressure on the U.S. Defense Acquisition System (DAS) to reduce cost overruns and schedule delays, system engineers' performance is only as good as their tools. Recent literature details a need for 1) objective, analytical risk quantification methodologies over traditional subjective qualitative methods -- such as, expert judgment, and 2) mathematically rigorous system-level maturity assessments. The Mahafza, Componation, and Tippett (2005) Technology Performance Risk Index (TPRI) ties the assessment of technical performance to the quantification of risk of unmet performance; however, it is structured for component- level data as input. This study's aim is to establish a modified TPRI with systems-level data as model input, and then validate the modified index with actual system-level data from the Department of Defense's (DoD) Major Defense Acquisition Programs (MDAPs). This work's contribution is the establishment and validation of the System-level Performance Risk Index (SPRI). With the introduction of the SPRI, system-level metrics are better aligned, allowing for better assessment, tradeoff and balance of time, performance and cost constraints. This will allow system engineers and program managers to ultimately make better-informed system-level technical decisions throughout the development phase.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015SPIE.9794E..2ZW','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015SPIE.9794E..2ZW"><span>The power grid AGC frequency bias coefficient online identification method based on wide area information</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Wang, Zian; Li, Shiguang; Yu, Ting</p> <p>2015-12-01</p> <p>This paper propose online identification method of regional frequency deviation coefficient based on the analysis of interconnected grid AGC adjustment response mechanism of regional frequency deviation coefficient and the generator online real-time operation state by measured data through PMU, analyze the optimization method of regional frequency deviation coefficient in case of the actual operation state of the power system and achieve a more accurate and efficient automatic generation control in power system. Verify the validity of the online identification method of regional frequency deviation coefficient by establishing the long-term frequency control simulation model of two-regional interconnected power system.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/22872447','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/22872447"><span>Development and validation of a new method to simultaneously quantify triazoles in plasma spotted on dry sample spot devices and analysed by HPLC-MS.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Baietto, Lorena; D'Avolio, Antonio; Marra, Cristina; Simiele, Marco; Cusato, Jessica; Pace, Simone; Ariaudo, Alessandra; De Rosa, Francesco Giuseppe; Di Perri, Giovanni</p> <p>2012-11-01</p> <p>Therapeutic drug monitoring (TDM) of triazoles is widely used in clinical practice to optimize therapy. TDM is limited by technical problems and cost considerations, such as sample storage and dry-ice shipping. We aimed to develop and validate a new method to analyse itraconazole, posaconazole and voriconazole in plasma spotted on dry sample spot devices (DSSDs) and to quantify them by an HPLC system. Extraction from DSSDs was done using n-hexane/ethyl acetate and ammonia solution. Samples were analysed using HPLC with mass spectrometry (HPLC-MS). Accuracy and precision were assayed by inter- and intra-day validation. The stability of triazoles in plasma spotted on DSSDs was investigated at room temperature for 1 month. The method was compared with a validated standard HPLC method for quantification of triazoles in human plasma. Mean inter- and intra-day accuracy and precision were <15% for all compounds. Triazoles were stable for 2 weeks at room temperature. The method was linear (r(2) > 0.999) in the range 0.031-8 mg/L for itraconazole and posaconazole, and 0.058-15 mg/L for voriconazole. High sensitivity was observed; limits of detection were 0.008, 0.004 and 0.007 mg/L for itraconazole, posaconazole and voriconazole, respectively. A high degree of correlation (r(2) > 0.94) was obtained between the DSSD method and the standard method of analysis. The method that we developed and validated to quantify triazoles in human plasma spotted on DSSDs is accurate and precise. It overcomes problems related to plasma sample storage and shipment, allowing TDM to be performed in a cheaper and safer manner.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19930002722','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19930002722"><span>A wall interference assessment/correction system</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Lo, Ching F.; Ulbrich, N.; Sickles, W. L.; Qian, Cathy X.</p> <p>1992-01-01</p> <p>A Wall Signature method, the Hackett method, has been selected to be adapted for the 12-ft Wind Tunnel wall interference assessment/correction (WIAC) system in the present phase. This method uses limited measurements of the static pressure at the wall, in conjunction with the solid wall boundary condition, to determine the strength and distribution of singularities representing the test article. The singularities are used in turn for estimating wall interferences at the model location. The Wall Signature method will be formulated for application to the unique geometry of the 12-ft Tunnel. The development and implementation of a working prototype will be completed, delivered and documented with a software manual. The WIAC code will be validated by conducting numerically simulated experiments rather than actual wind tunnel experiments. The simulations will be used to generate both free-air and confined wind-tunnel flow fields for each of the test articles over a range of test configurations. Specifically, the pressure signature at the test section wall will be computed for the tunnel case to provide the simulated 'measured' data. These data will serve as the input for the WIAC method-Wall Signature method. The performance of the WIAC method then may be evaluated by comparing the corrected parameters with those for the free-air simulation. Each set of wind tunnel/test article numerical simulations provides data to validate the WIAC method. A numerical wind tunnel test simulation is initiated to validate the WIAC methods developed in the project. In the present reported period, the blockage correction has been developed and implemented for a rectangular tunnel as well as the 12-ft Pressure Tunnel. An improved wall interference assessment and correction method for three-dimensional wind tunnel testing is presented in the appendix.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014PhRvE..90b2133M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014PhRvE..90b2133M"><span>Statistical evaluation of forecasts</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Mader, Malenka; Mader, Wolfgang; Gluckman, Bruce J.; Timmer, Jens; Schelter, Björn</p> <p>2014-08-01</p> <p>Reliable forecasts of extreme but rare events, such as earthquakes, financial crashes, and epileptic seizures, would render interventions and precautions possible. Therefore, forecasting methods have been developed which intend to raise an alarm if an extreme event is about to occur. In order to statistically validate the performance of a prediction system, it must be compared to the performance of a random predictor, which raises alarms independent of the events. Such a random predictor can be obtained by bootstrapping or analytically. We propose an analytic statistical framework which, in contrast to conventional methods, allows for validating independently the sensitivity and specificity of a forecasting method. Moreover, our method accounts for the periods during which an event has to remain absent or occur after a respective forecast.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://rosap.ntl.bts.gov/view/dot/5382','DOTNTL'); return false;" href="https://rosap.ntl.bts.gov/view/dot/5382"><span>Using historical data to measure transportation infrastructure constraints on land use.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntlsearch.bts.gov/tris/index.do">DOT National Transportation Integrated Search</a></p> <p></p> <p>1998-06-01</p> <p>This study had three goals: (1) To develop a method for reversing the planning process, such that we begin with transportation system usage and conclude with transportation system usage and conclude with an indication of land use; (2) To validate thi...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016SPIE.9684E..3HW','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016SPIE.9684E..3HW"><span>The aberration characteristics in a misaligned three-mirror anastigmatic (TMA) system</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Wang, Bin; Wu, Fan; Ye, Yutang</p> <p>2016-09-01</p> <p>To realize the efficient alignment of the TMA system, the aberrations in a misaligned TMA system had been analyzed theoretically in this paper. Firstly, based on the nodal aberration theory (NAT), the aberration types and characteristics in the misaligned TMA system had been concluded; Secondly, a simulation validation had been carried out to testify the analysis results, the simulation results validates the aberration characteristics; Finally, the alignment procedures were determined according to the aberration characteristics: adjust the axial spacing of the mirrors in terms of Z9 in the center field of TMA system first; and then, adjust the decenters and tilts of the mirrors in terms of Z5 - Z8 in the edge field of TMA system. This method is helpful for the alignment of the TMA telescope.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_13");'>13</a></li> <li><a href="#" onclick='return showDiv("page_14");'>14</a></li> <li class="active"><span>15</span></li> <li><a href="#" onclick='return showDiv("page_16");'>16</a></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_15 --> <div id="page_16" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_14");'>14</a></li> <li><a href="#" onclick='return showDiv("page_15");'>15</a></li> <li class="active"><span>16</span></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="301"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017EntIS..11.1228Z','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017EntIS..11.1228Z"><span>Dynamic decision-making for reliability and maintenance analysis of manufacturing systems based on failure effects</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Zhang, Ding; Zhang, Yingjie</p> <p>2017-09-01</p> <p>A framework for reliability and maintenance analysis of job shop manufacturing systems is proposed in this paper. An efficient preventive maintenance (PM) policy in terms of failure effects analysis (FEA) is proposed. Subsequently, reliability evaluation and component importance measure based on FEA are performed under the PM policy. A job shop manufacturing system is applied to validate the reliability evaluation and dynamic maintenance policy. Obtained results are compared with existed methods and the effectiveness is validated. Some vague understandings for issues such as network modelling, vulnerabilities identification, the evaluation criteria of repairable systems, as well as PM policy during manufacturing system reliability analysis are elaborated. This framework can help for reliability optimisation and rational maintenance resources allocation of job shop manufacturing systems.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015JSV...356...72O','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015JSV...356...72O"><span>Vision-based system identification technique for building structures using a motion capture system</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Oh, Byung Kwan; Hwang, Jin Woo; Kim, Yousok; Cho, Tongjun; Park, Hyo Seon</p> <p>2015-11-01</p> <p>This paper presents a new vision-based system identification (SI) technique for building structures by using a motion capture system (MCS). The MCS with outstanding capabilities for dynamic response measurements can provide gage-free measurements of vibrations through the convenient installation of multiple markers. In this technique, from the dynamic displacement responses measured by MCS, the dynamic characteristics (natural frequency, mode shape, and damping ratio) of building structures are extracted after the processes of converting the displacement from MCS to acceleration and conducting SI by frequency domain decomposition. A free vibration experiment on a three-story shear frame was conducted to validate the proposed technique. The SI results from the conventional accelerometer-based method were compared with those from the proposed technique and showed good agreement, which confirms the validity and applicability of the proposed vision-based SI technique for building structures. Furthermore, SI directly employing MCS measured displacements to FDD was performed and showed identical results to those of conventional SI method.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19910013474','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19910013474"><span>Validation and verification of expert systems</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Gilstrap, Lewey</p> <p>1991-01-01</p> <p>Validation and verification (V&V) are procedures used to evaluate system structure or behavior with respect to a set of requirements. Although expert systems are often developed as a series of prototypes without requirements, it is not possible to perform V&V on any system for which requirements have not been prepared. In addition, there are special problems associated with the evaluation of expert systems that do not arise in the evaluation of conventional systems, such as verification of the completeness and accuracy of the knowledge base. The criticality of most NASA missions make it important to be able to certify the performance of the expert systems used to support these mission. Recommendations for the most appropriate method for integrating V&V into the Expert System Development Methodology (ESDM) and suggestions for the most suitable approaches for each stage of ESDM development are presented.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2009IJTPE.129..533S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2009IJTPE.129..533S"><span>Quantitative Evaluation Method of Each Generation Margin for Power System Planning</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Su, Su; Tanaka, Kazuyuki</p> <p></p> <p>As the power system deregulation advances, the competition among the power companies becomes heated, and they seek more efficient system planning using existing facilities. Therefore, an efficient system planning method has been expected. This paper proposes a quantitative evaluation method for the (N-1) generation margin considering the overload and the voltage stability restriction. Concerning the generation margin related with the overload, a fast solution method without the recalculation of the (N-1) Y-matrix is proposed. Referred to the voltage stability, this paper proposes an efficient method to search the stability limit. The IEEE30 model system which is composed of 6 generators and 14 load nodes is employed to validate the proposed method. According to the results, the proposed method can reduce the computational cost for the generation margin related with the overload under the (N-1) condition, and specify the value quantitatively.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/biblio/1426370-three-stage-enhanced-reactive-power-voltage-optimization-method-high-penetration-solar','SCIGOV-STC'); return false;" href="https://www.osti.gov/biblio/1426370-three-stage-enhanced-reactive-power-voltage-optimization-method-high-penetration-solar"><span>A Three-Stage Enhanced Reactive Power and Voltage Optimization Method for High Penetration of Solar</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Ke, Xinda; Huang, Renke; Vallem, Mallikarjuna R.</p> <p></p> <p>This paper presents a three-stage enhanced volt/var optimization method to stabilize voltage fluctuations in transmission networks by optimizing the usage of reactive power control devices. In contrast with existing volt/var optimization algorithms, the proposed method optimizes the voltage profiles of the system, while keeping the voltage and real power output of the generators as close to the original scheduling values as possible. This allows the method to accommodate realistic power system operation and market scenarios, in which the original generation dispatch schedule will not be affected. The proposed method was tested and validated on a modified IEEE 118-bus system withmore » photovoltaic data.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/29533231','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/29533231"><span>Worldwide Protein Data Bank validation information: usage and trends.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Smart, Oliver S; Horský, Vladimír; Gore, Swanand; Svobodová Vařeková, Radka; Bendová, Veronika; Kleywegt, Gerard J; Velankar, Sameer</p> <p>2018-03-01</p> <p>Realising the importance of assessing the quality of the biomolecular structures deposited in the Protein Data Bank (PDB), the Worldwide Protein Data Bank (wwPDB) partners established Validation Task Forces to obtain advice on the methods and standards to be used to validate structures determined by X-ray crystallography, nuclear magnetic resonance spectroscopy and three-dimensional electron cryo-microscopy. The resulting wwPDB validation pipeline is an integral part of the wwPDB OneDep deposition, biocuration and validation system. The wwPDB Validation Service webserver (https://validate.wwpdb.org) can be used to perform checks prior to deposition. Here, it is shown how validation metrics can be combined to produce an overall score that allows the ranking of macromolecular structures and domains in search results. The ValTrends DB database provides users with a convenient way to access and analyse validation information and other properties of X-ray crystal structures in the PDB, including investigating trends in and correlations between different structure properties and validation metrics.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5947764','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5947764"><span>Worldwide Protein Data Bank validation information: usage and trends</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Horský, Vladimír; Gore, Swanand; Svobodová Vařeková, Radka; Bendová, Veronika</p> <p>2018-01-01</p> <p>Realising the importance of assessing the quality of the biomolecular structures deposited in the Protein Data Bank (PDB), the Worldwide Protein Data Bank (wwPDB) partners established Validation Task Forces to obtain advice on the methods and standards to be used to validate structures determined by X-ray crystallography, nuclear magnetic resonance spectroscopy and three-dimensional electron cryo-microscopy. The resulting wwPDB validation pipeline is an integral part of the wwPDB OneDep deposition, biocuration and validation system. The wwPDB Validation Service webserver (https://validate.wwpdb.org) can be used to perform checks prior to deposition. Here, it is shown how validation metrics can be combined to produce an overall score that allows the ranking of macromolecular structures and domains in search results. The ValTrendsDB database provides users with a convenient way to access and analyse validation information and other properties of X-ray crystal structures in the PDB, including investigating trends in and correlations between different structure properties and validation metrics. PMID:29533231</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/27409006','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/27409006"><span>Self-homodyne free-space optical communication system based on orthogonally polarized binary phase shift keying.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Cai, Guangyu; Sun, Jianfeng; Li, Guangyuan; Zhang, Guo; Xu, Mengmeng; Zhang, Bo; Yue, Chaolei; Liu, Liren</p> <p>2016-06-10</p> <p>A self-homodyne laser communication system based on orthogonally polarized binary phase shift keying is demonstrated. The working principles of this method and the structure of a transceiver are described using theoretical calculations. Moreover, the signal-to-noise ratio, sensitivity, and bit error rate are analyzed for the amplifier-noise-limited case. The reported experiment validates the feasibility of the proposed method and demonstrates its advantageous sensitivity as a self-homodyne communication system.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015npgr.conf..209M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015npgr.conf..209M"><span>Development of Measurement Methods for Detection of Special Nuclear Materials using D-D Pulsed Neutron Source</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Misawa, Tsuyoshi; Takahashi, Yoshiyuki; Yagi, Takahiro; Pyeon, Cheol Ho; Kimura, Masaharu; Masuda, Kai; Ohgaki, Hideaki</p> <p>2015-10-01</p> <p>For detection of hidden special nuclear materials (SNMs), we have developed an active neutron-based interrogation system combined with a D-D fusion pulsed neutron source and a neutron detection system. In the detection scheme, we have adopted new measurement techniques simultaneously; neutron noise analysis and neutron energy spectrum analysis. The validity of neutron noise analysis method has been experimentally studied in the Kyoto University Critical Assembly (KUCA), and was applied to a cargo container inspection system by simulation.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1088831','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1088831"><span></span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Rhinefrank, Kenneth E.; Lenee-Bluhm, Pukha; Prudell, Joseph H.</p> <p></p> <p>The most prudent path to a full-scale design, build and deployment of a wave energy conversion (WEC) system involves establishment of validated numerical models using physical experiments in a methodical scaling program. This Project provides essential additional rounds of wave tank testing at 1:33 scale and ocean/bay testing at a 1:7 scale, necessary to validate numerical modeling that is essential to a utility-scale WEC design and associated certification.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=Vanessa+van&pg=2&id=EJ897054','ERIC'); return false;" href="https://eric.ed.gov/?q=Vanessa+van&pg=2&id=EJ897054"><span>Measuring Mobility Limitations in Children with Cerebral Palsy: Content and Construct Validity of a Mobility Questionnaire (MobQues)</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Van Ravesteyn, Nicolien T.; Scholtes, Vanessa A.; Becher, Jules G.; Roorda, Leo D.; Verschuren, Olaf; Dallmeijer, Annet J.</p> <p>2010-01-01</p> <p>Aim: The objective of this study was to assess the validity of a mobility questionnaire (MobQues) that was developed to measure parent-reported mobility limitations in children with cerebral palsy (CP). Method: The parents of 439 children with CP (256 males and 183 females; age range 2-18y; Gross Motor Function Classification System [GMFCS] levels…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1012245','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1012245"><span>Simulating the Daylight Performance of Complex Fenestration Systems Using Bidirectional Scattering Distribution Functions within Radiance</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Ward, Gregory; Mistrick, Ph.D., Richard; Lee, Eleanor</p> <p>2011-01-21</p> <p>We describe two methods which rely on bidirectional scattering distribution functions (BSDFs) to model the daylighting performance of complex fenestration systems (CFS), enabling greater flexibility and accuracy in evaluating arbitrary assemblies of glazing, shading, and other optically-complex coplanar window systems. Two tools within Radiance enable a) efficient annual performance evaluations of CFS, and b) accurate renderings of CFS despite the loss of spatial resolution associated with low-resolution BSDF datasets for inhomogeneous systems. Validation, accuracy, and limitations of the methods are discussed.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/25697416','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/25697416"><span>Generation of Human Induced Pluripotent Stem Cells Using RNA-Based Sendai Virus System and Pluripotency Validation of the Resulting Cell Population.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Chichagova, Valeria; Sanchez-Vera, Irene; Armstrong, Lyle; Steel, David; Lako, Majlinda</p> <p>2016-01-01</p> <p>Human induced pluripotent stem cells (hiPSCs) provide a platform for studying human disease in vitro, increase our understanding of human embryonic development, and provide clinically relevant cell types for transplantation, drug testing, and toxicology studies. Since their discovery, numerous advances have been made in order to eliminate issues such as vector integration into the host genome, low reprogramming efficiency, incomplete reprogramming and acquisition of genomic instabilities. One of the ways to achieve integration-free reprogramming is by using RNA-based Sendai virus. Here we describe a method to generate hiPSCs with Sendai virus in both feeder-free and feeder-dependent culture systems. Additionally, we illustrate methods by which to validate pluripotency of the resulting stem cell population.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/25087521','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/25087521"><span>Methods for assessing the quality of data in public health information systems: a critical review.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Chen, Hong; Yu, Ping; Hailey, David; Wang, Ning</p> <p>2014-01-01</p> <p>The quality of data in public health information systems can be ensured by effective data quality assessment. In order to conduct effective data quality assessment, measurable data attributes have to be precisely defined. Then reliable and valid measurement methods for data attributes have to be used to measure each attribute. We conducted a systematic review of data quality assessment methods for public health using major databases and well-known institutional websites. 35 studies were eligible for inclusion in the study. A total of 49 attributes of data quality were identified from the literature. Completeness, accuracy and timeliness were the three most frequently assessed attributes of data quality. Most studies directly examined data values. This is complemented by exploring either data users' perception or documentation quality. However, there are limitations of current data quality assessment methods: a lack of consensus on attributes measured; inconsistent definition of the data quality attributes; a lack of mixed methods for assessing data quality; and inadequate attention to reliability and validity. Removal of these limitations is an opportunity for further improvement.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=2174504','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=2174504"><span>International ranking systems for universities and institutions: a critical appraisal</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Ioannidis, John PA; Patsopoulos, Nikolaos A; Kavvoura, Fotini K; Tatsioni, Athina; Evangelou, Evangelos; Kouri, Ioanna; Contopoulos-Ioannidis, Despina G; Liberopoulos, George</p> <p>2007-01-01</p> <p>Background Ranking of universities and institutions has attracted wide attention recently. Several systems have been proposed that attempt to rank academic institutions worldwide. Methods We review the two most publicly visible ranking systems, the Shanghai Jiao Tong University 'Academic Ranking of World Universities' and the Times Higher Education Supplement 'World University Rankings' and also briefly review other ranking systems that use different criteria. We assess the construct validity for educational and research excellence and the measurement validity of each of the proposed ranking criteria, and try to identify generic challenges in international ranking of universities and institutions. Results None of the reviewed criteria for international ranking seems to have very good construct validity for both educational and research excellence, and most don't have very good construct validity even for just one of these two aspects of excellence. Measurement error for many items is also considerable or is not possible to determine due to lack of publication of the relevant data and methodology details. The concordance between the 2006 rankings by Shanghai and Times is modest at best, with only 133 universities shared in their top 200 lists. The examination of the existing international ranking systems suggests that generic challenges include adjustment for institutional size, definition of institutions, implications of average measurements of excellence versus measurements of extremes, adjustments for scientific field, time frame of measurement and allocation of credit for excellence. Conclusion Naïve lists of international institutional rankings that do not address these fundamental challenges with transparent methods are misleading and should be abandoned. We make some suggestions on how focused and standardized evaluations of excellence could be improved and placed in proper context. PMID:17961208</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19890010462','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19890010462"><span>Validation of highly reliable, real-time knowledge-based systems</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Johnson, Sally C.</p> <p>1988-01-01</p> <p>Knowledge-based systems have the potential to greatly increase the capabilities of future aircraft and spacecraft and to significantly reduce support manpower needed for the space station and other space missions. However, a credible validation methodology must be developed before knowledge-based systems can be used for life- or mission-critical applications. Experience with conventional software has shown that the use of good software engineering techniques and static analysis tools can greatly reduce the time needed for testing and simulation of a system. Since exhaustive testing is infeasible, reliability must be built into the software during the design and implementation phases. Unfortunately, many of the software engineering techniques and tools used for conventional software are of little use in the development of knowledge-based systems. Therefore, research at Langley is focused on developing a set of guidelines, methods, and prototype validation tools for building highly reliable, knowledge-based systems. The use of a comprehensive methodology for building highly reliable, knowledge-based systems should significantly decrease the time needed for testing and simulation. A proven record of delivering reliable systems at the beginning of the highly visible testing and simulation phases is crucial to the acceptance of knowledge-based systems in critical applications.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/biblio/22608519-experimental-validation-intrinsic-spatial-efficiency-method-over-wide-range-sizes-cylindrical-sources','SCIGOV-STC'); return false;" href="https://www.osti.gov/biblio/22608519-experimental-validation-intrinsic-spatial-efficiency-method-over-wide-range-sizes-cylindrical-sources"><span>Experimental validation of the intrinsic spatial efficiency method over a wide range of sizes for cylindrical sources</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Ortiz-Ramŕez, Pablo, E-mail: rapeitor@ug.uchile.cl; Larroquette, Philippe; Camilla, S.</p> <p></p> <p>The intrinsic spatial efficiency method is a new absolute method to determine the efficiency of a gamma spectroscopy system for any extended source. In the original work the method was experimentally demonstrated and validated for homogeneous cylindrical sources containing {sup 137}Cs, whose sizes varied over a small range (29.5 mm radius and 15.0 to 25.9 mm height). In this work we present an extension of the validation over a wide range of sizes. The dimensions of the cylindrical sources vary between 10 to 40 mm height and 8 to 30 mm radius. The cylindrical sources were prepared using the referencemore » material IAEA-372, which had a specific activity of 11320 Bq/kg at july 2006. The obtained results were better for the sources with 29 mm radius showing relative bias lesser than 5% and for the sources with 10 mm height showing relative bias lesser than 6%. In comparison with the obtained results in the work where we present the method, the majority of these results show an excellent agreement.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/27931157','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/27931157"><span>Biotransformation of lignan glycoside to its aglycone by Woodfordia fruticosa flowers: quantification of compounds using a validated HPTLC method.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Mishra, Shikha; Aeri, Vidhu</p> <p>2017-12-01</p> <p>Saraca asoca Linn. (Caesalpiniaceae) is an important traditional remedy for gynaecological disorders and it contains lyoniside, an aryl tetralin lignan glycoside. The aglycone of lyoniside, lyoniresinol possesses structural similarity to enterolignan precursors which are established phytoestrogens. This work illustrates biotransformation of lyoniside to lyoniresinol using Woodfordia fruticosa Kurz. (Lythraceae) flowers and simultaneous quantification of lyoniside and lyoniresinol using a validated HPTLC method. The aqueous extract prepared from S. asoca bark was fermented using W. fruticosa flowers. The substrate and fermented product both were simultaneously analyzed using solvent system:toluene:ethyl acetate:formic acid (4:3:0.4) at 254 nm. The method was validated for specificity, accuracy, precision, linearity, sensitivity and robustness as per ICH guidelines. The substrate showed the presence of lyoniside, however, it decreased as the fermentation proceeded. On 3rd day, lyoniresinol starts appearing in the medium. In 8 days duration most of the lyoniside converted to lyoniresinol. The developed method was specific for lyoniside and lyoniresinol. Lyoniside and lyoniresinol showed linearity in the range of 250-3000 and 500-2500 ng. The method was accurate as resulted in 99.84% and 99.83% recovery, respectively, for lyoniside and lyoniresinol. Aryl tetralin lignan glycoside, lyoniside was successfully transformed into lyoniresinol using W. fruticosa flowers and their contents were simultaneously analyzed using developed validated HPTLC method.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1033495','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1033495"><span>A Greenhouse-Gas Information System: Monitoring and Validating Emissions Reporting and Mitigation</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Jonietz, Karl K.; Dimotakis, Paul E.; Rotman, Douglas A.</p> <p>2011-09-26</p> <p>This study and report focus on attributes of a greenhouse-gas information system (GHGIS) needed to support MRV&V needs. These needs set the function of such a system apart from scientific/research monitoring of GHGs and carbon-cycle systems, and include (not exclusively): the need for a GHGIS that is operational, as required for decision-support; the need for a system that meets specifications derived from imposed requirements; the need for rigorous calibration, verification, and validation (CV&V) standards, processes, and records for all measurement and modeling/data-inversion data; the need to develop and adopt an uncertainty-quantification (UQ) regimen for all measurement and modeling data; andmore » the requirement that GHGIS products can be subjected to third-party questioning and scientific scrutiny. This report examines and assesses presently available capabilities that could contribute to a future GHGIS. These capabilities include sensors and measurement technologies; data analysis and data uncertainty quantification (UQ) practices and methods; and model-based data-inversion practices, methods, and their associated UQ. The report further examines the need for traceable calibration, verification, and validation processes and attached metadata; differences between present science-/research-oriented needs and those that would be required for an operational GHGIS; the development, operation, and maintenance of a GHGIS missions-operations center (GMOC); and the complex systems engineering and integration that would be required to develop, operate, and evolve a future GHGIS.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4885388','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4885388"><span>HPTLC Method for the Determination of Paracetamol, Pseudoephedrine and Loratidine in Tablets and Human Plasma</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Farid, Nehal Fayek; Abdelaleem, Eglal A.</p> <p>2016-01-01</p> <p>A sensitive, accurate and selective high performance thin layer chromatography (HPTLC) method was developed and validated for the simultaneous determination of paracetamol (PAR), its toxic impurity 4-aminophenol (4-AP), pseudoephedrine HCl (PSH) and loratidine (LOR). The proposed chromatographic method has been developed using HPTLC aluminum plates precoated with silica gel 60 F254 using acetone–hexane–ammonia (4:5:0.1, by volume) as a developing system followed by densitometric measurement at 254 nm for PAR, 4-AP and LOR, while PSH was scanned at 208 nm. System suitability testing parameters were calculated to ascertain the quality performance of the developed chromatographic method. The method was validated with respect to USP guidelines regarding accuracy, precision and specificity. The method was successfully applied for the determination of PAR, PSH and LOR in ATSHI® tablets. The three drugs were also determined in plasma by applying the proposed method in the ranges of 0.5–6 µg/band, 1.6–12 µg/band and 0.4–2 µg/band for PAR, PSH and LOR, respectively. The results obtained by the proposed method were compared with those obtained by a reported HPLC method, and there was no significance difference between both methods regarding accuracy and precision. PMID:26762956</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_14");'>14</a></li> <li><a href="#" onclick='return showDiv("page_15");'>15</a></li> <li class="active"><span>16</span></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_16 --> <div id="page_17" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_15");'>15</a></li> <li><a href="#" onclick='return showDiv("page_16");'>16</a></li> <li class="active"><span>17</span></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="321"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/21859587','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/21859587"><span>Matrix method for acoustic levitation simulation.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Andrade, Marco A B; Perez, Nicolas; Buiochi, Flavio; Adamowski, Julio C</p> <p>2011-08-01</p> <p>A matrix method is presented for simulating acoustic levitators. A typical acoustic levitator consists of an ultrasonic transducer and a reflector. The matrix method is used to determine the potential for acoustic radiation force that acts on a small sphere in the standing wave field produced by the levitator. The method is based on the Rayleigh integral and it takes into account the multiple reflections that occur between the transducer and the reflector. The potential for acoustic radiation force obtained by the matrix method is validated by comparing the matrix method results with those obtained by the finite element method when using an axisymmetric model of a single-axis acoustic levitator. After validation, the method is applied in the simulation of a noncontact manipulation system consisting of two 37.9-kHz Langevin-type transducers and a plane reflector. The manipulation system allows control of the horizontal position of a small levitated sphere from -6 mm to 6 mm, which is done by changing the phase difference between the two transducers. The horizontal position of the sphere predicted by the matrix method agrees with the horizontal positions measured experimentally with a charge-coupled device camera. The main advantage of the matrix method is that it allows simulation of non-symmetric acoustic levitators without requiring much computational effort.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/26572579','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/26572579"><span>[Assessment of an Evaluation System for Psychiatry Learning].</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Campo-Cabal, Gerardo</p> <p>2012-01-01</p> <p>Through the analysis of a teaching evaluation system for a Psychiatry course aimed at Medicine students, the author reviews the basic elements taken into account in a teaching assessment process. Analysis was carried out of the assessment methods used as well as of the grades obtained by the students from four groups into which the they were divided. The selected assessment methods are appropriate to evaluate educational objectives; the contents are selected by means of a specification matrix; there is a high correlation coefficient between the grades obtained in previous academic periods and the ones obtained in the course, thus demonstrating the validity of the results (both considering the whole exam or just a part of it). Most of the students are on the right side of the grading curve, which means that the majority of them acquire the knowledge expected. The assessment system used in the Psychopathology course is fair, valid and reliable, specifically concerning the objective methods used, but the conceptual evaluation should be improved or, preferably, eliminated as a constituernt part of the evaluation system. Copyright © 2012 Asociación Colombiana de Psiquiatría. Publicado por Elsevier España. All rights reserved.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2010SPIE.7716E..14W','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2010SPIE.7716E..14W"><span>WaferOptics® mass volume production and reliability</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Wolterink, E.; Demeyer, K.</p> <p>2010-05-01</p> <p>The Anteryon WaferOptics® Technology platform contains imaging optics designs, materials, metrologies and combined with wafer level based Semicon & MEMS production methods. WaferOptics® first required complete new system engineering. This system closes the loop between application requirement specifications, Anteryon product specification, Monte Carlo Analysis, process windows, process controls and supply reject criteria. Regarding the Anteryon product Integrated Lens Stack (ILS), new design rules, test methods and control systems were assessed, implemented, validated and customer released for mass production. This includes novel reflowable materials, mastering process, replication, bonding, dicing, assembly, metrology, reliability programs and quality assurance systems. Many of Design of Experiments were performed to assess correlations between optical performance parameters and machine settings of all process steps. Lens metrologies such as FFL, BFL, and MTF were adapted for wafer level production and wafer mapping was introduced for yield management. Test methods for screening and validating suitable optical materials were designed. Critical failure modes such as delamination and popcorning were assessed and modeled with FEM. Anteryon successfully managed to integrate the different technologies starting from single prototypes to high yield mass volume production These parallel efforts resulted in a steep yield increase from 30% to over 90% in a 8 months period.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/21586386','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/21586386"><span>Computer-assisted update of a consumer health vocabulary through mining of social network data.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Doing-Harris, Kristina M; Zeng-Treitler, Qing</p> <p>2011-05-17</p> <p>Consumer health vocabularies (CHVs) have been developed to aid consumer health informatics applications. This purpose is best served if the vocabulary evolves with consumers' language. Our objective was to create a computer assisted update (CAU) system that works with live corpora to identify new candidate terms for inclusion in the open access and collaborative (OAC) CHV. The CAU system consisted of three main parts: a Web crawler and an HTML parser, a candidate term filter that utilizes natural language processing tools including term recognition methods, and a human review interface. In evaluation, the CAU system was applied to the health-related social network website PatientsLikeMe.com. The system's utility was assessed by comparing the candidate term list it generated to a list of valid terms hand extracted from the text of the crawled webpages. The CAU system identified 88,994 unique terms 1- to 7-grams ("n-grams" are n consecutive words within a sentence) in 300 crawled PatientsLikeMe.com webpages. The manual review of the crawled webpages identified 651 valid terms not yet included in the OAC CHV or the Unified Medical Language System (UMLS) Metathesaurus, a collection of vocabularies amalgamated to form an ontology of medical terms, (ie, 1 valid term per 136.7 candidate n-grams). The term filter selected 774 candidate terms, of which 237 were valid terms, that is, 1 valid term among every 3 or 4 candidates reviewed. The CAU system is effective for generating a list of candidate terms for human review during CHV development.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1457413','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1457413"><span>System and method for islanding detection and prevention in distributed generation</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Bhowmik, Shibashis; Mazhari, Iman; Parkhideh, Babak</p> <p></p> <p>Various examples are directed to systems and methods for detecting an islanding condition at an inverter configured to couple a distributed generation system to an electrical grid network. A controller may determine a command frequency and a command frequency variation. The controller may determine that the command frequency variation indicates a potential islanding condition and send to the inverter an instruction to disconnect the distributed generation system from the electrical grid network. When the distributed generation system is disconnected from the electrical grid network, the controller may determine whether the grid network is valid.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/1993esep.agarR....S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/1993esep.agarR....S"><span>Large Aircraft Robotic Paint Stripping (LARPS) system and the high pressure water process</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>See, David W.; Hofacker, Scott A.; Stone, M. Anthony; Harbaugh, Darcy</p> <p>1993-03-01</p> <p>The aircraft maintenance industry is beset by new Environmental Protection Agency (EPA) guidelines on air emissions, Occupational Safety and Health Administration (OSHA) standards, dwindling labor markets, Federal Aviation Administration (FAA) safety guidelines, and increased operating costs. In light of these factors, the USAF's Wright Laboratory Manufacturing Technology Directorate and the Aircraft Division of the Oklahoma City Air Logistics Center initiated a MANTECH/REPTECH effort to automate an alternate paint removal method and eliminate the current manual methylene chloride chemical stripping methods. This paper presents some of the background and history of the LARPS program, describes the LARPS system, documents the projected operational flow, quantifies some of the projected system benefits and describes the High Pressure Water Stripping Process. Certification of an alternative paint removal method to replace the current chemical process is being performed in two phases: Process Optimization and Process Validation. This paper also presents the results of the Process Optimization for metal substrates. Data on the coating removal rate, residual stresses, surface roughness, preliminary process envelopes, and technical plans for process Validation Testing will be discussed.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/29127252','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/29127252"><span>Validation of an ergonomic method to withdraw [99mTc] radiopharmaceuticals.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Blondeel-Gomes, Sandy; Marie, Solène; Fouque, Julien; Loyeau, Sabrina; Madar, Olivier; Lokiec, François</p> <p>2017-11-10</p> <p>The main objective of the present work was to ensure quality of radiopharmaceuticals syringes withdrawn with a "Spinal needle/obturator In-Stopper" system. Methods: Visual examinations and physicochemical tests are performed at T0 and T+4h for [ 99m Tc]albumin nanocolloid and T+7h for [ 99m Tc]eluate, [ 99m Tc] HydroxyMethylene DiPhosphonate and [ 99m Tc]Human Serum Albumin. Microbiological validation was performed according to European pharmacopoeia. Fingertip radiation exposure was evaluated to confirm the safety of the system. Results: Results show stable visual and physicochemical properties. The integrity of the connector was not affected after 30 punctures (no cores). No microbiological contamination was found on tested syringes. Conclusion: The system could be used 30 times. The stability of syringes drawing with this method is guaranteed up to 4 hours for [ 99m Tc]albumin nanocolloid and 7 hours for [ 99m Tc]eluate, [ 99m Tc]HydroxyMethylene DisPhosphonate and [ 99m Tc]Human serum albumin. Copyright © 2017 by the Society of Nuclear Medicine and Molecular Imaging, Inc.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/25968077','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/25968077"><span>Design of a Competency Evaluation Model for Clinical Nursing Practicum, Based on Standardized Language Systems: Psychometric Validation Study.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Iglesias-Parra, Maria Rosa; García-Guerrero, Alfonso; García-Mayor, Silvia; Kaknani-Uttumchandani, Shakira; León-Campos, Álvaro; Morales-Asencio, José Miguel</p> <p>2015-07-01</p> <p>To develop an evaluation system of clinical competencies for the practicum of nursing students based on the Nursing Interventions Classification (NIC). Psychometric validation study: the first two phases addressed definition and content validation, and the third phase consisted of a cross-sectional study for analyzing reliability. The study population was undergraduate nursing students and clinical tutors. Through the Delphi technique, 26 competencies and 91 interventions were isolated. Cronbach's α was 0.96. Factor analysis yielded 18 factors that explained 68.82% of the variance. Overall inter-item correlation was 0.26, and total-item correlation ranged between 0.66 and 0.19. A competency system for the nursing practicum, structured on the NIC, is a reliable method for assessing and evaluating clinical competencies. Further evaluations in other contexts are needed. The availability of standardized language systems in the nursing discipline supposes an ideal framework to develop the nursing curricula. © 2015 Sigma Theta Tau International.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/biblio/22068490-protecting-quantum-state-from-environmental-noise-incompatible-finite-time-measurement','SCIGOV-STC'); return false;" href="https://www.osti.gov/biblio/22068490-protecting-quantum-state-from-environmental-noise-incompatible-finite-time-measurement"><span>Protecting a quantum state from environmental noise by an incompatible finite-time measurement</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Brasil, Carlos Alexandre; Castro, L. A. de; Napolitano, R. d. J.</p> <p></p> <p>We show that measurements of finite duration performed on an open two-state system can protect the initial state from a phase-noisy environment, provided the measured observable does not commute with the perturbing interaction. When the measured observable commutes with the environmental interaction, the finite-duration measurement accelerates the rate of decoherence induced by the phase noise. For the description of the measurement of an observable that is incompatible with the interaction between system and environment, we have found an approximate analytical expression, valid at zero temperature and weak coupling with the measuring device. We have tested the validity of the analyticalmore » predictions against an exact numerical approach, based on the superoperator-splitting method, that confirms the protection of the initial state of the system. When the coupling between the system and the measuring apparatus increases beyond the range of validity of the analytical approximation, the initial state is still protected by the finite-time measurement, according with the exact numerical calculations.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/29694543','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/29694543"><span>A framework to assess management performance in district health systems: a qualitative and quantitative case study in Iran.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Tabrizi, Jafar Sadegh; Gholipour, Kamal; Iezadi, Shabnam; Farahbakhsh, Mostafa; Ghiasi, Akbar</p> <p>2018-01-01</p> <p>The aim was to design a district health management performance framework for Iran's healthcare system. The mixed-method study was conducted between September 2015 and May 2016 in Tabriz, Iran. In this study, the indicators of district health management performance were obtained by analyzing the 45 semi-structured surveys of experts in the public health system. Content validity of performance indicators which were generated in qualitative part were reviewed and confirmed based on content validity index (CVI). Also content validity ratio (CVR) was calculated using data acquired from a survey of 21 experts in quantitative part. The result of this study indicated that, initially, 81 indicators were considered in framework of district health management performance and, at the end, 53 indicators were validated and confirmed. These indicators were classified in 11 categories which include: human resources and organizational creativity, management and leadership, rules and ethics, planning and evaluation, district managing, health resources management and economics, community participation, quality improvement, research in health system, health information management, epidemiology and situation analysis. The designed framework model can be used to assess the district health management and facilitates performance improvement at the district level.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://ntrs.nasa.gov/search.jsp?R=19890043705&hterms=intelligence+methodology&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D10%26Ntt%3Dintelligence%2Bmethodology','NASA-TRS'); return false;" href="https://ntrs.nasa.gov/search.jsp?R=19890043705&hterms=intelligence+methodology&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D10%26Ntt%3Dintelligence%2Bmethodology"><span>Machine intelligence and autonomy for aerospace systems</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Heer, Ewald (Editor); Lum, Henry (Editor)</p> <p>1988-01-01</p> <p>The present volume discusses progress toward intelligent robot systems in aerospace applications, NASA Space Program automation and robotics efforts, the supervisory control of telerobotics in space, machine intelligence and crew/vehicle interfaces, expert-system terms and building tools, and knowledge-acquisition for autonomous systems. Also discussed are methods for validation of knowledge-based systems, a design methodology for knowledge-based management systems, knowledge-based simulation for aerospace systems, knowledge-based diagnosis, planning and scheduling methods in AI, the treatment of uncertainty in AI, vision-sensing techniques in aerospace applications, image-understanding techniques, tactile sensing for robots, distributed sensor integration, and the control of articulated and deformable space structures.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018OptLE.104..135T','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018OptLE.104..135T"><span>Calibration of an arbitrarily arranged projection moiré system for 3D shape measurement</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Tang, Ying; Yao, Jun; Zhou, Yihao; Sun, Chen; Yang, Peng; Miao, Hong; Chen, Jubing</p> <p>2018-05-01</p> <p>An arbitrarily arranged projection moiré system is presented for three-dimensional shape measurement. We develop a model for projection moiré system and derive a universal formula expressing the relation between height and phase variation before and after we put the object on the reference plane. With so many system parameters involved, a system calibration technique is needed. In this work, we provide a robust and accurate calibration method for an arbitrarily arranged projection moiré system. The system no longer puts restrictions on the configuration of the optical setup. Real experiments have been conducted to verify the validity of this method.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1083811','DOE-PATENT-XML'); return false;" href="https://www.osti.gov/servlets/purl/1083811"><span>System and method for modeling and analyzing complex scenarios</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/doepatents">DOEpatents</a></p> <p>Shevitz, Daniel Wolf</p> <p>2013-04-09</p> <p>An embodiment of the present invention includes a method for analyzing and solving possibility tree. A possibility tree having a plurality of programmable nodes is constructed and solved with a solver module executed by a processor element. The solver module executes the programming of said nodes, and tracks the state of at least a variable through a branch. When a variable of said branch is out of tolerance with a parameter, the solver disables remaining nodes of the branch and marks the branch as an invalid solution. The valid solutions are then aggregated and displayed as valid tree solutions.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/27225570','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/27225570"><span>Mediating the Cognitive Walkthrough with Patient Groups to achieve Personalized Health in Chronic Disease Self-Management System Evaluation.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Georgsson, Mattias; Kushniruk, Andre</p> <p>2016-01-01</p> <p>The cognitive walkthrough (CW) is a task-based, expert inspection usability evaluation method involving benefits such as cost effectiveness and efficiency. A drawback of the method is that it doesn't involve the user perspective from real users but instead is based on experts' predictions about the usability of the system and how users interact. In this paper, we propose a way of involving the user in an expert evaluation method by modifying the CW with patient groups as mediators. This along with other modifications include a dual domain session facilitator, specific patient groups and three different phases: 1) a preparation phase where suitable tasks are developed by a panel of experts and patients, validated through the content validity index 2) a patient user evaluation phase including an individual and collaborative process part 3) an analysis and coding phase where all data is digitalized and synthesized making use of Qualitative Data Analysis Software (QDAS) to determine usability deficiencies. We predict that this way of evaluating will utilize the benefits of the expert methods, also providing a way of including the patient user of these self-management systems. Results from this prospective study should provide evidence of the usefulness of this method modification.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016SPIE10154E..1QX','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016SPIE10154E..1QX"><span>A correction method for the axial maladjustment of transmission-type optical system based on aberration theory</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Xu, Chunmei; Huang, Fu-yu; Yin, Jian-ling; Chen, Yu-dan; Mao, Shao-juan</p> <p>2016-10-01</p> <p>The influence of aberration on misalignment of optical system is considered fully, the deficiencies of Gauss optical correction method is pointed, and a correction method for transmission-type misalignment optical system is proposed based on aberration theory. The variation regularity of single lens aberration caused by axial displacement is analyzed, and the aberration effect is defined. On this basis, through calculating the size of lens adjustment induced by the image position error and the magnifying rate error, the misalignment correction formula based on the constraints of the aberration is deduced mathematically. Taking the three lens collimation system for an example, the test is carried out to validate this method, and its superiority is proved.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/22458958','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/22458958"><span>Development and validation of a multiplex quantitative polymerase chain reaction assay for the detection of Mollicutes impurities in human cells, cultured under good manufacturing practice conditions, and following European Pharmacopoeia requirements and the International Conference on Harmonization guidelines.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Vanni, Irene; Ugolotti, Elisabetta; Raso, Alessandro; Di Marco, Eddi; Melioli, Giovanni; Biassoni, Roberto</p> <p>2012-07-01</p> <p>The clinical applications of in vitro manipulated cultured cells and their precursors are often made use of in therapeutic trials. However, tissue cultures can be easily contaminated by the ubiquitous Mollicutes micro-organisms, which can cause various and severe alterations in cellular function. Thus methods able to detect and trace Mollicutes impurities contaminating cell cultures are required before starting any attempt to grow cells under good manufacturing practice (GMP) conditions. We developed a multiplex quantitative polymerase chain reaction (qPCR) assay specific for the 16S-23S rRNA intergenic spacer regions, for the Tuf and P1 cytoadhesin genes, able to detect contaminant Mollicutes species in a single tube reaction. The system was validated by analyzing different cell lines and the positive samples were confirmed by 16S and P1 cytoadhesin gene dideoxy sequencing. Our multiplex qPCR detection system was able to reach a sensitivity, specificity and robustness comparable with the culture and the indicator cell culture method, as required by the European Pharmacopoeia guidelines. We have developed a multiplex qPCR method, validated following International Conference on Harmonization (ICH) guidelines, as a qualitative limit test for impurities, assessing the validation characteristics of limit of detection and specificity. It also follows the European Pharmacopoeia guidelines and Food and Drug Administration (FDA) requirements.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20000080102','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20000080102"><span>Advanced Atmospheric Water Vapor DIAL Detection System</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Refaat, Tamer F.; Elsayed-Ali, Hani E.; DeYoung, Russell J. (Technical Monitor)</p> <p>2000-01-01</p> <p>Measurement of atmospheric water vapor is very important for understanding the Earth's climate and water cycle. The remote sensing Differential Absorption Lidar (DIAL) technique is a powerful method to perform such measurement from aircraft and space. This thesis describes a new advanced detection system, which incorporates major improvements regarding sensitivity and size. These improvements include a low noise advanced avalanche photodiode detector, a custom analog circuit, a 14-bit digitizer, a microcontroller for on board averaging and finally a fast computer interface. This thesis describes the design and validation of this new water vapor DIAL detection system which was integrated onto a small Printed Circuit Board (PCB) with minimal weight and power consumption. Comparing its measurements to an existing DIAL system for aerosol and water vapor profiling validated the detection system.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5130606','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5130606"><span>The Physician Recommendation Coding System (PhyReCS): A Reliable and Valid Method to Quantify the Strength of Physician Recommendations During Clinical Encounters</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Scherr, Karen A.; Fagerlin, Angela; Williamson, Lillie D.; Davis, J. Kelly; Fridman, Ilona; Atyeo, Natalie; Ubel, Peter A.</p> <p>2016-01-01</p> <p>Background Physicians’ recommendations affect patients’ treatment choices. However, most research relies on physicians’ or patients’ retrospective reports of recommendations, which offer a limited perspective and have limitations such as recall bias. Objective To develop a reliable and valid method to measure the strength of physician recommendations using direct observation of clinical encounters. Methods Clinical encounters (n = 257) were recorded as part of a larger study of prostate cancer decision making. We used an iterative process to create the 5-point Physician Recommendation Coding System (PhyReCS). To determine reliability, research assistants double-coded 50 transcripts. To establish content validity, we used one-way ANOVAs to determine whether relative treatment recommendation scores differed as a function of which treatment patients received. To establish concurrent validity, we examined whether patients’ perceived treatment recommendations matched our coded recommendations. Results The PhyReCS was highly reliable (Krippendorf’s alpha =. 89, 95% CI [.86, .91]). The average relative treatment recommendation score for each treatment was higher for individuals who received that particular treatment. For example, the average relative surgery recommendation score was higher for individuals who received surgery versus radiation (mean difference = .98, SE = .18, p < .001) or active surveillance (mean difference = 1.10, SE = .14, p < .001). Patients’ perceived recommendations matched coded recommendations 81% of the time. Conclusion The PhyReCS is a reliable and valid way to capture the strength of physician recommendations. We believe that the PhyReCS would be helpful for other researchers who wish to study physician recommendations, an important part of patient decision making. PMID:27343015</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20090034248','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20090034248"><span>Compartment Venting Analyses of Ares I First Stage Systems Tunnel</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Wang, Qunzhen; Arner, Stephen</p> <p>2009-01-01</p> <p>Compartment venting analyses have been performed for the Ares I first stage systems tunnel using both the lumped parameter method and the three-dimensional (31)) transient computational fluid dynamics (CFD) approach. The main objective of venting analyses is to predict the magnitudes of differential pressures across the skin so the integrity of solid walls can be evaluated and properly designed. The lumped parameter method assumes the gas pressure and temperature inside the systems tunnel are spatially uniform, which is questionable since the tunnel is about 1,700 in. long and 4 in. wide. Therefore, 31) transient CFD simulations using the commercial CFD code FLUENT are performed in order to examine the gas pressure and temperature variations inside the tunnel. It was found that the uniform pressure and temperature assumptions inside the systems tunnel are valid during ascent. During reentry, the uniform pressure assumption is also reasonable but the uniform temperature assumption is not valid. Predicted pressure and temperature inside the systems tunnel using CFD are also compared with those from the lumped parameter method using the NASA code CHCHVENT. In general, the average pressure and temperature inside the systems tunnel from CFD are between the burst and crush results from CHCHVENT during both ascent and reentry. The skin differential pressure and pressure inside the systems tunnel relative to freestream pressure from CHCHVENT as well as velocity vectors and streamlines are also discussed in detail.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3538068','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3538068"><span>Motivating medical information system performance by system quality, service quality, and job satisfaction for evidence-based practice</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p></p> <p>2012-01-01</p> <p>Background No previous studies have addressed the integrated relationships among system quality, service quality, job satisfaction, and system performance; this study attempts to bridge such a gap with evidence-based practice study. Methods The convenience sampling method was applied to the information system users of three hospitals in southern Taiwan. A total of 500 copies of questionnaires were distributed, and 283 returned copies were valid, suggesting a valid response rate of 56.6%. SPSS 17.0 and AMOS 17.0 (structural equation modeling) statistical software packages were used for data analysis and processing. Results The findings are as follows: System quality has a positive influence on service quality (γ11= 0.55), job satisfaction (γ21= 0.32), and system performance (γ31= 0.47). Service quality (β31= 0.38) and job satisfaction (β32= 0.46) will positively influence system performance. Conclusions It is thus recommended that the information office of hospitals and developers take enhancement of service quality and user satisfaction into consideration in addition to placing b on system quality and information quality when designing, developing, or purchasing an information system, in order to improve benefits and gain more achievements generated by hospital information systems. PMID:23171394</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_15");'>15</a></li> <li><a href="#" onclick='return showDiv("page_16");'>16</a></li> <li class="active"><span>17</span></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_17 --> <div id="page_18" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_16");'>16</a></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li class="active"><span>18</span></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="341"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4062323','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4062323"><span>Validation of a laboratory and hospital information system in a medical laboratory accredited according to ISO 15189</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Biljak, Vanja Radisic; Ozvald, Ivan; Radeljak, Andrea; Majdenic, Kresimir; Lasic, Branka; Siftar, Zoran; Lovrencic, Marijana Vucic; Flegar-Mestric, Zlata</p> <p>2012-01-01</p> <p>Introduction The aim of the study was to present a protocol for laboratory information system (LIS) and hospital information system (HIS) validation at the Institute of Clinical Chemistry and Laboratory Medicine of the Merkur University Hospital, Zagreb, Croatia. Materials and methods: Validity of data traceability was checked by entering all test requests for virtual patient into HIS/LIS and printing corresponding barcoded labels that provided laboratory analyzers with the information on requested tests. The original printouts of the test results from laboratory analyzer(s) were compared with the data obtained from LIS and entered into the provided template. Transfer of data from LIS to HIS was examined by requesting all tests in HIS and creating real data in a finding generated in LIS. Data obtained from LIS and HIS were entered into a corresponding template. The main outcome measure was the accuracy of transfer obtained from laboratory analyzers and results transferred from LIS and HIS expressed as percentage (%). Results: The accuracy of data transfer from laboratory analyzers to LIS was 99.5% and of that from LIS to HIS 100%. Conclusion: We presented our established validation protocol for laboratory information system and demonstrated that a system meets its intended purpose. PMID:22384522</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/24122572','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/24122572"><span>Validation of a method for real time foot position and orientation tracking with Microsoft Kinect technology for use in virtual reality and treadmill based gait training programs.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Paolini, Gabriele; Peruzzi, Agnese; Mirelman, Anat; Cereatti, Andrea; Gaukrodger, Stephen; Hausdorff, Jeffrey M; Della Croce, Ugo</p> <p>2014-09-01</p> <p>The use of virtual reality for the provision of motor-cognitive gait training has been shown to be effective for a variety of patient populations. The interaction between the user and the virtual environment is achieved by tracking the motion of the body parts and replicating it in the virtual environment in real time. In this paper, we present the validation of a novel method for tracking foot position and orientation in real time, based on the Microsoft Kinect technology, to be used for gait training combined with virtual reality. The validation of the motion tracking method was performed by comparing the tracking performance of the new system against a stereo-photogrammetric system used as gold standard. Foot position errors were in the order of a few millimeters (average RMSD from 4.9 to 12.1 mm in the medio-lateral and vertical directions, from 19.4 to 26.5 mm in the anterior-posterior direction); the foot orientation errors were also small (average %RMSD from 5.6% to 8.8% in the medio-lateral and vertical directions, from 15.5% to 18.6% in the anterior-posterior direction). The results suggest that the proposed method can be effectively used to track feet motion in virtual reality and treadmill-based gait training programs.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/24798391','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/24798391"><span>A method to validate quantitative high-frequency power doppler ultrasound with fluorescence in vivo video microscopy.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Pinter, Stephen Z; Kim, Dae-Ro; Hague, M Nicole; Chambers, Ann F; MacDonald, Ian C; Lacefield, James C</p> <p>2014-08-01</p> <p>Flow quantification with high-frequency (>20 MHz) power Doppler ultrasound can be performed objectively using the wall-filter selection curve (WFSC) method to select the cutoff velocity that yields a best-estimate color pixel density (CPD). An in vivo video microscopy system (IVVM) is combined with high-frequency power Doppler ultrasound to provide a method for validation of CPD measurements based on WFSCs in mouse testicular vessels. The ultrasound and IVVM systems are instrumented so that the mouse remains on the same imaging platform when switching between the two modalities. In vivo video microscopy provides gold-standard measurements of vascular diameter to validate power Doppler CPD estimates. Measurements in four image planes from three mice exhibit wide variation in the optimal cutoff velocity and indicate that a predetermined cutoff velocity setting can introduce significant errors in studies intended to quantify vascularity. Consistent with previously published flow-phantom data, in vivo WFSCs exhibited three characteristic regions and detectable plateaus. Selection of a cutoff velocity at the right end of the plateau yielded a CPD close to the gold-standard vascular volume fraction estimated using IVVM. An investigator can implement the WFSC method to help adapt cutoff velocity to current blood flow conditions and thereby improve the accuracy of power Doppler for quantitative microvascular imaging. Copyright © 2014 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2003SPIE.5107...13L','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2003SPIE.5107...13L"><span>Intelligent model-based diagnostics for vehicle health management</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Luo, Jianhui; Tu, Fang; Azam, Mohammad S.; Pattipati, Krishna R.; Willett, Peter K.; Qiao, Liu; Kawamoto, Masayuki</p> <p>2003-08-01</p> <p>The recent advances in sensor technology, remote communication and computational capabilities, and standardized hardware/software interfaces are creating a dramatic shift in the way the health of vehicles is monitored and managed. These advances facilitate remote monitoring, diagnosis and condition-based maintenance of automotive systems. With the increased sophistication of electronic control systems in vehicles, there is a concomitant increased difficulty in the identification of the malfunction phenomena. Consequently, the current rule-based diagnostic systems are difficult to develop, validate and maintain. New intelligent model-based diagnostic methodologies that exploit the advances in sensor, telecommunications, computing and software technologies are needed. In this paper, we will investigate hybrid model-based techniques that seamlessly employ quantitative (analytical) models and graph-based dependency models for intelligent diagnosis. Automotive engineers have found quantitative simulation (e.g. MATLAB/SIMULINK) to be a vital tool in the development of advanced control systems. The hybrid method exploits this capability to improve the diagnostic system's accuracy and consistency, utilizes existing validated knowledge on rule-based methods, enables remote diagnosis, and responds to the challenges of increased system complexity. The solution is generic and has the potential for application in a wide range of systems.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1183501','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1183501"><span>Nuclear Energy Knowledge and Validation Center (NEKVaC) Needs Workshop Summary Report</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Gougar, Hans</p> <p>2015-02-01</p> <p>The Department of Energy (DOE) has made significant progress developing simulation tools to predict the behavior of nuclear systems with greater accuracy and of increasing our capability to predict the behavior of these systems outside of the standard range of applications. These analytical tools require a more complex array of validation tests to accurately simulate the physics and multiple length and time scales. Results from modern simulations will allow experiment designers to narrow the range of conditions needed to bound system behavior and to optimize the deployment of instrumentation to limit the breadth and cost of the campaign. Modern validation,more » verification and uncertainty quantification (VVUQ) techniques enable analysts to extract information from experiments in a systematic manner and provide the users with a quantified uncertainty estimate. Unfortunately, the capability to perform experiments that would enable taking full advantage of the formalisms of these modern codes has progressed relatively little (with some notable exceptions in fuels and thermal-hydraulics); the majority of the experimental data available today is the "historic" data accumulated over the last decades of nuclear systems R&D. A validated code-model is a tool for users. An unvalidated code-model is useful for code developers to gain understanding, publish research results, attract funding, etc. As nuclear analysis codes have become more sophisticated, so have the measurement and validation methods and the challenges that confront them. A successful yet cost-effective validation effort requires expertise possessed only by a few, resources possessed only by the well-capitalized (or a willing collective), and a clear, well-defined objective (validating a code that is developed to satisfy the need(s) of an actual user). To that end, the Idaho National Laboratory established the Nuclear Energy Knowledge and Validation Center to address the challenges of modern code validation and to manage the knowledge from past, current, and future experimental campaigns. By pulling together the best minds involved in code development, experiment design, and validation to establish and disseminate best practices and new techniques, the Nuclear Energy Knowledge and Validation Center (NEKVaC or the ‘Center’) will be a resource for industry, DOE Programs, and academia validation efforts.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2007RaPC...76.1756Y','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2007RaPC...76.1756Y"><span>Challenges in validating the sterilisation dose for processed human amniotic membranes</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Yusof, Norimah; Hassan, Asnah; Firdaus Abd Rahman, M. N.; Hamid, Suzina A.</p> <p>2007-11-01</p> <p>Most of the tissue banks in the Asia Pacific region have been using ionising radiation at 25 kGy to sterilise human tissues for save clinical usage. Under tissue banking quality system, any dose employed for sterilisation has to be validated and the validation exercise has to be a part of quality document. Tissue grafts, unlike medical items, are not produced in large number per each processing batch and tissues relatively have a different microbial population. A Code of Practice established by the International Atomic Energy Agency (IAEA) in 2004 offers several validation methods using smaller number of samples compared to ISO 11137 (1995), which is meant for medical products. The methods emphasise on bioburden determination, followed by sterility test on samples after they were exposed to verification dose for attaining of sterility assurance level (SAL) of 10 -1. This paper describes our experience in using the IAEA Code of Practice in conducting the validation exercise for substantiating 25 kGy as sterilisation dose for both air-dried amnion and those preserved in 99% glycerol.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3146089','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3146089"><span>Development and validation of a headspace gas chromatographic method for the determination of residual solvents in arterolane (RBx11160) maleate bulk drug</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Gupta, Abhishek; Singh, Yogendra; Srinivas, Kona S.; Jain, Garima; Sreekumar, V. B.; Semwal, Vinod Prasad</p> <p>2010-01-01</p> <p>Objective: Arterolane maleate is an antimalarial drug currently under Phase III clinical evaluation, and presents a simple, economical and scalable synthesis, and does not suffer from safety problems. Arterolane maleate is more active than artemisinin; and is cheap to produce. It has a longer lifetime in the plasma, so it stays active longer in the body. To provide quality control over the manufacture of any API, it is essential to develop highly selective analytical methods. In the current article we are reporting the development and validation of a rapid and specific Head space gas chromatographic (HSGC) method for the determination of organic volatile impurities (residual solvents) in Arterolane Maleate bulk drug. Materials and Methods: The method development and its validation were performed on Perkin Elmer's gas chromatographic system equipped with Flame Ionization detector and head space analyzer. The method involved a thermal gradient elution of ten residual solvents present in arterolane maleate salt in RTx-624, 30 m × 0.32 mm, 1.8 μ column using nitrogen gas as a carrier. The flow rate was 0.5 ml/min and flame ionization detector (FID) was used. Results: During method validation, parameters such as precision, linearity, accuracy, limit of quantification and detection and specificity were evaluated, which remained within acceptable limits. Conclusions: The method has been successfully applied for the quantification of the amount of residual solvents present in arterolane maleate bulk drug.The method presents a simple and reliable solution for the routine quantitative analysis of residual solvents in Arterolane maleate bulk drug. PMID:21814428</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/19622367','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/19622367"><span>Development of an integrated laboratory system for the monitoring of cyanotoxins in surface and drinking waters.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Triantis, Theodoros; Tsimeli, Katerina; Kaloudis, Triantafyllos; Thanassoulias, Nicholas; Lytras, Efthymios; Hiskia, Anastasia</p> <p>2010-05-01</p> <p>A system of analytical processes has been developed in order to serve as a cost-effective scheme for the monitoring of cyanobacterial toxins on a quantitative basis, in surface and drinking waters. Five cyclic peptide hepatotoxins, microcystin-LR, -RR, -YR, -LA and nodularin were chosen as the target compounds. Two different enzyme-linked immunosorbent assays (ELISA) were validated in order to serve as primary quantitative screening tools. Validation results showed that the ELISA methods are sufficiently specific and sensitive with limits of detection (LODs) around 0.1 microg/L, however, matrix effects should be considered, especially with surface water samples or bacterial mass methanolic extracts. A colorimetric protein phosphatase inhibition assay (PPIA) utilizing protein phosphatase 2A and p-nitrophenyl phosphate as substrate, was applied in microplate format in order to serve as a quantitative screening method for the detection of the toxic activity associated with cyclic peptide hepatotoxins, at concentration levels >0.2 microg/L of MC-LR equivalents. A fast HPLC/PDA method has been developed for the determination of microcystins, by using a short, 50mm C18 column, with 1.8 microm particle size. Using this method a 10-fold reduction of sample run time was achieved and sufficient separation of microcystins was accomplished in less than 3 min. Finally, the analytical system includes an LC/MS/MS method that was developed for the determination of the 5 target compounds after SPE extraction. The method achieves extremely low limits of detection (<0.02 microg/L), in both surface and drinking waters and it is used for identification and verification purposes as well as for determinations at the ppt level. An analytical protocol that includes the above methods has been designed and validated through the analysis of a number of real samples. Copyright 2009 Elsevier Ltd. All rights reserved.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://files.eric.ed.gov/fulltext/ED540295.pdf','ERIC'); return false;" href="http://files.eric.ed.gov/fulltext/ED540295.pdf"><span>Evaluating the Diagnostic Validity of a Facet-Based Formative Assessment System</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>DeBarger, Angela Haydel; DiBello, Louis; Minstrell, Jim; Feng, Mingyu; Stout, William; Pellegrino, James; Haertel, Geneva; Harris, Christopher; Ructinger, Liliana</p> <p>2011-01-01</p> <p>This paper describes methods for an alignment study and psychometric analyses of a formative assessment system, Diagnoser Tools for physics. Diagnoser Tools begin with facet clusters as the interpretive framework for designing questions and instructional activities. Thus each question in the diagnostic assessments includes distractors that…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/ADA610661','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/ADA610661"><span>A Method for Aligning Acquisition Strategies and Software Architectures</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.dtic.mil/">DTIC Science & Technology</a></p> <p></p> <p>2014-09-01</p> <p>system • Want to make sure the system can be readily evolved to use new technology Members of the HR staff ( super - visors and those who would use the...References URLs are valid as of the publication date of this document. [Barbacci 2003] Barbacci, Mario , Ellison, Robert, Lattanze, Anthony, Stafford</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=analysis+AND+study&pg=3&id=EJ1160495','ERIC'); return false;" href="https://eric.ed.gov/?q=analysis+AND+study&pg=3&id=EJ1160495"><span>A Document Analysis of Teacher Evaluation Systems Specific to Physical Education</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Norris, Jason M.; van der Mars, Hans; Kulinna, Pamela; Kwon, Jayoun; Amrein-Beardsley, Audrey</p> <p>2017-01-01</p> <p>Purpose: The purpose of this document analysis study was to examine current teacher evaluation systems, understand current practices, and determine whether the instrumentation is a valid measure of teaching quality as reflected in teacher behavior and effectiveness specific to physical education (PE). Method: An interpretive document analysis…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20000081745','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20000081745"><span>Exploration of Uncertainty in Glacier Modelling</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Thompson, David E.</p> <p>1999-01-01</p> <p>There are procedures and methods for verification of coding algebra and for validations of models and calculations that are in use in the aerospace computational fluid dynamics (CFD) community. These methods would be efficacious if used by the glacier dynamics modelling community. This paper is a presentation of some of those methods, and how they might be applied to uncertainty management supporting code verification and model validation for glacier dynamics. The similarities and differences between their use in CFD analysis and the proposed application of these methods to glacier modelling are discussed. After establishing sources of uncertainty and methods for code verification, the paper looks at a representative sampling of verification and validation efforts that are underway in the glacier modelling community, and establishes a context for these within overall solution quality assessment. Finally, an information architecture and interactive interface is introduced and advocated. This Integrated Cryospheric Exploration (ICE) Environment is proposed for exploring and managing sources of uncertainty in glacier modelling codes and methods, and for supporting scientific numerical exploration and verification. The details and functionality of this Environment are described based on modifications of a system already developed for CFD modelling and analysis.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4569327','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4569327"><span>Beyond Corroboration: Strengthening Model Validation by Looking for Unexpected Patterns</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Chérel, Guillaume; Cottineau, Clémentine; Reuillon, Romain</p> <p>2015-01-01</p> <p>Models of emergent phenomena are designed to provide an explanation to global-scale phenomena from local-scale processes. Model validation is commonly done by verifying that the model is able to reproduce the patterns to be explained. We argue that robust validation must not only be based on corroboration, but also on attempting to falsify the model, i.e. making sure that the model behaves soundly for any reasonable input and parameter values. We propose an open-ended evolutionary method based on Novelty Search to look for the diverse patterns a model can produce. The Pattern Space Exploration method was tested on a model of collective motion and compared to three common a priori sampling experiment designs. The method successfully discovered all known qualitatively different kinds of collective motion, and performed much better than the a priori sampling methods. The method was then applied to a case study of city system dynamics to explore the model’s predicted values of city hierarchisation and population growth. This case study showed that the method can provide insights on potential predictive scenarios as well as falsifiers of the model when the simulated dynamics are highly unrealistic. PMID:26368917</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4147211','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4147211"><span>Development of a Decision Support System for Analysis and Solutions of Prolonged Standing in the Workplace</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Halim, Isa; Arep, Hambali; Kamat, Seri Rahayu; Abdullah, Rohana; Omar, Abdul Rahman; Ismail, Ahmad Rasdan</p> <p>2014-01-01</p> <p>Background Prolonged standing has been hypothesized as a vital contributor to discomfort and muscle fatigue in the workplace. The objective of this study was to develop a decision support system that could provide systematic analysis and solutions to minimize the discomfort and muscle fatigue associated with prolonged standing. Methods The integration of object-oriented programming and a Model Oriented Simultaneous Engineering System were used to design the architecture of the decision support system. Results Validation of the decision support system was carried out in two manufacturing companies. The validation process showed that the decision support system produced reliable results. Conclusion The decision support system is a reliable advisory tool for providing analysis and solutions to problems related to the discomfort and muscle fatigue associated with prolonged standing. Further testing of the decision support system is suggested before it is used commercially. PMID:25180141</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/21380564','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/21380564"><span>Assessment of abdominal muscle function using the Biodex System-4. Validity and reliability in healthy volunteers and patients with giant ventral hernia.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Gunnarsson, U; Johansson, M; Strigård, K</p> <p>2011-08-01</p> <p>The decrease in recurrence rates in ventral hernia surgery have led to a redirection of focus towards other important patient-related endpoints. One such endpoint is abdominal wall function. The aim of the present study was to evaluate the reliability and external validity of abdominal wall strength measurement using the Biodex System-4 with a back abdomen unit. Ten healthy volunteers and ten patients with ventral hernias exceeding 10 cm were recruited. Test-retest reliability, both with and without girdle, was evaluated by comparison of measurements at two test occasions 1 week apart. Reliability was calculated by the interclass correlation coefficients (ICC) method. Validity was evaluated by correlation with the well-established International Physical Activity Questionnaire (IPAQ) and a self-assessment of abdominal wall strength. One person in the healthy group was excluded after the first test due to neck problems following minor trauma. The reliability was excellent (>0.75), with ICC values between 0.92 and 0.97 for the different modalities tested. No differences were seen between testing with and without a girdle. Validity was also excellent both when calculated as correlation to self-assessment of abdominal wall strength, and to IPAQ, giving Kendall tau values of 0.51 and 0.47, respectively, and corresponding P values of 0.002 and 0.004. Measurement of abdominal muscle function using the Biodex System-4 is a reliable and valid method to assess this important patient-related endpoint. Further investigations will be made to explore the potential of this technique in the evaluation of the results of ventral hernia surgery, and to compare muscle function after different abdominal wall reconstruction techniques.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/25024133','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/25024133"><span>Validity and reliability of the abdominal test and evaluation systems tool (ABTEST) to accurately measure abdominal force.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Glenn, Jordan M; Galey, Madeline; Edwards, Abigail; Rickert, Bradley; Washington, Tyrone A</p> <p>2015-07-01</p> <p>Ability to generate force from the core musculature is a critical factor for sports and general activities with insufficiencies predisposing individuals to injury. This study evaluated isometric force production as a valid and reliable method of assessing abdominal force using the abdominal test and evaluation systems tool (ABTEST). Secondary analysis estimated 1-repetition maximum on commercially available abdominal machine compared to maximum force and average power on ABTEST system. This study utilized test-retest reliability and comparative analysis for validity. Reliability was measured using test-retest design on ABTEST. Validity was measured via comparison to estimated 1-repetition maximum on a commercially available abdominal device. Participants applied isometric, abdominal force against a transducer and muscular activation was evaluated measuring normalized electromyographic activity at the rectus-abdominus, rectus-femoris, and erector-spinae. Test, re-test force production on ABTEST was significantly correlated (r=0.84; p<0.001). Mean electromyographic activity for the rectus-abdominus (72.93% and 75.66%), rectus-femoris (6.59% and 6.51%), and erector-spinae (6.82% and 5.48%) were observed for trial-1 and trial-2, respectively. Significant correlations for the estimated 1-repetition maximum were found for average power (r=0.70, p=0.002) and maximum force (r=0.72, p<0.001). Data indicate the ABTEST can accurately measure rectus-abdominus force isolated from hip-flexor involvement. Negligible activation of erector-spinae substantiates little subjective effort among participants in the lower back. Results suggest ABTEST is a valid and reliable method of evaluating abdominal force. Copyright © 2014 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20090037048','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20090037048"><span>An Overview of Prognosis Health Management Research at Glenn Research Center for Gas Turbine Engine Structures With Special Emphasis on Deformation and Damage Modeling</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Arnold, Steven M.; Goldberg, Robert K.; Lerch, Bradley A.; Saleeb, Atef F.</p> <p>2009-01-01</p> <p>Herein a general, multimechanism, physics-based viscoelastoplastic model is presented in the context of an integrated diagnosis and prognosis methodology which is proposed for structural health monitoring, with particular applicability to gas turbine engine structures. In this methodology, diagnostics and prognostics will be linked through state awareness variable(s). Key technologies which comprise the proposed integrated approach include (1) diagnostic/detection methodology, (2) prognosis/lifing methodology, (3) diagnostic/prognosis linkage, (4) experimental validation, and (5) material data information management system. A specific prognosis lifing methodology, experimental characterization and validation and data information management are the focal point of current activities being pursued within this integrated approach. The prognostic lifing methodology is based on an advanced multimechanism viscoelastoplastic model which accounts for both stiffness and/or strength reduction damage variables. Methods to characterize both the reversible and irreversible portions of the model are discussed. Once the multiscale model is validated the intent is to link it to appropriate diagnostic methods to provide a full-featured structural health monitoring system.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20090040684','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20090040684"><span>An Overview of Prognosis Health Management Research at GRC for Gas Turbine Engine Structures With Special Emphasis on Deformation and Damage Modeling</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Arnold, Steven M.; Goldberg, Robert K.; Lerch, Bradley A.; Saleeb, Atef F.</p> <p>2009-01-01</p> <p>Herein a general, multimechanism, physics-based viscoelastoplastic model is presented in the context of an integrated diagnosis and prognosis methodology which is proposed for structural health monitoring, with particular applicability to gas turbine engine structures. In this methodology, diagnostics and prognostics will be linked through state awareness variable(s). Key technologies which comprise the proposed integrated approach include 1) diagnostic/detection methodology, 2) prognosis/lifing methodology, 3) diagnostic/prognosis linkage, 4) experimental validation and 5) material data information management system. A specific prognosis lifing methodology, experimental characterization and validation and data information management are the focal point of current activities being pursued within this integrated approach. The prognostic lifing methodology is based on an advanced multi-mechanism viscoelastoplastic model which accounts for both stiffness and/or strength reduction damage variables. Methods to characterize both the reversible and irreversible portions of the model are discussed. Once the multiscale model is validated the intent is to link it to appropriate diagnostic methods to provide a full-featured structural health monitoring system.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3123729','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3123729"><span>Preliminary Face and Construct Validation Study of a Virtual Basic Laparoscopic Skill Trainer</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Sankaranarayanan, Ganesh; Lin, Henry; Arikatla, Venkata S.; Mulcare, Maureen; Zhang, Likun; Derevianko, Alexandre; Lim, Robert; Fobert, David; Cao, Caroline; Schwaitzberg, Steven D.; Jones, Daniel B.</p> <p>2010-01-01</p> <p>Abstract Background The Virtual Basic Laparoscopic Skill Trainer (VBLaST™) is a developing virtual-reality–based surgical skill training system that incorporates several of the tasks of the Fundamentals of Laparoscopic Surgery (FLS) training system. This study aimed to evaluate the face and construct validity of the VBLaST™ system. Materials and Methods Thirty-nine subjects were voluntarily recruited at the Beth Israel Deaconess Medical Center (Boston, MA) and classified into two groups: experts (PGY 5, fellow and practicing surgeons) and novice (PGY 1–4). They were then asked to perform three FLS tasks, consisting of peg transfer, pattern cutting, and endoloop, on both the VBLaST and FLS systems. The VBLaST performance scores were automatically computed, while the FLS scores were rated by a trained evaluator. Face validity was assessed using a 5-point Likert scale, varying from not realistic/useful (1) to very realistic/useful (5). Results Face-validity scores showed that the VBLaST system was significantly realistic in portraying the three FLS tasks (3.95 ± 0.909), as well as the reality in trocar placement and tool movements (3.67 ± 0.874). Construct-validity results show that VBLaST was able to differentiate between the expert and novice group (P = 0.015). However, of the two tasks used for evaluating VBLaST, only the peg-transfer task showed a significant difference between the expert and novice groups (P = 0.003). Spearman correlation coefficient analysis between the two scores showed significant correlation for the peg-transfer task (Spearman coefficient 0.364; P = 0.023). Conclusions VBLaST demonstrated significant face and construct validity. A further set of studies, involving improvement to the current VBLaST system, is needed to thoroughly demonstrate face and construct validity for all the tasks. PMID:20201683</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2012AcSpA..97..495G','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2012AcSpA..97..495G"><span>Spectroscopic characterization and quantitative determination of atorvastatin calcium impurities by novel HPLC method</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Gupta, Lokesh Kumar</p> <p>2012-11-01</p> <p>Seven process related impurities were identified by LC-MS in the atorvastatin calcium drug substance. These impurities were identified by LC-MS. The structure of impurities was confirmed by modern spectroscopic techniques like 1H NMR and IR and physicochemical studies conducted by using synthesized authentic reference compounds. The synthesized reference samples of the impurity compounds were used for the quantitative HPLC determination. These impurities were detected by newly developed gradient, reverse phase high performance liquid chromatographic (HPLC) method. The system suitability of HPLC analysis established the validity of the separation. The analytical method was validated according to International Conference of Harmonization (ICH) with respect to specificity, precision, accuracy, linearity, robustness and stability of analytical solutions to demonstrate the power of newly developed HPLC method.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_16");'>16</a></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li class="active"><span>18</span></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_18 --> <div id="page_19" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li class="active"><span>19</span></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="361"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/27038542','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/27038542"><span>The development and validation of using inertial sensors to monitor postural change in resistance exercise.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Gleadhill, Sam; Lee, James Bruce; James, Daniel</p> <p>2016-05-03</p> <p>This research presented and validated a method of assessing postural changes during resistance exercise using inertial sensors. A simple lifting task was broken down to a series of well-defined tasks, which could be examined and measured in a controlled environment. The purpose of this research was to determine whether timing measures obtained from inertial sensor accelerometer outputs are able to provide accurate, quantifiable information of resistance exercise movement patterns. The aim was to complete a timing measure validation of inertial sensor outputs. Eleven participants completed five repetitions of 15 different deadlift variations. Participants were monitored with inertial sensors and an infrared three dimensional motion capture system. Validation was undertaken using a Will Hopkins Typical Error of the Estimate, with a Pearson׳s correlation and a Bland Altman Limits of Agreement analysis. Statistical validation measured the timing agreement during deadlifts, from inertial sensor outputs and the motion capture system. Timing validation results demonstrated a Pearson׳s correlation of 0.9997, with trivial standardised error (0.026) and standardised bias (0.002). Inertial sensors can now be used in practical settings with as much confidence as motion capture systems, for accelerometer timing measurements of resistance exercise. This research provides foundations for inertial sensors to be applied for qualitative activity recognition of resistance exercise and safe lifting practices. Copyright © 2016 Elsevier Ltd. All rights reserved.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/22254789','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/22254789"><span>Neuroscience, virtual reality and neurorehabilitation: brain repair as a validation of brain theory.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Verschure, Paul F M J</p> <p>2011-01-01</p> <p>This paper argues that basing cybertherapy approaches on a theoretical understanding of the brain has advantages. On one hand it provides for a rational approach towards therapy design while on the other allowing for a direct validation of brain theory in the clinic. As an example this paper discusses how the Distributed Adaptive Control architecture, a theory of mind, brain and action, has given rise to a new paradigm in neurorehabilitation called the Rehabilitation Gaming System (RGS) and to novel neuroprosthetic systems. The neuroprosthetic system considered is developed to replace the function of cerebellar micro-circuits, expresses core aspects of the learning systems of DAC and has been successfully tested in in-vivo experiments. The Virtual reality based rehabilitation paradigm of RGS has been validated in the treatment of acute and chronic stroke and has been shown to be more effective than existing methods. RGS provides a foundation for integrated at-home therapy systems that can operate largely autonomously when also augmented with appropriate physiological monitoring and diagnostic devices. These examples provide first steps towards a science based medicine.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=1949492','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=1949492"><span>NetMHCpan, a Method for Quantitative Predictions of Peptide Binding to Any HLA-A and -B Locus Protein of Known Sequence</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Nielsen, Morten; Lundegaard, Claus; Blicher, Thomas; Lamberth, Kasper; Harndahl, Mikkel; Justesen, Sune; Røder, Gustav; Peters, Bjoern; Sette, Alessandro; Lund, Ole; Buus, Søren</p> <p>2007-01-01</p> <p>Background Binding of peptides to Major Histocompatibility Complex (MHC) molecules is the single most selective step in the recognition of pathogens by the cellular immune system. The human MHC class I system (HLA-I) is extremely polymorphic. The number of registered HLA-I molecules has now surpassed 1500. Characterizing the specificity of each separately would be a major undertaking. Principal Findings Here, we have drawn on a large database of known peptide-HLA-I interactions to develop a bioinformatics method, which takes both peptide and HLA sequence information into account, and generates quantitative predictions of the affinity of any peptide-HLA-I interaction. Prospective experimental validation of peptides predicted to bind to previously untested HLA-I molecules, cross-validation, and retrospective prediction of known HIV immune epitopes and endogenous presented peptides, all successfully validate this method. We further demonstrate that the method can be applied to perform a clustering analysis of MHC specificities and suggest using this clustering to select particularly informative novel MHC molecules for future biochemical and functional analysis. Conclusions Encompassing all HLA molecules, this high-throughput computational method lends itself to epitope searches that are not only genome- and pathogen-wide, but also HLA-wide. Thus, it offers a truly global analysis of immune responses supporting rational development of vaccines and immunotherapy. It also promises to provide new basic insights into HLA structure-function relationships. The method is available at http://www.cbs.dtu.dk/services/NetMHCpan. PMID:17726526</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2010SPIE.7691E..05B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2010SPIE.7691E..05B"><span>Zero-G experimental validation of a robotics-based inertia identification algorithm</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Bruggemann, Jeremy J.; Ferrel, Ivann; Martinez, Gerardo; Xie, Pu; Ma, Ou</p> <p>2010-04-01</p> <p>The need to efficiently identify the changing inertial properties of on-orbit spacecraft is becoming more critical as satellite on-orbit services, such as refueling and repairing, become increasingly aggressive and complex. This need stems from the fact that a spacecraft's control system relies on the knowledge of the spacecraft's inertia parameters. However, the inertia parameters may change during flight for reasons such as fuel usage, payload deployment or retrieval, and docking/capturing operations. New Mexico State University's Dynamics, Controls, and Robotics Research Group has proposed a robotics-based method of identifying unknown spacecraft inertia properties1. Previous methods require firing known thrusts then measuring the thrust, and the velocity and acceleration changes. The new method utilizes the concept of momentum conservation, while employing a robotic device powered by renewable energy to excite the state of the satellite. Thus, it requires no fuel usage or force and acceleration measurements. The method has been well studied in theory and demonstrated by simulation. However its experimental validation is challenging because a 6- degree-of-freedom motion in a zero-gravity condition is required. This paper presents an on-going effort to test the inertia identification method onboard the NASA zero-G aircraft. The design and capability of the test unit will be discussed in addition to the flight data. This paper also introduces the design and development of an airbearing based test used to partially validate the method, in addition to the approach used to obtain reference value for the test system's inertia parameters that can be used for comparison with the algorithm results.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/21807120','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/21807120"><span>Scenario-based design: a method for connecting information system design with public health operations and emergency management.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Reeder, Blaine; Turner, Anne M</p> <p>2011-12-01</p> <p>Responding to public health emergencies requires rapid and accurate assessment of workforce availability under adverse and changing circumstances. However, public health information systems to support resource management during both routine and emergency operations are currently lacking. We applied scenario-based design as an approach to engage public health practitioners in the creation and validation of an information design to support routine and emergency public health activities. Using semi-structured interviews we identified the information needs and activities of senior public health managers of a large municipal health department during routine and emergency operations. Interview analysis identified 25 information needs for public health operations management. The identified information needs were used in conjunction with scenario-based design to create 25 scenarios of use and a public health manager persona. Scenarios of use and persona were validated and modified based on follow-up surveys with study participants. Scenarios were used to test and gain feedback on a pilot information system. The method of scenario-based design was applied to represent the resource management needs of senior-level public health managers under routine and disaster settings. Scenario-based design can be a useful tool for engaging public health practitioners in the design process and to validate an information system design. Copyright © 2011 Elsevier Inc. All rights reserved.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013SPIE.8768E..4HL','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013SPIE.8768E..4HL"><span>Multi-views storage model and access methods of conversation history in converged IP messaging system</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Lu, Meilian; Yang, Dong; Zhou, Xing</p> <p>2013-03-01</p> <p>Based on the analysis of the requirements of conversation history storage in CPM (Converged IP Messaging) system, a Multi-views storage model and access methods of conversation history are proposed. The storage model separates logical views from physical storage and divides the storage into system managed region and user managed region. It simultaneously supports conversation view, system pre-defined view and user-defined view of storage. The rationality and feasibility of multi-view presentation, the physical storage model and access methods are validated through the implemented prototype. It proves that, this proposal has good scalability, which will help to optimize the physical data storage structure and improve storage performance.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/25636563','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/25636563"><span>How to test validity in orthodontic research: a mixed dentition analysis example.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Donatelli, Richard E; Lee, Shin-Jae</p> <p>2015-02-01</p> <p>The data used to test the validity of a prediction method should be different from the data used to generate the prediction model. In this study, we explored whether an independent data set is mandatory for testing the validity of a new prediction method and how validity can be tested without independent new data. Several validation methods were compared in an example using the data from a mixed dentition analysis with a regression model. The validation errors of real mixed dentition analysis data and simulation data were analyzed for increasingly large data sets. The validation results of both the real and the simulation studies demonstrated that the leave-1-out cross-validation method had the smallest errors. The largest errors occurred in the traditional simple validation method. The differences between the validation methods diminished as the sample size increased. The leave-1-out cross-validation method seems to be an optimal validation method for improving the prediction accuracy in a data set with limited sample sizes. Copyright © 2015 American Association of Orthodontists. Published by Elsevier Inc. All rights reserved.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3155576','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3155576"><span>An empirical model of diagnostic x-ray attenuation under narrow-beam geometry</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Mathieu, Kelsey B.; Kappadath, S. Cheenu; White, R. Allen; Atkinson, E. Neely; Cody, Dianna D.</p> <p>2011-01-01</p> <p>Purpose: The purpose of this study was to develop and validate a mathematical model to describe narrow-beam attenuation of kilovoltage x-ray beams for the intended applications of half-value layer (HVL) and quarter-value layer (QVL) estimations, patient organ shielding, and computer modeling. Methods: An empirical model, which uses the Lambert W function and represents a generalized Lambert-Beer law, was developed. To validate this model, transmission of diagnostic energy x-ray beams was measured over a wide range of attenuator thicknesses [0.49–33.03 mm Al on a computed tomography (CT) scanner, 0.09–1.93 mm Al on two mammography systems, and 0.1–0.45 mm Cu and 0.49–14.87 mm Al using general radiography]. Exposure measurements were acquired under narrow-beam geometry using standard methods, including the appropriate ionization chamber, for each radiographic system. Nonlinear regression was used to find the best-fit curve of the proposed Lambert W model to each measured transmission versus attenuator thickness data set. In addition to validating the Lambert W model, we also assessed the performance of two-point Lambert W interpolation compared to traditional methods for estimating the HVL and QVL [i.e., semilogarithmic (exponential) and linear interpolation]. Results: The Lambert W model was validated for modeling attenuation versus attenuator thickness with respect to the data collected in this study (R2 > 0.99). Furthermore, Lambert W interpolation was more accurate and less sensitive to the choice of interpolation points used to estimate the HVL and∕or QVL than the traditional methods of semilogarithmic and linear interpolation. Conclusions: The proposed Lambert W model accurately describes attenuation of both monoenergetic radiation and (kilovoltage) polyenergetic beams (under narrow-beam geometry). PMID:21928626</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/biblio/22098588-empirical-model-diagnostic-ray-attenuation-under-narrow-beam-geometry','SCIGOV-STC'); return false;" href="https://www.osti.gov/biblio/22098588-empirical-model-diagnostic-ray-attenuation-under-narrow-beam-geometry"><span>An empirical model of diagnostic x-ray attenuation under narrow-beam geometry</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Mathieu, Kelsey B.; Kappadath, S. Cheenu; White, R. Allen</p> <p>2011-08-15</p> <p>Purpose: The purpose of this study was to develop and validate a mathematical model to describe narrow-beam attenuation of kilovoltage x-ray beams for the intended applications of half-value layer (HVL) and quarter-value layer (QVL) estimations, patient organ shielding, and computer modeling. Methods: An empirical model, which uses the Lambert W function and represents a generalized Lambert-Beer law, was developed. To validate this model, transmission of diagnostic energy x-ray beams was measured over a wide range of attenuator thicknesses [0.49-33.03 mm Al on a computed tomography (CT) scanner, 0.09-1.93 mm Al on two mammography systems, and 0.1-0.45 mm Cu and 0.49-14.87more » mm Al using general radiography]. Exposure measurements were acquired under narrow-beam geometry using standard methods, including the appropriate ionization chamber, for each radiographic system. Nonlinear regression was used to find the best-fit curve of the proposed Lambert W model to each measured transmission versus attenuator thickness data set. In addition to validating the Lambert W model, we also assessed the performance of two-point Lambert W interpolation compared to traditional methods for estimating the HVL and QVL [i.e., semilogarithmic (exponential) and linear interpolation]. Results: The Lambert W model was validated for modeling attenuation versus attenuator thickness with respect to the data collected in this study (R{sup 2} > 0.99). Furthermore, Lambert W interpolation was more accurate and less sensitive to the choice of interpolation points used to estimate the HVL and/or QVL than the traditional methods of semilogarithmic and linear interpolation. Conclusions: The proposed Lambert W model accurately describes attenuation of both monoenergetic radiation and (kilovoltage) polyenergetic beams (under narrow-beam geometry).« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/24216282','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/24216282"><span>Development and validation of a high-performance liquid chromatography-tandem mass spectrometry assay quantifying vemurafenib in human plasma.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Nijenhuis, C M; Rosing, H; Schellens, J H M; Beijnen, J H</p> <p>2014-01-01</p> <p>Vemurafenib is an inhibitor of mutated serine/threonine-protein kinase B-Raf (BRAF) and is registered as Zelboraf(®) for the treatment of adult patients with BRAF V600 mutation-positive unresectable or metastatic melanoma. To support Therapeutic Drug Monitoring (TDM) and clinical trials, we developed and validated a method for the quantification of vemurafenib in human plasma. Additionally two LC-MS systems with different detectors were tested: the TSQ Quantum Ultra and the API3000. Human plasma samples were collected in the clinic and stored at nominally -20°C. Vemurafenib was isolated from plasma by liquid-liquid extraction, separated on a C18 column with gradient elution, and analysed with triple quadrupole mass spectrometry in positive-ion mode. A stable isotope was used as internal standard for the quantification. Ranging from 1 to 100μg/ml the assay was linear with correlation coefficients (r(2)) of 0.9985 or better. Inter-assay and intra-assay accuracies were within ±7.6% of the nominal concentration; inter-assay and intra-assay precision were within ≤9.3% of the nominal concentration. In addition all results were within the acceptance criteria of the US FDA and the latest EMA guidelines for method validation for both MS detectors. In conclusion, the presented analytical method for vemurafenib in human plasma was successfully validated and the performance of the two LC-MS systems for this assay was comparable. In addition the method was successfully applied to evaluate the pharmacokinetic quantification of vemurafenib in cancer patients treated with vemurafenib. Copyright © 2013 Elsevier B.V. All rights reserved.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2006SPIE.6301E..01A','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2006SPIE.6301E..01A"><span>Simultaneous overpass off nadir (SOON): a method for unified calibration/validation across IEOS and GEOSS system of systems</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Ardanuy, Philip; Bergen, Bill; Huang, Allen; Kratz, Gene; Puschell, Jeff; Schueler, Carl; Walker, Joe</p> <p>2006-08-01</p> <p>The US operates a diverse, evolving constellation of research and operational environmental satellites, principally in polar and geosynchronous orbits. Our current and enhanced future domestic remote sensing capability is complemented by the significant capabilities of our current and potential future international partners. In this analysis, we define "success" through the data customers' "eyes": participating in the sufficient and continuously improving satisfaction of their mission responsibilities. To successfully fuse together observations from multiple simultaneous platforms and sensors into a common, self-consistent, operational environment requires that there exist a unified calibration and validation approach. Here, we consider develop a concept for an integrating framework for absolute accuracy; long-term stability; self-consistency among sensors, platforms, techniques, and observing systems; and validation and characterization of performance. Across all systems, this is a non-trivial problem. Simultaneous Nadir Overpasses, or SNO's, provide a proven intercomparison technique: simultaneous, collocated, co-angular measurements. Many systems have off-nadir elements, or effects, that must be calibrated. For these systems, the nadir technique constrains the process. We define the term "SOON," for simultaneous overpass off nadir. We present a target architecture and sensitivity analysis for the affordable, sustainable implementation of a global SOON calibration/validation network that can deliver the much-needed comprehensive, common, self-consistent operational picture in near-real time, at an affordable cost.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/3755792','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/3755792"><span>Design and validation of an automated hydrostatic weighing system.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>McClenaghan, B A; Rocchio, L</p> <p>1986-08-01</p> <p>The purpose of this study was to design and evaluate the validity of an automated technique to assess body density using a computerized hydrostatic weighing system. An existing hydrostatic tank was modified and interfaced with a microcomputer equipped with an analog-to-digital converter. Software was designed to input variables, control the collection of data, calculate selected measurements, and provide a summary of the results of each session. Validity of the data obtained utilizing the automated hydrostatic weighing system was estimated by: evaluating the reliability of the transducer/computer interface to measure objects of known underwater weight; comparing the data against a criterion measure; and determining inter-session subject reliability. Values obtained from the automated system were found to be highly correlated with known underwater weights (r = 0.99, SEE = 0.0060 kg). Data concurrently obtained utilizing the automated system and a manual chart recorder were also found to be highly correlated (r = 0.99, SEE = 0.0606 kg). Inter-session subject reliability was determined utilizing data collected on subjects (N = 16) tested on two occasions approximately 24 h apart. Correlations revealed high relationships between measures of underwater weight (r = 0.99, SEE = 0.1399 kg) and body density (r = 0.98, SEE = 0.00244 g X cm-1). Results indicate that a computerized hydrostatic weighing system is a valid and reliable method for determining underwater weight.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017MeScT..28f5201Z','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017MeScT..28f5201Z"><span>Virtual fringe projection system with nonparallel illumination based on iteration</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Zhou, Duo; Wang, Zhangying; Gao, Nan; Zhang, Zonghua; Jiang, Xiangqian</p> <p>2017-06-01</p> <p>Fringe projection profilometry has been widely applied in many fields. To set up an ideal measuring system, a virtual fringe projection technique has been studied to assist in the design of hardware configurations. However, existing virtual fringe projection systems use parallel illumination and have a fixed optical framework. This paper presents a virtual fringe projection system with nonparallel illumination. Using an iterative method to calculate intersection points between rays and reference planes or object surfaces, the proposed system can simulate projected fringe patterns and captured images. A new explicit calibration method has been presented to validate the precision of the system. Simulated results indicate that the proposed iterative method outperforms previous systems. Our virtual system can be applied to error analysis, algorithm optimization, and help operators to find ideal system parameter settings for actual measurements.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/15782243','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/15782243"><span>The face of pain--a pilot study to validate the measurement of facial pain expression with an improved electromyogram method.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Wolf, Karsten; Raedler, Thomas; Henke, Kai; Kiefer, Falk; Mass, Reinhard; Quante, Markus; Wiedemann, Klaus</p> <p>2005-01-01</p> <p>The purpose of this pilot study was to establish the validity of an improved facial electromyogram (EMG) method for the measurement of facial pain expression. Darwin defined pain in connection with fear as a simultaneous occurrence of eye staring, brow contraction and teeth chattering. Prkachin was the first to use the video-based Facial Action Coding System to measure facial expressions while using four different types of pain triggers, identifying a group of facial muscles around the eyes. The activity of nine facial muscles in 10 healthy male subjects was analyzed. Pain was induced through a laser system with a randomized sequence of different intensities. Muscle activity was measured with a new, highly sensitive and selective facial EMG. The results indicate two groups of muscles as key for pain expression. These results are in concordance with Darwin's definition. As in Prkachin's findings, one muscle group is assembled around the orbicularis oculi muscle, initiating eye staring. The second group consists of the mentalis and depressor anguli oris muscles, which trigger mouth movements. The results demonstrate the validity of the facial EMG method for measuring facial pain expression. Further studies with psychometric measurements, a larger sample size and a female test group should be conducted.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/29201818','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/29201818"><span>Reliability and Validity of a New Method for Isometric Back Extensor Strength Evaluation Using A Hand-Held Dynamometer.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Park, Hee-Won; Baek, Sora; Kim, Hong Young; Park, Jung-Gyoo; Kang, Eun Kyoung</p> <p>2017-10-01</p> <p>To investigate the reliability and validity of a new method for isometric back extensor strength measurement using a portable dynamometer. A chair equipped with a small portable dynamometer was designed (Power Track II Commander Muscle Tester). A total of 15 men (mean age, 34.8±7.5 years) and 15 women (mean age, 33.1±5.5 years) with no current back problems or previous history of back surgery were recruited. Subjects were asked to push the back of the chair while seated, and their isometric back extensor strength was measured by the portable dynamometer. Test-retest reliability was assessed with intraclass correlation coefficient (ICC). For the validity assessment, isometric back extensor strength of all subjects was measured by a widely used physical performance evaluation instrument, BTE PrimusRS system. The limit of agreement (LoA) from the Bland-Altman plot was evaluated between two methods. The test-retest reliability was excellent (ICC=0.82; 95% confidence interval, 0.65-0.91). The Bland-Altman plots demonstrated acceptable agreement between the two methods: the lower 95% LoA was -63.1 N and the upper 95% LoA was 61.1 N. This study shows that isometric back extensor strength measurement using a portable dynamometer has good reliability and validity.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20140008977','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20140008977"><span>SEU System Analysis: Not Just the Sum of All Parts</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Berg, Melanie D.; Label, Kenneth</p> <p>2014-01-01</p> <p>Single event upset (SEU) analysis of complex systems is challenging. Currently, system SEU analysis is performed by component level partitioning and then either: the most dominant SEU cross-sections (SEUs) are used in system error rate calculations; or the partition SEUs are summed to eventually obtain a system error rate. In many cases, system error rates are overestimated because these methods generally overlook system level derating factors. The problem with overestimating is that it can cause overdesign and consequently negatively affect the following: cost, schedule, functionality, and validation/verification. The scope of this presentation is to discuss the risks involved with our current scheme of SEU analysis for complex systems; and to provide alternative methods for improvement.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/29534499','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/29534499"><span>Research on Flow Field Perception Based on Artificial Lateral Line Sensor System.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Liu, Guijie; Wang, Mengmeng; Wang, Anyi; Wang, Shirui; Yang, Tingting; Malekian, Reza; Li, Zhixiong</p> <p>2018-03-11</p> <p>In nature, the lateral line of fish is a peculiar and important organ for sensing the surrounding hydrodynamic environment, preying, escaping from predators and schooling. In this paper, by imitating the mechanism of fish lateral canal neuromasts, we developed an artificial lateral line system composed of micro-pressure sensors. Through hydrodynamic simulations, an optimized sensor structure was obtained and the pressure distribution models of the lateral surface were established in uniform flow and turbulent flow. Carrying out the corresponding underwater experiment, the validity of the numerical simulation method is verified by the comparison between the experimental data and the simulation results. In addition, a variety of effective research methods are proposed and validated for the flow velocity estimation and attitude perception in turbulent flow, respectively and the shape recognition of obstacles is realized by the neural network algorithm.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/18076265','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/18076265"><span>Validity of two alternative systems for measuring vertical jump height.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Leard, John S; Cirillo, Melissa A; Katsnelson, Eugene; Kimiatek, Deena A; Miller, Tim W; Trebincevic, Kenan; Garbalosa, Juan C</p> <p>2007-11-01</p> <p>Vertical jump height is frequently used by coaches, health care professionals, and strength and conditioning professionals to objectively measure function. The purpose of this study is to determine the concurrent validity of the jump and reach method (Vertec) and the contact mat method (Just Jump) in assessing vertical jump height when compared with the criterion reference 3-camera motion analysis system. Thirty-nine college students, 25 females and 14 males between the ages of 18 and 25 (mean age 20.65 years), were instructed to perform the countermovement jump. Reflective markers were placed at the base of the individual's sacrum for the 3-camera motion analysis system to measure vertical jump height. The subject was then instructed to stand on the Just Jump mat beneath the Vertec and perform the jump. Measurements were recorded from each of the 3 systems simultaneously for each jump. The Pearson r statistic between the video and the jump and reach (Vertec) was 0.906. The Pearson r between the video and contact mat (Just Jump) was 0.967. Both correlations were significant at the 0.01 level. Analysis of variance showed a significant difference among the 3 means F(2,235) = 5.51, p < 0.05. The post hoc analysis showed a significant difference between the criterion reference (M = 0.4369 m) and the Vertec (M = 0.3937 m, p = 0.005) but not between the criterion reference and the Just Jump system (M = 0.4420 m, p = 0.972). The Just Jump method of measuring vertical jump height is a valid measure when compared with the 3-camera system. The Vertec was found to have a high correlation with the criterion reference, but the mean differed significantly. This study indicates that a higher degree of confidence is warranted when comparing Just Jump results with a 3-camera system study.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19850022829','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19850022829"><span>Adaptive identification and control of structural dynamics systems using recursive lattice filters</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Sundararajan, N.; Montgomery, R. C.; Williams, J. P.</p> <p>1985-01-01</p> <p>A new approach for adaptive identification and control of structural dynamic systems by using least squares lattice filters thar are widely used in the signal processing area is presented. Testing procedures for interfacing the lattice filter identification methods and modal control method for stable closed loop adaptive control are presented. The methods are illustrated for a free-free beam and for a complex flexible grid, with the basic control objective being vibration suppression. The approach is validated by using both simulations and experimental facilities available at the Langley Research Center.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017PhRvA..96a2119H','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017PhRvA..96a2119H"><span>Thermalization as an invisibility cloak for fragile quantum superpositions</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Hahn, Walter; Fine, Boris V.</p> <p>2017-07-01</p> <p>We propose a method for protecting fragile quantum superpositions in many-particle systems from dephasing by external classical noise. We call superpositions "fragile" if dephasing occurs particularly fast, because the noise couples very differently to the superposed states. The method consists of letting a quantum superposition evolve under the internal thermalization dynamics of the system, followed by a time-reversal manipulation known as Loschmidt echo. The thermalization dynamics makes the superposed states almost indistinguishable during most of the above procedure. We validate the method by applying it to a cluster of spins ½.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li class="active"><span>19</span></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_19 --> <div id="page_20" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li class="active"><span>20</span></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="381"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5698666','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5698666"><span>Reliability and Validity of a New Method for Isometric Back Extensor Strength Evaluation Using A Hand-Held Dynamometer</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p></p> <p>2017-01-01</p> <p>Objective To investigate the reliability and validity of a new method for isometric back extensor strength measurement using a portable dynamometer. Methods A chair equipped with a small portable dynamometer was designed (Power Track II Commander Muscle Tester). A total of 15 men (mean age, 34.8±7.5 years) and 15 women (mean age, 33.1±5.5 years) with no current back problems or previous history of back surgery were recruited. Subjects were asked to push the back of the chair while seated, and their isometric back extensor strength was measured by the portable dynamometer. Test-retest reliability was assessed with intraclass correlation coefficient (ICC). For the validity assessment, isometric back extensor strength of all subjects was measured by a widely used physical performance evaluation instrument, BTE PrimusRS system. The limit of agreement (LoA) from the Bland-Altman plot was evaluated between two methods. Results The test-retest reliability was excellent (ICC=0.82; 95% confidence interval, 0.65–0.91). The Bland-Altman plots demonstrated acceptable agreement between the two methods: the lower 95% LoA was −63.1 N and the upper 95% LoA was 61.1 N. Conclusion This study shows that isometric back extensor strength measurement using a portable dynamometer has good reliability and validity. PMID:29201818</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/23099556','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/23099556"><span>Integrated analyses of proteins and their glycans in a magnetic bead-based multiplex assay format.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Li, Danni; Chiu, Hanching; Chen, Jing; Zhang, Hui; Chan, Daniel W</p> <p>2013-01-01</p> <p>Well-annotated clinical samples are valuable resources for biomarker discovery and validation. Multiplex and integrated methods that simultaneously measure multiple analytes and generate integrated information about these analytes from a single measurement are desirable because these methods help conserve precious samples. We developed a magnetic bead-based system for multiplex and integrated glycoprotein quantification by immunoassays and glycan detection by lectin immunosorbent assays (LISAs). Magnetic beads coupled with antibodies were used for capturing proteins of interest. Biotinylated antibodies in combination with streptavidin-labeled phycoerythrin were used for protein quantification. In the LISAs, biotinylated detection antibodies were replaced by biotinylated lectins for glycan detection. Using tissue inhibitor of metallopeptidase 1 (TIMP-1), tissue plasminogen activator, membrane metallo-endopeptidase, and dipeptidyl peptidase-IV (DPP-4) as models, we found that the multiplex integrated system was comparable to single immunoassays in protein quantification and LISAs in glycan detection. The merits of this system were demonstrated when applied to well-annotated prostate cancer tissues for validation of biomarkers in aggressive prostate cancer. Because of the system's multiplex ability, we used only 300 ng of tissue protein for the integrated detection of glycans in these proteins. Fucosylated TIMP-1 and DPP-4 offered improved performance over the proteins in distinguishing aggressive and nonaggressive prostate cancer. The multiplex and integrated system conserves samples and is a useful tool for validation of glycoproteins and their glycoforms as biomarkers. © 2012 American Association for Clinical Chemistry</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20160006472','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20160006472"><span>Challenges of NDE Simulation Tool Challenges of NDE Simulation Tool</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Leckey, Cara A. C.; Juarez, Peter D.; Seebo, Jeffrey P.; Frank, Ashley L.</p> <p>2015-01-01</p> <p>Realistic nondestructive evaluation (NDE) simulation tools enable inspection optimization and predictions of inspectability for new aerospace materials and designs. NDE simulation tools may someday aid in the design and certification of advanced aerospace components; potentially shortening the time from material development to implementation by industry and government. Furthermore, modeling and simulation are expected to play a significant future role in validating the capabilities and limitations of guided wave based structural health monitoring (SHM) systems. The current state-of-the-art in ultrasonic NDE/SHM simulation cannot rapidly simulate damage detection techniques for large scale, complex geometry composite components/vehicles with realistic damage types. This paper discusses some of the challenges of model development and validation for composites, such as the level of realism and scale of simulation needed for NASA' applications. Ongoing model development work is described along with examples of model validation studies. The paper will also discuss examples of the use of simulation tools at NASA to develop new damage characterization methods, and associated challenges of validating those methods.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/15546934','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/15546934"><span>Application of Petri net theory for modelling and validation of the sucrose breakdown pathway in the potato tuber.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Koch, Ina; Junker, Björn H; Heiner, Monika</p> <p>2005-04-01</p> <p>Because of the complexity of metabolic networks and their regulation, formal modelling is a useful method to improve the understanding of these systems. An essential step in network modelling is to validate the network model. Petri net theory provides algorithms and methods, which can be applied directly to metabolic network modelling and analysis in order to validate the model. The metabolism between sucrose and starch in the potato tuber is of great research interest. Even if the metabolism is one of the best studied in sink organs, it is not yet fully understood. We provide an approach for model validation of metabolic networks using Petri net theory, which we demonstrate for the sucrose breakdown pathway in the potato tuber. We start with hierarchical modelling of the metabolic network as a Petri net and continue with the analysis of qualitative properties of the network. The results characterize the net structure and give insights into the complex net behaviour.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018OptCo.415...31Y','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018OptCo.415...31Y"><span>Tip-tilt disturbance model identification based on non-linear least squares fitting for Linear Quadratic Gaussian control</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Yang, Kangjian; Yang, Ping; Wang, Shuai; Dong, Lizhi; Xu, Bing</p> <p>2018-05-01</p> <p>We propose a method to identify tip-tilt disturbance model for Linear Quadratic Gaussian control. This identification method based on Levenberg-Marquardt method conducts with a little prior information and no auxiliary system and it is convenient to identify the tip-tilt disturbance model on-line for real-time control. This identification method makes it easy that Linear Quadratic Gaussian control runs efficiently in different adaptive optics systems for vibration mitigation. The validity of the Linear Quadratic Gaussian control associated with this tip-tilt disturbance model identification method is verified by experimental data, which is conducted in replay mode by simulation.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/24831757','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/24831757"><span>Reliability and concurrent validity of a Smartphone, bubble inclinometer and motion analysis system for measurement of hip joint range of motion.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Charlton, Paula C; Mentiplay, Benjamin F; Pua, Yong-Hao; Clark, Ross A</p> <p>2015-05-01</p> <p>Traditional methods of assessing joint range of motion (ROM) involve specialized tools that may not be widely available to clinicians. This study assesses the reliability and validity of a custom Smartphone application for assessing hip joint range of motion. Intra-tester reliability with concurrent validity. Passive hip joint range of motion was recorded for seven different movements in 20 males on two separate occasions. Data from a Smartphone, bubble inclinometer and a three dimensional motion analysis (3DMA) system were collected simultaneously. Intraclass correlation coefficients (ICCs), coefficients of variation (CV) and standard error of measurement (SEM) were used to assess reliability. To assess validity of the Smartphone application and the bubble inclinometer against the three dimensional motion analysis system, intraclass correlation coefficients and fixed and proportional biases were used. The Smartphone demonstrated good to excellent reliability (ICCs>0.75) for four out of the seven movements, and moderate to good reliability for the remaining three movements (ICC=0.63-0.68). Additionally, the Smartphone application displayed comparable reliability to the bubble inclinometer. The Smartphone application displayed excellent validity when compared to the three dimensional motion analysis system for all movements (ICCs>0.88) except one, which displayed moderate to good validity (ICC=0.71). Smartphones are portable and widely available tools that are mostly reliable and valid for assessing passive hip range of motion, with potential for large-scale use when a bubble inclinometer is not available. However, caution must be taken in its implementation as some movement axes demonstrated only moderate reliability. Copyright © 2014 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19910014887','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19910014887"><span>NASA-LaRc Flight-Critical Digital Systems Technology Workshop</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Meissner, C. W., Jr. (Editor); Dunham, J. R. (Editor); Crim, G. (Editor)</p> <p>1989-01-01</p> <p>The outcome is documented of a Flight-Critical Digital Systems Technology Workshop held at NASA-Langley December 13 to 15 1988. The purpose of the workshop was to elicit the aerospace industry's view of the issues which must be addressed for the practical realization of flight-critical digital systems. The workshop was divided into three parts: an overview session; three half-day meetings of seven working groups addressing aeronautical and space requirements, system design for validation, failure modes, system modeling, reliable software, and flight test; and a half-day summary of the research issues presented by the working group chairmen. Issues that generated the most consensus across the workshop were: (1) the lack of effective design and validation methods with support tools to enable engineering of highly-integrated, flight-critical digital systems, and (2) the lack of high quality laboratory and field data on system failures especially due to electromagnetic environment (EME).</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20140008549','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20140008549"><span>Dissertation Defense Computational Fluid Dynamics Uncertainty Analysis for Payload Fairing Spacecraft Environmental Control Systems</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Groves, Curtis Edward</p> <p>2014-01-01</p> <p>Spacecraft thermal protection systems are at risk of being damaged due to airflow produced from Environmental Control Systems. There are inherent uncertainties and errors associated with using Computational Fluid Dynamics to predict the airflow field around a spacecraft from the Environmental Control System. This paper describes an approach to quantify the uncertainty in using Computational Fluid Dynamics to predict airflow speeds around an encapsulated spacecraft without the use of test data. Quantifying the uncertainty in analytical predictions is imperative to the success of any simulation-based product. The method could provide an alternative to traditional "validation by test only" mentality. This method could be extended to other disciplines and has potential to provide uncertainty for any numerical simulation, thus lowering the cost of performing these verifications while increasing the confidence in those predictions. Spacecraft requirements can include a maximum airflow speed to protect delicate instruments during ground processing. Computational Fluid Dynamics can be used to verify these requirements; however, the model must be validated by test data. This research includes the following three objectives and methods. Objective one is develop, model, and perform a Computational Fluid Dynamics analysis of three (3) generic, non-proprietary, environmental control systems and spacecraft configurations. Several commercially available and open source solvers have the capability to model the turbulent, highly three-dimensional, incompressible flow regime. The proposed method uses FLUENT, STARCCM+, and OPENFOAM. Objective two is to perform an uncertainty analysis of the Computational Fluid Dynamics model using the methodology found in "Comprehensive Approach to Verification and Validation of Computational Fluid Dynamics Simulations". This method requires three separate grids and solutions, which quantify the error bars around Computational Fluid Dynamics predictions. The method accounts for all uncertainty terms from both numerical and input variables. Objective three is to compile a table of uncertainty parameters that could be used to estimate the error in a Computational Fluid Dynamics model of the Environmental Control System /spacecraft system. Previous studies have looked at the uncertainty in a Computational Fluid Dynamics model for a single output variable at a single point, for example the re-attachment length of a backward facing step. For the flow regime being analyzed (turbulent, three-dimensional, incompressible), the error at a single point can propagate into the solution both via flow physics and numerical methods. Calculating the uncertainty in using Computational Fluid Dynamics to accurately predict airflow speeds around encapsulated spacecraft in is imperative to the success of future missions.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20140008550','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20140008550"><span>Dissertation Defense: Computational Fluid Dynamics Uncertainty Analysis for Payload Fairing Spacecraft Environmental Control Systems</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Groves, Curtis Edward</p> <p>2014-01-01</p> <p>Spacecraft thermal protection systems are at risk of being damaged due to airflow produced from Environmental Control Systems. There are inherent uncertainties and errors associated with using Computational Fluid Dynamics to predict the airflow field around a spacecraft from the Environmental Control System. This paper describes an approach to quantify the uncertainty in using Computational Fluid Dynamics to predict airflow speeds around an encapsulated spacecraft without the use of test data. Quantifying the uncertainty in analytical predictions is imperative to the success of any simulation-based product. The method could provide an alternative to traditional validation by test only mentality. This method could be extended to other disciplines and has potential to provide uncertainty for any numerical simulation, thus lowering the cost of performing these verifications while increasing the confidence in those predictions.Spacecraft requirements can include a maximum airflow speed to protect delicate instruments during ground processing. Computational Fluid Dynamics can be used to verify these requirements; however, the model must be validated by test data. This research includes the following three objectives and methods. Objective one is develop, model, and perform a Computational Fluid Dynamics analysis of three (3) generic, non-proprietary, environmental control systems and spacecraft configurations. Several commercially available and open source solvers have the capability to model the turbulent, highly three-dimensional, incompressible flow regime. The proposed method uses FLUENT, STARCCM+, and OPENFOAM. Objective two is to perform an uncertainty analysis of the Computational Fluid Dynamics model using the methodology found in Comprehensive Approach to Verification and Validation of Computational Fluid Dynamics Simulations. This method requires three separate grids and solutions, which quantify the error bars around Computational Fluid Dynamics predictions. The method accounts for all uncertainty terms from both numerical and input variables. Objective three is to compile a table of uncertainty parameters that could be used to estimate the error in a Computational Fluid Dynamics model of the Environmental Control System spacecraft system.Previous studies have looked at the uncertainty in a Computational Fluid Dynamics model for a single output variable at a single point, for example the re-attachment length of a backward facing step. For the flow regime being analyzed (turbulent, three-dimensional, incompressible), the error at a single point can propagate into the solution both via flow physics and numerical methods. Calculating the uncertainty in using Computational Fluid Dynamics to accurately predict airflow speeds around encapsulated spacecraft in is imperative to the success of future missions.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016OptCo.361...28J','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016OptCo.361...28J"><span>A novel method for finding the initial structure parameters of optical systems via a genetic algorithm</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Jun, LIU; Huang, Wei; Hongjie, Fan</p> <p>2016-02-01</p> <p>A novel method for finding the initial structure parameters of an optical system via the genetic algorithm (GA) is proposed in this research. Usually, optical designers start their designs from the commonly used structures from a patent database; however, it is time consuming to modify the patented structures to meet the specification. A high-performance design result largely depends on the choice of the starting point. Accordingly, it would be highly desirable to be able to calculate the initial structure parameters automatically. In this paper, a method that combines a genetic algorithm and aberration analysis is used to determine an appropriate initial structure of an optical system. We use a three-mirror system as an example to demonstrate the validity and reliability of this method. On-axis and off-axis telecentric three-mirror systems are obtained based on this method.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1378368','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1378368"><span>sNebula, a network-based algorithm to predict binding between human leukocyte antigens and peptides</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Luo, Heng; Ye, Hao; Ng, Hui Wen</p> <p></p> <p>Understanding the binding between human leukocyte antigens (HLAs) and peptides is important to understand the functioning of the immune system. Since it is time-consuming and costly to measure the binding between large numbers of HLAs and peptides, computational methods including machine learning models and network approaches have been developed to predict HLA-peptide binding. However, there are several limitations for the existing methods. We developed a network-based algorithm called sNebula to address these limitations. We curated qualitative Class I HLA-peptide binding data and demonstrated the prediction performance of sNebula on this dataset using leave-one-out cross-validation and five-fold cross-validations. Furthermore, this algorithmmore » can predict not only peptides of different lengths and different types of HLAs, but also the peptides or HLAs that have no existing binding data. We believe sNebula is an effective method to predict HLA-peptide binding and thus improve our understanding of the immune system.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4997263','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4997263"><span>sNebula, a network-based algorithm to predict binding between human leukocyte antigens and peptides</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Luo, Heng; Ye, Hao; Ng, Hui Wen; Sakkiah, Sugunadevi; Mendrick, Donna L.; Hong, Huixiao</p> <p>2016-01-01</p> <p>Understanding the binding between human leukocyte antigens (HLAs) and peptides is important to understand the functioning of the immune system. Since it is time-consuming and costly to measure the binding between large numbers of HLAs and peptides, computational methods including machine learning models and network approaches have been developed to predict HLA-peptide binding. However, there are several limitations for the existing methods. We developed a network-based algorithm called sNebula to address these limitations. We curated qualitative Class I HLA-peptide binding data and demonstrated the prediction performance of sNebula on this dataset using leave-one-out cross-validation and five-fold cross-validations. This algorithm can predict not only peptides of different lengths and different types of HLAs, but also the peptides or HLAs that have no existing binding data. We believe sNebula is an effective method to predict HLA-peptide binding and thus improve our understanding of the immune system. PMID:27558848</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/pages/biblio/1378368-snebula-network-based-algorithm-predict-binding-between-human-leukocyte-antigens-peptides','SCIGOV-DOEP'); return false;" href="https://www.osti.gov/pages/biblio/1378368-snebula-network-based-algorithm-predict-binding-between-human-leukocyte-antigens-peptides"><span>sNebula, a network-based algorithm to predict binding between human leukocyte antigens and peptides</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/pages">DOE PAGES</a></p> <p>Luo, Heng; Ye, Hao; Ng, Hui Wen; ...</p> <p>2016-08-25</p> <p>Understanding the binding between human leukocyte antigens (HLAs) and peptides is important to understand the functioning of the immune system. Since it is time-consuming and costly to measure the binding between large numbers of HLAs and peptides, computational methods including machine learning models and network approaches have been developed to predict HLA-peptide binding. However, there are several limitations for the existing methods. We developed a network-based algorithm called sNebula to address these limitations. We curated qualitative Class I HLA-peptide binding data and demonstrated the prediction performance of sNebula on this dataset using leave-one-out cross-validation and five-fold cross-validations. Furthermore, this algorithmmore » can predict not only peptides of different lengths and different types of HLAs, but also the peptides or HLAs that have no existing binding data. We believe sNebula is an effective method to predict HLA-peptide binding and thus improve our understanding of the immune system.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3613328','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3613328"><span>Educational Milestone Development in the First 7 Specialties to Enter the Next Accreditation System</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Swing, Susan R.; Beeson, Michael S.; Carraccio, Carol; Coburn, Michael; Iobst, William; Selden, Nathan R.; Stern, Peter J.; Vydareny, Kay</p> <p>2013-01-01</p> <p>Background The Accreditation Council for Graduate Medical Education (ACGME) Outcome Project introduced 6 general competencies relevant to medical practice but fell short of its goal to create a robust assessment system that would allow program accreditation based on outcomes. In response, the ACGME, the specialty boards, and other stakeholders collaborated to develop educational milestones, observable steps in residents' professional development that describe progress from entry to graduation and beyond. Objectives We summarize the development of the milestones, focusing on 7 specialties, moving to the next accreditation system in July 2013, and offer evidence of their validity. Methods Specialty workgroups with broad representation used a 5-level developmental framework and incorporated information from literature reviews, specialty curricula, dialogue with constituents, and pilot testing. Results The workgroups produced richly diverse sets of milestones that reflect the community's consideration of attributes of competence relevant to practice in the given specialty. Both their development process and the milestones themselves establish a validity argument, when contemporary views of validity for complex performance assessment are used. Conclusions Initial evidence for validity emerges from the development processes and the resulting milestones. Further advancing a validity argument will require research on the use of milestone data in resident assessment and program accreditation. PMID:24404235</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/26819140','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/26819140"><span>The efficiency of health care production in OECD countries: A systematic review and meta-analysis of cross-country comparisons.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Varabyova, Yauheniya; Müller, Julia-Maria</p> <p>2016-03-01</p> <p>There has been an ongoing interest in the analysis and comparison of the efficiency of health care systems using nonparametric and parametric applications. The objective of this study was to review the current state of the literature and to synthesize the findings on health system efficiency in OECD countries. We systematically searched five electronic databases through August 2014 and identified 22 studies that analyzed the efficiency of health care production at the country level. We summarized these studies with view on their sample, methods, and utilized variables. We developed and applied a checklist of 14 items to assess the quality of the reviewed studies along four dimensions: reporting, external validity, bias, and power. Moreover, to examine the internal validity of findings we meta-analyzed the efficiency estimates reported in 35 models from ten studies. The qualitative synthesis of the literature indicated large differences in study designs and methods. The meta-analysis revealed low correlations between country rankings suggesting a lack of internal validity of the efficiency estimates. In conclusion, methodological problems of existing cross-country comparisons of the efficiency of health care systems draw into question the ability of these comparisons to provide meaningful guidance to policy-makers. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/17664685','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/17664685"><span>Mobile detection system to evaluate reactive hyperemia using radionuclide plethysmography.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Harel, François; Ngo, Quam; Finnerty, Vincent; Hernandez, Edgar; Khairy, Paul; Dupuis, Jocelyn</p> <p>2007-08-01</p> <p>We validated a novel mobile detection system to evaluate reactive hyperemia using the radionuclide plethysmography technique. Twenty-six subjects underwent simultaneously radionuclide plethysmography with strain gauge plethysmography. Strain gauge and radionuclide methods showed excellent reproducibility with intraclass correlation coefficients of 0.96 and 0.89 respectively. There was also a good correlation of flows between the two methods during reactive hyperemia (r = 0.87). We conclude that radionuclide plethysmography using this mobile detection system is a non-invasive alternative to assess forearm blood flow and its dynamic variations during reactive hyperemia.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://ntrs.nasa.gov/search.jsp?R=20070014002&hterms=jump&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D30%26Ntt%3Djump','NASA-TRS'); return false;" href="https://ntrs.nasa.gov/search.jsp?R=20070014002&hterms=jump&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D30%26Ntt%3Djump"><span>Markov Jump-Linear Performance Models for Recoverable Flight Control Computers</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Zhang, Hong; Gray, W. Steven; Gonzalez, Oscar R.</p> <p>2004-01-01</p> <p>Single event upsets in digital flight control hardware induced by atmospheric neutrons can reduce system performance and possibly introduce a safety hazard. One method currently under investigation to help mitigate the effects of these upsets is NASA Langley s Recoverable Computer System. In this paper, a Markov jump-linear model is developed for a recoverable flight control system, which will be validated using data from future experiments with simulated and real neutron environments. The method of tracking error analysis and the plan for the experiments are also described.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017JPhD...50ILT01W','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017JPhD...50ILT01W"><span>Identification of Curie temperature distributions in magnetic particulate systems</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Waters, J.; Berger, A.; Kramer, D.; Fangohr, H.; Hovorka, O.</p> <p>2017-09-01</p> <p>This paper develops a methodology for extracting the Curie temperature distribution from magnetisation versus temperature measurements which are realizable by standard laboratory magnetometry. The method is integral in nature, robust against various sources of measurement noise, and can be adopted to a wide range of granular magnetic materials and magnetic particle systems. The validity and practicality of the method is demonstrated using large-scale Monte-Carlo simulations of an Ising-like model as a proof of concept, and general conclusions are drawn about its applicability to different classes of systems and experimental conditions.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/26701180','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/26701180"><span>Red Lesion Detection Using Dynamic Shape Features for Diabetic Retinopathy Screening.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Seoud, Lama; Hurtut, Thomas; Chelbi, Jihed; Cheriet, Farida; Langlois, J M Pierre</p> <p>2016-04-01</p> <p>The development of an automatic telemedicine system for computer-aided screening and grading of diabetic retinopathy depends on reliable detection of retinal lesions in fundus images. In this paper, a novel method for automatic detection of both microaneurysms and hemorrhages in color fundus images is described and validated. The main contribution is a new set of shape features, called Dynamic Shape Features, that do not require precise segmentation of the regions to be classified. These features represent the evolution of the shape during image flooding and allow to discriminate between lesions and vessel segments. The method is validated per-lesion and per-image using six databases, four of which are publicly available. It proves to be robust with respect to variability in image resolution, quality and acquisition system. On the Retinopathy Online Challenge's database, the method achieves a FROC score of 0.420 which ranks it fourth. On the Messidor database, when detecting images with diabetic retinopathy, the proposed method achieves an area under the ROC curve of 0.899, comparable to the score of human experts, and it outperforms state-of-the-art approaches.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20140000469','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20140000469"><span>Final Report - Regulatory Considerations for Adaptive Systems</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Wilkinson, Chris; Lynch, Jonathan; Bharadwaj, Raj</p> <p>2013-01-01</p> <p>This report documents the findings of a preliminary research study into new approaches to the software design assurance of adaptive systems. We suggest a methodology to overcome the software validation and verification difficulties posed by the underlying assumption of non-adaptive software in the requirementsbased- testing verification methods in RTCA/DO-178B and C. An analysis of the relevant RTCA/DO-178B and C objectives is presented showing the reasons for the difficulties that arise in showing satisfaction of the objectives and suggested additional means by which they could be satisfied. We suggest that the software design assurance problem for adaptive systems is principally one of developing correct and complete high level requirements and system level constraints that define the necessary system functional and safety properties to assure the safe use of adaptive systems. We show how analytical techniques such as model based design, mathematical modeling and formal or formal-like methods can be used to both validate the high level functional and safety requirements, establish necessary constraints and provide the verification evidence for the satisfaction of requirements and constraints that supplements conventional testing. Finally the report identifies the follow-on research topics needed to implement this methodology.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li class="active"><span>20</span></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_20 --> <div id="page_21" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li class="active"><span>21</span></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="401"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/26414060','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/26414060"><span>High-throughput method for the determination of residues of β-lactam antibiotics in bovine milk by LC-MS/MS.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Jank, Louise; Martins, Magda Targa; Arsand, Juliana Bazzan; Hoff, Rodrigo Barcellos; Barreto, Fabiano; Pizzolato, Tânia Mara</p> <p>2015-01-01</p> <p>This study describes the development and validation procedures for scope extension of a method for the determination of β-lactam antibiotic residues (ampicillin, amoxicillin, penicillin G, penicillin V, oxacillin, cloxacillin, dicloxacillin, nafcillin, ceftiofur, cefquinome, cefoperazone, cephapirine, cefalexin and cephalonium) in bovine milk. Sample preparation was performed by liquid-liquid extraction (LLE) followed by two clean-up steps, including low temperature purification (LTP) and a solid phase dispersion clean-up. Extracts were analysed using a liquid chromatography-electrospray-tandem mass spectrometry system (LC-ESI-MS/MS). Chromatographic separation was performed in a C18 column, using methanol and water (both with 0.1% of formic acid) as mobile phase. Method validation was performed according to the criteria of Commission Decision 2002/657/EC. Main validation parameters such as linearity, limit of detection, decision limit (CCα), detection capability (CCβ), accuracy, and repeatability were determined and were shown to be adequate. The method was applied to real samples (more than 250) and two milk samples had levels above maximum residues limits (MRLs) for cloxacillin - CLX and cefapirin - CFAP.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/28577414','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/28577414"><span>Dynamic measurement of speed of sound in n-Heptane by ultrasonics during fuel injections.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Minnetti, Elisa; Pandarese, Giuseppe; Evangelisti, Piersavio; Verdugo, Francisco Rodriguez; Ungaro, Carmine; Bastari, Alessandro; Paone, Nicola</p> <p>2017-11-01</p> <p>The paper presents a technique to measure the speed of sound in fuels based on pulse-echo ultrasound. The method is applied inside the test chamber of a Zeuch-type instrument used for indirect measurement of the injection rate (Mexus). The paper outlines the pulse-echo method, considering probe installation, ultrasound beam propagation inside the test chamber, typical signals obtained, as well as different processing algorithms. The method is validated in static conditions by comparing the experimental results to the NIST database both for water and n-Heptane. The ultrasonic system is synchronized to the injector so that time resolved samples of speed of sound can be successfully acquired during a series of injections. Results at different operating conditions in n-Heptane are shown. An uncertainty analysis supports the analysis of results and allows to validate the method. Experimental results show that the speed of sound variation during an injection event is less than 1%, so the Mexus model assumption to consider it constant during the injection is valid. Copyright © 2017 Elsevier B.V. All rights reserved.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/22595261','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/22595261"><span>Development and validation of a stability-indicating gas chromatographic method for quality control of residual solvents in blonanserin: a novel atypical antipsychotic agent.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Peng, Ming; Liu, Jin; Lu, Dan; Yang, Yong-Jian</p> <p>2012-09-01</p> <p>Blonanserin is a novel atypical antipsychotic agent for the treatment of schizophrenia. Ethyl alcohol, isopropyl alcohol and toluene are utilized in the synthesis route of this bulk drug. A new validated gas chromatographic (GC) method for the simultaneous determination of residual solvents in blonanserin is described in this paper. Blonanserin was dissolved in N, N-dimethylformamide to make a sample solution that was directly injected into a DB-624 column. A postrun oven temperature at 240°C for approximately 2 h after the analysis cycle was performed to wash out blonanserin residue in the GC column. Quantitation was performed by external standard analyses and the validation was carried out according to International Conference on Harmonization validation guidelines Q2A and Q2B. The method was shown to be specific (no interference in the blank solution), linear (correlation coefficients ≥0.99998, n = 10), accurate (average recoveries between 94.1 and 101.7%), precise (intra-day and inter-day precision ≤2.6%), sensitive (limit of detection ≤0.2 ng, and limit of quantitation ≤0.7 ng), robust (small variations of carrier gas flow, initial oven temperature, temperature ramping rate, injector and detector temperatures did not significantly affect the system suitability test parameters and peak areas) and stable (reference standard and sample solutions were stable over 48 h). This extensively validated method is ready to be used for the quality control of blonanserin.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/28735598','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/28735598"><span>Developing and validating a nutrition knowledge questionnaire: key methods and considerations.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Trakman, Gina Louise; Forsyth, Adrienne; Hoye, Russell; Belski, Regina</p> <p>2017-10-01</p> <p>To outline key statistical considerations and detailed methodologies for the development and evaluation of a valid and reliable nutrition knowledge questionnaire. Literature on questionnaire development in a range of fields was reviewed and a set of evidence-based guidelines specific to the creation of a nutrition knowledge questionnaire have been developed. The recommendations describe key qualitative methods and statistical considerations, and include relevant examples from previous papers and existing nutrition knowledge questionnaires. Where details have been omitted for the sake of brevity, the reader has been directed to suitable references. We recommend an eight-step methodology for nutrition knowledge questionnaire development as follows: (i) definition of the construct and development of a test plan; (ii) generation of the item pool; (iii) choice of the scoring system and response format; (iv) assessment of content validity; (v) assessment of face validity; (vi) purification of the scale using item analysis, including item characteristics, difficulty and discrimination; (vii) evaluation of the scale including its factor structure and internal reliability, or Rasch analysis, including assessment of dimensionality and internal reliability; and (viii) gathering of data to re-examine the questionnaire's properties, assess temporal stability and confirm construct validity. Several of these methods have previously been overlooked. The measurement of nutrition knowledge is an important consideration for individuals working in the nutrition field. Improved methods in the development of nutrition knowledge questionnaires, such as the use of factor analysis or Rasch analysis, will enable more confidence in reported measures of nutrition knowledge.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20080033689','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20080033689"><span>Hovering Dual-Spin Vehicle Groundwork for Bias Momentum Sizing Validation Experiment</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Rothhaar, Paul M.; Moerder, Daniel D.; Lim, Kyong B.</p> <p>2008-01-01</p> <p>Angular bias momentum offers significant stability augmentation for hovering flight vehicles. The reliance of the vehicle on thrust vectoring for agility and disturbance rejection is greatly reduced with significant levels of stored angular momentum in the system. A methodical procedure for bias momentum sizing has been developed in previous studies. This current study provides groundwork for experimental validation of that method using an experimental vehicle called the Dual-Spin Test Device, a thrust-levitated platform. Using measured data the vehicle's thrust vectoring units are modeled and a gust environment is designed and characterized. Control design is discussed. Preliminary experimental results of the vehicle constrained to three rotational degrees of freedom are compared to simulation for a case containing no bias momentum to validate the simulation. A simulation of a bias momentum dominant case is presented.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/biblio/20976604-formulation-relativistic-moment-implicit-particle-cell-method','SCIGOV-STC'); return false;" href="https://www.osti.gov/biblio/20976604-formulation-relativistic-moment-implicit-particle-cell-method"><span>Formulation of the relativistic moment implicit particle-in-cell method</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Noguchi, Koichi; Tronci, Cesare; Zuccaro, Gianluca</p> <p>2007-04-15</p> <p>A new formulation is presented for the implicit moment method applied to the time-dependent relativistic Vlasov-Maxwell system. The new approach is based on a specific formulation of the implicit moment method that allows us to retain the same formalism that is valid in the classical case despite the formidable complication introduced by the nonlinear nature of the relativistic equations of motion. To demonstrate the validity of the new formulation, an implicit finite difference algorithm is developed to solve the Maxwell's equations and equations of motion. A number of benchmark problems are run: two stream instability, ion acoustic wave damping, Weibelmore » instability, and Poynting flux acceleration. The numerical results are all in agreement with analytical solutions.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/26557603','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/26557603"><span>Validation of Gujarati Version of ABILOCO-Kids Questionnaire.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Diwan, Shraddha; Diwan, Jasmin; Patel, Pankaj; Bansal, Ankita B</p> <p>2015-10-01</p> <p>ABILOCO-Kids is a measure of locomotion ability for children with cerebral palsy (CP) aged 6 to 15 years & is available in English & French. To validate the Gujarati version of ABILOCO-Kids questionnaire to be used in clinical research on Gujarati population. ABILOCO-Kids questionnaire was translated into Gujarati from English using forward-backward-forward method. To ensure face & content validity of Gujarati version using group consensus method, each item was examined by group of experts having mean experience of 24.62 years in field of paediatric and paediatric physiotherapy. Each item was analysed for content, meaning, wording, format, ease of administration & scoring. Each item was scored by expert group as either accepted, rejected or accepted with modification. Procedure was continued until 80% of consensus for all items. Concurrent validity was examined on 55 children with Cerebral Palsy (6-15 years) of all Gross Motor Functional Classification System (GMFCS) level & all clinical types by correlating score of ABILOCO-Kids with Gross Motor Functional Measure & GMFCS. In phase 1 of validation, 16 items were accepted as it is; 22 items accepted with modification & 3 items went for phase 2 validation. For concurrent validity, highly significant positive correlation was found between score of ABILOCO-Kids & total GMFM (r=0.713, p<0.005) & highly significant negative correlation with GMFCS (r= -0.778, p<0.005). Gujarati translated version of ABILOCO-Kids questionnaire has good face & content validity as well as concurrent validity which can be used to measure caregiver reported locomotion ability in children with CP.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/25479997','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/25479997"><span>Verification and validation of a Work Domain Analysis with turing machine task analysis.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Rechard, J; Bignon, A; Berruet, P; Morineau, T</p> <p>2015-03-01</p> <p>While the use of Work Domain Analysis as a methodological framework in cognitive engineering is increasing rapidly, verification and validation of work domain models produced by this method are becoming a significant issue. In this article, we propose the use of a method based on Turing machine formalism named "Turing Machine Task Analysis" to verify and validate work domain models. The application of this method on two work domain analyses, one of car driving which is an "intentional" domain, and the other of a ship water system which is a "causal domain" showed the possibility of highlighting improvements needed by these models. More precisely, the step by step analysis of a degraded task scenario in each work domain model pointed out unsatisfactory aspects in the first modelling, like overspecification, underspecification, omission of work domain affordances, or unsuitable inclusion of objects in the work domain model. Copyright © 2014 Elsevier Ltd and The Ergonomics Society. All rights reserved.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/1996JHyd..175..595P','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/1996JHyd..175..595P"><span>Validation of catchment models for predicting land-use and climate change impacts. 2. Case study for a Mediterranean catchment</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Parkin, G.; O'Donnell, G.; Ewen, J.; Bathurst, J. C.; O'Connell, P. E.; Lavabre, J.</p> <p>1996-02-01</p> <p>Validation methods commonly used to test catchment models are not capable of demonstrating a model's fitness for making predictions for catchments where the catchment response is not known (including hypothetical catchments, and future conditions of existing catchments which are subject to land-use or climate change). This paper describes the first use of a new method of validation (Ewen and Parkin, 1996. J. Hydrol., 175: 583-594) designed to address these types of application; the method involves making 'blind' predictions of selected hydrological responses which are considered important for a particular application. SHETRAN (a physically based, distributed catchment modelling system) is tested on a small Mediterranean catchment. The test involves quantification of the uncertainty in four predicted features of the catchment response (continuous hydrograph, peak discharge rates, monthly runoff, and total runoff), and comparison of observations with the predicted ranges for these features. The results of this test are considered encouraging.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://pubs.er.usgs.gov/publication/70173658','USGSPUBS'); return false;" href="https://pubs.er.usgs.gov/publication/70173658"><span>Model selection and assessment for multi­-species occupancy models</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Broms, Kristin M.; Hooten, Mevin B.; Fitzpatrick, Ryan M.</p> <p>2016-01-01</p> <p>While multi-species occupancy models (MSOMs) are emerging as a popular method for analyzing biodiversity data, formal checking and validation approaches for this class of models have lagged behind. Concurrent with the rise in application of MSOMs among ecologists, a quiet regime shift is occurring in Bayesian statistics where predictive model comparison approaches are experiencing a resurgence. Unlike single-species occupancy models that use integrated likelihoods, MSOMs are usually couched in a Bayesian framework and contain multiple levels. Standard model checking and selection methods are often unreliable in this setting and there is only limited guidance in the ecological literature for this class of models. We examined several different contemporary Bayesian hierarchical approaches for checking and validating MSOMs and applied these methods to a freshwater aquatic study system in Colorado, USA, to better understand the diversity and distributions of plains fishes. Our findings indicated distinct differences among model selection approaches, with cross-validation techniques performing the best in terms of prediction.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/27178547','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/27178547"><span>A validated method for rapid determination of dibenzo-p-dioxins/furans (PCDD/Fs), polybrominated diphenyl ethers (PBDEs) and polychlorinated biphenyls (PCBs) in human milk: focus on utility of tandem solid phase extraction (SPE) cleanup.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Lin, Yuanjie; Feng, Chao; Xu, Qian; Lu, Dasheng; Qiu, Xinlei; Jin, Yu'e; Wang, Guoquan; Wang, Dongli; She, Jianwen; Zhou, Zhijun</p> <p>2016-07-01</p> <p>An improved method based on tandem solid phase extraction (SPE) cleanup and gas chromatography-high resolution mass spectrometry (GC-HRMS) has been validated for a rapid determination of dibenzo-p-dioxins/furans (PCDD/Fs), dioxin-like polychlorinated biphenyls (PCBs), marker polychlorinated biphenyls (M-PCBs), and polybrominated diphenyl ethers (PBDEs) using a large volume (50 mL) of human milk. This method was well validated for the measurement of these analytes in human milk from the general population with low limits of detection (LODs, 0.004-0.12 ng/g lipid), satisfactory accuracy (75-120 % of recoveries), and precision [less than 10 % of relative standard deviations (RSDs)]. To comprehensively evaluate the performance of this method, a good, presently validated and routinely used method based on an automated sample clean-up system (ASCS, based on the commercial acid multilayer silica, basic alumina, and carbon columns) was used in parallel for comparison. Compared with the ASCS method, this method presented comparable specificity. Additionally, this method, in contrast to ASCS method, highly reduced consumption of solvents (40 mL versus 500 mL), which results in much lower background in the procedural blank, reduced time, and enhanced sample pretreatment throughput. This method was also applied in a pilot study to measure a batch of human milk samples with satisfactory results. Graphical Abstract Characteristics of the application of tandem SPE cleanup for determination of PCDD/Fs, DL-PCBs,M-PCBs and PBDEs in human milk.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/22300357','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/22300357"><span>Validation of a new mixing chamber system for breath-by-breath indirect calorimetry.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Kim, Do-Yeon; Robergs, Robert Andrew</p> <p>2012-02-01</p> <p>Limited validation research exists for applications of breath-by-breath systems of expired gas analysis indirect calorimetry (EGAIC) during exercise. We developed improved hardware and software for breath-by-breath indirect calorimetry (NEW) and validated this system as well as a commercial system (COM) against 2 methods: (i) mechanical ventilation with known calibration gas, and (ii) human subjects testing for 5 min each at rest and cycle ergometer exercise at 100 and 175 W. Mechanical calibration consisted of medical grade and certified calibration gas ((4.95% CO(2), 12.01% O(2), balance N(2)), room air (20.95% O(2), 0.03% CO(2), balance N(2)), and 100% nitrogen), and an air flow turbine calibrated with a 3-L calibration syringe. Ventilation was mimicked manually using complete 3-L calibration syringe manouvers at a rate of 10·min(-1) from a Douglas bag reservoir of calibration gas. The testing of human subjects was completed in a counterbalanced sequence based on 5 repeated tests of all conditions for a single subject. Rest periods of 5 and 10 min followed the 100 and 175 W conditions, respectively. COM and NEW had similar accuracy when tested with known ventilation and gas fractions. However, during human subjects testing COM significantly under-measured carbon dioxide gas fractions, over-measured oxygen gas fractions and minute ventilation, and resulted in errors to each of oxygen uptake, carbon dioxide output, and respiratory exchange ratio. These discrepant findings reveal that controlled ventilation and gas fractions are insufficient to validate breath-by-breath, and perhaps even time-averaged, systems of EGAIC. The errors of the COM system reveal the need for concern over the validity of commercial systems of EGAIC.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/18697663','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/18697663"><span>Research Methods Tutor: evaluation of a dialogue-based tutoring system in the classroom.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Arnott, Elizabeth; Hastings, Peter; Allbritton, David</p> <p>2008-08-01</p> <p>Research Methods Tutor (RMT) is a dialogue-based intelligent tutoring system for use in conjunction with undergraduate psychology research methods courses. RMT includes five topics that correspond to the curriculum of introductory research methods courses: ethics, variables, reliability, validity, and experimental design. We evaluated the effectiveness of the RMT system in the classroom using a nonequivalent control group design. Students in three classes (n = 83) used RMT, and students in two classes (n = 53) did not use RMT. Results indicated that the use of RMT yieldedstrong learning gains of 0.75 standard deviations above classroom instruction alone. Further, the dialogue-based tutoring condition of the system resulted in higher gains than did the textbook-style condition (CAI version) of the system. Future directions for RMT include the addition of new topics and tutoring elements.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014InPhT..67..583P','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014InPhT..67..583P"><span>A portable mid-range localization system using infrared LEDs for visually impaired people</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Park, Suhyeon; Choi, In-Mook; Kim, Sang-Soo; Kim, Sung-Mok</p> <p>2014-11-01</p> <p>A versatile indoor/outdoor pedestrian guidance system with good mobility is necessary in order to aid visually impaired pedestrians in indoor and outdoor environments. In this paper, distance estimation methods for portable wireless localization systems are verified. Two systems of a fixed active beacon and a receiver using an ultrasound time-of-flight method and a differential infrared intensity method are proposed. The infrared localization system was appropriate for the goal of this study. It was possible to use the infrared intensity method to generate a uniform signal field that exceeded 30 m. Valid distance estimations which were within 30 m of coverage indoors and within 20 m of coverage outdoors were made. Also, a pocket-sized receiver which can be adapted to a smartphone was found to be suitable for use as a portable device.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018OcMod.124...94Q','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018OcMod.124...94Q"><span>FVCOM one-way and two-way nesting using ESMF: Development and validation</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Qi, Jianhua; Chen, Changsheng; Beardsley, Robert C.</p> <p>2018-04-01</p> <p>Built on the Earth System Modeling Framework (ESMF), the one-way and two-way nesting methods were implemented into the unstructured-grid Finite-Volume Community Ocean Model (FVCOM). These methods help utilize the unstructured-grid multi-domain nesting of FVCOM with an aim at resolving the multi-scale physical and ecosystem processes. A detail of procedures on implementing FVCOM into ESMF was described. The experiments were made to validate and evaluate the performance of the nested-grid FVCOM system. The first was made for a wave-current interaction case with a two-domain nesting with an emphasis on qualifying a critical need of nesting to resolve a high-resolution feature near the coast and harbor with little loss in computational efficiency. The second was conducted for the pseudo river plume cases to examine the differences in the model-simulated salinity between one-way and two-way nesting approaches and evaluate the performance of mass conservative two-way nesting method. The third was carried out for the river plume case in the realistic geometric domain in Mass Bay, supporting the importance for having the two-way nesting for coastal-estuarine integrated modeling. The nesting method described in this paper has been used in the Northeast Coastal Ocean Forecast System (NECOFS)-a global-regional-coastal nesting FVCOM system that has been placed into the end-to-end forecast and hindcast operations since 2007.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016MS%26E..157a2029D','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016MS%26E..157a2029D"><span>Multisensor system for toxic gases detection generated on indoor environments</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Durán, C. M.; Monsalve, P. A. G.; Mosquera, C. J.</p> <p>2016-11-01</p> <p>This work describes a wireless multisensory system for different toxic gases detection generated on indoor environments (i.e., Underground coal mines, etc.). The artificial multisensory system proposed in this study was developed through a set of six chemical gas sensors (MQ) of low cost with overlapping sensitivities to detect hazardous gases in the air. A statistical parameter was implemented to the data set and two pattern recognition methods such as Principal Component Analysis (PCA) and Discriminant Function Analysis (DFA) were used for feature selection. The toxic gases categories were classified with a Probabilistic Neural Network (PNN) in order to validate the results previously obtained. The tests were carried out to verify feasibility of the application through a wireless communication model which allowed to monitor and store the information of the sensor signals for the appropriate analysis. The success rate in the measures discrimination was 100%, using an artificial neural network where leave-one-out was used as cross validation method.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/ADA621545','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/ADA621545"><span>Information Flow Integrity for Systems of Independently-Developed Components</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.dtic.mil/">DTIC Science & Technology</a></p> <p></p> <p>2015-06-22</p> <p>We also examined three programs (Apache, MySQL , and PHP) in detail to evaluate the efficacy of using the provided package test suites to generate...method are just as effective as hooks that were manually placed over the course of years while greatly reducing the burden on programmers. ”Leveraging...to validate optimizations of real-world, mature applications: the Apache software suite, the Mozilla Suite, and the MySQL database. ”Validating Library</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/ADA556860','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/ADA556860"><span>Parameter Selection Methods in Inverse Problem Formulation</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.dtic.mil/">DTIC Science & Technology</a></p> <p></p> <p>2010-11-03</p> <p>clinical data and used for prediction and a model for the reaction of the cardiovascular system to an ergometric workload. Key Words: Parameter selection...model for HIV dynamics which has been successfully validated with clinical data and used for prediction and a model for the reaction of the...recently developed in-host model for HIV dynamics which has been successfully validated with clinical data and used for prediction [4, 8]; b) a global</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018MS%26E..333a2064C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018MS%26E..333a2064C"><span>Analytical method development of nifedipine and its degradants binary mixture using high performance liquid chromatography through a quality by design approach</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Choiri, S.; Ainurofiq, A.; Ratri, R.; Zulmi, M. U.</p> <p>2018-03-01</p> <p>Nifedipin (NIF) is a photo-labile drug that easily degrades when it exposures a sunlight. This research aimed to develop of an analytical method using a high-performance liquid chromatography and implemented a quality by design approach to obtain effective, efficient, and validated analytical methods of NIF and its degradants. A 22 full factorial design approach with a curvature as a center point was applied to optimize of the analytical condition of NIF and its degradants. Mobile phase composition (MPC) and flow rate (FR) as factors determined on the system suitability parameters. The selected condition was validated by cross-validation using a leave one out technique. Alteration of MPC affected on time retention significantly. Furthermore, an increase of FR reduced the tailing factor. In addition, the interaction of both factors affected on an increase of the theoretical plates and resolution of NIF and its degradants. The selected analytical condition of NIF and its degradants has been validated at range 1 – 16 µg/mL that had good linearity, precision, accuration and efficient due to an analysis time within 10 min.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4460667','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4460667"><span>Validation of a Prototype Optical Computed Tomography System</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Zakariaee, Seyed Salman; Molazadeh, Mikaeil; Takavar, Abbas; Shirazi, Alireza; Mesbahi, Asghar; Zeinali, Ahad</p> <p>2015-01-01</p> <p>In radiation cancer treatments, the most of the side effects could be minimized using a proper dosimeter. Gel dosimeter is the only three-dimensional dosimeter and magnetic resonance imaging (MRI) is the gold standard method for gel dosimeter readout. Because of hard accessibility and high cost of sample reading by MRI systems, some other alternative methods were developed. The optical computed tomography (OCT) method could be considered as the most promising alternative method that has been studied widely. In the current study, gel dosimeter scanning using a prototype optical scanner and validation of this optical scanner was performed. Optical absorbance of the irradiated gel samples was determined by both of conventional spectrophotometer and the fabricated OCT system at 632 nm. Furthermore, these irradiated vials were scanned by a 1.5 T MRI. The slope of the curves was extracted as the dose-response sensitivity. The R2-dose sensitivity measured by MRI method was 0.1904 and 0.113 for NIPAM and PAGAT gels, respectively. The optical dose sensitivity obtained by conventional spectrophotometer and the fabricated optical scanner was 0.0453 and 0.0442 for NIPAM gels and 0.0244 and 0.0242 for PAGAT gels, respectively. The scanning results of the absorbed dose values showed that the new OCT and conventional spectrophotometer were in fair agreement. From the results, it could be concluded that the fabricated system is able to quantize the absorbed dose values in polymer gel samples with acceptable accuracy. PMID:26120572</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li class="active"><span>21</span></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_21 --> <div id="page_22" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li class="active"><span>22</span></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="421"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/12546540','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/12546540"><span>Many-body optimization using an ab initio monte carlo method.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Haubein, Ned C; McMillan, Scott A; Broadbelt, Linda J</p> <p>2003-01-01</p> <p>Advances in computing power have made it possible to study solvated molecules using ab initio quantum chemistry. Inclusion of discrete solvent molecules is required to determine geometric information about solute/solvent clusters. Monte Carlo methods are well suited to finding minima in many-body systems, and ab initio methods are applicable to the widest range of systems. A first principles Monte Carlo (FPMC) method was developed to find minima in many-body systems, and emphasis was placed on implementing moves that increase the likelihood of finding minimum energy structures. Partial optimization and molecular interchange moves aid in finding minima and overcome the incomplete sampling that is unavoidable when using ab initio methods. FPMC was validated by studying the boron trifluoride-water system, and then the method was used to examine the methyl carbenium ion in water to demonstrate its application to solvation problems.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018OptCo.413...14Z','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018OptCo.413...14Z"><span>A novel transmitter IQ imbalance and phase noise suppression method utilizing pilots in PDM CO-OFDM system</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Zhang, Haoyuan; Ma, Xiurong; Li, Pengru</p> <p>2018-04-01</p> <p>In this paper, we develop a novel pilot structure to suppress transmitter in-phase and quadrature (Tx IQ) imbalance, phase noise and channel distortion for polarization division multiplexed (PDM) coherent optical orthogonal frequency division multiplexing (CO-OFDM) systems. Compared with the conventional approach, our method not only significantly improves the system tolerance of IQ imbalance as well as phase noise, but also provides higher transmission speed. Numerical simulations of PDM CO-OFDM system is used to validate the theoretical analysis under the simulation conditions: the amplitude mismatch 3 dB, the phase mismatch 15°, the transmission bit rate 100 Gb/s and 560 km standard signal-mode fiber transmission. Moreover, the proposed method is 63% less complex than the compared method.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2010IEITI..91..897K','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2010IEITI..91..897K"><span>Identifying Stakeholders and Their Preferences about NFR by Comparing Use Case Diagrams of Several Existing Systems</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Kaiya, Haruhiko; Osada, Akira; Kaijiri, Kenji</p> <p></p> <p>We present a method to identify stakeholders and their preferences about non-functional requirements (NFR) by using use case diagrams of existing systems. We focus on the changes about NFR because such changes help stakeholders to identify their preferences. Comparing different use case diagrams of the same domain helps us to find changes to be occurred. We utilize Goal-Question-Metrics (GQM) method for identifying variables that characterize NFR, and we can systematically represent changes about NFR using the variables. Use cases that represent system interactions help us to bridge the gap between goals and metrics (variables), and we can easily construct measurable NFR. For validating and evaluating our method, we applied our method to an application domain of Mail User Agent (MUA) system.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19930004509','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19930004509"><span>Summary: Experimental validation of real-time fault-tolerant systems</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Iyer, R. K.; Choi, G. S.</p> <p>1992-01-01</p> <p>Testing and validation of real-time systems is always difficult to perform since neither the error generation process nor the fault propagation problem is easy to comprehend. There is no better substitute to results based on actual measurements and experimentation. Such results are essential for developing a rational basis for evaluation and validation of real-time systems. However, with physical experimentation, controllability and observability are limited to external instrumentation that can be hooked-up to the system under test. And this process is quite a difficult, if not impossible, task for a complex system. Also, to set up such experiments for measurements, physical hardware must exist. On the other hand, a simulation approach allows flexibility that is unequaled by any other existing method for system evaluation. A simulation methodology for system evaluation was successfully developed and implemented and the environment was demonstrated using existing real-time avionic systems. The research was oriented toward evaluating the impact of permanent and transient faults in aircraft control computers. Results were obtained for the Bendix BDX 930 system and Hamilton Standard EEC131 jet engine controller. The studies showed that simulated fault injection is valuable, in the design stage, to evaluate the susceptibility of computing sytems to different types of failures.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/26978944','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/26978944"><span>High-Frequency Observation of Water Spectrum and Its Application in Monitoring of Dynamic Variation of Suspended Materials in the Hangzhou Bay.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Dai, Qian; Pan, De-lu; He, Xian-qiang; Zhu, Qian-kun; Gong, Fang; Huang, Hai-qing</p> <p>2015-11-01</p> <p>In situ measurement of water spectrum is the basis of the validation of the ocean color remote sensing. The traditional method to obtain the water spectrum is based on the shipboard measurement at limited stations, which is difficult to meet the requirement of validation of ocean color remote sensing in the highly dynamic coastal waters. To overcome this shortage, continuously observing systems of water spectrum have been developed in the world. However, so far, there are still few high-frequency observation systems of the water spectrum in coastal waters, especially in the highly turbid and high-dynamic waters. Here, we established a high-frequency water-spectrum observing system based on tower in the Hangzhou Bay. The system measures the water spectrum at a step of 3 minutes, which can fully match the satellite observation. In this paper, we primarily developed a data processing method for the tower-based high-frequency water spectrum data, to realize automatic judgment of clear sky, sun glint, platform shadow, and weak illumination, etc. , and verified the processing results. The results show that the normalized water-leaving radiance spectra obtained through tower observation have relatively high consistency with the shipboard measurement results, with correlation coefficient of more than 0. 99, and average relative error of 9.96%. In addition, the long-term observation capability of the tower-based high-frequency water-spectrum observing system was evaluated, and the results show that although the system has run for one year, the normalized water-leaving radiance obtained by this system have good consistency with the synchronously measurement by Portable spectrometer ASD in respect of spectral shape and value, with correlation coefficient of more than 0.90 and average relative error of 6.48%. Moreover, the water spectra from high-frequency observation by the system can be used to effectively monitor the rapid dynamic variation in concentration of suspended materials with tide. The tower-based high-frequency water-spectrum observing system provided rich in situ spectral data for the validation of ocean color remote sensing in turbid waters, especially for validation of the high temporal-resolution geostationary satellite ocean color remote sensing.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018MSSP..107..137F','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018MSSP..107..137F"><span>An interval precise integration method for transient unbalance response analysis of rotor system with uncertainty</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Fu, Chao; Ren, Xingmin; Yang, Yongfeng; Xia, Yebao; Deng, Wangqun</p> <p>2018-07-01</p> <p>A non-intrusive interval precise integration method (IPIM) is proposed in this paper to analyze the transient unbalance response of uncertain rotor systems. The transfer matrix method (TMM) is used to derive the deterministic equations of motion of a hollow-shaft overhung rotor. The uncertain transient dynamic problem is solved by combing the Chebyshev approximation theory with the modified precise integration method (PIM). Transient response bounds are calculated by interval arithmetic of the expansion coefficients. Theoretical error analysis of the proposed method is provided briefly, and its accuracy is further validated by comparing with the scanning method in simulations. Numerical results show that the IPIM can keep good accuracy in vibration prediction of the start-up transient process. Furthermore, the proposed method can also provide theoretical guidance to other transient dynamic mechanical systems with uncertainties.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/18624889','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/18624889"><span>Developmental validation of a Cannabis sativa STR multiplex system for forensic analysis.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Howard, Christopher; Gilmore, Simon; Robertson, James; Peakall, Rod</p> <p>2008-09-01</p> <p>A developmental validation study based on recommendations of the Scientific Working Group on DNA Analysis Methods (SWGDAM) was conducted on a multiplex system of 10 Cannabis sativa short tandem repeat loci. Amplification of the loci in four multiplex reactions was tested across DNA from dried root, stem, and leaf sources, and DNA from fresh, frozen, and dried leaf tissue with a template DNA range of 10.0-0.01 ng. The loci were amplified and scored consistently for all DNA sources when DNA template was in the range of 10.0-1.0 ng. Some allelic dropout and PCR failure occurred in reactions with lower template DNA amounts. Overall, amplification was best using 10.0 ng of template DNA from dried leaf tissue indicating that this is the optimal source material. Cross species amplification was observed in Humulus lupulus for three loci but there was no allelic overlap. This is the first study following SWGDAM validation guidelines to validate short tandem repeat markers for forensic use in plants.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/920652','DOE-PATENT-XML'); return false;" href="https://www.osti.gov/servlets/purl/920652"><span>System and method for identifying, validating, weighing and characterizing moving or stationary vehicles and cargo</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/doepatents">DOEpatents</a></p> <p>Beshears, David L.; Batsell, Stephen G.; Abercrombie, Robert K.; Scudiere, Matthew B.; White, Clifford P.</p> <p>2007-12-04</p> <p>An asset identification and information infrastructure management (AI3M) device having an automated identification technology system (AIT), a Transportation Coordinators' Automated Information for Movements System II (TC-AIMS II), a weigh-in-motion system (WIM-II), and an Automated Air Load Planning system (AALPS) all in electronic communication for measuring and calculating actual asset characteristics, either statically or in-motion, and further calculating an actual load plan.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://files.eric.ed.gov/fulltext/ED074477.pdf','ERIC'); return false;" href="http://files.eric.ed.gov/fulltext/ED074477.pdf"><span>General Open Systems Theory and the Substrata-Factor Theory of Reading.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Kling, Martin</p> <p></p> <p>This study was designed to extend the generality of the Substrata-Factor Theory by two methods of investigation: (1) theoretically, to establish the validity of the hypothesis that an isomorphic relationship exists between the Substrata-Factor Theory and the General Open Systems Theory, and (2) experimentally, to discover through a series of…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=Subsystems+AND+analyses&pg=2&id=ED024546','ERIC'); return false;" href="https://eric.ed.gov/?q=Subsystems+AND+analyses&pg=2&id=ED024546"><span>General Open Systems Theory and the Substrata-Factor Theory of Reading.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Kling, Martin</p> <p></p> <p>This study was designed to extend the generality of the Substrata-Factor Theory by two methods of investigation: (1) theoretically, to est"blish the validity of the hypothesis that an isomorphic relationship exists between the Substrata-Factor Theory and the General Open Systems Theory, and (2) experimentally, to disc"ver through a…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://cfpub.epa.gov/si/si_public_record_report.cfm?dirEntryId=159908&keyword=nucleus&actType=&TIMSType=+&TIMSSubTypeID=&DEID=&epaNumber=&ntisID=&archiveStatus=Both&ombCat=Any&dateBeginCreated=&dateEndCreated=&dateBeginPublishedPresented=&dateEndPublishedPresented=&dateBeginUpdated=&dateEndUpdated=&dateBeginCompleted=&dateEndCompleted=&personID=&role=Any&journalID=&publisherID=&sortBy=revisionDate&count=50','EPA-EIMS'); return false;" href="https://cfpub.epa.gov/si/si_public_record_report.cfm?dirEntryId=159908&keyword=nucleus&actType=&TIMSType=+&TIMSSubTypeID=&DEID=&epaNumber=&ntisID=&archiveStatus=Both&ombCat=Any&dateBeginCreated=&dateEndCreated=&dateBeginPublishedPresented=&dateEndPublishedPresented=&dateBeginUpdated=&dateEndUpdated=&dateBeginCompleted=&dateEndCompleted=&personID=&role=Any&journalID=&publisherID=&sortBy=revisionDate&count=50"><span>ASSESSMENT OF CHEMICAL EFFECTS ON NEURONAL DIFFERENTIATION USING THE ARRAYSCAN HIGH CONTENT SCREENING SYSTEM</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://oaspub.epa.gov/eims/query.page">EPA Science Inventory</a></p> <p></p> <p></p> <p>The development of alternative methods for toxicity testing is driven by the need for scientifically valid data that can be obtained in a rapid and cost-efficient manner. In vitro systems provide a model in which chemical effects on cellular events can be examined using technique...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.ars.usda.gov/research/publications/publication/?seqNo115=290125','TEKTRAN'); return false;" href="http://www.ars.usda.gov/research/publications/publication/?seqNo115=290125"><span>Pasteurization of strawberry puree using a pilot plant pulsed electric fields (PEF) system</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ars.usda.gov/research/publications/find-a-publication/">USDA-ARS?s Scientific Manuscript database</a></p> <p></p> <p></p> <p>The processing of strawberry puree by pulsed electric fields (PEF) in a pilot plant system has never been evaluated. In addition, a method does not exist to validate the exact number and shape of the pulses applied during PEF processing. Both buffered peptone water (BPW) and fresh strawberry puree (...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/19802755','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/19802755"><span>A new method for motion capture of the scapula using an optoelectronic tracking device: a feasibility study.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Šenk, Miroslav; Chèze, Laurence</p> <p>2010-06-01</p> <p>Optoelectronic tracking systems are rarely used in 3D studies examining shoulder movements including the scapula. Among the reasons is the important slippage of skin markers with respect to scapula. Methods using electromagnetic tracking devices are validated and frequently applied. Thus, the aim of this study was to develop a new method for in vivo optoelectronic scapular capture dealing with the accepted accuracy issues of validated methods. Eleven arm positions in three anatomical planes were examined using five subjects in static mode. The method was based on local optimisation, and recalculation procedures were made using a set of five scapular surface markers. The scapular rotations derived from the recalculation-based method yielded RMS errors comparable with the frequently used electromagnetic scapular methods (RMS up to 12.6° for 150° arm elevation). The results indicate that the present method can be used under careful considerations for 3D kinematical studies examining different shoulder movements.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018IJASS.tmp...27S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018IJASS.tmp...27S"><span>Sum-of-Squares-Based Region of Attraction Analysis for Gain-Scheduled Three-Loop Autopilot</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Seo, Min-Won; Kwon, Hyuck-Hoon; Choi, Han-Lim</p> <p>2018-04-01</p> <p>A conventional method of designing a missile autopilot is to linearize the original nonlinear dynamics at several trim points, then to determine linear controllers for each linearized model, and finally implement gain-scheduling technique. The validation of such a controller is often based on linear system analysis for the linear closed-loop system at the trim conditions. Although this type of gain-scheduled linear autopilot works well in practice, validation based solely on linear analysis may not be sufficient to fully characterize the closed-loop system especially when the aerodynamic coefficients exhibit substantial nonlinearity with respect to the flight condition. The purpose of this paper is to present a methodology for analyzing the stability of a gain-scheduled controller in a setting close to the original nonlinear setting. The method is based on sum-of-squares (SOS) optimization that can be used to characterize the region of attraction of a polynomial system by solving convex optimization problems. The applicability of the proposed SOS-based methodology is verified on a short-period autopilot of a skid-to-turn missile.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/26762956','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/26762956"><span>HPTLC Method for the Determination of Paracetamol, Pseudoephedrine and Loratidine in Tablets and Human Plasma.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Farid, Nehal Fayek; Abdelaleem, Eglal A</p> <p>2016-04-01</p> <p>A sensitive, accurate and selective high performance thin layer chromatography (HPTLC) method was developed and validated for the simultaneous determination of paracetamol (PAR), its toxic impurity 4-aminophenol (4-AP), pseudoephedrine HCl (PSH) and loratidine (LOR). The proposed chromatographic method has been developed using HPTLC aluminum plates precoated with silica gel 60 F254 using acetone-hexane-ammonia (4:5:0.1, by volume) as a developing system followed by densitometric measurement at 254 nm for PAR, 4-AP and LOR, while PSH was scanned at 208 nm. System suitability testing parameters were calculated to ascertain the quality performance of the developed chromatographic method. The method was validated with respect to USP guidelines regarding accuracy, precision and specificity. The method was successfully applied for the determination of PAR, PSH and LOR in ATSHI(®) tablets. The three drugs were also determined in plasma by applying the proposed method in the ranges of 0.5-6 µg/band, 1.6-12 µg/band and 0.4-2 µg/band for PAR, PSH and LOR, respectively. The results obtained by the proposed method were compared with those obtained by a reported HPLC method, and there was no significance difference between both methods regarding accuracy and precision. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AIPC.1834d0005M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AIPC.1834d0005M"><span>Dynamical mechanical characteristic simulation and analysis of the low voltage switch under vibration and shock conditions</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Miao, Xiaodan; Han, Feng</p> <p>2017-04-01</p> <p>The low voltage switch has widely application especially in the hostile environment such as large vibration and shock conditions. In order to ensure the validity of the switch in the hostile environment, it is necessary to predict its mechanical characteristic. In traditional method, the complex and expensive testing system is build up to verify its validity. This paper presented a method based on finite element analysis to predict the dynamic mechanical characteristic of the switch by using ANSYS software. This simulation could provide the basis for the design and optimization of the switch to shorten the design process to improve the product efficiency.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017JMMM..436..117W','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017JMMM..436..117W"><span>Analysis and optimization of hybrid excitation permanent magnet synchronous generator for stand-alone power system</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Wang, Huijun; Qu, Zheng; Tang, Shaofei; Pang, Mingqi; Zhang, Mingju</p> <p>2017-08-01</p> <p>In this paper, electromagnetic design and permanent magnet shape optimization for permanent magnet synchronous generator with hybrid excitation are investigated. Based on generator structure and principle, design outline is presented for obtaining high efficiency and low voltage fluctuation. In order to realize rapid design, equivalent magnetic circuits for permanent magnet and iron poles are developed. At the same time, finite element analysis is employed. Furthermore, by means of design of experiment (DOE) method, permanent magnet is optimized to reduce voltage waveform distortion. Finally, the validity of proposed design methods is validated by the analytical and experimental results.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3355849','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3355849"><span>Evolving forecasting classifications and applications in health forecasting</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Soyiri, Ireneous N; Reidpath, Daniel D</p> <p>2012-01-01</p> <p>Health forecasting forewarns the health community about future health situations and disease episodes so that health systems can better allocate resources and manage demand. The tools used for developing and measuring the accuracy and validity of health forecasts commonly are not defined although they are usually adapted forms of statistical procedures. This review identifies previous typologies used in classifying the forecasting methods commonly used in forecasting health conditions or situations. It then discusses the strengths and weaknesses of these methods and presents the choices available for measuring the accuracy of health-forecasting models, including a note on the discrepancies in the modes of validation. PMID:22615533</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19940031369','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19940031369"><span>Adaptive model reduction for continuous systems via recursive rational interpolation</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Lilly, John H.</p> <p>1994-01-01</p> <p>A method for adaptive identification of reduced-order models for continuous stable SISO and MIMO plants is presented. The method recursively finds a model whose transfer function (matrix) matches that of the plant on a set of frequencies chosen by the designer. The algorithm utilizes the Moving Discrete Fourier Transform (MDFT) to continuously monitor the frequency-domain profile of the system input and output signals. The MDFT is an efficient method of monitoring discrete points in the frequency domain of an evolving function of time. The model parameters are estimated from MDFT data using standard recursive parameter estimation techniques. The algorithm has been shown in simulations to be quite robust to additive noise in the inputs and outputs. A significant advantage of the method is that it enables a type of on-line model validation. This is accomplished by simultaneously identifying a number of models and comparing each with the plant in the frequency domain. Simulations of the method applied to an 8th-order SISO plant and a 10-state 2-input 2-output plant are presented. An example of on-line model validation applied to the SISO plant is also presented.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015VSD....53.1455S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015VSD....53.1455S"><span>Nonlinear modelling of high-speed catenary based on analytical expressions of cable and truss elements</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Song, Yang; Liu, Zhigang; Wang, Hongrui; Lu, Xiaobing; Zhang, Jing</p> <p>2015-10-01</p> <p>Due to the intrinsic nonlinear characteristics and complex structure of the high-speed catenary system, a modelling method is proposed based on the analytical expressions of nonlinear cable and truss elements. The calculation procedure for solving the initial equilibrium state is proposed based on the Newton-Raphson iteration method. The deformed configuration of the catenary system as well as the initial length of each wire can be calculated. Its accuracy and validity of computing the initial equilibrium state are verified by comparison with the separate model method, absolute nodal coordinate formulation and other methods in the previous literatures. Then, the proposed model is combined with a lumped pantograph model and a dynamic simulation procedure is proposed. The accuracy is guaranteed by the multiple iterative calculations in each time step. The dynamic performance of the proposed model is validated by comparison with EN 50318, the results of the finite element method software and SIEMENS simulation report, respectively. At last, the influence of the catenary design parameters (such as the reserved sag and pre-tension) on the dynamic performance is preliminarily analysed by using the proposed model.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li class="active"><span>22</span></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_22 --> <div id="page_23" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li class="active"><span>23</span></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li><a href="#" onclick='return showDiv("page_25");'>25</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="441"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4230777','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4230777"><span>A Statistical Method of Identifying Interactions in Neuron–Glia Systems Based on Functional Multicell Ca2+ Imaging</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Nakae, Ken; Ikegaya, Yuji; Ishikawa, Tomoe; Oba, Shigeyuki; Urakubo, Hidetoshi; Koyama, Masanori; Ishii, Shin</p> <p>2014-01-01</p> <p>Crosstalk between neurons and glia may constitute a significant part of information processing in the brain. We present a novel method of statistically identifying interactions in a neuron–glia network. We attempted to identify neuron–glia interactions from neuronal and glial activities via maximum-a-posteriori (MAP)-based parameter estimation by developing a generalized linear model (GLM) of a neuron–glia network. The interactions in our interest included functional connectivity and response functions. We evaluated the cross-validated likelihood of GLMs that resulted from the addition or removal of connections to confirm the existence of specific neuron-to-glia or glia-to-neuron connections. We only accepted addition or removal when the modification improved the cross-validated likelihood. We applied the method to a high-throughput, multicellular in vitro Ca2+ imaging dataset obtained from the CA3 region of a rat hippocampus, and then evaluated the reliability of connectivity estimates using a statistical test based on a surrogate method. Our findings based on the estimated connectivity were in good agreement with currently available physiological knowledge, suggesting our method can elucidate undiscovered functions of neuron–glia systems. PMID:25393874</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20090012067','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20090012067"><span>IEEE/NASA Workshop on Leveraging Applications of Formal Methods, Verification, and Validation</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Margaria, Tiziana (Editor); Steffen, Bernhard (Editor); Hichey, Michael G.</p> <p>2005-01-01</p> <p>This volume contains the Preliminary Proceedings of the 2005 IEEE ISoLA Workshop on Leveraging Applications of Formal Methods, Verification, and Validation, with a special track on the theme of Formal Methods in Human and Robotic Space Exploration. The workshop was held on 23-24 September 2005 at the Loyola College Graduate Center, Columbia, MD, USA. The idea behind the Workshop arose from the experience and feedback of ISoLA 2004, the 1st International Symposium on Leveraging Applications of Formal Methods held in Paphos (Cyprus) last October-November. ISoLA 2004 served the need of providing a forum for developers, users, and researchers to discuss issues related to the adoption and use of rigorous tools and methods for the specification, analysis, verification, certification, construction, test, and maintenance of systems from the point of view of their different application domains.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/28585900','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/28585900"><span>Strategies for the screening of antibiotic residues in eggs: comparison of the validation of the classical microbiological method with an immunobiosensor method.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Gaudin, Valérie; Rault, Annie; Hedou, Celine; Soumet, Christophe; Verdon, Eric</p> <p>2017-09-01</p> <p>Efficient screening methods are needed to control antibiotic residues in eggs. A microbiological kit (Explorer® 2.0 test (Zeu Inmunotech, Spain)) and an immunobiosensor kit (Microarray II (AM® II) on Evidence Investigator™ system (Randox, UK)) have been evaluated and validated for screening of antibiotic residues in eggs, according to the European decision EC/2002/657 and to the European guideline for the validation of screening methods. The e-reader™ system, a new automatic incubator/reading system, was coupled to the Explorer 2.0 test. The AM II kit can detect residues of six different families of antibiotics in different matrices including eggs. For both tests, a different liquid/liquid extraction of eggs had to be developed. Specificities of the Explorer 2.0 and AM II kit were equal to 8% and 0% respectively. The detection capabilities were determined for 19 antibiotics, with representatives from different families, for Explorer 2.0 and 12 antibiotics for the AM II kit. For the nine antibiotics having a maximum residue limit (MRL) in eggs, the detection capabilities CCβ of Explorer 2.0 were below the MRL for four antibiotics, equal to the MRL for two antibiotics and between 1 and 1.5 MRLs for the three remaining antibiotics (tetracyclines). For the antibiotics from other families, the detection capabilities were low for beta-lactams and sulfonamides and satisfactory for dihydrostreptomycin (DHS) and fluoroquinolones, which are usually difficult to detect with microbiological tests. The CCβ values of the AM II kit were much lower than the respective MRLs for three detected antibiotics (tetracycline, oxytetracycline, tylosin). Concerning the nine other antibiotics, the detection capabilities determined were low. The highest CCβ was obtained for streptomycin (100 µg kg -1 ).</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018JEI....27a3012Z','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018JEI....27a3012Z"><span>Research on simulated infrared image utility evaluation using deep representation</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Zhang, Ruiheng; Mu, Chengpo; Yang, Yu; Xu, Lixin</p> <p>2018-01-01</p> <p>Infrared (IR) image simulation is an important data source for various target recognition systems. However, whether simulated IR images could be used as training data for classifiers depends on the features of fidelity and authenticity of simulated IR images. For evaluation of IR image features, a deep-representation-based algorithm is proposed. Being different from conventional methods, which usually adopt a priori knowledge or manually designed feature, the proposed method can extract essential features and quantitatively evaluate the utility of simulated IR images. First, for data preparation, we employ our IR image simulation system to generate large amounts of IR images. Then, we present the evaluation model of simulated IR image, for which an end-to-end IR feature extraction and target detection model based on deep convolutional neural network is designed. At last, the experiments illustrate that our proposed method outperforms other verification algorithms in evaluating simulated IR images. Cross-validation, variable proportion mixed data validation, and simulation process contrast experiments are carried out to evaluate the utility and objectivity of the images generated by our simulation system. The optimum mixing ratio between simulated and real data is 0.2≤γ≤0.3, which is an effective data augmentation method for real IR images.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20100017522','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20100017522"><span>L(sub 1) Adaptive Flight Control System: Flight Evaluation and Technology Transition</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Xargay, Enric; Hovakimyan, Naira; Dobrokhodov, Vladimir; Kaminer, Isaac; Gregory, Irene M.; Cao, Chengyu</p> <p>2010-01-01</p> <p>Certification of adaptive control technologies for both manned and unmanned aircraft represent a major challenge for current Verification and Validation techniques. A (missing) key step towards flight certification of adaptive flight control systems is the definition and development of analysis tools and methods to support Verification and Validation for nonlinear systems, similar to the procedures currently used for linear systems. In this paper, we describe and demonstrate the advantages of L(sub l) adaptive control architectures for closing some of the gaps in certification of adaptive flight control systems, which may facilitate the transition of adaptive control into military and commercial aerospace applications. As illustrative examples, we present the results of a piloted simulation evaluation on the NASA AirSTAR flight test vehicle, and results of an extensive flight test program conducted by the Naval Postgraduate School to demonstrate the advantages of L(sub l) adaptive control as a verifiable robust adaptive flight control system.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3779528','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3779528"><span>Validity and reliability of Patient-Reported Outcomes Measurement Information System (PROMIS) Instruments in Osteoarthritis</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Broderick, Joan E.; Schneider, Stefan; Junghaenel, Doerte U.; Schwartz, Joseph E.; Stone, Arthur A.</p> <p>2013-01-01</p> <p>Objective Evaluation of known group validity, ecological validity, and test-retest reliability of four domain instruments from the Patient Reported Outcomes Measurement System (PROMIS) in osteoarthritis (OA) patients. Methods Recruitment of an osteoarthritis sample and a comparison general population (GP) through an Internet survey panel. Pain intensity, pain interference, physical functioning, and fatigue were assessed for 4 consecutive weeks with PROMIS short forms on a daily basis and compared with same-domain Computer Adaptive Test (CAT) instruments that use a 7-day recall. Known group validity (comparison of OA and GP), ecological validity (comparison of aggregated daily measures with CATs), and test-retest reliability were evaluated. Results The recruited samples matched (age, sex, race, ethnicity) the demographic characteristics of the U.S. sample for arthritis and the 2009 Census for the GP. Compliance with repeated measurements was excellent: > 95%. Known group validity for CATs was demonstrated with large effect sizes (pain intensity: 1.42, pain interference: 1.25, and fatigue: .85). Ecological validity was also established through high correlations between aggregated daily measures and weekly CATs (≥ .86). Test-retest validity (7-day) was very good (≥ .80). Conclusion PROMIS CAT instruments demonstrated known group and ecological validity in a comparison of osteoarthritis patients with a general population sample. Adequate test-retest reliability was also observed. These data provide encouraging initial data on the utility of these PROMIS instruments for clinical and research outcomes in osteoarthritis patients. PMID:23592494</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/20721238','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/20721238"><span>Mathematic models for a ray tracing method and its applications in wireless optical communications.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Zhang, Minglun; Zhang, Yangan; Yuan, Xueguang; Zhang, Jinnan</p> <p>2010-08-16</p> <p>This paper presents a new ray tracing method, which contains a whole set of mathematic models, and its validity is verified by simulations. In addition, both theoretical analysis and simulation results show that the computational complexity of the method is much lower than that of previous ones. Therefore, the method can be used to rapidly calculate the impulse response of wireless optical channels for complicated systems.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5876797','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5876797"><span>Inertial Measurement Units for Clinical Movement Analysis: Reliability and Concurrent Validity</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Nicholas, Kevin; Sparkes, Valerie; Sheeran, Liba; Davies, Jennifer L</p> <p>2018-01-01</p> <p>The aim of this study was to investigate the reliability and concurrent validity of a commercially available Xsens MVN BIOMECH inertial-sensor-based motion capture system during clinically relevant functional activities. A clinician with no prior experience of motion capture technologies and an experienced clinical movement scientist each assessed 26 healthy participants within each of two sessions using a camera-based motion capture system and the MVN BIOMECH system. Participants performed overground walking, squatting, and jumping. Sessions were separated by 4 ± 3 days. Reliability was evaluated using intraclass correlation coefficient and standard error of measurement, and validity was evaluated using the coefficient of multiple correlation and the linear fit method. Day-to-day reliability was generally fair-to-excellent in all three planes for hip, knee, and ankle joint angles in all three tasks. Within-day (between-rater) reliability was fair-to-excellent in all three planes during walking and squatting, and poor-to-high during jumping. Validity was excellent in the sagittal plane for hip, knee, and ankle joint angles in all three tasks and acceptable in frontal and transverse planes in squat and jump activity across joints. Our results suggest that the MVN BIOMECH system can be used by a clinician to quantify lower-limb joint angles in clinically relevant movements. PMID:29495600</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2003ESASP.532E..64H','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2003ESASP.532E..64H"><span>Software Dependability and Safety Evaluations ESA's Initiative</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Hernek, M.</p> <p></p> <p>ESA has allocated funds for an initiative to evaluate Dependability and Safety methods of Software. The objectives of this initiative are; · More extensive validation of Safety and Dependability techniques for Software · Provide valuable results to improve the quality of the Software thus promoting the application of Dependability and Safety methods and techniques. ESA space systems are being developed according to defined PA requirement specifications. These requirements may be implemented through various design concepts, e.g. redundancy, diversity etc. varying from project to project. Analysis methods (FMECA. FTA, HA, etc) are frequently used during requirements analysis and design activities to assure the correct implementation of system PA requirements. The criticality level of failures, functions and systems is determined and by doing that the critical sub-systems are identified, on which dependability and safety techniques are to be applied during development. Proper performance of the software development requires the development of a technical specification for the products at the beginning of the life cycle. Such technical specification comprises both functional and non-functional requirements. These non-functional requirements address characteristics of the product such as quality, dependability, safety and maintainability. Software in space systems is more and more used in critical functions. Also the trend towards more frequent use of COTS and reusable components pose new difficulties in terms of assuring reliable and safe systems. Because of this, its dependability and safety must be carefully analysed. ESA identified and documented techniques, methods and procedures to ensure that software dependability and safety requirements are specified and taken into account during the design and development of a software system and to verify/validate that the implemented software systems comply with these requirements [R1].</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3349491','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3349491"><span>Construction of the descriptive system for the assessment of quality of life AQoL-6D utility instrument</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p></p> <p>2012-01-01</p> <p>Background Multi attribute utility (MAU) instruments are used to include the health related quality of life (HRQoL) in economic evaluations of health programs. Comparative studies suggest different MAU instruments measure related but different constructs. The objective of this paper is to describe the methods employed to achieve content validity in the descriptive system of the Assessment of Quality of Life (AQoL)-6D, MAU instrument. Methods The AQoL program introduced the use of psychometric methods in the construction of health related MAU instruments. To develop the AQoL-6D we selected 112 items from previous research, focus groups and expert judgment and administered them to 316 members of the public and 302 hospital patients. The search for content validity across a broad spectrum of health states required both formative and reflective modelling. We employed Exploratory Factor Analysis and Structural Equation Modelling (SEM) to meet these dual requirements. Results and Discussion The resulting instrument employs 20 items in a multi-tier descriptive system. Latent dimension variables achieve sensitive descriptions of 6 dimensions which, in turn, combine to form a single latent QoL variable. Diagnostic statistics from the SEM analysis are exceptionally good and confirm the hypothesised structure of the model. Conclusions The AQoL-6D descriptive system has good psychometric properties. They imply that the instrument has achieved construct validity and provides a sensitive description of HRQoL. This means that it may be used with confidence for measuring health related quality of life and that it is a suitable basis for modelling utilities for inclusion in the economic evaluation of health programs. PMID:22507254</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/29408321','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/29408321"><span>Status of acute systemic toxicity testing requirements and data uses by U.S. regulatory agencies.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Strickland, Judy; Clippinger, Amy J; Brown, Jeffrey; Allen, David; Jacobs, Abigail; Matheson, Joanna; Lowit, Anna; Reinke, Emily N; Johnson, Mark S; Quinn, Michael J; Mattie, David; Fitzpatrick, Suzanne C; Ahir, Surender; Kleinstreuer, Nicole; Casey, Warren</p> <p>2018-04-01</p> <p>Acute systemic toxicity data are used by a number of U.S. federal agencies, most commonly for hazard classification and labeling and/or risk assessment for acute chemical exposures. To identify opportunities for the implementation of non-animal approaches to produce these data, the regulatory needs and uses for acute systemic toxicity information must first be clarified. Thus, we reviewed acute systemic toxicity testing requirements for six U.S. agencies (Consumer Product Safety Commission, Department of Defense, Department of Transportation, Environmental Protection Agency, Food and Drug Administration, Occupational Safety and Health Administration) and noted whether there is flexibility in satisfying data needs with methods that replace or reduce animal use. Understanding the current regulatory use and acceptance of non-animal data is a necessary starting point for future method development, optimization, and validation efforts. The current review will inform the development of a national strategy and roadmap for implementing non-animal approaches to assess potential hazards associated with acute exposures to industrial chemicals and medical products. The Acute Toxicity Workgroup of the Interagency Coordinating Committee on the Validation of Alternative Methods (ICCVAM), U.S. agencies, non-governmental organizations, and other stakeholders will work to execute this strategy. Copyright © 2018 Elsevier Inc. All rights reserved.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017CoPhC.210....1W','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017CoPhC.210....1W"><span>1D-3D coupling for hydraulic system transient simulations</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Wang, Chao; Nilsson, Håkan; Yang, Jiandong; Petit, Olivier</p> <p>2017-01-01</p> <p>This work describes a coupling between the 1D method of characteristics (MOC) and the 3D finite volume method of computational fluid dynamics (CFD). The coupling method is applied to compressible flow in hydraulic systems. The MOC code is implemented as a set of boundary conditions in the OpenFOAM open source CFD software. The coupling is realized by two linear equations originating from the characteristics equation and the Riemann constant equation, respectively. The coupling method is validated using three simple water hammer cases and several coupling configurations. The accuracy and robustness are investigated with respect to the mesh size ratio across the interface, and 3D flow features close to the interface. The method is finally applied to the transient flow caused by the closing and opening of a knife valve (gate) in a pipe, where the flow is driven by the difference in free surface elevation between two tanks. A small region surrounding the moving gate is resolved by CFD, using a dynamic mesh library, while the rest of the system is modeled by MOC. Minor losses are included in the 1D region, corresponding to the contraction of the flow from the upstream tank into the pipe, a separate stationary flow regulation valve, and a pipe bend. The results are validated with experimental data. A 1D solution is provided for comparison, using the static gate characteristics obtained from steady-state CFD simulations.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19950024225','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19950024225"><span>Moving base simulation of an integrated flight and propulsion control system for an ejector-augmentor STOVL aircraft in hover</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Mcneill, Walter, E.; Chung, William W.; Stortz, Michael W.</p> <p>1995-01-01</p> <p>A piloted motion simulator evaluation, using the NASA Ames Vertical Motion Simulator, was conducted in support of a NASA Lewis Contractual study of the integration of flight and propulsion systems of a STOVL aircraft. Objectives of the study were to validate the Design Methods for Integrated Control Systems (DMICS) concept, to evaluate the handling qualities, and to assess control power usage. The E-7D ejector-augmentor STOVL fighter design served as the basis for the simulation. Handling-qualities ratings were obtained during precision hover and shipboard landing tasks. Handling-qualities ratings for these tasks ranged from satisfactory to adequate. Further improvement of the design process to fully validate the DMICS concept appears to be warranted.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/26701653','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/26701653"><span>A new 4π(LS)-γ coincidence counter at NCBJ RC POLATOM with TDCR detector in the beta channel.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Ziemek, T; Jęczmieniowski, A; Cacko, D; Broda, R; Lech, E</p> <p>2016-03-01</p> <p>A new 4π(LS)-γ coincidence system (TDCRG) was built at the NCBJ RC POLATOM. The counter consists of a TDCR detector in the beta channel and scintillation detector with NaI(Tl) crystal in the gamma channel. The system is equipped with a digital board with FPGA, which records and analyses coincidences in the TDCR detector and coincidences between the beta and gamma channels. The characteristics of the system and a scheme of the FPGA implementation with behavioral simulation are given. The TDCRG counter was validated by activity measurements on (14)C and (60)Co solutions standardized in RC POLATOM using previously validated methods. Copyright © 2015 Elsevier Ltd. All rights reserved.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018MSSP..100..398M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018MSSP..100..398M"><span>Servo-hydraulic actuator in controllable canonical form: Identification and experimental validation</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Maghareh, Amin; Silva, Christian E.; Dyke, Shirley J.</p> <p>2018-02-01</p> <p>Hydraulic actuators have been widely used to experimentally examine structural behavior at multiple scales. Real-time hybrid simulation (RTHS) is one innovative testing method that largely relies on such servo-hydraulic actuators. In RTHS, interface conditions must be enforced in real time, and controllers are often used to achieve tracking of the desired displacements. Thus, neglecting the dynamics of hydraulic transfer system may result either in system instability or sub-optimal performance. Herein, we propose a nonlinear dynamical model for a servo-hydraulic actuator (a.k.a. hydraulic transfer system) coupled with a nonlinear physical specimen. The nonlinear dynamical model is transformed into controllable canonical form for further tracking control design purposes. Through a number of experiments, the controllable canonical model is validated.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1176247','DOE-PATENT-XML'); return false;" href="https://www.osti.gov/servlets/purl/1176247"><span>Bad data packet capture device</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/doepatents">DOEpatents</a></p> <p>Chen, Dong; Gara, Alan; Heidelberger, Philip; Vranas, Pavlos</p> <p>2010-04-20</p> <p>An apparatus and method for capturing data packets for analysis on a network computing system includes a sending node and a receiving node connected by a bi-directional communication link. The sending node sends a data transmission to the receiving node on the bi-directional communication link, and the receiving node receives the data transmission and verifies the data transmission to determine valid data and invalid data and verify retransmissions of invalid data as corresponding valid data. A memory device communicates with the receiving node for storing the invalid data and the corresponding valid data. A computing node communicates with the memory device and receives and performs an analysis of the invalid data and the corresponding valid data received from the memory device.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1329751','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1329751"><span>Multiyear Plan for Validation of EnergyPlus Multi-Zone HVAC System Modeling using ORNL's Flexible Research Platform</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Im, Piljae; Bhandari, Mahabir S.; New, Joshua Ryan</p> <p></p> <p>This document describes the Oak Ridge National Laboratory (ORNL) multiyear experimental plan for validation and uncertainty characterization of whole-building energy simulation for a multi-zone research facility using a traditional rooftop unit (RTU) as a baseline heating, ventilating, and air conditioning (HVAC) system. The project’s overarching objective is to increase the accuracy of energy simulation tools by enabling empirical validation of key inputs and algorithms. Doing so is required to inform the design of increasingly integrated building systems and to enable accountability for performance gaps between design and operation of a building. The project will produce documented data sets that canmore » be used to validate key functionality in different energy simulation tools and to identify errors and inadequate assumptions in simulation engines so that developers can correct them. ASHRAE Standard 140, Method of Test for the Evaluation of Building Energy Analysis Computer Programs (ASHRAE 2004), currently consists primarily of tests to compare different simulation programs with one another. This project will generate sets of measured data to enable empirical validation, incorporate these test data sets in an extended version of Standard 140, and apply these tests to the Department of Energy’s (DOE) EnergyPlus software (EnergyPlus 2016) to initiate the correction of any significant deficiencies. The fitness-for-purpose of the key algorithms in EnergyPlus will be established and demonstrated, and vendors of other simulation programs will be able to demonstrate the validity of their products. The data set will be equally applicable to validation of other simulation engines as well.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20050156631','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20050156631"><span>A Survey of Formal Methods for Intelligent Swarms</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Truszkowski, Walt; Rash, James; Hinchey, Mike; Rouff, Chrustopher A.</p> <p>2004-01-01</p> <p>Swarms of intelligent autonomous spacecraft, involving complex behaviors and interactions, are being proposed for future space exploration missions. Such missions provide greater flexibility and offer the possibility of gathering more science data than traditional single spacecraft missions. The emergent properties of swarms make these missions powerful, but simultaneously far more difficult to design, and to assure that the proper behaviors will emerge. These missions are also considerably more complex than previous types of missions, and NASA, like other organizations, has little experience in developing or in verifying and validating these types of missions. A significant challenge when verifying and validating swarms of intelligent interacting agents is how to determine that the possible exponential interactions and emergent behaviors are producing the desired results. Assuring correct behavior and interactions of swarms will be critical to mission success. The Autonomous Nano Technology Swarm (ANTS) mission is an example of one of the swarm types of missions NASA is considering. The ANTS mission will use a swarm of picospacecraft that will fly from Earth orbit to the Asteroid Belt. Using an insect colony analogy, ANTS will be composed of specialized workers for asteroid exploration. Exploration would consist of cataloguing the mass, density, morphology, and chemical composition of the asteroids, including any anomalous concentrations of specific minerals. To perform this task, ANTS would carry miniaturized instruments, such as imagers, spectrometers, and detectors. Since ANTS and other similar missions are going to consist of autonomous spacecraft that may be out of contact with the earth for extended periods of time, and have low bandwidths due to weight constraints, it will be difficult to observe improper behavior and to correct any errors after launch. Providing V&V (verification and validation) for this type of mission is new to NASA, and represents the cutting edge in system correctness, and requires higher levels of assurance than other (traditional) missions that use a single or small number of spacecraft that are deterministic in nature and have near continuous communication access. One of the highest possible levels of assurance comes from the application of formal methods. Formal methods are mathematics-based tools and techniques for specifying and verifying (software and hardware) systems. They are particularly useful for specifying complex parallel systems, such as exemplified by the ANTS mission, where the entire system is difficult for a single person to fully understand, a problem that is multiplied with multiple developers. Once written, a formal specification can be used to prove properties of a system (e.g., the underlying system will go from one state to another or not into a specific state) and check for particular types of errors (e.g., race or livelock conditions). A formal specification can also be used as input to a model checker for further validation. This report gives the results of a survey of formal methods techniques for verification and validation of space missions that use swarm technology. Multiple formal methods were evaluated to determine their effectiveness in modeling and assuring the behavior of swarms of spacecraft using the ANTS mission as an example system. This report is the first result of the project to determine formal approaches that are promising for formally specifying swarm-based systems. From this survey, the most promising approaches were selected and are discussed relative to their possible application to the ANTS mission. Future work will include the application of an integrated approach, based on the selected approaches identified in this report, to the formal specification of the ANTS mission.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/ADA955303','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/ADA955303"><span>Development of Theoretical Foundations for Description and Analysis of Discrete Information Systems</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.dtic.mil/">DTIC Science & Technology</a></p> <p></p> <p>1975-05-07</p> <p>on work of M. Hack , M.W. Marean, J.M, Myers, and P.M. Shapiro. What is presented is an introduction to a body of methods related to Pngmatic...Program FrPPU910 which creates the Account Validation file (FFPFDS20) from input cards without contacting any other files in the system FP-910 The...34•"■ •""^•"’J«^ *• ^’’•"’»’">* ^’*», *’" \\’, A6. S18 FPU PDS 10 Rcqulslclo.) Master S20 FFPFDS20 Account Validation (used to check thai charges</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018MS%26E..325a2023S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018MS%26E..325a2023S"><span>Developing and Validating the Socio-Technical Model in Ontology Engineering</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Silalahi, Mesnan; Indra Sensuse, Dana; Giri Sucahyo, Yudho; Fadhilah Akmaliah, Izzah; Rahayu, Puji; Cahyaningsih, Elin</p> <p>2018-03-01</p> <p>This paper describes results from an attempt to develop a model in ontology engineering methodology and a way to validate the model. The approach to methodology in ontology engineering is from the point view of socio-technical system theory. Qualitative research synthesis is used to build the model using meta-ethnography. In order to ensure the objectivity of the measurement, inter-rater reliability method was applied using a multi-rater Fleiss Kappa. The results show the accordance of the research output with the diamond model in the socio-technical system theory by evidence of the interdependency of the four socio-technical variables namely people, technology, structure and task.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li class="active"><span>23</span></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li><a href="#" onclick='return showDiv("page_25");'>25</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_23 --> <div id="page_24" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li class="active"><span>24</span></li> <li><a href="#" onclick='return showDiv("page_25");'>25</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="461"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/27908162','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/27908162"><span>An efficient method to determine double Gaussian fluence parameters in the eclipse™ proton pencil beam model.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Shen, Jiajian; Liu, Wei; Stoker, Joshua; Ding, Xiaoning; Anand, Aman; Hu, Yanle; Herman, Michael G; Bues, Martin</p> <p>2016-12-01</p> <p>To find an efficient method to configure the proton fluence for a commercial proton pencil beam scanning (PBS) treatment planning system (TPS). An in-water dose kernel was developed to mimic the dose kernel of the pencil beam convolution superposition algorithm, which is part of the commercial proton beam therapy planning software, eclipse™ (Varian Medical Systems, Palo Alto, CA). The field size factor (FSF) was calculated based on the spot profile reconstructed by the in-house dose kernel. The workflow of using FSFs to find the desirable proton fluence is presented. The in-house derived spot profile and FSF were validated by a direct comparison with those calculated by the eclipse TPS. The validation included 420 comparisons of the FSFs from 14 proton energies, various field sizes from 2 to 20 cm and various depths from 20% to 80% of proton range. The relative in-water lateral profiles between the in-house calculation and the eclipse TPS agree very well even at the level of 10 -4 . The FSFs between the in-house calculation and the eclipse TPS also agree well. The maximum deviation is within 0.5%, and the standard deviation is less than 0.1%. The authors' method significantly reduced the time to find the desirable proton fluences of the clinical energies. The method is extensively validated and can be applied to any proton centers using PBS and the eclipse TPS.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/7742968','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/7742968"><span>The use of children's drawings in the evaluation and treatment of child sexual, emotional, and physical abuse.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Peterson, L W; Hardin, M; Nitsch, M J</p> <p>1995-05-01</p> <p>Primary care physicians can be instrumental in the initial identification of potential sexual, emotional, and physical abuse of children. We reviewed the use of children's artwork as a method of communicating individual and family functioning. A quantitative method of analyzing children's artwork provides more reliability and validity than some methods used previously. A new scoring system was developed that uses individual human figure drawings and kinetic family drawings. This scoring system was based on research with 842 children (341 positively identified as sexually molested, 252 positively not sexually molested but having emotional or behavioral problems, and 249 "normal" public school children). This system is more comprehensive than previous systems of assessment of potential abuse.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=1533399','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=1533399"><span>Validation of alternative methods for toxicity testing.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Bruner, L H; Carr, G J; Curren, R D; Chamberlain, M</p> <p>1998-01-01</p> <p>Before nonanimal toxicity tests may be officially accepted by regulatory agencies, it is generally agreed that the validity of the new methods must be demonstrated in an independent, scientifically sound validation program. Validation has been defined as the demonstration of the reliability and relevance of a test method for a particular purpose. This paper provides a brief review of the development of the theoretical aspects of the validation process and updates current thinking about objectively testing the performance of an alternative method in a validation study. Validation of alternative methods for eye irritation testing is a specific example illustrating important concepts. Although discussion focuses on the validation of alternative methods intended to replace current in vivo toxicity tests, the procedures can be used to assess the performance of alternative methods intended for other uses. Images Figure 1 PMID:9599695</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3415976','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3415976"><span>Children's Behavior in the Postanesthesia Care Unit: The Development of the Child Behavior Coding System-PACU (CBCS-P)</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Tan, Edwin T.; Martin, Sarah R.; Fortier, Michelle A.; Kain, Zeev N.</p> <p>2012-01-01</p> <p>Objective To develop and validate a behavioral coding measure, the Children's Behavior Coding System-PACU (CBCS-P), for children's distress and nondistress behaviors while in the postanesthesia recovery unit. Methods A multidisciplinary team examined videotapes of children in the PACU and developed a coding scheme that subsequently underwent a refinement process (CBCS-P). To examine the reliability and validity of the coding system, 121 children and their parents were videotaped during their stay in the PACU. Participants were healthy children undergoing elective, outpatient surgery and general anesthesia. The CBCS-P was utilized and objective data from medical charts (analgesic consumption and pain scores) were extracted to establish validity. Results Kappa values indicated good-to-excellent (κ's > .65) interrater reliability of the individual codes. The CBCS-P had good criterion validity when compared to children's analgesic consumption and pain scores. Conclusions The CBCS-P is a reliable, observational coding method that captures children's distress and nondistress postoperative behaviors. These findings highlight the importance of considering context in both the development and application of observational coding schemes. PMID:22167123</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/24150620','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/24150620"><span>A low-cost contact system to assess load displacement velocity in a resistance training machine.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Buscà, Bernat; Font, Anna</p> <p>2011-01-01</p> <p>This study sought to determine the validity of a new system for assessing the displacement and average velocity within machine-based resistance training exercise using the Chronojump System. The new design is based on a contact bar and a simple, low-cost mechanism that detects the conductivity of electrical potentials with a precision chronograph. This system allows coaches to assess velocity to control the strength training process. A validation study was performed by assessing the concentric phase parameters of a leg press exercise. Output time data from the Chronojump System in combination with the pre-established range of movement was compared with data from a position sensor connected to a Biopac System. A subset of 87 actions from 11 professional tennis players was recorded and, using the two methods, average velocity and displacement variables in the same action were compared. A t-test for dependent samples and a correlation analysis were undertaken. The r value derived from the correlation between the Biopac System and the contact Chronojump System was >0.94 for all measures of displacement and velocity on all loads (p < 0.01). The Effect Size (ES) was 0.18 in displacement and 0.14 in velocity and ranged from 0.09 to 0.31 and from 0.07 to 0.34, respectively. The magnitude of the difference between the two methods in all parameters and the correlation values provided certain evidence of validity of the Chronojump System to assess the average displacement velocity of loads in a resistance training machine. Key pointsThe assessment of speed in resistance machines is a valuable source of information for strength training.Many commercial systems used to assess velocity, power and force are expensive thereby preventing widespread use by coaches and athletes.The system is intended to be a low-cost device for assessing and controlling the velocity exerted on each repetition in any resistance training machine.The system could be easily adapted in any vertical displacement barbell exercise.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://rosap.ntl.bts.gov/view/dot/3229','DOTNTL'); return false;" href="https://rosap.ntl.bts.gov/view/dot/3229"><span>Evaluation results for intelligent transportation systems</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntlsearch.bts.gov/tris/index.do">DOT National Transportation Integrated Search</a></p> <p></p> <p>2000-11-09</p> <p>This presentation covers the methods of evaluation set out for EC-funded ITS research and demonstration projects, known as the CONVERGE validation quality process and the lessons learned from that approach. The new approach to appraisal, which is bei...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://cfpub.epa.gov/si/si_public_record_report.cfm?dirEntryId=262461&keyword=Lower+AND+class&actType=&TIMSType=+&TIMSSubTypeID=&DEID=&epaNumber=&ntisID=&archiveStatus=Both&ombCat=Any&dateBeginCreated=&dateEndCreated=&dateBeginPublishedPresented=&dateEndPublishedPresented=&dateBeginUpdated=&dateEndUpdated=&dateBeginCompleted=&dateEndCompleted=&personID=&role=Any&journalID=&publisherID=&sortBy=revisionDate&count=50','EPA-EIMS'); return false;" href="https://cfpub.epa.gov/si/si_public_record_report.cfm?dirEntryId=262461&keyword=Lower+AND+class&actType=&TIMSType=+&TIMSSubTypeID=&DEID=&epaNumber=&ntisID=&archiveStatus=Both&ombCat=Any&dateBeginCreated=&dateEndCreated=&dateBeginPublishedPresented=&dateEndPublishedPresented=&dateBeginUpdated=&dateEndUpdated=&dateBeginCompleted=&dateEndCompleted=&personID=&role=Any&journalID=&publisherID=&sortBy=revisionDate&count=50"><span>A validation study of a rapid field-based rating system for discriminating among flow permanence classes of headwater streams in South Carolina</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://oaspub.epa.gov/eims/query.page">EPA Science Inventory</a></p> <p></p> <p></p> <p>Rapid field-based protocols for classifying flow permanence of headwater streams are needed to inform timely regulatory decisions. Such an existing method was developed for and has been used in North Carolina since 1997. The method uses ordinal scoring of 26 geomorphology, hydr...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2001PhDT.......318S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2001PhDT.......318S"><span>Performance prediction of a ducted rocket combustor</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Stowe, Robert</p> <p>2001-07-01</p> <p>The ducted rocket is a supersonic flight propulsion system that takes the exhaust from a solid fuel gas generator, mixes it with air, and burns it to produce thrust. To develop such systems, the use of numerical models based on Computational Fluid Dynamics (CFD) is increasingly popular, but their application to reacting flow requires specific attention and validation. Through a careful examination of the governing equations and experimental measurements, a CFD-based method was developed to predict the performance of a ducted rocket combustor. It uses an equilibrium-chemistry Probability Density Function (PDF) combustion model, with a gaseous and a separate stream of 75 nm diameter carbon spheres to represent the fuel. After extensive validation with water tunnel and direct-connect combustion experiments over a wide range of geometries and test conditions, this CFD-based method was able to predict, within a good degree of accuracy, the combustion efficiency of a ducted rocket combustor.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5877381','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5877381"><span>Research on Flow Field Perception Based on Artificial Lateral Line Sensor System</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Wang, Anyi; Wang, Shirui; Yang, Tingting</p> <p>2018-01-01</p> <p>In nature, the lateral line of fish is a peculiar and important organ for sensing the surrounding hydrodynamic environment, preying, escaping from predators and schooling. In this paper, by imitating the mechanism of fish lateral canal neuromasts, we developed an artificial lateral line system composed of micro-pressure sensors. Through hydrodynamic simulations, an optimized sensor structure was obtained and the pressure distribution models of the lateral surface were established in uniform flow and turbulent flow. Carrying out the corresponding underwater experiment, the validity of the numerical simulation method is verified by the comparison between the experimental data and the simulation results. In addition, a variety of effective research methods are proposed and validated for the flow velocity estimation and attitude perception in turbulent flow, respectively and the shape recognition of obstacles is realized by the neural network algorithm. PMID:29534499</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3160920','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3160920"><span>Population Health Metrics Research Consortium gold standard verbal autopsy validation study: design, implementation, and development of analysis datasets</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p></p> <p>2011-01-01</p> <p>Background Verbal autopsy methods are critically important for evaluating the leading causes of death in populations without adequate vital registration systems. With a myriad of analytical and data collection approaches, it is essential to create a high quality validation dataset from different populations to evaluate comparative method performance and make recommendations for future verbal autopsy implementation. This study was undertaken to compile a set of strictly defined gold standard deaths for which verbal autopsies were collected to validate the accuracy of different methods of verbal autopsy cause of death assignment. Methods Data collection was implemented in six sites in four countries: Andhra Pradesh, India; Bohol, Philippines; Dar es Salaam, Tanzania; Mexico City, Mexico; Pemba Island, Tanzania; and Uttar Pradesh, India. The Population Health Metrics Research Consortium (PHMRC) developed stringent diagnostic criteria including laboratory, pathology, and medical imaging findings to identify gold standard deaths in health facilities as well as an enhanced verbal autopsy instrument based on World Health Organization (WHO) standards. A cause list was constructed based on the WHO Global Burden of Disease estimates of the leading causes of death, potential to identify unique signs and symptoms, and the likely existence of sufficient medical technology to ascertain gold standard cases. Blinded verbal autopsies were collected on all gold standard deaths. Results Over 12,000 verbal autopsies on deaths with gold standard diagnoses were collected (7,836 adults, 2,075 children, 1,629 neonates, and 1,002 stillbirths). Difficulties in finding sufficient cases to meet gold standard criteria as well as problems with misclassification for certain causes meant that the target list of causes for analysis was reduced to 34 for adults, 21 for children, and 10 for neonates, excluding stillbirths. To ensure strict independence for the validation of methods and assessment of comparative performance, 500 test-train datasets were created from the universe of cases, covering a range of cause-specific compositions. Conclusions This unique, robust validation dataset will allow scholars to evaluate the performance of different verbal autopsy analytic methods as well as instrument design. This dataset can be used to inform the implementation of verbal autopsies to more reliably ascertain cause of death in national health information systems. PMID:21816095</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4724691','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4724691"><span>Development and validation of an HPLC–MS/MS method to determine clopidogrel in human plasma</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Liu, Gangyi; Dong, Chunxia; Shen, Weiwei; Lu, Xiaopei; Zhang, Mengqi; Gui, Yuzhou; Zhou, Qinyi; Yu, Chen</p> <p>2015-01-01</p> <p>A quantitative method for clopidogrel using online-SPE tandem LC–MS/MS was developed and fully validated according to the well-established FDA guidelines. The method achieves adequate sensitivity for pharmacokinetic studies, with lower limit of quantifications (LLOQs) as low as 10 pg/mL. Chromatographic separations were performed on reversed phase columns Kromasil Eternity-2.5-C18-UHPLC for both methods. Positive electrospray ionization in multiple reaction monitoring (MRM) mode was employed for signal detection and a deuterated analogue (clopidogrel-d4) was used as internal standard (IS). Adjustments in sample preparation, including introduction of an online-SPE system proved to be the most effective method to solve the analyte back-conversion in clinical samples. Pooled clinical samples (two levels) were prepared and successfully used as real-sample quality control (QC) in the validation of back-conversion testing under different conditions. The result showed that the real samples were stable in room temperature for 24 h. Linearity, precision, extraction recovery, matrix effect on spiked QC samples and stability tests on both spiked QCs and real sample QCs stored in different conditions met the acceptance criteria. This online-SPE method was successfully applied to a bioequivalence study of 75 mg single dose clopidogrel tablets in 48 healthy male subjects. PMID:26904399</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4939351','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4939351"><span>Quantification of Rifaximin in Tablets by Spectrophotometric Method Ecofriendly in Ultraviolet Region</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p></p> <p>2016-01-01</p> <p>Rifaximin is an oral nonabsorbable antibiotic that acts locally in the gastrointestinal tract with minimal systemic adverse effects. It does not have spectrophotometric method ecofriendly in the ultraviolet region described in official compendiums and literature. The analytical techniques for determination of rifaximin reported in the literature require large amount of time to release results and are significantly onerous. Furthermore, they use toxic reagents both for the operator and environment and, therefore, cannot be considered environmentally friendly analytical techniques. The objective of this study was to develop and validate an ecofriendly spectrophotometric method in the ultraviolet region to quantify rifaximin in tablets. The method was validated, showing linearity, selectivity, precision, accuracy, and robustness. It was linear over the concentration range of 10–30 mg L−1 with correlation coefficients greater than 0.9999 and limits of detection and quantification of 1.39 and 4.22 mg L−1, respectively. The validated method is useful and applied for the routine quality control of rifaximin, since it is simple with inexpensive conditions and fast in the release of results, optimizes analysts and equipment, and uses environmentally friendly solvents, being considered a green method, which does not prejudice either the operator or the environment. PMID:27429835</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/8469716','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/8469716"><span>Cross-validation of bioelectrical impedance analysis of body composition in children and adolescents.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Wu, Y T; Nielsen, D H; Cassady, S L; Cook, J S; Janz, K F; Hansen, J R</p> <p>1993-05-01</p> <p>The reliability and validity of measurements obtained with two bioelectrical impedance analyzers (BIAs), an RJL Systems model BIA-103 and a Berkeley Medical Research BMR-2000, were investigated using the manufacturers' prediction equations for the assessment of fat-free mass (FFM) (in kilograms) in children and adolescents. Forty-seven healthy children and adolescents (23 male, 24 female), ranging in age from 8 to 20 years (mean = 12.1, SD = 2.3), participated. In the context of a repeated-measures design, the data were analyzed according to gender and maturation (Tanner staging). Hydrostatic weighing (HYDRO) and Lohman's Siri age-adjusted body density prediction equation served as the criteria for validating the BIA-obtained measurements. High intraclass correlation coefficients (ICC > or = .987) demonstrated good test-retest (between-week) measurement reliability for HYDRO and both BIA methods. Between-method (HYDRO versus BIA) correlation coefficients were high for both boys and girls (r > or = .97). The standard errors of estimate (SEEs) for FFM were slightly larger for boys than for girls and were consistently smaller for the RJL system than for the BMR system (RJL SEE = 1.8 kg for boys, 1.3 kg for girls; BMR SEE = 2.4 kg for boys, 1.9 kg for girls). The coefficients of determination were high for both BIA methods (r2 > or = .929). Total prediction errors (TEs) for FFM showed similar between-method trends (RJL TE = 2.1 kg for boys, 1.5 kg for girls; BMR TE = 4.4 kg for boys, 1.9 kg for girls). This study demonstrated that the RJL BIA with the manufacturer's prediction equations can be used to reliably and accurately assess FFM in 8- to 20-year-old children and adolescents. The prediction of FFM by the BMR system was acceptable for girls, but significant overprediction of FFM for boys was noted.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://ntrs.nasa.gov/search.jsp?R=19960021804&hterms=numeric&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D80%26Ntt%3Dnumeric','NASA-TRS'); return false;" href="https://ntrs.nasa.gov/search.jsp?R=19960021804&hterms=numeric&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D80%26Ntt%3Dnumeric"><span>Experimental Investigation of Textile Composite Materials Using Moire Interferometry</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Ifju, Peter G.</p> <p>1995-01-01</p> <p>The viability as an efficient aircraft material of advanced textile composites is currently being addressed in the NASA Advanced Composites Technology (ACT) Program. One of the expected milestones of the program is to develop standard test methods for these complex material systems. Current test methods for laminated composites may not be optimum for textile composites, since the architecture of the textile induces nonuniform deformation characteristics on the scale of the smallest repeating unit of the architecture. The smallest repeating unit, also called the unit cell, is often larger than the strain gages used for testing of tape composites. As a result, extending laminated composite test practices to textiles can often lead to pronounced scatter in material property measurements. It has been speculated that the fiber architectures produce significant surface strain nonuniformities, however, the magnitudes were not well understood. Moire interferometry, characterized by full-field information, high displacement sensitivity, and high spatial resolution, is well suited to document the surface strain on textile composites. Studies at the NASA Langley Research Center on a variety of textile architectures including 2-D braids and 3-D weaves, has evidenced the merits of using moire interferometry to guide in test method development for textile composites. Moire was used to support tensile testing by validating instrumentation practices and documenting damage mechanisms. It was used to validate shear test methods by mapping the full-field deformation of shear specimens. Moire was used to validate open hole tension experiments to determine the strain concentration and compare then to numeric predictions. It was used for through-the-thickness tensile strength test method development, to verify capabilities for testing of both 2-D and 3-D material systems. For all of these examples, moire interferometry provided vision so that test methods could be developed with less speculation and more documentation.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/24082694','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/24082694"><span>Stability-indicating assay of repaglinide in bulk and optimized nanoemulsion by validated high performance thin layer chromatography technique.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Akhtar, Juber; Fareed, Sheeba; Aqil, Mohd</p> <p>2013-07-01</p> <p>A sensitive, selective, precise and stability-indicating high-performance thin-layer chromatographic (HPTLC) method for analysis of repaglinide both as a bulk drug and in nanoemulsion formulation was developed and validated. The method employed TLC aluminum plates precoated with silica gel 60F-254 as the stationary phase. The solvent system consisted of chloroform/methanol/ammonia/glacial acetic acid (7.5:1.5:0.9:0.1, v/v/v/v). This system was found to give compact spots for repaglinide (R f value of 0.38 ± 0.02). Repaglinide was subjected to acid and alkali hydrolysis, oxidation, photodegradation and dry heat treatment. Also, the degraded products were well separated from the pure drug. Densitometric analysis of repaglinide was carried out in the absorbance mode at 240 nm. The linear regression data for the calibration plots showed good linear relationship with r (2)= 0.998 ± 0.032 in the concentration range of 50-800 ng. The method was validated for precision, accuracy as recovery, robustness and specificity. The limits of detection and quantitation were 0.023 and 0.069 ng per spot, respectively. The drug undergoes degradation under acidic and basic conditions, oxidation and dry heat treatment. All the peaks of the degraded product were resolved from the standard drug with significantly different R f values. Statistical analysis proves that the method is reproducible and selective for the estimation of the said drug. As the method could effectively separate the drug from its degradation products, it can be employed as a stability-indicating one. Moreover, the proposed HPTLC method was utilized to investigate the degradation kinetics in 1M NaOH.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4134826','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4134826"><span>An Optimal Control Method for Maximizing the Efficiency of Direct Drive Ocean Wave Energy Extraction System</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Chen, Zhongxian; Yu, Haitao; Wen, Cheng</p> <p>2014-01-01</p> <p>The goal of direct drive ocean wave energy extraction system is to convert ocean wave energy into electricity. The problem explored in this paper is the design and optimal control for the direct drive ocean wave energy extraction system. An optimal control method based on internal model proportion integration differentiation (IM-PID) is proposed in this paper though most of ocean wave energy extraction systems are optimized by the structure, weight, and material. With this control method, the heavy speed of outer heavy buoy of the energy extraction system is in resonance with incident wave, and the system efficiency is largely improved. Validity of the proposed optimal control method is verified in both regular and irregular ocean waves, and it is shown that IM-PID control method is optimal in that it maximizes the energy conversion efficiency. In addition, the anti-interference ability of IM-PID control method has been assessed, and the results show that the IM-PID control method has good robustness, high precision, and strong anti-interference ability. PMID:25152913</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/25152913','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/25152913"><span>An optimal control method for maximizing the efficiency of direct drive ocean wave energy extraction system.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Chen, Zhongxian; Yu, Haitao; Wen, Cheng</p> <p>2014-01-01</p> <p>The goal of direct drive ocean wave energy extraction system is to convert ocean wave energy into electricity. The problem explored in this paper is the design and optimal control for the direct drive ocean wave energy extraction system. An optimal control method based on internal model proportion integration differentiation (IM-PID) is proposed in this paper though most of ocean wave energy extraction systems are optimized by the structure, weight, and material. With this control method, the heavy speed of outer heavy buoy of the energy extraction system is in resonance with incident wave, and the system efficiency is largely improved. Validity of the proposed optimal control method is verified in both regular and irregular ocean waves, and it is shown that IM-PID control method is optimal in that it maximizes the energy conversion efficiency. In addition, the anti-interference ability of IM-PID control method has been assessed, and the results show that the IM-PID control method has good robustness, high precision, and strong anti-interference ability.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/biblio/965593-methodology-methods-metrics-testing-evaluating-augmented-cognition-systems','SCIGOV-STC'); return false;" href="https://www.osti.gov/biblio/965593-methodology-methods-metrics-testing-evaluating-augmented-cognition-systems"><span>Methodology, Methods, and Metrics for Testing and Evaluating Augmented Cognition Systems</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Greitzer, Frank L.</p> <p></p> <p>The augmented cognition research community seeks cognitive neuroscience-based solutions to improve warfighter performance by applying and managing mitigation strategies to reduce workload and improve the throughput and quality of decisions. The focus of augmented cognition mitigation research is to define, demonstrate, and exploit neuroscience and behavioral measures that support inferences about the warfighter’s cognitive state that prescribe the nature and timing of mitigation. A research challenge is to develop valid evaluation methodologies, metrics and measures to assess the impact of augmented cognition mitigations. Two considerations are external validity, which is the extent to which the results apply to operational contexts;more » and internal validity, which reflects the reliability of performance measures and the conclusions based on analysis of results. The scientific rigor of the research methodology employed in conducting empirical investigations largely affects the validity of the findings. External validity requirements also compel us to demonstrate operational significance of mitigations. Thus it is important to demonstrate effectiveness of mitigations under specific conditions. This chapter reviews some cognitive science and methodological considerations in designing augmented cognition research studies and associated human performance metrics and analysis methods to assess the impact of augmented cognition mitigations.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016AIPC.1706l0011L','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016AIPC.1706l0011L"><span>Challenges of NDE simulation tool validation, optimization, and utilization for composites</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Leckey, Cara A. C.; Seebo, Jeffrey P.; Juarez, Peter</p> <p>2016-02-01</p> <p>Rapid, realistic nondestructive evaluation (NDE) simulation tools can aid in inspection optimization and prediction of inspectability for advanced aerospace materials and designs. NDE simulation tools may someday aid in the design and certification of aerospace components; potentially shortening the time from material development to implementation by industry and government. Furthermore, ultrasound modeling and simulation are expected to play a significant future role in validating the capabilities and limitations of guided wave based structural health monitoring (SHM) systems. The current state-of-the-art in ultrasonic NDE/SHM simulation is still far from the goal of rapidly simulating damage detection techniques for large scale, complex geometry composite components/vehicles containing realistic damage types. Ongoing work at NASA Langley Research Center is focused on advanced ultrasonic simulation tool development. This paper discusses challenges of simulation tool validation, optimization, and utilization for composites. Ongoing simulation tool development work is described along with examples of simulation validation and optimization challenges that are more broadly applicable to all NDE simulation tools. The paper will also discuss examples of simulation tool utilization at NASA to develop new damage characterization methods for composites, and associated challenges in experimentally validating those methods.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/28005974','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/28005974"><span>Bibliometrics for Social Validation.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Hicks, Daniel J</p> <p>2016-01-01</p> <p>This paper introduces a bibliometric, citation network-based method for assessing the social validation of novel research, and applies this method to the development of high-throughput toxicology research at the US Environmental Protection Agency. Social validation refers to the acceptance of novel research methods by a relevant scientific community; it is formally independent of the technical validation of methods, and is frequently studied in history, philosophy, and social studies of science using qualitative methods. The quantitative methods introduced here find that high-throughput toxicology methods are spread throughout a large and well-connected research community, which suggests high social validation. Further assessment of social validation involving mixed qualitative and quantitative methods are discussed in the conclusion.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li class="active"><span>24</span></li> <li><a href="#" onclick='return showDiv("page_25");'>25</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_24 --> <div id="page_25" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li class="active"><span>25</span></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="481"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5179025','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5179025"><span>Bibliometrics for Social Validation</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p></p> <p>2016-01-01</p> <p>This paper introduces a bibliometric, citation network-based method for assessing the social validation of novel research, and applies this method to the development of high-throughput toxicology research at the US Environmental Protection Agency. Social validation refers to the acceptance of novel research methods by a relevant scientific community; it is formally independent of the technical validation of methods, and is frequently studied in history, philosophy, and social studies of science using qualitative methods. The quantitative methods introduced here find that high-throughput toxicology methods are spread throughout a large and well-connected research community, which suggests high social validation. Further assessment of social validation involving mixed qualitative and quantitative methods are discussed in the conclusion. PMID:28005974</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://ntrs.nasa.gov/search.jsp?R=19930036721&hterms=dynamical+system&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D70%26Ntt%3Ddynamical%2Bsystem','NASA-TRS'); return false;" href="https://ntrs.nasa.gov/search.jsp?R=19930036721&hterms=dynamical+system&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D70%26Ntt%3Ddynamical%2Bsystem"><span>Sub-domain decomposition methods and computational controls for multibody dynamical systems. [of spacecraft structures</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Menon, R. G.; Kurdila, A. J.</p> <p>1992-01-01</p> <p>This paper presents a concurrent methodology to simulate the dynamics of flexible multibody systems with a large number of degrees of freedom. A general class of open-loop structures is treated and a redundant coordinate formulation is adopted. A range space method is used in which the constraint forces are calculated using a preconditioned conjugate gradient method. By using a preconditioner motivated by the regular ordering of the directed graph of the structures, it is shown that the method is order N in the total number of coordinates of the system. The overall formulation has the advantage that it permits fine parallelization and does not rely on system topology to induce concurrency. It can be efficiently implemented on the present generation of parallel computers with a large number of processors. Validation of the method is presented via numerical simulations of space structures incorporating large number of flexible degrees of freedom.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/29301115','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/29301115"><span>Considerations regarding the validation of chromatographic mass spectrometric methods for the quantification of endogenous substances in forensics.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Hess, Cornelius; Sydow, Konrad; Kueting, Theresa; Kraemer, Michael; Maas, Alexandra</p> <p>2018-02-01</p> <p>The requirement for correct evaluation of forensic toxicological results in daily routine work and scientific studies is reliable analytical data based on validated methods. Validation of a method gives the analyst tools to estimate the efficacy and reliability of the analytical method. Without validation, data might be contested in court and lead to unjustified legal consequences for a defendant. Therefore, new analytical methods to be used in forensic toxicology require careful method development and validation of the final method. Until now, there are no publications on the validation of chromatographic mass spectrometric methods for the detection of endogenous substances although endogenous analytes can be important in Forensic Toxicology (alcohol consumption marker, congener alcohols, gamma hydroxy butyric acid, human insulin and C-peptide, creatinine, postmortal clinical parameters). For these analytes, conventional validation instructions cannot be followed completely. In this paper, important practical considerations in analytical method validation for endogenous substances will be discussed which may be used as guidance for scientists wishing to develop and validate analytical methods for analytes produced naturally in the human body. Especially the validation parameters calibration model, analytical limits, accuracy (bias and precision) and matrix effects and recovery have to be approached differently. Highest attention should be paid to selectivity experiments. Copyright © 2017 Elsevier B.V. All rights reserved.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2007SPIE.6795E..18T','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2007SPIE.6795E..18T"><span>Feature extraction algorithm for space targets based on fractal theory</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Tian, Balin; Yuan, Jianping; Yue, Xiaokui; Ning, Xin</p> <p>2007-11-01</p> <p>In order to offer a potential for extending the life of satellites and reducing the launch and operating costs, satellite servicing including conducting repairs, upgrading and refueling spacecraft on-orbit become much more frequently. Future space operations can be more economically and reliably executed using machine vision systems, which can meet real time and tracking reliability requirements for image tracking of space surveillance system. Machine vision was applied to the research of relative pose for spacecrafts, the feature extraction algorithm was the basis of relative pose. In this paper fractal geometry based edge extraction algorithm which can be used in determining and tracking the relative pose of an observed satellite during proximity operations in machine vision system was presented. The method gets the gray-level image distributed by fractal dimension used the Differential Box-Counting (DBC) approach of the fractal theory to restrain the noise. After this, we detect the consecutive edge using Mathematical Morphology. The validity of the proposed method is examined by processing and analyzing images of space targets. The edge extraction method not only extracts the outline of the target, but also keeps the inner details. Meanwhile, edge extraction is only processed in moving area to reduce computation greatly. Simulation results compared edge detection using the method which presented by us with other detection methods. The results indicate that the presented algorithm is a valid method to solve the problems of relative pose for spacecrafts.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/ADA178140','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/ADA178140"><span>Improvement of Computer Software Quality through Software Automated Tools.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.dtic.mil/">DTIC Science & Technology</a></p> <p></p> <p>1986-08-31</p> <p>requirement for increased emphasis on software quality assurance has lead to the creation of various methods of verification and validation. Experience...result was a vast array of methods , systems, languages and automated tools to assist in the process. Given that the primary role of quality assurance is...Unfortunately, there is no single method , tool or technique that can insure accurate, reliable and cost effective software. Therefore, government and industry</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3642108','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3642108"><span>Enhancement of Chemical Entity Identification in Text Using Semantic Similarity Validation</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Grego, Tiago; Couto, Francisco M.</p> <p>2013-01-01</p> <p>With the amount of chemical data being produced and reported in the literature growing at a fast pace, it is increasingly important to efficiently retrieve this information. To tackle this issue text mining tools have been applied, but despite their good performance they still provide many errors that we believe can be filtered by using semantic similarity. Thus, this paper proposes a novel method that receives the results of chemical entity identification systems, such as Whatizit, and exploits the semantic relationships in ChEBI to measure the similarity between the entities found in the text. The method assigns a single validation score to each entity based on its similarities with the other entities also identified in the text. Then, by using a given threshold, the method selects a set of validated entities and a set of outlier entities. We evaluated our method using the results of two state-of-the-art chemical entity identification tools, three semantic similarity measures and two text window sizes. The method was able to increase precision without filtering a significant number of correctly identified entities. This means that the method can effectively discriminate the correctly identified chemical entities, while discarding a significant number of identification errors. For example, selecting a validation set with 75% of all identified entities, we were able to increase the precision by 28% for one of the chemical entity identification tools (Whatizit), maintaining in that subset 97% the correctly identified entities. Our method can be directly used as an add-on by any state-of-the-art entity identification tool that provides mappings to a database, in order to improve their results. The proposed method is included in a freely accessible web tool at www.lasige.di.fc.ul.pt/webtools/ice/. PMID:23658791</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4885924','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4885924"><span>Validation of a Rapid and Sensitive UPLC–MS-MS Method Coupled with Protein Precipitation for the Simultaneous Determination of Seven Pyrethroids in 100 µL of Rat Plasma by Using Ammonium Adduct as Precursor Ion</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Singh, Sheelendra Pratap; Dwivedi, Nistha; Raju, Kanumuri Siva Rama; Taneja, Isha; Wahajuddin, Mohammad</p> <p>2016-01-01</p> <p>United States Environmental Protection Agency has recommended estimating pyrethroids’ risk using cumulative exposure. For cumulative risk assessment, it would be useful to have a bioanalytical method for quantification of one or several pyrethroids simultaneously in a small sample volume to support toxicokinetic studies. Therefore, in the present study, a simple, sensitive and high-throughput ultraperformance liquid chromatography–tandem mass spectrometry method was developed and validated for simultaneous analysis of seven pyrethroids (fenvalerate, fenpropathrin, bifenthrin, lambda-cyhalothrin, cyfluthrin, cypermethrin and deltamethrin) in 100 µL of rat plasma. A simple single-step protein precipitation method was used for the extraction of target compounds. The total chromatographic run time of the method was 5 min. The chromatographic system used a Supelco C18 column and isocratic elution with a mobile phase consisting of methanol and 5 mM ammonium formate in the ratio of 90 : 10 (v/v). Mass spectrometer (API 4000) was operated in multiple reaction monitoring positive-ion mode using the electrospray ionization technique. The calibration curves were linear in the range of 7.8–2,000 ng/mL with correlation coefficients of ≥0.99. All validation parameters such as precision, accuracy, recovery, matrix effect and stability met the acceptance criteria according to the regulatory guidelines. The method was successfully applied to the toxicokinetic study of cypermethrin in rats. To the best of our knowledge, this is the first LC–MS-MS method for the simultaneous analysis of pyrethroids in rat plasma. This validated method with minimal modification can also be utilized for forensic and clinical toxicological applications due to its simplicity, sensitivity and rapidity. PMID:26801239</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/26568965','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/26568965"><span>Automatic Generation of Validated Specific Epitope Sets.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Carrasco Pro, Sebastian; Sidney, John; Paul, Sinu; Lindestam Arlehamn, Cecilia; Weiskopf, Daniela; Peters, Bjoern; Sette, Alessandro</p> <p>2015-01-01</p> <p>Accurate measurement of B and T cell responses is a valuable tool to study autoimmunity, allergies, immunity to pathogens, and host-pathogen interactions and assist in the design and evaluation of T cell vaccines and immunotherapies. In this context, it is desirable to elucidate a method to select validated reference sets of epitopes to allow detection of T and B cells. However, the ever-growing information contained in the Immune Epitope Database (IEDB) and the differences in quality and subjects studied between epitope assays make this task complicated. In this study, we develop a novel method to automatically select reference epitope sets according to a categorization system employed by the IEDB. From the sets generated, three epitope sets (EBV, mycobacteria and dengue) were experimentally validated by detection of T cell reactivity ex vivo from human donors. Furthermore, a web application that will potentially be implemented in the IEDB was created to allow users the capacity to generate customized epitope sets.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018JPhCS.975a2012U','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018JPhCS.975a2012U"><span>Method development and validation for simultaneous determination of IEA-R1 reactor’s pool water uranium and silicon content by ICP OES</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Ulrich, J. C.; Guilhen, S. N.; Cotrim, M. E. B.; Pires, M. A. F.</p> <p>2018-03-01</p> <p>IPEN’s research reactor, IEA-R1, an open pool type research reactor moderated and cooled by light water. High quality water is a key factor in preventing the corrosion of the spent fuel stored in the pool. Leaching of radionuclides from the corroded fuel cladding may be prevented by an efficient water treatment and purification system. However, as a safety management policy, IPEN has adopted a water chemistry control which periodically monitors the levels of uranium (U) and silicon (Si) in the pool’s reactor, since IEA-R1 employs U3Si2-Al dispersion fuel. An analytical method was developed and validated for the determination of uranium and silicon by ICP OES. This work describes the validation process, in a context of quality assurance, including the parameters selectivity, linearity, quantification limit, precision and recovery.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/7055765','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/7055765"><span>Ensuring the validity of calculated subcritical limits</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Clark, H.K.</p> <p>1977-01-01</p> <p>The care taken at the Savannah River Laboratory and Plant to ensure the validity of calculated subcritical limits is described. Close attention is given to ANSI N16.1-1975, ''Validation of Calculational Methods for Nuclear Criticality Safety.'' The computer codes used for criticality safety computations, which are listed and are briefly described, have been placed in the SRL JOSHUA system to facilitate calculation and to reduce input errors. A driver module, KOKO, simplifies and standardizes input and links the codes together in various ways. For any criticality safety evaluation, correlations of the calculational methods are made with experiment to establish bias. Occasionallymore » subcritical experiments are performed expressly to provide benchmarks. Calculated subcritical limits contain an adequate but not excessive margin to allow for uncertainty in the bias. The final step in any criticality safety evaluation is the writing of a report describing the calculations and justifying the margin.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/26267497','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/26267497"><span>Correction of the heat loss method for calculating clothing real evaporative resistance.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Wang, Faming; Zhang, Chengjiao; Lu, Yehu</p> <p>2015-08-01</p> <p>In the so-called isothermal condition (i.e., Tair [air temperature]=Tmanikin [manikin temperature]=Tr [radiant temperature]), the actual energy used for moisture evaporation detected by most sweating manikins was underestimated due to the uncontrolled fabric 'skin' temperature Tsk,f (i.e., Tsk,f<Tmanikin). Thus, it must be corrected before being used to compute the clothing real evaporative resistance. In this study, correction of the real evaporative heat loss from the wet fabric 'skin'-clothing system was proposed and experimentally validated on a 'Newton' sweating manikin. The real evaporative resistance of five clothing ensembles and the nude fabric 'skin' calculated by the corrected heat loss method was also reported and compared with that by the mass loss method. Results revealed that, depending on the types of tested clothing, different amounts of heat were drawn from the ambient environment. In general, a greater amount of heat was drawn from the ambient environment by the wet fabric 'skin'-clothing system in lower thermal insulation clothing than that in higher insulation clothing. There were no significant differences between clothing real evaporative resistances calculated by the corrected heat loss method and those by the mass loss method. It was therefore concluded that the correction method proposed in this study has been successfully validated. Copyright © 2015 Elsevier Ltd. All rights reserved.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2011JApSp..77..869M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2011JApSp..77..869M"><span>Spectrofluorimetric methods of stability-indicating assay of certain drugs affecting the cardiovascular system</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Moussa, B. A.; Mohamed, M. F.; Youssef, N. F.</p> <p>2011-01-01</p> <p>Two stability-indicating spectrofluorimetric methods have been developed for the determination of ezetimibe and olmesartan medoxomil, drugs affecting the cardiovascular system, and validated in the presence of their degradation products. The first method, for ezetimibe, is based on an oxidative coupling reaction of ezetimibe with 3-methylbenzothiazolin-2-one hydrazone hydrochloride in the presence of cerium (IV) ammonium sulfate in an acidic medium. The quenching effect of ezetimibe on the fluorescence of excess cerous ions is measured at the emission wavelength, λem, of 345 nm with the excitation wavelength, λex, of 296 nm. Factors affecting the reaction were carefully studied and optimized. The second method, for olmesartan medoxomil, is based on measuring the native fluorescence intensity of olmesartan medoxomil in methanol at λem = 360 nm with λex = 286 nm. Regression plots revealed good linear relationships in the assay limits of 10-120 and 8-112 g/ml for ezetimibe and olmesartan medoxomil, respectively. The validity of the methods was assessed according to the United States Pharmacopeya guidelines. Statistical analysis of the results exposed good Student's t-test and F-ratio values. The introduced methods were successfully applied to the analysis of ezetimibe and olmesartan medoxomil in drug substances and drug products as well as in the presence of their degradation products.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/ADA463034','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/ADA463034"><span>Specification for Visual Requirements of Work-Centered Software Systems</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.dtic.mil/">DTIC Science & Technology</a></p> <p></p> <p>2006-10-01</p> <p>no person shall be subject to any penity for fallng to comply wih a collection of N ilo ration it does not display a currenty valid OMB control...work- aiding systems. Based on the design concept for a work- centered support system (WCSS), these software systems support user tasks and goals...through both direct and indirect aiding methods within the interface client. In order to ensure the coherent development and delivery of work- centered</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/26191901','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/26191901"><span>Novel optical scanning cryptography using Fresnel telescope imaging.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Yan, Aimin; Sun, Jianfeng; Hu, Zhijuan; Zhang, Jingtao; Liu, Liren</p> <p>2015-07-13</p> <p>We propose a new method called modified optical scanning cryptography using Fresnel telescope imaging technique for encryption and decryption of remote objects. An image or object can be optically encrypted on the fly by Fresnel telescope scanning system together with an encryption key. For image decryption, the encrypted signals are received and processed with an optical coherent heterodyne detection system. The proposed method has strong performance through use of secure Fresnel telescope scanning with orthogonal polarized beams and efficient all-optical information processing. The validity of the proposed method is demonstrated by numerical simulations and experimental results.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016PhyA..461..795L','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016PhyA..461..795L"><span>Lattice hydrodynamic model based traffic control: A transportation cyber-physical system approach</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Liu, Hui; Sun, Dihua; Liu, Weining</p> <p>2016-11-01</p> <p>Lattice hydrodynamic model is a typical continuum traffic flow model, which describes the jamming transition of traffic flow properly. Previous studies in lattice hydrodynamic model have shown that the use of control method has the potential to improve traffic conditions. In this paper, a new control method is applied in lattice hydrodynamic model from a transportation cyber-physical system approach, in which only one lattice site needs to be controlled in this control scheme. The simulation verifies the feasibility and validity of this method, which can ensure the efficient and smooth operation of the traffic flow.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/24979047','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/24979047"><span>Uniting statistical and individual-based approaches for animal movement modelling.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Latombe, Guillaume; Parrott, Lael; Basille, Mathieu; Fortin, Daniel</p> <p>2014-01-01</p> <p>The dynamic nature of their internal states and the environment directly shape animals' spatial behaviours and give rise to emergent properties at broader scales in natural systems. However, integrating these dynamic features into habitat selection studies remains challenging, due to practically impossible field work to access internal states and the inability of current statistical models to produce dynamic outputs. To address these issues, we developed a robust method, which combines statistical and individual-based modelling. Using a statistical technique for forward modelling of the IBM has the advantage of being faster for parameterization than a pure inverse modelling technique and allows for robust selection of parameters. Using GPS locations from caribou monitored in Québec, caribou movements were modelled based on generative mechanisms accounting for dynamic variables at a low level of emergence. These variables were accessed by replicating real individuals' movements in parallel sub-models, and movement parameters were then empirically parameterized using Step Selection Functions. The final IBM model was validated using both k-fold cross-validation and emergent patterns validation and was tested for two different scenarios, with varying hardwood encroachment. Our results highlighted a functional response in habitat selection, which suggests that our method was able to capture the complexity of the natural system, and adequately provided projections on future possible states of the system in response to different management plans. This is especially relevant for testing the long-term impact of scenarios corresponding to environmental configurations that have yet to be observed in real systems.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4076191','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4076191"><span>Uniting Statistical and Individual-Based Approaches for Animal Movement Modelling</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Latombe, Guillaume; Parrott, Lael; Basille, Mathieu; Fortin, Daniel</p> <p>2014-01-01</p> <p>The dynamic nature of their internal states and the environment directly shape animals' spatial behaviours and give rise to emergent properties at broader scales in natural systems. However, integrating these dynamic features into habitat selection studies remains challenging, due to practically impossible field work to access internal states and the inability of current statistical models to produce dynamic outputs. To address these issues, we developed a robust method, which combines statistical and individual-based modelling. Using a statistical technique for forward modelling of the IBM has the advantage of being faster for parameterization than a pure inverse modelling technique and allows for robust selection of parameters. Using GPS locations from caribou monitored in Québec, caribou movements were modelled based on generative mechanisms accounting for dynamic variables at a low level of emergence. These variables were accessed by replicating real individuals' movements in parallel sub-models, and movement parameters were then empirically parameterized using Step Selection Functions. The final IBM model was validated using both k-fold cross-validation and emergent patterns validation and was tested for two different scenarios, with varying hardwood encroachment. Our results highlighted a functional response in habitat selection, which suggests that our method was able to capture the complexity of the natural system, and adequately provided projections on future possible states of the system in response to different management plans. This is especially relevant for testing the long-term impact of scenarios corresponding to environmental configurations that have yet to be observed in real systems. PMID:24979047</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20080033973','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20080033973"><span>Non-Linear System Identification for Aeroelastic Systems with Application to Experimental Data</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Kukreja, Sunil L.</p> <p>2008-01-01</p> <p>Representation and identification of a non-linear aeroelastic pitch-plunge system as a model of the NARMAX class is considered. A non-linear difference equation describing this aircraft model is derived theoretically and shown to be of the NARMAX form. Identification methods for NARMAX models are applied to aeroelastic dynamics and its properties demonstrated via continuous-time simulations of experimental conditions. Simulation results show that (i) the outputs of the NARMAX model match closely those generated using continuous-time methods and (ii) NARMAX identification methods applied to aeroelastic dynamics provide accurate discrete-time parameter estimates. Application of NARMAX identification to experimental pitch-plunge dynamics data gives a high percent fit for cross-validated data.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017MS%26E..274a2051Z','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017MS%26E..274a2051Z"><span>Research on dynamic characteristics of motor vibration isolation system through mechanical impedance method</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Zhao, Xingqian; Xu, Wei; Shuai, Changgeng; Hu, Zechao</p> <p>2017-12-01</p> <p>A mechanical impedance model of a coupled motor-shaft-bearing system has been developed to predict the dynamic characteristics and partially validated by comparing the computing results with finite element method (FEM), including the comparison of displacement amplitude in x and z directions at the two ends of the flexible coupling, the comparison of normalized vertical reaction force in z direction at bearing pedestals. The results demonstrate that the developed model can precisely predict the dynamic characteristics and the main advantage of such a method is that it can clearly illustrate the vibration property of the motor subsystem, which plays an important role in the isolation system design.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/ADA607461','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/ADA607461"><span>An Advanced Computational Approach to System of Systems Analysis & Architecting Using Agent-Based Behavioral Model: Phase 2</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.dtic.mil/">DTIC Science & Technology</a></p> <p></p> <p>2013-11-18</p> <p>for each valid interface between the systems. The factor is proportional to the count of feasible interfaces in the meta-architecture framework... proportional to the square root of the sector area being covered by each type of system, plus some time for transmitting data to, and double checking by, the...22] J.-H. Ahn, "An Archietcture Description method for Acknowledged System of Systems based on Federated Architeture ," in Advanced Science and</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li class="active"><span>25</span></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_25 --> <div class="footer-extlink text-muted" style="margin-bottom:1rem; text-align:center;">Some links on this page may take you to non-federal websites. Their policies may differ from this site.</div> </div><!-- container --> <a id="backToTop" href="#top"> Top </a> <footer> <nav> <ul class="links"> <li><a href="/sitemap.html">Site Map</a></li> <li><a href="/website-policies.html">Website Policies</a></li> <li><a href="https://www.energy.gov/vulnerability-disclosure-policy" target="_blank">Vulnerability Disclosure Program</a></li> <li><a href="/contact.html">Contact Us</a></li> </ul> </nav> </footer> <script type="text/javascript"><!-- // var lastDiv = ""; function showDiv(divName) { // hide last div if (lastDiv) { document.getElementById(lastDiv).className = "hiddenDiv"; } //if value of the box is not nothing and an object with that name exists, then change the class if (divName && document.getElementById(divName)) { document.getElementById(divName).className = "visibleDiv"; lastDiv = divName; } } //--> </script> <script> /** * Function that tracks a click on an outbound link in Google Analytics. * This function takes a valid URL string as an argument, and uses that URL string * as the event label. */ var trackOutboundLink = function(url,collectionCode) { try { h = window.open(url); setTimeout(function() { ga('send', 'event', 'topic-page-click-through', collectionCode, url); }, 1000); } catch(err){} }; </script> <!-- Google Analytics --> <script> (function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){ (i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o), m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m) })(window,document,'script','//www.google-analytics.com/analytics.js','ga'); ga('create', 'UA-1122789-34', 'auto'); ga('send', 'pageview'); </script> <!-- End Google Analytics --> <script> showDiv('page_1') </script> </body> </html>