Sample records for proposal evaluation process

  1. Evaluation criteria for commercially oriented materials processing in space proposals

    NASA Technical Reports Server (NTRS)

    Moore, W. F.; Mcdowell, J. R.

    1979-01-01

    An approach and criteria for evaluating NASA funded experiments and demonstrations which have commercial potential were developed. Methods for insuring quick initial screening of commercial proposals are presented. Recommendations are given for modifying the current evaluation approach. New criteria for evaluating commercially orientated materials processing in space (MPS) proposals are introduced. The process for selection of qualified individuals to evaluate the phases of this approach and criteria is considered and guidelines are set for its implementation.

  2. Optimization of High-Dimensional Functions through Hypercube Evaluation

    PubMed Central

    Abiyev, Rahib H.; Tunay, Mustafa

    2015-01-01

    A novel learning algorithm for solving global numerical optimization problems is proposed. The proposed learning algorithm is intense stochastic search method which is based on evaluation and optimization of a hypercube and is called the hypercube optimization (HO) algorithm. The HO algorithm comprises the initialization and evaluation process, displacement-shrink process, and searching space process. The initialization and evaluation process initializes initial solution and evaluates the solutions in given hypercube. The displacement-shrink process determines displacement and evaluates objective functions using new points, and the search area process determines next hypercube using certain rules and evaluates the new solutions. The algorithms for these processes have been designed and presented in the paper. The designed HO algorithm is tested on specific benchmark functions. The simulations of HO algorithm have been performed for optimization of functions of 1000-, 5000-, or even 10000 dimensions. The comparative simulation results with other approaches demonstrate that the proposed algorithm is a potential candidate for optimization of both low and high dimensional functions. PMID:26339237

  3. 7 CFR 1487.6 - What are the criteria for evaluating proposals?

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... § 1487.6 What are the criteria for evaluating proposals? (a) Evaluation criteria. FAS will use the... representation. (b) Evaluation process. FAS will review all proposals for eligibility and completeness and will..., and submit the proposals and funding recommendations to appropriate officials within FAS for decision...

  4. 7 CFR 1487.6 - What are the criteria for evaluating proposals?

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... § 1487.6 What are the criteria for evaluating proposals? (a) Evaluation criteria. FAS will use the... representation. (b) Evaluation process. FAS will review all proposals for eligibility and completeness and will..., and submit the proposals and funding recommendations to appropriate officials within FAS for decision...

  5. 7 CFR 1487.6 - What are the criteria for evaluating proposals?

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... § 1487.6 What are the criteria for evaluating proposals? (a) Evaluation criteria. FAS will use the... representation. (b) Evaluation process. FAS will review all proposals for eligibility and completeness and will..., and submit the proposals and funding recommendations to appropriate officials within FAS for decision...

  6. 7 CFR 3406.14 - Proposal review-teaching.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 15 2010-01-01 2010-01-01 false Proposal review-teaching. 3406.14 Section 3406.14... Review and Evaluation of a Teaching Proposal § 3406.14 Proposal review—teaching. The proposal evaluation process includes both internal staff review and merit evaluation by peer review panels comprised of...

  7. 48 CFR 2015.305 - Proposal evaluation.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... METHODS AND CONTRACT TYPES CONTRACTING BY NEGOTIATION Source Selection Processes and Techniques 2015.305 Proposal evaluation. The contracting officer may provide offerors' cost proposals and supporting financial...

  8. Focus on Process Improvement: An Evaluation of the Use of the RFP Process in the Distribution of Federal Workforce Education Funds in Minnesota. Perkins-JTPA Evaluation.

    ERIC Educational Resources Information Center

    Feickert, Joan Davis; And Others

    A study evaluated the use of the request for proposal (RFP) process as a method of distributing federal vocational education and job training funds in Minnesota. Thirty-seven employees of Minnesota technical colleges, community-based organizations, service delivery areas, and state agencies who had actually prepared proposals requesting Job…

  9. A method to evaluate process performance by integrating time and resources

    NASA Astrophysics Data System (ADS)

    Wang, Yu; Wei, Qingjie; Jin, Shuang

    2017-06-01

    The purpose of process mining is to improve the existing process of the enterprise, so how to measure the performance of the process is particularly important. However, the current research on the performance evaluation method is still insufficient. The main methods of evaluation are mainly using time or resource. These basic statistics cannot evaluate process performance very well. In this paper, a method of evaluating the performance of the process based on time dimension and resource dimension is proposed. This method can be used to measure the utilization and redundancy of resources in the process. This paper will introduce the design principle and formula of the evaluation algorithm. Then, the design and the implementation of the evaluation method will be introduced. Finally, we will use the evaluating method to analyse the event log from a telephone maintenance process and propose an optimization plan.

  10. Optimization evaluation of cutting technology based on mechanical parts

    NASA Astrophysics Data System (ADS)

    Wang, Yu

    2018-04-01

    The relationship between the mechanical manufacturing process and the carbon emission is studied on the basis of the process of the mechanical manufacturing process. The formula of carbon emission calculation suitable for mechanical manufacturing process is derived. Based on this, a green evaluation method for cold machining process of mechanical parts is proposed. The application verification and data analysis of the proposed evaluation method are carried out by an example. The results show that there is a great relationship between the mechanical manufacturing process data and carbon emissions.

  11. A framework for evaluating proposals for scientific activities in wilderness

    Treesearch

    Peter Landres

    2000-01-01

    This paper presents a structured framework for evaluating proposals for scientific activities in wilderness. Wilderness managers receive proposals for scientific activities ranging from unobtrusive inventorying of plants and animals to the use of chainsaws and helicopters for collecting information. Currently, there is no consistent process for evaluating proposals,...

  12. 7 CFR 1487.6 - What are the criteria for evaluating proposals?

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    .... FAS will use the following criteria in evaluating proposals: (1) The nature of the specific export... producer representation. (b) Evaluation process. FAS will review all proposals for eligibility and... within FAS for decision. FAS may, when appropriate to the subject matter of the proposal, request the...

  13. Proposed Framework for the Evaluation of Standalone Corpora Processing Systems: An Application to Arabic Corpora

    PubMed Central

    Al-Thubaity, Abdulmohsen; Alqifari, Reem

    2014-01-01

    Despite the accessibility of numerous online corpora, students and researchers engaged in the fields of Natural Language Processing (NLP), corpus linguistics, and language learning and teaching may encounter situations in which they need to develop their own corpora. Several commercial and free standalone corpora processing systems are available to process such corpora. In this study, we first propose a framework for the evaluation of standalone corpora processing systems and then use it to evaluate seven freely available systems. The proposed framework considers the usability, functionality, and performance of the evaluated systems while taking into consideration their suitability for Arabic corpora. While the results show that most of the evaluated systems exhibited comparable usability scores, the scores for functionality and performance were substantially different with respect to support for the Arabic language and N-grams profile generation. The results of our evaluation will help potential users of the evaluated systems to choose the system that best meets their needs. More importantly, the results will help the developers of the evaluated systems to enhance their systems and developers of new corpora processing systems by providing them with a reference framework. PMID:25610910

  14. Proposed framework for the evaluation of standalone corpora processing systems: an application to Arabic corpora.

    PubMed

    Al-Thubaity, Abdulmohsen; Al-Khalifa, Hend; Alqifari, Reem; Almazrua, Manal

    2014-01-01

    Despite the accessibility of numerous online corpora, students and researchers engaged in the fields of Natural Language Processing (NLP), corpus linguistics, and language learning and teaching may encounter situations in which they need to develop their own corpora. Several commercial and free standalone corpora processing systems are available to process such corpora. In this study, we first propose a framework for the evaluation of standalone corpora processing systems and then use it to evaluate seven freely available systems. The proposed framework considers the usability, functionality, and performance of the evaluated systems while taking into consideration their suitability for Arabic corpora. While the results show that most of the evaluated systems exhibited comparable usability scores, the scores for functionality and performance were substantially different with respect to support for the Arabic language and N-grams profile generation. The results of our evaluation will help potential users of the evaluated systems to choose the system that best meets their needs. More importantly, the results will help the developers of the evaluated systems to enhance their systems and developers of new corpora processing systems by providing them with a reference framework.

  15. 7 CFR 3406.19 - Proposal review-research.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 15 2013-01-01 2013-01-01 false Proposal review-research. 3406.19 Section 3406.19... AGRICULTURE 1890 INSTITUTION CAPACITY BUILDING GRANTS PROGRAM Review and Evaluation of a Research Proposal § 3406.19 Proposal review—research. The proposal evaluation process includes both internal staff review...

  16. 7 CFR 3406.19 - Proposal review-research.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 15 2012-01-01 2012-01-01 false Proposal review-research. 3406.19 Section 3406.19... AGRICULTURE 1890 INSTITUTION CAPACITY BUILDING GRANTS PROGRAM Review and Evaluation of a Research Proposal § 3406.19 Proposal review—research. The proposal evaluation process includes both internal staff review...

  17. 7 CFR 3406.19 - Proposal review-research.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 15 2011-01-01 2011-01-01 false Proposal review-research. 3406.19 Section 3406.19... AGRICULTURE 1890 INSTITUTION CAPACITY BUILDING GRANTS PROGRAM Review and Evaluation of a Research Proposal § 3406.19 Proposal review—research. The proposal evaluation process includes both internal staff review...

  18. TurboTech Technical Evaluation Automated System

    NASA Technical Reports Server (NTRS)

    Tiffany, Dorothy J.

    2009-01-01

    TurboTech software is a Web-based process that simplifies and semiautomates technical evaluation of NASA proposals for Contracting Officer's Technical Representatives (COTRs). At the time of this reporting, there have been no set standards or systems for training new COTRs in technical evaluations. This new process provides boilerplate text in response to interview style questions. This text is collected into a Microsoft Word document that can then be further edited to conform to specific cases. By providing technical language and a structured format, TurboTech allows the COTRs to concentrate more on the actual evaluation, and less on deciding what language would be most appropriate. Since the actual word choice is one of the more time-consuming parts of a COTRs job, this process should allow for an increase in quantity of proposals evaluated. TurboTech is applicable to composing technical evaluations of contractor proposals, task and delivery orders, change order modifications, requests for proposals, new work modifications, task assignments, as well as any changes to existing contracts.

  19. Effectiveness evaluation of double-layered satellite network with laser and microwave hybrid links based on fuzzy analytic hierarchy process

    NASA Astrophysics Data System (ADS)

    Zhang, Wei; Rao, Qiaomeng

    2018-01-01

    In order to solve the problem of high speed, large capacity and limited spectrum resources of satellite communication network, a double-layered satellite network with global seamless coverage based on laser and microwave hybrid links is proposed in this paper. By analyzing the characteristics of the double-layered satellite network with laser and microwave hybrid links, an effectiveness evaluation index system for the network is established. And then, the fuzzy analytic hierarchy process, which combines the analytic hierarchy process and the fuzzy comprehensive evaluation theory, is used to evaluate the effectiveness of the double-layered satellite network with laser and microwave hybrid links. Furthermore, the evaluation result of the proposed hybrid link network is obtained by simulation. The effectiveness evaluation process of the proposed double-layered satellite network with laser and microwave hybrid links can help to optimize the design of hybrid link double-layered satellite network and improve the operating efficiency of the satellite system.

  20. 78 FR 43205 - Proposed Substances To Be Evaluated for Set 27 Toxicological Profiles

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-19

    .... The Set 27 nomination process includes consideration of all substances on ATSDR's Priority List of... No. ATSDR-2013-0002] Proposed Substances To Be Evaluated for Set 27 Toxicological Profiles AGENCY...). ACTION: Request for comments on the proposed substances to be evaluated for Set 27 toxicological profiles...

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Szoka de Valladares, M.R.; Mack, S.

    The DOE Hydrogen Program needs to develop criteria as part of a systematic evaluation process for proposal identification, evaluation and selection. The H Scan component of this process provides a framework in which a project proposer can fully describe their candidate technology system and its components. The H Scan complements traditional methods of capturing cost and technical information. It consists of a special set of survey forms designed to elicit information so expert reviewers can assess the proposal relative to DOE specified selection criteria. The Analytic Hierarchy Process (AHP) component of the decision process assembles the management defined evaluation andmore » selection criteria into a coherent multi-level decision construct by which projects can be evaluated in pair-wise comparisons. The AHP model will reflect management`s objectives and it will assist in the ranking of individual projects based on the extent to which each contributes to management`s objectives. This paper contains a detailed description of the products and activities associated with the planning and evaluation process: The objectives or criteria; the H Scan; and The Analytic Hierarchy Process (AHP).« less

  2. 15 CFR 290.7 - Proposal selection process.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... INSTITUTE OF STANDARDS AND TECHNOLOGY, DEPARTMENT OF COMMERCE NIST EXTRAMURAL PROGRAMS REGIONAL CENTERS FOR... proposals will be reviewed by NIST to assure compliance with § 290.5 of these procedures. Proposals which... Director of NIST will appoint an evaluation panel to review and evaluate all qualified proposals in...

  3. 15 CFR 290.7 - Proposal selection process.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... INSTITUTE OF STANDARDS AND TECHNOLOGY, DEPARTMENT OF COMMERCE NIST EXTRAMURAL PROGRAMS REGIONAL CENTERS FOR... proposals will be reviewed by NIST to assure compliance with § 290.5 of these procedures. Proposals which... Director of NIST will appoint an evaluation panel to review and evaluate all qualified proposals in...

  4. 15 CFR 290.7 - Proposal selection process.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... INSTITUTE OF STANDARDS AND TECHNOLOGY, DEPARTMENT OF COMMERCE NIST EXTRAMURAL PROGRAMS REGIONAL CENTERS FOR... proposals will be reviewed by NIST to assure compliance with § 290.5 of these procedures. Proposals which... Director of NIST will appoint an evaluation panel to review and evaluate all qualified proposals in...

  5. 15 CFR 290.7 - Proposal selection process.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... INSTITUTE OF STANDARDS AND TECHNOLOGY, DEPARTMENT OF COMMERCE NIST EXTRAMURAL PROGRAMS REGIONAL CENTERS FOR... proposals will be reviewed by NIST to assure compliance with § 290.5 of these procedures. Proposals which... Director of NIST will appoint an evaluation panel to review and evaluate all qualified proposals in...

  6. 15 CFR 290.7 - Proposal selection process.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... INSTITUTE OF STANDARDS AND TECHNOLOGY, DEPARTMENT OF COMMERCE NIST EXTRAMURAL PROGRAMS REGIONAL CENTERS FOR... proposals will be reviewed by NIST to assure compliance with § 290.5 of these procedures. Proposals which... Director of NIST will appoint an evaluation panel to review and evaluate all qualified proposals in...

  7. Understanding the Federal Proposal Review Process.

    ERIC Educational Resources Information Center

    Cavin, Janis I.

    Information on the peer review process for the evaluation of federal grant proposals is presented to help college grants administrators and faculty develop good proposals. This guidebook provides an overview of the policies and conventions that govern the review and selection of proposals for funding, and details the review procedures of the…

  8. 76 FR 37344 - Technology Evaluation Process

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-27

    ...-NOA-0039] Technology Evaluation Process AGENCY: Office of Energy Efficiency and Renewable Energy... is an extension of a prior RFI seeking comment on a proposed commercial buildings technology... seeks comments and information related to a commercial buildings technology evaluation process. DOE is...

  9. 48 CFR 315.305 - Proposal evaluation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... following elements: (1) An explanation of the evaluation process and the role of evaluators throughout the... include, at a minimum, the following elements: (1) A list of recommended technical evaluation panel... that the technical evaluation will have in the award decision. (2) The technical evaluation process...

  10. 15 CFR 292.5 - Proposal selection process.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... INSTITUTE OF STANDARDS AND TECHNOLOGY, DEPARTMENT OF COMMERCE NIST EXTRAMURAL PROGRAMS MANUFACTURING... proposals will be reviewed by NIST to assure compliance with the proposal content and other basic provisions... and selection of finalists. NIST will appoint an evaluation panel to review and evaluate all qualified...

  11. 15 CFR 292.5 - Proposal selection process.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... INSTITUTE OF STANDARDS AND TECHNOLOGY, DEPARTMENT OF COMMERCE NIST EXTRAMURAL PROGRAMS MANUFACTURING... proposals will be reviewed by NIST to assure compliance with the proposal content and other basic provisions... and selection of finalists. NIST will appoint an evaluation panel to review and evaluate all qualified...

  12. 15 CFR 292.5 - Proposal selection process.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... INSTITUTE OF STANDARDS AND TECHNOLOGY, DEPARTMENT OF COMMERCE NIST EXTRAMURAL PROGRAMS MANUFACTURING... proposals will be reviewed by NIST to assure compliance with the proposal content and other basic provisions... and selection of finalists. NIST will appoint an evaluation panel to review and evaluate all qualified...

  13. 15 CFR 292.5 - Proposal selection process.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... INSTITUTE OF STANDARDS AND TECHNOLOGY, DEPARTMENT OF COMMERCE NIST EXTRAMURAL PROGRAMS MANUFACTURING... proposals will be reviewed by NIST to assure compliance with the proposal content and other basic provisions... and selection of finalists. NIST will appoint an evaluation panel to review and evaluate all qualified...

  14. 15 CFR 292.5 - Proposal selection process.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... INSTITUTE OF STANDARDS AND TECHNOLOGY, DEPARTMENT OF COMMERCE NIST EXTRAMURAL PROGRAMS MANUFACTURING... proposals will be reviewed by NIST to assure compliance with the proposal content and other basic provisions... and selection of finalists. NIST will appoint an evaluation panel to review and evaluate all qualified...

  15. An evaluation and implementation of rule-based Home Energy Management System using the Rete algorithm.

    PubMed

    Kawakami, Tomoya; Fujita, Naotaka; Yoshihisa, Tomoki; Tsukamoto, Masahiko

    2014-01-01

    In recent years, sensors become popular and Home Energy Management System (HEMS) takes an important role in saving energy without decrease in QoL (Quality of Life). Currently, many rule-based HEMSs have been proposed and almost all of them assume "IF-THEN" rules. The Rete algorithm is a typical pattern matching algorithm for IF-THEN rules. Currently, we have proposed a rule-based Home Energy Management System (HEMS) using the Rete algorithm. In the proposed system, rules for managing energy are processed by smart taps in network, and the loads for processing rules and collecting data are distributed to smart taps. In addition, the number of processes and collecting data are reduced by processing rules based on the Rete algorithm. In this paper, we evaluated the proposed system by simulation. In the simulation environment, rules are processed by a smart tap that relates to the action part of each rule. In addition, we implemented the proposed system as HEMS using smart taps.

  16. A Module Experimental Process System Development Unit (MEPSDU)

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The purpose of this program is to demonstrate the technical readiness of a cost effective process sequence that has the potential for the production of flat plate photovoltaic modules which met the price goal in 1986 of $.70 or less per watt peak. Program efforts included: preliminary design review, preliminary cell fabrication using the proposed process sequence, verification of sandblasting back cleanup, study of resist parameters, evaluation of pull strength of the proposed metallization, measurement of contact resistance of Electroless Ni contacts, optimization of process parameter, design of the MEPSDU module, identification and testing of insulator tapes, development of a lamination process sequence, identification, discussions, demonstrations and visits with candidate equipment vendors, evaluation of proposals for tabbing and stringing machine.

  17. Benchmark simulation Model no 2 in Matlab-simulink: towards plant-wide WWTP control strategy evaluation.

    PubMed

    Vreck, D; Gernaey, K V; Rosen, C; Jeppsson, U

    2006-01-01

    In this paper, implementation of the Benchmark Simulation Model No 2 (BSM2) within Matlab-Simulink is presented. The BSM2 is developed for plant-wide WWTP control strategy evaluation on a long-term basis. It consists of a pre-treatment process, an activated sludge process and sludge treatment processes. Extended evaluation criteria are proposed for plant-wide control strategy assessment. Default open-loop and closed-loop strategies are also proposed to be used as references with which to compare other control strategies. Simulations indicate that the BM2 is an appropriate tool for plant-wide control strategy evaluation.

  18. 48 CFR 1415.207-71 - Confidentiality of proposal evaluation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... committee appointed to evaluate proposals shall discuss or disclose any information on the number, identity... brief all members and advisors on the sensitivity of the evaluation process and the prohibition against unauthorized disclosure of information. At this meeting each member and advisor shall sign a Confidentiality...

  19. 48 CFR 1415.207-71 - Confidentiality of proposal evaluation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... committee appointed to evaluate proposals shall discuss or disclose any information on the number, identity... brief all members and advisors on the sensitivity of the evaluation process and the prohibition against unauthorized disclosure of information. At this meeting each member and advisor shall sign a Confidentiality...

  20. 23 CFR 636.304 - What process may be used to rate and score proposals?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... ENGINEERING AND TRAFFIC OPERATIONS DESIGN-BUILD CONTRACTING Proposal Evaluation Factors § 636.304 What process... any rating method or combination of methods including color or adjectival ratings, numerical weights...

  1. 32 CFR 206.4 - Proposal development and review.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...) The NSEP will use a two-stage review process in order to evaluate a broad range of proposal ideas. In...-stage process, potential grantees are given an opportunity to present their ideas without creating a...

  2. 32 CFR 206.4 - Proposal development and review.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...) The NSEP will use a two-stage review process in order to evaluate a broad range of proposal ideas. In...-stage process, potential grantees are given an opportunity to present their ideas without creating a...

  3. Pythagorean fuzzy analytic hierarchy process to multi-criteria decision making

    NASA Astrophysics Data System (ADS)

    Mohd, Wan Rosanisah Wan; Abdullah, Lazim

    2017-11-01

    A numerous approaches have been proposed in the literature to determine the criteria of weight. The weight of criteria is very significant in the process of decision making. One of the outstanding approaches that used to determine weight of criteria is analytic hierarchy process (AHP). This method involves decision makers (DMs) to evaluate the decision to form the pair-wise comparison between criteria and alternatives. In classical AHP, the linguistic variable of pairwise comparison is presented in terms of crisp value. However, this method is not appropriate to present the real situation of the problems because it involved the uncertainty in linguistic judgment. For this reason, AHP has been extended by incorporating the Pythagorean fuzzy sets. In addition, no one has found in the literature proposed how to determine the weight of criteria using AHP under Pythagorean fuzzy sets. In order to solve the MCDM problem, the Pythagorean fuzzy analytic hierarchy process is proposed to determine the criteria weight of the evaluation criteria. Using the linguistic variables, pairwise comparison for evaluation criteria are made to the weights of criteria using Pythagorean fuzzy numbers (PFNs). The proposed method is implemented in the evaluation problem in order to demonstrate its applicability. This study shows that the proposed method provides us with a useful way and a new direction in solving MCDM problems with Pythagorean fuzzy context.

  4. [Analysis of evaluation process of research projects submitted to the Fondo de Investigación Sanitaria, Spain].

    PubMed

    Prieto Carles, C; Gómez-Gerique, J; Gutiérrez Millet, V; Veiga de Cabo, J; Sanz Martul, E; Mendoza Hernández, J L

    2000-10-07

    At the present time it seems very clear that research improvement is both an unquestionable fact and the right way to develop technological innovation, services and patents. However, such improvement and corresponding finances needs to be done under fine and rigorous evaluation process as an assessment tool under which all the research projects applying to a public or private call for proposals should be submitted to assure a coherence point according to the investment to be made. At this end, the main target of this work has been focused to analysis and study the evaluation process traditionally made by Fondo de Investigación Sanitaria (FIS) as well as to propose most adequate modifications. A sample of 431 research projects corresponding to year 1998 proposal was analysed. The evaluation from FIS and ANEP (National Evaluation and Prospective Agency) was evaluated and scored (evaluation quality) in its main contents by 3 independent evaluators, the showed results submitted to a comparative frame between these agencies at indoor (FIS) and outdoor (FIS/ANEP) level. FIS evaluation had 20 commissions or areas of knowledge. The analysis indoor (FIS) clearly showed that evaluation quality was correlated to the assigned commission (F = 3.71; p < 0.001) and to the time last of the researched proposal (F = 3.42; p < 0.05) but no related to the evaluator. On the other hand, the quality of ANEP evaluation showed a correlated dependency of the three mentioned facts. In all terms, the ANEP evaluation was better than FIS for the three years time projects, but in did not show significant differences in one or two years time projects. In all cases, the evaluation with final results as negative (financing denied) showed an average quality higher than positive evaluation. The obtained results advice about the convenience of making some changes in the evaluative structure and to review the sort of FIS technical commissions focusing an improvement of the evaluation process.

  5. Crew Transportation Technical Standards and Design Evaluation Criteria

    NASA Technical Reports Server (NTRS)

    Lueders, Kathryn L.; Thomas, Rayelle E. (Compiler)

    2015-01-01

    Crew Transportation Technical Standards and Design Evaluation Criteria contains descriptions of technical, safety, and crew health medical processes and specifications, and the criteria which will be used to evaluate the acceptability of the Commercial Providers' proposed processes and specifications.

  6. Online Deviation Detection for Medical Processes

    PubMed Central

    Christov, Stefan C.; Avrunin, George S.; Clarke, Lori A.

    2014-01-01

    Human errors are a major concern in many medical processes. To help address this problem, we are investigating an approach for automatically detecting when performers of a medical process deviate from the acceptable ways of performing that process as specified by a detailed process model. Such deviations could represent errors and, thus, detecting and reporting deviations as they occur could help catch errors before harm is done. In this paper, we identify important issues related to the feasibility of the proposed approach and empirically evaluate the approach for two medical procedures, chemotherapy and blood transfusion. For the evaluation, we use the process models to generate sample process executions that we then seed with synthetic errors. The process models describe the coordination of activities of different process performers in normal, as well as in exceptional situations. The evaluation results suggest that the proposed approach could be applied in clinical settings to help catch errors before harm is done. PMID:25954343

  7. 76 FR 814 - Self-Regulatory Organizations; NASDAQ OMX PHLX LLC; Order Approving a Proposed Rule Change To...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-06

    ... proposes to establish new minimum performance standards for specialist units.\\12\\ Specifically, new Rule... Streamline the Process for Specialist Evaluations and Clarify the Time Within Which SQTs and RSQTs Must Begin...-4 thereunder,\\2\\ a proposed rule change to update and streamline the process for specialist...

  8. Associating versus Proposing or Associating What We Propose: Comment on Gawronski and Bodenhausen

    ERIC Educational Resources Information Center

    Albarracin, Dolores; Hart, William; McCulloch, Kathleen C.

    2006-01-01

    This commentary on the article by B. Gawronski and G. V. Bodenhausen (see record 2006-10465-003) highlights the strengths of the associative-propositional evaluation model. It then describes problems in proposing a qualitative separation between propositional and associative processes. Propositional processes are instead described as associative.…

  9. A Proposed Process for Managing the First Amendment Aspects of Campus Hate Speech.

    ERIC Educational Resources Information Center

    Kaplan, William A.

    1992-01-01

    A carefully structured process for campus administrative decision making concerning hate speech is proposed and suggestions for implementation are offered. In addition, criteria for evaluating hate speech processes are outlined, and First Amendment principles circumscribing the institution's discretion to regulate hate speech are discussed.…

  10. JPEG XS call for proposals subjective evaluations

    NASA Astrophysics Data System (ADS)

    McNally, David; Bruylants, Tim; Willème, Alexandre; Ebrahimi, Touradj; Schelkens, Peter; Macq, Benoit

    2017-09-01

    In March 2016 the Joint Photographic Experts Group (JPEG), formally known as ISO/IEC SC29 WG1, issued a call for proposals soliciting compression technologies for a low-latency, lightweight and visually transparent video compression scheme. Within the JPEG family of standards, this scheme was denominated JPEG XS. The subjective evaluation of visually lossless compressed video sequences at high resolutions and bit depths poses particular challenges. This paper describes the adopted procedures, the subjective evaluation setup, the evaluation process and summarizes the obtained results which were achieved in the context of the JPEG XS standardization process.

  11. Evaluation of Models of the Reading Process.

    ERIC Educational Resources Information Center

    Balajthy, Ernest

    A variety of reading process models have been proposed and evaluated in reading research. Traditional approaches to model evaluation specify the workings of a system in a simplified fashion to enable organized, systematic study of the system's components. Following are several statistical methods of model evaluation: (1) empirical research on…

  12. Carbon dioxide mineralization process design and evaluation: concepts, case studies, and considerations.

    PubMed

    Yuen, Yeo Tze; Sharratt, Paul N; Jie, Bu

    2016-11-01

    Numerous carbon dioxide mineralization (CM) processes have been proposed to overcome the slow rate of natural weathering of silicate minerals. Ten of these proposals are mentioned in this article. The proposals are described in terms of the four major areas relating to CM process design: pre-treatment, purification, carbonation, and reagent recycling operations. Any known specifics based on probable or representative operating and reaction conditions are listed, and basic analysis of the strengths and shortcomings associated with the individual process designs are given in this article. The processes typically employ physical or chemical pseudo-catalytic methods to enhance the rate of carbon dioxide mineralization; however, both methods have its own associated advantages and problems. To examine the feasibility of a CM process, three key aspects should be included in the evaluation criteria: energy use, operational considerations as well as product value and economics. Recommendations regarding the optimal level of emphasis and implementation of measures to control these aspects are given, and these will depend very much on the desired process objectives. Ultimately, a mix-and-match approach to process design might be required to provide viable and economic proposals for CM processes.

  13. The NASA SARP Software Research Infusion Initiative

    NASA Technical Reports Server (NTRS)

    Hinchey, Mike; Pressburger, Tom; Markosian, Lawrence; Feather, Martin

    2006-01-01

    A viewgraph presentation describing the NASA Software Assurance Research Program (SARP) research infusion projects is shown. The topics include: 1) Background/Motivation; 2) Proposal Solicitation Process; 3) Proposal Evaluation Process; 4) Overview of Some Projects to Date; and 5) Lessons Learned.

  14. Evaluation of engineering foods for closed Ecological Life Support System (CELSS)

    NASA Technical Reports Server (NTRS)

    Karel, M.

    1982-01-01

    A nutritionally adequate and acceptable diet was evaluated and developed. A design for a multipurpose food plant is discussed. The types and amounts of foods needed to be regenerated in a partially closed ecological life support system (PCELSS) were proposed. All steps of food processes to be utilized in the multipurpose food plant of PCELSS were also considered. Equipment specifications, simplification of the proposed processes, and food waste treatment were analyzed.

  15. An Effective Measured Data Preprocessing Method in Electrical Impedance Tomography

    PubMed Central

    Yu, Chenglong; Yue, Shihong; Wang, Jianpei; Wang, Huaxiang

    2014-01-01

    As an advanced process detection technology, electrical impedance tomography (EIT) has widely been paid attention to and studied in the industrial fields. But the EIT techniques are greatly limited to the low spatial resolutions. This problem may result from the incorrect preprocessing of measuring data and lack of general criterion to evaluate different preprocessing processes. In this paper, an EIT data preprocessing method is proposed by all rooting measured data and evaluated by two constructed indexes based on all rooted EIT measured data. By finding the optimums of the two indexes, the proposed method can be applied to improve the EIT imaging spatial resolutions. In terms of a theoretical model, the optimal rooting times of the two indexes range in [0.23, 0.33] and in [0.22, 0.35], respectively. Moreover, these factors that affect the correctness of the proposed method are generally analyzed. The measuring data preprocessing is necessary and helpful for any imaging process. Thus, the proposed method can be generally and widely used in any imaging process. Experimental results validate the two proposed indexes. PMID:25165735

  16. The Future of the Space Age or how to Evaluate Innovative Ideas

    NASA Astrophysics Data System (ADS)

    Vollerthun, A.; Fricke, E.

    2002-05-01

    Based on an initiative of the German Aerospace Industry Association to foster a more transparent and structured funding of German commercial-oriented space projects a three-phased approach is suggested in this paper, to stepwise improve and evaluate proposed concepts for space-related innovations. The objective of this concept was to develop a transparent, structured, and reproducible process to select the right innovative project in terms of political, economical, and technical objectives for funding by e.g. a governmental agency. A stepwise process and related methods, that cover technical as well as economical aspects (and related sensitivities) are proposed. Based on the special needs and requirements of space industry the proposals are compared to a set of predefined top level objectives/requirements. Using an initial trades analysis with the criteria company, technology, product, and market, an initial business case is analyzed. The alternative innovative concepts are in the third process step subject to a very detailed analysis. The full economical and technical scale of the projects is evaluated and metrics for e.g. the 'Return on Investment' or 'Break Even Point' are determined, to compare the various innovations. Risks related to time, cost, and quality are considered, when performing sensitivity analysis by varying the most important factors of the project. Before discussing critical aspects of the proposed process, space-related examples will be presented to show how the process could be applied, and how different concepts should be evaluated.

  17. 78 FR 21912 - Proposed Information Collection; Comment Request; Processed Products Family of Forms

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-12

    ... Collection; Comment Request; Processed Products Family of Forms AGENCY: National Oceanic and Atmospheric... NOAA in the economic and social analyses developed when proposing and evaluating fishery management... collection). Affected Public: Business or other for-profit organizations. Estimated Number of Respondents...

  18. A Participatory Action Research Approach To Evaluating Inclusive School Programs.

    ERIC Educational Resources Information Center

    Dymond, Stacy K.

    2001-01-01

    This article proposes a model for evaluating inclusive schools. Key elements of the model are inclusion of stakeholders in the evaluation process through a participatory action research approach, analysis of program processes and outcomes, use of multiple methods and measures, and obtaining perceptions from diverse stakeholder groups. (Contains…

  19. Looking Across and Looking Beyond the Knowledge Frontier: Intellectual Distance, Novelty, and Resource Allocation in Science.

    PubMed

    Boudreau, Kevin J; Guinan, Eva C; Lakhani, Karim R; Riedl, Christoph

    2016-10-01

    Selecting among alternative projects is a core management task in all innovating organizations. In this paper, we focus on the evaluation of frontier scientific research projects. We argue that the "intellectual distance" between the knowledge embodied in research proposals and an evaluator's own expertise systematically relates to the evaluations given. To estimate relationships, we designed and executed a grant proposal process at a leading research university in which we randomized the assignment of evaluators and proposals to generate 2,130 evaluator-proposal pairs. We find that evaluators systematically give lower scores to research proposals that are closer to their own areas of expertise and to those that are highly novel. The patterns are consistent with biases associated with boundedly rational evaluation of new ideas. The patterns are inconsistent with intellectual distance simply contributing "noise" or being associated with private interests of evaluators. We discuss implications for policy, managerial intervention, and allocation of resources in the ongoing accumulation of scientific knowledge.

  20. Proposal of Modification Strategy of NC Program in the Virtual Manufacturing Environment

    NASA Astrophysics Data System (ADS)

    Narita, Hirohisa; Chen, Lian-Yi; Fujimoto, Hideo; Shirase, Keiichi; Arai, Eiji

    Virtual manufacturing will be a key technology in process planning, because there are no evaluation tools for cutting conditions. Therefore, virtual machining simulator (VMSim), which can predict end milling processes, has been developed. The modification strategy of NC program using VMSim is proposed in this paper.

  1. Mass Transit: Implementation of FTA’s New Starts Evaluation Process and FY 2001 Funding Proposals

    DTIC Science & Technology

    2000-04-01

    formalize the process. FTA issued a proposed rule on April 7, 1999, and plans to issue final regulations by the summer of 2000. In selecting projects for...commit funds to any more New Starts projects during the last 2 years of TEA-21—through fiscal year 2003. Because there are plans for many more...regional review of alternatives, develop preliminary engineering plans , and meet FTA’s approval for the final design. TEA-21 requires that FTA evaluate

  2. An Analysis of Department of Energy Cost Proposal Process and Effectiveness

    DTIC Science & Technology

    2011-10-11

    processes to mitigate and manage risk , rather than derive upfront assessment and quantification of proposal risk (DoE, 2008a). The proposal...2. GM 2- Enhance the Federal Contract and Project Management Workforce Substantially Complete 3. GM 3 - Improve Project Risk Assessment ...proposal, contract proposal evaluation, risk , cost analysis = = ^Åèìáëáíáçå=oÉëÉ~êÅÜ=mêçÖê~ã= do^ar^qb=p`elli=lc=_rpfkbpp=C=mr_if`=mlif`v - ii

  3. Towards a Cloud Based Smart Traffic Management Framework

    NASA Astrophysics Data System (ADS)

    Rahimi, M. M.; Hakimpour, F.

    2017-09-01

    Traffic big data has brought many opportunities for traffic management applications. However several challenges like heterogeneity, storage, management, processing and analysis of traffic big data may hinder their efficient and real-time applications. All these challenges call for well-adapted distributed framework for smart traffic management that can efficiently handle big traffic data integration, indexing, query processing, mining and analysis. In this paper, we present a novel, distributed, scalable and efficient framework for traffic management applications. The proposed cloud computing based framework can answer technical challenges for efficient and real-time storage, management, process and analyse of traffic big data. For evaluation of the framework, we have used OpenStreetMap (OSM) real trajectories and road network on a distributed environment. Our evaluation results indicate that speed of data importing to this framework exceeds 8000 records per second when the size of datasets is near to 5 million. We also evaluate performance of data retrieval in our proposed framework. The data retrieval speed exceeds 15000 records per second when the size of datasets is near to 5 million. We have also evaluated scalability and performance of our proposed framework using parallelisation of a critical pre-analysis in transportation applications. The results show that proposed framework achieves considerable performance and efficiency in traffic management applications.

  4. The comparative evaluation of expanded national immunization policies in Korea using an analytic hierarchy process.

    PubMed

    Shin, Taeksoo; Kim, Chun-Bae; Ahn, Yang-Heui; Kim, Hyo-Youl; Cha, Byung Ho; Uh, Young; Lee, Joo-Heon; Hyun, Sook-Jung; Lee, Dong-Han; Go, Un-Yeong

    2009-01-29

    The purpose of this paper is to propose new evaluation criteria and an analytic hierarchy process (AHP) model to assess the expanded national immunization programs (ENIPs) and to evaluate two alternative health care policies. One of the alternative policies is that private clinics and hospitals would offer free vaccination services to children and the other of them is that public health centers would offer these free vaccination services. Our model to evaluate the ENIPs was developed using brainstorming, Delphi techniques, and the AHP model. We first used the brainstorming and Delphi techniques, as well as literature reviews, to determine 25 criteria with which to evaluate the national immunization policy; we then proposed a hierarchical structure of the AHP model to assess ENIPs. By applying the proposed AHP model to the assessment of ENIPs for Korean immunization policies, we show that free vaccination services should be provided by private clinics and hospitals rather than public health centers.

  5. 77 FR 35665 - Notice of Proposed Information Collection Requests; Office of Planning, Evaluation and Policy...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-14

    ... evaluations may include both formative implementation and process evaluations that evaluate a program as it is unfolding, and summative descriptive evaluations that examine changes in final outcomes in a non-causal...

  6. Investigating the Gap Between Estimated and Actual Energy Efficiency and Conservation Savings for Public Buildings Projects & Programs in United States

    NASA Astrophysics Data System (ADS)

    Qaddus, Muhammad Kamil

    The gap between estimated and actual savings in energy efficiency and conservation (EE&C) projects or programs forms the problem statement for the scope of public and government buildings. This gap has been analyzed first on impact and then on process-level. On the impact-level, the methodology leads to categorization of the gap as 'Realization Gap'. It then views the categorization of gap within the context of past and current narratives linked to realization gap. On process-level, the methodology leads to further analysis of realization gap on process evaluation basis. The process evaluation criterion, a product of this basis is then applied to two different programs (DESEU and NYC ACE) linked to the scope of this thesis. Utilizing the synergies of impact and process level analysis, it offers proposals on program development and its structure using our process evaluation criterion. Innovative financing and benefits distribution structure is thus developed and will remain part of the proposal. Restricted Stakeholder Crowd Financing and Risk-Free Incentivized return are the products of proposed financing and benefit distribution structure respectively. These products are then complimented by proposing an alternative approach in estimating EE&C savings. The approach advocates estimation based on range-allocation rather than currently utilized unique estimated savings approach. The Way Ahead section thus explores synergy between financial and engineering ranges of energy savings as a multi-discipline approach for future research. Moreover, it provides the proposed program structure with risk aversion and incentive allocation while dealing with uncertainty. This set of new approaches are believed to better fill the realization gap between estimated and actual energy efficiency savings.

  7. The RFP Challenge: Evaluation of Responses to Requests for Proposals.

    ERIC Educational Resources Information Center

    LeDuc, A. L.

    1993-01-01

    When a college contracts for a product or service, the request for proposal (RFP) may be an appropriate tool. It must clearly define what is needed for fair evaluation. A rating or weighting system for costs and benefits is useful for isolating and quantifying important elements of the decision-making process. (MSE)

  8. New knowledge network evaluation method for design rationale management

    NASA Astrophysics Data System (ADS)

    Jing, Shikai; Zhan, Hongfei; Liu, Jihong; Wang, Kuan; Jiang, Hao; Zhou, Jingtao

    2015-01-01

    Current design rationale (DR) systems have not demonstrated the value of the approach in practice since little attention is put to the evaluation method of DR knowledge. To systematize knowledge management process for future computer-aided DR applications, a prerequisite is to provide the measure for the DR knowledge. In this paper, a new knowledge network evaluation method for DR management is presented. The method characterizes the DR knowledge value from four perspectives, namely, the design rationale structure scale, association knowledge and reasoning ability, degree of design justification support and degree of knowledge representation conciseness. The DR knowledge comprehensive value is also measured by the proposed method. To validate the proposed method, different style of DR knowledge network and the performance of the proposed measure are discussed. The evaluation method has been applied in two realistic design cases and compared with the structural measures. The research proposes the DR knowledge evaluation method which can provide object metric and selection basis for the DR knowledge reuse during the product design process. In addition, the method is proved to be more effective guidance and support for the application and management of DR knowledge.

  9. Construction and Validation of a Holistic Education School Evaluation Tool Using Montessori Erdkinder Principles

    ERIC Educational Resources Information Center

    Setari, Anthony Philip

    2016-01-01

    The purpose of this study was to construct a holistic education school evaluation tool using Montessori Erdkinder principles, and begin the validation process of examining the proposed tool. This study addresses a vital need in the holistic education community for a school evaluation tool. The tool construction process included using Erdkinder…

  10. Toward a Model Framework of Generalized Parallel Componential Processing of Multi-Symbol Numbers

    ERIC Educational Resources Information Center

    Huber, Stefan; Cornelsen, Sonja; Moeller, Korbinian; Nuerk, Hans-Christoph

    2015-01-01

    In this article, we propose and evaluate a new model framework of parallel componential multi-symbol number processing, generalizing the idea of parallel componential processing of multi-digit numbers to the case of negative numbers by considering the polarity signs similar to single digits. In a first step, we evaluated this account by defining…

  11. A data collection and processing procedure for evaluating a research program

    Treesearch

    Giuseppe Rensi; H. Dean Claxton

    1972-01-01

    A set of computer programs compiled for the information processing requirements of a model for evaluating research proposals are described. The programs serve to assemble and store information, periodically update it, and convert it to a form usable for decision-making. Guides for collecting and coding data are explained. The data-processing options available and...

  12. Evaluating the performance of free-formed surface parts using an analytic network process

    NASA Astrophysics Data System (ADS)

    Qian, Xueming; Ma, Yanqiao; Liang, Dezhi

    2018-03-01

    To successfully design parts with a free-formed surface, the critical issue of how to evaluate and select a favourable evaluation strategy before design is raised. The evaluation of free-formed surface parts is a multiple criteria decision-making (MCDM) problem that requires the consideration of a large number of interdependent factors. The analytic network process (ANP) is a relatively new MCDM method that can systematically deal with all kinds of dependences. In this paper, the factors, which come from the life-cycle and influence the design of free-formed surface parts, are proposed. After analysing the interdependence among these factors, a Hybrid ANP (HANP) structure for evaluating the part’s curved surface is constructed. Then, a HANP evaluation of an impeller is presented to illustrate the application of the proposed method.

  13. Development and Implementation of a Telecommuting Evaluation Framework, and Modeling the Executive Telecommuting Adoption Process

    NASA Astrophysics Data System (ADS)

    Vora, V. P.; Mahmassani, H. S.

    2002-02-01

    This work proposes and implements a comprehensive evaluation framework to document the telecommuter, organizational, and societal impacts of telecommuting through telecommuting programs. Evaluation processes and materials within the outlined framework are also proposed and implemented. As the first component of the evaluation process, the executive survey is administered within a public sector agency. The survey data is examined through exploratory analysis and is compared to a previous survey of private sector executives. The ordinal probit, dynamic probit, and dynamic generalized ordinal probit (DGOP) models of telecommuting adoption are calibrated to identify factors which significantly influence executive adoption preferences and to test the robustness of such factors. The public sector DGOP model of executive willingness to support telecommuting under different program scenarios is compared with an equivalent private sector DGOP model. Through the telecommuting program, a case study of telecommuting travel impacts is performed to further substantiate research.

  14. Looking Across and Looking Beyond the Knowledge Frontier: Intellectual Distance, Novelty, and Resource Allocation in Science

    PubMed Central

    Boudreau, Kevin J.; Guinan, Eva C.; Lakhani, Karim R.; Riedl, Christoph

    2016-01-01

    Selecting among alternative projects is a core management task in all innovating organizations. In this paper, we focus on the evaluation of frontier scientific research projects. We argue that the “intellectual distance” between the knowledge embodied in research proposals and an evaluator’s own expertise systematically relates to the evaluations given. To estimate relationships, we designed and executed a grant proposal process at a leading research university in which we randomized the assignment of evaluators and proposals to generate 2,130 evaluator–proposal pairs. We find that evaluators systematically give lower scores to research proposals that are closer to their own areas of expertise and to those that are highly novel. The patterns are consistent with biases associated with boundedly rational evaluation of new ideas. The patterns are inconsistent with intellectual distance simply contributing “noise” or being associated with private interests of evaluators. We discuss implications for policy, managerial intervention, and allocation of resources in the ongoing accumulation of scientific knowledge. PMID:27746512

  15. Software component quality evaluation

    NASA Technical Reports Server (NTRS)

    Clough, A. J.

    1991-01-01

    The paper describes a software inspection process that can be used to evaluate the quality of software components. Quality criteria, process application, independent testing of the process and proposed associated tool support are covered. Early results indicate that this technique is well suited for assessing software component quality in a standardized fashion. With automated machine assistance to facilitate both the evaluation and selection of software components, such a technique should promote effective reuse of software components.

  16. Performance evaluation methodology for historical document image binarization.

    PubMed

    Ntirogiannis, Konstantinos; Gatos, Basilis; Pratikakis, Ioannis

    2013-02-01

    Document image binarization is of great importance in the document image analysis and recognition pipeline since it affects further stages of the recognition process. The evaluation of a binarization method aids in studying its algorithmic behavior, as well as verifying its effectiveness, by providing qualitative and quantitative indication of its performance. This paper addresses a pixel-based binarization evaluation methodology for historical handwritten/machine-printed document images. In the proposed evaluation scheme, the recall and precision evaluation measures are properly modified using a weighting scheme that diminishes any potential evaluation bias. Additional performance metrics of the proposed evaluation scheme consist of the percentage rates of broken and missed text, false alarms, background noise, character enlargement, and merging. Several experiments conducted in comparison with other pixel-based evaluation measures demonstrate the validity of the proposed evaluation scheme.

  17. A primitive study of voxel feature generation by multiple stacked denoising autoencoders for detecting cerebral aneurysms on MRA

    NASA Astrophysics Data System (ADS)

    Nemoto, Mitsutaka; Hayashi, Naoto; Hanaoka, Shouhei; Nomura, Yukihiro; Miki, Soichiro; Yoshikawa, Takeharu; Ohtomo, Kuni

    2016-03-01

    The purpose of this study is to evaluate the feasibility of a novel feature generation, which is based on multiple deep neural networks (DNNs) with boosting, for computer-assisted detection (CADe). It is hard and time-consuming to optimize the hyperparameters for DNNs such as stacked denoising autoencoder (SdA). The proposed method allows using SdA based features without the burden of the hyperparameter setting. The proposed method was evaluated by an application for detecting cerebral aneurysms on magnetic resonance angiogram (MRA). A baseline CADe process included four components; scaling, candidate area limitation, candidate detection, and candidate classification. Proposed feature generation method was applied to extract the optimal features for candidate classification. Proposed method only required setting range of the hyperparameters for SdA. The optimal feature set was selected from a large quantity of SdA based features by multiple SdAs, each of which was trained using different hyperparameter set. The feature selection was operated through ada-boost ensemble learning method. Training of the baseline CADe process and proposed feature generation were operated with 200 MRA cases, and the evaluation was performed with 100 MRA cases. Proposed method successfully provided SdA based features just setting the range of some hyperparameters for SdA. The CADe process by using both previous voxel features and SdA based features had the best performance with 0.838 of an area under ROC curve and 0.312 of ANODE score. The results showed that proposed method was effective in the application for detecting cerebral aneurysms on MRA.

  18. 32 CFR 206.5 - Final proposal process.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... be evaluated in two basic categories: (1) Proposals that address study abroad infrastructure and (2... foreign cultural competency? In the case of study abroad programs, how will the success and impact of study abroad experiences be assessed. Proposals should not defer the consideration of these issues to a...

  19. 32 CFR 206.5 - Final proposal process.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... be evaluated in two basic categories: (1) Proposals that address study abroad infrastructure and (2... foreign cultural competency? In the case of study abroad programs, how will the success and impact of study abroad experiences be assessed. Proposals should not defer the consideration of these issues to a...

  20. 32 CFR 206.5 - Final proposal process.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... be evaluated in two basic categories: (1) Proposals that address study abroad infrastructure and (2... foreign cultural competency? In the case of study abroad programs, how will the success and impact of study abroad experiences be assessed. Proposals should not defer the consideration of these issues to a...

  1. 32 CFR 206.5 - Final proposal process.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... be evaluated in two basic categories: (1) Proposals that address study abroad infrastructure and (2... foreign cultural competency? In the case of study abroad programs, how will the success and impact of study abroad experiences be assessed. Proposals should not defer the consideration of these issues to a...

  2. 32 CFR 206.5 - Final proposal process.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... be evaluated in two basic categories: (1) Proposals that address study abroad infrastructure and (2... foreign cultural competency? In the case of study abroad programs, how will the success and impact of study abroad experiences be assessed. Proposals should not defer the consideration of these issues to a...

  3. A New Feedback-Based Method for Parameter Adaptation in Image Processing Routines.

    PubMed

    Khan, Arif Ul Maula; Mikut, Ralf; Reischl, Markus

    2016-01-01

    The parametrization of automatic image processing routines is time-consuming if a lot of image processing parameters are involved. An expert can tune parameters sequentially to get desired results. This may not be productive for applications with difficult image analysis tasks, e.g. when high noise and shading levels in an image are present or images vary in their characteristics due to different acquisition conditions. Parameters are required to be tuned simultaneously. We propose a framework to improve standard image segmentation methods by using feedback-based automatic parameter adaptation. Moreover, we compare algorithms by implementing them in a feedforward fashion and then adapting their parameters. This comparison is proposed to be evaluated by a benchmark data set that contains challenging image distortions in an increasing fashion. This promptly enables us to compare different standard image segmentation algorithms in a feedback vs. feedforward implementation by evaluating their segmentation quality and robustness. We also propose an efficient way of performing automatic image analysis when only abstract ground truth is present. Such a framework evaluates robustness of different image processing pipelines using a graded data set. This is useful for both end-users and experts.

  4. A New Feedback-Based Method for Parameter Adaptation in Image Processing Routines

    PubMed Central

    Mikut, Ralf; Reischl, Markus

    2016-01-01

    The parametrization of automatic image processing routines is time-consuming if a lot of image processing parameters are involved. An expert can tune parameters sequentially to get desired results. This may not be productive for applications with difficult image analysis tasks, e.g. when high noise and shading levels in an image are present or images vary in their characteristics due to different acquisition conditions. Parameters are required to be tuned simultaneously. We propose a framework to improve standard image segmentation methods by using feedback-based automatic parameter adaptation. Moreover, we compare algorithms by implementing them in a feedforward fashion and then adapting their parameters. This comparison is proposed to be evaluated by a benchmark data set that contains challenging image distortions in an increasing fashion. This promptly enables us to compare different standard image segmentation algorithms in a feedback vs. feedforward implementation by evaluating their segmentation quality and robustness. We also propose an efficient way of performing automatic image analysis when only abstract ground truth is present. Such a framework evaluates robustness of different image processing pipelines using a graded data set. This is useful for both end-users and experts. PMID:27764213

  5. 77 FR 1761 - Self-Regulatory Organizations; NYSE Arca, Inc.; Order Granting Approval of Proposed Rule Change...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-11

    ... quantitative research and evaluation process that forecasts economic excess sector returns (over/under the... proprietary SectorSAM quantitative research and evaluation process. \\8\\ The following convictions constitute... Allocation Methodology'' (``SectorSAM''), which is a proprietary quantitative analysis, to forecast each...

  6. Evaluation: The Process of Stimulating, Aiding, and Abetting Insightful Action.

    ERIC Educational Resources Information Center

    Guba, Egon G.; Stufflebeam, Daniel L.

    Part 1 of this monograph discusses the status of educational evaluation and describes several problems in carrying out such evaluation: (1) defining the educational setting, (2) defining decision types, (3) designing educational evaluation, (4) designing evaluation systems, and (5) defining criteria for judging evaluation. Part 2 proposes an…

  7. IT vendor selection model by using structural equation model & analytical hierarchy process

    NASA Astrophysics Data System (ADS)

    Maitra, Sarit; Dominic, P. D. D.

    2012-11-01

    Selecting and evaluating the right vendors is imperative for an organization's global marketplace competitiveness. Improper selection and evaluation of potential vendors can dwarf an organization's supply chain performance. Numerous studies have demonstrated that firms consider multiple criteria when selecting key vendors. This research intends to develop a new hybrid model for vendor selection process with better decision making. The new proposed model provides a suitable tool for assisting decision makers and managers to make the right decisions and select the most suitable vendor. This paper proposes a Hybrid model based on Structural Equation Model (SEM) and Analytical Hierarchy Process (AHP) for long-term strategic vendor selection problems. The five steps framework of the model has been designed after the thorough literature study. The proposed hybrid model will be applied using a real life case study to assess its effectiveness. In addition, What-if analysis technique will be used for model validation purpose.

  8. Providing a Science Base for the Evaluation of Tobacco Products

    PubMed Central

    Berman, Micah L.; Connolly, Greg; Cummings, K. Michael; Djordjevic, Mirjana V.; Hatsukami, Dorothy K.; Henningfield, Jack E.; Myers, Matthew; O'Connor, Richard J.; Parascandola, Mark; Rees, Vaughan; Rice, Jerry M.

    2015-01-01

    Objective Evidence-based tobacco regulation requires a comprehensive scientific framework to guide the evaluation of new tobacco products and health-related claims made by product manufacturers. Methods The Tobacco Product Assessment Consortium (TobPRAC) employed an iterative process involving consortia investigators, consultants, a workshop of independent scientists and public health experts, and written reviews in order to develop a conceptual framework for evaluating tobacco products. Results The consortium developed a four-phased framework for the scientific evaluation of tobacco products. The four phases addressed by the framework are: (1) pre-market evaluation, (2) pre-claims evaluation, (3) post-market activities, and (4) monitoring and re-evaluation. For each phase, the framework proposes the use of validated testing procedures that will evaluate potential harms at both the individual and population level. Conclusions While the validation of methods for evaluating tobacco products is an ongoing and necessary process, the proposed framework need not wait for fully validated methods to be used in guiding tobacco product regulation today. PMID:26665160

  9. EVALUATION OF ALTERNATIVE STRONIUM AND TRANSURANIC SEPARATION PROCESSES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    SMALLEY CS

    2011-04-25

    In order to meet contract requirements on the concentrations of strontium-90 and transuranic isotopes in the immobilized low-activity waste, strontium-90 and transuranics must be removed from the supernate of tanks 241-AN-102 and 241-AN-107. The process currently proposed for this application is an in-tank precipitation process using strontium nitrate and sodium permanganate. Development work on the process has not proceeded since 2005. The purpose of the evaluation is to identify whether any promising alternative processes have been developed since this issue was last examined, evaluate the alternatives and the baseline process, and recommend which process should be carried forward.

  10. A methodology to evaluate unplanned proposed transportation projects.

    DOT National Transportation Integrated Search

    2008-01-01

    The Virginia Department of Transportation may be asked to consider proposed transportation projects that have not originated within the transportation planning process. Examples include offers by the private sector to build infrastructure in exchange...

  11. 78 FR 51812 - Urbanized Area Formula Grants; Passenger Ferry Grant Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-21

    ... Sources of Match D. Proposal Submission Process E. Proposal Information F. Proposal Content G. Evaluation... as the market value of in-kind contributions integral to the project may be counted as a contribution... making a resubmission for any reason, include all original attachments regardless of which attachments...

  12. 78 FR 73085 - Mission Compatibility Evaluation Process

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-05

    ... daily operating hours or the number of days that equipment in the proposed structure would be in use... structure, operating characteristics, or the equipment in the proposed project. (2) Changing the location of... the DoD involve proposals for the construction of structures that may affect navigable air space...

  13. Evaluating Payments for Environmental Services: Methodological Challenges

    PubMed Central

    2016-01-01

    Over the last fifteen years, Payments for Environmental Services (PES) schemes have become very popular environmental policy instruments, but the academic literature has begun to question their additionality. The literature attempts to estimate the causal effect of these programs by applying impact evaluation (IE) techniques. However, PES programs are complex instruments and IE methods cannot be directly applied without adjustments. Based on a systematic review of the literature, this article proposes a framework for the methodological process of designing an IE for PES schemes. It revises and discusses the methodological choices at each step of the process and proposes guidelines for practitioners. PMID:26910850

  14. Theory Building through Praxis Discourse: A Theory- and Practice-Informed Model of Transformative Participatory Evaluation

    ERIC Educational Resources Information Center

    Harnar, Michael A.

    2012-01-01

    Stakeholder participation in evaluation, where the evaluator engages stakeholders in the process, is prevalent in evaluation practice and is an important focus of evaluation research. Cousins and Whitmore proposed a bifurcation of participatory evaluation into the two streams of transformative participatory and practical participatory evaluation…

  15. Crew Transportation Operations Standards

    NASA Technical Reports Server (NTRS)

    Mango, Edward J.; Pearson, Don J. (Compiler)

    2013-01-01

    The Crew Transportation Operations Standards contains descriptions of ground and flight operations processes and specifications and the criteria which will be used to evaluate the acceptability of Commercial Providers' proposed processes and specifications.

  16. Evaluation of animal models of neurobehavioral disorders

    PubMed Central

    van der Staay, F Josef; Arndt, Saskia S; Nordquist, Rebecca E

    2009-01-01

    Animal models play a central role in all areas of biomedical research. The process of animal model building, development and evaluation has rarely been addressed systematically, despite the long history of using animal models in the investigation of neuropsychiatric disorders and behavioral dysfunctions. An iterative, multi-stage trajectory for developing animal models and assessing their quality is proposed. The process starts with defining the purpose(s) of the model, preferentially based on hypotheses about brain-behavior relationships. Then, the model is developed and tested. The evaluation of the model takes scientific and ethical criteria into consideration. Model development requires a multidisciplinary approach. Preclinical and clinical experts should establish a set of scientific criteria, which a model must meet. The scientific evaluation consists of assessing the replicability/reliability, predictive, construct and external validity/generalizability, and relevance of the model. We emphasize the role of (systematic and extended) replications in the course of the validation process. One may apply a multiple-tiered 'replication battery' to estimate the reliability/replicability, validity, and generalizability of result. Compromised welfare is inherent in many deficiency models in animals. Unfortunately, 'animal welfare' is a vaguely defined concept, making it difficult to establish exact evaluation criteria. Weighing the animal's welfare and considerations as to whether action is indicated to reduce the discomfort must accompany the scientific evaluation at any stage of the model building and evaluation process. Animal model building should be discontinued if the model does not meet the preset scientific criteria, or when animal welfare is severely compromised. The application of the evaluation procedure is exemplified using the rat with neonatal hippocampal lesion as a proposed model of schizophrenia. In a manner congruent to that for improving animal models, guided by the procedure expounded upon in this paper, the developmental and evaluation procedure itself may be improved by careful definition of the purpose(s) of a model and by defining better evaluation criteria, based on the proposed use of the model. PMID:19243583

  17. New agreement measures based on survival processes

    PubMed Central

    Guo, Ying; Li, Ruosha; Peng, Limin; Manatunga, Amita K.

    2013-01-01

    Summary The need to assess agreement arises in many scenarios in biomedical sciences when measurements were taken by different methods on the same subjects. When the endpoints are survival outcomes, the study of agreement becomes more challenging given the special characteristics of time-to-event data. In this paper, we propose a new framework for assessing agreement based on survival processes that can be viewed as a natural representation of time-to-event outcomes. Our new agreement measure is formulated as the chance-corrected concordance between survival processes. It provides a new perspective for studying the relationship between correlated survival outcomes and offers an appealing interpretation as the agreement between survival times on the absolute distance scale. We provide a multivariate extension of the proposed agreement measure for multiple methods. Furthermore, the new framework enables a natural extension to evaluate time-dependent agreement structure. We develop nonparametric estimation of the proposed new agreement measures. Our estimators are shown to be strongly consistent and asymptotically normal. We evaluate the performance of the proposed estimators through simulation studies and then illustrate the methods using a prostate cancer data example. PMID:23844617

  18. NASA wide electronic publishing system: Electronic printing and duplicating. Stage 3 evaluation report

    NASA Technical Reports Server (NTRS)

    Tuey, Richard C.; Moore, Fred W.; Ryan, Christine A.

    1995-01-01

    The report is presented in four sections: The Introduction describes the duplicating configuration under evaluation and the Background contains a chronological description of the evaluation segmented by phases 1 and 2. This section includes the evaluation schedule, printing and duplicating requirements, storage and communication requirements, electronic publishing system configuration, existing processes and proposed processes, billing rates, costs and productivity analysis, and the return on investment based upon the data gathered to date. The third section contains the phase 1 comparative cost and productivity analysis. This analysis demonstrated that LaRC should proceed with a 90-day evaluation of the DocuTech and follow with a phase 2 cycle to actually demonstrate that the proposed system would meet the needs of LaRC's printing and duplicating requirements, benchmark results, cost comparisons, benchmark observations, and recommendations. These are documented after the recommendations.

  19. A Pitch Extraction Method with High Frequency Resolution for Singing Evaluation

    NASA Astrophysics Data System (ADS)

    Takeuchi, Hideyo; Hoguro, Masahiro; Umezaki, Taizo

    This paper proposes a pitch estimation method suitable for singing evaluation incorporable in KARAOKE machines. Professional singers and musicians have sharp hearing for music and singing voice. They recognize that singer's voice pitch is “a little off key” or “be in tune”. In the same way, the pitch estimation method that has high frequency resolution is necessary in order to evaluate singing. This paper proposes a pitch estimation method with high frequency resolution utilizing harmonic characteristic of autocorrelation function. The proposed method can estimate a fundamental frequency in the range 50 ∼ 1700[Hz] with resolution less than 3.6 cents in light processing.

  20. Quantitative evaluation method of the bubble structure of sponge cake by using morphology image processing

    NASA Astrophysics Data System (ADS)

    Tatebe, Hironobu; Kato, Kunihito; Yamamoto, Kazuhiko; Katsuta, Yukio; Nonaka, Masahiko

    2005-12-01

    Now a day, many evaluation methods for the food industry by using image processing are proposed. These methods are becoming new evaluation method besides the sensory test and the solid-state measurement that are using for the quality evaluation. An advantage of the image processing is to be able to evaluate objectively. The goal of our research is structure evaluation of sponge cake by using image processing. In this paper, we propose a feature extraction method of the bobble structure in the sponge cake. Analysis of the bubble structure is one of the important properties to understand characteristics of the cake from the image. In order to take the cake image, first we cut cakes and measured that's surface by using the CIS scanner. Because the depth of field of this type scanner is very shallow, the bubble region of the surface has low gray scale values, and it has a feature that is blur. We extracted bubble regions from the surface images based on these features. First, input image is binarized, and the feature of bubble is extracted by the morphology analysis. In order to evaluate the result of feature extraction, we compared correlation with "Size of the bubble" of the sensory test result. From a result, the bubble extraction by using morphology analysis gives good correlation. It is shown that our method is as well as the subjectivity evaluation.

  1. Walsh-Hadamard transform kernel-based feature vector for shot boundary detection.

    PubMed

    Lakshmi, Priya G G; Domnic, S

    2014-12-01

    Video shot boundary detection (SBD) is the first step of video analysis, summarization, indexing, and retrieval. In SBD process, videos are segmented into basic units called shots. In this paper, a new SBD method is proposed using color, edge, texture, and motion strength as vector of features (feature vector). Features are extracted by projecting the frames on selected basis vectors of Walsh-Hadamard transform (WHT) kernel and WHT matrix. After extracting the features, based on the significance of the features, weights are calculated. The weighted features are combined to form a single continuity signal, used as input for Procedure Based shot transition Identification process (PBI). Using the procedure, shot transitions are classified into abrupt and gradual transitions. Experimental results are examined using large-scale test sets provided by the TRECVID 2007, which has evaluated hard cut and gradual transition detection. To evaluate the robustness of the proposed method, the system evaluation is performed. The proposed method yields F1-Score of 97.4% for cut, 78% for gradual, and 96.1% for overall transitions. We have also evaluated the proposed feature vector with support vector machine classifier. The results show that WHT-based features can perform well than the other existing methods. In addition to this, few more video sequences are taken from the Openvideo project and the performance of the proposed method is compared with the recent existing SBD method.

  2. Multiple attribute decision making model and application to food safety risk evaluation.

    PubMed

    Ma, Lihua; Chen, Hong; Yan, Huizhe; Yang, Lifeng; Wu, Lifeng

    2017-01-01

    Decision making for supermarket food purchase decisions are characterized by network relationships. This paper analyzed factors that influence supermarket food selection and proposes a supplier evaluation index system based on the whole process of food production. The author established the intuitive interval value fuzzy set evaluation model based on characteristics of the network relationship among decision makers, and validated for a multiple attribute decision making case study. Thus, the proposed model provides a reliable, accurate method for multiple attribute decision making.

  3. New Developments in the Technology Readiness Assessment Process in US DOE-EM - 13247

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krahn, Steven; Sutter, Herbert; Johnson, Hoyt

    2013-07-01

    A Technology Readiness Assessment (TRA) is a systematic, metric-based process and accompanying report that evaluates the maturity of the technologies used in systems; it is designed to measure technology maturity using the Technology Readiness Level (TRL) scale pioneered by the National Aeronautics and Space Administration (NASA) in the 1980's. More recently, DoD has adopted and provided systematic guidance for performing TRAs and determining TRLs. In 2007 the GAO recommended that the DOE adopt the NASA/DoD methodology for evaluating technology maturity. Earlier, in 2006-2007, DOE-EM had conducted pilot TRAs on a number of projects at Hanford and Savannah River. In Marchmore » 2008, DOE-EM issued a process guide, which established TRAs as an integral part of DOE-EM's Project Management Critical Decision Process. Since the development of its detailed TRA guidance in 2008, DOE-EM has continued to accumulate experience in the conduct of TRAs and the process for evaluating technology maturity. DOE has developed guidance on TRAs applicable department-wide. DOE-EM's experience with the TRA process, the evaluations that led to recently developed proposed revisions to the DOE-EM TRA/TMP Guide; the content of the proposed changes that incorporate the above lessons learned and insights are described. (authors)« less

  4. Information processing capacity in psychopathy: Effects of anomalous attention.

    PubMed

    Hamilton, Rachel K B; Newman, Joseph P

    2018-03-01

    Hamilton and colleagues (2015) recently proposed that an integrative deficit in psychopathy restricts simultaneous processing, thereby leaving fewer resources available for information encoding, narrowing the scope of attention, and undermining associative processing. The current study evaluated this parallel processing deficit proposal using the Simultaneous-Sequential paradigm. This investigation marks the first a priori test of the Hamilton et al.'s theoretical framework. We predicted that psychopathy would be associated with inferior performance (as indexed by lower accuracy and longer response time) on trials requiring simultaneous processing of visual information relative to trials necessitating sequential processing. Results were consistent with these predictions, supporting the proposal that psychopathy is characterized by a reduced capacity to process multicomponent perceptual information concurrently. We discuss the potential implications of impaired simultaneous processing for the conceptualization of the psychopathic deficit. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  5. Formative Evaluation of ETV Programmes.

    ERIC Educational Resources Information Center

    Duby, Aliza

    A process of formative evaluation which considers moral, scientific, and social values as its criteria and which is conducted by project staff is proposed for the evaluation of educational television (ETV) programs produced by the South African Broadcasting Corporation. The theoretical framework of ETV evaluation is outlined in the first section,…

  6. Standards for Title VII Evaluations: Accommodation for Reality Constraints.

    ERIC Educational Resources Information Center

    Yap, Kim Onn

    Two separate sets of minimum standards designed to guide the evaluation of bilingual projects are proposed. The first set relates to the process in which the evaluation activities are conducted. They include: validity of assessment procedures, validity and reliability of evaluation instruments, representativeness of findings, use of procedures for…

  7. 7 CFR 3430.608 - Review criteria.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ...) Evaluation criteria. NIFA shall evaluate project proposals according to the following factors: (1) Relevancy.... (5) The adequacy of plans for the participatory evaluation process, outcome-based reporting, and the communication of findings and results beyond the immediate target audience. (6) Other appropriate factors, as...

  8. 7 CFR 3430.608 - Review criteria.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ...) Evaluation criteria. NIFA shall evaluate project proposals according to the following factors: (1) Relevancy.... (5) The adequacy of plans for the participatory evaluation process, outcome-based reporting, and the communication of findings and results beyond the immediate target audience. (6) Other appropriate factors, as...

  9. 7 CFR 3430.608 - Review criteria.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ...) Evaluation criteria. NIFA shall evaluate project proposals according to the following factors: (1) Relevancy.... (5) The adequacy of plans for the participatory evaluation process, outcome-based reporting, and the communication of findings and results beyond the immediate target audience. (6) Other appropriate factors, as...

  10. Experience Based Career Education at Wichita East High School: A Third-Party Evaluation for Year Two, 1977-78.

    ERIC Educational Resources Information Center

    Crawford, George; Miskel, Cecil

    A third-party evaluation was conducted to assess the second year's operation of the Experience Based Career Education (EBCE) program at Wichita (Kansas) High School East. The program proposal contained fourteen process objectives and twelve outcome objectives. The status of the process objective achievement was determined by interviewing program…

  11. Peer review of health research funding proposals: A systematic map and systematic review of innovations for effectiveness and efficiency.

    PubMed

    Shepherd, Jonathan; Frampton, Geoff K; Pickett, Karen; Wyatt, Jeremy C

    2018-01-01

    To investigate methods and processes for timely, efficient and good quality peer review of research funding proposals in health. A two-stage evidence synthesis: (1) a systematic map to describe the key characteristics of the evidence base, followed by (2) a systematic review of the studies stakeholders prioritised as relevant from the map on the effectiveness and efficiency of peer review 'innovations'. Standard processes included literature searching, duplicate inclusion criteria screening, study keyword coding, data extraction, critical appraisal and study synthesis. A total of 83 studies from 15 countries were included in the systematic map. The evidence base is diverse, investigating many aspects of the systems for, and processes of, peer review. The systematic review included eight studies from Australia, Canada, and the USA, evaluating a broad range of peer review innovations. These studies showed that simplifying the process by shortening proposal forms, using smaller reviewer panels, or expediting processes can speed up the review process and reduce costs, but this might come at the expense of peer review quality, a key aspect that has not been assessed. Virtual peer review using videoconferencing or teleconferencing appears promising for reducing costs by avoiding the need for reviewers to travel, but again any consequences for quality have not been adequately assessed. There is increasing international research activity into the peer review of health research funding. The studies reviewed had methodological limitations and variable generalisability to research funders. Given these limitations it is not currently possible to recommend immediate implementation of these innovations. However, many appear promising based on existing evidence, and could be adapted as necessary by funders and evaluated. Where feasible, experimental evaluation, including randomised controlled trials, should be conducted, evaluating impact on effectiveness, efficiency and quality.

  12. Design and Evaluation of a Scalable and Reconfigurable Multi-Platform System for Acoustic Imaging

    PubMed Central

    Izquierdo, Alberto; Villacorta, Juan José; del Val Puente, Lara; Suárez, Luis

    2016-01-01

    This paper proposes a scalable and multi-platform framework for signal acquisition and processing, which allows for the generation of acoustic images using planar arrays of MEMS (Micro-Electro-Mechanical Systems) microphones with low development and deployment costs. Acoustic characterization of MEMS sensors was performed, and the beam pattern of a module, based on an 8 × 8 planar array and of several clusters of modules, was obtained. A flexible framework, formed by an FPGA, an embedded processor, a computer desktop, and a graphic processing unit, was defined. The processing times of the algorithms used to obtain the acoustic images, including signal processing and wideband beamforming via FFT, were evaluated in each subsystem of the framework. Based on this analysis, three frameworks are proposed, defined by the specific subsystems used and the algorithms shared. Finally, a set of acoustic images obtained from sound reflected from a person are presented as a case study in the field of biometric identification. These results reveal the feasibility of the proposed system. PMID:27727174

  13. Research on the Environmental Performance Evaluation of Electronic Waste Reverse Logistics Enterprise

    NASA Astrophysics Data System (ADS)

    Yang, Yu-Xiang; Chen, Fei-Yang; Tong, Tong

    According to the characteristic of e-waste reverse logistics, environmental performance evaluation system of electronic waste reverse logistics enterprise is proposed. We use fuzzy analytic hierarchy process method to evaluate the system. In addition, this paper analyzes the enterprise X, as an example, to discuss the evaluation method. It's important to point out attributes and indexes which should be strengthen during the process of ewaste reverse logistics and provide guidance suggestions to domestic e-waste reverse logistics enterprises.

  14. Development of a Rating Form to Evaluate Grant Applications to the Hogg Foundation for Mental Health

    ERIC Educational Resources Information Center

    Whaley, Arthur L.; Rodriguez, Reymundo; Alexander, Laurel A.

    2006-01-01

    Reliance on subjective grant proposal review methods leads private philanthropies to underfund mental health programs, even when foundations have mental health focuses. This article describes a private mental health foundation's efforts to increase the objectivity of its proposal review process by developing a reliable, valid proposal rating form.…

  15. Green material selection for sustainability: A hybrid MCDM approach.

    PubMed

    Zhang, Honghao; Peng, Yong; Tian, Guangdong; Wang, Danqi; Xie, Pengpeng

    2017-01-01

    Green material selection is a crucial step for the material industry to comprehensively improve material properties and promote sustainable development. However, because of the subjectivity and conflicting evaluation criteria in its process, green material selection, as a multi-criteria decision making (MCDM) problem, has been a widespread concern to the relevant experts. Thus, this study proposes a hybrid MCDM approach that combines decision making and evaluation laboratory (DEMATEL), analytical network process (ANP), grey relational analysis (GRA) and technique for order performance by similarity to ideal solution (TOPSIS) to select the optimal green material for sustainability based on the product's needs. A nonlinear programming model with constraints was proposed to obtain the integrated closeness index. Subsequently, an empirical application of rubbish bins was used to illustrate the proposed method. In addition, a sensitivity analysis and a comparison with existing methods were employed to validate the accuracy and stability of the obtained final results. We found that this method provides a more accurate and effective decision support tool for alternative evaluation or strategy selection.

  16. Green material selection for sustainability: A hybrid MCDM approach

    PubMed Central

    Zhang, Honghao; Peng, Yong; Tian, Guangdong; Wang, Danqi; Xie, Pengpeng

    2017-01-01

    Green material selection is a crucial step for the material industry to comprehensively improve material properties and promote sustainable development. However, because of the subjectivity and conflicting evaluation criteria in its process, green material selection, as a multi-criteria decision making (MCDM) problem, has been a widespread concern to the relevant experts. Thus, this study proposes a hybrid MCDM approach that combines decision making and evaluation laboratory (DEMATEL), analytical network process (ANP), grey relational analysis (GRA) and technique for order performance by similarity to ideal solution (TOPSIS) to select the optimal green material for sustainability based on the product's needs. A nonlinear programming model with constraints was proposed to obtain the integrated closeness index. Subsequently, an empirical application of rubbish bins was used to illustrate the proposed method. In addition, a sensitivity analysis and a comparison with existing methods were employed to validate the accuracy and stability of the obtained final results. We found that this method provides a more accurate and effective decision support tool for alternative evaluation or strategy selection. PMID:28498864

  17. Using a fuzzy comprehensive evaluation method to determine product usability: A proposed theoretical framework.

    PubMed

    Zhou, Ronggang; Chan, Alan H S

    2017-01-01

    In order to compare existing usability data to ideal goals or to that for other products, usability practitioners have tried to develop a framework for deriving an integrated metric. However, most current usability methods with this aim rely heavily on human judgment about the various attributes of a product, but often fail to take into account of the inherent uncertainties in these judgments in the evaluation process. This paper presents a universal method of usability evaluation by combining the analytic hierarchical process (AHP) and the fuzzy evaluation method. By integrating multiple sources of uncertain information during product usability evaluation, the method proposed here aims to derive an index that is structured hierarchically in terms of the three usability components of effectiveness, efficiency, and user satisfaction of a product. With consideration of the theoretical basis of fuzzy evaluation, a two-layer comprehensive evaluation index was first constructed. After the membership functions were determined by an expert panel, the evaluation appraisals were computed by using the fuzzy comprehensive evaluation technique model to characterize fuzzy human judgments. Then with the use of AHP, the weights of usability components were elicited from these experts. Compared to traditional usability evaluation methods, the major strength of the fuzzy method is that it captures the fuzziness and uncertainties in human judgments and provides an integrated framework that combines the vague judgments from multiple stages of a product evaluation process.

  18. Status report: Data management program algorithm evaluation activity at Marshall Space Flight Center

    NASA Technical Reports Server (NTRS)

    Jayroe, R. R., Jr.

    1977-01-01

    An algorithm evaluation activity was initiated to study the problems associated with image processing by assessing the independent and interdependent effects of registration, compression, and classification techniques on LANDSAT data for several discipline applications. The objective of the activity was to make recommendations on selected applicable image processing algorithms in terms of accuracy, cost, and timeliness or to propose alternative ways of processing the data. As a means of accomplishing this objective, an Image Coding Panel was established. The conduct of the algorithm evaluation is described.

  19. Application of system thinking concepts in health system strengthening in low-income settings: a proposed conceptual framework for the evaluation of a complex health system intervention: the case of the BHOMA intervention in Zambia.

    PubMed

    Mutale, Wilbroad; Balabanova, Dina; Chintu, Namwinga; Mwanamwenge, Margaret Tembo; Ayles, Helen

    2016-02-01

    The current drive to strengthen health systems provides an opportunity to develop new strategies that will enable countries to achieve targets for millennium development goals. In this paper, we present a proposed framework for evaluating a new health system strengthening intervention in Zambia known as Better Health Outcomes through Mentoring and Assessment. We briefly describe the intervention design and focus on the proposed evaluation approach through the lens of systems thinking. In this paper, we present a proposed framework to evaluate a complex health system intervention applying systems thinking concepts. We hope that lessons learnt from this process will help to adapt the intervention and limit unintended negative consequences while promoting positive effects. Emphasis will be paid to interaction and interdependence between health system building blocks, context and the community. © 2014 The Authors. Journal of Evaluation in Clinical Practice published by John Wiley & Sons, Ltd.

  20. A Decentralized Compositional Framework for Dependable Decision Process in Self-Managed Cyber Physical Systems

    PubMed Central

    Hou, Kun-Mean; Zhang, Zhan

    2017-01-01

    Cyber Physical Systems (CPSs) need to interact with the changeable environment under various interferences. To provide continuous and high quality services, a self-managed CPS should automatically reconstruct itself to adapt to these changes and recover from failures. Such dynamic adaptation behavior introduces systemic challenges for CPS design, advice evaluation and decision process arrangement. In this paper, a formal compositional framework is proposed to systematically improve the dependability of the decision process. To guarantee the consistent observation of event orders for causal reasoning, this work first proposes a relative time-based method to improve the composability and compositionality of the timing property of events. Based on the relative time solution, a formal reference framework is introduced for self-managed CPSs, which includes a compositional FSM-based actor model (subsystems of CPS), actor-based advice and runtime decomposable decisions. To simplify self-management, a self-similar recursive actor interface is proposed for decision (actor) composition. We provide constraints and seven patterns for the composition of reliability and process time requirements. Further, two decentralized decision process strategies are proposed based on our framework, and we compare the reliability with the static strategy and the centralized processing strategy. The simulation results show that the one-order feedback strategy has high reliability, scalability and stability against the complexity of decision and random failure. This paper also shows a way to simplify the evaluation for dynamic system by improving the composability and compositionality of the subsystem. PMID:29120357

  1. A Decentralized Compositional Framework for Dependable Decision Process in Self-Managed Cyber Physical Systems.

    PubMed

    Zhou, Peng; Zuo, Decheng; Hou, Kun-Mean; Zhang, Zhan

    2017-11-09

    Cyber Physical Systems (CPSs) need to interact with the changeable environment under various interferences. To provide continuous and high quality services, a self-managed CPS should automatically reconstruct itself to adapt to these changes and recover from failures. Such dynamic adaptation behavior introduces systemic challenges for CPS design, advice evaluation and decision process arrangement. In this paper, a formal compositional framework is proposed to systematically improve the dependability of the decision process. To guarantee the consistent observation of event orders for causal reasoning, this work first proposes a relative time-based method to improve the composability and compositionality of the timing property of events. Based on the relative time solution, a formal reference framework is introduced for self-managed CPSs, which includes a compositional FSM-based actor model (subsystems of CPS), actor-based advice and runtime decomposable decisions. To simplify self-management, a self-similar recursive actor interface is proposed for decision (actor) composition. We provide constraints and seven patterns for the composition of reliability and process time requirements. Further, two decentralized decision process strategies are proposed based on our framework, and we compare the reliability with the static strategy and the centralized processing strategy. The simulation results show that the one-order feedback strategy has high reliability, scalability and stability against the complexity of decision and random failure. This paper also shows a way to simplify the evaluation for dynamic system by improving the composability and compositionality of the subsystem.

  2. A General Accelerated Degradation Model Based on the Wiener Process.

    PubMed

    Liu, Le; Li, Xiaoyang; Sun, Fuqiang; Wang, Ning

    2016-12-06

    Accelerated degradation testing (ADT) is an efficient tool to conduct material service reliability and safety evaluations by analyzing performance degradation data. Traditional stochastic process models are mainly for linear or linearization degradation paths. However, those methods are not applicable for the situations where the degradation processes cannot be linearized. Hence, in this paper, a general ADT model based on the Wiener process is proposed to solve the problem for accelerated degradation data analysis. The general model can consider the unit-to-unit variation and temporal variation of the degradation process, and is suitable for both linear and nonlinear ADT analyses with single or multiple acceleration variables. The statistical inference is given to estimate the unknown parameters in both constant stress and step stress ADT. The simulation example and two real applications demonstrate that the proposed method can yield reliable lifetime evaluation results compared with the existing linear and time-scale transformation Wiener processes in both linear and nonlinear ADT analyses.

  3. A General Accelerated Degradation Model Based on the Wiener Process

    PubMed Central

    Liu, Le; Li, Xiaoyang; Sun, Fuqiang; Wang, Ning

    2016-01-01

    Accelerated degradation testing (ADT) is an efficient tool to conduct material service reliability and safety evaluations by analyzing performance degradation data. Traditional stochastic process models are mainly for linear or linearization degradation paths. However, those methods are not applicable for the situations where the degradation processes cannot be linearized. Hence, in this paper, a general ADT model based on the Wiener process is proposed to solve the problem for accelerated degradation data analysis. The general model can consider the unit-to-unit variation and temporal variation of the degradation process, and is suitable for both linear and nonlinear ADT analyses with single or multiple acceleration variables. The statistical inference is given to estimate the unknown parameters in both constant stress and step stress ADT. The simulation example and two real applications demonstrate that the proposed method can yield reliable lifetime evaluation results compared with the existing linear and time-scale transformation Wiener processes in both linear and nonlinear ADT analyses. PMID:28774107

  4. On the evaluation of segmentation editing tools

    PubMed Central

    Heckel, Frank; Moltz, Jan H.; Meine, Hans; Geisler, Benjamin; Kießling, Andreas; D’Anastasi, Melvin; dos Santos, Daniel Pinto; Theruvath, Ashok Joseph; Hahn, Horst K.

    2014-01-01

    Abstract. Efficient segmentation editing tools are important components in the segmentation process, as no automatic methods exist that always generate sufficient results. Evaluating segmentation editing algorithms is challenging, because their quality depends on the user’s subjective impression. So far, no established methods for an objective, comprehensive evaluation of such tools exist and, particularly, intermediate segmentation results are not taken into account. We discuss the evaluation of editing algorithms in the context of tumor segmentation in computed tomography. We propose a rating scheme to qualitatively measure the accuracy and efficiency of editing tools in user studies. In order to objectively summarize the overall quality, we propose two scores based on the subjective rating and the quantified segmentation quality over time. Finally, a simulation-based evaluation approach is discussed, which allows a more reproducible evaluation without the need for human input. This automated evaluation complements user studies, allowing a more convincing evaluation, particularly during development, where frequent user studies are not possible. The proposed methods have been used to evaluate two dedicated editing algorithms on 131 representative tumor segmentations. We show how the comparison of editing algorithms benefits from the proposed methods. Our results also show the correlation of the suggested quality score with the qualitative ratings. PMID:26158063

  5. Usability Evaluation of a Web-Based Learning System

    ERIC Educational Resources Information Center

    Nguyen, Thao

    2012-01-01

    The paper proposes a contingent, learner-centred usability evaluation method and a prototype tool of such systems. This is a new usability evaluation method for web-based learning systems using a set of empirically-supported usability factors and can be done effectively with limited resources. During the evaluation process, the method allows for…

  6. Physical Education Resources, Class Management, and Student Physical Activity Levels: A Structure-Process-Outcome Approach to Evaluating Physical Education Effectiveness

    ERIC Educational Resources Information Center

    Bevans, Katherine B.; Fitzpatrick, Leslie-Anne; Sanchez, Betty M.; Riley, Anne W.; Forrest, Christopher

    2010-01-01

    Background: This study was conducted to empirically evaluate specific human, curricular, and material resources that maximize student opportunities for physical activity during physical education (PE) class time. A structure-process-outcome model was proposed to identify the resources that influence the frequency of PE and intensity of physical…

  7. Participation versus Privacy in the Training of Group Counselors.

    ERIC Educational Resources Information Center

    Pierce, Keith A.; Baldwin, Cynthia

    1990-01-01

    Examines the process of requiring and evaluating personal growth group participation for students in counselor education programs. Discusses the key components in the dilemma of protecting privacy while evaluating competencies, including ethical practices and program alternatives to avoid evaluation. Proposes a model that will enable participation…

  8. Research on assessment and improvement method of remote sensing image reconstruction

    NASA Astrophysics Data System (ADS)

    Sun, Li; Hua, Nian; Yu, Yanbo; Zhao, Zhanping

    2018-01-01

    Remote sensing image quality assessment and improvement is an important part of image processing. Generally, the use of compressive sampling theory in remote sensing imaging system can compress images while sampling which can improve efficiency. A method of two-dimensional principal component analysis (2DPCA) is proposed to reconstruct the remote sensing image to improve the quality of the compressed image in this paper, which contain the useful information of image and can restrain the noise. Then, remote sensing image quality influence factors are analyzed, and the evaluation parameters for quantitative evaluation are introduced. On this basis, the quality of the reconstructed images is evaluated and the different factors influence on the reconstruction is analyzed, providing meaningful referential data for enhancing the quality of remote sensing images. The experiment results show that evaluation results fit human visual feature, and the method proposed have good application value in the field of remote sensing image processing.

  9. Artificial intelligence and signal processing for infrastructure assessment

    NASA Astrophysics Data System (ADS)

    Assaleh, Khaled; Shanableh, Tamer; Yehia, Sherif

    2015-04-01

    The Ground Penetrating Radar (GPR) is being recognized as an effective nondestructive evaluation technique to improve the inspection process. However, data interpretation and complexity of the results impose some limitations on the practicality of using this technique. This is mainly due to the need of a trained experienced person to interpret images obtained by the GPR system. In this paper, an algorithm to classify and assess the condition of infrastructures utilizing image processing and pattern recognition techniques is discussed. Features extracted form a dataset of images of defected and healthy slabs are used to train a computer vision based system while another dataset is used to evaluate the proposed algorithm. Initial results show that the proposed algorithm is able to detect the existence of defects with about 77% success rate.

  10. Decision-making impairments in Parkinson's disease as a by-product of defective cost-benefit analysis and feedback processing.

    PubMed

    Ryterska, Agata; Jahanshahi, Marjan; Osman, Magda

    2014-01-01

    Studies examining decision-making in people with Parkinson's disease (PD) show impaired performance on a variety of tasks. However, there are also demonstrations that patients with PD can make optimal decisions just like healthy age-matched controls. We propose that the reason for these mixed findings is that PD does not produce a generalized impairment of decision-making, but rather affects sub-components of this process. In this review we evaluate this hypothesis by considering the empirical evidence examining decision-making in PD. We suggest that of the various stages of the decision-making process, the most affected in PD are (1) the cost-benefit analysis stage and (2) the outcome evaluation stage. We consider the implications of this proposal for research in this area.

  11. Evaluation of arterial propagation velocity based on the automated analysis of the Pulse Wave Shape

    NASA Astrophysics Data System (ADS)

    Clara, F. M.; Scandurra, A. G.; Meschino, G. J.; Passoni, L. I.

    2011-12-01

    This paper proposes the automatic estimation of the arterial propagation velocity from the pulse wave raw records measured in the region of the radial artery. A fully automatic process is proposed to select and analyze typical pulse cycles from the raw data. An adaptive neuro-fuzzy inference system, together with a heuristic search is used to find a functional approximation of the pulse wave. The estimation of the propagation velocity is carried out via the analysis of the functional approximation obtained with the fuzzy model. The analysis of the pulse wave records with the proposed methodology showed small differences compared with the method used so far, based on a strong interaction with the user. To evaluate the proposed methodology, we estimated the propagation velocity in a population of healthy men from a wide range of ages. It has been found in these studies that propagation velocity increases linearly with age and it presents a considerable dispersion of values in healthy individuals. We conclude that this process could be used to evaluate indirectly the propagation velocity of the aorta, which is related to physiological age in healthy individuals and with the expectation of life in cardiovascular patients.

  12. 15 CFR 291.5 - Proposal selection process.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... INSTITUTE OF STANDARDS AND TECHNOLOGY, DEPARTMENT OF COMMERCE NIST EXTRAMURAL PROGRAMS MANUFACTURING... reviewed by NIST to assure compliance with the proposal content and other basic provisions of this notice... finalists. NIST will appoint an evaluation panel composed of NIST and in some cases other federal employees...

  13. 15 CFR 291.5 - Proposal selection process.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... INSTITUTE OF STANDARDS AND TECHNOLOGY, DEPARTMENT OF COMMERCE NIST EXTRAMURAL PROGRAMS MANUFACTURING... reviewed by NIST to assure compliance with the proposal content and other basic provisions of this notice... finalists. NIST will appoint an evaluation panel composed of NIST and in some cases other federal employees...

  14. 15 CFR 291.5 - Proposal selection process.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... INSTITUTE OF STANDARDS AND TECHNOLOGY, DEPARTMENT OF COMMERCE NIST EXTRAMURAL PROGRAMS MANUFACTURING... reviewed by NIST to assure compliance with the proposal content and other basic provisions of this notice... finalists. NIST will appoint an evaluation panel composed of NIST and in some cases other federal employees...

  15. 15 CFR 291.5 - Proposal selection process.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... INSTITUTE OF STANDARDS AND TECHNOLOGY, DEPARTMENT OF COMMERCE NIST EXTRAMURAL PROGRAMS MANUFACTURING... reviewed by NIST to assure compliance with the proposal content and other basic provisions of this notice... finalists. NIST will appoint an evaluation panel composed of NIST and in some cases other federal employees...

  16. 15 CFR 291.5 - Proposal selection process.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... INSTITUTE OF STANDARDS AND TECHNOLOGY, DEPARTMENT OF COMMERCE NIST EXTRAMURAL PROGRAMS MANUFACTURING... reviewed by NIST to assure compliance with the proposal content and other basic provisions of this notice... finalists. NIST will appoint an evaluation panel composed of NIST and in some cases other federal employees...

  17. Comprehensive Mass Analysis for Chemical Processes, a Case Study on L-Dopa Manufacture

    EPA Science Inventory

    To evaluate the “greenness” of chemical processes in route selection and process development, we propose a comprehensive mass analysis to inform the stakeholders from different fields. This is carried out by characterizing the mass intensity for each contributing chemical or wast...

  18. Selecting automation for the clinical chemistry laboratory.

    PubMed

    Melanson, Stacy E F; Lindeman, Neal I; Jarolim, Petr

    2007-07-01

    Laboratory automation proposes to improve the quality and efficiency of laboratory operations, and may provide a solution to the quality demands and staff shortages faced by today's clinical laboratories. Several vendors offer automation systems in the United States, with both subtle and obvious differences. Arriving at a decision to automate, and the ensuing evaluation of available products, can be time-consuming and challenging. Although considerable discussion concerning the decision to automate has been published, relatively little attention has been paid to the process of evaluating and selecting automation systems. To outline a process for evaluating and selecting automation systems as a reference for laboratories contemplating laboratory automation. Our Clinical Chemistry Laboratory staff recently evaluated all major laboratory automation systems in the United States, with their respective chemistry and immunochemistry analyzers. Our experience is described and organized according to the selection process, the important considerations in clinical chemistry automation, decisions and implementation, and we give conclusions pertaining to this experience. Including the formation of a committee, workflow analysis, submitting a request for proposal, site visits, and making a final decision, the process of selecting chemistry automation took approximately 14 months. We outline important considerations in automation design, preanalytical processing, analyzer selection, postanalytical storage, and data management. Selecting clinical chemistry laboratory automation is a complex, time-consuming process. Laboratories considering laboratory automation may benefit from the concise overview and narrative and tabular suggestions provided.

  19. Are there two processes in reasoning? The dimensionality of inductive and deductive inferences.

    PubMed

    Stephens, Rachel G; Dunn, John C; Hayes, Brett K

    2018-03-01

    Single-process accounts of reasoning propose that the same cognitive mechanisms underlie inductive and deductive inferences. In contrast, dual-process accounts propose that these inferences depend upon 2 qualitatively different mechanisms. To distinguish between these accounts, we derived a set of single-process and dual-process models based on an overarching signal detection framework. We then used signed difference analysis to test each model against data from an argument evaluation task, in which induction and deduction judgments are elicited for sets of valid and invalid arguments. Three data sets were analyzed: data from Singmann and Klauer (2011), a database of argument evaluation studies, and the results of an experiment designed to test model predictions. Of the large set of testable models, we found that almost all could be rejected, including all 2-dimensional models. The only testable model able to account for all 3 data sets was a model with 1 dimension of argument strength and independent decision criteria for induction and deduction judgments. We conclude that despite the popularity of dual-process accounts, current results from the argument evaluation task are best explained by a single-process account that incorporates separate decision thresholds for inductive and deductive inferences. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  20. The performance evaluation model of mining project founded on the weight optimization entropy value method

    NASA Astrophysics Data System (ADS)

    Mao, Chao; Chen, Shou

    2017-01-01

    According to the traditional entropy value method still have low evaluation accuracy when evaluating the performance of mining projects, a performance evaluation model of mineral project founded on improved entropy is proposed. First establish a new weight assignment model founded on compatible matrix analysis of analytic hierarchy process (AHP) and entropy value method, when the compatibility matrix analysis to achieve consistency requirements, if it has differences between subjective weights and objective weights, moderately adjust both proportions, then on this basis, the fuzzy evaluation matrix for performance evaluation. The simulation experiments show that, compared with traditional entropy and compatible matrix analysis method, the proposed performance evaluation model of mining project based on improved entropy value method has higher accuracy assessment.

  1. Maximizing Total QoS-Provisioning of Image Streams with Limited Energy Budget

    NASA Astrophysics Data System (ADS)

    Lee, Wan Yeon; Kim, Kyong Hoon; Ko, Young Woong

    To fully utilize the limited battery energy of mobile electronic devices, we propose an adaptive adjustment method of processing quality for multiple image stream tasks running with widely varying execution times. This adjustment method completes the worst-case executions of the tasks with a given budget of energy, and maximizes the total reward value of processing quality obtained during their executions by exploiting the probability distribution of task execution times. The proposed method derives the maximum reward value for the tasks being executable with arbitrary processing quality, and near maximum value for the tasks being executable with a finite number of processing qualities. Our evaluation on a prototype system shows that the proposed method achieves larger reward values, by up to 57%, than the previous method.

  2. Pretreatment of empty fruit bunch from oil palm for fuel ethanol production and proposed biorefinery process.

    PubMed

    Tan, Liping; Yu, Yongcheng; Li, Xuezhi; Zhao, Jian; Qu, Yinbo; Choo, Yuen May; Loh, Soh Kheang

    2013-05-01

    This study evaluates the effects of some pretreatment processes to improve the enzymatic hydrolysis of oil palm empty fruit bunch (EFB) for ethanol production. The experimental results show that the bisulfite pretreatment was practical for EFB pretreatment. Moreover, the optimum pretreatment conditions of the bisulfite pretreatment (180 °C, 30 min, 8% NaHSO3, 1% H2SO4) were identified. In the experiments, a biorefinery process of EFB was proposed to produce ethanol, xylose products, and lignosulfonates. Copyright © 2012 Elsevier Ltd. All rights reserved.

  3. Evaluating Cloud Computing in the Proposed NASA DESDynI Ground Data System

    NASA Technical Reports Server (NTRS)

    Tran, John J.; Cinquini, Luca; Mattmann, Chris A.; Zimdars, Paul A.; Cuddy, David T.; Leung, Kon S.; Kwoun, Oh-Ig; Crichton, Dan; Freeborn, Dana

    2011-01-01

    The proposed NASA Deformation, Ecosystem Structure and Dynamics of Ice (DESDynI) mission would be a first-of-breed endeavor that would fundamentally change the paradigm by which Earth Science data systems at NASA are built. DESDynI is evaluating a distributed architecture where expert science nodes around the country all engage in some form of mission processing and data archiving. This is compared to the traditional NASA Earth Science missions where the science processing is typically centralized. What's more, DESDynI is poised to profoundly increase the amount of data collection and processing well into the 5 terabyte/day and tens of thousands of job range, both of which comprise a tremendous challenge to DESDynI's proposed distributed data system architecture. In this paper, we report on a set of architectural trade studies and benchmarks meant to inform the DESDynI mission and the broader community of the impacts of these unprecedented requirements. In particular, we evaluate the benefits of cloud computing and its integration with our existing NASA ground data system software called Apache Object Oriented Data Technology (OODT). The preliminary conclusions of our study suggest that the use of the cloud and OODT together synergistically form an effective, efficient and extensible combination that could meet the challenges of NASA science missions requiring DESDynI-like data collection and processing volumes at reduced costs.

  4. An automated process for building reliable and optimal in vitro/in vivo correlation models based on Monte Carlo simulations.

    PubMed

    Sutton, Steven C; Hu, Mingxiu

    2006-05-05

    Many mathematical models have been proposed for establishing an in vitro/in vivo correlation (IVIVC). The traditional IVIVC model building process consists of 5 steps: deconvolution, model fitting, convolution, prediction error evaluation, and cross-validation. This is a time-consuming process and typically a few models at most are tested for any given data set. The objectives of this work were to (1) propose a statistical tool to screen models for further development of an IVIVC, (2) evaluate the performance of each model under different circumstances, and (3) investigate the effectiveness of common statistical model selection criteria for choosing IVIVC models. A computer program was developed to explore which model(s) would be most likely to work well with a random variation from the original formulation. The process used Monte Carlo simulation techniques to build IVIVC models. Data-based model selection criteria (Akaike Information Criteria [AIC], R2) and the probability of passing the Food and Drug Administration "prediction error" requirement was calculated. To illustrate this approach, several real data sets representing a broad range of release profiles are used to illustrate the process and to demonstrate the advantages of this automated process over the traditional approach. The Hixson-Crowell and Weibull models were often preferred over the linear. When evaluating whether a Level A IVIVC model was possible, the model selection criteria AIC generally selected the best model. We believe that the approach we proposed may be a rapid tool to determine which IVIVC model (if any) is the most applicable.

  5. Method and Application for Dynamic Comprehensive Evaluation with Subjective and Objective Information

    PubMed Central

    Liu, Dinglin; Zhao, Xianglian

    2013-01-01

    In an effort to deal with more complicated evaluation situations, scientists have focused their efforts on dynamic comprehensive evaluation research. How to make full use of the subjective and objective information has become one of the noteworthy content. In this paper, a dynamic comprehensive evaluation method with subjective and objective information is proposed. We use the combination weighting method to determine the index weight. Analysis hierarchy process method is applied to dispose the subjective information, and criteria importance through intercriteria correlation method is used to handle the objective information. And for the time weight determination, we consider both time distance and information size to embody the principle of esteeming the present over the past. And then the linear weighted average model is constructed to make the evaluation process more practicable. Finally, an example is presented to illustrate the effectiveness of this method. Overall, the results suggest that the proposed method is reasonable and effective. PMID:24386176

  6. A study for high accuracy measurement of residual stress by deep hole drilling technique

    NASA Astrophysics Data System (ADS)

    Kitano, Houichi; Okano, Shigetaka; Mochizuki, Masahito

    2012-08-01

    The deep hole drilling technique (DHD) received much attention in recent years as a method for measuring through-thickness residual stresses. However, some accuracy problems occur when residual stress evaluation is performed by the DHD technique. One of the reasons is that the traditional DHD evaluation formula applies to the plane stress condition. The second is that the effects of the plastic deformation produced in the drilling process and the deformation produced in the trepanning process are ignored. In this study, a modified evaluation formula, which is applied to the plane strain condition, is proposed. In addition, a new procedure is proposed which can consider the effects of the deformation produced in the DHD process by investigating the effects in detail by finite element (FE) analysis. Then, the evaluation results obtained by the new procedure are compared with that obtained by traditional DHD procedure by FE analysis. As a result, the new procedure evaluates the residual stress fields better than the traditional DHD procedure when the measuring object is thick enough that the stress condition can be assumed as the plane strain condition as in the model used in this study.

  7. Cedar Project---Original goals and progress to date

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cybenko, G.; Kuck, D.; Padua, D.

    1990-11-28

    This work encompasses a broad attack on high speed parallel processing. Hardware, software, applications development, and performance evaluation and visualization as well as research topics are proposed. Our goal is to develop practical parallel processing for the 1990's.

  8. 42 CFR 410.143 - Requirements for approved accreditation organizations.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ...) Notice of any proposed changes in its accreditation standards and requirements or evaluation process. If... enforcement of its standards to a set of quality standards (described in § 410.144) and processes when any of the following conditions exist: (i) CMS imposes new requirements or changes its process for approving...

  9. An Underwater Color Image Quality Evaluation Metric.

    PubMed

    Yang, Miao; Sowmya, Arcot

    2015-12-01

    Quality evaluation of underwater images is a key goal of underwater video image retrieval and intelligent processing. To date, no metric has been proposed for underwater color image quality evaluation (UCIQE). The special absorption and scattering characteristics of the water medium do not allow direct application of natural color image quality metrics especially to different underwater environments. In this paper, subjective testing for underwater image quality has been organized. The statistical distribution of the underwater image pixels in the CIELab color space related to subjective evaluation indicates the sharpness and colorful factors correlate well with subjective image quality perception. Based on these, a new UCIQE metric, which is a linear combination of chroma, saturation, and contrast, is proposed to quantify the non-uniform color cast, blurring, and low-contrast that characterize underwater engineering and monitoring images. Experiments are conducted to illustrate the performance of the proposed UCIQE metric and its capability to measure the underwater image enhancement results. They show that the proposed metric has comparable performance to the leading natural color image quality metrics and the underwater grayscale image quality metrics available in the literature, and can predict with higher accuracy the relative amount of degradation with similar image content in underwater environments. Importantly, UCIQE is a simple and fast solution for real-time underwater video processing. The effectiveness of the presented measure is also demonstrated by subjective evaluation. The results show better correlation between the UCIQE and the subjective mean opinion score.

  10. Evaluation of Ultrasonic Fiber Structure Extraction Technique Using Autopsy Specimens of Liver

    NASA Astrophysics Data System (ADS)

    Yamaguchi, Tadashi; Hirai, Kazuki; Yamada, Hiroyuki; Ebara, Masaaki; Hachiya, Hiroyuki

    2005-06-01

    It is very important to diagnose liver cirrhosis noninvasively and correctly. In our previous studies, we proposed a processing technique to detect changes in liver tissue in vivo. In this paper, we propose the evaluation of the relationship between liver disease and echo information using autopsy specimens of a human liver in vitro. It is possible to verify the function of a processing parameter clearly and to compare the processing result and the actual human liver tissue structure by in vitro experiment. In the results of our processing technique, information that did not obey a Rayleigh distribution from the echo signal of the autopsy liver specimens was extracted depending on changes in a particular processing parameter. The fiber tissue structure of the same specimen was extracted from a number of histological images of stained tissue. We constructed 3D structures using the information extracted from the echo signal and the fiber structure of the stained tissue and compared the two. By comparing the 3D structures, it is possible to evaluate the relationship between the information that does not obey a Rayleigh distribution of the echo signal and the fibrosis structure.

  11. Validating the ACE Model for Evaluating Student Performance Using a Teaching-Learning Process Based on Computational Modeling Systems

    ERIC Educational Resources Information Center

    Louzada, Alexandre Neves; Elia, Marcos da Fonseca; Sampaio, Fábio Ferrentini; Vidal, Andre Luiz Pestana

    2014-01-01

    The aim of this work is to adapt and test, in a Brazilian public school, the ACE model proposed by Borkulo for evaluating student performance as a teaching-learning process based on computational modeling systems. The ACE model is based on different types of reasoning involving three dimensions. In addition to adapting the model and introducing…

  12. Design and evaluation of a parametric model for cardiac sounds.

    PubMed

    Ibarra-Hernández, Roilhi F; Alonso-Arévalo, Miguel A; Cruz-Gutiérrez, Alejandro; Licona-Chávez, Ana L; Villarreal-Reyes, Salvador

    2017-10-01

    Heart sound analysis plays an important role in the auscultative diagnosis process to detect the presence of cardiovascular diseases. In this paper we propose a novel parametric heart sound model that accurately represents normal and pathological cardiac audio signals, also known as phonocardiograms (PCG). The proposed model considers that the PCG signal is formed by the sum of two parts: one of them is deterministic and the other one is stochastic. The first part contains most of the acoustic energy. This part is modeled by the Matching Pursuit (MP) algorithm, which performs an analysis-synthesis procedure to represent the PCG signal as a linear combination of elementary waveforms. The second part, also called residual, is obtained after subtracting the deterministic signal from the original heart sound recording and can be accurately represented as an autoregressive process using the Linear Predictive Coding (LPC) technique. We evaluate the proposed heart sound model by performing subjective and objective tests using signals corresponding to different pathological cardiac sounds. The results of the objective evaluation show an average Percentage of Root-Mean-Square Difference of approximately 5% between the original heart sound and the reconstructed signal. For the subjective test we conducted a formal methodology for perceptual evaluation of audio quality with the assistance of medical experts. Statistical results of the subjective evaluation show that our model provides a highly accurate approximation of real heart sound signals. We are not aware of any previous heart sound model rigorously evaluated as our proposal. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Using a fuzzy comprehensive evaluation method to determine product usability: A proposed theoretical framework

    PubMed Central

    Zhou, Ronggang; Chan, Alan H. S.

    2016-01-01

    BACKGROUND: In order to compare existing usability data to ideal goals or to that for other products, usability practitioners have tried to develop a framework for deriving an integrated metric. However, most current usability methods with this aim rely heavily on human judgment about the various attributes of a product, but often fail to take into account of the inherent uncertainties in these judgments in the evaluation process. OBJECTIVE: This paper presents a universal method of usability evaluation by combining the analytic hierarchical process (AHP) and the fuzzy evaluation method. By integrating multiple sources of uncertain information during product usability evaluation, the method proposed here aims to derive an index that is structured hierarchically in terms of the three usability components of effectiveness, efficiency, and user satisfaction of a product. METHODS: With consideration of the theoretical basis of fuzzy evaluation, a two-layer comprehensive evaluation index was first constructed. After the membership functions were determined by an expert panel, the evaluation appraisals were computed by using the fuzzy comprehensive evaluation technique model to characterize fuzzy human judgments. Then with the use of AHP, the weights of usability components were elicited from these experts. RESULTS AND CONCLUSIONS: Compared to traditional usability evaluation methods, the major strength of the fuzzy method is that it captures the fuzziness and uncertainties in human judgments and provides an integrated framework that combines the vague judgments from multiple stages of a product evaluation process. PMID:28035943

  14. Environmental Assessment for the Strategic Petroleum Reserve West Hackberry Facility Raw Water Intake Pipeline Replacement Cameron and Calcasieu Parishes, Louisiana

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    N /A

    The proposed action and three alternatives, including a No Build alternative, were evaluated along the existing RWIPL alignment to accommodate the placement of the proposed RWIPL. Construction feasibility, reasonableness and potential environmental impacts were considered during the evaluation of the four actions (and action alternatives) for the proposed RWIPL activities. Reasonable actions were identified as those actions which were considered to be supported by common sense and sound technical principles. Feasible actions were those actions which were considered to be capable of being accomplished, practicable and non-excessive in terms of cost. The evaluation process considered the following design specifications, whichmore » were determined to be important to the feasibility of the overall project. The proposed RWIPL replacement project must therefore: (1) Comply with the existing design basis and criteria, (2) Maintain continuity of operation of the facility during construction, (3)Provide the required service life, (4) Be cost effective, (5)Improve the operation and maintenance of the pipeline, and (6) Maintain minimal environmental impact while meeting the performance requirements. Sizing of the pipe, piping construction materials, construction method (e.g., open-cut trench, directional drilling, etc.) and the acquisition of new Right-of-Way (ROW) were additionally evaluated in the preliminary alternative identification, selection and screening process.« less

  15. On the anatomy of a chain shift1

    PubMed Central

    Dinnsen, Daniel A.; Green, Christopher R.; Gierut, Judith A.; Morrisette, Michele L.

    2012-01-01

    Phonological chain shifts have been the focus of many theoretical, developmental, and clinical concerns. This paper considers an overlooked property of the problem by focusing on the typological properties of the widely attested ‘s > θ > f’ chain shift involving the processes of Labialization and Dentalization in early phonological development. Findings are reported from a cross-sectional study of 234 children (ages 3 years; 0 months–7;9) with functional (nonorganic) phonological delays. The results reveal some unexpected gaps in the predicted interactions of these processes and are brought to bear on the evaluation of recent optimality theoretic proposals for the characterization of phonological interactions. A developmental modification to the theory is proposed that has the desired effect of precluding certain early-stage grammars. The proposal is further evaluated against the facts of another widely cited developmental chain shift known as the ‘puzzle > puddle > pickle’ problem (Smith 1973). PMID:22389522

  16. 77 FR 60746 - Proposed Information Collection (VA/DOD Joint Disability Evaluation Board Claim) Activity...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-04

    ... War on Terror Heroes, VA and the Department of Defense (DOD) have agreed to develop a joint process in which Global War on Terror (GWOT) service members are evaluated to assign disability ratings, which will...

  17. Understanding Peer Review of Scientific Research

    ERIC Educational Resources Information Center

    Association of American Universities, 2011

    2011-01-01

    An important factor in the success of America's national research system is that federal funds for university-based research are awarded primarily through peer review, which uses panels of scientific experts, or "peers," to evaluate the quality of grant proposals. In this competitive process, proposals compete for resources based on their…

  18. Bilingual Program Application for Continuation Proposal: Compton Unified School District.

    ERIC Educational Resources Information Center

    Compton City Schools, CA.

    This document contains the continuation proposal for the fourth grade Compton bilingual education program. A review of the third year is included with details on process evaluation, project personnel and duties, new vocabulary developed by the project for lexical references, and inservice training of teachers. Information concerning the proposed…

  19. Adaptive neuro-heuristic hybrid model for fruit peel defects detection.

    PubMed

    Woźniak, Marcin; Połap, Dawid

    2018-02-01

    Fusion of machine learning methods benefits in decision support systems. A composition of approaches gives a possibility to use the most efficient features composed into one solution. In this article we would like to present an approach to the development of adaptive method based on fusion of proposed novel neural architecture and heuristic search into one co-working solution. We propose a developed neural network architecture that adapts to processed input co-working with heuristic method used to precisely detect areas of interest. Input images are first decomposed into segments. This is to make processing easier, since in smaller images (decomposed segments) developed Adaptive Artificial Neural Network (AANN) processes less information what makes numerical calculations more precise. For each segment a descriptor vector is composed to be presented to the proposed AANN architecture. Evaluation is run adaptively, where the developed AANN adapts to inputs and their features by composed architecture. After evaluation, selected segments are forwarded to heuristic search, which detects areas of interest. As a result the system returns the image with pixels located over peel damages. Presented experimental research results on the developed solution are discussed and compared with other commonly used methods to validate the efficacy and the impact of the proposed fusion in the system structure and training process on classification results. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Motor unit action potential conduction velocity estimated from surface electromyographic signals using image processing techniques.

    PubMed

    Soares, Fabiano Araujo; Carvalho, João Luiz Azevedo; Miosso, Cristiano Jacques; de Andrade, Marcelino Monteiro; da Rocha, Adson Ferreira

    2015-09-17

    In surface electromyography (surface EMG, or S-EMG), conduction velocity (CV) refers to the velocity at which the motor unit action potentials (MUAPs) propagate along the muscle fibers, during contractions. The CV is related to the type and diameter of the muscle fibers, ion concentration, pH, and firing rate of the motor units (MUs). The CV can be used in the evaluation of contractile properties of MUs, and of muscle fatigue. The most popular methods for CV estimation are those based on maximum likelihood estimation (MLE). This work proposes an algorithm for estimating CV from S-EMG signals, using digital image processing techniques. The proposed approach is demonstrated and evaluated, using both simulated and experimentally-acquired multichannel S-EMG signals. We show that the proposed algorithm is as precise and accurate as the MLE method in typical conditions of noise and CV. The proposed method is not susceptible to errors associated with MUAP propagation direction or inadequate initialization parameters, which are common with the MLE algorithm. Image processing -based approaches may be useful in S-EMG analysis to extract different physiological parameters from multichannel S-EMG signals. Other new methods based on image processing could also be developed to help solving other tasks in EMG analysis, such as estimation of the CV for individual MUs, localization and tracking of innervation zones, and study of MU recruitment strategies.

  1. Standardizing an approach to the evaluation of implementation science proposals.

    PubMed

    Crable, Erika L; Biancarelli, Dea; Walkey, Allan J; Allen, Caitlin G; Proctor, Enola K; Drainoni, Mari-Lynn

    2018-05-29

    The fields of implementation and improvement sciences have experienced rapid growth in recent years. However, research that seeks to inform health care change may have difficulty translating core components of implementation and improvement sciences within the traditional paradigms used to evaluate efficacy and effectiveness research. A review of implementation and improvement sciences grant proposals within an academic medical center using a traditional National Institutes of Health framework highlighted the need for tools that could assist investigators and reviewers in describing and evaluating proposed implementation and improvement sciences research. We operationalized existing recommendations for writing implementation science proposals as the ImplemeNtation and Improvement Science Proposals Evaluation CriTeria (INSPECT) scoring system. The resulting system was applied to pilot grants submitted to a call for implementation and improvement science proposals at an academic medical center. We evaluated the reliability of the INSPECT system using Krippendorff's alpha coefficients and explored the utility of the INSPECT system to characterize common deficiencies in implementation research proposals. We scored 30 research proposals using the INSPECT system. Proposals received a median cumulative score of 7 out of a possible score of 30. Across individual elements of INSPECT, proposals scored highest for criteria rating evidence of a care or quality gap. Proposals generally performed poorly on all other criteria. Most proposals received scores of 0 for criteria identifying an evidence-based practice or treatment (50%), conceptual model and theoretical justification (70%), setting's readiness to adopt new services/treatment/programs (54%), implementation strategy/process (67%), and measurement and analysis (70%). Inter-coder reliability testing showed excellent reliability (Krippendorff's alpha coefficient 0.88) for the application of the scoring system overall and demonstrated reliability scores ranging from 0.77 to 0.99 for individual elements. The INSPECT scoring system presents a new scoring criteria with a high degree of inter-rater reliability and utility for evaluating the quality of implementation and improvement sciences grant proposals.

  2. An enhanced fast scanning algorithm for image segmentation

    NASA Astrophysics Data System (ADS)

    Ismael, Ahmed Naser; Yusof, Yuhanis binti

    2015-12-01

    Segmentation is an essential and important process that separates an image into regions that have similar characteristics or features. This will transform the image for a better image analysis and evaluation. An important benefit of segmentation is the identification of region of interest in a particular image. Various algorithms have been proposed for image segmentation and this includes the Fast Scanning algorithm which has been employed on food, sport and medical images. It scans all pixels in the image and cluster each pixel according to the upper and left neighbor pixels. The clustering process in Fast Scanning algorithm is performed by merging pixels with similar neighbor based on an identified threshold. Such an approach will lead to a weak reliability and shape matching of the produced segments. This paper proposes an adaptive threshold function to be used in the clustering process of the Fast Scanning algorithm. This function used the gray'value in the image's pixels and variance Also, the level of the image that is more the threshold are converted into intensity values between 0 and 1, and other values are converted into intensity values zero. The proposed enhanced Fast Scanning algorithm is realized on images of the public and private transportation in Iraq. Evaluation is later made by comparing the produced images of proposed algorithm and the standard Fast Scanning algorithm. The results showed that proposed algorithm is faster in terms the time from standard fast scanning.

  3. Evaluation of spray drift using low speed wind tunnel measurements and dispersion modeling

    USDA-ARS?s Scientific Manuscript database

    The objective of this work was to evaluate the EPA’s proposed Test Plan for the validation testing of pesticide spray drift reduction technologies (DRTs) for row and field crops, focusing on the evaluation of ground application systems using the low-speed wind tunnel protocols and processing the dat...

  4. Identifying and Evaluating External Validity Evidence for Passing Scores

    ERIC Educational Resources Information Center

    Davis-Becker, Susan L.; Buckendahl, Chad W.

    2013-01-01

    A critical component of the standard setting process is collecting evidence to evaluate the recommended cut scores and their use for making decisions and classifying students based on test performance. Kane (1994, 2001) proposed a framework by which practitioners can identify and evaluate evidence of the results of the standard setting from (1)…

  5. Evaluation of extreme temperature events in northern Spain based on process control charts

    NASA Astrophysics Data System (ADS)

    Villeta, M.; Valencia, J. L.; Saá, A.; Tarquis, A. M.

    2018-02-01

    Extreme climate events have recently attracted the attention of a growing number of researchers because these events impose a large cost on agriculture and associated insurance planning. This study focuses on extreme temperature events and proposes a new method for their evaluation based on statistical process control tools, which are unusual in climate studies. A series of minimum and maximum daily temperatures for 12 geographical areas of a Spanish region between 1931 and 2009 were evaluated by applying statistical process control charts to statistically test whether evidence existed for an increase or a decrease of extreme temperature events. Specification limits were determined for each geographical area and used to define four types of extreme anomalies: lower and upper extremes for the minimum and maximum anomalies. A new binomial Markov extended process that considers the autocorrelation between extreme temperature events was generated for each geographical area and extreme anomaly type to establish the attribute control charts for the annual fraction of extreme days and to monitor the occurrence of annual extreme days. This method was used to assess the significance of changes and trends of extreme temperature events in the analysed region. The results demonstrate the effectiveness of an attribute control chart for evaluating extreme temperature events. For example, the evaluation of extreme maximum temperature events using the proposed statistical process control charts was consistent with the evidence of an increase in maximum temperatures during the last decades of the last century.

  6. Fulfilling Schmidt Ocean Institute's commitment to open sharing of information, data, and research outcomes: Successes and Lessons Learned from Proposal Evaluation to Public Repositories to Lasting Achievements

    NASA Astrophysics Data System (ADS)

    Miller, A.; Zykov, V.

    2016-02-01

    Schmidt Ocean Institute's vision is that the world's ocean be understood through technological advancement, intelligent observation, and open sharing of information. As such, making data collected aboard R/V Falkor available to the general public is a key pillar of the organization and a major strategic focus. Schmidt Ocean Institute supports open sharing of information about the ocean to stimulate the growth of its applications and user community, and amplify further exploration, discovery, and deeper understanding of our environment. These efforts are supported through partnerships with data management experts in the oceanographic community to enable standards-compliant sharing of scientific information and data collected during research cruises. To properly fulfill the commitment, proponents' data management plans are evaluated as part of the proposal process when applying for ship time. We request a thorough data management plan be submitted and expert reviewers evaluate the proposal's plan as part of the review process. Once a project is successfully selected, the chief scientist signs an agreement stating delivery dates for post-cruise data deliverables in a timely manner, R/V Falkor underway and meterological data is shared via public repositories, and links and reports are posted on the cruise webpage. This allows many more creative minds and thinkers to analyze, process, and study the data collected in the world ocean rather than privileging one scientist with the proprietary information, driving international and national scientific progress. This presentation will include the Institute's mission, vision, and strategy for sharing data, based on our Founders' passions, the process for evaluating proposed data management plans, and our partnering efforts to make data publically available in fulfillment of our commitment. Recent achievements and successes in data sharing, as well as future plans to improve our efforts will also be discussed.

  7. Development of a Nationally Coordinated Evaluation Plan for the Ghana National Strategy for Key Populations

    PubMed Central

    Reynolds, Heidi W; Atuahene, Kyeremeh; Sutherland, Elizabeth; Amenyah, Richard; Kwao, Isaiah Doe; Larbi, Emmanuel Tettey

    2015-01-01

    Objective Just as HIV prevention programs need to be tailored to the local epidemic, so should evaluations be country-owned and country-led to ensure use of those results in decision making and policy. The objective of this paper is to describe the process undertaken in Ghana to develop a national evaluation plan for the Ghana national strategy for key populations. Methods This was a participatory process that involved meetings between the Ghana AIDS Commission (GAC), other partners in Ghana working to prevent HIV among key populations, and MEASURE Evaluation. The process included three two-day, highly structured yet participatory meetings over the course of 12 months during which participants shared information about on-going and planned data and identified research questions and methods. Results An evaluation plan was prepared to inform stakeholders about which data collection activities need to be prioritized for funding, who would implement the study, the timing of data collection, the research question the data will help answer, and the analysis methods. The plan discusses various methods that can be used including the recommendation for the study design using multiple data sources. It has an evaluation conceptual model, proposed analyses, proposed definition of independent variables, estimated costs for filling data gaps, roles and responsibilities of stakeholders to carry out the plan, and considerations for ethics, data sharing and authorship. Conclusion The experience demonstrates that it is possible to design an evaluation responsive to national strategies and priorities with country leadership, regardless of stakeholders' experiences with evaluations. This process may be replicable elsewhere, where stakeholders want to plan and implement an evaluation of a large-scale program at the national or subnational level that is responsive to national priorities and part of a comprehensive monitoring and evaluation system. PMID:26120495

  8. CCRS proposal for evaluating LANDSAT-D MSS and TM data

    NASA Technical Reports Server (NTRS)

    Strome, W. M.; Cihlar, J.; Goodenough, D. G.; Guertin, F. E. (Principal Investigator); Collins, A. B.

    1983-01-01

    Accomplishments in the evaluation of LANDSAT 4 data are reported. The objectives of the Canadian proposal are: (1) to quantify the LANDSAT-4 sensors and system performance for the purpose of updating the radiometric and geometric correction algorithms for MSS and for developing and evaluating new correction algorithms to be used for TM data processing; (2) to compare and access the degree to which LANDSAT-4 MSS data can be integrated with MSS imagery acquired from earlier LANDSAT missions; and (3) to apply image analysis and information extraction techniques for specific user applications such as forestry or agriculture.

  9. Evaluation of hyperspectral reflectance for estimating dry matter and sugar concentration in processing potatoes

    USDA-ARS?s Scientific Manuscript database

    The measurement of sugar concentration and dry matter in processing potatoes is a time and resource intensive activity, cannot be performed in the field, and does not easily measure within tuber variation. A proposed method to improve the phenotyping of processing potatoes is to employ hyperspectral...

  10. On-board Attitude Determination System (OADS). [for advanced spacecraft missions

    NASA Technical Reports Server (NTRS)

    Carney, P.; Milillo, M.; Tate, V.; Wilson, J.; Yong, K.

    1978-01-01

    The requirements, capabilities and system design for an on-board attitude determination system (OADS) to be flown on advanced spacecraft missions were determined. Based upon the OADS requirements and system performance evaluation, a preliminary on-board attitude determination system is proposed. The proposed OADS system consists of one NASA Standard IRU (DRIRU-2) as the primary attitude determination sensor, two improved NASA Standard star tracker (SST) for periodic update of attitude information, a GPS receiver to provide on-board space vehicle position and velocity vector information, and a multiple microcomputer system for data processing and attitude determination functions. The functional block diagram of the proposed OADS system is shown. The computational requirements are evaluated based upon this proposed OADS system.

  11. Practicing universal design to actual hand tool design process.

    PubMed

    Lin, Kai-Chieh; Wu, Chih-Fu

    2015-09-01

    UD evaluation principles are difficult to implement in product design. This study proposes a methodology for implementing UD in the design process through user participation. The original UD principles and user experience are used to develop the evaluation items. Difference of product types was considered. Factor analysis and Quantification theory type I were used to eliminate considered inappropriate evaluation items and to examine the relationship between evaluation items and product design factors. Product design specifications were established for verification. The results showed that converting user evaluation into crucial design verification factors by the generalized evaluation scale based on product attributes as well as the design factors applications in product design can improve users' UD evaluation. The design process of this study is expected to contribute to user-centered UD application. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  12. 75 FR 26200 - Proposed Information Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-11

    ... School-Based Learn and Serve America Teacher Recruitment Process. The Teacher Recruitment Process will identify and recruit teachers for participation in the National Evaluation of School-Based Learn and Serve... are implementing Learn and Serve America funded service-learning programs. Teachers identified by...

  13. Real-time Medical Emergency Response System: Exploiting IoT and Big Data for Public Health.

    PubMed

    Rathore, M Mazhar; Ahmad, Awais; Paul, Anand; Wan, Jiafu; Zhang, Daqiang

    2016-12-01

    Healthy people are important for any nation's development. Use of the Internet of Things (IoT)-based body area networks (BANs) is increasing for continuous monitoring and medical healthcare in order to perform real-time actions in case of emergencies. However, in the case of monitoring the health of all citizens or people in a country, the millions of sensors attached to human bodies generate massive volume of heterogeneous data, called "Big Data." Processing Big Data and performing real-time actions in critical situations is a challenging task. Therefore, in order to address such issues, we propose a Real-time Medical Emergency Response System that involves IoT-based medical sensors deployed on the human body. Moreover, the proposed system consists of the data analysis building, called "Intelligent Building," depicted by the proposed layered architecture and implementation model, and it is responsible for analysis and decision-making. The data collected from millions of body-attached sensors is forwarded to Intelligent Building for processing and for performing necessary actions using various units such as collection, Hadoop Processing (HPU), and analysis and decision. The feasibility and efficiency of the proposed system are evaluated by implementing the system on Hadoop using an UBUNTU 14.04 LTS coreTMi5 machine. Various medical sensory datasets and real-time network traffic are considered for evaluating the efficiency of the system. The results show that the proposed system has the capability of efficiently processing WBAN sensory data from millions of users in order to perform real-time responses in case of emergencies.

  14. The qualitative research proposal.

    PubMed

    Klopper, H

    2008-12-01

    Qualitative research in the health sciences has had to overcome many prejudices and a number of misunderstandings, but today qualitative research is as acceptable as quantitative research designs and is widely funded and published. Writing the proposal of a qualitative study, however, can be a challenging feat, due to the emergent nature of the qualitative research design and the description of the methodology as a process. Even today, many sub-standard proposals at post-graduate evaluation committees and application proposals to be considered for funding are still seen. This problem has led the researcher to develop a framework to guide the qualitative researcher in writing the proposal of a qualitative study based on the following research questions: (i) What is the process of writing a qualitative research proposal? and (ii) What does the structure and layout of a qualitative proposal look like? The purpose of this article is to discuss the process of writing the qualitative research proposal, as well as describe the structure and layout of a qualitative research proposal. The process of writing a qualitative research proposal is discussed with regards to the most important questions that need to be answered in your research proposal with consideration of the guidelines of being practical, being persuasive, making broader links, aiming for crystal clarity and planning before you write. While the structure of the qualitative research proposal is discussed with regards to the key sections of the proposal, namely the cover page, abstract, introduction, review of the literature, research problem and research questions, research purpose and objectives, research paradigm, research design, research method, ethical considerations, dissemination plan, budget and appendices.

  15. Hue-preserving and saturation-improved color histogram equalization algorithm.

    PubMed

    Song, Ki Sun; Kang, Hee; Kang, Moon Gi

    2016-06-01

    In this paper, an algorithm is proposed to improve contrast and saturation without color degradation. The local histogram equalization (HE) method offers better performance than the global HE method, whereas the local HE method sometimes produces undesirable results due to the block-based processing. The proposed contrast-enhancement (CE) algorithm reflects the characteristics of the global HE method in the local HE method to avoid the artifacts, while global and local contrasts are enhanced. There are two ways to apply the proposed CE algorithm to color images. One is luminance processing methods, and the other one is each channel processing methods. However, these ways incur excessive or reduced saturation and color degradation problems. The proposed algorithm solves these problems by using channel adaptive equalization and similarity of ratios between the channels. Experimental results show that the proposed algorithm enhances contrast and saturation while preserving the hue and producing better performance than existing methods in terms of objective evaluation metrics.

  16. Peer review of health research funding proposals: A systematic map and systematic review of innovations for effectiveness and efficiency

    PubMed Central

    Frampton, Geoff K.; Pickett, Karen; Wyatt, Jeremy C.

    2018-01-01

    Objective To investigate methods and processes for timely, efficient and good quality peer review of research funding proposals in health. Methods A two-stage evidence synthesis: (1) a systematic map to describe the key characteristics of the evidence base, followed by (2) a systematic review of the studies stakeholders prioritised as relevant from the map on the effectiveness and efficiency of peer review ‘innovations’. Standard processes included literature searching, duplicate inclusion criteria screening, study keyword coding, data extraction, critical appraisal and study synthesis. Results A total of 83 studies from 15 countries were included in the systematic map. The evidence base is diverse, investigating many aspects of the systems for, and processes of, peer review. The systematic review included eight studies from Australia, Canada, and the USA, evaluating a broad range of peer review innovations. These studies showed that simplifying the process by shortening proposal forms, using smaller reviewer panels, or expediting processes can speed up the review process and reduce costs, but this might come at the expense of peer review quality, a key aspect that has not been assessed. Virtual peer review using videoconferencing or teleconferencing appears promising for reducing costs by avoiding the need for reviewers to travel, but again any consequences for quality have not been adequately assessed. Conclusions There is increasing international research activity into the peer review of health research funding. The studies reviewed had methodological limitations and variable generalisability to research funders. Given these limitations it is not currently possible to recommend immediate implementation of these innovations. However, many appear promising based on existing evidence, and could be adapted as necessary by funders and evaluated. Where feasible, experimental evaluation, including randomised controlled trials, should be conducted, evaluating impact on effectiveness, efficiency and quality. PMID:29750807

  17. A Survey and Analysis of Frameworks and Framework Issues for Information Fusion Applications

    NASA Astrophysics Data System (ADS)

    Llinas, James

    This paper was stimulated by the proposed project for the Santander Bank-sponsored "Chairs of Excellence" program in Spain, of which the author is a recipient. That project involves research on characterizing a robust, problem-domain-agnostic framework in which Information Fusion (IF) processes of all description, to include artificial intelligence processes and techniques could be developed. The paper describes the IF process and its requirements, a literature survey on IF frameworks, and a new proposed framework that will be implemented and evaluated at Universidad Carlos III de Madrid, Colmenarejo Campus.

  18. Adaptive nonlinear L2 and L3 filters for speckled image processing

    NASA Astrophysics Data System (ADS)

    Lukin, Vladimir V.; Melnik, Vladimir P.; Chemerovsky, Victor I.; Astola, Jaakko T.

    1997-04-01

    Here we propose adaptive nonlinear filters based on calculation and analysis of two or three order statistics in a scanning window. They are designed for processing images corrupted by severe speckle noise with non-symmetrical. (Rayleigh or one-side exponential) distribution laws; impulsive noise can be also present. The proposed filtering algorithms provide trade-off between impulsive noise can be also present. The proposed filtering algorithms provide trade-off between efficient speckle noise suppression, robustness, good edge/detail preservation, low computational complexity, preservation of average level for homogeneous regions of images. Quantitative evaluations of the characteristics of the proposed filter are presented as well as the results of the application to real synthetic aperture radar and ultrasound medical images.

  19. Development of Chemical Process Design and Control for ...

    EPA Pesticide Factsheets

    This contribution describes a novel process systems engineering framework that couples advanced control with sustainability evaluation and decision making for the optimization of process operations to minimize environmental impacts associated with products, materials, and energy. The implemented control strategy combines a biologically inspired method with optimal control concepts for finding more sustainable operating trajectories. The sustainability assessment of process operating points is carried out by using the U.S. E.P.A.’s Gauging Reaction Effectiveness for the ENvironmental Sustainability of Chemistries with a multi-Objective Process Evaluator (GREENSCOPE) tool that provides scores for the selected indicators in the economic, material efficiency, environmental and energy areas. The indicator scores describe process performance on a sustainability measurement scale, effectively determining which operating point is more sustainable if there are more than several steady states for one specific product manufacturing. Through comparisons between a representative benchmark and the optimal steady-states obtained through implementation of the proposed controller, a systematic decision can be made in terms of whether the implementation of the controller is moving the process towards a more sustainable operation. The effectiveness of the proposed framework is illustrated through a case study of a continuous fermentation process for fuel production, whose materi

  20. A New Automated Design Method Based on Machine Learning for CMOS Analog Circuits

    NASA Astrophysics Data System (ADS)

    Moradi, Behzad; Mirzaei, Abdolreza

    2016-11-01

    A new simulation based automated CMOS analog circuit design method which applies a multi-objective non-Darwinian-type evolutionary algorithm based on Learnable Evolution Model (LEM) is proposed in this article. The multi-objective property of this automated design of CMOS analog circuits is governed by a modified Strength Pareto Evolutionary Algorithm (SPEA) incorporated in the LEM algorithm presented here. LEM includes a machine learning method such as the decision trees that makes a distinction between high- and low-fitness areas in the design space. The learning process can detect the right directions of the evolution and lead to high steps in the evolution of the individuals. The learning phase shortens the evolution process and makes remarkable reduction in the number of individual evaluations. The expert designer's knowledge on circuit is applied in the design process in order to reduce the design space as well as the design time. The circuit evaluation is made by HSPICE simulator. In order to improve the design accuracy, bsim3v3 CMOS transistor model is adopted in this proposed design method. This proposed design method is tested on three different operational amplifier circuits. The performance of this proposed design method is verified by comparing it with the evolutionary strategy algorithm and other similar methods.

  1. An Approach to a Comprehensive Test Framework for Analysis and Evaluation of Text Line Segmentation Algorithms

    PubMed Central

    Brodic, Darko; Milivojevic, Dragan R.; Milivojevic, Zoran N.

    2011-01-01

    The paper introduces a testing framework for the evaluation and validation of text line segmentation algorithms. Text line segmentation represents the key action for correct optical character recognition. Many of the tests for the evaluation of text line segmentation algorithms deal with text databases as reference templates. Because of the mismatch, the reliable testing framework is required. Hence, a new approach to a comprehensive experimental framework for the evaluation of text line segmentation algorithms is proposed. It consists of synthetic multi-like text samples and real handwritten text as well. Although the tests are mutually independent, the results are cross-linked. The proposed method can be used for different types of scripts and languages. Furthermore, two different procedures for the evaluation of algorithm efficiency based on the obtained error type classification are proposed. The first is based on the segmentation line error description, while the second one incorporates well-known signal detection theory. Each of them has different capabilities and convenience, but they can be used as supplements to make the evaluation process efficient. Overall the proposed procedure based on the segmentation line error description has some advantages, characterized by five measures that describe measurement procedures. PMID:22164106

  2. An approach to a comprehensive test framework for analysis and evaluation of text line segmentation algorithms.

    PubMed

    Brodic, Darko; Milivojevic, Dragan R; Milivojevic, Zoran N

    2011-01-01

    The paper introduces a testing framework for the evaluation and validation of text line segmentation algorithms. Text line segmentation represents the key action for correct optical character recognition. Many of the tests for the evaluation of text line segmentation algorithms deal with text databases as reference templates. Because of the mismatch, the reliable testing framework is required. Hence, a new approach to a comprehensive experimental framework for the evaluation of text line segmentation algorithms is proposed. It consists of synthetic multi-like text samples and real handwritten text as well. Although the tests are mutually independent, the results are cross-linked. The proposed method can be used for different types of scripts and languages. Furthermore, two different procedures for the evaluation of algorithm efficiency based on the obtained error type classification are proposed. The first is based on the segmentation line error description, while the second one incorporates well-known signal detection theory. Each of them has different capabilities and convenience, but they can be used as supplements to make the evaluation process efficient. Overall the proposed procedure based on the segmentation line error description has some advantages, characterized by five measures that describe measurement procedures.

  3. Resource and Performance Evaluations of Fixed Point QRD-RLS Systolic Array through FPGA Implementation

    NASA Astrophysics Data System (ADS)

    Yokoyama, Yoshiaki; Kim, Minseok; Arai, Hiroyuki

    At present, when using space-time processing techniques with multiple antennas for mobile radio communication, real-time weight adaptation is necessary. Due to the progress of integrated circuit technology, dedicated processor implementation with ASIC or FPGA can be employed to implement various wireless applications. This paper presents a resource and performance evaluation of the QRD-RLS systolic array processor based on fixed-point CORDIC algorithm with FPGA. In this paper, to save hardware resources, we propose the shared architecture of a complex CORDIC processor. The required precision of internal calculation, the circuit area for the number of antenna elements and wordlength, and the processing speed will be evaluated. The resource estimation provides a possible processor configuration with a current FPGA on the market. Computer simulations assuming a fading channel will show a fast convergence property with a finite number of training symbols. The proposed architecture has also been implemented and its operation was verified by beamforming evaluation through a radio propagation experiment.

  4. Restoration of color in a remote sensing image and its quality evaluation

    NASA Astrophysics Data System (ADS)

    Zhang, Zuxun; Li, Zhijiang; Zhang, Jianqing; Wang, Zhihe

    2003-09-01

    This paper is focused on the restoration of color remote sensing (including airborne photo). A complete approach is recommended. It propose that two main aspects should be concerned in restoring a remote sensing image, that are restoration of space information, restoration of photometric information. In this proposal, the restoration of space information can be performed by making the modulation transfer function (MTF) as degradation function, in which the MTF is obtained by measuring the edge curve of origin image. The restoration of photometric information can be performed by improved local maximum entropy algorithm. What's more, a valid approach in processing color remote sensing image is recommended. That is splits the color remote sensing image into three monochromatic images which corresponding three visible light bands and synthesizes the three images after being processed separately with psychological color vision restriction. Finally, three novel evaluation variables are obtained based on image restoration to evaluate the image restoration quality in space restoration quality and photometric restoration quality. An evaluation is provided at last.

  5. 75 FR 382 - Proposed Collection; Comment Request; Process Evaluation of the NIH's Roadmap Interdisciplinary...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-05

    ... submitted to the Office of Management and Budget (OMB) for review and approval. Proposed Collection: The... Investigators, 1; Trainees, 1; Average burden hours per response: 30 minutes; and Estimated total annual burden hours requested: 250 hours. The total annualized cost to respondents (calculated as the number of...

  6. 78 FR 2038 - Notice of Availability of Proposed New Starts and Small Starts Policy Guidance

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-09

    ... policy guidance will accompany the final rule for Major Capital Investment Projects published elsewhere... policy guidance on the review and evaluation process and criteria for major capital investment projects... capital investment program authorized at 49 U.S.C. 5309. Both the new regulation and the proposed policy...

  7. 77 FR 35408 - Proposed Collection; Comment Request: Process Evaluation of the Early Independence Award (EIA...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-13

    ...), will publish periodic summaries of proposed projects to be submitted to the Office of Management and... Independence Principal Investigators, and (5) assess the support provided by the Host Institutions to the Early Independence Principal Investigators. The findings will provide valuable information concerning (1) aspects of...

  8. Field Calibration of Wind Direction Sensor to the True North and Its Application to the Daegwanryung Wind Turbine Test Sites

    PubMed Central

    Lee, Jeong Wan

    2008-01-01

    This paper proposes a field calibration technique for aligning a wind direction sensor to the true north. The proposed technique uses the synchronized measurements of captured images by a camera, and the output voltage of a wind direction sensor. The true wind direction was evaluated through image processing techniques using the captured picture of the sensor with the least square sense. Then, the evaluated true value was compared with the measured output voltage of the sensor. This technique solves the discordance problem of the wind direction sensor in the process of installing meteorological mast. For this proposed technique, some uncertainty analyses are presented and the calibration accuracy is discussed. Finally, the proposed technique was applied to the real meteorological mast at the Daegwanryung test site, and the statistical analysis of the experimental testing estimated the values of stable misalignment and uncertainty level. In a strict sense, it is confirmed that the error range of the misalignment from the exact north could be expected to decrease within the credibility level. PMID:27873957

  9. Analysis on critical success factors for agile manufacturing evaluation in original equipment manufacturing industry-an AHP approach

    NASA Astrophysics Data System (ADS)

    Ajay Guru Dev, C.; Senthil Kumar, V. S.

    2016-09-01

    Manufacturing industries are facing challenges in the implementation of agile manufacturing in their products and processes. Agility is widely accepted as a new competitive concept in the manufacturing sector in fulfilling varying customer demand. Thus, evaluation of agile manufacturing in industries has become a necessity. The success of an organisation depends on its ability to manage finding the critical success factors and give them special and continued attention in order to bring about high performance. This paper proposes a set of critical success factors (CSFs) for evaluating agile manufacturing considered appropriate for the manufacturing sector. The analytical hierarchy process (AHP) method is applied for prioritizing the success factors, by summarizing the opinions of experts. It is believed that the proposed CSFs enable and assist manufacturing industries to achieve a higher performance in agile manufacturing so as to increase competitiveness.

  10. A new state evaluation method of oil pump unit based on AHP and FCE

    NASA Astrophysics Data System (ADS)

    Lin, Yang; Liang, Wei; Qiu, Zeyang; Zhang, Meng; Lu, Wenqing

    2017-05-01

    In order to make an accurate state evaluation of oil pump unit, a comprehensive evaluation index should be established. A multi-parameters state evaluation method of oil pump unit is proposed in this paper. The oil pump unit is analyzed by Failure Mode and Effect Analysis (FMEA), so evaluation index can be obtained based on FMEA conclusions. The weights of different parameters in evaluation index are discussed using Analytic Hierarchy Process (AHP) with expert experience. According to the evaluation index and the weight of each parameter, the state evaluation is carried out by Fuzzy Comprehensive Evaluation (FCE) and the state is divided into five levels depending on status value, which is inspired by human body health. In order to verify the effectiveness and feasibility of the proposed method, a state evaluation of oil pump used in a pump station is taken as an example.

  11. Exploring Learning-Oriented Assessment Processes

    ERIC Educational Resources Information Center

    Carless, David

    2015-01-01

    This paper proposes a model of learning-oriented assessment to inform assessment theory and practice. The model focuses on three interrelated processes: the assessment tasks which students undertake; students' development of self-evaluative capacities; and student engagement with feedback. These three strands are explored through the analysis of…

  12. Evaluation of the clinical process in a critical care information system using the Lean method: a case study

    PubMed Central

    2012-01-01

    Background There are numerous applications for Health Information Systems (HIS) that support specific tasks in the clinical workflow. The Lean method has been used increasingly to optimize clinical workflows, by removing waste and shortening the delivery cycle time. There are a limited number of studies on Lean applications related to HIS. Therefore, we applied the Lean method to evaluate the clinical processes related to HIS, in order to evaluate its efficiency in removing waste and optimizing the process flow. This paper presents the evaluation findings of these clinical processes, with regards to a critical care information system (CCIS), known as IntelliVue Clinical Information Portfolio (ICIP), and recommends solutions to the problems that were identified during the study. Methods We conducted a case study under actual clinical settings, to investigate how the Lean method can be used to improve the clinical process. We used observations, interviews, and document analysis, to achieve our stated goal. We also applied two tools from the Lean methodology, namely the Value Stream Mapping and the A3 problem-solving tools. We used eVSM software to plot the Value Stream Map and A3 reports. Results We identified a number of problems related to inefficiency and waste in the clinical process, and proposed an improved process model. Conclusions The case study findings show that the Value Stream Mapping and the A3 reports can be used as tools to identify waste and integrate the process steps more efficiently. We also proposed a standardized and improved clinical process model and suggested an integrated information system that combines database and software applications to reduce waste and data redundancy. PMID:23259846

  13. Evaluation of the clinical process in a critical care information system using the Lean method: a case study.

    PubMed

    Yusof, Maryati Mohd; Khodambashi, Soudabeh; Mokhtar, Ariffin Marzuki

    2012-12-21

    There are numerous applications for Health Information Systems (HIS) that support specific tasks in the clinical workflow. The Lean method has been used increasingly to optimize clinical workflows, by removing waste and shortening the delivery cycle time. There are a limited number of studies on Lean applications related to HIS. Therefore, we applied the Lean method to evaluate the clinical processes related to HIS, in order to evaluate its efficiency in removing waste and optimizing the process flow. This paper presents the evaluation findings of these clinical processes, with regards to a critical care information system (CCIS), known as IntelliVue Clinical Information Portfolio (ICIP), and recommends solutions to the problems that were identified during the study. We conducted a case study under actual clinical settings, to investigate how the Lean method can be used to improve the clinical process. We used observations, interviews, and document analysis, to achieve our stated goal. We also applied two tools from the Lean methodology, namely the Value Stream Mapping and the A3 problem-solving tools. We used eVSM software to plot the Value Stream Map and A3 reports. We identified a number of problems related to inefficiency and waste in the clinical process, and proposed an improved process model. The case study findings show that the Value Stream Mapping and the A3 reports can be used as tools to identify waste and integrate the process steps more efficiently. We also proposed a standardized and improved clinical process model and suggested an integrated information system that combines database and software applications to reduce waste and data redundancy.

  14. Evaluation of the traffic parameters in a metropolitan area by fusing visual perceptions and CNN processing of webcam images.

    PubMed

    Faro, Alberto; Giordano, Daniela; Spampinato, Concetto

    2008-06-01

    This paper proposes a traffic monitoring architecture based on a high-speed communication network whose nodes are equipped with fuzzy processors and cellular neural network (CNN) embedded systems. It implements a real-time mobility information system where visual human perceptions sent by people working on the territory and video-sequences of traffic taken from webcams are jointly processed to evaluate the fundamental traffic parameters for every street of a metropolitan area. This paper presents the whole methodology for data collection and analysis and compares the accuracy and the processing time of the proposed soft computing techniques with other existing algorithms. Moreover, this paper discusses when and why it is recommended to fuse the visual perceptions of the traffic with the automated measurements taken from the webcams to compute the maximum traveling time that is likely needed to reach any destination in the traffic network.

  15. A trust region approach with multivariate Padé model for optimal circuit design

    NASA Astrophysics Data System (ADS)

    Abdel-Malek, Hany L.; Ebid, Shaimaa E. K.; Mohamed, Ahmed S. A.

    2017-11-01

    Since the optimization process requires a significant number of consecutive function evaluations, it is recommended to replace the function by an easily evaluated approximation model during the optimization process. The model suggested in this article is based on a multivariate Padé approximation. This model is constructed using data points of ?, where ? is the number of parameters. The model is updated over a sequence of trust regions. This model avoids the slow convergence of linear models of ? and has features of quadratic models that need interpolation data points of ?. The proposed approach is tested by applying it to several benchmark problems. Yield optimization using such a direct method is applied to some practical circuit examples. Minimax solution leads to a suitable initial point to carry out the yield optimization process. The yield is optimized by the proposed derivative-free method for active and passive filter examples.

  16. Development of flow systems by direct-milling on poly(methyl methacrylate) substrates using UV-photopolymerization as sealing process.

    PubMed

    Rodrigues, Eunice R G O; Lapa, Rui A S

    2009-03-01

    An alternative process for the design and construction of fluidic devices is presented. Several sealing processes were studied, as well as the hydrodynamic characteristics of the proposed fluidic devices. Manifolds were imprinted on polymeric substrates by direct-write milling, according to Computer Assisted Design (CAD) data. Poly(methyl methacrylate) (PMMA) was used as substrate due to its physical and chemical properties. Different bonding approaches for the imprinted channels were evaluated and UV-photopolymerization of acrylic acid (AA) was selected. The hydrodynamic characteristics of the proposed flow devices were assessed and compared to those obtained in similar flow systems using PTFE reactors and micro-pumps as propulsion units (multi-pumping approach). The applicability of the imprinted reactors was evaluated in the sequential determination of calcium and magnesium in water samples. Results obtained were in good agreement with those obtained by the reference procedure.

  17. A Verification Method of Inter-Task Cooperation in Embedded Real-time Systems and its Evaluation

    NASA Astrophysics Data System (ADS)

    Yoshida, Toshio

    In software development process of embedded real-time systems, the design of the task cooperation process is very important. The cooperating process of such tasks is specified by task cooperation patterns. Adoption of unsuitable task cooperation patterns has fatal influence on system performance, quality, and extendibility. In order to prevent repetitive work caused by the shortage of task cooperation performance, it is necessary to verify task cooperation patterns in an early software development stage. However, it is very difficult to verify task cooperation patterns in an early software developing stage where task program codes are not completed yet. Therefore, we propose a verification method using task skeleton program codes and a real-time kernel that has a function of recording all events during software execution such as system calls issued by task program codes, external interrupts, and timer interrupt. In order to evaluate the proposed verification method, we applied it to the software development process of a mechatronics control system.

  18. Formal implementation of a performance evaluation model for the face recognition system.

    PubMed

    Shin, Yong-Nyuo; Kim, Jason; Lee, Yong-Jun; Shin, Woochang; Choi, Jin-Young

    2008-01-01

    Due to usability features, practical applications, and its lack of intrusiveness, face recognition technology, based on information, derived from individuals' facial features, has been attracting considerable attention recently. Reported recognition rates of commercialized face recognition systems cannot be admitted as official recognition rates, as they are based on assumptions that are beneficial to the specific system and face database. Therefore, performance evaluation methods and tools are necessary to objectively measure the accuracy and performance of any face recognition system. In this paper, we propose and formalize a performance evaluation model for the biometric recognition system, implementing an evaluation tool for face recognition systems based on the proposed model. Furthermore, we performed evaluations objectively by providing guidelines for the design and implementation of a performance evaluation system, formalizing the performance test process.

  19. Comparative Evaluation of Financing Programs: Insights From California’s Experience

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deason, Jeff

    Berkeley Lab examines criteria for a comparative assessment of multiple financing programs for energy efficiency, developed through a statewide public process in California. The state legislature directed the California Alternative Energy and Advanced Transportation Financing Authority (CAEATFA) to develop these criteria. CAEATFA's report to the legislature, an invaluable reference for other jurisdictions considering these topics, discusses the proposed criteria and the rationales behind them in detail. Berkeley Lab's brief focuses on several salient issues that emerged during the criteria development and discussion process. Many of these issues are likely to arise in other states that plan to evaluate the impactsmore » of energy efficiency financing programs, whether for a single program or multiple programs. Issues discussed in the brief include: -The stakeholder process to develop the proposed assessment criteria -Attribution of outcomes - such as energy savings - to financing programs vs. other drivers -Choosing the outcome metric of primary interest: program take-up levels vs. savings -The use of net benefits vs. benefit-cost ratios for cost-effectiveness evaluation -Non-energy factors -Consumer protection factors -Market transformation impacts -Accommodating varying program goals in a multi-program evaluation -Accounting for costs and risks borne by various parties, including taxpayers and utility customers, in cost-effectiveness analysis -How to account for potential synergies among programs in a multi-program evaluation« less

  20. Content-based quality evaluation of color images: overview and proposals

    NASA Astrophysics Data System (ADS)

    Tremeau, Alain; Richard, Noel; Colantoni, Philippe; Fernandez-Maloigne, Christine

    2003-12-01

    The automatic prediction of perceived quality from image data in general, and the assessment of particular image characteristics or attributes that may need improvement in particular, becomes an increasingly important part of intelligent imaging systems. The purpose of this paper is to propose to the color imaging community in general to develop a software package available on internet to help the user to select among all these approaches which is better appropriated to a given application. The ultimate goal of this project is to propose, next to implement, an open and unified color imaging system to set up a favourable context for the evaluation and analysis of color imaging processes. Many different methods for measuring the performance of a process have been proposed by different researchers. In this paper, we will discuss the advantages and shortcomings of most of main analysis criteria and performance measures currently used. The aim is not to establish a harsh competition between algorithms or processes, but rather to test and compare the efficiency of methodologies firstly to highlight strengths and weaknesses of a given algorithm or methodology on a given image type and secondly to have these results publicly available. This paper is focused on two important unsolved problems. Why it is so difficult to select a color space which gives better results than another one? Why it is so difficult to select an image quality metric which gives better results than another one, with respect to the judgment of the Human Visual System? Several methods used either in color imaging or in image quality will be thus discussed. Proposals for content-based image measures and means of developing a standard test suite for will be then presented. The above reference advocates for an evaluation protocol based on an automated procedure. This is the ultimate goal of our proposal.

  1. A concept taxonomy and an instrument hierarchy: tools for establishing and evaluating the conceptual framework of a patient-reported outcome (PRO) instrument as applied to product labeling claims.

    PubMed

    Erickson, Pennifer; Willke, Richard; Burke, Laurie

    2009-01-01

    To facilitate development and evaluation of a PRO instrument conceptual framework, we propose two tools--a PRO concept taxonomy and a PRO instrument hierarchy. FDA's draft guidance on patient reported outcome (PRO) measures states that a clear description of the conceptual framework of an instrument is useful for evaluating its adequacy to support a treatment benefit claim for use in product labeling the draft guidance, however does not propose tools for establishing or evaluating a PRO instrument's conceptual framework. We draw from our review of PRO concepts and instruments that appear in prescription drug labeling approved in the United States from 1997 to 2007. We propose taxonomy terms that define relationships between PRO concepts, including "family,"compound concept," and "singular concept." Based on the range of complexity represented by the concepts, as defined by the taxonomy, we propose nine instrument orders for PRO measurement. The nine orders range from individual event counts to multi-item, multiscale instruments. This analysis of PRO concepts and instruments illustrates that the taxonomy and hierarchy are applicable to PRO concepts across a wide range of therapeutic areas and provide a basis for defining the instrument conceptual framework complexity. Although the utility of these tools in the drug development, review, and approval processes has not yet been demonstrated, these tools could be useful to improve communication and enhance efficiency in the instrument development and review process.

  2. Method of evaluation of process of red blood cell sedimentation based on photometry of droplet samples.

    PubMed

    Aristov, Alexander; Nosova, Ekaterina

    2017-04-01

    The paper focuses on research aimed at creating and testing a new approach to evaluate the processes of aggregation and sedimentation of red blood cells for purpose of its use in clinical laboratory diagnostics. The proposed method is based on photometric analysis of blood sample formed as a sessile drop. The results of clinical approbation of this method are given in the paper. Analysis of the processes occurring in the sample in the form of sessile drop during the process of blood cells sedimentation is described. The results of experimental studies to evaluate the effect of the droplet sample focusing properties on light radiation transmittance are presented. It is shown that this method significantly reduces the sample volume and provides sufficiently high sensitivity to the studied processes.

  3. Novel shortcut estimation method for regeneration energy of amine solvents in an absorption-based carbon capture process.

    PubMed

    Kim, Huiyong; Hwang, Sung June; Lee, Kwang Soon

    2015-02-03

    Among various CO2 capture processes, the aqueous amine-based absorption process is considered the most promising for near-term deployment. However, the performance evaluation of newly developed solvents still requires complex and time-consuming procedures, such as pilot plant tests or the development of a rigorous simulator. Absence of accurate and simple calculation methods for the energy performance at an early stage of process development has lengthened and increased expense of the development of economically feasible CO2 capture processes. In this paper, a novel but simple method to reliably calculate the regeneration energy in a standard amine-based carbon capture process is proposed. Careful examination of stripper behaviors and exploitation of energy balance equations around the stripper allowed for calculation of the regeneration energy using only vapor-liquid equilibrium and caloric data. Reliability of the proposed method was confirmed by comparing to rigorous simulations for two well-known solvents, monoethanolamine (MEA) and piperazine (PZ). The proposed method can predict the regeneration energy at various operating conditions with greater simplicity, greater speed, and higher accuracy than those proposed in previous studies. This enables faster and more precise screening of various solvents and faster optimization of process variables and can eventually accelerate the development of economically deployable CO2 capture processes.

  4. A Proposal for Evaluating Cognition in Assertiveness

    ERIC Educational Resources Information Center

    Vagos, Paula; Pereira, Anabela

    2010-01-01

    This article presents the development process and initial psychometric features of an instrument for evaluating cognition in assertiveness. This is an essential social skill for adolescent development and seems to encompass emotional, behavioral, and cognitive aspects. The instrument was created by combining both empirical and theoretical methods…

  5. College Students' Instructional Expectations and Evaluations.

    ERIC Educational Resources Information Center

    Calista, Donald J.

    Typical end-of-course faculty ratings were questioned for their inability to measure actual classroom interaction. Extending the concept of these evaluations to include the student instructional expectations dimension, the study proposed that the classroom experience be related to the process and systems approaches, more dependent upon monitoring…

  6. 76 FR 59420 - Proposed Information Collection; Alaska Guide Service Evaluation

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-26

    ... Office of Management and Budget (OMB) to approve the information collection (IC) described below. As... lands, we issue permits for commercial guide services, including big game hunting, sport fishing... information during the competitive selection process for big game and sport fishing guide permits to evaluate...

  7. Twitter Micro-Blogging Based Mobile Learning Approach to Enhance the Agriculture Education Process

    ERIC Educational Resources Information Center

    Dissanayeke, Uvasara; Hewagamage, K. P.; Ramberg, Robert; Wikramanayake, G. N.

    2013-01-01

    The study intends to see how to introduce mobile learning within the domain of agriculture so as to enhance the agriculture education process. We propose to use the Activity theory together with other methodologies such as participatory methods to design, implement, and evaluate mLearning activities. The study explores the process of introducing…

  8. Analysis and modeling of wafer-level process variability in 28 nm FD-SOI using split C-V measurements

    NASA Astrophysics Data System (ADS)

    Pradeep, Krishna; Poiroux, Thierry; Scheer, Patrick; Juge, André; Gouget, Gilles; Ghibaudo, Gérard

    2018-07-01

    This work details the analysis of wafer level global process variability in 28 nm FD-SOI using split C-V measurements. The proposed approach initially evaluates the native on wafer process variability using efficient extraction methods on split C-V measurements. The on-wafer threshold voltage (VT) variability is first studied and modeled using a simple analytical model. Then, a statistical model based on the Leti-UTSOI compact model is proposed to describe the total C-V variability in different bias conditions. This statistical model is finally used to study the contribution of each process parameter to the total C-V variability.

  9. Small Business Innovation Research. Program solicitation. Closing date: July 22, 1988

    NASA Technical Reports Server (NTRS)

    1988-01-01

    The sixth annual Small Business Innovation Research (SBIR) solicitation by NASA, describes the program, identifies eligibility requirements, outlines proposal preparation and submission requirements, describes the proposal evaluation and award selection process, and provides other information to assist those interested in participating in the SBIR program. It also identifies in Section 8.0 and Appendix D, the specific technical topics and subtopics in which SBIR Phase 1 proposals are solicited in 1988.

  10. Implementation of an innovative teaching project in a Chemical Process Design course at the University of Cantabria, Spain

    NASA Astrophysics Data System (ADS)

    Galan, Berta; Muñoz, Iciar; Viguri, Javier R.

    2016-09-01

    This paper shows the planning, the teaching activities and the evaluation of the learning and teaching process implemented in the Chemical Process Design course at the University of Cantabria, Spain. Educational methods to address the knowledge, skills and attitudes that students who complete the course are expected to acquire are proposed and discussed. Undergraduate and graduate engineers' perceptions of the methodology used are evaluated by means of a questionnaire. Results of the teaching activities and the strengths and weaknesses of the proposed case study are discussed in relation to the course characteristics. The findings of the empirical evaluation shows that the excessive time students had to dedicate to the case study project and dealing with limited information are the most negative aspects obtained, whereas an increase in the students' self-confidence and the practical application of the methodology are the most positive aspects. Finally, improvements are discussed in order to extend the application of the methodology to other courses offered as part of the chemical engineering degree.

  11. Design and information requirements for travel and tourism needs on scenic byways.

    DOT National Transportation Integrated Search

    1994-01-01

    The purpose of this study was to develop a system design and information evaluation process that could be used to review proposed or designated scenic byways. The process was intended to ensure that the geometric and traffic design of these roads wer...

  12. Presidential Primaries: Front-Loaded Fiascoes?

    ERIC Educational Resources Information Center

    Gans, Curtis

    1996-01-01

    Criticizes the presidential primary process as possibly leading to the destruction of the two-party system. Claims that the current process limits the competition to the rich and famous, enhances the worst aspects of campaigning, and evaluates candidates purely on their political skills. Briefly discusses some reform proposals. (MJP)

  13. A Computational Evaluation of Sentence Processing Deficits in Aphasia

    ERIC Educational Resources Information Center

    Patil, Umesh; Hanne, Sandra; Burchert, Frank; De Bleser, Ria; Vasishth, Shravan

    2016-01-01

    Individuals with agrammatic Broca's aphasia experience difficulty when processing reversible non-canonical sentences. Different accounts have been proposed to explain this phenomenon. The Trace Deletion account (Grodzinsky, 1995, 2000, 2006) attributes this deficit to an impairment in syntactic representations, whereas others (e.g., Caplan,…

  14. A 3D Human-Machine Integrated Design and Analysis Framework for Squat Exercises with a Smith Machine.

    PubMed

    Lee, Haerin; Jung, Moonki; Lee, Ki-Kwang; Lee, Sang Hun

    2017-02-06

    In this paper, we propose a three-dimensional design and evaluation framework and process based on a probabilistic-based motion synthesis algorithm and biomechanical analysis system for the design of the Smith machine and squat training programs. Moreover, we implemented a prototype system to validate the proposed framework. The framework consists of an integrated human-machine-environment model as well as a squat motion synthesis system and biomechanical analysis system. In the design and evaluation process, we created an integrated model in which interactions between a human body and machine or the ground are modeled as joints with constraints at contact points. Next, we generated Smith squat motion using the motion synthesis program based on a Gaussian process regression algorithm with a set of given values for independent variables. Then, using the biomechanical analysis system, we simulated joint moments and muscle activities from the input of the integrated model and squat motion. We validated the model and algorithm through physical experiments measuring the electromyography (EMG) signals, ground forces, and squat motions as well as through a biomechanical simulation of muscle forces. The proposed approach enables the incorporation of biomechanics in the design process and reduces the need for physical experiments and prototypes in the development of training programs and new Smith machines.

  15. A 3D Human-Machine Integrated Design and Analysis Framework for Squat Exercises with a Smith Machine

    PubMed Central

    Lee, Haerin; Jung, Moonki; Lee, Ki-Kwang; Lee, Sang Hun

    2017-01-01

    In this paper, we propose a three-dimensional design and evaluation framework and process based on a probabilistic-based motion synthesis algorithm and biomechanical analysis system for the design of the Smith machine and squat training programs. Moreover, we implemented a prototype system to validate the proposed framework. The framework consists of an integrated human–machine–environment model as well as a squat motion synthesis system and biomechanical analysis system. In the design and evaluation process, we created an integrated model in which interactions between a human body and machine or the ground are modeled as joints with constraints at contact points. Next, we generated Smith squat motion using the motion synthesis program based on a Gaussian process regression algorithm with a set of given values for independent variables. Then, using the biomechanical analysis system, we simulated joint moments and muscle activities from the input of the integrated model and squat motion. We validated the model and algorithm through physical experiments measuring the electromyography (EMG) signals, ground forces, and squat motions as well as through a biomechanical simulation of muscle forces. The proposed approach enables the incorporation of biomechanics in the design process and reduces the need for physical experiments and prototypes in the development of training programs and new Smith machines. PMID:28178184

  16. An adaptive semantic based mediation system for data interoperability among Health Information Systems.

    PubMed

    Khan, Wajahat Ali; Khattak, Asad Masood; Hussain, Maqbool; Amin, Muhammad Bilal; Afzal, Muhammad; Nugent, Christopher; Lee, Sungyoung

    2014-08-01

    Heterogeneity in the management of the complex medical data, obstructs the attainment of data level interoperability among Health Information Systems (HIS). This diversity is dependent on the compliance of HISs with different healthcare standards. Its solution demands a mediation system for the accurate interpretation of data in different heterogeneous formats for achieving data interoperability. We propose an adaptive AdapteR Interoperability ENgine mediation system called ARIEN, that arbitrates between HISs compliant to different healthcare standards for accurate and seamless information exchange to achieve data interoperability. ARIEN stores the semantic mapping information between different standards in the Mediation Bridge Ontology (MBO) using ontology matching techniques. These mappings are provided by our System for Parallel Heterogeneity (SPHeRe) matching system and Personalized-Detailed Clinical Model (P-DCM) approach to guarantee accuracy of mappings. The realization of the effectiveness of the mappings stored in the MBO is evaluation of the accuracy in transformation process among different standard formats. We evaluated our proposed system with the transformation process of medical records between Clinical Document Architecture (CDA) and Virtual Medical Record (vMR) standards. The transformation process achieved over 90 % of accuracy level in conversion process between CDA and vMR standards using pattern oriented approach from the MBO. The proposed mediation system improves the overall communication process between HISs. It provides an accurate and seamless medical information exchange to ensure data interoperability and timely healthcare services to patients.

  17. Synthesis method from low-coherence digital holograms for improvement of image quality in holographic display.

    PubMed

    Mori, Yutaka; Nomura, Takanori

    2013-06-01

    In holographic displays, it is undesirable to observe the speckle noises with the reconstructed images. A method for improvement of reconstructed image quality by synthesizing low-coherence digital holograms is proposed. It is possible to obtain speckleless reconstruction of holograms due to low-coherence digital holography. An image sensor records low-coherence digital holograms, and the holograms are synthesized by computational calculation. Two approaches, the threshold-processing and the picking-a-peak methods, are proposed in order to reduce random noise of low-coherence digital holograms. The reconstructed image quality by the proposed methods is compared with the case of high-coherence digital holography. Quantitative evaluation is given to confirm the proposed methods. In addition, the visual evaluation by 15 people is also shown.

  18. Two-structured solid particle model for predicting and analyzing supercritical extraction performance.

    PubMed

    Samadi, Sara; Vaziri, Behrooz Mahmoodzadeh

    2017-07-14

    Solid extraction process, using the supercritical fluid, is a modern science and technology, which has come in vogue regarding its considerable advantages. In the present article, a new and comprehensive model is presented for predicting the performance and separation yield of the supercritical extraction process. The base of process modeling is partial differential mass balances. In the proposed model, the solid particles are considered twofold: (a) particles with intact structure, (b) particles with destructed structure. A distinct mass transfer coefficient has been used for extraction of each part of solid particles to express different extraction regimes and to evaluate the process accurately (internal mass transfer coefficient was used for the intact-structure particles and external mass transfer coefficient was employed for the destructed-structure particles). In order to evaluate and validate the proposed model, the obtained results from simulations were compared with two series of available experimental data for extraction of chamomile extract with supercritical carbon dioxide, which had an excellent agreement. This is indicative of high potentiality of the model in predicting the extraction process, precisely. In the following, the effect of major parameters on supercritical extraction process, like pressure, temperature, supercritical fluid flow rate, and the size of solid particles was evaluated. The model can be used as a superb starting point for scientific and experimental applications. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. An Empirical Investigation of Entrepreneurship Intensity in Iranian State Universities

    ERIC Educational Resources Information Center

    Mazdeh, Mohammad Mahdavi; Razavi, Seyed-Mostafa; Hesamamiri, Roozbeh; Zahedi, Mohammad-Reza; Elahi, Behin

    2013-01-01

    The purpose of this study is to propose a framework to evaluate the entrepreneurship intensity (EI) of Iranian state universities. In order to determine EI, a hybrid multi-method framework consisting of Delphi, Analytic Network Process (ANP), and VIKOR is proposed. The Delphi method is used to localize and reduce the number of criteria extracted…

  20. The Impact of a Scaffolded Assessment Intervention on Students' Academic Achievement in Web-Based Peer Assessment Activities

    ERIC Educational Resources Information Center

    Lee, Chien-I; Yang, Ya-Fei; Mai, Shin-Yi

    2016-01-01

    Web-based peer assessment has been considered an important process for learning. However, students may not offer constructive feedback due to lack of expertise knowledge. Therefore, this study proposed a scaffolded assessment approach accordingly. To evaluate the effectiveness of the proposed approach, the quasi-experimental design was employed to…

  1. 77 FR 19391 - Notice of Proposed Intelligent Mail Indicia Performance Criteria With Request for Comments

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-30

    ... products designed to meet new customer needs for access to postage. In addition, changes within the United... opportunities for PES providers to propose new concepts, methods, and processes to enable customers to print pre... support the USPS PES Test and Evaluation Program (the ``Program''). The intent is for the volumes to fully...

  2. Using Photo-Interviewing as Tool for Research and Evaluation.

    ERIC Educational Resources Information Center

    Dempsey, John V.; Tucker, Susan A.

    Arguing that photo-interviewing yields richer data than that usually obtained from verbal interviewing procedures alone, it is proposed that this method of data collection be added to "standard" methodologies in instructional development research and evaluation. The process, as described in this paper, consists of using photographs of…

  3. Designing, Teaching, and Evaluating Two Complementary Mixed Methods Research Courses

    ERIC Educational Resources Information Center

    Christ, Thomas W.

    2009-01-01

    Teaching mixed methods research is difficult. This longitudinal explanatory study examined how two classes were designed, taught, and evaluated. Curriculum, Research, and Teaching (EDCS-606) and Mixed Methods Research (EDCS-780) used a research proposal generation process to highlight the importance of the purpose, research question and…

  4. Low-power, high-speed 1-bit inexact Full Adder cell designs applicable to low-energy image processing

    NASA Astrophysics Data System (ADS)

    Zareei, Zahra; Navi, Keivan; Keshavarziyan, Peiman

    2018-03-01

    In this paper, three novel low-power and high-speed 1-bit inexact Full Adder cell designs are presented based on current mode logic in 32 nm carbon nanotube field effect transistor technology for the first time. The circuit-level figures of merits, i.e. power, delay and power-delay product as well as application-level metric such as error distance, are considered to assess the efficiency of the proposed cells over their counterparts. The effect of voltage scaling and temperature variation on the proposed cells is studied using HSPICE tool. Moreover, using MATLAB tool, the peak signal to noise ratio of the proposed cells is evaluated in an image-processing application referred to as motion detector. Simulation results confirm the efficiency of the proposed cells.

  5. Cache-Oblivious parallel SIMD Viterbi decoding for sequence search in HMMER.

    PubMed

    Ferreira, Miguel; Roma, Nuno; Russo, Luis M S

    2014-05-30

    HMMER is a commonly used bioinformatics tool based on Hidden Markov Models (HMMs) to analyze and process biological sequences. One of its main homology engines is based on the Viterbi decoding algorithm, which was already highly parallelized and optimized using Farrar's striped processing pattern with Intel SSE2 instruction set extension. A new SIMD vectorization of the Viterbi decoding algorithm is proposed, based on an SSE2 inter-task parallelization approach similar to the DNA alignment algorithm proposed by Rognes. Besides this alternative vectorization scheme, the proposed implementation also introduces a new partitioning of the Markov model that allows a significantly more efficient exploitation of the cache locality. Such optimization, together with an improved loading of the emission scores, allows the achievement of a constant processing throughput, regardless of the innermost-cache size and of the dimension of the considered model. The proposed optimized vectorization of the Viterbi decoding algorithm was extensively evaluated and compared with the HMMER3 decoder to process DNA and protein datasets, proving to be a rather competitive alternative implementation. Being always faster than the already highly optimized ViterbiFilter implementation of HMMER3, the proposed Cache-Oblivious Parallel SIMD Viterbi (COPS) implementation provides a constant throughput and offers a processing speedup as high as two times faster, depending on the model's size.

  6. An Approach for Integrating the Prioritization of Functional and Nonfunctional Requirements

    PubMed Central

    Dabbagh, Mohammad; Lee, Sai Peck

    2014-01-01

    Due to the budgetary deadlines and time to market constraints, it is essential to prioritize software requirements. The outcome of requirements prioritization is an ordering of requirements which need to be considered first during the software development process. To achieve a high quality software system, both functional and nonfunctional requirements must be taken into consideration during the prioritization process. Although several requirements prioritization methods have been proposed so far, no particular method or approach is presented to consider both functional and nonfunctional requirements during the prioritization stage. In this paper, we propose an approach which aims to integrate the process of prioritizing functional and nonfunctional requirements. The outcome of applying the proposed approach produces two separate prioritized lists of functional and non-functional requirements. The effectiveness of the proposed approach has been evaluated through an empirical experiment aimed at comparing the approach with the two state-of-the-art-based approaches, analytic hierarchy process (AHP) and hybrid assessment method (HAM). Results show that our proposed approach outperforms AHP and HAM in terms of actual time-consumption while preserving the quality of the results obtained by our proposed approach at a high level of agreement in comparison with the results produced by the other two approaches. PMID:24982987

  7. An approach for integrating the prioritization of functional and nonfunctional requirements.

    PubMed

    Dabbagh, Mohammad; Lee, Sai Peck

    2014-01-01

    Due to the budgetary deadlines and time to market constraints, it is essential to prioritize software requirements. The outcome of requirements prioritization is an ordering of requirements which need to be considered first during the software development process. To achieve a high quality software system, both functional and nonfunctional requirements must be taken into consideration during the prioritization process. Although several requirements prioritization methods have been proposed so far, no particular method or approach is presented to consider both functional and nonfunctional requirements during the prioritization stage. In this paper, we propose an approach which aims to integrate the process of prioritizing functional and nonfunctional requirements. The outcome of applying the proposed approach produces two separate prioritized lists of functional and non-functional requirements. The effectiveness of the proposed approach has been evaluated through an empirical experiment aimed at comparing the approach with the two state-of-the-art-based approaches, analytic hierarchy process (AHP) and hybrid assessment method (HAM). Results show that our proposed approach outperforms AHP and HAM in terms of actual time-consumption while preserving the quality of the results obtained by our proposed approach at a high level of agreement in comparison with the results produced by the other two approaches.

  8. Large Scale Frequent Pattern Mining using MPI One-Sided Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vishnu, Abhinav; Agarwal, Khushbu

    In this paper, we propose a work-stealing runtime --- Library for Work Stealing LibWS --- using MPI one-sided model for designing scalable FP-Growth --- {\\em de facto} frequent pattern mining algorithm --- on large scale systems. LibWS provides locality efficient and highly scalable work-stealing techniques for load balancing on a variety of data distributions. We also propose a novel communication algorithm for FP-growth data exchange phase, which reduces the communication complexity from state-of-the-art O(p) to O(f + p/f) for p processes and f frequent attributed-ids. FP-Growth is implemented using LibWS and evaluated on several work distributions and support counts. Anmore » experimental evaluation of the FP-Growth on LibWS using 4096 processes on an InfiniBand Cluster demonstrates excellent efficiency for several work distributions (87\\% efficiency for Power-law and 91% for Poisson). The proposed distributed FP-Tree merging algorithm provides 38x communication speedup on 4096 cores.« less

  9. A Collaborative Secure Localization Algorithm Based on Trust Model in Underwater Wireless Sensor Networks

    PubMed Central

    Han, Guangjie; Liu, Li; Jiang, Jinfang; Shu, Lei; Rodrigues, Joel J.P.C.

    2016-01-01

    Localization is one of the hottest research topics in Underwater Wireless Sensor Networks (UWSNs), since many important applications of UWSNs, e.g., event sensing, target tracking and monitoring, require location information of sensor nodes. Nowadays, a large number of localization algorithms have been proposed for UWSNs. How to improve location accuracy are well studied. However, few of them take location reliability or security into consideration. In this paper, we propose a Collaborative Secure Localization algorithm based on Trust model (CSLT) for UWSNs to ensure location security. Based on the trust model, the secure localization process can be divided into the following five sub-processes: trust evaluation of anchor nodes, initial localization of unknown nodes, trust evaluation of reference nodes, selection of reference node, and secondary localization of unknown node. Simulation results demonstrate that the proposed CSLT algorithm performs better than the compared related works in terms of location security, average localization accuracy and localization ratio. PMID:26891300

  10. Automated grain extraction and classification by combining improved region growing segmentation and shape descriptors in electromagnetic mill classification system

    NASA Astrophysics Data System (ADS)

    Budzan, Sebastian

    2018-04-01

    In this paper, the automatic method of grain detection and classification has been presented. As input, it uses a single digital image obtained from milling process of the copper ore with an high-quality digital camera. The grinding process is an extremely energy and cost consuming process, thus granularity evaluation process should be performed with high efficiency and time consumption. The method proposed in this paper is based on the three-stage image processing. First, using Seeded Region Growing (SRG) segmentation with proposed adaptive thresholding based on the calculation of Relative Standard Deviation (RSD) all grains are detected. In the next step results of the detection are improved using information about the shape of the detected grains using distance map. Finally, each grain in the sample is classified into one of the predefined granularity class. The quality of the proposed method has been obtained by using nominal granularity samples, also with a comparison to the other methods.

  11. Establishment and assessment of a novel cleaner production process of corn grain fuel ethanol.

    PubMed

    Wang, Ke; Zhang, Jianhua; Tang, Lei; Zhang, Hongjian; Zhang, Guiying; Yang, Xizhao; Liu, Pei; Mao, Zhonggui

    2013-11-01

    An integrated corn ethanol-methane fermentation system was proposed to solve the problem of stillage handling, where thin stillage was treated by anaerobic digestion and then reused to make mash for the following ethanol fermentation. This system was evaluated at laboratory and pilot scale. Anaerobic digestion of thin stillage ran steadily with total chemical oxygen demand removal efficiency of 98% at laboratory scale and 97% at pilot scale. Ethanol production was not influenced by recycling anaerobic digestion effluent at laboratory and pilot scale. Compared with dried distillers' grains with solubles produced in conventional process, dried distillers' grains in the proposed system exhibited higher quality because of increased protein concentration and decreased salts concentration. Energetic assessment indicated that application of this novel process enhanced the net energy balance ratio from 1.26 (conventional process) to 1.76. In conclusion, the proposed system possessed technical advantage over the conventional process for corn fuel ethanol production. Copyright © 2013 Elsevier Ltd. All rights reserved.

  12. How can systems engineering inform the methods of programme evaluation in health professions education?

    PubMed

    Rojas, David; Grierson, Lawrence; Mylopoulos, Maria; Trbovich, Patricia; Bagli, Darius; Brydges, Ryan

    2018-04-01

    We evaluate programmes in health professions education (HPE) to determine their effectiveness and value. Programme evaluation has evolved from use of reductionist frameworks to those addressing the complex interactions between programme factors. Researchers in HPE have recently suggested a 'holistic programme evaluation' aiming to better describe and understand the implications of 'emergent processes and outcomes'. We propose a programme evaluation framework informed by principles and tools from systems engineering. Systems engineers conceptualise complexity and emergent elements in unique ways that may complement and extend contemporary programme evaluations in HPE. We demonstrate how the abstract decomposition space (ADS), an engineering knowledge elicitation tool, provides the foundation for a systems engineering informed programme evaluation designed to capture both planned and emergent programme elements. We translate the ADS tool to use education-oriented language, and describe how evaluators can use it to create a programme-specific ADS through iterative refinement. We provide a conceptualisation of emergent elements and an equation that evaluators can use to identify the emergent elements in their programme. Using our framework, evaluators can analyse programmes not as isolated units with planned processes and planned outcomes, but as unfolding, complex interactive systems that will exhibit emergent processes and emergent outcomes. Subsequent analysis of these emergent elements will inform the evaluator as they seek to optimise and improve the programme. Our proposed systems engineering informed programme evaluation framework provides principles and tools for analysing the implications of planned and emergent elements, as well as their potential interactions. We acknowledge that our framework is preliminary and will require application and constant refinement. We suggest that our framework will also advance our understanding of the construct of 'emergence' in HPE research. © 2017 John Wiley & Sons Ltd and The Association for the Study of Medical Education.

  13. Lazy checkpoint coordination for bounding rollback propagation

    NASA Technical Reports Server (NTRS)

    Wang, Yi-Min; Fuchs, W. Kent

    1992-01-01

    Independent checkpointing allows maximum process autonomy but suffers from potential domino effects. Coordinated checkpointing eliminates the domino effect by sacrificing a certain degree of process autonomy. In this paper, we propose the technique of lazy checkpoint coordination which preserves process autonomy while employing communication-induced checkpoint coordination for bounding rollback propagation. The introduction of the notion of laziness allows a flexible trade-off between the cost for checkpoint coordination and the average rollback distance. Worst-case overhead analysis provides a means for estimating the extra checkpoint overhead. Communication trace-driven simulation for several parallel programs is used to evaluate the benefits of the proposed scheme for real applications.

  14. The philosophy of benchmark testing a standards-based picture archiving and communications system.

    PubMed

    Richardson, N E; Thomas, J A; Lyche, D K; Romlein, J; Norton, G S; Dolecek, Q E

    1999-05-01

    The Department of Defense issued its requirements for a Digital Imaging Network-Picture Archiving and Communications System (DIN-PACS) in a Request for Proposals (RFP) to industry in January 1997, with subsequent contracts being awarded in November 1997 to the Agfa Division of Bayer and IBM Global Government Industry. The Government's technical evaluation process consisted of evaluating a written technical proposal as well as conducting a benchmark test of each proposed system at the vendor's test facility. The purpose of benchmark testing was to evaluate the performance of the fully integrated system in a simulated operational environment. The benchmark test procedures and test equipment were developed through a joint effort between the Government, academic institutions, and private consultants. Herein the authors discuss the resources required and the methods used to benchmark test a standards-based PACS.

  15. Moving from Cognition to Behavior: What the Research Says

    ERIC Educational Resources Information Center

    Johnson, Russell E.; Chang, Chu-Hsiang; Lord, Robert G.

    2006-01-01

    In 1994, R. G. Lord and P. E. Levy proposed a variant of control theory that incorporated human information processing principles. The current article evaluates the empirical evidence for their propositions and updates the theory by considering contemporary research on information processing. Considerable support drawing from diverse literatures…

  16. Fuzzy Relational Databases: Representational Issues and Reduction Using Similarity Measures.

    ERIC Educational Resources Information Center

    Prade, Henri; Testemale, Claudette

    1987-01-01

    Compares and expands upon two approaches to dealing with fuzzy relational databases. The proposed similarity measure is based on a fuzzy Hausdorff distance and estimates the mismatch between two possibility distributions using a reduction process. The consequences of the reduction process on query evaluation are studied. (Author/EM)

  17. 48 CFR 915.207-70 - Handling proposals and information during evaluation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... information (data) before decision as to the award of a contract, or the transfer of valuable and sensitive information between competing offerors during the competitive phase of the acquisition process, would seriously disrupt the Government's decision-making process and undermine the integrity of the competitive...

  18. 47 CFR 17.4 - Antenna structure registration.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... provisions of the relevant local zoning process. The local notice shall contain all of the descriptive... not adequately evaluate the potentially significant environmental effects of the proposal. The Request...

  19. Community Currency Trading Method through Partial Transaction Intermediary Process

    NASA Astrophysics Data System (ADS)

    Kido, Kunihiko; Hasegawa, Seiichi; Komoda, Norihisa

    A community currency is local money that is issued by local governments or Non-Profit Organization (NPO) to support social services. The purpose of introducing community currencies is to regenerate communities by fostering mutual aids among community members. In this paper, we propose a community currency trading method through partial intermediary process, under operational environments without introducing coordinators all the time. In this method, coordinators perform coordination between service users and service providers during several months from the start point of transactions. After the period of coordination, participants spontaneously make transactions based on their trust area and a trust evaluation method based on the number of provided services and complaint information. This method is especially effective to communities with close social networks and low trustworthiness. The proposed method is evaluated through multi-agent simulation.

  20. An Experience Oriented-Convergence Improved Gravitational Search Algorithm for Minimum Variance Distortionless Response Beamforming Optimum.

    PubMed

    Darzi, Soodabeh; Tiong, Sieh Kiong; Tariqul Islam, Mohammad; Rezai Soleymanpour, Hassan; Kibria, Salehin

    2016-01-01

    An experience oriented-convergence improved gravitational search algorithm (ECGSA) based on two new modifications, searching through the best experiments and using of a dynamic gravitational damping coefficient (α), is introduced in this paper. ECGSA saves its best fitness function evaluations and uses those as the agents' positions in searching process. In this way, the optimal found trajectories are retained and the search starts from these trajectories, which allow the algorithm to avoid the local optimums. Also, the agents can move faster in search space to obtain better exploration during the first stage of the searching process and they can converge rapidly to the optimal solution at the final stage of the search process by means of the proposed dynamic gravitational damping coefficient. The performance of ECGSA has been evaluated by applying it to eight standard benchmark functions along with six complicated composite test functions. It is also applied to adaptive beamforming problem as a practical issue to improve the weight vectors computed by minimum variance distortionless response (MVDR) beamforming technique. The results of implementation of the proposed algorithm are compared with some well-known heuristic methods and verified the proposed method in both reaching to optimal solutions and robustness.

  1. Automated hierarchical time gain compensation for in-vivo ultrasound imaging

    NASA Astrophysics Data System (ADS)

    Moshavegh, Ramin; Hemmsen, Martin C.; Martins, Bo; Brandt, Andreas H.; Hansen, Kristoffer L.; Nielsen, Michael B.; Jensen, Jørgen A.

    2015-03-01

    Time gain compensation (TGC) is essential to ensure the optimal image quality of the clinical ultrasound scans. When large fluid collections are present within the scan plane, the attenuation distribution is changed drastically and TGC compensation becomes challenging. This paper presents an automated hierarchical TGC (AHTGC) algorithm that accurately adapts to the large attenuation variation between different types of tissues and structures. The algorithm relies on estimates of tissue attenuation, scattering strength, and noise level to gain a more quantitative understanding of the underlying tissue and the ultrasound signal strength. The proposed algorithm was applied to a set of 44 in vivo abdominal movie sequences each containing 15 frames. Matching pairs of in vivo sequences, unprocessed and processed with the proposed AHTGC were visualized side by side and evaluated by two radiologists in terms of image quality. Wilcoxon signed-rank test was used to evaluate whether radiologists preferred the processed sequences or the unprocessed data. The results indicate that the average visual analogue scale (VAS) is positive ( p-value: 2.34 × 10-13) and estimated to be 1.01 (95% CI: 0.85; 1.16) favoring the processed data with the proposed AHTGC algorithm.

  2. DE-CERTS: A Decision Support System for a Comparative Evaluation Method for Risk Management Methodologies and Tools

    DTIC Science & Technology

    1991-09-01

    iv III. THE ANALYTIC HIERARCHY PROCESS ..... ........ 15 A. INTRODUCTION ...... ................. 15 B. THE AHP PROCESS ...... ................ 16 C...INTRODUCTION ...... ................. 26 B. IMPLEMENTATION OF CERTS USING AHP ........ .. 27 1. Consistency ...... ................ 29 2. User Interface...the proposed technique into a Decision Support System. Expert Choice implements the Analytic Hierarchy Process ( AHP ), an approach to multi- criteria

  3. Implementation of an Innovative Teaching Project in a Chemical Process Design Course at the University of Cantabria, Spain

    ERIC Educational Resources Information Center

    Galan, Berta; Muñoz, Iciar; Viguri, Javier R.

    2016-01-01

    This paper shows the planning, the teaching activities and the evaluation of the learning and teaching process implemented in the Chemical Process Design course at the University of Cantabria, Spain. Educational methods to address the knowledge, skills and attitudes that students who complete the course are expected to acquire are proposed and…

  4. Linking automatic evaluation to mood and information processing style: consequences for experienced affect, impression formation, and stereotyping.

    PubMed

    Chartrand, Tanya L; van Baaren, Rick B; Bargh, John A

    2006-02-01

    According to the feelings-as-information account, a person's mood state signals to him or her the valence of the current environment (N. Schwarz & G. Clore, 1983). However, the ways in which the environment automatically influences mood in the first place remain to be explored. The authors propose that one mechanism by which the environment influences affect is automatic evaluation, the nonconscious evaluation of environmental stimuli as good or bad. A first experiment demonstrated that repeated brief exposure to positive or negative stimuli (which leads to automatic evaluation) induces a corresponding mood in participants. In 3 additional studies, the authors showed that automatic evaluation affects information processing style. Experiment 4 showed that participants' mood mediates the effect of valenced brief primes on information processing. ((c) 2006 APA, all rights reserved).

  5. Linking Automatic Evaluation to Mood and Information Processing Style: Consequences for Experienced Affect, Impression Formation, and Stereotyping

    PubMed Central

    Chartrand, Tanya L.; van Baaren, Rick B.; Bargh, John A.

    2009-01-01

    According to the feelings-as-information account, a person’s mood state signals to him or her the valence of the current environment (N. Schwarz & G. Clore, 1983). However, the ways in which the environment automatically influences mood in the first place remain to be explored. The authors propose that one mechanism by which the environment influences affect is automatic evaluation, the nonconscious evaluation of environmental stimuli as good or bad. A first experiment demonstrated that repeated brief exposure to positive or negative stimuli (which leads to automatic evaluation) induces a corresponding mood in participants. In 3 additional studies, the authors showed that automatic evaluation affects information processing style. Experiment 4 showed that participants’ mood mediates the effect of valenced brief primes on information processing. PMID:16478316

  6. Line-width roughness of advanced semiconductor features by using FIB and planar-TEM as reference metrology

    NASA Astrophysics Data System (ADS)

    Takamasu, Kiyoshi; Takahashi, Satoru; Kawada, Hiroki; Ikota, Masami

    2018-03-01

    LER (Line Edge Roughness) and LWR (Line Width Roughness) of the semiconductor device are an important evaluation scale of the performance of the device. Conventionally, LER and LWR is evaluated from CD-SEM (Critical Dimension Scanning Electron Microscope) images. However, CD-SEM measurement has a problem that high frequency random noise is large, and resolution is not sufficiently high. For random noise of CD-SEM measurement, some techniques are proposed. In these methods, it is necessary to set parameters for model and processing, and it is necessary to verify the correctness of these parameters using reference metrology. We have already proposed a novel reference metrology using FIB (Focused Ion Beam) process and planar-TEM (Transmission Electron Microscope) method. In this study, we applied the proposed method to three new samples such as SAQP (Self-Aligned Quadruple Patterning) FinFET device, EUV (Extreme Ultraviolet Lithography) conventional resist, and EUV new material resist. LWR and PSD (Power Spectral Density) of LWR are calculated from the edge positions on planar-TEM images. We confirmed that LWR and PSD of LWR can be measured with high accuracy and evaluated the difference by the proposed method. Furthermore, from comparisons with PSD of the same sample by CD-SEM, the validity of measurement of PSD and LWR by CD-SEM can be verified.

  7. Processing uncertain RFID data in traceability supply chains.

    PubMed

    Xie, Dong; Xiao, Jie; Guo, Guangjun; Jiang, Tong

    2014-01-01

    Radio Frequency Identification (RFID) is widely used to track and trace objects in traceability supply chains. However, massive uncertain data produced by RFID readers are not effective and efficient to be used in RFID application systems. Following the analysis of key features of RFID objects, this paper proposes a new framework for effectively and efficiently processing uncertain RFID data, and supporting a variety of queries for tracking and tracing RFID objects. We adjust different smoothing windows according to different rates of uncertain data, employ different strategies to process uncertain readings, and distinguish ghost, missing, and incomplete data according to their apparent positions. We propose a comprehensive data model which is suitable for different application scenarios. In addition, a path coding scheme is proposed to significantly compress massive data by aggregating the path sequence, the position, and the time intervals. The scheme is suitable for cyclic or long paths. Moreover, we further propose a processing algorithm for group and independent objects. Experimental evaluations show that our approach is effective and efficient in terms of the compression and traceability queries.

  8. Processing Uncertain RFID Data in Traceability Supply Chains

    PubMed Central

    Xie, Dong; Xiao, Jie

    2014-01-01

    Radio Frequency Identification (RFID) is widely used to track and trace objects in traceability supply chains. However, massive uncertain data produced by RFID readers are not effective and efficient to be used in RFID application systems. Following the analysis of key features of RFID objects, this paper proposes a new framework for effectively and efficiently processing uncertain RFID data, and supporting a variety of queries for tracking and tracing RFID objects. We adjust different smoothing windows according to different rates of uncertain data, employ different strategies to process uncertain readings, and distinguish ghost, missing, and incomplete data according to their apparent positions. We propose a comprehensive data model which is suitable for different application scenarios. In addition, a path coding scheme is proposed to significantly compress massive data by aggregating the path sequence, the position, and the time intervals. The scheme is suitable for cyclic or long paths. Moreover, we further propose a processing algorithm for group and independent objects. Experimental evaluations show that our approach is effective and efficient in terms of the compression and traceability queries. PMID:24737978

  9. Process service quality evaluation based on Dempster-Shafer theory and support vector machine.

    PubMed

    Pei, Feng-Que; Li, Dong-Bo; Tong, Yi-Fei; He, Fei

    2017-01-01

    Human involvement influences traditional service quality evaluations, which triggers an evaluation's low accuracy, poor reliability and less impressive predictability. This paper proposes a method by employing a support vector machine (SVM) and Dempster-Shafer evidence theory to evaluate the service quality of a production process by handling a high number of input features with a low sampling data set, which is called SVMs-DS. Features that can affect production quality are extracted by a large number of sensors. Preprocessing steps such as feature simplification and normalization are reduced. Based on three individual SVM models, the basic probability assignments (BPAs) are constructed, which can help the evaluation in a qualitative and quantitative way. The process service quality evaluation results are validated by the Dempster rules; the decision threshold to resolve conflicting results is generated from three SVM models. A case study is presented to demonstrate the effectiveness of the SVMs-DS method.

  10. Evolution of evaluation criteria in the College of American Pathologists Surveys.

    PubMed

    Ross, J W

    1988-04-01

    This review of the evolution of evaluation criteria in the College of American Pathologists Survey and of theoretical grounds proposed for evaluation criteria explores the complex nature of the evaluation process. Survey professionals balance multiple variables to seek relevant and meaningful evaluations. These include the state of the art, the reliability of target values, the nature of available control materials, the perceived medical "nonusefulness" of the extremes of performance (good or poor), this extent of laboratory services provided, and the availability of scientific data and theory by which clinically relevant criteria of medical usefulness may be established. The evaluation process has consistently sought peer concensus, to stimulate improvement in state of the art, to increase medical usefulness, and to monitor the state of the art. Recent factors that are likely to promote change from peer group evaluation to fixed criteria evaluation are the high degree of proficiency in the state of the art for many analytes, accurate target values, increased knowledge of biologic variation, and the availability of statistical modeling techniques simulating biologic and diagnostic processes as well as analytic processes.

  11. Dynamic deformation image de-blurring and image processing for digital imaging correlation measurement

    NASA Astrophysics Data System (ADS)

    Guo, X.; Li, Y.; Suo, T.; Liu, H.; Zhang, C.

    2017-11-01

    This paper proposes a method for de-blurring of images captured in the dynamic deformation of materials. De-blurring is achieved based on the dynamic-based approach, which is used to estimate the Point Spread Function (PSF) during the camera exposure window. The deconvolution process involving iterative matrix calculations of pixels, is then performed on the GPU to decrease the time cost. Compared to the Gauss method and the Lucy-Richardson method, it has the best result of the image restoration. The proposed method has been evaluated by using the Hopkinson bar loading system. In comparison to the blurry image, the proposed method has successfully restored the image. It is also demonstrated from image processing applications that the de-blurring method can improve the accuracy and the stability of the digital imaging correlation measurement.

  12. How to convince your manager to invest in an HIS preimplementation methodology for appraisal of material, process and human costs and benefits.

    PubMed Central

    Bossard, B.; Renard, J. M.; Capelle, P.; Paradis, P.; Beuscart, M. C.

    2000-01-01

    Investing in information technology has become a crucial process in hospital management today. Medical and administrative managers are faced with difficulties in measuring medical information technology costs and benefits due to the complexity of the domain. This paper proposes a preimplementation methodology for evaluating and appraising material, process and human costs and benefits. Based on the users needs and organizational process analysis, the methodology provides an evaluative set of financial and non financial indicators which can be integrated in a decision making and investment evaluation process. We describe the first results obtained after a few months of operation for the Computer-Based Patient Record (CPR) project. Its full acceptance, in spite of some difficulties, encourages us to diffuse the method for the entire project. PMID:11079851

  13. SEIPS-based process modeling in primary care.

    PubMed

    Wooldridge, Abigail R; Carayon, Pascale; Hundt, Ann Schoofs; Hoonakker, Peter L T

    2017-04-01

    Process mapping, often used as part of the human factors and systems engineering approach to improve care delivery and outcomes, should be expanded to represent the complex, interconnected sociotechnical aspects of health care. Here, we propose a new sociotechnical process modeling method to describe and evaluate processes, using the SEIPS model as the conceptual framework. The method produces a process map and supplementary table, which identify work system barriers and facilitators. In this paper, we present a case study applying this method to three primary care processes. We used purposeful sampling to select staff (care managers, providers, nurses, administrators and patient access representatives) from two clinics to observe and interview. We show the proposed method can be used to understand and analyze healthcare processes systematically and identify specific areas of improvement. Future work is needed to assess usability and usefulness of the SEIPS-based process modeling method and further refine it. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. SEIPS-Based Process Modeling in Primary Care

    PubMed Central

    Wooldridge, Abigail R.; Carayon, Pascale; Hundt, Ann Schoofs; Hoonakker, Peter

    2016-01-01

    Process mapping, often used as part of the human factors and systems engineering approach to improve care delivery and outcomes, should be expanded to represent the complex, interconnected sociotechnical aspects of health care. Here, we propose a new sociotechnical process modeling method to describe and evaluate processes, using the SEIPS model as the conceptual framework. The method produces a process map and supplementary table, which identify work system barriers and facilitators. In this paper, we present a case study applying this method to three primary care processes. We used purposeful sampling to select staff (care managers, providers, nurses, administrators and patient access representatives) from two clinics to observe and interview. We show the proposed method can be used to understand and analyze healthcare processes systematically and identify specific areas of improvement. Future work is needed to assess usability and usefulness of the SEIPS-based process modeling method and further refine it. PMID:28166883

  15. 75 FR 42375 - Missoula County Resource Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-21

    ... RAC legislation and RAC mission, establish a process for project proposal evaluation and decision making, set future meeting dates and receive public comment on the meeting subjects and proceedings...

  16. NASA's Earth Science Data Systems Standards Process Experiences

    NASA Technical Reports Server (NTRS)

    Ullman, Richard E.; Enloe, Yonsook

    2007-01-01

    NASA has impaneled several internal working groups to provide recommendations to NASA management on ways to evolve and improve Earth Science Data Systems. One of these working groups is the Standards Process Group (SPC). The SPG is drawn from NASA-funded Earth Science Data Systems stakeholders, and it directs a process of community review and evaluation of proposed NASA standards. The working group's goal is to promote interoperability and interuse of NASA Earth Science data through broader use of standards that have proven implementation and operational benefit to NASA Earth science by facilitating the NASA management endorsement of proposed standards. The SPC now has two years of experience with this approach to identification of standards. We will discuss real examples of the different types of candidate standards that have been proposed to NASA's Standards Process Group such as OPeNDAP's Data Access Protocol, the Hierarchical Data Format, and Open Geospatial Consortium's Web Map Server. Each of the three types of proposals requires a different sort of criteria for understanding the broad concepts of "proven implementation" and "operational benefit" in the context of NASA Earth Science data systems. We will discuss how our Standards Process has evolved with our experiences with the three candidate standards.

  17. Evaluation of the dermal carcinogenicity of lubricant base oils by the mouse skin painting bioassay and other proposed methods.

    PubMed

    Chasey, K L; McKee, R H

    1993-01-01

    Lubricant base oils are petroleum products that are predominantly derived from the vacuum distillation of crude oil. Various types of refinement can be employed during the manufacturing process, and evidence suggests that certain of the associated process streams produce skin cancer. Polycyclic aromatic compounds (PACs), some of which are considered as the causative agents, are removed, concentrated or chemically converted during the refinement process. In order to understand the effects of various types of refinement processes on carcinogenic potential, 94 oils were evaluated in the mouse epidermal cancer bioassay. This Exxon database is unique, because of the wide range of crude oils and processing histories represented. Seven processing history classifications are described, and conclusions concerning the impacts of each refinement process on dermal carcinogenicity are discussed. This research also included an evaluation of selected biological and chemical test methods for predicting carcinogenic potential. These included a modified version of the Ames test for mutagenicity, as well as analytical characterizations of the polycyclic aromatic structures in the oils. For classification purposes, a sample was considered to be carcinogenic if it resulted in the production of two or more tumor-bearing animals (in test groups of either 40 or 50 animals). The modified Ames test was considered to be positive if the mutagenicity index was > or = 2.0, and PAC analyses were similarly designated as positive or negative according to proposed guidelines. All of the alternative test methods showed similar agreement with dermal carcinogenicity bioassay data; concordance values were > or = 80%. However, each test was incorrect in ca. 10%-20% of the cases evaluated.(ABSTRACT TRUNCATED AT 250 WORDS)

  18. Assessment Methodology for Process Validation Lifecycle Stage 3A.

    PubMed

    Sayeed-Desta, Naheed; Pazhayattil, Ajay Babu; Collins, Jordan; Chen, Shu; Ingram, Marzena; Spes, Jana

    2017-07-01

    The paper introduces evaluation methodologies and associated statistical approaches for process validation lifecycle Stage 3A. The assessment tools proposed can be applied to newly developed and launched small molecule as well as bio-pharma products, where substantial process and product knowledge has been gathered. The following elements may be included in Stage 3A: number of 3A batch determination; evaluation of critical material attributes, critical process parameters, critical quality attributes; in vivo in vitro correlation; estimation of inherent process variability (IPV) and PaCS index; process capability and quality dashboard (PCQd); and enhanced control strategy. US FDA guidance on Process Validation: General Principles and Practices, January 2011 encourages applying previous credible experience with suitably similar products and processes. A complete Stage 3A evaluation is a valuable resource for product development and future risk mitigation of similar products and processes. Elements of 3A assessment were developed to address industry and regulatory guidance requirements. The conclusions made provide sufficient information to make a scientific and risk-based decision on product robustness.

  19. Gaussian Process Interpolation for Uncertainty Estimation in Image Registration

    PubMed Central

    Wachinger, Christian; Golland, Polina; Reuter, Martin; Wells, William

    2014-01-01

    Intensity-based image registration requires resampling images on a common grid to evaluate the similarity function. The uncertainty of interpolation varies across the image, depending on the location of resampled points relative to the base grid. We propose to perform Bayesian inference with Gaussian processes, where the covariance matrix of the Gaussian process posterior distribution estimates the uncertainty in interpolation. The Gaussian process replaces a single image with a distribution over images that we integrate into a generative model for registration. Marginalization over resampled images leads to a new similarity measure that includes the uncertainty of the interpolation. We demonstrate that our approach increases the registration accuracy and propose an efficient approximation scheme that enables seamless integration with existing registration methods. PMID:25333127

  20. A Framework for the Development of Automatic DFA Method to Minimize the Number of Components and Assembly Reorientations

    NASA Astrophysics Data System (ADS)

    Alfadhlani; Samadhi, T. M. A. Ari; Ma’ruf, Anas; Setiasyah Toha, Isa

    2018-03-01

    Assembly is a part of manufacturing processes that must be considered at the product design stage. Design for Assembly (DFA) is a method to evaluate product design in order to make it simpler, easier and quicker to assemble, so that assembly cost is reduced. This article discusses a framework for developing a computer-based DFA method. The method is expected to aid product designer to extract data, evaluate assembly process, and provide recommendation for the product design improvement. These three things are desirable to be performed without interactive process or user intervention, so product design evaluation process could be done automatically. Input for the proposed framework is a 3D solid engineering drawing. Product design evaluation is performed by: minimizing the number of components; generating assembly sequence alternatives; selecting the best assembly sequence based on the minimum number of assembly reorientations; and providing suggestion for design improvement.

  1. A combined disease management and process modeling approach for assessing and improving care processes: a fall management case-study.

    PubMed

    Askari, Marjan; Westerhof, Richard; Eslami, Saied; Medlock, Stephanie; de Rooij, Sophia E; Abu-Hanna, Ameen

    2013-10-01

    To propose a combined disease management and process modeling approach for evaluating and improving care processes, and demonstrate its usability and usefulness in a real-world fall management case study. We identified essential disease management related concepts and mapped them into explicit questions meant to expose areas for improvement in the respective care processes. We applied the disease management oriented questions to a process model of a comprehensive real world fall prevention and treatment program covering primary and secondary care. We relied on interviews and observations to complete the process models, which were captured in UML activity diagrams. A preliminary evaluation of the usability of our approach by gauging the experience of the modeler and an external validator was conducted, and the usefulness of the method was evaluated by gathering feedback from stakeholders at an invitational conference of 75 attendees. The process model of the fall management program was organized around the clinical tasks of case finding, risk profiling, decision making, coordination and interventions. Applying the disease management questions to the process models exposed weaknesses in the process including: absence of program ownership, under-detection of falls in primary care, and lack of efficient communication among stakeholders due to missing awareness about other stakeholders' workflow. The modelers experienced the approach as usable and the attendees of the invitational conference found the analysis results to be valid. The proposed disease management view of process modeling was usable and useful for systematically identifying areas of improvement in a fall management program. Although specifically applied to fall management, we believe our case study is characteristic of various disease management settings, suggesting the wider applicability of the approach. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  2. LAC indicators: an evaluation of progress and list of proposed indicators

    Treesearch

    Alan E. Watson; David N. Cole

    1992-01-01

    One of the most critical, and difficult, steps in the Limits of Acceptable Change (LAC) process is the selection of indicators. To help with this step, this paper (I) briefly reviews some desirable characteristics of indicators and (2) lists indicators that have been proposed or adopted in LAC plans. From a comparison of this list of indicators and desirable...

  3. Proposed new industry code on unhealthy food marketing to children and young people: will it make a difference?

    PubMed

    Swinburn, Boyd; Vandevijvere, Stefanie; Woodward, Alistair; Hornblow, Andrew; Richardson, Ann; Burlingame, Barbara; Borman, Barry; Taylor, Barry; Breier, Bernhard; Arroll, Bruce; Drummond, Bernadette; Grant, Cameron; Bullen, Chris; Wall, Clare; Mhurchu, Cliona Ni; Cameron-Smith, David; Menkes, David; Murdoch, David; Mangin, Dee; Lennon, Diana; Sarfati, Diana; Sellman, Doug; Rush, Elaine; Sopoaga, Faafetai; Thomson, George; Devlin, Gerry; Abel, Gillian; White, Harvey; Coad, Jane; Hoek, Janet; Connor, Jennie; Krebs, Jeremy; Douwes, Jeroen; Mann, Jim; McCall, John; Broughton, John; Potter, John D; Toop, Les; McCowan, Lesley; Signal, Louise; Beckert, Lutz; Elwood, Mark; Kruger, Marlena; Farella, Mauro; Baker, Michael; Keall, Michael; Skeaff, Murray; Thomson, Murray; Wilson, Nick; Chandler, Nicholas; Reid, Papaarangi; Priest, Patricia; Brunton, Paul; Crampton, Peter; Davis, Peter; Gendall, Philip; Howden-Chapman, Philippa; Taylor, Rachael; Edwards, Richard; Beaglehole, Robert; Doughty, Robert; Scragg, Robert; Gauld, Robin; McGee, Robert; Jackson, Rod; Hughes, Roger; Mulder, Roger; Bonita, Ruth; Kruger, Rozanne; Casswell, Sally; Derrett, Sarah; Ameratunga, Shanthi; Denny, Simon; Hales, Simon; Pullon, Sue; Wells, Susan; Cundy, Tim; Blakely, Tony

    2017-02-17

    Reducing the exposure of children and young people to the marketing of unhealthy foods is a core strategy for reducing the high overweight and obesity prevalence in this population. The Advertising Standards Authority (ASA) has recently reviewed its self-regulatory codes and proposed a revised single code on advertising to children. This article evaluates the proposed code against eight criteria for an effective code, which were included in a submission to the ASA review process from over 70 New Zealand health professors. The evaluation found that the proposed code largely represents no change or uncertain change from the existing codes, and cannot be expected to provide substantial protection for children and young people from the marketing of unhealthy foods. Government regulations will be needed to achieve this important outcome.

  4. Multi-time scale energy management of wind farms based on comprehensive evaluation technology

    NASA Astrophysics Data System (ADS)

    Xu, Y. P.; Huang, Y. H.; Liu, Z. J.; Wang, Y. F.; Li, Z. Y.; Guo, L.

    2017-11-01

    A novel energy management of wind farms is proposed in this paper. Firstly, a novel comprehensive evaluation system is proposed to quantify economic properties of each wind farm to make the energy management more economical and reasonable. Then, a combination of multi time-scale schedule method is proposed to develop a novel energy management. The day-ahead schedule optimizes unit commitment of thermal power generators. The intraday schedule is established to optimize power generation plan for all thermal power generating units, hydroelectric generating sets and wind power plants. At last, the power generation plan can be timely revised in the process of on-line schedule. The paper concludes with simulations conducted on a real provincial integrated energy system in northeast China. Simulation results have validated the proposed model and corresponding solving algorithms.

  5. A foundational methodology for determining system static complexity using notional lunar oxygen production processes

    NASA Astrophysics Data System (ADS)

    Long, Nicholas James

    This thesis serves to develop a preliminary foundational methodology for evaluating the static complexity of future lunar oxygen production systems when extensive information is not yet available about the various systems under consideration. Evaluating static complexity, as part of a overall system complexity analysis, is an important consideration in ultimately selecting a process to be used in a lunar base. When system complexity is higher, there is generally an overall increase in risk which could impact the safety of astronauts and the economic performance of the mission. To evaluate static complexity in lunar oxygen production, static complexity is simplified and defined into its essential components. First, three essential dimensions of static complexity are investigated, including interconnective complexity, strength of connections, and complexity in variety. Then a set of methods is developed upon which to separately evaluate each dimension. Q-connectivity analysis is proposed as a means to evaluate interconnective complexity and strength of connections. The law of requisite variety originating from cybernetic theory is suggested to interpret complexity in variety. Secondly, a means to aggregate the results of each analysis is proposed to create holistic measurement for static complexity using the Single Multi-Attribute Ranking Technique (SMART). Each method of static complexity analysis and the aggregation technique is demonstrated using notional data for four lunar oxygen production processes.

  6. Bubble structure evaluation method of sponge cake by using image morphology

    NASA Astrophysics Data System (ADS)

    Kato, Kunihito; Yamamoto, Kazuhiko; Nonaka, Masahiko; Katsuta, Yukiyo; Kasamatsu, Chinatsu

    2007-01-01

    Nowadays, many evaluation methods for food industry by using image processing are proposed. These methods are becoming new evaluation method besides the sensory test and the solid-state measurement that have been used for the quality evaluation recently. The goal of our research is structure evaluation of sponge cake by using the image processing. In this paper, we propose a feature extraction method of the bobble structure in the sponge cake. Analysis of the bubble structure is one of the important properties to understand characteristics of the cake from the image. In order to take the cake image, first we cut cakes and measured that's surface by using the CIS scanner, because the depth of field of this type scanner is very shallow. Therefore the bubble region of the surface has low gray scale value, and it has a feature that is blur. We extracted bubble regions from the surface images based on these features. The input image is binarized, and the feature of bubble is extracted by the morphology analysis. In order to evaluate the result of feature extraction, we compared correlation with "Size of the bubble" of the sensory test result. From a result, the bubble extraction by using morphology analysis gives good correlation. It is shown that our method is as well as the subjectivity evaluation.

  7. Model performance evaluation (validation and calibration) in model-based studies of therapeutic interventions for cardiovascular diseases : a review and suggested reporting framework.

    PubMed

    Haji Ali Afzali, Hossein; Gray, Jodi; Karnon, Jonathan

    2013-04-01

    Decision analytic models play an increasingly important role in the economic evaluation of health technologies. Given uncertainties around the assumptions used to develop such models, several guidelines have been published to identify and assess 'best practice' in the model development process, including general modelling approach (e.g., time horizon), model structure, input data and model performance evaluation. This paper focuses on model performance evaluation. In the absence of a sufficient level of detail around model performance evaluation, concerns regarding the accuracy of model outputs, and hence the credibility of such models, are frequently raised. Following presentation of its components, a review of the application and reporting of model performance evaluation is presented. Taking cardiovascular disease as an illustrative example, the review investigates the use of face validity, internal validity, external validity, and cross model validity. As a part of the performance evaluation process, model calibration is also discussed and its use in applied studies investigated. The review found that the application and reporting of model performance evaluation across 81 studies of treatment for cardiovascular disease was variable. Cross-model validation was reported in 55 % of the reviewed studies, though the level of detail provided varied considerably. We found that very few studies documented other types of validity, and only 6 % of the reviewed articles reported a calibration process. Considering the above findings, we propose a comprehensive model performance evaluation framework (checklist), informed by a review of best-practice guidelines. This framework provides a basis for more accurate and consistent documentation of model performance evaluation. This will improve the peer review process and the comparability of modelling studies. Recognising the fundamental role of decision analytic models in informing public funding decisions, the proposed framework should usefully inform guidelines for preparing submissions to reimbursement bodies.

  8. An approach for environmental risk assessment of engineered nanomaterials using Analytical Hierarchy Process (AHP) and fuzzy inference rules.

    PubMed

    Topuz, Emel; van Gestel, Cornelis A M

    2016-01-01

    The usage of Engineered Nanoparticles (ENPs) in consumer products is relatively new and there is a need to conduct environmental risk assessment (ERA) to evaluate their impacts on the environment. However, alternative approaches are required for ERA of ENPs because of the huge gap in data and knowledge compared to conventional pollutants and their unique properties that make it difficult to apply existing approaches. This study aims to propose an ERA approach for ENPs by integrating Analytical Hierarchy Process (AHP) and fuzzy inference models which provide a systematic evaluation of risk factors and reducing uncertainty about the data and information, respectively. Risk is assumed to be the combination of occurrence likelihood, exposure potential and toxic effects in the environment. A hierarchy was established to evaluate the sub factors of these components. Evaluation was made with fuzzy numbers to reduce uncertainty and incorporate the expert judgements. Overall score of each component was combined with fuzzy inference rules by using expert judgements. Proposed approach reports the risk class and its membership degree such as Minor (0.7). Therefore, results are precise and helpful to determine the risk management strategies. Moreover, priority weights calculated by comparing the risk factors based on their importance for the risk enable users to understand which factor is effective on the risk. Proposed approach was applied for Ag (two nanoparticles with different coating) and TiO2 nanoparticles for different case studies. Results verified the proposed benefits of the approach. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Automatic and user-centric approaches to video summary evaluation

    NASA Astrophysics Data System (ADS)

    Taskiran, Cuneyt M.; Bentley, Frank

    2007-01-01

    Automatic video summarization has become an active research topic in content-based video processing. However, not much emphasis has been placed on developing rigorous summary evaluation methods and developing summarization systems based on a clear understanding of user needs, obtained through user centered design. In this paper we address these two topics and propose an automatic video summary evaluation algorithm adapted from teh text summarization domain.

  10. Methods for evaluating information in managing the enterprise on the basis of a hybrid three-tier system

    NASA Astrophysics Data System (ADS)

    Vasil'ev, V. A.; Dobrynina, N. V.

    2017-01-01

    The article presents data on the influence of information upon the functioning of complex systems in the process of ensuring their effective management. Ways and methods for evaluating multidimensional information that reduce time and resources, improve the validity of the studied system management decisions, were proposed.

  11. Evaluating Attitudes, Skill, and Performance in a Learning-Enhanced Quantitative Methods Course: A Structural Modeling Approach.

    ERIC Educational Resources Information Center

    Harlow, Lisa L.; Burkholder, Gary J.; Morrow, Jennifer A.

    2002-01-01

    Used a structural modeling approach to evaluate relations among attitudes, initial skills, and performance in a Quantitative Methods course that involved students in active learning. Results largely confirmed hypotheses offering support for educational reform efforts that propose actively involving students in the learning process, especially in…

  12. Automatic Assessment of 3D Modeling Exams

    ERIC Educational Resources Information Center

    Sanna, A.; Lamberti, F.; Paravati, G.; Demartini, C.

    2012-01-01

    Computer-based assessment of exams provides teachers and students with two main benefits: fairness and effectiveness in the evaluation process. This paper proposes a fully automatic evaluation tool for the Graphic and Virtual Design (GVD) curriculum at the First School of Architecture of the Politecnico di Torino, Italy. In particular, the tool is…

  13. 76 FR 72474 - Self-Regulatory Organizations; NYSE Arca, Inc.; Notice of Filing of Proposed Rule Change To List...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-23

    ... has developed a proprietary SectorSAM \\TM\\ quantitative research and evaluation process that forecasts... and short portfolios as dictated by its proprietary SectorSAM quantitative research and evaluation... a proprietary quantitative analysis, to forecast each sector's excess return within a specific time...

  14. A framework for the direct evaluation of large deviations in non-Markovian processes

    NASA Astrophysics Data System (ADS)

    Cavallaro, Massimo; Harris, Rosemary J.

    2016-11-01

    We propose a general framework to simulate stochastic trajectories with arbitrarily long memory dependence and efficiently evaluate large deviation functions associated to time-extensive observables. This extends the ‘cloning’ procedure of Giardiná et al (2006 Phys. Rev. Lett. 96 120603) to non-Markovian systems. We demonstrate the validity of this method by testing non-Markovian variants of an ion-channel model and the totally asymmetric exclusion process, recovering results obtainable by other means.

  15. Statistical analysis for validating ACO-KNN algorithm as feature selection in sentiment analysis

    NASA Astrophysics Data System (ADS)

    Ahmad, Siti Rohaidah; Yusop, Nurhafizah Moziyana Mohd; Bakar, Azuraliza Abu; Yaakub, Mohd Ridzwan

    2017-10-01

    This research paper aims to propose a hybrid of ant colony optimization (ACO) and k-nearest neighbor (KNN) algorithms as feature selections for selecting and choosing relevant features from customer review datasets. Information gain (IG), genetic algorithm (GA), and rough set attribute reduction (RSAR) were used as baseline algorithms in a performance comparison with the proposed algorithm. This paper will also discuss the significance test, which was used to evaluate the performance differences between the ACO-KNN, IG-GA, and IG-RSAR algorithms. This study evaluated the performance of the ACO-KNN algorithm using precision, recall, and F-score, which were validated using the parametric statistical significance tests. The evaluation process has statistically proven that this ACO-KNN algorithm has been significantly improved compared to the baseline algorithms. The evaluation process has statistically proven that this ACO-KNN algorithm has been significantly improved compared to the baseline algorithms. In addition, the experimental results have proven that the ACO-KNN can be used as a feature selection technique in sentiment analysis to obtain quality, optimal feature subset that can represent the actual data in customer review data.

  16. Risk Assessment of Groundwater Contamination: A Multilevel Fuzzy Comprehensive Evaluation Approach Based on DRASTIC Model

    PubMed Central

    Zhang, Yan; Zhong, Ming

    2013-01-01

    Groundwater contamination is a serious threat to water supply. Risk assessment of groundwater contamination is an effective way to protect the safety of groundwater resource. Groundwater is a complex and fuzzy system with many uncertainties, which is impacted by different geological and hydrological factors. In order to deal with the uncertainty in the risk assessment of groundwater contamination, we propose an approach with analysis hierarchy process and fuzzy comprehensive evaluation integrated together. Firstly, the risk factors of groundwater contamination are identified by the sources-pathway-receptor-consequence method, and a corresponding index system of risk assessment based on DRASTIC model is established. Due to the complexity in the process of transitions between the possible pollution risks and the uncertainties of factors, the method of analysis hierarchy process is applied to determine the weights of each factor, and the fuzzy sets theory is adopted to calculate the membership degrees of each factor. Finally, a case study is presented to illustrate and test this methodology. It is concluded that the proposed approach integrates the advantages of both analysis hierarchy process and fuzzy comprehensive evaluation, which provides a more flexible and reliable way to deal with the linguistic uncertainty and mechanism uncertainty in groundwater contamination without losing important information. PMID:24453883

  17. Best practice recommendations for the development, implementation, and evaluation of online knowledge translation resources in rehabilitation.

    PubMed

    Levac, Danielle; Glegg, Stephanie M N; Camden, Chantal; Rivard, Lisa M; Missiuna, Cheryl

    2015-04-01

    The knowledge-to-practice gap in rehabilitation has spurred knowledge translation (KT) initiatives aimed at promoting clinician behavior change and improving patient care. Online KT resources for physical therapists and other rehabilitation clinicians are appealing because of their potential to reach large numbers of individuals through self-paced, self-directed learning. This article proposes best practice recommendations for developing online KT resources that are designed to translate evidence into practice. Four recommendations are proposed with specific steps in the development, implementation, and evaluation process: (1) develop evidence-based, user-centered content; (2) tailor content to online format; (3) evaluate impact; and (4) share results and disseminate knowledge. Based on KT evidence and instructional design principles, concrete examples are provided along with insights gained from experiences in creating and evaluating online KT resources for physical therapists. In proposing these recommendations, the next steps for research are suggested, and others are invited to contribute to the discussion. © 2015 American Physical Therapy Association.

  18. Clinical Decision Support Alert Appropriateness: A Review and Proposal for Improvement

    PubMed Central

    McCoy, Allison B.; Thomas, Eric J.; Krousel-Wood, Marie; Sittig, Dean F.

    2014-01-01

    Background Many healthcare providers are adopting clinical decision support (CDS) systems to improve patient safety and meet meaningful use requirements. Computerized alerts that prompt clinicians about drug-allergy, drug-drug, and drug-disease warnings or provide dosing guidance are most commonly implemented. Alert overrides, which occur when clinicians do not follow the guidance presented by the alert, can hinder improved patient outcomes. Methods We present a review of CDS alerts and describe a proposal to develop novel methods for evaluating and improving CDS alerts that builds upon traditional informatics approaches. Our proposal incorporates previously described models for predicting alert overrides that utilize retrospective chart review to determine which alerts are clinically relevant and which overrides are justifiable. Results Despite increasing implementations of CDS alerts, detailed evaluations rarely occur because of the extensive labor involved in manual chart reviews to determine alert and response appropriateness. Further, most studies have solely evaluated alert overrides that are appropriate or justifiable. Our proposal expands the use of web-based monitoring tools with an interactive dashboard for evaluating CDS alert and response appropriateness that incorporates the predictive models. The dashboard provides 2 views, an alert detail view and a patient detail view, to provide a full history of alerts and help put the patient's events in context. Conclusion The proposed research introduces several innovations to address the challenges and gaps in alert evaluations. This research can transform alert evaluation processes across healthcare settings, leading to improved CDS, reduced alert fatigue, and increased patient safety. PMID:24940129

  19. 76 FR 56357 - Expedited Vocational Assessment Under the Sequential Evaluation Process

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-13

    ... work. This proposed new process would not disadvantage any claimant or change the ultimate conclusion... demands of unskilled work.\\28\\ If any of these rules would indicate that the claimant may be disabled or... to them, and an explanation of how we will apply the new rules. [[Page 56360

  20. A Guide to the Selection of Cost-Effective Wastewater Treatment Systems. Technical Report.

    ERIC Educational Resources Information Center

    Van Note, Robert H.; And Others

    The data within this publication provide guidelines for planners, engineers and decision-makers at all governmental levels to evaluate cost-effectiveness of alternative wastewater treatment proposals. The processes described include conventional and advanced treatment units as well as most sludge handling and processing units. Flow sheets, cost…

  1. 76 FR 13648 - Proposed Collection; Comment Request; Process Evaluation of the NIH Roadmap Epigenomics Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-14

    ... assess program process and progress, is non-experimental. The assessment is based on secondary source... programs of the Agency. To reduce response bias and to make the survey as accessible as possible to busy principal investigators, the survey will be Web-based. Frequency of Response: Once. Affected Public...

  2. ATMOSPHERIC AMMONIA EMISSIONS FROM THE LIVESTOCK SECTOR: DEVELOPMENT AND EVALUATION OF A PROCESS-BASED MODELING APPROACH

    EPA Science Inventory

    We propose multi-faceted research to enhance our understanding of NH3 emissions from livestock feeding operations. A process-based emissions modeling approach will be used, and we will investigate ammonia emissions from the scale of the individual farm out to impacts on region...

  3. Typical uses of NASTRAN in a petrochemical industry

    NASA Technical Reports Server (NTRS)

    Winter, J. R.

    1978-01-01

    NASTRAN was principally used to perform failure analysis and redesign process equipment. It was also employed in the evaluation of vendor designs and proposed design modifications to existing process equipment. Stress analysis of forced draft fans, distillation trays, metal stacks, jacketed pipes, heat exchangers, large centrifugal fans, and agitator support structures are described.

  4. Integrating Vocational & Academic Education. A Handbook Featuring Four Demonstration Sites Including Students from Special Populations.

    ERIC Educational Resources Information Center

    Tindall, Lloyd W.; And Others

    This handbook describes the processes and techniques used to develop, implement, and evaluate four integrated vocational and academic learning programs in Wisconsin that included students from special populations. The handbook contains seven chapters. Chapter 1 presents an overview of the project, including the request for proposal process and…

  5. Titan Science Return Quantification

    NASA Technical Reports Server (NTRS)

    Weisbin, Charles R.; Lincoln, William

    2014-01-01

    Each proposal for a NASA mission concept includes a Science Traceability Matrix (STM), intended to show that what is being proposed would contribute to satisfying one or more of the agency's top-level science goals. But the information traditionally provided cannot be used directly to quantitatively compare anticipated science return. We added numerical elements to NASA's STM and developed a software tool to process the data. We then applied this methodology to evaluate a group of competing concepts for a proposed mission to Saturn's moon, Titan.

  6. Data warehouse model for monitoring key performance indicators (KPIs) using goal oriented approach

    NASA Astrophysics Data System (ADS)

    Abdullah, Mohammed Thajeel; Ta'a, Azman; Bakar, Muhamad Shahbani Abu

    2016-08-01

    The growth and development of universities, just as other organizations, depend on their abilities to strategically plan and implement development blueprints which are in line with their vision and mission statements. The actualizations of these statements, which are often designed into goals and sub-goals and linked to their respective actors are better measured by defining key performance indicators (KPIs) of the university. The proposes ReGADaK, which is an extended the GRAnD approach highlights the facts, dimensions, attributes, measures and KPIs of the organization. The measures from the goal analysis of this unit serve as the basis of developing the related university's KPIs. The proposed data warehouse schema is evaluated through expert review, prototyping and usability evaluation. The findings from the evaluation processes suggest that the proposed data warehouse schema is suitable for monitoring the University's KPIs.

  7. EVALUATION OF DATA OBTAINED ON "MANUFACTURING PROCESS" DEVELOPMENT BUNDLES PD 1 THROUGH 5 PRIOR TO MACHINING OPERATIONS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frankhouser, W.L.; Eyler, J.H.

    1956-07-24

    Five reference fuel rod bundles were welded and evaluated dimensionally. Dimensional data are presented for the as-welded condition and for the annealed bundle with spacer strips removed (prior to the final machining operations). The welding sequence developed for Core Manufacturing should provide A'' boundles in respect to rod spacing measurements. It will probably not be possible to meet the same requirements for water channel averages, because the design tolerances are not consistent with some factors inherent to the production process. A method to improve this situation is presented. The data presented were evaluated in a fashion similar to that whichmore » would be used in the proposed scheme. Rods tended to bow resulting in a slightly barrel-shaped'' boundle. It is believed this condition can be overcome by providing special bundle peripheral clamps during annealing. Rod distortion should also be reduced by a redesign and relocation of strip spacers. The new design is proposed. (auth)« less

  8. Business process architectures: overview, comparison and framework

    NASA Astrophysics Data System (ADS)

    Dijkman, Remco; Vanderfeesten, Irene; Reijers, Hajo A.

    2016-02-01

    With the uptake of business process modelling in practice, the demand grows for guidelines that lead to consistent and integrated collections of process models. The notion of a business process architecture has been explicitly proposed to address this. This paper provides an overview of the prevailing approaches to design a business process architecture. Furthermore, it includes evaluations of the usability and use of the identified approaches. Finally, it presents a framework for business process architecture design that can be used to develop a concrete architecture. The use and usability were evaluated in two ways. First, a survey was conducted among 39 practitioners, in which the opinion of the practitioners on the use and usefulness of the approaches was evaluated. Second, four case studies were conducted, in which process architectures from practice were analysed to determine the approaches or elements of approaches that were used in their design. Both evaluations showed that practitioners have a preference for using approaches that are based on reference models and approaches that are based on the identification of business functions or business objects. At the same time, the evaluations showed that practitioners use these approaches in combination, rather than selecting a single approach.

  9. A dynamic vulnerability evaluation model to smart grid for the emergency response

    NASA Astrophysics Data System (ADS)

    Yu, Zhen; Wu, Xiaowei; Fang, Diange

    2018-01-01

    Smart grid shows more significant vulnerability to natural disasters and external destroy. According to the influence characteristics of important facilities suffered from typical kinds of natural disaster and external destroy, this paper built a vulnerability evaluation index system of important facilities in smart grid based on eight typical natural disasters, including three levels of static and dynamic indicators, totally forty indicators. Then a smart grid vulnerability evaluation method was proposed based on the index system, including determining the value range of each index, classifying the evaluation grade standard and giving the evaluation process and integrated index calculation rules. Using the proposed evaluation model, it can identify the most vulnerable parts of smart grid, and then help adopting targeted emergency response measures, developing emergency plans and increasing its capacity of disaster prevention and mitigation, which guarantee its safe and stable operation.

  10. Why (and how) should we study the interplay between emotional arousal, Theory of Mind, and inhibitory control to understand moral cognition?

    PubMed

    Buon, Marine; Seara-Cardoso, Ana; Viding, Essi

    2016-12-01

    Findings in the field of experimental psychology and cognitive neuroscience have shed new light on our understanding of the psychological and biological bases of morality. Although a lot of attention has been devoted to understanding the processes that underlie complex moral dilemmas, attempts to represent the way in which individuals generate moral judgments when processing basic harmful actions are rare. Here, we will outline a model of morality which proposes that the evaluation of basic harmful actions relies on complex interactions between emotional arousal, Theory of Mind (ToM) capacities, and inhibitory control resources. This model makes clear predictions regarding the cognitive processes underlying the development of and ability to generate moral judgments. We draw on data from developmental and cognitive psychology, cognitive neuroscience, and psychopathology research to evaluate the model and propose several conceptual and methodological improvements that are needed to further advance our understanding of moral cognition and its development.

  11. Flood inundation extent mapping based on block compressed tracing

    NASA Astrophysics Data System (ADS)

    Shen, Dingtao; Rui, Yikang; Wang, Jiechen; Zhang, Yu; Cheng, Liang

    2015-07-01

    Flood inundation extent, depth, and duration are important factors affecting flood hazard evaluation. At present, flood inundation analysis is based mainly on a seeded region-growing algorithm, which is an inefficient process because it requires excessive recursive computations and it is incapable of processing massive datasets. To address this problem, we propose a block compressed tracing algorithm for mapping the flood inundation extent, which reads the DEM data in blocks before transferring them to raster compression storage. This allows a smaller computer memory to process a larger amount of data, which solves the problem of the regular seeded region-growing algorithm. In addition, the use of a raster boundary tracing technique allows the algorithm to avoid the time-consuming computations required by the seeded region-growing. Finally, we conduct a comparative evaluation in the Chin-sha River basin, results show that the proposed method solves the problem of flood inundation extent mapping based on massive DEM datasets with higher computational efficiency than the original method, which makes it suitable for practical applications.

  12. Evaluation Criteria for Solid Waste Processing Research and Technology Development

    NASA Technical Reports Server (NTRS)

    Levri, Julie A.; Hogan, J. A.; Alazraki, M. P.

    2001-01-01

    A preliminary list of criteria is proposed for evaluation of solid waste processing technologies for research and technology development (R&TD) in the Advanced Life Support (ALS) Program. Completion of the proposed list by current and prospective ALS technology developers, with regard to specific missions of interest, may enable identification of appropriate technologies (or lack thereof) and guide future development efforts for the ALS Program solid waste processing area. An attempt is made to include criteria that capture information about the technology of interest as well as its system-wide impacts. Some of the criteria in the list are mission-independent, while the majority are mission-specific. In order for technology developers to respond to mission-specific criteria, critical information must be available on the quantity, composition and state of the waste stream, the wast processing requirements, as well as top-level mission scenario information (e.g. safety, resource recovery, planetary protection issues, and ESM equivalencies). The technology readiness level (TRL) determines the degree to which a technology developer is able to accurately report on the list of criteria. Thus, a criteria-specific minimum TRL for mandatory reporting has been identified for each criterion in the list. Although this list has been developed to define criteria that are needed to direct funding of solid waste processing technologies, this list processes significant overlap in criteria required for technology selection for inclusion in specific tests or missions. Additionally, this approach to technology evaluation may be adapted to other ALS subsystems.

  13. Use of social media in health promotion: purposes, key performance indicators, and evaluation metrics.

    PubMed

    Neiger, Brad L; Thackeray, Rosemary; Van Wagenen, Sarah A; Hanson, Carl L; West, Joshua H; Barnes, Michael D; Fagen, Michael C

    2012-03-01

    Despite the expanding use of social media, little has been published about its appropriate role in health promotion, and even less has been written about evaluation. The purpose of this article is threefold: (a) outline purposes for social media in health promotion, (b) identify potential key performance indicators associated with these purposes, and (c) propose evaluation metrics for social media related to the key performance indicators. Process evaluation is presented in this article as an overarching evaluation strategy for social media.

  14. Equity Theory Ratios as Causal Schemas.

    PubMed

    Arvanitis, Alexios; Hantzi, Alexandra

    2016-01-01

    Equity theory approaches justice evaluations based on ratios of exchange inputs to exchange outcomes. Situations are evaluated as just if ratios are equal and unjust if unequal. We suggest that equity ratios serve a more fundamental cognitive function than the evaluation of justice. More particularly, we propose that they serve as causal schemas for exchange outcomes, that is, they assist in determining whether certain outcomes are caused by inputs of other people in the context of an exchange process. Equality or inequality of ratios in this sense points to an exchange process. Indeed, Study 1 shows that different exchange situations, such as disproportional or balanced proportional situations, create perceptions of give-and-take on the basis of equity ratios. Study 2 shows that perceptions of justice are based more on communicatively accepted rules of interaction than equity-based evaluations, thereby offering a distinction between an attribution and an evaluation cognitive process for exchange outcomes.

  15. Equity Theory Ratios as Causal Schemas

    PubMed Central

    Arvanitis, Alexios; Hantzi, Alexandra

    2016-01-01

    Equity theory approaches justice evaluations based on ratios of exchange inputs to exchange outcomes. Situations are evaluated as just if ratios are equal and unjust if unequal. We suggest that equity ratios serve a more fundamental cognitive function than the evaluation of justice. More particularly, we propose that they serve as causal schemas for exchange outcomes, that is, they assist in determining whether certain outcomes are caused by inputs of other people in the context of an exchange process. Equality or inequality of ratios in this sense points to an exchange process. Indeed, Study 1 shows that different exchange situations, such as disproportional or balanced proportional situations, create perceptions of give-and-take on the basis of equity ratios. Study 2 shows that perceptions of justice are based more on communicatively accepted rules of interaction than equity-based evaluations, thereby offering a distinction between an attribution and an evaluation cognitive process for exchange outcomes. PMID:27594846

  16. Fuzzy Evaluating Customer Satisfaction of Jet Fuel Companies

    NASA Astrophysics Data System (ADS)

    Cheng, Haiying; Fang, Guoyi

    Based on the market characters of jet fuel companies, the paper proposes an evaluation index system of jet fuel company customer satisfaction from five dimensions as time, business, security, fee and service. And a multi-level fuzzy evaluation model composing with the analytic hierarchy process approach and fuzzy evaluation approach is given. Finally a case of one jet fuel company customer satisfaction evaluation is studied and the evaluation results response the feelings of the jet fuel company customers, which shows the fuzzy evaluation model is effective and efficient.

  17. Automatic visual monitoring of welding procedure in stainless steel kegs

    NASA Astrophysics Data System (ADS)

    Leo, Marco; Del Coco, Marco; Carcagnì, Pierluigi; Spagnolo, Paolo; Mazzeo, Pier Luigi; Distante, Cosimo; Zecca, Raffaele

    2018-05-01

    In this paper a system for automatic visual monitoring of welding process, in dry stainless steel kegs for food storage, is proposed. In the considered manufacturing process the upper and lower skirts are welded to the vessel by means of Tungsten Inert Gas (TIG) welding. During the process several problems can arise: 1) residuals on the bottom 2) darker weld 3) excessive/poor penetration and 4) outgrowths. The proposed system deals with all the four aforementioned problems and its inspection performances have been evaluated by using a large set of kegs demonstrating both the reliability in terms of defect detection and the suitability to be introduced in the manufacturing system in terms of computational costs.

  18. Preliminary assessment of the aquatic impacts of a proposed defense waste processing facility at the Savannah River Plant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mackey, H.E. Jr.

    1979-01-01

    A review of the literature indicates that a significant body of descriptive information exists concerning the aquatic ecology of Upper Three Runs Creek and Four Mile Creek of the Savannah River Plant south of Aiken, South Carolina. This information is adequate for preparation of an environmental document evaluating these streams. These streams will be impacted by construction and operation of a proposed Defense Waste Processing Facility for solidification of high level defense waste. Potential impacts include (1) construction runoff, erosion, and siltation, (2) effluents from a chemical and industrial waste treatment facility, and (3) radionuclide releases. In order to bettermore » evaluate potential impacts, recommend mitigation methods, and comply with NEPA requirements, additional quantitative biological information should be obtained through implementation of an aquatic baseline program.« less

  19. Techno-economic analysis of extraction-based separation systems for acetone, butanol, and ethanol recovery and purification.

    PubMed

    Grisales Díaz, Víctor Hugo; Olivar Tost, Gerard

    2017-01-01

    Dual extraction, high-temperature extraction, mixture extraction, and oleyl alcohol extraction have been proposed in the literature for acetone, butanol, and ethanol (ABE) production. However, energy and economic evaluation under similar assumptions of extraction-based separation systems are necessary. Hence, the new process proposed in this work, direct steam distillation (DSD), for regeneration of high-boiling extractants was compared with several extraction-based separation systems. The evaluation was performed under similar assumptions through simulation in Aspen Plus V7.3 ® software. Two end distillation systems (number of non-ideal stages between 70 and 80) were studied. Heat integration and vacuum operation of some units were proposed reducing the energy requirements. Energy requirement of hybrid processes, substrate concentration of 200 g/l, was between 6.4 and 8.3 MJ-fuel/kg-ABE. The minimum energy requirements of extraction-based separation systems, feeding a water concentration in the substrate equivalent to extractant selectivity, and ideal assumptions were between 2.6 and 3.5 MJ-fuel/kg-ABE, respectively. The efficiencies of recovery systems for baseline case and ideal evaluation were 0.53-0.57 and 0.81-0.84, respectively. The main advantages of DSD were the operation of the regeneration column at atmospheric pressure, the utilization of low-pressure steam, and the low energy requirements of preheating. The in situ recovery processes, DSD, and mixture extraction with conventional regeneration were the approaches with the lowest energy requirements and total annualized costs.

  20. Cache-Oblivious parallel SIMD Viterbi decoding for sequence search in HMMER

    PubMed Central

    2014-01-01

    Background HMMER is a commonly used bioinformatics tool based on Hidden Markov Models (HMMs) to analyze and process biological sequences. One of its main homology engines is based on the Viterbi decoding algorithm, which was already highly parallelized and optimized using Farrar’s striped processing pattern with Intel SSE2 instruction set extension. Results A new SIMD vectorization of the Viterbi decoding algorithm is proposed, based on an SSE2 inter-task parallelization approach similar to the DNA alignment algorithm proposed by Rognes. Besides this alternative vectorization scheme, the proposed implementation also introduces a new partitioning of the Markov model that allows a significantly more efficient exploitation of the cache locality. Such optimization, together with an improved loading of the emission scores, allows the achievement of a constant processing throughput, regardless of the innermost-cache size and of the dimension of the considered model. Conclusions The proposed optimized vectorization of the Viterbi decoding algorithm was extensively evaluated and compared with the HMMER3 decoder to process DNA and protein datasets, proving to be a rather competitive alternative implementation. Being always faster than the already highly optimized ViterbiFilter implementation of HMMER3, the proposed Cache-Oblivious Parallel SIMD Viterbi (COPS) implementation provides a constant throughput and offers a processing speedup as high as two times faster, depending on the model’s size. PMID:24884826

  1. A novel shape-changing haptic table-top display

    NASA Astrophysics Data System (ADS)

    Wang, Jiabin; Zhao, Lu; Liu, Yue; Wang, Yongtian; Cai, Yi

    2018-01-01

    A shape-changing table-top display with haptic feedback allows its users to perceive 3D visual and texture displays interactively. Since few existing devices are developed as accurate displays with regulatory haptic feedback, a novel attentive and immersive shape changing mechanical interface (SCMI) consisting of image processing unit and transformation unit was proposed in this paper. In order to support a precise 3D table-top display with an offset of less than 2 mm, a custommade mechanism was developed to form precise surface and regulate the feedback force. The proposed image processing unit was capable of extracting texture data from 2D picture for rendering shape-changing surface and realizing 3D modeling. The preliminary evaluation result proved the feasibility of the proposed system.

  2. A fuzzy MCDM framework based on fuzzy measure and fuzzy integral for agile supplier evaluation

    NASA Astrophysics Data System (ADS)

    Dursun, Mehtap

    2017-06-01

    Supply chains need to be agile in order to response quickly to the changes in today's competitive environment. The success of an agile supply chain depends on the firm's ability to select the most appropriate suppliers. This study proposes a multi-criteria decision making technique for conducting an analysis based on multi-level hierarchical structure and fuzzy logic for the evaluation of agile suppliers. The ideal and anti-ideal solutions are taken into consideration simultaneously in the developed approach. The proposed decision approach enables the decision-makers to use linguistic terms, and thus, reduce their cognitive burden in the evaluation process. Furthermore, a hierarchy of evaluation criteria and their related sub-criteria is employed in the presented approach in order to conduct a more effective analysis.

  3. Damage Evaluation Based on a Wave Energy Flow Map Using Multiple PZT Sensors

    PubMed Central

    Liu, Yaolu; Hu, Ning; Xu, Hong; Yuan, Weifeng; Yan, Cheng; Li, Yuan; Goda, Riu; Alamusi; Qiu, Jinhao; Ning, Huiming; Wu, Liangke

    2014-01-01

    A new wave energy flow (WEF) map concept was proposed in this work. Based on it, an improved technique incorporating the laser scanning method and Betti's reciprocal theorem was developed to evaluate the shape and size of damage as well as to realize visualization of wave propagation. In this technique, a simple signal processing algorithm was proposed to construct the WEF map when waves propagate through an inspection region, and multiple lead zirconate titanate (PZT) sensors were employed to improve inspection reliability. Various damages in aluminum and carbon fiber reinforced plastic laminated plates were experimentally and numerically evaluated to validate this technique. The results show that it can effectively evaluate the shape and size of damage from wave field variations around the damage in the WEF map. PMID:24463430

  4. Aid effectiveness and programmatic effectiveness: a proposed framework for comparative evaluation of different aid interventions in a particular health system.

    PubMed

    Haque, Hasibul; Hill, Philip C; Gauld, Robin

    2017-01-01

    Against a backdrop of changing concepts of aid effectiveness, development effectiveness, health systems strengthening, and increasing emphasis on impact evaluation, this article proposes a theory-driven impact evaluation framework to gauge the effect of aid effectiveness principles on programmatic outcomes of different aid funded programs in the health sector of a particular country. The foundation and step-by-step process of implementing the framework are described. With empirical evidence from the field, the steps involve analysis of context, program designs, implementation mechanisms, outcomes, synthesis, and interpretation of findings through the programs' underlying program theories and interactions with the state context and health system. The framework can be useful for comparatively evaluating different aid interventions both in fragile and non-fragile state contexts.

  5. A citrus waste-based biorefinery as a source of renewable energy: technical advances and analysis of engineering challenges.

    PubMed

    Rivas-Cantu, Raul C; Jones, Kim D; Mills, Patrick L

    2013-04-01

    An assessment of recent technical advances on pretreatment processes and its effects on enzymatic hydrolysis as the main steps of a proposed citrus processing waste (CPW) biorefinery is presented. Engineering challenges and relevant gaps in scientific and technical information for reliable design, modeling and scale up of a CPW biorefinery are also discussed. Some integrated physico-chemical pretreatments are proposed for testing for CPW, including high speed knife-grinding and simultaneous caustic addition. These new proposed processes and the effect of parameters such as particle size, surface area and morphology, pore volume and chemical composition of the diverse fractions resulting from pretreatment and enzymatic hydrolysis need to be evaluated and compared for pretreated and untreated samples of grapefruit processing waste. This assessment suggests the potential for filling the data gaps, and preliminary results demonstrate that the reduction of particle size and the increased surface area for the CPW will result in higher reaction rates and monosaccharide yields for the pretreated waste material.

  6. Detection and quantification of flow consistency in business process models.

    PubMed

    Burattin, Andrea; Bernstein, Vered; Neurauter, Manuel; Soffer, Pnina; Weber, Barbara

    2018-01-01

    Business process models abstract complex business processes by representing them as graphical models. Their layout, as determined by the modeler, may have an effect when these models are used. However, this effect is currently not fully understood. In order to systematically study this effect, a basic set of measurable key visual features is proposed, depicting the layout properties that are meaningful to the human user. The aim of this research is thus twofold: first, to empirically identify key visual features of business process models which are perceived as meaningful to the user and second, to show how such features can be quantified into computational metrics, which are applicable to business process models. We focus on one particular feature, consistency of flow direction, and show the challenges that arise when transforming it into a precise metric. We propose three different metrics addressing these challenges, each following a different view of flow consistency. We then report the results of an empirical evaluation, which indicates which metric is more effective in predicting the human perception of this feature. Moreover, two other automatic evaluations describing the performance and the computational capabilities of our metrics are reported as well.

  7. Small business innovation research: Program solicitation

    NASA Technical Reports Server (NTRS)

    1989-01-01

    This, the seventh annual SBIR solicitation by NASA, describes the program, identifies eligibility requirements, outlines the required proposal format and content, states proposal preparation and submission requirements, describes the proposal evaluation and award selection process, and provides other information to assist those interested in participating in NASA's SBIR program. It also identifies the Technical Topics and Subtopics in which SBIR Phase 1 proposals are solicited in 1989. These Topics and Subtopics cover a broad range of current NASA interests, but do not necessarily include all areas in which NASA plans or currently conducts research. High-risk high pay-off innovations are desired.

  8. Graphical Technique to Support the Teaching/Learning Process of Software Process Reference Models

    NASA Astrophysics Data System (ADS)

    Espinosa-Curiel, Ismael Edrein; Rodríguez-Jacobo, Josefina; Fernández-Zepeda, José Alberto

    In this paper, we propose a set of diagrams to visualize software process reference models (PRM). The diagrams, called dimods, are the combination of some visual and process modeling techniques such as rich pictures, mind maps, IDEF and RAD diagrams. We show the use of this technique by designing a set of dimods for the Mexican Software Industry Process Model (MoProSoft). Additionally, we perform an evaluation of the usefulness of dimods. The result of the evaluation shows that dimods may be a support tool that facilitates the understanding, memorization, and learning of software PRMs in both, software development organizations and universities. The results also show that dimods may have advantages over the traditional description methods for these types of models.

  9. Evaluation of soil erosion risk using Analytic Network Process and GIS: a case study from Spanish mountain olive plantations.

    PubMed

    Nekhay, Olexandr; Arriaza, Manuel; Boerboom, Luc

    2009-07-01

    The study presents an approach that combined objective information such as sampling or experimental data with subjective information such as expert opinions. This combined approach was based on the Analytic Network Process method. It was applied to evaluate soil erosion risk and overcomes one of the drawbacks of USLE/RUSLE soil erosion models, namely that they do not consider interactions among soil erosion factors. Another advantage of this method is that it can be used if there are insufficient experimental data. The lack of experimental data can be compensated for through the use of expert evaluations. As an example of the proposed approach, the risk of soil erosion was evaluated in olive groves in Southern Spain, showing the potential of the ANP method for modelling a complex physical process like soil erosion.

  10. Merit Evaluation Of Competitors In Debate And Recitation Competitions By Fuzzy Approach

    NASA Astrophysics Data System (ADS)

    Mukherjee, Supratim; Bhattacharyya, Rupak; Chatterjee, Amitava; Kar, Samarjit

    2010-10-01

    Co-curricular activities have a great importance in students' life, especially to grow their personality and communication skills. In different process of evaluating competitors in such competitions, generally crisp techniques are used. In this paper, we introduce a new fuzzy set theory based method of evaluation of competitors in co-curricular activities like debate and recitation competitions. The proposed method is illustrated by two examples.

  11. School Self-Evaluation at an Embryonic Stage: Depicting Teachers' Experiences with a Participative Project

    ERIC Educational Resources Information Center

    Karagiorgi, Yiasemina

    2012-01-01

    This case study aims to enquire into the journey of a Greek-Cypriot primary school through a self-evaluating process, in accordance to the respective guidelines proposed in the national educational reform documents. The article outlines the phases involved, beginning from the collection of information, moving to the formulation of a school…

  12. Covert Auditory Spatial Orienting: An Evaluation of the Spatial Relevance Hypothesis

    ERIC Educational Resources Information Center

    Roberts, Katherine L.; Summerfield, A. Quentin; Hall, Deborah A.

    2009-01-01

    The spatial relevance hypothesis (J. J. McDonald & L. M. Ward, 1999) proposes that covert auditory spatial orienting can only be beneficial to auditory processing when task stimuli are encoded spatially. We present a series of experiments that evaluate 2 key aspects of the hypothesis: (a) that "reflexive activation of location-sensitive neurons is…

  13. Toward an agenda for evaluation of qualitative research.

    PubMed

    Stige, Brynjulf; Malterud, Kirsti; Midtgarden, Torjus

    2009-10-01

    Evaluation is essential for research quality and development, but the diversity of traditions that characterize qualitative research suggests that general checklists or shared criteria for evaluation are problematic. We propose an approach to research evaluation that encourages reflexive dialogue through use of an evaluation agenda. In proposing an evaluation agenda we shift attention from rule-based judgment to reflexive dialogue. Unlike criteria, an agenda may embrace pluralism, and does not request consensus on ontological, epistemological, and methodological issues, only consensus on what themes warrant discussion. We suggest an evaluation agenda-EPICURE-with two dimensions communicated through use of two acronyms.The first, EPIC, refers to the challenge of producing rich and substantive accounts based on engagement, processing, interpretation, and (self-)critique. The second-CURE-refers to the challenge of dealing with preconditions and consequences of research, with a focus on (social) critique, usefulness, relevance, and ethics. The seven items of the composite agenda EPICURE are presented and exemplified. Features and implications of the agenda approach to research evaluation are then discussed.

  14. Risk analysis within environmental impact assessment of proposed construction activity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zeleňáková, Martina; Zvijáková, Lenka

    Environmental impact assessment is an important process, prior to approval of the investment plan, providing a detailed examination of the likely and foreseeable impacts of proposed construction activity on the environment. The objective of this paper is to develop a specific methodology for the analysis and evaluation of environmental impacts of selected constructions – flood protection structures using risk analysis methods. The application of methodology designed for the process of environmental impact assessment will develop assumptions for further improvements or more effective implementation and performance of this process. The main objective of the paper is to improve the implementation ofmore » the environmental impact assessment process. Through the use of risk analysis methods in environmental impact assessment process, the set objective has been achieved. - Highlights: This paper is informed by an effort to develop research with the aim of: • Improving existing qualitative and quantitative methods for assessing the impacts • A better understanding of relations between probabilities and consequences • Methodology for the EIA of flood protection constructions based on risk analysis • Creative approaches in the search for environmentally friendly proposed activities.« less

  15. Research environments that promote integrity.

    PubMed

    Jeffers, Brenda Recchia; Whittemore, Robin

    2005-01-01

    The body of empirical knowledge about research integrity and the factors that promote research integrity in nursing research environments remains small. To propose an internal control model as an innovative framework for the design and structure of nursing research environments that promote integrity. An internal control model is adapted to illustrate its use for conceptualizing and designing research environments that promote integrity. The internal control model integrates both the organizational elements necessary to promote research integrity and the processes needed to assess research environments. The model provides five interrelated process components within which any number of research integrity variables and processes may be used and studied: internal control environment, risk assessment, internal control activities, monitoring, and information and communication. The components of the proposed research integrity internal control model proposed comprise an integrated conceptualization of the processes that provide reasonable assurance that research integrity will be promoted within the nursing research environment. Schools of nursing can use the model to design, implement, and evaluate systems that promote research integrity. The model process components need further exploration to substantiate the use of the model in nursing research environments.

  16. Evaluation protocol for amusia: Portuguese sample.

    PubMed

    Peixoto, Maria Conceição; Martins, Jorge; Teixeira, Pedro; Alves, Marisa; Bastos, José; Ribeiro, Carlos

    2012-12-01

    Amusia is a disorder that affects the processing of music. Part of this processing happens in the primary auditory cortex. The study of this condition allows us to evaluate the central auditory pathways. To explore the diagnostic evaluation tests of amusia. The authors propose an evaluation protocol for patients with suspected amusia (after brain injury or complaints of poor musical perception), in parallel with the assessment of central auditory processing, already implemented in the department. The Montreal Evaluation of Battery of amusia was the basis for the selection of the tests. From this comprehensive battery of tests we selected some of the musical examples to evaluate different musical aspects, including memory and perception of music, ability concerning musical recognition and discrimination. In terms of memory there is a test for assessing delayed memory, adapted to the Portuguese culture. Prospective study. Although still experimental, with the possibility of adjustments in the assessment, we believe that this assessment, combined with the study of central auditory processing, will allow us to understand some central lesions, congenital or acquired hearing perception limitations.

  17. 78 FR 60287 - Agency Information Collection Activities; Proposed Collection; Comment Request; Testing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-01

    ... processes will provide the better understanding of target audiences that FDA needs to design effective... reaction to the messages in either individual or group settings. Third, as evaluative research, it will...

  18. A colored petri nets based workload evaluation model and its validation through Multi-Attribute Task Battery-II.

    PubMed

    Wang, Peng; Fang, Weining; Guo, Beiyuan

    2017-04-01

    This paper proposed a colored petri nets based workload evaluation model. A formal interpretation of workload was firstly introduced based on the process that reflection of petri nets components to task. A petri net based description of Multiple Resources theory was given by comprehending it from a new angle. A new application of VACP rating scales named V/A-C-P unit, and the definition of colored transitions were proposed to build a model of task process. The calculation of workload mainly has the following four steps: determine token's initial position and values; calculate the weight of directed arcs on the basis of the rules proposed; calculate workload from different transitions, and correct the influence of repetitive behaviors. Verify experiments were carried out based on Multi-Attribute Task Battery-II software. Our results show that there is a strong correlation between the model values and NASA -Task Load Index scores (r=0.9513). In addition, this method can also distinguish behavior characteristics between different people. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. An Experience Oriented-Convergence Improved Gravitational Search Algorithm for Minimum Variance Distortionless Response Beamforming Optimum

    PubMed Central

    Darzi, Soodabeh; Tiong, Sieh Kiong; Tariqul Islam, Mohammad; Rezai Soleymanpour, Hassan; Kibria, Salehin

    2016-01-01

    An experience oriented-convergence improved gravitational search algorithm (ECGSA) based on two new modifications, searching through the best experiments and using of a dynamic gravitational damping coefficient (α), is introduced in this paper. ECGSA saves its best fitness function evaluations and uses those as the agents’ positions in searching process. In this way, the optimal found trajectories are retained and the search starts from these trajectories, which allow the algorithm to avoid the local optimums. Also, the agents can move faster in search space to obtain better exploration during the first stage of the searching process and they can converge rapidly to the optimal solution at the final stage of the search process by means of the proposed dynamic gravitational damping coefficient. The performance of ECGSA has been evaluated by applying it to eight standard benchmark functions along with six complicated composite test functions. It is also applied to adaptive beamforming problem as a practical issue to improve the weight vectors computed by minimum variance distortionless response (MVDR) beamforming technique. The results of implementation of the proposed algorithm are compared with some well-known heuristic methods and verified the proposed method in both reaching to optimal solutions and robustness. PMID:27399904

  20. Capital Budgeting Guidelines: How to Decide Whether to Fund a New Dorm or an Upgraded Computer Lab.

    ERIC Educational Resources Information Center

    Swiger, John; Klaus, Allen

    1996-01-01

    A process for college and university decision making and budgeting for capital outlays that focuses on evaluating the qualitative and quantitative benefits of each proposed project is described and illustrated. The process provides a means to solicit suggestions from those involved and provide detailed information for cost-benefit analysis. (MSE)

  1. Evaluating the Turkish Higher Education Law and Proposals in the Light of ERASMUS Goals

    ERIC Educational Resources Information Center

    Dolasir, Semiyha; Tuncel, Fehmi

    2006-01-01

    Education unity among Europan Community countries is very important in the process of unifying Europe. Hence, with the thoughts of strengthening a regular determined and democratic society, the education ministries of 29 European countries, started the unifying education process by signing the Bologna Declaration in June 19, 1999. SOCRATES and…

  2. Analysis and evalaution in the production process and equipment area of the low-cost solar array project. [including modifying gaseous diffusion and using ion implantation

    NASA Technical Reports Server (NTRS)

    Goldman, H.; Wolf, M.

    1979-01-01

    The manufacturing methods for photovoltaic solar energy utilization are assessed. Economic and technical data on the current front junction formation processes of gaseous diffusion and ion implantation are presented. Future proposals, including modifying gaseous diffusion and using ion implantation, to decrease the cost of junction formation are studied. Technology developments in current processes and an economic evaluation of the processes are included.

  3. A Scientific Workflow System for Satellite Data Processing with Real-Time Monitoring

    NASA Astrophysics Data System (ADS)

    Nguyen, Minh Duc

    2018-02-01

    This paper provides a case study on satellite data processing, storage, and distribution in the space weather domain by introducing the Satellite Data Downloading System (SDDS). The approach proposed in this paper was evaluated through real-world scenarios and addresses the challenges related to the specific field. Although SDDS is used for satellite data processing, it can potentially be adapted to a wide range of data processing scenarios in other fields of physics.

  4. Using a fuzzy comprehensive evaluation method to determine product usability: A test case

    PubMed Central

    Zhou, Ronggang; Chan, Alan H. S.

    2016-01-01

    BACKGROUND: In order to take into account the inherent uncertainties during product usability evaluation, Zhou and Chan [1] proposed a comprehensive method of usability evaluation for products by combining the analytic hierarchy process (AHP) and fuzzy evaluation methods for synthesizing performance data and subjective response data. This method was designed to provide an integrated framework combining the inevitable vague judgments from the multiple stages of the product evaluation process. OBJECTIVE AND METHODS: In order to illustrate the effectiveness of the model, this study used a summative usability test case to assess the application and strength of the general fuzzy usability framework. To test the proposed fuzzy usability evaluation framework [1], a standard summative usability test was conducted to benchmark the overall usability of a specific network management software. Based on the test data, the fuzzy method was applied to incorporate both the usability scores and uncertainties involved in the multiple components of the evaluation. Then, with Monte Carlo simulation procedures, confidence intervals were used to compare the reliabilities among the fuzzy approach and two typical conventional methods combining metrics based on percentages. RESULTS AND CONCLUSIONS: This case study showed that the fuzzy evaluation technique can be applied successfully for combining summative usability testing data to achieve an overall usability quality for the network software evaluated. Greater differences of confidence interval widths between the method of averaging equally percentage and weighted evaluation method, including the method of weighted percentage averages, verified the strength of the fuzzy method. PMID:28035942

  5. Using a fuzzy comprehensive evaluation method to determine product usability: A test case.

    PubMed

    Zhou, Ronggang; Chan, Alan H S

    2017-01-01

    In order to take into account the inherent uncertainties during product usability evaluation, Zhou and Chan [1] proposed a comprehensive method of usability evaluation for products by combining the analytic hierarchy process (AHP) and fuzzy evaluation methods for synthesizing performance data and subjective response data. This method was designed to provide an integrated framework combining the inevitable vague judgments from the multiple stages of the product evaluation process. In order to illustrate the effectiveness of the model, this study used a summative usability test case to assess the application and strength of the general fuzzy usability framework. To test the proposed fuzzy usability evaluation framework [1], a standard summative usability test was conducted to benchmark the overall usability of a specific network management software. Based on the test data, the fuzzy method was applied to incorporate both the usability scores and uncertainties involved in the multiple components of the evaluation. Then, with Monte Carlo simulation procedures, confidence intervals were used to compare the reliabilities among the fuzzy approach and two typical conventional methods combining metrics based on percentages. This case study showed that the fuzzy evaluation technique can be applied successfully for combining summative usability testing data to achieve an overall usability quality for the network software evaluated. Greater differences of confidence interval widths between the method of averaging equally percentage and weighted evaluation method, including the method of weighted percentage averages, verified the strength of the fuzzy method.

  6. A nonlinear quality-related fault detection approach based on modified kernel partial least squares.

    PubMed

    Jiao, Jianfang; Zhao, Ning; Wang, Guang; Yin, Shen

    2017-01-01

    In this paper, a new nonlinear quality-related fault detection method is proposed based on kernel partial least squares (KPLS) model. To deal with the nonlinear characteristics among process variables, the proposed method maps these original variables into feature space in which the linear relationship between kernel matrix and output matrix is realized by means of KPLS. Then the kernel matrix is decomposed into two orthogonal parts by singular value decomposition (SVD) and the statistics for each part are determined appropriately for the purpose of quality-related fault detection. Compared with relevant existing nonlinear approaches, the proposed method has the advantages of simple diagnosis logic and stable performance. A widely used literature example and an industrial process are used for the performance evaluation for the proposed method. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  7. Evaluation of Moving Object Detection Based on Various Input Noise Using Fixed Camera

    NASA Astrophysics Data System (ADS)

    Kiaee, N.; Hashemizadeh, E.; Zarrinpanjeh, N.

    2017-09-01

    Detecting and tracking objects in video has been as a research area of interest in the field of image processing and computer vision. This paper evaluates the performance of a novel method for object detection algorithm in video sequences. This process helps us to know the advantage of this method which is being used. The proposed framework compares the correct and wrong detection percentage of this algorithm. This method was evaluated with the collected data in the field of urban transport which include car and pedestrian in fixed camera situation. The results show that the accuracy of the algorithm will decreases because of image resolution reduction.

  8. Proposed center for advanced industrial processes. Washington State University, College of Engineering and Architecture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1995-03-01

    The DOE proposes to authorize Washington State University (WSU) to proceed with the detailed design, construction, and equipping of the proposed Center for Advanced Industrial Processes (CAIP). The proposed project would involve construction of a three story building containing laboratories, classrooms, seminar rooms, and graduate student and administrative office space. Existing buildings would be demolished. The proposed facility would house research in thermal/fluid sciences, bioengineering, manufacturing processes, and materials processing. Under the {open_quotes}no-action{close_quotes} DOE would not authorize WSU to proceed with construction under the grant. WSU would then need to consider alternatives for proceeding without DOE funds. Such alternatives (includingmore » delaying or scaling back the project), would result in a postponement or slight reduction in the minor adverse environmental, safety and health Impacts of the project evaluated in this assessment. More importantly, these alternatives would affect the important environmental, safety, health, and programmatic benefits of the projects. The surrounding area is fully urbanized and the campus is intensely developed around the proposed site. The buildings scheduled for demolition do not meet State energy codes, are not air conditioned, and lack handicapped access. Sensitive resources (historical/archeological, protected species/critical habitats, wetlands/floodplains, national forests/parks/trails, prime farmland and special sources of water) would not be affected as they do not occur on or near the proposed site. Cumulative impacts would be small. The proposed action is not related to other actions being considered under other NEPA reviews. There is no conflict between the proposed action and any applicable Federal, State, regional or local land use plans and policies.« less

  9. A fast fusion scheme for infrared and visible light images in NSCT domain

    NASA Astrophysics Data System (ADS)

    Zhao, Chunhui; Guo, Yunting; Wang, Yulei

    2015-09-01

    Fusion of infrared and visible light images is an effective way to obtain a simultaneous visualization of details of background provided by visible light image and hiding target information provided by infrared image, which is more suitable for browsing and further processing. Two crucial components for infrared and visual light image fusion are improving its fusion performance as well as reducing its computational burden. In this paper, a novel fusion algorithm named pixel information estimation is proposed, which determines the weights by evaluating the information of pixel and is well applied in visible light and infrared image fusion with better fusion quality and lower time-consumption. Besides, a fast realization of non-subsampled contourlet transform is also proposed in this paper to improve the computational efficiency. To verify the advantage of the proposed method, this paper compares it with several popular ones in six evaluation metrics over four different image groups. Experimental results show that the proposed algorithm gets a more effective result with much less time consuming and performs well in both subjective evaluation and objective indicators.

  10. Gaussian process regression for sensor networks under localization uncertainty

    USGS Publications Warehouse

    Jadaliha, M.; Xu, Yunfei; Choi, Jongeun; Johnson, N.S.; Li, Weiming

    2013-01-01

    In this paper, we formulate Gaussian process regression with observations under the localization uncertainty due to the resource-constrained sensor networks. In our formulation, effects of observations, measurement noise, localization uncertainty, and prior distributions are all correctly incorporated in the posterior predictive statistics. The analytically intractable posterior predictive statistics are proposed to be approximated by two techniques, viz., Monte Carlo sampling and Laplace's method. Such approximation techniques have been carefully tailored to our problems and their approximation error and complexity are analyzed. Simulation study demonstrates that the proposed approaches perform much better than approaches without considering the localization uncertainty properly. Finally, we have applied the proposed approaches on the experimentally collected real data from a dye concentration field over a section of a river and a temperature field of an outdoor swimming pool to provide proof of concept tests and evaluate the proposed schemes in real situations. In both simulation and experimental results, the proposed methods outperform the quick-and-dirty solutions often used in practice.

  11. Distributed processing of a GPS receiver network for a regional ionosphere map

    NASA Astrophysics Data System (ADS)

    Choi, Kwang Ho; Hoo Lim, Joon; Yoo, Won Jae; Lee, Hyung Keun

    2018-01-01

    This paper proposes a distributed processing method applicable to GPS receivers in a network to generate a regional ionosphere map accurately and reliably. For accuracy, the proposed method is operated by multiple local Kalman filters and Kriging estimators. Each local Kalman filter is applied to a dual-frequency receiver to estimate the receiver’s differential code bias and vertical ionospheric delays (VIDs) at different ionospheric pierce points. The Kriging estimator selects and combines several VID estimates provided by the local Kalman filters to generate the VID estimate at each ionospheric grid point. For reliability, the proposed method uses receiver fault detectors and satellite fault detectors. Each receiver fault detector compares the VID estimates of the same local area provided by different local Kalman filters. Each satellite fault detector compares the VID estimate of each local area with that projected from the other local areas. Compared with the traditional centralized processing method, the proposed method is advantageous in that it considerably reduces the computational burden of each single Kalman filter and enables flexible fault detection, isolation, and reconfiguration capability. To evaluate the performance of the proposed method, several experiments with field collected measurements were performed.

  12. Liquid rocket booster integration study. Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    1988-01-01

    The impacts of introducing liquid rocket booster engines (LRB) into the Space Transportation System (STS)/Kennedy Space Center (KSC) launch environment are identified and evaluated. Proposed ground systems configurations are presented along with a launch site requirements summary. Prelaunch processing scenarios are described and the required facility modifications and new facility requirements are analyzed. Flight vehicle design recommendations to enhance launch processing are discussed. Processing approaches to integrate LRB with existing STS launch operations are evaluated. The key features and significance of launch site transition to a new STS configuration in parallel with ongoing launch activities are enumerated. This volume is the executive summary of the five volume series.

  13. Liquid rocket booster integration study. Volume 5, part 1: Appendices

    NASA Technical Reports Server (NTRS)

    1988-01-01

    The impacts of introducing liquid rocket booster engines (LRB) into the Space Transportation System (STS)/Kennedy Space Center (KSC) launch environment are identified and evaluated. Proposed ground systems configurations are presented along with a launch site requirements summary. Prelaunch processing scenarios are described and the required facility modifications and new facility requirements are analyzed. Flight vehicle design recommendations to enhance launch processing are discussed. Processing approaches to integrate LRB with existing STS launch operations are evaluated. The key features and significance of launch site transition to a new STS configuration in parallel with ongoing launch activities are enumerated. This volume is the appendices of the five volume series.

  14. Liquid Rocket Booster Integration Study. Volume 2: Study synopsis

    NASA Technical Reports Server (NTRS)

    1988-01-01

    The impacts of introducing liquid rocket booster engines (LRB) into the Space Transportation System (STS)/Kennedy Space Center (KSC) launch environment are identified and evaluated. Proposed ground systems configurations are presented along with a launch site requirements summary. Prelaunch processing scenarios are described and the required facility modifications and new facility requirements are analyzed. Flight vehicle design recommendations to enhance launch processing are discussed. Processing approaches to integrate LRB with existing STS launch operations are evaluated. The key features and significance of launch site transition to a new STS configuration in parallel with ongoing launch activities are enumerated. This volume is the study summary of the five volume series.

  15. Methodology for the Evaluation of the Algorithms for Text Line Segmentation Based on Extended Binary Classification

    NASA Astrophysics Data System (ADS)

    Brodic, D.

    2011-01-01

    Text line segmentation represents the key element in the optical character recognition process. Hence, testing of text line segmentation algorithms has substantial relevance. All previously proposed testing methods deal mainly with text database as a template. They are used for testing as well as for the evaluation of the text segmentation algorithm. In this manuscript, methodology for the evaluation of the algorithm for text segmentation based on extended binary classification is proposed. It is established on the various multiline text samples linked with text segmentation. Their results are distributed according to binary classification. Final result is obtained by comparative analysis of cross linked data. At the end, its suitability for different types of scripts represents its main advantage.

  16. The fast iris image clarity evaluation based on Tenengrad and ROI selection

    NASA Astrophysics Data System (ADS)

    Gao, Shuqin; Han, Min; Cheng, Xu

    2018-04-01

    In iris recognition system, the clarity of iris image is an important factor that influences recognition effect. In the process of recognition, the blurred image may possibly be rejected by the automatic iris recognition system, which will lead to the failure of identification. Therefore it is necessary to evaluate the iris image definition before recognition. Considered the existing evaluation methods on iris image definition, we proposed a fast algorithm to evaluate the definition of iris image in this paper. In our algorithm, firstly ROI (Region of Interest) is extracted based on the reference point which is determined by using the feature of the light spots within the pupil, then Tenengrad operator is used to evaluate the iris image's definition. Experiment results show that, the iris image definition algorithm proposed in this paper could accurately distinguish the iris images of different clarity, and the algorithm has the merit of low computational complexity and more effectiveness.

  17. Using experts feedback in clinical case resolution and arbitration as accuracy diagnosis methodology.

    PubMed

    Rodríguez-González, Alejandro; Torres-Niño, Javier; Valencia-Garcia, Rafael; Mayer, Miguel A; Alor-Hernandez, Giner

    2013-09-01

    This paper proposes a new methodology for assessing the efficiency of medical diagnostic systems and clinical decision support systems by using the feedback/opinions of medical experts. The methodology behind this work is based on a comparison between the expert feedback that has helped solve different clinical cases and the expert system that has evaluated these same cases. Once the results are returned, an arbitration process is carried out in order to ensure the correctness of the results provided by both methods. Once this process has been completed, the results are analyzed using Precision, Recall, Accuracy, Specificity and Matthews Correlation Coefficient (MCC) (PRAS-M) metrics. When the methodology is applied, the results obtained from a real diagnostic system allow researchers to establish the accuracy of the system based on objective facts. The methodology returns enough information to analyze the system's behavior for each disease in the knowledge base or across the entire knowledge base. It also returns data on the efficiency of the different assessors involved in the evaluation process, analyzing their behavior in the diagnostic process. The proposed work facilitates the evaluation of medical diagnostic systems, having a reliable process based on objective facts. The methodology presented in this research makes it possible to identify the main characteristics that define a medical diagnostic system and their values, allowing for system improvement. A good example of the results provided by the application of the methodology is shown in this paper. A diagnosis system was evaluated by means of this methodology, yielding positive results (statistically significant) when comparing the system with the assessors that participated in the evaluation process of the system through metrics such as recall (+27.54%) and MCC (+32.19%). These results demonstrate the real applicability of the methodology used. Copyright © 2013 Elsevier Ltd. All rights reserved.

  18. Report on inspection of concerns regarding DOE`s evaluation of Chevron USA`s unsolicited proposal for the Elk Hills Naval Petroleum Reserve

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1997-11-17

    An allegation was made to the Office of Inspector General (OIG) that the integrity of the Department of Energy`s (DOE) unsolicited proposal review process may have been compromised by the actions of a former Deputy Secretary of Energy and his Executive Assistant during the review of an unsolicited proposal received from Chevron U.S.A. Production Company (Chevron) in may 1993. The Chevron unsolicited proposal was for the management and operation of DOE`s Elk Hills Naval Petroleum Reserve (Elk Hills), located near Bakersfield, California. Chevron submitted the unsolicited proposal on May 19, 1993. DOE formally rejected Chevron`s unsolicited proposal in May 1995.more » Although Chevron`s unsolicited proposal was eventually rejected by DOE, the complainant specifically alleged that the {open_quotes}sanctity, integrity, and sensitivity{close_quotes} of the unsolicited proposal review process had been breached in meetings during the Fall of 1993 between Chevron officials, the Deputy Secretary of Energy (Deputy Secretary), and his Executive Assistant. Based on our review of the allegation, we identified the following issue as the focus of our inspection.« less

  19. Grey Comprehensive Evaluation of Biomass Power Generation Project Based on Group Judgement

    NASA Astrophysics Data System (ADS)

    Xia, Huicong; Niu, Dongxiao

    2017-06-01

    The comprehensive evaluation of benefit is an important task needed to be carried out at all stages of biomass power generation projects. This paper proposed an improved grey comprehensive evaluation method based on triangle whiten function. To improve the objectivity of weight calculation result of only reference comparison judgment method, this paper introduced group judgment to the weighting process. In the process of grey comprehensive evaluation, this paper invited a number of experts to estimate the benefit level of projects, and optimized the basic estimations based on the minimum variance principle to improve the accuracy of evaluation result. Taking a biomass power generation project as an example, the grey comprehensive evaluation result showed that the benefit level of this project was good. This example demonstrates the feasibility of grey comprehensive evaluation method based on group judgment for benefit evaluation of biomass power generation project.

  20. Steps toward improving ethical evaluation in health technology assessment: a proposed framework.

    PubMed

    Assasi, Nazila; Tarride, Jean-Eric; O'Reilly, Daria; Schwartz, Lisa

    2016-06-06

    While evaluation of ethical aspects in health technology assessment (HTA) has gained much attention during the past years, the integration of ethics in HTA practice still presents many challenges. In response to the increasing demand for expansion of health technology assessment (HTA) methodology to include ethical issues more systematically, this article reports on a multi-stage study that aimed at construction of a framework for improving the integration of ethics in HTA. The framework was developed through the following phases: 1) a systematic review and content analysis of guidance documents for ethics in HTA; 2) identification of factors influencing the integration of ethical considerations in HTA; 3) preparation of an action-oriented framework based on the key elements of the existing guidance documents and identified barriers to and facilitators of their implementation; and 4) expert consultation and revision of the framework. The proposed framework consists of three main components: an algorithmic flowchart, which exhibits the different steps of an ethical inquiry throughout the HTA process, including: defining the objectives and scope of the evaluation, stakeholder analysis, assessing organizational capacity, framing ethical evaluation questions, ethical analysis, deliberation, and knowledge translation; a stepwise guide, which focuses on the task objectives and potential questions that are required to be addressed at each step; and a list of some commonly recommended or used tools to help facilitate the evaluation process. The proposed framework can be used to support and promote good practice in integration of ethics into HTA. However, further validation of the framework through case studies and expert consultation is required to establish its utility for HTA practice.

  1. Optimization of digital image processing to determine quantum dots' height and density from atomic force microscopy.

    PubMed

    Ruiz, J E; Paciornik, S; Pinto, L D; Ptak, F; Pires, M P; Souza, P L

    2018-01-01

    An optimized method of digital image processing to interpret quantum dots' height measurements obtained by atomic force microscopy is presented. The method was developed by combining well-known digital image processing techniques and particle recognition algorithms. The properties of quantum dot structures strongly depend on dots' height, among other features. Determination of their height is sensitive to small variations in their digital image processing parameters, which can generate misleading results. Comparing the results obtained with two image processing techniques - a conventional method and the new method proposed herein - with the data obtained by determining the height of quantum dots one by one within a fixed area, showed that the optimized method leads to more accurate results. Moreover, the log-normal distribution, which is often used to represent natural processes, shows a better fit to the quantum dots' height histogram obtained with the proposed method. Finally, the quantum dots' height obtained were used to calculate the predicted photoluminescence peak energies which were compared with the experimental data. Again, a better match was observed when using the proposed method to evaluate the quantum dots' height. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Notes from the beginning of time

    PubMed Central

    Sidman, Murray

    2002-01-01

    Some remembrances of things past, and their possible relevance to things now. These remembrances include notes about informality, research as a social process, student training and evaluation, research grants, thesis and dissertation proposals, and interdisciplinary collaboration. PMID:22478373

  3. A Management Information System Model for Program Management. Ph.D. Thesis - Oklahoma State Univ.; [Computerized Systems Analysis

    NASA Technical Reports Server (NTRS)

    Shipman, D. L.

    1972-01-01

    The development of a model to simulate the information system of a program management type of organization is reported. The model statistically determines the following parameters: type of messages, destinations, delivery durations, type processing, processing durations, communication channels, outgoing messages, and priorites. The total management information system of the program management organization is considered, including formal and informal information flows and both facilities and equipment. The model is written in General Purpose System Simulation 2 computer programming language for use on the Univac 1108, Executive 8 computer. The model is simulated on a daily basis and collects queue and resource utilization statistics for each decision point. The statistics are then used by management to evaluate proposed resource allocations, to evaluate proposed changes to the system, and to identify potential problem areas. The model employs both empirical and theoretical distributions which are adjusted to simulate the information flow being studied.

  4. Feature and contrast enhancement of mammographic image based on multiscale analysis and morphology.

    PubMed

    Wu, Shibin; Yu, Shaode; Yang, Yuhan; Xie, Yaoqin

    2013-01-01

    A new algorithm for feature and contrast enhancement of mammographic images is proposed in this paper. The approach bases on multiscale transform and mathematical morphology. First of all, the Laplacian Gaussian pyramid operator is applied to transform the mammography into different scale subband images. In addition, the detail or high frequency subimages are equalized by contrast limited adaptive histogram equalization (CLAHE) and low-pass subimages are processed by mathematical morphology. Finally, the enhanced image of feature and contrast is reconstructed from the Laplacian Gaussian pyramid coefficients modified at one or more levels by contrast limited adaptive histogram equalization and mathematical morphology, respectively. The enhanced image is processed by global nonlinear operator. The experimental results show that the presented algorithm is effective for feature and contrast enhancement of mammogram. The performance evaluation of the proposed algorithm is measured by contrast evaluation criterion for image, signal-noise-ratio (SNR), and contrast improvement index (CII).

  5. Feature and Contrast Enhancement of Mammographic Image Based on Multiscale Analysis and Morphology

    PubMed Central

    Wu, Shibin; Xie, Yaoqin

    2013-01-01

    A new algorithm for feature and contrast enhancement of mammographic images is proposed in this paper. The approach bases on multiscale transform and mathematical morphology. First of all, the Laplacian Gaussian pyramid operator is applied to transform the mammography into different scale subband images. In addition, the detail or high frequency subimages are equalized by contrast limited adaptive histogram equalization (CLAHE) and low-pass subimages are processed by mathematical morphology. Finally, the enhanced image of feature and contrast is reconstructed from the Laplacian Gaussian pyramid coefficients modified at one or more levels by contrast limited adaptive histogram equalization and mathematical morphology, respectively. The enhanced image is processed by global nonlinear operator. The experimental results show that the presented algorithm is effective for feature and contrast enhancement of mammogram. The performance evaluation of the proposed algorithm is measured by contrast evaluation criterion for image, signal-noise-ratio (SNR), and contrast improvement index (CII). PMID:24416072

  6. Evaluating stakeholder management performance using a stakeholder report card: the next step in theory and practice.

    PubMed

    Malvey, Donna; Fottler, Myron D; Slovensky, Donna J

    2002-01-01

    In the highly competitive health care environment, the survival of an organization may depend on how well powerful stakeholders are managed. Yet, the existing strategic stakeholder management process does not include evaluation of stakeholder management performance. To address this critical gap, this paper proposes a systematic method for evaluation using a stakeholder report card. An example of a physician report card based on this methodology is presented.

  7. Sustainability in Health care by Allocating Resources Effectively (SHARE) 7: supporting staff in evidence-based decision-making, implementation and evaluation in a local healthcare setting.

    PubMed

    Harris, Claire; Allen, Kelly; Waller, Cara; Dyer, Tim; Brooke, Vanessa; Garrubba, Marie; Melder, Angela; Voutier, Catherine; Gust, Anthony; Farjou, Dina

    2017-06-21

    This is the seventh in a series of papers reporting Sustainability in Health care by Allocating Resources Effectively (SHARE) in a local healthcare setting. The SHARE Program was a systematic, integrated, evidence-based program for resource allocation within a large Australian health service. It aimed to facilitate proactive use of evidence from research and local data; evidence-based decision-making for resource allocation including disinvestment; and development, implementation and evaluation of disinvestment projects. From the literature and responses of local stakeholders it was clear that provision of expertise and education, training and support of health service staff would be required to achieve these aims. Four support services were proposed. This paper is a detailed case report of the development, implementation and evaluation of a Data Service, Capacity Building Service and Project Support Service. An Evidence Service is reported separately. Literature reviews, surveys, interviews, consultation and workshops were used to capture and process the relevant information. Existing theoretical frameworks were adapted for evaluation and explication of processes and outcomes. Surveys and interviews identified current practice in use of evidence in decision-making, implementation and evaluation; staff needs for evidence-based practice; nature, type and availability of local health service data; and preferred formats for education and training. The Capacity Building and Project Support Services were successful in achieving short term objectives; but long term outcomes were not evaluated due to reduced funding. The Data Service was not implemented at all. Factors influencing the processes and outcomes are discussed. Health service staff need access to education, training, expertise and support to enable evidence-based decision-making and to implement and evaluate the changes arising from those decisions. Three support services were proposed based on research evidence and local findings. Local factors, some unanticipated and some unavoidable, were the main barriers to successful implementation. All three proposed support services hold promise as facilitators of EBP in the local healthcare setting. The findings from this study will inform further exploration.

  8. An Objective Rating Form to Evaluate Grant Proposals to the Hogg Foundation for Mental Health: A Pilot Study of Implementation

    ERIC Educational Resources Information Center

    Whaley, Arthur L.

    2006-01-01

    The lack of support for mental health-related projects by private philanthropy, even among those that express an interest in mental health, is due in large part to the subjectivity of the grant review process. To address this problem, Whaley, Rodriguez, and Alexander developed the Grant Proposal Rating Form (GPRF) to make the grant review process…

  9. Adapting Document Similarity Measures for Ligand-Based Virtual Screening.

    PubMed

    Himmat, Mubarak; Salim, Naomie; Al-Dabbagh, Mohammed Mumtaz; Saeed, Faisal; Ahmed, Ali

    2016-04-13

    Quantifying the similarity of molecules is considered one of the major tasks in virtual screening. There are many similarity measures that have been proposed for this purpose, some of which have been derived from document and text retrieving areas as most often these similarity methods give good results in document retrieval and can achieve good results in virtual screening. In this work, we propose a similarity measure for ligand-based virtual screening, which has been derived from a text processing similarity measure. It has been adopted to be suitable for virtual screening; we called this proposed measure the Adapted Similarity Measure of Text Processing (ASMTP). For evaluating and testing the proposed ASMTP we conducted several experiments on two different benchmark datasets: the Maximum Unbiased Validation (MUV) and the MDL Drug Data Report (MDDR). The experiments have been conducted by choosing 10 reference structures from each class randomly as queries and evaluate them in the recall of cut-offs at 1% and 5%. The overall obtained results are compared with some similarity methods including the Tanimoto coefficient, which are considered to be the conventional and standard similarity coefficients for fingerprint-based similarity calculations. The achieved results show that the performance of ligand-based virtual screening is better and outperforms the Tanimoto coefficients and other methods.

  10. Comprehensive evaluation of impacts of distributed generation integration in distribution network

    NASA Astrophysics Data System (ADS)

    Peng, Sujiang; Zhou, Erbiao; Ji, Fengkun; Cao, Xinhui; Liu, Lingshuang; Liu, Zifa; Wang, Xuyang; Cai, Xiaoyu

    2018-04-01

    All Distributed generation (DG) as the supplement to renewable energy centralized utilization, is becoming the focus of development direction of renewable energy utilization. With the increasing proportion of DG in distribution network, the network power structure, power flow distribution, operation plans and protection are affected to some extent. According to the main impacts of DG, a comprehensive evaluation model of distributed network with DG is proposed in this paper. A comprehensive evaluation index system including 7 aspects, along with their corresponding index calculation method is established for quantitative analysis. The indices under different access capacity of DG in distribution network are calculated based on the IEEE RBTS-Bus 6 system and the evaluation result is calculated by analytic hierarchy process (AHP). The proposed model and method are verified effective and validity through case study.

  11. Measuring the emulsification dynamics and stability of self-emulsifying drug delivery systems.

    PubMed

    Vasconcelos, Teófilo; Marques, Sara; Sarmento, Bruno

    2018-02-01

    Self-emulsifying drug delivery systems (SEDDS) are one of the most promising technologies in the drug delivery field, particularly for addressing solubility and bioavailability issues of drugs. The development of these drug carriers excessively relies in visual observations and indirect determinations. The present manuscript intended to describe a method able to measure the emulsification of SEDDS, both micro and nano-emulsions, able to measure the droplet size and to evaluate the physical stability of these formulations. Additionally, a new process to evaluate the physical stability of SEDDS after emulsification was also proposed, based on a cycle of mechanical stress followed by a resting period. The use of a multiparameter continuous evaluation during the emulsification process and stability was of upmost value to understand SEDDS emulsification process. Based on this method, SEDDS were classified as fast and slow emulsifiers. Moreover, emulsification process and stabilization of emulsion was subject of several considerations regarding the composition of SEDDS as major factor that affects stability to physical stress and the use of multicomponent with different properties to develop a stable and robust SEDDS formulation. Drug loading level is herein suggested to impact droplets size of SEDDS after dispersion and SEDDS stability to stress conditions. The proposed protocol allows an online measurement of SEDDS droplet size during emulsification and a rationale selection of excipients based on its emulsification and stabilization performance. Copyright © 2017. Published by Elsevier B.V.

  12. Adsorption detection for polylysine biomolecules based on high-Q silica capillary whispering gallery mode microresonator

    NASA Astrophysics Data System (ADS)

    Wu, Jixuan; Liu, Bo; Zhang, Hao; Song, Binbin

    2017-11-01

    A silica-capillary-based whispering gallery mode (WGM) microresonator has been proposed and experimentally demonstrated for the real-time monitoring of the polylysine adsorption process. The spectral characteristics of the WGM resonance dips with high quality factor and good wavelength selectivity have been investigated to evaluate the dynamic process for the binding of polylysine with a capillary surface. The WGM transmission spectrum shows a regular shift with increments of observation time, which could be exploited for the analysis of the polylysine adsorption process. The proposed WGM microresonator system possesses desirable qualities such as high sensitivity, fast response, label-free method, high detection resolution and compactness, which could find promising applications in histology and related bioengineering areas.

  13. Applying Qualitative Hazard Analysis to Support Quantitative Safety Analysis for Proposed Reduced Wake Separation Conops

    NASA Technical Reports Server (NTRS)

    Shortle, John F.; Allocco, Michael

    2005-01-01

    This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.

  14. Investigation of Proposed Process Sequence for the Array Automated Assembly Task, Phase 2. [low cost silicon solar array fabrication

    NASA Technical Reports Server (NTRS)

    Mardesich, N.; Garcia, A.; Bunyan, S.; Pepe, A.

    1979-01-01

    The technological readiness of the proposed process sequence was reviewed. Process steps evaluated include: (1) plasma etching to establish a standard surface; (2) forming junctions by diffusion from an N-type polymeric spray-on source; (3) forming a p+ back contact by firing a screen printed aluminum paste; (4) forming screen printed front contacts after cleaning the back aluminum and removing the diffusion oxide; (5) cleaning the junction by a laser scribe operation; (6) forming an antireflection coating by baking a polymeric spray-on film; (7) ultrasonically tin padding the cells; and (8) assembling cell strings into solar circuits using ethylene vinyl acetate as an encapsulant and laminating medium.

  15. Data Storing Proposal from Heterogeneous Systems into a Specialized Repository

    NASA Astrophysics Data System (ADS)

    Václavová, Andrea; Tanuška, Pavol; Jánošík, Ján

    2016-12-01

    The aim of this paper is to analyze and to propose an appropriate system for processing and simultaneously storing a vast volume of structured and unstructured data. The paper consists of three parts. The first part addresses the issue of structured and unstructured data. The second part provides the detailed analysis of data repositories and subsequent evaluation indicating which system would be for the given type and volume of data optimal. The third part focuses on the use of gathered information to transfer data to the proposed repository.

  16. Emergence of Joint Attention through Bootstrap Learning based on the Mechanisms of Visual Attention and Learning with Self-evaluation

    NASA Astrophysics Data System (ADS)

    Nagai, Yukie; Hosoda, Koh; Morita, Akio; Asada, Minoru

    This study argues how human infants acquire the ability of joint attention through interactions with their caregivers from a viewpoint of cognitive developmental robotics. In this paper, a mechanism by which a robot acquires sensorimotor coordination for joint attention through bootstrap learning is described. Bootstrap learning is a process by which a learner acquires higher capabilities through interactions with its environment based on embedded lower capabilities even if the learner does not receive any external evaluation nor the environment is controlled. The proposed mechanism for bootstrap learning of joint attention consists of the robot's embedded mechanisms: visual attention and learning with self-evaluation. The former is to find and attend to a salient object in the field of the robot's view, and the latter is to evaluate the success of visual attention, not joint attention, and then to learn the sensorimotor coordination. Since the object which the robot looks at based on visual attention does not always correspond to the object which the caregiver is looking at in an environment including multiple objects, the robot may have incorrect learning situations for joint attention as well as correct ones. However, the robot is expected to statistically lose the learning data of the incorrect ones as outliers because of its weaker correlation between the sensor input and the motor output than that of the correct ones, and consequently to acquire appropriate sensorimotor coordination for joint attention even if the caregiver does not provide any task evaluation to the robot. The experimental results show the validity of the proposed mechanism. It is suggested that the proposed mechanism could explain the developmental mechanism of infants' joint attention because the learning process of the robot's joint attention can be regarded as equivalent to the developmental process of infants' one.

  17. Design and Control of Integrated Systems for Hydrogen Production and Power Generation

    NASA Astrophysics Data System (ADS)

    Georgis, Dimitrios

    Growing concerns on CO2 emissions have led to the development of highly efficient power plants. Options for increased energy efficiencies include alternative energy conversion pathways, energy integration and process intensification. Solid oxide fuel cells (SOFC) constitute a promising alternative for power generation since they convert the chemical energy electrochemically directly to electricity. Their high operating temperature shows potential for energy integration with energy intensive units (e.g. steam reforming reactors). Although energy integration is an essential tool for increased efficiencies, it leads to highly complex process schemes with rich dynamic behavior, which are challenging to control. Furthermore, the use of process intensification for increased energy efficiency imposes an additional control challenge. This dissertation identifies and proposes solutions on design, operational and control challenges of integrated systems for hydrogen production and power generation. Initially, a study on energy integrated SOFC systems is presented. Design alternatives are identified, control strategies are proposed for each alternative and their validity is evaluated under different operational scenarios. The operational range of the proposed control strategies is also analyzed. Next, thermal management of water gas shift membrane reactors, which are a typical application of process intensification, is considered. Design and operational objectives are identified and a control strategy is proposed employing advanced control algorithms. The performance of the proposed control strategy is evaluated and compared with classical control strategies. Finally SOFC systems for combined heat and power applications are considered. Multiple recycle loops are placed to increase design flexibility. Different operational objectives are identified and a nonlinear optimization problem is formulated. Optimal designs are obtained and their features are discussed and compared. The results of the dissertation provide a deeper understanding on the design, operational and control challenges of the above systems and can potentially guide further commercialization efforts. In addition to this, the results can be generalized and used for applications from the transportation and residential sector to large--scale power plants.

  18. Development and Evaluation of Science and Technology Education Program Using Interferometric SAR

    NASA Astrophysics Data System (ADS)

    Ito, Y.; Ikemitsu, H.; Nango, K.

    2016-06-01

    This paper proposes a science and technology education program to teach junior high school students to measure terrain changes by using interferometric synthetic aperture radar (SAR). The objectives of the proposed program are to evaluate and use information technology by performing SAR data processing in order to measure ground deformation, and to incorporate an understanding of Earth sciences by analyzing interferometric SAR processing results. To draft the teaching guidance plan for the developed education program, this study considers both science and technology education. The education program was used in a Japanese junior high school. An educational SAR processor developed by the authors and the customized Delft object-oriented radar interferometric software package were employed. Earthquakes as diastrophism events were chosen as practical teaching materials. The selected events indicate clear ground deformation in differential interferograms with high coherence levels. The learners were able to investigate the ground deformations and disasters caused by the events. They interactively used computers and became skilled at recognizing the knowledge and techniques of information technology, and then they evaluated the technology. Based on the results of pre- and post-questionnaire surveys and self-evaluation by the learners, it was clarified that the proposed program was applicable for junior high school education, and the learners recognized the usefulness of Earth observation technology by using interferometric SAR. The usefulness of the teaching materials in the learning activities was also shown through the practical teaching experience.

  19. A Developmental Perspective on Peer Rejection, Deviant Peer Affiliation, and Conduct Problems Among Youth.

    PubMed

    Chen, Diane; Drabick, Deborah A G; Burgers, Darcy E

    2015-12-01

    Peer rejection and deviant peer affiliation are linked consistently to the development and maintenance of conduct problems. Two proposed models may account for longitudinal relations among these peer processes and conduct problems: the (a) sequential mediation model, in which peer rejection in childhood and deviant peer affiliation in adolescence mediate the link between early externalizing behaviors and more serious adolescent conduct problems; and (b) parallel process model, in which peer rejection and deviant peer affiliation are considered independent processes that operate simultaneously to increment risk for conduct problems. In this review, we evaluate theoretical models and evidence for associations among conduct problems and (a) peer rejection and (b) deviant peer affiliation. We then consider support for the sequential mediation and parallel process models. Next, we propose an integrated model incorporating both the sequential mediation and parallel process models. Future research directions and implications for prevention and intervention efforts are discussed.

  20. A Developmental Perspective on Peer Rejection, Deviant Peer Affiliation, and Conduct Problems among Youth

    PubMed Central

    Chen, Diane; Drabick, Deborah A. G.; Burgers, Darcy E.

    2015-01-01

    Peer rejection and deviant peer affiliation are linked consistently to the development and maintenance of conduct problems. Two proposed models may account for longitudinal relations among these peer processes and conduct problems: the (a) sequential mediation model, in which peer rejection in childhood and deviant peer affiliation in adolescence mediate the link between early externalizing behaviors and more serious adolescent conduct problems; and (b) parallel process model, in which peer rejection and deviant peer affiliation are considered independent processes that operate simultaneously to increment risk for conduct problems. In this review, we evaluate theoretical models and evidence for associations among conduct problems and (a) peer rejection and (b) deviant peer affiliation. We then consider support for the sequential mediation and parallel process models. Next, we propose an integrated model incorporating both the sequential mediation and parallel process models. Future research directions and implications for prevention and intervention efforts are discussed. PMID:25410430

  1. Developing a framework for evaluating proposals for research in wilderness: Science to protect and learn from parks

    Treesearch

    Lewis C. Sharman; Peter Landres; Susan Boudreau

    2007-01-01

    In designated park wilderness, the requirements for scientific research often conflict with requirements designed to protect wilderness resources and values. Managers who wish to realize the benefits of scientific research must have a process by which to evaluate those benefits as well as their associated wilderness impacts. Glacier Bay National Park and Preserve, in...

  2. Developing a Self-Report-Based Sequential Analysis Method for Educational Technology Systems: A Process-Based Usability Evaluation

    ERIC Educational Resources Information Center

    Lin, Yi-Chun; Hsieh, Ya-Hui; Hou, Huei-Tse

    2015-01-01

    The development of a usability evaluation method for educational systems or applications, called the self-report-based sequential analysis, is described herein. The method aims to extend the current practice by proposing self-report-based sequential analysis as a new usability method, which integrates the advantages of self-report in survey…

  3. Request for OMB Clearance with Supporting Documents for the Evaluation of the State Capacity Building Program in Dissemination. Revised.

    ERIC Educational Resources Information Center

    NTS Research Corp., Durham, NC.

    This document presents the research proposal and supporting documents for a study to evaluate the process of developing a comprehensive dissemination capacity in State Education Agencies (SEAs) and to assess the impact of the capacity building activities in the states, addressing two questions: Is dissemination capacity being built and, if so,…

  4. Sound Arguments and Power in Evaluation Research and Policy-Making: A Measuring Instrument and Its Application.

    ERIC Educational Resources Information Center

    Propper, Igno M. A. M.

    1993-01-01

    Proposes an instrument for assessing the extent to which either sound arguments or power are found in scientific and political discussions. Empirical research is described that investigated the relation between the quality of evaluation research and the quality of discussion in policy-making processes in which the research is used. (Contains 47…

  5. An Assessment of Statistical Process Control-Based Approaches for Charting Student Evaluation Scores

    ERIC Educational Resources Information Center

    Ding, Xin; Wardell, Don; Verma, Rohit

    2006-01-01

    We compare three control charts for monitoring data from student evaluations of teaching (SET) with the goal of improving student satisfaction with teaching performance. The two charts that we propose are a modified "p" chart and a z-score chart. We show that these charts overcome some of the shortcomings of the more traditional charts…

  6. Evaluating the compatibility of multi-functional and intensive urban land uses

    NASA Astrophysics Data System (ADS)

    Taleai, M.; Sharifi, A.; Sliuzas, R.; Mesgari, M.

    2007-12-01

    This research is aimed at developing a model for assessing land use compatibility in densely built-up urban areas. In this process, a new model was developed through the combination of a suite of existing methods and tools: geographical information system, Delphi methods and spatial decision support tools: namely multi-criteria evaluation analysis, analytical hierarchy process and ordered weighted average method. The developed model has the potential to calculate land use compatibility in both horizontal and vertical directions. Furthermore, the compatibility between the use of each floor in a building and its neighboring land uses can be evaluated. The method was tested in a built-up urban area located in Tehran, the capital city of Iran. The results show that the model is robust in clarifying different levels of physical compatibility between neighboring land uses. This paper describes the various steps and processes of developing the proposed land use compatibility evaluation model (CEM).

  7. Evaluation of a hospital-wide PACS: costs and benefits of the Hammersmith PACS installation

    NASA Astrophysics Data System (ADS)

    Bryan, Stirling; Keen, Justin; Buxton, Martin J.; Weatherburn, Gwyneth C.

    1992-07-01

    The unusual nature of sites chosen for hospital-wide PACS implementations and the very small number of proposed implementations make evaluation a complex task. The UK Department of Health is funding both the evaluation and implementation of a hospital-wide PACS. The Brunel University evaluation of the Hammersmith Hospital PACS has two main components: an economic evaluation of the costs and benefits of hospital-wide PACS installations and an exercise in monitoring the implementation process. This paper concentrates on the economic component.

  8. Prediction of composites behavior undergoing an ATP process through data-mining

    NASA Astrophysics Data System (ADS)

    Martin, Clara Argerich; Collado, Angel Leon; Pinillo, Rubén Ibañez; Barasinski, Anaïs; Abisset-Chavanne, Emmanuelle; Chinesta, Francisco

    2018-05-01

    The need to characterize composite surfaces for distinct mechanical or physical processes leads to different manners of evaluate the state of the surface. During many manufacturing processes deformation occurs, thus hindering composite classification for fabrication processes. In this work we focus on the challenge of a priori identifying the surfaces' behavior in order to optimize manufacturing. We will propose and validate the curvature of the surface as a reliable parameter and we will develop a tool that allows the prediction of the surface behavior.

  9. Investigation of proposed process sequence for the array automated assembly task, phases 1 and 2

    NASA Technical Reports Server (NTRS)

    Mardesich, N.; Garcia, A.; Eskenas, K.

    1980-01-01

    Progress was made on the process sequence for module fabrication. A shift from bonding with a conformal coating to laminating with ethylene vinyl acetate and a glass superstrate is recommended for further module fabrication. The processes that were retained for the selected process sequence, spin-on diffusion, print and fire aluminum p+ back, clean, print and fire silver front contact and apply tin pad to aluminum back, were evaluated for their cost contribution.

  10. Analysis of the times involved in processing and communication in a lower limb simulation system controlled by SEMG

    NASA Astrophysics Data System (ADS)

    Profumieri, A.; Bonell, C.; Catalfamo, P.; Cherniz, A.

    2016-04-01

    Virtual reality has been proposed for different applications, including the evaluation of new control strategies and training protocols for upper limb prostheses and for the study of new rehabilitation programs. In this study, a lower limb simulation environment commanded by surface electromyography signals is evaluated. The time delays generated by the acquisition and processing stages for the signals that would command the knee joint, were measured and different acquisition windows were analysed. The subjective perception of the quality of simulation was also evaluated when extra delays were added to the process. The results showed that the acquisition window is responsible for the longest delay. Also, the basic implemented processes allowed for the acquisition of three signal channels for commanding the simulation. Finally, the communication between different applications is arguably efficient, although it depends on the amount of data to be sent.

  11. Plan delivery quality assurance for CyberKnife: Statistical process control analysis of 350 film-based patient-specific QAs.

    PubMed

    Bellec, J; Delaby, N; Jouyaux, F; Perdrieux, M; Bouvier, J; Sorel, S; Henry, O; Lafond, C

    2017-07-01

    Robotic radiosurgery requires plan delivery quality assurance (DQA) but there has never been a published comprehensive analysis of a patient-specific DQA process in a clinic. We proposed to evaluate 350 consecutive film-based patient-specific DQAs using statistical process control. We evaluated the performance of the process to propose achievable tolerance criteria for DQA validation and we sought to identify suboptimal DQA using control charts. DQAs were performed on a CyberKnife-M6 using Gafchromic-EBT3 films. The signal-to-dose conversion was performed using a multichannel-correction and a scanning protocol that combined measurement and calibration in a single scan. The DQA analysis comprised a gamma-index analysis at 3%/1.5mm and a separate evaluation of spatial and dosimetric accuracy of the plan delivery. Each parameter was plotted on a control chart and control limits were calculated. A capability index (Cpm) was calculated to evaluate the ability of the process to produce results within specifications. The analysis of capability showed that a gamma pass rate of 85% at 3%/1.5mm was highly achievable as acceptance criteria for DQA validation using a film-based protocol (Cpm>1.33). 3.4% of DQA were outside a control limit of 88% for gamma pass-rate. The analysis of the out-of-control DQA helped identify a dosimetric error in our institute for a specific treatment type. We have defined initial tolerance criteria for DQA validations. We have shown that the implementation of a film-based patient-specific DQA protocol with the use of control charts is an effective method to improve patient treatment safety on CyberKnife. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  12. Automatic evaluation of hypernasality based on a cleft palate speech database.

    PubMed

    He, Ling; Zhang, Jing; Liu, Qi; Yin, Heng; Lech, Margaret; Huang, Yunzhi

    2015-05-01

    The hypernasality is one of the most typical characteristics of cleft palate (CP) speech. The evaluation outcome of hypernasality grading decides the necessity of follow-up surgery. Currently, the evaluation of CP speech is carried out by experienced speech therapists. However, the result strongly depends on their clinical experience and subjective judgment. This work aims to propose an automatic evaluation system for hypernasality grading in CP speech. The database tested in this work is collected by the Hospital of Stomatology, Sichuan University, which has the largest number of CP patients in China. Based on the production process of hypernasality, source sound pulse and vocal tract filter features are presented. These features include pitch, the first and second energy amplified frequency bands, cepstrum based features, MFCC, short-time energy in the sub-bands features. These features combined with KNN classier are applied to automatically classify four grades of hypernasality: normal, mild, moderate and severe. The experiment results show that the proposed system achieves a good performance. The classification rates for four hypernasality grades reach up to 80.4%. The sensitivity of proposed features to the gender is also discussed.

  13. Ugliness as the fourth wall-breaker. Comment on "Move me, astonish me... delight my eyes and brain: The Vienna Integrated Model of top-down and bottom-up processes in Art Perception (VIMAP) and corresponding affective, evaluative, and neurophysiological correlates" by Matthew Pelowski et al.

    NASA Astrophysics Data System (ADS)

    Ishizu, Tomohiro; Sakamoto, Yasuhiro

    2017-07-01

    In this extensive and valuable theoretical article, Pelowski et al. propose a psychological architecture in art appreciation by introducing the concepts of early/bottom-up and relatively late/top-down stages. The former is dictated as automatic processing on perceptual features of visual images, while the latter comprises cognitive and evaluative processes where modulations from acquired knowledge and memories come into play with recurrent loops to form final experiences, as well as brain areas/networks which possibly have a role in each processing component [9].

  14. Investigation of dimensional variation in parts manufactured by fused deposition modeling using Gauge Repeatability and Reproducibility

    NASA Astrophysics Data System (ADS)

    Mohamed, Omar Ahmed; Hasan Masood, Syed; Lal Bhowmik, Jahar

    2018-02-01

    In the additive manufacturing (AM) market, the question is raised by industry and AM users on how reproducible and repeatable the fused deposition modeling (FDM) process is in providing good dimensional accuracy. This paper aims to investigate and evaluate the repeatability and reproducibility of the FDM process through a systematic approach to answer this frequently asked question. A case study based on the statistical gage repeatability and reproducibility (gage R&R) technique is proposed to investigate the dimensional variations in the printed parts of the FDM process. After running the simulation and analysis of the data, the FDM process capability is evaluated, which would help the industry for better understanding the performance of FDM technology.

  15. Integrated Response Time Evaluation Methodology for the Nuclear Safety Instrumentation System

    NASA Astrophysics Data System (ADS)

    Lee, Chang Jae; Yun, Jae Hee

    2017-06-01

    Safety analysis for a nuclear power plant establishes not only an analytical limit (AL) in terms of a measured or calculated variable but also an analytical response time (ART) required to complete protective action after the AL is reached. If the two constraints are met, the safety limit selected to maintain the integrity of physical barriers used for preventing uncontrolled radioactivity release will not be exceeded during anticipated operational occurrences and postulated accidents. Setpoint determination methodologies have actively been developed to ensure that the protective action is initiated before the process conditions reach the AL. However, regarding the ART for a nuclear safety instrumentation system, an integrated evaluation methodology considering the whole design process has not been systematically studied. In order to assure the safety of nuclear power plants, this paper proposes a systematic and integrated response time evaluation methodology that covers safety analyses, system designs, response time analyses, and response time tests. This methodology is applied to safety instrumentation systems for the advanced power reactor 1400 and the optimized power reactor 1000 nuclear power plants in South Korea. The quantitative evaluation results are provided herein. The evaluation results using the proposed methodology demonstrate that the nuclear safety instrumentation systems fully satisfy corresponding requirements of the ART.

  16. Dry deposition models for radionuclides dispersed in air: a new approach for deposition velocity evaluation schema

    NASA Astrophysics Data System (ADS)

    Giardina, M.; Buffa, P.; Cervone, A.; De Rosa, F.; Lombardo, C.; Casamirra, M.

    2017-11-01

    In the framework of a National Research Program funded by the Italian Minister of Economic Development, the Department of Energy, Information Engineering and Mathematical Models (DEIM) of Palermo University and ENEA Research Centre of Bologna, Italy are performing several research activities to study physical models and mathematical approaches aimed at investigating dry deposition mechanisms of radioactive pollutants. On the basis of such studies, a new approach to evaluate the dry deposition velocity for particles is proposed. Comparisons with some literature experimental data show that the proposed dry deposition scheme can capture the main phenomena involved in the dry deposition process successfully.

  17. An ontological case base engineering methodology for diabetes management.

    PubMed

    El-Sappagh, Shaker H; El-Masri, Samir; Elmogy, Mohammed; Riad, A M; Saddik, Basema

    2014-08-01

    Ontology engineering covers issues related to ontology development and use. In Case Based Reasoning (CBR) system, ontology plays two main roles; the first as case base and the second as domain ontology. However, the ontology engineering literature does not provide adequate guidance on how to build, evaluate, and maintain ontologies. This paper proposes an ontology engineering methodology to generate case bases in the medical domain. It mainly focuses on the research of case representation in the form of ontology to support the case semantic retrieval and enhance all knowledge intensive CBR processes. A case study on diabetes diagnosis case base will be provided to evaluate the proposed methodology.

  18. Simple performance evaluation of pulsed spontaneous parametric down-conversion sources for quantum communications.

    PubMed

    Smirr, Jean-Loup; Guilbaud, Sylvain; Ghalbouni, Joe; Frey, Robert; Diamanti, Eleni; Alléaume, Romain; Zaquine, Isabelle

    2011-01-17

    Fast characterization of pulsed spontaneous parametric down conversion (SPDC) sources is important for applications in quantum information processing and communications. We propose a simple method to perform this task, which only requires measuring the counts on the two output channels and the coincidences between them, as well as modeling the filter used to reduce the source bandwidth. The proposed method is experimentally tested and used for a complete evaluation of SPDC sources (pair emission probability, total losses, and fidelity) of various bandwidths. This method can find applications in the setting up of SPDC sources and in the continuous verification of the quality of quantum communication links.

  19. Implementation of a burn scar assessment system by ultrasound techniques.

    PubMed

    Du, Yi-Chun; Lin, Chih-Ming; Chen, Yung-Fu; Chen, Chung-Lin; Chen, Tainsong

    2006-01-01

    Tissue injury and its ensuing healing process cause scar formation. In addition to physical disability, the subsequent disfigurements from burns often bring negative psychological impacts on the survivors. Scar hypertrophy and contracture limit the joint motion and body function of the patient. With fast development of the current available technologies regarding the scar therapies, not only the process of wound healing has to be focused, but also the cosmetic and functional outcomes need to be emphasized. Therefore, proper evaluation and assessment of the healing process to nil scar status is highly recommended. However, the currently employed tools for scar evaluation are mostly subjective. For example, Vancouver General Hospital (VGH) scar index uses color, pigmentation, vascularity, pliability, and depth of the scar as dependent variables for scar evaluation. These parameters only estimate the superficial surface of the scar, but they can not evaluate the deeper tissue within dermis. Ultrasound is a safe, inexpensive, and multifunctional technique for probing tissue characteristics. In addition, its resolution is not inferior to other measurement techniques. Although 3D-ultrasound is available in clinical application, it's still not widely used in scar evaluation because of its high cost. In this study, we proposed a system for scar assessment using B-mode ultrasonic technique. By utilizing the reconstruction methods to search the scar border, many characteristic parameters, including depth, area and volume, can be estimated. The proposed method is useful in assisting the clinician to evaluate the treatment effect and to plan further therapeutic strategy more objectively. In this report, the quantitative assessment system was used to evaluate the scar of a seriously burned patient. In order to verify the reliability of systematic reconstruction method, we constructed a phantom to imitate the scar tissue. The results show that it can achieve more than 90% in accuracy.

  20. Re-Evaluating Split-Fovea Processing in Word Recognition: A Critical Assessment of Recent Research

    ERIC Educational Resources Information Center

    Jordan, Timothy R.; Paterson, Kevin B.

    2009-01-01

    In recent years, some researchers have proposed that a fundamental component of the word recognition process is that each fovea is divided precisely at its vertical midline and that information either side of this midline projects to different, contralateral hemispheres. Thus, when a word is fixated, all letters to the left of the point of…

  1. Evaluation Model for Applying an E-Learning System in a Course: An Analytic Hierarchy Process-Multi-Choice Goal Programming Approach

    ERIC Educational Resources Information Center

    Lin, Teng-Chiao; Ho, Hui-Ping; Chang, Ching-Ter

    2014-01-01

    With the widespread use of the Internet, adopting e-learning systems in courses has gradually become more and more important in universities in Taiwan. However, because of limitations of teachers' time, selecting suitable online IT tools has become very important. This study proposes an analytic hierarchy process (AHP)-multi-choice goal…

  2. A practical framework for data management processes and their evaluation in population-based medical registries.

    PubMed

    Sariyar, M; Borg, A; Heidinger, O; Pommerening, K

    2013-03-01

    We present a framework for data management processes in population-based medical registries. Existing guidelines lack the concreteness we deem necessary for them to be of practical use, especially concerning the establishment of new registries. Therefore, we propose adjustments and concretisations with regard to data quality, data privacy, data security and registry purposes. First, we separately elaborate on the issues to be included into the framework and present proposals for their improvements. Thereafter, we provide a framework for medical registries based on quasi-standard-operation procedures. The main result is a concise and scientifically based framework that tries to be both broad and concrete. Within that framework, we distinguish between data acquisition, data storage and data presentation as sub-headings. We use the framework to categorise and evaluate the data management processes of a German cancer registry. The standardisation of data management processes in medical registries is important to guarantee high quality of the registered data, to enhance the realisation of purposes, to increase efficiency and to enable comparisons between registries. Our framework is destined to show how one central impediment for such standardisations - lack of practicality - can be addressed on scientific grounds.

  3. Using formal methods for content validation of medical procedure documents.

    PubMed

    Cota, Érika; Ribeiro, Leila; Bezerra, Jonas Santos; Costa, Andrei; da Silva, Rosiana Estefane; Cota, Gláucia

    2017-08-01

    We propose the use of a formal approach to support content validation of a standard operating procedure (SOP) for a therapeutic intervention. Such an approach provides a useful tool to identify ambiguities, omissions and inconsistencies, and improves the applicability and efficacy of documents in the health settings. We apply and evaluate a methodology originally proposed for the verification of software specification documents to a specific SOP. The verification methodology uses the graph formalism to model the document. Semi-automatic analysis identifies possible problems in the model and in the original document. The verification is an iterative process that identifies possible faults in the original text that should be revised by its authors and/or specialists. The proposed method was able to identify 23 possible issues in the original document (ambiguities, omissions, redundant information, and inaccuracies, among others). The formal verification process aided the specialists to consider a wider range of usage scenarios and to identify which instructions form the kernel of the proposed SOP and which ones represent additional or required knowledge that are mandatory for the correct application of the medical document. By using the proposed verification process, a simpler and yet more complete SOP could be produced. As consequence, during the validation process the experts received a more mature document and could focus on the technical aspects of the procedure itself. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Evaluating Simultaneous Integrals

    ERIC Educational Resources Information Center

    Kwong, Harris

    2012-01-01

    Many integrals require two successive applications of integration by parts. During the process, another integral of similar type is often invoked. We propose a method which can integrate these two integrals simultaneously. All we need is to solve a linear system of equations.

  5. 77 FR 56634 - Notice of Proposed Information Collection Requests; Office of Planning, Evaluation and Policy...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-13

    ... data at the center of ED's policy, management, and budget decision-making processes for all K-12... collection and OMB Control Number when making your request. Individuals who use a telecommunications device...

  6. HIGH-TEMPERATURE AND HIGH-PRESSURE PARTICULATE CONTROL REQUIREMENTS

    EPA Science Inventory

    The report reviews and evaluates high-temperature and high-pressure particulate cleanup requirements of existing and proposed energy processes. The study's aims are to define specific high-temperature and high-pressure particle removal problems, to indicate potential solutions, a...

  7. Multicriteria Personnel Selection by the Modified Fuzzy VIKOR Method

    PubMed Central

    Alguliyev, Rasim M.; Aliguliyev, Ramiz M.; Mahmudova, Rasmiyya S.

    2015-01-01

    Personnel evaluation is an important process in human resource management. The multicriteria nature and the presence of both qualitative and quantitative factors make it considerably more complex. In this study, a fuzzy hybrid multicriteria decision-making (MCDM) model is proposed to personnel evaluation. This model solves personnel evaluation problem in a fuzzy environment where both criteria and weights could be fuzzy sets. The triangular fuzzy numbers are used to evaluate the suitability of personnel and the approximate reasoning of linguistic values. For evaluation, we have selected five information culture criteria. The weights of the criteria were calculated using worst-case method. After that, modified fuzzy VIKOR is proposed to rank the alternatives. The outcome of this research is ranking and selecting best alternative with the help of fuzzy VIKOR and modified fuzzy VIKOR techniques. A comparative analysis of results by fuzzy VIKOR and modified fuzzy VIKOR methods is presented. Experiments showed that the proposed modified fuzzy VIKOR method has some advantages over fuzzy VIKOR method. Firstly, from a computational complexity point of view, the presented model is effective. Secondly, compared to fuzzy VIKOR method, it has high acceptable advantage compared to fuzzy VIKOR method. PMID:26516634

  8. Health state evaluation of shield tunnel SHM using fuzzy cluster method

    NASA Astrophysics Data System (ADS)

    Zhou, Fa; Zhang, Wei; Sun, Ke; Shi, Bin

    2015-04-01

    Shield tunnel SHM is in the path of rapid development currently while massive monitoring data processing and quantitative health grading remain a real challenge, since multiple sensors belonging to different types are employed in SHM system. This paper addressed the fuzzy cluster method based on fuzzy equivalence relationship for the health evaluation of shield tunnel SHM. The method was optimized by exporting the FSV map to automatically generate the threshold value. A new holistic health score(HHS) was proposed and its effectiveness was validated by conducting a pilot test. A case study on Nanjing Yangtze River Tunnel was presented to apply this method. Three types of indicators, namely soil pressure, pore pressure and steel strain, were used to develop the evaluation set U. The clustering results were verified by analyzing the engineering geological conditions; the applicability and validity of the proposed method was also demonstrated. Besides, the advantage of multi-factor evaluation over single-factor model was discussed by using the proposed HHS. This investigation indicated the fuzzy cluster method and HHS is capable of characterizing the fuzziness of tunnel health, and it is beneficial to clarify the tunnel health evaluation uncertainties.

  9. Community of Practice: A Path to Strategic Learning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nancy M. Carlson

    2003-04-01

    To explore the concept of community of practice, the research initially concentrates on a strategic business process in a research and applied engineering laboratory discovering essential communication tools and processes needed to cultivate a high functioning cross-disciplinary team engaged in proposal preparation. Qualitative research in the human ecology of the proposal process blends topic-oriented ethnography and grounded theory and includes an innovative addition to qualitative interviewing, called meta-inquiry. Meta-inquiry uses an initial interview protocol with a homogeneous pool of informants to enhance the researcher's sensitivity to the unique cultures involved in the proposal process before developing a formal interview protocol.more » In this study the preanalysis process uses data from editors, graphic artists, text processors, and production coordinators to assess, modify, enhance, and focus the formal interview protocol with scientists, engineers, and technical managers-the heterogeneous informants. Thus this human ecology-based interview protocol values homogeneous and heterogeneous informant data and acquires data from which concepts, categories, properties, and both substantive and formal theory emerges. The research discovers the five essential processes of owning, visioning, reviewing, producing, and contributing for strategic learning to occur in a proposal community of practice. The apprenticeship, developmental, and nurturing perspectives of adult learning provide the proposal community of practice with cohesion, interdependence, and caring, while core and boundary practices provide insight into the tacit and explicit dimensions of the proposal process. By making these dimensions explicit, the necessary competencies, absorptive capacity, and capabilities needed for strategic learning are discovered. Substantive theory emerges and provides insight into the ability of the proposal community of practice to evolve, flourish, and adapt to the strategic advantage of the laboratory. The substantive theory explores the dimensions of owning, visioning, reviewing, producing, and contributing and their interrelationship to community learning dynamics. Through dialogue, creative tension, and imagination, the proposal community of practice focuses on actionable goals linked by proactively participating in practice, creating possibilities, evaluating and enhancing potential, producing a valued product, and confirming strategic value. Lastly, a formal theory emerges linking competency-capacity-capability, cohesion, interdependence, and caring as essential attributes of strategic learning communities.« less

  10. Deliberation before determination: the definition and evaluation of good decision making.

    PubMed

    Elwyn, Glyn; Miron-Shatz, Talya

    2010-06-01

    In this article, we examine definitions of suggested approaches to measure the concept of good decisions, highlight the ways in which they converge, and explain why we have concerns about their emphasis on post-hoc estimations and post-decisional outcomes, their prescriptive concept of knowledge, and their lack of distinction between the process of deliberation, and the act of decision determination. There has been a steady trend to involve patients in decision making tasks in clinical practice, part of a shift away from paternalism towards the concept of informed choice. An increased understanding of the uncertainties that exist in medicine, arising from a weak evidence base and, in addition, the stochastic nature of outcomes at the individual level, have contributed to shifting the responsibility for decision making from physicians to patients. This led to increasing use of decision support and communication methods, with the ultimate aim of improving decision making by patients. Interest has therefore developed in attempting to define good decision making and in the development of measurement approaches. We pose and reflect whether decisions can be judged good or not, and, if so, how this goodness might be evaluated. We hypothesize that decisions cannot be measured by reference to their outcomes and offer an alternative means of assessment, which emphasizes the deliberation process rather than the decision's end results. We propose decision making comprises a pre-decisional process and an act of decision determination and consider how this model of decision making serves to develop a new approach to evaluating what constitutes a good decision making process. We proceed to offer an alternative, which parses decisions into the pre-decisional deliberation process, the act of determination and post-decisional outcomes. Evaluating the deliberation process, we propose, should comprise of a subjective sufficiency of knowledge, as well as emotional processing and affective forecasting of the alternatives. This should form the basis for a good act of determination.

  11. The role of laser technology in materials processing and nondestructive testing in the 21st century

    NASA Astrophysics Data System (ADS)

    Sheinberg, B. M.

    Some of the potential applications of laser technology in the 21st century are explored, and the proposed role of this technology in relation to materials processing, nondestructive testing, and quality control are discussed. Examples illustrating the implementation of this techology include the proposed construction of vehicles and platforms in near and deep space, and construction of underwater platforms. The direction in which today's technology should evolve to pursue the achievement of such goals is indicated. Included in the discussion is an evaluation of laser, robotics, and fiber optics technologies with respect to their ability to achieve a synergistic level of operation.

  12. Nutrition in primary health care: using a Delphi process to design new interdisciplinary services.

    PubMed

    Brauer, Paula; Dietrich, Linda; Davidson, Bridget

    2006-01-01

    A modified Delphi process was used to identify key features of interdisciplinary nutrition services, including provider roles and responsibilities for Ontario Family Health Networks (FHNs), a family physician-based type of primary care. Twenty-three representatives from interested professional organizations, including three FHN demonstration sites, completed a modified Delphi process. Participants reviewed evidence from a systematic literature review, a patient survey, a costing analysis, and key informant interview results before undertaking the Delphi process. Statements describing various options for services were developed at an in-person meeting, which was followed by two rounds of e-mail questionnaires. Teleconference discussions were held between rounds. An interdisciplinary model with differing and complementary roles for health care providers emerged from the process. Additional key features addressing screening for nutrition problems, health promotion and disease prevention, team collaboration, planning and evaluation, administrative support, access to care, and medical directives/delegated acts were identified. Under the proposed model, the registered dietitian is the team member responsible for managing all aspects of nutrition services, from needs assessment to program delivery, as well as for supporting all providers' nutrition services. The proposed interdisciplinary nutrition services model merits evaluation of cost, effectiveness, applicability, and sustainability in team-based primary care service settings.

  13. Electromagnetic pulsed thermography for natural cracks inspection

    NASA Astrophysics Data System (ADS)

    Gao, Yunlai; Tian, Gui Yun; Wang, Ping; Wang, Haitao; Gao, Bin; Woo, Wai Lok; Li, Kongjing

    2017-02-01

    Emerging integrated sensing and monitoring of material degradation and cracks are increasingly required for characterizing the structural integrity and safety of infrastructure. However, most conventional nondestructive evaluation (NDE) methods are based on single modality sensing which is not adequate to evaluate structural integrity and natural cracks. This paper proposed electromagnetic pulsed thermography for fast and comprehensive defect characterization. It hybrids multiple physical phenomena i.e. magnetic flux leakage, induced eddy current and induction heating linking to physics as well as signal processing algorithms to provide abundant information of material properties and defects. New features are proposed using 1st derivation that reflects multiphysics spatial and temporal behaviors to enhance the detection of cracks with different orientations. Promising results that robust to lift-off changes and invariant features for artificial and natural cracks detection have been demonstrated that the proposed method significantly improves defect detectability. It opens up multiphysics sensing and integrated NDE with potential impact for natural understanding and better quantitative evaluation of natural cracks including stress corrosion crack (SCC) and rolling contact fatigue (RCF).

  14. Enhancement of Micropollutant Degradation at the Outlet of Small Wastewater Treatment Plants

    PubMed Central

    Rossi, Luca; Queloz, Pierre; Brovelli, Alessandro; Margot, Jonas; Barry, D. A.

    2013-01-01

    The aim of this work was to evaluate low-cost and easy-to-operate engineering solutions that can be added as a polishing step to small wastewater treatment plants to reduce the micropollutant load to water bodies. The proposed design combines a sand filter/constructed wetland with additional and more advanced treatment technologies (UV degradation, enhanced adsorption to the solid phase, e.g., an engineered substrate) to increase the elimination of recalcitrant compounds. The removal of five micropollutants with different physico-chemical characteristics (three pharmaceuticals: diclofenac, carbamazepine, sulfamethoxazole, one pesticide: mecoprop, and one corrosion inhibitor: benzotriazole) was studied to evaluate the feasibility of the proposed system. Separate batch experiments were conducted to assess the removal efficiency of UV degradation and adsorption. The efficiency of each individual process was substance-specific. No process was effective on all the compounds tested, although elimination rates over 80% using light expanded clay aggregate (an engineered material) were observed. A laboratory-scale flow-through setup was used to evaluate interactions when removal processes were combined. Four of the studied compounds were partially eliminated, with poor removal of the fifth (benzotriazole). The energy requirements for a field-scale installation were estimated to be the same order of magnitude as those of ozonation and powdered activated carbon treatments. PMID:23484055

  15. A Combination of Extended Fuzzy AHP and Fuzzy GRA for Government E-Tendering in Hybrid Fuzzy Environment

    PubMed Central

    Wang, Yan; Xi, Chengyu; Zhang, Shuai; Yu, Dejian; Zhang, Wenyu; Li, Yong

    2014-01-01

    The recent government tendering process being conducted in an electronic way is becoming an inevitable affair for numerous governmental agencies to further exploit the superiorities of conventional tendering. Thus, developing an effective web-based bid evaluation methodology so as to realize an efficient and effective government E-tendering (GeT) system is imperative. This paper firstly investigates the potentiality of employing fuzzy analytic hierarchy process (AHP) along with fuzzy gray relational analysis (GRA) for optimal selection of candidate tenderers in GeT process with consideration of a hybrid fuzzy environment with incomplete weight information. We proposed a novel hybrid fuzzy AHP-GRA (HFAHP-GRA) method that combines an extended fuzzy AHP with a modified fuzzy GRA. The extended fuzzy AHP which combines typical AHP with interval AHP is proposed to obtain the exact weight information, and the modified fuzzy GRA is applied to aggregate different types of evaluation information so as to identify the optimal candidate tenderers. Finally, a prototype system is built and validated with an illustrative example for GeT to confirm the feasibility of our approach. PMID:25057506

  16. A combination of extended fuzzy AHP and fuzzy GRA for government E-tendering in hybrid fuzzy environment.

    PubMed

    Wang, Yan; Xi, Chengyu; Zhang, Shuai; Yu, Dejian; Zhang, Wenyu; Li, Yong

    2014-01-01

    The recent government tendering process being conducted in an electronic way is becoming an inevitable affair for numerous governmental agencies to further exploit the superiorities of conventional tendering. Thus, developing an effective web-based bid evaluation methodology so as to realize an efficient and effective government E-tendering (GeT) system is imperative. This paper firstly investigates the potentiality of employing fuzzy analytic hierarchy process (AHP) along with fuzzy gray relational analysis (GRA) for optimal selection of candidate tenderers in GeT process with consideration of a hybrid fuzzy environment with incomplete weight information. We proposed a novel hybrid fuzzy AHP-GRA (HFAHP-GRA) method that combines an extended fuzzy AHP with a modified fuzzy GRA. The extended fuzzy AHP which combines typical AHP with interval AHP is proposed to obtain the exact weight information, and the modified fuzzy GRA is applied to aggregate different types of evaluation information so as to identify the optimal candidate tenderers. Finally, a prototype system is built and validated with an illustrative example for GeT to confirm the feasibility of our approach.

  17. Evaluating Core Quality for a Mars Sample Return Mission

    NASA Technical Reports Server (NTRS)

    Weiss, D. K.; Budney, C.; Shiraishi, L.; Klein, K.

    2012-01-01

    Sample return missions, including the proposed Mars Sample Return (MSR) mission, propose to collect core samples from scientifically valuable sites on Mars. These core samples would undergo extreme forces during the drilling process, and during the reentry process if the EEV (Earth Entry Vehicle) performed a hard landing on Earth. Because of the foreseen damage to the stratigraphy of the cores, it is important to evaluate each core for rock quality. However, because no core sample return mission has yet been conducted to another planetary body, it remains unclear as to how to assess the cores for rock quality. In this report, we describe the development of a metric designed to quantitatively assess the mechanical quality of any rock cores returned from Mars (or other planetary bodies). We report on the process by which we tested the metric on core samples of Mars analogue materials, and the effectiveness of the core assessment metric (CAM) in assessing rock core quality before and after the cores were subjected to shocking (g forces representative of an EEV landing).

  18. What Counts is not Falling … but Landing1

    PubMed Central

    BROUSSELLE, ASTRID

    2012-01-01

    Implementation evaluations, also called process evaluations, involve studying the development of programmes, and identifying and understanding their strengths and weaknesses. Undertaking an implementation evaluation offers insights into evaluation objectives, but does not help the researcher develop a research strategy. During the implementation analysis of the UNAIDS drug access initiative in Chile, the strategic analysis model developed by Crozier and Friedberg was used. However, a major incompatibility was noted between the procedure put forward by Crozier and Friedberg and the specific characteristics of the programme being evaluated. In this article, an adapted strategic analysis model for programme evaluation is proposed. PMID:23526306

  19. Transportation systems evaluation methodology development and applications, phase 3

    NASA Technical Reports Server (NTRS)

    Kuhlthau, A. R.; Jacobson, I. D.; Richards, L. C.

    1981-01-01

    Transportation systems or proposed changes in current systems are evaluated. Four principal evaluation criteria are incorporated in the process, operating performance characteristics as viewed by potential users, decisions based on the perceived impacts of the system, estimating what is required to reduce the system to practice; and predicting the ability of the concept to attract financial support. A series of matrix multiplications in which the various matrices represent evaluations in a logical sequence of the various discrete steps in a management decision process is used. One or more alternatives are compared with the current situation, and the result provides a numerical rating which determines the desirability of each alternative relative to the norm and to each other. The steps in the decision process are isolated so that contributions of each to the final result are readily analyzed. The ability to protect against bias on the part of the evaluators, and the fact that system parameters which are basically qualitative in nature can be easily included are advantageous.

  20. National Security Technology Incubator Evaluation Process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None, None

    This report describes the process by which the National Security Technology Incubator (NSTI) will be evaluated. The technology incubator is being developed as part of the National Security Preparedness Project (NSPP), funded by a Department of Energy (DOE)/National Nuclear Security Administration (NNSA) grant. This report includes a brief description of the components, steps, and measures of the proposed evaluation process. The purpose of the NSPP is to promote national security technologies through business incubation, technology demonstration and validation, and workforce development. The NSTI will focus on serving businesses with national security technology applications by nurturing them through critical stages ofmore » early development. An effective evaluation process of the NSTI is an important step as it can provide qualitative and quantitative information on incubator performance over a given period. The vision of the NSTI is to be a successful incubator of technologies and private enterprise that assist the NNSA in meeting new challenges in national safety and security. The mission of the NSTI is to identify, incubate, and accelerate technologies with national security applications at various stages of development by providing hands-on mentoring and business assistance to small businesses and emerging or growing companies. To achieve success for both incubator businesses and the NSTI program, an evaluation process is essential to effectively measure results and implement corrective processes in the incubation design if needed. The evaluation process design will collect and analyze qualitative and quantitative data through performance evaluation system.« less

  1. Development, implementation and evaluation of an evidence-based program for introduction of new health technologies and clinical practices in a local healthcare setting.

    PubMed

    Harris, Claire; Garrubba, Marie; Allen, Kelly; King, Richard; Kelly, Cate; Thiagarajan, Malar; Castleman, Beverley; Ramsey, Wayne; Farjou, Dina

    2015-12-28

    This paper reports the process of establishing a transparent, accountable, evidence-based program for introduction of new technologies and clinical practices (TCPs) in a large Australian healthcare network. Many countries have robust evidence-based processes for assessment of new TCPs at national level. However many decisions are made by local health services where the resources and expertise to undertake health technology assessment (HTA) are limited and a lack of structure, process and transparency has been reported. An evidence-based model for process change was used to establish the program. Evidence from research and local data, experience of health service staff and consumer perspectives were incorporated at each of four steps: identifying the need for change, developing a proposal, implementation and evaluation. Checklists assessing characteristics of success, factors for sustainability and barriers and enablers were applied and implementation strategies were based on these findings. Quantitative and qualitative methods were used for process and outcome evaluation. An action research approach underpinned ongoing refinement to systems, processes and resources. A Best Practice Guide developed from the literature and stakeholder consultation identified seven program components: Governance, Decision-Making, Application Process, Monitoring and Reporting, Resources, Administration, and Evaluation and Quality Improvement. The aims of transparency and accountability were achieved. The processes are explicit, decisions published, outcomes recorded and activities reported. The aim of ascertaining rigorous evidence-based information for decision-making was not achieved in all cases. Applicants proposing new TCPs provided the evidence from research literature and local data however the information was often incorrect or inadequate, overestimating benefits and underestimating costs. Due to these limitations the initial application process was replaced by an Expression of Interest from applicants followed by a rigorous HTA by independent in-house experts. The program is generalisable to most health care organisations. With one exception, the components would be achievable with minimal additional resources; the lack of skills and resources required for HTA will limit effective application in many settings. A toolkit containing details of the processes and sample materials is provided to facilitate replication or local adaptation by those wishing to establish a similar program.

  2. q-Gaussian distributions and multiplicative stochastic processes for analysis of multiple financial time series

    NASA Astrophysics Data System (ADS)

    Sato, Aki-Hiro

    2010-12-01

    This study considers q-Gaussian distributions and stochastic differential equations with both multiplicative and additive noises. In the M-dimensional case a q-Gaussian distribution can be theoretically derived as a stationary probability distribution of the multiplicative stochastic differential equation with both mutually independent multiplicative and additive noises. By using the proposed stochastic differential equation a method to evaluate a default probability under a given risk buffer is proposed.

  3. [The external evaluation of study quality: the role in maintaining the reliability of laboratory information].

    PubMed

    Men'shikov, V V

    2013-08-01

    The external evaluation of quality of clinical laboratory examinations was gradually introduced in USSR medical laboratories since 1970s. In Russia, in the middle of 1990 a unified all-national system of external evaluation quality was organized known as the Federal center of external evaluation of quality at the basis of laboratory of the state research center of preventive medicine. The main positions of policy in this area were neatly formulated in the guidance documents of ministry of Health. Nowadays, the center of external evaluation of quality proposes 100 and more types of control studies and permanently extends their specter starting from interests of different disciplines of clinical medicine. The consistent participation of laboratories in the cycles of external evaluation of quality intrinsically promotes improvement of indicators of properness and precision of analysis results and increases reliability of laboratory information. However, a significant percentage of laboratories does not participate at all in external evaluation of quality or takes part in control process irregularly and in limited number of tests. The managers of a number of medical organizations disregard the application of the proposed possibilities to increase reliability of laboratory information and limit financing of studies in the field of quality control. The article proposes to adopt the national standard on the basis of ISO 17043 "Evaluation of compliance. The common requirements of professional competence testing".

  4. RISMA: A Rule-based Interval State Machine Algorithm for Alerts Generation, Performance Analysis and Monitoring Real-Time Data Processing

    NASA Astrophysics Data System (ADS)

    Laban, Shaban; El-Desouky, Aly

    2013-04-01

    The monitoring of real-time systems is a challenging and complicated process. So, there is a continuous need to improve the monitoring process through the use of new intelligent techniques and algorithms for detecting exceptions, anomalous behaviours and generating the necessary alerts during the workflow monitoring of such systems. The interval-based or period-based theorems have been discussed, analysed, and used by many researches in Artificial Intelligence (AI), philosophy, and linguistics. As explained by Allen, there are 13 relations between any two intervals. Also, there have also been many studies of interval-based temporal reasoning and logics over the past decades. Interval-based theorems can be used for monitoring real-time interval-based data processing. However, increasing the number of processed intervals makes the implementation of such theorems a complex and time consuming process as the relationships between such intervals are increasing exponentially. To overcome the previous problem, this paper presents a Rule-based Interval State Machine Algorithm (RISMA) for processing, monitoring, and analysing the behaviour of interval-based data, received from real-time sensors. The proposed intelligent algorithm uses the Interval State Machine (ISM) approach to model any number of interval-based data into well-defined states as well as inferring them. An interval-based state transition model and methodology are presented to identify the relationships between the different states of the proposed algorithm. By using such model, the unlimited number of relationships between similar large numbers of intervals can be reduced to only 18 direct relationships using the proposed well-defined states. For testing the proposed algorithm, necessary inference rules and code have been designed and applied to the continuous data received in near real-time from the stations of International Monitoring System (IMS) by the International Data Centre (IDC) of the Preparatory Commission for the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO). The CLIPS expert system shell has been used as the main rule engine for implementing the algorithm rules. Python programming language and the module "PyCLIPS" are used for building the necessary code for algorithm implementation. More than 1.7 million intervals constitute the Concise List of Frames (CLF) from 20 different seismic stations have been used for evaluating the proposed algorithm and evaluating stations behaviour and performance. The initial results showed that proposed algorithm can help in better understanding of the operation and performance of those stations. Different important information, such as alerts and some station performance parameters, can be derived from the proposed algorithm. For IMS interval-based data and at any period of time it is possible to analyze station behavior, determine the missing data, generate necessary alerts, and to measure some of station performance attributes. The details of the proposed algorithm, methodology, implementation, experimental results, advantages, and limitations of this research are presented. Finally, future directions and recommendations are discussed.

  5. Liquid rocket booster integration study. Volume 3: Study products. Part 2: Sections 8-19

    NASA Technical Reports Server (NTRS)

    1988-01-01

    The impacts of introducing liquid rocket booster engines (LRB) into the Space Transportation System (STS)/Kennedy Space Center (KSC) launch environment are identified and evaluated. Proposed ground systems configurations are presented along with a launch site requirements summary. Prelaunch processing scenarios are described and the required facility modifications and new facility requirements are analyzed. Flight vehicle design recommendations to enhance launch processing are discussed. Processing approaches to integrate LRB with existing STS launch operations are evaluated. The key features and significance of launch site transition to a new STS configuration in parallel with ongoing launch activities are enumerated. This volume is part two of the study products section of the five volume series.

  6. Liquid rocket booster integration study. Volume 3, part 1: Study products

    NASA Technical Reports Server (NTRS)

    1988-01-01

    The impacts of introducing liquid rocket booster engines (LRB) into the Space Transportation System (STS)/Kennedy Space Center (KSC) launch environment are identified and evaluated. Proposed ground systems configurations are presented along with a launch site requirements summary. Prelaunch processing scenarios are described and the required facility modifications and new facility requirements are analyzed. Flight vehicle design recommendations to enhance launch processing are discussed. Processing approaches to integrate LRB with existing STS launch operations are evaluated. The key features and significance of launch site transition to a new STS configuration in parallel with ongoing launch activities are enumerated. This volume is part one of the study products section of the five volume series.

  7. Real-time multiple objects tracking on Raspberry-Pi-based smart embedded camera

    NASA Astrophysics Data System (ADS)

    Dziri, Aziz; Duranton, Marc; Chapuis, Roland

    2016-07-01

    Multiple-object tracking constitutes a major step in several computer vision applications, such as surveillance, advanced driver assistance systems, and automatic traffic monitoring. Because of the number of cameras used to cover a large area, these applications are constrained by the cost of each node, the power consumption, the robustness of the tracking, the processing time, and the ease of deployment of the system. To meet these challenges, the use of low-power and low-cost embedded vision platforms to achieve reliable tracking becomes essential in networks of cameras. We propose a tracking pipeline that is designed for fixed smart cameras and which can handle occlusions between objects. We show that the proposed pipeline reaches real-time processing on a low-cost embedded smart camera composed of a Raspberry-Pi board and a RaspiCam camera. The tracking quality and the processing speed obtained with the proposed pipeline are evaluated on publicly available datasets and compared to the state-of-the-art methods.

  8. Research on simulated infrared image utility evaluation using deep representation

    NASA Astrophysics Data System (ADS)

    Zhang, Ruiheng; Mu, Chengpo; Yang, Yu; Xu, Lixin

    2018-01-01

    Infrared (IR) image simulation is an important data source for various target recognition systems. However, whether simulated IR images could be used as training data for classifiers depends on the features of fidelity and authenticity of simulated IR images. For evaluation of IR image features, a deep-representation-based algorithm is proposed. Being different from conventional methods, which usually adopt a priori knowledge or manually designed feature, the proposed method can extract essential features and quantitatively evaluate the utility of simulated IR images. First, for data preparation, we employ our IR image simulation system to generate large amounts of IR images. Then, we present the evaluation model of simulated IR image, for which an end-to-end IR feature extraction and target detection model based on deep convolutional neural network is designed. At last, the experiments illustrate that our proposed method outperforms other verification algorithms in evaluating simulated IR images. Cross-validation, variable proportion mixed data validation, and simulation process contrast experiments are carried out to evaluate the utility and objectivity of the images generated by our simulation system. The optimum mixing ratio between simulated and real data is 0.2≤γ≤0.3, which is an effective data augmentation method for real IR images.

  9. Hybrid clustering based fuzzy structure for vibration control - Part 1: A novel algorithm for building neuro-fuzzy system

    NASA Astrophysics Data System (ADS)

    Nguyen, Sy Dzung; Nguyen, Quoc Hung; Choi, Seung-Bok

    2015-01-01

    This paper presents a new algorithm for building an adaptive neuro-fuzzy inference system (ANFIS) from a training data set called B-ANFIS. In order to increase accuracy of the model, the following issues are executed. Firstly, a data merging rule is proposed to build and perform a data-clustering strategy. Subsequently, a combination of clustering processes in the input data space and in the joint input-output data space is presented. Crucial reason of this task is to overcome problems related to initialization and contradictory fuzzy rules, which usually happen when building ANFIS. The clustering process in the input data space is accomplished based on a proposed merging-possibilistic clustering (MPC) algorithm. The effectiveness of this process is evaluated to resume a clustering process in the joint input-output data space. The optimal parameters obtained after completion of the clustering process are used to build ANFIS. Simulations based on a numerical data, 'Daily Data of Stock A', and measured data sets of a smart damper are performed to analyze and estimate accuracy. In addition, convergence and robustness of the proposed algorithm are investigated based on both theoretical and testing approaches.

  10. MEMBRANE-MEDIATED EXTRACTION AND BIODEGRADATION OF VOCS FROM AIR

    EPA Science Inventory

    The paper discusses a project designed to evaluate the feasibility of using a membrane-supported extraction and biotreatment process to meet the National Emissions Standard for Hazardous Air Pollutants (NESHAP) for aircraft painting and depainting facilities. The proposed system...

  11. COAL CONVERSION CONTROL TECHNOLOGY. VOLUME I. ENVIRONMENTAL REGULATIONS; LIQUID EFFLUENTS

    EPA Science Inventory

    This volume is the product of an information-gathering effort relating to coal conversion process streams. Available and developing control technology has been evaluated in view of the requirements of present and proposed federal, state, regional, and international environmental ...

  12. COAL CONVERSION CONTROL TECHNOLOGY. VOLUME II. GASEOUS EMISSIONS; SOLID WASTES

    EPA Science Inventory

    This volume is the product of an information-gathering effort relating to coal conversion process streams. Available and developing control technology has been evaluated in view of the requirements of present and proposed federal, state, regional, and international environmental ...

  13. 78 FR 21150 - Notice of Lodging of Proposed Amendment to Consent Decree Under the Clean Water Act

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-09

    ... forth a phased sequence and schedule for the decision-making process of HRSD and the Localities as they... \\1\\ are evaluating the potential benefits and feasibility of regionalization and consolidation of the...

  14. Evaluating Accuracy of the Sunnova Pro Platform Shade Measurement

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sunnova's new solar energy design platform, Sunnova Pro, automatically generates a 3D model of a building and surrounding shading objects. The product is designed to automate the process of engineering a system, sizing batteries and preparing sales proposals.

  15. Mediators of Stereotype Threat Induced by Diagnostic Testing by Stereotype Relevant Out-Group Evaluators: The Roles of Evaluator's Racial Fairness and Support on Performance Outcome

    ERIC Educational Resources Information Center

    Wilburn, Grady Akile

    2010-01-01

    There is an abundance of research that examines the social-psychological phenomenon called stereotype threat. There is not, however, a conclusive understanding of the processes and mechanisms that operate within stereotype threat that produce reduced performance in a negative stereotype relevant area. This dissertation proposes that there are…

  16. Required Delivery Date, an Alternative to Procurement Administrative Lead Time

    DTIC Science & Technology

    1993-12-01

    quality services we may be shout out by the customer. He mentioned two changes in the area of small purchase that may impact the process, the proposed... sweatshop mentality, do more vendor quality evaluations, and overall make the process more responsive to the customer. When questioned about RDD, he said...larger, more economical quantities, a habit that would also cut down on the number of requisitions to be processed. This may, however, have some

  17. Validation Process Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lewis, John E.; English, Christine M.; Gesick, Joshua C.

    This report documents the validation process as applied to projects awarded through Funding Opportunity Announcements (FOAs) within the U.S. Department of Energy Bioenergy Technologies Office (DOE-BETO). It describes the procedures used to protect and verify project data, as well as the systematic framework used to evaluate and track performance metrics throughout the life of the project. This report also describes the procedures used to validate the proposed process design, cost data, analysis methodologies, and supporting documentation provided by the recipients.

  18. A new frequency approach for light flicker evaluation in electric power systems

    NASA Astrophysics Data System (ADS)

    Feola, Luigi; Langella, Roberto; Testa, Alfredo

    2015-12-01

    In this paper, a new analytical estimator for light flicker in frequency domain, which is able to take into account also the frequency components neglected by the classical methods proposed in literature, is proposed. The analytical solutions proposed apply for any generic stationary signal affected by interharmonic distortion. The light flicker analytical estimator proposed is applied to numerous numerical case studies with the goal of showing i) the correctness and the improvements of the analytical approach proposed with respect to the other methods proposed in literature and ii) the accuracy of the results compared to those obtained by means of the classical International Electrotechnical Commission (IEC) flickermeter. The usefulness of the proposed analytical approach is that it can be included in signal processing tools for interharmonic penetration studies for the integration of renewable energy sources in future smart grids.

  19. A new approach to the concept of "relevance" in information retrieval (IR).

    PubMed

    Kagolovsky, Y; Möhr, J R

    2001-01-01

    The concept of "relevance" is the fundamental concept of information science in general and information retrieval, in particular. Although "relevance" is extensively used in evaluation of information retrieval, there are considerable problems associated with reaching an agreement on its definition, meaning, evaluation, and application in information retrieval. There are a number of different views on "relevance" and its use for evaluation. Based on a review of the literature the main problems associated with the concept of "relevance" in information retrieval are identified. The authors argue that the proposal for the solution of the problems can be based on the conceptual IR framework built using a systems analytic approach to IR. Using this framework different kinds of "relevance" relationships in the IR process are identified, and a methodology for evaluation of "relevance" based on methods of semantics capturing and comparison is proposed.

  20. Using hybrid method to evaluate the green performance in uncertainty.

    PubMed

    Tseng, Ming-Lang; Lan, Lawrence W; Wang, Ray; Chiu, Anthony; Cheng, Hui-Ping

    2011-04-01

    Green performance measure is vital for enterprises in making continuous improvements to maintain sustainable competitive advantages. Evaluation of green performance, however, is a challenging task due to the dependence complexity of the aspects, criteria, and the linguistic vagueness of some qualitative information and quantitative data together. To deal with this issue, this study proposes a novel approach to evaluate the dependence aspects and criteria of firm's green performance. The rationale of the proposed approach, namely green network balanced scorecard, is using balanced scorecard to combine fuzzy set theory with analytical network process (ANP) and importance-performance analysis (IPA) methods, wherein fuzzy set theory accounts for the linguistic vagueness of qualitative criteria and ANP converts the relations among the dependence aspects and criteria into an intelligible structural modeling used IPA. For the empirical case study, four dependence aspects and 34 green performance criteria for PCB firms in Taiwan were evaluated. The managerial implications are discussed.

  1. Psychometric evaluation of the Swedish adaptation of the Inventory for Assessing the Process of Cultural Competence Among Healthcare Professionals--Revised (IAPCC-R).

    PubMed

    Olt, Helen; Jirwe, Maria; Gustavsson, Petter; Emami, Azita

    2010-01-01

    The purpose of this study was to describe the translation, adaption, and psychometric evaluation process in relation to validity and reliability of the Swedish version of the instrument, Inventory for Assessing The Process of Cultural Competence Among Healthcare Professionals-Revised (IAPCC-R) following the translation, adaptation, and psychometric evaluation process. Validity tests were conducted on the response processes (N = 15), the content (N = 7), and the internal structure of the instrument (N = 334). Reliability (alpha = .65 for the total scale varying between -.01 and .65 for the different subscales) was evaluated in terms of internal consistency. Results indicated weak validity and reliability though it is difficult to conclude whether this is related to adaptation issues or the original construction.The testing of the response process identified problems in relation to respondents' conceptualization of cultural competence. The test of the content identified a weak correspondence between the items and the underlying model. In addition, a confirmatory factor analysis did not confirm the proposed structure of the instrument. This study concludes that this instrument is not valid and reliable for use with a Swedish population of practicing nurses or nursing students.

  2. Development of evaluation technique of GMAW welding quality based on statistical analysis

    NASA Astrophysics Data System (ADS)

    Feng, Shengqiang; Terasaki, Hidenri; Komizo, Yuichi; Hu, Shengsun; Chen, Donggao; Ma, Zhihua

    2014-11-01

    Nondestructive techniques for appraising gas metal arc welding(GMAW) faults plays a very important role in on-line quality controllability and prediction of the GMAW process. On-line welding quality controllability and prediction have several disadvantages such as high cost, low efficiency, complication and greatly being affected by the environment. An enhanced, efficient evaluation technique for evaluating welding faults based on Mahalanobis distance(MD) and normal distribution is presented. In addition, a new piece of equipment, designated the weld quality tester(WQT), is developed based on the proposed evaluation technique. MD is superior to other multidimensional distances such as Euclidean distance because the covariance matrix used for calculating MD takes into account correlations in the data and scaling. The values of MD obtained from welding current and arc voltage are assumed to follow a normal distribution. The normal distribution has two parameters: the mean µ and standard deviation σ of the data. In the proposed evaluation technique used by the WQT, values of MD located in the range from zero to µ+3 σ are regarded as "good". Two experiments which involve changing the flow of shielding gas and smearing paint on the surface of the substrate are conducted in order to verify the sensitivity of the proposed evaluation technique and the feasibility of using WQT. The experimental results demonstrate the usefulness of the WQT for evaluating welding quality. The proposed technique can be applied to implement the on-line welding quality controllability and prediction, which is of great importance to design some novel equipment for weld quality detection.

  3. Quad-phased data mining modeling for dementia diagnosis.

    PubMed

    Bang, Sunjoo; Son, Sangjoon; Roh, Hyunwoong; Lee, Jihye; Bae, Sungyun; Lee, Kyungwon; Hong, Changhyung; Shin, Hyunjung

    2017-05-18

    The number of people with dementia is increasing along with people's ageing trend worldwide. Therefore, there are various researches to improve a dementia diagnosis process in the field of computer-aided diagnosis (CAD) technology. The most significant issue is that the evaluation processes by physician which is based on medical information for patients and questionnaire from their guardians are time consuming, subjective and prone to error. This problem can be solved by an overall data mining modeling, which subsidizes an intuitive decision of clinicians. Therefore, in this paper we propose a quad-phased data mining modeling consisting of 4 modules. In Proposer Module, significant diagnostic criteria are selected that are effective for diagnostics. Then in Predictor Module, a model is constructed to predict and diagnose dementia based on a machine learning algorism. To help clinical physicians understand results of the predictive model better, in Descriptor Module, we interpret causes of diagnostics by profiling patient groups. Lastly, in Visualization Module, we provide visualization to effectively explore characteristics of patient groups. The proposed model is applied for CREDOS study which contains clinical data collected from 37 university-affiliated hospitals in republic of Korea from year 2005 to 2013. This research is an intelligent system enabling intuitive collaboration between CAD system and physicians. And also, improved evaluation process is able to effectively reduce time and cost consuming for clinicians and patients.

  4. Waveguide-type optical circuits for recognition of optical 8QAM-coded label

    NASA Astrophysics Data System (ADS)

    Surenkhorol, Tumendemberel; Kishikawa, Hiroki; Goto, Nobuo; Gonchigsumlaa, Khishigjargal

    2017-10-01

    Optical signal processing is expected to be applied in network nodes. In photonic routers, label recognition is one of the important functions. We have studied different kinds of label recognition methods so far for on-off keying, binary phase-shift keying, quadrature phase-shift keying, and 16 quadrature amplitude modulation-coded labels. We propose a method based on waveguide circuits to recognize an optical eight quadrature amplitude modulation (8QAM)-coded label by simple passive optical signal processing. The recognition of the proposed method is theoretically analyzed and numerically simulated by the finite difference beam propagation method. The noise tolerance is discussed, and bit-error rate against optical signal-to-noise ratio is evaluated. The scalability of the proposed method is also discussed theoretically for two-symbol length 8QAM-coded labels.

  5. A Generalized Precharging Strategy for Soft Startup Process of the Modular Multilevel Converter-Based HVDC Systems

    DOE PAGES

    Zhang, Lei; Qin, Jiangchao; Wu, Xiajie; ...

    2017-01-01

    The modular multilevel converter (MMC) has become one of the most promising converter technologies for medium/high-power applications, specifically for highvoltage direct current (HVDC) transmission systems. One of the technical challenges associated with the operation and control of the MMC-based system is to precharge the submodule (SM) capacitors to their nominal voltage during the startup process. In this paper, considering various SM circuits, a generalized precharging strategy is proposed for the MMC-based systems, which can implement soft stratup from dc or ac side. Furthermore, the proposed precharging strategy can be applicabe for various SM circuits and MMC configurations. The proposed startupmore » strategy does not require extra measurements and/or auxiliary power supplies. The charging current is controlled by adjusting the changing rate of the number of blocked and bypassed SM capacitors. Based on the proposed startup strategy, the startup processes of MMC/MMC-HVDC systems based on various SM circuits are analyzed and a generalized startup procedure for various MMC-HVDC systems is proposed. In addition, the uncontrollable steady-state SM capacitor voltages of various MMC-based systems are analyzed and determined, potentially useful in SM design. Our performance of the proposed strategy for various MMC-HVDC systems is evaluated based on time-domain simulation studies in the PSCAD/EMTDC software environment and experimental results based on a scaled-down prototype.« less

  6. A Generalized Precharging Strategy for Soft Startup Process of the Modular Multilevel Converter-Based HVDC Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Lei; Qin, Jiangchao; Wu, Xiajie

    The modular multilevel converter (MMC) has become one of the most promising converter technologies for medium/high-power applications, specifically for highvoltage direct current (HVDC) transmission systems. One of the technical challenges associated with the operation and control of the MMC-based system is to precharge the submodule (SM) capacitors to their nominal voltage during the startup process. In this paper, considering various SM circuits, a generalized precharging strategy is proposed for the MMC-based systems, which can implement soft stratup from dc or ac side. Furthermore, the proposed precharging strategy can be applicabe for various SM circuits and MMC configurations. The proposed startupmore » strategy does not require extra measurements and/or auxiliary power supplies. The charging current is controlled by adjusting the changing rate of the number of blocked and bypassed SM capacitors. Based on the proposed startup strategy, the startup processes of MMC/MMC-HVDC systems based on various SM circuits are analyzed and a generalized startup procedure for various MMC-HVDC systems is proposed. In addition, the uncontrollable steady-state SM capacitor voltages of various MMC-based systems are analyzed and determined, potentially useful in SM design. Our performance of the proposed strategy for various MMC-HVDC systems is evaluated based on time-domain simulation studies in the PSCAD/EMTDC software environment and experimental results based on a scaled-down prototype.« less

  7. Yogic exercises and health--a psycho-neuro immunological approach.

    PubMed

    Kulkarni, D D; Bera, T K

    2009-01-01

    Relaxation potential of yogic exercises seems to play a vital role in establishing psycho-physical health in reversing the psycho-immunology of emotions under stress based on breath and body awareness. However, mechanism of yogic exercises for restoring health and fitness components operating through psycho-neuro-immunological pathways is unknown. Therefore, a hybrid model of human information processing-psycho-neuroendocrine (HIP-PNE) network has been proposed to reveal the importance of yogic information processing. This study focuses on two major pathways of information processing involving cortical and hypothalamo-pituitary-adrenal axis (HPA) interactions with a deep reach molecular action on cellular, neuro-humoral and immune system in reversing stress mediated diseases. Further, the proposed HIP-PNE model has ample of experimental potential for objective evaluation of yogic view of health and fitness.

  8. Accuracy-energy configurable sensor processor and IoT device for long-term activity monitoring in rare-event sensing applications.

    PubMed

    Park, Daejin; Cho, Jeonghun

    2014-01-01

    A specially designed sensor processor used as a main processor in IoT (internet-of-thing) device for the rare-event sensing applications is proposed. The IoT device including the proposed sensor processor performs the event-driven sensor data processing based on an accuracy-energy configurable event-quantization in architectural level. The received sensor signal is converted into a sequence of atomic events, which is extracted by the signal-to-atomic-event generator (AEG). Using an event signal processing unit (EPU) as an accelerator, the extracted atomic events are analyzed to build the final event. Instead of the sampled raw data transmission via internet, the proposed method delays the communication with a host system until a semantic pattern of the signal is identified as a final event. The proposed processor is implemented on a single chip, which is tightly coupled in bus connection level with a microcontroller using a 0.18 μm CMOS embedded-flash process. For experimental results, we evaluated the proposed sensor processor by using an IR- (infrared radio-) based signal reflection and sensor signal acquisition system. We successfully demonstrated that the expected power consumption is in the range of 20% to 50% compared to the result of the basement in case of allowing 10% accuracy error.

  9. A New FPGA Architecture of FAST and BRIEF Algorithm for On-Board Corner Detection and Matching.

    PubMed

    Huang, Jingjin; Zhou, Guoqing; Zhou, Xiang; Zhang, Rongting

    2018-03-28

    Although some researchers have proposed the Field Programmable Gate Array (FPGA) architectures of Feature From Accelerated Segment Test (FAST) and Binary Robust Independent Elementary Features (BRIEF) algorithm, there is no consideration of image data storage in these traditional architectures that will result in no image data that can be reused by the follow-up algorithms. This paper proposes a new FPGA architecture that considers the reuse of sub-image data. In the proposed architecture, a remainder-based method is firstly designed for reading the sub-image, a FAST detector and a BRIEF descriptor are combined for corner detection and matching. Six pairs of satellite images with different textures, which are located in the Mentougou district, Beijing, China, are used to evaluate the performance of the proposed architecture. The Modelsim simulation results found that: (i) the proposed architecture is effective for sub-image reading from DDR3 at a minimum cost; (ii) the FPGA implementation is corrected and efficient for corner detection and matching, such as the average value of matching rate of natural areas and artificial areas are approximately 67% and 83%, respectively, which are close to PC's and the processing speed by FPGA is approximately 31 and 2.5 times faster than those by PC processing and by GPU processing, respectively.

  10. An image compression survey and algorithm switching based on scene activity

    NASA Technical Reports Server (NTRS)

    Hart, M. M.

    1985-01-01

    Data compression techniques are presented. A description of these techniques is provided along with a performance evaluation. The complexity of the hardware resulting from their implementation is also addressed. The compression effect on channel distortion and the applicability of these algorithms to real-time processing are presented. Also included is a proposed new direction for an adaptive compression technique for real-time processing.

  11. An expert panel process to evaluate habitat restoration actions in the Columbia River estuary.

    PubMed

    Krueger, Kirk L; Bottom, Daniel L; Hood, W Gregory; Johnson, Gary E; Jones, Kim K; Thom, Ronald M

    2017-03-01

    We describe a process for evaluating proposed ecosystem restoration projects intended to improve survival of juvenile salmon in the Columbia River estuary (CRE). Changes in the Columbia River basin (northwestern USA), including hydropower development, have contributed to the listing of 13 salmon stocks as endangered or threatened under the U.S. Endangered Species Act. Habitat restoration in the CRE, from Bonneville Dam to the ocean, is part of a basin-wide, legally mandated effort to mitigate federal hydropower impacts on salmon survival. An Expert Regional Technical Group (ERTG) was established in 2009 to improve and implement a process for assessing and assigning "survival benefit units" (SBUs) to restoration actions. The SBU concept assumes site-specific restoration projects will increase juvenile salmon survival during migration through the 234 km CRE. Assigned SBUs are used to inform selection of restoration projects and gauge mitigation progress. The ERTG standardized the SBU assessment process to improve its scientific integrity, repeatability, and transparency. In lieu of experimental data to quantify the survival benefits of individual restoration actions, the ERTG adopted a conceptual model composed of three assessment criteria-certainty of success, fish opportunity improvements, and habitat capacity improvements-to evaluate restoration projects. Based on these criteria, an algorithm assigned SBUs by integrating potential fish density as an indicator of salmon performance. Between 2009 and 2014, the ERTG assessed SBUs for 55 proposed projects involving a total of 181 restoration actions located across 8 of 9 reaches of the CRE, largely relying on information provided in a project template based on the conceptual model, presentations, discussions with project sponsors, and site visits. Most projects restored tidal inundation to emergent wetlands, improved riparian function, and removed invasive vegetation. The scientific relationship of geomorphic and salmonid responses to restoration actions remains the foremost concern. Although not designed to establish a broad strategy for estuary restoration, the scoring process has adaptively influenced the types, designs, and locations of restoration proposals. The ERTG process may be a useful model for others who have unique ecosystem restoration goals and share some of our common challenges. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. A note on evaluating VAN earthquake predictions

    NASA Astrophysics Data System (ADS)

    Tselentis, G.-Akis; Melis, Nicos S.

    The evaluation of the success level of an earthquake prediction method should not be based on approaches that apply generalized strict statistical laws and avoid the specific nature of the earthquake phenomenon. Fault rupture processes cannot be compared to gambling processes. The outcome of the present note is that even an ideal earthquake prediction method is still shown to be a matter of a “chancy” association between precursors and earthquakes if we apply the same procedure proposed by Mulargia and Gasperini [1992] in evaluating VAN earthquake predictions. Each individual VAN prediction has to be evaluated separately, taking always into account the specific circumstances and information available. The success level of epicenter prediction should depend on the earthquake magnitude, and magnitude and time predictions may depend on earthquake clustering and the tectonic regime respectively.

  13. A health impact assessment of proposed public transportation service cuts and fare increases in Boston, Massachusetts (U.S.A.).

    PubMed

    James, Peter; Ito, Kate; Buonocore, Jonathan J; Levy, Jonathan I; Arcaya, Mariana C

    2014-08-07

    Transportation decisions have health consequences that are often not incorporated into policy-making processes. Health Impact Assessment (HIA) is a process that can be used to evaluate health effects of transportation policy. We present a rapid HIA, conducted over eight weeks, evaluating health and economic effects of proposed fare increases and service cuts to Boston, Massachusetts' public transportation system. We used transportation modeling in concert with tools allowing for quantification and monetization of multiple pathways. We estimated health and economic costs of proposed public transportation system changes to be hundreds of millions of dollars per year, exceeding the budget gap the public transportation authority was required to close. Significant health pathways included crashes, air pollution, and physical activity. The HIA enabled stakeholders to advocate for more modest fare increases and service cuts, which were eventually adopted by decision makers. This HIA was among the first to quantify and monetize multiple pathways linking transportation decisions with health and economic outcomes, using approaches that could be applied in different settings. Including health costs in transportation decisions can lead to policy choices with both economic and public health benefits.

  14. A Health Impact Assessment of Proposed Public Transportation Service Cuts and Fare Increases in Boston, Massachusetts (U.S.A.)

    PubMed Central

    James, Peter; Ito, Kate; Buonocore, Jonathan J.; Levy, Jonathan I.; Arcaya, Mariana C.

    2014-01-01

    Transportation decisions have health consequences that are often not incorporated into policy-making processes. Health Impact Assessment (HIA) is a process that can be used to evaluate health effects of transportation policy. We present a rapid HIA, conducted over eight weeks, evaluating health and economic effects of proposed fare increases and service cuts to Boston, Massachusetts’ public transportation system. We used transportation modeling in concert with tools allowing for quantification and monetization of multiple pathways. We estimated health and economic costs of proposed public transportation system changes to be hundreds of millions of dollars per year, exceeding the budget gap the public transportation authority was required to close. Significant health pathways included crashes, air pollution, and physical activity. The HIA enabled stakeholders to advocate for more modest fare increases and service cuts, which were eventually adopted by decision makers. This HIA was among the first to quantify and monetize multiple pathways linking transportation decisions with health and economic outcomes, using approaches that could be applied in different settings. Including health costs in transportation decisions can lead to policy choices with both economic and public health benefits. PMID:25105550

  15. An adaptive Gaussian process-based iterative ensemble smoother for data assimilation

    NASA Astrophysics Data System (ADS)

    Ju, Lei; Zhang, Jiangjiang; Meng, Long; Wu, Laosheng; Zeng, Lingzao

    2018-05-01

    Accurate characterization of subsurface hydraulic conductivity is vital for modeling of subsurface flow and transport. The iterative ensemble smoother (IES) has been proposed to estimate the heterogeneous parameter field. As a Monte Carlo-based method, IES requires a relatively large ensemble size to guarantee its performance. To improve the computational efficiency, we propose an adaptive Gaussian process (GP)-based iterative ensemble smoother (GPIES) in this study. At each iteration, the GP surrogate is adaptively refined by adding a few new base points chosen from the updated parameter realizations. Then the sensitivity information between model parameters and measurements is calculated from a large number of realizations generated by the GP surrogate with virtually no computational cost. Since the original model evaluations are only required for base points, whose number is much smaller than the ensemble size, the computational cost is significantly reduced. The applicability of GPIES in estimating heterogeneous conductivity is evaluated by the saturated and unsaturated flow problems, respectively. Without sacrificing estimation accuracy, GPIES achieves about an order of magnitude of speed-up compared with the standard IES. Although subsurface flow problems are considered in this study, the proposed method can be equally applied to other hydrological models.

  16. Evaluation of high-efficiency gas-liquid contactors for natural gas processing. Second semiannual technical progress report, April 1, 1993--September 30, 1993

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1993-12-01

    The objective of this proposed program is to evaluate the potential of rotating gas-liquid contactors for natural gas processing by expanding the currently available database. This expansion will focus on application of this technology to environments representative of those typically encountered in natural gas processing plants. Operational and reliability concerns will be addressed while generating pertinent engineering data relating to the mass-transfer process. Work to be performed this reporting period are: complete all negotiations and processing of agreements; complete assembly, modifications, shakedown, and conduct fluid dynamic studies using the plastic rotary contactor unit; confirmation of project test matrix; and locate,more » and transport an amine plant and dehydration plant. Accomplishment for this period are presented.« less

  17. Mechanoluminescence assisting agile optimization of processing design on surgical epiphysis plates

    NASA Astrophysics Data System (ADS)

    Terasaki, Nao; Toyomasu, Takashi; Sonohata, Motoki

    2018-04-01

    We propose a novel method for agile optimization of processing design by visualization of mechanoluminescence. To demonstrate the effect of the new method, epiphysis plates were processed to form dots (diameters: 1 and 1.5 mm) and the mechanical information was evaluated. As a result, the appearance of new strain concentration was successfully visualized on the basis of mechanoluminescence, and complex mechanical information was instinctively understood by surgeons as the designers. In addition, it was clarified by mechanoluminescence analysis that small dots do not have serious mechanical effects such as strength reduction. Such detail mechanical information evaluated on the basis of mechanoluminescence was successfully applied to the judgement of the validity of the processing design. This clearly proves the effectiveness of the new methodology using mechanoluminescence for assisting agile optimization of the processing design.

  18. Hybrid surrogate-model-based multi-fidelity efficient global optimization applied to helicopter blade design

    NASA Astrophysics Data System (ADS)

    Ariyarit, Atthaphon; Sugiura, Masahiko; Tanabe, Yasutada; Kanazaki, Masahiro

    2018-06-01

    A multi-fidelity optimization technique by an efficient global optimization process using a hybrid surrogate model is investigated for solving real-world design problems. The model constructs the local deviation using the kriging method and the global model using a radial basis function. The expected improvement is computed to decide additional samples that can improve the model. The approach was first investigated by solving mathematical test problems. The results were compared with optimization results from an ordinary kriging method and a co-kriging method, and the proposed method produced the best solution. The proposed method was also applied to aerodynamic design optimization of helicopter blades to obtain the maximum blade efficiency. The optimal shape obtained by the proposed method achieved performance almost equivalent to that obtained using the high-fidelity, evaluation-based single-fidelity optimization. Comparing all three methods, the proposed method required the lowest total number of high-fidelity evaluation runs to obtain a converged solution.

  19. A risk assessment methodology using intuitionistic fuzzy set in FMEA

    NASA Astrophysics Data System (ADS)

    Chang, Kuei-Hu; Cheng, Ching-Hsue

    2010-12-01

    Most current risk assessment methods use the risk priority number (RPN) value to evaluate the risk of failure. However, conventional RPN methodology has been criticised as having five main shortcomings as follows: (1) the assumption that the RPN elements are equally weighted leads to over simplification; (2) the RPN scale itself has some non-intuitive statistical properties; (3) the RPN elements have many duplicate numbers; (4) the RPN is derived from only three factors mainly in terms of safety; and (5) the conventional RPN method has not considered indirect relations between components. To address the above issues, an efficient and comprehensive algorithm to evaluate the risk of failure is needed. This article proposes an innovative approach, which integrates the intuitionistic fuzzy set (IFS) and the decision-making trial and evaluation laboratory (DEMATEL) approach on risk assessment. The proposed approach resolves some of the shortcomings of the conventional RPN method. A case study, which assesses the risk of 0.15 µm DRAM etching process, is used to demonstrate the effectiveness of the proposed approach. Finally, the result of the proposed method is compared with the listing approaches of risk assessment methods.

  20. What Counts is not Falling … but Landing: Strategic Analysis: An Adapted Model for Implementation Evaluation.

    PubMed

    Brousselle, Astrid

    2004-04-01

    Implementation evaluations, also called process evaluations, involve studying the development of programmes, and identifying and understanding their strengths and weaknesses. Undertaking an implementation evaluation offers insights into evaluation objectives, but does not help the researcher develop a research strategy. During the implementation analysis of the UNAIDS drug access initiative in Chile, the strategic analysis model developed by Crozier and Friedberg was used. However, a major incompatibility was noted between the procedure put forward by Crozier and Friedberg and the specific characteristics of the programme being evaluated. In this article, an adapted strategic analysis model for programme evaluation is proposed.

  1. The role (or not) of economic evaluation at the micro level: can Bourdieu's theory provide a way forward for clinical decision-making?

    PubMed

    Lessard, Chantale; Contandriopoulos, André-Pierre; Beaulieu, Marie-Dominique

    2010-06-01

    Despite increasing interest in health economic evaluation, investigations have shown limited use by micro (clinical) level decision-makers. A considerable amount of health decisions take place daily at the point of the clinical encounter; especially in primary care. Since every decision has an opportunity cost, ignoring economic information in family physicians' (FPs) decision-making may have a broad impact on health care efficiency. Knowledge translation of economic evaluation is often based on taken-for-granted assumptions about actors' interests and interactions, neglecting much of the complexity of social reality. Health economics literature frequently assumes a rational and linear decision-making process. Clinical decision-making is in fact a complex social, dynamic, multifaceted process, involving relationships and contextual embeddedness. FPs are embedded in complex social networks that have a significant impact on skills, attitudes, knowledge, practices, and on the information being used. Because of their socially constructed nature, understanding preferences, professional culture, practices, and knowledge translation requires serious attention to social reality. There has been little exploration by health economists of whether the problem may be more fundamental and reside in a misunderstanding of the process of decision-making. There is a need to enhance our understanding of the role of economic evaluation in decision-making from a disciplinary perspective different than health economics. This paper argues for a different conceptualization of the role of economic evaluation in FPs' decision-making, and proposes Bourdieu's sociological theory as a research framework. Bourdieu's theory of practice illustrates how the context-sensitive nature of practice must be understood as a socially constituted practical knowledge. The proposed approach could substantially contribute to a more complex understanding of the role of economic evaluation in FPs' decision-making. Copyright 2010 Elsevier Ltd. All rights reserved.

  2. SOMA: A Proposed Framework for Trend Mining in Large UK Diabetic Retinopathy Temporal Databases

    NASA Astrophysics Data System (ADS)

    Somaraki, Vassiliki; Harding, Simon; Broadbent, Deborah; Coenen, Frans

    In this paper, we present SOMA, a new trend mining framework; and Aretaeus, the associated trend mining algorithm. The proposed framework is able to detect different kinds of trends within longitudinal datasets. The prototype trends are defined mathematically so that they can be mapped onto the temporal patterns. Trends are defined and generated in terms of the frequency of occurrence of pattern changes over time. To evaluate the proposed framework the process was applied to a large collection of medical records, forming part of the diabetic retinopathy screening programme at the Royal Liverpool University Hospital.

  3. Revisiting a theory of negotiation: the utility of Markiewicz (2005) proposed six principles.

    PubMed

    McDonald, Diane

    2008-08-01

    People invited to participate in an evaluation process will inevitably come from a variety of personal backgrounds and hold different views based on their own lived experience. However, evaluators are in a privileged position because they have access to information from a wide range of sources and can play an important role in helping stakeholders to hear and appreciate one another's opinions and ideas. Indeed, in some cases a difference in perspective can be utilised by an evaluator to engage key stakeholders in fruitful discussion that can add value to the evaluation outcome. In other instances the evaluator finds that the task of facilitating positive interaction between multiple stakeholders is just 'an uphill battle' and so conflict, rather than consensus, occurs as the evaluation findings emerge and are debated. As noted by Owen [(2006) PROGRAM EVALUATION: Forms and approaches (3rd ed.). St. Leonards, NSW: Allen & Unwin] and other eminent evaluators before him [Fetterman, D. M. (1996). Empowerment evaluation: An introduction to theory and practice. In D. M. Fetterman, S. J. Kaftarian, & A. Wandersman (Eds.), Empowerment evaluation: Knowledge and tools for self-assessment and accountability (pp. 3-46). Thousand Oaks, CA: Sage Publications; Patton, M. Q. (1997). Utilization-focused evaluation (3rd ed.). Thousand Oaks, CA: Sage Publications; Stake, R. A. (1983). Stakeholder influence in the evaluation of cities-in-schools. New Directions for Program Evaluation, 17, 15-30], conflict in an evaluation process is not unexpected. The challenge is for evaluators to facilitate dialogue between people who hold strongly opposing views, with the aim of helping them to achieve a common understanding of the best way forward. However, this does not imply that consensus will be reached [Guba, E. G., & Lincoln, Y. S. (1989). Fourth generation evaluation. Newbury Park, CA: Sage]. What is essential is that the evaluator assists the various stakeholders to recognise and accept their differences and be willing to move on. But the problem is that evaluators are not necessarily equipped with the technical or personal skills required for effective negotiation. In addition, the time and effort that are required to undertake this mediating role are often not sufficiently understood by those who commission a review. With such issues in mind Markiewicz, A. [(2005). A balancing act: Resolving multiple stakeholder interests in program evaluation. Evaluation Journal of Australasia, 4(1-2), 13-21] has proposed six principles upon which to build a case for negotiation to be integrated into the evaluation process. This paper critiques each of these principles in the context of an evaluation undertaken of a youth program. In doing so it challenges the view that stakeholder consensus is always possible if program improvement is to be achieved. This has led to some refinement and further extension of the proposed theory of negotiation that is seen to be instrumental to the role of an evaluator.

  4. Fingerprint pattern restoration by digital image processing techniques.

    PubMed

    Wen, Che-Yen; Yu, Chiu-Chung

    2003-09-01

    Fingerprint evidence plays an important role in solving criminal problems. However, defective (lacking information needed for completeness) or contaminated (undesirable information included) fingerprint patterns make identifying and recognizing processes difficult. Unfortunately. this is the usual case. In the recognizing process (enhancement of patterns, or elimination of "false alarms" so that a fingerprint pattern can be searched in the Automated Fingerprint Identification System (AFIS)), chemical and physical techniques have been proposed to improve pattern legibility. In the identifying process, a fingerprint examiner can enhance contaminated (but not defective) fingerprint patterns under guidelines provided by the Scientific Working Group on Friction Ridge Analysis, Study and Technology (SWGFAST), the Scientific Working Group on Imaging Technology (SWGIT), and an AFIS working group within the National Institute of Justice. Recently, the image processing techniques have been successfully applied in forensic science. For example, we have applied image enhancement methods to improve the legibility of digital images such as fingerprints and vehicle plate numbers. In this paper, we propose a novel digital image restoration technique based on the AM (amplitude modulation)-FM (frequency modulation) reaction-diffusion method to restore defective or contaminated fingerprint patterns. This method shows its potential application to fingerprint pattern enhancement in the recognizing process (but not for the identifying process). Synthetic and real images are used to show the capability of the proposed method. The results of enhancing fingerprint patterns by the manual process and our method are evaluated and compared.

  5. Automatic small target detection in synthetic infrared images

    NASA Astrophysics Data System (ADS)

    Yardımcı, Ozan; Ulusoy, Ä.°lkay

    2017-05-01

    Automatic detection of targets from far distances is a very challenging problem. Background clutter and small target size are the main difficulties which should be solved while reaching a high detection performance as well as a low computational load. The pre-processing, detection and post-processing approaches are very effective on the final results. In this study, first of all, various methods in the literature were evaluated separately for each of these stages using the simulated test scenarios. Then, a full system of detection was constructed among available solutions which resulted in the best performance in terms of detection. However, although a precision rate as 100% was reached, the recall values stayed low around 25-45%. Finally, a post-processing method was proposed which increased the recall value while keeping the precision at 100%. The proposed post-processing method, which is based on local operations, increased the recall value to 65-95% in all test scenarios.

  6. Revision and extension of Eco-LCA metrics for sustainability assessment of the energy and chemical processes.

    PubMed

    Yang, Shiying; Yang, Siyu; Kraslawski, Andrzej; Qian, Yu

    2013-12-17

    Ecologically based life cycle assessment (Eco-LCA) is an appealing approach for the evaluation of resources utilization and environmental impacts of the process industries from an ecological scale. However, the aggregated metrics of Eco-LCA suffer from some drawbacks: the environmental impact metric has limited applicability; the resource utilization metric ignores indirect consumption; the renewability metric fails to address the quantitative distinction of resources availability; the productivity metric seems self-contradictory. In this paper, the existing Eco-LCA metrics are revised and extended for sustainability assessment of the energy and chemical processes. A new Eco-LCA metrics system is proposed, including four independent dimensions: environmental impact, resource utilization, resource availability, and economic effectiveness. An illustrative example of comparing assessment between a gas boiler and a solar boiler process provides insight into the features of the proposed approach.

  7. Breast histopathology image segmentation using spatio-colour-texture based graph partition method.

    PubMed

    Belsare, A D; Mushrif, M M; Pangarkar, M A; Meshram, N

    2016-06-01

    This paper proposes a novel integrated spatio-colour-texture based graph partitioning method for segmentation of nuclear arrangement in tubules with a lumen or in solid islands without a lumen from digitized Hematoxylin-Eosin stained breast histology images, in order to automate the process of histology breast image analysis to assist the pathologists. We propose a new similarity based super pixel generation method and integrate it with texton representation to form spatio-colour-texture map of Breast Histology Image. Then a new weighted distance based similarity measure is used for generation of graph and final segmentation using normalized cuts method is obtained. The extensive experiments carried shows that the proposed algorithm can segment nuclear arrangement in normal as well as malignant duct in breast histology tissue image. For evaluation of the proposed method the ground-truth image database of 100 malignant and nonmalignant breast histology images is created with the help of two expert pathologists and the quantitative evaluation of proposed breast histology image segmentation has been performed. It shows that the proposed method outperforms over other methods. © 2015 The Authors Journal of Microscopy © 2015 Royal Microscopical Society.

  8. A spatial scan statistic for multiple clusters.

    PubMed

    Li, Xiao-Zhou; Wang, Jin-Feng; Yang, Wei-Zhong; Li, Zhong-Jie; Lai, Sheng-Jie

    2011-10-01

    Spatial scan statistics are commonly used for geographical disease surveillance and cluster detection. While there are multiple clusters coexisting in the study area, they become difficult to detect because of clusters' shadowing effect to each other. The recently proposed sequential method showed its better power for detecting the second weaker cluster, but did not improve the ability of detecting the first stronger cluster which is more important than the second one. We propose a new extension of the spatial scan statistic which could be used to detect multiple clusters. Through constructing two or more clusters in the alternative hypothesis, our proposed method accounts for other coexisting clusters in the detecting and evaluating process. The performance of the proposed method is compared to the sequential method through an intensive simulation study, in which our proposed method shows better power in terms of both rejecting the null hypothesis and accurately detecting the coexisting clusters. In the real study of hand-foot-mouth disease data in Pingdu city, a true cluster town is successfully detected by our proposed method, which cannot be evaluated to be statistically significant by the standard method due to another cluster's shadowing effect. Copyright © 2011 Elsevier Inc. All rights reserved.

  9. A quality-refinement process for medical imaging applications.

    PubMed

    Neuhaus, J; Maleike, D; Nolden, M; Kenngott, H-G; Meinzer, H-P; Wolf, I

    2009-01-01

    To introduce and evaluate a process for refinement of software quality that is suitable to research groups. In order to avoid constraining researchers too much, the quality improvement process has to be designed carefully. The scope of this paper is to present and evaluate a process to advance quality aspects of existing research prototypes in order to make them ready for initial clinical studies. The proposed process is tailored for research environments and therefore more lightweight than traditional quality management processes. Focus on quality criteria that are important at the given stage of the software life cycle. Usage of tools that automate aspects of the process is emphasized. To evaluate the additional effort that comes along with the process, it was exemplarily applied for eight prototypical software modules for medical image processing. The introduced process has been applied to improve the quality of all prototypes so that they could be successfully used in clinical studies. The quality refinement yielded an average of 13 person days of additional effort per project. Overall, 107 bugs were found and resolved by applying the process. Careful selection of quality criteria and the usage of automated process tools lead to a lightweight quality refinement process suitable for scientific research groups that can be applied to ensure a successful transfer of technical software prototypes into clinical research workflows.

  10. E-Services quality assessment framework for collaborative networks

    NASA Astrophysics Data System (ADS)

    Stegaru, Georgiana; Danila, Cristian; Sacala, Ioan Stefan; Moisescu, Mihnea; Mihai Stanescu, Aurelian

    2015-08-01

    In a globalised networked economy, collaborative networks (CNs) are formed to take advantage of new business opportunities. Collaboration involves shared resources and capabilities, such as e-Services that can be dynamically composed to automate CN participants' business processes. Quality is essential for the success of business process automation. Current approaches mostly focus on quality of service (QoS)-based service selection and ranking algorithms, overlooking the process of service composition which requires interoperable, adaptable and secure e-Services to ensure seamless collaboration, data confidentiality and integrity. Lack of assessment of these quality attributes can result in e-Service composition failure. The quality of e-Service composition relies on the quality of each e-Service and on the quality of the composition process. Therefore, there is the need for a framework that addresses quality from both views: product and process. We propose a quality of e-Service composition (QoESC) framework for quality assessment of e-Service composition for CNs which comprises of a quality model for e-Service evaluation and guidelines for quality of e-Service composition process. We implemented a prototype considering a simplified telemedicine use case which involves a CN in e-Healthcare domain. To validate the proposed quality-driven framework, we analysed service composition reliability with and without using the proposed framework.

  11. Implications of the Institute of Medicine Report: Evaluation of Biomarkers and Surrogate Endpoints in Chronic Disease.

    PubMed

    Wagner, J A; Ball, J R

    2015-07-01

    The Institute of Medicine (IOM) released a groundbreaking 2010 report, Evaluation of Biomarkers and Surrogate Endpoints in Chronic Disease. Key recommendations included a harmonized scientific process and a general framework for biomarker evaluation with three interrelated steps: (1) Analytical validation -- is the biomarker measurement accurate? (2) Qualification -- is the biomarker associated with the clinical endpoint of concern? (3) Utilization -- what is the specific context of the proposed use? © 2015 American Society for Clinical Pharmacology and Therapeutics.

  12. Tailored program evaluation: Past, present, future.

    PubMed

    Suggs, L Suzanne; Cowdery, Joan E; Carroll, Jennifer B

    2006-11-01

    This paper discusses measurement issues related to the evaluation of computer-tailored health behavior change programs. As the first generation of commercially available tailored products is utilized in health promotion programming, programmers and researchers are becoming aware of the unique challenges that the evaluation of these programs presents. A project is presented that used an online tailored health behavior assessment (HBA) in a worksite setting. Process and outcome evaluation methods are described and include the challenges faced, and strategies proposed and implemented, for meeting them. Implications for future research in tailored program development, implementation, and evaluation are also discussed.

  13. Toward an optimisation technique for dynamically monitored environment

    NASA Astrophysics Data System (ADS)

    Shurrab, Orabi M.

    2016-10-01

    The data fusion community has introduced multiple procedures of situational assessments; this is to facilitate timely responses to emerging situations. More directly, the process refinement of the Joint Directors of Laboratories (JDL) is a meta-process to assess and improve the data fusion task during real-time operation. In other wording, it is an optimisation technique to verify the overall data fusion performance, and enhance it toward the top goals of the decision-making resources. This paper discusses the theoretical concept of prioritisation. Where the analysts team is required to keep an up to date with the dynamically changing environment, concerning different domains such as air, sea, land, space and cyberspace. Furthermore, it demonstrates an illustration example of how various tracking activities are ranked, simultaneously into a predetermined order. Specifically, it presents a modelling scheme for a case study based scenario, where the real-time system is reporting different classes of prioritised events. Followed by a performance metrics for evaluating the prioritisation process of situational awareness (SWA) domain. The proposed performance metrics has been designed and evaluated using an analytical approach. The modelling scheme represents the situational awareness system outputs mathematically, in the form of a list of activities. Such methods allowed the evaluation process to conduct a rigorous analysis of the prioritisation process, despite any constrained related to a domain-specific configuration. After conducted three levels of assessments over three separates scenario, The Prioritisation Capability Score (PCS) has provided an appropriate scoring scheme for different ranking instances, Indeed, from the data fusion perspectives, the proposed metric has assessed real-time system performance adequately, and it is capable of conducting a verification process, to direct the operator's attention to any issue, concerning the prioritisation capability of situational awareness domain.

  14. Sustainability in Health care by Allocating Resources Effectively (SHARE) 6: investigating methods to identify, prioritise, implement and evaluate disinvestment projects in a local healthcare setting.

    PubMed

    Harris, Claire; Allen, Kelly; Brooke, Vanessa; Dyer, Tim; Waller, Cara; King, Richard; Ramsey, Wayne; Mortimer, Duncan

    2017-05-25

    This is the sixth in a series of papers reporting Sustainability in Health care by Allocating Resources Effectively (SHARE) in a local healthcare setting. The SHARE program was established to investigate a systematic, integrated, evidence-based approach to disinvestment within a large Australian health service. This paper describes the methods employed in undertaking pilot disinvestment projects. It draws a number of lessons regarding the strengths and weaknesses of these methods; particularly regarding the crucial first step of identifying targets for disinvestment. Literature reviews, survey, interviews, consultation and workshops were used to capture and process the relevant information. A theoretical framework was adapted for evaluation and explication of disinvestment projects, including a taxonomy for the determinants of effectiveness, process of change and outcome measures. Implementation, evaluation and costing plans were developed. Four literature reviews were completed, surveys were received from 15 external experts, 65 interviews were conducted, 18 senior decision-makers attended a data gathering workshop, 22 experts and local informants were consulted, and four decision-making workshops were undertaken. Mechanisms to identify disinvestment targets and criteria for prioritisation and decision-making were investigated. A catalogue containing 184 evidence-based opportunities for disinvestment and an algorithm to identify disinvestment projects were developed. An Expression of Interest process identified two potential disinvestment projects. Seventeen additional projects were proposed through a non-systematic nomination process. Four of the 19 proposals were selected as pilot projects but only one reached the implementation stage. Factors with potential influence on the outcomes of disinvestment projects are discussed and barriers and enablers in the pilot projects are summarised. This study provides an in-depth insight into the experience of disinvestment in one local healthcare service. To our knowledge, this is the first paper to report the process of disinvestment from identification, through prioritisation and decision-making, to implementation and evaluation, and finally explication of the processes and outcomes.

  15. Sensitivity analysis for simulating pesticide impacts on honey bee colonies

    EPA Science Inventory

    Background/Question/Methods Regulatory agencies assess risks to honey bees from pesticides through a tiered process that includes predictive modeling with empirical toxicity and chemical data of pesticides as a line of evidence. We evaluate the Varroapop colony model, proposed by...

  16. Diffusion of Defaults Among Financial Institutions

    NASA Astrophysics Data System (ADS)

    Demange, Gabrielle

    The paper proposes a simple unified model for the diffusion of defaults across financial institutions and presents some measures for evaluating the risk imposed by a bank on the system. So far the standard contagion processes might not incorporate some important features of financial contagion.

  17. 78 FR 49253 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-13

    ... and evaluating fishery management actions. Affected Public: Business or other for-profit organizations... submit to the Office of Management and Budget (OMB) for clearance the following proposal for collection...: National Oceanic and Atmospheric Administration (NOAA). Title: Processed Products Family of Forms. OMB...

  18. 23 CFR 172.5 - Methods of procurement.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... proposal solicitation (project, task, or service) process shall be by public announcement, advertisement... ENGINEERING AND DESIGN RELATED SERVICE CONTRACTS § 172.5 Methods of procurement. (a) Procurement. The procurement of Federal-aid highway contracts for engineering and design related services shall be evaluated...

  19. Evaluation of the late merge work zone traffic control strategy.

    DOT National Transportation Integrated Search

    2004-01-01

    Several alternative lane merge strategies have been proposed in recent years to process vehicles through work zone lane closures more safely and efficiently. Among these is the late merge. With the late merge, drivers are instructed to use all lanes ...

  20. Use of Flowsheet Monitoring to Perform Environmental Evaluation of Chemical Process Flowsheets

    EPA Science Inventory

    Flowsheet monitoring interfaces have been proposed to the Cape-Open Laboratories Network to enable development of applications that access to multiple parts of the flowsheet or its thermodynamic models, without interfering with the flowsheet itself. These flowsheet monitoring app...

  1. Single Wall Carbon Nanotube Alignment Mechanisms for Non-Destructive Evaluation

    NASA Technical Reports Server (NTRS)

    Hong, Seunghun

    2002-01-01

    As proposed in our original proposal, we developed a new innovative method to assemble millions of single wall carbon nanotube (SWCNT)-based circuit components as fast as conventional microfabrication processes. This method is based on surface template assembly strategy. The new method solves one of the major bottlenecks in carbon nanotube based electrical applications and, potentially, may allow us to mass produce a large number of SWCNT-based integrated devices of critical interests to NASA.

  2. Undergraduate medical education programme renewal: a longitudinal context, input, process and product evaluation study.

    PubMed

    Mirzazadeh, Azim; Gandomkar, Roghayeh; Hejri, Sara Mortaz; Hassanzadeh, Gholamreza; Koochak, Hamid Emadi; Golestani, Abolfazl; Jafarian, Ali; Jalili, Mohammad; Nayeri, Fatemeh; Saleh, Narges; Shahi, Farhad; Razavi, Seyed Hasan Emami

    2016-02-01

    The purpose of this study was to utilize the Context, Input, Process and Product (CIPP) evaluation model as a comprehensive framework to guide initiating, planning, implementing and evaluating a revised undergraduate medical education programme. The eight-year longitudinal evaluation study consisted of four phases compatible with the four components of the CIPP model. In the first phase, we explored the strengths and weaknesses of the traditional programme as well as contextual needs, assets, and resources. For the second phase, we proposed a model for the programme considering contextual features. During the process phase, we provided formative information for revisions and adjustments. Finally, in the fourth phase, we evaluated the outcomes of the new undergraduate medical education programme in the basic sciences phase. Information was collected from different sources such as medical students, faculty members, administrators, and graduates, using various qualitative and quantitative methods including focus groups, questionnaires, and performance measures. The CIPP model has the potential to guide policy makers to systematically collect evaluation data and to manage stakeholders' reactions at each stage of the reform in order to make informed decisions. However, the model may result in evaluation burden and fail to address some unplanned evaluation questions.

  3. Progress on BN and Doped-BN Coatings on Woven Fabrics

    NASA Technical Reports Server (NTRS)

    Hurwitz, Frances I.; Scott, John M.; Chayka, Paul V.

    2001-01-01

    A novel, multistep process for applying interface coatings to woven structures using a pulsed CVD process is being evaluated. Borazine (B3N3H6), a neat liquid, and several Si precursors are used in the process to produce BN and SiBN coatings on Hi- Nicalon fabrics and preforms. A three variable, two level, full factorial matrix is proposed to define the influence of processing parameters. Coating morphology, uniformity and chemistry are characterized by field emission scanning electron microscopy (FESEM), energy dispersive (EDS) and Auger spectroscopies.

  4. On the use of multi-agent systems for the monitoring of industrial systems

    NASA Astrophysics Data System (ADS)

    Rezki, Nafissa; Kazar, Okba; Mouss, Leila Hayet; Kahloul, Laid; Rezki, Djamil

    2016-03-01

    The objective of the current paper is to present an intelligent system for complex process monitoring, based on artificial intelligence technologies. This system aims to realize with success all the complex process monitoring tasks that are: detection, diagnosis, identification and reconfiguration. For this purpose, the development of a multi-agent system that combines multiple intelligences such as: multivariate control charts, neural networks, Bayesian networks and expert systems has became a necessity. The proposed system is evaluated in the monitoring of the complex process Tennessee Eastman process.

  5. A hybrid method for evaluating enterprise architecture implementation.

    PubMed

    Nikpay, Fatemeh; Ahmad, Rodina; Yin Kia, Chiam

    2017-02-01

    Enterprise Architecture (EA) implementation evaluation provides a set of methods and practices for evaluating the EA implementation artefacts within an EA implementation project. There are insufficient practices in existing EA evaluation models in terms of considering all EA functions and processes, using structured methods in developing EA implementation, employing matured practices, and using appropriate metrics to achieve proper evaluation. The aim of this research is to develop a hybrid evaluation method that supports achieving the objectives of EA implementation. To attain this aim, the first step is to identify EA implementation evaluation practices. To this end, a Systematic Literature Review (SLR) was conducted. Second, the proposed hybrid method was developed based on the foundation and information extracted from the SLR, semi-structured interviews with EA practitioners, program theory evaluation and Information Systems (ISs) evaluation. Finally, the proposed method was validated by means of a case study and expert reviews. This research provides a suitable foundation for researchers who wish to extend and continue this research topic with further analysis and exploration, and for practitioners who would like to employ an effective and lightweight evaluation method for EA projects. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. Scientific Process Flowchart Assessment (SPFA): A Method for Evaluating Changes in Understanding and Visualization of the Scientific Process in a Multidisciplinary Student Population

    PubMed Central

    Wilson, Kristy J.; Rigakos, Bessie

    2016-01-01

    The scientific process is nonlinear, unpredictable, and ongoing. Assessing the nature of science is difficult with methods that rely on Likert-scale or multiple-choice questions. This study evaluated conceptions about the scientific process using student-created visual representations that we term “flowcharts.” The methodology, Scientific Process Flowchart Assessment (SPFA), consisted of a prompt and rubric that was designed to assess students’ understanding of the scientific process. Forty flowcharts representing a multidisciplinary group without intervention and 26 flowcharts representing pre- and postinstruction were evaluated over five dimensions: connections, experimental design, reasons for doing science, nature of science, and interconnectivity. Pre to post flowcharts showed a statistically significant improvement in the number of items and ratings for the dimensions. Comparison of the terms used and connections between terms on student flowcharts revealed an enhanced and more nuanced understanding of the scientific process, especially in the areas of application to society and communication within the scientific community. We propose that SPFA can be used in a variety of circumstances, including in the determination of what curricula or interventions would be useful in a course or program, in the assessment of curriculum, or in the evaluation of students performing research projects. PMID:27856551

  7. [Potentialities of the vegetative resonance test for diagnostics of hyperplastic processes in vocal folds].

    PubMed

    Ukhankova, N I; Sotskaia, T Iu

    2010-01-01

    The objective of the present study was to evaluate potentialities of the vegetative resonance test (VRT) for the elucidation of metabolic aspects of the inflammatory process in different forms of chronic vocal fold hyperplasty. The proposed diagnostic criteria characterize the inflammatory process in the larynx, specific features of metabolism in patients presenting with catarrhal and oedematopolypous laryngitis, characteristic changes in oedematofibrous and fibrous polyps. The use of VRT allowed diagnostic criteria for precarcinogenic conditions in the larynx to be developed.

  8. Experimental Evaluation of Performance Feedback Using the Dismounted Infantry Virtual After Action Review System. Long Range Navy and Marine Corps Science and Technology Program

    DTIC Science & Technology

    2007-11-14

    Artificial intelligence and 4 23 education , Volume 1: Learning environments and tutoring systems. Hillsdale, NJ: Erlbaum. Wickens, C.D. (1984). Processing...and how to use it to best optimize the learning process. Some researchers (see Loftin & Savely, 1991) have proposed adding intelligent systems to the...is experienced as the cognitive centers in an individual’s brain process visual, tactile, kinesthetic , olfactory, proprioceptive, and auditory

  9. An effective and optimal quality control approach for green energy manufacturing using design of experiments framework and evolutionary algorithm

    NASA Astrophysics Data System (ADS)

    Saavedra, Juan Alejandro

    Quality Control (QC) and Quality Assurance (QA) strategies vary significantly across industries in the manufacturing sector depending on the product being built. Such strategies range from simple statistical analysis and process controls, decision-making process of reworking, repairing, or scraping defective product. This study proposes an optimal QC methodology in order to include rework stations during the manufacturing process by identifying the amount and location of these workstations. The factors that are considered to optimize these stations are cost, cycle time, reworkability and rework benefit. The goal is to minimize the cost and cycle time of the process, but increase the reworkability and rework benefit. The specific objectives of this study are: (1) to propose a cost estimation model that includes energy consumption, and (2) to propose an optimal QC methodology to identify quantity and location of rework workstations. The cost estimation model includes energy consumption as part of the product direct cost. The cost estimation model developed allows the user to calculate product direct cost as the quality sigma level of the process changes. This provides a benefit because a complete cost estimation calculation does not need to be performed every time the processes yield changes. This cost estimation model is then used for the QC strategy optimization process. In order to propose a methodology that provides an optimal QC strategy, the possible factors that affect QC were evaluated. A screening Design of Experiments (DOE) was performed on seven initial factors and identified 3 significant factors. It reflected that one response variable was not required for the optimization process. A full factorial DOE was estimated in order to verify the significant factors obtained previously. The QC strategy optimization is performed through a Genetic Algorithm (GA) which allows the evaluation of several solutions in order to obtain feasible optimal solutions. The GA evaluates possible solutions based on cost, cycle time, reworkability and rework benefit. Finally it provides several possible solutions because this is a multi-objective optimization problem. The solutions are presented as chromosomes that clearly state the amount and location of the rework stations. The user analyzes these solutions in order to select one by deciding which of the four factors considered is most important depending on the product being manufactured or the company's objective. The major contribution of this study is to provide the user with a methodology used to identify an effective and optimal QC strategy that incorporates the number and location of rework substations in order to minimize direct product cost, and cycle time, and maximize reworkability, and rework benefit.

  10. Anomaly Detection in Nanofibrous Materials by CNN-Based Self-Similarity.

    PubMed

    Napoletano, Paolo; Piccoli, Flavio; Schettini, Raimondo

    2018-01-12

    Automatic detection and localization of anomalies in nanofibrous materials help to reduce the cost of the production process and the time of the post-production visual inspection process. Amongst all the monitoring methods, those exploiting Scanning Electron Microscope (SEM) imaging are the most effective. In this paper, we propose a region-based method for the detection and localization of anomalies in SEM images, based on Convolutional Neural Networks (CNNs) and self-similarity. The method evaluates the degree of abnormality of each subregion of an image under consideration by computing a CNN-based visual similarity with respect to a dictionary of anomaly-free subregions belonging to a training set. The proposed method outperforms the state of the art.

  11. Integrating Personalized and Community Services for Mobile Travel Planning and Management

    NASA Astrophysics Data System (ADS)

    Yu, Chien-Chih

    Personalized and community services have been noted as keys to enhance and facilitate e-tourism as well as mobile applications. This paper aims at proposing an integrated service framework for combining personalized and community functions to support mobile travel planning and management. Major mobile tourism related planning and decision support functions specified include personalized profile management, information search and notification, evaluation and recommendation, do-it-yourself planning and design, community and collaboration management, auction and negotiation, transaction and payment, as well as trip tracking and quality control. A system implementation process with an example prototype is also presented for illustrating the feasibility and effectiveness of the proposed system framework, process model, and development methodology.

  12. Choosing Between Public and Private Providers of Depot Maintenance: A Proposed New Approach

    DTIC Science & Technology

    1997-09-01

    Appendix A Mathematical Form of the Model Appendix B Assumed Distributions for Evaluation Factors vm Contents Appendix C Trial Evaluation Workbook ...Figure 4-2. Revised Factor Scale Anchors 4-5 Figure 4-3. Workbook Display Establishing Relevance of Factor 4-5 Figure 4-4. Comparing Results of...Introduction process. To fill voids we conducted additional research in the areas of classical microeconomics , transaction cost economics, public

  13. MEMS-based system and image processing strategy for epiretinal prosthesis.

    PubMed

    Xia, Peng; Hu, Jie; Qi, Jin; Gu, Chaochen; Peng, Yinghong

    2015-01-01

    Retinal prostheses have the potential to restore some level of visual function to the patients suffering from retinal degeneration. In this paper, an epiretinal approach with active stimulation devices is presented. The MEMS-based processing system consists of an external micro-camera, an information processor, an implanted electrical stimulator and a microelectrode array. The image processing strategy combining image clustering and enhancement techniques was proposed and evaluated by psychophysical experiments. The results indicated that the image processing strategy improved the visual performance compared with direct merging pixels to low resolution. The image processing methods assist epiretinal prosthesis for vision restoration.

  14. A risk-based approach to management of leachables utilizing statistical analysis of extractables.

    PubMed

    Stults, Cheryl L M; Mikl, Jaromir; Whelehan, Oliver; Morrical, Bradley; Duffield, William; Nagao, Lee M

    2015-04-01

    To incorporate quality by design concepts into the management of leachables, an emphasis is often put on understanding the extractable profile for the materials of construction for manufacturing disposables, container-closure, or delivery systems. Component manufacturing processes may also impact the extractable profile. An approach was developed to (1) identify critical components that may be sources of leachables, (2) enable an understanding of manufacturing process factors that affect extractable profiles, (3) determine if quantitative models can be developed that predict the effect of those key factors, and (4) evaluate the practical impact of the key factors on the product. A risk evaluation for an inhalation product identified injection molding as a key process. Designed experiments were performed to evaluate the impact of molding process parameters on the extractable profile from an ABS inhaler component. Statistical analysis of the resulting GC chromatographic profiles identified processing factors that were correlated with peak levels in the extractable profiles. The combination of statistically significant molding process parameters was different for different types of extractable compounds. ANOVA models were used to obtain optimal process settings and predict extractable levels for a selected number of compounds. The proposed paradigm may be applied to evaluate the impact of material composition and processing parameters on extractable profiles and utilized to manage product leachables early in the development process and throughout the product lifecycle.

  15. Innovation through developing consumers communities. Part II: Digitalizing the innovation processes

    NASA Astrophysics Data System (ADS)

    Avasilcai, S.; Galateanu (Avram, E.

    2015-11-01

    The current research recognises the concept of innovation as the main driver for organisational growth and profitability. The companies seek to develop new ways to engage consumers and customers into co - creation value through the product design, development and distribution processes. However the main concern is manifested for new and creative ways of customization products based on consumers’ requirements and needs. Thus the need for innovative virtual instruments arose as the demand from social communities for personalised products or services increased. Basically companies should develop own innovative platforms, where consumers can participate, with ideas, concepts or other relevant contributions, and interact with designers or engineers for product development. This paper aims to present the most important features of platform development within BMW Group as a concept and as innovative instrument. From this point of view it is important to enhance past experiences of the company in the field of co - creation projects. There will be highlighted the dual consumers’ character as co - creator and co - evaluator based on their involvement in the proposed and developed projects and platform structure. The significant impact on platform functioning it has the diversity of company's concerns for Research & Development and innovation activities. From this point of view there will be assessed the platform structure, the main proposed themes and the evaluation process. The main outcome is to highlight the significance of platform development as innovative tool for consumers’ communities’ enhancement. Based on the analysis of “BMW Co-Creation Lab”, there will be revealed the main consumers concerns in terms of safety, comfort and appearance of the products. Thus it is important to understand the evaluation process of gathered ideas and intellectual property policy. The importance of platform development and implementation will be highlighted by company's results in terms of Research & Development investments and future projects which will be proposed, assessed and implemented by BMW Group in order to show the responsibility for their products and consumers.

  16. Factors that affect micro-tooling features created by direct printing approach

    NASA Astrophysics Data System (ADS)

    Kumbhani, Mayur N.

    Current market required faster pace production of smaller, better, and improved products in shorter amount of time. Traditional high-rate manufacturing process such as hot embossing, injection molding, compression molding, etc. use tooling to replicate feature on a products. Miniaturization of many product in the field of biomedical, electronics, optical, and microfluidic is occurring on a daily bases. There is a constant need to produce cheaper, and faster tooling, which can be utilize by existing manufacturing processes. Traditionally, in order to manufacture micron size tooling features processes such as micro-machining, Electrical Discharge Machining (EDM), etc. are utilized. Due to a higher difficulty to produce smaller size features, and longer production cycle time, various additive manufacturing approaches are proposed, e.g. selective laser sintering (SLS), inkjet printing (3DP), fused deposition modeling (FDM), etc. were proposed. Most of these approaches can produce net shaped products from different materials such as metal, ceramic, or polymers. Several attempts were made to produce tooling features using additive manufacturing approaches. Most of these produced tooling were not cost effective, and the life cycle of these tooling was reported short. In this research, a method to produce tooling features using direct printing approach, where highly filled feedstock was dispensed on a substrate. This research evaluated different natural binders, such as guar gum, xanthan gum, and sodium carboxymethyl cellulose (NaCMC) and their combinations were evaluated. The best binder combination was then use to evaluate effect of different metal (316L stainless steel (3 mum), 316 stainless steel (45 mum), and 304 stainless steel (45 mum)) particle size on feature quality. Finally, the effect of direct printing process variables such as dispensing tip internal diameter (500 mum, and 333 mum) at different printing speeds were evaluated.

  17. A methodology for the evaluation of the human-bioclimatic performance of open spaces

    NASA Astrophysics Data System (ADS)

    Charalampopoulos, Ioannis; Tsiros, Ioannis; Chronopoulou-Sereli, Aik.; Matzarakis, Andreas

    2017-05-01

    The purpose of this paper is to present a simple methodology to improve the evaluation of the human-biometeorological benefits of open spaces. It is based on two groups of new indices using as basis the well-known PET index. This simple methodology along with the accompanying indices allows a qualitative and quantitative evaluation of the climatic behavior of the selected sites. The proposed methodology was applied in a human-biometeorology research in the city of Athens, Greece. The results of this study are in line with the results of other related studies indicating the considerable influence of the sky view factor (SVF), the existence of the vegetation and the building material on human-biometeorological conditions. The proposed methodology may provide new insights in the decision-making process related to urban open spaces' best configuration.

  18. Degraded document image enhancement

    NASA Astrophysics Data System (ADS)

    Agam, G.; Bal, G.; Frieder, G.; Frieder, O.

    2007-01-01

    Poor quality documents are obtained in various situations such as historical document collections, legal archives, security investigations, and documents found in clandestine locations. Such documents are often scanned for automated analysis, further processing, and archiving. Due to the nature of such documents, degraded document images are often hard to read, have low contrast, and are corrupted by various artifacts. We describe a novel approach for the enhancement of such documents based on probabilistic models which increases the contrast, and thus, readability of such documents under various degradations. The enhancement produced by the proposed approach can be viewed under different viewing conditions if desired. The proposed approach was evaluated qualitatively and compared to standard enhancement techniques on a subset of historical documents obtained from the Yad Vashem Holocaust museum. In addition, quantitative performance was evaluated based on synthetically generated data corrupted under various degradation models. Preliminary results demonstrate the effectiveness of the proposed approach.

  19. Probabilistic retinal vessel segmentation

    NASA Astrophysics Data System (ADS)

    Wu, Chang-Hua; Agam, Gady

    2007-03-01

    Optic fundus assessment is widely used for diagnosing vascular and non-vascular pathology. Inspection of the retinal vasculature may reveal hypertension, diabetes, arteriosclerosis, cardiovascular disease and stroke. Due to various imaging conditions retinal images may be degraded. Consequently, the enhancement of such images and vessels in them is an important task with direct clinical applications. We propose a novel technique for vessel enhancement in retinal images that is capable of enhancing vessel junctions in addition to linear vessel segments. This is an extension of vessel filters we have previously developed for vessel enhancement in thoracic CT scans. The proposed approach is based on probabilistic models which can discern vessels and junctions. Evaluation shows the proposed filter is better than several known techniques and is comparable to the state of the art when evaluated on a standard dataset. A ridge-based vessel tracking process is applied on the enhanced image to demonstrate the effectiveness of the enhancement filter.

  20. On improving IED object detection by exploiting scene geometry using stereo processing

    NASA Astrophysics Data System (ADS)

    van de Wouw, Dennis W. J. M.; Dubbelman, Gijs; de With, Peter H. N.

    2015-03-01

    Detecting changes in the environment with respect to an earlier data acquisition is important for several applications, such as finding Improvised Explosive Devices (IEDs). We explore and evaluate the benefit of depth sensing in the context of automatic change detection, where an existing monocular system is extended with a second camera in a fixed stereo setup. We then propose an alternative frame registration that exploits scene geometry, in particular the ground plane. Furthermore, change characterization is applied to localized depth maps to distinguish between 3D physical changes and shadows, which solves one of the main challenges of a monocular system. The proposed system is evaluated on real-world acquisitions, containing geo-tagged test objects of 18 18 9 cm up to a distance of 60 meters. The proposed extensions lead to a significant reduction of the false-alarm rate by a factor of 3, while simultaneously improving the detection score with 5%.

  1. Naive Bayes as opinion classifier to evaluate students satisfaction based on student sentiment in Twitter Social Media

    NASA Astrophysics Data System (ADS)

    Candra Permana, Fahmi; Rosmansyah, Yusep; Setiawan Abdullah, Atje

    2017-10-01

    Students activity on social media can provide implicit knowledge and new perspectives for an educational system. Sentiment analysis is a part of text mining that can help to analyze and classify the opinion data. This research uses text mining and naive Bayes method as opinion classifier, to be used as an alternative methods in the process of evaluating studentss satisfaction for educational institution. Based on test results, this system can determine the opinion classification in Bahasa Indonesia using naive Bayes as opinion classifier with accuracy level of 84% correct, and the comparison between the existing system and the proposed system to evaluate students satisfaction in learning process, there is only a difference of 16.49%.

  2. A framework to observe and evaluate the sustainability of human-natural systems in a complex dynamic context.

    PubMed

    Satanarachchi, Niranji; Mino, Takashi

    2014-01-01

    This paper aims to explore the prominent implications of the process of observing complex dynamics linked to sustainability in human-natural systems and to propose a framework for sustainability evaluation by introducing the concept of sustainability boundaries. Arguing that both observing and evaluating sustainability should engage awareness of complex dynamics from the outset, we try to embody this idea in the framework by two complementary methods, namely, the layer view- and dimensional view-based methods, which support the understanding of a reflexive and iterative sustainability process. The framework enables the observation of complex dynamic sustainability contexts, which we call observation metastructures, and enable us to map the contexts to sustainability boundaries.

  3. New method of contour image processing based on the formalism of spiral light beams

    NASA Astrophysics Data System (ADS)

    Volostnikov, Vladimir G.; Kishkin, S. A.; Kotova, S. P.

    2013-07-01

    The possibility of applying the mathematical formalism of spiral light beams to the problems of contour image recognition is theoretically studied. The advantages and disadvantages of the proposed approach are evaluated; the results of numerical modelling are presented.

  4. 39 CFR 775.9 - Environmental evaluation process.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... upon the environment. (2) Findings of no significant impact. If an environmental assessment indicates that there is no significant impact of a proposed action on the environment, an environmental impact... significant effect on the human environment and states that an environmental impact statement will not be...

  5. 39 CFR 775.9 - Environmental evaluation process.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... upon the environment. (2) Findings of no significant impact. If an environmental assessment indicates that there is no significant impact of a proposed action on the environment, an environmental impact... significant effect on the human environment and states that an environmental impact statement will not be...

  6. Plasticity in Memorial Networks

    ERIC Educational Resources Information Center

    Hayes-Roth, Barbara; Hayes-Roth, Frederick

    1975-01-01

    An adaptive network model is proposed to represent the structure and processing of knowledge. Accessibility of subjects' stored information was measured. Relationships exist among (a) frequency of verifying a test relation, (b) other relations involving concepts used to evaluate test relation, (c) frequency of verifying those relations. (CHK)

  7. Mass transit : implementation of FTA's new starts evaluation process and FY 2001 funding proposals

    DOT National Transportation Integrated Search

    2000-04-01

    Since the early 1970s, the federal government has provided a large share of the nation's capital investment in urban mass transportation. Much of this investment has come through the Federal Transit Administration's (FTA) New Starts program, which he...

  8. Evaluation of Research Ethics Committees: Criteria for the Ethical Quality of the Review Process.

    PubMed

    Scherzinger, Gregor; Bobbert, Monika

    2017-01-01

    Repeatedly, adequacy, performance and quality of Ethics Committees that oversee medical research trials are being discussed. Although they play a crucial role in reviewing medical research and protecting human subjects, it is far from clear to what degree they fulfill the task they have been assigned to. This eventuates in the call for an evaluation of their activity and, in some places, led to the establishment of accreditation schemes. At the same time, IRBs have become subject of detailed legislation in the process of the ongoing global juridification of medical research. Unsurprisingly, there is a tendency to understand the evaluation of RECs as a question of controlling their legal compliance. This paper discusses the need for a quality evaluation of IRBs from an ethical point of view and, by systematically reviewing the major ethical guidelines for IRBs, proposes a system of criteria that should orientate any evaluation of IRBs.

  9. A unified architecture for biomedical search engines based on semantic web technologies.

    PubMed

    Jalali, Vahid; Matash Borujerdi, Mohammad Reza

    2011-04-01

    There is a huge growth in the volume of published biomedical research in recent years. Many medical search engines are designed and developed to address the over growing information needs of biomedical experts and curators. Significant progress has been made in utilizing the knowledge embedded in medical ontologies and controlled vocabularies to assist these engines. However, the lack of common architecture for utilized ontologies and overall retrieval process, hampers evaluating different search engines and interoperability between them under unified conditions. In this paper, a unified architecture for medical search engines is introduced. Proposed model contains standard schemas declared in semantic web languages for ontologies and documents used by search engines. Unified models for annotation and retrieval processes are other parts of introduced architecture. A sample search engine is also designed and implemented based on the proposed architecture in this paper. The search engine is evaluated using two test collections and results are reported in terms of precision vs. recall and mean average precision for different approaches used by this search engine.

  10. Statistical and clustering analysis for disturbances: A case study of voltage dips in wind farms

    DOE PAGES

    Garcia-Sanchez, Tania; Gomez-Lazaro, Emilio; Muljadi, Eduard; ...

    2016-01-28

    This study proposes and evaluates an alternative statistical methodology to analyze a large number of voltage dips. For a given voltage dip, a set of lengths is first identified to characterize the root mean square (rms) voltage evolution along the disturbance, deduced from partial linearized time intervals and trajectories. Principal component analysis and K-means clustering processes are then applied to identify rms-voltage patterns and propose a reduced number of representative rms-voltage profiles from the linearized trajectories. This reduced group of averaged rms-voltage profiles enables the representation of a large amount of disturbances, which offers a visual and graphical representation ofmore » their evolution along the events, aspects that were not previously considered in other contributions. The complete process is evaluated on real voltage dips collected in intense field-measurement campaigns carried out in a wind farm in Spain among different years. The results are included in this paper.« less

  11. Vibrational monitor of early demineralization in tooth enamel after in vitro exposure to phosphoridic liquid

    NASA Astrophysics Data System (ADS)

    Pezzotti, Giuseppe; Adachi, Tetsuya; Gasparutti, Isabella; Vincini, Giulio; Zhu, Wenliang; Boffelli, Marco; Rondinella, Alfredo; Marin, Elia; Ichioka, Hiroaki; Yamamoto, Toshiro; Marunaka, Yoshinori; Kanamura, Narisato

    2017-02-01

    The Raman spectroscopic method has been applied to quantitatively assess the in vitro degree of demineralization in healthy human teeth. Based on previous evaluations of Raman selection rules (empowered by an orientation distribution function (ODF) statistical algorithm) and on a newly proposed analysis of phonon density of states (PDOS) for selected vibrational modes of the hexagonal structure of hydroxyapatite, a molecular-scale evaluation of the demineralization process upon in vitro exposure to a highly acidic beverage (i.e., CocaCola™ Classic, pH = 2.5) could be obtained. The Raman method proved quite sensitive and spectroscopic features could be directly related to an increase in off-stoichiometry of the enamel surface structure since the very early stage of the demineralization process (i.e., when yet invisible to other conventional analytical techniques). The proposed Raman spectroscopic algorithm might possess some generality for caries risk assessment, allowing a prompt non-contact diagnostic practice in dentistry.

  12. Certifying leaders? high-quality management practices and healthy organisations: an ISO-9000 based standardisation approach

    PubMed Central

    MONTANO, Diego

    2016-01-01

    The present study proposes a set of quality requirements to management practices by taking into account the empirical evidence on their potential effects on health, the systemic nature of social organisations, and the current conceptualisations of management functions within the framework of comprehensive quality management systems. Systematic reviews and meta-analyses focusing on the associations between leadership and/or supervision and health in occupational settings are evaluated, and the core elements of an ISO 9001 standardisation approach are presented. Six major occupational health requirements to high-quality management practices are identified pertaining to communication processes, organisational justice, role clarity, decision making, social influence processes and management support. It is concluded that the quality of management practices may be improved by developing a quality management system of management practices that ensures not only conformity to product but also to occupational safety and health requirements. Further research may evaluate the practicability of the proposed approach. PMID:26860787

  13. Certifying leaders? high-quality management practices and healthy organisations: an ISO-9000 based standardisation approach.

    PubMed

    Montano, Diego

    2016-08-05

    The present study proposes a set of quality requirements to management practices by taking into account the empirical evidence on their potential effects on health, the systemic nature of social organisations, and the current conceptualisations of management functions within the framework of comprehensive quality management systems. Systematic reviews and meta-analyses focusing on the associations between leadership and/or supervision and health in occupational settings are evaluated, and the core elements of an ISO 9001 standardisation approach are presented. Six major occupational health requirements to high-quality management practices are identified pertaining to communication processes, organisational justice, role clarity, decision making, social influence processes and management support. It is concluded that the quality of management practices may be improved by developing a quality management system of management practices that ensures not only conformity to product but also to occupational safety and health requirements. Further research may evaluate the practicability of the proposed approach.

  14. Tinnitus: a management model.

    PubMed

    Stephens, S D; Hallam, R S; Jakes, S C

    1986-08-01

    A comprehensive model of tinnitus management is proposed. As it is rarely possible to abolish the symptom, management of the tinnitus patient must aim at precipitating the habituation process. The model is split into 'evaluation' and 'remediation' sections. In each section the various aspects of management are discussed. Together with traditional factors, the importance of psychological processes is stressed. The role of the expectations of the patient in limiting remedial possibilities is also discussed.

  15. A proposed analytic framework for determining the impact of an antimicrobial resistance intervention.

    PubMed

    Grohn, Yrjo T; Carson, Carolee; Lanzas, Cristina; Pullum, Laura; Stanhope, Michael; Volkova, Victoriya

    2017-06-01

    Antimicrobial use (AMU) is increasingly threatened by antimicrobial resistance (AMR). The FDA is implementing risk mitigation measures promoting prudent AMU in food animals. Their evaluation is crucial: the AMU/AMR relationship is complex; a suitable framework to analyze interventions is unavailable. Systems science analysis, depicting variables and their associations, would help integrate mathematics/epidemiology to evaluate the relationship. This would identify informative data and models to evaluate interventions. This National Institute for Mathematical and Biological Synthesis AMR Working Group's report proposes a system framework to address the methodological gap linking livestock AMU and AMR in foodborne bacteria. It could evaluate how AMU (and interventions) impact AMR. We will evaluate pharmacokinetic/dynamic modeling techniques for projecting AMR selection pressure on enteric bacteria. We study two methods to model phenotypic AMR changes in bacteria in the food supply and evolutionary genotypic analyses determining molecular changes in phenotypic AMR. Systems science analysis integrates the methods, showing how resistance in the food supply is explained by AMU and concurrent factors influencing the whole system. This process is updated with data and techniques to improve prediction and inform improvements for AMU/AMR surveillance. Our proposed framework reflects both the AMR system's complexity, and desire for simple, reliable conclusions.

  16. Active magnetic force microscopy of Sr-ferrite magnet by stimulating magnetization under an AC magnetic field: Direct observation of reversible and irreversible magnetization processes

    NASA Astrophysics Data System (ADS)

    Cao, Yongze; Kumar, Pawan; Zhao, Yue; Yoshimura, Satoru; Saito, Hitoshi

    2018-05-01

    Understanding the dynamic magnetization process of magnetic materials is crucial to improving their fundamental properties and technological applications. Here, we propose active magnetic force microscopy for observing reversible and irreversible magnetization processes by stimulating magnetization with an AC magnetic field based on alternating magnetic force microscopy with a sensitive superparamagnetic tip. This approach simultaneously measures sample's DC and AC magnetic fields. We used this microscopy approach to an anisotropic Sr-ferrite (SrF) sintered magnet. This is a single domain type magnet where magnetization mainly changes via magnetic rotation. The proposed method can directly observe the reversible and irreversible magnetization processes of SrF and clearly reveal magnetic domain evolution of SrF (without stimulating magnetization—stimulating reversible magnetization—stimulating irreversible magnetization switching) by slowly increasing the amplitude of the external AC magnetic field. This microscopy approach can evaluate magnetic inhomogeneity and explain the local magnetic process within the permanent magnet.

  17. Generalized species sampling priors with latent Beta reinforcements

    PubMed Central

    Airoldi, Edoardo M.; Costa, Thiago; Bassetti, Federico; Leisen, Fabrizio; Guindani, Michele

    2014-01-01

    Many popular Bayesian nonparametric priors can be characterized in terms of exchangeable species sampling sequences. However, in some applications, exchangeability may not be appropriate. We introduce a novel and probabilistically coherent family of non-exchangeable species sampling sequences characterized by a tractable predictive probability function with weights driven by a sequence of independent Beta random variables. We compare their theoretical clustering properties with those of the Dirichlet Process and the two parameters Poisson-Dirichlet process. The proposed construction provides a complete characterization of the joint process, differently from existing work. We then propose the use of such process as prior distribution in a hierarchical Bayes modeling framework, and we describe a Markov Chain Monte Carlo sampler for posterior inference. We evaluate the performance of the prior and the robustness of the resulting inference in a simulation study, providing a comparison with popular Dirichlet Processes mixtures and Hidden Markov Models. Finally, we develop an application to the detection of chromosomal aberrations in breast cancer by leveraging array CGH data. PMID:25870462

  18. Implementation of an Antenna Array Signal Processing Breadboard for the Deep Space Network

    NASA Technical Reports Server (NTRS)

    Navarro, Robert

    2006-01-01

    The Deep Space Network Large Array will replace/augment 34 and 70 meter antenna assets. The array will mainly be used to support NASA's deep space telemetry, radio science, and navigation requirements. The array project will deploy three complexes in the western U.S., Australia, and European longitude each with 400 12m downlink antennas and a DSN central facility at JPL. THis facility will remotely conduct all real-time monitor and control for the network. Signal processing objectives include: provide a means to evaluate the performance of the Breadboard Array's antenna subsystem; design and build prototype hardware; demonstrate and evaluate proposed signal processing techniques; and gain experience with various technologies that may be used in the Large Array. Results are summarized..

  19. Evaluation method on steering for the shape-shifting robot in different configurations

    NASA Astrophysics Data System (ADS)

    Chang, Jian; Li, Bin; Wang, Chong; Zheng, Huaibing; Li, Zhiqiang

    2016-01-01

    The evaluation method on steering is based on qualitative manner in existence, which causes the result inaccurate and fuzziness. It reduces the efficiency of process execution. So the method by quantitative manner for the shape-shifting robot in different configurations is proposed. Comparing to traditional evaluation method, the most important aspects which can influence the steering abilities of the robot in different configurations are researched in detail, including the energy, angular velocity, time and space. In order to improve the robustness of system, the ideal and slippage conditions are all considered by mathematical model. Comparing to the traditional weighting confirming method, the extent of robot steering method is proposed by the combination of subjective and objective weighting method. The subjective weighting method can show more preferences of the experts and is based on five-grade scale. The objective weighting method is based on information entropy to determine the factors. By the sensors fixed on the robot, the contract force between track grouser and ground, the intrinsic motion characteristics of robot are obtained and the experiment is done to prove the algorithm which is proposed as the robot in different common configurations. Through the method proposed in the article, fuzziness and inaccurate of the evaluation method has been solved, so the operators can choose the most suitable configuration of the robot to fulfil the different tasks more quickly and simply.

  20. Small business innovation research program solicitation: Closing date July 16, 1990

    NASA Technical Reports Server (NTRS)

    1990-01-01

    This is the eighth annual solicitation by NASA addressed to small business firms, inviting them to submit proposals for research, or research and development, activities in some of the science and engineering areas of interest to NASA. The solicitation describes the Small Business Innovative Research (SBIR) program, identifies eligibility requirements, outlines the required proposal format and content, states proposal preparation and submission requirements, describes the proposal evaluation and award selection process, and provides other information to assist those interested in participating in NASA's SBIR program. It also identifies the technical topics and subtopics for which SBIR proposals are solicited. These cover a broad range of current NASA interests, but do not necessarily include all areas in which NASA plans or currently conducts research. High-risk high pay-off innovations are desired.

Top