Sample records for techniques critical analysis

  1. Teaching for Art Criticism: Incorporating Feldman's Critical Analysis Learning Model in Students' Studio Practice

    ERIC Educational Resources Information Center

    Subramaniam, Maithreyi; Hanafi, Jaffri; Putih, Abu Talib

    2016-01-01

    This study adopted 30 first year graphic design students' artwork, with critical analysis using Feldman's model of art criticism. Data were analyzed quantitatively; descriptive statistical techniques were employed. The scores were viewed in the form of mean score and frequencies to determine students' performances in their critical ability.…

  2. Improving Skill Development: An Exploratory Study Comparing a Philosophical and an Applied Ethical Analysis Technique

    ERIC Educational Resources Information Center

    Al-Saggaf, Yeslam; Burmeister, Oliver K.

    2012-01-01

    This exploratory study compares and contrasts two types of critical thinking techniques; one is a philosophical and the other an applied ethical analysis technique. The two techniques analyse an ethically challenging situation involving ICT that a recent media article raised to demonstrate their ability to develop the ethical analysis skills of…

  3. Determining the Number of Factors in P-Technique Factor Analysis

    ERIC Educational Resources Information Center

    Lo, Lawrence L.; Molenaar, Peter C. M.; Rovine, Michael

    2017-01-01

    Determining the number of factors is a critical first step in exploratory factor analysis. Although various criteria and methods for determining the number of factors have been evaluated in the usual between-subjects R-technique factor analysis, there is still question of how these methods perform in within-subjects P-technique factor analysis. A…

  4. Risk analysis of hematopoietic stem cell transplant process: failure mode, effect, and criticality analysis and hazard analysis critical control point methods integration based on guidelines to good manufacturing practice for medicinal product ANNEX 20 (February 2008).

    PubMed

    Gianassi, S; Bisin, S; Bindi, B; Spitaleri, I; Bambi, F

    2010-01-01

    The collection and handling of hematopoietic stem cells (HSCs) must meet high quality requirements. An integrated Quality Risk Management can help to identify and contain potential risks related to HSC production. Risk analysis techniques allow one to "weigh" identified hazards, considering the seriousness of their effects, frequency, and detectability, seeking to prevent the most harmful hazards. The Hazard Analysis Critical Point, recognized as the most appropriate technique to identify risks associated with physical, chemical, and biological hazards for cellular products, consists of classifying finished product specifications and limits of acceptability, identifying all off-specifications, defining activities that can cause them, and finally establishing both a monitoring system for each Critical Control Point and corrective actions for deviations. The severity of possible effects on patients, as well as the occurrence and detectability of critical parameters, are measured on quantitative scales (Risk Priority Number [RPN]). Risk analysis was performed with this technique on manipulation process of HPC performed at our blood center. The data analysis showed that hazards with higher values of RPN with greater impact on the process are loss of dose and tracking; technical skills of operators and manual transcription of data were the most critical parameters. Problems related to operator skills are handled by defining targeted training programs, while other critical parameters can be mitigated with the use of continuous control systems. The blood center management software was completed by a labeling system with forms designed to be in compliance with standards in force and by starting implementation of a cryopreservation management module. Copyright 2010 Elsevier Inc. All rights reserved.

  5. Critical evaluation of sample pretreatment techniques.

    PubMed

    Hyötyläinen, Tuulia

    2009-06-01

    Sample preparation before chromatographic separation is the most time-consuming and error-prone part of the analytical procedure. Therefore, selecting and optimizing an appropriate sample preparation scheme is a key factor in the final success of the analysis, and the judicious choice of an appropriate procedure greatly influences the reliability and accuracy of a given analysis. The main objective of this review is to critically evaluate the applicability, disadvantages, and advantages of various sample preparation techniques. Particular emphasis is placed on extraction techniques suitable for both liquid and solid samples.

  6. Describing Function Techniques for the Non-Linear Analysis of the Dynamics of a Rail Vehicle Wheelset

    DOT National Transportation Integrated Search

    1975-07-01

    The describing function method of analysis is applied to investigate the influence of parametric variations on wheelset critical velocity. In addition, the relationship between the amplitude of sustained lateral oscillations and critical speed is der...

  7. Matters of Care in Alberta's "Inspiring Education" Policy: A Critical Feminist Discourse Analysis

    ERIC Educational Resources Information Center

    Bohachyk, Laura

    2016-01-01

    Using the ethics of care as a theoretical lens, alongside the techniques of discourse analysis, I critically analyze texts from Alberta's Inspiring Education policies. On the basis of this analysis, I identify two discourses: the sentimental treatment of care and the "facilitator discourse." I argue that a caring teacher-student…

  8. The Critical Incident Technique: An Effective Tool for Gathering Experience from Practicing Engineers

    ERIC Educational Resources Information Center

    Hanson, James H.; Brophy, Patrick D.

    2012-01-01

    Not all knowledge and skills that educators want to pass to students exists yet in textbooks. Some still resides only in the experiences of practicing engineers (e.g., how engineers create new products, how designers identify errors in calculations). The critical incident technique, CIT, is an established method for cognitive task analysis. It is…

  9. Rotary-Balance Testing for Aircraft Dynamics (Les Essais sur Balance Rotative pour l’Etude de la Dynamique du Vol de l’Avion)

    DTIC Science & Technology

    1990-01-01

    critical examination of the rotary-balance techniques used in the AGARD community for the analysis of high-angle-of-attack dynamic behavior of aircraft. It...aircraft. It was felt that sudi a critical examination should encompass both the experimental techniques used to obtain rotary-flow aerodynamic data and the...monitor the vibrational and critical structural characteristies of the apparatus and tunnel support system. Many of these systems are integrated directly

  10. Fostering Deeper Critical Inquiry with Causal Layered Analysis

    ERIC Educational Resources Information Center

    Haigh, Martin

    2016-01-01

    Causal layered analysis (CLA) is a technique that enables deeper critical inquiry through a structured exploration of four layers of causation. CLA's layers reach down from the surface litany of media understanding, through the layer of systemic causes identified by conventional research, to underpinning worldviews, ideologies and philosophies,…

  11. Measurement of Responsibility: A Critical Evaluation of Level of Work Measurement by Time-Span of Discretion.

    ERIC Educational Resources Information Center

    Laner, S.; And Others

    This report is a critical evaluation based on extended field trials and theoretical analysis of the time-span technique of measuring level of work in organizational hierarchies. It is broadly concluded that the technique does possess many of the desirable features claimed by its originator, but that earlier, less highly structured versions based…

  12. Active Student Involvement Focusing on Critical Analysis of Commercial Television.

    ERIC Educational Resources Information Center

    Notar, Ellen E.; Robinson, Rhonda S.

    This paper examines classroom techniques for stimulating students' critical faculties in viewing commercial television. The thrust is not only to increase critical viewing judgments, but also to heighten their knowledge of the literary elements of television. Television literacy may be developed by attention to the artistry of the television…

  13. Critical Thinking, Parenting, and the Dance of Adolescence.

    ERIC Educational Resources Information Center

    Sargant, Hope

    2002-01-01

    A parent of a gifted preteen discusses how parents can promote critical thinking in their gifted adolescents. Parents are urged to focus three levels of cognition where critical thinking is believed to take place: analysis, synthesis, and evaluation. Examples of positive interactions and questioning techniques are provided. (Contains 1 reference.)…

  14. Decision Analysis Techniques for Adult Learners: Application to Leadership

    ERIC Educational Resources Information Center

    Toosi, Farah

    2017-01-01

    Most decision analysis techniques are not taught at higher education institutions. Leaders, project managers and procurement agents in industry have strong technical knowledge, and it is crucial for them to apply this knowledge at the right time to make critical decisions. There are uncertainties, problems, and risks involved in business…

  15. Respondent Techniques for Reduction of Emotions Limiting School Adjustment: A Quantitative Review and Methodological Critique.

    ERIC Educational Resources Information Center

    Misra, Anjali; Schloss, Patrick J.

    1989-01-01

    The critical analysis of 23 studies using respondent techniques for the reduction of excessive emotional reactions in school children focuses on research design, dependent variables, independent variables, component analysis, and demonstrations of generalization and maintenance. Results indicate widespread methodological flaws that limit the…

  16. Using Combined SFTA and SFMECA Techniques for Space Critical Software

    NASA Astrophysics Data System (ADS)

    Nicodemos, F. G.; Lahoz, C. H. N.; Abdala, M. A. D.; Saotome, O.

    2012-01-01

    This work addresses the combined Software Fault Tree Analysis (SFTA) and Software Failure Modes, Effects and Criticality Analysis (SFMECA) techniques applied to space critical software of satellite launch vehicles. The combined approach is under research as part of the Verification and Validation (V&V) efforts to increase software dependability and as future application in other projects under development at Instituto de Aeronáutica e Espaço (IAE). The applicability of such approach was conducted on system software specification and applied to a case study based on the Brazilian Satellite Launcher (VLS). The main goal is to identify possible failure causes and obtain compensating provisions that lead to inclusion of new functional and non-functional system software requirements.

  17. A Model of Risk Analysis in Analytical Methodology for Biopharmaceutical Quality Control.

    PubMed

    Andrade, Cleyton Lage; Herrera, Miguel Angel De La O; Lemes, Elezer Monte Blanco

    2018-01-01

    One key quality control parameter for biopharmaceutical products is the analysis of residual cellular DNA. To determine small amounts of DNA (around 100 pg) that may be in a biologically derived drug substance, an analytical method should be sensitive, robust, reliable, and accurate. In principle, three techniques have the ability to measure residual cellular DNA: radioactive dot-blot, a type of hybridization; threshold analysis; and quantitative polymerase chain reaction. Quality risk management is a systematic process for evaluating, controlling, and reporting of risks that may affects method capabilities and supports a scientific and practical approach to decision making. This paper evaluates, by quality risk management, an alternative approach to assessing the performance risks associated with quality control methods used with biopharmaceuticals, using the tool hazard analysis and critical control points. This tool provides the possibility to find the steps in an analytical procedure with higher impact on method performance. By applying these principles to DNA analysis methods, we conclude that the radioactive dot-blot assay has the largest number of critical control points, followed by quantitative polymerase chain reaction, and threshold analysis. From the analysis of hazards (i.e., points of method failure) and the associated method procedure critical control points, we conclude that the analytical methodology with the lowest risk for performance failure for residual cellular DNA testing is quantitative polymerase chain reaction. LAY ABSTRACT: In order to mitigate the risk of adverse events by residual cellular DNA that is not completely cleared from downstream production processes, regulatory agencies have required the industry to guarantee a very low level of DNA in biologically derived pharmaceutical products. The technique historically used was radioactive blot hybridization. However, the technique is a challenging method to implement in a quality control laboratory: It is laborious, time consuming, semi-quantitative, and requires a radioisotope. Along with dot-blot hybridization, two alternatives techniques were evaluated: threshold analysis and quantitative polymerase chain reaction. Quality risk management tools were applied to compare the techniques, taking into account the uncertainties, the possibility of circumstances or future events, and their effects upon method performance. By illustrating the application of these tools with DNA methods, we provide an example of how they can be used to support a scientific and practical approach to decision making and can assess and manage method performance risk using such tools. This paper discusses, considering the principles of quality risk management, an additional approach to the development and selection of analytical quality control methods using the risk analysis tool hazard analysis and critical control points. This tool provides the possibility to find the method procedural steps with higher impact on method reliability (called critical control points). Our model concluded that the radioactive dot-blot assay has the larger number of critical control points, followed by quantitative polymerase chain reaction and threshold analysis. Quantitative polymerase chain reaction is shown to be the better alternative analytical methodology in residual cellular DNA analysis. © PDA, Inc. 2018.

  18. Multidimensional chromatography in food analysis.

    PubMed

    Herrero, Miguel; Ibáñez, Elena; Cifuentes, Alejandro; Bernal, Jose

    2009-10-23

    In this work, the main developments and applications of multidimensional chromatographic techniques in food analysis are reviewed. Different aspects related to the existing couplings involving chromatographic techniques are examined. These couplings include multidimensional GC, multidimensional LC, multidimensional SFC as well as all their possible combinations. Main advantages and drawbacks of each coupling are critically discussed and their key applications in food analysis described.

  19. An analysis of pilot error-related aircraft accidents

    NASA Technical Reports Server (NTRS)

    Kowalsky, N. B.; Masters, R. L.; Stone, R. B.; Babcock, G. L.; Rypka, E. W.

    1974-01-01

    A multidisciplinary team approach to pilot error-related U.S. air carrier jet aircraft accident investigation records successfully reclaimed hidden human error information not shown in statistical studies. New analytic techniques were developed and applied to the data to discover and identify multiple elements of commonality and shared characteristics within this group of accidents. Three techniques of analysis were used: Critical element analysis, which demonstrated the importance of a subjective qualitative approach to raw accident data and surfaced information heretofore unavailable. Cluster analysis, which was an exploratory research tool that will lead to increased understanding and improved organization of facts, the discovery of new meaning in large data sets, and the generation of explanatory hypotheses. Pattern recognition, by which accidents can be categorized by pattern conformity after critical element identification by cluster analysis.

  20. A PERT/CPM of the Computer Assisted Completion of The Ministry September Report. Research Report.

    ERIC Educational Resources Information Center

    Feeney, J. D.

    Using two statistical analysis techniques (the Program Evaluation and Review Technique and the Critical Path Method), this study analyzed procedures for compiling the required yearly report of the Metropolitan Separate School Board (Catholic) of Toronto, Canada. The computer-assisted analysis organized the process of completing the report more…

  1. CPM and PERT in Library Management.

    ERIC Educational Resources Information Center

    Main, Linda

    1989-01-01

    Discusses two techniques of systems analysis--Critical Path Method (CPM) and Program Evaluation Review Techniques (PERT)--and their place in library management. An overview of CPM and PERT charting procedures is provided. (11 references) (Author/MES)

  2. Analysis of critical thinking ability of VII grade students based on the mathematical anxiety level through learning cycle 7E model

    NASA Astrophysics Data System (ADS)

    Widyaningsih, E.; Waluya, S. B.; Kurniasih, A. W.

    2018-03-01

    This study aims to know mastery learning of students’ critical thinking ability with learning cycle 7E, determine whether the critical thinking ability of the students with learning cycle 7E is better than students’ critical thinking ability with expository model, and describe the students’ critical thinking phases based on the mathematical anxiety level. The method is mixed method with concurrent embedded. The population is VII grade students of SMP Negeri 3 Kebumen academic year 2016/2017. Subjects are determined by purposive sampling, selected two students from each level of mathematical anxiety. Data collection techniques include test, questionnaire, interview, and documentation. Quantitative data analysis techniques include mean test, proportion test, difference test of two means, difference test of two proportions and for qualitative data used Miles and Huberman model. The results show that: (1) students’ critical thinking ability with learning cycle 7E achieve mastery learning; (2) students’ critical thinking ability with learning cycle 7E is better than students’ critical thinking ability with expository model; (3) description of students’ critical thinking phases based on the mathematical anxiety level that is the lower the mathematical anxiety level, the subjects have been able to fulfil all of the indicators of clarification, assessment, inference, and strategies phases.

  3. Technologies for Clinical Diagnosis Using Expired Human Breath Analysis

    PubMed Central

    Mathew, Thalakkotur Lazar; Pownraj, Prabhahari; Abdulla, Sukhananazerin; Pullithadathil, Biji

    2015-01-01

    This review elucidates the technologies in the field of exhaled breath analysis. Exhaled breath gas analysis offers an inexpensive, noninvasive and rapid method for detecting a large number of compounds under various conditions for health and disease states. There are various techniques to analyze some exhaled breath gases, including spectrometry, gas chromatography and spectroscopy. This review places emphasis on some of the critical biomarkers present in exhaled human breath, and its related effects. Additionally, various medical monitoring techniques used for breath analysis have been discussed. It also includes the current scenario of breath analysis with nanotechnology-oriented techniques. PMID:26854142

  4. Approaches to answering critical CER questions.

    PubMed

    Kinnier, Christine V; Chung, Jeanette W; Bilimoria, Karl Y

    2015-01-01

    While randomized controlled trials (RCTs) are the gold standard for research, many research questions cannot be ethically and practically answered using an RCT. Comparative effectiveness research (CER) techniques are often better suited than RCTs to address the effects of an intervention under routine care conditions, an outcome otherwise known as effectiveness. CER research techniques covered in this section include: effectiveness-oriented experimental studies such as pragmatic trials and cluster randomized trials, treatment response heterogeneity, observational and database studies including adjustment techniques such as sensitivity analysis and propensity score analysis, systematic reviews and meta-analysis, decision analysis, and cost effectiveness analysis. Each section describes the technique and covers the strengths and weaknesses of the approach.

  5. Improving skill development: an exploratory study comparing a philosophical and an applied ethical analysis technique

    NASA Astrophysics Data System (ADS)

    Al-Saggaf, Yeslam; Burmeister, Oliver K.

    2012-09-01

    This exploratory study compares and contrasts two types of critical thinking techniques; one is a philosophical and the other an applied ethical analysis technique. The two techniques analyse an ethically challenging situation involving ICT that a recent media article raised to demonstrate their ability to develop the ethical analysis skills of ICT students and professionals. In particular the skill development focused on includes: being able to recognise ethical challenges and formulate coherent responses; distancing oneself from subjective judgements; developing ethical literacy; identifying stakeholders; and communicating ethical decisions made, to name a few.

  6. Developing Flanagan's critical incident technique to elicit indicators of high and low quality nursing care from patients and their nurses.

    PubMed

    Norman, I J; Redfern, S J; Tomalin, D A; Oliver, S

    1992-05-01

    This paper discusses a development of Flanagan's critical incident technique (CIT) to elicit indicators of high and low quality nursing from patients and their nurses on medical, surgical and elderly care wards. Stages in undertaking the CIT are identified and presuppositions held by most researchers about the nature of the technique are identified. The paper describes how the authors moved to a different set of presuppositions during the course of the study. Preliminary analysis of interview transcripts revealed that critical incidents need not always be demarcated scenes with a clear beginning and end, but may arise from respondents summarizing their overall experience within their description of one incident. Characteristically respondents were unable to give a detailed account of such incidents but validity may be established by the fact that respondents appear to recount what actually happened as they saw it, and what they said was clearly important to them. The researchers found that the most appropriate basic unit of analysis was not the incident itself but 'happenings' revealed by incidents that are 'critical' by virtue of being important to respondents with respect to the quality of nursing care. The importance of CIT researchers achieving an understanding of the 'meaning' of critical happenings to respondents is emphasized. Analysis of the interview transcripts is facilitated by the use of INGRES, a relational database computer program which should enable a 'personal theory' of quality nursing for each respondent, both patients and nurses, to be described. The study suggests that the CIT is a flexible technique which may be adapted to meet the demands of nursing research. If carefully applied, the CIT seems capable of capitalizing on respondents' own stories and avoids the loss of information which occurs when complex narratives are reduced to simple descriptive categories. Patients and nurses have unique perspectives on nursing and their views are of primary importance in understanding what quality means with respect to the interpersonal processes that are integral to nursing care. This paper discusses the identification of indicators of quality nursing from interviews with patients and nurses using the authors' development of Flanagan's critical incident technique.

  7. Failure Analysis by Statistical Techniques (FAST). Volume 1. User’s Manual

    DTIC Science & Technology

    1974-10-31

    REPORT NUMBER DNA 3336F-1 2. OOVT ACCESSION NO 4. TITLE Cand Sublllle) • FAILURE ANALYSIS BY STATISTICAL TECHNIQUES (FAST) Volume I, User’s...SS2), and t’ a facility ( SS7 ). The other three diagrams break down the three critical subsystems. T le median probability of survival of the

  8. Geometric parameter analysis to predetermine optimal radiosurgery technique for the treatment of arteriovenous malformation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mestrovic, Ante; Clark, Brenda G.; Department of Medical Physics, British Columbia Cancer Agency, Vancouver, British Columbia

    2005-11-01

    Purpose: To develop a method of predicting the values of dose distribution parameters of different radiosurgery techniques for treatment of arteriovenous malformation (AVM) based on internal geometric parameters. Methods and Materials: For each of 18 previously treated AVM patients, four treatment plans were created: circular collimator arcs, dynamic conformal arcs, fixed conformal fields, and intensity-modulated radiosurgery. An algorithm was developed to characterize the target and critical structure shape complexity and the position of the critical structures with respect to the target. Multiple regression was employed to establish the correlation between the internal geometric parameters and the dose distribution for differentmore » treatment techniques. The results from the model were applied to predict the dosimetric outcomes of different radiosurgery techniques and select the optimal radiosurgery technique for a number of AVM patients. Results: Several internal geometric parameters showing statistically significant correlation (p < 0.05) with the treatment planning results for each technique were identified. The target volume and the average minimum distance between the target and the critical structures were the most effective predictors for normal tissue dose distribution. The structure overlap volume with the target and the mean distance between the target and the critical structure were the most effective predictors for critical structure dose distribution. The predicted values of dose distribution parameters of different radiosurgery techniques were in close agreement with the original data. Conclusions: A statistical model has been described that successfully predicts the values of dose distribution parameters of different radiosurgery techniques and may be used to predetermine the optimal technique on a patient-to-patient basis.« less

  9. Critical incident technique: a user's guide for nurse researchers.

    PubMed

    Schluter, Jessica; Seaton, Philippa; Chaboyer, Wendy

    2008-01-01

    This paper is a description of the development and processes of the critical incident technique and its applicability to nursing research, using a recently-conducted study of the Australian nursing workforce as an exemplar. Issues are raised for consideration prior to the technique being put into practice. Since 1954, the critical incident technique has been used to study people's activities in a variety of professions. This five-step technique can be modified for specific settings and research questions. The fruitfulness of a study using the technique relies on gaining three important pieces of information. First, participants' complete and rich descriptions of the situation or event to be explored; secondly, the specific actions of the person/s involved in the event to aid understanding of why certain decisions were made; thirdly, the outcome of the event, to ascertain the effectiveness of the behaviour. As in other qualitative methodologies, an inductive analysis process can be used with the critical incident technique. Rich contextual information can be obtained using this technique. It generates information and uncovers tacit knowledge through assisting participants to describe their thought processes and actions during the event. Use of probing questions that determine how participants take part in certain events, or act in the ways they do, greatly enhances the outcome. A full interpretation of the event can only occur when all its aspects are provided. The critical incident technique is a practical method that allows researchers to understand complexities of the nursing role and function, and the interactions between nurses and other clinicians.

  10. Laser power conversion system analysis, volume 1

    NASA Technical Reports Server (NTRS)

    Jones, W. S.; Morgan, L. L.; Forsyth, J. B.; Skratt, J. P.

    1979-01-01

    The orbit-to-orbit laser energy conversion system analysis established a mission model of satellites with various orbital parameters and average electrical power requirements ranging from 1 to 300 kW. The system analysis evaluated various conversion techniques, power system deployment parameters, power system electrical supplies and other critical supplies and other critical subsystems relative to various combinations of the mission model. The analysis show that the laser power system would not be competitive with current satellite power systems from weight, cost and development risk standpoints.

  11. The Rational-Emotive Approach: A Critique

    ERIC Educational Resources Information Center

    Morris, G. Barry

    1976-01-01

    The critique of Rational-Emotive Therapy aims criticism at Ellis' concept of irrationality, analysis of human behavior and therapeutic techniques. Ellis suggests that his critic's claims lack the support of experimental evidence. He further suggests that an "existential" bias pervades which differs from his own brand of…

  12. Practical semen analysis: from A to Z

    PubMed Central

    Brazil, Charlene

    2010-01-01

    Accurate semen analysis is critical for decisions about patient care, as well as for studies addressing overall changes in semen quality, contraceptive efficacy and effects of toxicant exposure. The standardization of semen analysis is very difficult for many reasons, including the use of subjective techniques with no standards for comparison, poor technician training, problems with proficiency testing and a reluctance to change techniques. The World Health Organization (WHO) Semen handbook (2010) offers a vastly improved set of standardized procedures, all at a level of detail that will preclude most misinterpretations. However, there is a limit to what can be learned from words and pictures alone. A WHO-produced DVD that offers complete demonstrations of each technique along with quality assurance standards for motility, morphology and concentration assessments would enhance the effectiveness of the manual. However, neither the manual nor a DVD will help unless there is general acknowledgement of the critical need to standardize techniques and rigorously pursue quality control to ensure that laboratories actually perform techniques 'according to WHO' instead of merely reporting that they have done so. Unless improvements are made, patient results will continue to be compromised and comparison between studies and laboratories will have limited merit. PMID:20111076

  13. New Flutter Analysis Technique for Time-Domain Computational Aeroelasticity

    NASA Technical Reports Server (NTRS)

    Pak, Chan-Gi; Lung, Shun-Fat

    2017-01-01

    A new time-domain approach for computing flutter speed is presented. Based on the time-history result of aeroelastic simulation, the unknown unsteady aerodynamics model is estimated using a system identification technique. The full aeroelastic model is generated via coupling the estimated unsteady aerodynamic model with the known linear structure model. The critical dynamic pressure is computed and used in the subsequent simulation until the convergence of the critical dynamic pressure is achieved. The proposed method is applied to a benchmark cantilevered rectangular wing.

  14. Eye Movement Desensitization and Reprocessing: A Critical Analysis.

    ERIC Educational Resources Information Center

    Erwin, Terry McVannel

    Since Shapiro's introduction of Eye Movement Desensitization and Reprocessing (EMDR) in 1989, it has been a highly controversial therapeutic technique. Critical reviews of Shapiro's initial study have highlighted many methodological shortcomings in her work. And early empirical research that followed Shapiro's original study has been criticized…

  15. Factors influencing patient compliance with therapeutic regimens in chronic heart failure: A critical incident technique analysis.

    PubMed

    Strömberg, A; Broström, A; Dahlström, U; Fridlund, B

    1999-01-01

    The aim of this study was to identify factors influencing compliance with prescribed treatment in patients with chronic heart failure. A qualitative design with a critical incident technique was used. Incidents were collected through interviews with 25 patients with heart failure strategically selected from a primary health care clinic, a medical ward, and a specialist clinic. Two hundred sixty critical incidents were identified in the interviews and 2 main areas emerged in the analysis: inward factors and outward factors. The inward factors described how compliance was influenced by the personality of the patient, the disease, and the treatment. The outward factors described how compliance was influenced by social activities, social relationships, and health care professionals. By identifying the inward and outward factors influencing patients with chronic heart failure, health care professionals can assess whether intervention is needed to increase compliance.

  16. Asian International Student Transition to High School in Canada

    ERIC Educational Resources Information Center

    Popadiuk, Natalee

    2010-01-01

    There is a paucity of studies conducted with unaccompanied adolescent international students. In this qualitative inquiry, I present a thematic analysis of the critical incidents that Chinese, Japanese, and Korean participants reported as either facilitating or hindering to their transition to Canada. Using the Critical Incident Technique, I…

  17. The Construction of Pro-Science and Technology Discourse in Chinese Language Textbooks

    ERIC Educational Resources Information Center

    Liu, Yongbing

    2005-01-01

    This paper examines the pro-science and technology discourse constructed in Chinese language textbooks currently used for primary school students nationwide in China. By applying analytical techniques of critical discourse analysis (CDA), the paper critically investigates how the discourse is constructed and what ideological forces are manifested…

  18. Knowledge Management and Higher Education: A Critical Analysis

    ERIC Educational Resources Information Center

    Metcalfe, Amy

    2006-01-01

    Rather than focusing on functional issues relating to implementation of knowledge management (KM) techniques, this book addresses the social aspects of KM. Using various social science perspectives, the volume provides critical analyses of KM in higher education, with an emphasis on unintended consequences and future implications. Fifteen chapters…

  19. A Critical Appraisal of Techniques, Software Packages, and Standards for Quantitative Proteomic Analysis

    PubMed Central

    Lawless, Craig; Hubbard, Simon J.; Fan, Jun; Bessant, Conrad; Hermjakob, Henning; Jones, Andrew R.

    2012-01-01

    Abstract New methods for performing quantitative proteome analyses based on differential labeling protocols or label-free techniques are reported in the literature on an almost monthly basis. In parallel, a correspondingly vast number of software tools for the analysis of quantitative proteomics data has also been described in the literature and produced by private companies. In this article we focus on the review of some of the most popular techniques in the field and present a critical appraisal of several software packages available to process and analyze the data produced. We also describe the importance of community standards to support the wide range of software, which may assist researchers in the analysis of data using different platforms and protocols. It is intended that this review will serve bench scientists both as a useful reference and a guide to the selection and use of different pipelines to perform quantitative proteomics data analysis. We have produced a web-based tool (http://www.proteosuite.org/?q=other_resources) to help researchers find appropriate software for their local instrumentation, available file formats, and quantitative methodology. PMID:22804616

  20. A review of costing methodologies in critical care studies.

    PubMed

    Pines, Jesse M; Fager, Samuel S; Milzman, David P

    2002-09-01

    Clinical decision making in critical care has traditionally been based on clinical outcome measures such as mortality and morbidity. Over the past few decades, however, increasing competition in the health care marketplace has made it necessary to consider costs when making clinical and managerial decisions in critical care. Sophisticated costing methodologies have been developed to aid this decision-making process. We performed a narrative review of published costing studies in critical care during the past 6 years. A total of 282 articles were found, of which 68 met our search criteria. They involved a mean of 508 patients (range, 20-13,907). A total of 92.6% of the studies (63 of 68) used traditional cost analysis, whereas the remaining 7.4% (5 of 68) used cost-effectiveness analysis. None (0 of 68) used cost-benefit analysis or cost-utility analysis. A total of 36.7% (25 of 68) used hospital charges as a surrogate for actual costs. Of the 43 articles that actually counted costs, 37.2% (16 of 43) counted physician costs, 27.9% (12 of 43) counted facility costs, 34.9% (15 of 43) counted nursing costs, 9.3% (4 of 43) counted societal costs, and 90.7% (39 of 43) counted laboratory, equipment, and pharmacy costs. Our conclusion is that despite considerable progress in costing methodologies, critical care studies have not adequately implemented these techniques. Given the importance of financial implications in medicine, it would be prudent for critical care studies to use these more advanced techniques. Copyright 2002, Elsevier Science (USA). All rights reserved.

  1. 40 CFR 85.2120 - Maintenance and submittal of records.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... testing program, including all production part sampling techniques used to verify compliance of the... subsequent analyses of that data; (7) A description of all the methodology, analysis, testing and/or sampling techniques used to ascertain the emission critical parameter specifications of the originial equipment part...

  2. The Impact of Cooperative Learning on Critical Thinking Test Scores of Associate's Degree Graduates in Southwest Virginia

    ERIC Educational Resources Information Center

    Hodges, James Gregory

    2013-01-01

    This study examined the impact that the teaching technique known as cooperative learning had on the changes between pre- and post-test scores on all sub-categories ("induction, deduction, analysis, evaluation, inference", and "total composite") associated with the "California Critical Thinking Skills Test" (CCTST) for…

  3. Teaching Research Methodology Using a Project-Based Three Course Sequence Critical Reflections on Practice

    ERIC Educational Resources Information Center

    Braguglia, Kay H.; Jackson, Kanata A.

    2012-01-01

    This article presents a reflective analysis of teaching research methodology through a three course sequence using a project-based approach. The authors reflect critically on their experiences in teaching research methods courses in an undergraduate business management program. The introduction of a range of specific techniques including student…

  4. Application of Person-Centered Approaches to Critical Quantitative Research: Exploring Inequities in College Financing Strategies

    ERIC Educational Resources Information Center

    Malcom-Piqueux, Lindsey

    2014-01-01

    This chapter discusses the utility of person-centered approaches to critical quantitative researchers. These techniques, which identify groups of individuals who share similar attributes, experiences, or outcomes, are contrasted with more commonly used variable-centered approaches. An illustrative example of a latent class analysis of the college…

  5. Analysis of Critical Thinking Skills on The Topic of Static Fluid

    NASA Astrophysics Data System (ADS)

    Puspita, I.; Kaniawati, I.; Suwarma, I. R.

    2017-09-01

    This study aimed to know the critical thinking skills profil of senior high school students. This research using a descriptive study to analysis student test results of critical thinking skill of 40 students XI grade in one of the senior high school in Bogor District. The method used is survey research with sample determined by purposive sampling technique. The instrument used is test of critical thinking skill by 5 indicators on static fluid topics. Questions consist of 11 set. It is has been developed by researcher and validated by experts. The results showed students critical thinking skills are still low. Is almost every indicator of critical thinking skills only reaches less than 30%. 28% for elementary clarification, 10% for the basic for decisions/basic support, 6% for inference, 6% for advanced clarification, 4% for strategies and tactics.

  6. Dynamics of aerospace vehicles

    NASA Technical Reports Server (NTRS)

    Schmidt, David K.

    1991-01-01

    The focus of this research was to address the modeling, including model reduction, of flexible aerospace vehicles, with special emphasis on models used in dynamic analysis and/or guidance and control system design. In the modeling, it is critical that the key aspects of the system being modeled be captured in the model. In this work, therefore, aspects of the vehicle dynamics critical to control design were important. In this regard, fundamental contributions were made in the areas of stability robustness analysis techniques, model reduction techniques, and literal approximations for key dynamic characteristics of flexible vehicles. All these areas are related. In the development of a model, approximations are always involved, so control systems designed using these models must be robust against uncertainties in these models.

  7. Novel Hybrid Scheduling Technique for Sensor Nodes with Mixed Criticality Tasks.

    PubMed

    Micea, Mihai-Victor; Stangaciu, Cristina-Sorina; Stangaciu, Valentin; Curiac, Daniel-Ioan

    2017-06-26

    Sensor networks become increasingly a key technology for complex control applications. Their potential use in safety- and time-critical domains has raised the need for task scheduling mechanisms specially adapted to sensor node specific requirements, often materialized in predictable jitter-less execution of tasks characterized by different criticality levels. This paper offers an efficient scheduling solution, named Hybrid Hard Real-Time Scheduling (H²RTS), which combines a static, clock driven method with a dynamic, event driven scheduling technique, in order to provide high execution predictability, while keeping a high node Central Processing Unit (CPU) utilization factor. From the detailed, integrated schedulability analysis of the H²RTS, a set of sufficiency tests are introduced and demonstrated based on the processor demand and linear upper bound metrics. The performance and correct behavior of the proposed hybrid scheduling technique have been extensively evaluated and validated both on a simulator and on a sensor mote equipped with ARM7 microcontroller.

  8. FEM Techniques for High Stress Detection in Accelerated Fatigue Simulation

    NASA Astrophysics Data System (ADS)

    Veltri, M.

    2016-09-01

    This work presents the theory and a numerical validation study in support to a novel method for a priori identification of fatigue critical regions, with the aim to accelerate durability design in large FEM problems. The investigation is placed in the context of modern full-body structural durability analysis, where a computationally intensive dynamic solution could be required to identify areas with potential for fatigue damage initiation. The early detection of fatigue critical areas can drive a simplification of the problem size, leading to sensible improvement in solution time and model handling while allowing processing of the critical areas in higher detail. The proposed technique is applied to a real life industrial case in a comparative assessment with established practices. Synthetic damage prediction quantification and visualization techniques allow for a quick and efficient comparison between methods, outlining potential application benefits and boundaries.

  9. Analysis of Power Laws, Shape Collapses, and Neural Complexity: New Techniques and MATLAB Support via the NCC Toolbox

    PubMed Central

    Marshall, Najja; Timme, Nicholas M.; Bennett, Nicholas; Ripp, Monica; Lautzenhiser, Edward; Beggs, John M.

    2016-01-01

    Neural systems include interactions that occur across many scales. Two divergent methods for characterizing such interactions have drawn on the physical analysis of critical phenomena and the mathematical study of information. Inferring criticality in neural systems has traditionally rested on fitting power laws to the property distributions of “neural avalanches” (contiguous bursts of activity), but the fractal nature of avalanche shapes has recently emerged as another signature of criticality. On the other hand, neural complexity, an information theoretic measure, has been used to capture the interplay between the functional localization of brain regions and their integration for higher cognitive functions. Unfortunately, treatments of all three methods—power-law fitting, avalanche shape collapse, and neural complexity—have suffered from shortcomings. Empirical data often contain biases that introduce deviations from true power law in the tail and head of the distribution, but deviations in the tail have often been unconsidered; avalanche shape collapse has required manual parameter tuning; and the estimation of neural complexity has relied on small data sets or statistical assumptions for the sake of computational efficiency. In this paper we present technical advancements in the analysis of criticality and complexity in neural systems. We use maximum-likelihood estimation to automatically fit power laws with left and right cutoffs, present the first automated shape collapse algorithm, and describe new techniques to account for large numbers of neural variables and small data sets in the calculation of neural complexity. In order to facilitate future research in criticality and complexity, we have made the software utilized in this analysis freely available online in the MATLAB NCC (Neural Complexity and Criticality) Toolbox. PMID:27445842

  10. Analysis of Power Laws, Shape Collapses, and Neural Complexity: New Techniques and MATLAB Support via the NCC Toolbox.

    PubMed

    Marshall, Najja; Timme, Nicholas M; Bennett, Nicholas; Ripp, Monica; Lautzenhiser, Edward; Beggs, John M

    2016-01-01

    Neural systems include interactions that occur across many scales. Two divergent methods for characterizing such interactions have drawn on the physical analysis of critical phenomena and the mathematical study of information. Inferring criticality in neural systems has traditionally rested on fitting power laws to the property distributions of "neural avalanches" (contiguous bursts of activity), but the fractal nature of avalanche shapes has recently emerged as another signature of criticality. On the other hand, neural complexity, an information theoretic measure, has been used to capture the interplay between the functional localization of brain regions and their integration for higher cognitive functions. Unfortunately, treatments of all three methods-power-law fitting, avalanche shape collapse, and neural complexity-have suffered from shortcomings. Empirical data often contain biases that introduce deviations from true power law in the tail and head of the distribution, but deviations in the tail have often been unconsidered; avalanche shape collapse has required manual parameter tuning; and the estimation of neural complexity has relied on small data sets or statistical assumptions for the sake of computational efficiency. In this paper we present technical advancements in the analysis of criticality and complexity in neural systems. We use maximum-likelihood estimation to automatically fit power laws with left and right cutoffs, present the first automated shape collapse algorithm, and describe new techniques to account for large numbers of neural variables and small data sets in the calculation of neural complexity. In order to facilitate future research in criticality and complexity, we have made the software utilized in this analysis freely available online in the MATLAB NCC (Neural Complexity and Criticality) Toolbox.

  11. Ionic liquids in chromatographic and electrophoretic techniques: toward additional improvements in the separation of natural compounds

    PubMed Central

    Freire, Carmen S. R.; Coutinho, João A. P.; Silvestre, Armando J. D.; Freire, Mara G.

    2016-01-01

    Due to their unique properties, in recent years, ionic liquids (ILs) have been largely investigated in the field of analytical chemistry. Particularly during the last sixteen years, they have been successfully applied in the chromatographic and electrophoretic analysis of value-added compounds extracted from biomass. Considering the growing interest in the use of ILs in this field, this critical review provides a comprehensive overview on the improvements achieved using ILs as constituents of mobile or stationary phases in analytical techniques, namely in capillary electrophoresis and its different modes, in high performance liquid chromatography, and in gas chromatography, for the separation and analysis of natural compounds. The impact of the IL chemical structure and the influence of secondary parameters, such as the IL concentration, temperature, pH, voltage and analysis time (when applied), are also critically addressed regarding the achieved separation improvements. Major conclusions on the role of ILs in the separation mechanisms and the performance of these techniques in terms of efficiency, resolution and selectivity are provided. Based on a critical analysis of all published results, some target-oriented ILs are suggested. Finally, current drawbacks and future challenges in the field are highlighted. In particular, the design and use of more benign and effective ILs as well as the development of integrated (and thus more sustainable) extraction–separation processes using IL aqueous solutions are suggested within a green chemistry perspective. PMID:27667965

  12. Longitudinal Analysis Technique to Assist School Leaders in Making Critical Curriculum and Instruction Decisions for School Improvement

    ERIC Educational Resources Information Center

    Bigham, Gary D.; Riney, Mark R.

    2017-01-01

    To meet the constantly changing needs of schools and diverse learners, educators must frequently monitor student learning, revise curricula, and improve instruction. Consequently, it is critical that careful analyses of student performance data are ongoing components of curriculum decision-making processes. The primary purpose of this study is to…

  13. Methodological Issues in the Collection, Analysis, and Reporting of Granular Data in Asian American Populations: Historical Challenges and Potential Solutions

    PubMed Central

    Islam, Nadia Shilpi; Khan, Suhaila; Kwon, Simona; Jang, Deeana; Ro, Marguerite; Trinh-Shevrin, Chau

    2011-01-01

    There are close to 15 million Asian Americans living in the United States, and they represent the fastest growing populations in the country. By the year 2050, there will be an estimated 33.4 million Asian Americans living in the country. However, their health needs remain poorly understood and there is a critical lack of data disaggregated by Asian American ethnic subgroups, primary language, and geography. This paper examines methodological issues, challenges, and potential solutions to addressing the collection, analysis, and reporting of disaggregated (or, granular) data on Asian Americans. The article explores emerging efforts to increase granular data through the use of innovative study design and analysis techniques. Concerted efforts to implement these techniques will be critical to the future development of sound research, health programs, and policy efforts targeting this and other minority populations. PMID:21099084

  14. Maintaining the Health of Software Monitors

    NASA Technical Reports Server (NTRS)

    Person, Suzette; Rungta, Neha

    2013-01-01

    Software health management (SWHM) techniques complement the rigorous verification and validation processes that are applied to safety-critical systems prior to their deployment. These techniques are used to monitor deployed software in its execution environment, serving as the last line of defense against the effects of a critical fault. SWHM monitors use information from the specification and implementation of the monitored software to detect violations, predict possible failures, and help the system recover from faults. Changes to the monitored software, such as adding new functionality or fixing defects, therefore, have the potential to impact the correctness of both the monitored software and the SWHM monitor. In this work, we describe how the results of a software change impact analysis technique, Directed Incremental Symbolic Execution (DiSE), can be applied to monitored software to identify the potential impact of the changes on the SWHM monitor software. The results of DiSE can then be used by other analysis techniques, e.g., testing, debugging, to help preserve and improve the integrity of the SWHM monitor as the monitored software evolves.

  15. Neural robust stabilization via event-triggering mechanism and adaptive learning technique.

    PubMed

    Wang, Ding; Liu, Derong

    2018-06-01

    The robust control synthesis of continuous-time nonlinear systems with uncertain term is investigated via event-triggering mechanism and adaptive critic learning technique. We mainly focus on combining the event-triggering mechanism with adaptive critic designs, so as to solve the nonlinear robust control problem. This can not only make better use of computation and communication resources, but also conduct controller design from the view of intelligent optimization. Through theoretical analysis, the nonlinear robust stabilization can be achieved by obtaining an event-triggered optimal control law of the nominal system with a newly defined cost function and a certain triggering condition. The adaptive critic technique is employed to facilitate the event-triggered control design, where a neural network is introduced as an approximator of the learning phase. The performance of the event-triggered robust control scheme is validated via simulation studies and comparisons. The present method extends the application domain of both event-triggered control and adaptive critic control to nonlinear systems possessing dynamical uncertainties. Copyright © 2018 Elsevier Ltd. All rights reserved.

  16. A guide to understanding meta-analysis.

    PubMed

    Israel, Heidi; Richter, Randy R

    2011-07-01

    With the focus on evidence-based practice in healthcare, a well-conducted systematic review that includes a meta-analysis where indicated represents a high level of evidence for treatment effectiveness. The purpose of this commentary is to assist clinicians in understanding meta-analysis as a statistical tool using both published articles and explanations of components of the technique. We describe what meta-analysis is, what heterogeneity is, and how it affects meta-analysis, effect size, the modeling techniques of meta-analysis, and strengths and weaknesses of meta-analysis. Common components like forest plot interpretation, software that may be used, special cases for meta-analysis, such as subgroup analysis, individual patient data, and meta-regression, and a discussion of criticisms, are included.

  17. The Need For Dedicated Bifurcation Stents: A Critical Analysis

    PubMed Central

    Lesiak, Maciej

    2016-01-01

    There is growing evidence that optimally performed two-stent techniques may provide similar or better results compared with the simple techniques for bifurcation lesions, with an observed trend towards improvements in clinical and/or angiographic outcomes with a two-stent strategy. Yet, provisional stenting remains the treatment of choice. Here, the author discusses the evidence – and controversies – concerning when and how to use complex techniques. PMID:29588719

  18. Continuous EEG monitoring in the intensive care unit.

    PubMed

    Scheuer, Mark L

    2002-01-01

    Continuous EEG (CEEG) monitoring allows uninterrupted assessment of cerebral cortical activity with good spatial resolution and excellent temporal resolution. Thus, this procedure provides a means of constantly assessing brain function in critically ill obtunded and comatose patients. Recent advances in digital EEG acquisition, storage, quantitative analysis, and transmission have made CEEG monitoring in the intensive care unit (ICU) technically feasible and useful. This article summarizes the indications and methodology of CEEG monitoring in the ICU, and discusses the role of some quantitative EEG analysis techniques in near real-time remote observation of CEEG recordings. Clinical examples of CEEG use, including monitoring of status epilepticus, assessment of ongoing therapy for treatment of seizures in critically ill patients, and monitoring for cerebral ischemia, are presented. Areas requiring further development of CEEG monitoring techniques and indications are discussed.

  19. Objective Assessment of Patient Inhaler User Technique Using an Audio-Based Classification Approach.

    PubMed

    Taylor, Terence E; Zigel, Yaniv; Egan, Clarice; Hughes, Fintan; Costello, Richard W; Reilly, Richard B

    2018-02-01

    Many patients make critical user technique errors when using pressurised metered dose inhalers (pMDIs) which reduce the clinical efficacy of respiratory medication. Such critical errors include poor actuation coordination (poor timing of medication release during inhalation) and inhaling too fast (peak inspiratory flow rate over 90 L/min). Here, we present a novel audio-based method that objectively assesses patient pMDI user technique. The Inhaler Compliance Assessment device was employed to record inhaler audio signals from 62 respiratory patients as they used a pMDI with an In-Check Flo-Tone device attached to the inhaler mouthpiece. Using a quadratic discriminant analysis approach, the audio-based method generated a total frame-by-frame accuracy of 88.2% in classifying sound events (actuation, inhalation and exhalation). The audio-based method estimated the peak inspiratory flow rate and volume of inhalations with an accuracy of 88.2% and 83.94% respectively. It was detected that 89% of patients made at least one critical user technique error even after tuition from an expert clinical reviewer. This method provides a more clinically accurate assessment of patient inhaler user technique than standard checklist methods.

  20. The use of artificial intelligence techniques to improve the multiple payload integration process

    NASA Technical Reports Server (NTRS)

    Cutts, Dannie E.; Widgren, Brian K.

    1992-01-01

    A maximum return of science and products with a minimum expenditure of time and resources is a major goal of mission payload integration. A critical component then, in successful mission payload integration is the acquisition and analysis of experiment requirements from the principal investigator and payload element developer teams. One effort to use artificial intelligence techniques to improve the acquisition and analysis of experiment requirements within the payload integration process is described.

  1. Reachability analysis of real-time systems using time Petri nets.

    PubMed

    Wang, J; Deng, Y; Xu, G

    2000-01-01

    Time Petri nets (TPNs) are a popular Petri net model for specification and verification of real-time systems. A fundamental and most widely applied method for analyzing Petri nets is reachability analysis. The existing technique for reachability analysis of TPNs, however, is not suitable for timing property verification because one cannot derive end-to-end delay in task execution, an important issue for time-critical systems, from the reachability tree constructed using the technique. In this paper, we present a new reachability based analysis technique for TPNs for timing property analysis and verification that effectively addresses the problem. Our technique is based on a concept called clock-stamped state class (CS-class). With the reachability tree generated based on CS-classes, we can directly compute the end-to-end time delay in task execution. Moreover, a CS-class can be uniquely mapped to a traditional state class based on which the conventional reachability tree is constructed. Therefore, our CS-class-based analysis technique is more general than the existing technique. We show how to apply this technique to timing property verification of the TPN model of a command and control (C2) system.

  2. Combination of ray-tracing and the method of moments for electromagnetic radiation analysis using reduced meshes

    NASA Astrophysics Data System (ADS)

    Delgado, Carlos; Cátedra, Manuel Felipe

    2018-05-01

    This work presents a technique that allows a very noticeable relaxation of the computational requirements for full-wave electromagnetic simulations based on the Method of Moments. A ray-tracing analysis of the geometry is performed in order to extract the critical points with significant contributions. These points are then used to generate a reduced mesh, considering the regions of the geometry that surround each critical point and taking into account the electrical path followed from the source. The electromagnetic analysis of the reduced mesh produces very accurate results, requiring a fraction of the resources that the conventional analysis would utilize.

  3. Multi-intelligence critical rating assessment of fusion techniques (MiCRAFT)

    NASA Astrophysics Data System (ADS)

    Blasch, Erik

    2015-06-01

    Assessment of multi-intelligence fusion techniques includes credibility of algorithm performance, quality of results against mission needs, and usability in a work-domain context. Situation awareness (SAW) brings together low-level information fusion (tracking and identification), high-level information fusion (threat and scenario-based assessment), and information fusion level 5 user refinement (physical, cognitive, and information tasks). To measure SAW, we discuss the SAGAT (Situational Awareness Global Assessment Technique) technique for a multi-intelligence fusion (MIF) system assessment that focuses on the advantages of MIF against single intelligence sources. Building on the NASA TLX (Task Load Index), SAGAT probes, SART (Situational Awareness Rating Technique) questionnaires, and CDM (Critical Decision Method) decision points; we highlight these tools for use in a Multi-Intelligence Critical Rating Assessment of Fusion Techniques (MiCRAFT). The focus is to measure user refinement of a situation over the information fusion quality of service (QoS) metrics: timeliness, accuracy, confidence, workload (cost), and attention (throughput). A key component of any user analysis includes correlation, association, and summarization of data; so we also seek measures of product quality and QuEST of information. Building a notion of product quality from multi-intelligence tools is typically subjective which needs to be aligned with objective machine metrics.

  4. Failure mode and effects analysis (FMEA) for the Space Shuttle solid rocket motor

    NASA Technical Reports Server (NTRS)

    Russell, D. L.; Blacklock, K.; Langhenry, M. T.

    1988-01-01

    The recertification of the Space Shuttle Solid Rocket Booster (SRB) and Solid Rocket Motor (SRM) has included an extensive rewriting of the Failure Mode and Effects Analysis (FMEA) and Critical Items List (CIL). The evolution of the groundrules and methodology used in the analysis is discussed and compared to standard FMEA techniques. Especially highlighted are aspects of the FMEA/CIL which are unique to the analysis of an SRM. The criticality category definitions are presented and the rationale for assigning criticality is presented. The various data required by the CIL and contribution of this data to the retention rationale is also presented. As an example, the FMEA and CIL for the SRM nozzle assembly is discussed in detail. This highlights some of the difficulties associated with the analysis of a system with the unique mission requirements of the Space Shuttle.

  5. Review and classification of variability analysis techniques with clinical applications.

    PubMed

    Bravi, Andrea; Longtin, André; Seely, Andrew J E

    2011-10-10

    Analysis of patterns of variation of time-series, termed variability analysis, represents a rapidly evolving discipline with increasing applications in different fields of science. In medicine and in particular critical care, efforts have focussed on evaluating the clinical utility of variability. However, the growth and complexity of techniques applicable to this field have made interpretation and understanding of variability more challenging. Our objective is to provide an updated review of variability analysis techniques suitable for clinical applications. We review more than 70 variability techniques, providing for each technique a brief description of the underlying theory and assumptions, together with a summary of clinical applications. We propose a revised classification for the domains of variability techniques, which include statistical, geometric, energetic, informational, and invariant. We discuss the process of calculation, often necessitating a mathematical transform of the time-series. Our aims are to summarize a broad literature, promote a shared vocabulary that would improve the exchange of ideas, and the analyses of the results between different studies. We conclude with challenges for the evolving science of variability analysis.

  6. Review and classification of variability analysis techniques with clinical applications

    PubMed Central

    2011-01-01

    Analysis of patterns of variation of time-series, termed variability analysis, represents a rapidly evolving discipline with increasing applications in different fields of science. In medicine and in particular critical care, efforts have focussed on evaluating the clinical utility of variability. However, the growth and complexity of techniques applicable to this field have made interpretation and understanding of variability more challenging. Our objective is to provide an updated review of variability analysis techniques suitable for clinical applications. We review more than 70 variability techniques, providing for each technique a brief description of the underlying theory and assumptions, together with a summary of clinical applications. We propose a revised classification for the domains of variability techniques, which include statistical, geometric, energetic, informational, and invariant. We discuss the process of calculation, often necessitating a mathematical transform of the time-series. Our aims are to summarize a broad literature, promote a shared vocabulary that would improve the exchange of ideas, and the analyses of the results between different studies. We conclude with challenges for the evolving science of variability analysis. PMID:21985357

  7. Critical thinking skills profile of senior high school students in Biology learning

    NASA Astrophysics Data System (ADS)

    Saputri, A. C.; Sajidan; Rinanto, Y.

    2018-04-01

    Critical thinking is an important and necessary skill to confront the challenges of the 21st century. Critical thinking skills accommodate activities that can improve high-order thinking skills. This study aims to determine senior high school students' critical thinking skills in Biology learning. This research is descriptive research using instruments developed based on the core aspects of critical thinking skills according to Facione which include interpretation, analysis, evaluation, explanation, conclusion, and self-regulation. The subjects in this study were 297 students in grade 12 of a senior high school in Surakarta selected through purposive sampling technique. The results of this study showed that the students' critical thinking skills on evaluation and self-regulation are in good criterion with 78% and 66% acquisition while 52% interpretation, 56% analysis, 52% conclusion and 42% explanation indicate sufficient criteria. The conclusion from this research is that critical thinking skill of the students still was in enough category, so that needed a way to enhance it on some indicators.

  8. Assessment of the transportation route of oversize and excessive loads in relation to the load-bearing capacity of existing bridges

    NASA Astrophysics Data System (ADS)

    Doležel, Jiří; Novák, Drahomír; Petrů, Jan

    2017-09-01

    Transportation routes of oversize and excessive loads are currently planned in relation to ensure the transit of a vehicle through critical points on the road. Critical points are level-intersection of roads, bridges etc. This article presents a comprehensive procedure to determine a reliability and a load-bearing capacity level of the existing bridges on highways and roads using the advanced methods of reliability analysis based on simulation techniques of Monte Carlo type in combination with nonlinear finite element method analysis. The safety index is considered as a main criterion of the reliability level of the existing construction structures and the index is described in current structural design standards, e.g. ISO and Eurocode. An example of a single-span slab bridge made of precast prestressed concrete girders of the 60 year current time and its load bearing capacity is set for the ultimate limit state and serviceability limit state. The structure’s design load capacity was estimated by the full probability nonlinear MKP analysis using a simulation technique Latin Hypercube Sampling (LHS). Load-bearing capacity values based on a fully probabilistic analysis are compared with the load-bearing capacity levels which were estimated by deterministic methods of a critical section of the most loaded girders.

  9. A comparative critical study between FMEA and FTA risk analysis methods

    NASA Astrophysics Data System (ADS)

    Cristea, G.; Constantinescu, DM

    2017-10-01

    Today there is used an overwhelming number of different risk analyses techniques with acronyms such as: FMEA (Failure Modes and Effects Analysis) and its extension FMECA (Failure Mode, Effects, and Criticality Analysis), DRBFM (Design Review by Failure Mode), FTA (Fault Tree Analysis) and and its extension ETA (Event Tree Analysis), HAZOP (Hazard & Operability Studies), HACCP (Hazard Analysis and Critical Control Points) and What-if/Checklist. However, the most used analysis techniques in the mechanical and electrical industry are FMEA and FTA. In FMEA, which is an inductive method, information about the consequences and effects of the failures is usually collected through interviews with experienced people, and with different knowledge i.e., cross-functional groups. The FMEA is used to capture potential failures/risks & impacts and prioritize them on a numeric scale called Risk Priority Number (RPN) which ranges from 1 to 1000. FTA is a deductive method i.e., a general system state is decomposed into chains of more basic events of components. The logical interrelationship of how such basic events depend on and affect each other is often described analytically in a reliability structure which can be visualized as a tree. Both methods are very time-consuming to be applied thoroughly, and this is why it is oftenly not done so. As a consequence possible failure modes may not be identified. To address these shortcomings, it is proposed to use a combination of FTA and FMEA.

  10. Analysis, Design and Implementation of a Proof-of-Concept Prototype to Support Large-Scale Military Experimentation

    DTIC Science & Technology

    2013-09-01

    Result Analysis In this phase, users and analysts check all the results per objective- question. Then, they consolidate all these results to form...the CRUD technique. By using both the CRUD and the user goal techniques, we identified all the use cases the iFRE system must perform. Table 3...corresponding Focus Area or Critical Operation Issue to simplify the user tasks, and exempts the user from remembering the identifying codes/numbers of

  11. Analysing uncertainties of supply and demand in the future use of hydrogen as an energy vector

    NASA Astrophysics Data System (ADS)

    Lenel, U. R.; Davies, D. G. S.; Moore, M. A.

    An analytical technique (Analysis with Uncertain Qualities), developed at Fulmer, is being used to examine the sensitivity of the outcome to uncertainties in input quantities in order to highlight which input quantities critically affect the potential role of hydrogen. The work presented here includes an outline of the model and the analysis technique, along with basic considerations of the input quantities to the model (demand, supply and constraints). Some examples are given of probabilistic estimates of input quantities.

  12. Initial postbuckling analysis of elastoplastic thin-shear structures

    NASA Technical Reports Server (NTRS)

    Carnoy, E. G.; Panosyan, G.

    1984-01-01

    The design of thin shell structures with respect to elastoplastic buckling requires an extended analysis of the influence of initial imperfections. For conservative design, the most critical defect should be assumed with the maximum allowable magnitude. This defect is closely related to the initial postbuckling behavior. An algorithm is given for the quasi-static analysis of the postbuckling behavior of structures that exhibit multiple buckling points. the algorithm based upon an energy criterion allows the computation of the critical perturbation which will be employed for the definition of the critical defect. For computational efficiency, the algorithm uses the reduced basis technique with automatic update of the modal basis. The method is applied to the axisymmetric buckling of cylindrical shells under axial compression, and conclusions are given for future research.

  13. PRO-Elicere: A Study for Create a New Process of Dependability Analysis of Space Computer Systems

    NASA Astrophysics Data System (ADS)

    da Silva, Glauco; Netto Lahoz, Carlos Henrique

    2013-09-01

    This paper presents the new approach to the computer system dependability analysis, called PRO-ELICERE, which introduces data mining concepts and intelligent mechanisms to decision support to analyze the potential hazards and failures of a critical computer system. Also, are presented some techniques and tools that support the traditional dependability analysis and briefly discusses the concept of knowledge discovery and intelligent databases for critical computer systems. After that, introduces the PRO-ELICERE process, an intelligent approach to automate the ELICERE, a process created to extract non-functional requirements for critical computer systems. The PRO-ELICERE can be used in the V&V activities in the projects of Institute of Aeronautics and Space, such as the Brazilian Satellite Launcher (VLS-1).

  14. Novel Hybrid Scheduling Technique for Sensor Nodes with Mixed Criticality Tasks

    PubMed Central

    Micea, Mihai-Victor; Stangaciu, Cristina-Sorina; Stangaciu, Valentin; Curiac, Daniel-Ioan

    2017-01-01

    Sensor networks become increasingly a key technology for complex control applications. Their potential use in safety- and time-critical domains has raised the need for task scheduling mechanisms specially adapted to sensor node specific requirements, often materialized in predictable jitter-less execution of tasks characterized by different criticality levels. This paper offers an efficient scheduling solution, named Hybrid Hard Real-Time Scheduling (H2RTS), which combines a static, clock driven method with a dynamic, event driven scheduling technique, in order to provide high execution predictability, while keeping a high node Central Processing Unit (CPU) utilization factor. From the detailed, integrated schedulability analysis of the H2RTS, a set of sufficiency tests are introduced and demonstrated based on the processor demand and linear upper bound metrics. The performance and correct behavior of the proposed hybrid scheduling technique have been extensively evaluated and validated both on a simulator and on a sensor mote equipped with ARM7 microcontroller. PMID:28672856

  15. What If They Just Want To Write?

    ERIC Educational Resources Information Center

    Gilmar, Sybil

    1979-01-01

    Writing workshops are held for gifted students (7 to 15 years old) and include journalism, guidebook, and fiction work with critical analysis of each other's writing. Sample exercises and brainstorming techniques are discussed. (CL)

  16. Deep Learning Nuclei Detection in Digitized Histology Images by Superpixels.

    PubMed

    Sornapudi, Sudhir; Stanley, Ronald Joe; Stoecker, William V; Almubarak, Haidar; Long, Rodney; Antani, Sameer; Thoma, George; Zuna, Rosemary; Frazier, Shelliane R

    2018-01-01

    Advances in image analysis and computational techniques have facilitated automatic detection of critical features in histopathology images. Detection of nuclei is critical for squamous epithelium cervical intraepithelial neoplasia (CIN) classification into normal, CIN1, CIN2, and CIN3 grades. In this study, a deep learning (DL)-based nuclei segmentation approach is investigated based on gathering localized information through the generation of superpixels using a simple linear iterative clustering algorithm and training with a convolutional neural network. The proposed approach was evaluated on a dataset of 133 digitized histology images and achieved an overall nuclei detection (object-based) accuracy of 95.97%, with demonstrated improvement over imaging-based and clustering-based benchmark techniques. The proposed DL-based nuclei segmentation Method with superpixel analysis has shown improved segmentation results in comparison to state-of-the-art methods.

  17. Internet-Based Delphi Research: Case Based Discussion

    PubMed Central

    Donohoe, Holly M.; Stellefson, Michael L.

    2013-01-01

    The interactive capacity of the Internet offers benefits that are intimately linked with contemporary research innovation in the natural resource and environmental studies domains. However, e-research methodologies, such as the e-Delphi technique, have yet to undergo critical review. This study advances methodological discourse on the e-Delphi technique by critically assessing an e-Delphi case study. The analysis suggests that the benefits of using e-Delphi are noteworthy but the authors acknowledge that researchers are likely to face challenges that could potentially compromise research validity and reliability. To ensure that these issues are sufficiently considered when planning and designing an e-Delphi, important facets of the technique are discussed and recommendations are offered to help the environmental researcher avoid potential pitfalls associated with coordinating e-Delphi research. PMID:23288149

  18. Using Movies to Analyse Gene Circuit Dynamics in Single Cells

    PubMed Central

    Locke, James CW; Elowitz, Michael B

    2010-01-01

    Preface Many bacterial systems rely on dynamic genetic circuits to control critical processes. A major goal of systems biology is to understand these behaviours in terms of individual genes and their interactions. However, traditional techniques based on population averages wash out critical dynamics that are either unsynchronized between cells or driven by fluctuations, or ‘noise,’ in cellular components. Recently, the combination of time-lapse microscopy, quantitative image analysis, and fluorescent protein reporters has enabled direct observation of multiple cellular components over time in individual cells. In conjunction with mathematical modelling, these techniques are now providing powerful insights into genetic circuit behaviour in diverse microbial systems. PMID:19369953

  19. Sampling methods for microbiological analysis of red meat and poultry carcasses.

    PubMed

    Capita, Rosa; Prieto, Miguel; Alonso-Calleja, Carlos

    2004-06-01

    Microbiological analysis of carcasses at slaughterhouses is required in the European Union for evaluating the hygienic performance of carcass production processes as required for effective hazard analysis critical control point implementation. The European Union microbial performance standards refer exclusively to the excision method, even though swabbing using the wet/dry technique is also permitted when correlation between both destructive and nondestructive methods can be established. For practical and economic reasons, the swab technique is the most extensively used carcass surface-sampling method. The main characteristics, advantages, and limitations of the common excision and swabbing methods are described here.

  20. State of the art in on-line techniques coupled to flow injection analysis FIA/on-line- a critical review

    PubMed Central

    Puchades, R.; Maquieira, A.; Atienza, J.; Herrero, M. A.

    1990-01-01

    Flow injection analysis (FIA) has emerged as an increasingly used laboratory tool in chemical analysis. Employment of the technique for on-line sample treatment and on-line measurement in chemical process control is a growing trend. This article reviews the recent applications of FlA. Most papers refer to on-line sample treatment. Although FIA is very well suited to continuous on-line process monitoring, few examples have been found in this areamost of them have been applied to water treatment or fermentation processes. PMID:18925271

  1. StreamCat and LakeCat: An overview of algorithms, data, and models developed at the US EPA Western Ecology Division to facilitate and advance watershed prediction in the conterminous US.

    EPA Science Inventory

    Geospatial data and techniques have long been critical to advancing the analysis and management of freshwater ecosystems. However, these data and techniques have often been limited to specific sample sites or regional analyses because of the difficulty associated with generating ...

  2. A real time study on condition monitoring of distribution transformer using thermal imager

    NASA Astrophysics Data System (ADS)

    Mariprasath, T.; Kirubakaran, V.

    2018-05-01

    The transformer is one of the critical apparatus in the power system. At any cost, a few minutes of outages harshly influence the power system. Hence, prevention-based maintenance technique is very essential. The continuous conditioning and monitoring technology significantly increases the life span of the transformer, as well as reduces the maintenance cost. Hence, conditioning and monitoring of transformer's temperature are very essential. In this paper, a critical review has been made on various conditioning and monitoring techniques. Furthermore, a new method, hot spot indication technique, is discussed. Also, transformer's operating condition is monitored by using thermal imager. From the thermal analysis, it is inferred that major hotspot locations are appearing at connection lead out; also, the bushing of the transformer is the very hottest spot in transformer, so monitoring the level of oil is essential. Alongside, real time power quality analysis has been carried out using the power analyzer. It shows that industrial drives are injecting current harmonics to the distribution network, which causes the power quality problem on the grid. Moreover, the current harmonic limit has exceeded the IEEE standard limit. Hence, the adequate harmonics suppression technique is need an hour.

  3. Flight test derived heating math models for critical locations on the orbiter during reentry

    NASA Technical Reports Server (NTRS)

    Hertzler, E. K.; Phillips, P. W.

    1983-01-01

    An analysis technique was developed for expanding the aerothermodynamic envelope of the Space Shuttle without subjecting the vehicle to sustained flight at more stressing heating conditions. A transient analysis program was developed to take advantage of the transient maneuvers that were flown as part of this analysis technique. Heat rates were derived from flight test data for various locations on the orbiter. The flight derived heat rates were used to update heating models based on predicted data. Future missions were then analyzed based on these flight adjusted models. A technique for comparing flight and predicted heating rate data and the extrapolation of the data to predict the aerothermodynamic environment of future missions is presented.

  4. Studies of EGRET sources with a novel image restoration technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tajima, Hiroyasu; Cohen-Tanugi, Johann; Kamae, Tuneyoshi

    2007-07-12

    We have developed an image restoration technique based on the Richardson-Lucy algorithm optimized for GLAST-LAT image analysis. Our algorithm is original since it utilizes the PSF (point spread function) that is calculated for each event. This is critical for EGRET and GLAST-LAT image analysis since the PSF depends on the energy and angle of incident gamma-rays and varies by more than one order of magnitude. EGRET and GLAST-LAT image analysis also faces Poisson noise due to low photon statistics. Our technique incorporates wavelet filtering to minimize noise effects. We present studies of EGRET sources using this novel image restoration techniquemore » for possible identification of extended gamma-ray sources.« less

  5. Predicting the Influence of Nano-Scale Material Structure on the In-Plane Buckling of Orthotropic Plates

    NASA Technical Reports Server (NTRS)

    Gates, Thomas S.; Odegard, Gregory M.; Nemeth, Michael P.; Frankland, Sarah-Jane V.

    2004-01-01

    A multi-scale analysis of the structural stability of a carbon nanotube-polymer composite material is developed. The influence of intrinsic molecular structure, such as nanotube length, volume fraction, orientation and chemical functionalization, is investigated by assessing the relative change in critical, in-plane buckling loads. The analysis method relies on elastic properties predicted using the hierarchical, constitutive equations developed from the equivalent-continuum modeling technique applied to the buckling analysis of an orthotropic plate. The results indicate that for the specific composite materials considered in this study, a composite with randomly orientated carbon nanotubes consistently provides the highest values of critical buckling load and that for low volume fraction composites, the non-functionalized nanotube material provides an increase in critical buckling stability with respect to the functionalized system.

  6. Application of Critical Classroom Discourse Analysis (CCDA) in Analyzing Classroom Interaction

    ERIC Educational Resources Information Center

    Sadeghi, Sima; Ketabi, Saeed; Tavakoli, Mansoor; Sadeghi, Moslem

    2012-01-01

    As an area of classroom research, Interaction Analysis developed from the need and desire to investigate the process of classroom teaching and learning in terms of action-reaction between individuals and their socio-cultural context (Biddle, 1967). However, sole reliance on quantitative techniques could be problematic, since they conceal more than…

  7. Cross-cultural perspectives on critical thinking.

    PubMed

    Jenkins, Sheryl Daun

    2011-05-01

    The purpose of this cross-cultural study was to explore critical thinking among nurse scholars in Thailand and the United States. The study used qualitative methodology to examine how nurse scholars describe critical thinking in nursing. Nurse educators in Thailand and the United States were questioned concerning the following aspects of critical thinking: essential components; teaching and evaluation techniques; characteristics of critical thinkers; and the importance of a consensus definition for critical thinking in nursing. Their statements, which revealed both common and specific cultural aspects of critical thinking, were subjected to content analysis. Certain themes emerged that have not been widely discussed in the literature, including the link between staying calm and thinking critically, the assertion that happiness is an essential component of critical thinking, and the participants' nearly unanimous support for coming to a consensus definition of critical thinking for nursing. Copyright 2011, SLACK Incorporated.

  8. Evidence of Magnetic Breakdown on the Defects With Thermally Suppressed Critical Field in High Gradient SRF Cavities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eremeev, Grigory; Palczewski, Ari

    2013-09-01

    At SRF 2011 we presented the study of quenches in high gradient SRF cavities with dual mode excitation technique. The data differed from measurements done in 80's that indicated thermal breakdown nature of quenches in SRF cavities. In this contribution we present analysis of the data that indicates that our recent data for high gradient quenches is consistent with the magnetic breakdown on the defects with thermally suppressed critical field. From the parametric fits derived within the model we estimate the critical breakdown fields.

  9. Response Time Analysis and Test of Protection System Instrument Channels for APR1400 and OPR1000

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Chang Jae; Han, Seung; Yun, Jae Hee

    2015-07-01

    Safety limits are required to maintain the integrity of physical barriers designed to prevent the uncontrolled release of radioactive materials in nuclear power plants. The safety analysis establishes two critical constraints that include an analytical limit in terms of a measured or calculated variable, and a specific time after the analytical limit is reached to begin protective action. Keeping with the nuclear regulations and industry standards, satisfying these two requirements will ensure that the safety limit will not be exceeded during the design basis event, either an anticipated operational occurrence or a postulated accident. Various studies on the setpoint determinationmore » methodology for the safety-related instrumentation have been actively performed to ensure that the requirement of the analytical limit is satisfied. In particular, the protection setpoint methodology for the advanced power reactor 1400 (APP1400) and the optimized power reactor 1000 (OPR1000) has been recently developed to cover both the design basis event and the beyond design basis event. The developed setpoint methodology has also been quantitatively validated using specific computer programs and setpoint calculations. However, the safety of nuclear power plants cannot be fully guaranteed by satisfying the requirement of the analytical limit. In spite of the response time verification requirements of nuclear regulations and industry standards, it is hard to find the studies on the systematically integrated methodology regarding the response time evaluation. In cases of APR1400 and OPR1000, the response time analysis for the plant protection system is partially included in the setpoint calculation and the response time test is separately performed via the specific plant procedure. The test technique has a drawback which is the difficulty to demonstrate completeness of timing test. The analysis technique has also a demerit of resulting in extreme times that not actually possible. Thus, the establishment of the systematic response time evaluation methodology is needed to justify the conformance to the response time requirement used in the safety analysis. This paper proposes the response time evaluation methodology for APR1400 and OPR1000 using the combined analysis and test technique to confirm that the plant protection system can meet the analytical response time assumed in the safety analysis. In addition, the results of the quantitative evaluation performed for APR1400 and OPR1000 are presented in this paper. The proposed response time analysis technique consists of defining the response time requirement, determining the critical signal path for the trip parameter, allocating individual response time to each component on the signal path, and analyzing the total response time for the trip parameter, and demonstrates that the total analyzed response time does not exceed the response time requirement. The proposed response time test technique is composed of defining the response time requirement, determining the critical signal path for the trip parameter, determining the test method for each component on the signal path, performing the response time test, and demonstrates that the total test result does not exceed the response time requirement. The total response time should be tested in a single test that covers from the sensor to the final actuation device on the instrument channel. When the total channel is not tested in a single test, separate tests on groups of components or single components including the total instrument channel shall be combined to verify the total channel response. For APR1400 and OPR1000, the ramp test technique is used for the pressure and differential pressure transmitters and the step function testing technique is applied to the signal processing equipment and final actuation device. As a result, it can be demonstrated that the response time requirement is satisfied by the combined analysis and test technique. Therefore, the proposed methodology in this paper plays a crucial role in guaranteeing the safety of the nuclear power plants systematically satisfying one of two critical requirements from the safety analysis. (authors)« less

  10. Sensitivity-Uncertainty Techniques for Nuclear Criticality Safety

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Forrest B.; Rising, Michael Evan; Alwin, Jennifer Louise

    2017-08-07

    The sensitivity and uncertainty analysis course will introduce students to k eff sensitivity data, cross-section uncertainty data, how k eff sensitivity data and k eff uncertainty data are generated and how they can be used. Discussion will include how sensitivity/uncertainty data can be used to select applicable critical experiments, to quantify a defensible margin to cover validation gaps and weaknesses, and in development of upper subcritical limits.

  11. State-of-the-Art Methods for Skeletal Muscle Glycogen Analysis in Athletes-The Need for Novel Non-Invasive Techniques.

    PubMed

    Greene, Jacob; Louis, Julien; Korostynska, Olga; Mason, Alex

    2017-02-23

    Muscle glycogen levels have a profound impact on an athlete's sporting performance, thus measurement is vital. Carbohydrate manipulation is a fundamental component in an athlete's lifestyle and is a critical part of elite performance, since it can provide necessary training adaptations. This paper provides a critical review of the current invasive and non-invasive methods for measuring skeletal muscle glycogen levels. These include the gold standard muscle biopsy, histochemical analysis, magnetic resonance spectroscopy, and musculoskeletal high frequency ultrasound, as well as pursuing future application of electromagnetic sensors in the pursuit of portable non-invasive quantification of muscle glycogen. This paper will be of interest to researchers who wish to understand the current and most appropriate techniques in measuring skeletal muscle glycogen. This will have applications both in the lab and in the field by improving the accuracy of research protocols and following the physiological adaptations to exercise.

  12. Learning from an Unsuccessful Study Idea: Reflection and Application of Innovative Techniques to Prevent Future Failures.

    PubMed

    Fujihara, Yuki; Saito, Taichi; Huetteman, Helen E; Sterbenz, Jennifer M; Chung, Kevin C

    2018-04-01

    A well-organized, thoughtful study design is essential for creating an impactful study. However, pressures promoting high output from researchers can lead to rushed study proposals that overlook critical weaknesses in the study design that can affect the validity of the conclusions. Researchers can benefit from thorough review of past failed proposals when crafting new research ideas. Conceptual frameworks and root cause analysis are two innovative techniques that can be used during study development to identify flaws and prevent study failures. In addition, conceptual frameworks and root cause analysis can be combined to complement each other to provide both a big picture and detailed view of a study proposal. This article describes these two common analytical methods and provides an example of how they can be used to evaluate and improve a study design by critically examining a previous failed research idea.

  13. Deep Learning Nuclei Detection in Digitized Histology Images by Superpixels

    PubMed Central

    Sornapudi, Sudhir; Stanley, Ronald Joe; Stoecker, William V.; Almubarak, Haidar; Long, Rodney; Antani, Sameer; Thoma, George; Zuna, Rosemary; Frazier, Shelliane R.

    2018-01-01

    Background: Advances in image analysis and computational techniques have facilitated automatic detection of critical features in histopathology images. Detection of nuclei is critical for squamous epithelium cervical intraepithelial neoplasia (CIN) classification into normal, CIN1, CIN2, and CIN3 grades. Methods: In this study, a deep learning (DL)-based nuclei segmentation approach is investigated based on gathering localized information through the generation of superpixels using a simple linear iterative clustering algorithm and training with a convolutional neural network. Results: The proposed approach was evaluated on a dataset of 133 digitized histology images and achieved an overall nuclei detection (object-based) accuracy of 95.97%, with demonstrated improvement over imaging-based and clustering-based benchmark techniques. Conclusions: The proposed DL-based nuclei segmentation Method with superpixel analysis has shown improved segmentation results in comparison to state-of-the-art methods. PMID:29619277

  14. Evaluation of undergraduate clinical learning experiences in the subject of pediatric dentistry using critical incident technique.

    PubMed

    Vyawahare, S; Banda, N R; Choubey, S; Parvekar, P; Barodiya, A; Dutta, S

    2013-01-01

    In pediatric dentistry, the experiences of dental students may help dental educators better prepare graduates to treat the children. Research suggests that student's perceptions should be considered in any discussion of their education, but there has been no systematic examination of India's undergraduate dental students learning experiences. This qualitative investigation aimed to gather and analyze information about experiences in pediatric dentistry from the students' viewpoint using critical incident technique (CIT). The sample group for this investigation came from all 240 3rd and 4th year dental students from all the four dental colleges in Indore. Using CIT, participants were asked to describe at least one positive and one negative experience in detail. They described 308 positive and 359 negative experiences related to the pediatric dentistry clinic. Analysis of the data resulted in the identification of four key factors related to their experiences: 1) The instructor; 2) the patient; 3) the learning process; and 4) the learning environment. The CIT is a useful data collection and analysis technique that provides rich, useful data and has many potential uses in dental education.

  15. Correlative visualization techniques for multidimensional data

    NASA Technical Reports Server (NTRS)

    Treinish, Lloyd A.; Goettsche, Craig

    1989-01-01

    Critical to the understanding of data is the ability to provide pictorial or visual representation of those data, particularly in support of correlative data analysis. Despite the advancement of visualization techniques for scientific data over the last several years, there are still significant problems in bringing today's hardware and software technology into the hands of the typical scientist. For example, there are other computer science domains outside of computer graphics that are required to make visualization effective such as data management. Well-defined, flexible mechanisms for data access and management must be combined with rendering algorithms, data transformation, etc. to form a generic visualization pipeline. A generalized approach to data visualization is critical for the correlative analysis of distinct, complex, multidimensional data sets in the space and Earth sciences. Different classes of data representation techniques must be used within such a framework, which can range from simple, static two- and three-dimensional line plots to animation, surface rendering, and volumetric imaging. Static examples of actual data analyses will illustrate the importance of an effective pipeline in data visualization system.

  16. Splash evaluation of SRB designs

    NASA Technical Reports Server (NTRS)

    Counter, D. N.

    1974-01-01

    A technique is developed to optimize the shuttle solid rocket booster (SRB) design for water impact loads. The SRB is dropped by parachute and recovered at sea for reuse. Loads experienced at water impact are design critical. The probability of each water impact load is determined using a Monte Carlo technique and an aerodynamic analysis of the SRB parachute system. Meteorological effects are included and four configurations are evaluated.

  17. Scheduling: A guide for program managers

    NASA Technical Reports Server (NTRS)

    1994-01-01

    The following topics are discussed concerning scheduling: (1) milestone scheduling; (2) network scheduling; (3) program evaluation and review technique; (4) critical path method; (5) developing a network; (6) converting an ugly duckling to a swan; (7) network scheduling problem; (8) (9) network scheduling when resources are limited; (10) multi-program considerations; (11) influence on program performance; (12) line-of-balance technique; (13) time management; (14) recapitulization; and (15) analysis.

  18. Reliability/safety analysis of a fly-by-wire system

    NASA Technical Reports Server (NTRS)

    Brock, L. D.; Goddman, H. A.

    1980-01-01

    An analysis technique has been developed to estimate the reliability of a very complex, safety-critical system by constructing a diagram of the reliability equations for the total system. This diagram has many of the characteristics of a fault-tree or success-path diagram, but is much easier to construct for complex redundant systems. The diagram provides insight into system failure characteristics and identifies the most likely failure modes. A computer program aids in the construction of the diagram and the computation of reliability. Analysis of the NASA F-8 Digital Fly-by-Wire Flight Control System is used to illustrate the technique.

  19. Mixed-venous oxygen tension by nitrogen rebreathing - A critical, theoretical analysis.

    NASA Technical Reports Server (NTRS)

    Kelman, G. R.

    1972-01-01

    There is dispute about the validity of the nitrogen rebreathing technique for determination of mixed-venous oxygen tension. This theoretical analysis examines the circumstances under which the technique is likely to be applicable. When the plateau method is used the probable error in mixed-venous oxygen tension is plus or minus 2.5 mm Hg at rest, and of the order of plus or minus 1 mm Hg during exercise. Provided, that the rebreathing bag size is reasonably chosen, Denison's (1967) extrapolation technique gives results at least as accurate as those obtained by the plateau method. At rest, however, extrapolation should be to 30 rather than to 20 sec.

  20. Early Oscillation Detection for Hybrid DC/DC Converter Fault Diagnosis

    NASA Technical Reports Server (NTRS)

    Wang, Bright L.

    2011-01-01

    This paper describes a novel fault detection technique for hybrid DC/DC converter oscillation diagnosis. The technique is based on principles of feedback control loop oscillation and RF signal modulations, and Is realized by using signal spectral analysis. Real-circuit simulation and analytical study reveal critical factors of the oscillation and indicate significant correlations between the spectral analysis method and the gain/phase margin method. A stability diagnosis index (SDI) is developed as a quantitative measure to accurately assign a degree of stability to the DC/DC converter. This technique Is capable of detecting oscillation at an early stage without interfering with DC/DC converter's normal operation and without limitations of probing to the converter.

  1. Developing critical consciousness or justifying the system? A qualitative analysis of attributions for poverty and wealth among low-income racial/ethnic minority and immigrant women

    PubMed Central

    Godfrey, Erin B.; Wolf, Sharon

    2015-01-01

    Objectives Economic inequality is a growing concern in the United States and globally. The current study uses qualitative techniques to (1) explore the attributions low-income racial/ethnic minority and immigrant women make for poverty and wealth in the U.S., and (2) clarify important links between attributions, critical consciousness development and system justification theory. Methods In-depth interview transcripts from 19 low-income immigrant Dominican and Mexican and native African-American mothers in a large Northeastern city were analyzed using open coding techniques. Interview topics included perceptions of current economic inequality and mobility and experiences of daily economic hardships. Results Almost all respondents attributed economic inequality to individual factors (character flaws, lack of hard work). Structural explanations for poverty and wealth were expressed by less than half the sample and almost always paired with individual explanations. Moreover, individual attributions included system-justifying beliefs such as the belief in meritocracy and equality of opportunity and structural attributions represented varying levels of critical consciousness. Conclusions Our analysis sheds new light on how and why individuals simultaneously hold individual and structural attributions and highlights key links between system justification and critical consciousness. It shows that critical consciousness and system justification do not represent opposite stances along a single underlying continuum, but are distinct belief systems and motivations. It also suggests that the motive to justify the system is a key psychological process impeding the development of critical consciousness. Implications for scholarship and intervention are discussed. PMID:25915116

  2. Developing critical consciousness or justifying the system? A qualitative analysis of attributions for poverty and wealth among low-income racial/ethnic minority and immigrant women.

    PubMed

    Godfrey, Erin B; Wolf, Sharon

    2016-01-01

    Economic inequality is a growing concern in the United States and globally. The current study uses qualitative techniques to (a) explore the attributions low-income racial/ethnic minority and immigrant women make for poverty and wealth in the U.S., and (b) clarify important links between attributions, critical consciousness development, and system justification theory. In-depth interview transcripts from 19 low-income immigrant Dominican and Mexican and native African American mothers in a large Northeastern city were analyzed using open coding techniques. Interview topics included perceptions of current economic inequality and mobility and experiences of daily economic hardships. Almost all respondents attributed economic inequality to individual factors (character flaws, lack of hard work). Structural explanations for poverty and wealth were expressed by fewer than half the sample and almost always paired with individual explanations. Moreover, individual attributions included system-justifying beliefs such as the belief in meritocracy and equality of opportunity and structural attributions represented varying levels of critical consciousness. Our analysis sheds new light on how and why individuals simultaneously hold individual and structural attributions and highlights key links between system justification and critical consciousness. It shows that critical consciousness and system justification do not represent opposite stances along a single underlying continuum, but are distinct belief systems and motivations. It also suggests that the motive to justify the system is a key psychological process impeding the development of critical consciousness. Implications for scholarship and intervention are discussed. (c) 2016 APA, all rights reserved).

  3. Thyroid Radiofrequency Ablation: Updates on Innovative Devices and Techniques

    PubMed Central

    Park, Hye Sun; Park, Auh Whan; Chung, Sae Rom; Choi, Young Jun; Lee, Jeong Hyun

    2017-01-01

    Radiofrequency ablation (RFA) is a well-known, effective, and safe method for treating benign thyroid nodules and recurrent thyroid cancers. Thyroid-dedicated devices and basic techniques for thyroid RFA were introduced by the Korean Society of Thyroid Radiology (KSThR) in 2012. Thyroid RFA has now been adopted worldwide, with subsequent advances in devices and techniques. To optimize the treatment efficacy and patient safety, understanding the basic and advanced RFA techniques and selecting the optimal treatment strategy are critical. The goal of this review is to therefore provide updates and analysis of current devices and advanced techniques for RFA treatment of benign thyroid nodules and recurrent thyroid cancers. PMID:28670156

  4. [Functional magnetic resonance imaging: a critical analysis of its technical, statistical and theoretical implications in human neuroscience].

    PubMed

    González-García, C; Tudela, P; Ruz, M

    2014-04-01

    The use of functional magnetic resonance imaging (fMRI) has represented an important step forward for the neurosciences. Nevertheless, it has also been subject to rather a lot of criticism. To study the most widespread criticism against fMRI, so that researchers who are starting to use it may know the different elements that must be taken into account to be able to take a suitable approach to this technique. The fact that fMRI allows brain activity to be observed makes it a very attractive and useful tool, and its use has grown exponentially since the last decade of the 20th century. At the same time, criticism against its use has become especially fierce. Most of this scepticism can be classified into aspects related with the technique and physiology, the analysis of data and their theoretical interpretation. In this study we will review the main arguments defended in each of these three areas, as well as looking at whether they are well-founded or not. Additionally, this work is also intended as a reference for novel researchers when it comes to identifying elements that must be taken into account as they approach fMRI. Despite the fact that fMRI is one of the most interesting options for observing the brain available today, its correct utilisation requires a great deal of control and knowledge. Even so, today most of the criticism it receives no longer has any solid foundation on which to stand.

  5. Comparison of Two Variants Of a Kata Technique (Unsu): The Neuromechanical Point of View

    PubMed Central

    Camomilla, Valentina; Sbriccoli, Paola; Mario, Alberto Di; Arpante, Alessandro; Felici, Francesco

    2009-01-01

    The objective of this work was to characterize from a neuromechanical point of view a jump performed within the sequence of Kata Unsu in International top level karateka. A modified jumping technique was proposed to improve the already acquired technique. The neuromechanical evaluation, paralleled by a refereeing judgment, was then used to compare modified and classic technique to test if the modification could lead to a better performance capacity, e.g. a higher score during an official competition. To this purpose, four high ranked karateka were recruited and instructed to perform the two jumps. Surface electromyographic signals were recorded in a bipolar mode from the vastus lateralis, rectus femoris, biceps femoris, gluteus maximus, and gastrocnemious muscles of both lower limbs. Mechanical data were collected by means of a stereophotogrammetric system and force platforms. Performance was associated to parameters characterizing the initial conditions of the aerial phase and to the CoM maximal height. The most critical elements having a negative influence on the arbitral evaluation were associated to quantitative error indicators. 3D reconstruction of the movement and videos were used to obtain the referee scores. The Unsu jump was divided into five phases (preparation, take off, ascending flight, descending flight, and landing) and the critical elements were highlighted. When comparing the techniques, no difference was found in the pattern of sEMG activation of the throwing leg muscles, while the push leg showed an earlier activation of RF and GA muscles at the beginning of the modified technique. The only significant improvement associated with the modified technique was evidenced at the beginning of the aerial phase, while there was no significant improvement of the referee score. Nevertheless, the proposed neuromechanical analysis, finalized to correlate technique features with the core performance indicators, is new in the field and is a promising tool to perform further analyses. Key Points A quantitative phase analysis, highlighting the critical features of the technique, was provided for the jump executed during the Kata Unsu. Kinematics and neuromuscular activity can be assessed during the Kata Unsu jump performed by top level karateka. Neuromechanical parameters change during different Kata Unsu jump techniques. Appropriate performance capacity indicators based on the neuromechanical evaluation can describe changes due to a modification of the technique. PMID:24474884

  6. Inverse analysis of aerodynamic loads from strain information using structural models and neural networks

    NASA Astrophysics Data System (ADS)

    Wada, Daichi; Sugimoto, Yohei

    2017-04-01

    Aerodynamic loads on aircraft wings are one of the key parameters to be monitored for reliable and effective aircraft operations and management. Flight data of the aerodynamic loads would be used onboard to control the aircraft and accumulated data would be used for the condition-based maintenance and the feedback for the fatigue and critical load modeling. The effective sensing techniques such as fiber optic distributed sensing have been developed and demonstrated promising capability of monitoring structural responses, i.e., strains on the surface of the aircraft wings. By using the developed techniques, load identification methods for structural health monitoring are expected to be established. The typical inverse analysis for load identification using strains calculates the loads in a discrete form of concentrated forces, however, the distributed form of the loads is essential for the accurate and reliable estimation of the critical stress at structural parts. In this study, we demonstrate an inverse analysis to identify the distributed loads from measured strain information. The introduced inverse analysis technique calculates aerodynamic loads not in a discrete but in a distributed manner based on a finite element model. In order to verify the technique through numerical simulations, we apply static aerodynamic loads on a flat panel model, and conduct the inverse identification of the load distributions. We take two approaches to build the inverse system between loads and strains. The first one uses structural models and the second one uses neural networks. We compare the performance of the two approaches, and discuss the effect of the amount of the strain sensing information.

  7. A computer technique for detailed analysis of mission radius and maneuverability characteristics of fighter aircraft

    NASA Technical Reports Server (NTRS)

    Foss, W. E., Jr.

    1981-01-01

    A computer technique to determine the mission radius and maneuverability characteristics of combat aircraft was developed. The technique was used to determine critical operational requirements and the areas in which research programs would be expected to yield the most beneficial results. In turn, the results of research efforts were evaluated in terms of aircraft performance on selected mission segments and for complete mission profiles. Extensive use of the technique in evaluation studies indicates that the calculated performance is essentially the same as that obtained by the proprietary programs in use throughout the aircraft industry.

  8. Recent mass spectrometry-based techniques and considerations for disulfide bond characterization in proteins.

    PubMed

    Lakbub, Jude C; Shipman, Joshua T; Desaire, Heather

    2018-04-01

    Disulfide bonds are important structural moieties of proteins: they ensure proper folding, provide stability, and ensure proper function. With the increasing use of proteins for biotherapeutics, particularly monoclonal antibodies, which are highly disulfide bonded, it is now important to confirm the correct disulfide bond connectivity and to verify the presence, or absence, of disulfide bond variants in the protein therapeutics. These studies help to ensure safety and efficacy. Hence, disulfide bonds are among the critical quality attributes of proteins that have to be monitored closely during the development of biotherapeutics. However, disulfide bond analysis is challenging because of the complexity of the biomolecules. Mass spectrometry (MS) has been the go-to analytical tool for the characterization of such complex biomolecules, and several methods have been reported to meet the challenging task of mapping disulfide bonds in proteins. In this review, we describe the relevant, recent MS-based techniques and provide important considerations needed for efficient disulfide bond analysis in proteins. The review focuses on methods for proper sample preparation, fragmentation techniques for disulfide bond analysis, recent disulfide bond mapping methods based on the fragmentation techniques, and automated algorithms designed for rapid analysis of disulfide bonds from liquid chromatography-MS/MS data. Researchers involved in method development for protein characterization can use the information herein to facilitate development of new MS-based methods for protein disulfide bond analysis. In addition, individuals characterizing biotherapeutics, especially by disulfide bond mapping in antibodies, can use this review to choose the best strategies for disulfide bond assignment of their biologic products. Graphical Abstract This review, describing characterization methods for disulfide bonds in proteins, focuses on three critical components: sample preparation, mass spectrometry data, and software tools.

  9. CrossTalk: The Journal of Defense Software Engineering. Volume 27, Number 1, January/February 2014

    DTIC Science & Technology

    2014-02-01

    deficit in trustworthiness and will permit analysis on how this deficit needs to be overcome. This analysis will help identify adaptations that are...approaches to trustworthy analysis split into two categories: product-based and process-based. Product-based techniques [9] identify factors that...Criticalities may also be assigned to decompositions and contributions. 5. Evaluation and analysis : in this task the propagation rules of the NFR

  10. Challenges of assessing critical thinking and clinical judgment in nurse practitioner students.

    PubMed

    Gorton, Karen L; Hayes, Janice

    2014-03-01

    The purpose of this study was to determine whether there was a relationship between critical thinking skills and clinical judgment in nurse practitioner students. The study used a convenience, nonprobability sampling technique, engaging participants from across the United States. Correlational analysis demonstrated no statistically significant relationship between critical thinking skills and examination-style questions, critical thinking skills and scores on the evaluation and reevaluation of consequences subscale of the Clinical Decision Making in Nursing Scale, and critical thinking skills and the preceptor evaluation tool. The study found no statistically significant relationships between critical thinking skills and clinical judgment. Educators and practitioners could consider further research in these areas to gain insight into how critical thinking is and could be measured, to gain insight into the clinical decision making skills of nurse practitioner students, and to gain insight into the development and measurement of critical thinking skills in advanced practice educational programs. Copyright 2014, SLACK Incorporated.

  11. Atomic spectrometry methods for wine analysis: a critical evaluation and discussion of recent applications.

    PubMed

    Grindlay, Guillermo; Mora, Juan; Gras, Luis; de Loos-Vollebregt, Margaretha T C

    2011-04-08

    The analysis of wine is of great importance since wine components strongly determine its stability, organoleptic or nutrition characteristics. In addition, wine analysis is also important to prevent fraud and to assess toxicological issues. Among the different analytical techniques described in the literature, atomic spectrometry has been traditionally employed for elemental wine analysis due to its simplicity and good analytical figures of merit. The scope of this review is to summarize the main advantages and drawbacks of various atomic spectrometry techniques for elemental wine analysis. Special attention is paid to interferences (i.e. matrix effects) affecting the analysis as well as the strategies available to mitigate them. Finally, latest studies about wine speciation are briefly discussed. Copyright © 2011 Elsevier B.V. All rights reserved.

  12. Critical frontier of the triangular Ising antiferromagnet in a field

    NASA Astrophysics Data System (ADS)

    Qian, Xiaofeng; Wegewijs, Maarten; Blöte, Henk W.

    2004-03-01

    We study the critical line of the triangular Ising antiferromagnet in an external magnetic field by means of a finite-size analysis of results obtained by transfer-matrix and Monte Carlo techniques. We compare the shape of the critical line with predictions of two different theoretical scenarios. Both scenarios, while plausible, involve assumptions. The first scenario is based on the generalization of the model to a vertex model, and the assumption that the exact analytic form of the critical manifold of this vertex model is determined by the zeroes of an O(2) gauge-invariant polynomial in the vertex weights. However, it is not possible to fit the coefficients of such polynomials of orders up to 10, such as to reproduce the numerical data for the critical points. The second theoretical prediction is based on the assumption that a renormalization mapping exists of the Ising model on the Coulomb gas, and analysis of the resulting renormalization equations. It leads to a shape of the critical line that is inconsistent with the first prediction, but consistent with the numerical data.

  13. SU-F-T-246: Evaluation of Healthcare Failure Mode And Effect Analysis For Risk Assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harry, T; University of California, San Diego, La Jolla, CA; Manger, R

    Purpose: To evaluate the differences between the Veteran Affairs Healthcare Failure Modes and Effect Analysis (HFMEA) and the AAPM Task Group 100 Failure and Effect Analysis (FMEA) risk assessment techniques in the setting of a stereotactic radiosurgery (SRS) procedure were compared respectively. Understanding the differences in the techniques methodologies and outcomes will provide further insight into the applicability and utility of risk assessments exercises in radiation therapy. Methods: HFMEA risk assessment analysis was performed on a stereotactic radiosurgery procedure. A previous study from our institution completed a FMEA of our SRS procedure and the process map generated from this workmore » was used for the HFMEA. The process of performing the HFMEA scoring was analyzed, and the results from both analyses were compared. Results: The key differences between the two risk assessments are the scoring criteria for failure modes and identifying critical failure modes for potential hazards. The general consensus among the team performing the analyses was that scoring for the HFMEA was simpler and more intuitive then the FMEA. The FMEA identified 25 critical failure modes while the HFMEA identified 39. Seven of the FMEA critical failure modes were not identified by the HFMEA and 21 of the HFMEA critical failure modes were not identified by the FMEA. HFMEA as described by the Veteran Affairs provides guidelines on which failure modes to address first. Conclusion: HFMEA is a more efficient model for identifying gross risks in a process than FMEA. Clinics with minimal staff, time and resources can benefit from this type of risk assessment to eliminate or mitigate high risk hazards with nominal effort. FMEA can provide more in depth details but at the cost of elevated effort.« less

  14. Probabilistic Analysis Techniques Applied to Complex Spacecraft Power System Modeling

    NASA Technical Reports Server (NTRS)

    Hojnicki, Jeffrey S.; Rusick, Jeffrey J.

    2005-01-01

    Electric power system performance predictions are critical to spacecraft, such as the International Space Station (ISS), to ensure that sufficient power is available to support all the spacecraft s power needs. In the case of the ISS power system, analyses to date have been deterministic, meaning that each analysis produces a single-valued result for power capability because of the complexity and large size of the model. As a result, the deterministic ISS analyses did not account for the sensitivity of the power capability to uncertainties in model input variables. Over the last 10 years, the NASA Glenn Research Center has developed advanced, computationally fast, probabilistic analysis techniques and successfully applied them to large (thousands of nodes) complex structural analysis models. These same techniques were recently applied to large, complex ISS power system models. This new application enables probabilistic power analyses that account for input uncertainties and produce results that include variations caused by these uncertainties. Specifically, N&R Engineering, under contract to NASA, integrated these advanced probabilistic techniques with Glenn s internationally recognized ISS power system model, System Power Analysis for Capability Evaluation (SPACE).

  15. Benchmarking criticality analysis of TRIGA fuel storage racks.

    PubMed

    Robinson, Matthew Loren; DeBey, Timothy M; Higginbotham, Jack F

    2017-01-01

    A criticality analysis was benchmarked to sub-criticality measurements of the hexagonal fuel storage racks at the United States Geological Survey TRIGA MARK I reactor in Denver. These racks, which hold up to 19 fuel elements each, are arranged at 0.61m (2 feet) spacings around the outer edge of the reactor. A 3-dimensional model was created of the racks using MCNP5, and the model was verified experimentally by comparison to measured subcritical multiplication data collected in an approach to critical loading of two of the racks. The validated model was then used to show that in the extreme condition where the entire circumference of the pool was lined with racks loaded with used fuel the storage array is subcritical with a k value of about 0.71; well below the regulatory limit of 0.8. A model was also constructed of the rectangular 2×10 fuel storage array used in many other TRIGA reactors to validate the technique against the original TRIGA licensing sub-critical analysis performed in 1966. The fuel used in this study was standard 20% enriched (LEU) aluminum or stainless steel clad TRIGA fuel. Copyright © 2016. Published by Elsevier Ltd.

  16. A comparative analysis of conventional cytopreparatory and liquid based cytological techniques (Sure Path) in evaluation of serous effusion fluids.

    PubMed

    Dadhich, Hrishikesh; Toi, Pampa Ch; Siddaraju, Neelaiah; Sevvanthi, Kalidas

    2016-11-01

    Clinically, detection of malignant cells in serous body fluids is critical, as their presence implies the upstaging of the disease. Cytology of body cavity fluids serves as an important tool when other diagnostic tests cannot be performed. In most laboratories, currently, the effusion fluid samples are analysed chiefly by the conventional cytopreparatory (CCP) technique. Although, there are several studies comparing the liquid-based cytology (LBC), with CCP technique in the field of cervicovaginal cytology; the literature on such comparison with respect to serous body fluid examination is sparse. One hundred samples of serous body fluids were processed by both CCP and LBC techniques. Slides prepared by these techniques were studied using six parameters. A comparative analysis of the advantages and disadvantages of the techniques in detection of malignant cells was carried out with appropriate statistical tests. The samples comprised 52 pleural, 44 peritoneal and four pericardial fluids. No statistically significant difference was noted with respect to cellularity (P values = 0.22), cell distribution (P values = 0.39) and diagnosis of malignancy (P values = 0.20). As for the remaining parameters, LBC provided statistically significant clearer smear background (P values < 0.0001) and shorter screening time (P values < 0.0001), while CPP technique provided a significantly better staining quality (P values 0.01) and sharper cytomorphologic features (P values 0.05). Although, a reduced screening time and clearer smear background are the two major advantages of LBC; the CCP technique provides the better staining quality with sharper cytomorphologic features which is more critical from the cytologic interpretation point of view. Diagn. Cytopathol. 2016;44:874-879. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  17. Critical care physician cognitive task analysis: an exploratory study

    PubMed Central

    Fackler, James C; Watts, Charles; Grome, Anna; Miller, Thomas; Crandall, Beth; Pronovost, Peter

    2009-01-01

    Introduction For better or worse, the imposition of work-hour limitations on house-staff has imperiled continuity and/or improved decision-making. Regardless, the workflow of every physician team in every academic medical centre has been irrevocably altered. We explored the use of cognitive task analysis (CTA) techniques, most commonly used in other high-stress and time-sensitive environments, to analyse key cognitive activities in critical care medicine. The study objective was to assess the usefulness of CTA as an analytical tool in order that physician cognitive tasks may be understood and redistributed within the work-hour limited medical decision-making teams. Methods After approval from each Institutional Review Board, two intensive care units (ICUs) within major university teaching hospitals served as data collection sites for CTA observations and interviews of critical care providers. Results Five broad categories of cognitive activities were identified: pattern recognition; uncertainty management; strategic vs. tactical thinking; team coordination and maintenance of common ground; and creation and transfer of meaning through stories. Conclusions CTA within the framework of Naturalistic Decision Making is a useful tool to understand the critical care process of decision-making and communication. The separation of strategic and tactical thinking has implications for workflow redesign. Given the global push for work-hour limitations, such workflow redesign is occurring. Further work with CTA techniques will provide important insights toward rational, rather than random, workflow changes. PMID:19265517

  18. Critical care physician cognitive task analysis: an exploratory study.

    PubMed

    Fackler, James C; Watts, Charles; Grome, Anna; Miller, Thomas; Crandall, Beth; Pronovost, Peter

    2009-01-01

    For better or worse, the imposition of work-hour limitations on house-staff has imperiled continuity and/or improved decision-making. Regardless, the workflow of every physician team in every academic medical centre has been irrevocably altered. We explored the use of cognitive task analysis (CTA) techniques, most commonly used in other high-stress and time-sensitive environments, to analyse key cognitive activities in critical care medicine. The study objective was to assess the usefulness of CTA as an analytical tool in order that physician cognitive tasks may be understood and redistributed within the work-hour limited medical decision-making teams. After approval from each Institutional Review Board, two intensive care units (ICUs) within major university teaching hospitals served as data collection sites for CTA observations and interviews of critical care providers. Five broad categories of cognitive activities were identified: pattern recognition; uncertainty management; strategic vs. tactical thinking; team coordination and maintenance of common ground; and creation and transfer of meaning through stories. CTA within the framework of Naturalistic Decision Making is a useful tool to understand the critical care process of decision-making and communication. The separation of strategic and tactical thinking has implications for workflow redesign. Given the global push for work-hour limitations, such workflow redesign is occurring. Further work with CTA techniques will provide important insights toward rational, rather than random, workflow changes.

  19. Beethoven recordings reviewed: a systematic method for mapping the content of music performance criticism

    PubMed Central

    Alessandri, Elena; Williamson, Victoria J.; Eiholzer, Hubert; Williamon, Aaron

    2015-01-01

    Critical reviews offer rich data that can be used to investigate how musical experiences are conceptualized by expert listeners. However, these data also present significant challenges in terms of organization, analysis, and interpretation. This study presents a new systematic method for examining written responses to music, tested on a substantial corpus of music criticism. One hundred critical reviews of Beethoven’s piano sonata recordings, published in the Gramophone between August 1934 and July 2010, were selected using in-depth data reduction (qualitative/quantitative approach). The texts were then examined using thematic analysis in order to generate a visual descriptive model of expert critical review. This model reveals how the concept of evaluation permeates critical review. It also distinguishes between two types of descriptors. The first characterizes the performance in terms of specific actions or features of the musical sound (musical parameters, technique, and energy); the second appeals to higher-order properties (artistic style, character and emotion, musical structure, communicativeness) or assumed performer qualities (understanding, intentionality, spontaneity, sensibility, control, and care). The new model provides a methodological guide and conceptual basis for future studies of critical review in any genre. PMID:25741295

  20. Beethoven recordings reviewed: a systematic method for mapping the content of music performance criticism.

    PubMed

    Alessandri, Elena; Williamson, Victoria J; Eiholzer, Hubert; Williamon, Aaron

    2015-01-01

    Critical reviews offer rich data that can be used to investigate how musical experiences are conceptualized by expert listeners. However, these data also present significant challenges in terms of organization, analysis, and interpretation. This study presents a new systematic method for examining written responses to music, tested on a substantial corpus of music criticism. One hundred critical reviews of Beethoven's piano sonata recordings, published in the Gramophone between August 1934 and July 2010, were selected using in-depth data reduction (qualitative/quantitative approach). The texts were then examined using thematic analysis in order to generate a visual descriptive model of expert critical review. This model reveals how the concept of evaluation permeates critical review. It also distinguishes between two types of descriptors. The first characterizes the performance in terms of specific actions or features of the musical sound (musical parameters, technique, and energy); the second appeals to higher-order properties (artistic style, character and emotion, musical structure, communicativeness) or assumed performer qualities (understanding, intentionality, spontaneity, sensibility, control, and care). The new model provides a methodological guide and conceptual basis for future studies of critical review in any genre.

  1. Geospatial methods and data analysis for assessing distribution of grazing livestock

    USDA-ARS?s Scientific Manuscript database

    Free-ranging livestock research must begin with a well conceived problem statement and employ appropriate data acquisition tools and analytical techniques to accomplish the research objective. These requirements are especially critical in addressing animal distribution. Tools and statistics used t...

  2. Techniques and Technology to Revise Content Delivery and Model Critical Thinking in the Neuroscience Classroom

    PubMed Central

    Illig, Kurt R.

    2015-01-01

    Undergraduate neuroscience courses typically involve highly interdisciplinary material, and it is often necessary to use class time to review how principles of chemistry, math and biology apply to neuroscience. Lecturing and Socratic discussion can work well to deliver information to students, but these techniques can lead students to feel more like spectators than participants in a class, and do not actively engage students in the critical analysis and application of experimental evidence. If one goal of undergraduate neuroscience education is to foster critical thinking skills, then the classroom should be a place where students and instructors can work together to develop them. Students learn how to think critically by directly engaging with course material, and by discussing evidence with their peers, but taking classroom time for these activities requires that an instructor find a way to provide course materials outside of class. Using technology as an on-demand provider of course materials can give instructors the freedom to restructure classroom time, allowing students to work together in small groups and to have discussions that foster critical thinking, and allowing the instructor to model these skills. In this paper, I provide a rationale for reducing the use of traditional lectures in favor of more student-centered activities, I present several methods that can be used to deliver course materials outside of class and discuss their use, and I provide a few examples of how these techniques and technologies can help improve learning outcomes. PMID:26240525

  3. Techniques and Technology to Revise Content Delivery and Model Critical Thinking in the Neuroscience Classroom.

    PubMed

    Illig, Kurt R

    2015-01-01

    Undergraduate neuroscience courses typically involve highly interdisciplinary material, and it is often necessary to use class time to review how principles of chemistry, math and biology apply to neuroscience. Lecturing and Socratic discussion can work well to deliver information to students, but these techniques can lead students to feel more like spectators than participants in a class, and do not actively engage students in the critical analysis and application of experimental evidence. If one goal of undergraduate neuroscience education is to foster critical thinking skills, then the classroom should be a place where students and instructors can work together to develop them. Students learn how to think critically by directly engaging with course material, and by discussing evidence with their peers, but taking classroom time for these activities requires that an instructor find a way to provide course materials outside of class. Using technology as an on-demand provider of course materials can give instructors the freedom to restructure classroom time, allowing students to work together in small groups and to have discussions that foster critical thinking, and allowing the instructor to model these skills. In this paper, I provide a rationale for reducing the use of traditional lectures in favor of more student-centered activities, I present several methods that can be used to deliver course materials outside of class and discuss their use, and I provide a few examples of how these techniques and technologies can help improve learning outcomes.

  4. Optimisation of Critical Infrastructure Protection: The SiVe Project on Airport Security

    NASA Astrophysics Data System (ADS)

    Breiing, Marcus; Cole, Mara; D'Avanzo, John; Geiger, Gebhard; Goldner, Sascha; Kuhlmann, Andreas; Lorenz, Claudia; Papproth, Alf; Petzel, Erhard; Schwetje, Oliver

    This paper outlines the scientific goals, ongoing work and first results of the SiVe research project on critical infrastructure security. The methodology is generic while pilot studies are chosen from airport security. The outline proceeds in three major steps, (1) building a threat scenario, (2) development of simulation models as scenario refinements, and (3) assessment of alternatives. Advanced techniques of systems analysis and simulation are employed to model relevant airport structures and processes as well as offences. Computer experiments are carried out to compare and optimise alternative solutions. The optimality analyses draw on approaches to quantitative risk assessment recently developed in the operational sciences. To exploit the advantages of the various techniques, an integrated simulation workbench is build up in the project.

  5. Verification of Orthogrid Finite Element Modeling Techniques

    NASA Technical Reports Server (NTRS)

    Steeve, B. E.

    1996-01-01

    The stress analysis of orthogrid structures, specifically with I-beam sections, is regularly performed using finite elements. Various modeling techniques are often used to simplify the modeling process but still adequately capture the actual hardware behavior. The accuracy of such 'Oshort cutso' is sometimes in question. This report compares three modeling techniques to actual test results from a loaded orthogrid panel. The finite element models include a beam, shell, and mixed beam and shell element model. Results show that the shell element model performs the best, but that the simpler beam and beam and shell element models provide reasonable to conservative results for a stress analysis. When deflection and stiffness is critical, it is important to capture the effect of the orthogrid nodes in the model.

  6. Report of geomagnetic pulsation indices for space weather applications

    USGS Publications Warehouse

    Xu, Z.; Gannon, Jennifer L.; Rigler, Erin J.

    2013-01-01

    The phenomenon of ultra-low frequency geomagnetic pulsations was first observed in the ground-based measurements of the 1859 Carrington Event and has been studied for over 100 years. Pulsation frequency is considered to be “ultra” low when it is lower than the natural frequencies of the plasma, such as the ion gyrofrequency. Ultra-low frequency pulsations are considered a source of noise in some geophysical analysis techniques, such as aeromagnetic surveys and transient electromagnetics, so it is critical to develop near real-time space weather products to monitor these geomagnetic pulsations. The proper spectral analysis of magnetometer data, such as using wavelet analysis techniques, can also be important to Geomagnetically Induced Current risk assessment.

  7. Determination of the critical bending speeds of a multy-rotor shaft from the vibration signal analysis

    NASA Astrophysics Data System (ADS)

    Crâştiu, I.; Nyaguly, E.; Deac, S.; Gozman-Pop, C.; Bârgău, A.; Bereteu, L.

    2018-01-01

    The purpose of this paper is the development and validation of an impulse excitation technique to determine flexural critical speeds of a single rotor shaft and multy-rotor shaft. The experimental measurement of the vibroacoustic response is carried out by using a condenser microphone as a transducer. By the means of Modal Analysis using Finite Element Method (FEM), the natural frequencies and shape modes of one rotor and three rotor specimens are determined. The vibration responses of the specimens, in simple supported conditions, are carried out using algorithms based on Fast Fourier Transform (FFT). To validate the results of the modal parameters estimated using Finite Element Analysis (FEA) these are compared with experimental ones.

  8. Software Dependability and Safety Evaluations ESA's Initiative

    NASA Astrophysics Data System (ADS)

    Hernek, M.

    ESA has allocated funds for an initiative to evaluate Dependability and Safety methods of Software. The objectives of this initiative are; · More extensive validation of Safety and Dependability techniques for Software · Provide valuable results to improve the quality of the Software thus promoting the application of Dependability and Safety methods and techniques. ESA space systems are being developed according to defined PA requirement specifications. These requirements may be implemented through various design concepts, e.g. redundancy, diversity etc. varying from project to project. Analysis methods (FMECA. FTA, HA, etc) are frequently used during requirements analysis and design activities to assure the correct implementation of system PA requirements. The criticality level of failures, functions and systems is determined and by doing that the critical sub-systems are identified, on which dependability and safety techniques are to be applied during development. Proper performance of the software development requires the development of a technical specification for the products at the beginning of the life cycle. Such technical specification comprises both functional and non-functional requirements. These non-functional requirements address characteristics of the product such as quality, dependability, safety and maintainability. Software in space systems is more and more used in critical functions. Also the trend towards more frequent use of COTS and reusable components pose new difficulties in terms of assuring reliable and safe systems. Because of this, its dependability and safety must be carefully analysed. ESA identified and documented techniques, methods and procedures to ensure that software dependability and safety requirements are specified and taken into account during the design and development of a software system and to verify/validate that the implemented software systems comply with these requirements [R1].

  9. Aligning Goals, Assessments, and Activities: An Approach to Teaching PCR and Gel Electrophoresis

    PubMed Central

    Robertson, Amber L.; Batzli, Janet; Harris, Michelle; Miller, Sarah

    2008-01-01

    Polymerase chain reaction (PCR) and gel electrophoresis have become common techniques used in undergraduate molecular and cell biology labs. Although students enjoy learning these techniques, they often cannot fully comprehend and analyze the outcomes of their experiments because of a disconnect between concepts taught in lecture and experiments done in lab. Here we report the development and implementation of novel exercises that integrate the biological concepts of DNA structure and replication with the techniques of PCR and gel electrophoresis. Learning goals were defined based on concepts taught throughout the cell biology lab course and learning objectives specific to the PCR and gel electrophoresis lab. Exercises developed to promote critical thinking and target the underlying concepts of PCR, primer design, gel analysis, and troubleshooting were incorporated into an existing lab unit based on the detection of genetically modified organisms. Evaluative assessments for each exercise were aligned with the learning goals and used to measure student learning achievements. Our analysis found that the exercises were effective in enhancing student understanding of these concepts as shown by student performance across all learning goals. The new materials were particularly helpful in acquiring relevant knowledge, fostering critical-thinking skills, and uncovering prevalent misconceptions. PMID:18316813

  10. Swedish Defence Research Abstracts 80/81-2 (Froe Forsvars Forsknings Referat 80/81-2).

    DTIC Science & Technology

    1981-06-01

    Introduction of critical path analysis in calculations of the effects on aerial targets E CONDUCT OF WAR - INFORMATION AND COMMAND TECHNIQUE (63) Optical...including signal interception and technical intelligence (72) The mechanism of a Baltic earthquake source obtained by analysis of recordings of surface...work - a concluding discussion H7 Testing and job analysis (91) Cardiopulmonary responses to arm exercise performed in various ways (in English) (92

  11. Lessons Learned from Application of System and Software Level RAMS Analysis to a Space Control System

    NASA Astrophysics Data System (ADS)

    Silva, N.; Esper, A.

    2012-01-01

    The work presented in this article represents the results of applying RAMS analysis to a critical space control system, both at system and software levels. The system level RAMS analysis allowed the assignment of criticalities to the high level components, which was further refined by a tailored software level RAMS analysis. The importance of the software level RAMS analysis in the identification of new failure modes and its impact on the system level RAMS analysis is discussed. Recommendations of changes in the software architecture have also been proposed in order to reduce the criticality of the SW components to an acceptable minimum. The dependability analysis was performed in accordance to ECSS-Q-ST-80, which had to be tailored and complemented in some aspects. This tailoring will also be detailed in the article and lessons learned from the application of this tailoring will be shared, stating the importance to space systems safety evaluations. The paper presents the applied techniques, the relevant results obtained, the effort required for performing the tasks and the planned strategy for ROI estimation, as well as the soft skills required and acquired during these activities.

  12. Concrete pavement mixture design and analysis (MDA) : application of a portable x-ray fluorescence technique to assess concrete mix proportions.

    DOT National Transportation Integrated Search

    2012-03-01

    Any transportation infrastructure system is inherently concerned with durability and performance issues. The proportioning and : uniformity control of concrete mixtures are critical factors that directly affect the longevity and performance of the po...

  13. Automated Student Model Improvement

    ERIC Educational Resources Information Center

    Koedinger, Kenneth R.; McLaughlin, Elizabeth A.; Stamper, John C.

    2012-01-01

    Student modeling plays a critical role in developing and improving instruction and instructional technologies. We present a technique for automated improvement of student models that leverages the DataShop repository, crowd sourcing, and a version of the Learning Factors Analysis algorithm. We demonstrate this method on eleven educational…

  14. State-of-the-Art Methods for Skeletal Muscle Glycogen Analysis in Athletes—The Need for Novel Non-Invasive Techniques

    PubMed Central

    Greene, Jacob; Louis, Julien; Korostynska, Olga; Mason, Alex

    2017-01-01

    Muscle glycogen levels have a profound impact on an athlete’s sporting performance, thus measurement is vital. Carbohydrate manipulation is a fundamental component in an athlete’s lifestyle and is a critical part of elite performance, since it can provide necessary training adaptations. This paper provides a critical review of the current invasive and non-invasive methods for measuring skeletal muscle glycogen levels. These include the gold standard muscle biopsy, histochemical analysis, magnetic resonance spectroscopy, and musculoskeletal high frequency ultrasound, as well as pursuing future application of electromagnetic sensors in the pursuit of portable non-invasive quantification of muscle glycogen. This paper will be of interest to researchers who wish to understand the current and most appropriate techniques in measuring skeletal muscle glycogen. This will have applications both in the lab and in the field by improving the accuracy of research protocols and following the physiological adaptations to exercise. PMID:28241495

  15. Critical Analysis of Dual-Probe Heat-Pulse Technique Applied to Measuring Thermal Diffusivity

    NASA Astrophysics Data System (ADS)

    Bovesecchi, G.; Coppa, P.; Corasaniti, S.; Potenza, M.

    2018-07-01

    The paper presents an analysis of the experimental parameters involved in application of the dual-probe heat pulse technique, followed by a critical review of methods for processing thermal response data (e.g., maximum detection and nonlinear least square regression) and the consequent obtainable uncertainty. Glycerol was selected as testing liquid, and its thermal diffusivity was evaluated over the temperature range from - 20 °C to 60 °C. In addition, Monte Carlo simulation was used to assess the uncertainty propagation for maximum detection. It was concluded that maximum detection approach to process thermal response data gives the closest results to the reference data inasmuch nonlinear regression results are affected by major uncertainties due to partial correlation between the evaluated parameters. Besides, the interpolation of temperature data with a polynomial to find the maximum leads to a systematic difference between measured and reference data, as put into evidence by the Monte Carlo simulations; through its correction, this systematic error can be reduced to a negligible value, about 0.8 %.

  16. Formal Methods for Life-Critical Software

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Johnson, Sally C.

    1993-01-01

    The use of computer software in life-critical applications, such as for civil air transports, demands the use of rigorous formal mathematical verification procedures. This paper demonstrates how to apply formal methods to the development and verification of software by leading the reader step-by-step through requirements analysis, design, implementation, and verification of an electronic phone book application. The current maturity and limitations of formal methods tools and techniques are then discussed, and a number of examples of the successful use of formal methods by industry are cited.

  17. Assessing Spontaneous Combustion Instability with Nonlinear Time Series Analysis

    NASA Technical Reports Server (NTRS)

    Eberhart, C. J.; Casiano, M. J.

    2015-01-01

    Considerable interest lies in the ability to characterize the onset of spontaneous instabilities within liquid propellant rocket engine (LPRE) combustion devices. Linear techniques, such as fast Fourier transforms, various correlation parameters, and critical damping parameters, have been used at great length for over fifty years. Recently, nonlinear time series methods have been applied to deduce information pertaining to instability incipiency hidden in seemingly stochastic combustion noise. A technique commonly used in biological sciences known as the Multifractal Detrended Fluctuation Analysis has been extended to the combustion dynamics field, and is introduced here as a data analysis approach complementary to linear ones. Advancing, a modified technique is leveraged to extract artifacts of impending combustion instability that present themselves a priori growth to limit cycle amplitudes. Analysis is demonstrated on data from J-2X gas generator testing during which a distinct spontaneous instability was observed. Comparisons are made to previous work wherein the data were characterized using linear approaches. Verification of the technique is performed by examining idealized signals and comparing two separate, independently developed tools.

  18. [Professional divers: analysis of critical issues and proposal of a health protocol for work fitness].

    PubMed

    Pedata, Paola; Corvino, Anna Rita; Napolitano, Raffaele Carmine; Garzillo, Elpidio Maria; Furfaro, Ciro; Lamberti, Monica

    2016-01-20

    From many years now, thanks to the development of modern diving techniques, there has been a rapid spread of diving activities everywhere. In fact, divers are ever more numerous both among the Armed Forces and civilians who dive for work, like fishing, biological research and archeology. The aim of the study was to propose a health protocol for work fitness of professional divers keeping in mind the peculiar work activity, existing Italian legislation that is almost out of date and the technical and scientific evolution in this occupational field. We performed an analysis of the most frequently occurring diseases among professional divers and of the clinical investigation and imaging techniques used for work fitness assessment of professional divers. From analysis of the health protocol recommended by D.M. 13 January 1979 (Ministerial Decree), that is most used by occupational health physician, several critical issues emerged. Very often the clinical investigation and imaging techniques still used are almost obsolete, ignoring the execution of simple and inexpensive investigations that are more useful for work fitness assessment. Considering the out-dated legislation concerning diving disciplines, it is necessary to draw up a common health protocol that takes into account clinical and scientific knowledge and skills acquired in this area. This protocol's aim is to propose a useful tool for occupational health physicians who work in this sector.

  19. Multivariate Quantitative Chemical Analysis

    NASA Technical Reports Server (NTRS)

    Kinchen, David G.; Capezza, Mary

    1995-01-01

    Technique of multivariate quantitative chemical analysis devised for use in determining relative proportions of two components mixed and sprayed together onto object to form thermally insulating foam. Potentially adaptable to other materials, especially in process-monitoring applications in which necessary to know and control critical properties of products via quantitative chemical analyses of products. In addition to chemical composition, also used to determine such physical properties as densities and strengths.

  20. Changes in Social Capital and Networks: A Study of Community-Based Environmental Management through a School-Centered Research Program

    ERIC Educational Resources Information Center

    Thornton, Teresa; Leahy, Jessica

    2012-01-01

    Social network analysis (SNA) is a social science research tool that has not been applied to educational programs. This analysis is critical to documenting the changes in social capital and networks that result from community based K-12 educational collaborations. We review SNA and show an application of this technique in a school-centered,…

  1. An artificial bee colony algorithm for locating the critical slip surface in slope stability analysis

    NASA Astrophysics Data System (ADS)

    Kang, Fei; Li, Junjie; Ma, Zhenyue

    2013-02-01

    Determination of the critical slip surface with the minimum factor of safety of a slope is a difficult constrained global optimization problem. In this article, an artificial bee colony algorithm with a multi-slice adjustment method is proposed for locating the critical slip surfaces of soil slopes, and the Spencer method is employed to calculate the factor of safety. Six benchmark examples are presented to illustrate the reliability and efficiency of the proposed technique, and it is also compared with some well-known or recent algorithms for the problem. The results show that the new algorithm is promising in terms of accuracy and efficiency.

  2. Liquid chromatographic separation of terpenoid pigments in foods and food products.

    PubMed

    Cserháti, T; Forgács, E

    2001-11-30

    The newest achievements in the use of various liquid chromatographic techniques such as adsorption and reversed-phase thin-layer chromatography and HPLC employed for the separation and quantitative determination of terpenoid-based color substances in foods and food products are reviewed. The techniques applied for the analysis of individual pigments and pigments classes are surveyed and critically evaluated. Future trends in the separation and identification of pigments in foods and food products are delineated.

  3. Effect of load eccentricity on the buckling of thin-walled laminated C-columns

    NASA Astrophysics Data System (ADS)

    Wysmulski, Pawel; Teter, Andrzej; Debski, Hubert

    2018-01-01

    The study investigates the behaviour of short, thin-walled laminated C-columns under eccentric compression. The tested columns are simple-supported. The effect of load inaccuracy on the critical and post-critical (local buckling) states is examined. A numerical analysis by the finite element method and experimental tests on a test stand are performed. The samples were produced from a carbon-epoxy prepreg by the autoclave technique. The experimental tests rest on the assumption that compressive loads are 1.5 higher than the theoretical critical force. Numerical modelling is performed using the commercial software package ABAQUS®. The critical load is determined by solving an eigen problem using the Subspace algorithm. The experimental critical loads are determined based on post-buckling paths. The numerical and experimental results show high agreement, thus demonstrating a significant effect of load inaccuracy on the critical load corresponding to the column's local buckling.

  4. Radiotelemetry; techniques and analysis

    Treesearch

    Sybill K. Amelon; David C. Dalton; Joshua J. Millspaugh; Sandy A. Wolf

    2009-01-01

    Radiotelemetry has become and important tool in studies of animal behavior, ecology, management, and conservation. From the first decades following the introduction of radio transmitters, radiotelemetry emerged as a prominent and critically important tool in wildlife science for the study of physiology, animal movements (migration, dispersal, and home range), survival...

  5. Are Effective Counselors Made or Born? A Critical Review.

    ERIC Educational Resources Information Center

    DeCsipkes, Robert A.; And Others

    The purpose of this review was to investigate the relationship between counselor characteristics and reports of effectiveness. The theoretical position appears to focus on two opposing views. The humanists emphasize the influence of intuition, genuineness, and spontaneity, while the behaviorists place importance on technique, analysis of…

  6. An assessment of finite-element modeling techniques for thick-solid/thin-shell joints analysis

    NASA Technical Reports Server (NTRS)

    Min, J. B.; Androlake, S. G.

    1993-01-01

    The subject of finite-element modeling has long been of critical importance to the practicing designer/analyst who is often faced with obtaining an accurate and cost-effective structural analysis of a particular design. Typically, these two goals are in conflict. The purpose is to discuss the topic of finite-element modeling for solid/shell connections (joints) which are significant for the practicing modeler. Several approaches are currently in use, but frequently various assumptions restrict their use. Such techniques currently used in practical applications were tested, especially to see which technique is the most ideally suited for the computer aided design (CAD) environment. Some basic thoughts regarding each technique are also discussed. As a consequence, some suggestions based on the results are given to lead reliable results in geometrically complex joints where the deformation and stress behavior are complicated.

  7. Dryland pasture and crop conditions as seen by HCMM. [Washita Watershed, Oklahoma

    NASA Technical Reports Server (NTRS)

    Harlan, J. C. (Principal Investigator); Rosenthal, W. D.; Blanchard, B. J.

    1981-01-01

    Techniques developed from aircraft flights over the Washita watershed in central Oklahoma were applied to HCMM data analysis. Results show that (1) canopy temperatures were accurately measured remotely; (2) pasture surface temperature differences detected relative soil moisture differences; (3) pasture surface temperature differences were related to stress in nearby wheat fields; and (4) no relationship was developed between final yield differences, thermal infrared data, and soil moisture stress at critical growth stages due to a lack of satellite thermal data at critical growth stages. The HCMM thermal data proved to be quite adequate in detecting relative moisture differences; however, with a 16 day day/night overpass frequency, more frequent overpasses are required to analyze more cases within a 7 day period after the storm. Better normalization techniques are also required.

  8. Flexible aircraft dynamic modeling for dynamic analysis and control synthesis

    NASA Technical Reports Server (NTRS)

    Schmidt, David K.

    1989-01-01

    The linearization and simplification of a nonlinear, literal model for flexible aircraft is highlighted. Areas of model fidelity that are critical if the model is to be used for control system synthesis are developed and several simplification techniques that can deliver the necessary model fidelity are discussed. These techniques include both numerical and analytical approaches. An analytical approach, based on first-order sensitivity theory is shown to lead not only to excellent numerical results, but also to closed-form analytical expressions for key system dynamic properties such as the pole/zero factors of the vehicle transfer-function matrix. The analytical results are expressed in terms of vehicle mass properties, vibrational characteristics, and rigid-body and aeroelastic stability derivatives, thus leading to the underlying causes for critical dynamic characteristics.

  9. Preparing data for analysis using microsoft Excel.

    PubMed

    Elliott, Alan C; Hynan, Linda S; Reisch, Joan S; Smith, Janet P

    2006-09-01

    A critical component essential to good research is the accurate and efficient collection and preparation of data for analysis. Most medical researchers have little or no training in data management, often causing not only excessive time spent cleaning data but also a risk that the data set contains collection or recording errors. The implementation of simple guidelines based on techniques used by professional data management teams will save researchers time and money and result in a data set better suited to answer research questions. Because Microsoft Excel is often used by researchers to collect data, specific techniques that can be implemented in Excel are presented.

  10. Optimization of binary thermodynamic and phase diagram data

    NASA Astrophysics Data System (ADS)

    Bale, Christopher W.; Pelton, A. D.

    1983-03-01

    An optimization technique based upon least squares regression is presented to permit the simultaneous analysis of diverse experimental binary thermodynamic and phase diagram data. Coefficients of polynomial expansions for the enthalpy and excess entropy of binary solutions are obtained which can subsequently be used to calculate the thermodynamic properties or the phase diagram. In an interactive computer-assisted analysis employing this technique, one can critically analyze a large number of diverse data in a binary system rapidly, in a manner which is fully self-consistent thermodynamically. Examples of applications to the Bi-Zn, Cd-Pb, PbCl2-KCl, LiCl-FeCl2, and Au-Ni binary systems are given.

  11. Safety considerations in the design and operation of large wind turbines

    NASA Technical Reports Server (NTRS)

    Reilly, D. H.

    1979-01-01

    The engineering and safety techniques used to assure the reliable and safe operation of large wind turbine generators utilizing the Mod 2 Wind Turbine System Program as an example is described. The techniques involve a careful definition of the wind turbine's natural and operating environments, use of proven structural design criteria and analysis techniques, an evaluation of potential failure modes and hazards, and use of a fail safe and redundant component engineering philosophy. The role of an effective quality assurance program, tailored to specific hardware criticality, and the checkout and validation program developed to assure system integrity are described.

  12. New Teaching Techniques to Improve Critical Thinking. The Diaprove Methodology

    ERIC Educational Resources Information Center

    Saiz, Carlos; Rivas, Silvia F.

    2016-01-01

    The objective of this research is to ascertain whether new instructional techniques can improve critical thinking. To achieve this goal, two different instruction techniques (ARDESOS--group 1--and DIAPROVE--group 2--) were studied and a pre-post assessment of critical thinking in various dimensions such as argumentation, inductive reasoning,…

  13. Calcium supplementation improves clinical outcome in intensive care unit patients: a propensity score matched analysis of a large clinical database MIMIC-II.

    PubMed

    Zhang, Zhongheng; Chen, Kun; Ni, Hongying

    2015-01-01

    Observational studies have linked hypocalcemia with adverse clinical outcome in critically ill patients. However, calcium supplementation has never been formally investigated for its beneficial effect in critically ill patients. To investigate whether calcium supplementation can improve 28-day survival in adult critically ill patients. Secondary analysis of a large clinical database consisting over 30,000 critical ill patients was performed. Multivariable analysis was performed to examine the independent association of calcium supplementation and 28-day morality. Furthermore, propensity score matching technique was employed to investigate the role of calcium supplementation in improving survival. none. Primary outcome was the 28-day mortality. 90-day mortality was used as secondary outcome. A total of 32,551 adult patients, including 28,062 survivors and 4489 non-survivors (28-day mortality rate: 13.8 %) were included. Calcium supplementation was independently associated with improved 28-day mortality after adjusting for confounding variables (hazard ratio: 0.51; 95 % CI 0.47-0.56). Propensity score matching was performed and the after-matching cohort showed well balanced covariates. The results showed that calcium supplementation was associated with improved 28- and 90-day mortality (p < 0.05 for both Log-rank test). In adult critically ill patients, calcium supplementation during their ICU stay improved 28-day survival. This finding supports the use of calcium supplementation in critically ill patients.

  14. Recent trends in analytical methods and separation techniques for drugs of abuse in hair.

    PubMed

    Baciu, T; Borrull, F; Aguilar, C; Calull, M

    2015-01-26

    Hair analysis of drugs of abuse has been a subject of growing interest from a clinical, social and forensic perspective for years because of the broad time detection window after intake in comparison to urine and blood analysis. Over the last few years, hair analysis has gained increasing attention and recognition for the retrospective investigation of drug abuse in a wide variety of contexts, shown by the large number of applications developed. This review aims to provide an overview of the state of the art and the latest trends used in the literature from 2005 to the present in the analysis of drugs of abuse in hair, with a special focus on separation analytical techniques and their hyphenation with mass spectrometry detection. The most recently introduced sample preparation techniques are also addressed in this paper. The main strengths and weaknesses of all of these approaches are critically discussed by means of relevant applications. Copyright © 2014 Elsevier B.V. All rights reserved.

  15. Development of critical attitude in fundamentals of professional care discipline: a case study.

    PubMed

    Waterkemper, Roberta; do Prado, Marta Lenise; Medina, Jose Luis Moya; Reibnitz, Kenya Schmidt

    2014-04-01

    This is a qualitative case study to identify the contributions of a critical pedagogical technique in developing critical attitudes of graduating nursing students in Brazil. Fourteen students participated in this study. Data were collected from March to August 2010 using triangulation of non-participant observation, interview and document analysis. The collected data were transcribed to Word documents, which were subsequently imported into ATLAS.ti, version 6.2, for organisation and qualitative data analysis. The analysis was based on the work of Minayo (2010). The following three thematic analysis units were constructed: feeling free - seeking the liberty to learn to admire, admiring by curiosity and reflecting about the admired object. The results of the thematic categories reveal that the students understand that they are free to have an active role in their education, and the teacher facilitates this process; thus, the students have a raison d'ètre, or reason for being, free. Feeling free, the student can exercise their curiosity when facing the given situations and topics, which challenges them to make decisions based on their awareness of the world. © 2013.

  16. Eigenvalue Contributon Estimator for Sensitivity Calculations with TSUNAMI-3D

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rearden, Bradley T; Williams, Mark L

    2007-01-01

    Since the release of the Tools for Sensitivity and Uncertainty Analysis Methodology Implementation (TSUNAMI) codes in SCALE [1], the use of sensitivity and uncertainty analysis techniques for criticality safety applications has greatly increased within the user community. In general, sensitivity and uncertainty analysis is transitioning from a technique used only by specialists to a practical tool in routine use. With the desire to use the tool more routinely comes the need to improve the solution methodology to reduce the input and computational burden on the user. This paper reviews the current solution methodology of the Monte Carlo eigenvalue sensitivity analysismore » sequence TSUNAMI-3D, describes an alternative approach, and presents results from both methodologies.« less

  17. Statistical analysis of RHIC beam position monitors performance

    NASA Astrophysics Data System (ADS)

    Calaga, R.; Tomás, R.

    2004-04-01

    A detailed statistical analysis of beam position monitors (BPM) performance at RHIC is a critical factor in improving regular operations and future runs. Robust identification of malfunctioning BPMs plays an important role in any orbit or turn-by-turn analysis. Singular value decomposition and Fourier transform methods, which have evolved as powerful numerical techniques in signal processing, will aid in such identification from BPM data. This is the first attempt at RHIC to use a large set of data to statistically enhance the capability of these two techniques and determine BPM performance. A comparison from run 2003 data shows striking agreement between the two methods and hence can be used to improve BPM functioning at RHIC and possibly other accelerators.

  18. Modern Computational Techniques for the HMMER Sequence Analysis

    PubMed Central

    2013-01-01

    This paper focuses on the latest research and critical reviews on modern computing architectures, software and hardware accelerated algorithms for bioinformatics data analysis with an emphasis on one of the most important sequence analysis applications—hidden Markov models (HMM). We show the detailed performance comparison of sequence analysis tools on various computing platforms recently developed in the bioinformatics society. The characteristics of the sequence analysis, such as data and compute-intensive natures, make it very attractive to optimize and parallelize by using both traditional software approach and innovated hardware acceleration technologies. PMID:25937944

  19. Get it together: Issues that facilitate collaboration in teams of learners in intensive care.

    PubMed

    Conte, Helen; Jirwe, Maria; Scheja, Max; Hjelmqvist, Hans

    2016-05-01

    The study describes issues that facilitate collaboration in teams of learners in an interprofessional education unit in intensive care. A descriptive qualitative study design was applied using semi-structured interviews based on the critical incident technique and qualitative content analysis. Nineteen participants, eight learners in their specialist training, nine supervisors and two head supervisors in Sweden identified 47 incidents. Teams of learners having control was the core issue. Motivation, time, experiences and reflection were central issues for facilitating collaboration. Efficiently training teams how to collaborate requires learners having control while acting on their common understanding and supervisors taking a facilitating role supporting teams to take control of their critical analysis.

  20. Systemic Analysis Approaches for Air Transportation

    NASA Technical Reports Server (NTRS)

    Conway, Sheila

    2005-01-01

    Air transportation system designers have had only limited success using traditional operations research and parametric modeling approaches in their analyses of innovations. They need a systemic methodology for modeling of safety-critical infrastructure that is comprehensive, objective, and sufficiently concrete, yet simple enough to be used with reasonable investment. The methodology must also be amenable to quantitative analysis so issues of system safety and stability can be rigorously addressed. However, air transportation has proven itself an extensive, complex system whose behavior is difficult to describe, no less predict. There is a wide range of system analysis techniques available, but some are more appropriate for certain applications than others. Specifically in the area of complex system analysis, the literature suggests that both agent-based models and network analysis techniques may be useful. This paper discusses the theoretical basis for each approach in these applications, and explores their historic and potential further use for air transportation analysis.

  1. Issues and Problems Related to the Research on Teenage Fathers: A Critical Analysis.

    ERIC Educational Resources Information Center

    Robinson, Bryan E.; Barret, Robert L.

    1982-01-01

    Research on teenage pregnancies has usually neglected young fathers. Most studies ignore fathers entirely, or use data confused or confounded by biased reporting techniques. Retrospective and post hoc analyses often produce inaccurate conclusions, and subject samples often are unrepresentative. A closer examination of adolescent fathers is…

  2. The Time-Out and Seclusion Continuum: A Systematic Analysis of the Case Law

    ERIC Educational Resources Information Center

    Bon, Susan C.; Zirkel, Perry A.

    2014-01-01

    During the past two decades, scholars, educators, and special interest organizations, including advocacy groups, have critically examined and debated the ethical and legal use of aversive interventions with individuals with disabilities. These interventions comprise a broad spectrum of behavior management techniques including but not at all…

  3. A Writing-Intensive, Methods-Based Laboratory Course for Undergraduates

    ERIC Educational Resources Information Center

    Colabroy, Keri L.

    2011-01-01

    Engaging undergraduate students in designing and executing original research should not only be accompanied by technique training but also intentional instruction in the critical analysis and writing of scientific literature. The course described here takes a rigorous approach to scientific reading and writing using primary literature as the model…

  4. Quantified Academic Selves: The Gamification of Research through Social Networking Services

    ERIC Educational Resources Information Center

    Hammarfelt, Björn; de Rijcke, Sarah; Rushforth, Alexander D.

    2016-01-01

    Introduction: Our study critically engages with techniques of self-quantification in contemporary academia, by demonstrating how social networking services enact research and scholarly communication as a "game". Method: The empirical part of the study involves an analysis of two leading platforms: Impactstory and ResearchGate. Observed…

  5. Peer Coaching as a Technique To Foster Professional Development in Clinical Ambulatory Settings.

    ERIC Educational Resources Information Center

    Sekerka, Leslie E.; Chao, Jason

    2003-01-01

    Thematic analysis of critical incidents interviews with 13 physician coaches yielded two orientations to coaching: reflection/teaching coaches focused on others and described positive encounters experienced in coaching; personal learning and change coaches identified more personal benefits from the experience. (Contains 31 references.) (SK)

  6. New Pathways for Teaching Chemistry: Reflective Judgment in Science.

    ERIC Educational Resources Information Center

    Finster, David C.

    1992-01-01

    The reflective judgment model offers a rich context for analysis of science and science teaching. It provides deeper understanding of the scientific process and its critical thinking and reveals fundamental connections between science and the other liberal arts. Classroom techniques from a college chemistry course illustrate the utility of the…

  7. Here We Go Round the M25

    ERIC Educational Resources Information Center

    McCartney, Mark; Walsh, Ian

    2006-01-01

    A simple model for how traffic moves around a closed loop of road is introduced. The consequent analysis of the model can be used as an application of techniques taught at first year undergraduate level, and as a motivator to encourage students to think critically about model formulation and interpretation.

  8. The Use of Research Libraries: A Comment about the Pittsburgh Study and Its Critics.

    ERIC Educational Resources Information Center

    Peat, W. Leslie

    1981-01-01

    Reviews the controversy surrounding the Pittsburgh study of library circulation and collection usage and proposes the use of citation analysis techniques as an acceptable method for measuring research use of a research library which will complement circulation studies. Five references are listed. (RAA)

  9. Shifting Discourses in Teacher Education: Performing the Advocate Bilingual Teacher

    ERIC Educational Resources Information Center

    Caldas, Blanca

    2017-01-01

    This article analyses the co-construction of the Bilingual teacher as advocate among preservice Bilingual teachers, through the use of narratives drawn from actual stories of Bilingual teachers, by means of drama-based pedagogy inspired by Theater of the Oppressed techniques. This study uses critical discourse analysis and Bakhtinian…

  10. Contour metrology using critical dimension atomic force microscopy

    NASA Astrophysics Data System (ADS)

    Orji, Ndubuisi G.; Dixson, Ronald G.; Vladár, András E.; Ming, Bin; Postek, Michael T.

    2012-03-01

    The critical dimension atomic force microscope (CD-AFM), which is used as a reference instrument in lithography metrology, has been proposed as a complementary instrument for contour measurement and verification. Although data from CD-AFM is inherently three dimensional, the planar two-dimensional data required for contour metrology is not easily extracted from the top-down CD-AFM data. This is largely due to the limitations of the CD-AFM method for controlling the tip position and scanning. We describe scanning techniques and profile extraction methods to obtain contours from CD-AFM data. We also describe how we validated our technique, and explain some of its limitations. Potential sources of error for this approach are described, and a rigorous uncertainty model is presented. Our objective is to show which data acquisition and analysis methods could yield optimum contour information while preserving some of the strengths of CD-AFM metrology. We present comparison of contours extracted using our technique to those obtained from the scanning electron microscope (SEM), and the helium ion microscope (HIM).

  11. Assessing the validity of discourse analysis: transdisciplinary convergence

    NASA Astrophysics Data System (ADS)

    Jaipal-Jamani, Kamini

    2014-12-01

    Research studies using discourse analysis approaches make claims about phenomena or issues based on interpretation of written or spoken text, which includes images and gestures. How are findings/interpretations from discourse analysis validated? This paper proposes transdisciplinary convergence as a way to validate discourse analysis approaches to research. The argument is made that discourse analysis explicitly grounded in semiotics, systemic functional linguistics, and critical theory, offers a credible research methodology. The underlying assumptions, constructs, and techniques of analysis of these three theoretical disciplines can be drawn on to show convergence of data at multiple levels, validating interpretations from text analysis.

  12. Analysis of small crack behavior for airframe applications

    NASA Technical Reports Server (NTRS)

    Mcclung, R. C.; Chan, K. S.; Hudak, S. J., Jr.; Davidson, D. L.

    1994-01-01

    The small fatigue crack problem is critically reviewed from the perspective of airframe applications. Different types of small cracks-microstructural, mechanical, and chemical-are carefully defined and relevant mechanisms identified. Appropriate analysis techniques, including both rigorous scientific and practical engineering treatments, are briefly described. Important materials data issues are addressed, including increased scatter in small crack data and recommended small crack test methods. Key problems requiring further study are highlighted.

  13. Technologies for Arsenic Removal from Water: Current Status and Future Perspectives

    PubMed Central

    Nicomel, Nina Ricci; Leus, Karen; Folens, Karel; Van Der Voort, Pascal; Du Laing, Gijs

    2015-01-01

    This review paper presents an overview of the available technologies used nowadays for the removal of arsenic species from water. Conventionally applied techniques to remove arsenic species include oxidation, coagulation-flocculation, and membrane techniques. Besides, progress has recently been made on the utility of various nanoparticles for the remediation of contaminated water. A critical analysis of the most widely investigated nanoparticles is presented and promising future research on novel porous materials, such as metal organic frameworks, is suggested. PMID:26703687

  14. Technologies for Arsenic Removal from Water: Current Status and Future Perspectives.

    PubMed

    Nicomel, Nina Ricci; Leus, Karen; Folens, Karel; Van Der Voort, Pascal; Du Laing, Gijs

    2015-12-22

    This review paper presents an overview of the available technologies used nowadays for the removal of arsenic species from water. Conventionally applied techniques to remove arsenic species include oxidation, coagulation-flocculation, and membrane techniques. Besides, progress has recently been made on the utility of various nanoparticles for the remediation of contaminated water. A critical analysis of the most widely investigated nanoparticles is presented and promising future research on novel porous materials, such as metal organic frameworks, is suggested.

  15. Analysis of Cryogenic Cycle with Process Modeling Tool: Aspen HYSYS

    NASA Astrophysics Data System (ADS)

    Joshi, D. M.; Patel, H. K.

    2015-10-01

    Cryogenic engineering deals with the development and improvement of low temperature techniques, processes and equipment. A process simulator such as Aspen HYSYS, for the design, analysis, and optimization of process plants, has features that accommodate the special requirements and therefore can be used to simulate most cryogenic liquefaction and refrigeration processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Cryogenic processes require special attention in terms of the integration of various components like heat exchangers, Joule-Thompson Valve, Turbo expander and Compressor. Here, Aspen HYSYS, a process modeling tool, is used to understand the behavior of the complete plant. This paper presents the analysis of an air liquefaction plant based on the Linde cryogenic cycle, performed using the Aspen HYSYS process modeling tool. It covers the technique used to find the optimum values for getting the maximum liquefaction of the plant considering different constraints of other parameters. The analysis result so obtained gives clear idea in deciding various parameter values before implementation of the actual plant in the field. It also gives an idea about the productivity and profitability of the given configuration plant which leads to the design of an efficient productive plant.

  16. The role of student’s critical asking question in developing student’s critical thinking skills

    NASA Astrophysics Data System (ADS)

    Santoso, T.; Yuanita, L.; Erman, E.

    2018-01-01

    Questioning means thinking, and thinking is manifested in the form of questions. Research that studies the relationship between questioning and students’ critical thinking skills is little, if any. The aim of this study is to examine how student’s questions skill correlates to student’s critical thinking skills in learning of chemistry. The research design used was one group pretest-posttest design. The participants involved were 94 students, all of whom attended their last semesters, Chemistry Education of Tadulako University. A pre-test was administered to check participants’ ability to ask critical questions and critical thinking skills in learning chemistry. Then, the students were taught by using questioning technique. After accomplishing the lesson, a post-test was given to evaluate their progress. Obtained data were analyzed by using Pair-Samples T.Test and correlation methods. The result shows that the level of the questions plays an important role in critical thinking skills is the question levels of predictive, analysis, evaluation and inference.

  17. Surface contamination analysis technology team overview

    NASA Technical Reports Server (NTRS)

    Burns, H. Dewitt

    1995-01-01

    A team was established which consisted of representatives from NASA (Marshall Space Flight Center and Langley Research Center), Thiokol Corporation, the University of Alabama in Huntsville, AC Engineering, SAIC, Martin Marietta, and Aerojet. The team's purpose was to bring together the appropriate personnel to determine what surface inspection techniques were applicable to multiprogram bonding surface cleanliness inspection. In order to identify appropriate techniques and their sensitivity to various contaminant families, calibration standards were developed. Producing standards included development of consistent low level contamination application techniques. Oxidation was also considered for effect on inspection equipment response. Ellipsometry was used for oxidation characterization. Verification testing was then accomplished to show that selected inspection techniques could detect subject contaminants at levels found to be detrimental to critical bond systems of interest. Once feasibility of identified techniques was shown, selected techniques and instrumentation could then be incorporated into a multipurpose inspection head and integrated with a robot for critical surface inspection. Inspection techniques currently being evaluated include optically stimulated electron emission (OSEE); near infrared (NIR) spectroscopy utilizing fiber optics; Fourier transform infrared (FTIR) spectroscopy; and ultraviolet (UV) fluorescence. Current plans are to demonstrate an integrated system in MSFC's Productivity Enhancement Complex within five years from initiation of this effort in 1992 assuming appropriate funding levels are maintained. This paper gives an overview of work accomplished by the team and future plans.

  18. The Potential of Sequential Extraction in the Characterisation and Management of Wastes from Steel Processing: A Prospective Review

    PubMed Central

    Rodgers, Kiri J.; Hursthouse, Andrew; Cuthbert, Simon

    2015-01-01

    As waste management regulations become more stringent, yet demand for resources continues to increase, there is a pressing need for innovative management techniques and more sophisticated supporting analysis techniques. Sequential extraction (SE) analysis, a technique previously applied to soils and sediments, offers the potential to gain a better understanding of the composition of solid wastes. SE attempts to classify potentially toxic elements (PTEs) by their associations with phases or fractions in waste, with the aim of improving resource use and reducing negative environmental impacts. In this review we explain how SE can be applied to steel wastes. These present challenges due to differences in sample characteristics compared with materials to which SE has been traditionally applied, specifically chemical composition, particle size and pH buffering capacity, which are critical when identifying a suitable SE method. We highlight the importance of delineating iron-rich phases, and find that the commonly applied BCR (The community Bureau of reference) extraction method is problematic due to difficulties with zinc speciation (a critical steel waste constituent), hence a substantially modified SEP is necessary to deal with particular characteristics of steel wastes. Successful development of SE for steel wastes could have wider implications, e.g., for the sustainable management of fly ash and mining wastes. PMID:26393631

  19. The Potential of Sequential Extraction in the Characterisation and Management of Wastes from Steel Processing: A Prospective Review.

    PubMed

    Rodgers, Kiri J; Hursthouse, Andrew; Cuthbert, Simon

    2015-09-18

    As waste management regulations become more stringent, yet demand for resources continues to increase, there is a pressing need for innovative management techniques and more sophisticated supporting analysis techniques. Sequential extraction (SE) analysis, a technique previously applied to soils and sediments, offers the potential to gain a better understanding of the composition of solid wastes. SE attempts to classify potentially toxic elements (PTEs) by their associations with phases or fractions in waste, with the aim of improving resource use and reducing negative environmental impacts. In this review we explain how SE can be applied to steel wastes. These present challenges due to differences in sample characteristics compared with materials to which SE has been traditionally applied, specifically chemical composition, particle size and pH buffering capacity, which are critical when identifying a suitable SE method. We highlight the importance of delineating iron-rich phases, and find that the commonly applied BCR (The community Bureau of reference) extraction method is problematic due to difficulties with zinc speciation (a critical steel waste constituent), hence a substantially modified SEP is necessary to deal with particular characteristics of steel wastes. Successful development of SE for steel wastes could have wider implications, e.g., for the sustainable management of fly ash and mining wastes.

  20. Development of Measurement Methods for Detection of Special Nuclear Materials using D-D Pulsed Neutron Source

    NASA Astrophysics Data System (ADS)

    Misawa, Tsuyoshi; Takahashi, Yoshiyuki; Yagi, Takahiro; Pyeon, Cheol Ho; Kimura, Masaharu; Masuda, Kai; Ohgaki, Hideaki

    2015-10-01

    For detection of hidden special nuclear materials (SNMs), we have developed an active neutron-based interrogation system combined with a D-D fusion pulsed neutron source and a neutron detection system. In the detection scheme, we have adopted new measurement techniques simultaneously; neutron noise analysis and neutron energy spectrum analysis. The validity of neutron noise analysis method has been experimentally studied in the Kyoto University Critical Assembly (KUCA), and was applied to a cargo container inspection system by simulation.

  1. Fault Tree Analysis Application for Safety and Reliability

    NASA Technical Reports Server (NTRS)

    Wallace, Dolores R.

    2003-01-01

    Many commercial software tools exist for fault tree analysis (FTA), an accepted method for mitigating risk in systems. The method embedded in the tools identifies a root as use in system components, but when software is identified as a root cause, it does not build trees into the software component. No commercial software tools have been built specifically for development and analysis of software fault trees. Research indicates that the methods of FTA could be applied to software, but the method is not practical without automated tool support. With appropriate automated tool support, software fault tree analysis (SFTA) may be a practical technique for identifying the underlying cause of software faults that may lead to critical system failures. We strive to demonstrate that existing commercial tools for FTA can be adapted for use with SFTA, and that applied to a safety-critical system, SFTA can be used to identify serious potential problems long before integrator and system testing.

  2. Giant increase in critical current density of K xFe 2-ySe₂ single crystals

    DOE PAGES

    Lei, Hechang; Petrovic, C.

    2011-12-28

    The critical current density Jabc of K xFe 2-ySe₂ single crystals can be enhanced by more than one order of magnitude, up to ~2.1×10⁴ A/cm² by the post annealing and quenching technique. A scaling analysis reveals the universal behavior of the normalized pinning force as a function of the reduced field for all temperatures, indicating the presence of a single vortex pinning mechanism. The main pinning sources are three-dimensional (3D) point-like normal cores. The dominant vortex interaction with pinning centers is via spatial variations in critical temperature T c (“δT c pinning”).

  3. Quantitative analysis and feature recognition in 3-D microstructural data sets

    NASA Astrophysics Data System (ADS)

    Lewis, A. C.; Suh, C.; Stukowski, M.; Geltmacher, A. B.; Spanos, G.; Rajan, K.

    2006-12-01

    A three-dimensional (3-D) reconstruction of an austenitic stainless-steel microstructure was used as input for an image-based finite-element model to simulate the anisotropic elastic mechanical response of the microstructure. The quantitative data-mining and data-warehousing techniques used to correlate regions of high stress with critical microstructural features are discussed. Initial analysis of elastic stresses near grain boundaries due to mechanical loading revealed low overall correlation with their location in the microstructure. However, the use of data-mining and feature-tracking techniques to identify high-stress outliers revealed that many of these high-stress points are generated near grain boundaries and grain edges (triple junctions). These techniques also allowed for the differentiation between high stresses due to boundary conditions of the finite volume reconstructed, and those due to 3-D microstructural features.

  4. V and V of ISHM Software for Space Exploration

    NASA Technical Reports Server (NTRS)

    Markosian, Lawrence; Feather, Martin, S.; Brinza, David; Figueroa, F.

    2005-01-01

    NASA has established a far-reaching and long-term program for robotic and manned exploration of the solar system, beginning with missions to the moon and Mars. The Crew Transportation System (CTS), a key system for space exploration, imposes four requirements' that ISHM addresses. These requirements have a wide range of implications for V&V and certification of ISHM. There is a range of time-criticality for ISHM actions, from prognostication, which is often (but not always) non-time-critical, to time-critical state estimation and system management under off-nominal emergency conditions. These are externally imposed requirements on ISHM that are subject to V&V. - In addition, a range of techniques are needed to implement an ISHM. The approaches to ISHM are described elsewhere. These approaches range from well-understood algorithms for low-level data analysis, validation and reporting, to AI techniques for state estimation and planning. The range of techniques, and specifically the use of AI techniques such as reasoning under uncertainty and mission planning (and re-planning), implies that several V&V approaches may be required. Depending on the ISHM architecture, traditional testing approaches may be adequate for some ISHM functionality. The AI-based approaches to reasoning under uncertainty, model-based reasoning, and planning share characteristics typical of other complex software systems, but they also have characteristics that set them apart and challenge standard V&V techniques. The range of possible solutions to the overall ISHM problem impose internal challenges to V&V. The V&V challenges increase when hard real-time constraints are imposed for time-critical functionality. For example, there is an external requirement that impending catastrophic failure of the Launch Vehicle (LV) at launch time be detected and life-saving action be taken within two seconds. In this paper we outline the challenges for ISHM V&V, existing approaches and analogs in other software application areas, and possible new approaches to the V&V challenges for space exploration ISHM.

  5. Techniques of Neutralising Wildlife Crime in Rural England and Wales

    ERIC Educational Resources Information Center

    Enticott, Gareth

    2011-01-01

    Within rural studies there have been few attempts to critically analyse crimes against nature. This paper addresses this gap by providing an analysis of farmers' reasons for illegally culling badgers in the United Kingdom. Drawing on Sykes and Matza's (1957) concepts of neutralisation and drift, the paper shows how farmers rationalise this…

  6. The Critical Success Factor Method: Establishing a Foundation for Enterprise Security Management

    DTIC Science & Technology

    2004-07-01

    importance.                                                                13  SWOT   analysis  is a commonly used strategic planning technique.  It identifies...24  Figure 11:  Relationship Between Enterprise and Operational Unit CSFs ............... 28  Figure 12:  Affinity  Analysis  for Determining...ISRM Scope ....................................... 36  Figure 13:  Affinity  Analysis  for Determining Critical Assets

  7. Critical assessment of inverse gas chromatography as means of assessing surface free energy and acid-base interaction of pharmaceutical powders.

    PubMed

    Telko, Martin J; Hickey, Anthony J

    2007-10-01

    Inverse gas chromatography (IGC) has been employed as a research tool for decades. Despite this record of use and proven utility in a variety of applications, the technique is not routinely used in pharmaceutical research. In other fields the technique has flourished. IGC is experimentally relatively straightforward, but analysis requires that certain theoretical assumptions are satisfied. The assumptions made to acquire some of the recently reported data are somewhat modified compared to initial reports. Most publications in the pharmaceutical literature have made use of a simplified equation for the determination of acid/base surface properties resulting in parameter values that are inconsistent with prior methods. In comparing the surface properties of different batches of alpha-lactose monohydrate, new data has been generated and compared with literature to allow critical analysis of the theoretical assumptions and their importance to the interpretation of the data. The commonly used (simplified) approach was compared with the more rigorous approach originally outlined in the surface chemistry literature. (c) 2007 Wiley-Liss, Inc.

  8. Testing of Safety-Critical Software Embedded in an Artificial Heart

    NASA Astrophysics Data System (ADS)

    Cha, Sungdeok; Jeong, Sehun; Yoo, Junbeom; Kim, Young-Gab

    Software is being used more frequently to control medical devices such as artificial heart or robotic surgery system. While much of software safety issues in such systems are similar to other safety-critical systems (e.g., nuclear power plants), domain-specific properties may warrant development of customized techniques to demonstrate fitness of the system on patients. In this paper, we report results of a preliminary analysis done on software controlling a Hybrid Ventricular Assist Device (H-VAD) developed by Korea Artificial Organ Centre (KAOC). It is a state-of-the-art artificial heart which completed animal testing phase. We performed software testing in in-vitro experiments and animal experiments. An abnormal behaviour, never detected during extensive in-vitro analysis and animal testing, was found.

  9. Evaluation of Analysis Techniques for Fluted-Core Sandwich Cylinders

    NASA Technical Reports Server (NTRS)

    Lovejoy, Andrew E.; Schultz, Marc R.

    2012-01-01

    Buckling-critical launch-vehicle structures require structural concepts that have high bending stiffness and low mass. Fluted-core, also known as truss-core, sandwich construction is one such concept. In an effort to identify an analysis method appropriate for the preliminary design of fluted-core cylinders, the current paper presents and compares results from several analysis techniques applied to a specific composite fluted-core test article. The analysis techniques are evaluated in terms of their ease of use and for their appropriateness at certain stages throughout a design analysis cycle (DAC). Current analysis techniques that provide accurate determination of the global buckling load are not readily applicable early in the DAC, such as during preliminary design, because they are too costly to run. An analytical approach that neglects transverse-shear deformation is easily applied during preliminary design, but the lack of transverse-shear deformation results in global buckling load predictions that are significantly higher than those from more detailed analysis methods. The current state of the art is either too complex to be applied for preliminary design, or is incapable of the accuracy required to determine global buckling loads for fluted-core cylinders. Therefore, it is necessary to develop an analytical method for calculating global buckling loads of fluted-core cylinders that includes transverse-shear deformations, and that can be easily incorporated in preliminary design.

  10. Computational techniques for ECG analysis and interpretation in light of their contribution to medical advances

    PubMed Central

    Mincholé, Ana; Martínez, Juan Pablo; Laguna, Pablo; Rodriguez, Blanca

    2018-01-01

    Widely developed for clinical screening, electrocardiogram (ECG) recordings capture the cardiac electrical activity from the body surface. ECG analysis can therefore be a crucial first step to help diagnose, understand and predict cardiovascular disorders responsible for 30% of deaths worldwide. Computational techniques, and more specifically machine learning techniques and computational modelling are powerful tools for classification, clustering and simulation, and they have recently been applied to address the analysis of medical data, especially ECG data. This review describes the computational methods in use for ECG analysis, with a focus on machine learning and 3D computer simulations, as well as their accuracy, clinical implications and contributions to medical advances. The first section focuses on heartbeat classification and the techniques developed to extract and classify abnormal from regular beats. The second section focuses on patient diagnosis from whole recordings, applied to different diseases. The third section presents real-time diagnosis and applications to wearable devices. The fourth section highlights the recent field of personalized ECG computer simulations and their interpretation. Finally, the discussion section outlines the challenges of ECG analysis and provides a critical assessment of the methods presented. The computational methods reported in this review are a strong asset for medical discoveries and their translation to the clinical world may lead to promising advances. PMID:29321268

  11. The effect of ion plated silver and sliding friction on tensile stress-induced cracking in aluminum oxide

    NASA Technical Reports Server (NTRS)

    Sliney, Harold E.; Spalvins, Talivaldis

    1991-01-01

    A Hertzian analysis of the effect of sliding friction on contact stresses in alumina is used to predict the critical load for crack generation. The results for uncoated alumina and alumina coated with ion plated silver are compared. Friction coefficient inputs to the analysis are determined experimentally with a scratch test instrument employing an 0.2 mm radius diamond stylus. A series of scratches were made at constant load increments on coated and uncoated flat alumina surfaces. Critical loads for cracking are detected by microscopic examination of cross sections of scratches made at various loads and friction coefficients. Acoustic emission (AE) and friction trends were also evaluated as experimental techniques for determining critical loads for cracking. Analytical predictions correlate well with micrographic evidence and with the lowest load at which AE is detected in multiple scratch tests. Friction/load trends are not good indicators of early crack formation. Lubrication with silver films reduced friction and thereby increased the critical load for crack initiation in agreement with analytical predictions.

  12. The effect of ion-plated silver and sliding friction on tensile stress-induced cracking in aluminum oxide

    NASA Technical Reports Server (NTRS)

    Sliney, Harold E.; Spalvins, Talivaldis

    1993-01-01

    A Hertzian analysis of the effect of sliding friction on contact stresses in alumina is used to predict the critical load for crack generation. The results for uncoated alumina and alumina coated with ion plated silver are compared. Friction coefficient inputs to the analysis are determined experimentally with a scratch test instrument employing an 0.2 mm radius diamond stylus. A series of scratches were made at constant load increments on coated and uncoated flat alumina surfaces. Critical loads for cracking are detected by microscopic examination of cross sections of scratches made at various loads and friction coefficients. Acoustic emission (AE) and friction trends were also evaluated as experimental techniques for determining critical loads for cracking. Analytical predictions correlate well with micrographic evidence and with the lowest load at which AE is detected in multiple scratch tests. Friction/load trends are not good indicators of early crack formation. Lubrication with silver films reduced friction and thereby increased the critical load for crack initiation in agreement with analytical predictions.

  13. Implementation of quality by design principles in the development of microsponges as drug delivery carriers: Identification and optimization of critical factors using multivariate statistical analyses and design of experiments studies.

    PubMed

    Simonoska Crcarevska, Maja; Dimitrovska, Aneta; Sibinovska, Nadica; Mladenovska, Kristina; Slavevska Raicki, Renata; Glavas Dodov, Marija

    2015-07-15

    Microsponges drug delivery system (MDDC) was prepared by double emulsion-solvent-diffusion technique using rotor-stator homogenization. Quality by design (QbD) concept was implemented for the development of MDDC with potential to be incorporated into semisolid dosage form (gel). Quality target product profile (QTPP) and critical quality attributes (CQA) were defined and identified, accordingly. Critical material attributes (CMA) and Critical process parameters (CPP) were identified using quality risk management (QRM) tool, failure mode, effects and criticality analysis (FMECA). CMA and CPP were identified based on results obtained from principal component analysis (PCA-X&Y) and partial least squares (PLS) statistical analysis along with literature data, product and process knowledge and understanding. FMECA identified amount of ethylcellulose, chitosan, acetone, dichloromethane, span 80, tween 80 and water ratio in primary/multiple emulsions as CMA and rotation speed and stirrer type used for organic solvent removal as CPP. The relationship between identified CPP and particle size as CQA was described in the design space using design of experiments - one-factor response surface method. Obtained results from statistically designed experiments enabled establishment of mathematical models and equations that were used for detailed characterization of influence of identified CPP upon MDDC particle size and particle size distribution and their subsequent optimization. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. Vapor-liquid equilibrium and critical asymmetry of square well and short square well chain fluids.

    PubMed

    Li, Liyan; Sun, Fangfang; Chen, Zhitong; Wang, Long; Cai, Jun

    2014-08-07

    The critical behavior of square well fluids with variable interaction ranges and of short square well chain fluids have been investigated by grand canonical ensemble Monte Carlo simulations. The critical temperatures and densities were estimated by a finite-size scaling analysis with the help of histogram reweighting technique. The vapor-liquid coexistence curve in the near-critical region was determined using hyper-parallel tempering Monte Carlo simulations. The simulation results for coexistence diameters show that the contribution of |t|(1-α) to the coexistence diameter dominates the singular behavior in all systems investigated. The contribution of |t|(2β) to the coexistence diameter is larger for the system with a smaller interaction range λ. While for short square well chain fluids, longer the chain length, larger the contribution of |t|(2β). The molecular configuration greatly influences the critical asymmetry: a short soft chain fluid shows weaker critical asymmetry than a stiff chain fluid with same chain length.

  15. Probing the critical zone using passive- and active-source estimates of subsurface shear-wave velocities

    NASA Astrophysics Data System (ADS)

    Callahan, R. P.; Taylor, N. J.; Pasquet, S.; Dueker, K. G.; Riebe, C. S.; Holbrook, W. S.

    2016-12-01

    Geophysical imaging is rapidly becoming popular for quantifying subsurface critical zone (CZ) architecture. However, a diverse array of measurements and measurement techniques are available, raising the question of which are appropriate for specific study goals. Here we compare two techniques for measuring S-wave velocities (Vs) in the near surface. The first approach quantifies Vs in three dimensions using a passive source and an iterative residual least-squares tomographic inversion. The second approach uses a more traditional active-source seismic survey to quantify Vs in two dimensions via a Monte Carlo surface-wave dispersion inversion. Our analysis focuses on three 0.01 km2 study plots on weathered granitic bedrock in the Southern Sierra Critical Zone Observatory. Preliminary results indicate that depth-averaged velocities from the two methods agree over the scales of resolution of the techniques. While the passive- and active-source techniques both quantify Vs, each method has distinct advantages and disadvantages during data acquisition and analysis. The passive-source method has the advantage of generating a three dimensional distribution of subsurface Vs structure across a broad area. Because this method relies on the ambient seismic field as a source, which varies unpredictably across space and time, data quality and depth of investigation are outside the control of the user. Meanwhile, traditional active-source surveys can be designed around a desired depth of investigation. However, they only generate a two dimensional image of Vs structure. Whereas traditional active-source surveys can be inverted quickly on a personal computer in the field, passive source surveys require significantly more computations, and are best conducted in a high-performance computing environment. We use data from our study sites to compare these methods across different scales and to explore how these methods can be used to better understand subsurface CZ architecture.

  16. Propagating Resource Constraints Using Mutual Exclusion Reasoning

    NASA Technical Reports Server (NTRS)

    Frank, Jeremy; Sanchez, Romeo; Do, Minh B.; Clancy, Daniel (Technical Monitor)

    2001-01-01

    One of the most recent techniques for propagating resource constraints in Constraint Based scheduling is Energy Constraint. This technique focuses in precedence based scheduling, where precedence relations are taken into account rather than the absolute position of activities. Although, this particular technique proved to be efficient on discrete unary resources, it provides only loose bounds for jobs using discrete multi-capacity resources. In this paper we show how mutual exclusion reasoning can be used to propagate time bounds for activities using discrete resources. We show that our technique based on critical path analysis and mutex reasoning is just as effective on unary resources, and also shows that it is more effective on multi-capacity resources, through both examples and empirical study.

  17. Application of largest Lyapunov exponent analysis on the studies of dynamics under external forces

    NASA Astrophysics Data System (ADS)

    Odavić, Jovan; Mali, Petar; Tekić, Jasmina; Pantić, Milan; Pavkov-Hrvojević, Milica

    2017-06-01

    Dynamics of driven dissipative Frenkel-Kontorova model is examined by using largest Lyapunov exponent computational technique. Obtained results show that besides the usual way where behavior of the system in the presence of external forces is studied by analyzing its dynamical response function, the largest Lyapunov exponent analysis can represent a very convenient tool to examine system dynamics. In the dc driven systems, the critical depinning force for particular structure could be estimated by computing the largest Lyapunov exponent. In the dc+ac driven systems, if the substrate potential is the standard sinusoidal one, calculation of the largest Lyapunov exponent offers a more sensitive way to detect the presence of Shapiro steps. When the amplitude of the ac force is varied the behavior of the largest Lyapunov exponent in the pinned regime completely reflects the behavior of Shapiro steps and the critical depinning force, in particular, it represents the mirror image of the amplitude dependence of critical depinning force. This points out an advantage of this technique since by calculating the largest Lyapunov exponent in the pinned regime we can get an insight into the dynamics of the system when driving forces are applied. Additionally, the system is shown to be not chaotic even in the case of incommensurate structures and large amplitudes of external force, which is a consequence of overdampness of the model and the Middleton's no passing rule.

  18. Experience report: Using formal methods for requirements analysis of critical spacecraft software

    NASA Technical Reports Server (NTRS)

    Lutz, Robyn R.; Ampo, Yoko

    1994-01-01

    Formal specification and analysis of requirements continues to gain support as a method for producing more reliable software. However, the introduction of formal methods to a large software project is difficult, due in part to the unfamiliarity of the specification languages and the lack of graphics. This paper reports results of an investigation into the effectiveness of formal methods as an aid to the requirements analysis of critical, system-level fault-protection software on a spacecraft currently under development. Our experience indicates that formal specification and analysis can enhance the accuracy of the requirements and add assurance prior to design development in this domain. The work described here is part of a larger, NASA-funded research project whose purpose is to use formal-methods techniques to improve the quality of software in space applications. The demonstration project described here is part of the effort to evaluate experimentally the effectiveness of supplementing traditional engineering approaches to requirements specification with the more rigorous specification and analysis available with formal methods.

  19. Eddy current technique for predicting burst pressure

    DOEpatents

    Petri, Mark C.; Kupperman, David S.; Morman, James A.; Reifman, Jaques; Wei, Thomas Y. C.

    2003-01-01

    A signal processing technique which correlates eddy current inspection data from a tube having a critical tubing defect with a range of predicted burst pressures for the tube is provided. The method can directly correlate the raw eddy current inspection data representing the critical tubing defect with the range of burst pressures using a regression technique, preferably an artificial neural network. Alternatively, the technique deconvolves the raw eddy current inspection data into a set of undistorted signals, each of which represents a separate defect of the tube. The undistorted defect signal which represents the critical tubing defect is related to a range of burst pressures utilizing a regression technique.

  20. Tutorial: Advanced fault tree applications using HARP

    NASA Technical Reports Server (NTRS)

    Dugan, Joanne Bechta; Bavuso, Salvatore J.; Boyd, Mark A.

    1993-01-01

    Reliability analysis of fault tolerant computer systems for critical applications is complicated by several factors. These modeling difficulties are discussed and dynamic fault tree modeling techniques for handling them are described and demonstrated. Several advanced fault tolerant computer systems are described, and fault tree models for their analysis are presented. HARP (Hybrid Automated Reliability Predictor) is a software package developed at Duke University and NASA Langley Research Center that is capable of solving the fault tree models presented.

  1. A community assessment of privacy preserving techniques for human genomes

    PubMed Central

    2014-01-01

    To answer the need for the rigorous protection of biomedical data, we organized the Critical Assessment of Data Privacy and Protection initiative as a community effort to evaluate privacy-preserving dissemination techniques for biomedical data. We focused on the challenge of sharing aggregate human genomic data (e.g., allele frequencies) in a way that preserves the privacy of the data donors, without undermining the utility of genome-wide association studies (GWAS) or impeding their dissemination. Specifically, we designed two problems for disseminating the raw data and the analysis outcome, respectively, based on publicly available data from HapMap and from the Personal Genome Project. A total of six teams participated in the challenges. The final results were presented at a workshop of the iDASH (integrating Data for Analysis, 'anonymization,' and SHaring) National Center for Biomedical Computing. We report the results of the challenge and our findings about the current genome privacy protection techniques. PMID:25521230

  2. A community assessment of privacy preserving techniques for human genomes.

    PubMed

    Jiang, Xiaoqian; Zhao, Yongan; Wang, Xiaofeng; Malin, Bradley; Wang, Shuang; Ohno-Machado, Lucila; Tang, Haixu

    2014-01-01

    To answer the need for the rigorous protection of biomedical data, we organized the Critical Assessment of Data Privacy and Protection initiative as a community effort to evaluate privacy-preserving dissemination techniques for biomedical data. We focused on the challenge of sharing aggregate human genomic data (e.g., allele frequencies) in a way that preserves the privacy of the data donors, without undermining the utility of genome-wide association studies (GWAS) or impeding their dissemination. Specifically, we designed two problems for disseminating the raw data and the analysis outcome, respectively, based on publicly available data from HapMap and from the Personal Genome Project. A total of six teams participated in the challenges. The final results were presented at a workshop of the iDASH (integrating Data for Analysis, 'anonymization,' and SHaring) National Center for Biomedical Computing. We report the results of the challenge and our findings about the current genome privacy protection techniques.

  3. Critical Assessment of the Evidence for Striped Nanoparticles

    PubMed Central

    Stirling, Julian; Lekkas, Ioannis; Sweetman, Adam; Djuranovic, Predrag; Guo, Quanmin; Pauw, Brian; Granwehr, Josef; Lévy, Raphaël; Moriarty, Philip

    2014-01-01

    There is now a significant body of literature which reports that stripes form in the ligand shell of suitably functionalised Au nanoparticles. This stripe morphology has been proposed to strongly affect the physicochemical and biochemical properties of the particles. We critique the published evidence for striped nanoparticles in detail, with a particular focus on the interpretation of scanning tunnelling microscopy (STM) data (as this is the only technique which ostensibly provides direct evidence for the presence of stripes). Through a combination of an exhaustive re-analysis of the original data, in addition to new experimental measurements of a simple control sample comprising entirely unfunctionalised particles, we show that all of the STM evidence for striped nanoparticles published to date can instead be explained by a combination of well-known instrumental artefacts, or by issues with data acquisition/analysis protocols. We also critically re-examine the evidence for the presence of ligand stripes which has been claimed to have been found from transmission electron microscopy, nuclear magnetic resonance spectroscopy, small angle neutron scattering experiments, and computer simulations. Although these data can indeed be interpreted in terms of stripe formation, we show that the reported results can alternatively be explained as arising from a combination of instrumental artefacts and inadequate data analysis techniques. PMID:25402426

  4. Using AVIRIS data and multiple-masking techniques to map urban forest trees species

    Treesearch

    Q. Xiao; S.L. Ustin; E.G. McPherson

    2004-01-01

    Tree type and species information are critical parameters for urban forest management, benefit cost analysis and urban planning. However, traditionally, these parameters have been derived based on limited field samples in urban forest management practice. In this study we used high-resolution Airborne Visible Infrared Imaging Spectrometer (AVIRIS) data and multiple-...

  5. The Socratic Method: analyzing ethical issues in health administration.

    PubMed

    Gac, E J; Boerstler, H; Ruhnka, J C

    1998-01-01

    The Socratic Method has long been recognized by the legal profession as an effective tool for promoting critical thinking and analysis in the law. This article describes ways the technique can be used in health administration education to help future administrators develop the "ethical rudder" they will need for effective leadership. An illustrative dialogue is provided.

  6. Finding the Spirit Within: A Critical Analysis of Film Techniques in "Spirited Away"

    ERIC Educational Resources Information Center

    Cooper, Damon

    2010-01-01

    In 2008 the New South Wales Board of Studies included Hayao Miyazaki's film "Spirited Away" as the prescribed text for the Higher School Certificate Japanese Extension course. A study of the film in this context requires students to engage with the text in three distinct ways: through language, cultural symbolism and relevance, and…

  7. Funding Mechanisms, Cost Drivers, and the Distribution of Education Funds in Alberta: A Case Study.

    ERIC Educational Resources Information Center

    Neu, Dean; Taylor, Alison

    2000-01-01

    Critical analysis of historical financial data of the Calgary Board of Education (CBE) examined the impact of Alberta's 1994 funding changes on the CBE and the distribution of Alberta's education funding. Findings illustrate how funding mechanisms are used to govern from a distance and how seemingly neutral accounting/funding techniques function…

  8. Problems in Staff and Educational Development Leadership: Solving, Framing, and Avoiding

    ERIC Educational Resources Information Center

    Blackmore, Paul; Wilson, Andrew

    2005-01-01

    Analysis of interviews using critical incident technique with a sample of leaders in staff and educational development in higher education institutions reveals a limited use of classical problem-solving approaches. However, many leaders are able to articulate ways in which they frame problems. Framing has to do with goals, which may be complex,…

  9. Assessing tree and stand biomass: a review with examples and critical comparisons

    Treesearch

    Bernard R. Parresol

    1999-01-01

    There is considerable interest today in estimating the biomass of trees and forests for both practical forestry issues and scientific purposes. New techniques and procedures are brought together along with the more traditional approaches to estimating woody biomass. General model forms and weighted analysis are reviewed, along with statistics for evaluating and...

  10. Has your ancient stamp been regummed with synthetic glue? A FT-NIR and FT-Raman study.

    PubMed

    Simonetti, Remo; Oliveri, Paolo; Henry, Adrien; Duponchel, Ludovic; Lanteri, Silvia

    2016-01-01

    The potential of FT-NIR and FT-Raman spectroscopies to characterise the gum applied on the backside of ancient stamps was investigated for the first time. This represents a very critical issue for the collectors' market, since gum conditions heavily influence stamp quotations, and fraudulent application of synthetic gum onto damaged stamp backsides to increase their desirability is a well-documented practice. Spectral data were processed by exploratory pattern recognition tools. In particular, application of principal component analysis (PCA) revealed that both of the spectroscopic techniques provide information useful to characterise stamp gum. Examination of PCA loadings and their chemical interpretation confirmed the robustness of the outcomes. Fusion of FT-NIR and FT-Raman spectral data was performed, following both a low-level and a mid-level procedure. The results were critically compared with those obtained separately for the two spectroscopic techniques. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. Optimizing Hybrid Metrology: Rigorous Implementation of Bayesian and Combined Regression.

    PubMed

    Henn, Mark-Alexander; Silver, Richard M; Villarrubia, John S; Zhang, Nien Fan; Zhou, Hui; Barnes, Bryan M; Ming, Bin; Vladár, András E

    2015-01-01

    Hybrid metrology, e.g., the combination of several measurement techniques to determine critical dimensions, is an increasingly important approach to meet the needs of the semiconductor industry. A proper use of hybrid metrology may yield not only more reliable estimates for the quantitative characterization of 3-D structures but also a more realistic estimation of the corresponding uncertainties. Recent developments at the National Institute of Standards and Technology (NIST) feature the combination of optical critical dimension (OCD) measurements and scanning electron microscope (SEM) results. The hybrid methodology offers the potential to make measurements of essential 3-D attributes that may not be otherwise feasible. However, combining techniques gives rise to essential challenges in error analysis and comparing results from different instrument models, especially the effect of systematic and highly correlated errors in the measurement on the χ 2 function that is minimized. Both hypothetical examples and measurement data are used to illustrate solutions to these challenges.

  12. Critical current density measurement of striated multifilament-coated conductors using a scanning Hall probe microscope

    NASA Astrophysics Data System (ADS)

    Li, Xiao-Fen; Kochat, Mehdi; Majkic, Goran; Selvamanickam, Venkat

    2016-08-01

    In this paper the authors succeeded in measuring the critical current density ({J}{{c}}) of multifilament-coated conductors (CCs) with thin filaments as low as 0.25 mm using the scanning hall probe microscope (SHPM) technique. A new iterative method of data analysis is developed to make the calculation of {J}{{c}} for thin filaments possible, even without a very small scan distance. The authors also discussed in detail the advantage and limitation of the iterative method using both simulation and experiment results. The results of the new method correspond well with the traditional fast Fourier transform method where this is still applicable. However, the new method is applicable for the filamentized CCs in much wider measurement conditions such as with thin filament and a large scan distance, thus overcoming the barrier for application of the SHPM technique on {J}{{c}} measurement of long filamentized CCs with narrow filaments.

  13. In vitro and in vivo mapping of the Prunus necrotic ringspot virus coat protein C-terminal dimerization domain by bimolecular fluorescence complementation.

    PubMed

    Aparicio, Frederic; Sánchez-Navarro, Jesús A; Pallás, Vicente

    2006-06-01

    Interactions between viral proteins are critical for virus viability. Bimolecular fluorescent complementation (BiFC) technique determines protein interactions in real-time under almost normal physiological conditions. The coat protein (CP) of Prunus necrotic ringspot virus is required for multiple functions in its replication cycle. In this study, the region involved in CP dimerization has been mapped by BiFC in both bacteria and plant tissue. Full-length and C-terminal deleted forms of the CP gene were fused in-frame to the N- and C-terminal fragments of the yellow fluorescent protein. The BiFC analysis showed that a domain located between residues 9 and 27 from the C-end plays a critical role in dimerization. The importance of this C-terminal region in dimer formation and the applicability of the BiFC technique to analyse viral protein interactions are discussed.

  14. Influences of geological parameters to probabilistic assessment of slope stability of embankment

    NASA Astrophysics Data System (ADS)

    Nguyen, Qui T.; Le, Tuan D.; Konečný, Petr

    2018-04-01

    This article considers influences of geological parameters to slope stability of the embankment in probabilistic analysis using SLOPE/W computational system. Stability of a simple slope is evaluated with and without pore–water pressure on the basis of variation of soil properties. Normal distributions of unit weight, cohesion and internal friction angle are assumed. Monte Carlo simulation technique is employed to perform analysis of critical slip surface. Sensitivity analysis is performed to observe the variation of the geological parameters and their effects on safety factors of the slope stability.

  15. Complex systems and the technology of variability analysis

    PubMed Central

    Seely, Andrew JE; Macklem, Peter T

    2004-01-01

    Characteristic patterns of variation over time, namely rhythms, represent a defining feature of complex systems, one that is synonymous with life. Despite the intrinsic dynamic, interdependent and nonlinear relationships of their parts, complex biological systems exhibit robust systemic stability. Applied to critical care, it is the systemic properties of the host response to a physiological insult that manifest as health or illness and determine outcome in our patients. Variability analysis provides a novel technology with which to evaluate the overall properties of a complex system. This review highlights the means by which we scientifically measure variation, including analyses of overall variation (time domain analysis, frequency distribution, spectral power), frequency contribution (spectral analysis), scale invariant (fractal) behaviour (detrended fluctuation and power law analysis) and regularity (approximate and multiscale entropy). Each technique is presented with a definition, interpretation, clinical application, advantages, limitations and summary of its calculation. The ubiquitous association between altered variability and illness is highlighted, followed by an analysis of how variability analysis may significantly improve prognostication of severity of illness and guide therapeutic intervention in critically ill patients. PMID:15566580

  16. Predicting critical transitions in dynamical systems from time series using nonstationary probability density modeling.

    PubMed

    Kwasniok, Frank

    2013-11-01

    A time series analysis method for predicting the probability density of a dynamical system is proposed. A nonstationary parametric model of the probability density is estimated from data within a maximum likelihood framework and then extrapolated to forecast the future probability density and explore the system for critical transitions or tipping points. A full systematic account of parameter uncertainty is taken. The technique is generic, independent of the underlying dynamics of the system. The method is verified on simulated data and then applied to prediction of Arctic sea-ice extent.

  17. Crystallization of calcia-gallia-silica glasses

    NASA Technical Reports Server (NTRS)

    Ray, C. S.; Day, D. E.

    1984-01-01

    A thermal image furance is presently used to study the critical cooling rate for glass formation, and the kinetics of crystallization, of the compositions 18.4CaO-(81.6-X)Ga2O3-XSiO2, where X = 3, 6, 9, and 13.8. Crystallization was studied nonisothermally, and the data were analyzed in light of the Avrami (1939) equation. Critical cooling rate and crystallization activation energy are both found to decrease with increasing silica content, and the results obtained by the present technique are noted to agree with those obtained on the basis of differential thermal analysis measurements.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lei, Hechang; Petrovic, C.

    The critical current density Jabc of K xFe 2-ySe₂ single crystals can be enhanced by more than one order of magnitude, up to ~2.1×10⁴ A/cm² by the post annealing and quenching technique. A scaling analysis reveals the universal behavior of the normalized pinning force as a function of the reduced field for all temperatures, indicating the presence of a single vortex pinning mechanism. The main pinning sources are three-dimensional (3D) point-like normal cores. The dominant vortex interaction with pinning centers is via spatial variations in critical temperature T c (“δT c pinning”).

  19. Critical analysis and systematization of rat pancreatectomy terminology.

    PubMed

    Eulálio, José Marcus Raso; Bon-Habib, Assad Charbel Chequer; Soares, Daiane de Oliveira; Corrêa, Paulo Guilherme Antunes; Pineschi, Giovana Penna Firme; Diniz, Victor Senna; Manso, José Eduardo Ferreira; Schanaider, Alberto

    2016-10-01

    To critically analyze and standardize the rat pancreatectomy nomenclature variants. It was performed a review of indexed manuscripts in PUBMED from 01/01/1945 to 31/12/2015 with the combined keywords "rat pancreatectomy" and "rat pancreas resection". The following parameters was considered: A. Frequency of publications; B. Purpose of the pancreatectomy in each article; C. Bibliographic references; D. Nomenclature of techniques according to the pancreatic parenchyma resection percentage. Among the 468, the main objectives were to surgically induce diabetes and to study the genes regulations and expressions. Five rat pancreatectomy technique references received 15 or more citations. Twenty different terminologies were identified for the pancreas resection: according to the resected parenchyma percentage (30 to 95%); to the procedure type (total, subtotal and partial); or based on the selected anatomical region (distal, longitudinal and segmental). A nomenclature systematization was gathered by cross-checking information between the main surgical techniques, the anatomic parameters descriptions and the resected parenchyma percentages. The subtotal pancreatectomy nomenclature for parenchymal resection between 80 and 95% establishes a surgical parameter that also defines the total and partial pancreatectomy limits and standardizes these surgical procedures in rats.

  20. The measurement of linear frequency drift in oscillators

    NASA Astrophysics Data System (ADS)

    Barnes, J. A.

    1985-04-01

    A linear drift in frequency is an important element in most stochastic models of oscillator performance. Quartz crystal oscillators often have drifts in excess of a part in ten to the tenth power per day. Even commercial cesium beam devices often show drifts of a few parts in ten to the thirteenth per year. There are many ways to estimate the drift rates from data samples (e.g., regress the phase on a quadratic; regress the frequency on a linear; compute the simple mean of the first difference of frequency; use Kalman filters with a drift term as one element in the state vector; and others). Although most of these estimators are unbiased, they vary in efficiency (i.e., confidence intervals). Further, the estimation of confidence intervals using the standard analysis of variance (typically associated with the specific estimating technique) can give amazingly optimistic results. The source of these problems is not an error in, say, the regressions techniques, but rather the problems arise from correlations within the residuals. That is, the oscillator model is often not consistent with constraints on the analysis technique or, in other words, some specific analysis techniques are often inappropriate for the task at hand. The appropriateness of a specific analysis technique is critically dependent on the oscillator model and can often be checked with a simple whiteness test on the residuals.

  1. The Fabrication Technique and Property Analysis of Racetrack-Type High Temperature Superconducting Magnet for High Power Motor

    NASA Astrophysics Data System (ADS)

    Xie, S. F.; Wang, Y.; Wang, D. Y.; Zhang, X. J.; Zhao, B.; Zhang, Y. Y.; Li, L.; Li, Y. N.; Chen, P. M.

    2013-03-01

    The superconducting motor is now the focus of the research on the application of high temperature superconducting (HTS) materials. In this manuscript, we mainly introduce the recent progress on the fabrication technique and property research of the superconducting motor magnet in Luoyang Ship Material Research Institute (LSMRI) in China, including the materials, the winding and impregnation technique, and property measurement of magnet. Several techniques and devices were developed to manufacture the magnet, including the technique of insulation and thermal conduction, the device for winding the racetrack-type magnet, etc. At last, the superconducting magnet used for the MW class motor were successfully developed, which is the largest superconducting motor magnet in china at present. The critical current of the superconducting magnet exceeds the design value (90 A at 30 K).

  2. Probabilistic Structural Analysis Theory Development

    NASA Technical Reports Server (NTRS)

    Burnside, O. H.

    1985-01-01

    The objective of the Probabilistic Structural Analysis Methods (PSAM) project is to develop analysis techniques and computer programs for predicting the probabilistic response of critical structural components for current and future space propulsion systems. This technology will play a central role in establishing system performance and durability. The first year's technical activity is concentrating on probabilistic finite element formulation strategy and code development. Work is also in progress to survey critical materials and space shuttle mian engine components. The probabilistic finite element computer program NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) is being developed. The final probabilistic code will have, in the general case, the capability of performing nonlinear dynamic of stochastic structures. It is the goal of the approximate methods effort to increase problem solving efficiency relative to finite element methods by using energy methods to generate trial solutions which satisfy the structural boundary conditions. These approximate methods will be less computer intensive relative to the finite element approach.

  3. Risk assessment of component failure modes and human errors using a new FMECA approach: application in the safety analysis of HDR brachytherapy.

    PubMed

    Giardina, M; Castiglia, F; Tomarchio, E

    2014-12-01

    Failure mode, effects and criticality analysis (FMECA) is a safety technique extensively used in many different industrial fields to identify and prevent potential failures. In the application of traditional FMECA, the risk priority number (RPN) is determined to rank the failure modes; however, the method has been criticised for having several weaknesses. Moreover, it is unable to adequately deal with human errors or negligence. In this paper, a new versatile fuzzy rule-based assessment model is proposed to evaluate the RPN index to rank both component failure and human error. The proposed methodology is applied to potential radiological over-exposure of patients during high-dose-rate brachytherapy treatments. The critical analysis of the results can provide recommendations and suggestions regarding safety provisions for the equipment and procedures required to reduce the occurrence of accidental events.

  4. Transactional Space: Feedback, Critical Thinking, and Learning Dance Technique

    ERIC Educational Resources Information Center

    Akinleye, Adesola; Payne, Rose

    2016-01-01

    This article explores attitudes about feedback and critical thinking in dance technique classes. The authors discuss an expansion of their teaching practices to include feedback as bidirectional (transactional) and a part of developing critical thinking skills in student dancers. The article was written after the authors undertook research…

  5. D Digitization of AN Heritage Masterpiece - a Critical Analysis on Quality Assessment

    NASA Astrophysics Data System (ADS)

    Menna, F.; Nocerino, E.; Remondino, F.; Dellepiane, M.; Callieri, M.; Scopigno, R.

    2016-06-01

    Despite being perceived as interchangeable when properly applied, close-range photogrammetry and range imaging have both their pros and limitations that can be overcome using suitable procedures. Even if the two techniques have been frequently cross-compared, critical analysis discussing all sub-phases of a complex digitization project are quite rare. Comparisons taking into account the digitization of a cultural masterpiece, such as the Etruscan Sarcophagus of the Spouses (Figure 1) discussed in this paper, are even less common. The final 3D model of the Sarcophagus shows impressive spatial and texture resolution, in the order of tenths of millimetre for both digitization techniques, making it a large 3D digital model even though the physical size of the artwork is quite limited. The paper presents the survey of the Sarcophagus, a late 6th century BC Etruscan anthropoid Sarcophagus. Photogrammetry and laser scanning were used for its 3D digitization in two different times only few days apart from each other. The very short time available for the digitization was a crucial constraint for the surveying operations (due to constraints imposed us by the museum curators). Despite very high-resolution and detailed 3D models have been produced, a metric comparison between the two models shows intrinsic limitations of each technique that should be overcome through suitable onsite metric verification procedures as well as a proper processing workflow.

  6. Magnetic Flux Leakage and Principal Component Analysis for metal loss approximation in a pipeline

    NASA Astrophysics Data System (ADS)

    Ruiz, M.; Mujica, L. E.; Quintero, M.; Florez, J.; Quintero, S.

    2015-07-01

    Safety and reliability of hydrocarbon transportation pipelines represent a critical aspect for the Oil an Gas industry. Pipeline failures caused by corrosion, external agents, among others, can develop leaks or even rupture, which can negatively impact on population, natural environment, infrastructure and economy. It is imperative to have accurate inspection tools traveling through the pipeline to diagnose the integrity. In this way, over the last few years, different techniques under the concept of structural health monitoring (SHM) have continuously been in development. This work is based on a hybrid methodology that combines the Magnetic Flux Leakage (MFL) and Principal Components Analysis (PCA) approaches. The MFL technique induces a magnetic field in the pipeline's walls. The data are recorded by sensors measuring leakage magnetic field in segments with loss of metal, such as cracking, corrosion, among others. The data provide information of a pipeline with 15 years of operation approximately, which transports gas, has a diameter of 20 inches and a total length of 110 km (with several changes in the topography). On the other hand, PCA is a well-known technique that compresses the information and extracts the most relevant information facilitating the detection of damage in several structures. At this point, the goal of this work is to detect and localize critical loss of metal of a pipeline that are currently working.

  7. A cross-sectional observational study to assess inhaler technique in Saudi hospitalized patients with asthma and chronic obstructive pulmonary disease

    PubMed Central

    Ammari, Maha Al; Sultana, Khizra; Yunus, Faisal; Ghobain, Mohammed Al; Halwan, Shatha M. Al

    2016-01-01

    Objectives: To assess the proportion of critical errors committed while demonstrating the inhaler technique in hospitalized patients diagnosed with asthma and chronic obstructive pulmonary disease (COPD). Methods: This cross-sectional observational study was conducted in 47 asthmatic and COPD patients using inhaler devices. The study took place at King Abdulaziz Medical City, Riyadh, Saudi Arabia between September and December 2013. Two pharmacists independently assessed inhaler technique with a validated checklist. Results: Seventy percent of patients made at least one critical error while demonstrating their inhaler technique, and the mean number of critical errors per patient was 1.6. Most patients used metered dose inhaler (MDI), and 73% of MDI users and 92% of dry powder inhaler users committed at least one critical error. Conclusion: Inhaler technique in hospitalized Saudi patients was inadequate. Health care professionals should understand the importance of reassessing and educating patients on a regular basis for inhaler technique, recommend the use of a spacer when needed, and regularly assess and update their own inhaler technique skills. PMID:27146622

  8. Developing an intelligence analysis process through social network analysis

    NASA Astrophysics Data System (ADS)

    Waskiewicz, Todd; LaMonica, Peter

    2008-04-01

    Intelligence analysts are tasked with making sense of enormous amounts of data and gaining an awareness of a situation that can be acted upon. This process can be extremely difficult and time consuming. Trying to differentiate between important pieces of information and extraneous data only complicates the problem. When dealing with data containing entities and relationships, social network analysis (SNA) techniques can be employed to make this job easier. Applying network measures to social network graphs can identify the most significant nodes (entities) and edges (relationships) and help the analyst further focus on key areas of concern. Strange developed a model that identifies high value targets such as centers of gravity and critical vulnerabilities. SNA lends itself to the discovery of these high value targets and the Air Force Research Laboratory (AFRL) has investigated several network measures such as centrality, betweenness, and grouping to identify centers of gravity and critical vulnerabilities. Using these network measures, a process for the intelligence analyst has been developed to aid analysts in identifying points of tactical emphasis. Organizational Risk Analyzer (ORA) and Terrorist Modus Operandi Discovery System (TMODS) are the two applications used to compute the network measures and identify the points to be acted upon. Therefore, the result of leveraging social network analysis techniques and applications will provide the analyst and the intelligence community with more focused and concentrated analysis results allowing them to more easily exploit key attributes of a network, thus saving time, money, and manpower.

  9. Computational intelligence techniques for biological data mining: An overview

    NASA Astrophysics Data System (ADS)

    Faye, Ibrahima; Iqbal, Muhammad Javed; Said, Abas Md; Samir, Brahim Belhaouari

    2014-10-01

    Computational techniques have been successfully utilized for a highly accurate analysis and modeling of multifaceted and raw biological data gathered from various genome sequencing projects. These techniques are proving much more effective to overcome the limitations of the traditional in-vitro experiments on the constantly increasing sequence data. However, most critical problems that caught the attention of the researchers may include, but not limited to these: accurate structure and function prediction of unknown proteins, protein subcellular localization prediction, finding protein-protein interactions, protein fold recognition, analysis of microarray gene expression data, etc. To solve these problems, various classification and clustering techniques using machine learning have been extensively used in the published literature. These techniques include neural network algorithms, genetic algorithms, fuzzy ARTMAP, K-Means, K-NN, SVM, Rough set classifiers, decision tree and HMM based algorithms. Major difficulties in applying the above algorithms include the limitations found in the previous feature encoding and selection methods while extracting the best features, increasing classification accuracy and decreasing the running time overheads of the learning algorithms. The application of this research would be potentially useful in the drug design and in the diagnosis of some diseases. This paper presents a concise overview of the well-known protein classification techniques.

  10. Design approach of an aquaculture cage system for deployment in the constructed channel flow environments of a power plant

    PubMed Central

    Lee, Jihoon; Fredriksson, David W.; DeCew, Judson; Drach, Andrew; Yim, Solomon C.

    2018-01-01

    This study provides an engineering approach for designing an aquaculture cage system for use in constructed channel flow environments. As sustainable aquaculture has grown globally, many novel techniques have been introduced such as those implemented in the global Atlantic salmon industry. The advent of several highly sophisticated analysis software systems enables the development of such novel engineering techniques. These software systems commonly include three-dimensional (3D) drafting, computational fluid dynamics, and finite element analysis. In this study, a combination of these analysis tools is applied to evaluate a conceptual aquaculture system for potential deployment in a power plant effluent channel. The channel is supposedly clean; however, it includes elevated water temperatures and strong currents. The first portion of the analysis includes the design of a fish cage system with specific net solidities using 3D drafting techniques. Computational fluid dynamics is then applied to evaluate the flow reduction through the system from the previously generated solid models. Implementing the same solid models, a finite element analysis is performed on the critical components to assess the material stresses produced by the drag force loads that are calculated from the fluid velocities. PMID:29897954

  11. Recommendations for Quantitative Analysis of Small Molecules by Matrix-assisted laser desorption ionization mass spectrometry

    PubMed Central

    Wang, Poguang; Giese, Roger W.

    2017-01-01

    Matrix-assisted laser desorption ionization mass spectrometry (MALDI-MS) has been used for quantitative analysis of small molecules for many years. It is usually preceded by an LC separation step when complex samples are tested. With the development several years ago of “modern MALDI” (automation, high repetition laser, high resolution peaks), the ease of use and performance of MALDI as a quantitative technique greatly increased. This review focuses on practical aspects of modern MALDI for quantitation of small molecules conducted in an ordinary way (no special reagents, devices or techniques for the spotting step of MALDI), and includes our ordinary, preferred Methods The review is organized as 18 recommendations with accompanying explanations, criticisms and exceptions. PMID:28118972

  12. Plasmapheresis and other extracorporeal filtration techniques in critical patients.

    PubMed

    Daga Ruiz, D; Fonseca San Miguel, F; González de Molina, F J; Úbeda-Iglesias, A; Navas Pérez, A; Jannone Forés, R

    2017-04-01

    Plasmapheresis is an extracorporeal technique that eliminates macromolecules involved in pathological processes from plasma. A review is made of the technical aspects, main indications in critical care and potential complications of plasmapheresis, as well as of other extracorporeal filtration techniques such as endotoxin-removal columns and other devices designed to eliminate cytokines or modulate the inflammatory immune response in critical patients. Copyright © 2016 Elsevier España, S.L.U. y SEMICYUC. All rights reserved.

  13. Linear Covariance Analysis For Proximity Operations Around Asteroid 2008 EV5

    NASA Technical Reports Server (NTRS)

    Wright, Cinnamon A.; Bhatt, Sagar; Woffinden, David; Strube, Matthew; D'Souza, Chris

    2015-01-01

    The NASA initiative to collect an asteroid, the Asteroid Robotic Redirect Mission (ARRM), is currently investigating the option of retrieving a boulder from an asteroid, demonstrating planetary defense with an enhanced gravity tractor technique, and returning it to a lunar orbit. Techniques for accomplishing this are being investigated by the Satellite Servicing Capabilities Office (SSCO) at NASA GSFC in collaboration with JPL, NASA JSC, LaRC, and Draper Laboratory, Inc. Two critical phases of the mission are the descent to the boulder and the Enhanced Gravity Tractor demonstration. A linear covariance analysis is done for these phases to assess the feasibility of these concepts with the proposed design of the sensor and actuator suite of the Asteroid Redirect Vehicle (ARV). The sensor suite for this analysis includes a wide field of view camera, LiDAR, and an IMU. The proposed asteroid of interest is currently the C-type asteroid 2008 EV5, a carbonaceous chondrite that is of high interest to the scientific community. This paper presents an overview of the linear covariance analysis techniques and simulation tool, provides sensor and actuator models, and addresses the feasibility of descending to the surface of the asteroid within allocated requirements as well as the possibility of maintaining a halo orbit to demonstrate the Enhanced Gravity Tractor technique.

  14. Towards Understanding Soil Forming in Santa Clotilde Critical Zone Observatory: Modelling Soil Mixing Processes in a Hillslope using Luminescence Techniques

    NASA Astrophysics Data System (ADS)

    Sanchez, A. R.; Laguna, A.; Reimann, T.; Giráldez, J. V.; Peña, A.; Wallinga, J.; Vanwalleghem, T.

    2017-12-01

    Different geomorphological processes such as bioturbation and erosion-deposition intervene in soil formation and landscape evolution. The latter processes produce the alteration and degradation of the materials that compose the rocks. The degree to which the bedrock is weathered is estimated through the fraction of the bedrock which is mixing in the soil either vertically or laterally. This study presents an analytical solution for the diffusion-advection equation to quantify bioturbation and erosion-depositions rates in profiles along a catena. The model is calibrated with age-depth data obtained from profiles using the luminescence dating based on single grain Infrared Stimulated Luminescence (IRSL). Luminescence techniques contribute to a direct measurement of the bioturbation and erosion-deposition processes. Single-grain IRSL techniques is applied to feldspar minerals of fifteen samples which were collected from four soil profiles at different depths along a catena in Santa Clotilde Critical Zone Observatory, Cordoba province, SE Spain. A sensitivity analysis is studied to know the importance of the parameters in the analytical model. An uncertainty analysis is carried out to stablish the better fit of the parameters to the measured age-depth data. The results indicate a diffusion constant at 20 cm in depth of 47 (mm2/year) in the hill-base profile and 4.8 (mm2/year) in the hilltop profile. The model has high uncertainty in the estimation of erosion and deposition rates. This study reveals the potential of luminescence single-grain techniques to quantify pedoturbation processes.

  15. The role of chemometrics in single and sequential extraction assays: a review. Part II. Cluster analysis, multiple linear regression, mixture resolution, experimental design and other techniques.

    PubMed

    Giacomino, Agnese; Abollino, Ornella; Malandrino, Mery; Mentasti, Edoardo

    2011-03-04

    Single and sequential extraction procedures are used for studying element mobility and availability in solid matrices, like soils, sediments, sludge, and airborne particulate matter. In the first part of this review we reported an overview on these procedures and described the applications of chemometric uni- and bivariate techniques and of multivariate pattern recognition techniques based on variable reduction to the experimental results obtained. The second part of the review deals with the use of chemometrics not only for the visualization and interpretation of data, but also for the investigation of the effects of experimental conditions on the response, the optimization of their values and the calculation of element fractionation. We will describe the principles of the multivariate chemometric techniques considered, the aims for which they were applied and the key findings obtained. The following topics will be critically addressed: pattern recognition by cluster analysis (CA), linear discriminant analysis (LDA) and other less common techniques; modelling by multiple linear regression (MLR); investigation of spatial distribution of variables by geostatistics; calculation of fractionation patterns by a mixture resolution method (Chemometric Identification of Substrates and Element Distributions, CISED); optimization and characterization of extraction procedures by experimental design; other multivariate techniques less commonly applied. Copyright © 2010 Elsevier B.V. All rights reserved.

  16. Spectroscopic analysis technique for arc-welding process control

    NASA Astrophysics Data System (ADS)

    Mirapeix, Jesús; Cobo, Adolfo; Conde, Olga; Quintela, María Ángeles; López-Higuera, José-Miguel

    2005-09-01

    The spectroscopic analysis of the light emitted by thermal plasmas has found many applications, from chemical analysis to monitoring and control of industrial processes. Particularly, it has been demonstrated that the analysis of the thermal plasma generated during arc or laser welding can supply information about the process and, thus, about the quality of the weld. In some critical applications (e.g. the aerospace sector), an early, real-time detection of defects in the weld seam (oxidation, porosity, lack of penetration, ...) is highly desirable as it can reduce expensive non-destructive testing (NDT). Among others techniques, full spectroscopic analysis of the plasma emission is known to offer rich information about the process itself, but it is also very demanding in terms of real-time implementations. In this paper, we proposed a technique for the analysis of the plasma emission spectrum that is able to detect, in real-time, changes in the process parameters that could lead to the formation of defects in the weld seam. It is based on the estimation of the electronic temperature of the plasma through the analysis of the emission peaks from multiple atomic species. Unlike traditional techniques, which usually involve peak fitting to Voigt functions using the Levenberg-Marquardt recursive method, we employ the LPO (Linear Phase Operator) sub-pixel algorithm to accurately estimate the central wavelength of the peaks (allowing an automatic identification of each atomic species) and cubic-spline interpolation of the noisy data to obtain the intensity and width of the peaks. Experimental tests on TIG-welding using fiber-optic capture of light and a low-cost CCD-based spectrometer, show that some typical defects can be easily detected and identified with this technique, whose typical processing time for multiple peak analysis is less than 20msec. running in a conventional PC.

  17. Do not blame the driver: a systems analysis of the causes of road freight crashes.

    PubMed

    Newnam, Sharon; Goode, Natassia

    2015-03-01

    Although many have advocated a systems approach in road transportation, this view has not meaningfully penetrated road safety research, practice or policy. In this study, a systems theory-based approach, Rasmussens's (1997) risk management framework and associated Accimap technique, is applied to the analysis of road freight transportation crashes. Twenty-seven highway crash investigation reports were downloaded from the National Transport Safety Bureau website. Thematic analysis was used to identify the complex system of contributory factors, and relationships, identified within the reports. The Accimap technique was then used to represent the linkages and dependencies within and across system levels in the road freight transportation industry and to identify common factors and interactions across multiple crashes. The results demonstrate how a systems approach can increase knowledge in this safety critical domain, while the findings can be used to guide prevention efforts and the development of system-based investigation processes for the heavy vehicle industry. A research agenda for developing an investigation technique to better support the application of the Accimap technique by practitioners in road freight transportation industry is proposed. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. Measurement of fracture toughness by nanoindentation methods: Recent advances and future challenges

    DOE PAGES

    Sebastiani, Marco; Johanns, K. E.; Herbert, Erik G.; ...

    2015-04-30

    In this study, we describe recent advances and developments for the measurement of fracture toughness at small scales by the use of nanoindentation-based methods including techniques based on micro-cantilever beam bending and micro-pillar splitting. A critical comparison of the techniques is made by testing a selected group of bulk and thin film materials. For pillar splitting, cohesive zone finite element simulations are used to validate a simple relationship between the critical load at failure, the pillar radius, and the fracture toughness for a range of material properties and coating/substrate combinations. The minimum pillar diameter required for nucleation and growth ofmore » a crack during indentation is also estimated. An analysis of pillar splitting for a film on a dissimilar substrate material shows that the critical load for splitting is relatively insensitive to the substrate compliance for a large range of material properties. Experimental results from a selected group of materials show good agreement between single cantilever and pillar splitting methods, while a discrepancy of ~25% is found between the pillar splitting technique and double-cantilever testing. It is concluded that both the micro-cantilever and pillar splitting techniques are valuable methods for micro-scale assessment of fracture toughness of brittle ceramics, provided the underlying assumptions can be validated. Although the pillar splitting method has some advantages because of the simplicity of sample preparation and testing, it is not applicable to most metals because their higher toughness prevents splitting, and in this case, micro-cantilever bend testing is preferred.« less

  19. An ethic of analysis: an argument for critical analysis of research interviews as an ethical practice.

    PubMed

    Cloyes, Kristin Gates

    2006-01-01

    Nursing literature is replete with discussions about the ethics of research interviews. These largely involve questions of method, and how careful study design and data collection technique can render studies more ethical. Analysis, the perennial black box of the research process, is rarely discussed as an ethical practice. In this paper, I introduce the idea that analysis itself is an ethical practice. Specifically, I argue that political discourse analysis of research interviews is an ethical practice. I use examples from my own research in a prison control unit to illustrate what this might look like, and what is at stake.

  20. A rule-based system for real-time analysis of control systems

    NASA Astrophysics Data System (ADS)

    Larson, Richard R.; Millard, D. Edward

    1992-10-01

    An approach to automate the real-time analysis of flight critical health monitoring and system status is being developed and evaluated at the NASA Dryden Flight Research Facility. A software package was developed in-house and installed as part of the extended aircraft interrogation and display system. This design features a knowledge-base structure in the form of rules to formulate interpretation and decision logic of real-time data. This technique has been applied for ground verification and validation testing and flight testing monitoring where quick, real-time, safety-of-flight decisions can be very critical. In many cases post processing and manual analysis of flight system data are not required. The processing is described of real-time data for analysis along with the output format which features a message stack display. The development, construction, and testing of the rule-driven knowledge base, along with an application using the X-31A flight test program, are presented.

  1. [Environmental, social, and roadway vulnerability in accidents involving transportation of hazardous products: a case study of the BR-101 highway between Osório and Torres in Rio Grande do Sul State, Brazil].

    PubMed

    Tinoco, Maria Auxiliadora Cannarozzo; Nodari, Christine Tessele; Pereira, Kimberllyn Rosa da Silva

    2016-09-19

    This study aimed to assess the environmental and social vulnerability and identify critical highway stretches for accidents involving transportation of hazardous products on the BR-101 highway between the cities of Osório and Torres in Rio Grande do Sul State, Brazil. The study's approach consisted of a multiple-criteria analysis combining highway safety analysis and environmental and social vulnerability analysis in the occurrence of accidents with hazardous products, plus cartographic analysis techniques. Thirty-eight kilometers of the highway showed high vulnerability, of which 8 kilometers with critical vulnerability, associated with bridges over rivers, water uptake points, a tunnel, environmental preservation areas, and an urban area. These stretches should be prioritized when developing action plans for accident mitigation and development of public policies for this highway. This proved to be an unprecedented approach when compared to existing studies and a potentially useful tool for decision-making in emergency operations.

  2. A rule-based system for real-time analysis of control systems

    NASA Technical Reports Server (NTRS)

    Larson, Richard R.; Millard, D. Edward

    1992-01-01

    An approach to automate the real-time analysis of flight critical health monitoring and system status is being developed and evaluated at the NASA Dryden Flight Research Facility. A software package was developed in-house and installed as part of the extended aircraft interrogation and display system. This design features a knowledge-base structure in the form of rules to formulate interpretation and decision logic of real-time data. This technique has been applied for ground verification and validation testing and flight testing monitoring where quick, real-time, safety-of-flight decisions can be very critical. In many cases post processing and manual analysis of flight system data are not required. The processing is described of real-time data for analysis along with the output format which features a message stack display. The development, construction, and testing of the rule-driven knowledge base, along with an application using the X-31A flight test program, are presented.

  3. Glyphosate analysis using sensors and electromigration separation techniques as alternatives to gas or liquid chromatography.

    PubMed

    Gauglitz, Günter; Wimmer, Benedikt; Melzer, Tanja; Huhn, Carolin

    2018-01-01

    Since its introduction in 1974, the herbicide glyphosate has experienced a tremendous increase in use, with about one million tons used annually today. This review focuses on sensors and electromigration separation techniques as alternatives to chromatographic methods for the analysis of glyphosate and its metabolite aminomethyl phosphonic acid. Even with the large number of studies published, glyphosate analysis remains challenging. With its polar and depending on pH even ionic functional groups lacking a chromophore, it is difficult to analyze with chromatographic techniques. Its analysis is mostly achieved after derivatization. Its purification from food and environmental samples inevitably results incoextraction of ionic matrix components, with a further impact on analysis derivatization. Its purification from food and environmental samples inevitably results in coextraction of ionic matrix components, with a further impact on analysis and also derivatization reactions. Its ability to form chelates with metal cations is another obstacle for precise quantification. Lastly, the low limits of detection required by legislation have to be met. These challenges preclude glyphosate from being analyzed together with many other pesticides in common multiresidue (chromatographic) methods. For better monitoring of glyphosate in environmental and food samples, further fast and robust methods are required. In this review, analytical methods are summarized and discussed from the perspective of biosensors and various formats of electromigration separation techniques, including modes such as capillary electrophoresis and micellar electrokinetic chromatography, combined with various detection techniques. These methods are critically discussed with regard to matrix tolerance, limits of detection reached, and selectivity.

  4. Novel Sample-handling Approach for XRD Analysis with Minimal Sample Preparation

    NASA Technical Reports Server (NTRS)

    Sarrazin, P.; Chipera, S.; Bish, D.; Blake, D.; Feldman, S.; Vaniman, D.; Bryson, C.

    2004-01-01

    Sample preparation and sample handling are among the most critical operations associated with X-ray diffraction (XRD) analysis. These operations require attention in a laboratory environment, but they become a major constraint in the deployment of XRD instruments for robotic planetary exploration. We are developing a novel sample handling system that dramatically relaxes the constraints on sample preparation by allowing characterization of coarse-grained material that would normally be impossible to analyze with conventional powder-XRD techniques.

  5. Using the Critical Incident Technique for Triangulation and Elaboration of Communication Management Competencies

    ERIC Educational Resources Information Center

    Brunton, Margaret Ann; Jeffrey, Lynn Maud

    2010-01-01

    This paper presents the findings from research using the critical incident technique to identify the use of key competencies for communication management practitioners. Qualitative data was generated from 202 critical incidents reported by 710 respondents. We also present a brief summary of the quantitative data, which identified two superordinate…

  6. Continuously varying of critical exponents with the bismuth doped in the La0.8Na0.2Mn1-xBixO3 (0 ≤ x ≤ 0.06) manganites

    NASA Astrophysics Data System (ADS)

    Laouyenne, M. R.; Baazaoui, M.; Mahjoub, Sa.; Cheikhrouhou-Koubaa, W.; Farah, Kh.; Oumezzine, M.

    2018-04-01

    A comprehensive analysis of the critical phenomena for the nominal compositions La0.8Na0.2Mn1-xBixO3 (0 ≤ x ≤ 0.06) was carried out. The critical exponents values were calculated by various techniques such as Modified Arrott plot (MAP), Kouvel Fisher (KF) method and critical isotherm (CI). Comparison of the experimental data with the above theoretical models showed that the critical exponents β, γ and δ for the undoped sample are quite well described by the tricritical mean-field model (TMF). Furthermore, the substitution of Mn by Bi ions led to the increase of γ which approached the 3D-Heisenberg model (γ = 1 325 and β took similar values to those predicted by the TMF model. The validity of the exponents values was confirmed with the scaling hypothesis; the M (T, ε) curves collapse onto two independent universal branches below and above Tc.

  7. Laser-induced breakdown spectroscopy application in environmental monitoring of water quality: a review.

    PubMed

    Yu, Xiaodong; Li, Yang; Gu, Xiaofeng; Bao, Jiming; Yang, Huizhong; Sun, Li

    2014-12-01

    Water quality monitoring is a critical part of environmental management and protection, and to be able to qualitatively and quantitatively determine contamination and impurity levels in water is especially important. Compared to the currently available water quality monitoring methods and techniques, laser-induced breakdown spectroscopy (LIBS) has several advantages, including no need for sample pre-preparation, fast and easy operation, and chemical free during the process. Therefore, it is of great importance to understand the fundamentals of aqueous LIBS analysis and effectively apply this technique to environmental monitoring. This article reviews the research conducted on LIBS analysis for liquid samples, and the article content includes LIBS theory, history and applications, quantitative analysis of metallic species in liquids, LIBS signal enhancement methods and data processing, characteristics of plasma generated by laser in water, and the factors affecting accuracy of analysis results. Although there have been many research works focusing on aqueous LIBS analysis, detection limit and stability of this technique still need to be improved to satisfy the requirements of environmental monitoring standard. In addition, determination of nonmetallic species in liquid by LIBS is equally important and needs immediate attention from the community. This comprehensive review will assist the readers to better understand the aqueous LIBS technique and help to identify current research needs for environmental monitoring of water quality.

  8. Causal inference in economics and marketing.

    PubMed

    Varian, Hal R

    2016-07-05

    This is an elementary introduction to causal inference in economics written for readers familiar with machine learning methods. The critical step in any causal analysis is estimating the counterfactual-a prediction of what would have happened in the absence of the treatment. The powerful techniques used in machine learning may be useful for developing better estimates of the counterfactual, potentially improving causal inference.

  9. Critical Trends and Events Affecting the Future of Texas Higher Education. Proceedings of the Texas Association for Institutional Research (TAIR) Preconference Workshop on Environmental Scanning (1995).

    ERIC Educational Resources Information Center

    Morrison, James L.

    This proceedings report describes exercises used in a workshop on environmental scanning, designed to assist institutional research officers to develop competency in establishing and maintaining an external analysis capability on their campuses. The workshop offered an opportunity for participants to experience several techniques used in…

  10. Causal inference in economics and marketing

    PubMed Central

    Varian, Hal R.

    2016-01-01

    This is an elementary introduction to causal inference in economics written for readers familiar with machine learning methods. The critical step in any causal analysis is estimating the counterfactual—a prediction of what would have happened in the absence of the treatment. The powerful techniques used in machine learning may be useful for developing better estimates of the counterfactual, potentially improving causal inference. PMID:27382144

  11. Man-rating of a launch vehicle

    NASA Astrophysics Data System (ADS)

    Soeffker, D.

    Analysis techniques for hazard identification, classification, and control, developed for Spacelab, are presented. Hazards were classified as catastrophic (leading to crew or vehicle loss) critical (could lead to serious injury or damage) and controlled (counteracted by design). All nonmetallic materials were rated for flammability in oxygen enriched atmospheres, toxic offgassing, and odor. Any element with less than 200 mission capability was rated life limited.

  12. A Laboratory Course for Teaching Laboratory Techniques, Experimental Design, Statistical Analysis, and Peer Review Process to Undergraduate Science Students

    ERIC Educational Resources Information Center

    Gliddon, C. M.; Rosengren, R. J.

    2012-01-01

    This article describes a 13-week laboratory course called Human Toxicology taught at the University of Otago, New Zealand. This course used a guided inquiry based laboratory coupled with formative assessment and collaborative learning to develop in undergraduate students the skills of problem solving/critical thinking, data interpretation and…

  13. A Study in Critical Listening Using Eight to Ten Year Olds in an Analysis of Commercial Propaganda Emanating from Television.

    ERIC Educational Resources Information Center

    Cook, Jimmie Ellis

    Selected eight to ten year old Maryland children were used in this study measuring the effect of lessons in becoming aware of propaganda employed by commercial advertisers in television programs. Sixteen 45-minute lessons directed to the propaganda techniques of Band Wagon, Card Stacking, Glittering Generalities, Name Calling, Plain Folks,…

  14. Gender Positioning in Education: A Critical Image Analysis of ESL Texts.

    ERIC Educational Resources Information Center

    Giaschi, Peter

    2000-01-01

    This article is adapted from a project report prepared for the author's Master's degree in Education. The objective is to report the use of an adapted analytical technique for examining the images contained in contemporary English-as-a-Second-Language (ESL) textbooks. The point of departure for the study was the identification of the trend in mass…

  15. An Incremental Life-cycle Assurance Strategy for Critical System Certification

    DTIC Science & Technology

    2014-11-04

    for Safe Aircraft Operation Embedded software systems introduce a new class of problems not addressed by traditional system modeling & analysis...Platform Runtime Architecture Application Software Embedded SW System Engineer Data Stream Characteristics Latency jitter affects control behavior...do system level failures still occur despite fault tolerance techniques being deployed in systems ? Embedded software system as major source of

  16. An Analysis of the Units "I'm Learning My Past" and "The Place Where We Live" in the Social Studies Textbook Related to Critical Thinking Standards

    ERIC Educational Resources Information Center

    Aybek, Birsel; Aslan, Serkan

    2016-01-01

    Problem Statement: Various research have been conducted investigating the quality and quantity of textbooks such as wording, content, design, visuality, physical properties, activities, methods and techniques, questions and experiments, events, misconceptions, organizations, pictures, text selection, end of unit questions and assessments, indexes…

  17. Systems modeling and simulation applications for critical care medicine

    PubMed Central

    2012-01-01

    Critical care delivery is a complex, expensive, error prone, medical specialty and remains the focal point of major improvement efforts in healthcare delivery. Various modeling and simulation techniques offer unique opportunities to better understand the interactions between clinical physiology and care delivery. The novel insights gained from the systems perspective can then be used to develop and test new treatment strategies and make critical care delivery more efficient and effective. However, modeling and simulation applications in critical care remain underutilized. This article provides an overview of major computer-based simulation techniques as applied to critical care medicine. We provide three application examples of different simulation techniques, including a) pathophysiological model of acute lung injury, b) process modeling of critical care delivery, and c) an agent-based model to study interaction between pathophysiology and healthcare delivery. Finally, we identify certain challenges to, and opportunities for, future research in the area. PMID:22703718

  18. Systematic risk assessment methodology for critical infrastructure elements - Oil and Gas subsectors

    NASA Astrophysics Data System (ADS)

    Gheorghiu, A.-D.; Ozunu, A.

    2012-04-01

    The concern for the protection of critical infrastructure has been rapidly growing in the last few years in Europe. The level of knowledge and preparedness in this field is beginning to develop in a lawfully organized manner, for the identification and designation of critical infrastructure elements of national and European interest. Oil and gas production, refining, treatment, storage and transmission by pipelines facilities, are considered European critical infrastructure sectors, as per Annex I of the Council Directive 2008/114/EC of 8 December 2008 on the identification and designation of European critical infrastructures and the assessment of the need to improve their protection. Besides identifying European and national critical infrastructure elements, member states also need to perform a risk analysis for these infrastructure items, as stated in Annex II of the above mentioned Directive. In the field of risk assessment, there are a series of acknowledged and successfully used methods in the world, but not all hazard identification and assessment methods and techniques are suitable for a given site, situation, or type of hazard. As Theoharidou, M. et al. noted (Theoharidou, M., P. Kotzanikolaou, and D. Gritzalis 2009. Risk-Based Criticality Analysis. In Critical Infrastructure Protection III. Proceedings. Third Annual IFIP WG 11.10 International Conference on Critical Infrastructure Protection. Hanover, New Hampshire, USA, March 23-25, 2009: revised selected papers, edited by C. Palmer and S. Shenoi, 35-49. Berlin: Springer.), despite the wealth of knowledge already created, there is a need for simple, feasible, and standardized criticality analyses. The proposed systematic risk assessment methodology includes three basic steps: the first step (preliminary analysis) includes the identification of hazards (including possible natural hazards) for each installation/section within a given site, followed by a criterial analysis and then a detailed analysis step. The criterial evaluation is used as a ranking system in order to establish the priorities for the detailed risk assessment. This criterial analysis stage is necessary because the total number of installations and sections on a site can be quite large. As not all installations and sections on a site contribute significantly to the risk of a major accident occurring, it is not efficient to include all installations and sections in the detailed risk assessment, which can be time and resource consuming. The selected installations are then taken into consideration in the detailed risk assessment, which is the third step of the systematic risk assessment methodology. Following this step, conclusions can be drawn related to the overall risk characteristics of the site. The proposed methodology can as such be successfully applied to the assessment of risk related to critical infrastructure elements falling under the energy sector of Critical Infrastructure, mainly the sub-sectors oil and gas. Key words: Systematic risk assessment, criterial analysis, energy sector critical infrastructure elements

  19. Neural network robust tracking control with adaptive critic framework for uncertain nonlinear systems.

    PubMed

    Wang, Ding; Liu, Derong; Zhang, Yun; Li, Hongyi

    2018-01-01

    In this paper, we aim to tackle the neural robust tracking control problem for a class of nonlinear systems using the adaptive critic technique. The main contribution is that a neural-network-based robust tracking control scheme is established for nonlinear systems involving matched uncertainties. The augmented system considering the tracking error and the reference trajectory is formulated and then addressed under adaptive critic optimal control formulation, where the initial stabilizing controller is not needed. The approximate control law is derived via solving the Hamilton-Jacobi-Bellman equation related to the nominal augmented system, followed by closed-loop stability analysis. The robust tracking control performance is guaranteed theoretically via Lyapunov approach and also verified through simulation illustration. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Safety Analysis of Soybean Processing for Advanced Life Support

    NASA Technical Reports Server (NTRS)

    Hentges, Dawn L.

    1999-01-01

    Soybeans (cv. Hoyt) is one of the crops planned for food production within the Advanced Life Support System Integration Testbed (ALSSIT), a proposed habitat simulation for long duration lunar/Mars missions. Soybeans may be processed into a variety of food products, including soymilk, tofu, and tempeh. Due to the closed environmental system and importance of crew health maintenance, food safety is a primary concern on long duration space missions. Identification of the food safety hazards and critical control points associated with the closed ALSSIT system is essential for the development of safe food processing techniques and equipment. A Hazard Analysis Critical Control Point (HACCP) model was developed to reflect proposed production and processing protocols for ALSSIT soybeans. Soybean processing was placed in the type III risk category. During the processing of ALSSIT-grown soybeans, critical control points were identified to control microbiological hazards, particularly mycotoxins, and chemical hazards from antinutrients. Critical limits were suggested at each CCP. Food safety recommendations regarding the hazards and risks associated with growing, harvesting, and processing soybeans; biomass management; and use of multifunctional equipment were made in consideration of the limitations and restraints of the closed ALSSIT.

  1. On Mixed Data and Event Driven Design for Adaptive-Critic-Based Nonlinear $H_{\\infty}$ Control.

    PubMed

    Wang, Ding; Mu, Chaoxu; Liu, Derong; Ma, Hongwen

    2018-04-01

    In this paper, based on the adaptive critic learning technique, the control for a class of unknown nonlinear dynamic systems is investigated by adopting a mixed data and event driven design approach. The nonlinear control problem is formulated as a two-player zero-sum differential game and the adaptive critic method is employed to cope with the data-based optimization. The novelty lies in that the data driven learning identifier is combined with the event driven design formulation, in order to develop the adaptive critic controller, thereby accomplishing the nonlinear control. The event driven optimal control law and the time driven worst case disturbance law are approximated by constructing and tuning a critic neural network. Applying the event driven feedback control, the closed-loop system is built with stability analysis. Simulation studies are conducted to verify the theoretical results and illustrate the control performance. It is significant to observe that the present research provides a new avenue of integrating data-based control and event-triggering mechanism into establishing advanced adaptive critic systems.

  2. Exploring Gigabyte Datasets in Real Time: Architectures, Interfaces and Time-Critical Design

    NASA Technical Reports Server (NTRS)

    Bryson, Steve; Gerald-Yamasaki, Michael (Technical Monitor)

    1998-01-01

    Architectures and Interfaces: The implications of real-time interaction on software architecture design: decoupling of interaction/graphics and computation into asynchronous processes. The performance requirements of graphics and computation for interaction. Time management in such an architecture. Examples of how visualization algorithms must be modified for high performance. Brief survey of interaction techniques and design, including direct manipulation and manipulation via widgets. talk discusses how human factors considerations drove the design and implementation of the virtual wind tunnel. Time-Critical Design: A survey of time-critical techniques for both computation and rendering. Emphasis on the assignment of a time budget to both the overall visualization environment and to each individual visualization technique in the environment. The estimation of the benefit and cost of an individual technique. Examples of the modification of visualization algorithms to allow time-critical control.

  3. A Regev-type fully homomorphic encryption scheme using modulus switching.

    PubMed

    Chen, Zhigang; Wang, Jian; Chen, Liqun; Song, Xinxia

    2014-01-01

    A critical challenge in a fully homomorphic encryption (FHE) scheme is to manage noise. Modulus switching technique is currently the most efficient noise management technique. When using the modulus switching technique to design and implement a FHE scheme, how to choose concrete parameters is an important step, but to our best knowledge, this step has drawn very little attention to the existing FHE researches in the literature. The contributions of this paper are twofold. On one hand, we propose a function of the lower bound of dimension value in the switching techniques depending on the LWE specific security levels. On the other hand, as a case study, we modify the Brakerski FHE scheme (in Crypto 2012) by using the modulus switching technique. We recommend concrete parameter values of our proposed scheme and provide security analysis. Our result shows that the modified FHE scheme is more efficient than the original Brakerski scheme in the same security level.

  4. Contamination assessment and control in scientific satellites

    NASA Technical Reports Server (NTRS)

    Naumann, R. J.

    1973-01-01

    Techniques for assessment and control of the contamination environment for both particulates and condensible vapors in the vicinity of spacecraft are developed. An analysis of the deposition rate on critical surfaces is made considering sources within the line of sight of the surface in question as well as those obscured from the line of sight. The amount of contamination returned by collision with the surrounding atmosphere is estimated. Scattering and absorption from the induced atmosphere of gases and particulates around the spacecraft are estimated. Finally, design techniques developed for Skylab to reduce the contamination environment to an acceptable level are discussed.

  5. Vision sensing techniques in aeronautics and astronautics

    NASA Technical Reports Server (NTRS)

    Hall, E. L.

    1988-01-01

    The close relationship between sensing and other tasks in orbital space, and the integral role of vision sensing in practical aerospace applications, are illustrated. Typical space mission-vision tasks encompass the docking of space vehicles, the detection of unexpected objects, the diagnosis of spacecraft damage, and the inspection of critical spacecraft components. Attention is presently given to image functions, the 'windowing' of a view, the number of cameras required for inspection tasks, the choice of incoherent or coherent (laser) illumination, three-dimensional-to-two-dimensional model-matching, edge- and region-segmentation techniques, and motion analysis for tracking.

  6. Pilot workload and fatigue: A critical survey of concepts and assessment techniques

    NASA Technical Reports Server (NTRS)

    Gartner, W. B.; Murphy, M. R.

    1976-01-01

    The principal unresolved issues in conceptualizing and measuring pilot workload and fatigue are discussed. These issues are seen as limiting the development of more useful working concepts and techniques and their application to systems engineering and management activities. A conceptual analysis of pilot workload and fatigue, an overview and critique of approaches to the assessment of these phenomena, and a discussion of current trends in the management of unwanted workload and fatigue effects are presented. Refinements and innovations in assessment methods are recommended for enhancing the practical significance of workload and fatigue studies.

  7. Static and Dynamic Verification of Critical Software for Space Applications

    NASA Astrophysics Data System (ADS)

    Moreira, F.; Maia, R.; Costa, D.; Duro, N.; Rodríguez-Dapena, P.; Hjortnaes, K.

    Space technology is no longer used only for much specialised research activities or for sophisticated manned space missions. Modern society relies more and more on space technology and applications for every day activities. Worldwide telecommunications, Earth observation, navigation and remote sensing are only a few examples of space applications on which we rely daily. The European driven global navigation system Galileo and its associated applications, e.g. air traffic management, vessel and car navigation, will significantly expand the already stringent safety requirements for space based applications Apart from their usefulness and practical applications, every single piece of onboard software deployed into the space represents an enormous investment. With a long lifetime operation and being extremely difficult to maintain and upgrade, at least when comparing with "mainstream" software development, the importance of ensuring their correctness before deployment is immense. Verification &Validation techniques and technologies have a key role in ensuring that the onboard software is correct and error free, or at least free from errors that can potentially lead to catastrophic failures. Many RAMS techniques including both static criticality analysis and dynamic verification techniques have been used as a means to verify and validate critical software and to ensure its correctness. But, traditionally, these have been isolated applied. One of the main reasons is the immaturity of this field in what concerns to its application to the increasing software product(s) within space systems. This paper presents an innovative way of combining both static and dynamic techniques exploiting their synergy and complementarity for software fault removal. The methodology proposed is based on the combination of Software FMEA and FTA with Fault-injection techniques. The case study herein described is implemented with support from two tools: The SoftCare tool for the SFMEA and SFTA, and the Xception tool for fault-injection. Keywords: Verification &Validation, RAMS, Onboard software, SFMEA, STA, Fault-injection 1 This work is being performed under the project STADY Applied Static And Dynamic Verification Of Critical Software, ESA/ESTEC Contract Nr. 15751/02/NL/LvH.

  8. The effect of physics-based scientific learning on the improvement of the student’s critical thinking skills

    NASA Astrophysics Data System (ADS)

    Zaidah, A.; Sukarmin; Sunarno, W.

    2018-04-01

    This study aimed to determine the influence of a physics-based scientific learning to increase student’s critical thinking skill. This type of this research was quantitative research with taking the conclusion through statistical analysis. This research was carried out in MA (Senior High School) Mu'allimat NW Pancor in the second semester in the academic year of 2016/2017 with all students of XI class. The sampling is done by using technique purposive sampling where the class was taken from XI 6 class. Based on the result of descriptive analysis, it was obtained an average pre-test score of 49.17 and an average post-test score of 82.43. Also, the results showed that the average score was gained of 0.67 with a medium category. Based on the inferential analysis showed the value of t = 22.559 while the ttable in significance level of 5% was 2.04. Thus, t > the ttable from Ha is accepted. Therefore, the pre-test and posttest were different significantly when the students used scientific-based learning. The result showed that a physics-based scientific learning has influenced to increase the student’s critical thinking skill.

  9. Web-based Visual Analytics for Extreme Scale Climate Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steed, Chad A; Evans, Katherine J; Harney, John F

    In this paper, we introduce a Web-based visual analytics framework for democratizing advanced visualization and analysis capabilities pertinent to large-scale earth system simulations. We address significant limitations of present climate data analysis tools such as tightly coupled dependencies, ineffi- cient data movements, complex user interfaces, and static visualizations. Our Web-based visual analytics framework removes critical barriers to the widespread accessibility and adoption of advanced scientific techniques. Using distributed connections to back-end diagnostics, we minimize data movements and leverage HPC platforms. We also mitigate system dependency issues by employing a RESTful interface. Our framework embraces the visual analytics paradigm via newmore » visual navigation techniques for hierarchical parameter spaces, multi-scale representations, and interactive spatio-temporal data mining methods that retain details. Although generalizable to other science domains, the current work focuses on improving exploratory analysis of large-scale Community Land Model (CLM) and Community Atmosphere Model (CAM) simulations.« less

  10. On the convergence of nanotechnology and Big Data analysis for computer-aided diagnosis.

    PubMed

    Rodrigues, Jose F; Paulovich, Fernando V; de Oliveira, Maria Cf; de Oliveira, Osvaldo N

    2016-04-01

    An overview is provided of the challenges involved in building computer-aided diagnosis systems capable of precise medical diagnostics based on integration and interpretation of data from different sources and formats. The availability of massive amounts of data and computational methods associated with the Big Data paradigm has brought hope that such systems may soon be available in routine clinical practices, which is not the case today. We focus on visual and machine learning analysis of medical data acquired with varied nanotech-based techniques and on methods for Big Data infrastructure. Because diagnosis is essentially a classification task, we address the machine learning techniques with supervised and unsupervised classification, making a critical assessment of the progress already made in the medical field and the prospects for the near future. We also advocate that successful computer-aided diagnosis requires a merge of methods and concepts from nanotechnology and Big Data analysis.

  11. Decorrelation correction for nanoparticle tracking analysis of dilute polydisperse suspensions in bulk flow

    NASA Astrophysics Data System (ADS)

    Hartman, John; Kirby, Brian

    2017-03-01

    Nanoparticle tracking analysis, a multiprobe single particle tracking technique, is a widely used method to quickly determine the concentration and size distribution of colloidal particle suspensions. Many popular tools remove non-Brownian components of particle motion by subtracting the ensemble-average displacement at each time step, which is termed dedrifting. Though critical for accurate size measurements, dedrifting is shown here to introduce significant biasing error and can fundamentally limit the dynamic range of particle size that can be measured for dilute heterogeneous suspensions such as biological extracellular vesicles. We report a more accurate estimate of particle mean-square displacement, which we call decorrelation analysis, that accounts for correlations between individual and ensemble particle motion, which are spuriously introduced by dedrifting. Particle tracking simulation and experimental results show that this approach more accurately determines particle diameters for low-concentration polydisperse suspensions when compared with standard dedrifting techniques.

  12. The integrated analysis capability (IAC Level 2.0)

    NASA Technical Reports Server (NTRS)

    Frisch, Harold P.; Vos, Robert G.

    1988-01-01

    The critical data management issues involved in the development of the integral analysis capability (IAC), Level 2, to support the design analysis and performance evaluation of large space structures, are examined. In particular, attention is given to the advantages and disadvantages of the formalized data base; merging of the matrix and relational data concepts; data types, query operators, and data handling; sequential versus direct-access files; local versus global data access; programming languages and host machines; and data flow techniques. The discussion also covers system architecture, recent system level enhancements, executive/user interface capabilities, and technology applications.

  13. Performance criteria for emergency medicine residents: a job analysis.

    PubMed

    Blouin, Danielle; Dagnone, Jeffrey Damon

    2008-11-01

    A major role of admission interviews is to assess a candidate's suitability for a residency program. Structured interviews have greater reliability and validity than do unstructured ones. The development of content for a structured interview is typically based on the dimensions of performance that are perceived as important to succeed in a particular line of work. A formal job analysis is normally conducted to determine these dimensions. The dimensions essential to succeed as an emergency medicine (EM) resident have not yet been studied. We aimed to analyze the work of EM residents to determine these essential dimensions. The "critical incident technique" was used to generate scenarios of poor and excellent resident performance. Two reviewers independently read each scenario and labelled the performance dimensions that were reflected in each. All labels assigned to a particular scenario were pooled and reviewed again until a consensus was reached. Five faculty members (25% of our total faculty) comprised the subject experts. Fifty-one incidents were generated and 50 different labels were applied. Eleven dimensions of performance applied to at least 5 incidents. "Professionalism" was the most valued performance dimension, represented in 56% of the incidents, followed by "self-confidence" (22%), "experience" (20%) and "knowledge" (20%). "Professionalism," "self-confidence," "experience" and "knowledge" were identified as the performance dimensions essential to succeed as an EM resident based on our formal job analysis using the critical incident technique. Performing a formal job analysis may assist training program directors with developing admission interviews.

  14. Evolution and Advances in Satellite Analysis of Volcanoes

    NASA Astrophysics Data System (ADS)

    Dean, K. G.; Dehn, J.; Webley, P.; Bailey, J.

    2008-12-01

    Over the past 20 years satellite data used for monitoring and analysis of volcanic eruptions has evolved in terms of timeliness, access, distribution, resolution and understanding of volcanic processes. Initially satellite data was used for retrospective analysis but has evolved to proactive monitoring systems. Timely acquisition of data and the capability to distribute large data files paralleled advances in computer technology and was a critical component for near real-time monitoring. The sharing of these data and resulting discussions has improved our understanding of eruption processes and, even more importantly, their impact on society. To illustrate this evolution, critical scientific discoveries will be highlighted, including detection of airborne ash and sulfur dioxide, cloud-height estimates, prediction of ash cloud movement, and detection of thermal anomalies as precursor-signals to eruptions. AVO has been a leader in implementing many of these advances into an operational setting such as, automated eruption detection, database analysis systems, and remotely accessible web-based analysis systems. Finally, limitations resulting from trade-offs between resolution and how they impact some weakness in detection techniques and hazard assessments will be presented.

  15. Improved consolidation of silicon carbide

    NASA Technical Reports Server (NTRS)

    Freedman, M. R.; Millard, M. L.

    1986-01-01

    Alpha silicon carbide powder was consolidated by both dry and wet methods. Dry pressing in a double acting steel die yielded sintered test bars with an average flexural strength of 235.6 MPa with a critical flaw size of approximately 100 micro m. An aqueous slurry pressing technique produced sintered test bars with an average flexural strength of 440.8 MPa with a critical flaw size of approximately 25 micro m. Image analysis revealed a reduction in both pore area and pore size distribution in the slurry pressed sintered test bars. The improvements in the slurry pressed material properties are discussed in terms of reduced agglomeration and improved particle packing during consolidation.

  16. Timoshenko-Type Theory in the Stability Analysis of Corrugated Cylindrical Shells

    NASA Astrophysics Data System (ADS)

    Semenyuk, N. P.; Neskhodovskaya, N. A.

    2002-06-01

    A technique is proposed for stability analysis of longitudinally corrugated shells under axial compression. The technique employs the equations of the Timoshenko-type nonlinear theory of shells. The geometrical parameters of shells are specified on discrete set of points and are approximated by segments of Fourier series. Infinite systems of homogeneous algebraic equations are derived from a variational equation written in displacements to determine the critical loads and buckling modes. Specific types of corrugated isotropic metal and fiberglass shells are considered. The calculated results are compared with those obtained within the framework of the classical theory of shells. It is shown that the Timoshenko-type theory extends significantly the possibility of exact allowance for the geometrical parameters and material properties of corrugated shells compared with Kirchhoff-Love theory.

  17. Cost collection and analysis for health economic evaluation.

    PubMed

    Smith, Kristine A; Rudmik, Luke

    2013-08-01

    To improve the understanding of common health care cost collection, estimation, analysis, and reporting methodologies. Ovid MEDLINE (1947 to December 2012), Cochrane Central register of Controlled Trials, Database of Systematic Reviews, Health Technology Assessment, and National Health Service Economic Evaluation Database. This article discusses the following cost collection methods: defining relevant resources, quantification of consumed resources, and resource valuation. It outlines the recommendations for cost reporting in economic evaluations and reviews the techniques on how to handle cost data uncertainty. Last, it discusses the controversial topics of future costs and patient productivity losses. Health care cost collection and estimation can be challenging, and an organized approach is required to optimize accuracy of economic evaluation outcomes. Understanding health care cost collection and estimation techniques will improve both critical appraisal and development of future economic evaluations.

  18. Ideal, nonideal, and no-marker variables: The confirmatory factor analysis (CFA) marker technique works when it matters.

    PubMed

    Williams, Larry J; O'Boyle, Ernest H

    2015-09-01

    A persistent concern in the management and applied psychology literature is the effect of common method variance on observed relations among variables. Recent work (i.e., Richardson, Simmering, & Sturman, 2009) evaluated 3 analytical approaches to controlling for common method variance, including the confirmatory factor analysis (CFA) marker technique. Their findings indicated significant problems with this technique, especially with nonideal marker variables (those with theoretical relations with substantive variables). Based on their simulation results, Richardson et al. concluded that not correcting for method variance provides more accurate estimates than using the CFA marker technique. We reexamined the effects of using marker variables in a simulation study and found the degree of error in estimates of a substantive factor correlation was relatively small in most cases, and much smaller than error associated with making no correction. Further, in instances in which the error was large, the correlations between the marker and substantive scales were higher than that found in organizational research with marker variables. We conclude that in most practical settings, the CFA marker technique yields parameter estimates close to their true values, and the criticisms made by Richardson et al. are overstated. (c) 2015 APA, all rights reserved).

  19. Getting the Most Out of Dual-Listed Courses: Involving Undergraduate Students in Discussion Through Active Learning Techniques

    NASA Astrophysics Data System (ADS)

    Tasich, C. M.; Duncan, L. L.; Duncan, B. R.; Burkhardt, B. L.; Benneyworth, L. M.

    2015-12-01

    Dual-listed courses will persist in higher education because of resource limitations. The pedagogical differences between undergraduate and graduate STEM student groups and the underlying distinction in intellectual development levels between the two student groups complicate the inclusion of undergraduates in these courses. Active learning techniques are a possible remedy to the hardships undergraduate students experience in graduate-level courses. Through an analysis of both undergraduate and graduate student experiences while enrolled in a dual-listed course, we implemented a variety of learning techniques used to complement the learning of both student groups and enhance deep discussion. Here, we provide details concerning the implementation of four active learning techniques - role play, game, debate, and small group - that were used to help undergraduate students critically discuss primary literature. Student perceptions were gauged through an anonymous, end-of-course evaluation that contained basic questions comparing the course to other courses at the university and other salient aspects of the course. These were given as a Likert scale on which students rated a variety of statements (1 = strongly disagree, 3 = no opinion, and 5 = strongly agree). Undergraduates found active learning techniques to be preferable to traditional techniques with small-group discussions being rated the highest in both enjoyment and enhanced learning. The graduate student discussion leaders also found active learning techniques to improve discussion. In hindsight, students of all cultures may be better able to take advantage of such approaches and to critically read and discuss primary literature when written assignments are used to guide their reading. Applications of active learning techniques can not only address the gap between differing levels of students, but also serve as a complement to student engagement in any science course design.

  20. Quantitative ultrasonic evaluation of mechanical properties of engineering materials

    NASA Technical Reports Server (NTRS)

    Vary, A.

    1978-01-01

    Current progress in the application of ultrasonic techniques to nondestructive measurement of mechanical strength properties of engineering materials is reviewed. Even where conventional NDE techniques have shown that a part is free of overt defects, advanced NDE techniques should be available to confirm the material properties assumed in the part's design. There are many instances where metallic, composite, or ceramic parts may be free of critical defects while still being susceptible to failure under design loads due to inadequate or degraded mechanical strength. This must be considered in any failure prevention scheme that relies on fracture analysis. This review will discuss the availability of ultrasonic methods that can be applied to actual parts to assess their potential susceptibility to failure under design conditions.

  1. Groundwork for the Concept of Technique in Education: Herbert Marcuse and Technological Society

    ERIC Educational Resources Information Center

    Pierce, Clayton

    2006-01-01

    This article articulates the groundwork for a new understanding of the concept of technique through a critical engagement with Herbert Marcuse's critical theory of technology. To this end, it identifies and engages three expressions of technique in Marcuse's work: mimesis, reified labor, and the happy consciousness. It is argued that this mapping…

  2. Risk assessment techniques with applicability in marine engineering

    NASA Astrophysics Data System (ADS)

    Rudenko, E.; Panaitescu, F. V.; Panaitescu, M.

    2015-11-01

    Nowadays risk management is a carefully planned process. The task of risk management is organically woven into the general problem of increasing the efficiency of business. Passive attitude to risk and awareness of its existence are replaced by active management techniques. Risk assessment is one of the most important stages of risk management, since for risk management it is necessary first to analyze and evaluate risk. There are many definitions of this notion but in general case risk assessment refers to the systematic process of identifying the factors and types of risk and their quantitative assessment, i.e. risk analysis methodology combines mutually complementary quantitative and qualitative approaches. Purpose of the work: In this paper we will consider as risk assessment technique Fault Tree analysis (FTA). The objectives are: understand purpose of FTA, understand and apply rules of Boolean algebra, analyse a simple system using FTA, FTA advantages and disadvantages. Research and methodology: The main purpose is to help identify potential causes of system failures before the failures actually occur. We can evaluate the probability of the Top event.The steps of this analize are: the system's examination from Top to Down, the use of symbols to represent events, the use of mathematical tools for critical areas, the use of Fault tree logic diagrams to identify the cause of the Top event. Results: In the finally of study it will be obtained: critical areas, Fault tree logical diagrams and the probability of the Top event. These results can be used for the risk assessment analyses.

  3. Twenty-five-gauge vitrectomy versus 23-gauge vitrectomy in the management of macular diseases: a comparative analysis through a Health Technology Assessment model.

    PubMed

    Grosso, Andrea; Charrier, Lorena; Lovato, Emanuela; Panico, Claudio; Mariotti, Cesare; Dapavo, Giancarlo; Chiuminatto, Roberto; Siliquini, Roberta; Gianino, Maria Michela

    2014-04-01

    Small-gauge vitreoretinal techniques have been shown to be safe and effective in the management of a wide spectrum of vitreoretinal diseases. However, the costs of the new technologies may represent a critical issue for national health systems. The aim of the study is to plan a Health Technology Assessment (HTA) by performing a comparative analysis between the 23- and 25-gauge techniques in the management of macular diseases (epiretinal membranes, macular holes, vitreo-macular traction syndrome). In this prospective study, 45-80-year-old patients undergoing vitrectomy surgery for macular disease were enrolled at the Torino Eye Hospital. In the HTA model we assessed the safety, clinical effectiveness, and cost and financial evaluation of 23-gauge compared with 25-gauge vitrectomies. Fifty patients entered the study; 14 patients underwent 23-gauge vitrectomy and 36 underwent 25-gauge vitrectomy. There was no statistically significant difference in post-operative visual acuity at 1 year between the two groups. No cases of retinal detachment or endophtalmitis were registered at 1-year follow-up. The 23-gauge technique was slightly more expensive than the 25-gauge: the total surgical costs were EUR1217.70 versus EUR1164.84 (p = 0.351). We provide a financial comparison between new vitreoretinal procedures recently introduced in the market and reimbursed by the Italian National Health System and we also stimulate a critical debate about the expensive technocratic model of medicine.

  4. Modelling, design and stability analysis of an improved SEPIC converter for renewable energy systems

    NASA Astrophysics Data System (ADS)

    G, Dileep; Singh, S. N.; Singh, G. K.

    2017-09-01

    In this paper, a detailed modelling and analysis of a switched inductor (SI)-based improved single-ended primary inductor converter (SEPIC) has been presented. To increase the gain of conventional SEPIC converter, input and output side inductors are replaced with SI structures. Design and stability analysis for continuous conduction mode operation of the proposed SI-SEPIC converter has also been presented in this paper. State space averaging technique is used to model the converter and carry out the stability analysis. Performance and stability analysis of closed loop configuration is predicted by observing the open loop behaviour using Nyquist diagram and Nichols chart. System was found to stable and critically damped.

  5. Coxiella Burnetti Vaccine Development: Lipopolysaccharide Structural Analysis

    DTIC Science & Technology

    1989-12-29

    linkage, branching, and sequence, by periodate oxidation, supercritical fluid chromatography , and mass spectrometry. These techniques combine to pro... Supercritical fluid chromatography of PFBAB labeled maltodextrin sample prepared as the acetate derivative. C-anopropyl SFC column using CO 2 as the...8217 ide the elements of a global approach to oligosaccharide structure. The utility of s"pr critical fluid chromatography for a determination of Lipid-A

  6. Air Mobility Command’s Total Force Integration: A Critical Analysis

    DTIC Science & Technology

    2012-02-26

    variety of techniques available to positively change culture . 9 Changing assumptions of in-group collectivist requires increasing breadth and open...there have been issues. Command structure, cultural differences and vague guidance for implementation have plagued Air Mobility Command’s...through the Governor and not the President of the United States. Cultural Considerations The purpose of this section is to identify and discuss

  7. Skin prick tests and allergy diagnosis.

    PubMed

    Antunes, João; Borrego, Luís; Romeira, Ana; Pinto, Paula

    2009-01-01

    Skin testing remains an essential diagnostic tool in modern allergy practice. A significant variability has been reported regarding technical procedures, interpretation of results and documentation. This review has the aim of consolidating methodological recommendations through a critical analysis on past and recent data. This will allow a better understanding on skin prick test (SPT) history; technique; (contra-) indications; interpretation of results; diagnostic pitfalls; adverse reactions; and variability factors.

  8. Scalable collaborative risk management technology for complex critical systems

    NASA Technical Reports Server (NTRS)

    Campbell, Scott; Torgerson, Leigh; Burleigh, Scott; Feather, Martin S.; Kiper, James D.

    2004-01-01

    We describe here our project and plans to develop methods, software tools, and infrastructure tools to address challenges relating to geographically distributed software development. Specifically, this work is creating an infrastructure that supports applications working over distributed geographical and organizational domains and is using this infrastructure to develop a tool that supports project development using risk management and analysis techniques where the participants are not collocated.

  9. Microfluidic Devices for Forensic DNA Analysis: A Review.

    PubMed

    Bruijns, Brigitte; van Asten, Arian; Tiggelaar, Roald; Gardeniers, Han

    2016-08-05

    Microfluidic devices may offer various advantages for forensic DNA analysis, such as reduced risk of contamination, shorter analysis time and direct application at the crime scene. Microfluidic chip technology has already proven to be functional and effective within medical applications, such as for point-of-care use. In the forensic field, one may expect microfluidic technology to become particularly relevant for the analysis of biological traces containing human DNA. This would require a number of consecutive steps, including sample work up, DNA amplification and detection, as well as secure storage of the sample. This article provides an extensive overview of microfluidic devices for cell lysis, DNA extraction and purification, DNA amplification and detection and analysis techniques for DNA. Topics to be discussed are polymerase chain reaction (PCR) on-chip, digital PCR (dPCR), isothermal amplification on-chip, chip materials, integrated devices and commercially available techniques. A critical overview of the opportunities and challenges of the use of chips is discussed, and developments made in forensic DNA analysis over the past 10-20 years with microfluidic systems are described. Areas in which further research is needed are indicated in a future outlook.

  10. Muscle mass and physical recovery in ICU: innovations for targeting of nutrition and exercise.

    PubMed

    Wischmeyer, Paul E; Puthucheary, Zudin; San Millán, Iñigo; Butz, Daniel; Grocott, Michael P W

    2017-08-01

    We have significantly improved hospital mortality from sepsis and critical illness in last 10 years; however, over this same period we have tripled the number of 'ICU survivors' going to rehabilitation. Furthermore, as up to half the deaths in the first year following ICU admission occur post-ICU discharge, it is unclear how many of these patients ever returned home or a meaningful quality of life. For those who do survive, recent data reveals many 'ICU survivors' will suffer significant functional impairment or post-ICU syndrome (PICS). Thus, new innovative metabolic and exercise interventions to address PICS are urgently needed. These should focus on optimal nutrition and lean body mass (LBM) assessment, targeted nutrition delivery, anabolic/anticatabolic strategies, and utilization of personalized exercise intervention techniques, such as utilized by elite athletes to optimize preparation and recovery from critical care. New data for novel LBM analysis technique such as computerized tomography scan and ultrasound analysis of LBM are available showing objective measures of LBM now becoming more practical for predicting metabolic reserve and effectiveness of nutrition/exercise interventions. 13C-Breath testing is a novel technique under study to predict infection earlier and predict over-feeding and under-feeding to target nutrition delivery. New technologies utilized routinely by athletes such as muscle glycogen ultrasound also show promise. Finally, the role of personalized cardiopulmonary exercise testing to target preoperative exercise optimization and post-ICU recovery are becoming reality. New innovative techniques are demonstrating promise to target recovery from PICS utilizing a combination of objective LBM and metabolic assessment, targeted nutrition interventions, personalized exercise interventions for prehabilitation and post-ICU recovery. These interventions should provide hope that we will soon begin to create more 'survivors' and fewer victim's post-ICU care.

  11. Reviewing the connection between speech and obstructive sleep apnea.

    PubMed

    Espinoza-Cuadros, Fernando; Fernández-Pozo, Rubén; Toledano, Doroteo T; Alcázar-Ramírez, José D; López-Gonzalo, Eduardo; Hernández-Gómez, Luis A

    2016-02-20

    Sleep apnea (OSA) is a common sleep disorder characterized by recurring breathing pauses during sleep caused by a blockage of the upper airway (UA). The altered UA structure or function in OSA speakers has led to hypothesize the automatic analysis of speech for OSA assessment. In this paper we critically review several approaches using speech analysis and machine learning techniques for OSA detection, and discuss the limitations that can arise when using machine learning techniques for diagnostic applications. A large speech database including 426 male Spanish speakers suspected to suffer OSA and derived to a sleep disorders unit was used to study the clinical validity of several proposals using machine learning techniques to predict the apnea-hypopnea index (AHI) or classify individuals according to their OSA severity. AHI describes the severity of patients' condition. We first evaluate AHI prediction using state-of-the-art speaker recognition technologies: speech spectral information is modelled using supervectors or i-vectors techniques, and AHI is predicted through support vector regression (SVR). Using the same database we then critically review several OSA classification approaches previously proposed. The influence and possible interference of other clinical variables or characteristics available for our OSA population: age, height, weight, body mass index, and cervical perimeter, are also studied. The poor results obtained when estimating AHI using supervectors or i-vectors followed by SVR contrast with the positive results reported by previous research. This fact prompted us to a careful review of these approaches, also testing some reported results over our database. Several methodological limitations and deficiencies were detected that may have led to overoptimistic results. The methodological deficiencies observed after critically reviewing previous research can be relevant examples of potential pitfalls when using machine learning techniques for diagnostic applications. We have found two common limitations that can explain the likelihood of false discovery in previous research: (1) the use of prediction models derived from sources, such as speech, which are also correlated with other patient characteristics (age, height, sex,…) that act as confounding factors; and (2) overfitting of feature selection and validation methods when working with a high number of variables compared to the number of cases. We hope this study could not only be a useful example of relevant issues when using machine learning for medical diagnosis, but it will also help in guiding further research on the connection between speech and OSA.

  12. Protein identification and quantification from riverbank grape, Vitis riparia: Comparing SDS-PAGE and FASP-GPF techniques for shotgun proteomic analysis.

    PubMed

    George, Iniga S; Fennell, Anne Y; Haynes, Paul A

    2015-09-01

    Protein sample preparation optimisation is critical for establishing reproducible high throughput proteomic analysis. In this study, two different fractionation sample preparation techniques (in-gel digestion and in-solution digestion) for shotgun proteomics were used to quantitatively compare proteins identified in Vitis riparia leaf samples. The total number of proteins and peptides identified were compared between filter aided sample preparation (FASP) coupled with gas phase fractionation (GPF) and SDS-PAGE methods. There was a 24% increase in the total number of reproducibly identified proteins when FASP-GPF was used. FASP-GPF is more reproducible, less expensive and a better method than SDS-PAGE for shotgun proteomics of grapevine samples as it significantly increases protein identification across biological replicates. Total peptide and protein information from the two fractionation techniques is available in PRIDE with the identifier PXD001399 (http://proteomecentral.proteomexchange.org/dataset/PXD001399). © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Measuring masses of large biomolecules and bioparticles using mass spectrometric techniques.

    PubMed

    Peng, Wen-Ping; Chou, Szu-Wei; Patil, Avinash A

    2014-07-21

    Large biomolecules and bioparticles play a vital role in biology, chemistry, biomedical science and physics. Mass is a critical parameter for the characterization of large biomolecules and bioparticles. To achieve mass analysis, choosing a suitable ion source is the first step and the instruments for detecting ions, mass analyzers and detectors should also be considered. Abundant mass spectrometric techniques have been proposed to determine the masses of large biomolecules and bioparticles and these techniques can be divided into two categories. The first category measures the mass (or size) of intact particles, including single particle quadrupole ion trap mass spectrometry, cell mass spectrometry, charge detection mass spectrometry and differential mobility mass analysis; the second category aims to measure the mass and tandem mass of biomolecular ions, including quadrupole ion trap mass spectrometry, time-of-flight mass spectrometry, quadrupole orthogonal time-of-flight mass spectrometry and orbitrap mass spectrometry. Moreover, algorithms for the mass and stoichiometry assignment of electrospray mass spectra are developed to obtain accurate structure information and subunit combinations.

  14. Debate in the Classroom: An Evaluation of a Critical Thinking Teaching Technique within a Rehabilitation Counseling Course

    ERIC Educational Resources Information Center

    Gervey, Robert; Drout, Mary O'Connor; Wang, Chia-Chiang

    2009-01-01

    The educational setting, in particular the college classroom, is a natural training ground for fostering critical thinking skills for the 21st century worker. In this study, debate is explored as a technique to help students attain mastery of content and critical thinking skills considered key to working in the field of rehabilitation counseling.…

  15. Orbital stability analysis in biomechanics: a systematic review of a nonlinear technique to detect instability of motor tasks.

    PubMed

    Riva, F; Bisi, M C; Stagni, R

    2013-01-01

    Falls represent a heavy economic and clinical burden on society. The identification of individual chronic characteristics associated with falling is of fundamental importance for the clinicians; in particular, the stability of daily motor tasks is one of the main factors that the clinicians look for during assessment procedures. Various methods for the assessment of stability in human movement are present in literature, and methods coming from stability analysis of nonlinear dynamic systems applied to biomechanics recently showed promise. One of these techniques is orbital stability analysis via Floquet multipliers. This method allows to measure orbital stability of periodic nonlinear dynamic systems and it seems a promising approach for the definition of a reliable motor stability index, taking into account for the whole task cycle dynamics. Despite the premises, its use in the assessment of fall risk has been deemed controversial. The aim of this systematic review was therefore to provide a critical evaluation of the literature on the topic of applications of orbital stability analysis in biomechanics, with particular focus to methodologic aspects. Four electronic databases have been searched for articles relative to the topic; 23 articles were selected for review. Quality of the studies present in literature has been assessed with a customised quality assessment tool. Overall quality of the literature in the field was found to be high. The most critical aspect was found to be the lack of uniformity in the implementation of the analysis to biomechanical time series, particularly in the choice of state space and number of cycles to include in the analysis. Copyright © 2012 Elsevier B.V. All rights reserved.

  16. Optimizing Hybrid Metrology: Rigorous Implementation of Bayesian and Combined Regression

    PubMed Central

    Henn, Mark-Alexander; Silver, Richard M.; Villarrubia, John S.; Zhang, Nien Fan; Zhou, Hui; Barnes, Bryan M.; Ming, Bin; Vladár, András E.

    2015-01-01

    Hybrid metrology, e.g., the combination of several measurement techniques to determine critical dimensions, is an increasingly important approach to meet the needs of the semiconductor industry. A proper use of hybrid metrology may yield not only more reliable estimates for the quantitative characterization of 3-D structures but also a more realistic estimation of the corresponding uncertainties. Recent developments at the National Institute of Standards and Technology (NIST) feature the combination of optical critical dimension (OCD) measurements and scanning electron microscope (SEM) results. The hybrid methodology offers the potential to make measurements of essential 3-D attributes that may not be otherwise feasible. However, combining techniques gives rise to essential challenges in error analysis and comparing results from different instrument models, especially the effect of systematic and highly correlated errors in the measurement on the χ2 function that is minimized. Both hypothetical examples and measurement data are used to illustrate solutions to these challenges. PMID:26681991

  17. Critically appraising qualitative research: a guide for clinicians more familiar with quantitative techniques.

    PubMed

    Kisely, Stephen; Kendall, Elizabeth

    2011-08-01

    Papers using qualitative methods are increasingly common in psychiatric journals. This overview is an introduction to critically appraising a qualitative paper for clinicians who are more familiar with quantitative methods. Qualitative research uses data from interviews (semi-structured or unstructured), focus groups, observations or written materials. Data analysis is inductive, allowing meaning to emerge from the data, rather than the more deductive, hypothesis centred approach of quantitative research. This overview compares and contrasts quantitative and qualitative research methods. Quantitative concepts such as reliability, validity, statistical power, bias and generalisability have qualitative equivalents. These include triangulation, trustworthiness, saturation, reflexivity and applicability. Reflexivity also shares features of transference. Qualitative approaches include: ethnography, action-assessment, grounded theory, case studies and mixed methods. Qualitative research can complement quantitative approaches. An understanding of both is useful in critically appraising the psychiatric literature.

  18. Examining Cybersecurity of Cyberphysical Systems for Critical Infrastructures Through Work Domain Analysis.

    PubMed

    Wang, Hao; Lau, Nathan; Gerdes, Ryan M

    2018-04-01

    The aim of this study was to apply work domain analysis for cybersecurity assessment and design of supervisory control and data acquisition (SCADA) systems. Adoption of information and communication technology in cyberphysical systems (CPSs) for critical infrastructures enables automated and distributed control but introduces cybersecurity risk. Many CPSs employ SCADA industrial control systems that have become the target of cyberattacks, which inflict physical damage without use of force. Given that absolute security is not feasible for complex systems, cyberintrusions that introduce unanticipated events will occur; a proper response will in turn require human adaptive ability. Therefore, analysis techniques that can support security assessment and human factors engineering are invaluable for defending CPSs. We conducted work domain analysis using the abstraction hierarchy (AH) to model a generic SCADA implementation to identify the functional structures and means-ends relations. We then adopted a case study approach examining the Stuxnet cyberattack by developing and integrating AHs for the uranium enrichment process, SCADA implementation, and malware to investigate the interactions between the three aspects of cybersecurity in CPSs. The AHs for modeling a generic SCADA implementation and studying the Stuxnet cyberattack are useful for mapping attack vectors, identifying deficiencies in security processes and features, and evaluating proposed security solutions with respect to system objectives. Work domain analysis is an effective analytical method for studying cybersecurity of CPSs for critical infrastructures in a psychologically relevant manner. Work domain analysis should be applied to assess cybersecurity risk and inform engineering and user interface design.

  19. A critical incident study of general practice trainees in their basic general practice term.

    PubMed

    Diamond, M R; Kamien, M; Sim, M G; Davis, J

    1995-03-20

    To obtain information on the experiences of general practice (GP) trainees during their first general practice (GP) attachment. Critical incident technique--a qualitative analysis of open-ended interviews about incidents which describe competent or poor professional practice. Thirty-nine Western Australian doctors from the Royal Australian College of General Practitioners' (RACGP) Family Medicine Program who were completing their first six months of general practice in 1992. Doctors reported 180 critical incidents, of which just over 50% involved problems (and sometimes successes) with: difficult patients; paediatrics; the doctor-patient relationship; counselling skills; obstetrics and gynaecology; relationships with other health professionals and practice staff; and cardiovascular disorders. The major skills associated with both positive and negative critical incidents were: the interpersonal skills of rapport and listening; the diagnostic skills of thorough clinical assessment and the appropriate use of investigations; and the management skills of knowing when and how to obtain help from supervisors, hospitals and specialists. Doctors reported high levels of anxiety over difficult management decisions and feelings of guilt over missed diagnoses and inadequate management. The initial GP term is a crucial transition period in the development of the future general practitioner. An analysis of commonly recurring positive and negative critical incidents can be used by the RACGP Training Program to accelerate the learning process of doctors in vocational training and has implications for the planning of undergraduate curricula.

  20. Correlation between central venous pressure and peripheral venous pressure with passive leg raise in patients on mechanical ventilation.

    PubMed

    Kumar, Dharmendra; Ahmed, Syed Moied; Ali, Shahna; Ray, Utpal; Varshney, Ankur; Doley, Kashmiri

    2015-11-01

    Central venous pressure (CVP) assesses the volume status of patients. However, this technique is not without complications. We, therefore, measured peripheral venous pressure (PVP) to see whether it can replace CVP. To evaluate the correlation and agreement between CVP and PVP after passive leg raise (PLR) in critically ill patients on mechanical ventilation. Prospective observational study in Intensive Care Unit. Fifty critically ill patients on mechanical ventilation were included in the study. CVP and PVP measurements were taken using a water column manometer. Measurements were taken in the supine position and subsequently after a PLR of 45°. Pearson's correlation and Bland-Altman's analysis. This study showed a fair correlation between CVP and PVP after a PLR of 45° (correlation coefficient, r = 0.479; P = 0.0004) when the CVP was <10 cmH2O. However, the correlation was good when the CVP was >10 cmH2O. Bland-Altman analysis showed 95% limits of agreement to be -2.912-9.472. PVP can replace CVP for guiding fluid therapy in critically ill patients.

  1. System safety in Stirling engine development

    NASA Technical Reports Server (NTRS)

    Bankaitis, H.

    1981-01-01

    The DOE/NASA Stirling Engine Project Office has required that contractors make safety considerations an integral part of all phases of the Stirling engine development program. As an integral part of each engine design subtask, analyses are evolved to determine possible modes of failure. The accepted system safety analysis techniques (Fault Tree, FMEA, Hazards Analysis, etc.) are applied in various degrees of extent at the system, subsystem and component levels. The primary objectives are to identify critical failure areas, to enable removal of susceptibility to such failures or their effects from the system and to minimize risk.

  2. Study of process variables associated with manufacturing hermetically-sealed nickel-cadmium cells

    NASA Technical Reports Server (NTRS)

    Miller, L.; Doan, D. J.; Carr, E. S.

    1971-01-01

    A program to determine and study the critical process variables associated with the manufacture of aerospace, hermetically-sealed, nickel-cadmium cells is described. The determination and study of the process variables associated with the positive and negative plaque impregnation/polarization process are emphasized. The experimental data resulting from the implementation of fractional factorial design experiments are analyzed by means of a linear multiple regression analysis technique. This analysis permits the selection of preferred levels for certain process variables to achieve desirable impregnated plaque characteristics.

  3. Applying Failure Modes, Effects, And Criticality Analysis And Human Reliability Analysis Techniques To Improve Safety Design Of Work Process In Singapore Armed Forces

    DTIC Science & Technology

    2016-09-01

    an instituted safety program that utilizes a generic risk assessment method involving the 5-M (Mission, Man, Machine , Medium and Management) factor...the Safety core value is hinged upon three key principles—(1) each soldier has a crucial part to play, by adopting safety as a core value and making...it a way of life in his unit; (2) safety is an integral part of training, operations and mission success, and (3) safety is an individual, team and

  4. Neuroimaging in aphasia treatment research: Consensus and practical guidelines for data analysis

    PubMed Central

    Meinzer, Marcus; Beeson, Pélagie M.; Cappa, Stefano; Crinion, Jenny; Kiran, Swathi; Saur, Dorothee; Parrish, Todd; Crosson, Bruce; Thompson, Cynthia K.

    2012-01-01

    Functional magnetic resonance imaging is the most widely used imaging technique to study treatment-induced recovery in post-stroke aphasia. The longitudinal design of such studies adds to the challenges researchers face when studying patient populations with brain damage in cross-sectional settings. The present review focuses on issues specifically relevant to neuroimaging data analysis in aphasia treatment research identified in discussions among international researchers at the Neuroimaging in Aphasia Treatment Research Workshop held at Northwestern University (Evanston, Illinois, USA). In particular, we aim to provide the reader with a critical review of unique problems related to the pre-processing, statistical modeling and interpretation of such data sets. Despite the fact that data analysis procedures critically depend on specific design features of a given study, we aim to discuss and communicate a basic set of practical guidelines that should be applicable to a wide range of studies and useful as a reference for researchers pursuing this line of research. PMID:22387474

  5. Scheduling Real-Time Mixed-Criticality Jobs

    NASA Astrophysics Data System (ADS)

    Baruah, Sanjoy K.; Bonifaci, Vincenzo; D'Angelo, Gianlorenzo; Li, Haohan; Marchetti-Spaccamela, Alberto; Megow, Nicole; Stougie, Leen

    Many safety-critical embedded systems are subject to certification requirements; some systems may be required to meet multiple sets of certification requirements, from different certification authorities. Certification requirements in such "mixed-criticality" systems give rise to interesting scheduling problems, that cannot be satisfactorily addressed using techniques from conventional scheduling theory. In this paper, we study a formal model for representing such mixed-criticality workloads. We demonstrate first the intractability of determining whether a system specified in this model can be scheduled to meet all its certification requirements, even for systems subject to two sets of certification requirements. Then we quantify, via the metric of processor speedup factor, the effectiveness of two techniques, reservation-based scheduling and priority-based scheduling, that are widely used in scheduling such mixed-criticality systems, showing that the latter of the two is superior to the former. We also show that the speedup factors are tight for these two techniques.

  6. Techniques of Punishment and the Development of Self-Criticism

    ERIC Educational Resources Information Center

    Grusec, Joan E.; Ezrin, Sharyn A.

    1972-01-01

    The results of this study indicate that induction combined with withdrawal of love is no more effective as a punishment technique for the development of at least one aspect of conscience--self-criticism--than withdrawal of material reward. (Authors)

  7. A Coordinated Focused Ion Beam/Ultramicrotomy Technique for Serial Sectioning of Hayabusa Particles and Other Returned Samples

    NASA Technical Reports Server (NTRS)

    Berger, E. L.; Keller, L. P.

    2014-01-01

    Recent sample return missions, such as NASA's Stardust mission to comet 81P/Wild 2 and JAXA's Hayabusa mission to asteroid 25143 Itokawa, have returned particulate samples (typically 5-50 µm) that pose tremendous challenges to coordinated analysis using a variety of nano- and micro-beam techniques. The ability to glean maximal information from individual particles has become increasingly important and depends critically on how the samples are prepared for analysis. This also holds true for other extraterrestrial materials, including interplanetary dust particles, micrometeorites and lunar regolith grains. Traditionally, particulate samples have been prepared using microtomy techniques (e.g., [1]). However, for hard mineral particles ?20 µm, microtome thin sections are compromised by severe chatter and sample loss. For these difficult samples, we have developed a hybrid technique that combines traditional ultramicrotomy with focused ion beam (FIB) techniques, allowing for the in situ investigation of grain surfaces and interiors. Using this method, we have increased the number of FIB-SEM prepared sections that can be recovered from a particle with dimensions on the order of tens of µms. These sections can be subsequently analyzed using a variety of electron beam techniques. Here, we demonstrate this sample preparation technique on individual lunar regolith grains in order to study their space-weathered surfaces. We plan to extend these efforts to analyses of individual Hayabusa samples.

  8. Molecular cytogenetic analysis of Xq critical regions in premature ovarian failure

    PubMed Central

    2013-01-01

    Background One of the frequent reasons for unsuccessful conception is premature ovarian failure/primary ovarian insufficiency (POF/POI) that is defined as the loss of functional follicles below the age of 40 years. Among the genetic causes the most common one involves the X chromosome, as in Turner syndrome, partial X deletion and X-autosome translocations. Here we report a case of a 27-year-old female patient referred to genetic counselling because of premature ovarian failure. The aim of this case study to perform molecular genetic and cytogenetic analyses in order to identify the exact genetic background of the pathogenic phenotype. Results For premature ovarian failure disease diagnostics we performed the Fragile mental retardation 1 gene analysis using Southern blot technique and Repeat Primed PCR in order to identify the relationship between the Fragile mental retardation 1 gene premutation status and the premature ovarion failure disease. At this early onset, the premature ovarian failure affected patient we detected one normal allele of Fragile mental retardation 1 gene and we couldn’t verify the methylated allele, therefore we performed the cytogenetic analyses using G-banding and fluorescent in situ hybridization methods and a high resolution molecular cytogenetic method, the array comparative genomic hybridization technique. For this patient applying the G-banding, we identified a large deletion on the X chromosome at the critical region (ChrX q21.31-q28) which is associated with the premature ovarian failure phenotype. In order to detect the exact breakpoints, we used a special cytogenetic array ISCA plus CGH array and we verified a 67.355 Mb size loss at the critical region which include total 795 genes. Conclusions We conclude for this case study that the karyotyping is definitely helpful in the evaluation of premature ovarian failure patients, to identify the non submicroscopic chromosomal rearrangement, and using the array CGH technique we can contribute to the most efficient detection and mapping of exact deletion breakpoints of the deleted Xq region. PMID:24359613

  9. Biomaterial-Stabilized Soft Tissue Healing for Healing of Critical-Sized Bone Defects: the Masquelet Technique.

    PubMed

    Tarchala, Magdalena; Harvey, Edward J; Barralet, Jake

    2016-03-01

    Critical-sized bone defects present a significant burden to the medical community due to their challenging treatment. However, a successful limb-salvaging technique, the Masquelet Technique (MT), has significantly improved the prognosis of many segmental bone defects in helping to restore form and function. Although the Masquelet Technique has proven to be clinically effective, the physiology of the healing it induces is not well understood. Multiple modifiable factors have been implicated by various surgical and research teams, but no single factor has been proven to be critical to the success of the Masquelet Technique. In this review the most recent clinical and experimental evidence that supports and helps to decipher the traditional Masquelet, as well as the modifiable factors and their effect on the success of the technique are discussed. In addition, future developments for the integration of the traditional Masquelet Technique with the use of alternative biomaterials to increase the effectiveness and expand the clinical applicability of the Masquelet Technique are reviewed. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. PSP SAR interferometry monitoring of ground and structure deformations applied to archaeological sites

    NASA Astrophysics Data System (ADS)

    Costantini, Mario; Francioni, Elena; Trillo, Francesco; Minati, Federico; Margottini, Claudio; Spizzichino, Daniele; Trigila, Alessandro; Iadanza, Carla

    2017-04-01

    Archaeological sites and cultural heritage are considered as critical assets for the society, representing not only the history of region or a culture, but also contributing to create a common identity of people living in a certain region. In this view, it is becoming more and more urgent to preserve them from climate changes effect and in general from their degradation. These structures are usually just as precious as fragile: remote sensing technology can be useful to monitor these treasures. In this work, we will focus on ground deformation measurements obtained by satellite SAR interferometry and on the methodology adopted and implemented in order to use the results operatively for conservation policies in a Italian archaeological site. The analysis is based on the processing of COSMO-SkyMed Himage data by the e-GEOS proprietary Persistent Scatterer Pair (PSP) SAR interferometry technology. The PSP technique is a proven SAR interferometry technology characterized by the fact of exploiting in the processing only the relative properties between close points (pairs) in order to overcome atmospheric artefacts (which are one of the main problems of SAR interferometry). Validations analyses [Costantini et al. 2015] settled that this technique applied to COSMO-SkyMed Himage data is able to retrieve very dense (except of course on vegetated or cultivated areas) millimetric deformation measurements with sub-metric localization. Considering the limitations of all the interferometric techniques, in particular the fact that the measurement are along the line of sight (LOS) and the geometric distortions, in order to obtain the maximum information from interferometric analysis, both ascending and descending geometry have been used. The ascending analysis allows selecting measurements points over the top and, approximately, South-West part of the structures, while the descending one over the top and the South-East part of the structures. The interferometric techniques needs to use a stack of SAR images to separate the deformation phase contributions from other spurious components (atmospheric, orbital, etc.). Historical/reference analyses of the period 2011-2014 have been performed to obtain such deformations and to have a start point for the next updates. In fact, starting from the reference analyses the deformation monitoring has then continued with monthly updates of the PSP analysis with new COSMO-SkyMed acquisitions both in ascending and descending geometry. In addition to this traditional monitoring service, the satellite interferometry analysis has been realized over specific time frame that have been selected on the bases of some important events (damages to structures, collapses, works etc.) and the analysis have been correlated with additional site information as weather conditions, critical meteorological events, historical information of the site, etc. The objective is to find a nominal behaviour of the site in response to critical events and/or related to natural degradation of infrastructures in order to prevent damages and guide maintenance activities. The first results of this cross correlated analysis showed that some deformation phenomena are identifiable by SAR satellite interferometric analysis and it has also been possible to validate them on field through a direct survey.

  11. Trends in analytical techniques applied to particulate matter characterization: A critical review of fundaments and applications.

    PubMed

    Galvão, Elson Silva; Santos, Jane Meri; Lima, Ana Teresa; Reis, Neyval Costa; Orlando, Marcos Tadeu D'Azeredo; Stuetz, Richard Michael

    2018-05-01

    Epidemiological studies have shown the association of airborne particulate matter (PM) size and chemical composition with health problems affecting the cardiorespiratory and central nervous systems. PM also act as cloud condensation nuclei (CNN) or ice nuclei (IN), taking part in the clouds formation process, and therefore can impact the climate. There are several works using different analytical techniques in PM chemical and physical characterization to supply information to source apportionment models that help environmental agencies to assess damages accountability. Despite the numerous analytical techniques described in the literature available for PM characterization, laboratories are normally limited to the in-house available techniques, which raises the question if a given technique is suitable for the purpose of a specific experimental work. The aim of this work consists of summarizing the main available technologies for PM characterization, serving as a guide for readers to find the most appropriate technique(s) for their investigation. Elemental analysis techniques like atomic spectrometry based and X-ray based techniques, organic and carbonaceous techniques and surface analysis techniques are discussed, illustrating their main features as well as their advantages and drawbacks. We also discuss the trends in analytical techniques used over the last two decades. The choice among all techniques is a function of a number of parameters such as: the relevant particles physical properties, sampling and measuring time, access to available facilities and the costs associated to equipment acquisition, among other considerations. An analytical guide map is presented as a guideline for choosing the most appropriated technique for a given analytical information required. Copyright © 2018 Elsevier Ltd. All rights reserved.

  12. Column-coupling strategies for multidimensional electrophoretic separation techniques.

    PubMed

    Kler, Pablo A; Sydes, Daniel; Huhn, Carolin

    2015-01-01

    Multidimensional electrophoretic separations represent one of the most common strategies for dealing with the analysis of complex samples. In recent years we have been witnessing the explosive growth of separation techniques for the analysis of complex samples in applications ranging from life sciences to industry. In this sense, electrophoretic separations offer several strategic advantages such as excellent separation efficiency, different methods with a broad range of separation mechanisms, and low liquid consumption generating less waste effluents and lower costs per analysis, among others. Despite their impressive separation efficiency, multidimensional electrophoretic separations present some drawbacks that have delayed their extensive use: the volumes of the columns, and consequently of the injected sample, are significantly smaller compared to other analytical techniques, thus the coupling interfaces between two separations components must be very efficient in terms of providing geometrical precision with low dead volume. Likewise, very sensitive detection systems are required. Additionally, in electrophoretic separation techniques, the surface properties of the columns play a fundamental role for electroosmosis as well as the unwanted adsorption of proteins or other complex biomolecules. In this sense the requirements for an efficient coupling for electrophoretic separation techniques involve several aspects related to microfluidics and physicochemical interactions of the electrolyte solutions and the solid capillary walls. It is interesting to see how these multidimensional electrophoretic separation techniques have been used jointly with different detection techniques, for intermediate detection as well as for final identification and quantification, particularly important in the case of mass spectrometry. In this work we present a critical review about the different strategies for coupling two or more electrophoretic separation techniques and the different intermediate and final detection methods implemented for such separations.

  13. Stability of numerical integration techniques for transient rotor dynamics

    NASA Technical Reports Server (NTRS)

    Kascak, A. F.

    1977-01-01

    A finite element model of a rotor bearing system was analyzed to determine the stability limits of the forward, backward, and centered Euler; Runge-Kutta; Milne; and Adams numerical integration techniques. The analysis concludes that the highest frequency mode determines the maximum time step for a stable solution. Thus, the number of mass elements should be minimized. Increasing the damping can sometimes cause numerical instability. For a uniform shaft, with 10 mass elements, operating at approximately the first critical speed, the maximum time step for the Runge-Kutta, Milne, and Adams methods is that which corresponds to approximately 1 degree of shaft movement. This is independent of rotor dimensions.

  14. The use of Electronic Speckle Pattern Interferometry (ESPI) in the crack propagation analysis of epoxy resins

    NASA Astrophysics Data System (ADS)

    Herbert, D. P.; Al-Hassani, A. H. M.; Richardson, M. O. W.

    The ESPI (electronic speckle pattern interferometry) technique at high magnification levels is demonstrated to be of considerable value in interpreting the fracture behaviour of epoxy resins. The fracture toughness of powder coating system at different thicknesses has been measured using a TDCB (tapered double cantilever beam) technique and the deformation zone at the tip of the moving crack monitored. Initial indications are that a mechanistic changeover occurs at a critical bond (coating) thickness and that this is synonymous with the occurence of a fracture toughness maximum, which in turn is associated with a deformation zone of specific diameter.

  15. Fluorescence endoscopic imaging for evaluation of gastric mucosal blood flow: a preliminary study

    NASA Astrophysics Data System (ADS)

    Bocquillon, Nicolas; Mordon, Serge R.; Mathieu, D.; Maunoury, Vincent; Marechal, Xavier-Marie; Neviere, Remi; Wattel, Francis; Chopin, Claude

    1999-02-01

    Microcirculatory disorders of the gastrointestinal tract appear to be a major compound of the multiple organ dysfunction syndrome secondary to sepsis or septic shock. A better analysis of mucosal hypoperfusion in critically ill patients with sepsis may be helpful for the comprehension of this high mortality-associated syndrome. Fluorescence endoscopy has been recognized as a non-invasive method for both spatial and temporal evaluation of gastrointestinal mucosal perfusion. We performed this imaging technique during routine gastric endoscopy in patients with sepsis criteria. The study included gastric observation and appearance time of gastric fluorescence after an intravenous 10% sodium - fluorescein bolus. Qualitative analysis of high fluorescence areas was compared with mucosal blood flow measurements by laser - Doppler flowmetry. We concluded that the fluorescence endoscopic imaging in critically ill patients with sepsis may reveal spacial and temporal differences in the mucosal microcirculation distribution.

  16. Focus control enhancement and on-product focus response analysis methodology

    NASA Astrophysics Data System (ADS)

    Kim, Young Ki; Chen, Yen-Jen; Hao, Xueli; Samudrala, Pavan; Gomez, Juan-Manuel; Mahoney, Mark O.; Kamalizadeh, Ferhad; Hanson, Justin K.; Lee, Shawn; Tian, Ye

    2016-03-01

    With decreasing CDOF (Critical Depth Of Focus) for 20/14nm technology and beyond, focus errors are becoming increasingly critical for on-product performance. Current on product focus control techniques in high volume manufacturing are limited; It is difficult to define measurable focus error and optimize focus response on product with existing methods due to lack of credible focus measurement methodologies. Next to developments in imaging and focus control capability of scanners and general tool stability maintenance, on-product focus control improvements are also required to meet on-product imaging specifications. In this paper, we discuss focus monitoring, wafer (edge) fingerprint correction and on-product focus budget analysis through diffraction based focus (DBF) measurement methodology. Several examples will be presented showing better focus response and control on product wafers. Also, a method will be discussed for a focus interlock automation system on product for a high volume manufacturing (HVM) environment.

  17. Analysis of airframe/engine interactions - An integrated control perspective

    NASA Technical Reports Server (NTRS)

    Schmidt, David K.; Schierman, John D.; Garg, Sanjay

    1990-01-01

    Techniques for the analysis of the dynamic interactions between airframe/engine dynamical systems are presented. Critical coupling terms are developed that determine the significance of these interactions with regard to the closed loop stability and performance of the feedback systems. A conceptual model is first used to indicate the potential sources of the coupling, how the coupling manifests itself, and how the magnitudes of these critical coupling terms are used to quantify the effects of the airframe/engine interactions. A case study is also presented involving an unstable airframe with thrust vectoring for attitude control. It is shown for this system with classical, decentralized control laws that there is little airframe/engine interaction, and the stability and performance with those control laws is not affected. Implications of parameter uncertainty in the coupling dynamics is also discussed, and effects of these parameter variations are also demonstrated to be small for this vehicle configuration.

  18. Mathematics is always invisible, Professor Dowling

    NASA Astrophysics Data System (ADS)

    Cable, John

    2015-09-01

    This article provides a critical evaluation of a technique of analysis, the Social Activity Method, recently offered by Dowling (2013) as a `gift' to mathematics education. The method is found to be inadequate, firstly, because it employs a dichotomy (between `expression' and `content') instead of a finer analysis (into symbols, concepts and setting or phenomena), and, secondly, because the distinction between `public' and `esoteric' mathematics, although interesting, is allowed to obscure the structure of the mathematics itself. There is also criticism of what Dowling calls the `myth of participation', which denies the intimate links between mathematics and the rest of the universe that lie at the heart of mathematical pedagogy. Behind all this lies Dowling's `essentially linguistic' conception of mathematics, which is criticised on the dual ground that it ignores the chastening experience of formalism in mathematical philosophy and that linguistics itself has taken a wrong turn and ignores lessons that might be learnt from mathematics education.

  19. Incorporating active-learning techniques and competency assessment into a critical care elective course.

    PubMed

    Malcom, Daniel R; Hibbs, Jennifer L

    2012-09-10

    To design, implement, and measure the effectiveness of a critical care elective course for second-year students in a 3-year accelerated doctor of pharmacy (PharmD) program. A critical care elective course was developed that used active-learning techniques, including cooperative learning and group presentations, to deliver content on critical care topics. Group presentations had to include a disease state overview, practice guidelines, and clinical recommendations, and were evaluated by course faculty members and peers. Students' mean scores on a 20-question critical-care competency assessment administered before and after the course improved by 11% (p < 0.05). Course evaluations and comments were positive. A critical care elective course resulted in significantly improved competency in critical care and was well-received by students.

  20. Microprobe monazite geochronology: new techniques for dating deformation and metamorphism

    NASA Astrophysics Data System (ADS)

    Williams, M.; Jercinovic, M.; Goncalves, P.; Mahan, K.

    2003-04-01

    High-resolution compositional mapping, age mapping, and precise dating of monazite on the electron microprobe are powerful additions to microstructural and petrologic analysis and important tools for tectonic studies. The in-situ nature and high spatial resolution of the technique offer an entirely new level of structurally and texturally specific geochronologic data that can be used to put absolute time constraints on P-T-D paths, constrain the rates of sedimentary, metamorphic, and deformational processes, and provide new links between metamorphism and deformation. New analytical techniques (including background modeling, sample preparation, and interference analysis) have significantly improved the precision and accuracy of the technique and new mapping and image analysis techniques have increased the efficiency and strengthened the correlation with fabrics and textures. Microprobe geochronology is particularly applicable to three persistent microstructural-microtextural problem areas: (1) constraining the chronology of metamorphic assemblages; (2) constraining the timing of deformational fabrics; and (3) interpreting other geochronological results. In addition, authigenic monazite can be used to date sedimentary basins, and detrital monazite can fingerprint sedimentary source areas, both critical for tectonic analysis. Although some monazite generations can be directly tied to metamorphism or deformation, at present, the most common constraints rely on monazite inclusion relations in porphyroblasts that, in turn, can be tied to the deformation and/or metamorphic history. Examples will be presented from deep-crustal rocks of northern Saskatchewan and from mid-crustal rocks from the southwestern USA. Microprobe monazite geochronology has been used in both regions to deconvolute overprinting deformation and metamorphic events and to clarify the interpretation of other geochronologic data. Microprobe mapping and dating are powerful companions to mass spectroscopic dating techniques. They allow geochronology to be incorporated into the microstructural analytical process, resulting in a new level of integration of time (t) into P-T-D histories.

  1. Surface roughness: A review of its measurement at micro-/nano-scale

    NASA Astrophysics Data System (ADS)

    Gong, Yuxuan; Xu, Jian; Buchanan, Relva C.

    2018-01-01

    The measurement of surface roughness at micro-/nano-scale is of great importance to metrological, manufacturing, engineering, and scientific applications given the critical roles of roughness in physical and chemical phenomena. The surface roughness of materials can significantly change the way of how they interact with light, phonons, molecules, and so forth, thus surface roughness ultimately determines the functionality and property of materials. In this short review, the techniques of measuring micro-/nano-scale surface roughness are discussed with special focus on the limitations and capabilities of each technique. In addition, the calculations of surface roughness and their theoretical background are discussed to offer readers a better understanding of the importance of post-measurement analysis. Recent progress on fractal analysis of surface roughness is discussed to shed light on the future efforts in surface roughness measurement.

  2. Ultrasonic and metallographic studies on AISI 4140 steel exposed to hydrogen at high pressure and temperature

    NASA Astrophysics Data System (ADS)

    Oruganti, Malavika

    This thesis conducts an investigation to study the effects of hydrogen exposure at high temperature and pressure on the behavior of AISI 4140 steel. Piezoelectric ultrasonic technique was primarily used to evaluate surface longitudinal wave velocity and defect geometry variations, as related to time after exposure to hydrogen at high temperature and pressure. Critically refracted longitudinal wave technique was used for the former and pulse-echo technique for the latter. Optical microscopy and scanning electron microscopy were used to correlate the ultrasonic results with the microstructure of the steel and to provide better insight into the steel behavior. The results of the investigation indicate that frequency analysis of the defect echo, determined using the pulse-echo technique at regular intervals of time, appears to be a promising tool for monitoring defect growth induced by a high temperature and high pressure hydrogen-related attack.

  3. Applications of integrated human error identification techniques on the chemical cylinder change task.

    PubMed

    Cheng, Ching-Min; Hwang, Sheue-Ling

    2015-03-01

    This paper outlines the human error identification (HEI) techniques that currently exist to assess latent human errors. Many formal error identification techniques have existed for years, but few have been validated to cover latent human error analysis in different domains. This study considers many possible error modes and influential factors, including external error modes, internal error modes, psychological error mechanisms, and performance shaping factors, and integrates several execution procedures and frameworks of HEI techniques. The case study in this research was the operational process of changing chemical cylinders in a factory. In addition, the integrated HEI method was used to assess the operational processes and the system's reliability. It was concluded that the integrated method is a valuable aid to develop much safer operational processes and can be used to predict human error rates on critical tasks in the plant. Copyright © 2014 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  4. ACOSS Eight (Active Control of Space Structures), Phase 2

    DTIC Science & Technology

    1981-09-01

    A-2 A-2 Nominal Model - Equipment Section and Solar Panels ....... A-3 A-3 Nominal Model - Upper Support .-uss ...... ............ A-4 A...sensitivity analysis technique ef selecting critical system parameters is applied tc the Diaper tetrahedral truss structure (See Section 4-2...and solar panels are omitted. The precision section is mounted on isolators to inertially r•" I fixed rigid support. The mode frequencies of this

  5. Perspectives on NMR in drug discovery: a technique comes of age

    PubMed Central

    Pellecchia, Maurizio; Bertini, Ivano; Cowburn, David; Dalvit, Claudio; Giralt, Ernest; Jahnke, Wolfgang; James, Thomas L.; Homans, Steve W.; Kessler, Horst; Luchinat, Claudio; Meyer, Bernd; Oschkinat, Hartmut; Peng, Jeff; Schwalbe, Harald; Siegal, Gregg

    2009-01-01

    In the past decade, the potential of harnessing the ability of nuclear magnetic resonance (NMR) spectroscopy to monitor intermolecular interactions as a tool for drug discovery has been increasingly appreciated in academia and industry. In this Perspective, we highlight some of the major applications of NMR in drug discovery, focusing on hit and lead generation, and provide a critical analysis of its current and potential utility. PMID:19172689

  6. How to Define the Mean Square Amplitude of Solar Wind Fluctuations With Respect to the Local Mean Magnetic Field

    NASA Astrophysics Data System (ADS)

    Podesta, John J.

    2017-12-01

    Over the last decade it has become popular to analyze turbulent solar wind fluctuations with respect to a coordinate system aligned with the local mean magnetic field. This useful analysis technique has provided new information and new insights about the nature of solar wind fluctuations and provided some support for phenomenological theories of MHD turbulence based on the ideas of Goldreich and Sridhar. At the same time it has drawn criticism suggesting that the use of a scale-dependent local mean field is somehow inconsistent or irreconcilable with traditional analysis techniques based on second-order structure functions and power spectra that, for stationary time series, are defined with respect to the constant (scale-independent) ensemble average magnetic field. Here it is shown that for fluctuations with power law spectra, such as those observed in solar wind turbulence, it is possible to define the local mean magnetic field in a special way such that the total mean square amplitude (trace amplitude) of turbulent fluctuations is approximately the same, scale by scale, as that obtained using traditional second-order structure functions or power spectra. This fact should dispel criticism concerning the physical validity or practical usefulness of the local mean magnetic field in these applications.

  7. Overview of PAT process analysers applicable in monitoring of film coating unit operations for manufacturing of solid oral dosage forms.

    PubMed

    Korasa, Klemen; Vrečer, Franc

    2018-01-01

    Over the last two decades, regulatory agencies have demanded better understanding of pharmaceutical products and processes by implementing new technological approaches, such as process analytical technology (PAT). Process analysers present a key PAT tool, which enables effective process monitoring, and thus improved process control of medicinal product manufacturing. Process analysers applicable in pharmaceutical coating unit operations are comprehensibly described in the present article. The review is focused on monitoring of solid oral dosage forms during film coating in two most commonly used coating systems, i.e. pan and fluid bed coaters. Brief theoretical background and critical overview of process analysers used for real-time or near real-time (in-, on-, at- line) monitoring of critical quality attributes of film coated dosage forms are presented. Besides well recognized spectroscopic methods (NIR and Raman spectroscopy), other techniques, which have made a significant breakthrough in recent years, are discussed (terahertz pulsed imaging (TPI), chord length distribution (CLD) analysis, and image analysis). Last part of the review is dedicated to novel techniques with high potential to become valuable PAT tools in the future (optical coherence tomography (OCT), acoustic emission (AE), microwave resonance (MR), and laser induced breakdown spectroscopy (LIBS)). Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Online analysis and process control in recombinant protein production (review).

    PubMed

    Palmer, Shane M; Kunji, Edmund R S

    2012-01-01

    Online analysis and control is essential for efficient and reproducible bioprocesses. A key factor in real-time control is the ability to measure critical variables rapidly. Online in situ measurements are the preferred option and minimize the potential loss of sterility. The challenge is to provide sensors with a good lifespan that withstand harsh bioprocess conditions, remain stable for the duration of a process without the need for recalibration, and offer a suitable working range. In recent decades, many new techniques that promise to extend the possibilities of analysis and control, not only by providing new parameters for analysis, but also through the improvement of accepted, well practiced, measurements have arisen.

  9. A measurement of time-averaged aerosol optical depth using air-showers observed in stereo by HiRes

    NASA Astrophysics Data System (ADS)

    High Resolution Fly'S Eye Collaboration; Abbasi, R. U.; Abu-Zayyad, T.; Amann, J. F.; Archbold, G.; Atkins, R.; Belov, K.; Belz, J. W.; Benzvi, S.; Bergman, D. R.; Boyer, J. H.; Cannon, C. T.; Cao, Z.; Connolly, B. M.; Fedorova, Y.; Finley, C. B.; Hanlon, W. F.; Hoffman, C. M.; Holzscheiter, M. H.; Hughes, G. A.; Hüntemeyer, P.; Jui, C. C. H.; Kirn, M. A.; Knapp, B. C.; Loh, E. C.; Manago, N.; Mannel, E. J.; Martens, K.; Matthews, J. A. J.; Matthews, J. N.; O'Neill, A.; Reil, K.; Roberts, M. D.; Schnetzer, S. R.; Seman, M.; Sinnis, G.; Smith, J. D.; Sokolsky, P.; Song, C.; Springer, R. W.; Stokes, B. T.; Thomas, S. B.; Thomson, G. B.; Tupa, D.; Westerhoff, S.; Wiencke, L. R.; Zech, A.

    2006-03-01

    Air fluorescence measurements of cosmic ray energy must be corrected for attenuation of the atmosphere. In this paper, we show that the air-showers themselves can yield a measurement of the aerosol attenuation in terms of optical depth, time-averaged over extended periods. Although the technique lacks statistical power to make the critical hourly measurements that only specialized active instruments can achieve, we note the technique does not depend on absolute calibration of the detector hardware, and requires no additional equipment beyond the fluorescence detectors that observe the air showers. This paper describes the technique, and presents results based on analysis of 1258 air-showers observed in stereo by the High Resolution Fly’s Eye over a four year span.

  10. Comparison of prometaphase chromosome techniques with emphasis on the role of colcemid.

    PubMed

    Wiley, J E; Sargent, L M; Inhorn, S L; Meisner, L F

    1984-12-01

    Six different techniques were evaluated to define better those technical factors that are most critical for obtaining prometaphase cells for banding analysis. Our results demonstrate: colcemid exposures of 30 min or less have no effect on increasing the yield of prometaphase cells, colcemid exposures of greater than 0.1 microgram/ml can be toxic, methotrexate depresses the mitotic index significantly and seems to increase the incidence of prometaphase cells only because it suppresses later forms; and (d) the optimum number of cytogenetically satisfactory prometaphase cells can be obtained with a 4-h exposure to a combination of low concentration actinomycin D (0.5 microgram/ml) and colcemid (0.1 microgram/ml). This technique inhibits chromosome condensation while permitting prometaphase cells to accumulate for 4 h.

  11. A FORTRAN technique for correlating a circular environmental variable with a linear physiological variable in the sugar maple.

    PubMed

    Pease, J M; Morselli, M F

    1987-01-01

    This paper deals with a computer program adapted to a statistical method for analyzing an unlimited quantity of binary recorded data of an independent circular variable (e.g. wind direction), and a linear variable (e.g. maple sap flow volume). Circular variables cannot be statistically analyzed with linear methods, unless they have been transformed. The program calculates a critical quantity, the acrophase angle (PHI, phi o). The technique is adapted from original mathematics [1] and is written in Fortran 77 for easier conversion between computer networks. Correlation analysis can be performed following the program or regression which, because of the circular nature of the independent variable, becomes periodic regression. The technique was tested on a file of approximately 4050 data pairs.

  12. Location estimation in wireless sensor networks using spring-relaxation technique.

    PubMed

    Zhang, Qing; Foh, Chuan Heng; Seet, Boon-Chong; Fong, A C M

    2010-01-01

    Accurate and low-cost autonomous self-localization is a critical requirement of various applications of a large-scale distributed wireless sensor network (WSN). Due to its massive deployment of sensors, explicit measurements based on specialized localization hardware such as the Global Positioning System (GPS) is not practical. In this paper, we propose a low-cost WSN localization solution. Our design uses received signal strength indicators for ranging, light weight distributed algorithms based on the spring-relaxation technique for location computation, and the cooperative approach to achieve certain location estimation accuracy with a low number of nodes with known locations. We provide analysis to show the suitability of the spring-relaxation technique for WSN localization with cooperative approach, and perform simulation experiments to illustrate its accuracy in localization.

  13. Critical analysis of radiologist-patient interaction.

    PubMed

    Morris, K J; Tarico, V S; Smith, W L; Altmaier, E M; Franken, E A

    1987-05-01

    A critical incident interview technique was used to identify features of radiologist-patient interactions considered effective and ineffective by patients. During structured interviews with 35 radiology patients and five patients' parents, three general categories of physician behavior were described: attention to patient comfort, explanation of procedure and results, and interpersonal sensitivity. The findings indicated that patients are sensitive to physicians' interpersonal styles and that they want physicians to explain procedures and results in an understandable manner and to monitor their well-being during procedures. The sample size of the study is small; thus further confirmation is needed. However, the implications for training residents and practicing radiologists in these behaviors are important in the current competitive medical milieu.

  14. Critical Materials Institute

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    King, Alex

    2013-01-09

    Ames Laboratory Director Alex King talks about the goals of the Critical Materials Institute in diversifying the supply of critical materials, developing substitute materials, developing tools and techniques for recycling critical materials, and forecasting materials needs to avoid future shortages.

  15. Critical Materials Institute

    ScienceCinema

    King, Alex

    2017-12-22

    Ames Laboratory Director Alex King talks about the goals of the Critical Materials Institute in diversifying the supply of critical materials, developing substitute materials, developing tools and techniques for recycling critical materials, and forecasting materials needs to avoid future shortages.

  16. Application of the SCALE TSUNAMI Tools for the Validation of Criticality Safety Calculations Involving 233U

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mueller, Don; Rearden, Bradley T; Hollenbach, Daniel F

    2009-02-01

    The Radiochemical Development Facility at Oak Ridge National Laboratory has been storing solid materials containing 233U for decades. Preparations are under way to process these materials into a form that is inherently safe from a nuclear criticality safety perspective. This will be accomplished by down-blending the {sup 233}U materials with depleted or natural uranium. At the request of the U.S. Department of Energy, a study has been performed using the SCALE sensitivity and uncertainty analysis tools to demonstrate how these tools could be used to validate nuclear criticality safety calculations of selected process and storage configurations. ISOTEK nuclear criticality safetymore » staff provided four models that are representative of the criticality safety calculations for which validation will be needed. The SCALE TSUNAMI-1D and TSUNAMI-3D sequences were used to generate energy-dependent k{sub eff} sensitivity profiles for each nuclide and reaction present in the four safety analysis models, also referred to as the applications, and in a large set of critical experiments. The SCALE TSUNAMI-IP module was used together with the sensitivity profiles and the cross-section uncertainty data contained in the SCALE covariance data files to propagate the cross-section uncertainties ({Delta}{sigma}/{sigma}) to k{sub eff} uncertainties ({Delta}k/k) for each application model. The SCALE TSUNAMI-IP module was also used to evaluate the similarity of each of the 672 critical experiments with each application. Results of the uncertainty analysis and similarity assessment are presented in this report. A total of 142 experiments were judged to be similar to application 1, and 68 experiments were judged to be similar to application 2. None of the 672 experiments were judged to be adequately similar to applications 3 and 4. Discussion of the uncertainty analysis and similarity assessment is provided for each of the four applications. Example upper subcritical limits (USLs) were generated for application 1 based on trending of the energy of average lethargy of neutrons causing fission, trending of the TSUNAMI similarity parameters, and use of data adjustment techniques.« less

  17. Nontraditional teaching techniques and critical thinking in an introductory postsecondary environmental science course

    NASA Astrophysics Data System (ADS)

    Buerdsell, Sherri Lynn

    2009-12-01

    As an institution of higher education and as a Hispanic-serving institution, New Mexico State University has a responsibility to its students to provide the skills and experiences necessary for each and every student to become a responsible, reflective citizen, capable of making informed decisions. Postsecondary science has traditionally been taught through lectures. Traditional lecture classes simply do not meet the needs of diverse groups of students in the modern multicultural student body like New Mexico State University's. However, the implementation of nontraditional pedagogy without evaluation of the results is useless as a step to reform; it is necessary to evaluate the results of in situ nontraditional pedagogy to determine its worth. The purpose of this research is to analyze the development and change in students' critical thinking skills, and critical thinking dispositions in single semester in an introductory Environmental Science course. This study utilized a mixed methods approach. The California Critical Thinking Skills Test and the California Critical Thinking Disposition Inventory were administered in the beginning and at the end of the semester. The pretest was used to provide a baseline for each participant against which the posttest score was compared. In addition, student interviews, field notes, and a survey provided qualitative data, which generated themes regarding the development of student critical thinking in this course. The results indicated there were no significant differences in the critical thinking test scores. However, qualitative analysis indicated that students experienced significant changes in critical thinking. Three themes emerged from the qualitative analysis pertaining to the amount of influence on student learning. These themes are active thinking and learning, dialogue, and professor's influence. Due to the conflict between the quantitative and the qualitative results, it is suggested that the critical thinking tests are not sensitive enough to identify minute but important changes in student critical thinking.

  18. Review on Microstructure Analysis of Metals and Alloys Using Image Analysis Techniques

    NASA Astrophysics Data System (ADS)

    Rekha, Suganthini; Bupesh Raja, V. K.

    2017-05-01

    The metals and alloys find vast application in engineering and domestic sectors. The mechanical properties of the metals and alloys are influenced by their microstructure. Hence the microstructural investigation is very critical. Traditionally the microstructure is studied using optical microscope with suitable metallurgical preparation. The past few decades the computers are applied in the capture and analysis of the optical micrographs. The advent of computer softwares like digital image processing and computer vision technologies are a boon to the analysis of the microstructure. In this paper the literature study of the various developments in the microstructural analysis, is done. The conventional optical microscope is complemented by the use of Scanning Electron Microscope (SEM) and other high end equipments.

  19. Multi-Mbar Ramp Compression of Copper

    NASA Astrophysics Data System (ADS)

    Kraus, Rick; Davis, Jean-Paul; Seagle, Christopher; Fratanduono, Dayne; Swift, Damian; Eggert, Jon; Collins, Gilbert

    2015-06-01

    The cold curve is a critical component of equation of state models. Diamond anvil cell measurements can be used to determine isotherms, but these have generally been limited to pressures below 1 Mbar. The cold curve can also be extracted from Hugoniot data, but only with assumptions about the thermal pressure. As the National Ignition Facility will be using copper as an ablator material at pressures in excess of 10 Mbar, we need a better understanding of the high-density equation of state. Here we present ramp-wave compression experiments at the Sandia Z-Machine that we have used to constrain the isentrope of copper to a stress state of nearly 5 Mbar. We use the iterative Lagrangian analysis technique, developed by Rothman and Maw, to determine the stress-strain path. We also present a new iterative forward analysis (IFA) technique coupled to the ARES hydrocode that performs a non-linear optimization over the pressure drive and equation of state in order to match the free surface velocities. The IFA technique is an advantage over iterative Lagrangian analysis for experiments with growing shocks or systems with time dependent strength, which violate the assumptions of iterative Lagrangian analysis. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  20. Recent Advances in Techniques for Hyperspectral Image Processing

    NASA Technical Reports Server (NTRS)

    Plaza, Antonio; Benediktsson, Jon Atli; Boardman, Joseph W.; Brazile, Jason; Bruzzone, Lorenzo; Camps-Valls, Gustavo; Chanussot, Jocelyn; Fauvel, Mathieu; Gamba, Paolo; Gualtieri, Anthony; hide

    2009-01-01

    Imaging spectroscopy, also known as hyperspectral imaging, has been transformed in less than 30 years from being a sparse research tool into a commodity product available to a broad user community. Currently, there is a need for standardized data processing techniques able to take into account the special properties of hyperspectral data. In this paper, we provide a seminal view on recent advances in techniques for hyperspectral image processing. Our main focus is on the design of techniques able to deal with the highdimensional nature of the data, and to integrate the spatial and spectral information. Performance of the discussed techniques is evaluated in different analysis scenarios. To satisfy time-critical constraints in specific applications, we also develop efficient parallel implementations of some of the discussed algorithms. Combined, these parts provide an excellent snapshot of the state-of-the-art in those areas, and offer a thoughtful perspective on future potentials and emerging challenges in the design of robust hyperspectral imaging algorithms

  1. Application of advanced techniques for the assessment of bio-stability of biowaste-derived residues: A minireview.

    PubMed

    Lü, Fan; Shao, Li-Ming; Zhang, Hua; Fu, Wen-Ding; Feng, Shi-Jin; Zhan, Liang-Tong; Chen, Yun-Min; He, Pin-Jing

    2018-01-01

    Bio-stability is a key feature for the utilization and final disposal of biowaste-derived residues, such as aerobic compost or vermicompost of food waste, bio-dried waste, anaerobic digestate or landfilled waste. The present paper reviews conventional methods and advanced techniques used for the assessment of bio-stability. The conventional methods are reclassified into two categories. Advanced techniques, including spectroscopic (fluorescent, ultraviolet-visible, infrared, Raman, nuclear magnetic resonance), thermogravimetric and thermochemolysis analysis, are emphasized for their application in bio-stability assessment in recent years. Their principles, pros and cons are critically discussed. These advanced techniques are found to be convenient in sample preparation and to supply diversified information. However, the viability of these techniques as potential indicators for bio-stability assessment ultimately lies in the establishment of the relationship of advanced ones with the conventional methods, especially with the methods based on biotic response. Furthermore, some misuses in data explanation should be noted. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Critical analysis of procurement techniques in construction management sectors

    NASA Astrophysics Data System (ADS)

    Tiwari, Suman Tiwari Suresh; Chan, Shiau Wei; Faraz Mubarak, Muhammad

    2018-04-01

    Over the last three decades, numerous procurement techniques have been one of the highlights of the Construction Management (CM) for ventures, administration contracting, venture management as well as design and construct. Due to the development and utilization of those techniques, various researchers have explored the criteria for their choice and their execution in terms of time, cost and quality. Nevertheless, there is a lack of giving an account on the relationship between the procurement techniques and the progressed related issues, for example, supply chain, sustainability, innovation and technology development, lean construction, constructability, value management, Building Information Modelling (BIM) as well as e-procurement. Through chosen papers from the reputable CM-related academic journals, the specified scopes of these issues are methodically assessed with the objective to explore the status and trend in procurement related research. The result of this paper contributes theoretically as well as practically to the researchers and industrialist in order to be aware and appreciate the development of procurement techniques.

  3. A Regev-Type Fully Homomorphic Encryption Scheme Using Modulus Switching

    PubMed Central

    Chen, Zhigang; Wang, Jian; Song, Xinxia

    2014-01-01

    A critical challenge in a fully homomorphic encryption (FHE) scheme is to manage noise. Modulus switching technique is currently the most efficient noise management technique. When using the modulus switching technique to design and implement a FHE scheme, how to choose concrete parameters is an important step, but to our best knowledge, this step has drawn very little attention to the existing FHE researches in the literature. The contributions of this paper are twofold. On one hand, we propose a function of the lower bound of dimension value in the switching techniques depending on the LWE specific security levels. On the other hand, as a case study, we modify the Brakerski FHE scheme (in Crypto 2012) by using the modulus switching technique. We recommend concrete parameter values of our proposed scheme and provide security analysis. Our result shows that the modified FHE scheme is more efficient than the original Brakerski scheme in the same security level. PMID:25093212

  4. Lip reposition surgery: A new call in periodontics

    PubMed Central

    Sheth, Tejal; Shah, Shilpi; Shah, Mihir; Shah, Ekta

    2013-01-01

    “Gummy smile” is a major concern for a large number of patients visiting the dentist. Esthetics has now become an integral part of periodontal treatment plan. This article presents a case of a gummy smile in which esthetic correction was achieved through periodontal plastic surgical procedure wherein a 10-12 mm of partial-thickness flap was dissected apical to mucogingival junction followed by approximation of the flaps. This novel technique gave excellent post-operative results with enormous patient satisfaction. This surgical chair-side procedure being one of its kinds with outstanding results is very rarely performed by Periodontists. Thus, a lot of clinical work and literature review with this surgical technique is required. To make it a routine surgical procedure this technique can be incorporated as a part of periodontal plastic surgery in the text. Hence, we have put forward experience of a case with critical analysis of the surgical technique including the limitations of the technique. PMID:24124310

  5. Current applications of high-resolution mass spectrometry for the analysis of new psychoactive substances: a critical review.

    PubMed

    Pasin, Daniel; Cawley, Adam; Bidny, Sergei; Fu, Shanlin

    2017-10-01

    The proliferation of new psychoactive substances (NPS) in recent years has resulted in the development of numerous analytical methods for the detection and identification of known and unknown NPS derivatives. High-resolution mass spectrometry (HRMS) has been identified as the method of choice for broad screening of NPS in a wide range of analytical contexts because of its ability to measure accurate masses using data-independent acquisition (DIA) techniques. Additionally, it has shown promise for non-targeted screening strategies that have been developed in order to detect and identify novel analogues without the need for certified reference materials (CRMs) or comprehensive mass spectral libraries. This paper reviews the applications of HRMS for the analysis of NPS in forensic drug chemistry and analytical toxicology. It provides an overview of the sample preparation procedures in addition to data acquisition, instrumental analysis, and data processing techniques. Furthermore, it gives an overview of the current state of non-targeted screening strategies with discussion on future directions and perspectives of this technique. Graphical Abstract Missing the bullseye - a graphical respresentation of non-targeted screening. Image courtesy of Christian Alonzo.

  6. Laser-induced dissociation processes of protonated glucose: dehydration reactions vs cross-ring dissociation

    NASA Astrophysics Data System (ADS)

    Dyakov, Y. A.; Kazaryan, M. A.; Golubkov, M. G.; Gubanova, D. P.; Bulychev, N. A.; Kazaryan, S. M.

    2018-04-01

    Studying the processes occurring in biological systems under irradiation is critically important for understanding the principles of working of biological systems. One of the main problems, which stimulate interest to the processes of photo-induced excitation and ionization of biomolecules, is the necessity of their identification by various mass spectrometry (MS) methods. While simple analysis of small molecules became a standard MS technique long time ago, recognition of large molecules, especially carbohydrates, is still a difficult problem, and requires sophisticated techniques and complicated computer analysis. Due to the large variety of substances in the samples, as far as the complexity of the processes occurring after excitation/ionization of the molecules, the recognition efficiency of MS technique in terms of carbohydrates is still not high enough. Additional theoretical and experimental analysis of ionization and dissociation processes in various kinds of polysaccharides, beginning from the simplest ones, is necessary. In our work, we extent previous theoretical and experimental studies of saccharides, and concentrate our attention to protonated glucose. In this article we paid the most attention to the cross-ring dissociation and water loss reactions due to their importance for identification of various isomers of hydrocarbon molecules (for example, distinguish α- and β-glucose).

  7. The Critical Incident Interview and Ethnoracial Identity.

    ERIC Educational Resources Information Center

    Montalvo, Frank F.

    1999-01-01

    Describes the critical-incident interview, a cross-cultural training technique that helps social work students assess clients' ethnic- and racial-identity development. Uses examples from student interviews to present the steps involved in teaching the technique. Includes guidelines for selecting and interviewing informants, and gives three scales…

  8. "Both Answers Make Sense!"

    ERIC Educational Resources Information Center

    Lockwood, Elise

    2014-01-01

    Formulas, problem types, keywords, and tricky techniques can certainly be valuable tools for successful counters. However, they can easily become substitutes for critical thinking about counting problems and for deep consideration of the set of outcomes. Formulas and techniques should serve as tools for students as they think critically about…

  9. Special Section: A Debate on Research Techniques in Economic Education

    ERIC Educational Resources Information Center

    Dawson, George G.; And Others

    1976-01-01

    Dawson introduces three articles which debate merits of research techniques in undergraduate economic education. William E. Becker criticizes John C. Soper's models, multicollinearity argument, and student incentives in a research project; Soper replies; Robert Highsmith critically analyzes strengths and weaknesses of each argument. (AV)

  10. Management of the second phase of labour: perineum protection techniques.

    PubMed

    Laganà, A S; Burgio, M A; Retto, G; Pizzo, A; Granese, R; Sturlese, E; Ciancimino, L; Chiofalo, B; Retto, A; Triolo, O

    2015-06-01

    The obstetric experience alongside scientific evidences in literature indicate several management techniques during the expulsive period of labour to minimize obstetric complications. Among the various methods that can be used for the protection of the perineum during the expulsive phase, some are performed prepartum (perineum massage), while most are used during childbirth. Among the second group, progressively increasing importance is assumed by the manual techniques to protect the perineum (using the "hands-on" and "hands-off") and by episiotomy. These techniques, when used in accordance to the guidelines, may favour the reduction of adverse outcomes for both the mother and the newborn, both immediately after birth and after a longer time. The midwife should be aware of the evidences in literature so that a critical analysis of the available techniques can be made and put in action during the expulsive phase in order to protect the mother and the foetus from any unfavourable outcomes. Currently, clinical evidence in literature is directing obstetric and medical staff towards a careful analysis of the maternal-foetal parameters, in order to achieve a precise assessment of the risks factors of intrapartum and postpartum outcomes. Increasingly, there is the need for close collaboration between the midwife and medical staff to ensure proper personalized assistance based on the peculiar characteristics of the woman and the fetus.

  11. Measuring protein dynamics in live cells: protocols and practical considerations for fluorescence fluctuation microscopy

    PubMed Central

    Youker, Robert T.; Teng, Haibing

    2014-01-01

    Abstract. Quantitative analysis of protein complex stoichiometries and mobilities are critical for elucidating the mechanisms that regulate cellular pathways. Fluorescence fluctuation spectroscopy (FFS) techniques can measure protein dynamics, such as diffusion coefficients and formation of complexes, with extraordinary precision and sensitivity. Complete calibration and characterization of the microscope instrument is necessary in order to avoid artifacts during data acquisition and to capitalize on the full capabilities of FFS techniques. We provide an overview of the theory behind FFS techniques, discuss calibration procedures, provide protocols, and give practical considerations for performing FFS experiments. One important parameter recovered from FFS measurements is the relative molecular brightness that can correlate with oligomerization. Three methods for measuring molecular brightness (fluorescence correlation spectroscopy, photon-counting histogram, and number and brightness analysis) recover similar values when measuring samples under ideal conditions in vitro. However, examples are given illustrating that these different methods used for calculating molecular brightness of fluorescent molecules in cells are not always equivalent. Methods relying on spot measurements are more prone to bleaching and movement artifacts that can lead to underestimation of brightness values. We advocate for the use of multiple FFS techniques to study molecular brightnesses to overcome and compliment limitations of individual techniques. PMID:25260867

  12. Critical thinking in patient centered care.

    PubMed

    Mitchell, Shannon H; Overman, Pamela; Forrest, Jane L

    2014-06-01

    Health care providers can enhance their critical thinking skills, essential to providing patient centered care, by use of motivational interviewing and evidence-based decision making techniques. The need for critical thinking skills to foster optimal patient centered care is being emphasized in educational curricula for health care professions. The theme of this paper is that evidence-based decision making (EBDM) and motivational interviewing (MI) are tools that when taught in health professions educational programs can aid in the development of critical thinking skills. This paper reviews the MI and EBDM literature for evidence regarding these patient-centered care techniques as they relate to improved oral health outcomes. Comparisons between critical thinking and EBDM skills are presented and the EBDM model and the MI technique are briefly described followed by a discussion of the research to date. The evidence suggests that EBDM and MI are valuable tools; however, further studies are needed regarding the effectiveness of EBDM and MI and the ways that health care providers can best develop critical thinking skills to facilitate improved patient care outcomes. Copyright © 2014 Elsevier Inc. All rights reserved.

  13. Three-state Potts model on non-local directed small-world lattices

    NASA Astrophysics Data System (ADS)

    Ferraz, Carlos Handrey Araujo; Lima, José Luiz Sousa

    2017-10-01

    In this paper, we study the non-local directed Small-World (NLDSW) disorder effects in the three-state Potts model as a form to capture the essential features shared by real complex systems where non-locality effects play a important role in the behavior of these systems. Using Monte Carlo techniques and finite-size scaling analysis, we estimate the infinite lattice critical temperatures and the leading critical exponents in this model. In particular, we investigate the first- to second-order phase transition crossover when NLDSW links are inserted. A cluster-flip algorithm was used to reduce the critical slowing down effect in our simulations. We find that for a NLDSW disorder densities p

  14. Entangled communities and spatial synchronization lead to criticality in urban traffic

    PubMed Central

    Petri, Giovanni; Expert, Paul; Jensen, Henrik J.; Polak, John W.

    2013-01-01

    Understanding the relation between patterns of human mobility and the scaling of dynamical features of urban environments is a great importance for today's society. Although recent advancements have shed light on the characteristics of individual mobility, the role and importance of emerging human collective phenomena across time and space are still unclear. In this Article, we show by using two independent data-analysis techniques that the traffic in London is a combination of intertwined clusters, spanning the whole city and effectively behaving as a single correlated unit. This is due to algebraically decaying spatio-temporal correlations, that are akin to those shown by systems near a critical point. We describe these correlations in terms of Taylor's law for fluctuations and interpret them as the emerging result of an underlying spatial synchronisation. Finally, our results provide the first evidence for a large-scale spatial human system reaching a self-organized critical state. PMID:23660823

  15. In-Depth View of the Structure and Growth of SnO2 Nanowires and Nanobrushes.

    PubMed

    Stuckert, Erin P; Geiss, Roy H; Miller, Christopher J; Fisher, Ellen R

    2016-08-31

    Strategic application of an array of complementary imaging and diffraction techniques is critical to determine accurate structural information on nanomaterials, especially when also seeking to elucidate structure-property relationships and their effects on gas sensors. In this work, SnO2 nanowires and nanobrushes grown via chemical vapor deposition (CVD) displayed the same tetragonal SnO2 structure as revealed via powder X-ray diffraction bulk crystallinity data. Additional characterization using a range of electron microscopy imaging and diffraction techniques, however, revealed important structure and morphology distinctions between the nanomaterials. Tailoring scanning transmission electron microscopy (STEM) modes combined with transmission electron backscatter diffraction (t-EBSD) techniques afforded a more detailed view of the SnO2 nanostructures. Indeed, upon deeper analysis of individual wires and brushes, we discovered that, despite a similar bulk structure, wires and brushes grew with different crystal faces and lattice spacings. Had we not utilized multiple STEM diffraction modes in conjunction with t-EBSD, differences in orientation related to bristle density would have been overlooked. Thus, it is only through a methodical combination of several structural analysis techniques that precise structural information can be reliably obtained.

  16. Chemometric applications to assess quality and critical parameters of virgin and extra-virgin olive oil. A review.

    PubMed

    Gómez-Caravaca, Ana M; Maggio, Rubén M; Cerretani, Lorenzo

    2016-03-24

    Today virgin and extra-virgin olive oil (VOO and EVOO) are food with a large number of analytical tests planned to ensure its quality and genuineness. Almost all official methods demand high use of reagents and manpower. Because of that, analytical development in this area is continuously evolving. Therefore, this review focuses on analytical methods for EVOO/VOO which use fast and smart approaches based on chemometric techniques in order to reduce time of analysis, reagent consumption, high cost equipment and manpower. Experimental approaches of chemometrics coupled with fast analytical techniques such as UV-Vis spectroscopy, fluorescence, vibrational spectroscopies (NIR, MIR and Raman fluorescence), NMR spectroscopy, and other more complex techniques like chromatography, calorimetry and electrochemical techniques applied to EVOO/VOO production and analysis have been discussed throughout this work. The advantages and drawbacks of this association have also been highlighted. Chemometrics has been evidenced as a powerful tool for the oil industry. In fact, it has been shown how chemometrics can be implemented all along the different steps of EVOO/VOO production: raw material input control, monitoring during process and quality control of final product. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. The role of critical ethnic awareness and social support in the discrimination-depression relationship among Asian Americans: path analysis.

    PubMed

    Kim, Isok

    2014-01-01

    This study used a path analytic technique to examine associations among critical ethnic awareness, racial discrimination, social support, and depressive symptoms. Using a convenience sample from online survey of Asian American adults (N = 405), the study tested 2 main hypotheses: First, based on the empowerment theory, critical ethnic awareness would be positively associated with racial discrimination experience; and second, based on the social support deterioration model, social support would partially mediate the relationship between racial discrimination and depressive symptoms. The result of the path analysis model showed that the proposed path model was a good fit based on global fit indices, χ²(2) = 4.70, p = .10; root mean square error of approximation = 0.06; comparative fit index = 0.97; Tucker-Lewis index = 0.92; and standardized root mean square residual = 0.03. The examinations of study hypotheses demonstrated that critical ethnic awareness was directly associated (b = .11, p < .05) with the racial discrimination experience, whereas social support had a significant indirect effect (b = .48; bias-corrected 95% confidence interval [0.02, 1.26]) between the racial discrimination experience and depressive symptoms. The proposed path model illustrated that both critical ethnic awareness and social support are important mechanisms for explaining the relationship between racial discrimination and depressive symptoms among this sample of Asian Americans. This study highlights the usefulness of the critical ethnic awareness concept as a way to better understand how Asian Americans might perceive and recognize racial discrimination experiences in relation to its mental health consequences.

  18. A cyclostationary multi-domain analysis of fluid instability in Kaplan turbines

    NASA Astrophysics Data System (ADS)

    Pennacchi, P.; Borghesani, P.; Chatterton, S.

    2015-08-01

    Hydraulic instabilities represent a critical problem for Francis and Kaplan turbines, reducing their useful life due to increase of fatigue on the components and cavitation phenomena. Whereas an exhaustive list of publications on computational fluid-dynamic models of hydraulic instability is available, the possibility of applying diagnostic techniques based on vibration measurements has not been investigated sufficiently, also because the appropriate sensors seldom equip hydro turbine units. The aim of this study is to fill this knowledge gap and to exploit fully, for this purpose, the potentiality of combining cyclostationary analysis tools, able to describe complex dynamics such as those of fluid-structure interactions, with order tracking procedures, allowing domain transformations and consequently the separation of synchronous and non-synchronous components. This paper will focus on experimental data obtained on a full-scale Kaplan turbine unit, operating in a real power plant, tackling the issues of adapting such diagnostic tools for the analysis of hydraulic instabilities and proposing techniques and methodologies for a highly automated condition monitoring system.

  19. A Hitchhiker's Guide to Functional Magnetic Resonance Imaging

    PubMed Central

    Soares, José M.; Magalhães, Ricardo; Moreira, Pedro S.; Sousa, Alexandre; Ganz, Edward; Sampaio, Adriana; Alves, Victor; Marques, Paulo; Sousa, Nuno

    2016-01-01

    Functional Magnetic Resonance Imaging (fMRI) studies have become increasingly popular both with clinicians and researchers as they are capable of providing unique insights into brain functions. However, multiple technical considerations (ranging from specifics of paradigm design to imaging artifacts, complex protocol definition, and multitude of processing and methods of analysis, as well as intrinsic methodological limitations) must be considered and addressed in order to optimize fMRI analysis and to arrive at the most accurate and grounded interpretation of the data. In practice, the researcher/clinician must choose, from many available options, the most suitable software tool for each stage of the fMRI analysis pipeline. Herein we provide a straightforward guide designed to address, for each of the major stages, the techniques, and tools involved in the process. We have developed this guide both to help those new to the technique to overcome the most critical difficulties in its use, as well as to serve as a resource for the neuroimaging community. PMID:27891073

  20. Computer-Aided Design of Low-Noise Microwave Circuits

    NASA Astrophysics Data System (ADS)

    Wedge, Scott William

    1991-02-01

    Devoid of most natural and manmade noise, microwave frequencies have detection sensitivities limited by internally generated receiver noise. Low-noise amplifiers are therefore critical components in radio astronomical antennas, communications links, radar systems, and even home satellite dishes. A general technique to accurately predict the noise performance of microwave circuits has been lacking. Current noise analysis methods have been limited to specific circuit topologies or neglect correlation, a strong effect in microwave devices. Presented here are generalized methods, developed for computer-aided design implementation, for the analysis of linear noisy microwave circuits comprised of arbitrarily interconnected components. Included are descriptions of efficient algorithms for the simultaneous analysis of noisy and deterministic circuit parameters based on a wave variable approach. The methods are therefore particularly suited to microwave and millimeter-wave circuits. Noise contributions from lossy passive components and active components with electronic noise are considered. Also presented is a new technique for the measurement of device noise characteristics that offers several advantages over current measurement methods.

  1. Using cluster analysis for medical resource decision making.

    PubMed

    Dilts, D; Khamalah, J; Plotkin, A

    1995-01-01

    Escalating costs of health care delivery have in the recent past often made the health care industry investigate, adapt, and apply those management techniques relating to budgeting, resource control, and forecasting that have long been used in the manufacturing sector. A strategy that has contributed much in this direction is the definition and classification of a hospital's output into "products" or groups of patients that impose similar resource or cost demands on the hospital. Existing classification schemes have frequently employed cluster analysis in generating these groupings. Unfortunately, the myriad articles and books on clustering and classification contain few formalized selection methodologies for choosing a technique for solving a particular problem, hence they often leave the novice investigator at a loss. This paper reviews the literature on clustering, particularly as it has been applied in the medical resource-utilization domain, addresses the critical choices facing an investigator in the medical field using cluster analysis, and offers suggestions (using the example of clustering low-vision patients) for how such choices can be made.

  2. Six Sigma Approach to Improve Stripping Quality of Automotive Electronics Component – a case study

    NASA Astrophysics Data System (ADS)

    Razali, Noraini Mohd; Murni Mohamad Kadri, Siti; Con Ee, Toh

    2018-03-01

    Lacking of problem solving skill techniques and cooperation between support groups are the two obstacles that always been faced in actual production line. Inadequate detail analysis and inappropriate technique in solving the problem may cause the repeating issues which may give impact to the organization performance. This study utilizes a well-structured six sigma DMAIC with combination of other problem solving tools to solve product quality problem in manufacturing of automotive electronics component. The study is concentrated at the stripping process, a critical process steps with highest rejection rate that contribute to the scrap and rework performance. The detail analysis is conducted in the analysis phase to identify the actual root cause of the problem. Then several improvement activities are implemented and the results show that the rejection rate due to stripping defect decrease tremendously and the process capability index improved from 0.75 to 1.67. This results prove that the six sigma approach used to tackle the quality problem is substantially effective.

  3. Reliability analysis of the F-8 digital fly-by-wire system

    NASA Technical Reports Server (NTRS)

    Brock, L. D.; Goodman, H. A.

    1981-01-01

    The F-8 Digital Fly-by-Wire (DFBW) flight test program intended to provide the technology for advanced control systems, giving aircraft enhanced performance and operational capability is addressed. A detailed analysis of the experimental system was performed to estimated the probabilities of two significant safety critical events: (1) loss of primary flight control function, causing reversion to the analog bypass system; and (2) loss of the aircraft due to failure of the electronic flight control system. The analysis covers appraisal of risks due to random equipment failure, generic faults in design of the system or its software, and induced failure due to external events. A unique diagrammatic technique was developed which details the combinatorial reliability equations for the entire system, promotes understanding of system failure characteristics, and identifies the most likely failure modes. The technique provides a systematic method of applying basic probability equations and is augmented by a computer program written in a modular fashion that duplicates the structure of these equations.

  4. High-throughput determination of structural phase diagram and constituent phases using GRENDEL

    NASA Astrophysics Data System (ADS)

    Kusne, A. G.; Keller, D.; Anderson, A.; Zaban, A.; Takeuchi, I.

    2015-11-01

    Advances in high-throughput materials fabrication and characterization techniques have resulted in faster rates of data collection and rapidly growing volumes of experimental data. To convert this mass of information into actionable knowledge of material process-structure-property relationships requires high-throughput data analysis techniques. This work explores the use of the Graph-based endmember extraction and labeling (GRENDEL) algorithm as a high-throughput method for analyzing structural data from combinatorial libraries, specifically, to determine phase diagrams and constituent phases from both x-ray diffraction and Raman spectral data. The GRENDEL algorithm utilizes a set of physical constraints to optimize results and provides a framework by which additional physics-based constraints can be easily incorporated. GRENDEL also permits the integration of database data as shown by the use of critically evaluated data from the Inorganic Crystal Structure Database in the x-ray diffraction data analysis. Also the Sunburst radial tree map is demonstrated as a tool to visualize material structure-property relationships found through graph based analysis.

  5. Episiotomy: the final cut?

    PubMed

    Steiner, Naama; Weintraub, Adi Y; Wiznitzer, Arnon; Sergienko, Ruslan; Sheiner, Eyal

    2012-12-01

    To investigate whether episiotomy prevents 3rd or 4th degree perineal tears in critical conditions such as shoulder dystocia, instrumental deliveries (vacuum or forceps), persistent occiput-posterior position, fetal macrosomia (>4,000 g), and non-reassuring fetal heart rate (NRFHR) patterns. A retrospective study comparing 3rd and 4th degree perineal tears during vaginal deliveries with or without episiotomy, in selected critical conditions was performed. Multiple gestations, preterm deliveries (<37 weeks' gestation) and cesarean deliveries were excluded from the analysis. Stratified analysis (using the Mantel-Haenszel technique) was used to obtain the weighted odds ratio (OR), while controlling for these variables. During the study period, there were 168,077 singleton vaginal deliveries. Of those, 188 (0.1%) had 3rd or 4th degree perineal tears. Vaginal deliveries with episiotomy had statistically significant higher rates of 3rd or 4th degree perineal tears than those without episiotomy (0.2 vs. 0.1%; P<0.001). The association between episiotomy and severe perineal tears remained significant even in the critical conditions. Stratified analysis revealed that the adjusted ORs for 3rd or 4th degree perineal tears in these critical conditions (Macrosomia OR=2.3; instrumental deliveries OR=1.8; NRFHR patterns OR=2.1; occipito-posterior position OR=2.3; and shoulder dystocia OR=2.3) were similar to the crude OR (OR=2.3). Mediolateral episiotomy is an independent risk factor for 3rd or 4th degree perineal tears, even in critical conditions such as shoulder dystocia, instrumental deliveries, occiput-posterior position, fetal macrosomia, and NRFHR. Prophylactic use of episiotomy in these conditions does not seem beneficial if performed to prevent 3rd or 4th degree perineal tears.

  6. Analytical Quality by Design in pharmaceutical quality assurance: Development of a capillary electrophoresis method for the analysis of zolmitriptan and its impurities.

    PubMed

    Orlandini, Serena; Pasquini, Benedetta; Caprini, Claudia; Del Bubba, Massimo; Pinzauti, Sergio; Furlanetto, Sandra

    2015-11-01

    A fast and selective CE method for the determination of zolmitriptan (ZOL) and its five potential impurities has been developed applying the analytical Quality by Design principles. Voltage, temperature, buffer concentration, and pH were investigated as critical process parameters that can influence the critical quality attributes, represented by critical resolution values between peak pairs, analysis time, and peak efficiency of ZOL-dimer. A symmetric screening matrix was employed for investigating the knowledge space, and a Box-Behnken design was used to evaluate the main, interaction, and quadratic effects of the critical process parameters on the critical quality attributes. Contour plots were drawn highlighting important interactions between buffer concentration and pH, and the gained information was merged into the sweet spot plots. Design space (DS) was established by the combined use of response surface methodology and Monte Carlo simulations, introducing a probability concept and thus allowing the quality of the analytical performances to be assured in a defined domain. The working conditions (with the interval defining the DS) were as follows: BGE, 138 mM (115-150 mM) phosphate buffer pH 2.74 (2.54-2.94); temperature, 25°C (24-25°C); voltage, 30 kV. A control strategy was planned based on method robustness and system suitability criteria. The main advantages of applying the Quality by Design concept consisted of a great increase of knowledge of the analytical system, obtained throughout multivariate techniques, and of the achievement of analytical assurance of quality, derived by probability-based definition of DS. The developed method was finally validated and applied to the analysis of ZOL tablets. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Preliminary evaluation of several nondestructive-evaluation techniques for silicon nitride gas-turbine rotors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kupperman, D. S.; Sciammarella, C.; Lapinski, N. P.

    1978-01-01

    Several nondestructive-evaluation (NDE) techniques have been examined to establish their effectiveness for detecting critically sized flaws in silicon nitride gas-turbine rotors. Preliminary results have been obtained for holographic interferometry, acoustic microscopy, dye-enhanced radiography, acoustic emission, and acoustic-impact testing techniques. This report discusses the relative effectiveness of these techniques in terms of their applicability to the rotor geometry and ability to detect critically sized flaws. Where feasible, flaw indications were verified by alternative NDE techniques or destructive examination. This study has indicated that, since the various techniques have different advantages, ultimately a reliable interrogation of ceramic rotors may require the applicationmore » of several NDE methods.« less

  8. An Analysis of Measures Used to Evaluate the Air Force Critical Item Program

    DTIC Science & Technology

    1991-09-01

    example of a histogram. Cause & Effect Diagram. The cause and effect diagram was introduced in 1953 by Dr. Kaoru Ishikawa in summarizing the opinions of...Personal Interview. Air Force Institute of Technology, School of Engineering, Wright-Patterson AFB OH, 24 April 1991. 31. Ishikawa , Dr. Kaoru . Guide to...collected. How the data are collected will determine which measurement techniques are appropriate. Ishikawa classifies data collection into five categories

  9. Do network relationships matter? Comparing network and instream habitat variables to explain densities of juvenile coho salmon (Oncorhynchus kisutch) in mid-coastal Oregon, USA

    Treesearch

    Rebecca L. Flitcroft; Kelly M. Burnett; Gordon H. Reeves; Lisa M. Ganio

    2012-01-01

    Aquatic ecologists are working to develop theory and techniques for analysis of dynamic stream processes and communities of organisms. Such work is critical for the development of conservation plans that are relevant at the scale of entire ecosystems. The stream network is the foundation upon which stream systems are organized. Natural and human disturbances in streams...

  10. Digital Watermarking of Autonomous Vehicles Imagery and Video Communication

    DTIC Science & Technology

    2005-10-01

    world’s recent events, the increasing need in different domains, those being: spatial, spectral and corn- homeland security and defense is a critical topic...watermarking schemes benefit in that security and analysis is also vital, especially when using there is no need for full or partial decompression, which...are embedded Arguably, the most widely used technique is spread spec- change with each application. Whether it is secure covert trum watermarking (SS

  11. Microfluidic Devices for Forensic DNA Analysis: A Review

    PubMed Central

    Bruijns, Brigitte; van Asten, Arian; Tiggelaar, Roald; Gardeniers, Han

    2016-01-01

    Microfluidic devices may offer various advantages for forensic DNA analysis, such as reduced risk of contamination, shorter analysis time and direct application at the crime scene. Microfluidic chip technology has already proven to be functional and effective within medical applications, such as for point-of-care use. In the forensic field, one may expect microfluidic technology to become particularly relevant for the analysis of biological traces containing human DNA. This would require a number of consecutive steps, including sample work up, DNA amplification and detection, as well as secure storage of the sample. This article provides an extensive overview of microfluidic devices for cell lysis, DNA extraction and purification, DNA amplification and detection and analysis techniques for DNA. Topics to be discussed are polymerase chain reaction (PCR) on-chip, digital PCR (dPCR), isothermal amplification on-chip, chip materials, integrated devices and commercially available techniques. A critical overview of the opportunities and challenges of the use of chips is discussed, and developments made in forensic DNA analysis over the past 10–20 years with microfluidic systems are described. Areas in which further research is needed are indicated in a future outlook. PMID:27527231

  12. An evaluation of the individual components and accuracies associated with the determination of impervious area

    USGS Publications Warehouse

    Slonecker, E.T.; Tilley, J.S.

    2004-01-01

    The percentage of impervious surface area in a watershed has been widely recognized as a key indicator of terrestrial and aquatic ecosystem condition. Although the use of the impervious indicator is widespread, there is currently no consistent or mutually accepted method of computing impervious area and the approach of various commonly used techniques varies widely. Further, we do not have reliable information on the components of impervious surfaces, which would be critical in any future planning attempts to remediate problems associated with impervious surface coverage. In cooperation with the USGS Geographic Analysis and Monitoring Program (GAM) and The National Map, and the EPA Landscape Ecology Program, this collaborative research project utilized very high resolution imagery and GIS techniques to map and quantify the individual components of total impervious area in six urban/suburban watersheds in different parts of the United States. These data were served as ground reference, or "truth," for the evaluation for four techniques used to compute impervious area. The results show some important aspects about the component make-up of impervious cover and the variability of methods commonly used to compile this critical emerging indicator of ecosystem condition. ?? 2004 by V. H. Winston and Sons, Inc. All rights reserved.

  13. Quantitation of Surface Coating on Nanoparticles Using Thermogravimetric Analysis.

    PubMed

    Dongargaonkar, Alpana A; Clogston, Jeffrey D

    2018-01-01

    Nanoparticles are critical components in nanomedicine and nanotherapeutic applications. Some nanoparticles, such as metallic nanoparticles, consist of a surface coating or surface modification to aid in its dispersion and stability. This surface coating may affect the behavior of nanoparticles in a biological environment, thus it is important to measure. Thermogravimetric analysis (TGA) can be used to determine the amount of coating on the surface of the nanoparticle. TGA experiments run under inert atmosphere can also be used to determine residual metal content present in the sample. In this chapter, the TGA technique and experimental method are described.

  14. Analysis of Nonvolatile Residue (NVR) from Spacecraft Systems

    NASA Technical Reports Server (NTRS)

    Colony, J. A.

    1985-01-01

    Organic contamination on critical spacecraft surfaces can cause electronic problems, serious attenuation of various optical signals, thermal control changes, and adhesion problems. Such contaminants can be detected early by the controlled use of witness mirrors, witness plates, wipe sampling, or direct solvent extraction. Each method requires careful control of variables of technique and materials to attain the ultimate sensitivities inherent to that procedure. Subsequent chemical analysis of the contaminant sample by infrared and mass spectrometry identifies the components, gives semiquantitative estimates of contaminant thickness, indicates possible sources of the nonvolatile residue (NVR), and provides guidance for effective cleanup procedures.

  15. Crises, noise, and tipping in the Hassell population model

    NASA Astrophysics Data System (ADS)

    Bashkirtseva, Irina

    2018-03-01

    We consider a problem of the analysis of the noise-induced tipping in population systems. To study this phenomenon, we use Hassell-type system with Allee effect as a conceptual model. A mathematical investigation of the tipping is connected with the analysis of the crisis bifurcations, both boundary and interior. In the parametric study of the abrupt changes in dynamics related to the noise-induced extinction and transition from order to chaos, the stochastic sensitivity function technique and confidence domains are used. The effectiveness of the suggested approach to detect early warnings of critical stochastic transitions is demonstrated.

  16. Alignment issues, correlation techniques and their assessment for a visible light imaging-based 3D printer quality control system

    NASA Astrophysics Data System (ADS)

    Straub, Jeremy

    2016-05-01

    Quality control is critical to manufacturing. Frequently, techniques are used to define object conformity bounds, based on historical quality data. This paper considers techniques for bespoke and small batch jobs that are not statistical model based. These techniques also serve jobs where 100% validation is needed due to the mission or safety critical nature of particular parts. One issue with this type of system is alignment discrepancies between the generated model and the physical part. This paper discusses and evaluates techniques for characterizing and correcting alignment issues between the projected and perceived data sets to prevent errors attributable to misalignment.

  17. The Strengths and Weaknesses of Logic Formalisms to Support Mishap Analysis

    NASA Technical Reports Server (NTRS)

    Johnson, C. W.; Holloway, C. M.

    2002-01-01

    The increasing complexity of many safety critical systems poses new problems for mishap analysis. Techniques developed in the sixties and seventies cannot easily scale-up to analyze incidents involving tightly integrated software and hardware components. Similarly, the realization that many failures have systemic causes has widened the scope of many mishap investigations. Organizations, including NASA and the NTSB, have responded by starting research and training initiatives to ensure that their personnel are well equipped to meet these challenges. One strand of research has identified a range of mathematically based techniques that can be used to reason about the causes of complex, adverse events. The proponents of these techniques have argued that they can be used to formally prove that certain events created the necessary and sufficient causes for a mishap to occur. Mathematical proofs can reduce the bias that is often perceived to effect the interpretation of adverse events. Others have opposed the introduction of these techniques by identifying social and political aspects to incident investigation that cannot easily be reconciled with a logic-based approach. Traditional theorem proving mechanisms cannot accurately capture the wealth of inductive, deductive and statistical forms of inference that investigators routinely use in their analysis of adverse events. This paper summarizes some of the benefits that logics provide, describes their weaknesses, and proposes a number of directions for future research.

  18. SP mountain data analysis

    NASA Technical Reports Server (NTRS)

    Rawson, R. F.; Hamilton, R. E.; Liskow, C. L.; Dias, A. R.; Jackson, P. L.

    1981-01-01

    An analysis of synthetic aperture radar data of SP Mountain was undertaken to demonstrate the use of digital image processing techniques to aid in geologic interpretation of SAR data. These data were collected with the ERIM X- and L-band airborne SAR using like- and cross-polarizations. The resulting signal films were used to produce computer compatible tapes, from which four-channel imagery was generated. Slant range-to-ground range and range-azimuth-scale corrections were made in order to facilitate image registration; intensity corrections were also made. Manual interpretation of the imagery showed that L-band represented the geology of the area better than X-band. Several differences between the various images were also noted. Further digital analysis of the corrected data was done for enhancement purposes. This analysis included application of an MSS differencing routine and development of a routine for removal of relief displacement. It was found that accurate registration of the SAR channels is critical to the effectiveness of the differencing routine. Use of the relief displacement algorithm on the SP Mountain data demonstrated the feasibility of the technique.

  19. Identification and elucidation of anthropogenic source contribution in PM10 pollutant: Insight gain from dispersion and receptor models.

    PubMed

    Roy, Debananda; Singh, Gurdeep; Yadav, Pankaj

    2016-10-01

    Source apportionment study of PM 10 (Particulate Matter) in a critically polluted area of Jharia coalfield, India has been carried out using Dispersion model, Principle Component Analysis (PCA) and Chemical Mass Balance (CMB) techniques. Dispersion model Atmospheric Dispersion Model (AERMOD) was introduced to simplify the complexity of sources in Jharia coalfield. PCA and CMB analysis indicates that monitoring stations near the mining area were mainly affected by the emission from open coal mining and its associated activities such as coal transportation, loading and unloading of coal. Mine fire emission also contributed a considerable amount of particulate matters in monitoring stations. Locations in the city area were mostly affected by vehicular, Liquid Petroleum Gas (LPG) & Diesel Generator (DG) set emissions, residential, and commercial activities. The experimental data sampling and their analysis could aid understanding how dispersion based model technique along with receptor model based concept can be strategically used for quantitative analysis of Natural and Anthropogenic sources of PM 10 . Copyright © 2016. Published by Elsevier B.V.

  20. Overview of Recent Flight Flutter Testing Research at NASA Dryden

    NASA Technical Reports Server (NTRS)

    Brenner, Martin J.; Lind, Richard C.; Voracek, David F.

    1997-01-01

    In response to the concerns of the aeroelastic community, NASA Dryden Flight Research Center, Edwards, California, is conducting research into improving the flight flutter (including aeroservoelasticity) test process with more accurate and automated techniques for stability boundary prediction. The important elements of this effort so far include the following: (1) excitation mechanisms for enhanced vibration data to reduce uncertainty levels in stability estimates; (2) investigation of a variety of frequency, time, and wavelet analysis techniques for signal processing, stability estimation, and nonlinear identification; and (3) robust flutter boundary prediction to substantially reduce the test matrix for flutter clearance. These are critical research topics addressing the concerns of a recent AGARD Specialists' Meeting on Advanced Aeroservoelastic Testing and Data Analysis. This paper addresses these items using flight test data from the F/A-18 Systems Research Aircraft and the F/A-18 High Alpha Research Vehicle.

  1. Verification, Validation and Sensitivity Studies in Computational Biomechanics

    PubMed Central

    Anderson, Andrew E.; Ellis, Benjamin J.; Weiss, Jeffrey A.

    2012-01-01

    Computational techniques and software for the analysis of problems in mechanics have naturally moved from their origins in the traditional engineering disciplines to the study of cell, tissue and organ biomechanics. Increasingly complex models have been developed to describe and predict the mechanical behavior of such biological systems. While the availability of advanced computational tools has led to exciting research advances in the field, the utility of these models is often the subject of criticism due to inadequate model verification and validation. The objective of this review is to present the concepts of verification, validation and sensitivity studies with regard to the construction, analysis and interpretation of models in computational biomechanics. Specific examples from the field are discussed. It is hoped that this review will serve as a guide to the use of verification and validation principles in the field of computational biomechanics, thereby improving the peer acceptance of studies that use computational modeling techniques. PMID:17558646

  2. UMA/GAN network architecture analysis

    NASA Astrophysics Data System (ADS)

    Yang, Liang; Li, Wensheng; Deng, Chunjian; Lv, Yi

    2009-07-01

    This paper is to critically analyze the architecture of UMA which is one of Fix Mobile Convergence (FMC) solutions, and also included by the third generation partnership project(3GPP). In UMA/GAN network architecture, UMA Network Controller (UNC) is the key equipment which connects with cellular core network and mobile station (MS). UMA network could be easily integrated into the existing cellular networks without influencing mobile core network, and could provides high-quality mobile services with preferentially priced indoor voice and data usage. This helps to improve subscriber's experience. On the other hand, UMA/GAN architecture helps to integrate other radio technique into cellular network which includes WiFi, Bluetooth, and WiMax and so on. This offers the traditional mobile operators an opportunity to integrate WiMax technique into cellular network. In the end of this article, we also give an analysis of potential influence on the cellular core networks ,which is pulled by UMA network.

  3. Application of real-time digitization techniques in beam measurement for accelerators

    NASA Astrophysics Data System (ADS)

    Zhao, Lei; Zhan, Lin-Song; Gao, Xing-Shun; Liu, Shu-Bin; An, Qi

    2016-04-01

    Beam measurement is very important for accelerators. In this paper, modern digital beam measurement techniques based on IQ (In-phase & Quadrature-phase) analysis are discussed. Based on this method and high-speed high-resolution analog-to-digital conversion, we have completed three beam measurement electronics systems designed for the China Spallation Neutron Source (CSNS), Shanghai Synchrotron Radiation Facility (SSRF), and Accelerator Driven Sub-critical system (ADS). Core techniques of hardware design and real-time system calibration are discussed, and performance test results of these three instruments are also presented. Supported by National Natural Science Foundation of China (11205153, 10875119), Knowledge Innovation Program of the Chinese Academy of Sciences (KJCX2-YW-N27), and the Fundamental Research Funds for the Central Universities (WK2030040029),and the CAS Center for Excellence in Particle Physics (CCEPP).

  4. Artificial chordae for degenerative mitral valve disease: critical analysis of current techniques

    PubMed Central

    Ibrahim, Michael; Rao, Christopher; Athanasiou, Thanos

    2012-01-01

    The surgical repair of degenerative mitral valve disease involves a number of technical points of importance. The use of artificial chordae for the repair of degenerative disease has increased as a part of the move from mitral valve replacement to repair of the mitral valve. The use of artificial chordae provides an alternative to the techniques pioneered by Carpentier (including the quadrangular resection, transfer of native chordae and papillary muscle shortening/plasty), which can be more technically difficult. Despite a growth in their uptake and the indications for their use, a number of challenges remain for the use of artificial chordae in mitral valve repair, particularly in the determination of the correct length to ensure optimal leaflet coaptation. Here, we analyse over 40 techniques described for artificial chordae mitral valve repair in the setting of degenerative disease. PMID:22962321

  5. Three dimensional profile measurement using multi-channel detector MVM-SEM

    NASA Astrophysics Data System (ADS)

    Yoshikawa, Makoto; Harada, Sumito; Ito, Keisuke; Murakawa, Tsutomu; Shida, Soichi; Matsumoto, Jun; Nakamura, Takayuki

    2014-07-01

    In next generation lithography (NGL) for the 1x nm node and beyond, the three dimensional (3D) shape measurements such as side wall angle (SWA) and height of feature on photomask become more critical for the process control. Until today, AFM (Atomic Force Microscope), X-SEM (cross-section Scanning Electron Microscope) and TEM (Transmission Electron Microscope) tools are normally used for 3D measurements, however, these techniques require time-consuming preparation and observation. And both X-SEM and TEM are destructive measurement techniques. This paper presents a technology for quick and non-destructive 3D shape analysis using multi-channel detector MVM-SEM (Multi Vision Metrology SEM), and also reports its accuracy and precision.

  6. Signal frequency distribution and natural-time analyses from acoustic emission monitoring of an arched structure in the Castle of Racconigi

    NASA Astrophysics Data System (ADS)

    Niccolini, Gianni; Manuello, Amedeo; Marchis, Elena; Carpinteri, Alberto

    2017-07-01

    The stability of an arch as a structural element in the thermal bath of King Charles Albert (Carlo Alberto) in the Royal Castle of Racconigi (on the UNESCO World Heritage List since 1997) was assessed by the acoustic emission (AE) monitoring technique with application of classical inversion methods to recorded AE data. First, damage source location by means of triangulation techniques and signal frequency analysis were carried out. Then, the recently introduced method of natural-time analysis was preliminarily applied to the AE time series in order to reveal a possible entrance point to a critical state of the monitored structural element. Finally, possible influence of the local seismic and microseismic activity on the stability of the monitored structure was investigated. The criterion for selecting relevant earthquakes was based on the estimation of the size of earthquake preparation zones. The presented results suggest the use of the AE technique as a tool for detecting both ongoing structural damage processes and microseismic activity during preparation stages of seismic events.

  7. A review of cutting mechanics and modeling techniques for biological materials.

    PubMed

    Takabi, Behrouz; Tai, Bruce L

    2017-07-01

    This paper presents a comprehensive survey on the modeling of tissue cutting, including both soft tissue and bone cutting processes. In order to achieve higher accuracy in tissue cutting, as a critical process in surgical operations, the meticulous modeling of such processes is important in particular for surgical tool development and analysis. This review paper is focused on the mechanical concepts and modeling techniques utilized to simulate tissue cutting such as cutting forces and chip morphology. These models are presented in two major categories, namely soft tissue cutting and bone cutting. Fracture toughness is commonly used to describe tissue cutting while Johnson-Cook material model is often adopted for bone cutting in conjunction with finite element analysis (FEA). In each section, the most recent mathematical and computational models are summarized. The differences and similarities among these models, challenges, novel techniques, and recommendations for future work are discussed along with each section. This review is aimed to provide a broad and in-depth vision of the methods suitable for tissue and bone cutting simulations. Copyright © 2017 IPEM. Published by Elsevier Ltd. All rights reserved.

  8. Advances in carbonate exploration and reservoir analysis

    USGS Publications Warehouse

    Garland, J.; Neilson, J.; Laubach, S.E.; Whidden, Katherine J.

    2012-01-01

    The development of innovative techniques and concepts, and the emergence of new plays in carbonate rocks are creating a resurgence of oil and gas discoveries worldwide. The maturity of a basin and the application of exploration concepts have a fundamental influence on exploration strategies. Exploration success often occurs in underexplored basins by applying existing established geological concepts. This approach is commonly undertaken when new basins ‘open up’ owing to previous political upheavals. The strategy of using new techniques in a proven mature area is particularly appropriate when dealing with unconventional resources (heavy oil, bitumen, stranded gas), while the application of new play concepts (such as lacustrine carbonates) to new areas (i.e. ultra-deep South Atlantic basins) epitomizes frontier exploration. Many low-matrix-porosity hydrocarbon reservoirs are productive because permeability is controlled by fractures and faults. Understanding basic fracture properties is critical in reducing geological risk and therefore reducing well costs and increasing well recovery. The advent of resource plays in carbonate rocks, and the long-standing recognition of naturally fractured carbonate reservoirs means that new fracture and fault analysis and prediction techniques and concepts are essential.

  9. Helping Librarians To Encourage Critical Thinking through Active Learning Techniques in Library Instruction.

    ERIC Educational Resources Information Center

    Swaine, Cynthia Wright

    Encouraging librarians to incorporate critical thinking skills and active learning techniques in their course instruction requires more than talking about it in a department meeting or distributing articles on the topic. At Old Dominion University (Virginia), librarians have tried conducting workshops, had readily-accessible binders of articles…

  10. Leadership and Management Education and Training (LMET) Course Requirements for Recruit Company Commanders and ’A’ School Instructors.

    DTIC Science & Technology

    1983-12-01

    integration of TAEG findings with contractor findings. Critical incident interview techniques, as used by the contractor, were specifically prohibited in order...than the critical incident interview technique were to be explored for use in the identification of leadership competencies. These competencies and

  11. Critical stresses for extension of filament-bridged matrix cracks in ceramic-matrix composites: An assessment with a model composite with tailored interfaces

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Danchaivijit, S.; Shetty, D.K.; Eldridge, J.

    Matrix cracking was studied in a model unidirectional composite of SiC filaments in an epoxy-bonded alumina matrix. The residual clamping stress on the filaments due to the shrinkage of the epoxy was moderated with the addition of the alumina filler, and the filament surface was coated with a releasing agent to produce unbonded frictional interfaces. Uniaxial tension specimens with controlled through-cracks with bridging filaments were fabricated by a two-step casting technique. Critical stresses for extension of the filament-bridged cracks of various lengths were measured in uniaxial tension using a high-sensitivity extensometer. The measured crack-length dependence of the critical stress wasmore » in good agreement with the prediction of a stress-intensity analysis that employed a new force-displacement law for the bridging filaments. The analysis required independent experimental evaluation of the matrix fracture toughness, the interfacial sliding friction stress, and the residual tension in the matrix. The matrix-cracking stress for the test specimens without the deliberately introduced cracks was significantly higher than the steady-state cracking stress measured for the long, filament-bridged cracks.« less

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marshall, William BJ J; Rearden, Bradley T

    The validation of neutron transport methods used in nuclear criticality safety analyses is required by consensus American National Standards Institute/American Nuclear Society (ANSI/ANS) standards. In the last decade, there has been an increased interest in correlations among critical experiments used in validation that have shared physical attributes and which impact the independence of each measurement. The statistical methods included in many of the frequently cited guidance documents on performing validation calculations incorporate the assumption that all individual measurements are independent, so little guidance is available to practitioners on the topic. Typical guidance includes recommendations to select experiments from multiple facilitiesmore » and experiment series in an attempt to minimize the impact of correlations or common-cause errors in experiments. Recent efforts have been made both to determine the magnitude of such correlations between experiments and to develop and apply methods for adjusting the bias and bias uncertainty to account for the correlations. This paper describes recent work performed at Oak Ridge National Laboratory using the Sampler sequence from the SCALE code system to develop experimental correlations using a Monte Carlo sampling technique. Sampler will be available for the first time with the release of SCALE 6.2, and a brief introduction to the methods used to calculate experiment correlations within this new sequence is presented in this paper. Techniques to utilize these correlations in the establishment of upper subcritical limits are the subject of a companion paper and will not be discussed here. Example experimental uncertainties and correlation coefficients are presented for a variety of low-enriched uranium water-moderated lattice experiments selected for use in a benchmark exercise by the Working Party on Nuclear Criticality Safety Subgroup on Uncertainty Analysis in Criticality Safety Analyses. The results include studies on the effect of fuel rod pitch on the correlations, and some observations are also made regarding difficulties in determining experimental correlations using the Monte Carlo sampling technique.« less

  13. Approach to the critically ill camelid.

    PubMed

    Bedenice, Daniela

    2009-07-01

    The estimation of fluid deficits in camelids is challenging. However, early recognition and treatment of shock and hypovolemia is instrumental to improve morbidity and mortality of critically ill camelids. Early goal-directed fluid therapy requires specific knowledge of clinical indicators of hypovolemia and assessment of resuscitation endpoints, but may significantly enhance the understanding, monitoring, and safety of intravenous fluid therapy in South American camelids (SAC). It is important to recognize that over-aggressive fluid resuscitation is just as detrimental as under resuscitation. Nonetheless, a protocol of conservative fluid management is often indicated in the treatment of camelids with pulmonary inflammation, to counteract edema formation. The early recognition of lung dysfunction is often based on advanced diagnostic techniques, including arterial blood gas analysis, diagnostic imaging, and noninvasive pulmonary function testing.

  14. Genome-wide high-resolution aCGH analysis of gestational choriocarcinomas.

    PubMed

    Poaty, Henriette; Coullin, Philippe; Peko, Jean Félix; Dessen, Philippe; Diatta, Ange Lucien; Valent, Alexander; Leguern, Eric; Prévot, Sophie; Gombé-Mbalawa, Charles; Candelier, Jean-Jacques; Picard, Jean-Yves; Bernheim, Alain

    2012-01-01

    Eleven samples of DNA from choriocarcinomas were studied by high resolution CGH-array 244 K. They were studied after histopathological confirmation of the diagnosis, of the androgenic etiology and after a microsatellite marker analysis confirming the absence of contamination of tumor DNA from maternal DNA. Three cell lines, BeWo, JAR, JEG were also studied by this high resolution pangenomic technique. According to aCGH analysis, the de novo choriocarcinomas exhibited simple chromosomal rearrangements or normal profiles. The cell lines showed various and complex chromosomal aberrations. 23 Minimal Critical Regions were defined that allowed us to list the genes that were potentially implicated. Among them, unusually high numbers of microRNA clusters and imprinted genes were observed.

  15. Exploring Surface Analysis Techniques for the Detection of Molecular Contaminants on Spacecraft

    NASA Technical Reports Server (NTRS)

    Rutherford, Gugu N.; Seasly, Elaine; Thornblom, Mark; Baughman, James

    2016-01-01

    Molecular contamination is a known area of concern for spacecraft. To mitigate this risk, projects involving space flight hardware set requirements in a contamination control plan that establishes an allocation budget for the exposure of non-volatile residues (NVR) onto critical surfaces. The purpose of this work will focus on non-contact surface analysis and in situ monitoring to mitigate molecular contamination on space flight hardware. By using Scanning Electron Microscopy and Energy Dispersive Spectroscopy (SEM-EDS) with Raman Spectroscopy, an unlikely contaminant was identified on space flight hardware. Using traditional and surface analysis methods provided the broader view of the contamination sources allowing for best fit solutions to prevent any future exposure.

  16. Analyses of exobiological and potential resource materials in the Martian soil.

    PubMed

    Mancinelli, R L; Marshall, J R; White, M R

    1992-01-01

    Potential Martian soil components relevant to exobiology include water, organic matter, evaporites, clays, and oxides. These materials are also resources for human expeditions to Mars. When found in particular combinations, some of these materials constitute diagnostic paleobiomarker suites, allowing insight to be gained into the probability of life originating on Mars. Critically important to exobiology is the method of data analysis and data interpretation. To that end we are investigating methods of analysis of potential biomarker and paleobiomarker compounds and resource materials in soils and rocks pertinent to Martian geology. Differential thermal analysis coupled with gas chromatography is shown to be a highly useful analytical technique for detecting this wide and complex variety of materials.

  17. Analyses of exobiological and potential resource materials in the Martian soil

    NASA Technical Reports Server (NTRS)

    Mancinelli, Rocco L.; Marshall, John R.; White, Melisa R.

    1992-01-01

    Potential Martian soil components relevant to exobiology include water, organic matter, evaporites, clays, and oxides. These materials are also resources for human expeditions to Mars. When found in particular combinations, some of these materials constitute diagnostic paleobiomarker suites, allowing insight to be gained into the probability of life originating on Mars. Critically important to exobiology is the method of data analysis and data interpretation. To that end, methods of analysis of potential biomarker and paleobiomarker compounds and resource materials in soils and rocks pertinent to Martian geology are investigated. Differential thermal analysis coupled with gas chromotography is shown to be a highly useful analytical technique for detecting this wide and complex variety of materials.

  18. A Critical Study of Agglomerated Multigrid Methods for Diffusion

    NASA Technical Reports Server (NTRS)

    Nishikawa, Hiroaki; Diskin, Boris; Thomas, James L.

    2011-01-01

    Agglomerated multigrid techniques used in unstructured-grid methods are studied critically for a model problem representative of laminar diffusion in the incompressible limit. The studied target-grid discretizations and discretizations used on agglomerated grids are typical of current node-centered formulations. Agglomerated multigrid convergence rates are presented using a range of two- and three-dimensional randomly perturbed unstructured grids for simple geometries with isotropic and stretched grids. Two agglomeration techniques are used within an overall topology-preserving agglomeration framework. The results show that multigrid with an inconsistent coarse-grid scheme using only the edge terms (also referred to in the literature as a thin-layer formulation) provides considerable speedup over single-grid methods but its convergence deteriorates on finer grids. Multigrid with a Galerkin coarse-grid discretization using piecewise-constant prolongation and a heuristic correction factor is slower and also grid-dependent. In contrast, grid-independent convergence rates are demonstrated for multigrid with consistent coarse-grid discretizations. Convergence rates of multigrid cycles are verified with quantitative analysis methods in which parts of the two-grid cycle are replaced by their idealized counterparts.

  19. A Critical Study of Agglomerated Multigrid Methods for Diffusion

    NASA Technical Reports Server (NTRS)

    Thomas, James L.; Nishikawa, Hiroaki; Diskin, Boris

    2009-01-01

    Agglomerated multigrid techniques used in unstructured-grid methods are studied critically for a model problem representative of laminar diffusion in the incompressible limit. The studied target-grid discretizations and discretizations used on agglomerated grids are typical of current node-centered formulations. Agglomerated multigrid convergence rates are presented using a range of two- and three-dimensional randomly perturbed unstructured grids for simple geometries with isotropic and highly stretched grids. Two agglomeration techniques are used within an overall topology-preserving agglomeration framework. The results show that multigrid with an inconsistent coarse-grid scheme using only the edge terms (also referred to in the literature as a thin-layer formulation) provides considerable speedup over single-grid methods but its convergence deteriorates on finer grids. Multigrid with a Galerkin coarse-grid discretization using piecewise-constant prolongation and a heuristic correction factor is slower and also grid-dependent. In contrast, grid-independent convergence rates are demonstrated for multigrid with consistent coarse-grid discretizations. Actual cycle results are verified using quantitative analysis methods in which parts of the cycle are replaced by their idealized counterparts.

  20. Arc-to-Arc mini-sling 1999: a critical analysis of concept and technology.

    PubMed

    Palma, Paulo

    2011-01-01

    The aim of this study was to critically review the Arc-to-Arc mini-sling (Palma's technique) a less invasive mid-urethral sling using bovine pericardium as the sling material. The Arc-to-Arc mini-sling, using bovine pericardium, was the first published report of a mini-sling, in 1999. The technique was identical to the "tension-free tape" operation, midline incision and dissection of the urethra. The ATFP (white line) was identified by blunt dissection, and the mini-sling was sutured to the tendinous arc on both sides with 2 polypropylene 00 sutures. The initial results were encouraging, with 9/10 patients cured at the 6 weeks post-operative visit. However, infection and extrusion of the mini-sling resulted in sling extrusion and removal, with 5 patients remaining cured at 12 months. The Arc-to-Arc mini-sling was a good concept, but failed because of the poor technology available at that time. Further research using new materials and better technology has led to new and safer alternatives for the management of stress urinary incontinence.

  1. Proceedings of the Sixth NASA Langley Formal Methods (LFM) Workshop

    NASA Technical Reports Server (NTRS)

    Rozier, Kristin Yvonne (Editor)

    2008-01-01

    Today's verification techniques are hard-pressed to scale with the ever-increasing complexity of safety critical systems. Within the field of aeronautics alone, we find the need for verification of algorithms for separation assurance, air traffic control, auto-pilot, Unmanned Aerial Vehicles (UAVs), adaptive avionics, automated decision authority, and much more. Recent advances in formal methods have made verifying more of these problems realistic. Thus we need to continually re-assess what we can solve now and identify the next barriers to overcome. Only through an exchange of ideas between theoreticians and practitioners from academia to industry can we extend formal methods for the verification of ever more challenging problem domains. This volume contains the extended abstracts of the talks presented at LFM 2008: The Sixth NASA Langley Formal Methods Workshop held on April 30 - May 2, 2008 in Newport News, Virginia, USA. The topics of interest that were listed in the call for abstracts were: advances in formal verification techniques; formal models of distributed computing; planning and scheduling; automated air traffic management; fault tolerance; hybrid systems/hybrid automata; embedded systems; safety critical applications; safety cases; accident/safety analysis.

  2. Discovering phases, phase transitions, and crossovers through unsupervised machine learning: A critical examination

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hu, Wenjian; Singh, Rajiv R. P.; Scalettar, Richard T.

    Here, we apply unsupervised machine learning techniques, mainly principal component analysis (PCA), to compare and contrast the phase behavior and phase transitions in several classical spin models - the square and triangular-lattice Ising models, the Blume-Capel model, a highly degenerate biquadratic-exchange spin-one Ising (BSI) model, and the 2D XY model, and examine critically what machine learning is teaching us. We find that quantified principal components from PCA not only allow exploration of different phases and symmetry-breaking, but can distinguish phase transition types and locate critical points. We show that the corresponding weight vectors have a clear physical interpretation, which ismore » particularly interesting in the frustrated models such as the triangular antiferromagnet, where they can point to incipient orders. Unlike the other well-studied models, the properties of the BSI model are less well known. Using both PCA and conventional Monte Carlo analysis, we demonstrate that the BSI model shows an absence of phase transition and macroscopic ground-state degeneracy. The failure to capture the 'charge' correlations (vorticity) in the BSI model (XY model) from raw spin configurations points to some of the limitations of PCA. Finally, we employ a nonlinear unsupervised machine learning procedure, the 'antoencoder method', and demonstrate that it too can be trained to capture phase transitions and critical points.« less

  3. Discovering phases, phase transitions, and crossovers through unsupervised machine learning: A critical examination

    DOE PAGES

    Hu, Wenjian; Singh, Rajiv R. P.; Scalettar, Richard T.

    2017-06-19

    Here, we apply unsupervised machine learning techniques, mainly principal component analysis (PCA), to compare and contrast the phase behavior and phase transitions in several classical spin models - the square and triangular-lattice Ising models, the Blume-Capel model, a highly degenerate biquadratic-exchange spin-one Ising (BSI) model, and the 2D XY model, and examine critically what machine learning is teaching us. We find that quantified principal components from PCA not only allow exploration of different phases and symmetry-breaking, but can distinguish phase transition types and locate critical points. We show that the corresponding weight vectors have a clear physical interpretation, which ismore » particularly interesting in the frustrated models such as the triangular antiferromagnet, where they can point to incipient orders. Unlike the other well-studied models, the properties of the BSI model are less well known. Using both PCA and conventional Monte Carlo analysis, we demonstrate that the BSI model shows an absence of phase transition and macroscopic ground-state degeneracy. The failure to capture the 'charge' correlations (vorticity) in the BSI model (XY model) from raw spin configurations points to some of the limitations of PCA. Finally, we employ a nonlinear unsupervised machine learning procedure, the 'antoencoder method', and demonstrate that it too can be trained to capture phase transitions and critical points.« less

  4. Discovering phases, phase transitions, and crossovers through unsupervised machine learning: A critical examination

    NASA Astrophysics Data System (ADS)

    Hu, Wenjian; Singh, Rajiv R. P.; Scalettar, Richard T.

    2017-06-01

    We apply unsupervised machine learning techniques, mainly principal component analysis (PCA), to compare and contrast the phase behavior and phase transitions in several classical spin models—the square- and triangular-lattice Ising models, the Blume-Capel model, a highly degenerate biquadratic-exchange spin-1 Ising (BSI) model, and the two-dimensional X Y model—and we examine critically what machine learning is teaching us. We find that quantified principal components from PCA not only allow the exploration of different phases and symmetry-breaking, but they can distinguish phase-transition types and locate critical points. We show that the corresponding weight vectors have a clear physical interpretation, which is particularly interesting in the frustrated models such as the triangular antiferromagnet, where they can point to incipient orders. Unlike the other well-studied models, the properties of the BSI model are less well known. Using both PCA and conventional Monte Carlo analysis, we demonstrate that the BSI model shows an absence of phase transition and macroscopic ground-state degeneracy. The failure to capture the "charge" correlations (vorticity) in the BSI model (X Y model) from raw spin configurations points to some of the limitations of PCA. Finally, we employ a nonlinear unsupervised machine learning procedure, the "autoencoder method," and we demonstrate that it too can be trained to capture phase transitions and critical points.

  5. Whirl Motion of a Seal Test Rig with Squeeze-Film Dampers

    NASA Technical Reports Server (NTRS)

    Proctor, Margaret P.; Gunter, Edgar J.

    2007-01-01

    This paper presents the experimental behavior and dynamic analysis of a high speed test rig with rolling element bearings mounted in squeeze film oil damper bearings. The test rotor is a double overhung configuration with rolling element ball bearings mounted in uncentered squeeze-film oil dampers. The damper design is similar to that employed with various high-speed aircraft HP gas turbines. The dynamic performance of the test rig with the originally installed dampers with an effective damper length of length 0.23-inch was unacceptable. The design speed of 40,000 RPM could not be safely achieved as nonsynchronous whirling at the overhung seal test disk and high amplitude critical speed response at the drive spline section occurred at 32,000 RPM. In addition to the self excited stability and critical speed problems, it was later seen from FFT data analysis, that a region of supersynchronous dead band whirling occurs between 10,000 to 15,000 RPM which can lead to bearing distress and wear. The system was analyzed using both linear and nonlinear techniques. The extended length damper design resulting from the analysis eliminated the rotor subsynchronous whirling, high amplitude critical speed, and the dead band whirling region allowing the system to achieve a speed of 45,000 RPM. However, nonlinear analysis shows that damper lockup could occur with high rotor unbalance at 33,000 RPM, even with the extended squeeze-film dampers. The control of damper lockup will be addressed in a future paper.

  6. Producing children in the 21st century: a critical discourse analysis of the science and techniques of monitoring early child development.

    PubMed

    Einboden, Rochelle; Rudge, Trudy; Varcoe, Colleen

    2013-11-01

    The purpose of this article is to identify the implications of commonly held ideologies within theories of child development. Despite critiques to doing so, developmental theory assumes that children's bodies are unitary, natural and material. The recent explosion of neuroscience illustrates the significance of historical, social and cultural contexts to portrayals of brain development, offering the opportunity for a critical departure in thinking. Instead, this neuroscience research has been taken up in ways that align with biomedical traditions and neoliberal values. This article uses a critical discursive approach, supported by Haraway's ideas of technoscience, to analyse a population-based early child development research initiative. This initiative organises a large-scale surveillance of children's development, operating from the premise that risks to development are best captured early to optimise children's potential. The analysis in this article shows an intermingling of health and economic discourses and clarifies how the child is a figure of significant contemporary social and political interests. In a poignant example of technobiopolitics, the collusion between health research, technologies and the state enrols health professionals to participate in the production of children as subjects of social value, figured as human capital, investments in the future, or alternatively, as waste. The analysis shows how practices that participate in what has become a developmental enterprise also participate in the marginalisation of the very children they intend to serve. Hence, there is the need to rethink practices critically and move towards innovative conceptualisations of child development that hold possibilities to resist these figurations.

  7. Application of multi-criteria decision-making to risk prioritisation in tidal energy developments

    NASA Astrophysics Data System (ADS)

    Kolios, Athanasios; Read, George; Ioannou, Anastasia

    2016-01-01

    This paper presents an analytical multi-criterion analysis for the prioritisation of risks for the development of tidal energy projects. After a basic identification of risks throughout the project and relevant stakeholders in the UK, classified through a political, economic, social, technological, legal and environmental analysis, relevant questionnaires provided scores to each risk and corresponding weights for each of the different sectors. Employing an extended technique for order of preference by similarity to ideal solution as well as the weighted sum method based on the data obtained, the risks identified are ranked based on their criticality, drawing attention of the industry in mitigating the ones scoring higher. Both methods were modified to take averages at different stages of the analysis in order to observe the effects on the final risk ranking. A sensitivity analysis of the results was also carried out with regard to the weighting factors given to the perceived expertise of participants, with different results being obtained whether a linear, squared or square root regression is used. Results of the study show that academics and industry have conflicting opinions with regard to the perception of the most critical risks.

  8. Implementation of Speed Variation in the Structural Dynamic Assessment of Turbomachinery Flow-Path Components

    NASA Technical Reports Server (NTRS)

    Brown, Andrew M.; Davis, R. Benjamin; DeHaye, Michael

    2013-01-01

    During the design of turbomachinery flow path components, the assessment of possible structural resonant conditions is critical. Higher frequency modes of these structures are frequently found to be subject to resonance, and in these cases, design criteria require a forced response analysis of the structure with the assumption that the excitation speed exactly equals the resonant frequency. The design becomes problematic if the response analysis shows a violation of the HCF criteria. One possible solution is to perform "finite-life" analysis, where Miner's rule is used to calculate the actual life in seconds in comparison to the required life. In this situation, it is beneficial to incorporate the fact that, for a variety of turbomachinery control reasons, the speed of the rotor does not actually dwell at a single value but instead dithers about a nominal mean speed and during the time that the excitation frequency is not equal to the resonant frequency, the damage accumulated by the structure is diminished significantly. Building on previous investigations into this process, we show that a steady-state assumption of the response is extremely accurate for this typical case, resulting in the ability to quickly account for speed variation in the finite-life analysis of a component which has previously had its peak dynamic stress at resonance calculated. A technique using Monte Carlo simulation is also presented which can be used when specific speed time histories are not available. The implementation of these techniques can prove critical for successful turbopump design, as the improvement in life when speed variation is considered is shown to be greater than a factor of two

  9. Identification of complex metabolic states in critically injured patients using bioinformatic cluster analysis.

    PubMed

    Cohen, Mitchell J; Grossman, Adam D; Morabito, Diane; Knudson, M Margaret; Butte, Atul J; Manley, Geoffrey T

    2010-01-01

    Advances in technology have made extensive monitoring of patient physiology the standard of care in intensive care units (ICUs). While many systems exist to compile these data, there has been no systematic multivariate analysis and categorization across patient physiological data. The sheer volume and complexity of these data make pattern recognition or identification of patient state difficult. Hierarchical cluster analysis allows visualization of high dimensional data and enables pattern recognition and identification of physiologic patient states. We hypothesized that processing of multivariate data using hierarchical clustering techniques would allow identification of otherwise hidden patient physiologic patterns that would be predictive of outcome. Multivariate physiologic and ventilator data were collected continuously using a multimodal bioinformatics system in the surgical ICU at San Francisco General Hospital. These data were incorporated with non-continuous data and stored on a server in the ICU. A hierarchical clustering algorithm grouped each minute of data into 1 of 10 clusters. Clusters were correlated with outcome measures including incidence of infection, multiple organ failure (MOF), and mortality. We identified 10 clusters, which we defined as distinct patient states. While patients transitioned between states, they spent significant amounts of time in each. Clusters were enriched for our outcome measures: 2 of the 10 states were enriched for infection, 6 of 10 were enriched for MOF, and 3 of 10 were enriched for death. Further analysis of correlations between pairs of variables within each cluster reveals significant differences in physiology between clusters. Here we show for the first time the feasibility of clustering physiological measurements to identify clinically relevant patient states after trauma. These results demonstrate that hierarchical clustering techniques can be useful for visualizing complex multivariate data and may provide new insights for the care of critically injured patients.

  10. Method for the Collection, Gravimetric and Chemical Analysis of Nonvolatile Residue (NVR) on Surfaces

    NASA Technical Reports Server (NTRS)

    Gordon, Keith; Rutherford, Gugu; Aranda, Denisse

    2017-01-01

    Nonvolatile residue (NVR), sometimes referred to as molecular contamination is the term used for the total composition of the inorganic and high boiling point organic components in particulates and molecular films deposited on critical surfaces surrounding space structures, with the particulate and NVR contamination originating primarily from pre-launch operations. The "nonvolatile" suggestion from the terminology NVR implies that the collected residue will not experience much loss under ambient conditions. NVR has been shown to have a dramatic impact on the ability to perform optical measurements from platforms based in space. Such contaminants can be detected early by the controlled application of various detection techniques and contamination analyses. Contamination analyses are the techniques used to determine if materials, components, and subsystems can be expected to meet the performance requirements of a system. Of particular concern is the quantity of NVR contaminants that might be deposited on critical payload surfaces from these sources. Subsequent chemical analysis of the contaminant samples by infrared spectroscopy and gas chromatography mass spectrometry identifies the components, gives semi-quantitative estimates of contaminant thickness, indicates possible sources of the NVR, and provides guidance for effective cleanup procedures. In this report, a method for the collection and determination of the mass of NVR was generated by the authors at NASA Langley Research Center. This report describes the method developed and implemented for collecting NVR contaminants, and procedures for gravimetric and chemical analysis of the residue obtained. The result of this NVR analysis collaboration will help pave the way for Langley's ability to certify flight hardware outgassing requirements in support of flight projects such as Stratospheric Aerosol and Gas Experiment III (SAGE III), Clouds and the Earth's Radiant Energy System (CERES), Materials International Space Station Experiment - X (MISSE-X), and Doppler Aerosol Wind Lidar (DAWN).

  11. Improved helicopter aeromechanical stability analysis using segmented constrained layer damping and hybrid optimization

    NASA Astrophysics Data System (ADS)

    Liu, Qiang; Chattopadhyay, Aditi

    2000-06-01

    Aeromechanical stability plays a critical role in helicopter design and lead-lag damping is crucial to this design. In this paper, the use of segmented constrained damping layer (SCL) treatment and composite tailoring is investigated for improved rotor aeromechanical stability using formal optimization technique. The principal load-carrying member in the rotor blade is represented by a composite box beam, of arbitrary thickness, with surface bonded SCLs. A comprehensive theory is used to model the smart box beam. A ground resonance analysis model and an air resonance analysis model are implemented in the rotor blade built around the composite box beam with SCLs. The Pitt-Peters dynamic inflow model is used in air resonance analysis under hover condition. A hybrid optimization technique is used to investigate the optimum design of the composite box beam with surface bonded SCLs for improved damping characteristics. Parameters such as stacking sequence of the composite laminates and placement of SCLs are used as design variables. Detailed numerical studies are presented for aeromechanical stability analysis. It is shown that optimum blade design yields significant increase in rotor lead-lag regressive modal damping compared to the initial system.

  12. A Numerical Study on the Edgewise Compression Strength of Sandwich Structures with Facesheet-Core Disbonds

    NASA Technical Reports Server (NTRS)

    Bergan, Andrew C.

    2017-01-01

    Damage tolerant design approaches require determination of critical damage modes and flaw sizes in order to establish nondestructive evaluation detection requirements. A finite element model is developed to assess the effect of circular facesheet-core disbonds on the strength of sandwich specimens subjected to edgewise compressive loads for the purpose of predicting the critical flaw size for a variety of design parameters. Postbuckling analyses are conducted in which an initial imperfection is seeded using results from a linear buckling analysis. Both the virtual crack closure technique (VCCT) and cohesive elements are considered for modeling disbond growth. Predictions from analyses using the VCCT and analyses using cohesive elements are in good correlation. A series of parametric analyses are conducted to investigate the effect of core thickness and material, facesheet layup, facesheet-core interface properties, and curvature on the criticality of facesheet-core disbonds of various sizes. The results from these analyses provide a basis for determining the critical flaw size for facesheet-core disbonds subjected to edgewise compression loads and, therefore, nondestructive evaluation flaw detection requirements for this configuration.

  13. A Critical Review of Heat Transfer Enhancement Techniques for Use in Marine Condensers.

    DTIC Science & Technology

    1982-09-01

    horizontal tube, the Nusselt theory predicts that the condensate film is thinnest at the top of the tube, and thickens around the tube until at the...transfer coefficient. As pointed out above, the Nusselt analysis assumes that the condensate film drains from a horizontal tube in a continuous sheet...the condensate falling on the lower tubes does not deteriorate the thermal performance of these tubes because the helically - wrapped wires draw the

  14. Results of a Prospective Echocardiography Trial in International Space Station Crew

    NASA Technical Reports Server (NTRS)

    Hamilton, Douglas R.; Sargsyan, Ashot E.; Martin, David; Garcia, Kathleen M.; Melton, Shannon; Feiverson, Alan; Dulchavsky, Scott A.

    2009-01-01

    In the framework of an operationally oriented investigation, we conducted a prospective trial of a standard clinical echocardiography protocol in a cohort of long-duration crewmembers. The resulting primary and processed data appear to have no precedents. Our tele-echocardiography paradigm, including just-in-time e-training methods, was also assessed. A critical review of the imaging technique, equipment and setting limitations, and quality assurance is provided, as well as the analysis of "space normal" data.

  15. A possible glycosidic benzophenone with full substitution on B-ring from Psidium guajava leaves.

    PubMed

    Venditti, Alessandro; Ukwueze, Stanley E

    2017-04-01

    Bidimensional NMR analysis may be a useful tool to resolve the structure of chemical compounds also in mixture. This letter would demonstrate how these techniques could be applied e.g. to the reported case on identification of benzophenone glycoside from Psidium guajava. A tentative structure for the secondary component, not yet described, was possibly proposed on the basis of observation and critic review of available 1D and 2D NMR spectra.

  16. The Joint NASA/Goddard-University of Maryland Research Program in Charged Particle and High Energy Photon Detector Technology

    NASA Technical Reports Server (NTRS)

    1988-01-01

    Having recognized at an early stage the critical importance of maintaining detector capabilities which utilize state of the art techniques, a joint program was formulated. This program has involved coordination of a broad range of efforts and activities including joint experiments, collaboration in theoretical studies, instrument design, calibrations, and data analysis. Summaries of the progress made to date are presented. A representative bibliography is also included.

  17. Protein purification and analysis: next generation Western blotting techniques.

    PubMed

    Mishra, Manish; Tiwari, Shuchita; Gomes, Aldrin V

    2017-11-01

    Western blotting is one of the most commonly used techniques in molecular biology and proteomics. Since western blotting is a multistep protocol, variations and errors can occur at any step reducing the reliability and reproducibility of this technique. Recent reports suggest that a few key steps, such as the sample preparation method, the amount and source of primary antibody used, as well as the normalization method utilized, are critical for reproducible western blot results. Areas covered: In this review, improvements in different areas of western blotting, including protein transfer and antibody validation, are summarized. The review discusses the most advanced western blotting techniques available and highlights the relationship between next generation western blotting techniques and its clinical relevance. Expert commentary: Over the last decade significant improvements have been made in creating more sensitive, automated, and advanced techniques by optimizing various aspects of the western blot protocol. New methods such as single cell-resolution western blot, capillary electrophoresis, DigiWest, automated microfluid western blotting and microchip electrophoresis have all been developed to reduce potential problems associated with the western blotting technique. Innovative developments in instrumentation and increased sensitivity for western blots offer novel possibilities for increasing the clinical implications of western blot.

  18. Digital I and C system upgrade integration technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, H. W.; Shih, C.; Wang, J. R.

    2012-07-01

    This work developed an integration technique for digital I and C system upgrade, the utility can replace the I and C systems step by step systematically by this method. Inst. of Nuclear Energy Research (INER) developed a digital Instrumentation and Control (I and C) replacement integration technique on the basis of requirement of the three existing nuclear power plants (NPPs), which are Chin-Shan (CS) NPP, Kuo-Sheng (KS) NPP, and Maanshan (MS) NPP, in Taiwan, and also developed the related Critical Digital Review (CDR) Procedure. The digital I and C replacement integration technique includes: (I) Establishment of Nuclear Power Plant Digitalmore » Replacement Integration Guideline, (2) Preliminary Investigation on I and C System Digitalization, (3) Evaluation on I and C System Digitalization, and (4) Establishment of I and C System Digitalization Architectures. These works can be a reference for performing I and C system digital replacement integration of the three existing NPPs of Taiwan Power Company (TPC). A CDR is the review for a critical system digital I and C replacement. The major reference of this procedure is EPRI TR- 1011710 (2005) 'Handbook for Evaluating Critical Digital Equipment and Systems' which was published by the Electric Power Research Inst. (EPRI). With this document, INER developed a TPC-specific CDR procedure. Currently, CDR becomes one of the policies for digital I and C replacement in TPC. The contents of this CDR procedure include: Scope, Responsibility, Operation Procedure, Operation Flow Chart, CDR review items. The CDR review items include the comparison of the design change, Software Verification and Validation (SVandV), Failure Mode and Effects Analysis (FMEA), Evaluation of Diversity and Defense-in-depth (D3), Evaluation of Watchdog Timer, Evaluation of Electromagnetic Compatibility (EMC), Evaluation of Grounding for System/Component, Seismic Evaluation, Witness and Inspection, Lessons Learnt from the Digital I and C Failure Events. A solid review can assure the quality of the digital I and C system replacement. (authors)« less

  19. Critical Thinking: The Importance of Teaching.

    ERIC Educational Resources Information Center

    Strickland, Glen

    Modern society is full of examples of people's inability to employ techniques of critical thinking in everyday situations. Learning to think critically is important because within this complex society, individuals are constantly placed into situations where difficult choices must be made. An ability to analyze critically available alternatives…

  20. Using cognitive task analysis to identify critical decisions in the laparoscopic environment.

    PubMed

    Craig, Curtis; Klein, Martina I; Griswold, John; Gaitonde, Krishnanath; McGill, Thomas; Halldorsson, Ari

    2012-12-01

    The aim of this study was to identify the critical decisions surgeons need to make regarding laparoscopic surgery, the information these decisions are based on, the strategies employed by surgeons to reach their objectives, and the difficulties experienced by novices. Laparoscopic training focuses on the development of technical skills. However, successful surgical outcomes are also dependent on appropriate decisions made during surgery, which are influenced by critical cues and the use of appropriate strategies. Novices might not be as adept at cue detection and strategy use. Participants were eight attending surgeons. The authors employed task-analytic techniques to identify critical decisions inherent in laparoscopy and the cues, strategies, and novice traps associated with these decisions. The authors used decision requirements tables to organize the data into the key decisions made during the preoperative, operative, and postoperative phases as well as the cues, strategies, and novice traps associated with these decisions. Key decisions identified for the preoperative phase included but were not limited to the decision of performing a laparoscopic versus open surgery, necessity to review the literature, practicing the procedure, and trocar placement. Some key decisions identified for the operative phase included converting to open surgery, performing angiograms, cutting tissue or organs, and reevaluation of the approach. Only one key decision was identified for the postoperative phrase: whether the surgeon's technique needs to be evaluated and revised. The laparoscopic environment requires complex decision making, and novices are prone to errors in their decisions. The information elicited in this study is applicable to laparoscopic training.

  1. Teacher Perceptions of High School Student Failure in the Classroom: Identifying Preventive Practices of Failure Using Critical Incident Technique

    ERIC Educational Resources Information Center

    Kalahar, Kory G.

    2011-01-01

    Student failure is a prominent issue in many comprehensive secondary schools nationwide. Researchers studying error, reliability, and performance in organizations have developed and employed a method known as critical incident technique (CIT) for investigating failure. Adopting an action research model, this study involved gathering and analyzing…

  2. The Application of Critical Incident Procedures for an Initial Audit of Organizational Communication.

    ERIC Educational Resources Information Center

    Rutherford, R. Stanley

    This paper discusses the concept of the critical incidents technique, traces its early development in the training of airplane pilots during World War II, sketches the requirements of the typical steps, notes the few studies in communication using this technique, provides an evaluation, and briefly describes a study concerning department chairmen.…

  3. What Helps and Hinders Indigenous Student Success in Higher Education Health Programmes: A Qualitative Study Using the Critical Incident Technique

    ERIC Educational Resources Information Center

    Curtis, Elana; Wikaire, Erena; Kool, Bridget; Honey, Michelle; Kelly, Fiona; Poole, Phillippa; Barrow, Mark; Airini; Ewen, Shaun; Reid, Papaarangi

    2015-01-01

    Tertiary institutions aim to provide high quality teaching and learning that meet the academic needs for an increasingly diverse student body including indigenous students. "Tatou Tatou" is a qualitative research project utilising Kaupapa "Maori" research methodology and the Critical Incident Technique interview method to…

  4. The Role of a Physical Analysis Laboratory in a 300 mm IC Development and Manufacturing Centre

    NASA Astrophysics Data System (ADS)

    Kwakman, L. F. Tz.; Bicais-Lepinay, N.; Courtas, S.; Delille, D.; Juhel, M.; Trouiller, C.; Wyon, C.; de la Bardonnie, M.; Lorut, F.; Ross, R.

    2005-09-01

    To remain competitive IC manufacturers have to accelerate the development of most advanced (CMOS) technology and to deliver high yielding products with best cycle times and at a competitive pricing. With the increase of technology complexity, also the need for physical characterization support increases, however many of the existing techniques are no longer adequate to effectively support the 65-45 nm technology node developments. New and improved techniques are definitely needed to better characterize the often marginal processes, but these should not significantly impact fabrication costs or cycle time. Hence, characterization and metrology challenges in state-of-the-art IC manufacturing are both of technical and economical nature. TEM microscopy is needed for high quality, high volume analytical support but several physical and practical hurdles have to be taken. The success rate of FIB-SEM based failure analysis drops as defects often are too small to be detected and fault isolation becomes more difficult in the nano-scale device structures. To remain effective and efficient, SEM and OBIRCH techniques have to be improved or complemented with other more effective methods. Chemical analysis of novel materials and critical interfaces requires improvements in the field of e.g. SIMS, ToF-SIMS. Techniques that previously were only used sporadically, like EBSD and XRD, have become a `must' to properly support backend process development. At the bright side, thanks to major technical advances, techniques that previously were practiced at laboratory level only now can be used effectively for at-line fab metrology: Voltage Contrast based defectivity control, XPS based gate dielectric metrology and XRD based control of copper metallization processes are practical examples. In this paper capabilities and shortcomings of several techniques and corresponding equipment are presented with practical illustrations of use in our Crolles facilities.

  5. Investigation of historical metal objects using Laser Induced Breakdown Spectroscopy (LIBS) technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abdel-Kareem, O.; Ghoneim, M.; Harith, M. A.

    2011-09-22

    Analysis of metal objects is a necessary step for establishing an appropriate conservation treatment of an object or to follow up the application's result of the suggested treatments. The main considerations on selecting a method that can be used in investigation and analysis of metal objects are based on the diagnostic power, representative sampling, reproducibility, destructive nature/invasiveness of analysis and accessibility to the appropriate instrument. This study aims at evaluating the usefulness of the use of Laser Induced Breakdown Spectroscopy (LIBS) Technique for analysis of historical metal objects. In this study various historical metal objects collected from different museums andmore » excavations in Egypt were investigated using (LIBS) technique. For evaluating usefulness of the suggested analytical protocol of this technique, the same investigated metal objects were investigated by other methods such as Scanning Electron Microscope with energy-dispersive x-ray analyzer (SEM-EDX) and X-ray Diffraction (XRD). This study confirms that Laser Induced Breakdown Spectroscopy (LIBS) Technique is considered very useful technique that can be used safely for investigating historical metal objects. LIBS analysis can quickly provide information on the qualitative and semi-quantitative elemental content of different metal objects and their characterization and classification. It is practically non-destructive technique with the critical advantage of being applicable in situ, thereby avoiding sampling and sample preparations. It is can be dependable, satisfactory and effective method for low cost study of archaeological and historical metals. But we have to take into consideration that the corrosion of metal leads to material alteration and possible loss of certain metals in the form of soluble salts. Certain corrosion products are known to leach out of the object and therefore, their low content does not necessarily reflect the composition of the metal at the time of the object manufacture. Another point should be taken into consideration that the heterogeneity of a metal alloy object that often result from poor mixing of the different metal alloy composition.There is a necessity to carry out further research to investigate and determine the most appropriate and effective approaches and methods for conservation of these metal objects.« less

  6. Assessment of different sample preparation routes for mass spectrometric monitoring and imaging of lipids in bone cells via ToF-SIMS

    PubMed Central

    Schaepe, Kaija; Kokesch-Himmelreich, Julia; Rohnke, Marcus; Wagner, Alena-Svenja; Schaaf, Thimo; Wenisch, Sabine; Janek, Jürgen

    2015-01-01

    In ToF-SIMS analysis, the experimental outcome from cell experiments is to a great extent influenced by the sample preparation routine. In order to better judge this critical influence in the case of lipid analysis, a detailed comparison of different sample preparation routines is performed—aiming at an optimized preparation routine for systematic lipid imaging of cell cultures. For this purpose, human mesenchymal stem cells were analyzed: (a) as chemically fixed, (b) freeze-dried, and (c) frozen-hydrated. For chemical fixation, different fixatives, i.e., glutaraldehyde, paraformaldehyde, and a mixture of both, were tested with different postfixative handling procedures like storage in phosphate buffered saline, water or critical point drying. Furthermore, secondary lipid fixation via osmium tetroxide was taken into account and the effect of an ascending alcohol series with and without this secondary lipid fixation was evaluated. Concerning freeze-drying, three different postprocessing possibilities were examined. One can be considered as a pure cryofixation technique while the other two routes were based on chemical fixation. Cryofixation methods known from literature, i.e., freeze-fracturing and simple frozen-hydrated preparation, were also evaluated to complete the comparison of sample preparation techniques. Subsequent data evaluation of SIMS spectra in both, positive and negative, ion mode was performed via principal component analysis by use of peak sets representative for lipids. For freeze-fracturing, these experiments revealed poor reproducibility making this preparation route unsuitable for systematic investigations and statistic data evaluation. Freeze-drying after cryofixation showed improved reproducibility and well preserved lipid contents while the other freeze-drying procedures showed drawbacks in one of these criteria. In comparison, chemical fixation techniques via glutar- and/or paraformaldehyde proved most suitable in terms of reproducibility and preserved lipid contents, while alcohol and osmium treatment led to the extraction of lipids and are therefore not recommended. PMID:25791294

  7. Assessment of different sample preparation routes for mass spectrometric monitoring and imaging of lipids in bone cells via ToF-SIMS.

    PubMed

    Schaepe, Kaija; Kokesch-Himmelreich, Julia; Rohnke, Marcus; Wagner, Alena-Svenja; Schaaf, Thimo; Wenisch, Sabine; Janek, Jürgen

    2015-03-19

    In ToF-SIMS analysis, the experimental outcome from cell experiments is to a great extent influenced by the sample preparation routine. In order to better judge this critical influence in the case of lipid analysis, a detailed comparison of different sample preparation routines is performed-aiming at an optimized preparation routine for systematic lipid imaging of cell cultures. For this purpose, human mesenchymal stem cells were analyzed: (a) as chemically fixed, (b) freeze-dried, and (c) frozen-hydrated. For chemical fixation, different fixatives, i.e., glutaraldehyde, paraformaldehyde, and a mixture of both, were tested with different postfixative handling procedures like storage in phosphate buffered saline, water or critical point drying. Furthermore, secondary lipid fixation via osmium tetroxide was taken into account and the effect of an ascending alcohol series with and without this secondary lipid fixation was evaluated. Concerning freeze-drying, three different postprocessing possibilities were examined. One can be considered as a pure cryofixation technique while the other two routes were based on chemical fixation. Cryofixation methods known from literature, i.e., freeze-fracturing and simple frozen-hydrated preparation, were also evaluated to complete the comparison of sample preparation techniques. Subsequent data evaluation of SIMS spectra in both, positive and negative, ion mode was performed via principal component analysis by use of peak sets representative for lipids. For freeze-fracturing, these experiments revealed poor reproducibility making this preparation route unsuitable for systematic investigations and statistic data evaluation. Freeze-drying after cryofixation showed improved reproducibility and well preserved lipid contents while the other freeze-drying procedures showed drawbacks in one of these criteria. In comparison, chemical fixation techniques via glutar- and/or paraformaldehyde proved most suitable in terms of reproducibility and preserved lipid contents, while alcohol and osmium treatment led to the extraction of lipids and are therefore not recommended.

  8. Mode Decomposition Methods for Soil Moisture Prediction

    NASA Astrophysics Data System (ADS)

    Jana, R. B.; Efendiev, Y. R.; Mohanty, B.

    2014-12-01

    Lack of reliable, well-distributed, long-term datasets for model validation is a bottle-neck for most exercises in soil moisture analysis and prediction. Understanding what factors drive soil hydrological processes at different scales and their variability is very critical to further our ability to model the various components of the hydrologic cycle more accurately. For this, a comprehensive dataset with measurements across scales is very necessary. Intensive fine-resolution sampling of soil moisture over extended periods of time is financially and logistically prohibitive. Installation of a few long term monitoring stations is also expensive, and needs to be situated at critical locations. The concept of Time Stable Locations has been in use for some time now to find locations that reflect the mean values for the soil moisture across the watershed under all wetness conditions. However, the soil moisture variability across the watershed is lost when measuring at only time stable locations. We present here a study using techniques such as Dynamic Mode Decomposition (DMD) and Discrete Empirical Interpolation Method (DEIM) that extends the concept of time stable locations to arrive at locations that provide not simply the average soil moisture values for the watershed, but also those that can help re-capture the dynamics across all locations in the watershed. As with the time stability, the initial analysis is dependent on an intensive sampling history. The DMD/DEIM method is an application of model reduction techniques for non-linearly related measurements. Using this technique, we are able to determine the number of sampling points that would be required for a given accuracy of prediction across the watershed, and the location of those points. Locations with higher energetics in the basis domain are chosen first. We present case studies across watersheds in the US and India. The technique can be applied to other hydro-climates easily.

  9. Model-Based Safety Analysis

    NASA Technical Reports Server (NTRS)

    Joshi, Anjali; Heimdahl, Mats P. E.; Miller, Steven P.; Whalen, Mike W.

    2006-01-01

    System safety analysis techniques are well established and are used extensively during the design of safety-critical systems. Despite this, most of the techniques are highly subjective and dependent on the skill of the practitioner. Since these analyses are usually based on an informal system model, it is unlikely that they will be complete, consistent, and error free. In fact, the lack of precise models of the system architecture and its failure modes often forces the safety analysts to devote much of their effort to gathering architectural details about the system behavior from several sources and embedding this information in the safety artifacts such as the fault trees. This report describes Model-Based Safety Analysis, an approach in which the system and safety engineers share a common system model created using a model-based development process. By extending the system model with a fault model as well as relevant portions of the physical system to be controlled, automated support can be provided for much of the safety analysis. We believe that by using a common model for both system and safety engineering and automating parts of the safety analysis, we can both reduce the cost and improve the quality of the safety analysis. Here we present our vision of model-based safety analysis and discuss the advantages and challenges in making this approach practical.

  10. Switch of Sensitivity Dynamics Revealed with DyGloSA Toolbox for Dynamical Global Sensitivity Analysis as an Early Warning for System's Critical Transition

    PubMed Central

    Baumuratova, Tatiana; Dobre, Simona; Bastogne, Thierry; Sauter, Thomas

    2013-01-01

    Systems with bifurcations may experience abrupt irreversible and often unwanted shifts in their performance, called critical transitions. For many systems like climate, economy, ecosystems it is highly desirable to identify indicators serving as early warnings of such regime shifts. Several statistical measures were recently proposed as early warnings of critical transitions including increased variance, autocorrelation and skewness of experimental or model-generated data. The lack of automatized tool for model-based prediction of critical transitions led to designing DyGloSA – a MATLAB toolbox for dynamical global parameter sensitivity analysis (GPSA) of ordinary differential equations models. We suggest that the switch in dynamics of parameter sensitivities revealed by our toolbox is an early warning that a system is approaching a critical transition. We illustrate the efficiency of our toolbox by analyzing several models with bifurcations and predicting the time periods when systems can still avoid going to a critical transition by manipulating certain parameter values, which is not detectable with the existing SA techniques. DyGloSA is based on the SBToolbox2 and contains functions, which compute dynamically the global sensitivity indices of the system by applying four main GPSA methods: eFAST, Sobol's ANOVA, PRCC and WALS. It includes parallelized versions of the functions enabling significant reduction of the computational time (up to 12 times). DyGloSA is freely available as a set of MATLAB scripts at http://bio.uni.lu/systems_biology/software/dyglosa. It requires installation of MATLAB (versions R2008b or later) and the Systems Biology Toolbox2 available at www.sbtoolbox2.org. DyGloSA can be run on Windows and Linux systems, -32 and -64 bits. PMID:24367574

  11. Switch of sensitivity dynamics revealed with DyGloSA toolbox for dynamical global sensitivity analysis as an early warning for system's critical transition.

    PubMed

    Baumuratova, Tatiana; Dobre, Simona; Bastogne, Thierry; Sauter, Thomas

    2013-01-01

    Systems with bifurcations may experience abrupt irreversible and often unwanted shifts in their performance, called critical transitions. For many systems like climate, economy, ecosystems it is highly desirable to identify indicators serving as early warnings of such regime shifts. Several statistical measures were recently proposed as early warnings of critical transitions including increased variance, autocorrelation and skewness of experimental or model-generated data. The lack of automatized tool for model-based prediction of critical transitions led to designing DyGloSA - a MATLAB toolbox for dynamical global parameter sensitivity analysis (GPSA) of ordinary differential equations models. We suggest that the switch in dynamics of parameter sensitivities revealed by our toolbox is an early warning that a system is approaching a critical transition. We illustrate the efficiency of our toolbox by analyzing several models with bifurcations and predicting the time periods when systems can still avoid going to a critical transition by manipulating certain parameter values, which is not detectable with the existing SA techniques. DyGloSA is based on the SBToolbox2 and contains functions, which compute dynamically the global sensitivity indices of the system by applying four main GPSA methods: eFAST, Sobol's ANOVA, PRCC and WALS. It includes parallelized versions of the functions enabling significant reduction of the computational time (up to 12 times). DyGloSA is freely available as a set of MATLAB scripts at http://bio.uni.lu/systems_biology/software/dyglosa. It requires installation of MATLAB (versions R2008b or later) and the Systems Biology Toolbox2 available at www.sbtoolbox2.org. DyGloSA can be run on Windows and Linux systems, -32 and -64 bits.

  12. Whole-body impedance--what does it measure?

    PubMed

    Foster, K R; Lukaski, H C

    1996-09-01

    Although the bioelectrical impedance technique is widely used in human nutrition and clinical research, an integrated summary of the biophysical and bioelectrical bases of this approach is lacking. We summarize the pertinent electrical phenomena relevant to the application of the impedance technique in vivo and discuss the relations between electrical measurements and biological conductor volumes. Key terms in the derivation of bioelectrical impedance analysis are described and the relation between the electrical properties of tissues and tissue structure is discussed. The relation between the impedance of an object and its geometry, scale, and intrinsic electrical properties is also discussed. Correlations between whole-body impedance measurements and various bioconductor volumes, such as total body water and fat-free mass, are experimentally well established; however, the reason for the success of the impedence technique is much less clear. The bioengineering basis for the technique is critically presented and considerations are proposed that might help to clarify the method and potentially improve its sensitivity.

  13. Security of Color Image Data Designed by Public-Key Cryptosystem Associated with 2D-DWT

    NASA Astrophysics Data System (ADS)

    Mishra, D. C.; Sharma, R. K.; Kumar, Manish; Kumar, Kuldeep

    2014-08-01

    In present times the security of image data is a major issue. So, we have proposed a novel technique for security of color image data by public-key cryptosystem or asymmetric cryptosystem. In this technique, we have developed security of color image data using RSA (Rivest-Shamir-Adleman) cryptosystem with two-dimensional discrete wavelet transform (2D-DWT). Earlier proposed schemes for security of color images designed on the basis of keys, but this approach provides security of color images with the help of keys and correct arrangement of RSA parameters. If the attacker knows about exact keys, but has no information of exact arrangement of RSA parameters, then the original information cannot be recovered from the encrypted data. Computer simulation based on standard example is critically examining the behavior of the proposed technique. Security analysis and a detailed comparison between earlier developed schemes for security of color images and proposed technique are also mentioned for the robustness of the cryptosystem.

  14. Measuring the Impact of Online Evidence Retrieval Systems using Critical Incidents & Journey Mapping.

    PubMed

    Westbrook, Johanna I; Coiera, Enrico W; Braithwaite, Jeffrey

    2005-01-01

    Online evidence retrieval systems are one potential tool in supporting evidence-based practice. We have undertaken a program of research to investigate how hospital-based clinicians (doctors, nurses and allied health professionals) use these systems, factors influencing use and their impact on decision-making and health care delivery. A central component of this work has been the development and testing of a broad range of evaluation techniques. This paper provides an overview of the results obtained from three stages of this evaluation and details the results derived from the final stage which sought to test two methods for assessing the integration of an online evidence system and its impact on decision making and patient care. The critical incident and journey mapping techniques were applied. Semi-structured interviews were conducted with 29 clinicians who were experienced users of the online evidence system. Clinicians were asked to described recent instances in which the information obtained using the online evidence system was especially helpful with their work. A grounded approach to data analysis was taken producing three categories of impact. The journey mapping technique was adapted as a method to describe and quantify clinicians' integration of CIAP into their practice and the impact of this on patient care. The analogy of a journey is used to capture the many stages in this integration process, from introduction to the system to full integration into everyday clinical practice with measurable outcomes. Transcribed interview accounts of system use were mapped against the journey stages and scored. Clinicians generated 85 critical incidents and one quarter of these provided specific examples of system use leading to improvements in patient care. The journey mapping technique proved to be a useful method for providing a quantification of the ways and extent to which clincians had integrated system use into practice, and insights into how information systems can influence organisational culture. Further work is required on this technique to assess its value as an evaluation method. The study demonstrates the strength of a triangulated evidence approach to assessing the use and impact of online clinical evidence systems.

  15. Analysis on the Critical Rainfall Value For Predicting Large Scale Landslides Caused by Heavy Rainfall In Taiwan.

    NASA Astrophysics Data System (ADS)

    Tsai, Kuang-Jung; Chiang, Jie-Lun; Lee, Ming-Hsi; Chen, Yie-Ruey

    2017-04-01

    Analysis on the Critical Rainfall Value For Predicting Large Scale Landslides Caused by Heavy Rainfall In Taiwan. Kuang-Jung Tsai 1, Jie-Lun Chiang 2,Ming-Hsi Lee 2, Yie-Ruey Chen 1, 1Department of Land Management and Development, Chang Jung Christian Universityt, Tainan, Taiwan. 2Department of Soil and Water Conservation, National Pingtung University of Science and Technology, Pingtung, Taiwan. ABSTRACT The accumulated rainfall amount was recorded more than 2,900mm that were brought by Morakot typhoon in August, 2009 within continuous 3 days. Very serious landslides, and sediment related disasters were induced by this heavy rainfall event. The satellite image analysis project conducted by Soil and Water Conservation Bureau after Morakot event indicated that more than 10,904 sites of landslide with total sliding area of 18,113ha were found by this project. At the same time, all severe sediment related disaster areas are also characterized based on their disaster type, scale, topography, major bedrock formations and geologic structures during the period of extremely heavy rainfall events occurred at the southern Taiwan. Characteristics and mechanism of large scale landslide are collected on the basis of the field investigation technology integrated with GPS/GIS/RS technique. In order to decrease the risk of large scale landslides on slope land, the strategy of slope land conservation, and critical rainfall database should be set up and executed as soon as possible. Meanwhile, study on the establishment of critical rainfall value used for predicting large scale landslides induced by heavy rainfall become an important issue which was seriously concerned by the government and all people live in Taiwan. The mechanism of large scale landslide, rainfall frequency analysis ,sediment budge estimation and river hydraulic analysis under the condition of extremely climate change during the past 10 years would be seriously concerned and recognized as a required issue by this research. Hopefully, all results developed from this research can be used as a warning system for Predicting Large Scale Landslides in the southern Taiwan. Keywords:Heavy Rainfall, Large Scale, landslides, Critical Rainfall Value

  16. Discrimination of irradiated MOX fuel from UOX fuel by multivariate statistical analysis of simulated activities of gamma-emitting isotopes

    NASA Astrophysics Data System (ADS)

    Åberg Lindell, M.; Andersson, P.; Grape, S.; Hellesen, C.; Håkansson, A.; Thulin, M.

    2018-03-01

    This paper investigates how concentrations of certain fission products and their related gamma-ray emissions can be used to discriminate between uranium oxide (UOX) and mixed oxide (MOX) type fuel. Discrimination of irradiated MOX fuel from irradiated UOX fuel is important in nuclear facilities and for transport of nuclear fuel, for purposes of both criticality safety and nuclear safeguards. Although facility operators keep records on the identity and properties of each fuel, tools for nuclear safeguards inspectors that enable independent verification of the fuel are critical in the recovery of continuity of knowledge, should it be lost. A discrimination methodology for classification of UOX and MOX fuel, based on passive gamma-ray spectroscopy data and multivariate analysis methods, is presented. Nuclear fuels and their gamma-ray emissions were simulated in the Monte Carlo code Serpent, and the resulting data was used as input to train seven different multivariate classification techniques. The trained classifiers were subsequently implemented and evaluated with respect to their capabilities to correctly predict the classes of unknown fuel items. The best results concerning successful discrimination of UOX and MOX-fuel were acquired when using non-linear classification techniques, such as the k nearest neighbors method and the Gaussian kernel support vector machine. For fuel with cooling times up to 20 years, when it is considered that gamma-rays from the isotope 134Cs can still be efficiently measured, success rates of 100% were obtained. A sensitivity analysis indicated that these methods were also robust.

  17. Exposomics research using suspect screening and non ...

    EPA Pesticide Factsheets

    High-resolution mass spectrometry (HRMS) is used for suspect screening (SSA) and non-targeted analysis (NTA) in an attempt to characterize xenobiotic chemicals in various samples broadly and efficiently. These important techniques aid characterization of the exposome, the totality of human exposures, and provide critical information on thousands of chemicals in commerce for which exposure data are lacking. The Environmental Protection Agency (EPA) SSA and NTA capabilities consist of analytical instrumentation [liquid chromatography (LC) with time of flight (TOF) and quadrupole-TOF (Q-TOF) HRMS], workflows (feature extraction, formula generation, structure prediction, spectral matching, chemical confirmation), and tools (databases; models for predicting retention time, functional use, media occurrence, and media concentration; and schemes for ranking features and chemicals). Suspect screening (SSA) and non-targeted analysis (NTA) are used to characterize xenobiotic chemicals in various samples and aid characterization of the exposome, the totality of human exposures, and provide critical information on thousands of chemicals in commerce for which exposure data are lacking.

  18. Analysis of stray radiation for infrared optical system

    NASA Astrophysics Data System (ADS)

    Li, Yang; Zhang, Tingcheng; Liao, Zhibo; Mu, Shengbo; Du, Jianxiang; Wang, Xiangdong

    2016-10-01

    Based on the theory of radiation energy transfer in the infrared optical system, two methods for stray radiation analysis caused by interior thermal radiation in infrared optical system are proposed, one of which is important sampling method technique using forward ray trace, another of which is integral computation method using reverse ray trace. The two methods are discussed in detail. A concrete infrared optical system is provided. Light-tools is used to simulate the passage of radiation from the mirrors and mounts. Absolute values of internal irradiance on the detector are received. The results shows that the main part of the energy on the detector is due to the critical objects which were consistent with critical objects obtained by reverse ray trace, where mirror self-emission contribution is about 87.5% of the total energy. Corresponding to the results, the irradiance on the detector calculated by the two methods are in good agreement. So the validity and rationality of the two methods are proved.

  19. Technical and investigative support for high density digital satellite recording systems

    NASA Technical Reports Server (NTRS)

    Schultz, R. A.

    1983-01-01

    Recent results of dropout measurements and defect analysis conducted on one reel of Ampex 721 which was submitted for evaluation by the manufacturer are described. The results or status of other tape evaluation activities are also reviewed. Several changes in test interpretations and applications are recommended. In some cases, deficiencies in test methods or equipment became apparent during continued work on this project and other IITRI tape evaluation projects. Techniques and equipment for future tasks such as tape qualification are also recommended and discussed. Project effort and expenditures were kept at a relatively low level. This rate provided added development time and experience with the IITRI Dropout Measurement System, which is approaching its potential as a computer based dropout analysis tool. Another benefit is the expanded data base on critical parameters that can be achieved from tests on different tape types and lots as they become available. More consideration and effort was directed toward identification of critical parameters, development of meaningful repeatable test procedures, and tape procurement strategy.

  20. Critical review of the United Kingdom's "gold standard" survey of public attitudes to science.

    PubMed

    Smith, Benjamin K; Jensen, Eric A

    2016-02-01

    Since 2000, the UK government has funded surveys aimed at understanding the UK public's attitudes toward science, scientists, and science policy. Known as the Public Attitudes to Science series, these surveys and their predecessors have long been used in UK science communication policy, practice, and scholarship as a source of authoritative knowledge about science-related attitudes and behaviors. Given their importance and the significant public funding investment they represent, detailed academic scrutiny of the studies is needed. In this essay, we critically review the most recently published Public Attitudes to Science survey (2014), assessing the robustness of its methods and claims. The review casts doubt on the quality of key elements of the Public Attitudes to Science 2014 survey data and analysis while highlighting the importance of robust quantitative social research methodology. Our analysis comparing the main sample and booster sample for young people demonstrates that quota sampling cannot be assumed equivalent to probability-based sampling techniques. © The Author(s) 2016.

  1. Metabolomic analysis using porcine skin: a pilot study of analytical techniques.

    PubMed

    Wu, Julie; Fiehn, Oliver; Armstrong, April W

    2014-06-15

    Metabolic byproducts serve as indicators of the chemical processes and can provide valuable information on pathogenesis by measuring the amplified output. Standardized techniques for metabolome extraction of skin samples serve as a critical foundation to this field but have not been developed. We sought to determine the optimal cell lysage techniques for skin sample preparation and to compare GC-TOF-MS and UHPLC-QTOF-MS for metabolomic analysis. Using porcine skin samples, we pulverized the skin via various combinations of mechanical techniques for cell lysage. After extraction, the samples were subjected to GC-TOF-MS and/or UHPLC-QTOF-MS. Signal intensities from GC-TOF-MS analysis showed that ultrasonication (2.7x107) was most effective for cell lysage when compared to mortar-and-pestle (2.6x107), ball mill followed by ultrasonication (1.6x107), mortar-and-pestle followed by ultrasonication (1.4x107), and homogenization (trial 1: 8.4x106; trial 2: 1.6x107). Due to the similar signal intensities, ultrasonication and mortar-and-pestle were applied to additional samples and subjected to GC-TOF-MS and UHPLC-QTOF-MS. Ultrasonication yielded greater signal intensities than mortar-and-pestle for 92% of detected metabolites following GC-TOF-MS and for 68% of detected metabolites following UHPLC-QTOF-MS. Overall, ultrasonication is the preferred method for efficient cell lysage of skin tissue for both metabolomic platforms. With standardized sample preparation, metabolomic analysis of skin can serve as a powerful tool in elucidating underlying biological processes in dermatological conditions.

  2. Cardiac imaging: working towards fully-automated machine analysis & interpretation.

    PubMed

    Slomka, Piotr J; Dey, Damini; Sitek, Arkadiusz; Motwani, Manish; Berman, Daniel S; Germano, Guido

    2017-03-01

    Non-invasive imaging plays a critical role in managing patients with cardiovascular disease. Although subjective visual interpretation remains the clinical mainstay, quantitative analysis facilitates objective, evidence-based management, and advances in clinical research. This has driven developments in computing and software tools aimed at achieving fully automated image processing and quantitative analysis. In parallel, machine learning techniques have been used to rapidly integrate large amounts of clinical and quantitative imaging data to provide highly personalized individual patient-based conclusions. Areas covered: This review summarizes recent advances in automated quantitative imaging in cardiology and describes the latest techniques which incorporate machine learning principles. The review focuses on the cardiac imaging techniques which are in wide clinical use. It also discusses key issues and obstacles for these tools to become utilized in mainstream clinical practice. Expert commentary: Fully-automated processing and high-level computer interpretation of cardiac imaging are becoming a reality. Application of machine learning to the vast amounts of quantitative data generated per scan and integration with clinical data also facilitates a move to more patient-specific interpretation. These developments are unlikely to replace interpreting physicians but will provide them with highly accurate tools to detect disease, risk-stratify, and optimize patient-specific treatment. However, with each technological advance, we move further from human dependence and closer to fully-automated machine interpretation.

  3. Timing analysis by model checking

    NASA Technical Reports Server (NTRS)

    Naydich, Dimitri; Guaspari, David

    2000-01-01

    The safety of modern avionics relies on high integrity software that can be verified to meet hard real-time requirements. The limits of verification technology therefore determine acceptable engineering practice. To simplify verification problems, safety-critical systems are commonly implemented under the severe constraints of a cyclic executive, which make design an expensive trial-and-error process highly intolerant of change. Important advances in analysis techniques, such as rate monotonic analysis (RMA), have provided a theoretical and practical basis for easing these onerous restrictions. But RMA and its kindred have two limitations: they apply only to verifying the requirement of schedulability (that tasks meet their deadlines) and they cannot be applied to many common programming paradigms. We address both these limitations by applying model checking, a technique with successful industrial applications in hardware design. Model checking algorithms analyze finite state machines, either by explicit state enumeration or by symbolic manipulation. Since quantitative timing properties involve a potentially unbounded state variable (a clock), our first problem is to construct a finite approximation that is conservative for the properties being analyzed-if the approximation satisfies the properties of interest, so does the infinite model. To reduce the potential for state space explosion we must further optimize this finite model. Experiments with some simple optimizations have yielded a hundred-fold efficiency improvement over published techniques.

  4. Validating an Air Traffic Management Concept of Operation Using Statistical Modeling

    NASA Technical Reports Server (NTRS)

    He, Yuning; Davies, Misty Dawn

    2013-01-01

    Validating a concept of operation for a complex, safety-critical system (like the National Airspace System) is challenging because of the high dimensionality of the controllable parameters and the infinite number of states of the system. In this paper, we use statistical modeling techniques to explore the behavior of a conflict detection and resolution algorithm designed for the terminal airspace. These techniques predict the robustness of the system simulation to both nominal and off-nominal behaviors within the overall airspace. They also can be used to evaluate the output of the simulation against recorded airspace data. Additionally, the techniques carry with them a mathematical value of the worth of each prediction-a statistical uncertainty for any robustness estimate. Uncertainty Quantification (UQ) is the process of quantitative characterization and ultimately a reduction of uncertainties in complex systems. UQ is important for understanding the influence of uncertainties on the behavior of a system and therefore is valuable for design, analysis, and verification and validation. In this paper, we apply advanced statistical modeling methodologies and techniques on an advanced air traffic management system, namely the Terminal Tactical Separation Assured Flight Environment (T-TSAFE). We show initial results for a parameter analysis and safety boundary (envelope) detection in the high-dimensional parameter space. For our boundary analysis, we developed a new sequential approach based upon the design of computer experiments, allowing us to incorporate knowledge from domain experts into our modeling and to determine the most likely boundary shapes and its parameters. We carried out the analysis on system parameters and describe an initial approach that will allow us to include time-series inputs, such as the radar track data, into the analysis

  5. Discourse analysis: towards an understanding of its place in nursing.

    PubMed

    Crowe, Marie

    2005-07-01

    This paper describes how discourse analysis, and in particular critical discourse analysis, can be used in nursing research, and provides an example to illustrate the techniques involved. Discourse analysis has risen to prominence in the 1980s and 1990s in disciplines such as the social sciences, literary theory and cultural studies and is increasingly used in nursing. This paper investigates discourse analysis as a useful methodology for conducting nursing research. Effective clinical reasoning relies on employing several different kinds of knowledge and research that draw on different perspectives, methodologies and techniques to generate breadth of knowledge and depth of understanding of clinical practices and patients' experiences of those practices. The steps in a discourse analysis include: choosing the text, and identifying the explicit purpose of the text, the processes used for claiming authority connections to other discourses, construction of major concepts, processes of naming and categorizing, construction of subject positions, construction of reality and social relations and implications for the practice of nursing. The limitations of discourse analysis, its relationship to other qualitative approaches and questions for evaluating the rigour of research using discourse analysis are also explored. The example of discourse analysis shows how a text influences the practice of nursing by shaping knowledge, values and beliefs. Discourse analysis can make a contribution to the development of nursing knowledge by providing a research strategy to examine dominant discourses that influence nursing practice.

  6. Using Job Analysis Techniques to Understand Training Needs for Promotores de Salud.

    PubMed

    Ospina, Javier H; Langford, Toshiko A; Henry, Kimberly L; Nelson, Tristan Q

    2018-04-01

    Despite the value of community health worker programs, such as Promotores de Salud, for addressing health disparities in the Latino community, little consensus has been reached to formally define the unique roles and duties associated with the job, thereby creating unique job training challenges. Understanding the job tasks and worker attributes central to this work is a critical first step for developing the training and evaluation systems of promotores programs. Here, we present the process and findings of a job analysis conducted for promotores working for Planned Parenthood. We employed a systematic approach, the combination job analysis method, to define the job in terms of its work and worker requirements, identifying key job tasks, as well as the worker attributes necessary to effectively perform them. Our results suggest that the promotores' job encompasses a broad range of activities and requires an equally broad range of personal characteristics to perform. These results played an important role in the development of our training and evaluation protocols. In this article, we introduce the technique of job analysis, provide an overview of the results from our own application of this technique, and discuss how these findings can be used to inform a training and performance evaluation system. This article provides a template for other organizations implementing similar community health worker programs and illustrates the value of conducting a job analysis for clarifying job roles, developing and evaluating job training materials, and selecting qualified job candidates.

  7. EPR dosimetry in a mixed neutron and gamma radiation field.

    PubMed

    Trompier, F; Fattibene, P; Tikunov, D; Bartolotta, A; Carosi, A; Doca, M C

    2004-01-01

    Suitability of Electron Paramagnetic Resonance (EPR) spectroscopy for criticality dosimetry was evaluated for tooth enamel, mannose and alanine pellets during the 'international intercomparison of criticality dosimetry techniques' at the SILENE reactor held in Valduc in June 2002, France. These three materials were irradiated in neutron and gamma-ray fields of various relative intensities and spectral distributions in order to evaluate their neutron sensitivity. The neutron response was found to be around 10% for tooth enamel, 45% for mannose and between 40 and 90% for alanine pellets according their type. According to the IAEA recommendations on the early estimate of criticality accident absorbed dose, analyzed results show the EPR potentiality and complementarity with regular criticality techniques.

  8. Comparative assessment of bone pose estimation using Point Cluster Technique and OpenSim.

    PubMed

    Lathrop, Rebecca L; Chaudhari, Ajit M W; Siston, Robert A

    2011-11-01

    Estimating the position of the bones from optical motion capture data is a challenge associated with human movement analysis. Bone pose estimation techniques such as the Point Cluster Technique (PCT) and simulations of movement through software packages such as OpenSim are used to minimize soft tissue artifact and estimate skeletal position; however, using different methods for analysis may produce differing kinematic results which could lead to differences in clinical interpretation such as a misclassification of normal or pathological gait. This study evaluated the differences present in knee joint kinematics as a result of calculating joint angles using various techniques. We calculated knee joint kinematics from experimental gait data using the standard PCT, the least squares approach in OpenSim applied to experimental marker data, and the least squares approach in OpenSim applied to the results of the PCT algorithm. Maximum and resultant RMS differences in knee angles were calculated between all techniques. We observed differences in flexion/extension, varus/valgus, and internal/external rotation angles between all approaches. The largest differences were between the PCT results and all results calculated using OpenSim. The RMS differences averaged nearly 5° for flexion/extension angles with maximum differences exceeding 15°. Average RMS differences were relatively small (< 1.08°) between results calculated within OpenSim, suggesting that the choice of marker weighting is not critical to the results of the least squares inverse kinematics calculations. The largest difference between techniques appeared to be a constant offset between the PCT and all OpenSim results, which may be due to differences in the definition of anatomical reference frames, scaling of musculoskeletal models, and/or placement of virtual markers within OpenSim. Different methods for data analysis can produce largely different kinematic results, which could lead to the misclassification of normal or pathological gait. Improved techniques to allow non-uniform scaling of generic models to more accurately reflect subject-specific bone geometries and anatomical reference frames may reduce differences between bone pose estimation techniques and allow for comparison across gait analysis platforms.

  9. Chemical and biological threat-agent detection using electrophoresis-based lab-on-a-chip devices.

    PubMed

    Borowsky, Joseph; Collins, Greg E

    2007-10-01

    The ability to separate complex mixtures of analytes has made capillary electrophoresis (CE) a powerful analytical tool since its modern configuration was first introduced over 25 years ago. The technique found new utility with its application to the microfluidics based lab-on-a-chip platform (i.e., microchip), which resulted in ever smaller footprints, sample volumes, and analysis times. These features, coupled with the technique's potential for portability, have prompted recent interest in the development of novel analyzers for chemical and biological threat agents. This article will comment on three main areas of microchip CE as applied to the separation and detection of threat agents: detection techniques and their corresponding limits of detection, sampling protocol and preparation time, and system portability. These three areas typify the broad utility of lab-on-a-chip for meeting critical, present-day security, in addition to illustrating areas wherein advances are necessary.

  10. Methodological optimization of tinnitus assessment using prepulse inhibition of the acoustic startle reflex.

    PubMed

    Longenecker, R J; Galazyuk, A V

    2012-11-16

    Recently prepulse inhibition of the acoustic startle reflex (ASR) became a popular technique for tinnitus assessment in laboratory animals. This method confers a significant advantage over the previously used time-consuming behavioral approaches utilizing basic mechanisms of conditioning. Although this technique has been successfully used to assess tinnitus in different laboratory animals, many of the finer details of this methodology have not been described enough to be replicated, but are critical for tinnitus assessment. Here we provide detail description of key procedures and methodological issues that provide guidance for newcomers with the process of learning to correctly apply gap detection techniques for tinnitus assessment in laboratory animals. The major categories of these issues include: refinement of hardware for best performance, optimization of stimulus parameters, behavioral considerations, and identification of optimal strategies for data analysis. This article is part of a Special Issue entitled: Tinnitus Neuroscience. Copyright © 2012. Published by Elsevier B.V.

  11. Monosomy 3 by FISH in uveal melanoma: variability in techniques and results.

    PubMed

    Aronow, Mary; Sun, Yang; Saunthararajah, Yogen; Biscotti, Charles; Tubbs, Raymond; Triozzi, Pierre; Singh, Arun D

    2012-09-01

    Tumor monosomy 3 confers a poor prognosis in patients with uveal melanoma. We critically review the techniques used for fluorescence in situ hybridization (FISH) detection of monosomy 3 in order to assess variability in practice patterns and to explain differences in results. Significant variability that has likely affected reported results was found in tissue sampling methods, selection of FISH probes, number of cells counted, and the cut-off point used to determine monosomy 3 status. Clinical parameters and specific techniques employed to report FISH results should be specified so as to allow meta-analysis of published studies. FISH-based detection of monosomy 3 in uveal melanoma has not been performed in a standardized manner, which limits conclusions regarding its clinical utility. FISH is a widely available, versatile technology, and when performed optimally has the potential to be a valuable tool for determining the prognosis of uveal melanoma. Copyright © 2012 Elsevier Inc. All rights reserved.

  12. X-ray near-field speckle: implementation and critical analysis

    PubMed Central

    Lu, Xinhui; Mochrie, S. G. J.; Narayanan, S.; Sandy, A. R.; Sprung, M.

    2011-01-01

    The newly introduced coherence-based technique of X-ray near-field speckle (XNFS) has been implemented at 8-ID-I at the Advanced Photon Source. In the near-field regime of high-brilliance synchrotron X-rays scattered from a sample of interest, it turns out that, when the scattered radiation and the main beam both impinge upon an X-ray area detector, the measured intensity shows low-contrast speckles, resulting from interference between the incident and scattered beams. A micrometer-resolution XNFS detector with a high numerical aperture microscope objective has been built and its capability for studying static structures and dynamics at longer length scales than traditional far-field X-ray scattering techniques is demonstrated. Specifically, the dynamics of dilute silica and polystyrene colloidal samples are characterized. This study reveals certain limitations of the XNFS technique, especially in the characterization of static structures, which is discussed. PMID:21997906

  13. Analysis of recurrent patterns in toroidal magnetic fields.

    PubMed

    Sanderson, Allen R; Chen, Guoning; Tricoche, Xavier; Pugmire, David; Kruger, Scott; Breslau, Joshua

    2010-01-01

    In the development of magnetic confinement fusion which will potentially be a future source for low cost power, physicists must be able to analyze the magnetic field that confines the burning plasma. While the magnetic field can be described as a vector field, traditional techniques for analyzing the field's topology cannot be used because of its Hamiltonian nature. In this paper we describe a technique developed as a collaboration between physicists and computer scientists that determines the topology of a toroidal magnetic field using fieldlines with near minimal lengths. More specifically, we analyze the Poincaré map of the sampled fieldlines in a Poincaré section including identifying critical points and other topological features of interest to physicists. The technique has been deployed into an interactive parallel visualization tool which physicists are using to gain new insight into simulations of magnetically confined burning plasmas.

  14. New efficient optimizing techniques for Kalman filters and numerical weather prediction models

    NASA Astrophysics Data System (ADS)

    Famelis, Ioannis; Galanis, George; Liakatas, Aristotelis

    2016-06-01

    The need for accurate local environmental predictions and simulations beyond the classical meteorological forecasts are increasing the last years due to the great number of applications that are directly or not affected: renewable energy resource assessment, natural hazards early warning systems, global warming and questions on the climate change can be listed among them. Within this framework the utilization of numerical weather and wave prediction systems in conjunction with advanced statistical techniques that support the elimination of the model bias and the reduction of the error variability may successfully address the above issues. In the present work, new optimization methods are studied and tested in selected areas of Greece where the use of renewable energy sources is of critical. The added value of the proposed work is due to the solid mathematical background adopted making use of Information Geometry and Statistical techniques, new versions of Kalman filters and state of the art numerical analysis tools.

  15. Laser-induced fluorescence spectroscopy in tissue local necrosis detection

    NASA Astrophysics Data System (ADS)

    Cip, Ondrej; Buchta, Zdenek; Lesundak, Adam; Randula, Antonin; Mikel, Bretislav; Lazar, Josef; Veverkova, Lenka

    2014-03-01

    The recent effort leads to reliable imaging techniques which can help to a surgeon during operations. The fluorescence spectroscopy was selected as very useful online in vivo imaging method to organics and biological materials analysis. The presented work scopes to a laser induced fluorescence spectroscopy technique to detect tissue local necrosis in small intestine surgery. In first experiments, we tested tissue auto-fluorescence technique but a signal-to-noise ratio didn't express significant results. Then we applied a contrast dye - IndoCyanine Green (ICG) which absorbs and emits wavelengths in the near IR. We arranged the pilot experimental setup based on highly coherent extended cavity diode laser (ECDL) used for stimulating of some critical areas of the small intestine tissue with injected ICG dye. We demonstrated the distribution of the ICG exciter with the first file of shots of small intestine tissue of a rabbit that was captured by high sensitivity fluorescent cam.

  16. Electrical failure debug using interlayer profiling method

    NASA Astrophysics Data System (ADS)

    Yang, Thomas; Shen, Yang; Zhang, Yifan; Sweis, Jason; Lai, Ya-Chieh

    2017-03-01

    It is very well known that as technology nodes move to smaller sizes, the number of design rules increases while design structures become more regular and the process manufacturing steps have increased as well. Normal inspection tools can only monitor hard failures on a single layer. For electrical failures that happen due to inter layers misalignments, we can only detect them through testing. This paper will present a working flow for using pattern analysis interlayer profiling techniques to turn multiple layer physical info into group linked parameter values. Using this data analysis flow combined with an electrical model allows us to find critical regions on a layout for yield learning.

  17. Navigational Traffic Conflict Technique: A Proactive Approach to Quantitative Measurement of Collision Risks in Port Waters

    NASA Astrophysics Data System (ADS)

    Debnath, Ashim Kumar; Chin, Hoong Chor

    Navigational safety analysis relying on collision statistics is often hampered because of the low number of observations. A promising alternative approach that overcomes this problem is proposed in this paper. By analyzing critical vessel interactions this approach proactively measures collision risk in port waters. The proposed method is illustrated for quantitative measurement of collision risks in Singapore port fairways, and validated by examining correlations between the measured risks with those perceived by pilots. This method is an ethically appealing alternative to the collision-based analysis for fast, reliable and effective safety assessment, thus possessing great potential for managing collision risks in port waters.

  18. Power System Analysis

    NASA Astrophysics Data System (ADS)

    Taniguchi, Haruhito

    Electric power generation that relies on various sources as the primary sources of energy is expected to bring down CO2 emissions levels to support the overall strategy to curb global warming. Accordingly, utilities are moving towards integrating more renewable sources for generation, mostly dispersed, and adopting Smart Grid Technologies for system control. In order to construct, operate, and maintain power systems stably and economically in such background, thorough understanding about the characteristics of power systems and their components is essential. This paper presents modeling and simulation techniques available for the analysis of critical aspects such as thermal capacity, stability, voltage stability, and frequency dynamics, vital for the stable operation of power systems.

  19. Adjoint-Based Aerodynamic Design of Complex Aerospace Configurations

    NASA Technical Reports Server (NTRS)

    Nielsen, Eric J.

    2016-01-01

    An overview of twenty years of adjoint-based aerodynamic design research at NASA Langley Research Center is presented. Adjoint-based algorithms provide a powerful tool for efficient sensitivity analysis of complex large-scale computational fluid dynamics (CFD) simulations. Unlike alternative approaches for which computational expense generally scales with the number of design parameters, adjoint techniques yield sensitivity derivatives of a simulation output with respect to all input parameters at the cost of a single additional simulation. With modern large-scale CFD applications often requiring millions of compute hours for a single analysis, the efficiency afforded by adjoint methods is critical in realizing a computationally tractable design optimization capability for such applications.

  20. Artificial intelligence applied to process signal analysis

    NASA Technical Reports Server (NTRS)

    Corsberg, Dan

    1988-01-01

    Many space station processes are highly complex systems subject to sudden, major transients. In any complex process control system, a critical aspect of the human/machine interface is the analysis and display of process information. Human operators can be overwhelmed by large clusters of alarms that inhibit their ability to diagnose and respond to a disturbance. Using artificial intelligence techniques and a knowledge base approach to this problem, the power of the computer can be used to filter and analyze plant sensor data. This will provide operators with a better description of the process state. Once a process state is recognized, automatic action could be initiated and proper system response monitored.

  1. RP and RQA Analysis for Floating Potential Fluctuations in a DC Magnetron Sputtering Plasma

    NASA Astrophysics Data System (ADS)

    Sabavath, Gopikishan; Banerjee, I.; Mahapatra, S. K.

    2016-04-01

    The nonlinear dynamics of a direct current magnetron sputtering plasma is visualized using recurrence plot (RP) technique. RP comprises the recurrence quantification analysis (RQA) which is an efficient method to observe critical regime transitions in dynamics. Further, RQA provides insight information about the system’s behavior. We observed the floating potential fluctuations of the plasma as a function of discharge voltage by using Langmuir probe. The system exhibits quasi-periodic-chaotic-quasi-periodic-chaotic transitions. These transitions are quantified from determinism, Lmax, and entropy of RQA. Statistical investigations like kurtosis and skewness also studied for these transitions which are in well agreement with RQA results.

  2. An Evaluation of CPRA (Cost Performance Report Analysis) Estimate at Completion Techniques Based Upon AFWAL (Air Force Wright Aeronautical Laboratories) Cost/Schedule Control System Criteria Data

    DTIC Science & Technology

    1985-09-01

    4 C/SCSC Terms and Definitions ...... ..... 5 Cost Performance Report Analysis (CPA) Progrra" m 6 Description of CPRA Terms and Formulas...hypotheses are: 1 2 C2: al’ 02 ’ The test statistic is then calculated as: F* (( SSEI + (nI - 2)) / (SSE 2 + (n 2 - 2))] The critical F value is: F(c, nl...353.90767 SIGNIF F = .0000 44 ,1 42 •.4 m . - .TABLE B.4 General Linear Test for EAC1 and EAC5 MEAN STD DEV CASES ECAC 827534.056 1202737.882 1630 EACS

  3. Conceptual designs for in situ analysis of Mars soil

    NASA Technical Reports Server (NTRS)

    Mckay, C. P.; Zent, A. P.; Hartman, H.

    1991-01-01

    A goal of this research is to develop conceptual designs for instrumentation to perform in situ measurements of the Martian soil in order to determine the existence and nature of any reactive chemicals. Our approach involves assessment and critical review of the Viking biology results which indicated the presence of a soil oxidant, an investigation of the possible application of standard soil science techniques to the analysis of Martian soil, and a preliminary consideration of non-standard methods that may be necessary for use in the highly oxidizing Martian soil. Based on our preliminary analysis, we have developed strawman concepts for standard soil analysis on Mars, including pH, suitable for use on a Mars rover mission. In addition, we have devised a method for the determination of the possible strong oxidants on Mars.

  4. Systems thinking, the Swiss Cheese Model and accident analysis: a comparative systemic analysis of the Grayrigg train derailment using the ATSB, AcciMap and STAMP models.

    PubMed

    Underwood, Peter; Waterson, Patrick

    2014-07-01

    The Swiss Cheese Model (SCM) is the most popular accident causation model and is widely used throughout various industries. A debate exists in the research literature over whether the SCM remains a viable tool for accident analysis. Critics of the model suggest that it provides a sequential, oversimplified view of accidents. Conversely, proponents suggest that it embodies the concepts of systems theory, as per the contemporary systemic analysis techniques. The aim of this paper was to consider whether the SCM can provide a systems thinking approach and remain a viable option for accident analysis. To achieve this, the train derailment at Grayrigg was analysed with an SCM-based model (the ATSB accident investigation model) and two systemic accident analysis methods (AcciMap and STAMP). The analysis outputs and usage of the techniques were compared. The findings of the study showed that each model applied the systems thinking approach. However, the ATSB model and AcciMap graphically presented their findings in a more succinct manner, whereas STAMP more clearly embodied the concepts of systems theory. The study suggests that, whilst the selection of an analysis method is subject to trade-offs that practitioners and researchers must make, the SCM remains a viable model for accident analysis. Copyright © 2013 Elsevier Ltd. All rights reserved.

  5. Three-Dimensional Images For Robot Vision

    NASA Astrophysics Data System (ADS)

    McFarland, William D.

    1983-12-01

    Robots are attracting increased attention in the industrial productivity crisis. As one significant approach for this nation to maintain technological leadership, the need for robot vision has become critical. The "blind" robot, while occupying an economical niche at present is severely limited and job specific, being only one step up from the numerical controlled machines. To successfully satisfy robot vision requirements a three dimensional representation of a real scene must be provided. Several image acquistion techniques are discussed with more emphasis on the laser radar type instruments. The autonomous vehicle is also discussed as a robot form, and the requirements for these applications are considered. The total computer vision system requirement is reviewed with some discussion of the major techniques in the literature for three dimensional scene analysis.

  6. Support vector machine for automatic pain recognition

    NASA Astrophysics Data System (ADS)

    Monwar, Md Maruf; Rezaei, Siamak

    2009-02-01

    Facial expressions are a key index of emotion and the interpretation of such expressions of emotion is critical to everyday social functioning. In this paper, we present an efficient video analysis technique for recognition of a specific expression, pain, from human faces. We employ an automatic face detector which detects face from the stored video frame using skin color modeling technique. For pain recognition, location and shape features of the detected faces are computed. These features are then used as inputs to a support vector machine (SVM) for classification. We compare the results with neural network based and eigenimage based automatic pain recognition systems. The experiment results indicate that using support vector machine as classifier can certainly improve the performance of automatic pain recognition system.

  7. New Techniques for the Generation and Analysis of Tailored Microbial Systems on Surfaces.

    PubMed

    Furst, Ariel L; Smith, Matthew J; Francis, Matthew B

    2018-05-17

    The interactions between microbes and surfaces provide critically important cues that control the behavior and growth of the cells. As our understanding of complex microbial communities improves, there is a growing need for experimental tools that can establish and control the spatial arrangements of these cells in a range of contexts. Recent improvements in methods to attach bacteria and yeast to nonbiological substrates, combined with an expanding set of techniques available to study these cells, position this field for many new discoveries. Improving methods for controlling the immobilization of bacteria provides powerful experimental tools for testing hypotheses regarding microbiome interactions, studying the transfer of nutrients between bacterial species, and developing microbial communities for green energy production and pollution remediation.

  8. Implementation of a finite element analysis procedure for structural analysis of shape memory behaviour of fibre reinforced shape memory polymer composites

    NASA Astrophysics Data System (ADS)

    Azzawi, Wessam Al; Epaarachchi, J. A.; Islam, Mainul; Leng, Jinsong

    2017-12-01

    Shape memory polymers (SMPs) offer a unique ability to undergo a substantial shape deformation and subsequently recover the original shape when exposed to a particular external stimulus. Comparatively low mechanical properties being the major drawback for extended use of SMPs in engineering applications. However the inclusion of reinforcing fibres in to SMPs improves mechanical properties significantly while retaining intrinsic shape memory effects. The implementation of shape memory polymer composites (SMPCs) in any engineering application is a unique task which requires profound materials and design optimization. However currently available analytical tools have critical limitations to undertake accurate analysis/simulations of SMPC structures and slower derestrict transformation of breakthrough research outcomes to real-life applications. Many finite element (FE) models have been presented. But majority of them require a complicated user-subroutines to integrate with standard FE software packages. Furthermore, those subroutines are problem specific and difficult to use for a wider range of SMPC materials and related structures. This paper presents a FE simulation technique to model the thermomechanical behaviour of the SMPCs using commercial FE software ABAQUS. Proposed technique incorporates material time-dependent viscoelastic behaviour. The ability of the proposed technique to predict the shape fixity and shape recovery was evaluated by experimental data acquired by a bending of a SMPC cantilever beam. The excellent correlation between the experimental and FE simulation results has confirmed the robustness of the proposed technique.

  9. Characterization of Metal Powders Used for Additive Manufacturing.

    PubMed

    Slotwinski, J A; Garboczi, E J; Stutzman, P E; Ferraris, C F; Watson, S S; Peltz, M A

    2014-01-01

    Additive manufacturing (AM) techniques can produce complex, high-value metal parts, with potential applications as critical parts, such as those found in aerospace components. The production of AM parts with consistent and predictable properties requires input materials (e.g., metal powders) with known and repeatable characteristics, which in turn requires standardized measurement methods for powder properties. First, based on our previous work, we assess the applicability of current standardized methods for powder characterization for metal AM powders. Then we present the results of systematic studies carried out on two different powder materials used for additive manufacturing: stainless steel and cobalt-chrome. The characterization of these powders is important in NIST efforts to develop appropriate measurements and standards for additive materials and to document the property of powders used in a NIST-led additive manufacturing material round robin. An extensive array of characterization techniques was applied to these two powders, in both virgin and recycled states. The physical techniques included laser diffraction particle size analysis, X-ray computed tomography for size and shape analysis, and optical and scanning electron microscopy. Techniques sensitive to structure and chemistry, including X-ray diffraction, energy dispersive analytical X-ray analysis using the X-rays generated during scanning electron microscopy, and X-Ray photoelectron spectroscopy were also employed. The results of these analyses show how virgin powder changes after being exposed to and recycled from one or more Direct Metal Laser Sintering (DMLS) additive manufacturing build cycles. In addition, these findings can give insight into the actual additive manufacturing process.

  10. Supplemental Conceptual Design Study of an Integrated Voice/Data Switching and Multiplexing Technique for an Access Area Exchange

    DTIC Science & Technology

    1976-11-11

    exchange. The basis for this choice was derived from several factors . One was a timing analysis that was made for certain basic time-critical software...randidate 6jrstem designs were developed and _*xamined with respect to L their capability to demonstrate the workability of the basic concept and for factors ...algorithm recuires a bit time completion, while SOF production allows byte timing and the involved = SOF correlation procedure may be perfor-med during

  11. Global tropospheric chemistry: Chemical fluexes in the global atmosphere

    NASA Technical Reports Server (NTRS)

    Lenschow, Donald H. (Editor); Hicks, Bruce B. (Editor)

    1989-01-01

    In October 1987, NSF, NASA, and NOAA jointly sponsored a workshop at Columbia University to assess the experimental tools and analysis procedures in use and under development to measure and understand gas and particle fluxes across this critical air-surface boundary. Results are presented for that workshop. It is published to summarize the present understanding of the various measurement techniques that are available, identify promising new technological developments for improved measurements, and stimulate thinking about this important measurement challenge.

  12. Neutron dosimetry of the Little Boy device

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pederson, R.A.; Plassmann, E.A.

    1984-01-01

    Neutron dose rates at several angular locations and at distances out to 0.5 mile have been measured during critical operation of the Little Boy replica. We used modified remmetes and thermoluminescent dosimetry techniques for the measurements. The present status of our analysis is presented including estimates of the neutron-dose-relaxation length in air and the variation of the neutron-to-gamma-ray dose ratio with distance from the replica. These results are preliminary and are subject to detector calibration measurements.

  13. Current and future technology in radial and axial gas turbines

    NASA Technical Reports Server (NTRS)

    Rohlik, H. E.

    1983-01-01

    Design approaches and flow analysis techniques currently employed by aircraft engine manufacturers are assessed. Studies were performed to define the characteristics of aircraft and engines for civil missions of the 1990's and beyond. These studies, coupled with experience in recent years, identified the critical technologies needed to meet long range goals in fuel economy and other operating costs. Study results, recent and current research and development programs, and an estimate of future design and analytic capabilities are discussed.

  14. [The modern approaches to organization of delivery system in Nizhniy Novgorod].

    PubMed

    Ryzhova, N K; Lazarev, V N

    2014-01-01

    The article presents data concerning reproductive demographic processes in Nizhniy Novgorod. The numbers of women of fertility age and indicator of maternity mortality were selected as objects for analysis. The structure of causes of maternal mortality is presented and on its basis the corresponding classification was developed. To prevent maternal losses the development of specialized centers was proposed and implementation of high-tech blood-preserving techniques as well. The routing and accompaniment of women being in critical ("closer to death") conditions are considered.

  15. A Review of Australian Investigations on Aeronautical Fatigue during the Period April 1985 to March 1987.

    DTIC Science & Technology

    1987-04-01

    to the edge, a process such as cold- expansion needs to be well proven before its adoption in service. Secondly, many Nomad aircraft operate in a...including the third front spar) has included extensive use of the FTI cold- expansion process in the fatigue-critical regions in 89 holes. Testing began...ANALYSIS AND REPAIR 9.4.1 Fatigue Life Enhancement (J.Y. Mann - ARL) Cold expansion of bolt holes was one of the techniques used to improve the

  16. In Situ Warming and Soil Venting to Enhance the Biodegradation of JP-4 in Cold Climates: A Critical Study and Analysis

    DTIC Science & Technology

    1995-12-01

    1178-1180 (1991). Atlas , Ronald M. and Richard Bartha . Microbial Ecology : Fundamentals and Applications. 3d ed. Redwood City CA: The Benjamin/Cummings...technique called bioventing. In cold climates, in situ bioremediation is limited to the summer when soil temperatures are sufficient to support microbial ...actively warmed the soil -- warm water circulation and heat tape; the other passively warmed the plot with insulatory covers. Microbial respiration (02

  17. Efficiency of unconstrained minimization techniques in nonlinear analysis

    NASA Technical Reports Server (NTRS)

    Kamat, M. P.; Knight, N. F., Jr.

    1978-01-01

    Unconstrained minimization algorithms have been critically evaluated for their effectiveness in solving structural problems involving geometric and material nonlinearities. The algorithms have been categorized as being zeroth, first, or second order depending upon the highest derivative of the function required by the algorithm. The sensitivity of these algorithms to the accuracy of derivatives clearly suggests using analytically derived gradients instead of finite difference approximations. The use of analytic gradients results in better control of the number of minimizations required for convergence to the exact solution.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matthew R. Kumjian; Giangrande, Scott E.; Mishra, Subashree

    Polarimetric radar observations increasingly are used to understand cloud microphysical processes, which is critical for improving their representation in cloud and climate models. In particular, there has been recent focus on improving representations of ice collection processes (e.g., aggregation, riming), as these influence precipitation rate, heating profiles, and ultimately cloud life cycles. However, distinguishing these processes using conventional polarimetric radar observations is difficult, as they produce similar fingerprints. This necessitates improved analysis techniques and integration of complementary data sources. Furthermore, the Midlatitude Continental Convective Clouds Experiment (MC3E) provided such an opportunity.

  19. Are pediatric critical care medicine fellowships teaching and evaluating communication and professionalism?

    PubMed

    Turner, David A; Mink, Richard B; Lee, K Jane; Winkler, Margaret K; Ross, Sara L; Hornik, Christoph P; Schuette, Jennifer J; Mason, Katherine; Storgion, Stephanie A; Goodman, Denise M

    2013-06-01

    To describe the teaching and evaluation modalities used by pediatric critical care medicine training programs in the areas of professionalism and communication. Cross-sectional national survey. Pediatric critical care medicine fellowship programs. Pediatric critical care medicine program directors. None. Survey response rate was 67% of program directors in the United States, representing educators for 73% of current pediatric critical care medicine fellows. Respondents had a median of 4 years experience, with a median of seven fellows and 12 teaching faculty in their program. Faculty role modeling or direct observation with feedback were the most common modalities used to teach communication. However, six of the eight (75%) required elements of communication evaluated were not specifically taught by all programs. Faculty role modeling was the most commonly used technique to teach professionalism in 44% of the content areas evaluated, and didactics was the technique used in 44% of other professionalism content areas. Thirteen of the 16 required elements of professionalism (81%) were not taught by all programs. Evaluations by members of the healthcare team were used for assessment for both competencies. The use of a specific teaching technique was not related to program size, program director experience, or training in medical education. A wide range of techniques are currently used within pediatric critical care medicine to teach communication and professionalism, but there are a number of required elements that are not specifically taught by fellowship programs. These areas of deficiency represent opportunities for future investigation and improved education in the important competencies of communication and professionalism.

  20. Influence of a Levelness Defect in a Thrust Bearing on the Dynamic Behaviour of AN Elastic Shaft

    NASA Astrophysics Data System (ADS)

    BERGER, S.; BONNEAU, O.; FRÊNE, J.

    2002-01-01

    This paper examines the non-linear dynamic behaviour of a flexible shaft. The shaft is mounted on two journal bearings and the axial load is supported by a defective hydrodynamic thrust bearing at one end. The defect is a levelness defect of the rotor. The thrust bearing behaviour must be considered to be non-linear because of the effects of the defect. The shaft is modelled with typical beam finite elements including effects such as the gyroscopic effects. A modal technique is used to reduce the number of degrees of freedom. Results show that the thrust bearing defects introduce supplementary critical speeds. The linear approach is unable to show the supplementary critical speeds which are obtained only by using non-linear analysis.

  1. Applying formal methods and object-oriented analysis to existing flight software

    NASA Technical Reports Server (NTRS)

    Cheng, Betty H. C.; Auernheimer, Brent

    1993-01-01

    Correctness is paramount for safety-critical software control systems. Critical software failures in medical radiation treatment, communications, and defense are familiar to the public. The significant quantity of software malfunctions regularly reported to the software engineering community, the laws concerning liability, and a recent NRC Aeronautics and Space Engineering Board report additionally motivate the use of error-reducing and defect detection software development techniques. The benefits of formal methods in requirements driven software development ('forward engineering') is well documented. One advantage of rigorously engineering software is that formal notations are precise, verifiable, and facilitate automated processing. This paper describes the application of formal methods to reverse engineering, where formal specifications are developed for a portion of the shuttle on-orbit digital autopilot (DAP). Three objectives of the project were to: demonstrate the use of formal methods on a shuttle application, facilitate the incorporation and validation of new requirements for the system, and verify the safety-critical properties to be exhibited by the software.

  2. A Study Combining Criticism and Qualitative Research Techniques for Appraising Classroom Media.

    ERIC Educational Resources Information Center

    Swartz, James D.

    Qualitative criticism is a method of understanding things, actions, and events within a social framework. It is a method of acquiring knowledge to guide decision making based on local knowledge and a synthesis of principles from criticism and qualitative research. The function of qualitative criticism is centered with Richard Rorty's theoretical…

  3. The Use of Argument Mapping to Enhance Critical Thinking Skills in Business Education

    ERIC Educational Resources Information Center

    Kunsch, David W.; Schnarr, Karin; van Tyle, Russell

    2014-01-01

    Complex business problems require enhanced critical thinking skills. In a dedicated, in-person critical thinking class, argument mapping techniques were used in conjunction with business and nonbusiness case studies to build the critical thinking skills of a group of master of business administration students. Results demonstrated that the…

  4. Standoff detection of explosives: critical comparison for ensuing options on Raman spectroscopy-LIBS sensor fusion.

    PubMed

    Moros, J; Lorenzo, J A; Laserna, J J

    2011-07-01

    In general, any standoff sensor for the effective detection of explosives must meet two basic requirements: first, a capacity to detect the response generated from only a small amount of material located at a distance of several meters (high sensitivity) and second, the ability to provide easily distinguishable responses for different materials (high specificity). Raman spectroscopy and laser-induced breakdown spectroscopy (LIBS) are two analytical techniques which share similar instrumentation and, at the same time, generate complementary data. These factors have been taken into account recently for the design of sensors used in the detection of explosives. Similarly, research on the proper integration of both techniques has been around for a while. A priori, the different operational conditions required by the two techniques oblige the acquisition of the response for each sensor through sequential analysis, previously necessary to define the proper hierarchy of actuation. However, such an approach does not guarantee that Raman and LIBS responses obtained may relate to each other. Nonetheless, the possible advantages arising from the integration of the molecular and elemental spectroscopic information come with an obvious underlying requirement, simultaneous data acquisition. In the present paper, strong and weak points of Raman spectroscopy and LIBS for solving explosives detection problems, in terms of selectivity, sensitivity, and throughput, are critically examined, discussed, and compared for assessing the ensuing options on the fusion of the responses of both sensing technologies.

  5. Applying data fusion techniques for benthic habitat mapping and monitoring in a coral reef ecosystem

    NASA Astrophysics Data System (ADS)

    Zhang, Caiyun

    2015-06-01

    Accurate mapping and effective monitoring of benthic habitat in the Florida Keys are critical in developing management strategies for this valuable coral reef ecosystem. For this study, a framework was designed for automated benthic habitat mapping by combining multiple data sources (hyperspectral, aerial photography, and bathymetry data) and four contemporary imagery processing techniques (data fusion, Object-based Image Analysis (OBIA), machine learning, and ensemble analysis). In the framework, 1-m digital aerial photograph was first merged with 17-m hyperspectral imagery and 10-m bathymetry data using a pixel/feature-level fusion strategy. The fused dataset was then preclassified by three machine learning algorithms (Random Forest, Support Vector Machines, and k-Nearest Neighbor). Final object-based habitat maps were produced through ensemble analysis of outcomes from three classifiers. The framework was tested for classifying a group-level (3-class) and code-level (9-class) habitats in a portion of the Florida Keys. Informative and accurate habitat maps were achieved with an overall accuracy of 88.5% and 83.5% for the group-level and code-level classifications, respectively.

  6. Orion Exploration Flight Test Post-Flight Inspection and Analysis

    NASA Technical Reports Server (NTRS)

    Miller, J. E.; Berger, E. L.; Bohl, W. E.; Christiansen, E. L.; Davis, B. A.; Deighton, K. D.; Enriquez, P. A.; Garcia, M. A.; Hyde, J. L.; Oliveras, O. M.

    2017-01-01

    The principal mechanism for developing orbital debris environment models, is to make observations of larger pieces of debris in the range of several centimeters and greater using radar and optical techniques. For particles that are smaller than this threshold, breakup and migration models of particles to returned surfaces in lower orbit are relied upon to quantify the flux. This reliance on models to derive spatial densities of particles that are of critical importance to spacecraft make the unique nature of the EFT-1's return surface a valuable metric. To this end detailed post-flight inspections have been performed of the returned EFT-1 backshell, and the inspections identified six candidate impact sites that were not present during the pre-flight inspections. This paper describes the post-flight analysis efforts to characterize the EFT-1 mission craters. This effort included ground based testing to understand small particle impact craters in the thermal protection material, the pre- and post-flight inspection, the crater analysis using optical, X-ray computed tomography (CT) and scanning electron microscope (SEM) techniques, and numerical simulations.

  7. Emerging surface characterization techniques for carbon steel corrosion: a critical brief review.

    PubMed

    Dwivedi, D; Lepkova, K; Becker, T

    2017-03-01

    Carbon steel is a preferred construction material in many industrial and domestic applications, including oil and gas pipelines, where corrosion mitigation using film-forming corrosion inhibitor formulations is a widely accepted method. This review identifies surface analytical techniques that are considered suitable for analysis of thin films at metallic substrates, but are yet to be applied to analysis of carbon steel surfaces in corrosive media or treated with corrosion inhibitors. The reviewed methods include time of flight-secondary ion mass spectrometry, X-ray absorption spectroscopy methods, particle-induced X-ray emission, Rutherford backscatter spectroscopy, Auger electron spectroscopy, electron probe microanalysis, near-edge X-ray absorption fine structure spectroscopy, X-ray photoemission electron microscopy, low-energy electron diffraction, small-angle neutron scattering and neutron reflectometry, and conversion electron Moessbauer spectrometry. Advantages and limitations of the analytical methods in thin-film surface investigations are discussed. Technical parameters of nominated analytical methods are provided to assist in the selection of suitable methods for analysis of metallic substrates deposited with surface films. The challenges associated with the applications of the emerging analytical methods in corrosion science are also addressed.

  8. Emerging surface characterization techniques for carbon steel corrosion: a critical brief review

    NASA Astrophysics Data System (ADS)

    Dwivedi, D.; Lepkova, K.; Becker, T.

    2017-03-01

    Carbon steel is a preferred construction material in many industrial and domestic applications, including oil and gas pipelines, where corrosion mitigation using film-forming corrosion inhibitor formulations is a widely accepted method. This review identifies surface analytical techniques that are considered suitable for analysis of thin films at metallic substrates, but are yet to be applied to analysis of carbon steel surfaces in corrosive media or treated with corrosion inhibitors. The reviewed methods include time of flight-secondary ion mass spectrometry, X-ray absorption spectroscopy methods, particle-induced X-ray emission, Rutherford backscatter spectroscopy, Auger electron spectroscopy, electron probe microanalysis, near-edge X-ray absorption fine structure spectroscopy, X-ray photoemission electron microscopy, low-energy electron diffraction, small-angle neutron scattering and neutron reflectometry, and conversion electron Moessbauer spectrometry. Advantages and limitations of the analytical methods in thin-film surface investigations are discussed. Technical parameters of nominated analytical methods are provided to assist in the selection of suitable methods for analysis of metallic substrates deposited with surface films. The challenges associated with the applications of the emerging analytical methods in corrosion science are also addressed.

  9. Hypersonic vehicle control law development using H infinity and mu-synthesis

    NASA Technical Reports Server (NTRS)

    Gregory, Irene M.; Chowdhry, Rajiv S.; Mcminn, John D.; Shaughnessy, John D.

    1992-01-01

    Applicability and effectiveness of robust control techniques to a single-stage-to-orbit (SSTO) airbreathing hypersonic vehicle on an ascent accelerating path and their effectiveness are explored in this paper. An SSTO control system design problem, requiring high accuracy tracking of velocity and altitude commands while limiting angle of attack oscillations, minimizing control power usage and stabilizing the vehicle all in the presence of atmospheric turbulence and uncertainty in the system, was formulated to compare results of the control designs using H infinity and mu-synthesis procedures. The math model, an integrated flight/propulsion dynamic model of a conical accelerator class vehicle, was linearized as the vehicle accelerated through Mach 8. Controller analysis was conducted using the singular value technique and the mu-analysis approach. Analysis results were obtained in both the frequency and the time domains. The results clearly demonstrate the inherent advantages of the structured singular value framework for this class of problems. Since payload performance margins are so critical for the SSTO mission, it is crucial that adequate stability margins be provided without sacrificing any payload mass.

  10. True Ortho Generation of Urban Area Using High Resolution Aerial Photos

    NASA Astrophysics Data System (ADS)

    Hu, Yong; Stanley, David; Xin, Yubin

    2016-06-01

    The pros and cons of existing methods for true ortho generation are analyzed based on a critical literature review for its two major processing stages: visibility analysis and occlusion compensation. They process frame and pushbroom images using different algorithms for visibility analysis due to the need of perspective centers used by the z-buffer (or alike) techniques. For occlusion compensation, the pixel-based approach likely results in excessive seamlines in the ortho-rectified images due to the use of a quality measure on the pixel-by-pixel rating basis. In this paper, we proposed innovative solutions to tackle the aforementioned problems. For visibility analysis, an elevation buffer technique is introduced to employ the plain elevations instead of the distances from perspective centers by z-buffer, and has the advantage of sensor independency. A segment oriented strategy is developed to evaluate a plain cost measure per segment for occlusion compensation instead of the tedious quality rating per pixel. The cost measure directly evaluates the imaging geometry characteristics in ground space, and is also sensor independent. Experimental results are demonstrated using aerial photos acquired by UltraCam camera.

  11. Emerging surface characterization techniques for carbon steel corrosion: a critical brief review

    PubMed Central

    Dwivedi, D.; Becker, T.

    2017-01-01

    Carbon steel is a preferred construction material in many industrial and domestic applications, including oil and gas pipelines, where corrosion mitigation using film-forming corrosion inhibitor formulations is a widely accepted method. This review identifies surface analytical techniques that are considered suitable for analysis of thin films at metallic substrates, but are yet to be applied to analysis of carbon steel surfaces in corrosive media or treated with corrosion inhibitors. The reviewed methods include time of flight-secondary ion mass spectrometry, X-ray absorption spectroscopy methods, particle-induced X-ray emission, Rutherford backscatter spectroscopy, Auger electron spectroscopy, electron probe microanalysis, near-edge X-ray absorption fine structure spectroscopy, X-ray photoemission electron microscopy, low-energy electron diffraction, small-angle neutron scattering and neutron reflectometry, and conversion electron Moessbauer spectrometry. Advantages and limitations of the analytical methods in thin-film surface investigations are discussed. Technical parameters of nominated analytical methods are provided to assist in the selection of suitable methods for analysis of metallic substrates deposited with surface films. The challenges associated with the applications of the emerging analytical methods in corrosion science are also addressed. PMID:28413351

  12. Elemental analysis by IBA and NAA — A critical comparison

    NASA Astrophysics Data System (ADS)

    Watterson, J. I. W.

    1988-12-01

    In this review neutron activation analysis (NAA) and ion beam analysis (IBA) have been compared in the context of the entire field of analytical science using the discipline of scientometrics, as developed by Braun and Lyon. This perspective on the relative achievements of the two methods is modified by considering and comparing their particular attributes and characteristics, particularly in relation to their differing degree of maturity. This assessment shows that NAA, as the more mature method, is the most widely applied nuclear technique, but the special capabilities of IBA give it the ability to provide information about surface composition and elemental distribution that is unique, while it is still relatively immature and it is not yet possible to define its ultimate role with any confidence.

  13. Genome-Wide High-Resolution aCGH Analysis of Gestational Choriocarcinomas

    PubMed Central

    Poaty, Henriette; Coullin, Philippe; Peko, Jean Félix; Dessen, Philippe; Diatta, Ange Lucien; Valent, Alexander; Leguern, Eric; Prévot, Sophie; Gombé-Mbalawa, Charles; Candelier, Jean-Jacques; Picard, Jean-Yves; Bernheim, Alain

    2012-01-01

    Eleven samples of DNA from choriocarcinomas were studied by high resolution CGH-array 244 K. They were studied after histopathological confirmation of the diagnosis, of the androgenic etiology and after a microsatellite marker analysis confirming the absence of contamination of tumor DNA from maternal DNA. Three cell lines, BeWo, JAR, JEG were also studied by this high resolution pangenomic technique. According to aCGH analysis, the de novo choriocarcinomas exhibited simple chromosomal rearrangements or normal profiles. The cell lines showed various and complex chromosomal aberrations. 23 Minimal Critical Regions were defined that allowed us to list the genes that were potentially implicated. Among them, unusually high numbers of microRNA clusters and imprinted genes were observed. PMID:22253721

  14. NASA Marshall Space Flight Center Controls Systems Design and Analysis Branch

    NASA Technical Reports Server (NTRS)

    Gilligan, Eric

    2014-01-01

    Marshall Space Flight Center maintains a critical national capability in the analysis of launch vehicle flight dynamics and flight certification of GN&C algorithms. MSFC analysts are domain experts in the areas of flexible-body dynamics and control-structure interaction, thrust vector control, sloshing propellant dynamics, and advanced statistical methods. Marshall's modeling and simulation expertise has supported manned spaceflight for over 50 years. Marshall's unparalleled capability in launch vehicle guidance, navigation, and control technology stems from its rich heritage in developing, integrating, and testing launch vehicle GN&C systems dating to the early Mercury-Redstone and Saturn vehicles. The Marshall team is continuously developing novel methods for design, including advanced techniques for large-scale optimization and analysis.

  15. A temporal phase unwrapping algorithm for photoelastic stress analysis

    NASA Astrophysics Data System (ADS)

    Baldi, Antonio; Bertolino, Filippo; Ginesu, Francesco

    2007-05-01

    Photoelastic stress analysis is a full-field optical technique for experimental stress analysis whose automation has received considerable research attention over the last 15 years. The latest developments have been made possible largely due to the availability of powerful calculators with large memory capacity and colour, high resolution, cameras. A further stimulus is provided by the photoelastic resins now used for rapid prototyping. However, one critical aspect which still deserves attention is phase unwrapping. The algorithms most commonly used for this purpose have been developed in other scientific areas (classical interferometry, profilometry, moiré, etc.) for solving different problems. In this article a new algorithm is proposed for temporal phase unwrapping, which offers several advantages over those used today.

  16. Time-Series Analysis: A Cautionary Tale

    NASA Technical Reports Server (NTRS)

    Damadeo, Robert

    2015-01-01

    Time-series analysis has often been a useful tool in atmospheric science for deriving long-term trends in various atmospherically important parameters (e.g., temperature or the concentration of trace gas species). In particular, time-series analysis has been repeatedly applied to satellite datasets in order to derive the long-term trends in stratospheric ozone, which is a critical atmospheric constituent. However, many of the potential pitfalls relating to the non-uniform sampling of the datasets were often ignored and the results presented by the scientific community have been unknowingly biased. A newly developed and more robust application of this technique is applied to the Stratospheric Aerosol and Gas Experiment (SAGE) II version 7.0 ozone dataset and the previous biases and newly derived trends are presented.

  17. Survival analysis: A consumer-friendly method to estimate the optimum sucrose level in probiotic petit suisse.

    PubMed

    Esmerino, E A; Paixão, J A; Cruz, A G; Garitta, L; Hough, G; Bolini, H M A

    2015-11-01

    For years, just-about-right (JAR) scales have been among the most used techniques to obtain sensory information about consumer perception, but recently, some researchers have harshly criticized the technique. The present study aimed to apply survival analysis to estimate the optimum sucrose concentration in probiotic petit suisse cheese and compare the survival analysis to JAR scales to verify which technique more accurately predicted the optimum sucrose concentration according to consumer acceptability. Two panels of consumers (total=170) performed affective tests to determine the optimal concentration of sucrose in probiotic petit suisse using 2 different methods of analysis: JAR scales (n=85) and survival analysis (n=85). Then an acceptance test was conducted using naïve consumers (n=100) between 18 and 60 yr old, with 2 samples of petit suisse, one with the ideal sucrose determined by JAR scales and the other with the ideal sucrose content determined by survival analysis, to determine which formulation was in accordance with consumer acceptability. The results indicate that the 2 sensory methods were equally effective in predicting the optimum sucrose level in probiotic petit suisse cheese, and no significant differences were detected in any of the characteristics related to liking evaluated. However, survival analysis has important advantages over the JAR scales. Survival analysis has shown the potential to be an advantageous tool for dairy companies because it was able to accurately predict the optimum sucrose content in a consumer-friendly way and was also practical for researchers because experimental sensory work is simpler and has been shown to be more cost effective than JAR scales without losses of consumer acceptability. Copyright © 2015 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  18. Critical Response Protocol

    ERIC Educational Resources Information Center

    Ellingson, Charlene; Roehrig, Gillian; Bakkum, Kris; Dubinsky, Janet M.

    2016-01-01

    This article introduces the Critical Response Protocol (CRP), an arts-based technique that engages students in equitable critical discourse and aligns with the "Next Generation Science Standards" vision for providing students opportunities for language learning while advancing science learning (NGSS Lead States 2013). CRP helps teachers…

  19. A protocol for the development of a critical thinking assessment tool for nurses using a Delphi technique.

    PubMed

    Jacob, Elisabeth; Duffield, Christine; Jacob, Darren

    2017-08-01

    The aim of this study was to develop an assessment tool to measure the critical thinking ability of nurses. As an increasing number of complex patients are admitted to hospitals, the importance of nurses recognizing changes in health status and picking up on deterioration is more important. To detect early signs of complication requires critical thinking skills. Registered Nurses are expected to commence their clinical careers with the necessary critical thinking skills to ensure safe nursing practice. Currently, there is no published tool to assess critical thinking skills which is context specific to Australian nurses. A modified Delphi study will be used for the project. This study will develop a series of unfolding case scenarios using national health data with multiple-choice questions to assess critical thinking. Face validity of the scenarios will be determined by an expert reference group of clinical and academic nurses. A Delphi study will determine the answers to scenario questions. Panel members will be expert clinicians and educators from two states in Australia. Rasch analysis of the questionnaire will assess validity and reliability of the tool. Funding for the study and Research Ethics Committee approval were obtained in March and November 2016, respectively. Patient outcomes and safety are directly linked to nurses' critical thinking skills. This study will develop an assessment tool to provide a standardized method of measuring nurses' critical thinking skills across Australia. This will provide healthcare providers with greater confidence in the critical thinking level of graduate Registered Nurses. © 2017 John Wiley & Sons Ltd.

  20. Challenges faced by nurses in managing pain in a critical care setting.

    PubMed

    Subramanian, Pathmawathi; Allcock, Nick; James, Veronica; Lathlean, Judith

    2012-05-01

    To explore nurses' challenges in managing pain among ill patients in critical care. Pain can lead to many adverse medical consequences and providing pain relief is central to caring for ill patients. Effective pain management is vital since studies show patients admitted to critical care units still suffer from significant levels of acute pain. The effective delivery of care in clinical areas remains a challenge for nurses involved with care which is dynamic and constantly changing in critically ill. Qualitative prospective exploratory design. This study employed semi structured interviews with nurses, using critical incident technique. Twenty-one nurses were selected from critical care settings from a large acute teaching health care trust in the UK. A critical incident interview guide was constructed from the literature and used to elicit responses. Framework analysis showed that nurses perceived four main challenges in managing pain namely lack of clinical guidelines, lack of structured pain assessment tool, limited autonomy in decision making and the patient's condition itself. Nurses' decision making and pain management can influence the quality of care given to critically ill patients. It is important to overcome the clinical problems that are faced when dealing with pain experience. There is a need for nursing education on pain management. Providing up to date and practical strategies may help to reduce nurses' challenges in managing pain among critically ill patients. Broader autonomy and effective decision making can be seen as beneficial for the nurses besides having a clearer and structured pain management guidelines. © 2011 Blackwell Publishing Ltd.

Top