Diffraction as a Method of Critical Policy Analysis
ERIC Educational Resources Information Center
Ulmer, Jasmine B.
2016-01-01
Recent developments in critical policy analysis have occurred alongside the new materialisms in qualitative research. These lines of scholarship have unfolded along two separate, but related, tracks. In particular, the new materialist method of "diffraction" aligns with many elements of critical policy analysis. Both involve critical…
A method for identifying EMI critical circuits during development of a large C3
NASA Astrophysics Data System (ADS)
Barr, Douglas H.
The circuit analysis methods and process Boeing Aerospace used on a large, ground-based military command, control, and communications (C3) system are described. This analysis was designed to help identify electromagnetic interference (EMI) critical circuits. The methodology used the MIL-E-6051 equipment criticality categories as the basis for defining critical circuits, relational database technology to help sort through and account for all of the approximately 5000 system signal cables, and Macintosh Plus personal computers to predict critical circuits based on safety margin analysis. The EMI circuit analysis process systematically examined all system circuits to identify which ones were likely to be EMI critical. The process used two separate, sequential safety margin analyses to identify critical circuits (conservative safety margin analysis, and detailed safety margin analysis). These analyses used field-to-wire and wire-to-wire coupling models using both worst-case and detailed circuit parameters (physical and electrical) to predict circuit safety margins. This process identified the predicted critical circuits that could then be verified by test.
Bonan, Brigitte; Martelli, Nicolas; Berhoune, Malik; Maestroni, Marie-Laure; Havard, Laurent; Prognon, Patrice
2009-02-01
To apply the Hazard analysis and Critical Control Points method to the preparation of anti-cancer drugs. To identify critical control points in our cancer chemotherapy process and to propose control measures and corrective actions to manage these processes. The Hazard Analysis and Critical Control Points application began in January 2004 in our centralized chemotherapy compounding unit. From October 2004 to August 2005, monitoring of the process nonconformities was performed to assess the method. According to the Hazard Analysis and Critical Control Points method, a multidisciplinary team was formed to describe and assess the cancer chemotherapy process. This team listed all of the critical points and calculated their risk indexes according to their frequency of occurrence, their severity and their detectability. The team defined monitoring, control measures and corrective actions for each identified risk. Finally, over a 10-month period, pharmacists reported each non-conformity of the process in a follow-up document. Our team described 11 steps in the cancer chemotherapy process. The team identified 39 critical control points, including 11 of higher importance with a high-risk index. Over 10 months, 16,647 preparations were performed; 1225 nonconformities were reported during this same period. The Hazard Analysis and Critical Control Points method is relevant when it is used to target a specific process such as the preparation of anti-cancer drugs. This method helped us to focus on the production steps, which can have a critical influence on product quality, and led us to improve our process.
Rhetorical Analysis in Critical Policy Research
ERIC Educational Resources Information Center
Winton, Sue
2013-01-01
Rhetorical analysis, an approach to critical discourse analysis, is presented as a useful method for critical policy analysis and its effort to understand the role policies play in perpetuating inequality. A rhetorical analysis of Character "Matters!", the character education policy of a school board in Ontario, Canada, provides an…
Accurate estimates of 3D Ising critical exponents using the coherent-anomaly method
NASA Astrophysics Data System (ADS)
Kolesik, Miroslav; Suzuki, Masuo
1995-02-01
An analysis of the critical behavior of the three-dimensional Ising model using the coherent-anomaly method (CAM) is presented. Various sources of errors in CAM estimates of critical exponents are discussed, and an improved scheme for the CAM data analysis is tested. Using a set of mean-field type approximations based on the variational series expansion approach, accuracy comparable to the most precise conventional methods has been achieved. Our results for the critical exponents are given by α = 0.108(5), β = 0.327(4), γ = 1.237(4) and δ = 4.77(5).
Lee, JuHee; Lee, Yoonju; Gong, SaeLom; Bae, Juyeon; Choi, Moonki
2016-09-15
Scientific framework is important in designing curricula and evaluating students in the field of education and clinical practice. The purpose of this study was to examine the effectiveness of non-traditional educational methods on critical thinking skills. A systematic review approach was applied. Studies published in peer-reviewed journals from January 2001 to December 2014 were searched using electronic databases and major education journals. A meta-analysis was performed using Review Manager 5.2. Reviewing the included studies, the California Critical Thinking Dispositions Inventory (CCTDI) and California Critical Thinking Skills Test (CCTST) were used to assess the effectiveness of critical thinking in the meta-analysis. The eight CCTDI datasets showed that non- traditional teaching methods (i.e., no lectures) were more effective compared to control groups (standardized mean difference [SMD]: 0.42, 95 % confidence interval [CI]: 0.26-0.57, p < .00001). And six CCTST datasets showed the teaching and learning methods in these studies were also had significantly more effects when compared to the control groups (SMD: 0.29, 95 % CI: 0.10-0.48, p = 0.003). This research showed that new teaching and learning methods designed to improve critical thinking were generally effective at enhancing critical thinking dispositions.
Structured Case Analysis: Developing Critical Thinking Skills in a Marketing Case Course
ERIC Educational Resources Information Center
Klebba, Joanne M.; Hamilton, Janet G.
2007-01-01
Structured case analysis is a hybrid pedagogy that flexibly combines diverse instructional methods with comprehensive case analysis as a mechanism to develop critical thinking skills. An incremental learning framework is proposed that allows instructors to develop and monitor content-specific theory and the corresponding critical thinking skills.…
NASA Astrophysics Data System (ADS)
Klügel, J.
2006-12-01
Deterministic scenario-based seismic hazard analysis has a long tradition in earthquake engineering for developing the design basis of critical infrastructures like dams, transport infrastructures, chemical plants and nuclear power plants. For many applications besides of the design of infrastructures it is of interest to assess the efficiency of the design measures taken. These applications require a method allowing to perform a meaningful quantitative risk analysis. A new method for a probabilistic scenario-based seismic risk analysis has been developed based on a probabilistic extension of proven deterministic methods like the MCE- methodology. The input data required for the method are entirely based on the information which is necessary to perform any meaningful seismic hazard analysis. The method is based on the probabilistic risk analysis approach common for applications in nuclear technology developed originally by Kaplan & Garrick (1981). It is based (1) on a classification of earthquake events into different size classes (by magnitude), (2) the evaluation of the frequency of occurrence of events, assigned to the different classes (frequency of initiating events, (3) the development of bounding critical scenarios assigned to each class based on the solution of an optimization problem and (4) in the evaluation of the conditional probability of exceedance of critical design parameters (vulnerability analysis). The advantage of the method in comparison with traditional PSHA consists in (1) its flexibility, allowing to use different probabilistic models for earthquake occurrence as well as to incorporate advanced physical models into the analysis, (2) in the mathematically consistent treatment of uncertainties, and (3) in the explicit consideration of the lifetime of the critical structure as a criterion to formulate different risk goals. The method was applied for the evaluation of the risk of production interruption losses of a nuclear power plant during its residual lifetime.
Automatic Analysis of Critical Incident Reports: Requirements and Use Cases.
Denecke, Kerstin
2016-01-01
Increasingly, critical incident reports are used as a means to increase patient safety and quality of care. The entire potential of these sources of experiential knowledge remains often unconsidered since retrieval and analysis is difficult and time-consuming, and the reporting systems often do not provide support for these tasks. The objective of this paper is to identify potential use cases for automatic methods that analyse critical incident reports. In more detail, we will describe how faceted search could offer an intuitive retrieval of critical incident reports and how text mining could support in analysing relations among events. To realise an automated analysis, natural language processing needs to be applied. Therefore, we analyse the language of critical incident reports and derive requirements towards automatic processing methods. We learned that there is a huge potential for an automatic analysis of incident reports, but there are still challenges to be solved.
The fact of ignorance: revisiting the Socratic method as a tool for teaching critical thinking.
Oyler, Douglas R; Romanelli, Frank
2014-09-15
Critical thinking, while highly valued as an ability of health care providers, remains a skill that many educators find difficult to teach. This review provides an analysis examining why current methods of teaching critical thinking to health care students (primarily medical and pharmacy students) often fail and describes a premise and potential utility of the Socratic method as a tool to teach critical thinking in health care education.
A Model of Risk Analysis in Analytical Methodology for Biopharmaceutical Quality Control.
Andrade, Cleyton Lage; Herrera, Miguel Angel De La O; Lemes, Elezer Monte Blanco
2018-01-01
One key quality control parameter for biopharmaceutical products is the analysis of residual cellular DNA. To determine small amounts of DNA (around 100 pg) that may be in a biologically derived drug substance, an analytical method should be sensitive, robust, reliable, and accurate. In principle, three techniques have the ability to measure residual cellular DNA: radioactive dot-blot, a type of hybridization; threshold analysis; and quantitative polymerase chain reaction. Quality risk management is a systematic process for evaluating, controlling, and reporting of risks that may affects method capabilities and supports a scientific and practical approach to decision making. This paper evaluates, by quality risk management, an alternative approach to assessing the performance risks associated with quality control methods used with biopharmaceuticals, using the tool hazard analysis and critical control points. This tool provides the possibility to find the steps in an analytical procedure with higher impact on method performance. By applying these principles to DNA analysis methods, we conclude that the radioactive dot-blot assay has the largest number of critical control points, followed by quantitative polymerase chain reaction, and threshold analysis. From the analysis of hazards (i.e., points of method failure) and the associated method procedure critical control points, we conclude that the analytical methodology with the lowest risk for performance failure for residual cellular DNA testing is quantitative polymerase chain reaction. LAY ABSTRACT: In order to mitigate the risk of adverse events by residual cellular DNA that is not completely cleared from downstream production processes, regulatory agencies have required the industry to guarantee a very low level of DNA in biologically derived pharmaceutical products. The technique historically used was radioactive blot hybridization. However, the technique is a challenging method to implement in a quality control laboratory: It is laborious, time consuming, semi-quantitative, and requires a radioisotope. Along with dot-blot hybridization, two alternatives techniques were evaluated: threshold analysis and quantitative polymerase chain reaction. Quality risk management tools were applied to compare the techniques, taking into account the uncertainties, the possibility of circumstances or future events, and their effects upon method performance. By illustrating the application of these tools with DNA methods, we provide an example of how they can be used to support a scientific and practical approach to decision making and can assess and manage method performance risk using such tools. This paper discusses, considering the principles of quality risk management, an additional approach to the development and selection of analytical quality control methods using the risk analysis tool hazard analysis and critical control points. This tool provides the possibility to find the method procedural steps with higher impact on method reliability (called critical control points). Our model concluded that the radioactive dot-blot assay has the larger number of critical control points, followed by quantitative polymerase chain reaction and threshold analysis. Quantitative polymerase chain reaction is shown to be the better alternative analytical methodology in residual cellular DNA analysis. © PDA, Inc. 2018.
The Fact of IgnoranceRevisiting the Socratic Method as a Tool for Teaching Critical Thinking
Romanelli, Frank
2014-01-01
Critical thinking, while highly valued as an ability of health care providers, remains a skill that many educators find difficult to teach. This review provides an analysis examining why current methods of teaching critical thinking to health care students (primarily medical and pharmacy students) often fail and describes a premise and potential utility of the Socratic method as a tool to teach critical thinking in health care education. PMID:25258449
Software Safety Progress in NASA
NASA Technical Reports Server (NTRS)
Radley, Charles F.
1995-01-01
NASA has developed guidelines for development and analysis of safety-critical software. These guidelines have been documented in a Guidebook for Safety Critical Software Development and Analysis. The guidelines represent a practical 'how to' approach, to assist software developers and safety analysts in cost effective methods for software safety. They provide guidance in the implementation of the recent NASA Software Safety Standard NSS-1740.13 which was released as 'Interim' version in June 1994, scheduled for formal adoption late 1995. This paper is a survey of the methods in general use, resulting in the NASA guidelines for safety critical software development and analysis.
Assessing criticality in seismicity by entropy
NASA Astrophysics Data System (ADS)
Goltz, C.
2003-04-01
There is an ongoing discussion whether the Earth's crust is in a critical state and whether this state is permanent or intermittent. Intermittent criticality would allow specification of time-dependent hazard in principle. Analysis of a spatio-temporally evolving synthetic critical point phenomenon and of real seismicity using configurational entropy shows that the method is a suitable approach for the characterisation of critical point dynamics. Results obtained rather support the notion of intermittent criticality in earthquakes. Statistical significance of the findings is assessed by the method of surrogate data.
Simulating Mission Command for Planning and Analysis
2015-06-01
mission plan. 14. SUBJECT TERMS Mission Planning, CPM , PERT, Simulation, DES, Simkit, Triangle Distribution, Critical Path 15. NUMBER OF...Battalion Task Force CO Company CPM Critical Path Method DES Discrete Event Simulation FA BAT Field Artillery Battalion FEL Future Event List FIST...management tools that can be utilized to find the critical path in military projects. These are the Critical Path Method ( CPM ) and the Program Evaluation and
DOE Office of Scientific and Technical Information (OSTI.GOV)
Skiles, S. K.
1994-12-22
An inductive double-contingency analysis (DCA) method developed by the criticality safety function at the Savannah River Site, was applied in Criticality Safety Evaluations (CSEs) of five major plant process systems at the Westinghouse Electric Corporation`s Commercial Nuclear Fuel Manufacturing Plant in Columbia, South Carolina (WEC-Cola.). The method emphasizes a thorough evaluation of the controls intended to provide barriers against criticality for postulated initiating events, and has been demonstrated effective at identifying common mode failure potential and interdependence among multiple controls. A description of the method and an example of its application is provided.
Dai, Sheng-Yun; Xu, Bing; Shi, Xin-Yuan; Xu, Xiang; Sun, Ying-Qiang; Qiao, Yan-Jiang
2017-03-01
This study is aimed to propose a continual improvement strategy based on quality by design (QbD). An ultra high performance liquid chromatography (UPLC) method was developed to accomplish the method transformation from HPLC to UPLC of Panax notogineng saponins (PNS) and achieve the continual improvement of PNS based on QbD, for example. Plackett-Burman screening design and Box-Behnken optimization design were employed to further understand the relationship between the critical method parameters (CMPs) and critical method attributes (CMAs). And then the Bayesian design space was built. The separation degree of the critical peaks (ginsenoside Rg₁ and ginsenoside Re) was over 2.0 and the analysis time was less than 17 min by a method chosen from the design space with 20% of the initial concentration of the acetonitrile, 10 min of the isocratic time and 6%•min⁻¹ of the gradient slope. At last, the optimum method was validated by accuracy profile. Based on the same analytical target profile (ATP), the comparison of HPLC and UPLC including chromatograph method, CMA identification, CMP-CMA model and system suitability test (SST) indicated that the UPLC method could shorten the analysis time, improve the critical separation and satisfy the requirement of the SST. In all, HPLC method could be replaced by UPLC for the quantity analysis of PNS. Copyright© by the Chinese Pharmaceutical Association.
Using Paradigm Case Analysis To Foster Instructor Development.
ERIC Educational Resources Information Center
Peregrym, Jill; And Others
Paradigm Case Analysis (PCA) is a method of increasing instructor effectiveness through the gathering of narratives of critical teaching incidents and experiences from proficient instructors and their analysis in group discussions. Critical Incidents (CI's) may include those in which the instructor's intervention made a significant difference in…
Methods for improved forewarning of critical events across multiple data channels
Hively, Lee M [Philadelphia, TN
2007-04-24
This disclosed invention concerns improvements in forewarning of critical events via phase-space dissimilarity analysis of data from mechanical devices, electrical devices, biomedical data, and other physical processes. First, a single channel of process-indicative data is selected that can be used in place of multiple data channels without sacrificing consistent forewarning of critical events. Second, the method discards data of inadequate quality via statistical analysis of the raw data, because the analysis of poor quality data always yields inferior results. Third, two separate filtering operations are used in sequence to remove both high-frequency and low-frequency artifacts using a zero-phase quadratic filter. Fourth, the method constructs phase-space dissimilarity measures (PSDM) by combining of multi-channel time-serial data into a multi-channel time-delay phase-space reconstruction. Fifth, the method uses a composite measure of dissimilarity (C.sub.i) to provide a forewarning of failure and an indicator of failure onset.
Critical path method applied to research project planning: Fire Economics Evaluation System (FEES)
Earl B. Anderson; R. Stanton Hales
1986-01-01
The critical path method (CPM) of network analysis (a) depicts precedence among the many activities in a project by a network diagram; (b) identifies critical activities by calculating their starting, finishing, and float times; and (c) displays possible schedules by constructing time charts. CPM was applied to the development of the Forest Service's Fire...
Historicizing in Critical Policy Analysis: The Production of Cultural Histories and Microhistories
ERIC Educational Resources Information Center
Brewer, Curtis A.
2014-01-01
The practice of critical policy analysis often emphasizes the importance of historicizing the present. However, there is very little guidance for critical policy analysts on the methodical production of histories. In this paper, I meet this need by arguing for the use of methodologies embedded in the production of both cultural histories and…
Methods for Conducting Cognitive Task Analysis for a Decision Making Task.
1996-01-01
Cognitive task analysis (CTA) improves traditional task analysis procedures by analyzing the thought processes of performers while they complete a...for using these methods to conduct a CTA for domains which involve critical decision making tasks in naturalistic settings. The cognitive task analysis methods
Mixed time integration methods for transient thermal analysis of structures
NASA Technical Reports Server (NTRS)
Liu, W. K.
1982-01-01
The computational methods used to predict and optimize the thermal structural behavior of aerospace vehicle structures are reviewed. In general, two classes of algorithms, implicit and explicit, are used in transient thermal analysis of structures. Each of these two methods has its own merits. Due to the different time scales of the mechanical and thermal responses, the selection of a time integration method can be a different yet critical factor in the efficient solution of such problems. Therefore mixed time integration methods for transient thermal analysis of structures are being developed. The computer implementation aspects and numerical evaluation of these mixed time implicit-explicit algorithms in thermal analysis of structures are presented. A computationally useful method of estimating the critical time step for linear quadrilateral element is also given. Numerical tests confirm the stability criterion and accuracy characteristics of the methods. The superiority of these mixed time methods to the fully implicit method or the fully explicit method is also demonstrated.
Mixed time integration methods for transient thermal analysis of structures
NASA Technical Reports Server (NTRS)
Liu, W. K.
1983-01-01
The computational methods used to predict and optimize the thermal-structural behavior of aerospace vehicle structures are reviewed. In general, two classes of algorithms, implicit and explicit, are used in transient thermal analysis of structures. Each of these two methods has its own merits. Due to the different time scales of the mechanical and thermal responses, the selection of a time integration method can be a difficult yet critical factor in the efficient solution of such problems. Therefore mixed time integration methods for transient thermal analysis of structures are being developed. The computer implementation aspects and numerical evaluation of these mixed time implicit-explicit algorithms in thermal analysis of structures are presented. A computationally-useful method of estimating the critical time step for linear quadrilateral element is also given. Numerical tests confirm the stability criterion and accuracy characteristics of the methods. The superiority of these mixed time methods to the fully implicit method or the fully explicit method is also demonstrated.
Huff, Andrew G; Hodges, James S; Kennedy, Shaun P; Kircher, Amy
2015-08-01
To protect and secure food resources for the United States, it is crucial to have a method to compare food systems' criticality. In 2007, the U.S. government funded development of the Food and Agriculture Sector Criticality Assessment Tool (FASCAT) to determine which food and agriculture systems were most critical to the nation. FASCAT was developed in a collaborative process involving government officials and food industry subject matter experts (SMEs). After development, data were collected using FASCAT to quantify threats, vulnerabilities, consequences, and the impacts on the United States from failure of evaluated food and agriculture systems. To examine FASCAT's utility, linear regression models were used to determine: (1) which groups of questions posed in FASCAT were better predictors of cumulative criticality scores; (2) whether the items included in FASCAT's criticality method or the smaller subset of FASCAT items included in DHS's risk analysis method predicted similar criticality scores. Akaike's information criterion was used to determine which regression models best described criticality, and a mixed linear model was used to shrink estimates of criticality for individual food and agriculture systems. The results indicated that: (1) some of the questions used in FASCAT strongly predicted food or agriculture system criticality; (2) the FASCAT criticality formula was a stronger predictor of criticality compared to the DHS risk formula; (3) the cumulative criticality formula predicted criticality more strongly than weighted criticality formula; and (4) the mixed linear regression model did not change the rank-order of food and agriculture system criticality to a large degree. © 2015 Society for Risk Analysis.
A concept analysis of critical thinking: A guide for nurse educators.
Von Colln-Appling, Christina; Giuliano, Danielle
2017-02-01
In research literature, the concept of critical thinking has been widely utilized in nursing education. However, critical thinking has been defined and evaluated using a variety of methods. This paper presents a concept analysis to define and clarify the concept of critical thinking to provide a deeper understanding of how critical thinking can be incorporated into nursing education through the use of simulation exercises. A theoretical definition and sample cases were developed to illuminate the concept as well as a discussion of the antecedents, consequences, and empirical referents of critical thinking. Copyright © 2016 Elsevier Ltd. All rights reserved.
DOT National Transportation Integrated Search
1975-07-01
The describing function method of analysis is applied to investigate the influence of parametric variations on wheelset critical velocity. In addition, the relationship between the amplitude of sustained lateral oscillations and critical speed is der...
Bowleg, Lisa
2017-10-01
Qualitative methods are not intrinsically progressive. Methods are simply tools to conduct research. Epistemology, the justification of knowledge, shapes methodology and methods, and thus is a vital starting point for a critical health equity research stance, regardless of whether the methods are qualitative, quantitative, or mixed. In line with this premise, I address four themes in this commentary. First, I criticize the ubiquitous and uncritical use of the term health disparities in U.S. public health. Next, I advocate for the increased use of qualitative methodologies-namely, photovoice and critical ethnography-that, pursuant to critical approaches, prioritize dismantling social-structural inequities as a prerequisite to health equity. Thereafter, I discuss epistemological stance and its influence on all aspects of the research process. Finally, I highlight my critical discourse analysis HIV prevention research based on individual interviews and focus groups with Black men, as an example of a critical health equity research approach.
Yue, Meng; Zhang, Meng; Zhang, Chunmei; Jin, Changde
2017-05-01
As an essential skill in daily clinical nursing practice, critical thinking ability has been an important objective in nursing education. Concept mapping enables nursing students connect new information to existing knowledge and integrates interdisciplinary knowledge. However, there is a lack of evidence related to critical thinking ability and concept mapping in nursing education. The purpose of this systematic review and meta-analysis was to assess the effect of concept mapping in developing critical thinking in nursing education. This systematic review was reported in line with Preferred Reporting Items for Systematic Reviews and Meta-Analysis (PRISMA). A search was conducted in PubMed, Web of science, Embase, Cochrane Central Register of Controlled Trials (CENTRAL), Cumulative Index to Nursing and Allied Health (CINAHL) and China National Knowledge Infrastructure (CNKI). Randomized controlled trials (RCT) comparing concept mapping and traditional teaching method were retrieved. Data were collected by two reviewers according to the data extraction tables. The methodological quality of included studies was assessed by other two reviewers. The results of meta-analysis were presented using mean difference (MD). Thirteen trials were summarized in the systematic review and eleven trials were included in the meta-analysis. The pooled effect size showed that, comparing with traditional methods, concept mapping could improve subjects' critical thinking ability measured by California Critical Thinking Disposition Inventory (CCTDI), California Critical Thinking Skill Test (CCTST) and Critical Thinking Scale (CTS). The subgroup analyses showed that concept mapping improved the score of all subscales. The result of this review indicated that concept mapping could affect the critical thinking affective dispositions and critical thinking cognitive skills. Further high quality research using uniform evaluation is required. Copyright © 2017 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Lewis, Cynthia
2006-01-01
Lewis explains why critical discourse analysis (CDA) has become an indispensable method for many researchers trying to understand how ideologies and social structures are reflected in and reified by language. The critical linguistic turn that has occurred in the humanities and social sciences for the last three decade has finally taken hold in the…
Analysis of a boron-carbide-drum-controlled critical reactor experiment
NASA Technical Reports Server (NTRS)
Mayo, W. T.
1972-01-01
In order to validate methods and cross sections used in the neutronic design of compact fast-spectrum reactors for generating electric power in space, an analysis of a boron-carbide-drum-controlled critical reactor was made. For this reactor the transport analysis gave generally satisfactory results. The calculated multiplication factor for the most detailed calculation was only 0.7-percent Delta k too high. Calculated reactivity worth of the control drums was $11.61 compared to measurements of $11.58 by the inverse kinetics methods and $11.98 by the inverse counting method. Calculated radial and axial power distributions were in good agreement with experiment.
ERIC Educational Resources Information Center
Kanakis, Ioannis
1997-01-01
Examines the Socratic method through a comparative analysis of early Platonic dialogs with theories of critical rationalism and cognitive theories based on achievement motivation. Presents details of the Socratic strategy of teaching and learning, including critical reflection, conversation, and intellectual honesty; asserts that these methods are…
Time Critical Targeting: Predictive Vs Reactionary Methods An Analysis For The Future
2002-06-01
critical targets. To conduct the analysis, a four-step process is used. First, research is conducted to determine which future aircraft, spacecraft , and...the most promising aircraft, spacecraft , and weapons are determined , they are categorized for use in either the reactive or preemptive method. For...no significant delays, 292; Alan Vick et al., 17. 33 Ibid. 12 sensors are Electro-optical (EO) sensors, thermal imagers , and signal intelligence
Research on criticality analysis method of CNC machine tools components under fault rate correlation
NASA Astrophysics Data System (ADS)
Gui-xiang, Shen; Xian-zhuo, Zhao; Zhang, Ying-zhi; Chen-yu, Han
2018-02-01
In order to determine the key components of CNC machine tools under fault rate correlation, a system component criticality analysis method is proposed. Based on the fault mechanism analysis, the component fault relation is determined, and the adjacency matrix is introduced to describe it. Then, the fault structure relation is hierarchical by using the interpretive structure model (ISM). Assuming that the impact of the fault obeys the Markov process, the fault association matrix is described and transformed, and the Pagerank algorithm is used to determine the relative influence values, combined component fault rate under time correlation can obtain comprehensive fault rate. Based on the fault mode frequency and fault influence, the criticality of the components under the fault rate correlation is determined, and the key components are determined to provide the correct basis for equationting the reliability assurance measures. Finally, taking machining centers as an example, the effectiveness of the method is verified.
Experience report: Using formal methods for requirements analysis of critical spacecraft software
NASA Technical Reports Server (NTRS)
Lutz, Robyn R.; Ampo, Yoko
1994-01-01
Formal specification and analysis of requirements continues to gain support as a method for producing more reliable software. However, the introduction of formal methods to a large software project is difficult, due in part to the unfamiliarity of the specification languages and the lack of graphics. This paper reports results of an investigation into the effectiveness of formal methods as an aid to the requirements analysis of critical, system-level fault-protection software on a spacecraft currently under development. Our experience indicates that formal specification and analysis can enhance the accuracy of the requirements and add assurance prior to design development in this domain. The work described here is part of a larger, NASA-funded research project whose purpose is to use formal-methods techniques to improve the quality of software in space applications. The demonstration project described here is part of the effort to evaluate experimentally the effectiveness of supplementing traditional engineering approaches to requirements specification with the more rigorous specification and analysis available with formal methods.
Failure Mode, Effects, and Criticality Analysis (FMECA)
1993-04-01
Preliminary Failure Modes, Effects and Criticality Analysis (FMECA) of the Brayton Isotope Power System Ground Demonstration System, Report No. TID 27301...No. TID/SNA - 3015, Aeroject Nuclear Systems Co., Sacramento, California: 1970. 95. Taylor , J.R. A Formalization of Failure Mode Analysis of Control...Roskilde, Denmark: 1973. 96. Taylor , J.R. A Semi-Automatic Method for Oualitative Failure Mode Analysis. Report No. RISO-M-1707. Available from a
Perceptions of the use of critical thinking teaching methods.
Kowalczyk, Nina; Hackworth, Ruth; Case-Smith, Jane
2012-01-01
To identify the perceived level of competence in teaching and assessing critical thinking skills and the difficulties facing radiologic science program directors in implementing student-centered teaching methods. A total of 692 program directors received an invitation to complete an electronic survey soliciting information regarding the importance of critical thinking skills, their confidence in applying teaching methods and assessing student performance, and perceived obstacles. Statistical analysis included descriptive data, correlation coefficients, and ANOVA. Responses were received from 317 participants indicating program directors perceive critical thinking to be an essential element in the education of the student; however, they identified several areas for improvement. A high correlation was identified between the program directors' perceived level of skill and their confidence in critical thinking, and between their perceived level of skill and ability to assess the students' critical thinking. Key barriers to implementing critical thinking teaching strategies were identified. Program directors value the importance of implementing critical thinking teaching methods and perceive a need for professional development in critical thinking educational methods. Regardless of the type of educational institution in which the academic program is located, the level of education held by the program director was a significant factor regarding perceived confidence in the ability to model critical thinking skills and the ability to assess student critical thinking skills.
ERIC Educational Resources Information Center
Kucan, Linda; Palincsar, Annemarie Sullivan
2018-01-01
This investigation focuses on a tool used in a reading methods course to introduce reading specialist candidates to text analysis as a critical component of planning for text-based discussions. Unlike planning that focuses mainly on important text content or information, a text analysis approach focuses both on content and how that content is…
NASA Technical Reports Server (NTRS)
Wolitz, K.; Brockmann, W.; Fischer, T.
1979-01-01
Acoustic emission analysis as a quasi-nondestructive test method makes it possible to differentiate clearly, in judging the total behavior of fiber-reinforced plastic composites, between critical failure modes (in the case of unidirectional composites fiber fractures) and non-critical failure modes (delamination processes or matrix fractures). A particular advantage is that, for varying pressure demands on the composites, the emitted acoustic pulses can be analyzed with regard to their amplitude distribution. In addition, definite indications as to how the damages occurred can be obtained from the time curves of the emitted acoustic pulses as well as from the particular frequency spectrum. Distinct analogies can be drawn between the various analytical methods with respect to whether the failure modes can be classified as critical or non-critical.
NASA Astrophysics Data System (ADS)
Delgado, Carlos; Cátedra, Manuel Felipe
2018-05-01
This work presents a technique that allows a very noticeable relaxation of the computational requirements for full-wave electromagnetic simulations based on the Method of Moments. A ray-tracing analysis of the geometry is performed in order to extract the critical points with significant contributions. These points are then used to generate a reduced mesh, considering the regions of the geometry that surround each critical point and taking into account the electrical path followed from the source. The electromagnetic analysis of the reduced mesh produces very accurate results, requiring a fraction of the resources that the conventional analysis would utilize.
Eastwood, John G; Jalaludin, Bin B; Kemp, Lynn A
2014-01-01
A recent criticism of social epidemiological studies, and multi-level studies in particular has been a paucity of theory. We will present here the protocol for a study that aims to build a theory of the social epidemiology of maternal depression. We use a critical realist approach which is trans-disciplinary, encompassing both quantitative and qualitative traditions, and that assumes both ontological and hierarchical stratification of reality. We describe a critical realist Explanatory Theory Building Method comprising of an: 1) emergent phase, 2) construction phase, and 3) confirmatory phase. A concurrent triangulated mixed method multilevel cross-sectional study design is described. The Emergent Phase uses: interviews, focus groups, exploratory data analysis, exploratory factor analysis, regression, and multilevel Bayesian spatial data analysis to detect and describe phenomena. Abductive and retroductive reasoning will be applied to: categorical principal component analysis, exploratory factor analysis, regression, coding of concepts and categories, constant comparative analysis, drawing of conceptual networks, and situational analysis to generate theoretical concepts. The Theory Construction Phase will include: 1) defining stratified levels; 2) analytic resolution; 3) abductive reasoning; 4) comparative analysis (triangulation); 5) retroduction; 6) postulate and proposition development; 7) comparison and assessment of theories; and 8) conceptual frameworks and model development. The strength of the critical realist methodology described is the extent to which this paradigm is able to support the epistemological, ontological, axiological, methodological and rhetorical positions of both quantitative and qualitative research in the field of social epidemiology. The extensive multilevel Bayesian studies, intensive qualitative studies, latent variable theory, abductive triangulation, and Inference to Best Explanation provide a strong foundation for Theory Construction. The study will contribute to defining the role that realism and mixed methods can play in explaining the social determinants and developmental origins of health and disease.
ERIC Educational Resources Information Center
Sundararajan, NarayanKripa; Adesope, Olusola; Cavagnetto, Andy
2017-01-01
To develop and nurture critical thinking, students must have opportunities to observe and practice critical thinking in the classroom. In this parallel mixed method classroom study, we investigate the role of collaborative concept mapping in the development of kindergarten learners' critical thinking skills of analysis and interpretation over a…
Hard-Rock Stability Analysis for Span Design in Entry-Type Excavations with Learning Classifiers
García-Gonzalo, Esperanza; Fernández-Muñiz, Zulima; García Nieto, Paulino José; Bernardo Sánchez, Antonio; Menéndez Fernández, Marta
2016-01-01
The mining industry relies heavily on empirical analysis for design and prediction. An empirical design method, called the critical span graph, was developed specifically for rock stability analysis in entry-type excavations, based on an extensive case-history database of cut and fill mining in Canada. This empirical span design chart plots the critical span against rock mass rating for the observed case histories and has been accepted by many mining operations for the initial span design of cut and fill stopes. Different types of analysis have been used to classify the observed cases into stable, potentially unstable and unstable groups. The main purpose of this paper is to present a new method for defining rock stability areas of the critical span graph, which applies machine learning classifiers (support vector machine and extreme learning machine). The results show a reasonable correlation with previous guidelines. These machine learning methods are good tools for developing empirical methods, since they make no assumptions about the regression function. With this software, it is easy to add new field observations to a previous database, improving prediction output with the addition of data that consider the local conditions for each mine. PMID:28773653
Hard-Rock Stability Analysis for Span Design in Entry-Type Excavations with Learning Classifiers.
García-Gonzalo, Esperanza; Fernández-Muñiz, Zulima; García Nieto, Paulino José; Bernardo Sánchez, Antonio; Menéndez Fernández, Marta
2016-06-29
The mining industry relies heavily on empirical analysis for design and prediction. An empirical design method, called the critical span graph, was developed specifically for rock stability analysis in entry-type excavations, based on an extensive case-history database of cut and fill mining in Canada. This empirical span design chart plots the critical span against rock mass rating for the observed case histories and has been accepted by many mining operations for the initial span design of cut and fill stopes. Different types of analysis have been used to classify the observed cases into stable, potentially unstable and unstable groups. The main purpose of this paper is to present a new method for defining rock stability areas of the critical span graph, which applies machine learning classifiers (support vector machine and extreme learning machine). The results show a reasonable correlation with previous guidelines. These machine learning methods are good tools for developing empirical methods, since they make no assumptions about the regression function. With this software, it is easy to add new field observations to a previous database, improving prediction output with the addition of data that consider the local conditions for each mine.
Quantitative analysis of single-molecule superresolution images
Coltharp, Carla; Yang, Xinxing; Xiao, Jie
2014-01-01
This review highlights the quantitative capabilities of single-molecule localization-based superresolution imaging methods. In addition to revealing fine structural details, the molecule coordinate lists generated by these methods provide the critical ability to quantify the number, clustering, and colocalization of molecules with 10 – 50 nm resolution. Here we describe typical workflows and precautions for quantitative analysis of single-molecule superresolution images. These guidelines include potential pitfalls and essential control experiments, allowing critical assessment and interpretation of superresolution images. PMID:25179006
NASA Astrophysics Data System (ADS)
Wang, Shuliang; Zhang, Jianhua; Zhao, Mingwei; Min, Xu
2017-05-01
This paper takes central China power grid (CCPG) as an example, and analyzes the vulnerability of the power systems under terrorist attacks. To simulate the intelligence of terrorist attacks, a method of critical attack area identification according to community structures is introduced. Meanwhile, three types of vulnerability models and the corresponding vulnerability metrics are given for comparative analysis. On this basis, influence of terrorist attacks on different critical areas is studied. Identifying the vulnerability of different critical areas will be conducted. At the same time, vulnerabilities of critical areas under different tolerance parameters and different vulnerability models are acquired and compared. Results show that only a few number of vertex disruptions may cause some critical areas collapse completely, they can generate great performance losses the whole systems. Further more, the variation of vulnerability values under different scenarios is very large. Critical areas which can cause greater damage under terrorist attacks should be given priority of protection to reduce vulnerability. The proposed method can be applied to analyze the vulnerability of other infrastructure systems, they can help decision makers search mitigation action and optimum protection strategy.
NASA Astrophysics Data System (ADS)
Gu, Wen; Zhu, Zhiwei; Zhu, Wu-Le; Lu, Leyao; To, Suet; Xiao, Gaobo
2018-05-01
An automatic identification method for obtaining the critical depth-of-cut (DoC) of brittle materials with nanometric accuracy and sub-nanometric uncertainty is proposed in this paper. With this method, a two-dimensional (2D) microscopic image of the taper cutting region is captured and further processed by image analysis to extract the margin of generated micro-cracks in the imaging plane. Meanwhile, an analytical model is formulated to describe the theoretical curve of the projected cutting points on the imaging plane with respect to a specified DoC during the whole cutting process. By adopting differential evolution algorithm-based minimization, the critical DoC can be identified by minimizing the deviation between the extracted margin and the theoretical curve. The proposed method is demonstrated through both numerical simulation and experimental analysis. Compared with conventional 2D- and 3D-microscopic-image-based methods, determination of the critical DoC in this study uses the envelope profile rather than the onset point of the generated cracks, providing a more objective approach with smaller uncertainty.
Defining critical habitats of threatened and endemic reef fishes with a multivariate approach.
Purcell, Steven W; Clarke, K Robert; Rushworth, Kelvin; Dalton, Steven J
2014-12-01
Understanding critical habitats of threatened and endemic animals is essential for mitigating extinction risks, developing recovery plans, and siting reserves, but assessment methods are generally lacking. We evaluated critical habitats of 8 threatened or endemic fish species on coral and rocky reefs of subtropical eastern Australia, by measuring physical and substratum-type variables of habitats at fish sightings. We used nonmetric and metric multidimensional scaling (nMDS, mMDS), Analysis of similarities (ANOSIM), similarity percentages analysis (SIMPER), permutational analysis of multivariate dispersions (PERMDISP), and other multivariate tools to distinguish critical habitats. Niche breadth was widest for 2 endemic wrasses, and reef inclination was important for several species, often found in relatively deep microhabitats. Critical habitats of mainland reef species included small caves or habitat-forming hosts such as gorgonian corals and black coral trees. Hard corals appeared important for reef fishes at Lord Howe Island, and red algae for mainland reef fishes. A wide range of habitat variables are required to assess critical habitats owing to varied affinities of species to different habitat features. We advocate assessments of critical habitats matched to the spatial scale used by the animals and a combination of multivariate methods. Our multivariate approach furnishes a general template for assessing the critical habitats of species, understanding how these vary among species, and determining differences in the degree of habitat specificity. © 2014 Society for Conservation Biology.
ERIC Educational Resources Information Center
Fitchett, Paul G.; Heafner, Tina L.
2013-01-01
In this analysis of promising practice, we demonstrate how social studies methods instructors can incorporate data analysis of the 2010 United States History National Assessment of Educational Progress (NAEP-USH) to facilitate pedagogical aims, engage teacher candidates in critical discourse, and investigate the contexts of teaching and learning.…
Teaching Blended Content Analysis and Critically Vigilant Media Consumption
ERIC Educational Resources Information Center
Harris, Christopher S.
2015-01-01
The semester-long activity described herein uses an integrated instructional approach to media studies to introduce students to the research method of qualitative content analysis and help them become more critically vigilant media consumers. The goal is to increase students' media literacy by guiding them in the design of an exploratory…
Perfetti, Christopher M.; Rearden, Bradley T.
2016-03-01
The sensitivity and uncertainty analysis tools of the ORNL SCALE nuclear modeling and simulation code system that have been developed over the last decade have proven indispensable for numerous application and design studies for nuclear criticality safety and reactor physics. SCALE contains tools for analyzing the uncertainty in the eigenvalue of critical systems, but cannot quantify uncertainty in important neutronic parameters such as multigroup cross sections, fuel fission rates, activation rates, and neutron fluence rates with realistic three-dimensional Monte Carlo simulations. A more complete understanding of the sources of uncertainty in these design-limiting parameters could lead to improvements in processmore » optimization, reactor safety, and help inform regulators when setting operational safety margins. A novel approach for calculating eigenvalue sensitivity coefficients, known as the CLUTCH method, was recently explored as academic research and has been found to accurately and rapidly calculate sensitivity coefficients in criticality safety applications. The work presented here describes a new method, known as the GEAR-MC method, which extends the CLUTCH theory for calculating eigenvalue sensitivity coefficients to enable sensitivity coefficient calculations and uncertainty analysis for a generalized set of neutronic responses using high-fidelity continuous-energy Monte Carlo calculations. Here, several criticality safety systems were examined to demonstrate proof of principle for the GEAR-MC method, and GEAR-MC was seen to produce response sensitivity coefficients that agreed well with reference direct perturbation sensitivity coefficients.« less
Fault Tree Analysis Application for Safety and Reliability
NASA Technical Reports Server (NTRS)
Wallace, Dolores R.
2003-01-01
Many commercial software tools exist for fault tree analysis (FTA), an accepted method for mitigating risk in systems. The method embedded in the tools identifies a root as use in system components, but when software is identified as a root cause, it does not build trees into the software component. No commercial software tools have been built specifically for development and analysis of software fault trees. Research indicates that the methods of FTA could be applied to software, but the method is not practical without automated tool support. With appropriate automated tool support, software fault tree analysis (SFTA) may be a practical technique for identifying the underlying cause of software faults that may lead to critical system failures. We strive to demonstrate that existing commercial tools for FTA can be adapted for use with SFTA, and that applied to a safety-critical system, SFTA can be used to identify serious potential problems long before integrator and system testing.
Ouyang, Min; Tian, Hui; Wang, Zhenghua; Hong, Liu; Mao, Zijun
2017-01-17
This article studies a general type of initiating events in critical infrastructures, called spatially localized failures (SLFs), which are defined as the failure of a set of infrastructure components distributed in a spatially localized area due to damage sustained, while other components outside the area do not directly fail. These failures can be regarded as a special type of intentional attack, such as bomb or explosive assault, or a generalized modeling of the impact of localized natural hazards on large-scale systems. This article introduces three SLFs models: node centered SLFs, district-based SLFs, and circle-shaped SLFs, and proposes a SLFs-induced vulnerability analysis method from three aspects: identification of critical locations, comparisons of infrastructure vulnerability to random failures, topologically localized failures and SLFs, and quantification of infrastructure information value. The proposed SLFs-induced vulnerability analysis method is finally applied to the Chinese railway system and can be also easily adapted to analyze other critical infrastructures for valuable protection suggestions. © 2017 Society for Risk Analysis.
Pathway cross-talk network analysis identifies critical pathways in neonatal sepsis.
Meng, Yu-Xiu; Liu, Quan-Hong; Chen, Deng-Hong; Meng, Ying
2017-06-01
Despite advances in neonatal care, sepsis remains a major cause of morbidity and mortality in neonates worldwide. Pathway cross-talk analysis might contribute to the inference of the driving forces in bacterial sepsis and facilitate a better understanding of underlying pathogenesis of neonatal sepsis. This study aimed to explore the critical pathways associated with the progression of neonatal sepsis by the pathway cross-talk analysis. By integrating neonatal transcriptome data with known pathway data and protein-protein interaction data, we systematically uncovered the disease pathway cross-talks and constructed a disease pathway cross-talk network for neonatal sepsis. Then, attract method was employed to explore the dysregulated pathways associated with neonatal sepsis. To determine the critical pathways in neonatal sepsis, rank product (RP) algorithm, centrality analysis and impact factor (IF) were introduced sequentially, which synthetically considered the differential expression of genes and pathways, pathways cross-talks and pathway parameters in the network. The dysregulated pathways with the highest IF values as well as RP<0.01 were defined as critical pathways in neonatal sepsis. By integrating three kinds of data, only 6919 common genes were included to perform the pathway cross-talk analysis. By statistic analysis, a total of 1249 significant pathway cross-talks were selected to construct the pathway cross-talk network. Moreover, 47 dys-regulated pathways were identified via attract method, 20 pathways were identified under RP<0.01, and the top 10 pathways with the highest IF were also screened from the pathway cross-talk network. Among them, we selected 8 common pathways, i.e. critical pathways. In this study, we systematically tracked 8 critical pathways involved in neonatal sepsis by integrating attract method and pathway cross-talk network. These pathways might be responsible for the host response in infection, and of great value for advancing diagnosis and therapy of neonatal sepsis. Copyright © 2017 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Mumovic, Ana
2015-01-01
The paper studies and illuminates Jovan Skerlic's social function and "criticism anatomy" in his "History of Serbian Literature." The object of analysis is the act and actors of Skerlic's engaged criticism and method relying on facts. It is the path taken a hundred years later by Serbian criticism and literature as culture and…
Nilsson, Björn; Håkansson, Petra; Johansson, Mikael; Nelander, Sven; Fioretos, Thoas
2007-01-01
Ontological analysis facilitates the interpretation of microarray data. Here we describe new ontological analysis methods which, unlike existing approaches, are threshold-free and statistically powerful. We perform extensive evaluations and introduce a new concept, detection spectra, to characterize methods. We show that different ontological analysis methods exhibit distinct detection spectra, and that it is critical to account for this diversity. Our results argue strongly against the continued use of existing methods, and provide directions towards an enhanced approach. PMID:17488501
Nucleation Rate Analysis of Methane Hydrate from Molecular Dynamics Simulations
Yuhara, Daisuke; Barnes, Brian C.; Suh, Donguk; ...
2015-01-06
Clathrate hydrates are solid crystalline structures most commonly formed from solutions that have nucleated to form a mixed solid composed of water and gas. Understanding the mechanism of clathrate hydrate nucleation is essential to grasp the fundamental chemistry of these complex structures and their applications. Molecular dynamics (MD) simulation is an ideal method to study nucleation at the molecular level because the size of the critical nucleus and formation rate occur on the nano scale. Moreover, various analysis methods for nucleation have been developed through MD to analyze nucleation. In particular, the mean first-passage time (MFPT) and survival probability (SP)more » methods have proven to be effective in procuring the nucleation rate and critical nucleus size for monatomic systems. This study assesses the MFPT and SP methods, previously used for monatomic systems, when applied to analyzing clathrate hydrate nucleation. Because clathrate hydrate nucleation is relatively difficult to observe in MD simulations (due to its high free energy barrier), these methods have yet to be applied to clathrate hydrate systems. In this study, we have analyzed the nucleation rate and critical nucleus size of methane hydrate using MFPT and SP methods from data generated by MD simulations at 255 K and 50 MPa. MFPT was modified for clathrate hydrate from the original version by adding the maximum likelihood estimate and growth effect term. The nucleation rates were calculated by MFPT and SP methods and are within 5%; the critical nucleus size estimated by the MFPT method was 50% higher, than values obtained through other more rigorous but computationally expensive estimates. These methods can also be extended to the analysis of other clathrate hydrates.« less
Nucleation Rate Analysis of Methane Hydrate from Molecular Dynamics Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yuhara, Daisuke; Barnes, Brian C.; Suh, Donguk
Clathrate hydrates are solid crystalline structures most commonly formed from solutions that have nucleated to form a mixed solid composed of water and gas. Understanding the mechanism of clathrate hydrate nucleation is essential to grasp the fundamental chemistry of these complex structures and their applications. Molecular dynamics (MD) simulation is an ideal method to study nucleation at the molecular level because the size of the critical nucleus and formation rate occur on the nano scale. Moreover, various analysis methods for nucleation have been developed through MD to analyze nucleation. In particular, the mean first-passage time (MFPT) and survival probability (SP)more » methods have proven to be effective in procuring the nucleation rate and critical nucleus size for monatomic systems. This study assesses the MFPT and SP methods, previously used for monatomic systems, when applied to analyzing clathrate hydrate nucleation. Because clathrate hydrate nucleation is relatively difficult to observe in MD simulations (due to its high free energy barrier), these methods have yet to be applied to clathrate hydrate systems. In this study, we have analyzed the nucleation rate and critical nucleus size of methane hydrate using MFPT and SP methods from data generated by MD simulations at 255 K and 50 MPa. MFPT was modified for clathrate hydrate from the original version by adding the maximum likelihood estimate and growth effect term. The nucleation rates were calculated by MFPT and SP methods and are within 5%; the critical nucleus size estimated by the MFPT method was 50% higher, than values obtained through other more rigorous but computationally expensive estimates. These methods can also be extended to the analysis of other clathrate hydrates.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bowman, S.M.
1995-01-01
The requirements of ANSI/ANS 8.1 specify that calculational methods for away-from-reactor criticality safety analyses be validated against experimental measurements. If credit for the negative reactivity of the depleted (or spent) fuel isotopics is desired, it is necessary to benchmark computational methods against spent fuel critical configurations. This report summarizes a portion of the ongoing effort to benchmark away-from-reactor criticality analysis methods using critical configurations from commercial pressurized-water reactors. The analysis methodology selected for all the calculations reported herein is based on the codes and data provided in the SCALE-4 code system. The isotopic densities for the spent fuel assemblies inmore » the critical configurations were calculated using the SAS2H analytical sequence of the SCALE-4 system. The sources of data and the procedures for deriving SAS2H input parameters are described in detail. The SNIKR code module was used to extract the necessary isotopic densities from the SAS2H results and provide the data in the format required by the SCALE criticality analysis modules. The CSASN analytical sequence in SCALE-4 was used to perform resonance processing of the cross sections. The KENO V.a module of SCALE-4 was used to calculate the effective multiplication factor (k{sub eff}) of each case. The SCALE-4 27-group burnup library containing ENDF/B-IV (actinides) and ENDF/B-V (fission products) data was used for all the calculations. This volume of the report documents the SCALE system analysis of three reactor critical configurations for the Sequoyah Unit 2 Cycle 3. This unit and cycle were chosen because of the relevance in spent fuel benchmark applications: (1) the unit had a significantly long downtime of 2.7 years during the middle of cycle (MOC) 3, and (2) the core consisted entirely of burned fuel at the MOC restart. The first benchmark critical calculation was the MOC restart at hot, full-power (HFP) critical conditions. The other two benchmark critical calculations were the beginning-of-cycle (BOC) startup at both hot, zero-power (HZP) and HFP critical conditions. These latter calculations were used to check for consistency in the calculated results for different burnups and downtimes. The k{sub eff} results were in the range of 1.00014 to 1.00259 with a standard deviation of less than 0.001.« less
Opinion: Clarifying Two Controversies about Information Mapping's Method.
ERIC Educational Resources Information Center
Horn, Robert E.
1992-01-01
Describes Information Mapping, a methodology for the analysis, organization, sequencing, and presentation of information and explains three major parts of the method: (1) content analysis, (2) project life-cycle synthesis and integration of the content analysis, and (3) sequencing and formatting. Major criticisms of the methodology are addressed.…
Probabilistic Structural Analysis Theory Development
NASA Technical Reports Server (NTRS)
Burnside, O. H.
1985-01-01
The objective of the Probabilistic Structural Analysis Methods (PSAM) project is to develop analysis techniques and computer programs for predicting the probabilistic response of critical structural components for current and future space propulsion systems. This technology will play a central role in establishing system performance and durability. The first year's technical activity is concentrating on probabilistic finite element formulation strategy and code development. Work is also in progress to survey critical materials and space shuttle mian engine components. The probabilistic finite element computer program NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) is being developed. The final probabilistic code will have, in the general case, the capability of performing nonlinear dynamic of stochastic structures. It is the goal of the approximate methods effort to increase problem solving efficiency relative to finite element methods by using energy methods to generate trial solutions which satisfy the structural boundary conditions. These approximate methods will be less computer intensive relative to the finite element approach.
ERIC Educational Resources Information Center
Smith, Karen
2018-01-01
Policy texts are representations of practice that both reflect and shape the world around them. There is, however, little higher education research that critically analyses the impact of higher education policy on educational developers and educational development practice. Extending methods from critical discourse analysis by combining textual…
Seismic Hazard Analysis — Quo vadis?
NASA Astrophysics Data System (ADS)
Klügel, Jens-Uwe
2008-05-01
The paper is dedicated to the review of methods of seismic hazard analysis currently in use, analyzing the strengths and weaknesses of different approaches. The review is performed from the perspective of a user of the results of seismic hazard analysis for different applications such as the design of critical and general (non-critical) civil infrastructures, technical and financial risk analysis. A set of criteria is developed for and applied to an objective assessment of the capabilities of different analysis methods. It is demonstrated that traditional probabilistic seismic hazard analysis (PSHA) methods have significant deficiencies, thus limiting their practical applications. These deficiencies have their roots in the use of inadequate probabilistic models and insufficient understanding of modern concepts of risk analysis, as have been revealed in some recent large scale studies. These deficiencies result in the lack of ability of a correct treatment of dependencies between physical parameters and finally, in an incorrect treatment of uncertainties. As a consequence, results of PSHA studies have been found to be unrealistic in comparison with empirical information from the real world. The attempt to compensate these problems by a systematic use of expert elicitation has, so far, not resulted in any improvement of the situation. It is also shown that scenario-earthquakes developed by disaggregation from the results of a traditional PSHA may not be conservative with respect to energy conservation and should not be used for the design of critical infrastructures without validation. Because the assessment of technical as well as of financial risks associated with potential damages of earthquakes need a risk analysis, current method is based on a probabilistic approach with its unsolved deficiencies. Traditional deterministic or scenario-based seismic hazard analysis methods provide a reliable and in general robust design basis for applications such as the design of critical infrastructures, especially with systematic sensitivity analyses based on validated phenomenological models. Deterministic seismic hazard analysis incorporates uncertainties in the safety factors. These factors are derived from experience as well as from expert judgment. Deterministic methods associated with high safety factors may lead to too conservative results, especially if applied for generally short-lived civil structures. Scenarios used in deterministic seismic hazard analysis have a clear physical basis. They are related to seismic sources discovered by geological, geomorphologic, geodetic and seismological investigations or derived from historical references. Scenario-based methods can be expanded for risk analysis applications with an extended data analysis providing the frequency of seismic events. Such an extension provides a better informed risk model that is suitable for risk-informed decision making.
Using Superheroes to Visually and Critically Analyze Comics, Stereotypes, and Society
ERIC Educational Resources Information Center
Cook, Mike P.; Frey, Ryle
2017-01-01
The purpose of this article is to provide teachers and students useful methods for utilizing the power of comic books as literacy sponsors in ELA classrooms. Given the continued boom in the popularity of comics in popular culture, this provides a relevant way to introduce students to visual and critical analysis. Engaging in meaningful analysis of…
Critical Development Exploration Based on the Islamic Education in Iranian Higher Education
ERIC Educational Resources Information Center
Taheri, Mohammad Reza; Keshtiaray, Narges; Yousefy, Ali Reza
2015-01-01
The aim of this research is to do a critical development exploration based on the Islamic education in Iranian higher education. In this paper, logical analysis qualitative method was used. Through library studies, information was collected and analysis of the results was done. The information collecting tool was note taking and information was…
A comparative critical study between FMEA and FTA risk analysis methods
NASA Astrophysics Data System (ADS)
Cristea, G.; Constantinescu, DM
2017-10-01
Today there is used an overwhelming number of different risk analyses techniques with acronyms such as: FMEA (Failure Modes and Effects Analysis) and its extension FMECA (Failure Mode, Effects, and Criticality Analysis), DRBFM (Design Review by Failure Mode), FTA (Fault Tree Analysis) and and its extension ETA (Event Tree Analysis), HAZOP (Hazard & Operability Studies), HACCP (Hazard Analysis and Critical Control Points) and What-if/Checklist. However, the most used analysis techniques in the mechanical and electrical industry are FMEA and FTA. In FMEA, which is an inductive method, information about the consequences and effects of the failures is usually collected through interviews with experienced people, and with different knowledge i.e., cross-functional groups. The FMEA is used to capture potential failures/risks & impacts and prioritize them on a numeric scale called Risk Priority Number (RPN) which ranges from 1 to 1000. FTA is a deductive method i.e., a general system state is decomposed into chains of more basic events of components. The logical interrelationship of how such basic events depend on and affect each other is often described analytically in a reliability structure which can be visualized as a tree. Both methods are very time-consuming to be applied thoroughly, and this is why it is oftenly not done so. As a consequence possible failure modes may not be identified. To address these shortcomings, it is proposed to use a combination of FTA and FMEA.
Diao, K; Farmani, R; Fu, G; Astaraie-Imani, M; Ward, S; Butler, D
2014-01-01
Large water distribution systems (WDSs) are networks with both topological and behavioural complexity. Thereby, it is usually difficult to identify the key features of the properties of the system, and subsequently all the critical components within the system for a given purpose of design or control. One way is, however, to more explicitly visualize the network structure and interactions between components by dividing a WDS into a number of clusters (subsystems). Accordingly, this paper introduces a clustering strategy that decomposes WDSs into clusters with stronger internal connections than external connections. The detected cluster layout is very similar to the community structure of the served urban area. As WDSs may expand along with urban development in a community-by-community manner, the correspondingly formed distribution clusters may reveal some crucial configurations of WDSs. For verification, the method is applied to identify all the critical links during firefighting for the vulnerability analysis of a real-world WDS. Moreover, both the most critical pipes and clusters are addressed, given the consequences of pipe failure. Compared with the enumeration method, the method used in this study identifies the same group of the most critical components, and provides similar criticality prioritizations of them in a more computationally efficient time.
Granato, Enzo
2008-07-11
Phase coherence and vortex order in a Josephson-junction array at irrational frustration are studied by extensive Monte Carlo simulations using the parallel-tempering method. A scaling analysis of the correlation length of phase variables in the full equilibrated system shows that the critical temperature vanishes with a power-law divergent correlation length and critical exponent nuph, in agreement with recent results from resistivity scaling analysis. A similar scaling analysis for vortex variables reveals a different critical exponent nuv, suggesting that there are two distinct correlation lengths associated with a decoupled zero-temperature phase transition.
NASA Technical Reports Server (NTRS)
Rajagopal, K. R.
1992-01-01
The technical effort and computer code development is summarized. Several formulations for Probabilistic Finite Element Analysis (PFEA) are described with emphasis on the selected formulation. The strategies being implemented in the first-version computer code to perform linear, elastic PFEA is described. The results of a series of select Space Shuttle Main Engine (SSME) component surveys are presented. These results identify the critical components and provide the information necessary for probabilistic structural analysis. Volume 2 is a summary of critical SSME components.
NASA Astrophysics Data System (ADS)
Widyaningsih, E.; Waluya, S. B.; Kurniasih, A. W.
2018-03-01
This study aims to know mastery learning of students’ critical thinking ability with learning cycle 7E, determine whether the critical thinking ability of the students with learning cycle 7E is better than students’ critical thinking ability with expository model, and describe the students’ critical thinking phases based on the mathematical anxiety level. The method is mixed method with concurrent embedded. The population is VII grade students of SMP Negeri 3 Kebumen academic year 2016/2017. Subjects are determined by purposive sampling, selected two students from each level of mathematical anxiety. Data collection techniques include test, questionnaire, interview, and documentation. Quantitative data analysis techniques include mean test, proportion test, difference test of two means, difference test of two proportions and for qualitative data used Miles and Huberman model. The results show that: (1) students’ critical thinking ability with learning cycle 7E achieve mastery learning; (2) students’ critical thinking ability with learning cycle 7E is better than students’ critical thinking ability with expository model; (3) description of students’ critical thinking phases based on the mathematical anxiety level that is the lower the mathematical anxiety level, the subjects have been able to fulfil all of the indicators of clarification, assessment, inference, and strategies phases.
Determining the Number of Factors in P-Technique Factor Analysis
ERIC Educational Resources Information Center
Lo, Lawrence L.; Molenaar, Peter C. M.; Rovine, Michael
2017-01-01
Determining the number of factors is a critical first step in exploratory factor analysis. Although various criteria and methods for determining the number of factors have been evaluated in the usual between-subjects R-technique factor analysis, there is still question of how these methods perform in within-subjects P-technique factor analysis. A…
Critical analysis of active methods of ozone layer recovery
NASA Astrophysics Data System (ADS)
Bekker, S. Z.; Doronin, A. P.; Kozlov, S. I.
2017-09-01
A critical analysis is given for various methods for recovery of the ozone layer of the Earth: the emission of alkane gases, the destruction of freons by laser IR radiation and with microwave discharge, exposure to laser UV radiation and electric discharge in the atmosphere, the use of solar radiation, laser infrared radiation, and gamma rays, and the creation of an artificial formation at high altitudes that shields the solar radiation dissociating ozone. The optimal methods are discussed in terms of their effectiveness, economic costs, and environmental consequences. These include the use of gamma rays sources, electric discharge in the atmosphere, and microwave breakdown.
Analysis of Mathematics Critical Thinking Students in Junior High School Based on Cognitive Style
NASA Astrophysics Data System (ADS)
Agoestanto, A.; Sukestiyarno, YL; Rochmad
2017-04-01
The purpose of this research was to determine the critical thinking ability of mathematics from junior high school students based on FI and FD cognitive style. Data of this research were taken from students grade VIII at SMPN 2 Ambarawa. The research method used a descriptive qualitative approach. Data was taken with a testing method; the critical thinking was measured with WGCTA which is modified with mathematical problems, the cognitive style was measured with GEFT. The student’s test result was analysed, then four students were selected, the two of them are FI cognitive style, and the others are FD cognitive style, for qualitative analysis. The result showed that the ability of mathematics critical thinking students with FI cognitive style is better than FD cognitive style on the ability of inference, assumption, deduction, and interpretation. While on the aspect of argument evaluation, mathematics critical thinking ability of students with FD cognitive style is a little better than students with FI cognitive style.
Challenges of postgraduate critical care nursing program in Iran.
Dehghan Nayeri, Nahid; Shariat, Esmaeil; Tayebi, Zahra; Ghorbanzadeh, Majid
2017-01-01
Background: The main philosophy of postgraduate preparation for working in critical care units is to ensure the safety and quality of patients' care. Increasing the complexity of technology, decision-making challenges and the high demand for advanced communication skills necessitate the need to educate learners. Within this aim, a master's degree in critical care nursing has been established in Iran. Current study was designed to collect critical care nursing students' experiences as well as their feedback to the field critical care nursing. Methods: This study used qualitative content analysis through in-depth semi-structured interviews. Graneheim and Lundman method was used for data analysis. Results: The results of the total 15 interviews were classified in the following domains: The vision of hope and illusion; shades of grey attitude; inefficient program and planning; inadequacy to run the program; and multiple outcomes: Far from the effectiveness. Overall findings indicated the necessity to review the curriculum and the way the program is implemented. Conclusion: The findings of this study provided valuable information to improve the critical care-nursing program. It also facilitated the next review of the program by the authorities.
Formal Methods for Life-Critical Software
NASA Technical Reports Server (NTRS)
Butler, Ricky W.; Johnson, Sally C.
1993-01-01
The use of computer software in life-critical applications, such as for civil air transports, demands the use of rigorous formal mathematical verification procedures. This paper demonstrates how to apply formal methods to the development and verification of software by leading the reader step-by-step through requirements analysis, design, implementation, and verification of an electronic phone book application. The current maturity and limitations of formal methods tools and techniques are then discussed, and a number of examples of the successful use of formal methods by industry are cited.
Alessandri, Elena; Williamson, Victoria J.; Eiholzer, Hubert; Williamon, Aaron
2015-01-01
Critical reviews offer rich data that can be used to investigate how musical experiences are conceptualized by expert listeners. However, these data also present significant challenges in terms of organization, analysis, and interpretation. This study presents a new systematic method for examining written responses to music, tested on a substantial corpus of music criticism. One hundred critical reviews of Beethoven’s piano sonata recordings, published in the Gramophone between August 1934 and July 2010, were selected using in-depth data reduction (qualitative/quantitative approach). The texts were then examined using thematic analysis in order to generate a visual descriptive model of expert critical review. This model reveals how the concept of evaluation permeates critical review. It also distinguishes between two types of descriptors. The first characterizes the performance in terms of specific actions or features of the musical sound (musical parameters, technique, and energy); the second appeals to higher-order properties (artistic style, character and emotion, musical structure, communicativeness) or assumed performer qualities (understanding, intentionality, spontaneity, sensibility, control, and care). The new model provides a methodological guide and conceptual basis for future studies of critical review in any genre. PMID:25741295
Alessandri, Elena; Williamson, Victoria J; Eiholzer, Hubert; Williamon, Aaron
2015-01-01
Critical reviews offer rich data that can be used to investigate how musical experiences are conceptualized by expert listeners. However, these data also present significant challenges in terms of organization, analysis, and interpretation. This study presents a new systematic method for examining written responses to music, tested on a substantial corpus of music criticism. One hundred critical reviews of Beethoven's piano sonata recordings, published in the Gramophone between August 1934 and July 2010, were selected using in-depth data reduction (qualitative/quantitative approach). The texts were then examined using thematic analysis in order to generate a visual descriptive model of expert critical review. This model reveals how the concept of evaluation permeates critical review. It also distinguishes between two types of descriptors. The first characterizes the performance in terms of specific actions or features of the musical sound (musical parameters, technique, and energy); the second appeals to higher-order properties (artistic style, character and emotion, musical structure, communicativeness) or assumed performer qualities (understanding, intentionality, spontaneity, sensibility, control, and care). The new model provides a methodological guide and conceptual basis for future studies of critical review in any genre.
2014-04-01
Barrier methods for critical exponent problems in geometric analysis and mathematical physics, J. Erway and M. Holst, Submitted for publication ...TR-14-33 A Posteriori Error Analysis and Uncertainty Quantification for Adaptive Multiscale Operator Decomposition Methods for Multiphysics...Problems Approved for public release, distribution is unlimited. April 2014 HDTRA1-09-1-0036 Donald Estep and Michael
Estimating erosion risk on forest lands using improved methods of discriminant analysis
J. Lewis; R. M. Rice
1990-01-01
A population of 638 timber harvest areas in northwestern California was sampled for data related to the occurrence of critical amounts of erosion (>153 m3 within 0.81 ha). Separate analyses were done for forest roads and logged areas. Linear discriminant functions were computed in each analysis to contrast site conditions at critical plots with randomly selected...
NASA Technical Reports Server (NTRS)
Liu, A. F.
1974-01-01
A systematic approach for applying methods for fracture control in the structural components of space vehicles consists of four major steps. The first step is to define the primary load-carrying structural elements and the type of load, environment, and design stress levels acting upon them. The second step is to identify the potential fracture-critical parts by means of a selection logic flow diagram. The third step is to evaluate the safe-life and fail-safe capabilities of the specified part. The last step in the sequence is to apply the control procedures that will prevent damage to the fracture-critical parts. The fracture control methods discussed include fatigue design and analysis methods, methods for preventing crack-like defects, fracture mechanics analysis methods, and nondestructive evaluation methods. An example problem is presented for evaluation of the safe-crack-growth capability of the space shuttle crew compartment skin structure.
Ghatak, Ishita; Dhat, Vaishali; Tilak, Mona A; Roy, Indranath
2016-08-01
Acid Base Disorders (ABDs) are commonly encountered in critically ill Chronic Kidney Disease (CKD) patients. Timely and correct analysis of Arterial Blood Gases (ABG) is critical for the diagnosis, treatment and prediction of outcome of the patients. The aim was to explore type and prevalence of ABDs in 31 critically ill CKD patients from a tertiary care hospital in Maharashtra, to compare two methods of analysis- bedside and systematic approaches and to clinically correlate the nature of ABDs in these patients. The initial ABG reports of 31 consecutive CKD patients were analysed by two methods. Medica Easy stat analyser was the equipment for analysis with Principle of potentiometry and ion selective electrode for pH and pCO2 and amperometry for pO2. Serum albumin was also measured by Bromocresol green dye binding method using liquixx albumin kit in Erba XL 300 autoanalyser. Chi-square test was used for statistical analysis using Epi Info version 3.5.4 and SPSS 14.0 softwares. The systematic method showed a significantly higher prevalence of mixed disorders (50%) compared to bedside method (12.9%). Most prevalent disorder by bedside method was metabolic acidosis in 15 cases (48.39%). By the systematic method, 3 reports were invalid. As a single category, most prevalent type was both simple respiratory alkalosis and mixed metabolic acidosis with respiratory alkalosis- 6 of 31 cases in each type (19.36% each). As a whole, metabolic acidosis (including both High Anion Gap Metabolic Acidosis or HAGMA and Non Anion Gap Metabolic Acidosis or NAGMA with 4 in each type) was most prevalent- 8 of 31(25.8%). Systematic approach was more effective in diagnosing mixed acid base disorders. By systematic method the findings of analysis in most cases could be correlated with the clinical condition and provisional diagnosis. Thus interpretation of ABDs by using stepwise approach could be useful to the clinicians in early diagnosis and management of the patients.
Critical behavior near the ferromagnetic phase transition in double perovskite Nd2NiMnO6
NASA Astrophysics Data System (ADS)
Ali, Anzar; Sharma, G.; Singh, Yogesh
2018-05-01
The knowledge of critical exponents plays a crucial role in trying to understand the interaction mechanism near a phase transition. In this report, we present a detailed study of the critical behaviour near the ferromagnetic (FM) transition (TC ˜ 193 K) in Nd2NiMnO6 using the temperature and magnetic field dependent isothermal magnetisation measurements. We used various analysis methods such as Arrott plot, modified Arrott plot, and Kouvel-Fisher plot to estimate the critical parameters. The magnetic critical parameters β = 0.49±0.02, γ = 1.05±0.04 and critical isothermal parameter δ = 3.05±0.02 are in excellent agreement with Widom scaling. The critical parameters analysis emphasizes that mean field interaction is the mechanism driving the FM transition in Nd2NiMnO6.
Multifractality and Network Analysis of Phase Transition
Li, Wei; Yang, Chunbin; Han, Jihui; Su, Zhu; Zou, Yijiang
2017-01-01
Many models and real complex systems possess critical thresholds at which the systems shift dramatically from one sate to another. The discovery of early-warnings in the vicinity of critical points are of great importance to estimate how far the systems are away from the critical states. Multifractal Detrended Fluctuation analysis (MF-DFA) and visibility graph method have been employed to investigate the multifractal and geometrical properties of the magnetization time series of the two-dimensional Ising model. Multifractality of the time series near the critical point has been uncovered from the generalized Hurst exponents and singularity spectrum. Both long-term correlation and broad probability density function are identified to be the sources of multifractality. Heterogeneous nature of the networks constructed from magnetization time series have validated the fractal properties. Evolution of the topological quantities of the visibility graph, along with the variation of multifractality, serve as new early-warnings of phase transition. Those methods and results may provide new insights about the analysis of phase transition problems and can be used as early-warnings for a variety of complex systems. PMID:28107414
An Analysis of the Critical Reading Levels of Pre-Service Turkish and Literature Teachers
ERIC Educational Resources Information Center
Maltepe, Sadet
2016-01-01
Problem Statement: Critical reading refers to individuals' thinking about what they read, assessing what they have read, and using their own judgment about what they have read. In order to teach critical reading skills to students, a teacher is expected to have knowledge about text selection, use of appropriate methods, preparation of functional…
Measuring Road Network Vulnerability with Sensitivity Analysis
Jun-qiang, Leng; Long-hai, Yang; Liu, Wei-yi; Zhao, Lin
2017-01-01
This paper focuses on the development of a method for road network vulnerability analysis, from the perspective of capacity degradation, which seeks to identify the critical infrastructures in the road network and the operational performance of the whole traffic system. This research involves defining the traffic utility index and modeling vulnerability of road segment, route, OD (Origin Destination) pair and road network. Meanwhile, sensitivity analysis method is utilized to calculate the change of traffic utility index due to capacity degradation. This method, compared to traditional traffic assignment, can improve calculation efficiency and make the application of vulnerability analysis to large actual road network possible. Finally, all the above models and calculation method is applied to actual road network evaluation to verify its efficiency and utility. This approach can be used as a decision-supporting tool for evaluating the performance of road network and identifying critical infrastructures in transportation planning and management, especially in the resource allocation for mitigation and recovery. PMID:28125706
Varndell, Wayne; Fry, Margaret; Elliott, Doug
2017-08-01
Many critically ill patients experience moderate to severe acute pain that is frequently undetected and/or undertreated. Acute pain in this patient cohort not only derives from their injury and/or illness, but also as a consequence of delivering care whilst stabilising the patient. Emergency nurses are increasingly responsible for the safety and wellbeing of critically ill patients, which includes assessing, monitoring and managing acute pain. How emergency nurses manage acute pain in critically ill adult patients is unknown. The objective of this study is to explore how emergency nurses manage acute pain in critically ill patients in the Emergency Department. In this paper, we provide a detailed description of the methods and protocol for a multiphase sequential mixed methods study, exploring how emergency nurses assess, monitor and manage acute pain in critically ill adult patients. The objective, method, data collection and analysis of each phase are explained. Justification of each method and data integration is described. Synthesis of findings will generate a comprehensive picture of how emergency nurses' perceive and manage acute pain in critically ill adult patients. The results of this study will form a knowledge base to expand theory and inform research and practice.
NASA Astrophysics Data System (ADS)
Simola, Kaisa; Laakso, Kari
1992-01-01
Eight years of operating experiences of 104 motor operated closing valves in different safety systems in nuclear power units were analyzed in a systematic way. The qualitative methods used were Failure Mode and Effect Analysis (FMEA) and Maintenance Effects and Criticality Analysis (MECA). These reliability engineering methods are commonly used in the design stage of equipment. The successful application of these methods for analysis and utilization of operating experiences was demonstrated.
ERIC Educational Resources Information Center
Escobar Alméciga, Wilder Yesid
2013-01-01
This article addresses a critical problem about asymmetrical power relationships and uneven conditions in English language education exerted via identity shaping discourses in the document Educación: "Visión 2019" issued by the Colombian Ministry of National Education. The study follows the critical discourse analysis method. It…
How mental health nurses improve their critical thinking through problem-based learning.
Hung, Tsui-Mei; Tang, Lee-Chun; Ko, Chen-Ju
2015-01-01
Critical thinking has been regarded as one of the most important elements for nurses to improve quality of patient care. The aim of this study was to use problem-based learning (PBL) as a method in a continuing education program to evaluate nurses' critical thinking skills. A quasiexperimental study design was carried out. The "Critical Thinking Disposition Inventory" in Chinese was used for data collection. The results indicated significant improvement after PBL continuous education, notably in the dimensions of systematic analysis and curiosity. Content analysis extracted four themes: (a) changes in linear thinking required, (b) logical and systematic thinking required performance improved, (3) integration of prior knowledge and clinical application, and (4) brainstorming learning strategy. The study supports PBL as a continuing education strategy for mental health nurses, and that systematic analysis and curiosity effectively facilitate the development of critical thinking.
Critical ignition conditions in exothermically reacting systems: first-order reactions
NASA Astrophysics Data System (ADS)
Filimonov, Valeriy Yu.
2017-10-01
In this paper, the comparative analysis of the thermal explosion (TE) critical conditions on the planes temperature-conversion degree and temperature-time was conducted. It was established that the ignition criteria are almost identical only at relatively small values of Todes parameter. Otherwise, the results of critical conditions analysis on the plane temperature-conversion degree may be wrong. The asymptotic method of critical conditions calculation for the first-order reactions was proposed (taking into account the reactant consumption). The degeneration conditions of TE were determined. The calculation of critical conditions for specific first-order reaction was made. The comparison of the analytical results obtained with the results of numerical calculations and experimental data showed that they are in good agreement.
Critical ignition conditions in exothermically reacting systems: first-order reactions.
Filimonov, Valeriy Yu
2017-10-01
In this paper, the comparative analysis of the thermal explosion (TE) critical conditions on the planes temperature-conversion degree and temperature-time was conducted. It was established that the ignition criteria are almost identical only at relatively small values of Todes parameter. Otherwise, the results of critical conditions analysis on the plane temperature-conversion degree may be wrong. The asymptotic method of critical conditions calculation for the first-order reactions was proposed (taking into account the reactant consumption). The degeneration conditions of TE were determined. The calculation of critical conditions for specific first-order reaction was made. The comparison of the analytical results obtained with the results of numerical calculations and experimental data showed that they are in good agreement.
ERIC Educational Resources Information Center
Alaei, Mahya; Ahangari, Saeideh
2016-01-01
The linguistic study of literature or critical analysis of literary discourse is no different from any other textual description; it is not a new branch or a new level or a new kind of linguistics but the application of existing theories and methods (Halliday, 2002). This study intends to determine how ideology or opinion is expressed in Joseph…
Study of Graphite/Epoxy Composites for Material Flaw Criticality.
1980-11-01
criticality of disbonds with two-dimensional planforms located in laminated graphite/epoxy composites has been examined. Linear elastic fracture...mechanics approach, semi-empirical growth laws and methods of stress analysis based on a modified laminated plate theory have been studied for assessing...growth rates of disbonds in a transverse shear environ- ment. Elastic stability analysis has been utilized for laminates with disbonds subjected to in
NASA Technical Reports Server (NTRS)
1976-01-01
The application of NASTRAN to a wide variety of static and dynamic structural problems is discussed. The following topics are focused upon: (1) methods of analysis; (2) hydroelastic methods; (3) complete analysis of structures; (4) elements and material studies; (5) critical comparisons with other programs; and (6) pre- and post-processor operations.
A MULTI-RESIDUE METHOD FOR THE ANALYSIS OF INSECTICIDES COLLECTED ON COTTON SURFACE WIPES
A method was developed for the extraction, clean-up, and analysis of multiple pesticides from cotton wipe media used in human exposure studies to collect residues from residential hard surfaces. Measurements of pesticides are critical for estimating dermal and indirect ingestion ...
Critical object recognition in millimeter-wave images with robustness to rotation and scale.
Mohammadzade, Hoda; Ghojogh, Benyamin; Faezi, Sina; Shabany, Mahdi
2017-06-01
Locating critical objects is crucial in various security applications and industries. For example, in security applications, such as in airports, these objects might be hidden or covered under shields or secret sheaths. Millimeter-wave images can be utilized to discover and recognize the critical objects out of the hidden cases without any health risk due to their non-ionizing features. However, millimeter-wave images usually have waves in and around the detected objects, making object recognition difficult. Thus, regular image processing and classification methods cannot be used for these images and additional pre-processings and classification methods should be introduced. This paper proposes a novel pre-processing method for canceling rotation and scale using principal component analysis. In addition, a two-layer classification method is introduced and utilized for recognition. Moreover, a large dataset of millimeter-wave images is collected and created for experiments. Experimental results show that a typical classification method such as support vector machines can recognize 45.5% of a type of critical objects at 34.2% false alarm rate (FAR), which is a drastically poor recognition. The same method within the proposed recognition framework achieves 92.9% recognition rate at 0.43% FAR, which indicates a highly significant improvement. The significant contribution of this work is to introduce a new method for analyzing millimeter-wave images based on machine vision and learning approaches, which is not yet widely noted in the field of millimeter-wave image analysis.
Orlandini, Serena; Pasquini, Benedetta; Caprini, Claudia; Del Bubba, Massimo; Pinzauti, Sergio; Furlanetto, Sandra
2015-11-01
A fast and selective CE method for the determination of zolmitriptan (ZOL) and its five potential impurities has been developed applying the analytical Quality by Design principles. Voltage, temperature, buffer concentration, and pH were investigated as critical process parameters that can influence the critical quality attributes, represented by critical resolution values between peak pairs, analysis time, and peak efficiency of ZOL-dimer. A symmetric screening matrix was employed for investigating the knowledge space, and a Box-Behnken design was used to evaluate the main, interaction, and quadratic effects of the critical process parameters on the critical quality attributes. Contour plots were drawn highlighting important interactions between buffer concentration and pH, and the gained information was merged into the sweet spot plots. Design space (DS) was established by the combined use of response surface methodology and Monte Carlo simulations, introducing a probability concept and thus allowing the quality of the analytical performances to be assured in a defined domain. The working conditions (with the interval defining the DS) were as follows: BGE, 138 mM (115-150 mM) phosphate buffer pH 2.74 (2.54-2.94); temperature, 25°C (24-25°C); voltage, 30 kV. A control strategy was planned based on method robustness and system suitability criteria. The main advantages of applying the Quality by Design concept consisted of a great increase of knowledge of the analytical system, obtained throughout multivariate techniques, and of the achievement of analytical assurance of quality, derived by probability-based definition of DS. The developed method was finally validated and applied to the analysis of ZOL tablets. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Estimating optical imaging system performance for space applications
NASA Technical Reports Server (NTRS)
Sinclair, K. F.
1972-01-01
The critical system elements of an optical imaging system are identified and a method for an initial assessment of system performance is presented. A generalized imaging system is defined. A system analysis is considered, followed by a component analysis. An example of the method is given using a film imaging system.
Analysis of Critical Thinking Skills on The Topic of Static Fluid
NASA Astrophysics Data System (ADS)
Puspita, I.; Kaniawati, I.; Suwarma, I. R.
2017-09-01
This study aimed to know the critical thinking skills profil of senior high school students. This research using a descriptive study to analysis student test results of critical thinking skill of 40 students XI grade in one of the senior high school in Bogor District. The method used is survey research with sample determined by purposive sampling technique. The instrument used is test of critical thinking skill by 5 indicators on static fluid topics. Questions consist of 11 set. It is has been developed by researcher and validated by experts. The results showed students critical thinking skills are still low. Is almost every indicator of critical thinking skills only reaches less than 30%. 28% for elementary clarification, 10% for the basic for decisions/basic support, 6% for inference, 6% for advanced clarification, 4% for strategies and tactics.
ERIC Educational Resources Information Center
Finn, Jerry; Dillon, Caroline
2007-01-01
This paper describes methods for teaching content analysis as part of the Research sequence in social work education. Teaching content analysis is used to develop research skills as well as to promote students' knowledge and critical thinking and about new information technology resources that are being increasingly used by the general public. The…
Critical Thinking and the Use of Nontraditional Instructional Methodologies.
Orique, Sabrina B; McCarthy, Mary Ann
2015-08-01
The purpose of this study was to examine the relationship between critical thinking and the use of concept mapping (CM) and problem-based learning (PBL) during care plan development. A quasi-experimental study with a pretest-posttest design was conducted using a convenience sample (n = 49) of first-semester undergraduate baccalaureate nursing students. Critical thinking was measured using the Holistic Critical Thinking Scoring Rubric. Data analysis consisted of a repeated measures analysis of variance with post hoc mean comparison tests using the Bonferroni method. Findings indicated that mean critical thinking at phase 4 (CM and PBL) was significantly higher, compared with phase 1 (baseline), phase 2 (PBL), and phase 3 (CM [p < 0.001]). The results support the utilization of nontraditional instructional (CM and PBL) methodologies in undergraduate nursing curricula. Copyright 2015, SLACK Incorporated.
NASA Astrophysics Data System (ADS)
Potirakis, Stelios M.; Contoyiannis, Yiannis; Asano, Tomokazu; Hayakawa, Masashi
2018-01-01
A network of 8 VLF/LF receivers has recently been established across Japan, receiving subionospheric signals from different transmitters located both in the same and other countries. The primary purpose of this network is to study disturbances in the VLF/LF propagation through the lower ionosphere in possible relation to earthquake (EQ) preparation processes. Ionospheric perturbations of possible seismic origin have long been investigated and considered very promising for short-term EQ prediction. The raw amplitude data on reception of the above-mentioned network, after being appropriately filtered, were analyzed by means of the method of critical fluctuations (MCF) in analogy to thermal critical systems. The MCF analysis of the VLF/LF propagation revealed that intermittency-induced criticality was reached in the lower ionosphere from 1 week to 3 days prior to the catastrophic 2016 Kumamoto EQs. These fault-type EQs occurred within a two-day period (14 April: MW = 6.2 and MW = 6.0, 15 April: MW = 7.0) at shallow depths ( 10 km) and very close epicenters, while the main event was as huge as the former 1995 Kobe EQ. MCF analysis results are compared to those by the conventional nighttime fluctuation method as well as to those by natural time analysis method obtained for the same dataset, and are found to exhibit remarkable consistency.
NASA Astrophysics Data System (ADS)
Doležel, Jiří; Novák, Drahomír; Petrů, Jan
2017-09-01
Transportation routes of oversize and excessive loads are currently planned in relation to ensure the transit of a vehicle through critical points on the road. Critical points are level-intersection of roads, bridges etc. This article presents a comprehensive procedure to determine a reliability and a load-bearing capacity level of the existing bridges on highways and roads using the advanced methods of reliability analysis based on simulation techniques of Monte Carlo type in combination with nonlinear finite element method analysis. The safety index is considered as a main criterion of the reliability level of the existing construction structures and the index is described in current structural design standards, e.g. ISO and Eurocode. An example of a single-span slab bridge made of precast prestressed concrete girders of the 60 year current time and its load bearing capacity is set for the ultimate limit state and serviceability limit state. The structure’s design load capacity was estimated by the full probability nonlinear MKP analysis using a simulation technique Latin Hypercube Sampling (LHS). Load-bearing capacity values based on a fully probabilistic analysis are compared with the load-bearing capacity levels which were estimated by deterministic methods of a critical section of the most loaded girders.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ivanova, T.; Laville, C.; Dyrda, J.
2012-07-01
The sensitivities of the k{sub eff} eigenvalue to neutron cross sections have become commonly used in similarity studies and as part of the validation algorithm for criticality safety assessments. To test calculations of the sensitivity coefficients, a benchmark study (Phase III) has been established by the OECD-NEA/WPNCS/EG UACSA (Expert Group on Uncertainty Analysis for Criticality Safety Assessment). This paper presents some sensitivity results generated by the benchmark participants using various computational tools based upon different computational methods: SCALE/TSUNAMI-3D and -1D, MONK, APOLLO2-MORET 5, DRAGON-SUSD3D and MMKKENO. The study demonstrates the performance of the tools. It also illustrates how model simplificationsmore » impact the sensitivity results and demonstrates the importance of 'implicit' (self-shielding) sensitivities. This work has been a useful step towards verification of the existing and developed sensitivity analysis methods. (authors)« less
Counter-Learning under Oppression
ERIC Educational Resources Information Center
Kucukaydin, Ilhan
2010-01-01
This qualitative study utilized the method of narrative analysis to explore the counter-learning process of an oppressed Kurdish woman from Turkey. Critical constructivism was utilized to analyze counter-learning; Frankfurt School-based Marcusian critical theory was used to analyze the sociopolitical context and its impact on the oppressed. Key…
Using Case Studies: An International Approach
ERIC Educational Resources Information Center
McClam, Tricia; Woodside, Marianne
2005-01-01
Case studies as an instructional strategy have been used in many disciplines, including law, teacher education, science, medicine, and business. Among the benefits of this method of instruction are involving students in learning, developing their critical thinking skills, promoting communication, and engaging in critical analysis. Case studies are…
USDA-ARS?s Scientific Manuscript database
Wort beta-glucan concentration is a critical malting quality parameter used to identify and avoid potential brewhouse filtration problems. ASBC method Wort-18 is widely used in malt analysis laboratories and brewhouses to measure wort beta-glucan levels. However, the chemistry underlying the method...
ERIC Educational Resources Information Center
Yung-Kuan, Chan; Hsieh, Ming-Yuan; Lee, Chin-Feng; Huang, Chih-Cheng; Ho, Li-Chih
2017-01-01
Under the hyper-dynamic education situation, this research, in order to comprehensively explore the interplays between Teacher Competence Demands (TCD) and Learning Organization Requests (LOR), cross-employs the data refined method of Descriptive Statistics (DS) method and Analysis of Variance (ANOVA) and Principal Components Analysis (PCA)…
Culture Representation in Human Reliability Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
David Gertman; Julie Marble; Steven Novack
Understanding human-system response is critical to being able to plan and predict mission success in the modern battlespace. Commonly, human reliability analysis has been used to predict failures of human performance in complex, critical systems. However, most human reliability methods fail to take culture into account. This paper takes an easily understood state of the art human reliability analysis method and extends that method to account for the influence of culture, including acceptance of new technology, upon performance. The cultural parameters used to modify the human reliability analysis were determined from two standard industry approaches to cultural assessment: Hofstede’s (1991)more » cultural factors and Davis’ (1989) technology acceptance model (TAM). The result is called the Culture Adjustment Method (CAM). An example is presented that (1) reviews human reliability assessment with and without cultural attributes for a Supervisory Control and Data Acquisition (SCADA) system attack, (2) demonstrates how country specific information can be used to increase the realism of HRA modeling, and (3) discusses the differences in human error probability estimates arising from cultural differences.« less
NASA Astrophysics Data System (ADS)
Hananto, R. B.; Kusmayadi, T. A.; Riyadi
2018-05-01
The research aims to identify the critical thinking process of students in solving geometry problems. The geometry problem selected in this study was the building of flat side room (cube). The critical thinking process was implemented to visual, auditory and kinesthetic learning styles. This research was a descriptive analysis research using qualitative method. The subjects of this research were 3 students selected by purposive sampling consisting of visual, auditory, and kinesthetic learning styles. Data collection was done through test, interview, and observation. The results showed that the students' critical thinking process in identifying and defining steps for each learning style were similar in solving problems. The critical thinking differences were seen in enumerate, analyze, list, and self-correct steps. It was also found that critical thinking process of students with kinesthetic learning style was better than visual and auditory learning styles.
NASA Astrophysics Data System (ADS)
Gao, Zhiwen; Zhou, Youhe
2015-04-01
Real fundamental solution for fracture problem of transversely isotropic high temperature superconductor (HTS) strip is obtained. The superconductor E-J constitutive law is characterized by the Bean model where the critical current density is independent of the flux density. Fracture analysis is performed by the methods of singular integral equations which are solved numerically by Gauss-Lobatto-Chybeshev (GSL) collocation method. To guarantee a satisfactory accuracy, the convergence behavior of the kernel function is investigated. Numerical results of fracture parameters are obtained and the effects of the geometric characteristics, applied magnetic field and critical current density on the stress intensity factors (SIF) are discussed.
Information retrieval for nonstationary data records
NASA Technical Reports Server (NTRS)
Su, M. Y.
1971-01-01
A review and a critical discussion are made on the existing methods for analysis of nonstationary time series, and a new algorithm for splitting nonstationary time series, is applied to the analysis of sunspot data.
2002-05-13
alternative: feedback from the environment. This was Darwin’s great insight, that an agent can improve its internal models without any paranormal ...identifying the variables of war and establishing their interrelations.”26 Clausewitz and Schneider considered a critical analysis of history as the only...to separate the enduring principles from the accidental anomalies. This critical analysis of history is the method that comprises Dewey’s pattern of
An improved UHPLC-UV method for separation and quantification of carotenoids in vegetable crops.
Maurer, Megan M; Mein, Jonathan R; Chaudhuri, Swapan K; Constant, Howard L
2014-12-15
Carotenoid identification and quantitation is critical for the development of improved nutrition plant varieties. Industrial analysis of carotenoids is typically carried out on multiple crops with potentially thousands of samples per crop, placing critical needs on speed and broad utility of the analytical methods. Current chromatographic methods for carotenoid analysis have had limited industrial application due to their low throughput, requiring up to 60 min for complete separation of all compounds. We have developed an improved UHPLC-UV method that resolves all major carotenoids found in broccoli (Brassica oleracea L. var. italica), carrot (Daucus carota), corn (Zea mays), and tomato (Solanum lycopersicum). The chromatographic method is completed in 13.5 min allowing for the resolution of the 11 carotenoids of interest, including the structural isomers lutein/zeaxanthin and α-/β-carotene. Additional minor carotenoids have also been separated and identified with this method, demonstrating the utility of this method across major commercial food crops. Copyright © 2014 Elsevier Ltd. All rights reserved.
Segura-Totten, Miriam; Dalman, Nancy E.
2013-01-01
Analysis of the primary literature in the undergraduate curriculum is associated with gains in student learning. In particular, the CREATE (Consider, Read, Elucidate hypotheses, Analyze and interpret the data, and Think of the next Experiment) method is associated with an increase in student critical thinking skills. We adapted the CREATE method within a required cell biology class and compared the learning gains of students using CREATE to those of students involved in less structured literature discussions. We found that while both sets of students had gains in critical thinking, students who used the CREATE method did not show significant improvement over students engaged in a more traditional method for dissecting the literature. Students also reported similar learning gains for both literature discussion methods. Our study suggests that, at least in our educational context, the CREATE method does not lead to higher learning gains than a less structured way of reading primary literature. PMID:24358379
Disrupting the Pipeline: Critical Analyses of Student Pathways through Postsecondary STEM Education
ERIC Educational Resources Information Center
Metcalf, Heather E.
2014-01-01
Critical mixed methods approaches allow us to reflect upon the ways in which we collect, measure, interpret, and analyze data, providing novel alternatives for quantitative analysis. For institutional researchers, whose work influences institutional policies, programs, and practices, the approach has the transformative ability to expose and create…
Theoretical Cognitive Principles Observed in the Social Studies Classroom
ERIC Educational Resources Information Center
Walker, Juan; Langan, Elise; Kemp, Andrew; Pagnotti, John; Russell, William
2016-01-01
Pre-service elementary social studies teachers in the south eastern United States participated in a mixed methods study to determine the degree to which they utilized critical thinking skills. Insight Assessments administered analysis of their reflections, critical thinking skills, and dispositions test. The researchers developed a post survey for…
USDA-ARS?s Scientific Manuscript database
Critical path analysis (CPA) is a method for estimating macroscopic transport coefficients of heterogeneous materials that are highly disordered at the micro-scale. Developed originally to model conduction in semiconductors, numerous researchers have noted that CPA might also have relevance to flow ...
Empowering Discourse: Discourse Analysis as Method and Practice in the Sociology Classroom
ERIC Educational Resources Information Center
Hjelm, Titus
2013-01-01
Collaborative learning and critical pedagogy are widely recognized as "empowering" pedagogies for higher education. Yet, the practical implementation of both has a mixed record. The question, then, is: How could collaborative and critical pedagogies be empowered themselves? This paper makes a primarily theoretical case for discourse…
NASA Astrophysics Data System (ADS)
Ishizawa, Y.; Abe, K.; Shirako, G.; Takai, T.; Kato, H.
The electromagnetic compatibility (EMC) control method, system EMC analysis method, and system test method which have been applied to test the components of the MOS-1 satellite are described. The merits and demerits of the problem solving, specification, and system approaches to EMC control are summarized, and the data requirements of the SEMCAP (specification and electromagnetic compatibility analysis program) computer program for verifying the EMI safety margin of the components are sumamrized. Examples of EMC design are mentioned, and the EMC design process and selection method for EMC critical points are shown along with sample EMC test results.
Marshall, Najja; Timme, Nicholas M; Bennett, Nicholas; Ripp, Monica; Lautzenhiser, Edward; Beggs, John M
2016-01-01
Neural systems include interactions that occur across many scales. Two divergent methods for characterizing such interactions have drawn on the physical analysis of critical phenomena and the mathematical study of information. Inferring criticality in neural systems has traditionally rested on fitting power laws to the property distributions of "neural avalanches" (contiguous bursts of activity), but the fractal nature of avalanche shapes has recently emerged as another signature of criticality. On the other hand, neural complexity, an information theoretic measure, has been used to capture the interplay between the functional localization of brain regions and their integration for higher cognitive functions. Unfortunately, treatments of all three methods-power-law fitting, avalanche shape collapse, and neural complexity-have suffered from shortcomings. Empirical data often contain biases that introduce deviations from true power law in the tail and head of the distribution, but deviations in the tail have often been unconsidered; avalanche shape collapse has required manual parameter tuning; and the estimation of neural complexity has relied on small data sets or statistical assumptions for the sake of computational efficiency. In this paper we present technical advancements in the analysis of criticality and complexity in neural systems. We use maximum-likelihood estimation to automatically fit power laws with left and right cutoffs, present the first automated shape collapse algorithm, and describe new techniques to account for large numbers of neural variables and small data sets in the calculation of neural complexity. In order to facilitate future research in criticality and complexity, we have made the software utilized in this analysis freely available online in the MATLAB NCC (Neural Complexity and Criticality) Toolbox.
Implementation of a computer database testing and analysis program.
Rouse, Deborah P
2007-01-01
The author is the coordinator of a computer software database testing and analysis program implemented in an associate degree nursing program. Computer software database programs help support the testing development and analysis process. Critical thinking is measurable and promoted with their use. The reader of this article will learn what is involved in procuring and implementing a computer database testing and analysis program in an academic nursing program. The use of the computerized database for testing and analysis will be approached as a method to promote and evaluate the nursing student's critical thinking skills and to prepare the nursing student for the National Council Licensure Examination.
Structural dynamic analysis of the Space Shuttle Main Engine
NASA Technical Reports Server (NTRS)
Scott, L. P.; Jamison, G. T.; Mccutcheon, W. A.; Price, J. M.
1981-01-01
This structural dynamic analysis supports development of the SSME by evaluating components subjected to critical dynamic loads, identifying significant parameters, and evaluating solution methods. Engine operating parameters at both rated and full power levels are considered. Detailed structural dynamic analyses of operationally critical and life limited components support the assessment of engine design modifications and environmental changes. Engine system test results are utilized to verify analytic model simulations. The SSME main chamber injector assembly is an assembly of 600 injector elements which are called LOX posts. The overall LOX post analysis procedure is shown.
Systematic risk assessment methodology for critical infrastructure elements - Oil and Gas subsectors
NASA Astrophysics Data System (ADS)
Gheorghiu, A.-D.; Ozunu, A.
2012-04-01
The concern for the protection of critical infrastructure has been rapidly growing in the last few years in Europe. The level of knowledge and preparedness in this field is beginning to develop in a lawfully organized manner, for the identification and designation of critical infrastructure elements of national and European interest. Oil and gas production, refining, treatment, storage and transmission by pipelines facilities, are considered European critical infrastructure sectors, as per Annex I of the Council Directive 2008/114/EC of 8 December 2008 on the identification and designation of European critical infrastructures and the assessment of the need to improve their protection. Besides identifying European and national critical infrastructure elements, member states also need to perform a risk analysis for these infrastructure items, as stated in Annex II of the above mentioned Directive. In the field of risk assessment, there are a series of acknowledged and successfully used methods in the world, but not all hazard identification and assessment methods and techniques are suitable for a given site, situation, or type of hazard. As Theoharidou, M. et al. noted (Theoharidou, M., P. Kotzanikolaou, and D. Gritzalis 2009. Risk-Based Criticality Analysis. In Critical Infrastructure Protection III. Proceedings. Third Annual IFIP WG 11.10 International Conference on Critical Infrastructure Protection. Hanover, New Hampshire, USA, March 23-25, 2009: revised selected papers, edited by C. Palmer and S. Shenoi, 35-49. Berlin: Springer.), despite the wealth of knowledge already created, there is a need for simple, feasible, and standardized criticality analyses. The proposed systematic risk assessment methodology includes three basic steps: the first step (preliminary analysis) includes the identification of hazards (including possible natural hazards) for each installation/section within a given site, followed by a criterial analysis and then a detailed analysis step. The criterial evaluation is used as a ranking system in order to establish the priorities for the detailed risk assessment. This criterial analysis stage is necessary because the total number of installations and sections on a site can be quite large. As not all installations and sections on a site contribute significantly to the risk of a major accident occurring, it is not efficient to include all installations and sections in the detailed risk assessment, which can be time and resource consuming. The selected installations are then taken into consideration in the detailed risk assessment, which is the third step of the systematic risk assessment methodology. Following this step, conclusions can be drawn related to the overall risk characteristics of the site. The proposed methodology can as such be successfully applied to the assessment of risk related to critical infrastructure elements falling under the energy sector of Critical Infrastructure, mainly the sub-sectors oil and gas. Key words: Systematic risk assessment, criterial analysis, energy sector critical infrastructure elements
Fractal analysis of GPS time series for early detection of disastrous seismic events
NASA Astrophysics Data System (ADS)
Filatov, Denis M.; Lyubushin, Alexey A.
2017-03-01
A new method of fractal analysis of time series for estimating the chaoticity of behaviour of open stochastic dynamical systems is developed. The method is a modification of the conventional detrended fluctuation analysis (DFA) technique. We start from analysing both methods from the physical point of view and demonstrate the difference between them which results in a higher accuracy of the new method compared to the conventional DFA. Then, applying the developed method to estimate the measure of chaoticity of a real dynamical system - the Earth's crust, we reveal that the latter exhibits two distinct mechanisms of transition to a critical state: while the first mechanism has already been known due to numerous studies of other dynamical systems, the second one is new and has not previously been described. Using GPS time series, we demonstrate efficiency of the developed method in identification of critical states of the Earth's crust. Finally we employ the method to solve a practically important task: we show how the developed measure of chaoticity can be used for early detection of disastrous seismic events and provide a detailed discussion of the numerical results, which are shown to be consistent with outcomes of other researches on the topic.
Sud, Sachin; Cuthbertson, Brian H
2011-10-01
The article reviews the methods of health economic analysis (HEA) in clinical trials of critically ill patients. Emphasis is placed on the usefulness of HEA in the context of positive and 'no effect' studies, with recent examples. The need to control costs and promote effective spending in caring for the critically ill has garnered considerable attention due to the high cost of critical illness. Many clinical trials focus on short-term mortality, ignoring costs and quality of life, and fail to change clinical practice or promote efficient use of resources. Incorporating HEA into clinical trials is a possible solution. Such studies have shown some interventions, although expensive, provide good value, whereas others should be withdrawn from clinical practice. Incorporating HEA into randomized controlled trials (RCTs) requires careful attention to collect all relevant costs. Decision trees, modeling assumptions and methods for collecting costs and measuring outcomes should be planned and published beforehand to minimize bias. Costs and cost-effectiveness are potentially useful outcomes in RCTs of critically ill patients. Future RCTs should incorporate parallel HEA to provide both economic outcomes, which are important to the community, alongside patient-centered outcomes, which are important to individuals.
The Increase of Critical Thinking Skills through Mathematical Investigation Approach
NASA Astrophysics Data System (ADS)
Sumarna, N.; Wahyudin; Herman, T.
2017-02-01
Some research findings on critical thinking skills of prospective elementary teachers, showed a response that is not optimal. On the other hand, critical thinking skills will lead a student in the process of analysis, evaluation and synthesis in solving a mathematical problem. This study attempts to perform an alternative solution with a focus on mathematics learning conditions that is held in the lecture room through mathematical investigation approach. This research method was Quasi-Experimental design with pre-test post-test design. Data analysis using a mixed method with Embedded design. Subjects were regular students enrolled in 2014 at the study program of education of primary school teachers. The number of research subjects were 111 students consisting of 56 students in the experimental group and 55 students in the control group. The results of the study showed that (1) there is a significant difference in the improvement of critical thinking ability of students who receive learning through mathematical investigation approach when compared with students studying through expository approach, and (2) there is no interaction effect between prior knowledge of mathematics and learning factors (mathematical investigation and expository) to increase of critical thinking skills of students.
Gaebelein, Claude J.; Grice, Gloria R.; Crannage, Andrew J.; Weck, Margaret A.; Hurd, Peter; Walter, Brenda; Duncan, Wendy
2013-01-01
Objective. To determine the feasibility of using a validated set of assessment rubrics to assess students’ critical-thinking and problem-solving abilities across a doctor of pharmacy (PharmD) curriculum. Methods. Trained faculty assessors used validated rubrics to assess student work samples for critical-thinking and problem-solving abilities. Assessment scores were collected and analyzed to determine student achievement of these 2 ability outcomes across the curriculum. Feasibility of the process was evaluated in terms of time and resources used. Results. One hundred sixty-one samples were assessed for critical thinking, and 159 samples were assessed for problem-solving. Rubric scoring allowed assessors to evaluate four 5- to 7-page work samples per hour. The analysis indicated that overall critical-thinking scores improved over the curriculum. Although low yield for problem-solving samples precluded meaningful data analysis, it was informative for identifying potentially needed curricular improvements. Conclusions. Use of assessment rubrics for program ability outcomes was deemed authentic and feasible. Problem-solving was identified as a curricular area that may need improving. This assessment method has great potential to inform continuous quality improvement of a PharmD program. PMID:24159207
Sampling methods for microbiological analysis of red meat and poultry carcasses.
Capita, Rosa; Prieto, Miguel; Alonso-Calleja, Carlos
2004-06-01
Microbiological analysis of carcasses at slaughterhouses is required in the European Union for evaluating the hygienic performance of carcass production processes as required for effective hazard analysis critical control point implementation. The European Union microbial performance standards refer exclusively to the excision method, even though swabbing using the wet/dry technique is also permitted when correlation between both destructive and nondestructive methods can be established. For practical and economic reasons, the swab technique is the most extensively used carcass surface-sampling method. The main characteristics, advantages, and limitations of the common excision and swabbing methods are described here.
NASA Astrophysics Data System (ADS)
Kolesik, Miroslav; Suzuki, Masuo
1995-02-01
The antiferromagnetic three-state Potts model on the simple-cubic lattice is studied using the coherent-anomaly method (CAM). The CAM analysis provides the estimates for the critical exponents which indicate the XY universality class, namely α = -0.011, β = 0.351, γ = 1.309 and δ = 4.73. This observation corroborates the results of the recent Monte Carlo simulations, and disagrees with the proposal of a new universality class.
Kisely, Stephen; Kendall, Elizabeth
2011-08-01
Papers using qualitative methods are increasingly common in psychiatric journals. This overview is an introduction to critically appraising a qualitative paper for clinicians who are more familiar with quantitative methods. Qualitative research uses data from interviews (semi-structured or unstructured), focus groups, observations or written materials. Data analysis is inductive, allowing meaning to emerge from the data, rather than the more deductive, hypothesis centred approach of quantitative research. This overview compares and contrasts quantitative and qualitative research methods. Quantitative concepts such as reliability, validity, statistical power, bias and generalisability have qualitative equivalents. These include triangulation, trustworthiness, saturation, reflexivity and applicability. Reflexivity also shares features of transference. Qualitative approaches include: ethnography, action-assessment, grounded theory, case studies and mixed methods. Qualitative research can complement quantitative approaches. An understanding of both is useful in critically appraising the psychiatric literature.
ERIC Educational Resources Information Center
Kim, Kyung Hi
2014-01-01
This research, based on a case study of vulnerable children in Korea, used a mixed methods transformative approach to explore strategies to support and help disadvantaged children. The methodological approach includes three phases: a mixed methods contextual analysis, a qualitative dominant analysis based on Sen's capability approach and critical…
ERIC Educational Resources Information Center
Welton, Anjale
2011-01-01
This response to ""Buscando la Libertad": Latino Youths in Search of Freedom in School" by Jason G. Irizarry demonstrates how youth participatory action research (YPAR) as an instrument of subverting oppressive school policies and structures is a form of critical policy analysis (CPA). As an evolving method, CPA acknowledges the absent voices in…
NASA Astrophysics Data System (ADS)
Prasetyo, Yudo; Ardi Gunawan, Setyo; Maksum, Zia Ul
2016-11-01
Semarang is the biggest city in central Java-Indonesia which has a rapid and massive infrastructure development nowadays. In order to control water resources and flood, the local goverment has been built east and west flood canal in Kaligarang and West Semarang River. One of main problem in Semarang city is the lack of fresh water in dry season because ground water is not rechargeable well. Rechargeable groundwater ability depends on underground water recharge rate and catchment area condition. The objective of the study is to determine condition and classification of water catchment area in Semarang city. The catchment area conditions will be determine by five parameters as follows soil type, land use, slope, ground water potential and rainfall intensity. In this study, we use three methods approach to solve the problem which is segmentation classification to acquire land use classification from high resolution imagery using nearest neighborhood algorithm, Interferometric Synthetic Aperture Radar (SAR) to derive DTM from SAR Imagery and multi criteria weighting and spatial analysis using GIS method. There are three types optical image (ALOS PRISM, SPOT-6 and ALOS PALSAR) to calculate water catchment area condition in Semarang city. For final result, this research will divide the water catchment into six criteria as follows good, naturally normal, early critical, a little bit critical, critical and very critical condition. The result shows that water catchment area condition is in an early critical condition around 2607,523 Ha (33,17 %), naturally normal condition around 1507,674 Ha (19,18 %), a little bit critical condition around 1452,931 Ha (18,48 %), good with 1157,04 Ha (14,72 %), critical with 1058,639 Ha (13,47 %) and very critical with 75,0387 Ha (0,95 %). The distribution of water catchment area conditions in West and East Flood Canal have an irreguler pattern. In northern area of watershed consists of begin to critical, naturally normal and good condition. Meanwhile in southern area of watershed consists of a little bit critical, critical and very critical condition.
Analysis of stray radiation for infrared optical system
NASA Astrophysics Data System (ADS)
Li, Yang; Zhang, Tingcheng; Liao, Zhibo; Mu, Shengbo; Du, Jianxiang; Wang, Xiangdong
2016-10-01
Based on the theory of radiation energy transfer in the infrared optical system, two methods for stray radiation analysis caused by interior thermal radiation in infrared optical system are proposed, one of which is important sampling method technique using forward ray trace, another of which is integral computation method using reverse ray trace. The two methods are discussed in detail. A concrete infrared optical system is provided. Light-tools is used to simulate the passage of radiation from the mirrors and mounts. Absolute values of internal irradiance on the detector are received. The results shows that the main part of the energy on the detector is due to the critical objects which were consistent with critical objects obtained by reverse ray trace, where mirror self-emission contribution is about 87.5% of the total energy. Corresponding to the results, the irradiance on the detector calculated by the two methods are in good agreement. So the validity and rationality of the two methods are proved.
NASA Astrophysics Data System (ADS)
Kang, Fei; Li, Junjie; Ma, Zhenyue
2013-02-01
Determination of the critical slip surface with the minimum factor of safety of a slope is a difficult constrained global optimization problem. In this article, an artificial bee colony algorithm with a multi-slice adjustment method is proposed for locating the critical slip surfaces of soil slopes, and the Spencer method is employed to calculate the factor of safety. Six benchmark examples are presented to illustrate the reliability and efficiency of the proposed technique, and it is also compared with some well-known or recent algorithms for the problem. The results show that the new algorithm is promising in terms of accuracy and efficiency.
Ideology Awareness Project: An Exercise in Item Unit Content Analysis.
ERIC Educational Resources Information Center
Simon, David R.
1981-01-01
Describes an exercise in the content analysis of political ideologies. Advantages of the exercise include that it teaches students to employ content analysis as a method of research and that it introduces them to the ideological statements of America's leading social critics. (DB)
Methods for consistent forewarning of critical events across multiple data channels
Hively, Lee M.
2006-11-21
This invention teaches further method improvements to forewarn of critical events via phase-space dissimilarity analysis of data from biomedical equipment, mechanical devices, and other physical processes. One improvement involves conversion of time-serial data into equiprobable symbols. A second improvement is a method to maximize the channel-consistent total-true rate of forewarning from a plurality of data channels over multiple data sets from the same patient or process. This total-true rate requires resolution of the forewarning indications into true positives, true negatives, false positives and false negatives. A third improvement is the use of various objective functions, as derived from the phase-space dissimilarity measures, to give the best forewarning indication. A fourth improvement uses various search strategies over the phase-space analysis parameters to maximize said objective functions. A fifth improvement shows the usefulness of the method for various biomedical and machine applications.
Nonlinear dynamics behavior analysis of the spatial configuration of a tendril-bearing plant
NASA Astrophysics Data System (ADS)
Feng, Jingjing; Zhang, Qichang; Wang, Wei; Hao, Shuying
2017-03-01
Tendril-bearing plants appear to have a spiraling shape when tendrils climb along a support during growth. The growth characteristics of a tendril-bearer can be simplified to a model of a thin elastic rod with a cylindrical constraint. In this paper, the connection between some typical configuration characteristics of tendrils and complex nonlinear dynamic behavior are qualitatively analyzed. The space configuration problem of tendrils can be explained through the study of the nonlinear dynamic behavior of the thin elastic rod system equation. In this study, the complex non-Z2 symmetric critical orbits in the system equation under critical parameters were presented. A new function transformation method that can effectively maintain the critical orbit properties was proposed, and a new nonlinear differential equations system containing complex nonlinear terms can been obtained to describe the cross section position and direction of a rod during climbing. Numerical simulation revealed that the new system can describe the configuration of a rod with reasonable accuracy. To adequately explain the growing regulation of the rod shape, the critical orbit and configuration of rod are connected in a direct way. The high precision analytical expressions of these complex non-Z2 symmetric critical orbits are obtained by introducing a suitable analytical method, and then these expressions are used to draw the corresponding three-dimensional configuration figures of an elastic thin rod. Combined with actual tendrils on a live plant, the space configuration of the winding knots of tendril is explained by the concept of heteroclinic orbit from the perspective of nonlinear dynamics, and correctness of the theoretical analysis was verified. This theoretical analysis method could also be effectively applied to other similar slender structures.
Review of teaching methods and critical thinking skills.
Kowalczyk, Nina
2011-01-01
Critical information is needed to inform radiation science educators regarding successful critical thinking educational strategies. From an evidence-based research perspective, systematic reviews are identified as the most current and highest level of evidence. Analysis at this high level is crucial in analyzing those teaching methods most appropriate to the development of critical thinking skills. To conduct a systematic literature review to identify teaching methods that demonstrate a positive effect on the development of students' critical thinking skills and to identify how these teaching strategies can best translate to radiologic science educational programs. A comprehensive literature search was conducted resulting in an assessment of 59 full reports. Nineteen of the 59 reports met inclusion criteria and were reviewed based on the level of evidence presented. Inclusion criteria included studies conducted in the past 10 years on sample sizes of 20 or more individuals demonstrating use of specific teaching interventions for 5 to 36 months in postsecondary health-related educational programs. The majority of the research focused on problem-based learning (PBL) requiring standardized small-group activities. Six of the 19 studies focused on PBL and demonstrated significant differences in student critical thinking scores. PBL, as described in the nursing literature, is an effective teaching method that should be used in radiation science education. ©2011 by the American Society of Radiologic Technologists.
ERIC Educational Resources Information Center
Salinas, Cinthia S.; Fránquiz, María E.; Rodríguez, Noreen Naseem
2016-01-01
This qualitative case study examines the experiences of Latina prospective teachers enrolled in a bilingual social studies methods course that focused attention upon critical historical inquiry. The students built historical narratives that deliberately addressed oft-ignored histories of Communities of Color. The analysis argues however that…
ERIC Educational Resources Information Center
Larkin, Douglas B.; Maloney, Tanya; Perry-Ryder, Gail M.
2016-01-01
This study describes the experiences of two preservice science teachers as they progress through their respective teacher education programs and uses critical race theory to examine the manner in which conceptions about race and its pedagogical implications change over time. Using a longitudinal case study method, participants' conceptual…
ERIC Educational Resources Information Center
Braguglia, Kay H.; Jackson, Kanata A.
2012-01-01
This article presents a reflective analysis of teaching research methodology through a three course sequence using a project-based approach. The authors reflect critically on their experiences in teaching research methods courses in an undergraduate business management program. The introduction of a range of specific techniques including student…
New Design for an Adjustable Cise Space Maintainer
2018-01-01
Objective The aim of this study is to present a new adjustable Cise space maintainer for preventive orthodontic applications. Methods Stainless steel based new design consists of six main components. In order to understand the major displacement and stress fields, structural analysis for the design is considered by using finite element method. Results Similar to major displacement at y-axis, critical stresses σx and τxy possess a linear distribution with constant increasing. Additionally, strain energy density (SED) plays an important role to determine critical biting load capacity. Conclusion Structural analysis shows that the space maintainer is stable and is used for maintaining and/or regaining the space which arouses early loss of molar tooth. PMID:29854764
Approximate N-Player Nonzero-Sum Game Solution for an Uncertain Continuous Nonlinear System.
Johnson, Marcus; Kamalapurkar, Rushikesh; Bhasin, Shubhendu; Dixon, Warren E
2015-08-01
An approximate online equilibrium solution is developed for an N -player nonzero-sum game subject to continuous-time nonlinear unknown dynamics and an infinite horizon quadratic cost. A novel actor-critic-identifier structure is used, wherein a robust dynamic neural network is used to asymptotically identify the uncertain system with additive disturbances, and a set of critic and actor NNs are used to approximate the value functions and equilibrium policies, respectively. The weight update laws for the actor neural networks (NNs) are generated using a gradient-descent method, and the critic NNs are generated by least square regression, which are both based on the modified Bellman error that is independent of the system dynamics. A Lyapunov-based stability analysis shows that uniformly ultimately bounded tracking is achieved, and a convergence analysis demonstrates that the approximate control policies converge to a neighborhood of the optimal solutions. The actor, critic, and identifier structures are implemented in real time continuously and simultaneously. Simulations on two and three player games illustrate the performance of the developed method.
Empirical Investigation of Critical Transitions in Paleoclimate
NASA Astrophysics Data System (ADS)
Loskutov, E. M.; Mukhin, D.; Gavrilov, A.; Feigin, A.
2016-12-01
In this work we apply a new empirical method for the analysis of complex spatially distributed systems to the analysis of paleoclimate data. The method consists of two general parts: (i) revealing the optimal phase-space variables and (ii) construction the empirical prognostic model by observed time series. The method of phase space variables construction based on the data decomposition into nonlinear dynamical modes which was successfully applied to global SST field and allowed clearly separate time scales and reveal climate shift in the observed data interval [1]. The second part, the Bayesian approach to optimal evolution operator reconstruction by time series is based on representation of evolution operator in the form of nonlinear stochastic function represented by artificial neural networks [2,3]. In this work we are focused on the investigation of critical transitions - the abrupt changes in climate dynamics - in match longer time scale process. It is well known that there were number of critical transitions on different time scales in the past. In this work, we demonstrate the first results of applying our empirical methods to analysis of paleoclimate variability. In particular, we discuss the possibility of detecting, identifying and prediction such critical transitions by means of nonlinear empirical modeling using the paleoclimate record time series. The study is supported by Government of Russian Federation (agreement #14.Z50.31.0033 with the Institute of Applied Physics of RAS). 1. Mukhin, D., Gavrilov, A., Feigin, A., Loskutov, E., & Kurths, J. (2015). Principal nonlinear dynamical modes of climate variability. Scientific Reports, 5, 15510. http://doi.org/10.1038/srep155102. Ya. I. Molkov, D. N. Mukhin, E. M. Loskutov, A.M. Feigin, (2012) : Random dynamical models from time series. Phys. Rev. E, Vol. 85, n.3.3. Mukhin, D., Kondrashov, D., Loskutov, E., Gavrilov, A., Feigin, A., & Ghil, M. (2015). Predicting Critical Transitions in ENSO models. Part II: Spatially Dependent Models. Journal of Climate, 28(5), 1962-1976. http://doi.org/10.1175/JCLI-D-14-00240.1
Overlapping Modularity at the Critical Point of k-Clique Percolation
NASA Astrophysics Data System (ADS)
Tóth, Bálint; Vicsek, Tamás; Palla, Gergely
2013-05-01
One of the most remarkable social phenomena is the formation of communities in social networks corresponding to families, friendship circles, work teams, etc. Since people usually belong to several different communities at the same time, the induced overlaps result in an extremely complicated web of the communities themselves. Thus, uncovering the intricate community structure of social networks is a non-trivial task with great potential for practical applications, gaining a notable interest in the recent years. The Clique Percolation Method (CPM) is one of the earliest overlapping community finding methods, which was already used in the analysis of several different social networks. In this approach the communities correspond to k-clique percolation clusters, and the general heuristic for setting the parameters of the method is to tune the system just below the critical point of k-clique percolation. However, this rule is based on simple physical principles and its validity was never subject to quantitative analysis. Here we examine the quality of the partitioning in the vicinity of the critical point using recently introduced overlapping modularity measures. According to our results on real social and other networks, the overlapping modularities show a maximum close to the critical point, justifying the original criteria for the optimal parameter settings.
NASA Astrophysics Data System (ADS)
Maharani, S.; Suprapto, E.
2018-03-01
Critical thinking is very important in Mathematics; it can make student more understanding mathematics concept. Critical thinking is also needed in numerical analysis. The Numerical analysis's book is not yet including critical thinking in them. This research aims to develop group investigation-based book on numerical analysis to increase critical thinking student’s ability, to know the quality of the group investigation-based book on numerical analysis is valid, practical, and effective. The research method is Research and Development (R&D) with the subject are 30 student college department of Mathematics education at Universitas PGRI Madiun. The development model used is 4-D modified to 3-D until the stage development. The type of data used is descriptive qualitative data. Instruments used are sheets of validation, test, and questionnaire. Development results indicate that group investigation-based book on numerical analysis in the category of valid a value 84.25%. Students response to the books very positive, so group investigation-based book on numerical analysis category practical, i.e., 86.00%. The use of group investigation-based book on numerical analysis has been meeting the completeness criteria classical learning that is 84.32 %. Based on research result of this study concluded that group investigation-based book on numerical analysis is feasible because it meets the criteria valid, practical, and effective. So, the book can be used by every mathematics academician. The next research can be observed that book based group investigation in other subjects.
SCALE Continuous-Energy Eigenvalue Sensitivity Coefficient Calculations
Perfetti, Christopher M.; Rearden, Bradley T.; Martin, William R.
2016-02-25
Sensitivity coefficients describe the fractional change in a system response that is induced by changes to system parameters and nuclear data. The Tools for Sensitivity and UNcertainty Analysis Methodology Implementation (TSUNAMI) code within the SCALE code system makes use of eigenvalue sensitivity coefficients for an extensive number of criticality safety applications, including quantifying the data-induced uncertainty in the eigenvalue of critical systems, assessing the neutronic similarity between different critical systems, and guiding nuclear data adjustment studies. The need to model geometrically complex systems with improved fidelity and the desire to extend TSUNAMI analysis to advanced applications has motivated the developmentmore » of a methodology for calculating sensitivity coefficients in continuous-energy (CE) Monte Carlo applications. The Contributon-Linked eigenvalue sensitivity/Uncertainty estimation via Tracklength importance CHaracterization (CLUTCH) and Iterated Fission Probability (IFP) eigenvalue sensitivity methods were recently implemented in the CE-KENO framework of the SCALE code system to enable TSUNAMI-3D to perform eigenvalue sensitivity calculations using continuous-energy Monte Carlo methods. This work provides a detailed description of the theory behind the CLUTCH method and describes in detail its implementation. This work explores the improvements in eigenvalue sensitivity coefficient accuracy that can be gained through the use of continuous-energy sensitivity methods and also compares several sensitivity methods in terms of computational efficiency and memory requirements.« less
Young, Allan; Yatham, Lakshmi; Grunze, Heinz; Vieta, Eduard; Blier, Pierre; Moeller, Hans Jurgen; Kasper, Siegfried
2017-01-01
Abstract Background: This paper includes a short description of the important clinical aspects of Bipolar Disorder with emphasis on issues that are important for the therapeutic considerations, including mixed and psychotic features, predominant polarity, and rapid cycling as well as comorbidity. Methods: The workgroup performed a review and critical analysis of the literature concerning grading methods and methods for the development of guidelines. Results: The workgroup arrived at a consensus to base the development of the guideline on randomized controlled trials and related meta-analyses alone in order to follow a strict evidence-based approach. A critical analysis of the existing methods for the grading of treatment options was followed by the development of a new grading method to arrive at efficacy and recommendation levels after the analysis of 32 distinct scenarios of available data for a given treatment option. Conclusion: The current paper reports details on the design, method, and process for the development of CINP guidelines for the treatment of Bipolar Disorder. The rationale and the method with which all data and opinions are combined in order to produce an evidence-based operationalized but also user-friendly guideline and a specific algorithm are described in detail in this paper. PMID:27815414
Argumentation: A Methodology to Facilitate Critical Thinking.
Makhene, Agnes
2017-06-20
Caring is a difficult nursing activity that involves a complex nature of a human being in need of complex decision-making and problem solving through the critical thinking process. It is mandatory that critical thinking is facilitated in general and in nursing education particularly in order to render care in diverse multicultural patient care settings. This paper aims to describe how argumentation can be used to facilitate critical thinking in learners. A qualitative, exploratory and descriptive design that is contextual was used. Purposive sampling method was used to draw a sample and Miles and Huberman methodology of qualitative analysis was used to analyse data. Lincoln and Guba's strategies were employed to ensure trustworthiness, while Dhai and McQuoid-Mason's principles of ethical consideration were used. Following data analysis the findings were integrated within literature which culminated into the formulation of guidelines that can be followed when using argumentation as a methodology to facilitate critical thinking.
Gianassi, S; Bisin, S; Bindi, B; Spitaleri, I; Bambi, F
2010-01-01
The collection and handling of hematopoietic stem cells (HSCs) must meet high quality requirements. An integrated Quality Risk Management can help to identify and contain potential risks related to HSC production. Risk analysis techniques allow one to "weigh" identified hazards, considering the seriousness of their effects, frequency, and detectability, seeking to prevent the most harmful hazards. The Hazard Analysis Critical Point, recognized as the most appropriate technique to identify risks associated with physical, chemical, and biological hazards for cellular products, consists of classifying finished product specifications and limits of acceptability, identifying all off-specifications, defining activities that can cause them, and finally establishing both a monitoring system for each Critical Control Point and corrective actions for deviations. The severity of possible effects on patients, as well as the occurrence and detectability of critical parameters, are measured on quantitative scales (Risk Priority Number [RPN]). Risk analysis was performed with this technique on manipulation process of HPC performed at our blood center. The data analysis showed that hazards with higher values of RPN with greater impact on the process are loss of dose and tracking; technical skills of operators and manual transcription of data were the most critical parameters. Problems related to operator skills are handled by defining targeted training programs, while other critical parameters can be mitigated with the use of continuous control systems. The blood center management software was completed by a labeling system with forms designed to be in compliance with standards in force and by starting implementation of a cryopreservation management module. Copyright 2010 Elsevier Inc. All rights reserved.
Fission matrix-based Monte Carlo criticality analysis of fuel storage pools
DOE Office of Scientific and Technical Information (OSTI.GOV)
Farlotti, M.; Ecole Polytechnique, Palaiseau, F 91128; Larsen, E. W.
2013-07-01
Standard Monte Carlo transport procedures experience difficulties in solving criticality problems in fuel storage pools. Because of the strong neutron absorption between fuel assemblies, source convergence can be very slow, leading to incorrect estimates of the eigenvalue and the eigenfunction. This study examines an alternative fission matrix-based Monte Carlo transport method that takes advantage of the geometry of a storage pool to overcome this difficulty. The method uses Monte Carlo transport to build (essentially) a fission matrix, which is then used to calculate the criticality and the critical flux. This method was tested using a test code on a simplemore » problem containing 8 assemblies in a square pool. The standard Monte Carlo method gave the expected eigenfunction in 5 cases out of 10, while the fission matrix method gave the expected eigenfunction in all 10 cases. In addition, the fission matrix method provides an estimate of the error in the eigenvalue and the eigenfunction, and it allows the user to control this error by running an adequate number of cycles. Because of these advantages, the fission matrix method yields a higher confidence in the results than standard Monte Carlo. We also discuss potential improvements of the method, including the potential for variance reduction techniques. (authors)« less
Articles Pertinent to the Campus Press: A Selected Annotated Bibliography
ERIC Educational Resources Information Center
Ardoin, Birthney; And Others
1977-01-01
Lists and annotates articles dealing with such topics as advertising, audience analysis, broadcasting, broadcast law, criticism and defense of media, editorial policy and methods, media policy and methods, journalism education, press law, and school publications. (GW)
Reliability analysis and initial requirements for FC systems and stacks
NASA Astrophysics Data System (ADS)
Åström, K.; Fontell, E.; Virtanen, S.
In the year 2000 Wärtsilä Corporation started an R&D program to develop SOFC systems for CHP applications. The program aims to bring to the market highly efficient, clean and cost competitive fuel cell systems with rated power output in the range of 50-250 kW for distributed generation and marine applications. In the program Wärtsilä focuses on system integration and development. System reliability and availability are key issues determining the competitiveness of the SOFC technology. In Wärtsilä, methods have been implemented for analysing the system in respect to reliability and safety as well as for defining reliability requirements for system components. A fault tree representation is used as the basis for reliability prediction analysis. A dynamic simulation technique has been developed to allow for non-static properties in the fault tree logic modelling. Special emphasis has been placed on reliability analysis of the fuel cell stacks in the system. A method for assessing reliability and critical failure predictability requirements for fuel cell stacks in a system consisting of several stacks has been developed. The method is based on a qualitative model of the stack configuration where each stack can be in a functional, partially failed or critically failed state, each of the states having different failure rates and effects on the system behaviour. The main purpose of the method is to understand the effect of stack reliability, critical failure predictability and operating strategy on the system reliability and availability. An example configuration, consisting of 5 × 5 stacks (series of 5 sets of 5 parallel stacks) is analysed in respect to stack reliability requirements as a function of predictability of critical failures and Weibull shape factor of failure rate distributions.
Development of Standard Methods of Testing and Analyzing Fatigue Crack Growth Rate Data
1978-05-01
nitrogen cooled cryostat; high temperature tests were conducted using resistance heating tapes . An automatic controller maintained test temperatures...Cracking," Int. J. Fracture, Vol. 9, 1973, pp. 63-74. 87. P. Paris and F. Erdogan , "A Critical Analysis of Crack Propagation Laws," Trans. ASME, Ser. D: J...requirements of Sec. 7.2 and Appendix B. 200 REFERENCES 1. P. C. Paris and F. Erdogan , "A Critical Analysis of Crack Propagation Laws", Trans. ASME, Ser. D: 3
2013-06-24
Barrier methods for critical exponent problems in geometric analysis and mathematical physics, J. Erway and M. Hoist, Submitted for publication . • Finite...1996. [20] C. LANCZOS, Linear Differential Operators, Dover Publications , Mineola, NY, 1997. [21] G.I. MARCHUK, Adjoint Equations and Analysis of...NUMBER(S) 16. SECURITY CLASSIFICATION OF: 19b. TELEPHONE NUMBER (Include area code) The public reporting burden for this collection of information is
Initial postbuckling analysis of elastoplastic thin-shear structures
NASA Technical Reports Server (NTRS)
Carnoy, E. G.; Panosyan, G.
1984-01-01
The design of thin shell structures with respect to elastoplastic buckling requires an extended analysis of the influence of initial imperfections. For conservative design, the most critical defect should be assumed with the maximum allowable magnitude. This defect is closely related to the initial postbuckling behavior. An algorithm is given for the quasi-static analysis of the postbuckling behavior of structures that exhibit multiple buckling points. the algorithm based upon an energy criterion allows the computation of the critical perturbation which will be employed for the definition of the critical defect. For computational efficiency, the algorithm uses the reduced basis technique with automatic update of the modal basis. The method is applied to the axisymmetric buckling of cylindrical shells under axial compression, and conclusions are given for future research.
Analysis of students critical thinking skills in socio-scientific issues of biodiversity subject
NASA Astrophysics Data System (ADS)
Santika, A. R.; Purwianingsih, W.; Nuraeni, E.
2018-05-01
Critical thinking is a skills the which students should have in order to face 21st century demands. Critical thinking skills can help people in facing their daily problems, especially problems roommates relate to science. This research is aimed to analyze students critical thinking skills in socio-scientific issues of biodiversity subject. The method used in this research was descriptive method. The research subject is first-grade students’ in senior high school. The data collected by interview and open-ended question the which classified based on framework : (1) question at issue, (2) information (3) purpose (4) concepts (5) assumptions, (6) point of view, (7) interpretation and inference, and (8) implication and consequences, then it will be assessed by using rubrics. The result of the data showed students critical thinking skills in socio-scientific issues of biodiversity subject is in low and medium category. Therefore we need a learning activity that is able to develop student’s critical thinking skills, especially regarding issues of social science.
NASA Technical Reports Server (NTRS)
Aiken, Alexander
2001-01-01
The Scalable Analysis Toolkit (SAT) project aimed to demonstrate that it is feasible and useful to statically detect software bugs in very large systems. The technical focus of the project was on a relatively new class of constraint-based techniques for analysis software, where the desired facts about programs (e.g., the presence of a particular bug) are phrased as constraint problems to be solved. At the beginning of this project, the most successful forms of formal software analysis were limited forms of automatic theorem proving (as exemplified by the analyses used in language type systems and optimizing compilers), semi-automatic theorem proving for full verification, and model checking. With a few notable exceptions these approaches had not been demonstrated to scale to software systems of even 50,000 lines of code. Realistic approaches to large-scale software analysis cannot hope to make every conceivable formal method scale. Thus, the SAT approach is to mix different methods in one application by using coarse and fast but still adequate methods at the largest scales, and reserving the use of more precise but also more expensive methods at smaller scales for critical aspects (that is, aspects critical to the analysis problem under consideration) of a software system. The principled method proposed for combining a heterogeneous collection of formal systems with different scalability characteristics is mixed constraints. This idea had been used previously in small-scale applications with encouraging results: using mostly coarse methods and narrowly targeted precise methods, useful information (meaning the discovery of bugs in real programs) was obtained with excellent scalability.
Critical scaling analysis for displacive-type organic ferroelectrics around ferroelectric transition
NASA Astrophysics Data System (ADS)
Ding, L. J.
2017-04-01
The critical scaling properties of displacive-type organic ferroelectrics, in which the ferroelectric-paraelectric transition is induced by spin-Peierls instability, are investigated by Green's function theory through the modified Arrott plot, critical isothermal and electrocaloric effect (ECE) analysis around the transition temperature TC. It is shown that the electric entropy change - ΔS follows a power-law dependence of electric field E : - ΔS ∼En with n satisfying the Franco equation n(TC) = 1 +(β - 1) /(β + γ) = 0.618, wherein the obtained critical exponents β = 0.440 and γ = 1.030 are not only corroborated by Kouvel-Fisher method, but also confirm the Widom critical relation δ = 1 + γ / β. The self-consistency and reliability of the obtained critical exponents are further verified by the scaling equations. Additionally, a universal curve of - ΔS is constructed with rescaling temperature and electric field, so that one can extrapolate the ECE in a certain temperature and electric field range, which would be helpful in designing controlled electric refrigeration devices.
Selected Judgmental Methods in Defense Analyses. Volume 1. Main Text.
1990-07-01
contract No. MDA903-89-C-0003, Task T-6-593, Survey of Qualitative Methods in Military Operations Research . The objective of this analysis is to...Generalizability, and Reliability: Three Dimensions of Judgment Research ..................................................................... 1-1 a...V-3 3. Non -Gamble Methods ............................................................... V-4 B. Criticisms, Caveats, Replies
ERIC Educational Resources Information Center
Grimm, Kevin J.
2007-01-01
Recent advances in methods and computer software for longitudinal data analysis have pushed researchers to more critically examine developmental theories. In turn, researchers have also begun to push longitudinal methods by asking more complex developmental questions. One such question involves the relationships between two developmental…
NASA Astrophysics Data System (ADS)
Leherte, L.; Allen, F. H.; Vercauteren, D. P.
1995-04-01
A computational method is described for mapping the volume within the DNA double helix accessible to a groove-binding antibiotic, netropsin. Topological critical point analysis is used to locate maxima in electron density maps reconstructed from crystallographically determined atomic coordinates. The peaks obtained in this way are represented as ellipsoids with axes related to local curvature of the electron density function. Combining the ellipsoids produces a single electron density function which can be probed to estimate effective volumes of the interacting species. Close complementarity between host and ligand in this example shows the method to be a good representation of the electron density function at various resolutions; while at the atomic level the ellipsoid method gives results which are in close agreement with those from the conventional, spherical, van der Waals approach.
NASA Astrophysics Data System (ADS)
Leherte, Laurence; Allen, Frank H.
1994-06-01
A computational method is described for mapping the volume within the DNA double helix accessible to the groove-binding antibiotic netropsin. Topological critical point analysis is used to locate maxima in electron density maps reconstructed from crystallographically determined atomic coordinates. The peaks obtained in this way are represented as ellipsoids with axes related to local curvature of the electron density function. Combining the ellipsoids produces a single electron density function which can be probed to estimate effective volumes of the interacting species. Close complementarity between host and ligand in this example shows the method to give a good representation of the electron density function at various resolutions. At the atomic level, the ellipsoid method gives results which are in close agreement with those from the conventional spherical van der Waals approach.
The secret lives of experiments: methods reporting in the fMRI literature.
Carp, Joshua
2012-10-15
Replication of research findings is critical to the progress of scientific understanding. Accordingly, most scientific journals require authors to report experimental procedures in sufficient detail for independent researchers to replicate their work. To what extent do research reports in the functional neuroimaging literature live up to this standard? The present study evaluated methods reporting and methodological choices across 241 recent fMRI articles. Many studies did not report critical methodological details with regard to experimental design, data acquisition, and analysis. Further, many studies were underpowered to detect any but the largest statistical effects. Finally, data collection and analysis methods were highly flexible across studies, with nearly as many unique analysis pipelines as there were studies in the sample. Because the rate of false positive results is thought to increase with the flexibility of experimental designs, the field of functional neuroimaging may be particularly vulnerable to false positives. In sum, the present study documented significant gaps in methods reporting among fMRI studies. Improved methodological descriptions in research reports would yield significant benefits for the field. Copyright © 2012 Elsevier Inc. All rights reserved.
He, Qing; Hao, Yinping; Liu, Hui; Liu, Wenyi
2018-01-01
Super-critical carbon dioxide energy-storage (SC-CCES) technology is a new type of gas energy-storage technology. This paper used orthogonal method and variance analysis to make significant analysis on the factors which would affect the thermodynamics characteristics of the SC-CCES system and obtained the significant factors and interactions in the energy-storage process, the energy-release process and the whole energy-storage system. Results have shown that the interactions in the components have little influence on the energy-storage process, the energy-release process and the whole energy-storage process of the SC-CCES system, the significant factors are mainly on the characteristics of the system component itself, which will provide reference for the optimization of the thermal properties of the energy-storage system.
He, Qing; Liu, Hui; Liu, Wenyi
2018-01-01
Super-critical carbon dioxide energy-storage (SC-CCES) technology is a new type of gas energy-storage technology. This paper used orthogonal method and variance analysis to make significant analysis on the factors which would affect the thermodynamics characteristics of the SC-CCES system and obtained the significant factors and interactions in the energy-storage process, the energy-release process and the whole energy-storage system. Results have shown that the interactions in the components have little influence on the energy-storage process, the energy-release process and the whole energy-storage process of the SC-CCES system, the significant factors are mainly on the characteristics of the system component itself, which will provide reference for the optimization of the thermal properties of the energy-storage system. PMID:29634742
The importance of reference materials in doping-control analysis.
Mackay, Lindsey G; Kazlauskas, Rymantas
2011-08-01
Currently a large range of pure substance reference materials are available for calibration of doping-control methods. These materials enable traceability to the International System of Units (SI) for the results generated by World Anti-Doping Agency (WADA)-accredited laboratories. Only a small number of prohibited substances have threshold limits for which quantification is highly important. For these analytes only the highest quality reference materials that are available should be used. Many prohibited substances have no threshold limits and reference materials provide essential identity confirmation. For these reference materials the correct identity is critical and the methods used to assess identity in these cases should be critically evaluated. There is still a lack of certified matrix reference materials to support many aspects of doping analysis. However, in key areas a range of urine matrix materials have been produced for substances with threshold limits, for example 19-norandrosterone and testosterone/epitestosterone (T/E) ratio. These matrix-certified reference materials (CRMs) are an excellent independent means of checking method recovery and bias and will typically be used in method validation and then regularly as quality-control checks. They can be particularly important in the analysis of samples close to threshold limits, in which measurement accuracy becomes critical. Some reference materials for isotope ratio mass spectrometry (IRMS) analysis are available and a matrix material certified for steroid delta values is currently under production. In other new areas, for example the Athlete Biological Passport, peptide hormone testing, designer steroids, and gene doping, reference material needs still need to be thoroughly assessed and prioritised.
Shao, Jingyuan; Cao, Wen; Qu, Haibin; Pan, Jianyang; Gong, Xingchu
2018-01-01
The aim of this study was to present a novel analytical quality by design (AQbD) approach for developing an HPLC method to analyze herbal extracts. In this approach, critical method attributes (CMAs) and critical method parameters (CMPs) of the analytical method were determined using the same data collected from screening experiments. The HPLC-ELSD method for separation and quantification of sugars in Codonopsis Radix extract (CRE) samples and Astragali Radix extract (ARE) samples was developed as an example method with a novel AQbD approach. Potential CMAs and potential CMPs were found with Analytical Target Profile. After the screening experiments, the retention time of the D-glucose peak of CRE samples, the signal-to-noise ratio of the D-glucose peak of CRE samples, and retention time of the sucrose peak in ARE samples were considered CMAs. The initial and final composition of the mobile phase, flow rate, and column temperature were found to be CMPs using a standard partial regression coefficient method. The probability-based design space was calculated using a Monte-Carlo simulation method and verified by experiments. The optimized method was validated to be accurate and precise, and then it was applied in the analysis of CRE and ARE samples. The present AQbD approach is efficient and suitable for analysis objects with complex compositions.
Failure mode and effects analysis: a comparison of two common risk prioritisation methods.
McElroy, Lisa M; Khorzad, Rebeca; Nannicelli, Anna P; Brown, Alexandra R; Ladner, Daniela P; Holl, Jane L
2016-05-01
Failure mode and effects analysis (FMEA) is a method of risk assessment increasingly used in healthcare over the past decade. The traditional method, however, can require substantial time and training resources. The goal of this study is to compare a simplified scoring method with the traditional scoring method to determine the degree of congruence in identifying high-risk failures. An FMEA of the operating room (OR) to intensive care unit (ICU) handoff was conducted. Failures were scored and ranked using both the traditional risk priority number (RPN) and criticality-based method, and a simplified method, which designates failures as 'high', 'medium' or 'low' risk. The degree of congruence was determined by first identifying those failures determined to be critical by the traditional method (RPN≥300), and then calculating the per cent congruence with those failures designated critical by the simplified methods (high risk). In total, 79 process failures among 37 individual steps in the OR to ICU handoff process were identified. The traditional method yielded Criticality Indices (CIs) ranging from 18 to 72 and RPNs ranging from 80 to 504. The simplified method ranked 11 failures as 'low risk', 30 as medium risk and 22 as high risk. The traditional method yielded 24 failures with an RPN ≥300, of which 22 were identified as high risk by the simplified method (92% agreement). The top 20% of CI (≥60) included 12 failures, of which six were designated as high risk by the simplified method (50% agreement). These results suggest that the simplified method of scoring and ranking failures identified by an FMEA can be a useful tool for healthcare organisations with limited access to FMEA expertise. However, the simplified method does not result in the same degree of discrimination in the ranking of failures offered by the traditional method. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
NASA Astrophysics Data System (ADS)
Moosavi, S. Amin; Montakhab, Afshin
2014-05-01
Motivated by recent experiments in neuroscience which indicate that neuronal avalanches exhibit scale invariant behavior similar to self-organized critical systems, we study the role of noisy (nonconservative) local dynamics on the critical behavior of a sandpile model which can be taken to mimic the dynamics of neuronal avalanches. We find that despite the fact that noise breaks the strict local conservation required to attain criticality, our system exhibits true criticality for a wide range of noise in various dimensions, given that conservation is respected on the average. Although the system remains critical, exhibiting finite-size scaling, the value of critical exponents change depending on the intensity of local noise. Interestingly, for a sufficiently strong noise level, the critical exponents approach and saturate at their mean-field values, consistent with empirical measurements of neuronal avalanches. This is confirmed for both two and three dimensional models. However, the addition of noise does not affect the exponents at the upper critical dimension (D =4). In addition to an extensive finite-size scaling analysis of our systems, we also employ a useful time-series analysis method to establish true criticality of noisy systems. Finally, we discuss the implications of our work in neuroscience as well as some implications for the general phenomena of criticality in nonequilibrium systems.
ERIC Educational Resources Information Center
Hanson, James H.; Brophy, Patrick D.
2012-01-01
Not all knowledge and skills that educators want to pass to students exists yet in textbooks. Some still resides only in the experiences of practicing engineers (e.g., how engineers create new products, how designers identify errors in calculations). The critical incident technique, CIT, is an established method for cognitive task analysis. It is…
The Development and Validation of a Mechanical Critical Thinking Scale for High School Students
ERIC Educational Resources Information Center
Yu, Kuang-Chao; Lin, Kuen-Yi; Chang, Shu-Fen
2017-01-01
The purpose of this study was to develop a mechanical critical thinking scale for high school students. A stratified random sampling method was used to establish the norms. After pre-tests and item analysis, the scale was determined to have five subtest sections (i.e., recognition of assumptions, induction, deduction, interpretation, and…
Contrasting Cross-Sectional and Longitudinal Early School Leaver Rates in Canada
ERIC Educational Resources Information Center
Timmons, Vianne; Ostridge, Randy
2009-01-01
Data analysis is critical to educational planning. Determining the number of school leavers is crucial for a school board when planning for interventions and supports. In researching the number of early school leavers in the province of Prince Edward Island, the method in which the data were reported affected the rates. Two critical considerations…
ERIC Educational Resources Information Center
Aleman, Enrique, Jr.
2009-01-01
In this article, the author seeks to re-imagine the political and policy roles of educational leaders of color, offering an alternative method for educational leadership, advocacy, and policy analysis. The author uses critical race theory (CRT) and Latina/o critical (LatCrit) theory to problematize the way politically-active Mexican American…
Natural time analysis of critical phenomena: The case of pre-fracture electromagnetic emissions
NASA Astrophysics Data System (ADS)
Potirakis, S. M.; Karadimitrakis, A.; Eftaxias, K.
2013-06-01
Criticality of complex systems reveals itself in various ways. One way to monitor a system at critical state is to analyze its observable manifestations using the recently introduced method of natural time. Pre-fracture electromagnetic (EM) emissions, in agreement to laboratory experiments, have been consistently detected in the MHz band prior to significant earthquakes. It has been proposed that these emissions stem from the fracture of the heterogeneous materials surrounding the strong entities (asperities) distributed along the fault, preventing the relative slipping. It has also been proposed that the fracture of heterogeneous material could be described in analogy to the critical phase transitions in statistical physics. In this work, the natural time analysis is for the first time applied to the pre-fracture MHz EM signals revealing their critical nature. Seismicity and pre-fracture EM emissions should be two sides of the same coin concerning the earthquake generation process. Therefore, we also examine the corresponding foreshock seismic activity, as another manifestation of the same complex system at critical state. We conclude that the foreshock seismicity data present criticality features as well.
Natural time analysis of critical phenomena: the case of pre-fracture electromagnetic emissions.
Potirakis, S M; Karadimitrakis, A; Eftaxias, K
2013-06-01
Criticality of complex systems reveals itself in various ways. One way to monitor a system at critical state is to analyze its observable manifestations using the recently introduced method of natural time. Pre-fracture electromagnetic (EM) emissions, in agreement to laboratory experiments, have been consistently detected in the MHz band prior to significant earthquakes. It has been proposed that these emissions stem from the fracture of the heterogeneous materials surrounding the strong entities (asperities) distributed along the fault, preventing the relative slipping. It has also been proposed that the fracture of heterogeneous material could be described in analogy to the critical phase transitions in statistical physics. In this work, the natural time analysis is for the first time applied to the pre-fracture MHz EM signals revealing their critical nature. Seismicity and pre-fracture EM emissions should be two sides of the same coin concerning the earthquake generation process. Therefore, we also examine the corresponding foreshock seismic activity, as another manifestation of the same complex system at critical state. We conclude that the foreshock seismicity data present criticality features as well.
Mixed-methods research in nursing - a critical review.
Bressan, Valentina; Bagnasco, Annamaria; Aleo, Giuseppe; Timmins, Fiona; Barisone, Michela; Bianchi, Monica; Pellegrini, Ramona; Sasso, Loredana
2017-10-01
To review the use of mixed-methods research in nursing with a particular focus on the extent to which current practice informs nurse researchers. It also aimed to highlight gaps in current knowledge, understanding and reporting of this type of research. Mixed-methods research is becoming increasingly popular among nurses and healthcare professionals. Emergent findings from this type of research are very useful for nurses in practice. The combination of both quantitative and qualitative methods provides a scientific base for practice but also richness from the qualitative enquiry. However, at the same time mixed-methods research is underdeveloped. This study identified mixed-methods research papers and critically evaluated their usefulness for research practice. To support the analysis, we performed a two-stage search using CINAHL to find papers with titles that included the key term 'mixed method'. An analysis of studies that used mixed-methods research revealed some inconsistencies in application and reporting. Attempts to use two distinct research methods in these studies often meant that one or both aspects had limitations. Overall methods were applied in a less rigorous way. This has implications for providing somewhat limited direction for novice researchers. There is also potential for application of evidence in healthcare practice that limited validity. This study highlights current gaps in knowledge, understanding and reporting of mixed-methods research. While these methods are useful to gain insight into clinical problems nurses lack guidance with this type of research. This study revealed that the guidance provided by current mixed-methods research is inconsistent and incomplete and this compounds the lack of available direction. There is an urgent need to develop robust guidelines for using mixed-methods research so that findings may be critically implemented in practice. © 2016 John Wiley & Sons Ltd.
Molecular characters of melon (Cucumismelo L. "Tacapa") in response to karst critical land
NASA Astrophysics Data System (ADS)
Rachmawati, Yuanita; Daryono, Budi Setiadi; Aristya, Ganies Riza
2017-06-01
Yogyakarta district has 158.600 ha critical land and spread off in three Agro Ecosystem zones. Two of them are karsts critical land. Critical lands which contain calcium carbonate in high concentration and water dehydration in upper surface give abiotic stress in wide range of plant. Melon cultivar TACAPA has superior characteristic derived from parental crossing, ♀ Action 434 and ♂ PI 371795 and potential to be developed in karsts critical land. Abscicic acid (ABA) is a phytohormone expressed by plant in abiotic stress condition. CmBG1 is a gene which regulate ABA hormone in melon. The purposes of this research were examining the molecular character of melon cultivar TACAPA in response to karsts critical land in order to study molecular characterization of CmBG1 gene. Analysis was done qualitatively by using Reverse Transcriptase-PCR (RT-PCR) and Electrophoresis, while quantitative analysis was conducted by observing absorbance score in spectrophotometer. CmBG1 gene expression is examined by using Real time PCR (qPCR). Molecular characters obtained are CmBG1 detected in size ±1258 bp, CmBG1 gene concentrations in melon which planted in control media are lower than melon in critical lands media. These results are similar with the real time quantitative analysis method. It also be revealed that melon TACAPA is more potential plant compared to another cultivar that can be developed in karst critical land area.
Anthropometric Accommodation in Space Suit Design
NASA Technical Reports Server (NTRS)
Rajulu, Sudhakar; Thaxton, Sherry
2007-01-01
Design requirements for next generation hardware are in process at NASA. Anthropometry requirements are given in terms of minimum and maximum sizes for critical dimensions that hardware must accommodate. These dimensions drive vehicle design and suit design, and implicitly have an effect on crew selection and participation. At this stage in the process, stakeholders such as cockpit and suit designers were asked to provide lists of dimensions that will be critical for their design. In addition, they were asked to provide technically feasible minimum and maximum ranges for these dimensions. Using an adjusted 1988 Anthropometric Survey of U.S. Army (ANSUR) database to represent a future astronaut population, the accommodation ranges provided by the suit critical dimensions were calculated. This project involved participation from the Anthropometry and Biomechanics facility (ABF) as well as suit designers, with suit designers providing expertise about feasible hardware dimensions and the ABF providing accommodation analysis. The initial analysis provided the suit design team with the accommodation levels associated with the critical dimensions provided early in the study. Additional outcomes will include a comparison of principal components analysis as an alternate method for anthropometric analysis.
La 2-xSr xCuO 4-δ superconducting samples prepared by the wet-chemical method
NASA Astrophysics Data System (ADS)
Loose, A.; Gonzalez, J. L.; Lopez, A.; Borges, H. A.; Baggio-Saitovitch, E.
2009-10-01
In this work, we report on the physical properties of good-quality polycrystalline superconducting samples of La 2-xSr xCu 1-yZn yO 4-δ ( y=0, 0.02) prepared by a wet-chemical method, focusing on the temperature dependence of the critical current. Using the wet-chemical method, we were able to produce samples with improved homogeneity compared to the solid-state method. A complete set of samples with several carrier concentrations, ranging from the underdoped (strontium concentration x≈0.05) to the highly overdoped ( x≈0.25) region, were prepared and investigated. The X-ray diffraction analysis, zero-field cooling magnetization and electrical resistivity measurements were reported on earlier. The structural parameters of the prepared samples seem to be slightly modified by the preparation method and their critical temperatures were lower than reported in the literature. The temperature dependence of the critical current was explained by a theoretical model which took the granular structure of the samples into account.
Verification and Validation in a Rapid Software Development Process
NASA Technical Reports Server (NTRS)
Callahan, John R.; Easterbrook, Steve M.
1997-01-01
The high cost of software production is driving development organizations to adopt more automated design and analysis methods such as rapid prototyping, computer-aided software engineering (CASE) tools, and high-level code generators. Even developers of safety-critical software system have adopted many of these new methods while striving to achieve high levels Of quality and reliability. While these new methods may enhance productivity and quality in many cases, we examine some of the risks involved in the use of new methods in safety-critical contexts. We examine a case study involving the use of a CASE tool that automatically generates code from high-level system designs. We show that while high-level testing on the system structure is highly desirable, significant risks exist in the automatically generated code and in re-validating releases of the generated code after subsequent design changes. We identify these risks and suggest process improvements that retain the advantages of rapid, automated development methods within the quality and reliability contexts of safety-critical projects.
Electrochemical Biosensors for Rapid Detection of Foodborne Salmonella: A Critical Overview
Cinti, Stefano; Volpe, Giulia; Piermarini, Silvia; Delibato, Elisabetta; Palleschi, Giuseppe
2017-01-01
Salmonella has represented the most common and primary cause of food poisoning in many countries for at least over 100 years. Its detection is still primarily based on traditional microbiological culture methods which are labor-intensive, extremely time consuming, and not suitable for testing a large number of samples. Accordingly, great efforts to develop rapid, sensitive and specific methods, easy to use, and suitable for multi-sample analysis, have been made and continue. Biosensor-based technology has all the potentialities to meet these requirements. In this paper, we review the features of the electrochemical immunosensors, genosensors, aptasensors and phagosensors developed in the last five years for Salmonella detection, focusing on the critical aspects of their application in food analysis. PMID:28820458
Radulescu, Georgeta; Gauld, Ian C.; Ilas, Germina; ...
2014-11-01
This paper describes a depletion code validation approach for criticality safety analysis using burnup credit for actinide and fission product nuclides in spent nuclear fuel (SNF) compositions. The technical basis for determining the uncertainties in the calculated nuclide concentrations is comparison of calculations to available measurements obtained from destructive radiochemical assay of SNF samples. Probability distributions developed for the uncertainties in the calculated nuclide concentrations were applied to the SNF compositions of a criticality safety analysis model by the use of a Monte Carlo uncertainty sampling method to determine bias and bias uncertainty in effective neutron multiplication factor. Application ofmore » the Monte Carlo uncertainty sampling approach is demonstrated for representative criticality safety analysis models of pressurized water reactor spent fuel pool storage racks and transportation packages using burnup-dependent nuclide concentrations calculated with SCALE 6.1 and the ENDF/B-VII nuclear data. Furthermore, the validation approach and results support a recent revision of the U.S. Nuclear Regulatory Commission Interim Staff Guidance 8.« less
NASA Astrophysics Data System (ADS)
Sulaiman, Tajularipin; a/l Kuppusamy, Suresh Kumar; Ayub, Ahmad Fauzi Mohd; Rahim, Suzieleez Syrene Abdul
2017-01-01
This study aims to assess the level of critical thinking disposition and teaching efficacy among the Special Education Integration Programme (SEIP) teachers in Negeri Sembilan, Malaysia. The level of critical thinking dispositions and teaching efficacy in the SEIP were compared based on teaching experience and gender. The study also examined the relationship between critical thinking disposition and teaching efficacy at SEIP. The research adopted a quantitative survey approach. A total of 190 primary school teachers from the SEIP in Negeri Sembilan were selected using proportional sampling method. The instrument used in this study comprised of three sections; demography, critical thinking disposition and teaching efficacy. Descriptive and inferential statistics were used in the analysis. Analysis shows that the respondents have a moderate level of critical thinking disposition (M = 2.99, S.D = 0.160) and teaching efficacy (M = 3.01 S.D. = 0.128) was at a high level. For teaching experience, the analysis showed that thinking disposition of novice teachers (mean = 2.52, SD = .503) are significantly higher than experienced teachers (mean = 2.35, SD = .481, t = 2.244, p <.05). There was no significant difference between male and female SEIP teachers in critical thinking disposition and teaching efficacy. Findings also indicated that there is a significant positive moderate relationship (r = .477) between critical thinking disposition and teaching efficacy among SEIP teachers. This study suggests that critical thinking disposition and teaching efficacy play an important role to enhance the performance of SEIP teachers.
Non-criticality of interaction network over system's crises: A percolation analysis.
Shirazi, Amir Hossein; Saberi, Abbas Ali; Hosseiny, Ali; Amirzadeh, Ehsan; Toranj Simin, Pourya
2017-11-20
Extraction of interaction networks from multi-variate time-series is one of the topics of broad interest in complex systems. Although this method has a wide range of applications, most of the previous analyses have focused on the pairwise relations. Here we establish the potential of such a method to elicit aggregated behavior of the system by making a connection with the concepts from percolation theory. We study the dynamical interaction networks of a financial market extracted from the correlation network of indices, and build a weighted network. In correspondence with the percolation model, we find that away from financial crises the interaction network behaves like a critical random network of Erdős-Rényi, while close to a financial crisis, our model deviates from the critical random network and behaves differently at different size scales. We perform further analysis to clarify that our observation is not a simple consequence of the growth in correlations over the crises.
Rock, Adam J.; Coventry, William L.; Morgan, Methuen I.; Loi, Natasha M.
2016-01-01
Generally, academic psychologists are mindful of the fact that, for many students, the study of research methods and statistics is anxiety provoking (Gal et al., 1997). Given the ubiquitous and distributed nature of eLearning systems (Nof et al., 2015), teachers of research methods and statistics need to cultivate an understanding of how to effectively use eLearning tools to inspire psychology students to learn. Consequently, the aim of the present paper is to discuss critically how using eLearning systems might engage psychology students in research methods and statistics. First, we critically appraise definitions of eLearning. Second, we examine numerous important pedagogical principles associated with effectively teaching research methods and statistics using eLearning systems. Subsequently, we provide practical examples of our own eLearning-based class activities designed to engage psychology students to learn statistical concepts such as Factor Analysis and Discriminant Function Analysis. Finally, we discuss general trends in eLearning and possible futures that are pertinent to teachers of research methods and statistics in psychology. PMID:27014147
Rock, Adam J; Coventry, William L; Morgan, Methuen I; Loi, Natasha M
2016-01-01
Generally, academic psychologists are mindful of the fact that, for many students, the study of research methods and statistics is anxiety provoking (Gal et al., 1997). Given the ubiquitous and distributed nature of eLearning systems (Nof et al., 2015), teachers of research methods and statistics need to cultivate an understanding of how to effectively use eLearning tools to inspire psychology students to learn. Consequently, the aim of the present paper is to discuss critically how using eLearning systems might engage psychology students in research methods and statistics. First, we critically appraise definitions of eLearning. Second, we examine numerous important pedagogical principles associated with effectively teaching research methods and statistics using eLearning systems. Subsequently, we provide practical examples of our own eLearning-based class activities designed to engage psychology students to learn statistical concepts such as Factor Analysis and Discriminant Function Analysis. Finally, we discuss general trends in eLearning and possible futures that are pertinent to teachers of research methods and statistics in psychology.
NASA Technical Reports Server (NTRS)
Gates, Thomas S.; Odegard, Gregory M.; Nemeth, Michael P.; Frankland, Sarah-Jane V.
2004-01-01
A multi-scale analysis of the structural stability of a carbon nanotube-polymer composite material is developed. The influence of intrinsic molecular structure, such as nanotube length, volume fraction, orientation and chemical functionalization, is investigated by assessing the relative change in critical, in-plane buckling loads. The analysis method relies on elastic properties predicted using the hierarchical, constitutive equations developed from the equivalent-continuum modeling technique applied to the buckling analysis of an orthotropic plate. The results indicate that for the specific composite materials considered in this study, a composite with randomly orientated carbon nanotubes consistently provides the highest values of critical buckling load and that for low volume fraction composites, the non-functionalized nanotube material provides an increase in critical buckling stability with respect to the functionalized system.
NASA Astrophysics Data System (ADS)
Tyffani, D. M.; Utomo, S. B.; Rahardjo, S. B.
2018-05-01
This research was aimed to find out how students’ need of chemistry module based REACT (Relating, Experiencing, Applying, Cooperating and Transferring) to improve students’ critical thinking ability. The subjects of this research was the studentsof XI grade in three school in even semester of academic year 2016-2017 that contained of 48 students of Senior High School 2 Bandar Lampung, 38 students of Senior High School 3 Bandar Lampung and 46 students of Senior High School 12 Bandar Lampung. The data was gathering used non-test method by using open questionnaire with 13 questions. The results showed that 84,84% of students stated that the development of chemistry module based REACT on colloid material is needed. The analysis of hand’s book was used aspects of critical thinking proposed by Facione (2011) are interpretation, analysis, evaluation, conclusion, and explanation. Based on the result of the analysis of hand’s book at Senior High School 12 Bandar Lampung for critical thinking in colloid material that indicate 50% indicator is appropriate, while for indicator of inference and explanation only 16,67% appropriate, then for indicator analysis and evaluation doesn’t have conformity. Based on the results of the analysis shows that the hand’s book used have not empowered critical thinking ability with maximum. The development of chemistry module on colloid material is needed to overcome the problem of hand’s book that hasn’t maximized critical thinking ability, then the development of module oriented to REACT learning model (Relating, Experiencing, Applying, Cooperating, and Transferring).
Ceppi, Marcello; Gallo, Fabio; Bonassi, Stefano
2011-01-01
The most common study design performed in population studies based on the micronucleus (MN) assay, is the cross-sectional study, which is largely performed to evaluate the DNA damaging effects of exposure to genotoxic agents in the workplace, in the environment, as well as from diet or lifestyle factors. Sample size is still a critical issue in the design of MN studies since most recent studies considering gene-environment interaction, often require a sample size of several hundred subjects, which is in many cases difficult to achieve. The control of confounding is another major threat to the validity of causal inference. The most popular confounders considered in population studies using MN are age, gender and smoking habit. Extensive attention is given to the assessment of effect modification, given the increasing inclusion of biomarkers of genetic susceptibility in the study design. Selected issues concerning the statistical treatment of data have been addressed in this mini-review, starting from data description, which is a critical step of statistical analysis, since it allows to detect possible errors in the dataset to be analysed and to check the validity of assumptions required for more complex analyses. Basic issues dealing with statistical analysis of biomarkers are extensively evaluated, including methods to explore the dose-response relationship among two continuous variables and inferential analysis. A critical approach to the use of parametric and non-parametric methods is presented, before addressing the issue of most suitable multivariate models to fit MN data. In the last decade, the quality of statistical analysis of MN data has certainly evolved, although even nowadays only a small number of studies apply the Poisson model, which is the most suitable method for the analysis of MN data.
Strain Modal Analysis of Small and Light Pipes Using Distributed Fibre Bragg Grating Sensors
Huang, Jun; Zhou, Zude; Zhang, Lin; Chen, Juntao; Ji, Chunqian; Pham, Duc Truong
2016-01-01
Vibration fatigue failure is a critical problem of hydraulic pipes under severe working conditions. Strain modal testing of small and light pipes is a good option for dynamic characteristic evaluation, structural health monitoring and damage identification. Unique features such as small size, light weight, and high multiplexing capability enable Fibre Bragg Grating (FBG) sensors to measure structural dynamic responses where sensor size and placement are critical. In this paper, experimental strain modal analysis of pipes using distributed FBG sensors ispresented. Strain modal analysis and parameter identification methods are introduced. Experimental strain modal testing and finite element analysis for a cantilever pipe have been carried out. The analysis results indicate that the natural frequencies and strain mode shapes of the tested pipe acquired by FBG sensors are in good agreement with the results obtained by a reference accelerometer and simulation outputs. The strain modal parameters of a hydraulic pipe were obtained by the proposed strain modal testing method. FBG sensors have been shown to be useful in the experimental strain modal analysis of small and light pipes in mechanical, aeronautic and aerospace applications. PMID:27681728
Understanding critical health literacy: a concept analysis
2013-01-01
Background Interest in and debates around health literacy have grown over the last two decades and key to the discussions has been the distinction made between basic functional health literacy, communicative/interactive health literacy and critical health literacy. Of these, critical health literacy is the least well developed and differing interpretations of its constituents and relevance exist. The aim of this study is to rigorously analyse the concept of critical health literacy in order to offer some clarity of definition upon which appropriate theory, well grounded practice and potential measurement tools can be based. Method The study uses a theoretical and colloquial evolutionary concept analysis method to systematically identify the features associated with this concept. A unique characteristic of this method is that it practically combines an analysis of the literature with in depth interviews undertaken with practitioners and policy makers who have an interest in the field. The study also analyses how the concept is understood across the contexts of time, place, discipline and use by health professionals, policy makers and academics. Results Findings revealed a distinct set of characteristics of advanced personal skills, health knowledge, information skills, effective interaction between service providers and users, informed decision making and empowerment including political action as key features of critical health literacy. The potential consequences of critical health literacy identified are in improving health outcomes, creating more effective use of health services and reducing inequalities in health thus demonstrating the relevance of this concept to public health and health promotion. Conclusions While critical health literacy is shown to be a unique concept, there remain significant contextual variations in understanding particularly between academics, practitioners and policy makers. Key attributes presented as part of this concept when it was first introduced in the literature, particularly those around empowerment, social and political action and the existence of the concept at both an individual and population level, have been lost in more recent representations. This has resulted in critical health literacy becoming restricted to a higher order cognitive individual skill rather than a driver for political and social change. The paper argues that in order to retain the uniqueness and usefulness of the concept in practice efforts should be made to avoid this dilution of meaning. PMID:23419015
Liu, Cong; Kolarik, Barbara; Gunnarsen, Lars; Zhang, Yinping
2015-10-20
Polychlorinated biphenyls (PCBs) have been found to be persistent in the environment and possibly harmful. Many buildings are characterized with high PCB concentrations. Knowledge about partitioning between primary sources and building materials is critical for exposure assessment and practical remediation of PCB contamination. This study develops a C-depth method to determine diffusion coefficient (D) and partition coefficient (K), two key parameters governing the partitioning process. For concrete, a primary material studied here, relative standard deviations of results among five data sets are 5%-22% for K and 42-66% for D. Compared with existing methods, C-depth method overcomes the inability to obtain unique estimation for nonlinear regression and does not require assumed correlations for D and K among congeners. Comparison with a more sophisticated two-term approach implies significant uncertainty for D, and smaller uncertainty for K. However, considering uncertainties associated with sampling and chemical analysis, and impact of environmental factors, the results are acceptable for engineering applications. This was supported by good agreement between model prediction and measurement. Sensitivity analysis indicated that effective diffusion distance, contacting time of materials with primary sources, and depth of measured concentrations are critical for determining D, and PCB concentration in primary sources is critical for K.
Deep Learning Nuclei Detection in Digitized Histology Images by Superpixels
Sornapudi, Sudhir; Stanley, Ronald Joe; Stoecker, William V.; Almubarak, Haidar; Long, Rodney; Antani, Sameer; Thoma, George; Zuna, Rosemary; Frazier, Shelliane R.
2018-01-01
Background: Advances in image analysis and computational techniques have facilitated automatic detection of critical features in histopathology images. Detection of nuclei is critical for squamous epithelium cervical intraepithelial neoplasia (CIN) classification into normal, CIN1, CIN2, and CIN3 grades. Methods: In this study, a deep learning (DL)-based nuclei segmentation approach is investigated based on gathering localized information through the generation of superpixels using a simple linear iterative clustering algorithm and training with a convolutional neural network. Results: The proposed approach was evaluated on a dataset of 133 digitized histology images and achieved an overall nuclei detection (object-based) accuracy of 95.97%, with demonstrated improvement over imaging-based and clustering-based benchmark techniques. Conclusions: The proposed DL-based nuclei segmentation Method with superpixel analysis has shown improved segmentation results in comparison to state-of-the-art methods. PMID:29619277
NASA Technical Reports Server (NTRS)
Bellmore, C. P.; Reid, R. L.
1980-01-01
Presented herein is a method of including density fluctuations in the equations of turbulent transport. Results of a numerical analysis indicate that the method may be used to predict heat transfer for the case of near-critical para-hydrogen in turbulent upflow inside vertical tubes. Wall temperatures, heat transfer coefficients, and velocities obtained by coupling the equations of turbulent momentum and heat transfer with a perturbed equation of state show good agreement with experiment for inlet reduced pressures of 1.28-5.83.
Recommended Practice for Securing Control System Modems
DOE Office of Scientific and Technical Information (OSTI.GOV)
James R. Davidson; Jason L. Wright
2008-01-01
This paper addresses an often overlooked “backdoor” into critical infrastructure control systems created by modem connections. A modem’s connection to the public telephone system is similar to a corporate network connection to the Internet. By tracing typical attack paths into the system, this paper provides the reader with an analysis of the problem and then guides the reader through methods to evaluate existing modem security. Following the analysis, a series of methods for securing modems is provided. These methods are correlated to well-known networking security methods.
ERIC Educational Resources Information Center
Glenn, Wendy
2008-01-01
This article employs critical discourse analysis methods to (a) apply Marxist and critical literacy theories to recently published young adult novels that feature wealthy New York teens whose privilege grants them lives of leisure and (b) discuss the implications of using these texts in the classroom to encourage students to read (and consume)…
The Speed Reading Is in Disrepute: Advantages of Slow Reading for the Information Equilibrium
ERIC Educational Resources Information Center
Tsvetkova, Milena I.
2017-01-01
The study is dedicated to the impact of the speed and the acceleration on the preservation of the information equilibrium and the ability for critical thinking in the active person. The methods about the fast reading training are subjected to a critical analysis. On the grounds of the theory for the information equilibrium and the philosophy of…
Terzić, Jelena; Popović, Igor; Stajić, Ana; Tumpa, Anja; Jančić-Stojanović, Biljana
2016-06-05
This paper deals with the development of hydrophilic interaction liquid chromatographic (HILIC) method for the analysis of bilastine and its degradation impurities following Analytical Quality by Design approach. It is the first time that the method for bilastine and its impurities is proposed. The main objective was to identify the conditions where an adequate separation in minimal analysis duration could be achieved within a robust region. Critical process parameters which have the most influence on method performance were defined as acetonitrile content in the mobile phase, pH of the aqueous phase and ammonium acetate concentration in the aqueous phase. Box-Behnken design was applied for establishing a relationship between critical process parameters and critical quality attributes. The defined mathematical models and Monte Carlo simulations were used to identify the design space. Fractional factorial design was applied for experimental robustness testing and the method is validated to verify the adequacy of selected optimal conditions: the analytical column Luna(®) HILIC (100mm×4.6mm, 5μm particle size); mobile phase consisted of acetonitrile-aqueous phase (50mM ammonium acetate, pH adjusted to 5.3 with glacial acetic acid) (90.5:9.5, v/v); column temperature 30°C, mobile phase flow rate 1mLmin(-1), wavelength of detection 275nm. Copyright © 2016 Elsevier B.V. All rights reserved.
2016-12-01
chosen rather than complex ones , and responds to the criticism of the DTA approach. Chapter IV provides three separate case studies in defense R&D...defense R&D projects. To this end, the first section describes the case study method and the advantages of using simple models over more complex ones ...the analysis lacked empirical data and relied on subjective data, the analysis successfully combined the DTA approach with the case study method and
NASA Astrophysics Data System (ADS)
Misawa, Tsuyoshi; Takahashi, Yoshiyuki; Yagi, Takahiro; Pyeon, Cheol Ho; Kimura, Masaharu; Masuda, Kai; Ohgaki, Hideaki
2015-10-01
For detection of hidden special nuclear materials (SNMs), we have developed an active neutron-based interrogation system combined with a D-D fusion pulsed neutron source and a neutron detection system. In the detection scheme, we have adopted new measurement techniques simultaneously; neutron noise analysis and neutron energy spectrum analysis. The validity of neutron noise analysis method has been experimentally studied in the Kyoto University Critical Assembly (KUCA), and was applied to a cargo container inspection system by simulation.
Coding Classroom Interactions for Collective and Individual Engagement
ERIC Educational Resources Information Center
Ryu, Suna; Lombardi, Doug
2015-01-01
This article characterizes "engagement in science learning" from a sociocultural perspective and offers a mixed method approach to measuring engagement that combines critical discourse analysis (CDA) and social network analysis (SNA). Conceptualizing engagement from a sociocultural perspective, the article discusses the advantages of a…
Methods for Estimating the Uncertainty in Emergy Table-Form Models
Emergy studies have suffered criticism due to the lack of uncertainty analysis and this shortcoming may have directly hindered the wider application and acceptance of this methodology. Recently, to fill this gap, the sources of uncertainty in emergy analysis were described and an...
Superintendent Leadership Style: A Gendered Discourse Analysis
ERIC Educational Resources Information Center
Wallin, Dawn C.; Crippen, Carolyn
2007-01-01
Using a blend of social constructionism, critical feminism, and dialogue theory, the discourse of nine Manitoba superintendents is examined to determine if it illustrates particular gendered assumptions regarding superintendents' leadership style. Qualitative inquiry and analysis methods were utilized to identify emerging themes, or topics of…
Manning, Joseph C; Hemingway, Pippa; Redsell, Sarah A
2014-01-01
Introduction Life-threatening critical illness affects over a quarter of a million children and adolescents (0–18 years old) annually in the USA and the UK. Death from critical illness is rare; however, survivors and their families can be exposed to a complex array of negative physical, psychological and social problems. Currently, within the literature, there is a distinct paucity of child and adolescent survivor self-reports, thus limiting our understanding of how survivors perceive this adversity and subsequently cope and grow in the long-term following their critical illness. This study aims to explore and understand psychosocial well-being and needs of critical illness survivors, 6–20 months post paediatric intensive care admission. Methods and analysis A longitudinal, qualitative approach will provide a platform for a holistic and contextualised exploration of outcomes and mechanisms at an individual level. Up to 80 participants, including 20 childhood critical illness survivors and 60 associated family members or health professionals/teachers, will be recruited. Three interviews, 7–9 weeks apart, will be conducted with critical illness survivors, allowing for the exploration of psychosocial well-being over time. A single interview will be conducted with the other participants enabling the exploration of contextual information and how psychosocial well-being may inter-relate between critical illness survivors and themselves. A ‘tool box’ of qualitative methods (semi-structured interviews, draw and tell, photo-elicitation, graphic-elicitation) will be used to collect data. Narrative analysis and pattern matching will be used to identify emergent themes across participants. Ethics and dissemination This study will provide an insight and understanding of participants’ experiences and perspectives of surviving critical illness in the long term with specific relation to their psychosocial well-being. Multiple methods will be used to ensure that the findings are effectively disseminated to service users, clinicians, policy and academic audiences. The study has full ethical approval from the East Midlands Research Ethics Committee and has received National Health Service (NHS) governance clearance. PMID:24435896
An approximate methods approach to probabilistic structural analysis
NASA Technical Reports Server (NTRS)
Mcclung, R. C.; Millwater, H. R.; Wu, Y.-T.; Thacker, B. H.; Burnside, O. H.
1989-01-01
A probabilistic structural analysis method (PSAM) is described which makes an approximate calculation of the structural response of a system, including the associated probabilistic distributions, with minimal computation time and cost, based on a simplified representation of the geometry, loads, and material. The method employs the fast probability integration (FPI) algorithm of Wu and Wirsching. Typical solution strategies are illustrated by formulations for a representative critical component chosen from the Space Shuttle Main Engine (SSME) as part of a major NASA-sponsored program on PSAM. Typical results are presented to demonstrate the role of the methodology in engineering design and analysis.
Nonequilibrium critical dynamics of the two-dimensional Ashkin-Teller model at the Baxter line
NASA Astrophysics Data System (ADS)
Fernandes, H. A.; da Silva, R.; Caparica, A. A.; de Felício, J. R. Drugowich
2017-04-01
We investigate the short-time universal behavior of the two-dimensional Ashkin-Teller model at the Baxter line by performing time-dependent Monte Carlo simulations. First, as preparatory results, we obtain the critical parameters by searching the optimal power-law decay of the magnetization. Thus, the dynamic critical exponents θm and θp, related to the magnetic and electric order parameters, as well as the persistence exponent θg, are estimated using heat-bath Monte Carlo simulations. In addition, we estimate the dynamic exponent z and the static critical exponents β and ν for both order parameters. We propose a refined method to estimate the static exponents that considers two different averages: one that combines an internal average using several seeds with another, which is taken over temporal variations in the power laws. Moreover, we also performed the bootstrapping method for a complementary analysis. Our results show that the ratio β /ν exhibits universal behavior along the critical line corroborating the conjecture for both magnetization and polarization.
Multiplexing N-glycan analysis by DNA analyzer.
Feng, Hua-Tao; Li, Pingjing; Rui, Guo; Stray, James; Khan, Shaheer; Chen, Shiaw-Min; Li, Sam F Y
2017-07-01
Analysis of N-glycan structures has been gaining attentions over the years due to their critical importance to biopharma-based applications and growing roles in biological research. Glycan profiling is also critical to the development of biosimilar drugs. The detailed characterization of N-glycosylation is mandatory because it is a nontemplate driven process and that significantly influences critical properties such as bio-safety and bio-activity. The ability to comprehensively characterize highly complex mixtures of N-glycans has been analytically challenging and stimulating because of the difficulties in both the structure complexity and time-consuming sample pretreatment procedures. CE-LIF is one of the typical techniques for N-glycan analysis due to its high separation efficiency. In this paper, a 16-capillary DNA analyzer was coupled with a magnetic bead glycan purification method to accelerate the sample preparation procedure and therefore increase N-glycan assay throughput. Routinely, the labeling dye used for CE-LIF is 8-aminopyrene-1,3,6-trisulfonic acid, while the typical identification method involves matching migration times with database entries. Two new fluorescent dyes were used to either cross-validate and increase the glycan identification precision or simplify sample preparation steps. Exoglycosidase studies were carried out using neuramididase, galactosidase, and fucosidase to confirm the results of three dye cross-validation. The optimized method combines the parallel separation capacity of multiple-capillary separation with three labeling dyes, magnetic bead assisted preparation, and exoglycosidase treatment to allow rapid and accurate analysis of N-glycans. These new methods provided enough useful structural information to permit N-glycan structure elucidation with only one sample injection. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
ERIC Educational Resources Information Center
Baker, Paul; Gabrielatos, Costas; McEnery, Tony
2013-01-01
This article uses methods from corpus linguistics and critical discourse analysis to examine patterns of representation around the word "Muslim" in a 143 million word corpus of British newspaper articles published between 1998 and 2009. Using the analysis tool Sketch Engine, an analysis of noun collocates of "Muslim" found that the following…
A seismic analysis for masonry constructions: The different schematization methods of masonry walls
NASA Astrophysics Data System (ADS)
Olivito, Renato. S.; Codispoti, Rosamaria; Scuro, Carmelo
2017-11-01
Seismic analysis of masonry structures is usually analyzed through the use of structural calculation software based on equivalent frames method or to macro-elements method. In these approaches, the masonry walls are divided into vertical elements, masonry walls, and horizontal elements, so-called spandrel elements, interconnected by rigid nodes. The aim of this work is to make a critical comparison between different schematization methods of masonry wall underlining the structural importance of the spandrel elements. In order to implement the methods, two different structural calculation software were used and an existing masonry building has been examined.
Khoshnood, Zohreh; Rayyani, Masoud; Tirgari, Batool
2018-01-13
Background Analysis of nursing theoretical works and its role in knowledge development is presented as an essential process of critical reflection. Health promotion model (HPM) focuses on helping people achieve higher levels of well-being and identifies background factors that influence health behaviors. Objectives This paper aims to evaluate, and critique HPM by Barnum's criteria. Methods The present study reviewed books and articles derived from Proquest, PubMed, Blackwell Databases. The method of evaluation for this model is based on Barnum's criteria for analysis, application and evaluation of nursing theories. The criteria selected by Barnum embrace both internal and external criticism. Internal criticism deals with how theory components fit with each other (internal construction of theory) and external criticism deals with the way in which theory relates to the extended world (which considers theory in its relationships to human beings, nursing, and health). Results The electronic database search yielded over 27,717 titles and abstracts. Following removal of duplicates, 18,963 titles and abstracts were screened using the inclusion criteria and 1278 manuscripts were retrieved. Of these, 80 were specific to HPM and 23 to analysis of any theory in nursing relating to the aim of this article. After final selection using the inclusion criteria for this review, 28 manuscripts were identified as examining the factors contributing to theory analysis. Evaluation of health promotion theory showed that the philosophical claims and their content are consistent and clear. HPM has a logical structure and was applied to diverse age groups from differing cultures with varying health concerns. Conclusion In conclusion, among the strategies for theory critique, the Barnum approach is structured and accurate, considers theory in its relationship to human beings, community psychiatric nursing, and health. While according to Pender, nursing assessment, diagnosis and interventions are utilized to operationalize the HPM through practical application and research.
NASA Astrophysics Data System (ADS)
Vadnala, Sudharshan; Asthana, Saket
2018-01-01
In this study, we have investigated magnetic behavior, magnetocaloric effect and critical exponent analysis of La0.7-xEuxSr0.3MnO3 (x = 0.0, 0.1, 0.2, 0.3) manganites synthesized through solid state reaction route. The crystallographic data obtained from refinement of X-ray diffraction patterns reveal that crystal structure changes from rhombohedral (for x = 0.0) to orthorhombic (for x ≥ 0.1). The average ionic radius of A-site is decreased from 1.384 Å (for x = 0.0) to 1.360 Å (for x = 0.3) with Eu3+ substitution which in turn decreases the Mn-O-Mn bond angles. Magnetization measurements are performed in the vicinity of TC to determine magnetocaloric effect (MCE) and critical field behavior. The maximum magnetic entropy change (Δ SMmax) (for μ0ΔH = 6T) increases with the Eu3+ substitution from 3.88 J/kg K (for x = 0.0) to 5.03 J/kg K (for x = 0.3) at the transition temperature. The critical field behaviour of compounds was analysed using various methods such as modified Arrott plots, Kouvel-Fisher method and critical isotherm to determine critical temperature and critical exponents (β, γ and δ). The obtained critical exponents are in good accordance with scaling relation. The temperature dependence of the order parameter n, for different magnetic fields, is studied using the relation ΔSMαHn. The values of n are found to obey the Curie-Weiss law for temperatures above the transition temperature. The rescaled change in entropy data for all compounds collapses into the same universal curve, revealing a second order phase transition.
Environmental analysis of higher brominated diphenyl ethers and decabromodiphenyl ethane.
Kierkegaard, Amelie; Sellström, Ulla; McLachlan, Michael S
2009-01-16
Methods for environmental analysis of higher brominated diphenyl ethers (PBDEs), in particular decabromodiphenyl ether (BDE209), and the recently discovered environmental contaminant decabromodiphenyl ethane (deBDethane) are reviewed. The extensive literature on analysis of BDE209 has identified several critical issues, including contamination of the sample, degradation of the analyte during sample preparation and GC analysis, and the selection of appropriate detection methods and surrogate standards. The limited experience with the analysis of deBDethane suggests that there are many commonalities with BDE209. The experience garnered from the analysis of BDE209 over the last 15 years will greatly facilitate progress in the analysis of deBDethane.
Annalaura, Carducci; Giulia, Davini; Stefano, Ceccanti
2013-01-01
Risk analysis is widely used in the pharmaceutical industry to manage production processes, validation activities, training, and other activities. Several methods of risk analysis are available (for example, failure mode and effects analysis, fault tree analysis), and one or more should be chosen and adapted to the specific field where they will be applied. Among the methods available, hazard analysis and critical control points (HACCP) is a methodology that has been applied since the 1960s, and whose areas of application have expanded over time from food to the pharmaceutical industry. It can be easily and successfully applied to several processes because its main feature is the identification, assessment, and control of hazards. It can be also integrated with other tools, such as fishbone diagram and flowcharting. The aim of this article is to show how HACCP can be used to manage an analytical process, propose how to conduct the necessary steps, and provide data templates necessary to document and useful to follow current good manufacturing practices. In the quality control process, risk analysis is a useful tool for enhancing the uniformity of technical choices and their documented rationale. Accordingly, it allows for more effective and economical laboratory management, is capable of increasing the reliability of analytical results, and enables auditors and authorities to better understand choices that have been made. The aim of this article is to show how hazard analysis and critical control points can be used to manage bacterial endotoxins testing and other analytical processes in a formal, clear, and detailed manner.
Identification of the students' critical thinking skills through biochemistry laboratory work report
NASA Astrophysics Data System (ADS)
Anwar, Yunita Arian Sani; Senam, Laksono, Endang W.
2017-08-01
This work aims to (1) identify the critical thinking skills of student based on their ability to set up laboratory work reports, and (2) analyze the implementation of biochemistry laboratory work. The method of quantitative content analysis was employed. Quantitative data were in the form of critical thinking skills through the assessment of students' laboratory work reports and questionnaire data. Hoyo rubric was used to measure critical thinking skills with 10 indicators, namely clarity, accuracy, precision, consistency, relevance, evidence, reason, depth, breadth, and fairness. The research sample consisted of 105 students (35 male, 70 female) of Mataram University who took a Biochemistry course and 2 lecturers of Biochemistry course. The results showed students' critical thinking skills through laboratory work reports were still weak. Analysis of the questionnaire showed that three indicators become the biggest problems during the laboratory work implementation, namely, lecturers' involved in laboratory work implementation, the integration of laboratory work implementation of learning in the classroom has not been done optimally and laboratory work implementation as an effort to train critical thinking skills is not optimal yet.
Development of direct-inverse 3-D methods for applied transonic aerodynamic wing design and analysis
NASA Technical Reports Server (NTRS)
Carlson, Leland A.
1989-01-01
An inverse wing design method was developed around an existing transonic wing analysis code. The original analysis code, TAWFIVE, has as its core the numerical potential flow solver, FLO30, developed by Jameson and Caughey. Features of the analysis code include a finite-volume formulation; wing and fuselage fitted, curvilinear grid mesh; and a viscous boundary layer correction that also accounts for viscous wake thickness and curvature. The development of the inverse methods as an extension of previous methods existing for design in Cartesian coordinates is presented. Results are shown for inviscid wing design cases in super-critical flow regimes. The test cases selected also demonstrate the versatility of the design method in designing an entire wing or discontinuous sections of a wing.
[Development and application of morphological analysis method in Aspergillus niger fermentation].
Tang, Wenjun; Xia, Jianye; Chu, Ju; Zhuang, Yingping; Zhang, Siliang
2015-02-01
Filamentous fungi are widely used in industrial fermentation. Particular fungal morphology acts as a critical index for a successful fermentation. To break the bottleneck of morphological analysis, we have developed a reliable method for fungal morphological analysis. By this method, we can prepare hundreds of pellet samples simultaneously and obtain quantitative morphological information at large scale quickly. This method can largely increase the accuracy and reliability of morphological analysis result. Based on that, the studies of Aspergillus niger morphology under different oxygen supply conditions and shear rate conditions were carried out. As a result, the morphological responding patterns of A. niger morphology to these conditions were quantitatively demonstrated, which laid a solid foundation for the further scale-up.
Designing pinhole vacancies in graphene towards functionalization: Effects on critical buckling load
NASA Astrophysics Data System (ADS)
Georgantzinos, S. K.; Markolefas, S.; Giannopoulos, G. I.; Katsareas, D. E.; Anifantis, N. K.
2017-03-01
The effect of size and placement of pinhole-type atom vacancies on Euler's critical load on free-standing, monolayer graphene, is investigated. The graphene is modeled by a structural spring-based finite element approach, in which every interatomic interaction is approached as a linear spring. The geometry of graphene and the pinhole size lead to the assembly of the stiffness matrix of the nanostructure. Definition of the boundary conditions of the problem leads to the solution of the eigenvalue problem and consequently to the critical buckling load. Comparison to results found in the literature illustrates the validity and accuracy of the proposed method. Parametric analysis regarding the placement and size of the pinhole-type vacancy, as well as the graphene geometry, depicts the effects on critical buckling load. Non-linear regression analysis leads to empirical-analytical equations for predicting the buckling behavior of graphene, with engineered pinhole-type atom vacancies.
ERIC Educational Resources Information Center
Butler, Brandon M.; Suh, Yonghee; Scott, Wendy
2015-01-01
In this article, the authors investigate the extent to which 9 elementary social studies methods textbooks present the purpose of teaching and learning social studies. Using Stanley's three perspectives of teaching social studies for knowledge transmission, method of intelligence, and social transformation; we analyze how these texts prepare…
a New Method for Fmeca Based on Fuzzy Theory and Expert System
NASA Astrophysics Data System (ADS)
Byeon, Yoong-Tae; Kim, Dong-Jin; Kim, Jin-O.
2008-10-01
Failure Mode Effects and Criticality Analysis (FMECA) is one of most widely used methods in modern engineering system to investigate potential failure modes and its severity upon the system. FMECA evaluates criticality and severity of each failure mode and visualize the risk level matrix putting those indices to column and row variable respectively. Generally, those indices are determined subjectively by experts and operators. However, this process has no choice but to include uncertainty. In this paper, a method for eliciting expert opinions considering its uncertainty is proposed to evaluate the criticality and severity. In addition, a fuzzy expert system is constructed in order to determine the crisp value of risk level for each failure mode. Finally, an illustrative example system is analyzed in the case study. The results are worth considering in deciding the proper policies for each component of the system.
Using root cause analysis to promote critical thinking in final year Bachelor of Midwifery students.
Carter, Amanda G; Sidebotham, Mary; Creedy, Debra K; Fenwick, Jennifer; Gamble, Jenny
2014-06-01
Midwives require well developed critical thinking to practice autonomously. However, multiple factors impinge on students' deep learning in the clinical context. Analysis of actual case scenarios using root cause analysis may foster students' critical thinking and application of 'best practice' principles in complex clinical situations. To examine the effectiveness of an innovative teaching strategy involving root cause analysis to develop students' perceptions of their critical thinking abilities. A descriptive, mixed methods design was used. Final 3rd year undergraduate midwifery students (n=22) worked in teams to complete and present an assessment item based on root cause analysis. The cases were adapted from coroners' reports. After graduation, 17 (77%) students evaluated the course using a standard university assessment tool. In addition 12 (54%) students provided specific feedback on the teaching strategy using a 16-item survey tool based on the domain concepts of Educational Acceptability, Educational Impact, and Preparation for Practice. Survey responses were on a 5-point Likert scale and analysed using descriptive statistics. Open-ended responses were analysed using content analysis. The majority of students perceived the course and this teaching strategy positively. The domain mean scores were high for Educational Acceptability (mean=4.3, SD=.49) and Educational Impact (mean=4.19, SD=.75) but slightly lower for Preparation for Practice (mean=3.7, SD=.77). Overall student responses to each item were positive with no item mean less than 3.42. Students found the root cause analysis challenging and time consuming but reported development of critical thinking skills about the complexity of practice, clinical governance and risk management principles. Analysing complex real life clinical cases to determine a root cause enhanced midwifery students' perceptions of their critical thinking. Teaching and assessment strategies to promote critical thinking need to be made explicit to students in order to foster ongoing development. © 2013.
Fountoulakis, Konstantinos N; Young, Allan; Yatham, Lakshmi; Grunze, Heinz; Vieta, Eduard; Blier, Pierre; Moeller, Hans Jurgen; Kasper, Siegfried
2017-02-01
This paper includes a short description of the important clinical aspects of Bipolar Disorder with emphasis on issues that are important for the therapeutic considerations, including mixed and psychotic features, predominant polarity, and rapid cycling as well as comorbidity. The workgroup performed a review and critical analysis of the literature concerning grading methods and methods for the development of guidelines. The workgroup arrived at a consensus to base the development of the guideline on randomized controlled trials and related meta-analyses alone in order to follow a strict evidence-based approach. A critical analysis of the existing methods for the grading of treatment options was followed by the development of a new grading method to arrive at efficacy and recommendation levels after the analysis of 32 distinct scenarios of available data for a given treatment option. The current paper reports details on the design, method, and process for the development of CINP guidelines for the treatment of Bipolar Disorder. The rationale and the method with which all data and opinions are combined in order to produce an evidence-based operationalized but also user-friendly guideline and a specific algorithm are described in detail in this paper. © The Author 2016. Published by Oxford University Press on behalf of CINP.
Isothermal Titration Calorimetry Can Provide Critical Thinking Opportunities
ERIC Educational Resources Information Center
Moore, Dale E.; Goode, David R.; Seney, Caryn S.; Boatwright, Jennifer M.
2016-01-01
College chemistry faculties might not have considered including isothermal titration calorimetry (ITC) in their majors' curriculum because experimental data from this instrumental method are often analyzed via automation (software). However, the software-based data analysis can be replaced with a spreadsheet-based analysis that is readily…
NASA Astrophysics Data System (ADS)
Citraresmi, A. D. P.; Wahyuni, E. E.
2018-03-01
The aim of this study was to inspect the implementation of Hazard Analysis and Critical Control Point (HACCP) for identification and prevention of potential hazards in the production process of dried anchovy at PT. Kelola Mina Laut (KML), Lobuk unit, Sumenep. Cold storage process is needed in each anchovy processing step in order to maintain its physical and chemical condition. In addition, the implementation of quality assurance system should be undertaken to maintain product quality. The research was conducted using a survey method, by following the whole process of making anchovy from the receiving raw materials to the packaging of final product. The method of data analysis used was descriptive analysis method. Implementation of HACCP at PT. KML, Lobuk unit, Sumenep was conducted by applying Pre Requisite Programs (PRP) and preparation stage consisting of 5 initial stages and 7 principles of HACCP. The results showed that CCP was found in boiling process flow with significant hazard of Listeria monocytogenesis bacteria and final sorting process with significant hazard of foreign material contamination in the product. Actions taken were controlling boiling temperature of 100 – 105°C for 3 - 5 minutes and training for sorting process employees.
Improved first-order uncertainty method for water-quality modeling
Melching, C.S.; Anmangandla, S.
1992-01-01
Uncertainties are unavoidable in water-quality modeling and subsequent management decisions. Monte Carlo simulation and first-order uncertainty analysis (involving linearization at central values of the uncertain variables) have been frequently used to estimate probability distributions for water-quality model output due to their simplicity. Each method has its drawbacks: Monte Carlo simulation's is mainly computational time; and first-order analysis are mainly questions of accuracy and representativeness, especially for nonlinear systems and extreme conditions. An improved (advanced) first-order method is presented, where the linearization point varies to match the output level whose exceedance probability is sought. The advanced first-order method is tested on the Streeter-Phelps equation to estimate the probability distribution of critical dissolved-oxygen deficit and critical dissolved oxygen using two hypothetical examples from the literature. The advanced first-order method provides a close approximation of the exceedance probability for the Streeter-Phelps model output estimated by Monte Carlo simulation using less computer time - by two orders of magnitude - regardless of the probability distributions assumed for the uncertain model parameters.
Tumpa, Anja; Stajić, Ana; Jančić-Stojanović, Biljana; Medenica, Mirjana
2017-02-05
This paper deals with the development of hydrophilic interaction liquid chromatography (HILIC) method with gradient elution, in accordance with Analytical Quality by Design (AQbD) methodology, for the first time. The method is developed for olanzapine and its seven related substances. Following step by step AQbD methodology, firstly as critical process parameters (CPPs) temperature, starting content of aqueous phase and duration of linear gradient are recognized, and as critical quality attributes (CQAs) separation criterion S of critical pairs of substances are investigated. Rechtschaffen design is used for the creation of models that describe the dependence between CPPs and CQAs. The design space that is obtained at the end is used for choosing the optimal conditions (set point). The method is fully validated at the end to verify the adequacy of the chosen optimal conditions and applied to real samples. Copyright © 2016 Elsevier B.V. All rights reserved.
Applying formal methods and object-oriented analysis to existing flight software
NASA Technical Reports Server (NTRS)
Cheng, Betty H. C.; Auernheimer, Brent
1993-01-01
Correctness is paramount for safety-critical software control systems. Critical software failures in medical radiation treatment, communications, and defense are familiar to the public. The significant quantity of software malfunctions regularly reported to the software engineering community, the laws concerning liability, and a recent NRC Aeronautics and Space Engineering Board report additionally motivate the use of error-reducing and defect detection software development techniques. The benefits of formal methods in requirements driven software development ('forward engineering') is well documented. One advantage of rigorously engineering software is that formal notations are precise, verifiable, and facilitate automated processing. This paper describes the application of formal methods to reverse engineering, where formal specifications are developed for a portion of the shuttle on-orbit digital autopilot (DAP). Three objectives of the project were to: demonstrate the use of formal methods on a shuttle application, facilitate the incorporation and validation of new requirements for the system, and verify the safety-critical properties to be exhibited by the software.
Flux pinning enhancement in thin films of Y3 Ba5 Cu8O18.5 + d
NASA Astrophysics Data System (ADS)
Aghabagheri, S.; Mohammadizadeh, M. R.; Kameli, P.; Salamati, H.
2018-06-01
YBa2Cu3O7 (Y123) and Y3Ba5Cu8O18 (Y358) thin films were deposited by pulsed laser deposition method. XRD analysis shows both films grow in c axis orientation. Resistivity versus temperature analysis shows superconducting transition temperature was about 91.2 K and 91.5 K and transition width for Y358 and Y123 films was about 0.6 K and 1.6 K, respectively. Analysis of the temperature dependence of the AC susceptibility near the transition temperature, employing Bean's critical state model, indicates that intergranular critical current density for Y358 films is more than twice of intergranular critical current density of Y123 films. Thus, flux pining is stronger in Y358 films. Weak links in the both samples is of superconductor-normal-superconductor (SNS) type irrespective of stoichiometry.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yu, Victoria; Kishan, Amar U.; Cao, Minsong
2014-03-15
Purpose: To demonstrate a new method of evaluating dose response of treatment-induced lung radiographic injury post-SBRT (stereotactic body radiotherapy) treatment and the discovery of bimodal dose behavior within clinically identified injury volumes. Methods: Follow-up CT scans at 3, 6, and 12 months were acquired from 24 patients treated with SBRT for stage-1 primary lung cancers or oligometastic lesions. Injury regions in these scans were propagated to the planning CT coordinates by performing deformable registration of the follow-ups to the planning CTs. A bimodal behavior was repeatedly observed from the probability distribution for dose values within the deformed injury regions. Basedmore » on a mixture-Gaussian assumption, an Expectation-Maximization (EM) algorithm was used to obtain characteristic parameters for such distribution. Geometric analysis was performed to interpret such parameters and infer the critical dose level that is potentially inductive of post-SBRT lung injury. Results: The Gaussian mixture obtained from the EM algorithm closely approximates the empirical dose histogram within the injury volume with good consistency. The average Kullback-Leibler divergence values between the empirical differential dose volume histogram and the EM-obtained Gaussian mixture distribution were calculated to be 0.069, 0.063, and 0.092 for the 3, 6, and 12 month follow-up groups, respectively. The lower Gaussian component was located at approximately 70% prescription dose (35 Gy) for all three follow-up time points. The higher Gaussian component, contributed by the dose received by planning target volume, was located at around 107% of the prescription dose. Geometrical analysis suggests the mean of the lower Gaussian component, located at 35 Gy, as a possible indicator for a critical dose that induces lung injury after SBRT. Conclusions: An innovative and improved method for analyzing the correspondence between lung radiographic injury and SBRT treatment dose has been demonstrated. Bimodal behavior was observed in the dose distribution of lung injury after SBRT. Novel statistical and geometrical analysis has shown that the systematically quantified low-dose peak at approximately 35 Gy, or 70% prescription dose, is a good indication of a critical dose for injury. The determined critical dose of 35 Gy resembles the critical dose volume limit of 30 Gy for ipsilateral bronchus in RTOG 0618 and results from previous studies. The authors seek to further extend this improved analysis method to a larger cohort to better understand the interpatient variation in radiographic lung injury dose response post-SBRT.« less
Using mixed methods in health research
Woodman, Jenny
2013-01-01
Summary Mixed methods research is the use of quantitative and qualitative methods in a single study or series of studies. It is an emergent methodology which is increasingly used by health researchers, especially within health services research. There is a growing literature on the theory, design and critical appraisal of mixed methods research. However, there are few papers that summarize this methodological approach for health practitioners who wish to conduct or critically engage with mixed methods studies. The objective of this paper is to provide an accessible introduction to mixed methods for clinicians and researchers unfamiliar with this approach. We present a synthesis of key methodological literature on mixed methods research, with examples from our own work and that of others, to illustrate the practical applications of this approach within health research. We summarize definitions of mixed methods research, the value of this approach, key aspects of study design and analysis, and discuss the potential challenges of combining quantitative and qualitative methods and data. One of the key challenges within mixed methods research is the successful integration of quantitative and qualitative data during analysis and interpretation. However, the integration of different types of data can generate insights into a research question, resulting in enriched understanding of complex health research problems. PMID:23885291
Liang, Sai; Qu, Shen; Xu, Ming
2016-02-02
To develop industry-specific policies for mitigating environmental pressures, previous studies primarily focus on identifying sectors that directly generate large amounts of environmental pressures (a.k.a. production-based method) or indirectly drive large amounts of environmental pressures through supply chains (e.g., consumption-based method). In addition to those sectors as important environmental pressure producers or drivers, there exist sectors that are also important to environmental pressure mitigation as transmission centers. Economy-wide environmental pressure mitigation might be achieved by improving production efficiency of these key transmission sectors, that is, using less upstream inputs to produce unitary output. We develop a betweenness-based method to measure the importance of transmission sectors, borrowing the betweenness concept from network analysis. We quantify the betweenness of sectors by examining supply chain paths extracted from structural path analysis that pass through a particular sector. We take China as an example and find that those critical transmission sectors identified by betweenness-based method are not always identifiable by existing methods. This indicates that betweenness-based method can provide additional insights that cannot be obtained with existing methods on the roles individual sectors play in generating economy-wide environmental pressures. Betweenness-based method proposed here can therefore complement existing methods for guiding sector-level environmental pressure mitigation strategies.
Hydrazine reagents as derivatizing agents in environmental analysis--a critical review.
Vogel, M; Büldt, A; Karst, U
2000-04-01
Hydrazine reagents are a well-known group of derivatizing agents for the determination of aldehydes and ketones in liquid and gaseous samples. Within this article, the most important hydrazine reagents are critically summarized, and their major applications in different fields, including environmental analysis, food chemistry and industrial analysis are introduced. As 2,4-dinitrophenylhydrazine (DNPH) is the basic reagent for several international standard procedures, its properties are discussed in detail. Particular focus is directed on the chemistry of the hydrazine reagents, and chemical interferences are considered. Recent methods for the determination of various oxidants using hydrazine reagents are presented as well. Due to limited space, this review does not cover the related field of carbohydrate analysis, although many chemical aspects are similar.
Stingray Failure Mode, Effects and Criticality Analysis: WEC Risk Registers
Ken Rhinefrank
2016-07-25
Analysis method to systematically identify all potential failure modes and their effects on the Stingray WEC system. This analysis is incorporated early in the development cycle such that the mitigation of the identified failure modes can be achieved cost effectively and efficiently. The FMECA can begin once there is enough detail to functions and failure modes of a given system, and its interfaces with other systems. The FMECA occurs coincidently with the design process and is an iterative process which allows for design changes to overcome deficiencies in the analysis.Risk Registers for major subsystems completed according to the methodology described in "Failure Mode Effects and Criticality Analysis Risk Reduction Program Plan.pdf" document below, in compliance with the DOE Risk Management Framework developed by NREL.
Vachon, Brigitte; LeBlanc, Jeannette
2011-09-01
Critical incident analysis (CIA) is one of the strategies frequently used to facilitate reflective learning. It involves the thorough description and analysis of an authentic and experienced event within its specific context. However, CIA has also been described as having the potential to expose vulnerabilities, threaten learners' coping mechanisms and increase rather than reduce their anxiety levels. The aim of this study was to compare the analysis of current critical incidents with that of past critical incidents, and to further explore why and how the former is more conducive to reflective learning and practice change than the latter. A collaborative research study was conducted. Eight occupational therapists were recruited to participate in a reflective learning group that convened for 12 meetings held over a 15-month period. The group facilitator planned and adapted the learning strategies to be used to promote reflective learning and guided the group process. Critical incident analysis represented the main activity carried out in the group discussions. The data collected were analysed using the grounded theory method. Three phenomena were found to differentiate between the learning contexts created by the analysis of, respectively, past and current critical incidents: attitudinal disposition; legitimacy of purpose, and the availability of opportunities for experimentation. Analysis of current clinical events was found to improve participants' motivation to self-evaluate, to increase their self-efficacy, and to help them transfer learning into action and to progressively self-regulate. The results of this collaborative research study suggest that the analysis of current clinical events in order to promote reflection offers a safer and more constructive learning environment than does the analysis of incidents that have occurred in the past. This learning strategy is directly grounded in health professional practice. The remaining challenge for continuing education providers is that of creating conditions conducive to its use. © Blackwell Publishing Ltd 2011.
Witnessing Deconstruction in Education: Why Quasi-Transcendentalism Matters
ERIC Educational Resources Information Center
Biesta, Gert
2009-01-01
Deconstruction is often depicted as a method of critical analysis aimed at exposing unquestioned metaphysical assumptions and internal contradictions in philosophical and literary language. Starting from Derrida's contention that deconstruction is not a method and cannot be transformed into one, I make a case for a different attitude towards…
Genealogy and Educational Research
ERIC Educational Resources Information Center
Christensen, Gerd
2016-01-01
The aim of this paper was to demonstrate how genealogy can be used as a method for critical education research. As Foucault emphasized, genealogy is a method for identifying the way in which the individuals are subjectified through discourse. The genealogical analysis in the article defines two mayor tendencies in contemporary Danish pedagogy:…
USDA-ARS?s Scientific Manuscript database
Increasing availability of genomic data and sophistication of analytical methodology in fungi has elevated the need for functional genomics tools in these organisms. Gene deletion is a critical tool for functional analysis. The targeted deletion of genes requires both a suitable method for the trans...
Acoustic Analysis of Chinese Fricatives and Affricates.
ERIC Educational Resources Information Center
Svantesson, Jan-Olof
1986-01-01
Develops a method of analyzing and describing the acoustic properties of fricatives, which consists of making frequency spectra using the Fast Fourier Transform and then analyzing the spectra in terms of critical bands. The six fricatives of Chinese are analyzed by this method, and comparison with other languages is made. (SED)
Good, Bad or Absent: Discourses of Parents with Disabilities in Australian News Media
ERIC Educational Resources Information Center
Fraser, Vikki; Llewellyn, Gwynnyth
2015-01-01
Background: News media frames public perceptions. As such, news media becomes a useful source of analysis to understand the presence (or otherwise) of people with disabilities, particularly intellectual disabilities, within parenting discourses in Australia. Method: Using Critical Discourse Analysis, this article examines major Australian…
25 Years of Self-organized Criticality: Numerical Detection Methods
NASA Astrophysics Data System (ADS)
McAteer, R. T. James; Aschwanden, Markus J.; Dimitropoulou, Michaila; Georgoulis, Manolis K.; Pruessner, Gunnar; Morales, Laura; Ireland, Jack; Abramenko, Valentyna
2016-01-01
The detection and characterization of self-organized criticality (SOC), in both real and simulated data, has undergone many significant revisions over the past 25 years. The explosive advances in the many numerical methods available for detecting, discriminating, and ultimately testing, SOC have played a critical role in developing our understanding of how systems experience and exhibit SOC. In this article, methods of detecting SOC are reviewed; from correlations to complexity to critical quantities. A description of the basic autocorrelation method leads into a detailed analysis of application-oriented methods developed in the last 25 years. In the second half of this manuscript space-based, time-based and spatial-temporal methods are reviewed and the prevalence of power laws in nature is described, with an emphasis on event detection and characterization. The search for numerical methods to clearly and unambiguously detect SOC in data often leads us outside the comfort zone of our own disciplines—the answers to these questions are often obtained by studying the advances made in other fields of study. In addition, numerical detection methods often provide the optimum link between simulations and experiments in scientific research. We seek to explore this boundary where the rubber meets the road, to review this expanding field of research of numerical detection of SOC systems over the past 25 years, and to iterate forwards so as to provide some foresight and guidance into developing breakthroughs in this subject over the next quarter of a century.
Kwasniok, Frank
2013-11-01
A time series analysis method for predicting the probability density of a dynamical system is proposed. A nonstationary parametric model of the probability density is estimated from data within a maximum likelihood framework and then extrapolated to forecast the future probability density and explore the system for critical transitions or tipping points. A full systematic account of parameter uncertainty is taken. The technique is generic, independent of the underlying dynamics of the system. The method is verified on simulated data and then applied to prediction of Arctic sea-ice extent.
Understanding critical health literacy: a concept analysis.
Sykes, Susie; Wills, Jane; Rowlands, Gillian; Popple, Keith
2013-02-18
Interest in and debates around health literacy have grown over the last two decades and key to the discussions has been the distinction made between basic functional health literacy, communicative/interactive health literacy and critical health literacy. Of these, critical health literacy is the least well developed and differing interpretations of its constituents and relevance exist. The aim of this study is to rigorously analyse the concept of critical health literacy in order to offer some clarity of definition upon which appropriate theory, well grounded practice and potential measurement tools can be based. The study uses a theoretical and colloquial evolutionary concept analysis method to systematically identify the features associated with this concept. A unique characteristic of this method is that it practically combines an analysis of the literature with in depth interviews undertaken with practitioners and policy makers who have an interest in the field. The study also analyses how the concept is understood across the contexts of time, place, discipline and use by health professionals, policy makers and academics. Findings revealed a distinct set of characteristics of advanced personal skills, health knowledge, information skills, effective interaction between service providers and users, informed decision making and empowerment including political action as key features of critical health literacy. The potential consequences of critical health literacy identified are in improving health outcomes, creating more effective use of health services and reducing inequalities in health thus demonstrating the relevance of this concept to public health and health promotion. While critical health literacy is shown to be a unique concept, there remain significant contextual variations in understanding particularly between academics, practitioners and policy makers. Key attributes presented as part of this concept when it was first introduced in the literature, particularly those around empowerment, social and political action and the existence of the concept at both an individual and population level, have been lost in more recent representations. This has resulted in critical health literacy becoming restricted to a higher order cognitive individual skill rather than a driver for political and social change. The paper argues that in order to retain the uniqueness and usefulness of the concept in practice efforts should be made to avoid this dilution of meaning.
Suggestions for presenting the results of data analyses
Anderson, David R.; Link, William A.; Johnson, Douglas H.; Burnham, Kenneth P.
2001-01-01
We give suggestions for the presentation of research results from frequentist, information-theoretic, and Bayesian analysis paradigms, followed by several general suggestions. The information-theoretic and Bayesian methods offer alternative approaches to data analysis and inference compared to traditionally used methods. Guidance is lacking on the presentation of results under these alternative procedures and on nontesting aspects of classical frequentists methods of statistical analysis. Null hypothesis testing has come under intense criticism. We recommend less reporting of the results of statistical tests of null hypotheses in cases where the null is surely false anyway, or where the null hypothesis is of little interest to science or management.
Technical Feasibility Assessment of Lunar Base Mission Scenarios
NASA Astrophysics Data System (ADS)
Magelssen, Trygve ``Spike''; Sadeh, Eligar
2005-02-01
Investigation of the literature pertaining to lunar base (LB) missions and the technologies required for LB development has revealed an information gap that hinders technical feasibility assessment. This information gap is the absence of technical readiness levels (TRL) (Mankins, 1995) and information pertaining to the criticality of the critical enabling technologies (CETs) that enable mission success. TRL is a means of identifying technical readiness stages of a technology. Criticality is defined as the level of influence the CET has on the mission scenario. The hypothesis of this research study is that technical feasibility is a function of technical readiness and technical readiness is a function of criticality. A newly developed research analysis method is used to identify the technical feasibility of LB mission scenarios. A Delphi is used to ascertain technical readiness levels and CET criticality-to-mission. The research analysis method is applied to the Delphi results to determine the technical feasibility of the LB mission scenarios that include: observatory, science research, lunar settlement, space exploration gateway, space resource utilization, and space tourism. The CETs identified encompasses four major system level technologies of: transportation, life support, structures, and power systems. Results of the technical feasibility assessment show the observatory and science research LB mission scenarios to be more technical ready out of all the scenarios, but all mission scenarios are in very close proximity to each other in regard to criticality and TRL and no one mission scenario stands out as being absolutely more technically ready than any of the other scenarios. What is significant and of value are the Delphi results concerning CET criticality-to-mission and the TRL values evidenced in the Tables that can be used by anyone assessing the technical feasibility of LB missions.
A new analysis of the effects of the Asian crisis of 1997 on emergent markets
NASA Astrophysics Data System (ADS)
Mariani, M. C.; Liu, Y.
2007-07-01
This work is devoted to the study of the Asian crisis of 1997, and its consequences on emerging markets. We have done so by means of a phase transition model. We have analyzed the crashes on leading indices of Hong Kong (HSI), Turkey (XU100), Mexico (MMX), Brazil (BOVESPA) and Argentina (MERVAL). We were able to obtain optimum values for the critical date, corresponding to the most probable date of the crash. The estimation of the critical date was excellent except for the MERVAL index; this improvement is due to a previous analysis of the parameters involved. We only used data from before the true crash date in order to obtain the predicted critical date. This article's conclusions are largely obtained via ad hoc empirical methods.
NASA Astrophysics Data System (ADS)
Kamiński, M.; Supeł, Ł.
2016-02-01
It is widely known that lateral-torsional buckling of a member under bending and warping restraints of its cross-sections in the steel structures are crucial for estimation of their safety and durability. Although engineering codes for steel and aluminum structures support the designer with the additional analytical expressions depending even on the boundary conditions and internal forces diagrams, one may apply alternatively the traditional Finite Element or Finite Difference Methods (FEM, FDM) to determine the so-called critical moment representing this phenomenon. The principal purpose of this work is to compare three different ways of determination of critical moment, also in the context of structural sensitivity analysis with respect to the structural element length. Sensitivity gradients are determined by the use of both analytical and the central finite difference scheme here and contrasted also for analytical, FEM as well as FDM approaches. Computational study is provided for the entire family of the steel I- and H - beams available for the practitioners in this area, and is a basis for further stochastic reliability analysis as well as durability prediction including possible corrosion progress.
Sharma, Teenu; Khurana, Rajneet Kaur; Jain, Atul; Katare, O P; Singh, Bhupinder
2018-05-01
The current research work envisages an analytical quality by design-enabled development of a simple, rapid, sensitive, specific, robust and cost-effective stability-indicating reversed-phase high-performance liquid chromatographic method for determining stress-induced forced-degradation products of sorafenib tosylate (SFN). An Ishikawa fishbone diagram was constructed to embark upon analytical target profile and critical analytical attributes, i.e. peak area, theoretical plates, retention time and peak tailing. Factor screening using Taguchi orthogonal arrays and quality risk assessment studies carried out using failure mode effect analysis aided the selection of critical method parameters, i.e. mobile phase ratio and flow rate potentially affecting the chosen critical analytical attributes. Systematic optimization using response surface methodology of the chosen critical method parameters was carried out employing a two-factor-three-level-13-run, face-centered cubic design. A method operable design region was earmarked providing optimum method performance using numerical and graphical optimization. The optimum method employed a mobile phase composition consisting of acetonitrile and water (containing orthophosphoric acid, pH 4.1) at 65:35 v/v at a flow rate of 0.8 mL/min with UV detection at 265 nm using a C 18 column. Response surface methodology validation studies confirmed good efficiency and sensitivity of the developed method for analysis of SFN in mobile phase as well as in human plasma matrix. The forced degradation studies were conducted under different recommended stress conditions as per ICH Q1A (R2). Mass spectroscopy studies showed that SFN degrades in strongly acidic, alkaline and oxidative hydrolytic conditions at elevated temperature, while the drug was per se found to be photostable. Oxidative hydrolysis using 30% H 2 O 2 showed maximum degradation with products at retention times of 3.35, 3.65, 4.20 and 5.67 min. The absence of any significant change in the retention time of SFN and degradation products, formed under different stress conditions, ratified selectivity and specificity of the systematically developed method. Copyright © 2017 John Wiley & Sons, Ltd.
Deep Learning Nuclei Detection in Digitized Histology Images by Superpixels.
Sornapudi, Sudhir; Stanley, Ronald Joe; Stoecker, William V; Almubarak, Haidar; Long, Rodney; Antani, Sameer; Thoma, George; Zuna, Rosemary; Frazier, Shelliane R
2018-01-01
Advances in image analysis and computational techniques have facilitated automatic detection of critical features in histopathology images. Detection of nuclei is critical for squamous epithelium cervical intraepithelial neoplasia (CIN) classification into normal, CIN1, CIN2, and CIN3 grades. In this study, a deep learning (DL)-based nuclei segmentation approach is investigated based on gathering localized information through the generation of superpixels using a simple linear iterative clustering algorithm and training with a convolutional neural network. The proposed approach was evaluated on a dataset of 133 digitized histology images and achieved an overall nuclei detection (object-based) accuracy of 95.97%, with demonstrated improvement over imaging-based and clustering-based benchmark techniques. The proposed DL-based nuclei segmentation Method with superpixel analysis has shown improved segmentation results in comparison to state-of-the-art methods.
Three-dimensional magnetic critical behavior in CrI 3
Liu, Yu; Petrovic, C.
2018-01-18
CrI 3 is a promising candidate for the van der Waals bonded ferromagnetic devices since its ferromagnetism can be maintained upon exfoliating of bulk crystals down to single layer. In this work we studied critical properties of bulk CrI 3 single crystals around the paramagnetic to ferromagnetic phase transition. Critical exponents β= 0.260(4) with a critical temperature T c= 60.05(13) K and γ= 1.136(6) with T c= 60.43(4) K are obtained by the Kouvel-Fisher method, whereas δ= 5.32(2) is obtained by a critical isotherm analysis at T c= 60 K. In conclusion, the critical exponents determined in bulk CrI 3more » single crystals suggest a three-dimensional long-range magnetic coupling with the exchange distance decaying as J(r)≈r -4:69« less
ERIC Educational Resources Information Center
Rogers, Richard
2004-01-01
Objective: The overriding objective is a critical examination of Munchausen syndrome by proxy (MSBP) and its closely-related alternative, factitious disorder by proxy (FDBP). Beyond issues of diagnostic validity, assessment methods and potential detection strategies are explored. Methods: A painstaking analysis was conducted of the MSBP and FDBP…
The PBL-Evaluator: A Web-Based Tool for Assessment in Tutorials.
ERIC Educational Resources Information Center
Chaves, John F.; Chaves, John A.; Lantz, Marilyn S.
1998-01-01
Describes design and use of the PBL Evaluator, a computer-based method of evaluating dental students' clinical problem-solving skills. Analysis of Indiana University students' self-, peer, and tutor ratings for one iteration of a course in critical thinking and professional behavior shows differences in these ratings. The method is found useful…
ERIC Educational Resources Information Center
Lal, Shalini; Suto, Melinda; Ungar, Michael
2012-01-01
Increasingly, qualitative researchers are combining methods, processes, and principles from two or more methodologies over the course of a research study. Critics charge that researchers adopting combined approaches place too little attention on the historical, epistemological, and theoretical aspects of the research design. Rather than…
Shan Gao; Xiping Wang; Michael C. Wiemann; Brian K. Brashaw; Robert J. Ross; Lihai Wang
2017-01-01
Key message Field methods for rapid determination of wood density in trees have evolved from increment borer, torsiometer, Pilodyn, and nail withdrawal into sophisticated electronic tools of resistance drilling measurement. A partial resistance drilling approach coupled with knowledge of internal tree density distribution may...
Carbogim, Fábio da Costa; de Oliveira, Larissa Bertacchini; Püschel, Vilanice Alves de Araújo
2016-01-01
ABSTRACT Objective: to analyze the concept of critical thinking (CT) in Rodger's evolutionary perspective. Method: documentary research undertaken in the Cinahl, Lilacs, Bdenf and Dedalus databases, using the keywords of 'critical thinking' and 'Nursing', without limitation based on year of publication. The data were analyzed in accordance with the stages of Rodger's conceptual model. The following were included: books and articles in full, published in Portuguese, English or Spanish, which addressed CT in the teaching and practice of Nursing; articles which did not address aspects related to the concept of CT were excluded. Results: the sample was made up of 42 works. As a substitute term, emphasis is placed on 'analytical thinking', and, as a related factor, decision-making. In order, the most frequent preceding and consequent attributes were: ability to analyze, training of the student nurse, and clinical decision-making. As the implications of CT, emphasis is placed on achieving effective results in care for the patient, family and community. Conclusion: CT is a cognitive skill which involves analysis, logical reasoning and clinical judgment, geared towards the resolution of problems, and standing out in the training and practice of the nurse with a view to accurate clinical decision-making and the achieving of effective results. PMID:27598376
Quantitative analysis of red wine tannins using Fourier-transform mid-infrared spectrometry.
Fernandez, Katherina; Agosin, Eduardo
2007-09-05
Tannin content and composition are critical quality components of red wines. No spectroscopic method assessing these phenols in wine has been described so far. We report here a new method using Fourier transform mid-infrared (FT-MIR) spectroscopy and chemometric techniques for the quantitative analysis of red wine tannins. Calibration models were developed using protein precipitation and phloroglucinolysis as analytical reference methods. After spectra preprocessing, six different predictive partial least-squares (PLS) models were evaluated, including the use of interval selection procedures such as iPLS and CSMWPLS. PLS regression with full-range (650-4000 cm(-1)), second derivative of the spectra and phloroglucinolysis as the reference method gave the most accurate determination for tannin concentration (RMSEC = 2.6%, RMSEP = 9.4%, r = 0.995). The prediction of the mean degree of polymerization (mDP) of the tannins also gave a reasonable prediction (RMSEC = 6.7%, RMSEP = 10.3%, r = 0.958). These results represent the first step in the development of a spectroscopic methodology for the quantification of several phenolic compounds that are critical for wine quality.
The Sensitivity Analysis for the Flow Past Obstacles Problem with Respect to the Reynolds Number
Ito, Kazufumi; Li, Zhilin; Qiao, Zhonghua
2013-01-01
In this paper, numerical sensitivity analysis with respect to the Reynolds number for the flow past obstacle problem is presented. To carry out such analysis, at each time step, we need to solve the incompressible Navier-Stokes equations on irregular domains twice, one for the primary variables; the other is for the sensitivity variables with homogeneous boundary conditions. The Navier-Stokes solver is the augmented immersed interface method for Navier-Stokes equations on irregular domains. One of the most important contribution of this paper is that our analysis can predict the critical Reynolds number at which the vortex shading begins to develop in the wake of the obstacle. Some interesting experiments are shown to illustrate how the critical Reynolds number varies with different geometric settings. PMID:24910780
The Sensitivity Analysis for the Flow Past Obstacles Problem with Respect to the Reynolds Number.
Ito, Kazufumi; Li, Zhilin; Qiao, Zhonghua
2012-02-01
In this paper, numerical sensitivity analysis with respect to the Reynolds number for the flow past obstacle problem is presented. To carry out such analysis, at each time step, we need to solve the incompressible Navier-Stokes equations on irregular domains twice, one for the primary variables; the other is for the sensitivity variables with homogeneous boundary conditions. The Navier-Stokes solver is the augmented immersed interface method for Navier-Stokes equations on irregular domains. One of the most important contribution of this paper is that our analysis can predict the critical Reynolds number at which the vortex shading begins to develop in the wake of the obstacle. Some interesting experiments are shown to illustrate how the critical Reynolds number varies with different geometric settings.
Study on vibration characteristics of the shaft system for a dredging pump based on FEM
NASA Astrophysics Data System (ADS)
Zhai, L. M.; Qin, L.; Liu, C. Y.; Liu, X.; He, L. Y.; He, Y.; Wang, Z. W.
2012-11-01
The dynamic characteristics of the shaft system for a dredging pump were studied with the Finite Element Method (FEM) by SAMCEF ROTOR. At first, the influence of the fluid-solid coupling interaction of mud water and impeller, water sealing and pump shaft on the lateral critical speeds were analyzed. The results indicated that the mud water must be taken into consideration, while the water sealing need not to. Then the effects of radial and thrust rolling bearings on the lateral critical speeds were discussed, which shows that the radial bearing close to the impeller has greatest impact on the 1st order critical speed. At last, the upper and lower limits of the critical speeds of lateral, axial and torsional vibration were calculated. The rated speed of the dredging pump was far less than the predicted critical speed, which can ensure the safe operation of the unit. Each vibration mode is also shown in this paper. This dynamic analysis method offers some reference value on the research of vibration and stability of the shaft system in dredging pump.
Lakbub, Jude C; Shipman, Joshua T; Desaire, Heather
2018-04-01
Disulfide bonds are important structural moieties of proteins: they ensure proper folding, provide stability, and ensure proper function. With the increasing use of proteins for biotherapeutics, particularly monoclonal antibodies, which are highly disulfide bonded, it is now important to confirm the correct disulfide bond connectivity and to verify the presence, or absence, of disulfide bond variants in the protein therapeutics. These studies help to ensure safety and efficacy. Hence, disulfide bonds are among the critical quality attributes of proteins that have to be monitored closely during the development of biotherapeutics. However, disulfide bond analysis is challenging because of the complexity of the biomolecules. Mass spectrometry (MS) has been the go-to analytical tool for the characterization of such complex biomolecules, and several methods have been reported to meet the challenging task of mapping disulfide bonds in proteins. In this review, we describe the relevant, recent MS-based techniques and provide important considerations needed for efficient disulfide bond analysis in proteins. The review focuses on methods for proper sample preparation, fragmentation techniques for disulfide bond analysis, recent disulfide bond mapping methods based on the fragmentation techniques, and automated algorithms designed for rapid analysis of disulfide bonds from liquid chromatography-MS/MS data. Researchers involved in method development for protein characterization can use the information herein to facilitate development of new MS-based methods for protein disulfide bond analysis. In addition, individuals characterizing biotherapeutics, especially by disulfide bond mapping in antibodies, can use this review to choose the best strategies for disulfide bond assignment of their biologic products. Graphical Abstract This review, describing characterization methods for disulfide bonds in proteins, focuses on three critical components: sample preparation, mass spectrometry data, and software tools.
A PERT/CPM of the Computer Assisted Completion of The Ministry September Report. Research Report.
ERIC Educational Resources Information Center
Feeney, J. D.
Using two statistical analysis techniques (the Program Evaluation and Review Technique and the Critical Path Method), this study analyzed procedures for compiling the required yearly report of the Metropolitan Separate School Board (Catholic) of Toronto, Canada. The computer-assisted analysis organized the process of completing the report more…
Online Responses towards Parental Rearing Styles Regarding Hand-Held Devices
ERIC Educational Resources Information Center
Geng, Gretchen; Disney, Leigh
2014-01-01
This article reviewed the literature on parental rearing styles and used responses from an online discussion forum to investigate people's opinions towards parental rearing styles and strategies when children use hand-held devices. Critical discourse analysis (CDA) was used as an analysis method via micro, meso and macro multi-level…
Language and Nutrition (Mis)Information: Food Labels, FDA Policies and Meaning
ERIC Educational Resources Information Center
Taylor, Christy Marie
2013-01-01
In this dissertation, I address the ways in which food manufacturers can exploit the often vague and ambiguous nature of FDA policies concerning language and images used on food labels. Employing qualitative analysis methods (Strauss, 1987; Denzin and Lincoln, 2003; Mackey and Gass, 2005) that drew upon critical discourse analysis (Fairclough,…
Huang, Weiqing; Fan, Hongbo; Qiu, Yongfu; Cheng, Zhiyu; Xu, Pingru; Qian, Yu
2016-05-01
Recently, China has frequently experienced large-scale, severe and persistent haze pollution due to surging urbanization and industrialization and a rapid growth in the number of motor vehicles and energy consumption. The vehicle emission due to the consumption of a large number of fossil fuels is no doubt a critical factor of the haze pollution. This work is focused on the causation mechanism of haze pollution related to the vehicle emission for Guangzhou city by employing the Fault Tree Analysis (FTA) method for the first time. With the establishment of the fault tree system of "Haze weather-Vehicle exhausts explosive emission", all of the important risk factors are discussed and identified by using this deductive FTA method. The qualitative and quantitative assessments of the fault tree system are carried out based on the structure, probability and critical importance degree analysis of the risk factors. The study may provide a new simple and effective tool/strategy for the causation mechanism analysis and risk management of haze pollution in China. Copyright © 2016 Elsevier Ltd. All rights reserved.
Measuring change in critical thinking skills of dental students educated in a PBL curriculum.
Pardamean, Bens
2012-04-01
This study measured the change in critical thinking skills of dental students educated in a problem-based learning (PBL) pedagogical method. The quantitative analysis was focused on measuring students' critical thinking skills achievement from their first through third years of dental education at the University of Southern California. This non-experimental evaluation was based on a volunteer sample of ninety-eight dental students who completed a demographics/academic questionnaire and a psychometric assessment known as the Health Sciences Reasoning Test (HSRT). The HSRT produced the overall critical thinking skills score. Additionally, the HSRT generated five subscale scores: analysis, inference, evaluation, deductive reasoning, and inductive reasoning. The results of this study concluded that the students showed no continuous and significant incremental improvement in their overall critical thinking skills score achievement during their PBL-based dental education. Except for the inductive reasoning score, this result was very consistent with the four subscale scores. Moreover, after performing the statistical adjustment on total score and subscale scores, no significant statistical differences were found among the three student groups. However, the results of this study found some aspects of critical thinking achievements that differed by categories of gender, race, English as first language, and education level.
Ozturk, Candan; Muslu, Gonca Karayagiz; Dicle, Aklime
2008-07-01
Determining the critical thinking (CT) levels of students in undergraduate nursing schools is important in terms of establishing the methods of education that should be used. Although there is some evidence that active learning approaches like problem-based learning are effective in developing CT, the findings are inconclusive. This descriptive analytic study compared levels of critical thinking among senior nursing students (N=147) in two educational programs, one of which used a problem-based learning (PBL) model while the other used a traditional model. The California critical thinking disposition inventory (CCTDI) was used as a data collection tool. Comparisons between the groups were made using t-test analysis. There was a significant difference (p<0.05) between the critical thinking disposition scores of the seniors in the PBL school and those in the school implementing the traditional model. Analysis of sub-scale scores showed significant differences in truth-seeking and open-mindedness. These findings add to the evidence that the active and self-directed nature of PBL encourages students' ability to think critically, be tolerant of the ideas of others and evaluate conflicting information before reaching a conclusion.
Construction Of Critical Thinking Skills Test Instrument Related The Concept On Sound Wave
NASA Astrophysics Data System (ADS)
Mabruroh, F.; Suhandi, A.
2017-02-01
This study aimed to construct test instrument of critical thinking skills of high school students related the concept on sound wave. This research using a mixed methods with sequential exploratory design, consists of: 1) a preliminary study; 2) design and review of test instruments. The form of test instruments in essay questions, consist of 18 questions that was divided into 5 indicators and 8 sub-indicators of the critical thinking skills expressed by Ennis, with questions that are qualitative and contextual. Phases of preliminary study include: a) policy studies; b) survey to the school; c) and literature studies. Phases of the design and review of test instruments consist of two steps, namely a draft design of test instruments include: a) analysis of the depth of teaching materials; b) the selection of indicators and sub-indicators of critical thinking skills; c) analysis of indicators and sub-indicators of critical thinking skills; d) implementation of indicators and sub-indicators of critical thinking skills; and e) making the descriptions about the test instrument. In the next phase of the review test instruments, consist of: a) writing about the test instrument; b) validity test by experts; and c) revision of test instruments based on the validator.
[Concept analysis of reflective thinking].
Van Vuuren, M; Botes, A
1999-09-01
The nursing practice is described as a scientific practice, but also as a practice where caring is important. The purpose of nursing education is to provide competent nursing practitioners. This implies that future practitioners must have both critical analytical thinking abilities, as well as empathy and moral values. Reflective thinking could probably accommodate these thinking skills. It seems that the facilitation of reflective thinking skills is essential in nursing education. The research question that is relevant in this context is: "What is reflective thinking?" The purpose of this article is to report on the concept analysis of reflective thinking and in particular on the connotative meaning (critical attributes) thereof. The method used to perform the concept analysis is based on the original method of Wilson (1987) as described by Walker & Avant (1995). As part of the concept analysis the connotations (critical attributes) are identified, reduced and organized into three categories, namely pre-requisites, processes and outcomes. A model case is described which confirms the essential critical attributes of reflective thinking. Finally a theoretical definition of reflective thinking is derived and reads as follows: Reflective thinking is a cyclic, hierarchical and interactive construction process. It is initiated, extended and continued because of personal cognitive-affective interaction (individual dimension) as well as interaction with the social environment (social dimension). to realize reflective thinking, a level of internalization on the cognitive and affective domain is required. The result of reflective thinking is a integrated framework of knowledge (meaningful learning) and a internalized value system providing a new perspective on and better understanding of a problem. Reflective thinking further leads to more effective decision making- and problem solving skills.
Metcalf, Heather
2016-01-01
This research methods Essay details the usefulness of critical theoretical frameworks and critical mixed-methodological approaches for life sciences education research on broadening participation in the life sciences. First, I draw on multidisciplinary research to discuss critical theory and methodologies. Then, I demonstrate the benefits of these approaches for researchers who study diversity and inclusion issues in the life sciences through examples from two critical mixed-methods studies of prominent issues in science, technology, engineering, and mathematics (STEM) participation and recognition. The first study pairs critical discourse analysis of the STEM workforce literature, data, and underlying surveys with quantitative analyses of STEM pathways into the workforce. This example illustrates the necessity of questioning popular models of retention. It also demonstrates the importance of intersecting demographic categories to reveal patterns of experience both within and between groups whose access to and participation in STEM we aim to improve. The second study’s critical approach applies research on inequities in prizes awarded by STEM professional societies toward organizational change. This example uses data from the life sciences professional societies to show the importance of placing data within context to broaden participation and understand challenges in creating sustainable change. PMID:27521238
Yamazaki, Hiroshi; Slingsby, Brian Taylor; Takahashi, Miyako; Hayashi, Yoko; Sugimori, Hiroki; Nakayama, Takeo
2009-12-01
Although qualitative studies have increased since the 1990s, some reports note that relatively few influential journals published them up until 2000. This study critically reviewed the characteristics of qualitative studies published in top tier medical journals since 2000. We assessed full texts of qualitative studies published between 2000 and 2004 in the Annals of Internal Medicine, BMJ, JAMA, Lancet, and New England Journal of Medicine. We found 80 qualitative studies, of which 73 (91%) were published in BMJ. Only 10 studies (13%) combined qualitative and quantitative methods. Sixty-two studies (78%) used only one method of data collection. Interviews dominated the choice of data collection. The median sample size was 36 (range: 9-383). Thirty-three studies (41%) did not specify the type of analysis used but rather described the analytic process in detail. The rest indicated the mode of data analysis, in which the most prevalent methods were the constant comparative method (23%) and the grounded theory approach (22%). Qualitative data analysis software was used by 33 studies (41%). Among influential journals of general medicine, only BMJ consistently published an average of 15 qualitative study reports between 2000 and 2004. These findings lend insight into what qualities and characteristics make a qualitative study worthy of consideration to be published in an influential journal, primarily BMJ.
CPM and PERT in Library Management.
ERIC Educational Resources Information Center
Main, Linda
1989-01-01
Discusses two techniques of systems analysis--Critical Path Method (CPM) and Program Evaluation Review Techniques (PERT)--and their place in library management. An overview of CPM and PERT charting procedures is provided. (11 references) (Author/MES)
Microenterprise in health care and health education.
Edler, A. A.
1998-01-01
Over the last decade, development aid has increasingly used a more collaborative model, with donors and recipients both contributing ideas, methods and goals. Though many examples of collateral aid projects exist in agriculture, business administration and banking, few have found their way into health care and health education, a typically donor-dominated model. The following case report describes a collateral project in health care education. This case report analyzes data-inducing project proposals, personal interviews and project reports obtained through standard archival research methods. The setting for this joint project was the collaboration between international nongovernmental (NGO) aid foundations and the faculty of a major sub-Saharan African Medical School's Department of Anesthesia. The initial goal of this project was to improve record keeping for all anesthetic records, both in the operating theatres and outside. Analysis of the data was performed using ethnographic methods of constant comparative analysis. The purpose of the analysis was to critically evaluate both the goals and their results in the Department of Anesthesiology. The findings of this analysis suggested that results included not only quality assurance and improvement programs in the department but also advances in the use of critical incidents as teaching tools, hospital-wide drug and equipment utilization information and the initiation of an outreach program to district hospitals throughout the country for similar projects. PMID:10604789
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Yu; Petrovic, C.
CrI 3 is a promising candidate for the van der Waals bonded ferromagnetic devices since its ferromagnetism can be maintained upon exfoliating of bulk crystals down to single layer. In this work we studied critical properties of bulk CrI 3 single crystals around the paramagnetic to ferromagnetic phase transition. Critical exponents β= 0.260(4) with a critical temperature T c= 60.05(13) K and γ= 1.136(6) with T c= 60.43(4) K are obtained by the Kouvel-Fisher method, whereas δ= 5.32(2) is obtained by a critical isotherm analysis at T c= 60 K. In conclusion, the critical exponents determined in bulk CrI 3more » single crystals suggest a three-dimensional long-range magnetic coupling with the exchange distance decaying as J(r)≈r -4:69« less
Likert scales, levels of measurement and the "laws" of statistics.
Norman, Geoff
2010-12-01
Reviewers of research reports frequently criticize the choice of statistical methods. While some of these criticisms are well-founded, frequently the use of various parametric methods such as analysis of variance, regression, correlation are faulted because: (a) the sample size is too small, (b) the data may not be normally distributed, or (c) The data are from Likert scales, which are ordinal, so parametric statistics cannot be used. In this paper, I dissect these arguments, and show that many studies, dating back to the 1930s consistently show that parametric statistics are robust with respect to violations of these assumptions. Hence, challenges like those above are unfounded, and parametric methods can be utilized without concern for "getting the wrong answer".
Determination of the design space of the HPLC analysis of water-soluble vitamins.
Wagdy, Hebatallah A; Hanafi, Rasha S; El-Nashar, Rasha M; Aboul-Enein, Hassan Y
2013-06-01
Analysis of water-soluble vitamins has been tremendously approached through the last decades. A multitude of HPLC methods have been reported with a variety of advantages/shortcomings, yet, the design space of HPLC analysis of these vitamins was not defined in any of these reports. As per the food and drug administration (FDA), implementing the quality by design approach for the analysis of commercially available mixtures is hypothesized to enhance the pharmaceutical industry via facilitating the process of analytical method development and approval. This work illustrates a multifactorial optimization of three measured plus seven calculated influential HPLC parameters on the analysis of a mixture containing seven common water-soluble vitamins (B1, B2, B6, B12, C, PABA, and PP). These three measured parameters are gradient time, temperature, and ternary eluent composition (B1/B2) and the seven calculated parameters are flow rate, column length, column internal diameter, dwell volume, extracolumn volume, %B (start), and %B (end). The design is based on 12 experiments in which, examining of the multifactorial effects of these 3 + 7 parameters on the critical resolution and selectivity, was carried out by systematical variation of all these parameters simultaneously. The 12 basic runs were based on two different gradient time each at two different temperatures, repeated at three different ternary eluent compositions (methanol or acetonitrile or a mixture of both). Multidimensional robust regions of high critical R(s) were defined and graphically verified. The optimum method was selected based on the best resolution separation in the shortest run time for a synthetic mixture, followed by application on two pharmaceutical preparations available in the market. The predicted retention times of all peaks were found to be in good match with the virtual ones. In conclusion, the presented report offers an accurate determination of the design space for critical resolution in the analysis of water-soluble vitamins by HPLC, which would help the regulatory authorities to judge the validity of presented analytical methods for approval. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Antioxidant Capacity Determination in Plants and Plant-Derived Products: A Review
Pop, Aneta; Cimpeanu, Carmen; Predoi, Gabriel
2016-01-01
The present paper aims at reviewing and commenting on the analytical methods applied to antioxidant and antioxidant capacity assessment in plant-derived products. Aspects related to oxidative stress, reactive oxidative species' influence on key biomolecules, and antioxidant benefits and modalities of action are discussed. Also, the oxidant-antioxidant balance is critically discussed. The conventional and nonconventional extraction procedures applied prior to analysis are also presented, as the extraction step is of pivotal importance for isolation and concentration of the compound(s) of interest before analysis. Then, the chromatographic, spectrometric, and electrochemical methods for antioxidant and antioxidant capacity determination in plant-derived products are detailed with respect to their principles, characteristics, and specific applications. Peculiarities related to the matrix characteristics and other factors influencing the method's performances are discussed. Health benefits of plants and derived products are described, as indicated in the original source. Finally, critical and conclusive aspects are given when it comes to the choice of a particular extraction procedure and detection method, which should consider the nature of the sample, prevalent antioxidant/antioxidant class, and the mechanism underlying each technique. Advantages and disadvantages are discussed for each method. PMID:28044094
ERIC Educational Resources Information Center
Aghababaeian, Parinaz; Moghaddam, Shams Aldin Hashemi; Nateghi, Faezeh; Faghihi, Alireza
2017-01-01
This study investigated the changes in public school social studies textbooks in general period of Iran (fourth and fifth grades) based on the emphasis on Facione critical thinking skills in the past three decades. In this study, content analysis of qualitative and quantitative methods was used to evaluate changes in textbook. For this purpose,…
NASA Astrophysics Data System (ADS)
Crâştiu, I.; Nyaguly, E.; Deac, S.; Gozman-Pop, C.; Bârgău, A.; Bereteu, L.
2018-01-01
The purpose of this paper is the development and validation of an impulse excitation technique to determine flexural critical speeds of a single rotor shaft and multy-rotor shaft. The experimental measurement of the vibroacoustic response is carried out by using a condenser microphone as a transducer. By the means of Modal Analysis using Finite Element Method (FEM), the natural frequencies and shape modes of one rotor and three rotor specimens are determined. The vibration responses of the specimens, in simple supported conditions, are carried out using algorithms based on Fast Fourier Transform (FFT). To validate the results of the modal parameters estimated using Finite Element Analysis (FEA) these are compared with experimental ones.
Teaching Critical Thinking Using Reflective Journaling in a Nursing Fellowship Program.
Zori, Susan
2016-07-01
Critical thinking (CT) is considered to be foundational for the development of RN clinical reasoning. Reflective journaling has been used as an educational strategy to support the development of CT. This project's purpose was to explore how using reflective journaling about CT dispositions with RNs in a fellowship program might influence RN's use of CT dispositions. This descriptive, qualitative study used content analysis as the method to analyze journal entries focused on seven CT dispositions: inquisitiveness, systematicity, open mindedness, analyticity, truth seeking, CT maturity, and CT confidence written by RNs in the first 7 weeks of their fellowship program. Based on the content analysis of journal entries, two major descriptive themes emerged: Development of Critical Thinking Is a Process That Develops During a Period of Time, and Purposefully Engaging Critical Thinking Dispositions May Help Prevent Negative Patient Outcomes. The purposeful use of CT dispositions as described in the journal entries also helped to guide the RN's individual learning. J Contin Educ Nurs. 2016;47(7):321-329. Copyright 2016, SLACK Incorporated.
Differences in Risk Factors for Rotator Cuff Tears between Elderly Patients and Young Patients.
Watanabe, Akihisa; Ono, Qana; Nishigami, Tomohiko; Hirooka, Takahiko; Machida, Hirohisa
2018-02-01
It has been unclear whether the risk factors for rotator cuff tears are the same at all ages or differ between young and older populations. In this study, we examined the risk factors for rotator cuff tears using classification and regression tree analysis as methods of nonlinear regression analysis. There were 65 patients in the rotator cuff tears group and 45 patients in the intact rotator cuff group. Classification and regression tree analysis was performed to predict rotator cuff tears. The target factor was rotator cuff tears; explanatory variables were age, sex, trauma, and critical shoulder angle≥35°. In the results of classification and regression tree analysis, the tree was divided at age 64. For patients aged≥64, the tree was divided at trauma. For patients aged<64, the tree was divided at critical shoulder angle≥35°. The odds ratio for critical shoulder angle≥35° was significant for all ages (5.89), and for patients aged<64 (10.3) while trauma was only a significant factor for patients aged≥64 (5.13). Age, trauma, and critical shoulder angle≥35° were related to rotator cuff tears in this study. However, these risk factors showed different trends according to age group, not a linear relationship.
Using Framework Analysis in nursing research: a worked example.
Ward, Deborah J; Furber, Christine; Tierney, Stephanie; Swallow, Veronica
2013-11-01
To demonstrate Framework Analysis using a worked example and to illustrate how criticisms of qualitative data analysis including issues of clarity and transparency can be addressed. Critics of the analysis of qualitative data sometimes cite lack of clarity and transparency about analytical procedures; this can deter nurse researchers from undertaking qualitative studies. Framework Analysis is flexible, systematic, and rigorous, offering clarity, transparency, an audit trail, an option for theme-based and case-based analysis and for readily retrievable data. This paper offers further explanation of the process undertaken which is illustrated with a worked example. Data were collected from 31 nursing students in 2009 using semi-structured interviews. The data collected are not reported directly here but used as a worked example for the five steps of Framework Analysis. Suggestions are provided to guide researchers through essential steps in undertaking Framework Analysis. The benefits and limitations of Framework Analysis are discussed. Nurses increasingly use qualitative research methods and need to use an analysis approach that offers transparency and rigour which Framework Analysis can provide. Nurse researchers may find the detailed critique of Framework Analysis presented in this paper a useful resource when designing and conducting qualitative studies. Qualitative data analysis presents challenges in relation to the volume and complexity of data obtained and the need to present an 'audit trail' for those using the research findings. Framework Analysis is an appropriate, rigorous and systematic method for undertaking qualitative analysis. © 2013 Blackwell Publishing Ltd.
Rationing critical care medicine: recent studies and current trends.
Ward, Nicholas S
2005-12-01
This paper reviews the literature on the rationing of critical care resources. Although much has been written about the concept of rationing, there have been few scientific studies as to its prevalence. A recent meta-analysis reviewed all previously published studies on rationing access to intensive care units but little is known about practices within the intensive care unit. Much literature in the past few years has focused on the growing use of critical care resources and projections for the future. Several authors suggest there may be a crisis in financial or personnel resources if some rationing does not take place. Other papers have argued that the methods of rationing critical care previously proposed, such as limiting the care of dying patients or using cost-effectiveness analysis to determine care, may not be effective or viewed as ethical by some. Finally, several recent papers review how critical care is practiced and allocated in India and Asian countries that already practice open rationing in their health care systems. There is currently no published evidence that overt rationing is taking place in critical care medicine. There is growing evidence that in the future, the need for critical care may outstrip financial resources unless some form of rationing takes place. It is also clear from the literature that choosing how to ration critical care will be a difficult task.
Theoretical analyses of Baroclinic flows
NASA Technical Reports Server (NTRS)
Antar, B.
1982-01-01
A stability analysis of a thin horizontal rotating fluid layer which is subjected to arbitrary horizontal and vertical temperature gradients is presented. The basic state is a nonlinear Hadley cell which contains both Ekman and thermal boundary layers; it is given in closed form. The stability analysis is based on the linearized Navier-Stokes equations, and zonally symmetric perturbations in the form of waves propagating in the meridional direction are considered. Numerical methods were used for the stability problem. It was found that the instability sets in when the Richardson number is close to unity and that the critical Richardson number is a non-monotonic function of the Prandtl number. Further, it was found that the critical Richardson number decreases with increasing Ekman number until a critical value of the Ekman number is reached beyond which the fluid is stable.
Delsignore, Ann Marie; Petrova, Elena; Harper, Amney; Stowe, Angela M; Mu'min, Ameena S; Middleton, Renée A
2010-07-01
An exploratory qualitative analysis of the critical incidents and assistance-seeking behaviors of White mental health psychologists and professional counselors was performed in an effort to examine a theoretical supposition presented within a Person(al)-as-Profession(al) transtheoretical framework (P-A-P). A concurrent nested strategy was used in which both quantitative and qualitative data were collected simultaneously (Creswell, 2003). In this nested strategy, qualitative data was embedded in a predominant (quantitative) method of analysis from an earlier study (see Middleton et al., 2005). Critical incidents categorized as informal (i.e., personal) experiences were cited more often than those characterized as formal (i.e., professional) experiences as influencing the professional perspectives of White mental health practitioners regarding multicultural diversity. Implications for the counseling and psychology professions are discussed.
NASA Technical Reports Server (NTRS)
Kim, Sang-Wook
1987-01-01
Various experimental, analytical, and numerical analysis methods for flow-solid interaction of a nest of cylinders subjected to cross flows are reviewed. A nest of cylinders subjected to cross flows can be found in numerous engineering applications including the Space Shuttle Maine Engine-Main Injector Assembly (SSME-MIA) and nuclear reactor heat exchangers. Despite its extreme importance in engineering applications, understanding of the flow-solid interaction process is quite limited and design of the tube banks are mostly dependent on experiments and/or experimental correlation equations. For future development of major numerical analysis methods for the flow-solid interaction of a nest of cylinders subjected to cross flow, various turbulence models, nonlinear structural dynamics, and existing laminar flow-solid interaction analysis methods are included.
Critical Thinking Skills in Nursing Students: a Comparison Between Freshmen and Senior Students
Azizi-Fini, Ismail; Hajibagheri, Ali; Adib-Hajbaghery, Mohsen
2015-01-01
Background: Critical thinking is one of the most important concepts in the field of education. Despite studies published on nursing students’ critical thinking skills (CTS), some suggest that there is not enough evidence supporting the relationship between content of nursing education programs and nursing students’ CTS. Objectives: Given the existing discrepancies, this study aimed to compare the critical thinking skills of freshmen and senior nursing students. Patients and Methods: This comparative study was conducted on 150 undergraduate freshmen and senior nursing students in Kashan University of Medical Sciences, during 2012. The students in the first and the last semesters of their study in nursing were entered in the study using the census method. Data were collected using a questionnaire including questions on demographic data and the California Critical Thinking Skills Test, form B. Data analysis was performed using the SPSS v.13 software. Descriptive statistics were calculated. Moreover, independent sample t-test and Spearman and Pearson’s correlation coefficients were used in the data analysis. Results: Both the freshmen and senior nursing students had low CTS. The mean critical thinking scores were 11.79 ± 4.80 and 11.21 ± 3.17 for the freshmen and the senior students, respectively (P = 0.511). Moreover, no significant correlation was found between the students’ score in CTS and their age, gender, high school grade point average (GPA), rank in university entrance examination (RUEE) and interest in the nursing profession. Conclusions: The students were low skilled in critical thinking and their CTS did not significantly change during their nursing degree. Thus it may be concluded that the nursing education program did not affect the CTS of its students. Longitudinal studies are suggested for assessing nursing students’ critical thinking over time. Moreover, revising the curriculum and preparing nursing educators for implementing innovative and active teaching strategies are suggested. PMID:25830160
A critical analysis of the literature on the Internet and consumer health information.
Powell, J A; Lowe, P; Griffiths, F E; Thorogood, M
2005-01-01
A critical review of the published literature investigating the Internet and consumer health information was undertaken in order to inform further research and policy. A qualitative, narrative method was used, consisting of a three-stage process of identification and collation, thematic coding, and critical analysis. This analysis identified five main themes in the research in this area: (1) the quality of online health information for consumers; (2) consumer use of the Internet for health information; (3) the effect of e-health on the practitioner-patient relationship; (4) virtual communities and online social support and (5) the electronic delivery of information-based interventions. Analysis of these themes revealed more about the concerns of health professionals than about the effect of the Internet on users. Much of the existing work has concentrated on quantifying characteristics of the Internet: for example, measuring the quality of online information, or describing the numbers of users in different health-care settings. There is a lack of qualitative research that explores how citizens are actually using the Internet for health care.
NASA Astrophysics Data System (ADS)
Li, Xiao-Fen; Kochat, Mehdi; Majkic, Goran; Selvamanickam, Venkat
2016-08-01
In this paper the authors succeeded in measuring the critical current density ({J}{{c}}) of multifilament-coated conductors (CCs) with thin filaments as low as 0.25 mm using the scanning hall probe microscope (SHPM) technique. A new iterative method of data analysis is developed to make the calculation of {J}{{c}} for thin filaments possible, even without a very small scan distance. The authors also discussed in detail the advantage and limitation of the iterative method using both simulation and experiment results. The results of the new method correspond well with the traditional fast Fourier transform method where this is still applicable. However, the new method is applicable for the filamentized CCs in much wider measurement conditions such as with thin filament and a large scan distance, thus overcoming the barrier for application of the SHPM technique on {J}{{c}} measurement of long filamentized CCs with narrow filaments.
Henry, C Jeya K; Xin, Janice Lim Wen
2014-06-01
The local manufacture of ready-to-use therapeutic foods (RUTFs) is increasing, and there is a need to develop methods to ensure their safe production. We propose the application of Hazard Analysis Critical Control Point (HACCP) principles to achieve this goal. The basic principles of HACCP in the production of RUTFs are outlined. It is concluded that the implementation of an HACCP system in the manufacture of RUTFs is not only feasible but also attainable. The introduction of good manufacturing practices, coupled with an effective HACCP system, will ensure that RUTFs are produced in a cost-effective, safe, and hygienic manner.
NASA Astrophysics Data System (ADS)
Zarka, Philippe
2011-06-01
Astrology meets a large success in our societies, from the private to the political sphere as well as in the media, in spite of the demonstrated inaccuracy of its psychological as well as operational predictions. We analyse here the relations between astrology and astronomy, as well as the criticisms opposed by the latter to the former. We show that most of these criticisms are weak. Much stronger ones emerge from the analysis of the astrological practice compared to the scientific method, leading us to conclude to the non-scientificity of astrology. Then we return to the success of astrology, and from its analysis we propose a renewed (and prophylactic) rôle for astronomy in society.
SCALE 6.2 Continuous-Energy TSUNAMI-3D Capabilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Perfetti, Christopher M; Rearden, Bradley T
2015-01-01
The TSUNAMI (Tools for Sensitivity and UNcertainty Analysis Methodology Implementation) capabilities within the SCALE code system make use of sensitivity coefficients for an extensive number of criticality safety applications, such as quantifying the data-induced uncertainty in the eigenvalue of critical systems, assessing the neutronic similarity between different systems, quantifying computational biases, and guiding nuclear data adjustment studies. The need to model geometrically complex systems with improved ease of use and fidelity and the desire to extend TSUNAMI analysis to advanced applications have motivated the development of a SCALE 6.2 module for calculating sensitivity coefficients using three-dimensional (3D) continuous-energy (CE) Montemore » Carlo methods: CE TSUNAMI-3D. This paper provides an overview of the theory, implementation, and capabilities of the CE TSUNAMI-3D sensitivity analysis methods. CE TSUNAMI contains two methods for calculating sensitivity coefficients in eigenvalue sensitivity applications: (1) the Iterated Fission Probability (IFP) method and (2) the Contributon-Linked eigenvalue sensitivity/Uncertainty estimation via Track length importance CHaracterization (CLUTCH) method. This work also presents the GEneralized Adjoint Response in Monte Carlo method (GEAR-MC), a first-of-its-kind approach for calculating adjoint-weighted, generalized response sensitivity coefficients—such as flux responses or reaction rate ratios—in CE Monte Carlo applications. The accuracy and efficiency of the CE TSUNAMI-3D eigenvalue sensitivity methods are assessed from a user perspective in a companion publication, and the accuracy and features of the CE TSUNAMI-3D GEAR-MC methods are detailed in this paper.« less
Hoff, Rodrigo Barcellos; Rübensam, Gabriel; Jank, Louise; Barreto, Fabiano; Peralba, Maria do Carmo Ruaro; Pizzolato, Tânia Mara; Silvia Díaz-Cruz, M; Barceló, Damià
2015-01-01
In residue analysis of veterinary drugs in foodstuff, matrix effects are one of the most critical points. This work present a discuss considering approaches used to estimate, minimize and monitoring matrix effects in bioanalytical methods. Qualitative and quantitative methods for estimation of matrix effects such as post-column infusion, slopes ratios analysis, calibration curves (mathematical and statistical analysis) and control chart monitoring are discussed using real data. Matrix effects varying in a wide range depending of the analyte and the sample preparation method: pressurized liquid extraction for liver samples show matrix effects from 15.5 to 59.2% while a ultrasound-assisted extraction provide values from 21.7 to 64.3%. The matrix influence was also evaluated: for sulfamethazine analysis, losses of signal were varying from -37 to -96% for fish and eggs, respectively. Advantages and drawbacks are also discussed considering a workflow for matrix effects assessment proposed and applied to real data from sulfonamides residues analysis. Copyright © 2014 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Townsend, John S.; Peck, Jeff; Ayala, Samuel
2000-01-01
NASA has funded several major programs (the Probabilistic Structural Analysis Methods Project is an example) to develop probabilistic structural analysis methods and tools for engineers to apply in the design and assessment of aerospace hardware. A probabilistic finite element software code, known as Numerical Evaluation of Stochastic Structures Under Stress, is used to determine the reliability of a critical weld of the Space Shuttle solid rocket booster aft skirt. An external bracket modification to the aft skirt provides a comparison basis for examining the details of the probabilistic analysis and its contributions to the design process. Also, analysis findings are compared with measured Space Shuttle flight data.
NASA Astrophysics Data System (ADS)
Wei, Haoyang
A new critical plane-energy model is proposed in this thesis for multiaxial fatigue life prediction of homogeneous and heterogeneous materials. Brief review of existing methods, especially on the critical plane-based and energy-based methods, are given first. Special focus is on one critical plane approach which has been shown to work for both brittle and ductile metals. The key idea is to automatically change the critical plane orientation with respect to different materials and stress states. One potential drawback of the developed model is that it needs an empirical calibration parameter for non-proportional multiaxial loadings since only the strain terms are used and the out-of-phase hardening cannot be considered. The energy-based model using the critical plane concept is proposed with help of the Mroz-Garud hardening rule to explicitly include the effect of non-proportional hardening under fatigue cyclic loadings. Thus, the empirical calibration for non-proportional loading is not needed since the out-of-phase hardening is naturally included in the stress calculation. The model predictions are compared with experimental data from open literature and it is shown the proposed model can work for both proportional and non-proportional loadings without the empirical calibration. Next, the model is extended for the fatigue analysis of heterogeneous materials integrating with finite element method. Fatigue crack initiation of representative volume of heterogeneous materials is analyzed using the developed critical plane-energy model and special focus is on the microstructure effect on the multiaxial fatigue life predictions. Several conclusions and future work is drawn based on the proposed study.
Clinical decision making by nurses when faced with third-space fluid shift. How well do they fare?
Redden, M; Wotton, K
2001-01-01
Nurses' use of knowledge, the connection of this knowledge to treatment decisions and information actually used to reach such decisions, delineates nurses' level of expertise. Previous research has shown that nurses in their clinical decision-making use the hypothetico-deductive method and intuitive judgment or pattern recognition. This interpretive study explored experienced critical care nurses' (n = 5) and gastrointestinal surgical nurses' (n = 5) clinical decision-making processes through ascertaining their knowledge and understanding of third-space fluid shift in elderly patients undergoing major gastrointestinal surgery. Both groups of nurses, because of their experience with elderly patients undergoing gastrointestinal surgery, were assumed to be experts. Data collection techniques included semi-structured interviews and the use of think aloud protocol for clinical scenario analysis. The findings demonstrated that the gastrointestinal surgical nurses used the hypothetico-deductive method to recognize critical cues and the existence of a problem but could not name the problem. The critical care nurses, on the other hand, used a combination of the hypothetico-deductive method and pattern recognition as a basis for identification of critical cues. The critical care nurses also possessed in depth knowledge of third-space fluid shift and were able to use pivotal cues to identify the actual phenomenon. Ultimately, it would appear that the structure of critical care nurses' work, their increased educational qualifications and the culture of the critical care unit promote a more proactive approach to reasoning in the physiological domain. The findings have implications for the development of practice guidelines and curriculum development in both tertiary and continuing nurse education.
Exploiting interfacial water properties for desalination and purification applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, Hongwu; Varma, Sameer; Nyman, May Devan
2008-09-01
A molecular-scale interpretation of interfacial processes is often downplayed in the analysis of traditional water treatment methods. However, such an approach is critical for the development of enhanced performance in traditional desalination and water treatments. Water confined between surfaces, within channels, or in pores is ubiquitous in technology and nature. Its physical and chemical properties in such environments are unpredictably different from bulk water. As a result, advances in water desalination and purification methods may be accomplished through an improved analysis of water behavior in these challenging environments using state-of-the-art microscopy, spectroscopy, experimental, and computational methods.
Objective Assessment of Patient Inhaler User Technique Using an Audio-Based Classification Approach.
Taylor, Terence E; Zigel, Yaniv; Egan, Clarice; Hughes, Fintan; Costello, Richard W; Reilly, Richard B
2018-02-01
Many patients make critical user technique errors when using pressurised metered dose inhalers (pMDIs) which reduce the clinical efficacy of respiratory medication. Such critical errors include poor actuation coordination (poor timing of medication release during inhalation) and inhaling too fast (peak inspiratory flow rate over 90 L/min). Here, we present a novel audio-based method that objectively assesses patient pMDI user technique. The Inhaler Compliance Assessment device was employed to record inhaler audio signals from 62 respiratory patients as they used a pMDI with an In-Check Flo-Tone device attached to the inhaler mouthpiece. Using a quadratic discriminant analysis approach, the audio-based method generated a total frame-by-frame accuracy of 88.2% in classifying sound events (actuation, inhalation and exhalation). The audio-based method estimated the peak inspiratory flow rate and volume of inhalations with an accuracy of 88.2% and 83.94% respectively. It was detected that 89% of patients made at least one critical user technique error even after tuition from an expert clinical reviewer. This method provides a more clinically accurate assessment of patient inhaler user technique than standard checklist methods.
Individual Movement Strategies Revealed through Novel Clustering of Emergent Movement Patterns
NASA Astrophysics Data System (ADS)
Valle, Denis; Cvetojevic, Sreten; Robertson, Ellen P.; Reichert, Brian E.; Hochmair, Hartwig H.; Fletcher, Robert J.
2017-03-01
Understanding movement is critical in several disciplines but analysis methods often neglect key information by adopting each location as sampling unit, rather than each individual. We introduce a novel statistical method that, by focusing on individuals, enables better identification of temporal dynamics of connectivity, traits of individuals that explain emergent movement patterns, and sites that play a critical role in connecting subpopulations. We apply this method to two examples that span movement networks that vary considerably in size and questions: movements of an endangered raptor, the snail kite (Rostrhamus sociabilis plumbeus), and human movement in Florida inferred from Twitter. For snail kites, our method reveals substantial differences in movement strategies for different bird cohorts and temporal changes in connectivity driven by the invasion of an exotic food resource, illustrating the challenge of identifying critical connectivity sites for conservation in the presence of global change. For human movement, our method is able to reliably determine the origin of Florida visitors and identify distinct movement patterns within Florida for visitors from different places, providing near real-time information on the spatial and temporal patterns of tourists. These results emphasize the need to integrate individual variation to generate new insights when modeling movement data.
Evaluation of methods for freeway operational analysis.
DOT National Transportation Integrated Search
2001-10-01
The ability to estimate accurately the operational performance of roadway segments has become increasingly critical as we move from a period of new construction into one of operations, maintenance, and, in some cases, reconstruction. In addition to m...
Interpreting Research on School Resources and Student Achievement: A Rejoinder to Hanushek.
ERIC Educational Resources Information Center
Greenwald, Rob; Hedges, Larry V.; Laine, Richard
1996-01-01
Supports the findings of a meta-analysis that demonstrates that student achievement is related to the availability of resources, disagreeing with criticisms of method and sample selection made by E. Hanushek (1996). (SLD)
METHODS OF ANALYSIS FOR WASTE LOAD ALLOCATION
This research has addressed several unresolved questions concerning the allocation of allowable waste loads among multiple wastewater dischargers within a water quality limited stream segment. First, the traditional assumptions about critical design conditions for waste load allo...
The Socratic Method: analyzing ethical issues in health administration.
Gac, E J; Boerstler, H; Ruhnka, J C
1998-01-01
The Socratic Method has long been recognized by the legal profession as an effective tool for promoting critical thinking and analysis in the law. This article describes ways the technique can be used in health administration education to help future administrators develop the "ethical rudder" they will need for effective leadership. An illustrative dialogue is provided.
A Critical Analysis of the Body of Work Method for Setting Cut-Scores
ERIC Educational Resources Information Center
Radwan, Nizam; Rogers, W. Todd
2006-01-01
The recent increase in the use of constructed-response items in educational assessment and the dissatisfaction with the nature of the decision that the judges must make using traditional standard-setting methods created a need to develop new and effective standard-setting procedures for tests that include both multiple-choice and…
ERIC Educational Resources Information Center
Mellone, James T.
2010-01-01
This study provides a database evaluation method for the practicing bibliographer that is more than a brief review yet less than a controlled experiment. The author establishes evaluation criteria in the context of the bibliographic instruction provided to meet the research requirements of undergraduate sociology majors at Queens College, City…
Benchmark On Sensitivity Calculation (Phase III)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ivanova, Tatiana; Laville, Cedric; Dyrda, James
2012-01-01
The sensitivities of the keff eigenvalue to neutron cross sections have become commonly used in similarity studies and as part of the validation algorithm for criticality safety assessments. To test calculations of the sensitivity coefficients, a benchmark study (Phase III) has been established by the OECD-NEA/WPNCS/EG UACSA (Expert Group on Uncertainty Analysis for Criticality Safety Assessment). This paper presents some sensitivity results generated by the benchmark participants using various computational tools based upon different computational methods: SCALE/TSUNAMI-3D and -1D, MONK, APOLLO2-MORET 5, DRAGON-SUSD3D and MMKKENO. The study demonstrates the performance of the tools. It also illustrates how model simplifications impactmore » the sensitivity results and demonstrates the importance of 'implicit' (self-shielding) sensitivities. This work has been a useful step towards verification of the existing and developed sensitivity analysis methods.« less
Focus control enhancement and on-product focus response analysis methodology
NASA Astrophysics Data System (ADS)
Kim, Young Ki; Chen, Yen-Jen; Hao, Xueli; Samudrala, Pavan; Gomez, Juan-Manuel; Mahoney, Mark O.; Kamalizadeh, Ferhad; Hanson, Justin K.; Lee, Shawn; Tian, Ye
2016-03-01
With decreasing CDOF (Critical Depth Of Focus) for 20/14nm technology and beyond, focus errors are becoming increasingly critical for on-product performance. Current on product focus control techniques in high volume manufacturing are limited; It is difficult to define measurable focus error and optimize focus response on product with existing methods due to lack of credible focus measurement methodologies. Next to developments in imaging and focus control capability of scanners and general tool stability maintenance, on-product focus control improvements are also required to meet on-product imaging specifications. In this paper, we discuss focus monitoring, wafer (edge) fingerprint correction and on-product focus budget analysis through diffraction based focus (DBF) measurement methodology. Several examples will be presented showing better focus response and control on product wafers. Also, a method will be discussed for a focus interlock automation system on product for a high volume manufacturing (HVM) environment.
Greene, Jacob; Louis, Julien; Korostynska, Olga; Mason, Alex
2017-02-23
Muscle glycogen levels have a profound impact on an athlete's sporting performance, thus measurement is vital. Carbohydrate manipulation is a fundamental component in an athlete's lifestyle and is a critical part of elite performance, since it can provide necessary training adaptations. This paper provides a critical review of the current invasive and non-invasive methods for measuring skeletal muscle glycogen levels. These include the gold standard muscle biopsy, histochemical analysis, magnetic resonance spectroscopy, and musculoskeletal high frequency ultrasound, as well as pursuing future application of electromagnetic sensors in the pursuit of portable non-invasive quantification of muscle glycogen. This paper will be of interest to researchers who wish to understand the current and most appropriate techniques in measuring skeletal muscle glycogen. This will have applications both in the lab and in the field by improving the accuracy of research protocols and following the physiological adaptations to exercise.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baer, Donald R.; Shutthanandan, Vaithiyalingam
Nano-sized objects are increasingly important as biomaterials and their surfaces play critical roles in determining their beneficial or deleterious behaviors in biological systems. Important characteristics of nanomaterials that impact their application in many areas are described with a strong focus on the importance of particle surfaces and surface characterization. Understanding aspects of the inherent nature of nano-objects and the important role that surfaces play in these applications is a universal need for any research or product development using such materials in biological applications. The role of surface analysis methods in collecting critical information about the nature of particle surfaces andmore » physicochemical properties of nano-objects is described along with the importance of including sample history and analysis results in a record of provenance information regarding specific batches of nano-objects.« less
Mathematics is always invisible, Professor Dowling
NASA Astrophysics Data System (ADS)
Cable, John
2015-09-01
This article provides a critical evaluation of a technique of analysis, the Social Activity Method, recently offered by Dowling (2013) as a `gift' to mathematics education. The method is found to be inadequate, firstly, because it employs a dichotomy (between `expression' and `content') instead of a finer analysis (into symbols, concepts and setting or phenomena), and, secondly, because the distinction between `public' and `esoteric' mathematics, although interesting, is allowed to obscure the structure of the mathematics itself. There is also criticism of what Dowling calls the `myth of participation', which denies the intimate links between mathematics and the rest of the universe that lie at the heart of mathematical pedagogy. Behind all this lies Dowling's `essentially linguistic' conception of mathematics, which is criticised on the dual ground that it ignores the chastening experience of formalism in mathematical philosophy and that linguistics itself has taken a wrong turn and ignores lessons that might be learnt from mathematics education.
Critical consciousness: current status and future directions.
Watts, Roderick J; Diemer, Matthew A; Voight, Adam M
2011-01-01
In this chapter, the authors consider Paulo Freire's construct of critical consciousness (CC) and why it deserves more attention in research and discourse on youth political and civic development. His approach to education and similar ideas by other scholars of liberation aims to foster a critical analysis of society--and one's status within it--using egalitarian, empowering, and interactive methods. The aim is social change as well as learning, which makes these ideas especially relevant to the structural injustice faced by marginalized youth. From their review of these ideas, the authors derive three core CC components: critical reflection, political efficacy, and critical action. They highlight promising research related to these constructs and innovative applied work including youth action-research methodology. Their conclusion offers ideas for closing some of the critical gaps in CC theory and research. Copyright © 2011 Wiley Periodicals, Inc., A Wiley Company.
Effect of load eccentricity on the buckling of thin-walled laminated C-columns
NASA Astrophysics Data System (ADS)
Wysmulski, Pawel; Teter, Andrzej; Debski, Hubert
2018-01-01
The study investigates the behaviour of short, thin-walled laminated C-columns under eccentric compression. The tested columns are simple-supported. The effect of load inaccuracy on the critical and post-critical (local buckling) states is examined. A numerical analysis by the finite element method and experimental tests on a test stand are performed. The samples were produced from a carbon-epoxy prepreg by the autoclave technique. The experimental tests rest on the assumption that compressive loads are 1.5 higher than the theoretical critical force. Numerical modelling is performed using the commercial software package ABAQUS®. The critical load is determined by solving an eigen problem using the Subspace algorithm. The experimental critical loads are determined based on post-buckling paths. The numerical and experimental results show high agreement, thus demonstrating a significant effect of load inaccuracy on the critical load corresponding to the column's local buckling.
Computer program for preliminary design analysis of axial-flow turbines
NASA Technical Reports Server (NTRS)
Glassman, A. J.
1972-01-01
The program method is based on a mean-diameter flow analysis. Input design requirements include power or pressure ratio, flow, temperature, pressure, and speed. Turbine designs are generated for any specified number of stages and for any of three types of velocity diagrams (symmetrical, zero exit swirl, or impulse). Exit turning vanes can be included in the design. Program output includes inlet and exit annulus dimensions, exit temperature and pressure, total and static efficiencies, blading angles, and last-stage critical velocity ratios. The report presents the analysis method, a description of input and output with sample cases, and the program listing.
Boussès, Christine; Ferey, Ludivine; Vedrines, Elodie; Gaudin, Karen
2015-11-10
An innovative combination of green chemistry and quality by design (QbD) approach is presented through the development of an UHPLC method for the analysis of the main degradation products of dextromethorphan hydrobromide. QbD strategy was integrated to the field of green analytical chemistry to improve method understanding while assuring quality and minimizing environmental impacts, and analyst exposure. This analytical method was thoroughly evaluated by applying risk assessment and multivariate analysis tools. After a scouting phase aimed at selecting a suitable stationary phase and an organic solvent in accordance with green chemistry principles, quality risk assessment tools were applied to determine the critical process parameters (CPPs). The effects of the CPPs on critical quality attributes (CQAs), i.e., resolutions, efficiencies, and solvent consumption were further evaluated by means of a screening design. A response surface methodology was then carried out to model CQAs as function of the selected CPPs and the optimal separation conditions were determined through a desirability analysis. Resulting contour plots enabled to establish the design space (DS) (method operable design region) where all CQAs fulfilled the requirements. An experimental validation of the DS proved that quality within the DS was guaranteed; therefore no more robustness study was required before the validation. Finally, this UHPLC method was validated using the concept of total error and was used to analyze a pharmaceutical drug product. Copyright © 2015 Elsevier B.V. All rights reserved.
Analysis on burnup step effect for evaluating reactor criticality and fuel breeding ratio
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saputra, Geby; Purnama, Aditya Rizki; Permana, Sidik
Criticality condition of the reactors is one of the important factors for evaluating reactor operation and nuclear fuel breeding ratio is another factor to show nuclear fuel sustainability. This study analyzes the effect of burnup steps and cycle operation step for evaluating the criticality condition of the reactor as well as the performance of nuclear fuel breeding or breeding ratio (BR). Burnup step is performed based on a day step analysis which is varied from 10 days up to 800 days and for cycle operation from 1 cycle up to 8 cycles reactor operations. In addition, calculation efficiency based onmore » the variation of computer processors to run the analysis in term of time (time efficiency in the calculation) have been also investigated. Optimization method for reactor design analysis which is used a large fast breeder reactor type as a reference case was performed by adopting an established reactor design code of JOINT-FR. The results show a criticality condition becomes higher for smaller burnup step (day) and for breeding ratio becomes less for smaller burnup step (day). Some nuclides contribute to make better criticality when smaller burnup step due to individul nuclide half-live. Calculation time for different burnup step shows a correlation with the time consuming requirement for more details step calculation, although the consuming time is not directly equivalent with the how many time the burnup time step is divided.« less
NASA Astrophysics Data System (ADS)
Mandayam Doddamane, Prabha
2011-12-01
Considerable research, policy, and programmatic efforts have been dedicated to addressing the participation of particular populations in STEM for decades. Each of these efforts claims equity-related goals; yet, they heavily frame the problem, through pervasive STEM pipeline model discourse, in terms of national needs, workforce supply, and competitiveness. This particular framing of the problem may, indeed, be counter to equity goals, especially when paired with policy that largely relies on statistical significance and broad aggregation of data over exploring the identities and experiences of the populations targeted for equitable outcomes in that policy. In this study, I used the mixed-methods approach of critical discourse and critical quantitative analyses to understand how the pipeline model ideology has become embedded within academic discourse, research, and data surrounding STEM education and work and to provide alternatives for quantitative analysis. Using critical theory as a lens, I first conducted a critical discourse analysis of contemporary STEM workforce studies with a particular eye to pipeline ideology. Next, I used that analysis to inform logistic regression analyses of the 2006 SESTAT data. This quantitative analysis compared and contrasted different ways of thinking about identity and retention. Overall, the findings of this study show that many subjective choices are made in the construction of the large-scale datasets used to inform much national science and engineering policy and that these choices greatly influence likelihood of retention outcomes.
The Limits of Functional Analysis in the Study of Mass Communication.
ERIC Educational Resources Information Center
Anderson, James A.; Meyer, Timothy P.
The fundamental limits of the functional approach to the study of mass communication are embodied in two of its criticisms. The first weakness is in its logical structure and the second involves the limits that are set by known methods. Functional analysis has difficulties as a meaningful research perspective because the process of mass…
Hierarchical models and bayesian analysis of bird survey information
John R. Sauer; William A. Link; J. Andrew Royle
2005-01-01
Summary of bird survey information is a critical component of conservation activities, but often our summaries rely on statistical methods that do not accommodate the limitations of the information. Prioritization of species requires ranking and analysis of species by magnitude of population trend, but often magnitude of trend is a misleading measure of actual decline...
A Critical Analysis of the CELF-4: The Responsible Clinician's Guide to the CELF-4
ERIC Educational Resources Information Center
Crowley, Catherine Jane
2010-01-01
Purpose: To provide an analysis of the accuracy and effectiveness of using the Clinical Evaluation of Language Fundamentals-Fourth Edition (CELF-4) to identify students as having language-based disabilities. Method: The CELF-4 is analyzed within the current standards set by the federal law on special education, the available research, preferred…
NASA Technical Reports Server (NTRS)
Cruse, T. A.
1987-01-01
The objective is the development of several modular structural analysis packages capable of predicting the probabilistic response distribution for key structural variables such as maximum stress, natural frequencies, transient response, etc. The structural analysis packages are to include stochastic modeling of loads, material properties, geometry (tolerances), and boundary conditions. The solution is to be in terms of the cumulative probability of exceedance distribution (CDF) and confidence bounds. Two methods of probability modeling are to be included as well as three types of structural models - probabilistic finite-element method (PFEM); probabilistic approximate analysis methods (PAAM); and probabilistic boundary element methods (PBEM). The purpose in doing probabilistic structural analysis is to provide the designer with a more realistic ability to assess the importance of uncertainty in the response of a high performance structure. Probabilistic Structural Analysis Method (PSAM) tools will estimate structural safety and reliability, while providing the engineer with information on the confidence that should be given to the predicted behavior. Perhaps most critically, the PSAM results will directly provide information on the sensitivity of the design response to those variables which are seen to be uncertain.
NASA Technical Reports Server (NTRS)
Cruse, T. A.; Burnside, O. H.; Wu, Y.-T.; Polch, E. Z.; Dias, J. B.
1988-01-01
The objective is the development of several modular structural analysis packages capable of predicting the probabilistic response distribution for key structural variables such as maximum stress, natural frequencies, transient response, etc. The structural analysis packages are to include stochastic modeling of loads, material properties, geometry (tolerances), and boundary conditions. The solution is to be in terms of the cumulative probability of exceedance distribution (CDF) and confidence bounds. Two methods of probability modeling are to be included as well as three types of structural models - probabilistic finite-element method (PFEM); probabilistic approximate analysis methods (PAAM); and probabilistic boundary element methods (PBEM). The purpose in doing probabilistic structural analysis is to provide the designer with a more realistic ability to assess the importance of uncertainty in the response of a high performance structure. Probabilistic Structural Analysis Method (PSAM) tools will estimate structural safety and reliability, while providing the engineer with information on the confidence that should be given to the predicted behavior. Perhaps most critically, the PSAM results will directly provide information on the sensitivity of the design response to those variables which are seen to be uncertain.
ERIC Educational Resources Information Center
Vettiveloo, Roshini
2008-01-01
The analysis was carried out as part of a master's thesis and it aimed to analyse the extent to which the Montessori educational philosophy and teaching method incorporated inclusive educational qualities. The Montessori Method was first developed for children who were disadvantaged and considered "idiots", in the slums of Italy's San…
Probabilistic structural analysis methods for improving Space Shuttle engine reliability
NASA Technical Reports Server (NTRS)
Boyce, L.
1989-01-01
Probabilistic structural analysis methods are particularly useful in the design and analysis of critical structural components and systems that operate in very severe and uncertain environments. These methods have recently found application in space propulsion systems to improve the structural reliability of Space Shuttle Main Engine (SSME) components. A computer program, NESSUS, based on a deterministic finite-element program and a method of probabilistic analysis (fast probability integration) provides probabilistic structural analysis for selected SSME components. While computationally efficient, it considers both correlated and nonnormal random variables as well as an implicit functional relationship between independent and dependent variables. The program is used to determine the response of a nickel-based superalloy SSME turbopump blade. Results include blade tip displacement statistics due to the variability in blade thickness, modulus of elasticity, Poisson's ratio or density. Modulus of elasticity significantly contributed to blade tip variability while Poisson's ratio did not. Thus, a rational method for choosing parameters to be modeled as random is provided.
Analysis of Critical Mass in Threshold Model of Diffusion
NASA Astrophysics Data System (ADS)
Kim, Jeehong; Hur, Wonchang; Kang, Suk-Ho
2012-04-01
Why does diffusion sometimes show cascade phenomena but at other times is impeded? In addressing this question, we considered a threshold model of diffusion, focusing on the formation of a critical mass, which enables diffusion to be self-sustaining. Performing an agent-based simulation, we found that the diffusion model produces only two outcomes: Almost perfect adoption or relatively few adoptions. In order to explain the difference, we considered the various properties of network structures and found that the manner in which thresholds are arrayed over a network is the most critical factor determining the size of a cascade. On the basis of the results, we derived a threshold arrangement method effective for generation of a critical mass and calculated the size required for perfect adoption.
GOMA: functional enrichment analysis tool based on GO modules
Huang, Qiang; Wu, Ling-Yun; Wang, Yong; Zhang, Xiang-Sun
2013-01-01
Analyzing the function of gene sets is a critical step in interpreting the results of high-throughput experiments in systems biology. A variety of enrichment analysis tools have been developed in recent years, but most output a long list of significantly enriched terms that are often redundant, making it difficult to extract the most meaningful functions. In this paper, we present GOMA, a novel enrichment analysis method based on the new concept of enriched functional Gene Ontology (GO) modules. With this method, we systematically revealed functional GO modules, i.e., groups of functionally similar GO terms, via an optimization model and then ranked them by enrichment scores. Our new method simplifies enrichment analysis results by reducing redundancy, thereby preventing inconsistent enrichment results among functionally similar terms and providing more biologically meaningful results. PMID:23237213
Lean Keng, Soon; AlQudah, Hani Nawaf Ibrahim
2017-02-01
To raise awareness of critical care nurses' cognitive bias in decision-making, its relationship with leadership styles and its impact on care delivery. The relationship between critical care nurses' decision-making and leadership styles in hospitals has been widely studied, but the influence of cognitive bias on decision-making and leadership styles in critical care environments remains poorly understood, particularly in Jordan. Two-phase mixed methods sequential explanatory design and grounded theory. critical care unit, Prince Hamza Hospital, Jordan. Participant sampling: convenience sampling Phase 1 (quantitative, n = 96), purposive sampling Phase 2 (qualitative, n = 20). Pilot tested quantitative survey of 96 critical care nurses in 2012. Qualitative in-depth interviews, informed by quantitative results, with 20 critical care nurses in 2013. Descriptive and simple linear regression quantitative data analyses. Thematic (constant comparative) qualitative data analysis. Quantitative - correlations found between rationality and cognitive bias, rationality and task-oriented leadership styles, cognitive bias and democratic communication styles and cognitive bias and task-oriented leadership styles. Qualitative - 'being competent', 'organizational structures', 'feeling self-confident' and 'being supported' in the work environment identified as key factors influencing critical care nurses' cognitive bias in decision-making and leadership styles. Two-way impact (strengthening and weakening) of cognitive bias in decision-making and leadership styles on critical care nurses' practice performance. There is a need to heighten critical care nurses' consciousness of cognitive bias in decision-making and leadership styles and its impact and to develop organization-level strategies to increase non-biased decision-making. © 2016 John Wiley & Sons Ltd.
[Recent advances in sample preparation methods of plant hormones].
Wu, Qian; Wang, Lus; Wu, Dapeng; Duan, Chunfeng; Guan, Yafeng
2014-04-01
Plant hormones are a group of naturally occurring trace substances which play a crucial role in controlling the plant development, growth and environment response. With the development of the chromatography and mass spectroscopy technique, chromatographic analytical method has become a widely used way for plant hormone analysis. Among the steps of chromatographic analysis, sample preparation is undoubtedly the most vital one. Thus, a highly selective and efficient sample preparation method is critical for accurate identification and quantification of phytohormones. For the three major kinds of plant hormones including acidic plant hormones & basic plant hormones, brassinosteroids and plant polypeptides, the sample preparation methods are reviewed in sequence especially the recently developed methods. The review includes novel methods, devices, extractive materials and derivative reagents for sample preparation of phytohormones analysis. Especially, some related works of our group are included. At last, the future developments in this field are also prospected.
FRA methods and approaches in environmental analysis and critical NEPA procedures
DOT National Transportation Integrated Search
2017-01-27
The Federal Railroad Administration hosted the third FRA Rail Program Delivery Meetinga 2.5-day conference for grantees, railroad representatives, federal oversight contractors, and staff, focused on rail project implementation. Below is a list of...
The role of student’s critical asking question in developing student’s critical thinking skills
NASA Astrophysics Data System (ADS)
Santoso, T.; Yuanita, L.; Erman, E.
2018-01-01
Questioning means thinking, and thinking is manifested in the form of questions. Research that studies the relationship between questioning and students’ critical thinking skills is little, if any. The aim of this study is to examine how student’s questions skill correlates to student’s critical thinking skills in learning of chemistry. The research design used was one group pretest-posttest design. The participants involved were 94 students, all of whom attended their last semesters, Chemistry Education of Tadulako University. A pre-test was administered to check participants’ ability to ask critical questions and critical thinking skills in learning chemistry. Then, the students were taught by using questioning technique. After accomplishing the lesson, a post-test was given to evaluate their progress. Obtained data were analyzed by using Pair-Samples T.Test and correlation methods. The result shows that the level of the questions plays an important role in critical thinking skills is the question levels of predictive, analysis, evaluation and inference.
Aluminum Data Measurements and Evaluation for Criticality Safety Applications
NASA Astrophysics Data System (ADS)
Leal, L. C.; Guber, K. H.; Spencer, R. R.; Derrien, H.; Wright, R. Q.
2002-12-01
The Defense Nuclear Facility Safety Board (DNFSB) Recommendation 93-2 motivated the US Department of Energy (DOE) to develop a comprehensive criticality safety program to maintain and to predict the criticality of systems throughout the DOE complex. To implement the response to the DNFSB Recommendation 93-2, a Nuclear Criticality Safety Program (NCSP) was created including the following tasks: Critical Experiments, Criticality Benchmarks, Training, Analytical Methods, and Nuclear Data. The Nuclear Data portion of the NCSP consists of a variety of differential measurements performed at the Oak Ridge Electron Linear Accelerator (ORELA) at the Oak Ridge National Laboratory (ORNL), data analysis and evaluation using the generalized least-squares fitting code SAMMY in the resolved, unresolved, and high energy ranges, and the development and benchmark testing of complete evaluations for a nuclide for inclusion into the Evaluated Nuclear Data File (ENDF/B). This paper outlines the work performed at ORNL to measure, evaluate, and test the nuclear data for aluminum for applications in criticality safety problems.
Intelligent neural network and fuzzy logic control of industrial and power systems
NASA Astrophysics Data System (ADS)
Kuljaca, Ognjen
The main role played by neural network and fuzzy logic intelligent control algorithms today is to identify and compensate unknown nonlinear system dynamics. There are a number of methods developed, but often the stability analysis of neural network and fuzzy control systems was not provided. This work will meet those problems for the several algorithms. Some more complicated control algorithms included backstepping and adaptive critics will be designed. Nonlinear fuzzy control with nonadaptive fuzzy controllers is also analyzed. An experimental method for determining describing function of SISO fuzzy controller is given. The adaptive neural network tracking controller for an autonomous underwater vehicle is analyzed. A novel stability proof is provided. The implementation of the backstepping neural network controller for the coupled motor drives is described. Analysis and synthesis of adaptive critic neural network control is also provided in the work. Novel tuning laws for the system with action generating neural network and adaptive fuzzy critic are given. Stability proofs are derived for all those control methods. It is shown how these control algorithms and approaches can be used in practical engineering control. Stability proofs are given. Adaptive fuzzy logic control is analyzed. Simulation study is conducted to analyze the behavior of the adaptive fuzzy system on the different environment changes. A novel stability proof for adaptive fuzzy logic systems is given. Also, adaptive elastic fuzzy logic control architecture is described and analyzed. A novel membership function is used for elastic fuzzy logic system. The stability proof is proffered. Adaptive elastic fuzzy logic control is compared with the adaptive nonelastic fuzzy logic control. The work described in this dissertation serves as foundation on which analysis of particular representative industrial systems will be conducted. Also, it gives a good starting point for analysis of learning abilities of adaptive and neural network control systems, as well as for the analysis of the different algorithms such as elastic fuzzy systems.
2013-01-01
Background This study aimed to perform a structural analysis of determinants of risk of critical incidents in care for women with a low risk profile at the start of pregnancy with a view on improving patient safety. Methods We included 71 critical incidents in primary midwifery care and subsequent hospital care in case of referral after 36 weeks of pregnancy that were related to substandard care and for that reason were reported to the Health Care Inspectorate in The Netherlands in 36 months (n = 357). We performed a case-by-case analysis, using a previously validated instrument which covered five broad domains: healthcare organization, communication between healthcare providers, patient risk factors, clinical management, and clinical outcomes. Results Determinants that were associated with risk concerned healthcare organization (n = 20 incidents), communication about treatment procedures (n = 39), referral processes (n = 19), risk assessment by telephone triage (n = 10), and clinical management in an out of hours setting (n = 19). The 71 critical incidents included three cases of maternal death, eight cases of severe maternal morbidity, 42 perinatal deaths and 12 critical incidents with severe morbidity for the child. Suboptimal prenatal risk assessment, a delay in availability of health care providers in urgent situations, miscommunication about treatment between care providers, and miscommunication with patients in situations with a language barrier were associated with safety risks. Conclusions Systematic analysis of critical incidents improves insight in determinants of safety risk. The wide variety of determinants of risk of critical incidents implies that there is no single intervention to improve patient safety in the care for pregnant women with initially a low risk profile. PMID:24286376
Marshall, Najja; Timme, Nicholas M.; Bennett, Nicholas; Ripp, Monica; Lautzenhiser, Edward; Beggs, John M.
2016-01-01
Neural systems include interactions that occur across many scales. Two divergent methods for characterizing such interactions have drawn on the physical analysis of critical phenomena and the mathematical study of information. Inferring criticality in neural systems has traditionally rested on fitting power laws to the property distributions of “neural avalanches” (contiguous bursts of activity), but the fractal nature of avalanche shapes has recently emerged as another signature of criticality. On the other hand, neural complexity, an information theoretic measure, has been used to capture the interplay between the functional localization of brain regions and their integration for higher cognitive functions. Unfortunately, treatments of all three methods—power-law fitting, avalanche shape collapse, and neural complexity—have suffered from shortcomings. Empirical data often contain biases that introduce deviations from true power law in the tail and head of the distribution, but deviations in the tail have often been unconsidered; avalanche shape collapse has required manual parameter tuning; and the estimation of neural complexity has relied on small data sets or statistical assumptions for the sake of computational efficiency. In this paper we present technical advancements in the analysis of criticality and complexity in neural systems. We use maximum-likelihood estimation to automatically fit power laws with left and right cutoffs, present the first automated shape collapse algorithm, and describe new techniques to account for large numbers of neural variables and small data sets in the calculation of neural complexity. In order to facilitate future research in criticality and complexity, we have made the software utilized in this analysis freely available online in the MATLAB NCC (Neural Complexity and Criticality) Toolbox. PMID:27445842
Characterization of the Space Shuttle Ascent Debris using CFD Methods
NASA Technical Reports Server (NTRS)
Murman, Scott M.; Aftosmis, Michael J.; Rogers, Stuart E.
2005-01-01
After video analysis of space shuttle flight STS-107's ascent showed that an object shed from the bipod-ramp region impacted the left wing, a transport analysis was initiated to determine a credible flight path and impact velocity for the piece of debris. This debris transport analysis was performed both during orbit, and after the subsequent re-entry accident. The analysis provided an accurate prediction of the velocity a large piece of foam bipod ramp would have as it impacted the wing leading edge. This prediction was corroborated by video analysis and fully-coupled CFD/six degree of freedom (DOF) simulations. While the prediction of impact velocity was accurate enough to predict critical damage in this case, one of the recommendations of the Columbia Accident Investigation Board (CAIB) for return-to-flight (RTF) was to analyze the complete debris environment experienced by the shuttle stack on ascent. This includes categorizing all possible debris sources, their probable geometric and aerodynamic characteristics, and their potential for damage. This paper is chiefly concerned with predicting the aerodynamic characteristics of a variety of potential debris sources (insulating foam and cork, nose-cone ablator, ice, ...) for the shuttle ascent configuration using CFD methods. These aerodynamic characteristics are used in the debris transport analysis to predict flight path, impact velocity and angle, and provide statistical variation to perform risk analyses where appropriate. The debris aerodynamic characteristics are difficult to determine using traditional methods, such as static or dynamic test data, due to the scaling requirements of simulating a typical debris event. The use of CFD methods has been a critical element for building confidence in the accuracy of the debris transport code by bridging the gap between existing aerodynamic data and the dynamics of full-scale, in-flight events.
Identifying critical transitions and their leading biomolecular networks in complex diseases.
Liu, Rui; Li, Meiyi; Liu, Zhi-Ping; Wu, Jiarui; Chen, Luonan; Aihara, Kazuyuki
2012-01-01
Identifying a critical transition and its leading biomolecular network during the initiation and progression of a complex disease is a challenging task, but holds the key to early diagnosis and further elucidation of the essential mechanisms of disease deterioration at the network level. In this study, we developed a novel computational method for identifying early-warning signals of the critical transition and its leading network during a disease progression, based on high-throughput data using a small number of samples. The leading network makes the first move from the normal state toward the disease state during a transition, and thus is causally related with disease-driving genes or networks. Specifically, we first define a state-transition-based local network entropy (SNE), and prove that SNE can serve as a general early-warning indicator of any imminent transitions, regardless of specific differences among systems. The effectiveness of this method was validated by functional analysis and experimental data.
Non-Fermi-liquid superconductivity: Eliashberg approach versus the renormalization group
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Huajia; Raghu, Srinivas; Torroba, Gonzalo
Here, we address the problem of superconductivity for non-Fermi liquids using two commonly adopted, yet apparently distinct, methods: (1) the renormalization group (RG) and (2) Eliashberg theory. The extent to which both methods yield consistent solutions for the low-energy behavior of quantum metals has remained unclear. We show that the perturbative RG beta function for the 4-Fermi coupling can be explicitly derived from the linearized Eliashberg equations, under the assumption that quantum corrections are approximately local across energy scales. We apply our analysis to the test case of phonon-mediated superconductivity and show the consistency of both the Eliashberg and RGmore » treatments. We next study superconductivity near a class of quantum critical points and find a transition between superconductivity and a “naked” metallic quantum critical point with finite, critical BCS couplings. We speculate on the applications of our theory to the phenomenology of unconventional metals.« less
Non-Fermi-liquid superconductivity: Eliashberg approach versus the renormalization group
Wang, Huajia; Raghu, Srinivas; Torroba, Gonzalo
2017-04-15
Here, we address the problem of superconductivity for non-Fermi liquids using two commonly adopted, yet apparently distinct, methods: (1) the renormalization group (RG) and (2) Eliashberg theory. The extent to which both methods yield consistent solutions for the low-energy behavior of quantum metals has remained unclear. We show that the perturbative RG beta function for the 4-Fermi coupling can be explicitly derived from the linearized Eliashberg equations, under the assumption that quantum corrections are approximately local across energy scales. We apply our analysis to the test case of phonon-mediated superconductivity and show the consistency of both the Eliashberg and RGmore » treatments. We next study superconductivity near a class of quantum critical points and find a transition between superconductivity and a “naked” metallic quantum critical point with finite, critical BCS couplings. We speculate on the applications of our theory to the phenomenology of unconventional metals.« less
FEM Techniques for High Stress Detection in Accelerated Fatigue Simulation
NASA Astrophysics Data System (ADS)
Veltri, M.
2016-09-01
This work presents the theory and a numerical validation study in support to a novel method for a priori identification of fatigue critical regions, with the aim to accelerate durability design in large FEM problems. The investigation is placed in the context of modern full-body structural durability analysis, where a computationally intensive dynamic solution could be required to identify areas with potential for fatigue damage initiation. The early detection of fatigue critical areas can drive a simplification of the problem size, leading to sensible improvement in solution time and model handling while allowing processing of the critical areas in higher detail. The proposed technique is applied to a real life industrial case in a comparative assessment with established practices. Synthetic damage prediction quantification and visualization techniques allow for a quick and efficient comparison between methods, outlining potential application benefits and boundaries.
Comprehensive comparative analysis of 5'-end RNA-sequencing methods.
Adiconis, Xian; Haber, Adam L; Simmons, Sean K; Levy Moonshine, Ami; Ji, Zhe; Busby, Michele A; Shi, Xi; Jacques, Justin; Lancaster, Madeline A; Pan, Jen Q; Regev, Aviv; Levin, Joshua Z
2018-06-04
Specialized RNA-seq methods are required to identify the 5' ends of transcripts, which are critical for studies of gene regulation, but these methods have not been systematically benchmarked. We directly compared six such methods, including the performance of five methods on a single human cellular RNA sample and a new spike-in RNA assay that helps circumvent challenges resulting from uncertainties in annotation and RNA processing. We found that the 'cap analysis of gene expression' (CAGE) method performed best for mRNA and that most of its unannotated peaks were supported by evidence from other genomic methods. We applied CAGE to eight brain-related samples and determined sample-specific transcription start site (TSS) usage, as well as a transcriptome-wide shift in TSS usage between fetal and adult brain.
Simonoska Crcarevska, Maja; Dimitrovska, Aneta; Sibinovska, Nadica; Mladenovska, Kristina; Slavevska Raicki, Renata; Glavas Dodov, Marija
2015-07-15
Microsponges drug delivery system (MDDC) was prepared by double emulsion-solvent-diffusion technique using rotor-stator homogenization. Quality by design (QbD) concept was implemented for the development of MDDC with potential to be incorporated into semisolid dosage form (gel). Quality target product profile (QTPP) and critical quality attributes (CQA) were defined and identified, accordingly. Critical material attributes (CMA) and Critical process parameters (CPP) were identified using quality risk management (QRM) tool, failure mode, effects and criticality analysis (FMECA). CMA and CPP were identified based on results obtained from principal component analysis (PCA-X&Y) and partial least squares (PLS) statistical analysis along with literature data, product and process knowledge and understanding. FMECA identified amount of ethylcellulose, chitosan, acetone, dichloromethane, span 80, tween 80 and water ratio in primary/multiple emulsions as CMA and rotation speed and stirrer type used for organic solvent removal as CPP. The relationship between identified CPP and particle size as CQA was described in the design space using design of experiments - one-factor response surface method. Obtained results from statistically designed experiments enabled establishment of mathematical models and equations that were used for detailed characterization of influence of identified CPP upon MDDC particle size and particle size distribution and their subsequent optimization. Copyright © 2015 Elsevier B.V. All rights reserved.
CRITICAL CURVES AND CAUSTICS OF TRIPLE-LENS MODELS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Daněk, Kamil; Heyrovský, David, E-mail: kamil.danek@utf.mff.cuni.cz, E-mail: heyrovsky@utf.mff.cuni.cz
2015-06-10
Among the 25 planetary systems detected up to now by gravitational microlensing, there are two cases of a star with two planets, and two cases of a binary star with a planet. Other, yet undetected types of triple lenses include triple stars or stars with a planet with a moon. The analysis and interpretation of such events is hindered by the lack of understanding of essential characteristics of triple lenses, such as their critical curves and caustics. We present here analytical and numerical methods for mapping the critical-curve topology and caustic cusp number in the parameter space of n-point-mass lenses.more » We apply the methods to the analysis of four symmetric triple-lens models, and obtain altogether 9 different critical-curve topologies and 32 caustic structures. While these results include various generic types, they represent just a subset of all possible triple-lens critical curves and caustics. Using the analyzed models, we demonstrate interesting features of triple lenses that do not occur in two-point-mass lenses. We show an example of a lens that cannot be described by the Chang–Refsdal model in the wide limit. In the close limit we demonstrate unusual structures of primary and secondary caustic loops, and explain the conditions for their occurrence. In the planetary limit we find that the presence of a planet may lead to a whole sequence of additional caustic metamorphoses. We show that a pair of planets may change the structure of the primary caustic even when placed far from their resonant position at the Einstein radius.« less
Grace, Pamela J; Perry, Donna J
2013-01-01
Philosophical inquiry remains critically important for nursing education, practice, and knowledge development. We propose a 3-level taxonomy of philosophical inquiry to guide nursing curricula and research development. Important background information about philosophy and the development of philosophical methods is given. Then philosophical inquiry is linked to the goals of nursing using our proposed taxonomy: level I-cultivating an attitude of "critical consciousness" related to all nursing situations and actions, level II-analysis and application of philosophical perspectives to nursing problems and level III-generating new knowledge for nursing purposes including new theories of practice and research.
Reliably detectable flaw size for NDE methods that use calibration
NASA Astrophysics Data System (ADS)
Koshti, Ajay M.
2017-04-01
Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-1823 and associated mh18232 POD software gives most common methods of POD analysis. In this paper, POD analysis is applied to an NDE method, such as eddy current testing, where calibration is used. NDE calibration standards have known size artificial flaws such as electro-discharge machined (EDM) notches and flat bottom hole (FBH) reflectors which are used to set instrument sensitivity for detection of real flaws. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. Therefore, it is important to correlate signal responses from real flaws with signal responses form artificial flaws used in calibration process to determine reliably detectable flaw size.
Reliably Detectable Flaw Size for NDE Methods that Use Calibration
NASA Technical Reports Server (NTRS)
Koshti, Ajay M.
2017-01-01
Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-1823 and associated mh1823 POD software gives most common methods of POD analysis. In this paper, POD analysis is applied to an NDE method, such as eddy current testing, where calibration is used. NDE calibration standards have known size artificial flaws such as electro-discharge machined (EDM) notches and flat bottom hole (FBH) reflectors which are used to set instrument sensitivity for detection of real flaws. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. Therefore, it is important to correlate signal responses from real flaws with signal responses form artificial flaws used in calibration process to determine reliably detectable flaw size.
Acevedo-Nuevo, M; González-Gil, M T; Solís-Muñoz, M; Láiz-Díez, N; Toraño-Olivera, M J; Carrasco-Rodríguez-Rey, L F; García-González, S; Velasco-Sanz, T R; Martínez-Álvarez, A; Martin-Rivera, B E
2016-01-01
To identify nursing experience on physical restraint management in Critical Care Units. To analyse similarities and differences in nursing experience on physical restraint management according to the clinical context that they are involved in. A multicentre phenomenological study was carried out including 14 Critical Care Units in Madrid, classified according to physical restraint use: Common/systematic use, lacking/personalised use, and mixed use. Five focus groups (23 participants were selected following purposeful sampling) were convened, concluding in data saturation. Data analysis was focused on thematic content analysis following Colaizzi's method. Six main themes: Physical restraint meaning in Critical Care Units, safety (self-retreat vital devices), contribution factors, feelings, alternatives, and pending issues. Although some themes are common to the 3 Critical Care Unit types, discourse differences are found as regards to indication, feelings, systematic use of pain and sedation measurement tools. In order to achieve real physical restraint reduction in Critical Care Units, it is necessary to have a deep understanding of restraints use in the specific clinical context. As self-retreat vital devices emerge as central concept, some interventions proposed in other settings could not be effective, requiring alternatives for critical care patients. Discourse variations laid out in the different Critical Care Unit types could highlight key items that determine the use and different attitudes towards physical restraint. Copyright © 2015 Elsevier España, S.L.U. y SEEIUC. All rights reserved.
Gleason, Brenda L; Gaebelein, Claude J; Grice, Gloria R; Crannage, Andrew J; Weck, Margaret A; Hurd, Peter; Walter, Brenda; Duncan, Wendy
2013-10-14
To determine the feasibility of using a validated set of assessment rubrics to assess students' critical-thinking and problem-solving abilities across a doctor of pharmacy (PharmD) curriculum. Trained faculty assessors used validated rubrics to assess student work samples for critical-thinking and problem-solving abilities. Assessment scores were collected and analyzed to determine student achievement of these 2 ability outcomes across the curriculum. Feasibility of the process was evaluated in terms of time and resources used. One hundred sixty-one samples were assessed for critical thinking, and 159 samples were assessed for problem-solving. Rubric scoring allowed assessors to evaluate four 5- to 7-page work samples per hour. The analysis indicated that overall critical-thinking scores improved over the curriculum. Although low yield for problem-solving samples precluded meaningful data analysis, it was informative for identifying potentially needed curricular improvements. Use of assessment rubrics for program ability outcomes was deemed authentic and feasible. Problem-solving was identified as a curricular area that may need improving. This assessment method has great potential to inform continuous quality improvement of a PharmD program.
Critical Thinking and Intelligence Analysis
2007-03-01
assess such systems – terrorist networks are but one example. Additionally, as sociologist Emile Durkheim observes, the combinations of elements...University Press, 99), 0. Cited hereafter as Jervis, System Effects. Emile Durkheim , The Rules of Sociological Method (Glencoe, IL: Free Press...Puzzles. New York, NY: Main Street, 2005. Durkheim , Emile . The Rules of Sociological Method. Glencoe, IL: Free Press, 1938. Eco, Umberto, and
Implementing Eratosthenes' Discovery in the Classroom: Educational Difficulties Needing Attention
ERIC Educational Resources Information Center
Decamp, Nicolas; de Hosson, Cecile
2012-01-01
This paper presents a critical analysis of the accepted educational use of the method performed by Eratosthenes to measure the circumference of Earth which is often considered as a relevant means of dealing with issues related to the nature of science and its history. This method relies on a number of assumptions among which the parallelism of sun…
Sensitivity-Uncertainty Based Nuclear Criticality Safety Validation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Forrest B.
2016-09-20
These are slides from a seminar given to the University of Mexico Nuclear Engineering Department. Whisper is a statistical analysis package developed to support nuclear criticality safety validation. It uses the sensitivity profile data for an application as computed by MCNP6 along with covariance files for the nuclear data to determine a baseline upper-subcritical-limit for the application. Whisper and its associated benchmark files are developed and maintained as part of MCNP6, and will be distributed with all future releases of MCNP6. Although sensitivity-uncertainty methods for NCS validation have been under development for 20 years, continuous-energy Monte Carlo codes such asmore » MCNP could not determine the required adjoint-weighted tallies for sensitivity profiles. The recent introduction of the iterated fission probability method into MCNP led to the rapid development of sensitivity analysis capabilities for MCNP6 and the development of Whisper. Sensitivity-uncertainty based methods represent the future for NCS validation – making full use of today’s computer power to codify past approaches based largely on expert judgment. Validation results are defensible, auditable, and repeatable as needed with different assumptions and process models. The new methods can supplement, support, and extend traditional validation approaches.« less
Critical behavior of the van der Waals bonded ferromagnet Fe3 -xGeTe2
NASA Astrophysics Data System (ADS)
Liu, Yu; Ivanovski, V. N.; Petrovic, C.
2017-10-01
The critical properties of the single-crystalline van der Waals bonded ferromagnet Fe3 -xGeTe2 were investigated by bulk dc magnetization around the paramagnetic to ferromagnetic (FM) phase transition. The Fe3 -xGeTe2 single crystals grown by self-flux method with Fe deficiency x ≈0.36 exhibit bulk FM ordering below Tc=152 K. The Mössbauer spectroscopy was used to provide information on defects and local atomic environment in such crystals. Critical exponents β =0.372 (4 ) with a critical temperature Tc=151.25 (5 ) K and γ =1.265 (15 ) with Tc=151.17 (12 ) K are obtained by the Kouvel-Fisher method, whereas δ =4.50 (1 ) is obtained by a critical isotherm analysis at Tc=151 K. These critical exponents obey the Widom scaling relation δ =1 +γ /β , indicating self-consistency of the obtained values. With these critical exponents the isotherm M (H ) curves below and above the critical temperatures collapse into two independent universal branches, obeying the single scaling equation m =f±(h ) , where m and h are renormalized magnetization and field, respectively. The exponents determined in this study are close to those calculated from the results of the renormalization group approach for a heuristic model of three-dimensional Heisenberg (d =3 ,n =3 ) spins coupled with the attractive long-range interactions between spins that decay as J (r ) ≈r-(3 +σ ) with σ =1.89 .
ERIC Educational Resources Information Center
Eemeren, F. H. van; Grootendorst, R.
Suitable methods can be developed and instructional devices can be designed for the teaching of argumentation analysis to students of varying interests, ages, and capacities. Until 1950, the study of argumentation in the Netherlands was either purely practical or a continuation of the classic logic and rhetoric traditions. A number of new research…
ERIC Educational Resources Information Center
Mann, David A.
2016-01-01
Drawing on the principles of critical multicultural teacher education, Teaching English to Speakers of Other Languages (TESOL) and bilingual education, this study examined how pre-service teachers were prepared to educate Emerging Bilinguals (EBs) in ESOL-infused teacher education programs in Florida universities. The textual analysis of a…
ERIC Educational Resources Information Center
Liu, Ran; Stamper, John; Davenport, Jodi
2018-01-01
Temporal analyses are critical to understanding learning processes, yet understudied in education research. Data from different sources are often collected at different grain sizes, which are difficult to integrate. Making sense of data at many levels of analysis, including the most detailed levels, is highly time-consuming. In this paper, we…
Guoyi Zhou; Ge Sun; Xu Wang; Chuanyan Zhou; Steven G. McNulty; James M. Vose; Devendra M. Amatya
2008-01-01
It is critical that evapotranspiration (ET) be quantified accurately so that scientists can evaluate the effects of land management and global change on water availability, streamflow, nutrient and sediment loading, and ecosystem productivity in watersheds. The objective of this study was to derive a new semi-empirical ET modeled using a dimension analysis method that...
Application of multi-criteria decision-making to risk prioritisation in tidal energy developments
NASA Astrophysics Data System (ADS)
Kolios, Athanasios; Read, George; Ioannou, Anastasia
2016-01-01
This paper presents an analytical multi-criterion analysis for the prioritisation of risks for the development of tidal energy projects. After a basic identification of risks throughout the project and relevant stakeholders in the UK, classified through a political, economic, social, technological, legal and environmental analysis, relevant questionnaires provided scores to each risk and corresponding weights for each of the different sectors. Employing an extended technique for order of preference by similarity to ideal solution as well as the weighted sum method based on the data obtained, the risks identified are ranked based on their criticality, drawing attention of the industry in mitigating the ones scoring higher. Both methods were modified to take averages at different stages of the analysis in order to observe the effects on the final risk ranking. A sensitivity analysis of the results was also carried out with regard to the weighting factors given to the perceived expertise of participants, with different results being obtained whether a linear, squared or square root regression is used. Results of the study show that academics and industry have conflicting opinions with regard to the perception of the most critical risks.
Analysis of the STS-126 Flow Control Valve Structural-Acoustic Coupling Failure
NASA Technical Reports Server (NTRS)
Jones, Trevor M.; Larko, Jeffrey M.; McNelis, Mark E.
2010-01-01
During the Space Transportation System mission STS-126, one of the main engine's flow control valves incurred an unexpected failure. A section of the valve broke off during liftoff. It is theorized that an acoustic mode of the flowing fuel, coupled with a structural mode of the valve, causing a high cycle fatigue failure. This report documents the analysis efforts conducted in an attempt to verify this theory. Hand calculations, computational fluid dynamics, and finite element methods are all implemented and analyses are performed using steady-state methods in addition to transient analysis methods. The conclusion of the analyses is that there is a critical acoustic mode that aligns with a structural mode of the valve
Optimized Kernel Entropy Components.
Izquierdo-Verdiguier, Emma; Laparra, Valero; Jenssen, Robert; Gomez-Chova, Luis; Camps-Valls, Gustau
2017-06-01
This brief addresses two main issues of the standard kernel entropy component analysis (KECA) algorithm: the optimization of the kernel decomposition and the optimization of the Gaussian kernel parameter. KECA roughly reduces to a sorting of the importance of kernel eigenvectors by entropy instead of variance, as in the kernel principal components analysis. In this brief, we propose an extension of the KECA method, named optimized KECA (OKECA), that directly extracts the optimal features retaining most of the data entropy by means of compacting the information in very few features (often in just one or two). The proposed method produces features which have higher expressive power. In particular, it is based on the independent component analysis framework, and introduces an extra rotation to the eigen decomposition, which is optimized via gradient-ascent search. This maximum entropy preservation suggests that OKECA features are more efficient than KECA features for density estimation. In addition, a critical issue in both the methods is the selection of the kernel parameter, since it critically affects the resulting performance. Here, we analyze the most common kernel length-scale selection criteria. The results of both the methods are illustrated in different synthetic and real problems. Results show that OKECA returns projections with more expressive power than KECA, the most successful rule for estimating the kernel parameter is based on maximum likelihood, and OKECA is more robust to the selection of the length-scale parameter in kernel density estimation.
Needs, barriers, and analysis methods for integrated urban freight transportation : final report.
DOT National Transportation Integrated Search
2015-08-01
In this joint project University of Maryland, West Virginia University, and Morgan State University worked together to : solve critical problems associated with urban freight systems. A review of literature and case studies on freight : villages and ...
A Simplified Method of Elastic-Stability Analysis for Thin Cylindrical Shells
NASA Technical Reports Server (NTRS)
Batdorf, S B
1947-01-01
This paper develops a new method for determining the buckling stresses of cylindrical shells under various loading conditions. In part I, the equation for the equilibrium of cylindrical shells introduced by Donnell in NACA report no. 479 to find the critical stresses of cylinders in torsion is applied to find critical stresses for cylinders with simply supported edges under other loading conditions. In part II, a modified form of Donnell's equation for the equilibrium of thin cylindrical shells is derived which is equivalent to Donnell's equation but has certain advantages in physical interpretation and in ease of solution, particularly in the case of shells having clamped edges. The question of implicit boundary conditions is also considered.
iCLIP: Protein–RNA interactions at nucleotide resolution
Huppertz, Ina; Attig, Jan; D’Ambrogio, Andrea; Easton, Laura E.; Sibley, Christopher R.; Sugimoto, Yoichiro; Tajnik, Mojca; König, Julian; Ule, Jernej
2014-01-01
RNA-binding proteins (RBPs) are key players in the post-transcriptional regulation of gene expression. Precise knowledge about their binding sites is therefore critical to unravel their molecular function and to understand their role in development and disease. Individual-nucleotide resolution UV crosslinking and immunoprecipitation (iCLIP) identifies protein–RNA crosslink sites on a genome-wide scale. The high resolution and specificity of this method are achieved by an intramolecular cDNA circularization step that enables analysis of cDNAs that truncated at the protein–RNA crosslink sites. Here, we describe the improved iCLIP protocol and discuss critical optimization and control experiments that are required when applying the method to new RBPs. PMID:24184352
A critical methodological review of discourse and conversation analysis studies of family therapy.
Tseliou, Eleftheria
2013-12-01
Discourse (DA) and conversation (CA) analysis, two qualitative research methods, have been recently suggested as potentially promising for the study of family therapy due to common epistemological adherences and their potential for an in situ study of therapeutic dialog. However, to date, there is no systematic methodological review of the few existing DA and CA studies of family therapy. This study aims at addressing this lack by critically reviewing published DA and CA studies of family therapy on methodological grounds. Twenty-eight articles in total are reviewed in relation to certain methodological axes identified in the relevant literature. These include choice of method, framing of research question(s), data/sampling, type of analysis, epistemological perspective, content/type of knowledge claims, and attendance to criteria for good quality practice. It is argued that the reviewed studies show "glimpses" of the methods' potential for family therapy research despite the identification of certain "shortcomings" regarding their methodological rigor. These include unclearly framed research questions and the predominance of case study designs. They also include inconsistencies between choice of method, stated or unstated epistemological orientations and knowledge claims, and limited attendance to criteria for good quality practice. In conclusion, it is argued that DA and CA can add to the existing quantitative and qualitative methods for family therapy research. They can both offer unique ways for a detailed study of the actual therapeutic dialog, provided that future attempts strive for a methodologically rigorous practice and against their uncritical deployment. © FPI, Inc.
Chemical Fingerprinting of Materials Developed Due to Environmental Issues
NASA Technical Reports Server (NTRS)
Smith, Doris A.; McCool, A. (Technical Monitor)
2000-01-01
Instrumental chemical analysis methods are developed and used to chemically fingerprint new and modified External Tank materials made necessary by changing environmental requirements. Chemical fingerprinting can detect and diagnose variations in material composition. To chemically characterize each material, fingerprint methods are selected from an extensive toolbox based on the material's chemistry and the ability of the specific methods to detect the material's critical ingredients. Fingerprint methods have been developed for a variety of materials including Thermal Protection System foams, adhesives, primers, and composites.
New Flutter Analysis Technique for Time-Domain Computational Aeroelasticity
NASA Technical Reports Server (NTRS)
Pak, Chan-Gi; Lung, Shun-Fat
2017-01-01
A new time-domain approach for computing flutter speed is presented. Based on the time-history result of aeroelastic simulation, the unknown unsteady aerodynamics model is estimated using a system identification technique. The full aeroelastic model is generated via coupling the estimated unsteady aerodynamic model with the known linear structure model. The critical dynamic pressure is computed and used in the subsequent simulation until the convergence of the critical dynamic pressure is achieved. The proposed method is applied to a benchmark cantilevered rectangular wing.
NASA Astrophysics Data System (ADS)
Arief, I. S.; Suherman, I. H.; Wardani, A. Y.; Baidowi, A.
2017-05-01
Control and monitoring system is a continuous process of securing the asset in the Marine Current Renewable Energy. A control and monitoring system is existed each critical components which is embedded in Failure Mode Effect Analysis (FMEA) method. As the result, the process in this paper developed through a matrix sensor. The matrix correlated to critical components and monitoring system which supported by sensors to conduct decision-making.
Consistency Analysis and Data Consultation of Gas System of Gas-Electricity Network of Latvia
NASA Astrophysics Data System (ADS)
Zemite, L.; Kutjuns, A.; Bode, I.; Kunickis, M.; Zeltins, N.
2018-02-01
In the present research, the main critical points of gas transmission and storage system of Latvia have been determined to ensure secure and reliable gas supply among the Baltic States to fulfil the core objectives of the EU energy policies. Technical data of critical points of the gas transmission and storage system of Latvia have been collected and analysed with the SWOT method and solutions have been provided to increase the reliability of the regional natural gas system.
The Role of the National Defense Stockpile in the Supply of Strategic and Critical Materials
2008-05-09
Insurance Trust Fund and the Federal Supplementary Medical Trust Fund.53 12 Analysis of NDS Operations and Alternatives The current method of determining...requirements are based upon analysis of military, industrial, and essential civilian materials needs in light of conflict scenarios found in the National...Defense Strategy. The bulk of this analysis is done utilizing computer modeling. First, the model projects the needs for finished products and services
Designing a model for critical thinking development in AJA University of Medical Sciences
MAFAKHERI LALEH, MAHYAR; MOHAMMADIMEHR, MOJGAN; ZARGAR BALAYE JAME, SANAZ
2016-01-01
Introduction: In the new concept of medical education, creativity development is an important goal. The aim of this research was to identify a model for developing critical thinking among students with the special focus on learning environment and learning style. Methods: This applied and cross-sectional study was conducted among all students studying in undergraduate and professional doctorate programs in Fall Semester 2013-2014 in AJA University of Medical Sciences (N=777). The sample consisted of 257 students selected based on the proportional stratified random sampling method. To collect data, three questionnaires including Critical Thinking, Perception of Learning Environment and Learning Style were employed. The data were analyzed using Pearson's correlation statistical test, and one-sample t-test. The Structural Equation Model (SEM) was used to test the research model. SPSS software, version 14 and the LISREL software were used for data analysis. Results: The results showed that students had significantly assessed the teaching-learning environment and two components of "perception of teachers" and "perception of emotional-psychological climate" at the desirable level (p<0.05). Also learning style and two components of "the study method" and "motivation for studying" were considered significantly desirable (p<0.05). The level of critical thinking among students in terms of components of "commitment", "creativity" and "cognitive maturity" was at the relatively desirable level (p<0.05). In addition, perception of the learning environment can impact the critical thinking through learning style. Conclusion: One of the factors which can significantly impact the quality improvement of the teaching and learning process in AJA University of Medical Sciences is to develop critical thinking among learners. This issue requires providing the proper situation for teaching and learning critical thinking in the educational environment. PMID:27795968
Social construction of the patient through problems of safety, uninsurance, and unequal treatment.
Trigg, Lisa J
2009-01-01
The purpose of this research was to study how the Institute of Medicine discourse promoting health information technology may reproduce existing social inequalities in healthcare. Social constructionist and critical discourse analysis combined with corpus linguistics methods have been used to study the subject positions constructed for receivers of healthcare across the executive summaries of 3 different Institute of Medicine reports. Data analysis revealed differences in the way receivers of healthcare are constructed through variations of social action through language use in the 3 texts selected for this method's testing.
Automated Parameter Studies Using a Cartesian Method
NASA Technical Reports Server (NTRS)
Murman, Scott M.; Aftosimis, Michael J.; Nemec, Marian
2004-01-01
Computational Fluid Dynamics (CFD) is now routinely used to analyze isolated points in a design space by performing steady-state computations at fixed flight conditions (Mach number, angle of attack, sideslip), for a fixed geometric configuration of interest. This "point analysis" provides detailed information about the flowfield, which aides an engineer in understanding, or correcting, a design. A point analysis is typically performed using high fidelity methods at a handful of critical design points, e.g. a cruise or landing configuration, or a sample of points along a flight trajectory.
Giacomini, Mita; Cook, Deborah; DeJean, Deirdre
2009-04-01
The objective of this study is to identify and appraise qualitative research evidence on the experience of making life-support decisions in critical care. In six databases and supplementary sources, we sought original research published from January 1990 through June 2008 reporting qualitative empirical studies of the experience of life-support decision making in critical care settings. Fifty-three journal articles and monographs were included. Of these, 25 reported prospective studies and 28 reported retrospective studies. We abstracted methodologic characteristics relevant to the basic critical appraisal of qualitative research (prospective data collection, ethics approval, purposive sampling, iterative data collection and analysis, and any method to corroborate findings). Qualitative research traditions represented include grounded theory (n = 15, 28%), ethnography or naturalistic methods (n = 15, 28%), phenomenology (n = 9, 17%), and other or unspecified approaches (n = 14, 26%). All 53 documents describe the research setting; 97% indicate purposive sampling of participants. Studies vary in their capture of multidisciplinary clinician and family perspectives. Thirty-one (58%) report research ethics board review. Only 49% report iterative data collection and analysis, and eight documents (15%) describe an analytically driven stopping point for data collection. Thirty-two documents (60%) indicated a method for corroborating findings. Qualitative evidence often appears outside of clinical journals, with most research from the United States. Prospective, observation-based studies follow life-support decision making directly. These involve a variety of participants and yield important insights into interactions, communication, and dynamics. Retrospective, interview-based studies lack this direct engagement, but focus on the recollections of fewer types of participants (particularly patients and physicians), and typically address specific issues (communication and stress). Both designs can provide useful reflections for improving care. Given the diversity of qualitative research in critical care, room for improvement exists regarding both the quality and transparency of reported methodology.
Critical considerations for the application of environmental DNA methods to detect aquatic species
Goldberg, Caren S.; Turner, Cameron R.; Deiner, Kristy; Klymus, Katy E.; Thomsen, Philip Francis; Murphy, Melanie A.; Spear, Stephen F.; McKee, Anna; Oyler-McCance, Sara J.; Cornman, Robert S.; Laramie, Matthew B.; Mahon, Andrew R.; Lance, Richard F.; Pilliod, David S.; Strickler, Katherine M.; Waits, Lisette P.; Fremier, Alexander K.; Takahara, Teruhiko; Herder, Jelger E.; Taberlet, Pierre
2016-01-01
Species detection using environmental DNA (eDNA) has tremendous potential for contributing to the understanding of the ecology and conservation of aquatic species. Detecting species using eDNA methods, rather than directly sampling the organisms, can reduce impacts on sensitive species and increase the power of field surveys for rare and elusive species. The sensitivity of eDNA methods, however, requires a heightened awareness and attention to quality assurance and quality control protocols. Additionally, the interpretation of eDNA data demands careful consideration of multiple factors. As eDNA methods have grown in application, diverse approaches have been implemented to address these issues. With interest in eDNA continuing to expand, supportive guidelines for undertaking eDNA studies are greatly needed.Environmental DNA researchers from around the world have collaborated to produce this set of guidelines and considerations for implementing eDNA methods to detect aquatic macroorganisms.Critical considerations for study design include preventing contamination in the field and the laboratory, choosing appropriate sample analysis methods, validating assays, testing for sample inhibition and following minimum reporting guidelines. Critical considerations for inference include temporal and spatial processes, limits of correlation of eDNA with abundance, uncertainty of positive and negative results, and potential sources of allochthonous DNA.We present a synthesis of knowledge at this stage for application of this new and powerful detection method.
Methods of space radiation dose analysis with applications to manned space systems
NASA Technical Reports Server (NTRS)
Langley, R. W.; Billings, M. P.
1972-01-01
The full potential of state-of-the-art space radiation dose analysis for manned missions has not been exploited. Point doses have been overemphasized, and the critical dose to the bone marrow has been only crudely approximated, despite the existence of detailed man models and computer codes for dose integration in complex geometries. The method presented makes it practical to account for the geometrical detail of the astronaut as well as the vehicle. Discussed are the major assumptions involved and the concept of applying the results of detailed proton dose analysis to the real-time interpretation of on-board dosimetric measurements.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weston, Louise Marie
2007-09-01
A recent report on criticality accidents in nuclear facilities indicates that human error played a major role in a significant number of incidents with serious consequences and that some of these human errors may be related to the emotional state of the individual. A pre-shift test to detect a deleterious emotional state could reduce the occurrence of such errors in critical operations. The effectiveness of pre-shift testing is a challenge because of the need to gather predictive data in a relatively short test period and the potential occurrence of learning effects due to a requirement for frequent testing. This reportmore » reviews the different types of reliability and validity methods and testing and statistical analysis procedures to validate measures of emotional state. The ultimate value of a validation study depends upon the percentage of human errors in critical operations that are due to the emotional state of the individual. A review of the literature to identify the most promising predictors of emotional state for this application is highly recommended.« less
Optimizing Probability of Detection Point Estimate Demonstration
NASA Technical Reports Server (NTRS)
Koshti, Ajay M.
2017-01-01
Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-18231and associated mh18232POD software gives most common methods of POD analysis. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. The paper provides discussion on optimizing probability of detection (POD) demonstration experiments using Point Estimate Method. POD Point estimate method is used by NASA for qualifying special NDE procedures. The point estimate method uses binomial distribution for probability density. Normally, a set of 29 flaws of same size within some tolerance are used in the demonstration. The optimization is performed to provide acceptable value for probability of passing demonstration (PPD) and achieving acceptable value for probability of false (POF) calls while keeping the flaw sizes in the set as small as possible.
Potestio, Melissa L.; Boyd, Jamie M.; Bagshaw, Sean M.; Heyland, Daren; Oxland, Peter; Doig, Christopher J.; Zygun, Dave; Stelfox, Henry T.
2015-01-01
Objective To engage the public to understand how to improve the care of critically ill patients. Design A qualitative content analysis of an open community forum (Café Scientifique). Setting Public venue in Calgary, Alberta, Canada. Participants Members of the general public including patients, families of patients, health care providers, and members of the community at large. Methods A panel of researchers, decision-makers, and a family member led a Café Scientifique, an informal dialogue between the populace and experts, over three-hours to engage the public to understand how to improve the care of critically ill patients. Conventional qualitative content analysis was used to analyze the data. The inductive analysis occurred in three phases: coding, categorizing, and developing themes. Results Thirty-eight members of the public (former ICU patients, family members of patients, providers, community members) attended. Participants focused the discussion and provided concrete suggestions for improvement around communication (family as surrogate voice, timing of conversations, decision tools) and provider well-being and engagement, as opposed to medical interventions in critical care. Conclusions Café participants believe patient and family centered care is important to ensure high-quality care in the ICU. A Café Scientifique is a valuable forum to engage the public to contribute to priority setting areas for research in critical care, as well as a platform to share lived experience. Research stakeholders including health care organizations, governments, and funding organizations should provide more opportunities for the public to engage in meaningful conversations about how to best improve healthcare. PMID:26580406
Zhu, Weidong; Jiang, Libing; Jiang, Shouyin; Ma, Yuefeng; Zhang, Mao
2015-01-23
Stress-induced hyperglycaemia, which has been shown to be associated with an unfavourable prognosis, is common among critically ill patients. Additionally, it has been reported that hypoglycaemia and high glucose variabilities are also associated with adverse outcomes. Thus, continuous glucose monitoring (CGM) may be the optimal method to detect severe hypoglycaemia, hyperglycaemia and decrease glucose excursion. However, the overall accuracy and reliability of CGM systems and the effects of CGM systems on glucose control and prognosis in critically ill patients remain inconclusive. Therefore, we will conduct a systematic review and meta-analysis to clarify the associations between CGM systems and clinical outcome. We will search PubMed, EMBASE and the Cochrane Library from inception to October 2014. Studies comparing CGM systems with any other glucose monitoring methods in critically ill patients will be eligible for our meta-analysis. The primary endpoints include the incidence of hypoglycaemia and hyperglycaemia, mean glucose level, and percentage of time within the target range. The second endpoints include intensive care unit (ICU) mortality, hospital mortality, duration of mechanical ventilation, length of ICU and hospital stay, and the Pearson correlation coefficient and the results of error grid analysis. In addition, we will record all complications (eg, acquired infections) in control and intervention groups and local adverse events in intervention groups (eg, bleeding or infections). Ethics approval is not required as this is a protocol for a systematic review. The findings will be disseminated in a peer-reviewed journal and presented at a relevant conference. PROSPERO registration number: CRD42014013488. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
The use of decision analysis to examine ethical decision making by critical care nurses.
Hughes, K K; Dvorak, E M
1997-01-01
To examine the extent to which critical care staff nurses make ethical decisions that coincide with those recommended by a decision analytic model. Nonexperimental, ex post facto. Midwestern university-affiliated 500 bed tertiary care medical center. One hundred critical care staff nurses randomly selected from seven critical care units. Complete responses were obtained from 82 nurses (for a final response rate of 82%). The dependent variable--consistent decision making--was measured as staff nurses' abilities to make ethical decisions that coincided with those prescribed by the decision model. Subjects completed two instruments, the Ethical Decision Analytic Model, a computer-administered instrument designed to measure staff nurses' abilities to make consistent decisions about a chemically-impaired colleague; and a Background Inventory. The results indicate marked consensus among nurses when informal methods were used. However, there was little consistency between the nurses' informal decisions and those recommended by the decision analytic model. Although 50% (n = 41) of all nurses chose a course of action that coincided with the model's least optimal alternative, few nurses agreed with the model as to the most optimal course of action. The findings also suggest that consistency was unrelated (p > 0.05) to the nurses' educational background or years of clinical experience; that most subjects reported receiving little or no education in decision making during their basic nursing education programs; but that exposure to decision-making strategies was related to years of nursing experience (p < 0.05). The findings differ from related studies that have found a moderate degree of consistency between nurses and decision analytic models for strictly clinical decision tasks, especially when those tasks were less complex. However, the findings partially coincide with other findings that decision analysis may not be particularly well-suited to the critical care environment. Additional research is needed to determine whether critical care nurses use the same decision-making methods as do other nurses; and to clarify the effects of decision task (clinical versus ethical) on nurses' decision making. It should not be assumed that methods used to study nurses' clinical decision making are applicable for all nurses or all types of decisions, including ethical decisions.
Jovanović, Marko; Rakić, Tijana; Tumpa, Anja; Jančić Stojanović, Biljana
2015-06-10
This study presents the development of hydrophilic interaction liquid chromatographic method for the analysis of iohexol, its endo-isomer and three impurities following Quality by Design (QbD) approach. The main objective of the method was to identify the conditions where adequate separation quality in minimal analysis duration could be achieved within a robust region that guarantees the stability of method performance. The relationship between critical process parameters (acetonitrile content in the mobile phase, pH of the water phase and ammonium acetate concentration in the water phase) and critical quality attributes is created applying design of experiments methodology. The defined mathematical models and Monte Carlo simulation are used to evaluate the risk of uncertainty in models prediction and incertitude in adjusting the process parameters and to identify the design space. The borders of the design space are experimentally verified and confirmed that the quality of the method is preserved in this region. Moreover, Plackett-Burman design is applied for experimental robustness testing and method is fully validated to verify the adequacy of selected optimal conditions: the analytical column ZIC HILIC (100 mm × 4.6 mm, 5 μm particle size); mobile phase consisted of acetonitrile-water phase (72 mM ammonium acetate, pH adjusted to 6.5 with glacial acetic acid) (86.7:13.3) v/v; column temperature 25 °C, mobile phase flow rate 1 mL min(-1), wavelength of detection 254 nm. Copyright © 2015 Elsevier B.V. All rights reserved.
Djuris, Jelena; Medarevic, Djordje; Krstic, Marko; Djuric, Zorica; Ibric, Svetlana
2013-06-01
This study illustrates the application of experimental design and multivariate data analysis in defining design space for granulation and tableting processes. According to the quality by design concepts, critical quality attributes (CQAs) of granules and tablets, as well as critical parameters of granulation and tableting processes, were identified and evaluated. Acetaminophen was used as the model drug, and one of the study aims was to investigate the possibility of the development of immediate- or extended-release acetaminophen tablets. Granulation experiments were performed in the fluid bed processor using polyethylene oxide polymer as a binder in the direct granulation method. Tablets were compressed in the laboratory excenter tablet press. The first set of experiments was organized according to Plackett-Burman design, followed by the full factorial experimental design. Principal component analysis and partial least squares regression were applied as the multivariate analysis techniques. By using these different methods, CQAs and process parameters were identified and quantified. Furthermore, an in-line method was developed to monitor the temperature during the fluidized bed granulation process, to foresee possible defects in granules CQAs. Various control strategies that are based on the process understanding and assure desired quality attributes of the product are proposed. Copyright © 2013 Wiley Periodicals, Inc.
Advancing Usability Evaluation through Human Reliability Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ronald L. Boring; David I. Gertman
2005-07-01
This paper introduces a novel augmentation to the current heuristic usability evaluation methodology. The SPAR-H human reliability analysis method was developed for categorizing human performance in nuclear power plants. Despite the specialized use of SPAR-H for safety critical scenarios, the method also holds promise for use in commercial off-the-shelf software usability evaluations. The SPAR-H method shares task analysis underpinnings with human-computer interaction, and it can be easily adapted to incorporate usability heuristics as performance shaping factors. By assigning probabilistic modifiers to heuristics, it is possible to arrive at the usability error probability (UEP). This UEP is not a literal probabilitymore » of error but nonetheless provides a quantitative basis to heuristic evaluation. When combined with a consequence matrix for usability errors, this method affords ready prioritization of usability issues.« less
Wang, Maocai; Dai, Guangming; Choo, Kim-Kwang Raymond; Jayaraman, Prem Prakash; Ranjan, Rajiv
2016-01-01
Information confidentiality is an essential requirement for cyber security in critical infrastructure. Identity-based cryptography, an increasingly popular branch of cryptography, is widely used to protect the information confidentiality in the critical infrastructure sector due to the ability to directly compute the user's public key based on the user's identity. However, computational requirements complicate the practical application of Identity-based cryptography. In order to improve the efficiency of identity-based cryptography, this paper presents an effective method to construct pairing-friendly elliptic curves with low hamming weight 4 under embedding degree 1. Based on the analysis of the Complex Multiplication(CM) method, the soundness of our method to calculate the characteristic of the finite field is proved. And then, three relative algorithms to construct pairing-friendly elliptic curve are put forward. 10 elliptic curves with low hamming weight 4 under 160 bits are presented to demonstrate the utility of our approach. Finally, the evaluation also indicates that it is more efficient to compute Tate pairing with our curves, than that of Bertoni et al.
Dai, Guangming
2016-01-01
Information confidentiality is an essential requirement for cyber security in critical infrastructure. Identity-based cryptography, an increasingly popular branch of cryptography, is widely used to protect the information confidentiality in the critical infrastructure sector due to the ability to directly compute the user’s public key based on the user’s identity. However, computational requirements complicate the practical application of Identity-based cryptography. In order to improve the efficiency of identity-based cryptography, this paper presents an effective method to construct pairing-friendly elliptic curves with low hamming weight 4 under embedding degree 1. Based on the analysis of the Complex Multiplication(CM) method, the soundness of our method to calculate the characteristic of the finite field is proved. And then, three relative algorithms to construct pairing-friendly elliptic curve are put forward. 10 elliptic curves with low hamming weight 4 under 160 bits are presented to demonstrate the utility of our approach. Finally, the evaluation also indicates that it is more efficient to compute Tate pairing with our curves, than that of Bertoni et al. PMID:27564373
Why Does a Method That Fails Continue To Be Used: The Answer
Templeton, Alan R.
2009-01-01
It has been claimed that hundreds of researchers use nested clade phylogeographic analysis (NCPA) based on what the method promises rather than requiring objective validation of the method. The supposed failure of NCPA is based upon the argument that validating it by using positive controls ignored type I error, and that computer simulations have shown a high type I error. The first argument is factually incorrect: the previously published validation analysis fully accounted for both type I and type II errors. The simulations that indicate a 75% type I error rate have serious flaws and only evaluate outdated versions of NCPA. These outdated type I error rates fall precipitously when the 2003 version of single locus NCPA is used or when the 2002 multi-locus version of NCPA is used. It is shown that the treewise type I errors in single-locus NCPA can be corrected to the desired nominal level by a simple statistical procedure, and that multilocus NCPA reconstructs a simulated scenario used to discredit NCPA with 100% accuracy. Hence, NCPA is a not a failed method at all, but rather has been validated both by actual data and by simulated data in a manner that satisfies the published criteria given by its critics. The critics have come to different conclusions because they have focused on the pre-2002 versions of NCPA and have failed to take into account the extensive developments in NCPA since 2002. Hence, researchers can choose to use NCPA based upon objective critical validation that shows that NCPA delivers what it promises. PMID:19335340
Geospatial methods and data analysis for assessing distribution of grazing livestock
USDA-ARS?s Scientific Manuscript database
Free-ranging livestock research must begin with a well conceived problem statement and employ appropriate data acquisition tools and analytical techniques to accomplish the research objective. These requirements are especially critical in addressing animal distribution. Tools and statistics used t...
Journal Writing in Health Education.
ERIC Educational Resources Information Center
Gillis, Angela J.
2001-01-01
Notes the growing use of journals in nursing education and health professions continuing education. Describes a three-step method involving critical analysis of clinical practice, peer group discussion, and self-evaluation. Presents practical guidelines for journal writing and ways to use journals to develop competence. (SK)
Scientific computations section monthly report, November 1993
DOE Office of Scientific and Technical Information (OSTI.GOV)
Buckner, M.R.
1993-12-30
This progress report from the Savannah River Technology Center contains abstracts from papers from the computational modeling, applied statistics, applied physics, experimental thermal hydraulics, and packaging and transportation groups. Specific topics covered include: engineering modeling and process simulation, criticality methods and analysis, plutonium disposition.
NASA Astrophysics Data System (ADS)
Sabri, Farhad
Shells of revolution, particularly cylindrical and conical shells, are one of the basic structural elements in the aerospace structures. With the advent of high speed aircrafts, these shells can show dynamic instabilities when they are exposed to a supersonic flow. Therefore, aeroelastic analysis of these elements is one of the primary design criteria which aeronautical engineers are dealing with. This analysis can be done with the help of finite element method (FEM) coupled with the computational fluid dynamic (CFD) or by experimental methods but it is time consuming and very expensive. The purpose of this dissertation is to develop such a numerical tool to do aeroelastic analysis in a fast and precise way. Meanwhile during the design stage, where the different configurations, loading and boundary conditions may need to be analyzed, this numerical method can be used very easily with the high order of reliability. In this study structural modeling is a combination of linear Sanders thin shell theory and classical finite element method. Based on this hybrid finite element method, the shell displacements are found from the exact solutions of shell theory rather than approximating by polynomial function done in traditional finite element method. This leads to a precise and fast convergence. Supersonic aerodynamic modeling is done based on the piston theory and modified piston theory with the shell curvature term. The stress stiffening due to lateral pressure and axial compression are also taken into accounts. Fluid-structure interaction in the presence of inside quiescent fluid is modeled based on the potential theory. In this method, fluid is considered as a velocity potential variable at each node of the shell element where its motion is expressed in terms of nodal elastic displacements at the fluid-structure interface. This proposed hybrid finite element has capabilities to do following analysis: (i) Buckling and vibration of an empty or partially fluid filled circular cylindrical shell or truncated conical shell subjected to internal/external pressure and axial compression loading. This is a typical example of external liquid propellant tanks of space shuttles and re-entry vehicles where they may experience this kind of loading during the flight. In the current work, different end boundary conditions of a circular cylindrical shell with different filling ratios were analyzed. To the best author' knowledge this is the first study where this kind of complex loading and boundary conditions are treated together during such an analysis. Only static instability, divergence, was observed where it showed that the fluid filling ratio does not have any effect on the critical buckling pressure and axial compression. It only reduces the vibration frequencies. It also revealed that the pressurized shell loses its stability at a higher critical axial load. (ii) Aeroelastic analysis of empty or partially liquid filled circular cylindrical and conical shells. Different boundary conditions with different geometries of shells subjected to supersonic air flow are studied here. In all of cases shell loses its stability though the coupled mode flutter. The results showed that internal pressure has a stabilizing effect and increases the critical flutter speed. It is seen that the value of critical dynamic pressure changes rapidly and widely as the filling ratio increases from a low value. In addition, by increasing the length ratio the decrement of flutter speed is decreased and vanishes. This rapid change in critical dynamic pressure at low filling ratios and its almost steady behaviour at large filling ratios indicate that the fluid near the bottom of the shell is largely influenced by elastic deformation when a shell is subjected to external subsonic flow. Based on comparison with the existing numerical, analytical and experimental data and the power of capabilities of this hybrid finite element method to model different boundary conditions and complex loadings, this FEM package can be used effectively for the design of advanced aerospace structures. It provides the results at less computational cost compare to the commercial FEM software, which imposes some restrictions when such an analysis is done.
Dynamic analysis and vibration testing of CFRP drive-line system used in heavy-duty machine tool
NASA Astrophysics Data System (ADS)
Yang, Mo; Gui, Lin; Hu, Yefa; Ding, Guoping; Song, Chunsheng
2018-03-01
Low critical rotary speed and large vibration in the metal drive-line system of heavy-duty machine tool affect the machining precision seriously. Replacing metal drive-line with the CFRP drive-line can effectively solve this problem. Based on the composite laminated theory and the transfer matrix method (TMM), this paper puts forward a modified TMM to analyze dynamic characteristics of CFRP drive-line system. With this modified TMM, the CFRP drive-line of a heavy vertical miller is analyzed. And the finite element modal analysis model of the shafting is established. The results of the modified TMM and finite element analysis (FEA) show that the modified TMM can effectively predict the critical rotary speed of CFRP drive-line. And the critical rotary speed of CFRP drive-line is 20% higher than that of the original metal drive-line. Then, the vibration of the CFRP and the metal drive-line were tested. The test results show that application of the CFRP drive shaft in the drive-line can effectively reduce the vibration of the heavy-duty machine tool.
Potency control of modified live viral vaccines for veterinary use.
Terpstra, C; Kroese, A H
1996-04-01
This paper reviews various aspects of efficacy, and methods for assaying the potency of modified live viral vaccines. The pros and cons of parametric versus non-parametric methods for analysis of potency assays are discussed and critical levels of protection, as determined by the target(s) of vaccination, are exemplified. Recommendations are presented for designing potency assays on master virus seeds and vaccine batches.
Potency control of modified live viral vaccines for veterinary use.
Terpstra, C; Kroese, A H
1996-01-01
This paper reviews various aspects of efficacy, and methods for assaying the potency of modified live viral vaccines. The pros and cons of parametric versus non-parametric methods for analysis of potency assays are discussed and critical levels of protection, as determined by the target(s) of vaccination, are exemplified. Recommendations are presented for designing potency assays on master virus seeds and vaccine batches.
Improving the quality of parameter estimates obtained from slug tests
Butler, J.J.; McElwee, C.D.; Liu, W.
1996-01-01
The slug test is one of the most commonly used field methods for obtaining in situ estimates of hydraulic conductivity. Despite its prevalence, this method has received criticism from many quarters in the ground-water community. This criticism emphasizes the poor quality of the estimated parameters, a condition that is primarily a product of the somewhat casual approach that is often employed in slug tests. Recently, the Kansas Geological Survey (KGS) has pursued research directed it improving methods for the performance and analysis of slug tests. Based on extensive theoretical and field research, a series of guidelines have been proposed that should enable the quality of parameter estimates to be improved. The most significant of these guidelines are: (1) three or more slug tests should be performed at each well during a given test period; (2) two or more different initial displacements (Ho) should be used at each well during a test period; (3) the method used to initiate a test should enable the slug to be introduced in a near-instantaneous manner and should allow a good estimate of Ho to be obtained; (4) data-acquisition equipment that enables a large quantity of high quality data to be collected should be employed; (5) if an estimate of the storage parameter is needed, an observation well other than the test well should be employed; (6) the method chosen for analysis of the slug-test data should be appropriate for site conditions; (7) use of pre- and post-analysis plots should be an integral component of the analysis procedure, and (8) appropriate well construction parameters should be employed. Data from slug tests performed at a number of KGS field sites demonstrate the importance of these guidelines.
Muinde, R K; Kiinyukia, C; Rombo, G O; Muoki, M A
2012-12-01
To determine the microbial load in food, examination of safety measures and possibility of implementing an Hazard Analysis Critical Control Points (HACCP) system. The target population for this study consisted of restaurants owners in Thika. Municipality (n = 30). Simple randomsamples of restaurantswere selected on a systematic sampling method of microbial analysis in cooked, non-cooked, raw food and water sanitation in the selected restaurants. Two hundred and ninety eight restaurants within Thika Municipality were selected. Of these, 30 were sampled for microbiological testing. From the study, 221 (74%) of the restaurants were ready to eat establishments where food was prepared early enough to hold and only 77(26%) of the total restaurants, customers made an order of food they wanted. 118(63%) of the restaurant operators/staff had knowledge on quality control on food safety measures, 24 (8%) of the restaurants applied these knowledge while 256 (86%) of the restaurants staff showed that food contains ingredients that were hazard if poorly handled. 238 (80%) of the resultants used weighing and sorting of food materials, 45 (15%) used preservation methods and the rest used dry foods as critical control points on food safety measures. The study showed that there was need for implementation of Hazard Analysis Critical Control Points (HACCP) system to enhance food safety. Knowledge of HACCP was very low with 89 (30%) of the restaurants applying some of quality measures to the food production process systems. There was contamination with Coliforms, Escherichia coli and Staphylococcus aureus microbial though at very low level. The means of Coliforms, Escherichia coli and Staphylococcus aureas microbial in sampled food were 9.7 x 103CFU/gm, 8.2 x 103 CFU/gm and 5.4 x 103 CFU/gm respectively with Coliforms taking the highest mean.
Advancing our thinking in presence-only and used-available analysis.
Warton, David; Aarts, Geert
2013-11-01
1. The problems of analysing used-available data and presence-only data are equivalent, and this paper uses this equivalence as a platform for exploring opportunities for advancing analysis methodology. 2. We suggest some potential methodological advances in used-available analysis, made possible via lessons learnt in the presence-only literature, for example, using modern methods to improve predictive performance. We also consider the converse - potential advances in presence-only analysis inspired by used-available methodology. 3. Notwithstanding these potential advances in methodology, perhaps a greater opportunity is in advancing our thinking about how to apply a given method to a particular data set. 4. It is shown by example that strikingly different results can be achieved for a single data set by applying a given method of analysis in different ways - hence having chosen a method of analysis, the next step of working out how to apply it is critical to performance. 5. We review some key issues to consider in deciding how to apply an analysis method: apply the method in a manner that reflects the study design; consider data properties; and use diagnostic tools to assess how reasonable a given analysis is for the data at hand. © 2013 The Authors. Journal of Animal Ecology © 2013 British Ecological Society.
Critical point and phase behavior of the pure fluid and a Lennard-Jones mixture
NASA Astrophysics Data System (ADS)
Potoff, Jeffrey J.; Panagiotopoulos, Athanassios Z.
1998-12-01
Monte Carlo simulations in the grand canonical ensemble were used to obtain liquid-vapor coexistence curves and critical points of the pure fluid and a binary mixture of Lennard-Jones particles. Critical parameters were obtained from mixed-field finite-size scaling analysis and subcritical coexistence data from histogram reweighting methods. The critical parameters of the untruncated Lennard-Jones potential were obtained as Tc*=1.3120±0.0007, ρc*=0.316±0.001 and pc*=0.1279±0.0006. Our results for the critical temperature and pressure are not in agreement with the recent study of Caillol [J. Chem. Phys. 109, 4885 (1998)] on a four-dimensional hypersphere. Mixture parameters were ɛ1=2ɛ2 and σ1=σ2, with Lorentz-Berthelot combining rules for the unlike-pair interactions. We determined the critical point at T*=1.0 and pressure-composition diagrams at three temperatures. Our results have much smaller statistical uncertainties relative to comparable Gibbs ensemble simulations.
Methodology or method? A critical review of qualitative case study reports.
Hyett, Nerida; Kenny, Amanda; Dickson-Swift, Virginia
2014-01-01
Despite on-going debate about credibility, and reported limitations in comparison to other approaches, case study is an increasingly popular approach among qualitative researchers. We critically analysed the methodological descriptions of published case studies. Three high-impact qualitative methods journals were searched to locate case studies published in the past 5 years; 34 were selected for analysis. Articles were categorized as health and health services (n=12), social sciences and anthropology (n=7), or methods (n=15) case studies. The articles were reviewed using an adapted version of established criteria to determine whether adequate methodological justification was present, and if study aims, methods, and reported findings were consistent with a qualitative case study approach. Findings were grouped into five themes outlining key methodological issues: case study methodology or method, case of something particular and case selection, contextually bound case study, researcher and case interactions and triangulation, and study design inconsistent with methodology reported. Improved reporting of case studies by qualitative researchers will advance the methodology for the benefit of researchers and practitioners.
Secondary Traumatic Stress in NICU Nurses: A Mixed-Methods Study.
Beck, Cheryl Tatano; Cusson, Regina M; Gable, Robert K
2017-12-01
Secondary traumatic stress is an occupational hazard for healthcare providers who care for patients who have been traumatized. This type of stress has been reported in various specialties of nursing, but no study to date had specifically focused on neonatal intensive care unit (NICU) nurses. (1) To determine the prevalence and severity of secondary traumatic stress in NICU nurses and (2) to explore those quantitative findings in more depth through nurses' qualitative descriptions of their traumatic experiences caring for critically ill infants in the NICU. Members of NANN were sent e-mails with a link to the electronic survey. In this mixed-methods study, a convergent parallel design was used. Neonatal nurses completed the Secondary Traumatic Stress Scale (STSS) and then described their traumatic experiences caring for critically ill infants in the NICU. SPSS version 24 and content analysis were used to analyze the quantitative and qualitative data, respectively. In this sample of 175 NICU nurses, 49% of the nurses' scores on the STSS indicated moderate to severe secondary traumatic stress. Analysis of the qualitative data revealed 5 themes that described NICU nurses' traumatic experiences caring for critically ill infants. NICU nurses need to know the signs of secondary traumatic stress that they may experience caring for their critically ill infants. Avenues for dealing with the stress should be provided. Future research with a higher response rate to increase the external validity of the findings to the population of neonatal nurses is needed.
The Critical Success Factor Method: Establishing a Foundation for Enterprise Security Management
2004-07-01
importance. 13 SWOT analysis is a commonly used strategic planning technique. It identifies...24 Figure 11: Relationship Between Enterprise and Operational Unit CSFs ............... 28 Figure 12: Affinity Analysis for Determining...ISRM Scope ....................................... 36 Figure 13: Affinity Analysis for Determining Critical Assets
NASA Astrophysics Data System (ADS)
Goldgruber, Markus; Shahriari, Shervin; Zenz, Gerald
2015-11-01
To reduce the natural hazard risks—due to, e.g., earthquake excitation—seismic safety assessments are carried out. Especially under severe loading, due to maximum credible or the so-called safety evaluation earthquake, critical infrastructure, as these are high dams, must not fail. However, under high loading local failure might be allowed as long as the entire structure does not collapse. Hence, for a dam, the loss of sliding stability during a short time period might be acceptable if the cumulative displacements after an event are below an acceptable value. This performance is not only valid for gravity dams but also for rock blocks as sliding is even more imminent in zones with higher seismic activity. Sliding modes cannot only occur in the dam-foundation contact, but also in sliding planes formed due to geological conditions. This work compares the qualitative possible and critical displacements for two methods, the well-known Newmark's sliding block analysis and a Fluid-Foundation-Structure Interaction simulation with the finite elements method. The results comparison of the maximum displacements at the end of the seismic event of the two methods depicts that for high friction angles, they are fairly close. For low friction angles, the results are differing more. The conclusion is that the commonly used Newmark's sliding block analysis and the finite elements simulation are only comparable for high friction angles, where this factor dominates the behaviour of the structure. Worth to mention is that the proposed simulation methods are also applicable to dynamic rock wedge problems and not only to dams.
11th Annual CMMI Technology Conference and User Group
2011-11-17
Examples of triggers may include: – Cost performance – Schedule performance – Results of management reviews – Occurrence of the risk • as a...Analysis (PHA) – Method 3 – Through bottom- up analysis of design data (e.g., flow diagrams, Failure Mode Effects and Criticality Analysis (FMECA...of formal reviews and the setting up of delta or follow- up reviews can be used to give the organization more places to look at the products as they
Finnveden, Göran; Björklund, Anna; Moberg, Asa; Ekvall, Tomas
2007-06-01
A large number of methods and approaches that can be used for supporting waste management decisions at different levels in society have been developed. In this paper an overview of methods is provided and preliminary guidelines for the choice of methods are presented. The methods introduced include: Environmental Impact Assessment, Strategic Environmental Assessment, Life Cycle Assessment, Cost-Benefit Analysis, Cost-effectiveness Analysis, Life-cycle Costing, Risk Assessment, Material Flow Accounting, Substance Flow Analysis, Energy Analysis, Exergy Analysis, Entropy Analysis, Environmental Management Systems, and Environmental Auditing. The characteristics used are the types of impacts included, the objects under study and whether the method is procedural or analytical. The different methods can be described as systems analysis methods. Waste management systems thinking is receiving increasing attention. This is, for example, evidenced by the suggested thematic strategy on waste by the European Commission where life-cycle analysis and life-cycle thinking get prominent positions. Indeed, life-cycle analyses have been shown to provide policy-relevant and consistent results. However, it is also clear that the studies will always be open to criticism since they are simplifications of reality and include uncertainties. This is something all systems analysis methods have in common. Assumptions can be challenged and it may be difficult to generalize from case studies to policies. This suggests that if decisions are going to be made, they are likely to be made on a less than perfect basis.
The influence of winding direction of two-layer HTS DC cable on the critical current
NASA Astrophysics Data System (ADS)
Vyatkin, V. S.; Kashiwagi, K.; Ivanov, Y. V.; Otabe, E. S.; Yamaguchi, S.
2017-09-01
The design of twist pitch and direction of winding in multilayer HTS coaxial cable is important. For HTS AC transmitting cables, the main condition of twist pitch is the balance of inductances of each layer for providing the current balance between layers. In this work, the finite element method analysis for the coaxial cables with both same and opposite directions winding is used to calculate magnetic field distribution, and critical current of the cable is estimated. It was found that the critical current of the cable with same direction winding is about 10 percent higher than that in the case of the cable with the opposite direction winding.
Improvement of the cost-benefit analysis algorithm for high-rise construction projects
NASA Astrophysics Data System (ADS)
Gafurov, Andrey; Skotarenko, Oksana; Plotnikov, Vladimir
2018-03-01
The specific nature of high-rise investment projects entailing long-term construction, high risks, etc. implies a need to improve the standard algorithm of cost-benefit analysis. An improved algorithm is described in the article. For development of the improved algorithm of cost-benefit analysis for high-rise construction projects, the following methods were used: weighted average cost of capital, dynamic cost-benefit analysis of investment projects, risk mapping, scenario analysis, sensitivity analysis of critical ratios, etc. This comprehensive approach helped to adapt the original algorithm to feasibility objectives in high-rise construction. The authors put together the algorithm of cost-benefit analysis for high-rise construction projects on the basis of risk mapping and sensitivity analysis of critical ratios. The suggested project risk management algorithms greatly expand the standard algorithm of cost-benefit analysis in investment projects, namely: the "Project analysis scenario" flowchart, improving quality and reliability of forecasting reports in investment projects; the main stages of cash flow adjustment based on risk mapping for better cost-benefit project analysis provided the broad range of risks in high-rise construction; analysis of dynamic cost-benefit values considering project sensitivity to crucial variables, improving flexibility in implementation of high-rise projects.
2009-01-01
In high-dimensional studies such as genome-wide association studies, the correction for multiple testing in order to control total type I error results in decreased power to detect modest effects. We present a new analytical approach based on the higher criticism statistic that allows identification of the presence of modest effects. We apply our method to the genome-wide study of rheumatoid arthritis provided in the Genetic Analysis Workshop 16 Problem 1 data set. There is evidence for unknown bias in this study that could be explained by the presence of undetected modest effects. We compared the asymptotic and empirical thresholds for the higher criticism statistic. Using the asymptotic threshold we detected the presence of modest effects genome-wide. We also detected modest effects using 90th percentile of the empirical null distribution as a threshold; however, there is no such evidence when the 95th and 99th percentiles were used. While the higher criticism method suggests that there is some evidence for modest effects, interpreting individual single-nucleotide polymorphisms with significant higher criticism statistics is of undermined value. The goal of higher criticism is to alert the researcher that genetic effects remain to be discovered and to promote the use of more targeted and powerful studies to detect the remaining effects. PMID:20018032
Kurtz, Martha J.
2007-01-01
Increasingly, national stakeholders express concern that U.S. college graduates cannot adequately solve problems and think critically. As a set of cognitive abilities, critical thinking skills provide students with tangible academic, personal, and professional benefits that may ultimately address these concerns. As an instructional method, writing has long been perceived as a way to improve critical thinking. In the current study, the researchers compared critical thinking performance of students who experienced a laboratory writing treatment with those who experienced traditional quiz-based laboratory in a general education biology course. The effects of writing were determined within the context of multiple covariables. Results indicated that the writing group significantly improved critical thinking skills whereas the nonwriting group did not. Specifically, analysis and inference skills increased significantly in the writing group but not the nonwriting group. Writing students also showed greater gains in evaluation skills; however, these were not significant. In addition to writing, prior critical thinking skill and instructor significantly affected critical thinking performance, whereas other covariables such as gender, ethnicity, and age were not significant. With improved critical thinking skill, general education biology students will be better prepared to solve problems as engaged and productive citizens. PMID:17548876
Earth Observing System (EOS) Advanced Microwave Sounding Unit-A (AMSU-A) schedule plan
NASA Technical Reports Server (NTRS)
1994-01-01
This report describes Aerojet's methods and procedures used to control and administer contractual schedules for the EOS/AMSU-A program. Included are the following: the master, intermediate, and detail schedules; critical path analysis; and the total program logic network diagrams.
Readiness to perform testing : a critical analysis of the concept and current practices.
DOT National Transportation Integrated Search
1993-08-01
Readiness to Perform (RTP) testing has become an increasingly popular alternative to biochemical screening as a method for assessing risk factors (i.e., drug, alcohol, fatigue, etc.) in the workplace. The focus of RTP testing is on the assessment of ...
DOT National Transportation Integrated Search
2016-10-01
The study outlined in this report aimed to quantify the available redundancy in pony truss bridge systems : constructed using standard designs and practices in the state of Ohio. A method of conducting refined : three-dimensional nonlinear finite ele...
Cardiorespiratory Variability and Synchronization in Critical Illness
2008-03-08
Sedating drugs routinely prescribed to relieve patient anxiety were discontinued prior to the data collection. All patients were awake and responsive to...P.; Tarvainen, M. P.; Ranta-Aho, P. O.; Karjalainen, P. A. Software for advanced HRV analysis. Computer Methods and Programs in Biomedicine 2004
Automated Student Model Improvement
ERIC Educational Resources Information Center
Koedinger, Kenneth R.; McLaughlin, Elizabeth A.; Stamper, John C.
2012-01-01
Student modeling plays a critical role in developing and improving instruction and instructional technologies. We present a technique for automated improvement of student models that leverages the DataShop repository, crowd sourcing, and a version of the Learning Factors Analysis algorithm. We demonstrate this method on eleven educational…
Falls from Height in the Construction Industry: A Critical Review of the Scientific Literature
Nadhim, Evan A.; Hon, Carol; Xia, Bo; Stewart, Ian; Fang, Dongping
2016-01-01
Globally, falls from height (FFH) are a substantial public health jeopardy and are among the important leading causes of serious and fatal injuries for construction workers. A comprehensive understanding of the causal factors in FFH incidents is urgently required; however, the literature appears to lack a scientific review of FFH. In this study, 297 articles that contribute to the topic of fall incidents were reviewed. Seventy-five (75) articles met the criteria for relevance and were aggregated in a database to support a critical review. A synthesis of macro-variables approach was adopted rather than a structured meta-analysis. Such a method of analysis provides the flexibility to combine previous studies' findings. The most common factors associated with FFH are risky activities, individual characteristics, site conditions, organizational characteristics, agents (scaffolds/ladders) and weather conditions. The outcomes contributed to identifying the most significant research area for safety enhancement by improving engineering facilities, behaviour investigations and FFH prevention methods. PMID:27367706
Study design in high-dimensional classification analysis.
Sánchez, Brisa N; Wu, Meihua; Song, Peter X K; Wang, Wen
2016-10-01
Advances in high throughput technology have accelerated the use of hundreds to millions of biomarkers to construct classifiers that partition patients into different clinical conditions. Prior to classifier development in actual studies, a critical need is to determine the sample size required to reach a specified classification precision. We develop a systematic approach for sample size determination in high-dimensional (large [Formula: see text] small [Formula: see text]) classification analysis. Our method utilizes the probability of correct classification (PCC) as the optimization objective function and incorporates the higher criticism thresholding procedure for classifier development. Further, we derive the theoretical bound of maximal PCC gain from feature augmentation (e.g. when molecular and clinical predictors are combined in classifier development). Our methods are motivated and illustrated by a study using proteomics markers to classify post-kidney transplantation patients into stable and rejecting classes. © The Author 2016. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Expert nurses' clinical reasoning under uncertainty: representation, structure, and process.
Fonteyn, M. E.; Grobe, S. J.
1992-01-01
How do expert nurses reason when planning care and making clinical decisions for a patient who is at risk, and whose outcome is uncertain? In this study, a case study involving a critically ill elderly woman whose condition deteriorated over time, was presented in segments to ten expert critical care nurses. Think aloud method was used to elicit knowledge from these experts to provide conceptual information about their knowledge and to reveal their reasoning processes and problem-solving strategies. The verbatim transcripts were then analyzed using a systematic three-step method that makes analysis easier and adds creditability to study findings by providing a means of retracing and explaining analysis results. Findings revealed information about how patient problems were represented during reasoning, the manner in which experts subjects structured their plan of care, and the reasoning processes and heuristics they used to formulate solutions for resolving the patient's problems and preventing deterioration in the patient's condition. PMID:1482907
Greene, Jacob; Louis, Julien; Korostynska, Olga; Mason, Alex
2017-01-01
Muscle glycogen levels have a profound impact on an athlete’s sporting performance, thus measurement is vital. Carbohydrate manipulation is a fundamental component in an athlete’s lifestyle and is a critical part of elite performance, since it can provide necessary training adaptations. This paper provides a critical review of the current invasive and non-invasive methods for measuring skeletal muscle glycogen levels. These include the gold standard muscle biopsy, histochemical analysis, magnetic resonance spectroscopy, and musculoskeletal high frequency ultrasound, as well as pursuing future application of electromagnetic sensors in the pursuit of portable non-invasive quantification of muscle glycogen. This paper will be of interest to researchers who wish to understand the current and most appropriate techniques in measuring skeletal muscle glycogen. This will have applications both in the lab and in the field by improving the accuracy of research protocols and following the physiological adaptations to exercise. PMID:28241495
Thematic Analysis: How do patient diaries affect survivors' psychological recovery?
Teece, Angela; Baker, John
2017-08-01
This review aims to use thematic analysis to explore and synthesise evidence of the actual or potential reported effects of diaries on the psychological rehabilitation and recovery of discharged critical care patients. Evidence suggests that whilst admission to critical care may save patient lives, the psychological aftermath can damage a patient's recovery and these needs must be met. Patient diaries are one potential intervention to aid patients understand their critical illness and fill memory gaps caused by sedation, thus reducing psychological distress post-discharge. Prospective patient diaries are increasing in popularity amongst critical care units in the United Kingdom, however there is little evidence base to support their use or understand their effects. A literature review using systematic methods was undertaken of studies relating to the effects of diaries on discharged patients. Thematic analysis enabled the generation and synthesis of themes. Three themes arose from the generated codes: 1) Reclaiming ownership of lost time. 2) Emphasising personhood. 3) Fear and frustration. The diary intervention was shown to have a largely positive impact on survivors' psychological rehabilitation. However, caution should be exercised as recipients could find the contents painful and emotional. Diaries should be embedded within a robust critical care follow-up plan. This review suggests that diaries have the potential to form one aspect of rehabilitation and make a positive impact on patients' recovery. More research is indicated to fully evaluate the effects of diaries on their recipients. Crown Copyright © 2017. Published by Elsevier Ltd. All rights reserved.
Linking stressors and ecological responses
Gentile, J.H.; Solomon, K.R.; Butcher, J.B.; Harrass, M.; Landis, W.G.; Power, M.; Rattner, B.A.; Warren-Hicks, W.J.; Wenger, R.; Foran, Jeffery A.; Ferenc, Susan A.
1999-01-01
To characterize risk, it is necessary to quantify the linkages and interactions between chemical, physical and biological stressors and endpoints in the conceptual framework for ecological risk assessment (ERA). This can present challenges in a multiple stressor analysis, and it will not always be possible to develop a quantitative stressor-response profile. This review commences with a conceptual representation of the problem of developing a linkage analysis for multiple stressors and responses. The remainder of the review surveys a variety of mathematical and statistical methods (e.g., ranking methods, matrix models, multivariate dose-response for mixtures, indices, visualization, simulation modeling and decision-oriented methods) for accomplishing the linkage analysis for multiple stressors. Describing the relationships between multiple stressors and ecological effects are critical components of 'effects assessment' in the ecological risk assessment framework.
IDHEAS – A NEW APPROACH FOR HUMAN RELIABILITY ANALYSIS
DOE Office of Scientific and Technical Information (OSTI.GOV)
G. W. Parry; J.A Forester; V.N. Dang
2013-09-01
This paper describes a method, IDHEAS (Integrated Decision-Tree Human Event Analysis System) that has been developed jointly by the US NRC and EPRI as an improved approach to Human Reliability Analysis (HRA) that is based on an understanding of the cognitive mechanisms and performance influencing factors (PIFs) that affect operator responses. The paper describes the various elements of the method, namely the performance of a detailed cognitive task analysis that is documented in a crew response tree (CRT), and the development of the associated time-line to identify the critical tasks, i.e. those whose failure results in a human failure eventmore » (HFE), and an approach to quantification that is based on explanations of why the HFE might occur.« less
Back analysis of geomechanical parameters in underground engineering using artificial bee colony.
Zhu, Changxing; Zhao, Hongbo; Zhao, Ming
2014-01-01
Accurate geomechanical parameters are critical in tunneling excavation, design, and supporting. In this paper, a displacements back analysis based on artificial bee colony (ABC) algorithm is proposed to identify geomechanical parameters from monitored displacements. ABC was used as global optimal algorithm to search the unknown geomechanical parameters for the problem with analytical solution. To the problem without analytical solution, optimal back analysis is time-consuming, and least square support vector machine (LSSVM) was used to build the relationship between unknown geomechanical parameters and displacement and improve the efficiency of back analysis. The proposed method was applied to a tunnel with analytical solution and a tunnel without analytical solution. The results show the proposed method is feasible.
NASA Astrophysics Data System (ADS)
Jin, Tao; Chen, Yiyang; Flesch, Rodolfo C. C.
2017-11-01
Harmonics pose a great threat to safe and economical operation of power grids. Therefore, it is critical to detect harmonic parameters accurately to design harmonic compensation equipment. The fast Fourier transform (FFT) is widely used for electrical popular power harmonics analysis. However, the barrier effect produced by the algorithm itself and spectrum leakage caused by asynchronous sampling often affects the harmonic analysis accuracy. This paper examines a new approach for harmonic analysis based on deducing the modifier formulas of frequency, phase angle, and amplitude, utilizing the Nuttall-Kaiser window double spectrum line interpolation method, which overcomes the shortcomings in traditional FFT harmonic calculations. The proposed approach is verified numerically and experimentally to be accurate and reliable.
Critical behavior of the van der Waals bonded ferromagnet Fe 3 - x GeTe 2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Yu; Ivanovski, V. N.; Petrovic, C.
The critical properties of the single-crystalline van der Waals bonded ferromagnet Fe 3-xGeTe 2 were investigated by bulk dc magnetization around the paramagnetic to ferromagnetic (FM) phase transition. The Fe 3-xGeTe 2 single crystals grown by self-flux method with Fe deficiency x ≈ 0.36 exhibit bulk FM ordering below T c = 152 K. The Mössbauer spectroscopy was used to provide information on defects and local atomic environment in such crystals. Critical exponents β = 0.372(4) with a critical temperature T c= 151.25(5) K and γ = 1.265(15) with T c = 151.17(12) K are obtained by the Kouvel-Fisher method,more » whereas δ = 4.50 ( 1 ) is obtained by a critical isotherm analysis at T c = 151 K. These critical exponents obey the Widom scaling relation δ = 1 + γ / β , indicating self-consistency of the obtained values. With these critical exponents the isotherm M(H) curves below and above the critical temperatures collapse into two independent universal branches, obeying the single scaling equation m = f±(h), where m and h are renormalized magnetization and field, respectively. The exponents determined in this study are close to those calculated from the results of the renormalization group approach for a heuristic model of three-dimensional Heisenberg (d = 3,n = 3) spins coupled with the attractive long-range interactions between spins that decay as J(r) ≈ r -(3+σ) with σ = 1.89.« less
Critical behavior of the van der Waals bonded ferromagnet Fe 3 - x GeTe 2
Liu, Yu; Ivanovski, V. N.; Petrovic, C.
2017-10-29
The critical properties of the single-crystalline van der Waals bonded ferromagnet Fe 3-xGeTe 2 were investigated by bulk dc magnetization around the paramagnetic to ferromagnetic (FM) phase transition. The Fe 3-xGeTe 2 single crystals grown by self-flux method with Fe deficiency x ≈ 0.36 exhibit bulk FM ordering below T c = 152 K. The Mössbauer spectroscopy was used to provide information on defects and local atomic environment in such crystals. Critical exponents β = 0.372(4) with a critical temperature T c= 151.25(5) K and γ = 1.265(15) with T c = 151.17(12) K are obtained by the Kouvel-Fisher method,more » whereas δ = 4.50 ( 1 ) is obtained by a critical isotherm analysis at T c = 151 K. These critical exponents obey the Widom scaling relation δ = 1 + γ / β , indicating self-consistency of the obtained values. With these critical exponents the isotherm M(H) curves below and above the critical temperatures collapse into two independent universal branches, obeying the single scaling equation m = f±(h), where m and h are renormalized magnetization and field, respectively. The exponents determined in this study are close to those calculated from the results of the renormalization group approach for a heuristic model of three-dimensional Heisenberg (d = 3,n = 3) spins coupled with the attractive long-range interactions between spins that decay as J(r) ≈ r -(3+σ) with σ = 1.89.« less
Interpretation of correlations in clinical research.
Hung, Man; Bounsanga, Jerry; Voss, Maren Wright
2017-11-01
Critically analyzing research is a key skill in evidence-based practice and requires knowledge of research methods, results interpretation, and applications, all of which rely on a foundation based in statistics. Evidence-based practice makes high demands on trained medical professionals to interpret an ever-expanding array of research evidence. As clinical training emphasizes medical care rather than statistics, it is useful to review the basics of statistical methods and what they mean for interpreting clinical studies. We reviewed the basic concepts of correlational associations, violations of normality, unobserved variable bias, sample size, and alpha inflation. The foundations of causal inference were discussed and sound statistical analyses were examined. We discuss four ways in which correlational analysis is misused, including causal inference overreach, over-reliance on significance, alpha inflation, and sample size bias. Recent published studies in the medical field provide evidence of causal assertion overreach drawn from correlational findings. The findings present a primer on the assumptions and nature of correlational methods of analysis and urge clinicians to exercise appropriate caution as they critically analyze the evidence before them and evaluate evidence that supports practice. Critically analyzing new evidence requires statistical knowledge in addition to clinical knowledge. Studies can overstate relationships, expressing causal assertions when only correlational evidence is available. Failure to account for the effect of sample size in the analyses tends to overstate the importance of predictive variables. It is important not to overemphasize the statistical significance without consideration of effect size and whether differences could be considered clinically meaningful.
NASA Astrophysics Data System (ADS)
Roostaee, M.; Deng, Z.
2017-12-01
The states' environmental agencies are required by The Clean Water Act to assess all waterbodies and evaluate potential sources of impairments. Spatial and temporal distributions of water quality parameters are critical in identifying Critical Source Areas (CSAs). However, due to limitations in monetary resources and a large number of waterbodies, available monitoring stations are typically sparse with intermittent periods of data collection. Hence, scarcity of water quality data is a major obstacle in addressing sources of pollution through management strategies. In this study spatiotemporal Bayesian Maximum Entropy method (BME) is employed to model the inherent temporal and spatial variability of measured water quality indicators such as Dissolved Oxygen (DO) concentration for Turkey Creek Watershed. Turkey Creek is located in northern Louisiana and has been listed in 303(d) list for DO impairment since 2014 in Louisiana Water Quality Inventory Reports due to agricultural practices. BME method is proved to provide more accurate estimates than the methods of purely spatial analysis by incorporating space/time distribution and uncertainty in available measured soft and hard data. This model would be used to estimate DO concentration at unmonitored locations and times and subsequently identifying CSAs. The USDA's crop-specific land cover data layers of the watershed were then used to determine those practices/changes that led to low DO concentration in identified CSAs. Primary results revealed that cultivation of corn and soybean as well as urban runoff are main contributing sources in low dissolved oxygen in Turkey Creek Watershed.
iCLIP: protein-RNA interactions at nucleotide resolution.
Huppertz, Ina; Attig, Jan; D'Ambrogio, Andrea; Easton, Laura E; Sibley, Christopher R; Sugimoto, Yoichiro; Tajnik, Mojca; König, Julian; Ule, Jernej
2014-02-01
RNA-binding proteins (RBPs) are key players in the post-transcriptional regulation of gene expression. Precise knowledge about their binding sites is therefore critical to unravel their molecular function and to understand their role in development and disease. Individual-nucleotide resolution UV crosslinking and immunoprecipitation (iCLIP) identifies protein-RNA crosslink sites on a genome-wide scale. The high resolution and specificity of this method are achieved by an intramolecular cDNA circularization step that enables analysis of cDNAs that truncated at the protein-RNA crosslink sites. Here, we describe the improved iCLIP protocol and discuss critical optimization and control experiments that are required when applying the method to new RBPs. Copyright © 2013 The Authors. Published by Elsevier Inc. All rights reserved.
Post graduate ESP curriculum: reading and writing needs.
Dehnad, Afsaneh; Bagherzadeh, Rafat; Bigdeli, Shoaleh; Hatami, Kamran; Hosseini, Agha Fatemeh
2014-01-01
Assessing learners' needs is an integral part of any curriculum and course design , namely English for specific purposes (ESP), syllabus design, materials development, teaching methods and testing issues. Critical approach to needs analysis, which is a relatively recent approach, acknowledges the rights of different stakeholders including teachers, students and administrators in the process of needs analysis. However, there has been no formal need analysis for syllabus design at postgraduate level in Medical Universities affiliated to the Ministry of Health in Iran. This study, conducted in 2011, was an attempt to assess the reading and writing needs of postgraduate students in ESP courses on the basis of critical approach to needs analysis. The study population consisted of 67 people: 56 postgraduate students, 5 heads of departments, 5 ESP instructors and 1 executive manager at the Ministry of Health in Iran. Ethical and demographic forms, needs analysis questionnaires, and a form of semi-structured interview were the instruments of the study. According to the findings, there was a discrepancy between students' and instructors' perception of learners' needs and the assumed needs appearing in the syllabi prescribed by the Ministry of Health in Iran. This study showed that a call for critical needs analysis in which the rights of different stakeholders are acknowledged is necessary for meeting the requirements of any ESP classes especially at postgraduate level where the instructors and learners are fully aware of learners' needs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Yu; Petrovic, C.
Some critical properties of the single-crystalline semiconducting ferromagnet Cr 2 Ge 2 Te 6 were investigated by bulk dc magnetization around the paramagnetic to ferromagnetic phase transition. Critical exponents β = 0.200 ± 0.003 with a critical temperature T c = 62.65 ± 0.07 K and γ = 1.28 ± 0.03 with T c = 62.75 ± 0.06 K are obtained by the Kouvel-Fisher method whereas δ = 7.96 ± 0.01 is obtained by a critical isotherm analysis at T c = 62.7 K. These critical exponents obey the Widom scaling relation δ = 1 + γ / β ,more » indicating self-consistency of the obtained values. Furthermore, with these critical exponents the isotherm M ( H ) curves below and above the critical temperatures collapse into two independent universal branches, obeying the single scaling equation m = f ± ( h ) , where m and h are renormalized magnetization and field, respectively. The determined exponents match well with those calculated from the results of the renormalization group approach for a two-dimensional Ising system coupled with a long-range interaction between spins decaying as J ( r ) ≈ r - ( d + σ ) with σ = 1.52 .« less
Li, Guowei; Cook, Deborah J; Levine, Mitchell A H; Guyatt, Gordon; Crowther, Mark; Heels-Ansdell, Diane; Holbrook, Anne; Lamontagne, Francois; Walter, Stephen D; Ferguson, Niall D; Finfer, Simon; Arabi, Yaseen M; Bellomo, Rinaldo; Cooper, D Jamie; Thabane, Lehana
2015-09-01
Failure to recognize the presence of competing risk or to account for it may result in misleading conclusions. We aimed to perform a competing risk analysis to assess the efficacy of the low molecular weight heparin dalteparin versus unfractionated heparin (UFH) in venous thromboembolism (VTE) in medical-surgical critically ill patients, taking death as a competing risk.This was a secondary analysis of a prospective randomized study of the Prophylaxis for Thromboembolism in Critical Care Trial (PROTECT) database. A total of 3746 medical-surgical critically ill patients from 67 intensive care units (ICUs) in 6 countries receiving either subcutaneous UFH 5000 IU twice daily (n = 1873) or dalteparin 5000 IU once daily plus once-daily placebo (n = 1873) were included for analysis.A total of 205 incident proximal leg deep vein thromboses (PLDVT) were reported during follow-up, among which 96 were in the dalteparin group and 109 were in the UFH group. No significant treatment effect of dalteparin on PLDVT compared with UFH was observed in either the competing risk analysis or standard survival analysis (also known as cause-specific analysis) using multivariable models adjusted for APACHE II score, history of VTE, need for vasopressors, and end-stage renal disease: sub-hazard ratio (SHR) = 0.92, 95% confidence interval (CI): 0.70-1.21, P-value = 0.56 for the competing risk analysis; hazard ratio (HR) = 0.92, 95% CI: 0.68-1.23, P-value = 0.57 for cause-specific analysis. Dalteparin was associated with a significant reduction in risk of pulmonary embolism (PE): SHR = 0.54, 95% CI: 0.31-0.94, P-value = 0.02 for the competing risk analysis; HR = 0.51, 95% CI: 0.30-0.88, P-value = 0.01 for the cause-specific analysis. Two additional sensitivity analyses using the treatment variable as a time-dependent covariate and using as-treated and per-protocol approaches demonstrated similar findings.This competing risk analysis yields no significant treatment effect on PLDVT but a superior effect of dalteparin on PE compared with UFH in medical-surgical critically ill patients. The findings from the competing risk method are in accordance with results from the cause-specific analysis.clinicaltrials.gov Identifier: NCT00182143.
Li, Guowei; Cook, Deborah J.; Levine, Mitchell A.H.; Guyatt, Gordon; Crowther, Mark; Heels-Ansdell, Diane; Holbrook, Anne; Lamontagne, Francois; Walter, Stephen D.; Ferguson, Niall D.; Finfer, Simon; Arabi, Yaseen M.; Bellomo, Rinaldo; Cooper, D. Jamie; Thabane, Lehana
2015-01-01
Abstract Failure to recognize the presence of competing risk or to account for it may result in misleading conclusions. We aimed to perform a competing risk analysis to assess the efficacy of the low molecular weight heparin dalteparin versus unfractionated heparin (UFH) in venous thromboembolism (VTE) in medical-surgical critically ill patients, taking death as a competing risk. This was a secondary analysis of a prospective randomized study of the Prophylaxis for Thromboembolism in Critical Care Trial (PROTECT) database. A total of 3746 medical-surgical critically ill patients from 67 intensive care units (ICUs) in 6 countries receiving either subcutaneous UFH 5000 IU twice daily (n = 1873) or dalteparin 5000 IU once daily plus once-daily placebo (n = 1873) were included for analysis. A total of 205 incident proximal leg deep vein thromboses (PLDVT) were reported during follow-up, among which 96 were in the dalteparin group and 109 were in the UFH group. No significant treatment effect of dalteparin on PLDVT compared with UFH was observed in either the competing risk analysis or standard survival analysis (also known as cause-specific analysis) using multivariable models adjusted for APACHE II score, history of VTE, need for vasopressors, and end-stage renal disease: sub-hazard ratio (SHR) = 0.92, 95% confidence interval (CI): 0.70–1.21, P-value = 0.56 for the competing risk analysis; hazard ratio (HR) = 0.92, 95% CI: 0.68–1.23, P-value = 0.57 for cause-specific analysis. Dalteparin was associated with a significant reduction in risk of pulmonary embolism (PE): SHR = 0.54, 95% CI: 0.31–0.94, P-value = 0.02 for the competing risk analysis; HR = 0.51, 95% CI: 0.30–0.88, P-value = 0.01 for the cause-specific analysis. Two additional sensitivity analyses using the treatment variable as a time-dependent covariate and using as-treated and per-protocol approaches demonstrated similar findings. This competing risk analysis yields no significant treatment effect on PLDVT but a superior effect of dalteparin on PE compared with UFH in medical-surgical critically ill patients. The findings from the competing risk method are in accordance with results from the cause-specific analysis. clinicaltrials.gov Identifier: NCT00182143 PMID:26356708
Tremblay, Marie-Claude; Brousselle, Astrid; Richard, Lucie; Beaudet, Nicole
2013-10-01
Program designers and evaluators should make a point of testing the validity of a program's intervention theory before investing either in implementation or in any type of evaluation. In this context, logic analysis can be a particularly useful option, since it can be used to test the plausibility of a program's intervention theory using scientific knowledge. Professional development in public health is one field among several that would truly benefit from logic analysis, as it appears to be generally lacking in theorization and evaluation. This article presents the application of this analysis method to an innovative public health professional development program, the Health Promotion Laboratory. More specifically, this paper aims to (1) define the logic analysis approach and differentiate it from similar evaluative methods; (2) illustrate the application of this method by a concrete example (logic analysis of a professional development program); and (3) reflect on the requirements of each phase of logic analysis, as well as on the advantages and disadvantages of such an evaluation method. Using logic analysis to evaluate the Health Promotion Laboratory showed that, generally speaking, the program's intervention theory appeared to have been well designed. By testing and critically discussing logic analysis, this article also contributes to further improving and clarifying the method. Copyright © 2013 Elsevier Ltd. All rights reserved.
Probabilistic Structural Analysis Program
NASA Technical Reports Server (NTRS)
Pai, Shantaram S.; Chamis, Christos C.; Murthy, Pappu L. N.; Stefko, George L.; Riha, David S.; Thacker, Ben H.; Nagpal, Vinod K.; Mital, Subodh K.
2010-01-01
NASA/NESSUS 6.2c is a general-purpose, probabilistic analysis program that computes probability of failure and probabilistic sensitivity measures of engineered systems. Because NASA/NESSUS uses highly computationally efficient and accurate analysis techniques, probabilistic solutions can be obtained even for extremely large and complex models. Once the probabilistic response is quantified, the results can be used to support risk-informed decisions regarding reliability for safety-critical and one-of-a-kind systems, as well as for maintaining a level of quality while reducing manufacturing costs for larger-quantity products. NASA/NESSUS has been successfully applied to a diverse range of problems in aerospace, gas turbine engines, biomechanics, pipelines, defense, weaponry, and infrastructure. This program combines state-of-the-art probabilistic algorithms with general-purpose structural analysis and lifting methods to compute the probabilistic response and reliability of engineered structures. Uncertainties in load, material properties, geometry, boundary conditions, and initial conditions can be simulated. The structural analysis methods include non-linear finite-element methods, heat-transfer analysis, polymer/ceramic matrix composite analysis, monolithic (conventional metallic) materials life-prediction methodologies, boundary element methods, and user-written subroutines. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. NASA/NESSUS 6.2c is structured in a modular format with 15 elements.
Sequential change detection and monitoring of temporal trends in random-effects meta-analysis.
Dogo, Samson Henry; Clark, Allan; Kulinskaya, Elena
2017-06-01
Temporal changes in magnitude of effect sizes reported in many areas of research are a threat to the credibility of the results and conclusions of meta-analysis. Numerous sequential methods for meta-analysis have been proposed to detect changes and monitor trends in effect sizes so that meta-analysis can be updated when necessary and interpreted based on the time it was conducted. The difficulties of sequential meta-analysis under the random-effects model are caused by dependencies in increments introduced by the estimation of the heterogeneity parameter τ 2 . In this paper, we propose the use of a retrospective cumulative sum (CUSUM)-type test with bootstrap critical values. This method allows retrospective analysis of the past trajectory of cumulative effects in random-effects meta-analysis and its visualization on a chart similar to CUSUM chart. Simulation results show that the new method demonstrates good control of Type I error regardless of the number or size of the studies and the amount of heterogeneity. Application of the new method is illustrated on two examples of medical meta-analyses. © 2016 The Authors. Research Synthesis Methods published by John Wiley & Sons Ltd. © 2016 The Authors. Research Synthesis Methods published by John Wiley & Sons Ltd.
Addendum to the User Manual for NASGRO Elastic-Plastic Fracture Mechanics Software Module
NASA Technical Reports Server (NTRS)
Gregg, M. Wayne (Technical Monitor); Chell, Graham; Gardner, Brian
2003-01-01
The elastic-plastic fracture mechanics modules in NASGRO have been enhanced by the addition of of the following: new J-integral solutions based on the reference stress method and finite element solutions; the extension of the critical crack and critical load modules for cracks with two degrees of freedom that tear and failure by ductile instability; the addition of a proof test analysis module that includes safe life analysis, calculates proof loads, and determines the flaw screening 1 capability for a given proof load; the addition of a tear-fatigue module for ductile materials that simultaneously tear and extend by fatigue; and a multiple cycle proof test module for estimating service reliability following a proof test.
NASA Astrophysics Data System (ADS)
Grzegorz Kossakowski, Paweł; Wciślik, Wiktor
2017-10-01
The paper is concerned with the nucleation, growth and coalescence of microdefects in the form of voids in S235JR steel. The material is known to be one of the basic steel grades commonly used in the construction industry. The theory and methods of damage mechanics were applied to determine and describe the failure mechanisms that occur when the material undergoes deformation. Until now, engineers have generally employed the Gurson-Tvergaard- Needleman model. This material model based on damage mechanics is well suited to define and analyze failure processes taking place in the microstructure of S235JR steel. It is particularly important to determine the critical void volume fraction fc , which is one of the basic parameters of the Gurson-Tvergaard-Needleman material model. As the critical void volume fraction fc refers to the failure stage, it is determined from the data collected for the void coalescence phase. A case of multi-axial stresses is considered taking into account the effects of spatial stress state. In this study, the parameter of stress triaxiality η was used to describe the failure phenomena. Cylindrical tensile specimens with a circumferential notch were analysed to obtain low values of initial stress triaxiality (η = 0.556 of the range) in order to determine the critical void volume fraction fc . It is essential to emphasize how unique the method applied is and how different it is from the other more common methods involving parameter calibration, i.e. curve-fitting methods. The critical void volume fraction fc at void coalescence was established through digital image analysis of surfaces of S235JR steel, which involved studying real, physical results obtained directly from the material tested.
NASA Astrophysics Data System (ADS)
Mercaldo, M. T.; Rabuffo, I.; De Cesare, L.; Caramico D'Auria, A.
2016-04-01
In this work we study the quantum phase transition, the phase diagram and the quantum criticality induced by the easy-plane single-ion anisotropy in a d-dimensional quantum spin-1 XY model in absence of an external longitudinal magnetic field. We employ the two-time Green function method by avoiding the Anderson-Callen decoupling of spin operators at the same sites which is of doubtful accuracy. Following the original Devlin procedure we treat exactly the higher order single-site anisotropy Green functions and use Tyablikov-like decouplings for the exchange higher order ones. The related self-consistent equations appear suitable for an analysis of the thermodynamic properties at and around second order phase transition points. Remarkably, the equivalence between the microscopic spin model and the continuous O(2) -vector model with transverse-Ising model (TIM)-like dynamics, characterized by a dynamic critical exponent z=1, emerges at low temperatures close to the quantum critical point with the single-ion anisotropy parameter D as the non-thermal control parameter. The zero-temperature critic anisotropy parameter Dc is obtained for dimensionalities d > 1 as a function of the microscopic exchange coupling parameter and the related numerical data for different lattices are found to be in reasonable agreement with those obtained by means of alternative analytical and numerical methods. For d > 2, and in particular for d=3, we determine the finite-temperature critical line ending in the quantum critical point and the related TIM-like shift exponent, consistently with recent renormalization group predictions. The main crossover lines between different asymptotic regimes around the quantum critical point are also estimated providing a global phase diagram and a quantum criticality very similar to the conventional ones.
M-OSCE as a method to measure dental hygiene students' critical thinking: a pilot study.
McComas, Martha J; Wright, Rebecca A; Mann, Nancy K; Cooper, Mary D; Jacks, Mary E
2013-04-01
Educators in all academic disciplines have been encouraged to utilize assessment strategies to evaluate students' critical thinking. The purpose of this study was to assess the viability of the modified objective structured clinical examination (m-OSCE) to evaluate critical thinking in dental hygiene education. This evaluation utilized a convenience sample of senior dental hygiene students. Students participated in the m-OSCE in which portions of a patient case were revealed at four stations. The exam consisted of multiple-choice questions intended to measure students' ability to utilize critical thinking skills. Additionally, there was one fill-in-the-blank question and a treatment plan that was completed at the fifth station. The results of this study revealed that the m-OSCE did not reliably measure dental hygiene students' critical thinking. Statistical analysis found no satisfactory reliability within the multiple-choice questions and moderately reliable results within the treatment planning portion of the examination. In addition, the item analysis found gaps in students' abilities to transfer clinical evidence/data to basic biomedical knowledge as demonstrated through the multiple-choice questioning results. This outcome warrants further investigation of the utility of the m-OSCE, with a focus on modifications to the evaluation questions, grading rubric, and patient case.
Yang, Samuel H; Wang, Jenny; Zhang, Kelly
2017-04-07
Despite the advantages of 2D-LC, there is currently little to no work in demonstrating the suitability of these 2D-LC methods for use in a quality control (QC) environment for good manufacturing practice (GMP) tests. This lack of information becomes more critical as the availability of commercial 2D-LC instrumentation has significantly increased, and more testing facilities begin to acquire these 2D-LC capabilities. It is increasingly important that the transferability of developed 2D-LC methods be assessed in terms of reproducibility, robustness and performance across different laboratories worldwide. The work presented here focuses on the evaluation of a heart-cutting 2D-LC method used for the analysis of a pharmaceutical material, where a key, co-eluting impurity in the first dimension ( 1 D) is resolved from the main peak and analyzed in the second dimension ( 2 D). A design-of-experiments (DOE) approach was taken in the collection of the data, and the results were then modeled in order to evaluate method robustness using statistical modeling software. This quality by design (QBD) approach gives a deeper understanding of the impact of these 2D-LC critical method attributes (CMAs) and how they affect overall method performance. Although there are multiple parameters that may be critical from method development point of view, a special focus of this work is devoted towards evaluation of unique 2D-LC critical method attributes from method validation perspective that transcend conventional method development and validation. The 2D-LC method attributes are evaluated for their recovery, peak shape, and resolution of the two co-eluting compounds in question on the 2 D. In the method, linearity, accuracy, precision, repeatability, and sensitivity are assessed along with day-to-day, analyst-to-analyst, and lab-to-lab (instrument-to-instrument) assessments. The results of this validation study demonstrate that the 2D-LC method is accurate, sensitive, and robust and is ultimately suitable for QC testing with good method transferability. Copyright © 2017 Elsevier B.V. All rights reserved.
Nonlinear analysis of a closed-loop tractor-semitrailer vehicle system with time delay
NASA Astrophysics Data System (ADS)
Liu, Zhaoheng; Hu, Kun; Chung, Kwok-wai
2016-08-01
In this paper, a nonlinear analysis is performed on a closed-loop system of articulated heavy vehicles with driver steering control. The nonlinearity arises from the nonlinear cubic tire force model. An integration method is employed to derive an analytical periodic solution of the system in the neighbourhood of the critical speed. The results show that excellent accuracy can be achieved for the calculation of periodic solutions arising from Hopf bifurcation of the vehicle motion. A criterion is obtained for detecting the Bautin bifurcation which separates branches of supercritical and subcritical Hopf bifurcations. The integration method is compared to the incremental harmonic balance method in both supercritical and subcritical scenarios.
Li, Zhi; Chen, Weidong; Lian, Feiyu; Ge, Hongyi; Guan, Aihong
2017-12-01
Quantitative analysis of component mixtures is an important application of terahertz time-domain spectroscopy (THz-TDS) and has attracted broad interest in recent research. Although the accuracy of quantitative analysis using THz-TDS is affected by a host of factors, wavelength selection from the sample's THz absorption spectrum is the most crucial component. The raw spectrum consists of signals from the sample and scattering and other random disturbances that can critically influence the quantitative accuracy. For precise quantitative analysis using THz-TDS, the signal from the sample needs to be retained while the scattering and other noise sources are eliminated. In this paper, a novel wavelength selection method based on differential evolution (DE) is investigated. By performing quantitative experiments on a series of binary amino acid mixtures using THz-TDS, we demonstrate the efficacy of the DE-based wavelength selection method, which yields an error rate below 5%.
Eastwood, John Graeme; Jalaludin, Bin Badrudin; Kemp, Lynn Ann; Phung, Hai Ngoc
2014-01-01
We have previously reported in this journal on an ecological study of perinatal depressive symptoms in South Western Sydney. In that article, we briefly reported on a factor analysis that was utilized to identify empirical indicators for analysis. In this article, we report on the mixed method approach that was used to identify those latent variables. Social epidemiology has been slow to embrace a latent variable approach to the study of social, political, economic, and cultural structures and mechanisms, partly for philosophical reasons. Critical realist ontology and epistemology have been advocated as an appropriate methodological approach to both theory building and theory testing in the health sciences. We describe here an emergent mixed method approach that uses qualitative methods to identify latent constructs followed by factor analysis using empirical indicators chosen to measure identified qualitative codes. Comparative analysis of the findings is reported together with a limited description of realist approaches to abstract reasoning.
Signal, Matthew; Thomas, Felicity; Shaw, Geoffrey M.; Chase, J. Geoffrey
2013-01-01
Background Critically ill patients often experience high levels of insulin resistance and stress-induced hyperglycemia, which may negatively impact outcomes. However, evidence surrounding the causes of negative outcomes remains inconclusive. Continuous glucose monitoring (CGM) devices allow researchers to investigate glucose complexity, using detrended fluctuation analysis (DFA), to determine whether it is associated with negative outcomes. The aim of this study was to investigate the effects of CGM device type/calibration and CGM sensor location on results from DFA. Methods This study uses CGM data from critically ill patients who were each monitored concurrently using Medtronic iPro2s on the thigh and abdomen and a Medtronic Guardian REAL-Time on the abdomen. This allowed interdevice/calibration type and intersensor site variation to be assessed. Detrended fluctuation analysis is a technique that has previously been used to determine the complexity of CGM data in critically ill patients. Two variants of DFA, monofractal and multifractal, were used to assess the complexity of sensor glucose data as well as the precalibration raw sensor current. Monofractal DFA produces a scaling exponent (H), where H is inversely related to complexity. The results of multifractal DFA are presented graphically by the multifractal spectrum. Results From the 10 patients recruited, 26 CGM devices produced data suitable for analysis. The values of H from abdominal iPro2 data were 0.10 (0.03–0.20) higher than those from Guardian REAL-Time data, indicating consistently lower complexities in iPro2 data. However, repeating the analysis on the raw sensor current showed little or no difference in complexity. Sensor site had little effect on the scaling exponents in this data set. Finally, multifractal DFA revealed no significant associations between the multifractal spectrums and CGM device type/calibration or sensor location. Conclusions Monofractal DFA results are dependent on the device/calibration used to obtain CGM data, but sensor location has little impact. Future studies of glucose complexity should consider the findings presented here when designing their investigations. PMID:24351175
A nationwide analysis of 30-day readmissions related to critical limb ischemia.
Masoomi, Reza; Shah, Zubair; Quint, Clay; Hance, Kirk; Vamanan, Karthik; Prasad, Anand; Hoel, Andrew; Dawn, Buddhadeb; Gupta, Kamal
2018-06-01
Objectives There is paucity of information regarding critical limb ischemia-related readmission rates in patients admitted with critical limb ischemia. We studied 30-day critical limb ischemia-related readmission rate, its predictors, and clinical outcomes using a nationwide real-world dataset. Methods We did a secondary analysis of the 2013 Nationwide Readmissions Database. We included all patients with a primary diagnosis of extremity rest pain, ulceration, and gangrene secondary to peripheral arterial disease. From this group, all patients readmitted with similar diagnosis within 30 days were recorded. Results Of the total 25,111 index hospitalization for critical limb ischemia, 1270 (5%) were readmitted with a primary diagnosis of critical limb ischemia within 30 days. The readmission rate was highest (9.5%) for the group that did not have any intervention (revascularization or major amputation) and was lowest for surgical revascularization and major amputation groups (2.6% and 1.3%, P value <0.001 for all groups). Severity of critical limb ischemia at index admission was associated with a significantly higher rate of 30-day readmission. Critical limb ischemia-related readmission was associated with a higher rate of major amputation (29.6% vs. 16.2%, P<0.001), a lower rate of any revascularization procedure (46% vs. 62.6%, P<0.001), and a higher likelihood of discharge to a skilled nursing facility (43.2% vs. 32.2%, P<0.001) compared to index hospitalization. Conclusions In patients with primary diagnosis of critical limb ischemia, 30-day critical limb ischemia-related readmission rate was affected by initial management strategy and the severity of critical limb ischemia. Readmission was associated with a significantly higher rate of amputation, increased length of stay, and a more frequent discharge to an alternate care facility than index admission and thus may serve as a useful quality of care metric in critical limb ischemia patients.
Data Analysis for the Behavioral Sciences Using SPSS
NASA Astrophysics Data System (ADS)
Lawner Weinberg, Sharon; Knapp Abramowitz, Sarah
2002-04-01
This book is written from the perspective that statistics is an integrated set of tools used together to uncover the story contained in numerical data. Accordingly, the book comes with a disk containing a series of real data sets to motivate discussions of appropriate methods of analysis. The presentation is based on a conceptual approach supported by an understanding of underlying mathematical foundations. Students learn that more than one method of analysis is typically needed and that an ample characterization of results is a critical component of any data analytic plan. The use of real data and SPSS to perform computations and create graphical summaries enables a greater emphasis on conceptual understanding and interpretation.
Analyses of exobiological and potential resource materials in the Martian soil.
Mancinelli, R L; Marshall, J R; White, M R
1992-01-01
Potential Martian soil components relevant to exobiology include water, organic matter, evaporites, clays, and oxides. These materials are also resources for human expeditions to Mars. When found in particular combinations, some of these materials constitute diagnostic paleobiomarker suites, allowing insight to be gained into the probability of life originating on Mars. Critically important to exobiology is the method of data analysis and data interpretation. To that end we are investigating methods of analysis of potential biomarker and paleobiomarker compounds and resource materials in soils and rocks pertinent to Martian geology. Differential thermal analysis coupled with gas chromatography is shown to be a highly useful analytical technique for detecting this wide and complex variety of materials.
Analyses of exobiological and potential resource materials in the Martian soil
NASA Technical Reports Server (NTRS)
Mancinelli, Rocco L.; Marshall, John R.; White, Melisa R.
1992-01-01
Potential Martian soil components relevant to exobiology include water, organic matter, evaporites, clays, and oxides. These materials are also resources for human expeditions to Mars. When found in particular combinations, some of these materials constitute diagnostic paleobiomarker suites, allowing insight to be gained into the probability of life originating on Mars. Critically important to exobiology is the method of data analysis and data interpretation. To that end, methods of analysis of potential biomarker and paleobiomarker compounds and resource materials in soils and rocks pertinent to Martian geology are investigated. Differential thermal analysis coupled with gas chromotography is shown to be a highly useful analytical technique for detecting this wide and complex variety of materials.
ERIC Educational Resources Information Center
Becker, Nicole M.; Rupp, Charlie A.; Brandriet, Alexandra
2017-01-01
Models related to the topic of chemical kinetics are critical for predicting and explaining chemical reactivity. Here we present a qualitative study of 15 general chemistry students' reasoning about a method of initial rates task. We asked students to discuss their understanding of the terms rate law and initial rate, and then analyze rate and…
Fourth NASA Langley Formal Methods Workshop
NASA Technical Reports Server (NTRS)
Holloway, C. Michael (Compiler); Hayhurst, Kelly J. (Compiler)
1997-01-01
This publication consists of papers presented at NASA Langley Research Center's fourth workshop on the application of formal methods to the design and verification of life-critical systems. Topic considered include: Proving properties of accident; modeling and validating SAFER in VDM-SL; requirement analysis of real-time control systems using PVS; a tabular language for system design; automated deductive verification of parallel systems. Also included is a fundamental hardware design in PVS.
Analysis of critical thinking ability in direct current electrical problems solving
NASA Astrophysics Data System (ADS)
Hartono; Sunarno, Widha; Sarwanto; Arya Nugraha, Dewanta
2017-11-01
This study concern on analyzing the ability of students in critical thinking skills on the subject matter of direct current electricity. Samples were taken using purposive random sampling consisted of 32 students of grade XI, Multimedia 1, SMK Negeri 3 Surakarta in academic year 2016/2017. This study used descriptive quantitative method. The data were collected using tests and interviews regarding the subject matter of direct current electricity. Based on the results, students are getting some difficulties in solving problem in indicator 4. The average of students’ correct answer is 62.8%.
Robinson, César Leyton; Caballero, Andrés Díaz
2007-01-01
This article is an experimental methodological reflection on the use of medical images as useful documents for constructing the history of medicine. A method is used that is based on guidelines or analysis topics that include different ways of viewing documents, from aesthetic, technical, social and political theories to historical and medical thinking. Some exercises are also included that enhance the proposal for the reader: rediscovering the worlds in society that harbor these medical photographical archives to obtain a new theoretical approach to the construction of the history of medical science.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yuan, Chris, E-mail: cyuan@uwm.edu; Wang, Endong; Zhai, Qiang
Temporal homogeneity of inventory data is one of the major problems in life cycle assessment (LCA). Addressing temporal homogeneity of life cycle inventory data is important in reducing the uncertainties and improving the reliability of LCA results. This paper attempts to present a critical review and discussion on the fundamental issues of temporal homogeneity in conventional LCA and propose a theoretical framework for temporal discounting in LCA. Theoretical perspectives for temporal discounting in life cycle inventory analysis are discussed first based on the key elements of a scientific mechanism for temporal discounting. Then generic procedures for performing temporal discounting inmore » LCA is derived and proposed based on the nature of the LCA method and the identified key elements of a scientific temporal discounting method. A five-step framework is proposed and reported in details based on the technical methods and procedures needed to perform a temporal discounting in life cycle inventory analysis. Challenges and possible solutions are also identified and discussed for the technical procedure and scientific accomplishment of each step within the framework. - Highlights: • A critical review for temporal homogeneity problem of life cycle inventory data • A theoretical framework for performing temporal discounting on inventory data • Methods provided to accomplish each step of the temporal discounting framework.« less
Educational Evaluation: Analysis and Responsibility.
ERIC Educational Resources Information Center
Apple, Michael W., Ed.; And Others
This book presents controversial aspects of evaluation and aims at broadening perspectives and insights in the evaluation field. Chapter 1 criticizes modes of evaluation and the basic rationality behind them and focuses on assumptions that have problematic consequences. Chapter 2 introduces concepts of evaluation and examines methods of grading…
Energy Programming for Buildings.
ERIC Educational Resources Information Center
Parshall, Steven; Diserens, Steven
1982-01-01
Programing is described as a process leading to an explicit statement of an architectural problem. The programing phase is seen as the most critical period in the delivery process in which energy analysis can have an impact on design. A programing method appropriate for standard architectural practice is provided. (MLW)
Tiwari, Vikram; Kumar, Avinash B
2018-01-01
The current system of summative multi-rater evaluations and standardized tests to determine readiness to graduate from critical care fellowships has limitations. We sought to pilot the use of data envelopment analysis (DEA) to assess what aspects of the fellowship program contribute the most to an individual fellow's success. DEA is a nonparametric, operations research technique that uses linear programming to determine the technical efficiency of an entity based on its relative usage of resources in producing the outcome. Retrospective cohort study. Critical care fellows (n = 15) in an Accreditation Council for Graduate Medical Education (ACGME) accredited fellowship at a major academic medical center in the United States. After obtaining institutional review board approval for this retrospective study, we analyzed the data of 15 anesthesiology critical care fellows from academic years 2013-2015. The input-oriented DEA model develops a composite score for each fellow based on multiple inputs and outputs. The inputs included the didactic sessions attended, the ratio of clinical duty works hours to the procedures performed (work intensity index), and the outputs were the Multidisciplinary Critical Care Knowledge Assessment Program (MCCKAP) score and summative evaluations of fellows. A DEA efficiency score that ranged from 0 to 1 was generated for each of the fellows. Five fellows were rated as DEA efficient, and 10 fellows were characterized in the DEA inefficient group. The model was able to forecast the level of effort needed for each inefficient fellow, to achieve similar outputs as their best performing peers. The model also identified the work intensity index as the key element that characterized the best performers in our fellowship. DEA is a feasible method of objectively evaluating peer performance in a critical care fellowship beyond summative evaluations alone and can potentially be a powerful tool to guide individual performance during the fellowship.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roach, Dennis Patrick; Jauregui, David Villegas; Daumueller, Andrew Nicholas
2012-02-01
Recent structural failures such as the I-35W Mississippi River Bridge in Minnesota have underscored the urgent need for improved methods and procedures for evaluating our aging transportation infrastructure. This research seeks to develop a basis for a Structural Health Monitoring (SHM) system to provide quantitative information related to the structural integrity of metallic structures to make appropriate management decisions and ensuring public safety. This research employs advanced structural analysis and nondestructive testing (NDT) methods for an accurate fatigue analysis. Metal railroad bridges in New Mexico will be the focus since many of these structures are over 100 years old andmore » classified as fracture-critical. The term fracture-critical indicates that failure of a single component may result in complete collapse of the structure such as the one experienced by the I-35W Bridge. Failure may originate from sources such as loss of section due to corrosion or cracking caused by fatigue loading. Because standard inspection practice is primarily visual, these types of defects can go undetected due to oversight, lack of access to critical areas, or, in riveted members, hidden defects that are beneath fasteners or connection angles. Another issue is that it is difficult to determine the fatigue damage that a structure has experienced and the rate at which damage is accumulating due to uncertain history and load distribution in supporting members. A SHM system has several advantages that can overcome these limitations. SHM allows critical areas of the structure to be monitored more quantitatively under actual loading. The research needed to apply SHM to metallic structures was performed and a case study was carried out to show the potential of SHM-driven fatigue evaluation to assess the condition of critical transportation infrastructure and to guide inspectors to potential problem areas. This project combines the expertise in transportation infrastructure at New Mexico State University with the expertise at Sandia National Laboratories in the emerging field of SHM.« less
Introduction to Concurrent Engineering: Electronic Circuit Design and Production Applications
1992-09-01
STD-1629. Failure mode distribution data for many different types of parts may be found in RAC publication FMD -91. FMEA utilizes inductive logic in a...contrasts with a Fault Tree Analysis ( FTA ) which utilizes deductive logic in a "top down" approach. In FTA , a system failure is assumed and traced down...Analysis ( FTA ) is a graphical method of risk analysis used to identify critical failure modes within a system or equipment. Utilizing a pictorial approach
NASA Technical Reports Server (NTRS)
1992-01-01
The technical effort and computer code developed during the first year are summarized. Several formulations for Probabilistic Finite Element Analysis (PFEA) are described with emphasis on the selected formulation. The strategies being implemented in the first-version computer code to perform linear, elastic PFEA is described. The results of a series of select Space Shuttle Main Engine (SSME) component surveys are presented. These results identify the critical components and provide the information necessary for probabilistic structural analysis.
Micromechanical analysis of thermo-inelastic multiphase short-fiber composites
NASA Technical Reports Server (NTRS)
Aboudi, Jacob
1994-01-01
A micromechanical formulation is presented for the prediction of the overall thermo-inelastic behavior of multiphase composites which consist of short fibers. The analysis is an extension of the generalized method of cells that was previously derived for inelastic composites with continuous fibers, and the reliability of which was critically examined in several situations. The resulting three dimensional formulation is extremely general, wherein the analysis of thermo-inelastic composites with continuous fibers as well as particulate and porous inelastic materials are merely special cases.
Analysis of documentary support for environmental restoration programs in Russia
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nechaev, A.F.; Projaev, V.V.
1995-12-31
Taking into account an importance of an adequate regulations for ensuring of radiological safety of the biosphere and for successful implementation of environmental restoration projects, contents of legislative and methodical documents as well as their comprehensitivity and substantiation are subjected to critical analysis. It is shown that there is much scope for further optimization of and improvements in regulatory basis both on Federal and regional levels.
Wendy A. Kuntz; Peter B. Stacey
1997-01-01
Individual identification, especially in rare species, can provide managers with critical information about demographic processes. Traditionally, banding has been the only effective method of marking individuals. However, banding's drawbacks have led some researchers to suggest vocal analysis as an alternative. We explore this prospect for Mexican Spotted Owls (...
Analysis of small crack behavior for airframe applications
NASA Technical Reports Server (NTRS)
Mcclung, R. C.; Chan, K. S.; Hudak, S. J., Jr.; Davidson, D. L.
1994-01-01
The small fatigue crack problem is critically reviewed from the perspective of airframe applications. Different types of small cracks-microstructural, mechanical, and chemical-are carefully defined and relevant mechanisms identified. Appropriate analysis techniques, including both rigorous scientific and practical engineering treatments, are briefly described. Important materials data issues are addressed, including increased scatter in small crack data and recommended small crack test methods. Key problems requiring further study are highlighted.
Accurate airway centerline extraction based on topological thinning using graph-theoretic analysis.
Bian, Zijian; Tan, Wenjun; Yang, Jinzhu; Liu, Jiren; Zhao, Dazhe
2014-01-01
The quantitative analysis of the airway tree is of critical importance in the CT-based diagnosis and treatment of popular pulmonary diseases. The extraction of airway centerline is a precursor to identify airway hierarchical structure, measure geometrical parameters, and guide visualized detection. Traditional methods suffer from extra branches and circles due to incomplete segmentation results, which induce false analysis in applications. This paper proposed an automatic and robust centerline extraction method for airway tree. First, the centerline is located based on the topological thinning method; border voxels are deleted symmetrically to preserve topological and geometrical properties iteratively. Second, the structural information is generated using graph-theoretic analysis. Then inaccurate circles are removed with a distance weighting strategy, and extra branches are pruned according to clinical anatomic knowledge. The centerline region without false appendices is eventually determined after the described phases. Experimental results show that the proposed method identifies more than 96% branches and keep consistency across different cases and achieves superior circle-free structure and centrality.
NASA Astrophysics Data System (ADS)
Smith, Clint; Edwards, Jarrod; Fisher, Andmorgan
2010-04-01
Rapid detection of biological material is critical for determining presence/absence of bacterial endospores within various investigative programs. Even more critical is that if select material tests positive for bacillus endospores then tests should provide data at the species level. Optical detection of microbial endospore formers such as Bacillus sp. can be heavy, cumbersome, and may only identify at the genus level. Data provided from this study will aid in characterization needed by future detection systems for further rapid breakdown analysis to gain insight into a more positive signature collection of Bacillus sp. Literature has shown that fluorescence spectroscopy of endospores could be statistically separated from other vegetative genera, but could not be separated among one another. Results of this study showed endospore species separation is possible using laser-induce fluorescence with lifetime decay analysis for Bacillus endospores. Lifetime decays of B. subtilis, B. megaterium, B. coagulans, and B. anthracis Sterne strain were investigated. Using the Multi-Exponential fit method data showed three distinct lifetimes for each species within the following ranges, 0.2-1.3 ns; 2.5-7.0 ns; 7.5-15.0 ns, when laser induced at 307 nm. The four endospore species were individually separated using principle component analysis (95% CI).
Baldelli, Sara; Marrubini, Giorgio; Cattaneo, Dario; Clementi, Emilio; Cerea, Matteo
2017-10-01
The application of Quality by Design (QbD) principles in clinical laboratories can help to develop an analytical method through a systematic approach, providing a significant advance over the traditional heuristic and empirical methodology. In this work, we applied for the first time the QbD concept in the development of a method for drug quantification in human plasma using elvitegravir as the test molecule. The goal of the study was to develop a fast and inexpensive quantification method, with precision and accuracy as requested by the European Medicines Agency guidelines on bioanalytical method validation. The method was divided into operative units, and for each unit critical variables affecting the results were identified. A risk analysis was performed to select critical process parameters that should be introduced in the design of experiments (DoEs). Different DoEs were used depending on the phase of advancement of the study. Protein precipitation and high-performance liquid chromatography-tandem mass spectrometry were selected as the techniques to be investigated. For every operative unit (sample preparation, chromatographic conditions, and detector settings), a model based on factors affecting the responses was developed and optimized. The obtained method was validated and clinically applied with success. To the best of our knowledge, this is the first investigation thoroughly addressing the application of QbD to the analysis of a drug in a biological matrix applied in a clinical laboratory. The extensive optimization process generated a robust method compliant with its intended use. The performance of the method is continuously monitored using control charts.
Evaluating Gene Set Enrichment Analysis Via a Hybrid Data Model
Hua, Jianping; Bittner, Michael L.; Dougherty, Edward R.
2014-01-01
Gene set enrichment analysis (GSA) methods have been widely adopted by biological labs to analyze data and generate hypotheses for validation. Most of the existing comparison studies focus on whether the existing GSA methods can produce accurate P-values; however, practitioners are often more concerned with the correct gene-set ranking generated by the methods. The ranking performance is closely related to two critical goals associated with GSA methods: the ability to reveal biological themes and ensuring reproducibility, especially for small-sample studies. We have conducted a comprehensive simulation study focusing on the ranking performance of seven representative GSA methods. We overcome the limitation on the availability of real data sets by creating hybrid data models from existing large data sets. To build the data model, we pick a master gene from the data set to form the ground truth and artificially generate the phenotype labels. Multiple hybrid data models can be constructed from one data set and multiple data sets of smaller sizes can be generated by resampling the original data set. This approach enables us to generate a large batch of data sets to check the ranking performance of GSA methods. Our simulation study reveals that for the proposed data model, the Q2 type GSA methods have in general better performance than other GSA methods and the global test has the most robust results. The properties of a data set play a critical role in the performance. For the data sets with highly connected genes, all GSA methods suffer significantly in performance. PMID:24558298
Design and landing dynamic analysis of reusable landing leg for a near-space manned capsule
NASA Astrophysics Data System (ADS)
Yue, Shuai; Nie, Hong; Zhang, Ming; Wei, Xiaohui; Gan, Shengyong
2018-06-01
To improve the landing performance of a near-space manned capsule under various landing conditions, a novel landing system is designed that employs double chamber and single chamber dampers in the primary and auxiliary struts, respectively. A dynamic model of the landing system is established, and the damper parameters are determined by employing the design method. A single-leg drop test with different initial pitch angles is then conducted to compare and validate the simulation model. Based on the validated simulation model, seven critical landing conditions regarding nine crucial landing responses are found by combining the radial basis function (RBF) surrogate model and adaptive simulated annealing (ASA) optimization method. Subsequently, the adaptability of the landing system under critical landing conditions is analyzed. The results show that the simulation effectively results match the test results, which validates the accuracy of the dynamic model. In addition, all of the crucial responses under their corresponding critical landing conditions satisfy the design specifications, demonstrating the feasibility of the landing system.
Aspects of the "Design Space" in high pressure liquid chromatography method development.
Molnár, I; Rieger, H-J; Monks, K E
2010-05-07
The present paper describes a multifactorial optimization of 4 critical HPLC method parameters, i.e. gradient time (t(G)), temperature (T), pH and ternary composition (B(1):B(2)) based on 36 experiments. The effect of these experimental variables on critical resolution and selectivity was carried out in such a way as to systematically vary all four factors simultaneously. The basic element is a gradient time-temperature (t(G)-T) plane, which is repeated at three different pH's of the eluent A and at three different ternary compositions of eluent B between methanol and acetonitrile. The so-defined volume enables the investigation of the critical resolution for a part of the Design Space of a given sample. Further improvement of the analysis time, with conservation of the previously optimized selectivity, was possible by reducing the gradient time and increasing the flow rate. Multidimensional robust regions were successfully defined and graphically depicted. Copyright (c) 2010 Elsevier B.V. All rights reserved.
Kaddoura, Mahmoud A
2010-09-01
It is essential for nurses to develop critical thinking skills to ensure their ability to provide safe and effective care to patients with complex and variable needs in ever-changing clinical environments. To date, very few studies have been conducted to examine how nursing orientation programs develop the critical thinking skills of novice critical care nurses. Strikingly, no research studies could be found about the American Association of Critical Care Nurses Essentials of Critical Care Orientation (ECCO) program and specifically its effect on the development of nurses' critical thinking skills. This study explored the perceptions of new graduate nurses regarding factors that helped to develop their critical thinking skills throughout their 6-month orientation program in the intensive care unit. A convenient non-probability sample of eight new graduates was selected from a hospital that used the ECCO program. Data were collected with demographic questionnaires and semi-structured interviews. An exploratory qualitative research method with content analysis was used to analyze the data. The study findings showed that new graduate nurses perceived that they developed critical thinking skills that improved throughout the orientation period, although there were some challenges in the ECCO program. This study provides data that could influence the development and implementation of future nursing orientation programs. Copyright 2010, SLACK Incorporated.
Automatic yield-line analysis of slabs using discontinuity layout optimization
Gilbert, Matthew; He, Linwei; Smith, Colin C.; Le, Canh V.
2014-01-01
The yield-line method of analysis is a long established and extremely effective means of estimating the maximum load sustainable by a slab or plate. However, although numerous attempts to automate the process of directly identifying the critical pattern of yield-lines have been made over the past few decades, to date none has proved capable of reliably analysing slabs of arbitrary geometry. Here, it is demonstrated that the discontinuity layout optimization (DLO) procedure can successfully be applied to such problems. The procedure involves discretization of the problem using nodes inter-connected by potential yield-line discontinuities, with the critical layout of these then identified using linear programming. The procedure is applied to various benchmark problems, demonstrating that highly accurate solutions can be obtained, and showing that DLO provides a truly systematic means of directly and reliably automatically identifying yield-line patterns. Finally, since the critical yield-line patterns for many problems are found to be quite complex in form, a means of automatically simplifying these is presented. PMID:25104905
Defining resilience: A preliminary integrative literature review
Wilt, Bonnie; Long, Suzanna K.; Shoberg, Thomas G.
2016-01-01
The term “resilience” is ubiquitous in technical literature; it appears in numerous forms, such as resilience, resiliency, or resilient, and each use may have a different definition depending on the interpretation of the writer. This creates difficulties in understanding what is meant by ‘resilience’ in any given use case, especially in discussions of interdisciplinary research. To better understand this problem, this research constructs a preliminary integrative literature review to map different definitions, applications and calculation methods of resilience invoked within critical infrastructure applications. The preliminary review uses a State-of-the-Art Matrix (SAM) analysis to characterize differences in definition across disciplines and between regions. Qualifying the various usages of resilience will produce a greater precision in the literature and a deeper insight into types of data required for its evaluation, particularly with respect to critical infrastructure calculations and how such data may be analyzed. Results from this SAM analysis will create a framework of key concepts as part of the most common applications for “resilient critical infrastructure” modeling.
Camargo Plazas, Maria del Pilar; Cameron, Brenda L
2015-06-01
Many approaches and efforts have been used to better understand chronic diseases worldwide. Yet, little is known about the meaning of living with chronic illness under the pressures of globalization and neoliberal ideologies. Through Freire's participatory educational method, this article presents an innovative approach to understanding the multiple dimensions of living with chronic illness. In this way, we hope to use an innovative approach to address the impact of globalization on the daily life of chronically ill people and thus expand to the body of knowledge on nursing. This article uses Freire's participatory educational method to understand the multiple dimensions of living with chronic illness. This qualitative study follows an interpretive inquiry approach and uses a critical hermeneutic phenomenological method and critical research methodologies. Five participants were recruited for this participatory educational activity. Data collection methods included digitally recorded semistructured individual interviews and a Freire's participatory educational method session. Data analysis included a thematic analysis. Participants reported lacking adequate access to healthcare services because of insurance policies; a general perception that they were an unwanted burden on the healthcare system; and a general lack of government support, advocacy, and political interest. This research activity assisted participants to gain a new critical perspective about the condition of others with chronic diseases and thus provided an enlightening opportunity to learn about the illnesses and experiences of others and to realize that others experienced the same oppression from the healthcare system. Participants became agents of change within their own families and communities. Chronic diseases cause many economic and social consequences in their victims. These findings urge us to move from merely acknowledging the difficulties of people who live with chronic illness in an age of globalization to taking the actions necessary to bring about healthcare, social, and political reform through a process of conscientization and mutual transformation.
Zacharis, Constantinos K; Vastardi, Elli
2018-02-20
In the research presented we report the development of a simple and robust liquid chromatographic method for the quantification of two genotoxic alkyl sulphonate impurities (namely methyl p-toluenesulfonate and isopropyl p-toluenesulfonate) in Aprepitant API substances using the Analytical Quality by Design (AQbD) approach. Following the steps of AQbD protocol, the selected critical method attributes (CMAs) were the separation criterions between the critical peak pairs, the analysis time and the peak efficiencies of the analytes. The critical method parameters (CMPs) included the flow rate, the gradient slope and the acetonitrile content at the first step of the gradient elution program. Multivariate experimental designs namely Plackett-Burman and Box-Behnken designs were conducted sequentially for factor screening and optimization of the method parameters. The optimal separation conditions were estimated using the desirability function. The method was fully validated in the range of 10-200% of the target concentration limit of the analytes using the "total error" approach. Accuracy profiles - a graphical decision making tool - were constructed using the results of the validation procedures. The β-expectation tolerance intervals did not exceed the acceptance criteria of±10%, meaning that 95% of future results will be included in the defined bias limits. The relative bias ranged between - 1.3-3.8% for both analytes, while the RSD values for repeatability and intermediate precision were less than 1.9% in all cases. The achieved limit of detection (LOD) and the limit of quantification (LOQ) were adequate for the specific purpose and found to be 0.02% (corresponding to 48μgg -1 in sample) for both methyl and isopropyl p-toluenesulfonate. As proof-of-concept, the validated method was successfully applied in the analysis of several Aprepitant batches indicating that this methodology could be used for routine quality control analyses. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Callahan, R. P.; Taylor, N. J.; Pasquet, S.; Dueker, K. G.; Riebe, C. S.; Holbrook, W. S.
2016-12-01
Geophysical imaging is rapidly becoming popular for quantifying subsurface critical zone (CZ) architecture. However, a diverse array of measurements and measurement techniques are available, raising the question of which are appropriate for specific study goals. Here we compare two techniques for measuring S-wave velocities (Vs) in the near surface. The first approach quantifies Vs in three dimensions using a passive source and an iterative residual least-squares tomographic inversion. The second approach uses a more traditional active-source seismic survey to quantify Vs in two dimensions via a Monte Carlo surface-wave dispersion inversion. Our analysis focuses on three 0.01 km2 study plots on weathered granitic bedrock in the Southern Sierra Critical Zone Observatory. Preliminary results indicate that depth-averaged velocities from the two methods agree over the scales of resolution of the techniques. While the passive- and active-source techniques both quantify Vs, each method has distinct advantages and disadvantages during data acquisition and analysis. The passive-source method has the advantage of generating a three dimensional distribution of subsurface Vs structure across a broad area. Because this method relies on the ambient seismic field as a source, which varies unpredictably across space and time, data quality and depth of investigation are outside the control of the user. Meanwhile, traditional active-source surveys can be designed around a desired depth of investigation. However, they only generate a two dimensional image of Vs structure. Whereas traditional active-source surveys can be inverted quickly on a personal computer in the field, passive source surveys require significantly more computations, and are best conducted in a high-performance computing environment. We use data from our study sites to compare these methods across different scales and to explore how these methods can be used to better understand subsurface CZ architecture.
Research on Buckling State of Prestressed Fiber-Strengthened Steel Pipes
NASA Astrophysics Data System (ADS)
Wang, Ruheng; Lan, Kunchang
2018-01-01
The main restorative methods of damaged oil and gas pipelines include welding reinforcement, fixture reinforcement and fiber material reinforcement. Owing to the severe corrosion problems of pipes in practical use, the research on renovation and consolidation techniques of damaged pipes gains extensive attention by experts and scholars both at home and abroad. The analysis of mechanical behaviors of reinforced pressure pipelines and further studies focusing on “the critical buckling” and intensity of pressure pipeline failure are conducted in this paper, providing theoretical basis to restressed fiber-strengthened steel pipes. Deformation coordination equations and buckling control equations of steel pipes under the effect of prestress is deduced by using Rayleigh Ritz method, which is an approximation method based on potential energy stationary value theory and minimum potential energy principle. According to the deformation of prestressed steel pipes, the deflection differential equation of prestressed steel pipes is established, and the critical value of buckling under prestress is obtained.
Silva, Nádia F D; Magalhães, Júlia M C S; Freire, Cristina; Delerue-Matos, Cristina
2018-01-15
According to the recent statistics, Salmonella is still an important public health issue in the whole world. Legislated reference methods, based on counting plate methods, are sensitive enough but are inadequate as an effective emergency response tool, and are far from a rapid device, simple to use out of lab. An overview of the commercially available rapid methods for Salmonella detection is provided along with a critical discussion of their limitations, benefits and potential use in a real context. The distinguished potentialities of electrochemical biosensors for the development of rapid devices are highlighted. The state-of-art and the newest technologic approaches in electrochemical biosensors for Salmonella detection are presented and a critical analysis of the literature is made in an attempt to identify the current challenges towards a complete solution for Salmonella detection in microbial food control based on electrochemical biosensors. Copyright © 2017 Elsevier B.V. All rights reserved.
Song, Ruizhuo; Lewis, Frank L; Wei, Qinglai
2017-03-01
This paper establishes an off-policy integral reinforcement learning (IRL) method to solve nonlinear continuous-time (CT) nonzero-sum (NZS) games with unknown system dynamics. The IRL algorithm is presented to obtain the iterative control and off-policy learning is used to allow the dynamics to be completely unknown. Off-policy IRL is designed to do policy evaluation and policy improvement in the policy iteration algorithm. Critic and action networks are used to obtain the performance index and control for each player. The gradient descent algorithm makes the update of critic and action weights simultaneously. The convergence analysis of the weights is given. The asymptotic stability of the closed-loop system and the existence of Nash equilibrium are proved. The simulation study demonstrates the effectiveness of the developed method for nonlinear CT NZS games with unknown system dynamics.
CORSSTOL: Cylinder Optimization of Rings, Skin, and Stringers with Tolerance sensitivity
NASA Technical Reports Server (NTRS)
Finckenor, J.; Bevill, M.
1995-01-01
Cylinder Optimization of Rings, Skin, and Stringers with Tolerance (CORSSTOL) sensitivity is a design optimization program incorporating a method to examine the effects of user-provided manufacturing tolerances on weight and failure. CORSSTOL gives designers a tool to determine tolerances based on need. This is a decisive way to choose the best design among several manufacturing methods with differing capabilities and costs. CORSSTOL initially optimizes a stringer-stiffened cylinder for weight without tolerances. The skin and stringer geometry are varied, subject to stress and buckling constraints. Then the same analysis and optimization routines are used to minimize the maximum material condition weight subject to the least favorable combination of tolerances. The adjusted optimum dimensions are provided with the weight and constraint sensitivities of each design variable. The designer can immediately identify critical tolerances. The safety of parts made out of tolerance can also be determined. During design and development of weight-critical systems, design/analysis tools that provide product-oriented results are of vital significance. The development of this program and methodology provides designers with an effective cost- and weight-saving design tool. The tolerance sensitivity method can be applied to any system defined by a set of deterministic equations.
Critical behavior of quasi-two-dimensional semiconducting ferromagnet Cr 2 Ge 2 Te 6
Liu, Yu; Petrovic, C.
2017-08-03
Some critical properties of the single-crystalline semiconducting ferromagnet Cr 2 Ge 2 Te 6 were investigated by bulk dc magnetization around the paramagnetic to ferromagnetic phase transition. Critical exponents β = 0.200 ± 0.003 with a critical temperature T c = 62.65 ± 0.07 K and γ = 1.28 ± 0.03 with T c = 62.75 ± 0.06 K are obtained by the Kouvel-Fisher method whereas δ = 7.96 ± 0.01 is obtained by a critical isotherm analysis at T c = 62.7 K. These critical exponents obey the Widom scaling relation δ = 1 + γ / β ,more » indicating self-consistency of the obtained values. Furthermore, with these critical exponents the isotherm M ( H ) curves below and above the critical temperatures collapse into two independent universal branches, obeying the single scaling equation m = f ± ( h ) , where m and h are renormalized magnetization and field, respectively. The determined exponents match well with those calculated from the results of the renormalization group approach for a two-dimensional Ising system coupled with a long-range interaction between spins decaying as J ( r ) ≈ r - ( d + σ ) with σ = 1.52 .« less
Riva, F; Bisi, M C; Stagni, R
2013-01-01
Falls represent a heavy economic and clinical burden on society. The identification of individual chronic characteristics associated with falling is of fundamental importance for the clinicians; in particular, the stability of daily motor tasks is one of the main factors that the clinicians look for during assessment procedures. Various methods for the assessment of stability in human movement are present in literature, and methods coming from stability analysis of nonlinear dynamic systems applied to biomechanics recently showed promise. One of these techniques is orbital stability analysis via Floquet multipliers. This method allows to measure orbital stability of periodic nonlinear dynamic systems and it seems a promising approach for the definition of a reliable motor stability index, taking into account for the whole task cycle dynamics. Despite the premises, its use in the assessment of fall risk has been deemed controversial. The aim of this systematic review was therefore to provide a critical evaluation of the literature on the topic of applications of orbital stability analysis in biomechanics, with particular focus to methodologic aspects. Four electronic databases have been searched for articles relative to the topic; 23 articles were selected for review. Quality of the studies present in literature has been assessed with a customised quality assessment tool. Overall quality of the literature in the field was found to be high. The most critical aspect was found to be the lack of uniformity in the implementation of the analysis to biomechanical time series, particularly in the choice of state space and number of cycles to include in the analysis. Copyright © 2012 Elsevier B.V. All rights reserved.
Dynamic test/analysis correlation using reduced analytical models
NASA Technical Reports Server (NTRS)
Mcgowan, Paul E.; Angelucci, A. Filippo; Javeed, Mehzad
1992-01-01
Test/analysis correlation is an important aspect of the verification of analysis models which are used to predict on-orbit response characteristics of large space structures. This paper presents results of a study using reduced analysis models for performing dynamic test/analysis correlation. The reduced test-analysis model (TAM) has the same number and orientation of DOF as the test measurements. Two reduction methods, static (Guyan) reduction and the Improved Reduced System (IRS) reduction, are applied to the test/analysis correlation of a laboratory truss structure. Simulated test results and modal test data are used to examine the performance of each method. It is shown that selection of DOF to be retained in the TAM is critical when large structural masses are involved. In addition, the use of modal test results may provide difficulties in TAM accuracy even if a large number of DOF are retained in the TAM.
The role of ecological dynamics in analysing performance in team sports.
Vilar, Luís; Araújo, Duarte; Davids, Keith; Button, Chris
2012-01-01
Performance analysis is a subdiscipline of sports sciences and one-approach, notational analysis, has been used to objectively audit and describe behaviours of performers during different subphases of play, providing additional information for practitioners to improve future sports performance. Recent criticisms of these methods have suggested the need for a sound theoretical rationale to explain performance behaviours, not just describe them. The aim of this article was to show how ecological dynamics provides a valid theoretical explanation of performance in team sports by explaining the formation of successful and unsuccessful patterns of play, based on symmetry-breaking processes emerging from functional interactions between players and the performance environment. We offer the view that ecological dynamics is an upgrade to more operational methods of performance analysis that merely document statistics of competitive performance. In support of our arguments, we refer to exemplar data on competitive performance in team sports that have revealed functional interpersonal interactions between attackers and defenders, based on variations in the spatial positioning of performers relative to each other in critical performance areas, such as the scoring zones. Implications of this perspective are also considered for practice task design and sport development programmes.
2014-01-01
Background To build research capacity among graduating medical students, the teaching of research and critical analysis was integrated into the University of Wollongong (UoW) new, graduate-entry medical curriculum. This study examined whether the self-perceived research experiences of medical students, and consequent research capability, were influenced by exposure to this innovative research and critical analysis curriculum, which incorporated a 12-month community-based research project, and associated assessment tasks. Methods The first three medical students cohorts (N = 221) completed a self-assessment of their research experiences in ten areas of research activity. Their responses were collected: before and after they undertook an individual community-based research project within a 12-month regional/rural clinical placement. The research areas investigated by the self-assessment tool were: (i) defining a research question/idea; (ii) writing a research protocol; (iii) finding relevant literature; (iv) critically reviewing the literature; (v) using quantitative research methods; (vi) using qualitative research methods; (vii) analysing and interpreting results; (viii) writing and presenting a research report; (ix) publishing results; and (x) applying for research funding. Results Participation rates of 94% (207/221) pre-placement and 99% (219/221) post-placement were achieved from the three student cohorts. Following the successful completion of the research projects and their assessment tasks, the median responses were significantly higher (p < 0.05) in nine of the ten research areas. The only area of research for which there was no increase recorded for any one of the three cohorts, or overall, was (x) applying for research funding. This activity was not a component of the UoW research and critical analysis curriculum and the item was included as a test of internal validity. Significant gains were also seen between cohorts in some key research areas. Conclusions Improved research capability among medical students was evidenced by increased scores in various areas of research experience in the context of successful completion of relevant assessment tasks. The results suggest that research capability of medical students can be positively influenced by the provision of a research-based integrated medical curriculum and further consolidated by authentic learning experiences, gained through conducting ‘hands-on’ research projects, under the supervision and mentoring of research-qualified academics. PMID:25096817
Software Dependability and Safety Evaluations ESA's Initiative
NASA Astrophysics Data System (ADS)
Hernek, M.
ESA has allocated funds for an initiative to evaluate Dependability and Safety methods of Software. The objectives of this initiative are; · More extensive validation of Safety and Dependability techniques for Software · Provide valuable results to improve the quality of the Software thus promoting the application of Dependability and Safety methods and techniques. ESA space systems are being developed according to defined PA requirement specifications. These requirements may be implemented through various design concepts, e.g. redundancy, diversity etc. varying from project to project. Analysis methods (FMECA. FTA, HA, etc) are frequently used during requirements analysis and design activities to assure the correct implementation of system PA requirements. The criticality level of failures, functions and systems is determined and by doing that the critical sub-systems are identified, on which dependability and safety techniques are to be applied during development. Proper performance of the software development requires the development of a technical specification for the products at the beginning of the life cycle. Such technical specification comprises both functional and non-functional requirements. These non-functional requirements address characteristics of the product such as quality, dependability, safety and maintainability. Software in space systems is more and more used in critical functions. Also the trend towards more frequent use of COTS and reusable components pose new difficulties in terms of assuring reliable and safe systems. Because of this, its dependability and safety must be carefully analysed. ESA identified and documented techniques, methods and procedures to ensure that software dependability and safety requirements are specified and taken into account during the design and development of a software system and to verify/validate that the implemented software systems comply with these requirements [R1].
In-die photomask registration and overlay metrology with PROVE using 2D correlation methods
NASA Astrophysics Data System (ADS)
Seidel, D.; Arnz, M.; Beyer, D.
2011-11-01
According to the ITRS roadmap, semiconductor industry drives the 193nm lithography to its limits, using techniques like double exposure, double patterning, mask-source optimization and inverse lithography. For photomask metrology this translates to full in-die measurement capability for registration and critical dimension together with challenging specifications for repeatability and accuracy. Especially, overlay becomes more and more critical and must be ensured on every die. For this, Carl Zeiss SMS has developed the next generation photomask registration and overlay metrology tool PROVE® which serves the 32nm node and below and which is already well established in the market. PROVE® features highly stable hardware components for the stage and environmental control. To ensure in-die measurement capability, sophisticated image analysis methods based on 2D correlations have been developed. In this paper we demonstrate the in-die capability of PROVE® and present corresponding measurement results for shortterm and long-term measurements as well as the attainable accuracy for feature sizes down to 85nm using different illumination modes and mask types. Standard measurement methods based on threshold criteria are compared with the new 2D correlation methods to demonstrate the performance gain of the latter. In addition, mask-to-mask overlay results of typical box-in-frame structures down to 200nm feature size are presented. It is shown, that from overlay measurements a reproducibility budget can be derived that takes into account stage, image analysis and global effects like mask loading and environmental control. The parts of the budget are quantified from measurement results to identify critical error contributions and to focus on the corresponding improvement strategies.
Lenton, T. M.; Livina, V. N.; Dakos, V.; Van Nes, E. H.; Scheffer, M.
2012-01-01
We address whether robust early warning signals can, in principle, be provided before a climate tipping point is reached, focusing on methods that seek to detect critical slowing down as a precursor of bifurcation. As a test bed, six previously analysed datasets are reconsidered, three palaeoclimate records approaching abrupt transitions at the end of the last ice age and three models of varying complexity forced through a collapse of the Atlantic thermohaline circulation. Approaches based on examining the lag-1 autocorrelation function or on detrended fluctuation analysis are applied together and compared. The effects of aggregating the data, detrending method, sliding window length and filtering bandwidth are examined. Robust indicators of critical slowing down are found prior to the abrupt warming event at the end of the Younger Dryas, but the indicators are less clear prior to the Bølling-Allerød warming, or glacial termination in Antarctica. Early warnings of thermohaline circulation collapse can be masked by inter-annual variability driven by atmospheric dynamics. However, rapidly decaying modes can be successfully filtered out by using a long bandwidth or by aggregating data. The two methods have complementary strengths and weaknesses and we recommend applying them together to improve the robustness of early warnings. PMID:22291229
A Framework for Creating a Function-based Design Tool for Failure Mode Identification
NASA Technical Reports Server (NTRS)
Arunajadai, Srikesh G.; Stone, Robert B.; Tumer, Irem Y.; Clancy, Daniel (Technical Monitor)
2002-01-01
Knowledge of potential failure modes during design is critical for prevention of failures. Currently industries use procedures such as Failure Modes and Effects Analysis (FMEA), Fault Tree analysis, or Failure Modes, Effects and Criticality analysis (FMECA), as well as knowledge and experience, to determine potential failure modes. When new products are being developed there is often a lack of sufficient knowledge of potential failure mode and/or a lack of sufficient experience to identify all failure modes. This gives rise to a situation in which engineers are unable to extract maximum benefits from the above procedures. This work describes a function-based failure identification methodology, which would act as a storehouse of information and experience, providing useful information about the potential failure modes for the design under consideration, as well as enhancing the usefulness of procedures like FMEA. As an example, the method is applied to fifteen products and the benefits are illustrated.
Specimen preparation for NanoSIMS analysis of biological materials
NASA Astrophysics Data System (ADS)
Grovenor, C. R. M.; Smart, K. E.; Kilburn, M. R.; Shore, B.; Dilworth, J. R.; Martin, B.; Hawes, C.; Rickaby, R. E. M.
2006-07-01
In order to achieve reliable and reproducible analysis of biological materials by SIMS, it is critical both that the chosen specimen preparation method does not modify substantially the in vivo chemistry that is the focus of the study and that any chemical information obtained can be calibrated accurately by selection of appropriate standards. In Oxford, we have been working with our new Cameca NanoSIMS50 on two very distinct classes of biological materials; the first where the sample preparation problems are relatively undemanding - human hair - but calibration for trace metal analysis is a critical issue and, the second, marine coccoliths and hyperaccumulator plants where reliable specimen preparation by rapid freezing and controlled drying to preserve the distribution of diffusible species is the first and most demanding requirement, but worthwhile experiments on tracking key elements can still be undertaken even when it is clear that some redistribution of the most diffusible ions has occurred.
Giardina, M; Castiglia, F; Tomarchio, E
2014-12-01
Failure mode, effects and criticality analysis (FMECA) is a safety technique extensively used in many different industrial fields to identify and prevent potential failures. In the application of traditional FMECA, the risk priority number (RPN) is determined to rank the failure modes; however, the method has been criticised for having several weaknesses. Moreover, it is unable to adequately deal with human errors or negligence. In this paper, a new versatile fuzzy rule-based assessment model is proposed to evaluate the RPN index to rank both component failure and human error. The proposed methodology is applied to potential radiological over-exposure of patients during high-dose-rate brachytherapy treatments. The critical analysis of the results can provide recommendations and suggestions regarding safety provisions for the equipment and procedures required to reduce the occurrence of accidental events.
Orlandini, S; Pasquini, B; Stocchero, M; Pinzauti, S; Furlanetto, S
2014-04-25
The development of a capillary electrophoresis (CE) method for the assay of almotriptan (ALM) and its main impurities using an integrated Quality by Design and mixture-process variable (MPV) approach is described. A scouting phase was initially carried out by evaluating different CE operative modes, including the addition of pseudostationary phases and additives to the background electrolyte, in order to approach the analytical target profile. This step made it possible to select normal polarity microemulsion electrokinetic chromatography (MEEKC) as operative mode, which allowed a good selectivity to be achieved in a low analysis time. On the basis of a general Ishikawa diagram for MEEKC methods, a screening asymmetric matrix was applied in order to screen the effects of the process variables (PVs) voltage, temperature, buffer concentration and buffer pH, on critical quality attributes (CQAs), represented by critical separation values and analysis time. A response surface study was then carried out considering all the critical process parameters, including both the PVs and the mixture components (MCs) of the microemulsion (borate buffer, n-heptane as oil, sodium dodecyl sulphate/n-butanol as surfactant/cosurfactant). The values of PVs and MCs were simultaneously changed in a MPV study, making it possible to find significant interaction effects. The design space (DS) was defined as the multidimensional combination of PVs and MCs where the probability for the different considered CQAs to be acceptable was higher than a quality level π=90%. DS was identified by risk of failure maps, which were drawn on the basis of Monte-Carlo simulations, and verification points spanning the design space were tested. Robustness testing of the method, performed by a D-optimal design, and system suitability criteria allowed a control strategy to be designed. The optimized method was validated following ICH Guideline Q2(R1) and was applied to a real sample of ALM coated tablets. Copyright © 2014 Elsevier B.V. All rights reserved.
Sample normalization methods in quantitative metabolomics.
Wu, Yiman; Li, Liang
2016-01-22
To reveal metabolomic changes caused by a biological event in quantitative metabolomics, it is critical to use an analytical tool that can perform accurate and precise quantification to examine the true concentration differences of individual metabolites found in different samples. A number of steps are involved in metabolomic analysis including pre-analytical work (e.g., sample collection and storage), analytical work (e.g., sample analysis) and data analysis (e.g., feature extraction and quantification). Each one of them can influence the quantitative results significantly and thus should be performed with great care. Among them, the total sample amount or concentration of metabolites can be significantly different from one sample to another. Thus, it is critical to reduce or eliminate the effect of total sample amount variation on quantification of individual metabolites. In this review, we describe the importance of sample normalization in the analytical workflow with a focus on mass spectrometry (MS)-based platforms, discuss a number of methods recently reported in the literature and comment on their applicability in real world metabolomics applications. Sample normalization has been sometimes ignored in metabolomics, partially due to the lack of a convenient means of performing sample normalization. We show that several methods are now available and sample normalization should be performed in quantitative metabolomics where the analyzed samples have significant variations in total sample amounts. Copyright © 2015 Elsevier B.V. All rights reserved.
Technologies for Clinical Diagnosis Using Expired Human Breath Analysis
Mathew, Thalakkotur Lazar; Pownraj, Prabhahari; Abdulla, Sukhananazerin; Pullithadathil, Biji
2015-01-01
This review elucidates the technologies in the field of exhaled breath analysis. Exhaled breath gas analysis offers an inexpensive, noninvasive and rapid method for detecting a large number of compounds under various conditions for health and disease states. There are various techniques to analyze some exhaled breath gases, including spectrometry, gas chromatography and spectroscopy. This review places emphasis on some of the critical biomarkers present in exhaled human breath, and its related effects. Additionally, various medical monitoring techniques used for breath analysis have been discussed. It also includes the current scenario of breath analysis with nanotechnology-oriented techniques. PMID:26854142
Critical Path Method Networks and Their Use in Claims Analysis.
1984-01-01
produced will only be as good as the time invested and the knowledge of the scheduler. A schedule which is based on faulty logic or which contains... fundementals of putting a schedule together but also *how the construction process functions so that the delays can be accurately inserted. When
49 CFR Appendix F to Part 229 - Recommended Practices for Design and Safety Analysis
Code of Federal Regulations, 2014 CFR
2014-10-01
... expected order of use; (v) Group similar controls together; (vi) Design for high stimulus-response compatibility (geometric and conceptual); (vii) Design safety-critical controls to require more than one... description of all backup methods of operation; and (s) The configuration/revision control measures designed...
49 CFR Appendix F to Part 229 - Recommended Practices for Design and Safety Analysis
Code of Federal Regulations, 2012 CFR
2012-10-01
... expected order of use; (v) Group similar controls together; (vi) Design for high stimulus-response compatibility (geometric and conceptual); (vii) Design safety-critical controls to require more than one... description of all backup methods of operation; and (s) The configuration/revision control measures designed...
49 CFR Appendix F to Part 229 - Recommended Practices for Design and Safety Analysis
Code of Federal Regulations, 2013 CFR
2013-10-01
... expected order of use; (v) Group similar controls together; (vi) Design for high stimulus-response compatibility (geometric and conceptual); (vii) Design safety-critical controls to require more than one... description of all backup methods of operation; and (s) The configuration/revision control measures designed...
ERIC Educational Resources Information Center
Roussakis, Yiannis
2018-01-01
This article attempts a reading of Andreas M. Kazamias's work and method as a persistent and firmly grounded attempt to "go against the tide" of an empirical/instrumentalist comparative education and toward a "modernist 'episteme.'" Kazamias has been explicitly critical of the social-scientific-cum-positivist comparative…
Articles on Mass Communication in U.S. and Foreign Journals.
ERIC Educational Resources Information Center
McKerns, Joseph P.; And Others, Eds.
1984-01-01
Annotates a number of journal articles dealing with a variety of subjects, including (1) advertising, (2) audience and communicatory analysis, (3) broadcasting, (4) communication theory, (5) courts and the law, (6) media criticism, (7) editorial policy and methods, (8) journalism education, (9) government and media, and (10) technology. (FL)
Optimizing Web-Based Instruction: A Case Study Using Poultry Processing Unit Operations
ERIC Educational Resources Information Center
O' Bryan, Corliss A.; Crandall, Philip G.; Shores-Ellis, Katrina; Johnson, Donald M.; Ricke, Steven C.; Marcy, John
2009-01-01
Food companies and supporting industries need inexpensive, revisable training methods for large numbers of hourly employees due to continuing improvements in Hazard Analysis Critical Control Point (HACCP) programs, new processing equipment, and high employee turnover. HACCP-based food safety programs have demonstrated their value by reducing the…
Implementing Service Excellence in Higher Education
ERIC Educational Resources Information Center
Khan, Hina; Matlay, Harry
2009-01-01
Purpose: The purpose of this paper is to provide a critical analysis of the importance of service excellence in higher education. Design/methodology/approach: The research upon which this paper is based employed a phenomenological approach. This method was selected for its focus on respondent perceptions and experiences. Both structured and…
Misrepresenting Chinese Folk Happiness: A Critique of a Study
ERIC Educational Resources Information Center
Ip, Po-Keung
2013-01-01
Discourses on Chinese folk happiness are often based on anecdotal narratives or qualitative analysis. A recent study on Chinese folk happiness using qualitative method seems to provide some empirical findings beyond anecdotal evidence on Chinese folk happiness. This paper critically examines the study's constructed image of Chinese folk happiness,…
A Writing-Intensive, Methods-Based Laboratory Course for Undergraduates
ERIC Educational Resources Information Center
Colabroy, Keri L.
2011-01-01
Engaging undergraduate students in designing and executing original research should not only be accompanied by technique training but also intentional instruction in the critical analysis and writing of scientific literature. The course described here takes a rigorous approach to scientific reading and writing using primary literature as the model…
Teaching Business Programming Using Games: A Critical Analysis
ERIC Educational Resources Information Center
Muganda, Nixon; Joubert, Pieter, Jr.; Toit, Jacques Du; Johnson, Roy
2012-01-01
Introduction: This paper examines the persistent problematic issue of engaging business students in teaching computer programming. Studies continue to document challenges in teaching computer programming and various methods have been proposed with varying degrees of success. From an educator's perspective, the concern is how to engage students to…
Japan's Teacher Acculturation: Critical Analysis through Comparative Ethnographic Narrative
ERIC Educational Resources Information Center
Howe, Edward R.
2005-01-01
Cross-cultural teaching and research in Canada and Japan is reported. Ethnographic narrative methods were used to examine Japan's teacher acculturation. Canada's teachers are largely required to work in isolation, to learn their practice through trial and error. There is little provision for mentorship and insufficient time to reflect. In…