ERIC Educational Resources Information Center
Dyehouse, Jeremiah
2007-01-01
Researchers studying technology development often examine how rhetorical activity contributes to technologies' design, implementation, and stabilization. This article offers a possible methodology for studying one role of rhetorical activity in technology development: knowledge consolidation analysis. Applying this method to an exemplar case, the…
Analysis of Introducing Active Learning Methodologies in a Basic Computer Architecture Course
ERIC Educational Resources Information Center
Arbelaitz, Olatz; José I. Martín; Muguerza, Javier
2015-01-01
This paper presents an analysis of introducing active methodologies in the Computer Architecture course taught in the second year of the Computer Engineering Bachelor's degree program at the University of the Basque Country (UPV/EHU), Spain. The paper reports the experience from three academic years, 2011-2012, 2012-2013, and 2013-2014, in which…
ERIC Educational Resources Information Center
Becher, Ayelet; Orland-Barak, Lily
2016-01-01
This study suggests an integrative qualitative methodological framework for capturing complexity in mentoring activity. Specifically, the model examines how historical developments of a discipline direct mentors' mediation of professional knowledge through the language that they use. The model integrates social activity theory and a framework of…
Analysis and Design of Fuselage Structures Including Residual Strength Prediction Methodology
NASA Technical Reports Server (NTRS)
Knight, Norman F.
1998-01-01
The goal of this research project is to develop and assess methodologies for the design and analysis of fuselage structures accounting for residual strength. Two primary objectives are included in this research activity: development of structural analysis methodology for predicting residual strength of fuselage shell-type structures; and the development of accurate, efficient analysis, design and optimization tool for fuselage shell structures. Assessment of these tools for robustness, efficient, and usage in a fuselage shell design environment will be integrated with these two primary research objectives.
System Dynamics Modeling for Proactive Intelligence
2010-01-01
5 4. Modeling Resources as Part of an Integrated Multi- Methodology System .................. 16 5. Formalizing Pro-Active...Observable Data With and Without Simulation Analysis ............................... 15 Figure 13. Summary of Probe Methodology and Results...Strategy ............................................................................. 22 Figure 22. Overview of Methodology
Representation of scientific methodology in secondary science textbooks
NASA Astrophysics Data System (ADS)
Binns, Ian C.
The purpose of this investigation was to assess the representation of scientific methodology in secondary science textbooks. More specifically, this study looked at how textbooks introduced scientific methodology and to what degree the examples from the rest of the textbook, the investigations, and the images were consistent with the text's description of scientific methodology, if at all. The sample included eight secondary science textbooks from two publishers, McGraw-Hill/Glencoe and Harcourt/Holt, Rinehart & Winston. Data consisted of all student text and teacher text that referred to scientific methodology. Second, all investigations in the textbooks were analyzed. Finally, any images that depicted scientists working were also collected and analyzed. The text analysis and activity analysis used the ethnographic content analysis approach developed by Altheide (1996). The rubrics used for the text analysis and activity analysis were initially guided by the Benchmarks (AAAS, 1993), the NSES (NRC, 1996), and the nature of science literature. Preliminary analyses helped to refine each of the rubrics and grounded them in the data. Image analysis used stereotypes identified in the DAST literature. Findings indicated that all eight textbooks presented mixed views of scientific methodology in their initial descriptions. Five textbooks placed more emphasis on the traditional view and three placed more emphasis on the broad view. Results also revealed that the initial descriptions, examples, investigations, and images all emphasized the broad view for Glencoe Biology and the traditional view for Chemistry: Matter and Change. The initial descriptions, examples, investigations, and images in the other six textbooks were not consistent. Overall, the textbook with the most appropriate depiction of scientific methodology was Glencoe Biology and the textbook with the least appropriate depiction of scientific methodology was Physics: Principles and Problems. These findings suggest that compared to earlier investigations, textbooks have begun to improve in how they represent scientific methodology. However, there is still much room for improvement. Future research needs to consider how textbooks impact teachers' and students' understandings of scientific methodology.
Risk analysis within environmental impact assessment of proposed construction activity
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zeleňáková, Martina; Zvijáková, Lenka
Environmental impact assessment is an important process, prior to approval of the investment plan, providing a detailed examination of the likely and foreseeable impacts of proposed construction activity on the environment. The objective of this paper is to develop a specific methodology for the analysis and evaluation of environmental impacts of selected constructions – flood protection structures using risk analysis methods. The application of methodology designed for the process of environmental impact assessment will develop assumptions for further improvements or more effective implementation and performance of this process. The main objective of the paper is to improve the implementation ofmore » the environmental impact assessment process. Through the use of risk analysis methods in environmental impact assessment process, the set objective has been achieved. - Highlights: This paper is informed by an effort to develop research with the aim of: • Improving existing qualitative and quantitative methods for assessing the impacts • A better understanding of relations between probabilities and consequences • Methodology for the EIA of flood protection constructions based on risk analysis • Creative approaches in the search for environmentally friendly proposed activities.« less
Reliability Prediction Analysis: Airborne System Results and Best Practices
NASA Astrophysics Data System (ADS)
Silva, Nuno; Lopes, Rui
2013-09-01
This article presents the results of several reliability prediction analysis for aerospace components, made by both methodologies, the 217F and the 217Plus. Supporting and complementary activities are described, as well as the differences concerning the results and the applications of both methodologies that are summarized in a set of lessons learned that are very useful for RAMS and Safety Prediction practitioners.The effort that is required for these activities is also an important point that is discussed, as is the end result and their interpretation/impact on the system design.The article concludes while positioning these activities and methodologies in an overall process for space and aeronautics equipment/components certification, and highlighting their advantages. Some good practices have also been summarized and some reuse rules have been laid down.
Semantic integration of gene expression analysis tools and data sources using software connectors
2013-01-01
Background The study and analysis of gene expression measurements is the primary focus of functional genomics. Once expression data is available, biologists are faced with the task of extracting (new) knowledge associated to the underlying biological phenomenon. Most often, in order to perform this task, biologists execute a number of analysis activities on the available gene expression dataset rather than a single analysis activity. The integration of heteregeneous tools and data sources to create an integrated analysis environment represents a challenging and error-prone task. Semantic integration enables the assignment of unambiguous meanings to data shared among different applications in an integrated environment, allowing the exchange of data in a semantically consistent and meaningful way. This work aims at developing an ontology-based methodology for the semantic integration of gene expression analysis tools and data sources. The proposed methodology relies on software connectors to support not only the access to heterogeneous data sources but also the definition of transformation rules on exchanged data. Results We have studied the different challenges involved in the integration of computer systems and the role software connectors play in this task. We have also studied a number of gene expression technologies, analysis tools and related ontologies in order to devise basic integration scenarios and propose a reference ontology for the gene expression domain. Then, we have defined a number of activities and associated guidelines to prescribe how the development of connectors should be carried out. Finally, we have applied the proposed methodology in the construction of three different integration scenarios involving the use of different tools for the analysis of different types of gene expression data. Conclusions The proposed methodology facilitates the development of connectors capable of semantically integrating different gene expression analysis tools and data sources. The methodology can be used in the development of connectors supporting both simple and nontrivial processing requirements, thus assuring accurate data exchange and information interpretation from exchanged data. PMID:24341380
Semantic integration of gene expression analysis tools and data sources using software connectors.
Miyazaki, Flávia A; Guardia, Gabriela D A; Vêncio, Ricardo Z N; de Farias, Cléver R G
2013-10-25
The study and analysis of gene expression measurements is the primary focus of functional genomics. Once expression data is available, biologists are faced with the task of extracting (new) knowledge associated to the underlying biological phenomenon. Most often, in order to perform this task, biologists execute a number of analysis activities on the available gene expression dataset rather than a single analysis activity. The integration of heterogeneous tools and data sources to create an integrated analysis environment represents a challenging and error-prone task. Semantic integration enables the assignment of unambiguous meanings to data shared among different applications in an integrated environment, allowing the exchange of data in a semantically consistent and meaningful way. This work aims at developing an ontology-based methodology for the semantic integration of gene expression analysis tools and data sources. The proposed methodology relies on software connectors to support not only the access to heterogeneous data sources but also the definition of transformation rules on exchanged data. We have studied the different challenges involved in the integration of computer systems and the role software connectors play in this task. We have also studied a number of gene expression technologies, analysis tools and related ontologies in order to devise basic integration scenarios and propose a reference ontology for the gene expression domain. Then, we have defined a number of activities and associated guidelines to prescribe how the development of connectors should be carried out. Finally, we have applied the proposed methodology in the construction of three different integration scenarios involving the use of different tools for the analysis of different types of gene expression data. The proposed methodology facilitates the development of connectors capable of semantically integrating different gene expression analysis tools and data sources. The methodology can be used in the development of connectors supporting both simple and nontrivial processing requirements, thus assuring accurate data exchange and information interpretation from exchanged data.
Systematic analysis of EOS data system for operations
NASA Technical Reports Server (NTRS)
Moe, K. L.; Dasgupta, R.
1985-01-01
A data management analysis methodology is being proposed. The objective of the methodology is to assist mission managers by identifying a series of ordered activities to be systematically followed in order to arrive at an effective ground system design. Existing system engineering tools and concepts have been assembled into a structured framework to facilitate the work of a mission planner. It is intended that this methodology can be gainfully applied (with probable modifications and/or changes) to the EOS payloads and their associated data systems.
Enviroplan—a summary methodology for comprehensive environmental planning and design
Robert Allen Jr.; George Nez; Fred Nicholson; Larry Sutphin
1979-01-01
This paper will discuss a comprehensive environmental assessment methodology that includes a numerical method for visual management and analysis. This methodology employs resource and human activity units as a means to produce a visual form unit which is the fundamental unit of the perceptual environment. The resource unit is based on the ecosystem as the fundamental...
System architectures for telerobotic research
NASA Technical Reports Server (NTRS)
Harrison, F. Wallace
1989-01-01
Several activities are performed related to the definition and creation of telerobotic systems. The effort and investment required to create architectures for these complex systems can be enormous; however, the magnitude of process can be reduced if structured design techniques are applied. A number of informal methodologies supporting certain aspects of the design process are available. More recently, prototypes of integrated tools supporting all phases of system design from requirements analysis to code generation and hardware layout have begun to appear. Activities related to system architecture of telerobots are described, including current activities which are designed to provide a methodology for the comparison and quantitative analysis of alternative system architectures.
Neutron activation analysis: trends in developments and applications
NASA Astrophysics Data System (ADS)
de Goeij, J. J.; Bode, P.
1995-03-01
New developments in instrumentation for, and methodology of, Instrumental Neutron Activation Analysis (INAA) may lead to new niches for this method of elemental analysis. This paper describes the possibilities of advanced detectors, automated irradiation and counting stations, and very large sample analysis. An overview is given of some typical new fields of application.
Spatiotemporal Data Mining, Analysis, and Visualization of Human Activity Data
ERIC Educational Resources Information Center
Li, Xun
2012-01-01
This dissertation addresses the research challenge of developing efficient new methods for discovering useful patterns and knowledge in large volumes of electronically collected spatiotemporal activity data. I propose to analyze three types of such spatiotemporal activity data in a methodological framework that integrates spatial analysis, data…
Haegele, Justin A; Hodge, Samuel Russell
2015-10-01
There are basic philosophical and paradigmatic assumptions that guide scholarly research endeavors, including the methods used and the types of questions asked. Through this article, kinesiology faculty and students with interests in adapted physical activity are encouraged to understand the basic assumptions of applied behavior analysis (ABA) methodology for conducting, analyzing, and presenting research of high quality in this paradigm. The purposes of this viewpoint paper are to present information fundamental to understanding the assumptions undergirding research methodology in ABA, describe key aspects of single-subject research designs, and discuss common research designs and data-analysis strategies used in single-subject studies.
2010-04-01
available [11]. Additionally, Table-3 is a guide for DMAIC methodology including 29 different methods [12]. RTO-MP-SAS-081 6 - 4 NATO UNCLASSIFIED NATO...Table 3: DMAIC Methodology (5-Phase Methodology). Define Measure Analyze Improve Control Project Charter Prioritization Matrix 5 Whys Analysis...Methodology Scope [13] DMAIC PDCA Develop performance priorities This is a preliminary stage that precedes specific improvement projects, and the aim
Passive and semi-active heave compensator: Project design methodology and control strategies.
Cuellar Sanchez, William Humberto; Linhares, Tássio Melo; Neto, André Benine; Fortaleza, Eugênio Libório Feitosa
2017-01-01
Heave compensator is a system that mitigates transmission of heave movement from vessels to the equipment in the vessel. In drilling industry, a heave compensator enables drilling in offshore environments. Heave compensator attenuates movement transmitted from the vessel to the drill string and drill bit ensuring security and efficiency of the offshore drilling process. Common types of heave compensators are passive, active and semi-active compensators. This article presents 4 main points. First, a bulk modulus analysis obtains a simple condition to determine if the bulk modulus can be neglected in the design of hydropneumatic passive heave compensator. Second, the methodology to design passive heave compensators with the desired frequency response. Third, four control methodologies for semi-active heave compensator are tested and compared numerically. Lastly, we show experimental results obtained from a prototype with the methodology developed to design passive heave compensator.
Passive and semi-active heave compensator: Project design methodology and control strategies
Cuellar Sanchez, William Humberto; Neto, André Benine; Fortaleza, Eugênio Libório Feitosa
2017-01-01
Heave compensator is a system that mitigates transmission of heave movement from vessels to the equipment in the vessel. In drilling industry, a heave compensator enables drilling in offshore environments. Heave compensator attenuates movement transmitted from the vessel to the drill string and drill bit ensuring security and efficiency of the offshore drilling process. Common types of heave compensators are passive, active and semi-active compensators. This article presents 4 main points. First, a bulk modulus analysis obtains a simple condition to determine if the bulk modulus can be neglected in the design of hydropneumatic passive heave compensator. Second, the methodology to design passive heave compensators with the desired frequency response. Third, four control methodologies for semi-active heave compensator are tested and compared numerically. Lastly, we show experimental results obtained from a prototype with the methodology developed to design passive heave compensator. PMID:28813494
Analysis of Alternatives for Risk Assessment Methodologies and Tools
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nachtigal, Noel M.; Fruetel, Julia A.; Gleason, Nathaniel J.
The purpose of this document is to provide a basic overview and understanding of risk assessment methodologies and tools from the literature and to assess the suitability of these methodologies and tools for cyber risk assessment. Sandia National Laboratories (SNL) performed this review in support of risk modeling activities performed for the Stakeholder Engagement and Cyber Infrastructure Resilience (SECIR) division of the Department of Homeland Security (DHS) Office of Cybersecurity and Communications (CS&C). The set of methodologies and tools covered in this document is not intended to be exhaustive; instead, it focuses on those that are commonly used in themore » risk assessment community. The classification of methodologies and tools was performed by a group of analysts with experience in risk analysis and cybersecurity, and the resulting analysis of alternatives has been tailored to address the needs of a cyber risk assessment.« less
ERIC Educational Resources Information Center
Wake Forest Univ., Winston Salem, NC. Bowman Gray School of Medicine.
Utilizing a systematic sampling technique, the professional activities of small groups of pediatricians, family practitioners, surgeons, obstetricians, and internists were observed for 4 or 5 days by a medical student who checked a prearranged activity sheet every 30 seconds to: (1) identify those tasks and activities an assistant could be trained…
Towards a Methodology for Identifying Program Constraints During Requirements Analysis
NASA Technical Reports Server (NTRS)
Romo, Lilly; Gates, Ann Q.; Della-Piana, Connie Kubo
1997-01-01
Requirements analysis is the activity that involves determining the needs of the customer, identifying the services that the software system should provide and understanding the constraints on the solution. The result of this activity is a natural language document, typically referred to as the requirements definition document. Some of the problems that exist in defining requirements in large scale software projects includes synthesizing knowledge from various domain experts and communicating this information across multiple levels of personnel. One approach that addresses part of this problem is called context monitoring and involves identifying the properties of and relationships between objects that the system will manipulate. This paper examines several software development methodologies, discusses the support that each provide for eliciting such information from experts and specifying the information, and suggests refinements to these methodologies.
LSU: The Library Space Utilization Methodology.
ERIC Educational Resources Information Center
Hall, Richard B.
A computerized research technique for measuring the space utilization of public library facilities provides a behavioral activity and occupancy analysis for library planning purposes. The library space utilization (LSU) methodology demonstrates that significant information about the functional requirements of a library can be measured and…
Measuring Circulation Desk Activities Using a Random Alarm Mechanism.
ERIC Educational Resources Information Center
Mosborg, Stella Frank
1980-01-01
Reports a job analysis methodology to gather meaningful data related to circulation desk activity. The technique is designed to give librarians statistical data on actual time expenditures for complex and varying activities. (Author/RAA)
Mesquita, D P; Dias, O; Amaral, A L; Ferreira, E C
2009-04-01
In recent years, a great deal of attention has been focused on the research of activated sludge processes, where the solid-liquid separation phase is frequently considered of critical importance, due to the different problems that severely affect the compaction and the settling of the sludge. Bearing that in mind, in this work, image analysis routines were developed in Matlab environment, allowing the identification and characterization of microbial aggregates and protruding filaments in eight different wastewater treatment plants, for a combined period of 2 years. The monitoring of the activated sludge contents allowed for the detection of bulking events proving that the developed image analysis methodology is adequate for a continuous examination of the morphological changes in microbial aggregates and subsequent estimation of the sludge volume index. In fact, the obtained results proved that the developed image analysis methodology is a feasible method for the continuous monitoring of activated sludge systems and identification of disturbances.
CACDA Jiffy War Game Technical Manual. Part 1: Methodology
1977-03-01
Systems Analysis Office (Mr Tyburski) Fort Monmout’.h, NJ 07703 Commander 1* USAISD ATTN: ATISE-TD-TS-CD (LT Boyer) Fort Deven , MASS 01433 Commander 2...Developments Activity Fort Leavenworth, Kansas 66027 CACDA JIFFY WAR GAME TECHNICAL MANUAL Part 1: Methodology by Timothy J. Bailey and Gerald A. Martin ACN...ComrbatDevelopments Activity (CACDA), Fort Leavenworth,i-Xsas," for scenario devel- opment and force structure evaluation. The Jiffy Game computer
DOT National Transportation Integrated Search
1999-10-01
The objective of this four-year research effort is to develop and test a methodology to estimate the economic impacts of median design. This report summarizes the activities performed in the third year of this project. The primary task in the third y...
Cooper, Guy Paul; Yeager, Violet; Burkle, Frederick M.; Subbarao, Italo
2015-01-01
Background: This article describes a novel triangulation methodological approach for identifying twitter activity of regional active twitter users during the 2013 Hattiesburg EF-4 Tornado. Methodology: A data extraction and geographically centered filtration approach was utilized to generate Twitter data for 48 hrs pre- and post-Tornado. The data was further validated using six sigma approach utilizing GPS data. Results: The regional analysis revealed a total of 81,441 tweets, 10,646 Twitter users, 27,309 retweets and 2637 tweets with GPS coordinates. Conclusions: Twitter tweet activity increased 5 fold during the response to the Hattiesburg Tornado. Retweeting activity increased 2.2 fold. Tweets with a hashtag increased 1.4 fold. Twitter was an effective disaster risk reduction tool for the Hattiesburg EF-4 Tornado 2013. PMID:26203396
Global/local methods research using the CSM testbed
NASA Technical Reports Server (NTRS)
Knight, Norman F., Jr.; Ransom, Jonathan B.; Griffin, O. Hayden, Jr.; Thompson, Danniella M.
1990-01-01
Research activities in global/local stress analysis are described including both two- and three-dimensional analysis methods. These methods are being developed within a common structural analysis framework. Representative structural analysis problems are presented to demonstrate the global/local methodologies being developed.
Environmental exposure effects on composite materials for commercial aircraft
NASA Technical Reports Server (NTRS)
Hoffman, D. J.
1978-01-01
Activities reported include completion of the program design tasks, resolution of a high fiber volume problem and resumption of specimen fabrication, fixture fabrication, and progress on the analysis methodology and definition of the typical aircraft environment. Program design activities including test specimens, specimen holding fixtures, flap-track fairing tailcones, and ground exposure racks were completed. The problem experienced in obtaining acceptable fiber volume fraction results on two of the selected graphite epoxy material systems was resolved with an alteration to the bagging procedure called out in BAC 5562. The revised bagging procedure, involving lower numbers of bleeder plies, produces acceptable results. All required laminates for the contract have now been laid up and cured. Progress in the area of analysis methodology has been centered about definition of the environment that a commercial transport aircraft undergoes. The selected methodology is analagous to fatigue life assessment.
Perspectives Do Matter: "Joint Screen", a Promising Methodology for Multimodal Interaction Analysis
ERIC Educational Resources Information Center
Arend, Béatrice; Sunnen, Patrick; Fixmer, Pierre; Sujbert, Monika
2014-01-01
This paper discusses theoretical and methodological issues arising from a video-based research design and the emergent tool "Joint Screen'"when grasping joint activity. We share our reflections regarding the combined reading of four synchronised camera perspectives combined in one screen. By these means we reconstruct and analyse…
Márquez, E; González, S; López, T; Giayetto, V; Littvik, A; Cannistraci, R; Pavan, J
2000-01-01
The productive character of the scientific thought points out a methodological means that include the demand of the students' job about the information and not his mere reception and reproduction. It is essential to give the students the opportunity of discovering the cognitive processes used in the production of the scientific knowledge. In this work, we present the result of the starting of a workshop dynamic in a basic subject, the students' answers, and the analysis of the subject. In the Chair of Medical Bacteriology and Virology of the School of Medical Sciences, the National University of Córdoba, 1700 students attended classes in 1997. The subject was developed with the activities that worked the same contents from two different learning conceptions: (I) workshop activity, non obligatory, constructive, and (II) theorico-practical activity, obligatory and traditional. Two voluntary and anonymous interviews were done about the valuation that the student gave to these two activities and their basis, one in the middle and the other at the end of the course. 90.55% classified the traditional activity as positive, and 9.45% as negative. Regarding the workshop activity, the 60.5% classified it as positive and the 39.5% as negative. The same developed content with two activities made possible the analysis of the impact that the two different methodologies produced on the students. The student's answer to the traditional activity was better than the workshop activity (p < 0.001). The differences in the student's acceptance between the two option revealed the major difficulty of an alternative methodological strategy in the current educational model.
Rapid development of xylanase assay conditions using Taguchi methodology.
Prasad Uday, Uma Shankar; Bandyopadhyay, Tarun Kanti; Bhunia, Biswanath
2016-11-01
The present investigation is mainly concerned with the rapid development of extracellular xylanase assay conditions by using Taguchi methodology. The extracellular xylanase was produced from Aspergillus niger (KP874102.1), a new strain isolated from a soil sample of the Baramura forest, Tripura West, India. Four physical parameters including temperature, pH, buffer concentration and incubation time were considered as key factors for xylanase activity and were optimized using Taguchi robust design methodology for enhanced xylanase activity. The main effect, interaction effects and optimal levels of the process factors were determined using signal-to-noise (S/N) ratio. The Taguchi method recommends the use of S/N ratio to measure quality characteristics. Based on analysis of the S/N ratio, optimal levels of the process factors were determined. Analysis of variance (ANOVA) was performed to evaluate statistically significant process factors. ANOVA results showed that temperature contributed the maximum impact (62.58%) on xylanase activity, followed by pH (22.69%), buffer concentration (9.55%) and incubation time (5.16%). Predicted results showed that enhanced xylanase activity (81.47%) can be achieved with pH 2, temperature 50°C, buffer concentration 50 Mm and incubation time 10 min.
A Software Engineering Environment for the Navy.
1982-03-31
Engineering Pr.cess . - 55 ?art II: Description of A Software Engineering Env.Lonnmeut 1. Data Base ........................................ 7 -3 L.I...Methodology to Tool 1-54 2.2.2.2-6 Flow of Management: Activity to Methodology to Tool 21- 55 2.2.2.2-7 Pipelining for Activity-Specific Tools 11-56 A.1.1-1 A...testing techniques. 2.2. 2 Methodciogies and Tools: Correctness Analysis Pai e T- 4Metboioioo ies aews - Pev2.ews Jeicrmine the in ernai ’ Qolc .. ness and
Ergonomic assessment methodologies in manual handling of loads--opportunities in organizations.
Pires, Claudia
2012-01-01
The present study was developed based on the analysis of workplaces in the engineering industry, particularly in automotive companies. The main objectives of the study were to evaluate the activities present in the workplace concerning manual handling, using assessment methodologies NIOSH Ergonomic Equation [1] and Manual Material Handling [2], present in ISO 11228 [3-4], and to consider the possibility of developing musculoskeletal injuries associated with these activities, an issue of great concern in all industrial sectors. Similarly, it was also shown the suitability of each method to the task concerned. The study was conducted in three steps. The first step was to collect images and information about the target tasks. As a second step proceeded to the analysis, determining the method to use and to evaluate activities. Finally, we found the results obtained and acted on accordingly. With the study observed situations considered urgent action, according to the methodologies used, and proceeded to develop solutions in order to solve the problems identified, eliminating and / or minimizing embarrassing situations and harmful to employees.
2014-01-01
Background The ability to walk independently is a primary goal for rehabilitation after stroke. Gait analysis provides a great amount of valuable information, while functional magnetic resonance imaging (fMRI) offers a powerful approach to define networks involved in motor control. The present study reports a new methodology based on both fMRI and gait analysis outcomes in order to investigate the ability of fMRI to reflect the phases of motor learning before/after electromyographic biofeedback treatment: the preliminary fMRI results of a post stroke subject’s brain activation, during passive and active ankle dorsal/plantarflexion, before and after biofeedback (BFB) rehabilitation are reported and their correlation with gait analysis data investigated. Methods A control subject and a post-stroke patient with chronic hemiparesis were studied. Functional magnetic resonance images were acquired during a block-design protocol on both subjects while performing passive and active ankle dorsal/plantarflexion. fMRI and gait analysis were assessed on the patient before and after electromyographic biofeedback rehabilitation treatment during gait activities. Lower limb three-dimensional kinematics, kinetics and surface electromyography were evaluated. Correlation between fMRI and gait analysis categorical variables was assessed: agreement/disagreement was assigned to each variable if the value was in/outside the normative range (gait analysis), or for presence of normal/diffuse/no activation of motor area (fMRI). Results Altered fMRI activity was found on the post-stroke patient before biofeedback rehabilitation with respect to the control one. Meanwhile the patient showed a diffuse, but more limited brain activation after treatment (less voxels). The post-stroke gait data showed a trend towards the normal range: speed, stride length, ankle power, and ankle positive work increased. Preliminary correlation analysis revealed that consistent changes were observed both for the fMRI data, and the gait analysis data after treatment (R > 0.89): this could be related to the possible effects BFB might have on the central as well as on the peripheral nervous system. Conclusions Our findings showed that this methodology allows evaluation of the relationship between alterations in gait and brain activation of a post-stroke patient. Such methodology, if applied on a larger sample subjects, could provide information about the specific motor area involved in a rehabilitation treatment. PMID:24716475
Towards sustainable mobile systems configurations: Application to a tuna purse seiner.
García Rellán, A; Vázquez Brea, C; Bello Bugallo, P M
2018-08-01
Fishing is one of the most important marine activities. It contributes to both overfishing and marine pollution, the two main threats to the ocean environment. In this context, the aim of this work is to investigate and validate methodologies for the identification of more sustainable operating configurations for a tuna purse seiner. The proposed methodology is based on a previous one applied to secondary industrial systems, taking into account the Integrated Pollution Prevention and Control focus, developed for the most potentially industrial polluting sources. The idea is to apply the same type of methodologies and concepts used for secondary industrial punctual sources, to a primary industrial mobile activity. This methodology combines two tools: "Material and Energy Flow Analysis" (a tool from industrial metabolism), and "Best Available Techniques Analysis". The first provides a way to detect "Improvable Flows" into de system, and the second provides a way to define sustainable options to improve them. Five main Improvable Flows have been identified in the selected case study, the activity of a purse seiner, most of them related with energy consumption and air emission, in different stages of the fishing activity. Thirty-one Best Available Techniques candidates for the system have been inventoried, that potentially could improve the sustainability of the activity. Seven of them are not implemented yet to the case study. The potential improvements of the system proposed by this work are related to energy efficiency, waste management, prevention and control of air emissions. This methodology demonstrates to be a good tool towards sustainable punctual systems, but also towards sustainable mobile systems such as the fishing activity in oceans, as the tuna purse seiner validated here. The practical application of the identified technologies to fishing systems will contribute to prevent and reduce marine pollution, one of the greatest threats of today's oceans. Copyright © 2017 Elsevier B.V. All rights reserved.
Popes in the Pizza: Analyzing Activity Reports to Create and Sustain a Strategic Plan
ERIC Educational Resources Information Center
Sweet, Charlie; Blythe, Hal; Keeley, E. J.; Forsyth, Ben
2008-01-01
This article presents a practical methodology for creating and sustaining strategic planning, the task analysis. Utilizing our Teaching & Learning Center Strategic Plan as a model, we demonstrate how working with a weekly status report provides a comprehensive listing of detail necessary to analyze and revise the plan. The new methodology is…
Development of Management Methodology for Engineering Production Quality
NASA Astrophysics Data System (ADS)
Gorlenko, O.; Miroshnikov, V.; Borbatc, N.
2016-04-01
The authors of the paper propose four directions of the methodology developing the quality management of engineering products that implement the requirements of new international standard ISO 9001:2015: the analysis of arrangement context taking into account stakeholders, the use of risk management, management of in-house knowledge, assessment of the enterprise activity according to the criteria of effectiveness
DOT National Transportation Integrated Search
1995-08-28
The 1-44/1-55 Major Transportation Investment Analysis (MTIA) has advanced to the analysis stage having completed the first round of scoping meetings with Advisory Committees and the public. This report summarizes the results of scoping activities to...
NASA Astrophysics Data System (ADS)
Scaini, C.; Felpeto, A.; Martí, J.; Carniel, R.
2014-05-01
This paper presents a GIS-based methodology to estimate damages produced by volcanic eruptions. The methodology is constituted by four parts: definition and simulation of eruptive scenarios, exposure analysis, vulnerability assessment and estimation of expected damages. Multi-hazard eruptive scenarios are defined for the Teide-Pico Viejo active volcanic complex, and simulated through the VORIS tool. The exposure analysis identifies the elements exposed to the hazard at stake and focuses on the relevant assets for the study area. The vulnerability analysis is based on previous studies on the built environment and complemented with the analysis of transportation and urban infrastructures. Damage assessment is performed associating a qualitative damage rating to each combination of hazard and vulnerability. This operation consists in a GIS-based overlap, performed for each hazardous phenomenon considered and for each element. The methodology is then automated into a GIS-based tool using an ArcGIS® program. Given the eruptive scenarios and the characteristics of the exposed elements, the tool produces expected damage maps. The tool is applied to the Icod Valley (North of Tenerife Island) which is likely to be affected by volcanic phenomena in case of eruption from both the Teide-Pico Viejo volcanic complex and North-West basaltic rift. Results are thematic maps of vulnerability and damage that can be displayed at different levels of detail, depending on the user preferences. The aim of the tool is to facilitate territorial planning and risk management in active volcanic areas.
A Unit on "Fahrenheit 451" That Uses Cooperative Learning (Resources and Reviews).
ERIC Educational Resources Information Center
Ebbers, Frances A.
1991-01-01
Provides a curriculum unit using the novel "Fahrenheit 451" to provide student-centered activities based on solid pedagogical methodology. Emphasizes value-centered analysis of the novel, comparison of alternative arguments, and integration of cooperative learning activities. (PRA)
Applications of Quantum Cascade Laser Spectroscopy in the Analysis of Pharmaceutical Formulations.
Galán-Freyle, Nataly J; Pacheco-Londoño, Leonardo C; Román-Ospino, Andrés D; Hernandez-Rivera, Samuel P
2016-09-01
Quantum cascade laser spectroscopy was used to quantify active pharmaceutical ingredient content in a model formulation. The analyses were conducted in non-contact mode by mid-infrared diffuse reflectance. Measurements were carried out at a distance of 15 cm, covering the spectral range 1000-1600 cm(-1) Calibrations were generated by applying multivariate analysis using partial least squares models. Among the figures of merit of the proposed methodology are the high analytical sensitivity equivalent to 0.05% active pharmaceutical ingredient in the formulation, high repeatability (2.7%), high reproducibility (5.4%), and low limit of detection (1%). The relatively high power of the quantum-cascade-laser-based spectroscopic system resulted in the design of detection and quantification methodologies for pharmaceutical applications with high accuracy and precision that are comparable to those of methodologies based on near-infrared spectroscopy, attenuated total reflection mid-infrared Fourier transform infrared spectroscopy, and Raman spectroscopy. © The Author(s) 2016.
ERIC Educational Resources Information Center
Park, Mira; Park, Do-Yong; Lee, Robert E.
2009-01-01
The purpose of this study is to investigate in what ways the inquiry task of teaching and learning in earth science textbooks reflect the unique characteristics of earth science inquiry methodology, and how it provides students with opportunities to develop their scientific reasoning skills. This study analyzes a number of inquiry activities in…
Analytical Chemistry Division. Annual progress report for period ending December 31, 1980
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lyon, W.S.
1981-05-01
This report is divided into: analytical methodology; mass and emission spectrometry; technical support; bio/organic analysis; nuclear and radiochemical analysis; quality assurance, safety, and tabulation of analyses; supplementary activities; and presentation of research results. Separate abstracts were prepared for the technical support, bio/organic analysis, and nuclear and radiochemical analysis. (DLC)
Rosenbaum, Simon; Tiedemann, Anne; Sherrington, Catherine; Curtis, Jackie; Ward, Philip B
2014-09-01
To determine effects of physical activity on depressive symptoms (primary objective), symptoms of schizophrenia, anthropometric measures, aerobic capacity, and quality of life (secondary objectives) in people with mental illness and explore between-study heterogeneity. MEDLINE, Cochrane Controlled Trials Register, PsycINFO, CINAHL, Embase, and the Physiotherapy Evidence Database (PEDro) were searched from earliest record to 2013. Randomized controlled trials of adults with a DSM-IV-TR, ICD-10, or clinician-confirmed diagnosis of a mental illness other than dysthymia or eating disorders were selected. Interventions included exercise programs, exercise counseling, lifestyle interventions, tai chi, or physical yoga. Study methodological quality and intervention compliance with American College of Sports Medicine (ACSM) guidelines were also assessed. Two investigators extracted data. Data were pooled using random-effects meta-analysis. Meta-regression was used to examine sources of between-study heterogeneity. Thirty-nine eligible trials were identified. The primary meta-analysis found a large effect of physical activity on depressive symptoms (n = 20; standardized mean difference (SMD) = 0.80). The effect size in trial interventions that met ACSM guidelines for aerobic exercise did not differ significantly from those that did not meet these guidelines. The effect for trials with higher methodological quality was smaller than that observed for trials with lower methodological quality (SMD = 0.39 vs 1.35); however, the difference was not statistically significant. A large effect was found for schizophrenia symptoms (SMD = 1.0), a small effect was found for anthropometry (SMD = 0.24), and moderate effects were found for aerobic capacity (SMD = 0.63) and quality of life (SMD = 0.64). Physical activity reduced depressive symptoms in people with mental illness. Larger effects were seen in studies of poorer methodological quality. Physical activity reduced symptoms of schizophrenia and improved anthropometric measures, aerobic capacity, and quality of life among people with mental illness. PROSPERO registration #CRD42012002012. © Copyright 2014 Physicians Postgraduate Press, Inc.
Interdisciplinary debate in the teaching-learning process on bioethics: academic health experiences.
Campos Daniel, Jéssica; Dias Reis Pessalacia, Juliana; Leite de Andrade, Ana Flávia
2016-06-01
The study aimed to understand the health of student experiences to participate in interdisciplinary discussions in bioethics and know the contributions of interdisciplinary methodological resource for the teaching-learning process at graduation. Descriptive study of qualitative approach in a public higher education institution of Divinópolis, Minas Gerais, Brazil. Three categories of analysis were identified: ''active methodologies in the training of a professional critic,'' ''interdisciplinary debate as facilitator reflection of bioethics'' and ''feelings and attitudes caused by the interdisciplinary debate.'' Discussion. There was a lack of approach of bioethical contents in the health curriculum, and the adoption of active methodologies provides a better reflection in bioethics, but that requires changing paradigms of teachers and educational institutions.
Infrared Algorithm Development for Ocean Observations with EOS/MODIS
NASA Technical Reports Server (NTRS)
Brown, Otis B.
1997-01-01
Efforts continue under this contract to develop algorithms for the computation of sea surface temperature (SST) from MODIS infrared measurements. This effort includes radiative transfer modeling, comparison of in situ and satellite observations, development and evaluation of processing and networking methodologies for algorithm computation and data accession, evaluation of surface validation approaches for IR radiances, development of experimental instrumentation, and participation in MODIS (project) related activities. Activities in this contract period have focused on radiative transfer modeling, evaluation of atmospheric correction methodologies, undertake field campaigns, analysis of field data, and participation in MODIS meetings.
Development of a Methodology for Assessing Aircrew Workloads.
1981-11-01
Workload Feasibility Study. .. ...... 52 Subjects. .. .............. ........ 53 Equipment .. ............... ....... 53 Date Analysis ... analysis ; simulation; standard time systems; switching synthetic time systems; task activities; task interference; time study; tracking; workload; work sampl...standard data systems, information content analysis , work sampling and job evaluation. Con- ventional methods were found to be deficient in accounting
A methodology for creating greenways through multidisciplinary sustainable landscape planning.
Pena, Selma Beatriz; Abreu, Maria Manuela; Teles, Rui; Espírito-Santo, Maria Dalila
2010-01-01
This research proposes a methodology for defining greenways via sustainable planning. This approach includes the analysis and discussion of culture and natural processes that occur in the landscape. The proposed methodology is structured in three phases: eco-cultural analysis; synthesis and diagnosis; and proposal. An interdisciplinary approach provides an assessment of the relationships between landscape structure and landscape dynamics, which are essential to any landscape management or land use. The landscape eco-cultural analysis provides a biophysical, dynamic (geomorphologic rate), vegetation (habitats from directive 92/43/EEC) and cultural characterisation. The knowledge obtained by this analysis then supports the definition of priority actions to stabilise the landscape and the management measures for the habitats. After the analysis and diagnosis phases, a proposal for the development of sustainable greenways can be achieved. This methodology was applied to a study area of the Azambuja Municipality in the Lisbon Metropolitan Area (Portugal). The application of the proposed methodology to the study area shows that landscape stability is crucial for greenway users in order to appreciate the landscape and its natural and cultural elements in a sustainable and healthy way, both by cycling or by foot. A balanced landscape will increase the value of greenways and in return, they can develop socio-economic activities with benefits for rural communities. Copyright 2009 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Erduran, Sibel; Simon, Shirley; Osborne, Jonathan
2004-11-01
This paper reports some methodological approaches to the analysis of argumentation discourse developed as part of the two-and-a-half year project titled Enhancing the Quality of Argument in School Scienc'' supported by the Economic and Social Research Council in the United Kingdom. In this project researchers collaborated with middle-school science teachers to develop models of instructional activities in an effort to make argumentation a component of instruction. We begin the paper with a brief theoretical justification for why we consider argumentation to be of significance to science education. We then contextualize the use of Toulmin's Argument Pattern in the study of argumentation discourse and provide a justification for the methodological outcomes our approach generates. We illustrate how our work refines and develops research methodologies in argumentation analysis. In particular, we present two methodological approaches to the analysis of argumentation resulting in whole-class as well as small-group student discussions. For each approach, we illustrate our coding scheme and some results as well as how our methodological approach has enabled our inquiry into the quality of argumentation in the classroom. We conclude with some implications for future research in argumentation in science education.
Nguyen, Hien D; Ullmann, Jeremy F P; McLachlan, Geoffrey J; Voleti, Venkatakaushik; Li, Wenze; Hillman, Elizabeth M C; Reutens, David C; Janke, Andrew L
2018-02-01
Calcium is a ubiquitous messenger in neural signaling events. An increasing number of techniques are enabling visualization of neurological activity in animal models via luminescent proteins that bind to calcium ions. These techniques generate large volumes of spatially correlated time series. A model-based functional data analysis methodology via Gaussian mixtures is suggested for the clustering of data from such visualizations is proposed. The methodology is theoretically justified and a computationally efficient approach to estimation is suggested. An example analysis of a zebrafish imaging experiment is presented.
Video analysis for insight and coding: Examples from tutorials in introductory physics
NASA Astrophysics Data System (ADS)
Scherr, Rachel E.
2009-12-01
The increasing ease of video recording offers new opportunities to create richly detailed records of classroom activities. These recordings, in turn, call for research methodologies that balance generalizability with interpretive validity. This paper shares methodology for two practices of video analysis: (1) gaining insight into specific brief classroom episodes and (2) developing and applying a systematic observational protocol for a relatively large corpus of video data. These two aspects of analytic practice are illustrated in the context of a particular research interest but are intended to serve as general suggestions.
Nonlinear analysis of EEG in major depression with fractal dimensions.
Akar, Saime A; Kara, Sadik; Agambayev, Sumeyra; Bilgic, Vedat
2015-01-01
Major depressive disorder (MDD) is a psychiatric mood disorder characterized by cognitive and functional impairments in attention, concentration, learning and memory. In order to investigate and understand its underlying neural activities and pathophysiology, EEG methodologies can be used. In this study, we estimated the nonlinearity features of EEG in MDD patients to assess the dynamical properties underlying the frontal and parietal brain activity. EEG data were obtained from 16 patients and 15 matched healthy controls. A wavelet-chaos methodology was used for data analysis. First, EEGs of subjects were decomposed into 5 EEG sub-bands by discrete wavelet transform. Then, both the Katz's and Higuchi's fractal dimensions (KFD and HFD) were calculated as complexity measures for full-band and sub-bands EEGs. Last, two-way analyses of variances were used to test EEG complexity differences on each fractality measures. As a result, a significantly increased complexity was found in both parietal and frontal regions of MDD patients. This significantly increased complexity was observed not only in full-band activity but also in beta and gamma sub-bands of EEG. The findings of the present study indicate the possibility of using the wavelet-chaos methodology to discriminate the EEGs of MDD patients from healthy controls.
Pinheiro, Rubiane C; Soares, Cleide M F; de Castro, Heizir F; Moraes, Flavio F; Zanin, Gisella M
2008-03-01
The conditions for maximization of the enzymatic activity of lipase entrapped in sol-gel matrix were determined for different vegetable oils using an experimental design. The effects of pH, temperature, and biocatalyst loading on lipase activity were verified using a central composite experimental design leading to a set of 13 assays and the surface response analysis. For canola oil and entrapped lipase, statistical analyses showed significant effects for pH and temperature and also the interactions between pH and temperature and temperature and biocatalyst loading. For the olive oil and entrapped lipase, it was verified that the pH was the only variable statistically significant. This study demonstrated that response surface analysis is a methodology appropriate for the maximization of the percentage of hydrolysis, as a function of pH, temperature, and lipase loading.
CURRENT TECHNICAL PROBLEMS IN EMERGY ANALYSIS
: Emergy Analysis has been a rapidly evolving assessment methodology for the past 30 years. This process of development was primarily driven by the inquiring mind and ceaseless activity of its founder, H.T. Odum, his students, and colleagues. Historically, as new kinds of proble...
Diagnostic Application of Absolute Neutron Activation Analysis in Hematology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zamboni, C.B.; Oliveira, L.C.; Dalaqua, L. Jr.
2004-10-03
The Absolute Neutron Activation Analysis (ANAA) technique was used to determine element concentrations of Cl and Na in blood of healthy group (male and female blood donators), select from Blood Banks at Sao Paulo city, to provide information which can help in diagnosis of patients. This study permitted to perform a discussion about the advantages and limitations of using this nuclear methodology in hematological examinations.
Gupta, Renu; Sharma, Sangeeta; Saxena, Sonal
2018-01-01
Healthcare-associated infections (HAI) are preventable in up to 30% of patients with evidence-based infection prevention and control (IPC) activities. IPC activities require effective surveillance to generate data for the HAI rates, defining priority areas, identifying processes amenable for improvement and institute interventions to improve patient's safety. However, uniform, accurate and standardised surveillance methodology using objective definitions can only generate meaningful data for effective execution of IPC activities. The highly exhaustive, complex and ever-evolving infection surveillance methodology pose a challenge for effective data capture, analysis and interpretation by ground level personnel. The present review addresses the gaps in knowledge and day-to-day challenges in surveillance faced by infection control team for effective implementation of IPC activities.
Recent activities within the Aeroservoelasticity Branch at the NASA Langley Research Center
NASA Technical Reports Server (NTRS)
Noll, Thomas E.; Perry, Boyd, III; Gilbert, Michael G.
1989-01-01
The objective of research in aeroservoelasticity at the NASA Langley Research Center is to enhance the modeling, analysis, and multidisciplinary design methodologies for obtaining multifunction digital control systems for application to flexible flight vehicles. Recent accomplishments are discussed, and a status report on current activities within the Aeroservoelasticity Branch is presented. In the area of modeling, improvements to the Minimum-State Method of approximating unsteady aerodynamics are shown to provide precise, low-order aeroservoelastic models for design and simulation activities. Analytical methods based on Matched Filter Theory and Random Process Theory to provide efficient and direct predictions of the critical gust profile and the time-correlated gust loads for linear structural design considerations are also discussed. Two research projects leading towards improved design methodology are summarized. The first program is developing an integrated structure/control design capability based on hierarchical problem decomposition, multilevel optimization and analytical sensitivities. The second program provides procedures for obtaining low-order, robust digital control laws for aeroelastic applications. In terms of methodology validation and application the current activities associated with the Active Flexible Wing project are reviewed.
Analysis of a Teacher's Pedagogical Arguments Using Toulmin's Model and Argumentation Schemes
ERIC Educational Resources Information Center
Metaxas, N.; Potari, D.; Zachariades, T.
2016-01-01
In this article, we elaborate methodologies to study the argumentation speech of a teacher involved in argumentative activities. The standard tool of analysis of teachers' argumentation concerning pedagogical matters is Toulmin's model. The theory of argumentation schemes offers an alternative perspective on the analysis of arguments. We propose…
Plancoulaine, Benoit; Laurinaviciene, Aida; Herlin, Paulette; Besusparis, Justinas; Meskauskas, Raimundas; Baltrusaityte, Indra; Iqbal, Yasir; Laurinavicius, Arvydas
2015-10-19
Digital image analysis (DIA) enables higher accuracy, reproducibility, and capacity to enumerate cell populations by immunohistochemistry; however, the most unique benefits may be obtained by evaluating the spatial distribution and intra-tissue variance of markers. The proliferative activity of breast cancer tissue, estimated by the Ki67 labeling index (Ki67 LI), is a prognostic and predictive biomarker requiring robust measurement methodologies. We performed DIA on whole-slide images (WSI) of 302 surgically removed Ki67-stained breast cancer specimens; the tumour classifier algorithm was used to automatically detect tumour tissue but was not trained to distinguish between invasive and non-invasive carcinoma cells. The WSI DIA-generated data were subsampled by hexagonal tiling (HexT). Distribution and texture parameters were compared to conventional WSI DIA and pathology report data. Factor analysis of the data set, including total numbers of tumor cells, the Ki67 LI and Ki67 distribution, and texture indicators, extracted 4 factors, identified as entropy, proliferation, bimodality, and cellularity. The factor scores were further utilized in cluster analysis, outlining subcategories of heterogeneous tumors with predominant entropy, bimodality, or both at different levels of proliferative activity. The methodology also allowed the visualization of Ki67 LI heterogeneity in tumors and the automated detection and quantitative evaluation of Ki67 hotspots, based on the upper quintile of the HexT data, conceptualized as the "Pareto hotspot". We conclude that systematic subsampling of DIA-generated data into HexT enables comprehensive Ki67 LI analysis that reflects aspects of intra-tumor heterogeneity and may serve as a methodology to improve digital immunohistochemistry in general.
Wada, Yoshinao; Dell, Anne; Haslam, Stuart M; Tissot, Bérangère; Canis, Kévin; Azadi, Parastoo; Bäckström, Malin; Costello, Catherine E; Hansson, Gunnar C; Hiki, Yoshiyuki; Ishihara, Mayumi; Ito, Hiromi; Kakehi, Kazuaki; Karlsson, Niclas; Hayes, Catherine E; Kato, Koichi; Kawasaki, Nana; Khoo, Kay-Hooi; Kobayashi, Kunihiko; Kolarich, Daniel; Kondo, Akihiro; Lebrilla, Carlito; Nakano, Miyako; Narimatsu, Hisashi; Novak, Jan; Novotny, Milos V; Ohno, Erina; Packer, Nicolle H; Palaima, Elizabeth; Renfrow, Matthew B; Tajiri, Michiko; Thomsson, Kristina A; Yagi, Hirokazu; Yu, Shin-Yi; Taniguchi, Naoyuki
2010-04-01
The Human Proteome Organisation Human Disease Glycomics/Proteome Initiative recently coordinated a multi-institutional study that evaluated methodologies that are widely used for defining the N-glycan content in glycoproteins. The study convincingly endorsed mass spectrometry as the technique of choice for glycomic profiling in the discovery phase of diagnostic research. The present study reports the extension of the Human Disease Glycomics/Proteome Initiative's activities to an assessment of the methodologies currently used for O-glycan analysis. Three samples of IgA1 isolated from the serum of patients with multiple myeloma were distributed to 15 laboratories worldwide for O-glycomics analysis. A variety of mass spectrometric and chromatographic procedures representative of current methodologies were used. Similar to the previous N-glycan study, the results convincingly confirmed the pre-eminent performance of MS for O-glycan profiling. Two general strategies were found to give the most reliable data, namely direct MS analysis of mixtures of permethylated reduced glycans in the positive ion mode and analysis of native reduced glycans in the negative ion mode using LC-MS approaches. In addition, mass spectrometric methodologies to analyze O-glycopeptides were also successful.
Analytical Chemistry Division annual progress report for period ending November 30, 1977
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lyon, W.S.
1978-03-01
Activities for the year are summarized in sections on analytical methodology, mass and mass emission spectrometry, analytical services, bio-organic analysis, nuclear and radiochemical analysis, and quality assurance and safety. Presentations of research results in publications and reports are tabulated. (JRD)
AAC Best Practice Using Automated Language Activity Monitoring.
ERIC Educational Resources Information Center
Hill, Katya; Romich, Barry
This brief paper describes automated language activity monitoring (LAM), an augmentative and alternative communication (AAC) methodology for the collection, editing, and analysis of language data in structured or natural situations with people who have severe communication disorders. The LAM function records each language event (letters, words,…
NASA Technical Reports Server (NTRS)
Noll, Thomas E.; Perry, Boyd, III; Tiffany, Sherwood H.; Cole, Stanley R.; Buttrill, Carey S.; Adams, William M., Jr.; Houck, Jacob A.; Srinathkumar, S.; Mukhopadhyay, Vivek; Pototzky, Anthony S.
1989-01-01
The status of the joint NASA/Rockwell Active Flexible Wing Wind-Tunnel Test Program is described. The objectives are to develop and validate the analysis, design, and test methodologies required to apply multifunction active control technology for improving aircraft performance and stability. Major tasks include designing digital multi-input/multi-output flutter-suppression and rolling-maneuver-load alleviation concepts for a flexible full-span wind-tunnel model, obtaining an experimental data base for the basic model and each control concept and providing comparisons between experimental and analytical results to validate the methodologies. The opportunity is provided to improve real-time simulation techniques and to gain practical experience with digital control law implementation procedures.
Application of ion chromatography in pharmaceutical and drug analysis.
Jenke, Dennis
2011-08-01
Ion chromatography (IC) has developed and matured into an important analytical methodology in a number of diverse applications and industries, including pharmaceuticals. This manuscript provides a review of IC applications for the determinations of active and inactive ingredients, excipients, degradation products, and impurities relevant to pharmaceutical analyses and thus serves as a resource for investigators looking for insights into the use of the IC methodology in this field of application.
Moreno-Opo, Rubén; Fernández-Olalla, Mariana; Margalida, Antoni; Arredondo, Ángel; Guil, Francisco
2012-01-01
The application of scientific-based conservation measures requires that sampling methodologies in studies modelling similar ecological aspects produce comparable results making easier their interpretation. We aimed to show how the choice of different methodological and ecological approaches can affect conclusions in nest-site selection studies along different Palearctic meta-populations of an indicator species. First, a multivariate analysis of the variables affecting nest-site selection in a breeding colony of cinereous vulture (Aegypius monachus) in central Spain was performed. Then, a meta-analysis was applied to establish how methodological and habitat-type factors determine differences and similarities in the results obtained by previous studies that have modelled the forest breeding habitat of the species. Our results revealed patterns in nesting-habitat modelling by the cinereous vulture throughout its whole range: steep and south-facing slopes, great cover of large trees and distance to human activities were generally selected. The ratio and situation of the studied plots (nests/random), the use of plots vs. polygons as sampling units and the number of years of data set determined the variability explained by the model. Moreover, a greater size of the breeding colony implied that ecological and geomorphological variables at landscape level were more influential. Additionally, human activities affected in greater proportion to colonies situated in Mediterranean forests. For the first time, a meta-analysis regarding the factors determining nest-site selection heterogeneity for a single species at broad scale was achieved. It is essential to homogenize and coordinate experimental design in modelling the selection of species' ecological requirements in order to avoid that differences in results among studies would be due to methodological heterogeneity. This would optimize best conservation and management practices for habitats and species in a global context. PMID:22413023
Rethinking Protocol Analysis from a Cultural Perspective.
ERIC Educational Resources Information Center
Smagorinsky, Peter
2001-01-01
Outlines a cultural-historical activity theory (CHAT) perspective that accounts for protocol analysis along three key dimensions: the relationship between thinking and speech from a representational standpoint; the social role of speech in research methodology; and the influence of speech on thinking and data collection. (Author/VWL)
Factors Influencing Teachers' Engagement in Informal Learning Activities
ERIC Educational Resources Information Center
Lohman, Margaret C.
2006-01-01
Purpose: The purpose of this study is to examine factors influencing the engagement of public school teachers in informal learning activities. Design/methodology/approach: This study used a survey research design. Findings: Analysis of the data found that teachers rely to a greater degree on interactive than on independent informal learning…
Cortical Signal Analysis and Advances in Functional Near-Infrared Spectroscopy Signal: A Review.
Kamran, Muhammad A; Mannan, Malik M Naeem; Jeong, Myung Yung
2016-01-01
Functional near-infrared spectroscopy (fNIRS) is a non-invasive neuroimaging modality that measures the concentration changes of oxy-hemoglobin (HbO) and de-oxy hemoglobin (HbR) at the same time. It is an emerging cortical imaging modality with a good temporal resolution that is acceptable for brain-computer interface applications. Researchers have developed several methods in last two decades to extract the neuronal activation related waveform from the observed fNIRS time series. But still there is no standard method for analysis of fNIRS data. This article presents a brief review of existing methodologies to model and analyze the activation signal. The purpose of this review article is to give a general overview of variety of existing methodologies to extract useful information from measured fNIRS data including pre-processing steps, effects of differential path length factor (DPF), variations and attributes of hemodynamic response function (HRF), extraction of evoked response, removal of physiological noises, instrumentation, and environmental noises and resting/activation state functional connectivity. Finally, the challenges in the analysis of fNIRS signal are summarized.
Cortical Signal Analysis and Advances in Functional Near-Infrared Spectroscopy Signal: A Review
Kamran, Muhammad A.; Mannan, Malik M. Naeem; Jeong, Myung Yung
2016-01-01
Functional near-infrared spectroscopy (fNIRS) is a non-invasive neuroimaging modality that measures the concentration changes of oxy-hemoglobin (HbO) and de-oxy hemoglobin (HbR) at the same time. It is an emerging cortical imaging modality with a good temporal resolution that is acceptable for brain-computer interface applications. Researchers have developed several methods in last two decades to extract the neuronal activation related waveform from the observed fNIRS time series. But still there is no standard method for analysis of fNIRS data. This article presents a brief review of existing methodologies to model and analyze the activation signal. The purpose of this review article is to give a general overview of variety of existing methodologies to extract useful information from measured fNIRS data including pre-processing steps, effects of differential path length factor (DPF), variations and attributes of hemodynamic response function (HRF), extraction of evoked response, removal of physiological noises, instrumentation, and environmental noises and resting/activation state functional connectivity. Finally, the challenges in the analysis of fNIRS signal are summarized. PMID:27375458
Suplatov, D A; Arzhanik, V K; Svedas, V K
2011-01-01
Comparative bioinformatic analysis is the cornerstone of the study of enzymes' structure-function relationship. However, numerous enzymes that derive from a common ancestor and have undergone substantial functional alterations during natural selection appear not to have a sequence similarity acceptable for a statistically reliable comparative analysis. At the same time, their active site structures, in general, can be conserved, while other parts may largely differ. Therefore, it sounds both plausible and appealing to implement a comparative analysis of the most functionally important structural elements - the active site structures; that is, the amino acid residues involved in substrate binding and the catalytic mechanism. A computer algorithm has been developed to create a library of enzyme active site structures based on the use of the PDB database, together with programs of structural analysis and identification of functionally important amino acid residues and cavities in the enzyme structure. The proposed methodology has been used to compare some α,β-hydrolase superfamily enzymes. The insight has revealed a high structural similarity of catalytic site areas, including the conservative organization of a catalytic triad and oxyanion hole residues, despite the wide functional diversity among the remote homologues compared. The methodology can be used to compare the structural organization of the catalytic and substrate binding sites of various classes of enzymes, as well as study enzymes' evolution and to create of a databank of enzyme active site structures.
ERIC Educational Resources Information Center
Thouvenelle, Suzanne; And Others
The final document in a series on least restrictive environment (LRE) placement for handicapped students summarizes the objectives and findings of the project. Research questions, methodology, and conclusions are reviewed from each of four research activities: state education agency analysis; local education agency analysis; legal analysis; and…
NASA Astrophysics Data System (ADS)
Hendikawati, P.; Arifudin, R.; Zahid, M. Z.
2018-03-01
This study aims to design an android Statistics Data Analysis application that can be accessed through mobile devices to making it easier for users to access. The Statistics Data Analysis application includes various topics of basic statistical along with a parametric statistics data analysis application. The output of this application system is parametric statistics data analysis that can be used for students, lecturers, and users who need the results of statistical calculations quickly and easily understood. Android application development is created using Java programming language. The server programming language uses PHP with the Code Igniter framework, and the database used MySQL. The system development methodology used is the Waterfall methodology with the stages of analysis, design, coding, testing, and implementation and system maintenance. This statistical data analysis application is expected to support statistical lecturing activities and make students easier to understand the statistical analysis of mobile devices.
A comparative analysis of protected area planning and management frameworks
Per Nilsen; Grant Tayler
1997-01-01
A comparative analysis of the Recreation Opportunity Spectrum (ROS), Limits of Acceptable Change (LAC), a Process for Visitor Impact Management (VIM), Visitor Experience and Resource Protection (VERP), and the Management Process for Visitor Activities (known as VAMP) decision frameworks examines their origins; methodology; use of factors, indicators, and standards;...
AN APPROACH TO WATER RESOURCES EVALUATION OF NON-POINT SILVICULTURAL SOURCES (A PROCEDURAL HANDBOOK)
This handbook provides an analysis methodology that can be used to describe and evaluate changes to the water resource resulting from non-point silvicultural activities. This state-of-the-art approach for analysis and prediction of pollution from non point silvicultural activitie...
ERIC Educational Resources Information Center
Darwin, Stephen
2011-01-01
Cultural-historical activity theory (CHAT), founded on the seminal work of Vygotsky and evolving in the subsequent work of Leont'ev and Engestrom, continues to emerge as a robust and increasingly widely used conceptual framework for the research and analysis of the complex social mediation of human learning and development. Yet there remains…
ERIC Educational Resources Information Center
Webb, Sara Jane; Bernier, Raphael; Henderson, Heather A.; Johnson, Mark H.; Jones, Emily J. H.; Lerner, Matthew D.; McPartland, James C.; Nelson, Charles A.; Rojas, Donald C.; Townsend, Jeanne; Westerfield, Marissa
2015-01-01
The EEG reflects the activation of large populations of neurons that act in synchrony and propagate to the scalp surface. This activity reflects both the brain's background electrical activity and when the brain is being challenged by a task. Despite strong theoretical and methodological arguments for the use of EEG in understanding the…
ARO - Terrestrial Research Program, Methodologies and Protocols for Characterization of Geomaterials
2015-05-14
of ice involves melting, digestion, and analysis using inductively coupled plasma – mass spectrometry (ICPMS). ICP-MS analysis established elemental...4] have distinct chemical compositions. Knowledge of the chemical composition of the mineral assemblage present in a rock is critical to...activation analysis (INAA), to inductively-coupled plasma analysis and mass spectrometry (ICP & ICP-MS), mass spectrometry (MS), and laser-ablation
Organizing Language Intervention Relative to the Client's Personal Experience: A Clinical Case Study
ERIC Educational Resources Information Center
Chen, Liang; Whittington, Diane
2006-01-01
An analysis is presented of two different therapeutic activities designed for a profoundly deaf adult with cerebral palsy, DP. The study draws on techniques of qualitative methodology to identify elements that contribute to effective intervention practices for DP. Results indicate that therapeutic materials and activities must first of all be…
John G. Hof; Curtis H. Flather; Tony J. Baltic; Rudy M. King
2004-01-01
This article reports the methodology and results of a data envelopment analysis (DEA) that attempts to identify areas in the country where there is maximum potential for improving the forest and rangeland condition, based on 12 indicator variables. This analysis differs from previous DEA studies in that the primary variables are measures of human activity and...
ERIC Educational Resources Information Center
Sajavaara, Kari, Ed.; Lehtonen, Jaakko, Ed.
The following papers and reports are included: (1) "Prisoners of Code-Centred Privacy: Reflections on Contrastive Analysis and Related Disciplines" by Kari Sajavaara and Jaakko Lehtonen; (2) "The Methodology and Practice of Contrastive Discourse Analysis" by Sajavaara, Lehtonen, and Liisa Korpimies; (3) "Interactional Activities in Discourse…
ERIC Educational Resources Information Center
Rantavuori, Juhana; Engeström, Yrjö; Lipponen, Lasse
2016-01-01
The paper analyzes a collaborative learning process among Finnish pre-service teachers planning their own learning in a self-regulated way. The study builds on cultural-historical activity theory and the theory of expansive learning, integrating for the first time an analysis of learning actions and an analysis of types of interaction. We examine…
Medicine and the humanities--theoretical and methodological issues.
Puustinen, Raimo; Leiman, M; Viljanen, A M
2003-12-01
Engel's biopsychosocial model, Cassell's promotion of the concept "person" in medical thinking and Pellegrino's and Thomasma's philosophy of medicine are attempts to widen current biomedical theory of disease and to approach medicine as a form of human activity in pursuit of healing. To develop this approach further we would like to propose activity theory as a possible means for understanding the nature of medical practice. By "activity theory" we refer to developments which have evolved from Vygotsky's research on socially mediated mental functions and processes. Analysing medicine as activity enforces the joint consideration of target and subject: who is doing what to whom. This requires the use of historical, linguistic, anthropological, and semiotic tools. Therefore, if we analyse medicine as an activity, humanities are both theoretically and methodologically "inbound" (or internal) to the analysis itself. On the other hand, literature studies or anthropological writings provide material for analysing the various forms of medical practices.
Analysis of the Impacts of an Early Start for Compliance with the Kyoto Protocol
1999-01-01
This report describes the Energy Information Administration's analysis of the impacts of an early start, using the same methodology as in Impacts of the Kyoto Protocol on U.S. Energy Markets and Economic Activity, with only those changes in assumptions caused by the early start date.
ERIC Educational Resources Information Center
Salerno, Carlo
2006-01-01
This paper puts forth a data envelopment analysis (DEA) approach to estimating higher education institutions' per-student education costs (PSCs) in an effort to redress a number of methodological problems endemic to such estimations, particularly the allocation of shared expenditures between education and other institutional activities. An example…
How to estimate green house gas (GHG) emissions from an excavator by using CAT's performance chart
NASA Astrophysics Data System (ADS)
Hajji, Apif M.; Lewis, Michael P.
2017-09-01
Construction equipment activities are a major part of many infrastructure projects. This type of equipment typically releases large quantities of green house gas (GHG) emissions. GHG emissions may come from fuel consumption. Furthermore, equipment productivity affects the fuel consumption. Thus, an estimating tool based on the construction equipment productivity rate is able to accurately assess the GHG emissions resulted from the equipment activities. This paper proposes a methodology to estimate the environmental impact for a common construction activity. This paper delivers sensitivity analysis and a case study for an excavator based on trench excavation activity. The methodology delivered in this study can be applied to a stand-alone model, or a module that is integrated with other emissions estimators. The GHG emissions are highly correlated to diesel fuel use, which is approximately 10.15 kilograms (kg) of CO2 per gallon of diesel fuel. The results showed that the productivity rate model as the result from multiple regression analysis can be used as the basis for estimating GHG emissions, and also as the framework for developing emissions footprint and understanding the environmental impact from construction equipment activities introduction.
Alvarez-Nemegyei, José; Buenfil-Rello, Fátima Annai; Pacheco-Pantoja, Elda Leonor
2016-01-01
Reports regarding the association between body composition and inflammatory activity in rheumatoid arthritis (RA) have consistently yielded contradictory results. To perform a systematic review on the association between overweight/obesity and inflammatory activity in RA. FAST approach: Article search (Medline, EBSCO, Cochrane Library), followed by abstract retrieval, full text review and blinded assessment of methodological quality for final inclusion. Because of marked heterogeneity in statistical approach and RA activity assessment method, a meta-analysis could not be done. Results are presented as qualitative synthesis. One hundred and nineteen reports were found, 16 of them qualified for full text review. Eleven studies (8,147 patients; n range: 37-5,161) approved the methodological quality filter and were finally included. Interobserver agreement for methodological quality score (ICC: 0.93; 95% CI: 0.82-0.98; P<.001) and inclusion/rejection decision (k 1.00, P>.001) was excellent. In all reports body composition was assessed by BMI; however a marked heterogeneity was found in the method used for RA activity assessment. A significant association between BMI and RA activity was found in 6 reports having larger mean sample size: 1,274 (range: 140-5,161). On the other hand, this association was not found in 5 studies having lower mean sample size: 100 (range: 7-150). The modulation of RA clinical status by body fat mass is suggested because a significant association was found between BMI and inflammatory activity in those reports with a trend toward higher statistical power. The relationship between body composition and clinical activity in RA requires be approached with further studies with higher methodological quality. Copyright © 2015 Elsevier España, S.L.U. and Sociedad Española de Reumatología y Colegio Mexicano de Reumatología. All rights reserved.
Contaminated site cleanups involving complex activities may benefit from a detailed environmental footprint analysis to inform decision-making about application of suitable best management practices for greener cleanups.
Brattoli, Magda; Cisternino, Ezia; Dambruoso, Paolo Rosario; de Gennaro, Gianluigi; Giungato, Pasquale; Mazzone, Antonio; Palmisani, Jolanda; Tutino, Maria
2013-01-01
The gas chromatography-olfactometry (GC-O) technique couples traditional gas chromatographic analysis with sensory detection in order to study complex mixtures of odorous substances and to identify odor active compounds. The GC-O technique is already widely used for the evaluation of food aromas and its application in environmental fields is increasing, thus moving the odor emission assessment from the solely olfactometric evaluations to the characterization of the volatile components responsible for odor nuisance. The aim of this paper is to describe the state of the art of gas chromatography-olfactometry methodology, considering the different approaches regarding the operational conditions and the different methods for evaluating the olfactometric detection of odor compounds. The potentials of GC-O are described highlighting the improvements in this methodology relative to other conventional approaches used for odor detection, such as sensoristic, sensorial and the traditional gas chromatographic methods. The paper also provides an examination of the different fields of application of the GC-O, principally related to fragrances and food aromas, odor nuisance produced by anthropic activities and odorous compounds emitted by materials and medical applications. PMID:24316571
Nonpoint source pollution of urban stormwater runoff: a methodology for source analysis.
Petrucci, Guido; Gromaire, Marie-Christine; Shorshani, Masoud Fallah; Chebbo, Ghassan
2014-09-01
The characterization and control of runoff pollution from nonpoint sources in urban areas are a major issue for the protection of aquatic environments. We propose a methodology to quantify the sources of pollutants in an urban catchment and to analyze the associated uncertainties. After describing the methodology, we illustrate it through an application to the sources of Cu, Pb, Zn, and polycyclic aromatic hydrocarbons (PAH) from a residential catchment (228 ha) in the Paris region. In this application, we suggest several procedures that can be applied for the analysis of other pollutants in different catchments, including an estimation of the total extent of roof accessories (gutters and downspouts, watertight joints and valleys) in a catchment. These accessories result as the major source of Pb and as an important source of Zn in the example catchment, while activity-related sources (traffic, heating) are dominant for Cu (brake pad wear) and PAH (tire wear, atmospheric deposition).
Argumentation: A Methodology to Facilitate Critical Thinking.
Makhene, Agnes
2017-06-20
Caring is a difficult nursing activity that involves a complex nature of a human being in need of complex decision-making and problem solving through the critical thinking process. It is mandatory that critical thinking is facilitated in general and in nursing education particularly in order to render care in diverse multicultural patient care settings. This paper aims to describe how argumentation can be used to facilitate critical thinking in learners. A qualitative, exploratory and descriptive design that is contextual was used. Purposive sampling method was used to draw a sample and Miles and Huberman methodology of qualitative analysis was used to analyse data. Lincoln and Guba's strategies were employed to ensure trustworthiness, while Dhai and McQuoid-Mason's principles of ethical consideration were used. Following data analysis the findings were integrated within literature which culminated into the formulation of guidelines that can be followed when using argumentation as a methodology to facilitate critical thinking.
NASA Astrophysics Data System (ADS)
Ahmad, Mohd Azmier; Afandi, Nur Syahidah; Bello, Olugbenga Solomon
2017-05-01
This study investigates the adsorptive removal of malachite green (MG) dye from aqueous solutions using chemically modified lime-peel-based activated carbon (LPAC). The adsorbent prepared was characterized using FTIR, SEM, Proximate analysis and BET techniques, respectively. Central composite design (CCD) in response surface methodology (RSM) was used to optimize the adsorption process. The effects of three variables: activation temperature, activation time and chemical impregnation ratio (IR) using KOH and their effects on percentage of dye removal and LPAC yield were investigated. Based on CCD design, quadratic models and two factor interactions (2FI) were developed correlating the adsorption variables to the two responses. Analysis of variance (ANOVA) was used to judge the adequacy of the model. The optimum conditions of MG dye removal using LPAC are: activation temperature (796 °C), activation time (1.0 h) and impregnation ratio (2.6), respectively. The percentage of MG dye removal obtained was 94.68 % resulting in 17.88 % LPAC yield. The percentage of error between predicted and experimental results for the removal of MG dye is 0.4 %. Model prediction was in good agreement with experimental results and LPAC was found to be effective in removing MG dye from aqueous solution.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Khaleel, R.; Mehta, S.; Nichols, W. E.
This annual review provides the projected dose estimates of radionuclide inventories disposed in the active 200 West Area Low-Level Burial Grounds (LLBGs) since September 26, 1988. These estimates area calculated using the original does methodology developed in the performance assessment (PA) analysis (WHC-EP-0645).
ERIC Educational Resources Information Center
Torrens, Paul M.; Griffin, William A.
2013-01-01
The authors describe an observational and analytic methodology for recording and interpreting dynamic microprocesses that occur during social interaction, making use of space--time data collection techniques, spatial-statistical analysis, and visualization. The scheme has three investigative foci: Structure, Activity Composition, and Clustering.…
Towards a Methodology for Managing Competencies in Virtual Teams - A Systemic Approach
NASA Astrophysics Data System (ADS)
Schumacher, Marinita; Stal-Le Cardinal, Julie; Bocquet, Jean-Claude
Virtual instruments and tools are future trends in Engineering which are a response to the growing complexity of engineering tasks, the facility of communication and strong collaborations on the international market. Outsourcing, off-shoring, and the globalization of organisations’ activities have resulted in the formation of virtual product development teams. Individuals who are working in virtual teams must be equipped with diversified competencies that provide a basis for virtual team building. Thanks to the systemic approach of the functional analysis our paper responds to the need of a methodology of competence management to build virtual teams that are active in virtual design projects in the area of New Product Development (NPD).
Impact of Physical Activity Interventions on Blood Pressure in Brazilian Populations
Bento, Vivian Freitas Rezende; Albino, Flávia Barbizan; de Moura, Karen Fernandes; Maftum, Gustavo Jorge; dos Santos, Mauro de Castro; Guarita-Souza, Luiz César; Faria Neto, José Rocha; Baena, Cristina Pellegrino
2015-01-01
Background High blood pressure is associated with cardiovascular disease, which is the leading cause of mortality in the Brazilian population. Lifestyle changes, including physical activity, are important for lowering blood pressure levels and decreasing the costs associated with outcomes. Objective Assess the impact of physical activity interventions on blood pressure in Brazilian individuals. Methods Meta-analysis and systematic review of studies published until May 2014, retrieved from several health sciences databases. Seven studies with 493 participants were included. The analysis included parallel studies of physical activity interventions in adult populations in Brazil with a description of blood pressure (mmHg) before and after the intervention in the control and intervention groups. Results Of 390 retrieved studies, eight matched the proposed inclusion criteria for the systematic review and seven randomized clinical trials were included in the meta-analysis. Physical activity interventions included aerobic and resistance exercises. There was a reduction of -10.09 (95% CI: -18.76 to -1.43 mmHg) in the systolic and -7.47 (95% CI: -11.30 to -3.63 mmHg) in the diastolic blood pressure. Conclusions Available evidence on the effects of physical activity on blood pressure in the Brazilian population shows a homogeneous and significant effect at both systolic and diastolic blood pressures. However, the strength of the included studies was low and the methodological quality was also low and/or regular. Larger studies with more rigorous methodology are necessary to build robust evidence. PMID:26016783
Nonlinear analysis of human physical activity patterns in health and disease.
Paraschiv-Ionescu, A; Buchser, E; Rutschmann, B; Aminian, K
2008-02-01
The reliable and objective assessment of chronic disease state has been and still is a very significant challenge in clinical medicine. An essential feature of human behavior related to the health status, the functional capacity, and the quality of life is the physical activity during daily life. A common way to assess physical activity is to measure the quantity of body movement. Since human activity is controlled by various factors both extrinsic and intrinsic to the body, quantitative parameters only provide a partial assessment and do not allow for a clear distinction between normal and abnormal activity. In this paper, we propose a methodology for the analysis of human activity pattern based on the definition of different physical activity time series with the appropriate analysis methods. The temporal pattern of postures, movements, and transitions between postures was quantified using fractal analysis and symbolic dynamics statistics. The derived nonlinear metrics were able to discriminate patterns of daily activity generated from healthy and chronic pain states.
Preliminary Work Domain Analysis for Human Extravehicular Activity
NASA Technical Reports Server (NTRS)
McGuire, Kerry; Miller, Matthew; Feigh, Karen
2015-01-01
A work domain analysis (WDA) of human extravehicular activity (EVA) is presented in this study. A formative methodology such as Cognitive Work Analysis (CWA) offers a new perspective to the knowledge gained from the past 50 years of living and working in space for the development of future EVA support systems. EVA is a vital component of human spaceflight and provides a case study example of applying a work domain analysis (WDA) to a complex sociotechnical system. The WDA presented here illustrates how the physical characteristics of the environment, hardware, and life support systems of the domain guide the potential avenues and functional needs of future EVA decision support system development.
Łoś, Aleksandra; Strachecka, Aneta
2018-05-09
Using insect hemolymph ("blood") and insect body surface elutions, researchers can perform rapid and cheap biochemical analyses to determine the insect's immunology status. The authors of this publication describe a detailed methodology for a quick marking of the concentration of total proteins and evaluation of the proteolytic system activity (acid, neutral, and alkaline proteases and protease inhibitors), as well as a methodology for quick "liver" tests in insects: alanine aminotransferase (ALT), aspartate aminotransferase (AST), alkaline phosphatase (ALP), and urea and glucose concentration analyses. The meaning and examples of an interpretation of the results of the presented methodology for biochemical parameter determination are described for the example of honey bees.
ERIC Educational Resources Information Center
Pereira Querol, Marco A.; Suutari, Timo; Seppanen, Laura
2010-01-01
The purpose of this paper is to present theoretical tools for understanding the dynamics of change and learning during the emergence and development of environmental management activities. The methodology consists of a historical analysis of a case of biogas production that took place in the Southwest region of Finland. The theoretical tools used…
Xu, Juan; Sheng, Guo-Ping; Luo, Hong-Wei; Fang, Fang; Li, Wen-Wei; Zeng, Raymond J; Tong, Zhong-Hua; Yu, Han-Qing
2011-01-01
Soluble microbial products (SMPs) present a major part of residual chemical oxygen demand (COD) in the effluents from biological wastewater treatment systems, and the SMP formation is greatly influenced by a variety of process parameters. In this study, response surface methodology (RSM) coupled with grey relational analysis (GRA) method was used to evaluate the effects of substrate concentration, temperature, NH(4)(+)-N concentration and aeration rate on the SMP production in batch activated sludge reactors. Carbohydrates were found to be the major component of SMP, and the influential priorities of these factors were: temperature>substrate concentration > aeration rate > NH(4)(+)-N concentration. On the basis of the RSM results, the interactive effects of these factors on the SMP formation were evaluated, and the optimal operating conditions for a minimum SMP production in such a batch activated sludge system also were identified. These results provide useful information about how to control the SMP formation of activated sludge and ensure the bioreactor high-quality effluent. Copyright © 2010 Elsevier Ltd. All rights reserved.
Strategies to determine diversity, growth, and activity of ammonia-oxidizing archaea in soil.
Nicol, Graeme W; Prosser, James I
2011-01-01
Ecological studies of soil microorganisms require reliable techniques for assessment of microbial community composition, abundance, growth, and activity. Soil structure and physicochemical properties seriously limit the applicability and value of methods involving direct observation, and ecological studies have focused on communities and populations, rather than single cells or microcolonies. Although ammonia-oxidizing archaea were discovered 5 years ago, there are still no cultured representatives from soil and there remains a lack of knowledge regarding their genomic composition, physiology, or functional diversity. Despite these limitations, however, significant insights into their distribution, growth characteristics, and metabolism have been made through the use of a range of molecular methodologies. As well as the analysis of taxonomic markers such as 16S rRNA genes, the development of PCR primers based on a limited number of (mostly marine) sequences has enabled the analysis of homologues encoding proteins involved in energy and carbon metabolism. This chapter will highlight the range of molecular methodologies available for examining the diversity, growth, and activity of ammonia-oxidizing archaea in the soil environment. Copyright © 2011 Elsevier Inc. All rights reserved.
Development of a system of indicators for sustainable port management.
Peris-Mora, E; Diez Orejas, J M; Subirats, A; Ibáñez, S; Alvarez, P
2005-12-01
The 1998 project ECOPORT, "Towards A Sustainable Transport Network", developed by the Valencia Port Authority (VPA), established the bases for implementing an Environmental Management System (EMS) in industrial harbours. The use of data and information shall always be required to develop an efficient EMS. The objective of the present research (INDAPORT) study is to propose a system of sustainable environmental management indicators to be used by any port authorities. All activities performed within a port area are analysed for any potential environmental impacts and risks. An environmental analysis of port activities has been carried out with the objective of designing the indicators system. Twenty-one corresponding activities have been identified for large industrial ports. Subsequently, the same methodology developed to date will be later applied to other Spanish and European ports. The study has been developed by using an original system and a methodology, which simultaneously use stage diagrams and systemic models (material and energy flow charts). Multi-criteria analysis techniques were used to evaluate potential impacts (identification of factors and evaluation of impacts).
Analysis and Purification of Bioactive Natural Products: The AnaPurNa Study
2012-01-01
Based on a meta-analysis of data mined from almost 2000 publications on bioactive natural products (NPs) from >80 000 pages of 13 different journals published in 1998–1999, 2004–2005, and 2009–2010, the aim of this systematic review is to provide both a survey of the status quo and a perspective for analytical methodology used for isolation and purity assessment of bioactive NPs. The study provides numerical measures of the common means of sourcing NPs, the chromatographic methodology employed for NP purification, and the role of spectroscopy and purity assessment in NP characterization. A link is proposed between the observed use of various analytical methodologies, the challenges posed by the complexity of metabolomes, and the inescapable residual complexity of purified NPs and their biological assessment. The data provide inspiration for the development of innovative methods for NP analysis as a means of advancing the role of naturally occurring compounds as a viable source of biologically active agents with relevance for human health and global benefit. PMID:22620854
ERIC Educational Resources Information Center
Mason, Robert M.; And Others
This document presents a research effort intended to improve the economic information available for formulating politics and making decisions related to Information Analysis Centers (IAC's) and IAC services. The project used a system of IAC information activities to analyze the functional aspects of IAC services, calculate the present value of net…
Methodology for the systems engineering process. Volume 1: System functional activities
NASA Technical Reports Server (NTRS)
Nelson, J. H.
1972-01-01
Systems engineering is examined in terms of functional activities that are performed in the conduct of a system definition/design, and system development is described in a parametric analysis that combines functions, performance, and design variables. Emphasis is placed on identification of activities performed by design organizations, design specialty groups, as well as a central systems engineering organizational element. Identification of specific roles and responsibilities for doing functions, and monitoring and controlling activities within the system development operation are also emphasized.
Geomatics for Maritime Parks and Preserved Areas
NASA Astrophysics Data System (ADS)
Lo Tauro, Agata
2009-11-01
The aim of this research is to use hyperspectral MIVIS data for protection of sensitive cultural, natural resources, Nature Reserves and maritime parks. A knowledge of the distribution of submerged vegetation is useful to monitor the health of ecosystems in coastal areas. The objective of this project was to develop a new methodology within geomatic environment to facilitate the analysis and application of Local Institutions who are not familiar with Spatial Analysis softwares in order to implement new research activities in this field of study. Field controls may be carried out with the support of accurate and novel in situ analysis in order to determine the training sites for the novel tested classification. The methodology applied demonstrates that the combination of hyperspectral sensors and ESA Remote Sensing (RS) data can be used to analyse thematic cartography of submerged vegetation and land use analysis for Sustainable Development. This project will be implemented for Innovative Educational and Research Programmes.
Critical Language Awareness in Pedagogic Context
ERIC Educational Resources Information Center
Ali, Shamim
2011-01-01
This study was designed to investigate the significance of developing students' critical language awareness through explicit teaching methodology of some procedures of critical discourse analysis. The researcher integrated critical activities into her teaching and students' learning process. The study was planned prudently to discover the…
2016 Workplace and Gender Relations Survey of Active Duty Members: Statistical Methodology Report
2017-03-01
2016 Workplace and Gender Relations Survey of Active Duty Members Statistical Methodology Report Additional copies of this report may be...MEMBERS: STATISTICAL METHODOLOGY REPORT Office of People Analytics (OPA) Defense Research, Surveys, and Statistics Center 4800 Mark Center Drive...20 1 2016 WORKPLACE AND GENDER RELATIONS SURVEY OF ACTIVE DUTY MEMBERS: STATISTICAL METHODOLOGY REPORT
Exercise redox biochemistry: Conceptual, methodological and technical recommendations.
Cobley, James N; Close, Graeme L; Bailey, Damian M; Davison, Gareth W
2017-08-01
Exercise redox biochemistry is of considerable interest owing to its translational value in health and disease. However, unaddressed conceptual, methodological and technical issues complicate attempts to unravel how exercise alters redox homeostasis in health and disease. Conceptual issues relate to misunderstandings that arise when the chemical heterogeneity of redox biology is disregarded: which often complicates attempts to use redox-active compounds and assess redox signalling. Further, that oxidised macromolecule adduct levels reflect formation and repair is seldom considered. Methodological and technical issues relate to the use of out-dated assays and/or inappropriate sample preparation techniques that confound biochemical redox analysis. After considering each of the aforementioned issues, we outline how each issue can be resolved and provide a unifying set of recommendations. We specifically recommend that investigators: consider chemical heterogeneity, use redox-active compounds judiciously, abandon flawed assays, carefully prepare samples and assay buffers, consider repair/metabolism, use multiple biomarkers to assess oxidative damage and redox signalling. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
[Bibliometric analysis of publications by the Mexican Social Security Institute staff].
Valdez-Martínez, E; Garduño-Espinosa, J; Gómez-Delgado, A; Dante Amato-Martínez, J; Morales-Mori, L; Blanco-Favela, F; Muñoz-Hernández, O
2000-01-01
To describe and analyze the general characteristics and methodology of indexed publications by the health staff of the Mexican Social Security Institute in 1997. Original articles were evaluated. The primary sources included Index Medicus, Current Contents and the Mexican National Council of Science and Technology (CONACYT) index. The following information was gathered for each article: affiliation and chief activity of the first author; impact factor of the journal; research type; field of study; topic of study, and methodological conduction. This latter point included congruence between design and objective, reproducibility of methods, applicability of the analysis, and pertinence of the conclusions. A total of 300 original articles was published of which 212 (71%) were available for the present study: full-time investigators (FTI) generated 109 articles and investigators with clinical activities (CAI) wrote 103 articles. The median impact factor of the journals in which FTI published was 1.337 (0.341 to 37.297) and for CAI publications, 0.707 (0.400 to 4.237). Biomedical research predominated in the first group (41%) and clinical investigation in the second (66%). Statistically significant differences were identified for the methodological conduction between groups of investigators. Descriptive studies and publications in journals without impact factor predominated. The FTI group had the highest bibliographic production of original articles in indexed journals with an impact factor.
Moreno-Conde, Alberto; Moner, David; Cruz, Wellington Dimas da; Santos, Marcelo R; Maldonado, José Alberto; Robles, Montserrat; Kalra, Dipak
2015-07-01
This systematic review aims to identify and compare the existing processes and methodologies that have been published in the literature for defining clinical information models (CIMs) that support the semantic interoperability of electronic health record (EHR) systems. Following the preferred reporting items for systematic reviews and meta-analyses systematic review methodology, the authors reviewed published papers between 2000 and 2013 that covered that semantic interoperability of EHRs, found by searching the PubMed, IEEE Xplore, and ScienceDirect databases. Additionally, after selection of a final group of articles, an inductive content analysis was done to summarize the steps and methodologies followed in order to build CIMs described in those articles. Three hundred and seventy-eight articles were screened and thirty six were selected for full review. The articles selected for full review were analyzed to extract relevant information for the analysis and characterized according to the steps the authors had followed for clinical information modeling. Most of the reviewed papers lack a detailed description of the modeling methodologies used to create CIMs. A representative example is the lack of description related to the definition of terminology bindings and the publication of the generated models. However, this systematic review confirms that most clinical information modeling activities follow very similar steps for the definition of CIMs. Having a robust and shared methodology could improve their correctness, reliability, and quality. Independently of implementation technologies and standards, it is possible to find common patterns in methods for developing CIMs, suggesting the viability of defining a unified good practice methodology to be used by any clinical information modeler. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Biederman, J; Hammerness, P; Sadeh, B; Peremen, Z; Amit, A; Or-Ly, H; Stern, Y; Reches, A; Geva, A; Faraone, S V
2017-05-01
A previous small study suggested that Brain Network Activation (BNA), a novel ERP-based brain network analysis, may have diagnostic utility in attention deficit hyperactivity disorder (ADHD). In this study we examined the diagnostic capability of a new advanced version of the BNA methodology on a larger population of adults with and without ADHD. Subjects were unmedicated right-handed 18- to 55-year-old adults of both sexes with and without a DSM-IV diagnosis of ADHD. We collected EEG while the subjects were performing a response inhibition task (Go/NoGo) and then applied a spatio-temporal Brain Network Activation (BNA) analysis of the EEG data. This analysis produced a display of qualitative measures of brain states (BNA scores) providing information on cortical connectivity. This complex set of scores was then fed into a machine learning algorithm. The BNA analysis of the EEG data recorded during the Go/NoGo task demonstrated a high discriminative capacity between ADHD patients and controls (AUC = 0.92, specificity = 0.95, sensitivity = 0.86 for the Go condition; AUC = 0.84, specificity = 0.91, sensitivity = 0.76 for the NoGo condition). BNA methodology can help differentiate between ADHD and healthy controls based on functional brain connectivity. The data support the utility of the tool to augment clinical examinations by objective evaluation of electrophysiological changes associated with ADHD. Results also support a network-based approach to the study of ADHD.
A Cross-Analysis of the Mathematics Teacher's Activity: An Example in a French 10th-Grade Class
ERIC Educational Resources Information Center
Robert, Aline; Rogalski, Janine
2005-01-01
The purpose of this paper is to contribute to the debate about how to tackle the issue of "the teacher in the teaching/learning process", and to propose a methodology for analysing the teacher's activity in the classroom, based on concepts used in the fields of the didactics of mathematics as well as in cognitive ergonomics. This…
NASA Technical Reports Server (NTRS)
Ziebarth, John P.; Meyer, Doug
1992-01-01
The coordination is examined of necessary resources, facilities, and special personnel to provide technical integration activities in the area of computational fluid dynamics applied to propulsion technology. Involved is the coordination of CFD activities between government, industry, and universities. Current geometry modeling, grid generation, and graphical methods are established to use in the analysis of CFD design methodologies.
Let's Move! The Social and Health Contributions from "Pokémon" GO
ERIC Educational Resources Information Center
Finco, Mateus David; Rocha, Richard Santin; Fão, Rafael Wailla; Santos, Fabiana
2018-01-01
The aim of this article was to analyze how players of Pokémon GO could adopt a healthier and active lifestyle meanwhile or after using the game, observing how active they could become in their daily routines. The methodology involved a qualitative analysis involving a sample with players who were invited to complete an online questionnaire to…
75 FR 70966 - Transit Asset Management (TAM) Pilot Program
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-19
... interoperability between diverse types of information technology systems through use of open data formats and... to develop asset management plans, technical assistance, data collection and a pilot program as... condition assessment methodologies, as well as new data collection and analysis activities. $3 million has...
Space station needs, attributes and architectural options: Mission requirements
NASA Technical Reports Server (NTRS)
1983-01-01
Various mission requirements for the proposed space station are examined. Subjects include modelling methodology, science applications, commercial opportunities, operations analysis, integrated mission requirements, and the role of man in space station functions and activities. The information is presented through the use of graphs.
Transcriptome analysis of blueberry using 454 EST sequencing
USDA-ARS?s Scientific Manuscript database
Blueberry (Vaccinium corymbosum) is a major berry crop in the United States, and one that has great nutritional and economical value. Next generation sequencing methodologies, such as 454, have been demonstrated to be successful and efficient in producing a snap-shot of transcriptional activities du...
Šumić, Zdravko; Vakula, Anita; Tepić, Aleksandra; Čakarević, Jelena; Vitas, Jasmina; Pavlić, Branimir
2016-07-15
Fresh red currants were dried by vacuum drying process under different drying conditions. Box-Behnken experimental design with response surface methodology was used for optimization of drying process in terms of physical (moisture content, water activity, total color change, firmness and rehydratation power) and chemical (total phenols, total flavonoids, monomeric anthocyanins and ascorbic acid content and antioxidant activity) properties of dried samples. Temperature (48-78 °C), pressure (30-330 mbar) and drying time (8-16 h) were investigated as independent variables. Experimental results were fitted to a second-order polynomial model where regression analysis and analysis of variance were used to determine model fitness and optimal drying conditions. The optimal conditions of simultaneously optimized responses were temperature of 70.2 °C, pressure of 39 mbar and drying time of 8 h. It could be concluded that vacuum drying provides samples with good physico-chemical properties, similar to lyophilized sample and better than conventionally dried sample. Copyright © 2016 Elsevier Ltd. All rights reserved.
Coetzee, Tanya; Hoffmann, Willem A; de Roubaix, Malcolm
2015-10-01
The amended research ethics policy at a South African University required the ethics review of undergraduate research projects, prompting the need to explore the content and teaching approach of research ethics education in health science undergraduate programs. Two qualitative data collection strategies were used: document analysis (syllabi and study guides) and semi-structured interviews with research methodology coordinators. Five main themes emerged: (a) timing of research ethics courses, (b) research ethics course content, (c) sub-optimal use of creative classroom activities to facilitate research ethics lectures, (d) understanding the need for undergraduate project research ethics review, and (e) research ethics capacity training for research methodology lecturers and undergraduate project supervisors. © The Author(s) 2015.
ERIC Educational Resources Information Center
Damian, Radu; Grifoll, Josep; Rigbers, Anke
2015-01-01
In this paper the current national legislations, the quality assurance approaches and the activities of impact analysis of three quality assurance agencies from Romania, Spain and Germany are described from a strategic perspective. The analysis shows that the general methodologies (comprising, for example, self-evaluation reports, peer reviews,…
Zanetti-Polzi, Laura; Corni, Stefano; Daidone, Isabella; Amadei, Andrea
2016-07-21
Here, a methodology is proposed to investigate the collective fluctuation modes of an arbitrary set of observables, maximally contributing to the fluctuation of another functionally relevant observable. The methodology, based on the analysis of fully classical molecular dynamics (MD) simulations, exploits the essential dynamics (ED) method, originally developed to analyse the collective motions in proteins. We apply this methodology to identify the residues that are more relevant for determining the reduction potential (E(0)) of a redox-active protein. To this aim, the fluctuation modes of the single-residue electrostatic potentials mostly contributing to the fluctuations of the total electrostatic potential (the main determinant of E(0)) are investigated for wild-type azurin and two of its mutants with a higher E(0). By comparing the results here obtained with a previous study on the same systems [Zanetti-Polzi et al., Org. Biomol. Chem., 2015, 13, 11003] we show that the proposed methodology is able to identify the key sites that determine E(0). This information can be used for a general deeper understanding of the molecular mechanisms on the basis of the redox properties of the proteins under investigation, as well as for the rational design of mutants with a higher or lower E(0). From the results of the present analysis we propose a new azurin mutant that, according to our calculations, shows a further increase of E(0).
77 FR 68773 - FIFRA Scientific Advisory Panel; Notice of Public Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-16
... for physical chemical properties that cannot be easily tested in in vitro systems or stable enough for.... Quantitative structural-activity relationship (QSAR) models and estrogen receptor (ER) expert systems development. High-throughput data generation and analysis (expertise focused on how this methodology can be...
Mapping Sustainability Efforts at the Claremont Colleges
ERIC Educational Resources Information Center
Srebotnjak, Tanja; Norgaard, Lee Michelle
2017-01-01
Purpose: The purpose of this study is to map and analyze sustainability activities and relationships at the seven Claremont Colleges and graduate institutions using social network analysis (SNA) to inform sustainability planning and programming. Design/methodology/approach: Online surveys and interviews were conducted among faculty, staff and…
Theorising Critical HRD: A Paradox of Intricacy and Discrepancy
ERIC Educational Resources Information Center
Trehan, Kiran; Rigg, Clare
2011-01-01
Purpose: This paper aims to advance theoretical understanding of the concept of "critical human resource development". Design/methodology/approach: This is a conceptual paper. Findings: Foregrounding questions of power, emotions and political dynamics within the analysis of organisational learning and development activity, critical approaches in…
NMR characterization of polymers: Review and update
USDA-ARS?s Scientific Manuscript database
NMR spectroscopy is a major technique for the characterization and analysis of polymers. A large number of methodologies have been developed in both the liquid and the solid state, and the literature has grown considerably (1-5). The field now covers a broad spectrum of activities, including polym...
Task analysis of autonomous on-road driving
NASA Astrophysics Data System (ADS)
Barbera, Anthony J.; Horst, John A.; Schlenoff, Craig I.; Aha, David W.
2004-12-01
The Real-time Control System (RCS) Methodology has evolved over a number of years as a technique to capture task knowledge and organize it into a framework conducive to implementation in computer control systems. The fundamental premise of this methodology is that the present state of the task activities sets the context that identifies the requirements for all of the support processing. In particular, the task context at any time determines what is to be sensed in the world, what world model states are to be evaluated, which situations are to be analyzed, what plans should be invoked, and which behavior generation knowledge is to be accessed. This methodology concentrates on the task behaviors explored through scenario examples to define a task decomposition tree that clearly represents the branching of tasks into layers of simpler and simpler subtask activities. There is a named branching condition/situation identified for every fork of this task tree. These become the input conditions of the if-then rules of the knowledge set that define how the task is to respond to input state changes. Detailed analysis of each branching condition/situation is used to identify antecedent world states and these, in turn, are further analyzed to identify all of the entities, objects, and attributes that have to be sensed to determine if any of these world states exist. This paper explores the use of this 4D/RCS methodology in some detail for the particular task of autonomous on-road driving, which work was funded under the Defense Advanced Research Project Agency (DARPA) Mobile Autonomous Robot Software (MARS) effort (Doug Gage, Program Manager).
Zhuang, Yong-liang; Zhao, Xue; Li, Ba-fang
2009-08-01
To optimize the hydrolysis conditions to prepare hydrolysates of jellyfish umbrella collagen with the highest hydroxyl radical scavenging activity, collagen extracted from jellyfish umbrella was hydrolyzed with trypsin, and response surface methodology (RSM) was applied. The optimum conditions obtained from experiments were pH 7.75, temperature (T) 48.77 degrees C, and enzyme-to-substrate ratio ([E]/[S]) 3.50%. The analysis of variance in RSM showed that pH and [E]/[S] were important factors that significantly affected the process (P<0.05 and P<0.01, respectively). The hydrolysates of jellyfish umbrella collagen were fractionated by high performance liquid chromatography (HPLC), and three fractions (HF-1>3000 Da, 1000 Da
Zhuang, Yong-liang; Zhao, Xue; Li, Ba-fang
2009-01-01
To optimize the hydrolysis conditions to prepare hydrolysates of jellyfish umbrella collagen with the highest hydroxyl radical scavenging activity, collagen extracted from jellyfish umbrella was hydrolyzed with trypsin, and response surface methodology (RSM) was applied. The optimum conditions obtained from experiments were pH 7.75, temperature (T) 48.77 °C, and enzyme-to-substrate ratio ([E]/[S]) 3.50%. The analysis of variance in RSM showed that pH and [E]/[S] were important factors that significantly affected the process (P<0.05 and P<0.01, respectively). The hydrolysates of jellyfish umbrella collagen were fractionated by high performance liquid chromatography (HPLC), and three fractions (HF-1>3000 Da, 1000 Da
CACDA JIFFY III War Game. Volume II. Methodology
1980-09-01
Devens , MA 01433 Commandant USA Air ,Defense School ATTN:’ ATSA-CD-SC-S Fort Bliss, TX 79916 Commandant USA Intelligence Center and School Fort Huachuca...RELEASE: DISTRIBUTION UNLIMITED 0 801030o 033 1 Technical Report TR 6-80, Septenber 1980 US Army Combined Arms Studies and Analysis Activity Fort ...manual war game developed and operated at the USATRADOC Combined Arms Combat Developments Activity (CACDA),, Fort Leavenworth, Kansas, for scenzrio
Tošić, Tamara; Sellers, Kristin K; Fröhlich, Flavio; Fedotenkova, Mariia; Beim Graben, Peter; Hutt, Axel
2015-01-01
For decades, research in neuroscience has supported the hypothesis that brain dynamics exhibits recurrent metastable states connected by transients, which together encode fundamental neural information processing. To understand the system's dynamics it is important to detect such recurrence domains, but it is challenging to extract them from experimental neuroscience datasets due to the large trial-to-trial variability. The proposed methodology extracts recurrent metastable states in univariate time series by transforming datasets into their time-frequency representations and computing recurrence plots based on instantaneous spectral power values in various frequency bands. Additionally, a new statistical inference analysis compares different trial recurrence plots with corresponding surrogates to obtain statistically significant recurrent structures. This combination of methods is validated by applying it to two artificial datasets. In a final study of visually-evoked Local Field Potentials in partially anesthetized ferrets, the methodology is able to reveal recurrence structures of neural responses with trial-to-trial variability. Focusing on different frequency bands, the δ-band activity is much less recurrent than α-band activity. Moreover, α-activity is susceptible to pre-stimuli, while δ-activity is much less sensitive to pre-stimuli. This difference in recurrence structures in different frequency bands indicates diverse underlying information processing steps in the brain.
Tošić, Tamara; Sellers, Kristin K.; Fröhlich, Flavio; Fedotenkova, Mariia; beim Graben, Peter; Hutt, Axel
2016-01-01
For decades, research in neuroscience has supported the hypothesis that brain dynamics exhibits recurrent metastable states connected by transients, which together encode fundamental neural information processing. To understand the system's dynamics it is important to detect such recurrence domains, but it is challenging to extract them from experimental neuroscience datasets due to the large trial-to-trial variability. The proposed methodology extracts recurrent metastable states in univariate time series by transforming datasets into their time-frequency representations and computing recurrence plots based on instantaneous spectral power values in various frequency bands. Additionally, a new statistical inference analysis compares different trial recurrence plots with corresponding surrogates to obtain statistically significant recurrent structures. This combination of methods is validated by applying it to two artificial datasets. In a final study of visually-evoked Local Field Potentials in partially anesthetized ferrets, the methodology is able to reveal recurrence structures of neural responses with trial-to-trial variability. Focusing on different frequency bands, the δ-band activity is much less recurrent than α-band activity. Moreover, α-activity is susceptible to pre-stimuli, while δ-activity is much less sensitive to pre-stimuli. This difference in recurrence structures in different frequency bands indicates diverse underlying information processing steps in the brain. PMID:26834580
NASA Astrophysics Data System (ADS)
Paik, Seung Hoon; Kim, Ji Yeon; Shin, Sang Joon; Kim, Seung Jo
2004-07-01
Smart structures incorporating active materials have been designed and analyzed to improve aerospace vehicle performance and its vibration/noise characteristics. Helicopter integral blade actuation is one example of those efforts using embedded anisotropic piezoelectric actuators. To design and analyze such integrally-actuated blades, beam approach based on homogenization methodology has been traditionally used. Using this approach, the global behavior of the structures is predicted in an averaged sense. However, this approach has intrinsic limitations in describing the local behaviors in the level of the constituents. For example, the failure analysis of the individual active fibers requires the knowledge of the local behaviors. Microscopic approach for the analysis of integrally-actuated structures is established in this paper. Piezoelectric fibers and matrices are modeled individually and finite element method using three-dimensional solid elements is adopted. Due to huge size of the resulting finite element meshes, high performance computing technology is required in its solution process. The present methodology is quoted as Direct Numerical Simulation (DNS) of the smart structure. As an initial validation effort, present analytical results are correlated with the experiments from a small-scaled integrally-actuated blade, Active Twist Rotor (ATR). Through DNS, local stress distribution around the interface of fiber and matrix can be analyzed.
Yeşiller, Gülden; Sezgintürk, Mustafa Kemal
2015-11-10
In this research, a novel enzyme activity analysis methodology is introduced as a new perspective for this area. The activity of elastase enzyme, which is a digestive enzyme mostly of found in the digestive system of vertebrates, was determined by an electrochemical device composed of carbon nanotubes and a second enzyme, glucose oxidase, which was used as a signal generator enzyme. In this novel methodology, a complex bioactive layer was constructed by using carbon nanotubes, glucose oxidase and a supporting protein, gelatin on a solid, conductive substrate. The activity of elastase was determined by monitoring the hydrolysis rate of elastase enzyme in the bioactive layer. As a result of this hydrolysis of elastase, glucose oxidase was dissociated from the bioactive layer, and following this the electrochemical signal due to glucose oxidase was decreased. The progressive elastase-catalyzed digestion of the bioactive layer containing glucose oxidase decreased the layer's enzymatic efficiency, resulting in a decrease of the glucose oxidation current as a function of the enzyme activity. The ratio of the decrease was correlated to elastase activity level. In this study, optimization experiments of bioactive components and characterization of the resulting new electrochemical device were carried out. A linear calibration range from 0.0303U/mL to 0.0729U/mL of elastase was reported. Real sample analyses were also carried out by the new electrochemical device. Copyright © 2015 Elsevier B.V. All rights reserved.
Scrutinizing UML Activity Diagrams
NASA Astrophysics Data System (ADS)
Al-Fedaghi, Sabah
Building an information system involves two processes: conceptual modeling of the “real world domain” and designing the software system. Object-oriented methods and languages (e.g., UML) are typically used for describing the software system. For the system analysis process that produces the conceptual description, object-oriented techniques or semantics extensions are utilized. Specifically, UML activity diagrams are the “flow charts” of object-oriented conceptualization tools. This chapter proposes an alternative to UML activity diagrams through the development of a conceptual modeling methodology based on the notion of flow.
NASA Technical Reports Server (NTRS)
Green, Scott; Kouchakdjian, Ara; Basili, Victor; Weidow, David
1990-01-01
This case study analyzes the application of the cleanroom software development methodology to the development of production software at the NASA/Goddard Space Flight Center. The cleanroom methodology emphasizes human discipline in program verification to produce reliable software products that are right the first time. Preliminary analysis of the cleanroom case study shows that the method can be applied successfully in the FDD environment and may increase staff productivity and product quality. Compared to typical Software Engineering Laboratory (SEL) activities, there is evidence of lower failure rates, a more complete and consistent set of inline code documentation, a different distribution of phase effort activity, and a different growth profile in terms of lines of code developed. The major goals of the study were to: (1) assess the process used in the SEL cleanroom model with respect to team structure, team activities, and effort distribution; (2) analyze the products of the SEL cleanroom model and determine the impact on measures of interest, including reliability, productivity, overall life-cycle cost, and software quality; and (3) analyze the residual products in the application of the SEL cleanroom model, such as fault distribution, error characteristics, system growth, and computer usage.
NASA Astrophysics Data System (ADS)
Swain, Sushree Diptimayee; Ray, Pravat Kumar; Mohanty, K. B.
2016-06-01
This research paper discover the design of a shunt Passive Power Filter (PPF) in Hybrid Series Active Power Filter (HSAPF) that employs a novel analytic methodology which is superior than FFT analysis. This novel approach consists of the estimation, detection and classification of the signals. The proposed method is applied to estimate, detect and classify the power quality (PQ) disturbance such as harmonics. This proposed work deals with three methods: the harmonic detection through wavelet transform method, the harmonic estimation by Kalman Filter algorithm and harmonic classification by decision tree method. From different type of mother wavelets in wavelet transform method, the db8 is selected as suitable mother wavelet because of its potency on transient response and crouched oscillation at frequency domain. In harmonic compensation process, the detected harmonic is compensated through Hybrid Series Active Power Filter (HSAPF) based on Instantaneous Reactive Power Theory (IRPT). The efficacy of the proposed method is verified in MATLAB/SIMULINK domain and as well as with an experimental set up. The obtained results confirm the superiority of the proposed methodology than FFT analysis. This newly proposed PPF is used to make the conventional HSAPF more robust and stable.
Concepts of Connectivity and Human Epileptic Activity
Lemieux, Louis; Daunizeau, Jean; Walker, Matthew C.
2011-01-01
This review attempts to place the concept of connectivity from increasingly sophisticated neuroimaging data analysis methodologies within the field of epilepsy research. We introduce the more principled connectivity terminology developed recently in neuroimaging and review some of the key concepts related to the characterization of propagation of epileptic activity using what may be called traditional correlation-based studies based on EEG. We then show how essentially similar methodologies, and more recently models addressing causality, have been used to characterize whole-brain and regional networks using functional MRI data. Following a discussion of our current understanding of the neuronal system aspects of the onset and propagation of epileptic discharges and seizures, we discuss the most advanced and ambitious framework to attempt to fully characterize epileptic networks based on neuroimaging data. PMID:21472027
Dvornikov, M V; Medenkov, A A
2015-04-01
In the current paper authors discuss problems of marine and aerospace medicine and psychophysiology, which Georgii Zarakovskii (1925-2014), a prominent domestic experts in the field of military medicine, psychology and ergonomics, solved. Authors focused on methodological approaches and results of the study of psychophysiological characteristics and human capabilities took into account for design of tools and organization of flight crews, astronauts and military experts. Authors marked the contribution to the creation of a system integrating psychophysiological features and characteristics of the person neccessary for development, testing and maintenance of aerospace engineering and organization of its professional activities. The possibilities of using the methodology of psychophysiological activity analysis in order to improve the reliability of psychophysiological military specialists, are shown.
Avaliani, S L; Novikov, S M; Shashina, T A; Dodina, N S; Kislitsin, V A; Mishina, A L
2014-01-01
The lack of adequate legislative and regulatory framework for ensuring minimization of the health risks in the field of environmental protection is the obstacle for the application of the risk analysis methodology as a leading tool for administrative activity in Russia. "Principles of the state policy in the sphere of ensuring chemical and biological safety of the Russian Federation for the period up to 2025 and beyond", approved by the President of the Russian Federation on 01 November 2013, No PR-25 73, are aimed at the legal support for the health risk analysis methodology. In the article there have been supposed the main stages of the operative control of the environmental quality, which lead to the reduction of the health risk to the acceptable level. The further improvement of the health risk analysis methodology in Russia should contribute to the implementation of the state policy in the sphere of chemical and biological safety through the introduction of complex measures on neutralization of chemical and biological threats to the human health and the environment, as well as evaluation of the economic effectiveness of these measures. The primary step should be the legislative securing of the quantitative value for the term: "acceptable risk".
Gonzálvez, A; Armenta, S; De La Guardia, M
2008-01-01
A methodology based on inductively coupled plasma optical emission spectrometry (ICP-OES) after microwave-assisted acid digestion was developed to determine the content of traces elements in curry samples from the Spanish market. The methodology was validated in terms of accuracy by the analysis of citrus and tomato leaf reference materials achieving comparable results with the certified values. The trace metal content of curry samples was compared with data available from previously published reports concerning Indian samples, especially in terms of heavy metal composition, in order to guarantee the quality of the commercially available spices in the European countries. Values found for the analysis of arsenic, lead and cadmium were significantly lower than the maximum limit allowed by European Union statutory limits for heavy metals and lower than those obtained for Indian curry leaves reported by Indian research teams by using neutron activation and γ-ray analysis.
NASA Astrophysics Data System (ADS)
Fusaro, Roberta; Viola, Nicole; Fenoglio, Franco; Santoro, Francesco
2017-03-01
This paper proposes a methodology to derive architectures and operational concepts for future earth-to-orbit and sub-orbital transportation systems. In particular, at first, it describes the activity flow, methods, and tools leading to the generation of a wide range of alternative solutions to meet the established goal. Subsequently, the methodology allows selecting a small number of feasible options among which the optimal solution can be found. For the sake of clarity, the first part of the paper describes the methodology from a theoretical point of view, while the second part proposes the selection of mission concepts and of a proper transportation system aimed at sub-orbital parabolic flights. Starting from a detailed analysis of the stakeholders and their needs, the major objectives of the mission have been derived. Then, following a system engineering approach, functional analysis tools as well as concept of operations techniques allowed generating a very high number of possible ways to accomplish the envisaged goals. After a preliminary pruning activity, aimed at defining the feasibility of these concepts, more detailed analyses have been carried out. Going on through the procedure, the designer should move from qualitative to quantitative evaluations, and for this reason, to support the trade-off analysis, an ad-hoc built-in mission simulation software has been exploited. This support tool aims at estimating major mission drivers (mass, heat loads, manoeuverability, earth visibility, and volumetric efficiency) as well as proving the feasibility of the concepts. Other crucial and multi-domain mission drivers, such as complexity, innovation level, and safety have been evaluated through the other appropriate analyses. Eventually, one single mission concept has been selected and detailed in terms of layout, systems, and sub-systems, highlighting also logistic, safety, and maintainability aspects.
ERIC Educational Resources Information Center
Aiello, Angelo; And Others
1986-01-01
A form is presented for language teacher self-evaluation concerning attitudes and knowledge about learning theories, general linguistics, sociolinguistics, pragmatics, discourse analysis, teaching methodology, the communicative approach, class activities, class management, instructional support, and evaluation. (MSE)
A Methodological Approach for Training Analysts of Small Business Problems.
ERIC Educational Resources Information Center
Mackness, J. R.
1986-01-01
Steps in a small business analysis are discussed: understand how company activities interact internally and with markets and suppliers; know the relative importance of controllable management variables; understand the social atmosphere within the company; analyze the operations of the company; define main problem areas; identify possible actions…
Activity Theory and Qualitative Research in Digital Domains
ERIC Educational Resources Information Center
Sam, Cecile
2012-01-01
Understanding the interactions between people, computer-mediated communication, and online life requires that researchers appropriate a set of methodological tools that would be best suited for capturing and analyzing the phenomenon. However, these tools are not limited to relevant technological forms of data collections and analysis programs; it…
Classroom Social Signal Analysis
ERIC Educational Resources Information Center
Raca, Mirko; Dillenbourg, Pierre
2014-01-01
We present our efforts towards building an observational system for measuring classroom activity. The goal is to explore visual cues which can be acquired with a system of video cameras and automatically processed to enrich the teacher's perception of the audience. The paper will give a brief overview of our methodology, explored features, and…
Human/Automation Trade Methodology for the Moon, Mars and Beyond
NASA Technical Reports Server (NTRS)
Korsmeyer, David J.
2009-01-01
It is possible to create a consistent trade methodology that can characterize operations model alternatives for crewed exploration missions. For example, a trade-space that is organized around the objective of maximizing Crew Exploration Vehicle (CEV) independence would have the input as a classification of the category of analysis to be conducted or decision to be made, and a commitment to a detailed point in a mission profile during which the analysis or decision is to be made. For example, does the decision have to do with crew activity planning, or life support? Is the mission phase trans-Earth injection, cruise, or lunar descent? Different kinds of decision analysis of the trade-space between human and automated decisions will occurs at different points in a mission's profile. The necessary objectives at a given point in time during a mission will call for different kinds of response with respect to where and how computers and automation are expected to help provide an accurate, safe, and timely response. In this paper, a consistent methodology for assessing the trades between human and automated decisions on-board will be presented and various examples discussed.
CAA Annual Report Fiscal Year 1998.
1998-12-01
Studies , 3-1 Quick Reaction Analyses & Projects 3-1 4 TECHNOLOGY RESEARCH AND ANALYSIS SUPPORT 4-1 Technology Research 4-1 Methodology Research 4-2...Publications, Graphics, and Reproduction 5-2 6 ANALYTICAL EFFORTS COMPLETED BETWEEN FY90 AND FY98 6-1 Appendix A Annual Study , Work Evaluation...future. Chapter 2 highlights major studies and analysis activities which occurred in FY 98. Chapter 3 is the total package of analytical summaries
Afshari, Kasra; Samavati, Vahid; Shahidi, Seyed-Ahmad
2015-03-01
The effects of ultrasonic power, extraction time, extraction temperature, and the water-to-raw material ratio on extraction yield of crude polysaccharide from the leaf of Hibiscus rosa-sinensis (HRLP) were optimized by statistical analysis using response surface methodology. The response surface methodology (RSM) was used to optimize HRLP extraction yield by implementing the Box-Behnken design (BBD). The experimental data obtained were fitted to a second-order polynomial equation using multiple regression analysis and also analyzed by appropriate statistical methods (ANOVA). Analysis of the results showed that the linear and quadratic terms of these four variables had significant effects. The optimal conditions for the highest extraction yield of HRLP were: ultrasonic power, 93.59 W; extraction time, 25.71 min; extraction temperature, 93.18°C; and the water to raw material ratio, 24.3 mL/g. Under these conditions, the experimental yield was 9.66±0.18%, which is well in close agreement with the value predicted by the model 9.526%. The results demonstrated that HRLP had strong scavenging activities in vitro on DPPH and hydroxyl radicals. Copyright © 2014 Elsevier B.V. All rights reserved.
Manganelli, Joe; Threatt, Anthony; Brooks, Johnell O; Healy, Stan; Merino, Jessica; Yanik, Paul; Walker, Ian; Green, Keith
2014-01-01
This article presents the results of a qualitative study that confirmed, classified, and prioritized user needs for the design of a more useful, usable, and actively assistive over-the-bed table. Manganelli et al. (2014) generated a list of 74 needs for use in developing an actively assistive over-the-bed table. This present study assesses the value and importance of those needs. Fourteen healthcare subject matter experts and eight research and design subject matter experts engaged in a participatory and iterative research and design process. A mixed methods qualitative approach used methodological triangulation to confirm the value of the findings and ratings to establish importance. Open and closed card sorts and a Delphi study were used. Data analysis methods included frequency analysis, content analysis, and a modified Kano analysis. A table demonstrating the needs that are of high importance to both groups of subject matter experts and classification of the design challenges each represents was produced. Through this process, the list of 74 needs was refined to the 37 most important need statements for both groups. Designing a more useful, usable, and actively assistive over-the-bed table is primarily about the ability to position it optimally with respect to the user for any task, as well as improving ease of use and usability. It is also important to make explicit and discuss the differences in priorities and perspectives demonstrated between research and design teams and their clients. © 2014 Vendome Group, LLC.
NASA standard: Trend analysis techniques
NASA Technical Reports Server (NTRS)
1990-01-01
Descriptive and analytical techniques for NASA trend analysis applications are presented in this standard. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. This document should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend analysis is neither a precise term nor a circumscribed methodology: it generally connotes quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this document. The basic ideas needed for qualitative and quantitative assessment of trends along with relevant examples are presented.
Prasse, Carsten; Stalter, Daniel; Schulte-Oehlmann, Ulrike; Oehlmann, Jörg; Ternes, Thomas A
2015-12-15
The knowledge we have gained in recent years on the presence and effects of compounds discharged by wastewater treatment plants (WWTPs) brings us to a point where we must question the appropriateness of current water quality evaluation methodologies. An increasing number of anthropogenic chemicals is detected in treated wastewater and there is increasing evidence of adverse environmental effects related to WWTP discharges. It has thus become clear that new strategies are needed to assess overall quality of conventional and advanced treated wastewaters. There is an urgent need for multidisciplinary approaches combining expertise from engineering, analytical and environmental chemistry, (eco)toxicology, and microbiology. This review summarizes the current approaches used to assess treated wastewater quality from the chemical and ecotoxicological perspective. Discussed chemical approaches include target, non-target and suspect analysis, sum parameters, identification and monitoring of transformation products, computational modeling as well as effect directed analysis and toxicity identification evaluation. The discussed ecotoxicological methodologies encompass in vitro testing (cytotoxicity, genotoxicity, mutagenicity, endocrine disruption, adaptive stress response activation, toxicogenomics) and in vivo tests (single and multi species, biomonitoring). We critically discuss the benefits and limitations of the different methodologies reviewed. Additionally, we provide an overview of the current state of research regarding the chemical and ecotoxicological evaluation of conventional as well as the most widely used advanced wastewater treatment technologies, i.e., ozonation, advanced oxidation processes, chlorination, activated carbon, and membrane filtration. In particular, possible directions for future research activities in this area are provided. Copyright © 2015 Elsevier Ltd. All rights reserved.
Hahn, K D; Cooper, G W; Ruiz, C L; Fehl, D L; Chandler, G A; Knapp, P F; Leeper, R J; Nelson, A J; Smelser, R M; Torres, J A
2014-04-01
We present a general methodology to determine the diagnostic sensitivity that is directly applicable to neutron-activation diagnostics fielded on a wide variety of neutron-producing experiments, which include inertial-confinement fusion (ICF), dense plasma focus, and ion beam-driven concepts. This approach includes a combination of several effects: (1) non-isotropic neutron emission; (2) the 1/r(2) decrease in neutron fluence in the activation material; (3) the spatially distributed neutron scattering, attenuation, and energy losses due to the fielding environment and activation material itself; and (4) temporally varying neutron emission. As an example, we describe the copper-activation diagnostic used to measure secondary deuterium-tritium fusion-neutron yields on ICF experiments conducted on the pulsed-power Z Accelerator at Sandia National Laboratories. Using this methodology along with results from absolute calibrations and Monte Carlo simulations, we find that for the diagnostic configuration on Z, the diagnostic sensitivity is 0.037% ± 17% counts/neutron per cm(2) and is ∼ 40% less sensitive than it would be in an ideal geometry due to neutron attenuation, scattering, and energy-loss effects.
Fair, Damien A.; Choi, Alexander H.; Dosenbach, Yannic B.L.; Coalson, Rebecca S.; Miezin, Francis M.; Petersen, Steven E.; Schlaggar, Bradley L.
2009-01-01
Children with congenital left hemisphere damage due to perinatal stroke are capable of acquiring relatively normal language functions despite experiencing a cortical insult that in adults often leads to devastating lifetime disabilities. Although this observed phenomenon accepted, its neurobiological mechanisms are not well characterized. In this paper we examined the functional neuroanatomy of lexical processing in 13 children/adolescents with perinatal left hemispheric damage. In contrast to many previous perinatal infarct fMRI studies, we use an event-related design, which allowed us to isolate trial related activity and examine correct and error trials separately. Using both group and single subject analysis techniques we attempt to address several methodological factors that may contribute to some discrepancies in the perinatal lesion literature. These methodological factors include making direct statistical comparisons, using common stereotactic space, using both single-subject and group analyses, and accounting for performance differences. Our group analysis, investigating correct trial related activity (separately from error trials), showed very few statistical differences in the non-involved right hemisphere between patients and performance matched controls. The single subject analysis revealed atypical regional activation patterns in several patients; however, the location of these regions identified in individual patients often varied across subjects. These results are consistent with the idea that alternative functional organization of trial-related activity after left hemisphere lesions is in large part unique to the individual. In addition, reported differences between results obtained with event-related designs and blocked designs may suggest diverging organizing principles for sustained and trial-related activity after early childhood brain injuries. PMID:19819000
Fair, Damien A; Choi, Alexander H; Dosenbach, Yannic B L; Coalson, Rebecca S; Miezin, Francis M; Petersen, Steven E; Schlaggar, Bradley L
2010-08-01
Children with congenital left hemisphere damage due to perinatal stroke are capable of acquiring relatively normal language functions despite experiencing a cortical insult that in adults often leads to devastating lifetime disabilities. Although this observed phenomenon is accepted, its neurobiological mechanisms are not well characterized. In this paper we examined the functional neuroanatomy of lexical processing in 13 children/adolescents with perinatal left hemispheric damage. In contrast to many previous perinatal infarct fMRI studies, we used an event-related design, which allowed us to isolate trial-related activity and examine correct and error trials separately. Using both group and single subject analysis techniques we attempt to address several methodological factors that may contribute to some discrepancies in the perinatal lesion literature. These methodological factors include making direct statistical comparisons, using common stereotactic space, using both single subject and group analyses, and accounting for performance differences. Our group analysis, investigating correct trial-related activity (separately from error trials), showed very few statistical differences in the non-involved right hemisphere between patients and performance matched controls. The single subject analysis revealed atypical regional activation patterns in several patients; however, the location of these regions identified in individual patients often varied across subjects. These results are consistent with the idea that alternative functional organization of trial-related activity after left hemisphere lesions is in large part unique to the individual. In addition, reported differences between results obtained with event-related designs and blocked designs may suggest diverging organizing principles for sustained and trial-related activity after early childhood brain injuries. 2009 Elsevier Inc. All rights reserved.
The economics of project analysis: Optimal investment criteria and methods of study
NASA Technical Reports Server (NTRS)
Scriven, M. C.
1979-01-01
Insight is provided toward the development of an optimal program for investment analysis of project proposals offering commercial potential and its components. This involves a critique of economic investment criteria viewed in relation to requirements of engineering economy analysis. An outline for a systems approach to project analysis is given Application of the Leontief input-output methodology to analysis of projects involving multiple processes and products is investigated. Effective application of elements of neoclassical economic theory to investment analysis of project components is demonstrated. Patterns of both static and dynamic activity levels are incorporated.
Spatial pattern recognition of seismic events in South West Colombia
NASA Astrophysics Data System (ADS)
Benítez, Hernán D.; Flórez, Juan F.; Duque, Diana P.; Benavides, Alberto; Lucía Baquero, Olga; Quintero, Jiber
2013-09-01
Recognition of seismogenic zones in geographical regions supports seismic hazard studies. This recognition is usually based on visual, qualitative and subjective analysis of data. Spatial pattern recognition provides a well founded means to obtain relevant information from large amounts of data. The purpose of this work is to identify and classify spatial patterns in instrumental data of the South West Colombian seismic database. In this research, clustering tendency analysis validates whether seismic database possesses a clustering structure. A non-supervised fuzzy clustering algorithm creates groups of seismic events. Given the sensitivity of fuzzy clustering algorithms to centroid initial positions, we proposed a methodology to initialize centroids that generates stable partitions with respect to centroid initialization. As a result of this work, a public software tool provides the user with the routines developed for clustering methodology. The analysis of the seismogenic zones obtained reveals meaningful spatial patterns in South-West Colombia. The clustering analysis provides a quantitative location and dispersion of seismogenic zones that facilitates seismological interpretations of seismic activities in South West Colombia.
Brain Activity Unique to Orgasm in Women: An fMRI Analysis.
Wise, Nan J; Frangos, Eleni; Komisaruk, Barry R
2017-11-01
Although the literature on imaging of regional brain activity during sexual arousal in women and men is extensive and largely consistent, that on orgasm is relatively limited and variable, owing in part to the methodologic challenges posed by variability in latency to orgasm in participants and head movement. To compare brain activity at orgasm (self- and partner-induced) with that at the onset of genital stimulation, immediately before the onset of orgasm, and immediately after the cessation of orgasm and to upgrade the methodology for obtaining and analyzing functional magnetic resonance imaging (fMRI) findings. Using fMRI, we sampled equivalent time points across female participants' variable durations of stimulation and orgasm in response to self- and partner-induced clitoral stimulation. The first 20-second epoch of orgasm was contrasted with the 20-second epochs at the beginning of stimulation and immediately before and after orgasm. Separate analyses were conducted for whole-brain and brainstem regions of interest. For a finer-grained analysis of the peri-orgasm phase, we conducted a time-course analysis on regions of interest. Head movement was minimized to a mean less than 1.3 mm using a custom-fitted thermoplastic whole-head and neck brace stabilizer. Ten women experienced orgasm elicited by self- and partner-induced genital stimulation in a Siemens 3-T Trio fMRI scanner. Brain activity gradually increased leading up to orgasm, peaked at orgasm, and then decreased. We found no evidence of deactivation of brain regions leading up to or during orgasm. The activated brain regions included sensory, motor, reward, frontal cortical, and brainstem regions (eg, nucleus accumbens, insula, anterior cingulate cortex, orbitofrontal cortex, operculum, right angular gyrus, paracentral lobule, cerebellum, hippocampus, amygdala, hypothalamus, ventral tegmental area, and dorsal raphe). Insight gained from the present findings could provide guidance toward a rational basis for treatment of orgasmic disorders, including anorgasmia. This is evidently the first fMRI study of orgasm elicited by self- and partner-induced genital stimulation in women. Methodologic solutions to the technical issues posed by excessive head movement and variable latencies to orgasm were successfully applied in the present study, enabling identification of brain regions involved in orgasm. Limitations include the small sample (N = 10), which combined self- and partner-induced stimulation datasets for analysis and which qualify the generalization of our conclusions. Extensive cortical, subcortical, and brainstem regions reach peak levels of activity at orgasm. Wise NJ, Frangos E, Komisaruk BR. Brain Activity Unique to Orgasm in Women: An fMRI Analysis. J Sex Med 2017;14:1380-1391. Copyright © 2017 International Society for Sexual Medicine. Published by Elsevier Inc. All rights reserved.
Edirs, Salamet; Turak, Ablajan; Numonov, Sodik; Xin, Xuelei; Aisa, Haji Akber
2017-01-01
By using extraction yield, total polyphenolic content, antidiabetic activities (PTP-1B and α -glycosidase), and antioxidant activity (ABTS and DPPH) as indicated markers, the extraction conditions of the prescription Kursi Wufarikun Ziyabit (KWZ) were optimized by response surface methodology (RSM). Independent variables were ethanol concentration, extraction temperature, solid-to-solvent ratio, and extraction time. The result of RSM analysis showed that the four variables investigated have a significant effect ( p < 0.05) for Y 1 , Y 2 , Y 3 , Y 4 , and Y 5 with R 2 value of 0.9120, 0.9793, 0.9076, 0.9125, and 0.9709, respectively. Optimal conditions for the highest extraction yield of 39.28%, PTP-1B inhibition rate of 86.21%, α -glycosidase enzymes inhibition rate of 96.56%, and ABTS inhibition rate of 77.38% were derived at ethanol concentration 50.11%, extraction temperature 72.06°C, solid-to-solvent ratio 1 : 22.73 g/mL, and extraction time 2.93 h. On the basis of total polyphenol content of 48.44% in this optimal condition, the quantitative analysis of effective part of KWZ was characterized via UPLC method, 12 main components were identified by standard compounds, and all of them have shown good regression within the test ranges and the total content of them was 11.18%.
Neuroimaging of Human Balance Control: A Systematic Review
Wittenberg, Ellen; Thompson, Jessica; Nam, Chang S.; Franz, Jason R.
2017-01-01
This review examined 83 articles using neuroimaging modalities to investigate the neural correlates underlying static and dynamic human balance control, with aims to support future mobile neuroimaging research in the balance control domain. Furthermore, this review analyzed the mobility of the neuroimaging hardware and research paradigms as well as the analytical methodology to identify and remove movement artifact in the acquired brain signal. We found that the majority of static balance control tasks utilized mechanical perturbations to invoke feet-in-place responses (27 out of 38 studies), while cognitive dual-task conditions were commonly used to challenge balance in dynamic balance control tasks (20 out of 32 studies). While frequency analysis and event related potential characteristics supported enhanced brain activation during static balance control, that in dynamic balance control studies was supported by spatial and frequency analysis. Twenty-three of the 50 studies utilizing EEG utilized independent component analysis to remove movement artifacts from the acquired brain signals. Lastly, only eight studies used truly mobile neuroimaging hardware systems. This review provides evidence to support an increase in brain activation in balance control tasks, regardless of mechanical, cognitive, or sensory challenges. Furthermore, the current body of literature demonstrates the use of advanced signal processing methodologies to analyze brain activity during movement. However, the static nature of neuroimaging hardware and conventional balance control paradigms prevent full mobility and limit our knowledge of neural mechanisms underlying balance control. PMID:28443007
2016-06-01
characteristics, experimental design techniques, and analysis methodologies that distinguish each phase of the MBSE MEASA. To ensure consistency... methodology . Experimental design selection, simulation analysis, and trade space analysis support the final two stages. Figure 27 segments the MBSE MEASA...rounding has the potential to increase the correlation between columns of the experimental design matrix. The design methodology presented in Vieira
Resting-State Functional Connectivity in the Infant Brain: Methods, Pitfalls, and Potentiality.
Mongerson, Chandler R L; Jennings, Russell W; Borsook, David; Becerra, Lino; Bajic, Dusica
2017-01-01
Early brain development is characterized by rapid growth and perpetual reconfiguration, driven by a dynamic milieu of heterogeneous processes. Postnatal brain plasticity is associated with increased vulnerability to environmental stimuli. However, little is known regarding the ontogeny and temporal manifestations of inter- and intra-regional functional connectivity that comprise functional brain networks. Resting-state functional magnetic resonance imaging (rs-fMRI) has emerged as a promising non-invasive neuroinvestigative tool, measuring spontaneous fluctuations in blood oxygen level dependent (BOLD) signal at rest that reflect baseline neuronal activity. Over the past decade, its application has expanded to infant populations providing unprecedented insight into functional organization of the developing brain, as well as early biomarkers of abnormal states. However, many methodological issues of rs-fMRI analysis need to be resolved prior to standardization of the technique to infant populations. As a primary goal, this methodological manuscript will (1) present a robust methodological protocol to extract and assess resting-state networks in early infancy using independent component analysis (ICA), such that investigators without previous knowledge in the field can implement the analysis and reliably obtain viable results consistent with previous literature; (2) review the current methodological challenges and ethical considerations associated with emerging field of infant rs-fMRI analysis; and (3) discuss the significance of rs-fMRI application in infants for future investigations of neurodevelopment in the context of early life stressors and pathological processes. The overarching goal is to catalyze efforts toward development of robust, infant-specific acquisition, and preprocessing pipelines, as well as promote greater transparency by researchers regarding methods used.
Resting-State Functional Connectivity in the Infant Brain: Methods, Pitfalls, and Potentiality
Mongerson, Chandler R. L.; Jennings, Russell W.; Borsook, David; Becerra, Lino; Bajic, Dusica
2017-01-01
Early brain development is characterized by rapid growth and perpetual reconfiguration, driven by a dynamic milieu of heterogeneous processes. Postnatal brain plasticity is associated with increased vulnerability to environmental stimuli. However, little is known regarding the ontogeny and temporal manifestations of inter- and intra-regional functional connectivity that comprise functional brain networks. Resting-state functional magnetic resonance imaging (rs-fMRI) has emerged as a promising non-invasive neuroinvestigative tool, measuring spontaneous fluctuations in blood oxygen level dependent (BOLD) signal at rest that reflect baseline neuronal activity. Over the past decade, its application has expanded to infant populations providing unprecedented insight into functional organization of the developing brain, as well as early biomarkers of abnormal states. However, many methodological issues of rs-fMRI analysis need to be resolved prior to standardization of the technique to infant populations. As a primary goal, this methodological manuscript will (1) present a robust methodological protocol to extract and assess resting-state networks in early infancy using independent component analysis (ICA), such that investigators without previous knowledge in the field can implement the analysis and reliably obtain viable results consistent with previous literature; (2) review the current methodological challenges and ethical considerations associated with emerging field of infant rs-fMRI analysis; and (3) discuss the significance of rs-fMRI application in infants for future investigations of neurodevelopment in the context of early life stressors and pathological processes. The overarching goal is to catalyze efforts toward development of robust, infant-specific acquisition, and preprocessing pipelines, as well as promote greater transparency by researchers regarding methods used. PMID:28856131
Safety assessment methodology in management of spent sealed sources.
Mahmoud, Narmine Salah
2005-02-14
Environmental hazards can be caused from radioactive waste after their disposal. It was therefore important that safety assessment methodologies be developed and established to study and estimate the possible hazards, and institute certain safety methodologies that lead and prevent the evolution of these hazards. Spent sealed sources are specific type of radioactive waste. According to IAEA definition, spent sealed sources are unused sources because of activity decay, damage, misuse, loss, or theft. Accidental exposure of humans from spent sealed sources can occur at the moment they become spent and before their disposal. Because of that reason, safety assessment methodologies were tailored to suit the management of spent sealed sources. To provide understanding and confidence of this study, validation analysis was undertaken by considering the scenario of an accident that occurred in Egypt, June 2000 (the Meet-Halfa accident from an iridium-192 source). The text of this work includes consideration related to the safety assessment approaches of spent sealed sources which constitutes assessment context, processes leading an active source to be spent, accident scenarios, mathematical models for dose calculations, and radiological consequences and regulatory criteria. The text also includes a validation study, which was carried out by evaluating a theoretical scenario compared to the real scenario of Meet-Halfa accident depending on the clinical assessment of affected individuals.
Nonlinear models for estimating GSFC travel requirements
NASA Technical Reports Server (NTRS)
Buffalano, C.; Hagan, F. J.
1974-01-01
A methodology is presented for estimating travel requirements for a particular period of time. Travel models were generated using nonlinear regression analysis techniques on a data base of FY-72 and FY-73 information from 79 GSFC projects. Although the subject matter relates to GSFX activities, the type of analysis used and the manner of selecting the relevant variables would be of interest to other NASA centers, government agencies, private corporations and, in general, any organization with a significant travel budget. Models were developed for each of six types of activity: flight projects (in-house and out-of-house), experiments on non-GSFC projects, international projects, ART/SRT, data analysis, advanced studies, tracking and data, and indirects.
Cunha, Edite; Pinto, Paula C A G; Saraiva, M Lúcia M F S
2015-08-15
An automated methodology is proposed for the evaluation of a set of ionic liquids (ILs) as alternative reaction media for aldolase based synthetic processes. For that, the effect of traditionally used organic solvents and ILs on the activity of aldolase was studied by means of a novel automated methodology. The implemented methodology is based on the concept of sequential injection analysis (SIA) and relies on the aldolase based cleavage of d-fructose-1,6 diphosphate (DFDP), to produce dihydroxyacetone phosphate (DHAP) and d-glyceraldehyde-3-phosphate (G3P). In the presence of FeCl3, 3-methyl-2-benzothiazoline hydrazine (MBTH) forms a blue cation that can be measured at 670nm, by combination with G3P. The influence of several parameters such as substrate and enzyme concentration, temperature, delay time and MBTH and FeCl3 concentration were studied and the optimum reaction conditions were subsequently selected. The developed methodology showed good precision and a relative standard deviation (rsd) that does not exceed 7% also leading to low reagents consumption as well as effluent production. Resorting to this strategy, the activity of the enzyme was studied in strictly aqueous media and in the presence of dimethylformamide, methanol, bmpyr [Cl], hmim [Cl], bmim [BF4], emim [BF4], emim [Ac], bmim [Cl], emim [TfMs], emim [Ms] and Chol [Ac] up to 50%. The results show that the utilization of ILs as reaction media for aldolase based organic synthesis might present potential advantages over the tested conventional organic solvents. The least toxic IL found in this study was cho [Ac] that causes a reduction of enzyme activity of only 2.7% when used in a concentration of 50%. Generally, it can be concluded that ILs based on choline or short alkyl imidazolium moieties associated with biocompatible anions are the most promising ILs regarding the future inclusion of these solvents in synthetic protocols catalyzed by aldolase. Copyright © 2015 Elsevier B.V. All rights reserved.
Øglund, Guro Pauck; Hildebrand, Maria; Ekelund, Ulf
2015-11-01
The purpose of this systematic review was to explore whether birth weight, early growth and motor development act as determinants of physical activity in children and youth. We performed a systematic literature search on the possible early life determinants. A meta-analysis was performed on the association between birthweight and objectively measured physical activity. We identified 9 studies examining birth weight, in which none of the studies with objectively measured physical activity observed an association between birth weight and physical activity. The meta-analysis confirmed this result (b=-3.08, 95% CI -10.20, 4.04). The 3 studies examining early growth and physical activity in youth differ in methodology and the results are inconsistent. Two studies suggest an association between earlier motor development and physical activity and sport participation in youth. This was not confirmed in a third study. Our meta-analysis suggests that birth weight is not an important determinant of physical activity in youth. Available data does not allow firm conclusions whether early growth and motor development act as determinants of physical activity in youth.
EUROBLOCv2: Methodology for the Study of Rockfalls
NASA Astrophysics Data System (ADS)
Torrebadella, Joan; Altimir, Joan; Lopez, Carles; Amigó, Jordi; Ferrer, Pau
2014-05-01
For studies of falling rocks, Euroconsult (Andorra) and Eurogeotecnica (Catalonia) developed in 1998 the methodology known as EUROBLOC. Having worked with it for over 10 years, and having done numerous studies both in the Principality of Andorra and Spain, it was considered appropriate to undertake an enhanced version of the methodology (EUROBLOCv2), in order to adapt it to the technological advances carried out in recent years on passive protection techniques, (it should be remembered that in 2000 there was only dynamic barriers with a retaining capacity of 1.000 kJ and nowadays there are already approved barriers up to 8.000 kJ and it is expected to reach10.000 kJ in the near future, embankments, reinforced earth walls, etc.) and also in active protection systems (direct stabilization of the slope in base of wire mesh or wire mesh combined with high strength anchors). The EUROBLOCv2 methodology (which was first used in 2012 in order to incorporate all the improvements in the field of protection) consists of two distinct parts, which are firstly, the analysis of rock falls and secondly determining the degree of protection afforded by the protection. So today, we can use a pioneering technique in the field of rocky landslides in which we consider all possible kinds of protection that are on the market, based on both passive protection and active protection. The new methodology also allows work with the simulation of 20m3 rock fall volume, instead on 10m3, maximum considered to date.
Davis, Jennifer C; Bryan, Stirling; Marra, Carlo A; Hsiung, Ging-Yuek R; Liu-Ambrose, Teresa
2015-10-01
Cognitive decline is one of the most prominent healthcare issues of the 21st century. Within the context of combating cognitive decline through behavioural interventions, physical activity is a promising approach. There is a dearth of health economic data in the area of behavioural interventions for dementia prevention. Yet, economic evaluations are essential for providing information to policy makers for resource allocation. It is essential we first address population and intervention-specific methodological challenges prior to building a larger evidence base. We use a cost-utility analysis conducted alongside the exercise for cognition and everyday living (EXCEL) study to illustrate methodological challenges specific to assessing the cost-effectiveness of behavioural interventions aimed at older adults at risk of cognitive decline. A cost-utility analysis conducted concurrently with a 6-month, three-arm randomised controlled trial (ie, the EXCEL study) was used as an example to identify and discuss methodological challenges. Both the aerobic training and resistance training interventions were less costly than twice weekly balance and tone classes. In critically evaluating the economic evaluation of the EXCEL study we identified four category-specific challenges: (1) analysing costs; (2) assessing quality-adjusted life-years; (3) Incomplete data; and (4) 'Intervention' activities of the control group. Resistance training and aerobic training resulted in healthcare cost saving and were equally effective to balance and tone classes after only 6 months of intervention. To ensure this population is treated fairly in terms of claims on resources, we first need to identify areas for methodological improvement. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Risk-based economic decision analysis of remediation options at a PCE-contaminated site.
Lemming, Gitte; Friis-Hansen, Peter; Bjerg, Poul L
2010-05-01
Remediation methods for contaminated sites cover a wide range of technical solutions with different remedial efficiencies and costs. Additionally, they may vary in their secondary impacts on the environment i.e. the potential impacts generated due to emissions and resource use caused by the remediation activities. More attention is increasingly being given to these secondary environmental impacts when evaluating remediation options. This paper presents a methodology for an integrated economic decision analysis which combines assessments of remediation costs, health risk costs and potential environmental costs. The health risks costs are associated with the residual contamination left at the site and its migration to groundwater used for drinking water. A probabilistic exposure model using first- and second-order reliability methods (FORM/SORM) is used to estimate the contaminant concentrations at a downstream groundwater well. Potential environmental impacts on the local, regional and global scales due to the site remediation activities are evaluated using life cycle assessments (LCA). The potential impacts on health and environment are converted to monetary units using a simplified cost model. A case study based upon the developed methodology is presented in which the following remediation scenarios are analyzed and compared: (a) no action, (b) excavation and off-site treatment of soil, (c) soil vapor extraction and (d) thermally enhanced soil vapor extraction by electrical heating of the soil. Ultimately, the developed methodology facilitates societal cost estimations of remediation scenarios which can be used for internal ranking of the analyzed options. Despite the inherent uncertainties of placing a value on health and environmental impacts, the presented methodology is believed to be valuable in supporting decisions on remedial interventions. Copyright 2010 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Ma, Ada W.W.
2013-01-01
In recent research, little attention has been paid to issues of methodology and analysis methods to evaluate the quality of the collaborative learning community. To address such issues, an attempt is made to adopt the Activity System Model as an analytical framework to examine the relationship between computer supported collaborative learning…
ERIC Educational Resources Information Center
Pickup, Austin
2017-01-01
In this article, Austin Pickup centers Foucault's concept of "problematization" as an important methodological tool for displacing neoliberalism from its stable perch atop a perceived absence of other possibilities. According to Pickup, the genealogical analysis envisaged and practiced by Foucault opens up new avenues by indicating not…
An Integrated Model for Effective Knowledge Management in Chinese Organizations
ERIC Educational Resources Information Center
An, Xiaomi; Deng, Hepu; Wang, Yiwen; Chao, Lemen
2013-01-01
Purpose: The purpose of this paper is to provide organizations in the Chinese cultural context with a conceptual model for an integrated adoption of existing knowledge management (KM) methods and to improve the effectiveness of their KM activities. Design/methodology/approaches: A comparative analysis is conducted between China and the western…
ERIC Educational Resources Information Center
Cooper, Amanda
2014-01-01
Knowledge mobilisation (KMb) attempts to address research-policy-practice gaps in education. Research brokering organisations (RBOs) are third party, intermediary organisations whose active role between research producers and users is a catalyst for research use in education. Sample: 44 Canadian RBOs in the education sector. Methodology: employed…
ERIC Educational Resources Information Center
Jung, Steven M.; And Others
Survey activities are reported which were designed to provide the foundation for a national evaluation of the effectiveness of programs assisted under the Career Education Incentive Act of 1977 (PL 95-207). The methodology described, called "program evaluability assessment," focuses on detailed analysis of program assumptions in order to…
Second Life as a Support Element for Learning Electronic Related Subjects: A Real Case
ERIC Educational Resources Information Center
Beltran Sierra, Luis M.; Gutierrez, Ronald S.; Garzon-Castro, Claudia L.
2012-01-01
Looking for more active and motivating methodological alternatives from the students' perspective, which promote analysis and investigation abilities that make the student a more participative agent and some learning processes are facilitated, a practical study was conducted in the University of La Sabana (Chia, Colombia), in Computing Engineering…
ERIC Educational Resources Information Center
Broussard, Shorna R.; Bliss, John C.
2007-01-01
Purpose: The purpose of this research is to determine institutional commitment to sustainability by examining Natural Resource Extension program inputs, activities, and participation. Design/methodology/approach: A document analysis of Natural Resource Extension planning and reporting documents was conducted to provide contextual and historical…
ERIC Educational Resources Information Center
Stukalina, Yulia
2016-01-01
Purpose: The purpose of this paper is to explore some issues related to enhancing the quality of educational services provided by a university in the agenda of integrating quality assurance activities and strategic management procedures. Design/methodology/approach: Employing multiple regression analysis the author has examined some factors that…
Trainer Interventions as Instructional Strategies in Air Traffic Control Training
ERIC Educational Resources Information Center
Koskela, Inka; Palukka, Hannele
2011-01-01
Purpose: This paper aims to identify methods of guidance and supervision used in air traffic control training. It also aims to show how these methods facilitate trainee participation in core work activities. Design/methodology/approach: The paper applies the tools of conversation analysis and ethnomethodology to explore the ways in which trainers…
ERIC Educational Resources Information Center
Hicks, Catherine
2018-01-01
Purpose: This paper aims to explore predicting employee learning activity via employee characteristics and usage for two online learning tools. Design/methodology/approach: Statistical analysis focused on observational data collected from user logs. Data are analyzed via regression models. Findings: Findings are presented for over 40,000…
Bromelain purification through unconventional aqueous two-phase system (PEG/ammonium sulphate).
Coelho, D F; Silveira, E; Pessoa Junior, A; Tambourgi, E B
2013-02-01
This paper focuses on the feasibility of unconventional aqueous two-phase systems for bromelain purification from pineapple processing waste. The main difference in comparison with conventional systems is the integration of the liquid-liquid extraction technique with fractional precipitation, which can decrease the protein content with no loss of biological activity by removing of unwanted molecules. The analysis of the results was based on the response surface methodology and revealed that the use of the desirability optimisation methodology (DOM) was necessary to achieve higher purification factor values and greater bromelain recovery. The use of DOM yielded an 11.80-fold purification factor and 66.38 % biological activity recovery using poly(ethylene glycol) (PEG) with a molar mass of 4,000, 10.86 % PEG concentration (m/m) and 36.21 % saturation of ammonium sulphate.
Røislien, Jo; Winje, Brita
2013-09-20
Clinical studies frequently include repeated measurements of individuals, often for long periods. We present a methodology for extracting common temporal features across a set of individual time series observations. In particular, the methodology explores extreme observations within the time series, such as spikes, as a possible common temporal phenomenon. Wavelet basis functions are attractive in this sense, as they are localized in both time and frequency domains simultaneously, allowing for localized feature extraction from a time-varying signal. We apply wavelet basis function decomposition of individual time series, with corresponding wavelet shrinkage to remove noise. We then extract common temporal features using linear principal component analysis on the wavelet coefficients, before inverse transformation back to the time domain for clinical interpretation. We demonstrate the methodology on a subset of a large fetal activity study aiming to identify temporal patterns in fetal movement (FM) count data in order to explore formal FM counting as a screening tool for identifying fetal compromise and thus preventing adverse birth outcomes. Copyright © 2013 John Wiley & Sons, Ltd.
Cooper, Guy Paul; Yeager, Violet; Burkle, Frederick M; Subbarao, Italo
2015-06-29
This article describes a novel triangulation methodological approach for identifying twitter activity of regional active twitter users during the 2013 Hattiesburg EF-4 Tornado. A data extraction and geographically centered filtration approach was utilized to generate Twitter data for 48 hrs pre- and post-Tornado. The data was further validated using six sigma approach utilizing GPS data. The regional analysis revealed a total of 81,441 tweets, 10,646 Twitter users, 27,309 retweets and 2637 tweets with GPS coordinates. Twitter tweet activity increased 5 fold during the response to the Hattiesburg Tornado. Retweeting activity increased 2.2 fold. Tweets with a hashtag increased 1.4 fold. Twitter was an effective disaster risk reduction tool for the Hattiesburg EF-4 Tornado 2013.
FINDING A METHOD FOR THE MADNESS: A COMPARATIVE ANALYSIS OF STRATEGIC DESIGN METHODOLOGIES
2017-06-01
FINDING A METHOD FOR THE MADNESS: A COMPARATIVE ANALYSIS OF STRATEGIC DESIGN METHODOLOGIES BY AMANDA DONNELLY A THESIS...work develops a comparative model for strategic design methodologies, focusing on the primary elements of vision, time, process, communication and...collaboration, and risk assessment. My analysis dissects and compares three potential design methodologies including, net assessment, scenarios and
A Methodology for Loading the Advanced Test Reactor Driver Core for Experiment Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cowherd, Wilson M.; Nielsen, Joseph W.; Choe, Dong O.
In support of experiments in the ATR, a new methodology was devised for loading the ATR Driver Core. This methodology will replace the existing methodology used by the INL Neutronic Analysis group to analyze experiments. Studied in this paper was the as-run analysis for ATR Cycle 152B, specifically comparing measured lobe powers and eigenvalue calculations.
Discourse Analysis and the Study of Educational Leadership
ERIC Educational Resources Information Center
Anderson, Gary; Mungal, Angus Shiva
2015-01-01
Purpose: The purpose of this paper is to provide an overview of the current and past work using discourse analysis in the field of educational administration and of discourse analysis as a methodology. Design/Methodology/Approach: Authors reviewed research in educational leadership that uses discourse analysis as a methodology. Findings: While…
76 FR 30139 - Federal Need Analysis Methodology for the 2012-2013 Award Year
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-24
... DEPARTMENT OF EDUCATION Federal Need Analysis Methodology for the 2012-2013 Award Year AGENCY: Federal Student Aid, Department of Education. ACTION: Notice of revision of the Federal Need Analysis...; 84.268; 84.379]. Federal Need Analysis Methodology for the 2012-2013 award year; Federal Pell Grant...
Das, Dipa; Meikap, Bhim C
2017-10-15
The present research describes the optimal adsorption condition for methylene blue (MB). The adsorbent used here was monoethanol amine-impregnated activated carbon (MEA-AC) prepared from green coconut shell. Response surface methodology (RSM) is the multivariate statistical technique used for the optimization of the process variables. The central composite design is used to determine the effect of activation temperature, activation time and impregnation ratio on the MB removal. The percentage (%) MB adsorption by MEA-AC is evaluated as a response of the system. A quadratic model was developed for response. From the analysis of variance, the factor which was the most influential on the experimental design response has been identified. The optimum condition for the preparation of MEA-AC from green coconut shells is the temperature of activation 545.6°C, activation time of 41.64 min and impregnation ratio of 0.33 to achieve the maximum removal efficiency of 98.21%. At the same optimum parameter, the % MB removal from the textile-effluent industry was examined and found to be 96.44%.
Lin, Blossom Yen-Ju; Chao, Te-Hsin; Yao, Yuh; Tu, Shu-Min; Wu, Chun-Ching; Chern, Jin-Yuan; Chao, Shiu-Hsiung; Shaw, Keh-Yuong
2007-04-01
Previous studies have shown the advantages of using activity-based costing (ABC) methodology in the health care industry. The potential values of ABC methodology in health care are derived from the more accurate cost calculation compared to the traditional step-down costing, and the potentials to evaluate quality or effectiveness of health care based on health care activities. This project used ABC methodology to profile the cost structure of inpatients with surgical procedures at the Department of Colorectal Surgery in a public teaching hospital, and to identify the missing or inappropriate clinical procedures. We found that ABC methodology was able to accurately calculate costs and to identify several missing pre- and post-surgical nursing education activities in the course of treatment.
NASA standard: Trend analysis techniques
NASA Technical Reports Server (NTRS)
1988-01-01
This Standard presents descriptive and analytical techniques for NASA trend analysis applications. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. Use of this Standard is not mandatory; however, it should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend Analysis is neither a precise term nor a circumscribed methodology, but rather connotes, generally, quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this Standard. The document presents the basic ideas needed for qualitative and quantitative assessment of trends, together with relevant examples. A list of references provides additional sources of information.
2011-01-01
Background Current methodological guidelines provide advice about the assessment of sub-group analysis within RCTs, but do not specify explicit criteria for assessment. Our objective was to provide researchers with a set of criteria that will facilitate the grading of evidence for moderators, in systematic reviews. Method We developed a set of criteria from methodological manuscripts (n = 18) using snowballing technique, and electronic database searches. Criteria were reviewed by an international Delphi panel (n = 21), comprising authors who have published methodological papers in this area, and researchers who have been active in the study of sub-group analysis in RCTs. We used the Research ANd Development/University of California Los Angeles appropriateness method to assess consensus on the quantitative data. Free responses were coded for consensus and disagreement. In a subsequent round additional criteria were extracted from the Cochrane Reviewers' Handbook, and the process was repeated. Results The recommendations are that meta-analysts report both confirmatory and exploratory findings for sub-groups analysis. Confirmatory findings must only come from studies in which a specific theory/evidence based a-priori statement is made. Exploratory findings may be used to inform future/subsequent trials. However, for inclusion in the meta-analysis of moderators, the following additional criteria should be applied to each study: Baseline factors should be measured prior to randomisation, measurement of baseline factors should be of adequate reliability and validity, and a specific test of the interaction between baseline factors and interventions must be presented. Conclusions There is consensus from a group of 21 international experts that methodological criteria to assess moderators within systematic reviews of RCTs is both timely and necessary. The consensus from the experts resulted in five criteria divided into two groups when synthesising evidence: confirmatory findings to support hypotheses about moderators and exploratory findings to inform future research. These recommendations are discussed in reference to previous recommendations for evaluating and reporting moderator studies. PMID:21281501
Shuttle payload bay dynamic environments: Summary and conclusion report for STS flights 1-5 and 9
NASA Technical Reports Server (NTRS)
Oconnell, M.; Garba, J.; Kern, D.
1984-01-01
The vibration, acoustic and low frequency loads data from the first 5 shuttle flights are presented. The engineering analysis of that data is also presented. Vibroacoustic data from STS-9 are also presented because they represent the only data taken on a large payload. Payload dynamic environment predictions developed by the participation of various NASA and industrial centers are presented along with a comparison of analytical loads methodology predictions with flight data, including a brief description of the methodologies employed in developing those predictions for payloads. The review of prediction methodologies illustrates how different centers have approached the problems of developing shuttle dynamic environmental predictions and criteria. Ongoing research activities related to the shuttle dynamic environments are also described. Analytical software recently developed for the prediction of payload acoustic and vibration environments are also described.
Diversity of nursing student views about simulation design: a q-methodological study.
Paige, Jane B; Morin, Karen H
2015-05-01
Education of future nurses benefits from well-designed simulation activities. Skillful teaching with simulation requires educators to be constantly aware of how students experience learning and perceive educators' actions. Because revision of simulation activities considers feedback elicited from students, it is crucial to understand the perspective from which students base their response. In a Q-methodological approach, 45 nursing students rank-ordered 60 opinion statements about simulation design into a distribution grid. Factor analysis revealed that nursing students hold five distinct and uniquely personal perspectives-Let Me Show You, Stand By Me, The Agony of Defeat, Let Me Think It Through, and I'm Engaging and So Should You. Results suggest that nurse educators need to reaffirm that students clearly understand the purpose of each simulation activity. Nurse educators should incorporate presimulation assignments to optimize learning and help allay anxiety. The five perspectives discovered in this study can serve as a tool to discern individual students' learning needs. Copyright 2015, SLACK Incorporated.
Jagiello, Karolina; Grzonkowska, Monika; Swirog, Marta; ...
2016-08-29
In this contribution, the advantages and limitations of two computational techniques that can be used for the investigation of nanoparticles activity and toxicity: classic nano-QSAR (Quantitative Structure–Activity Relationships employed for nanomaterials) and 3D nano-QSAR (three-dimensional Quantitative Structure–Activity Relationships, such us Comparative Molecular Field Analysis, CoMFA/Comparative Molecular Similarity Indices Analysis, CoMSIA analysis employed for nanomaterials) have been briefly summarized. Both approaches were compared according to the selected criteria, including: efficiency, type of experimental data, class of nanomaterials, time required for calculations and computational cost, difficulties in the interpretation. Taking into account the advantages and limitations of each method, we provide themore » recommendations for nano-QSAR modellers and QSAR model users to be able to determine a proper and efficient methodology to investigate biological activity of nanoparticles in order to describe the underlying interactions in the most reliable and useful manner.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jagiello, Karolina; Grzonkowska, Monika; Swirog, Marta
In this contribution, the advantages and limitations of two computational techniques that can be used for the investigation of nanoparticles activity and toxicity: classic nano-QSAR (Quantitative Structure–Activity Relationships employed for nanomaterials) and 3D nano-QSAR (three-dimensional Quantitative Structure–Activity Relationships, such us Comparative Molecular Field Analysis, CoMFA/Comparative Molecular Similarity Indices Analysis, CoMSIA analysis employed for nanomaterials) have been briefly summarized. Both approaches were compared according to the selected criteria, including: efficiency, type of experimental data, class of nanomaterials, time required for calculations and computational cost, difficulties in the interpretation. Taking into account the advantages and limitations of each method, we provide themore » recommendations for nano-QSAR modellers and QSAR model users to be able to determine a proper and efficient methodology to investigate biological activity of nanoparticles in order to describe the underlying interactions in the most reliable and useful manner.« less
Nonlinear estimation of parameters in biphasic Arrhenius plots.
Puterman, M L; Hrboticky, N; Innis, S M
1988-05-01
This paper presents a formal procedure for the statistical analysis of data on the thermotropic behavior of membrane-bound enzymes generated using the Arrhenius equation and compares the analysis to several alternatives. Data is modeled by a bent hyperbola. Nonlinear regression is used to obtain estimates and standard errors of the intersection of line segments, defined as the transition temperature, and slopes, defined as energies of activation of the enzyme reaction. The methodology allows formal tests of the adequacy of a biphasic model rather than either a single straight line or a curvilinear model. Examples on data concerning the thermotropic behavior of pig brain synaptosomal acetylcholinesterase are given. The data support the biphasic temperature dependence of this enzyme. The methodology represents a formal procedure for statistical validation of any biphasic data and allows for calculation of all line parameters with estimates of precision.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-09
... DEPARTMENT OF JUSTICE Office of Justice Programs [OMB Number 1121-NEW] Agency Information Collection Activities; Proposed Collection; Comment Request: Methodological Research To Support the National... Redesign Research (NCVS-RR) program: Methodological Research to Support the National Crime Victimization...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-07
... DEPARTMENT OF JUSTICE Office of Justice Programs [OMB No. 1121-NEW] Agency Information Collection Activities: Proposed Collection; Comments Requested Methodological Research To Support the National Crime... related to the National Crime Victimization Survey Redesign Research (NCVS-RR) program: Methodological...
Poulsen, Nicklas N; Pedersen, Morten E; Østergaard, Jesper; Petersen, Nickolaj J; Nielsen, Christoffer T; Heegaard, Niels H H; Jensen, Henrik
2016-09-20
Detection of immune responses is important in the diagnosis of many diseases. For example, the detection of circulating autoantibodies against double-stranded DNA (dsDNA) is used in the diagnosis of Systemic Lupus Erythematosus (SLE). It is, however, difficult to reach satisfactory sensitivity, specificity, and accuracy with established assays. Also, existing methodologies for quantification of autoantibodies are challenging to transfer to a point-of-care setting. Here we present the use of flow-induced dispersion analysis (FIDA) for rapid (minutes) measurement of autoantibodies against dsDNA. The assay is based on Taylor dispersion analysis (TDA) and is fully automated with the use of standard capillary electrophoresis (CE) based equipment employing fluorescence detection. It is robust toward matrix effects as demonstrated by the direct analysis of samples composed of up to 85% plasma derived from human blood samples, and it allows for flexible exchange of the DNA sequences used to probe for the autoantibodies. Plasma samples from SLE positive patients were analyzed using the new FIDA methodology as well as by standard indirect immunofluorescence and solid-phase immunoassays. Interestingly, the patient antibodies bound DNA sequences with different affinities, suggesting pronounced heterogeneity among autoantibodies produced in SLE. The FIDA based methodology is a new approach for autoantibody detection and holds promise for being used for patient stratification and monitoring of disease activity.
Co-activation patterns in resting-state fMRI signals.
Liu, Xiao; Zhang, Nanyin; Chang, Catie; Duyn, Jeff H
2018-02-08
The brain is a complex system that integrates and processes information across multiple time scales by dynamically coordinating activities over brain regions and circuits. Correlations in resting-state functional magnetic resonance imaging (rsfMRI) signals have been widely used to infer functional connectivity of the brain, providing a metric of functional associations that reflects a temporal average over an entire scan (typically several minutes or longer). Not until recently was the study of dynamic brain interactions at much shorter time scales (seconds to minutes) considered for inference of functional connectivity. One method proposed for this objective seeks to identify and extract recurring co-activation patterns (CAPs) that represent instantaneous brain configurations at single time points. Here, we review the development and recent advancement of CAP methodology and other closely related approaches, as well as their applications and associated findings. We also discuss the potential neural origins and behavioral relevance of CAPs, along with methodological issues and future research directions in the analysis of fMRI co-activation patterns. Copyright © 2018 Elsevier Inc. All rights reserved.
Structural health monitoring apparatus and methodology
NASA Technical Reports Server (NTRS)
Giurgiutiu, Victor (Inventor); Yu, Lingyu (Inventor); Bottai, Giola Santoni (Inventor)
2011-01-01
Disclosed is an apparatus and methodology for structural health monitoring (SHM) in which smart devices interrogate structural components to predict failure, expedite needed repairs, and thus increase the useful life of those components. Piezoelectric wafer active sensors (PWAS) are applied to or integrated with structural components and various data collected there from provide the ability to detect and locate cracking, corrosion, and disbanding through use of pitch-catch, pulse-echo, electro/mechanical impedance, and phased array technology. Stand alone hardware and an associated software program are provided that allow selection of multiple types of SHM investigations as well as multiple types of data analysis to perform a wholesome investigation of a structure.
Czolowski, Eliza D; Santoro, Renee L; Srebotnjak, Tanja; Shonkoff, Seth B C
2017-08-23
Higher risk of exposure to environmental health hazards near oil and gas wells has spurred interest in quantifying populations that live in proximity to oil and gas development. The available studies on this topic lack consistent methodology and ignore aspects of oil and gas development of value to public health-relevant assessment and decision-making. We aim to present a methodological framework for oil and gas development proximity studies grounded in an understanding of hydrocarbon geology and development techniques. We geospatially overlay locations of active oil and gas wells in the conterminous United States and Census data to estimate the population living in proximity to hydrocarbon development at the national and state levels. We compare our methods and findings with existing proximity studies. Nationally, we estimate that 17.6 million people live within 1,600m (∼1 mi) of at least one active oil and/or gas well. Three of the eight studies overestimate populations at risk from actively producing oil and gas wells by including wells without evidence of production or drilling completion and/or using inappropriate population allocation methods. The remaining five studies, by omitting conventional wells in regions dominated by historical conventional development, significantly underestimate populations at risk. The well inventory guidelines we present provide an improved methodology for hydrocarbon proximity studies by acknowledging the importance of both conventional and unconventional well counts as well as the relative exposure risks associated with different primary production categories (e.g., oil, wet gas, dry gas) and developmental stages of wells. https://doi.org/10.1289/EHP1535.
Translating Current Bioanalytical Techniques for Studying Corona Activity.
Wang, Chunming; Wang, Zhenzhen; Dong, Lei
2018-07-01
The recent discovery of the biological corona is revolutionising our understanding of the in vivo behaviour of nanomaterials. Accurate analysis of corona bioactivity is essential for predicting the fate of nanomaterials and thereby improving nanomedicine design. Nevertheless, current biotechniques for protein analysis are not readily adaptable for analysing corona proteins, given that their conformation, activity, and interaction may largely differ from those of the native proteins. Here, we introduce and propose tailor-made modifications to five types of mainstream bioanalytical methodologies. We specifically illustrate how these modifications can translate existing techniques for protein analysis into competent tools for dissecting the composition, bioactivity, and interaction (with both nanomaterials and the tissue) of corona formed on specific nanomaterial surfaces. Copyright © 2018 Elsevier Ltd. All rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-16
... DEPARTMENT OF JUSTICE Office of Justice Programs [OMB Number 1121-NEW] Agency Information Collection Activities; Proposed Collection; Comment Request: Methodological Research to Support the National...: Methodological Research to Support the National Crime Victimization Survey: Self-Report Data on Rape and Sexual...
Laurinavicius, Arvydas; Plancoulaine, Benoit; Rasmusson, Allan; Besusparis, Justinas; Augulis, Renaldas; Meskauskas, Raimundas; Herlin, Paulette; Laurinaviciene, Aida; Abdelhadi Muftah, Abir A; Miligy, Islam; Aleskandarany, Mohammed; Rakha, Emad A; Green, Andrew R; Ellis, Ian O
2016-04-01
Proliferative activity, assessed by Ki67 immunohistochemistry (IHC), is an established prognostic and predictive biomarker of breast cancer (BC). However, it remains under-utilized due to lack of standardized robust measurement methodologies and significant intratumor heterogeneity of expression. A recently proposed methodology for IHC biomarker assessment in whole slide images (WSI), based on systematic subsampling of tissue information extracted by digital image analysis (DIA) into hexagonal tiling arrays, enables computation of a comprehensive set of Ki67 indicators, including intratumor variability. In this study, the tiling methodology was applied to assess Ki67 expression in WSI of 152 surgically removed Ki67-stained (on full-face sections) BC specimens and to test which, if any, Ki67 indicators can predict overall survival (OS). Visual Ki67 IHC estimates and conventional clinico-pathologic parameters were also included in the study. Analysis revealed linearly independent intrinsic factors of the Ki67 IHC variance: proliferation (level of expression), disordered texture (entropy), tumor size and Nottingham Prognostic Index, bimodality, and correlation. All visual and DIA-generated indicators of the level of Ki67 expression provided significant cutoff values as single predictors of OS. However, only bimodality indicators (Ashman's D, in particular) were independent predictors of OS in the context of hormone receptor and HER2 status. From this, we conclude that spatial heterogeneity of proliferative tumor activity, measured by DIA of Ki67 IHC expression and analyzed by the hexagonal tiling approach, can serve as an independent prognostic indicator of OS in BC patients that outperforms the prognostic power of the level of proliferative activity.
Applications of artificial neural network in AIDS research and therapy.
Sardari, S; Sardari, D
2002-01-01
In recent years considerable effort has been devoted to applying pattern recognition techniques to the complex task of data analysis in drug research. Artificial neural networks (ANN) methodology is a modeling method with great ability to adapt to a new situation, or control an unknown system, using data acquired in previous experiments. In this paper, a brief history of ANN and the basic concepts behind the computing, the mathematical and algorithmic formulation of each of the techniques, and their developmental background is presented. Based on the abilities of ANNs in pattern recognition and estimation of system outputs from the known inputs, the neural network can be considered as a tool for molecular data analysis and interpretation. Analysis by neural networks improves the classification accuracy, data quantification and reduces the number of analogues necessary for correct classification of biologically active compounds. Conformational analysis and quantifying the components in mixtures using NMR spectra, aqueous solubility prediction and structure-activity correlation are among the reported applications of ANN as a new modeling method. Ranging from drug design and discovery to structure and dosage form design, the potential pharmaceutical applications of the ANN methodology are significant. In the areas of clinical monitoring, utilization of molecular simulation and design of bioactive structures, ANN would make the study of the status of the health and disease possible and brings their predicted chemotherapeutic response closer to reality.
Integrated Response Time Evaluation Methodology for the Nuclear Safety Instrumentation System
NASA Astrophysics Data System (ADS)
Lee, Chang Jae; Yun, Jae Hee
2017-06-01
Safety analysis for a nuclear power plant establishes not only an analytical limit (AL) in terms of a measured or calculated variable but also an analytical response time (ART) required to complete protective action after the AL is reached. If the two constraints are met, the safety limit selected to maintain the integrity of physical barriers used for preventing uncontrolled radioactivity release will not be exceeded during anticipated operational occurrences and postulated accidents. Setpoint determination methodologies have actively been developed to ensure that the protective action is initiated before the process conditions reach the AL. However, regarding the ART for a nuclear safety instrumentation system, an integrated evaluation methodology considering the whole design process has not been systematically studied. In order to assure the safety of nuclear power plants, this paper proposes a systematic and integrated response time evaluation methodology that covers safety analyses, system designs, response time analyses, and response time tests. This methodology is applied to safety instrumentation systems for the advanced power reactor 1400 and the optimized power reactor 1000 nuclear power plants in South Korea. The quantitative evaluation results are provided herein. The evaluation results using the proposed methodology demonstrate that the nuclear safety instrumentation systems fully satisfy corresponding requirements of the ART.
Igras, Susan; Diakité, Mariam; Lundgren, Rebecka
2017-07-01
In West Africa, social factors influence whether couples with unmet need for family planning act on birth-spacing desires. Tékponon Jikuagou is testing a social network-based intervention to reduce social barriers by diffusing new ideas. Individuals and groups judged socially influential by their communities provide entrée to networks. A participatory social network mapping methodology was designed to identify these diffusion actors. Analysis of monitoring data, in-depth interviews, and evaluation reports assessed the methodology's acceptability to communities and staff and whether it produced valid, reliable data to identify influential individuals and groups who diffuse new ideas through their networks. Results indicated the methodology's acceptability. Communities were actively and equitably engaged. Staff appreciated its ability to yield timely, actionable information. The mapping methodology also provided valid and reliable information by enabling communities to identify highly connected and influential network actors. Consistent with social network theory, this methodology resulted in the selection of informal groups and individuals in both informal and formal positions. In-depth interview data suggest these actors were diffusing new ideas, further confirming their influence/connectivity. The participatory methodology generated insider knowledge of who has social influence, challenging commonly held assumptions. Collecting and displaying information fostered staff and community learning, laying groundwork for social change.
Uncertainties in shoreline position analysis: the role of run-up and tide in a gentle slope beach
NASA Astrophysics Data System (ADS)
Manno, Giorgio; Lo Re, Carlo; Ciraolo, Giuseppe
2017-09-01
In recent decades in the Mediterranean Sea, high anthropic pressure from increasing economic and touristic development has affected several coastal areas. Today the erosion phenomena threaten human activities and existing structures, and interdisciplinary studies are needed to better understand actual coastal dynamics. Beach evolution analysis can be conducted using GIS methodologies, such as the well-known Digital Shoreline Analysis System (DSAS), in which error assessment based on shoreline positioning plays a significant role. In this study, a new approach is proposed to estimate the positioning errors due to tide and wave run-up influence. To improve the assessment of the wave run-up uncertainty, a spectral numerical model was used to propagate waves from deep to intermediate water and a Boussinesq-type model for intermediate water up to the swash zone. Tide effects on the uncertainty of shoreline position were evaluated using data collected by a nearby tide gauge. The proposed methodology was applied to an unprotected, dissipative Sicilian beach far from harbors and subjected to intense human activities over the last 20 years. The results show wave run-up and tide errors ranging from 0.12 to 4.5 m and from 1.20 to 1.39 m, respectively.
Active learning on the ward: outcomes from a comparative trial with traditional methods.
Melo Prado, Hegla; Hannois Falbo, Gilliatt; Rodrigues Falbo, Ana; Natal Figueirôa, José
2011-03-01
Academic activity during internship is essentially practical and ward rounds are traditionally considered the cornerstone of clinical education. However, the efficacy and effectiveness of ward rounds for learning purposes have been under-investigated and it is necessary to assess alternative educational paradigms for this activity. This study aimed to compare the educational effectiveness of ward rounds conducted with two different learning methodologies. Student subjects were first tested on 30 true/false questions to assess their initial degree of knowledge on pneumonia and diarrhoea. Afterwards, they attended ward rounds conducted using an active and a traditional learning methodology. The participants were submitted to a second test 48hours later in order to assess knowledge acquisition and were asked to answer two questions about self-directed learning and their opinions on the two learning methodologies used. Seventy-two medical students taking part in a paediatric clinic rotation were enrolled. The active methodology proved to be more effective than the traditional methodology for the three outcomes considered: knowledge acquisition (33 students [45.8%] versus 21 students [29.2%]; p=0.03); self-directed learning (38 students [52.8%] versus 11 students [15.3%]; p<0.001), and student opinion on the methods (61 students [84.7%] versus 38 students [52.8%]; p<0.001). The active methodology produced better results than the traditional methodology in a ward-based context. This study seems to be valuable in terms of the new evidence it demonstrates on learning methodologies in the context of the ward round. © Blackwell Publishing Ltd 2011.
Lê, Gillian; Mirzoev, Tolib; Orgill, Marsha; Erasmus, Ermin; Lehmann, Uta; Okeyo, Stephen; Goudge, Jane; Maluka, Stephen; Uzochukwu, Benjamin; Aikins, Moses; de Savigny, Don; Tomson, Goran; Gilson, Lucy
2014-10-08
The importance of health policy and systems research and analysis (HPSR+A) has been increasingly recognised, but it is still unclear how most effectively to strengthen the capacity of the different organisations involved in this field. Universities are particularly crucial but the expansive literature on capacity development has little to offer the unique needs of HPSR+A activity within universities, and often overlooks the pivotal contribution of capacity assessments to capacity strengthening. The Consortium for Health Policy and Systems Analysis in Africa 2011-2015 designed and implemented a new framework for capacity assessment for HPSR+A within universities. The methodology is reported in detail. Our reflections on developing and conducting the assessment generated four lessons for colleagues in the field. Notably, there are currently no published capacity assessment methodologies for HPSR+A that focus solely on universities - we report a first for the field to initiate the dialogue and exchange of experiences with others. Second, in HPSR+A, the unit of assessment can be a challenge, because HPSR+A groups within universities tend to overlap between academic departments and are embedded in different networks. Third, capacity assessment experience can itself be capacity strengthening, even when taking into account that doing such assessments require capacity. From our experience, we propose that future systematic assessments of HPSR+A capacity need to focus on both capacity assets and needs and assess capacity at individual, organisational, and systems levels, whilst taking into account the networked nature of HPSR+A activity. A genuine partnership process between evaluators and those participating in an assessment can improve the quality of assessment and uptake of results in capacity strengthening.
ERIC Educational Resources Information Center
Dickson, Martina; Ladefoged, Svend Erik
2017-01-01
This article focuses on a teaching methodology project which investigated issues of teaching quality at a technical and vocational education and training (TVET) academy in Kurdistan, Northern Iraq. The academy was established in 2012 to provide unemployed youth with TVET, particularly workplace-relevant training. A needs analysis showed that the…
ERIC Educational Resources Information Center
Storey, Keith
This final report briefly describes activities of a project which developed and evaluated specific natural support intervention procedures to increase the social integration of employees with severe disabilities using single-subject, clique analysis, and social validation methodologies. The project resulted in the publication of 6 journal articles…
The Long and Winding Road: Problems in Developing Capabilities in an Undergraduate Commerce Degree
ERIC Educational Resources Information Center
Calma, Angelito
2017-01-01
Purpose: The purpose of this paper is to provide an analysis of specific learning outcomes in an undergraduate commerce degree in a large research-intensive university in Australia. Design/methodology/approach: It uses data collected from assurance of learning activities as part of Association to Advance Collegiate Schools of Business…
ERIC Educational Resources Information Center
Petrov, Mark G.
2016-01-01
Thermally activated analysis of experimental data allows considering about the structure features of each material. By modelling the structural heterogeneity of materials by means of rheological models, general and local plastic flows in metals and alloys can be described over. Based on physical fundamentals of failure and deformation of materials…
Flame filtering and perimeter localization of wildfires using aerial thermal imagery
NASA Astrophysics Data System (ADS)
Valero, Mario M.; Verstockt, Steven; Rios, Oriol; Pastor, Elsa; Vandecasteele, Florian; Planas, Eulàlia
2017-05-01
Airborne thermal infrared (TIR) imaging systems are being increasingly used for wild fire tactical monitoring since they show important advantages over spaceborne platforms and visible sensors while becoming much more affordable and much lighter than multispectral cameras. However, the analysis of aerial TIR images entails a number of difficulties which have thus far prevented monitoring tasks from being totally automated. One of these issues that needs to be addressed is the appearance of flame projections during the geo-correction of off-nadir images. Filtering these flames is essential in order to accurately estimate the geographical location of the fuel burning interface. Therefore, we present a methodology which allows the automatic localisation of the active fire contour free of flame projections. The actively burning area is detected in TIR georeferenced images through a combination of intensity thresholding techniques, morphological processing and active contours. Subsequently, flame projections are filtered out by the temporal frequency analysis of the appropriate contour descriptors. The proposed algorithm was tested on footages acquired during three large-scale field experimental burns. Results suggest this methodology may be suitable to automatise the acquisition of quantitative data about the fire evolution. As future work, a revision of the low-pass filter implemented for the temporal analysis (currently a median filter) was recommended. The availability of up-to-date information about the fire state would improve situational awareness during an emergency response and may be used to calibrate data-driven simulators capable of emitting short-term accurate forecasts of the subsequent fire evolution.
Chen, Li-Ding; Lu, Yi-He; Tian, Hui-Ying; Shi, Qian
2007-03-01
Global ecological security becomes increasingly important with the intensive human activities. The function of ecological security is influenced by human activities, and in return, the efficiency of human activities will also be affected by the patterns of regional ecological security. Since the 1990s, China has initiated the construction of key projects "Yangtze Three Gorges Dam", "Qinghai-Tibet Railway", "West-to-East Gas Pipeline", "West-to-East Electricity Transmission" and "South-to-North Water Transfer" , etc. The interaction between these projects and regional ecological security has particularly attracted the attention of Chinese government. It is not only important for the regional environmental protection, but also of significance for the smoothly implementation of various projects aimed to develop an ecological rehabilitation system and to design a regional ecological security pattern. This paper made a systematic analysis on the types and characteristics of key project construction and their effects on the environment, and on the basis of this, brought forward the basic principles and methodology for ecological rehabilitation and security pattern design in this construction. It was considered that the following issues should be addressed in the implementation of a key project: 1) analysis and evaluation of current regional ecological environment, 2) evaluation of anthropogenic disturbances and their ecological risk, 3) regional ecological rehabilitation and security pattern design, 4) scenario analysis of environmental benefits of regional ecological security pattern, 5) re-optimization of regional ecological system framework, and 6) establishment of regional ecosystem management plan.
NASA Technical Reports Server (NTRS)
Starnes, James H., Jr.; Newman, James C., Jr.; Harris, Charles E.; Piascik, Robert S.; Young, Richard D.; Rose, Cheryl A.
2003-01-01
Analysis methodologies for predicting fatigue-crack growth from rivet holes in panels subjected to cyclic loads and for predicting the residual strength of aluminum fuselage structures with cracks and subjected to combined internal pressure and mechanical loads are described. The fatigue-crack growth analysis methodology is based on small-crack theory and a plasticity induced crack-closure model, and the effect of a corrosive environment on crack-growth rate is included. The residual strength analysis methodology is based on the critical crack-tip-opening-angle fracture criterion that characterizes the fracture behavior of a material of interest, and a geometric and material nonlinear finite element shell analysis code that performs the structural analysis of the fuselage structure of interest. The methodologies have been verified experimentally for structures ranging from laboratory coupons to full-scale structural components. Analytical and experimental results based on these methodologies are described and compared for laboratory coupons and flat panels, small-scale pressurized shells, and full-scale curved stiffened panels. The residual strength analysis methodology is sufficiently general to include the effects of multiple-site damage on structural behavior.
Inference evaluation in a finite evidence domain
NASA Astrophysics Data System (ADS)
Ratway, Michael J.; Bellomo, Carryn
2000-08-01
Modeling of a target starts with a subject matter expert (SME) analysis of the available sensor(s) data. The SME then forms relationships between the data and known target attributes, called evidence, to support modeling of different types of targets or target activity. Speeds in the interval 10 to 30 knots and ranges less than 30 nautical miles are two samples of target evidence derived from sensor data. Evidence is then organized into sets to define the activities of a target and/or to distinguish different types of targets. For example, near an airport, target activities of takeoff, landing, and holding need to be evaluated in addition to target classification of civilian or commercial aircraft. This paper discusses a method for evaluation of the inferred activities over the finite evidence domain formed from the collection of models under consideration. The methodology accounts for repeated use of evidence in different models. For example, 'near an airport' is a required piece of evidence used repeatedly in the takeoff, landing, and holding models of a wide area sensor. Properties of the activity model evaluator methodology are discussed in terms of model construction and informal results are presented in a Boolean evidence type of problem domain.
De Brún, Aoife; McAuliffe, Eilish
2018-03-13
Health systems research recognizes the complexity of healthcare, and the interacting and interdependent nature of components of a health system. To better understand such systems, innovative methods are required to depict and analyze their structures. This paper describes social network analysis as a methodology to depict, diagnose, and evaluate health systems and networks therein. Social network analysis is a set of techniques to map, measure, and analyze social relationships between people, teams, and organizations. Through use of a case study exploring support relationships among senior managers in a newly established hospital group, this paper illustrates some of the commonly used network- and node-level metrics in social network analysis, and demonstrates the value of these maps and metrics to understand systems. Network analysis offers a valuable approach to health systems and services researchers as it offers a means to depict activity relevant to network questions of interest, to identify opinion leaders, influencers, clusters in the network, and those individuals serving as bridgers across clusters. The strengths and limitations inherent in the method are discussed, and the applications of social network analysis in health services research are explored.
Nadine Gobron; Bernard Pinty; Ophélie Aussedat; Jing M. Chen; Warren B. Cohen; Rasmus Fensholt; Valery Gond; Karl Fred Huemmrich; Thomas Lavergne; Frédéric Méline; Jeffrey L. Privette; Inge Sandholt; Malcolm Taberner; David P. Turner; Michael M. Verstraete; Jean-Luc Widlowski
2006-01-01
This paper discusses the quality and the accuracy of the Joint Research Center (JRC) fraction of absorbed photosynthetically active radiation (FAPAR) products generated from an analysis of Sea-viewing Wide Field-of-view Sensor (SeaWiFS) data. The FAPAR value acts as an indicator of the presence and state of the vegetation and it can be estimated from remote sensing...
CNN based approach for activity recognition using a wrist-worn accelerometer.
Panwar, Madhuri; Dyuthi, S Ram; Chandra Prakash, K; Biswas, Dwaipayan; Acharyya, Amit; Maharatna, Koushik; Gautam, Arvind; Naik, Ganesh R
2017-07-01
In recent years, significant advancements have taken place in human activity recognition using various machine learning approaches. However, feature engineering have dominated conventional methods involving the difficult process of optimal feature selection. This problem has been mitigated by using a novel methodology based on deep learning framework which automatically extracts the useful features and reduces the computational cost. As a proof of concept, we have attempted to design a generalized model for recognition of three fundamental movements of the human forearm performed in daily life where data is collected from four different subjects using a single wrist worn accelerometer sensor. The validation of the proposed model is done with different pre-processing and noisy data condition which is evaluated using three possible methods. The results show that our proposed methodology achieves an average recognition rate of 99.8% as opposed to conventional methods based on K-means clustering, linear discriminant analysis and support vector machine.
NASA Astrophysics Data System (ADS)
Romero, Jesus Franklin A.; Leite, Patrícia; Mantovani, Gerson L.; Lanfredi, Alexandre J. C.; Martins-Filho, Luiz S.
2011-06-01
This paper describes the experience of an introductory discipline to the engineering curricula at the Brazilian Federal University of ABC (UFABC). The university offers a common basic curriculum that must be accomplished by every student and can be followed by professionalising courses. The discipline 'Introduction to Engineering' presents the basis of the engineering career, methods and thinking together with professional commitments and regulations. The objective is to help students to consciously choose their careers, minimising the precocity problem in deciding a professional future. The discipline methodology includes activities proposed by the TryEngineering website and from Brazilian engineering councils. Lectures with invited professors introduce UFABC engineering specialities: Aerospace, Bioengineering, Energy, Environmental & Urban, Information, Instrumentation & Automation & Robotics, Management, Materials. This paper reports the proposed activities, results obtained by the students, a methodology critical analysis and the impacts on the following steps of students embracing an engineering career.
NASA Astrophysics Data System (ADS)
Giordan, Daniele; Hayakawa, Yuichi; Nex, Francesco; Remondino, Fabio; Tarolli, Paolo
2018-04-01
The number of scientific studies that consider possible applications of remotely piloted aircraft systems (RPASs) for the management of natural hazards effects and the identification of occurred damages strongly increased in the last decade. Nowadays, in the scientific community, the use of these systems is not a novelty, but a deeper analysis of the literature shows a lack of codified complex methodologies that can be used not only for scientific experiments but also for normal codified emergency operations. RPASs can acquire on-demand ultra-high-resolution images that can be used for the identification of active processes such as landslides or volcanic activities but can also define the effects of earthquakes, wildfires and floods. In this paper, we present a review of published literature that describes experimental methodologies developed for the study and monitoring of natural hazards.
Pena-Pereira, Francisco; Duarte, Regina M B O; Trindade, Tito; Duarte, Armando C
2013-07-19
The development of a novel methodology for extraction and preconcentration of the most commonly used anionic surface active agents (SAAs), linear alkylbenzene sulfonates (LAS), is presented herein. The present method, based on the use of silica-magnetite nanoparticles modified with cationic surfactant aggregates, was developed for determination of C10-C13 LAS homologues. The proposed methodology allowed quantitative recoveries of C10-C13 LAS homologues by using a reduced amount of magnetic nanoparticles. Limits of detection were in the range 0.8-1.9μgL(-1) for C10-C13 LAS homologues, while the repeatability, expressed as relative standard deviation (RSD), ranged from 2.0 to 3.9% (N=6). Finally, the proposed method was successfully applied to the analysis of a variety of natural water samples. Copyright © 2013 Elsevier B.V. All rights reserved.
Lee, Kian Mun; Hamid, Sharifah Bee Abd
2015-01-19
The performance of advance photocatalytic degradation of 4-chlorophenoxyacetic acid (4-CPA) strongly depends on photocatalyst dosage, initial concentration and initial pH. In the present study, a simple response surface methodology (RSM) was applied to investigate the interaction between these three independent factors. Thus, the photocatalytic degradation of 4-CPA in aqueous medium assisted by ultraviolet-active ZnO photocatalyst was systematically investigated. This study aims to determine the optimum processing parameters to maximize 4-CPA degradation. Based on the results obtained, it was found that a maximum of 91% of 4-CPA was successfully degraded under optimal conditions (0.02 g ZnO dosage, 20.00 mg/L of 4-CPA and pH 7.71). All the experimental data showed good agreement with the predicted results obtained from statistical analysis.
Sequencing CYP2D6 for the detection of poor-metabolizers in post-mortem blood samples with tramadol.
Fonseca, Suzana; Amorim, António; Costa, Heloísa Afonso; Franco, João; Porto, Maria João; Santos, Jorge Costa; Dias, Mário
2016-08-01
Tramadol concentrations and analgesic effect are dependent on the CYP2D6 enzymatic activity. It is well known that some genetic polymorphisms are responsible for the variability in the expression of this enzyme and in the individual drug response. The detection of allelic variants described as non-functional can be useful to explain some circumstances of death in the study of post-mortem cases with tramadol. A Sanger sequencing methodology was developed for the detection of genetic variants that cause absent or reduced CYP2D6 activity, such as *3, *4, *6, *8, *10 and *12 alleles. This methodology, as well as the GC/MS method for the detection and quantification of tramadol and its main metabolites in blood samples was fully validated in accordance with international guidelines. Both methodologies were successfully applied to 100 post-mortem blood samples and the relation between toxicological and genetic results evaluated. Tramadol metabolism, expressed as its metabolites concentration ratio (N-desmethyltramadol/O-desmethyltramadol), has been shown to be correlated with the poor-metabolizer phenotype based on genetic characterization. It was also demonstrated the importance of enzyme inhibitors identification in toxicological analysis. According to our knowledge, this is the first study where a CYP2D6 sequencing methodology is validated and applied to post-mortem samples, in Portugal. The developed methodology allows the data collection of post-mortem cases, which is of primordial importance to enhance the application of these genetic tools to forensic toxicology and pathology. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Using a Realist Research Methodology in Policy Analysis
ERIC Educational Resources Information Center
Lourie, Megan; Rata, Elizabeth
2017-01-01
The article describes the usefulness of a realist methodology in linking sociological theory to empirically obtained data through the development of a methodological device. Three layers of analysis were integrated: 1. the findings from a case study about Maori language education in New Zealand; 2. the identification and analysis of contradictions…
Precision-Guided Munitions Effects Representation
2017-01-03
Center for Army Analysis (CAA) by the TRADOC Analysis Center, Monterey (TRAC-MTRY). The focus of the research is to improve the current methodology ... Methodology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . A-2 Timeline... Methodology . . . . . . . . . . . . . . . . . . . . C-1 MATLAB Code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . C-49 Damage
Wightman, Jade; Julio, Flávia; Virués-Ortega, Javier
2014-05-01
Experimental functional analysis is an assessment methodology to identify the environmental factors that maintain problem behavior in individuals with developmental disabilities and in other populations. Functional analysis provides the basis for the development of reinforcement-based approaches to treatment. This article reviews the procedures, validity, and clinical implementation of the methodological variations of functional analysis and function-based interventions. We present six variations of functional analysis methodology in addition to the typical functional analysis: brief functional analysis, single-function tests, latency-based functional analysis, functional analysis of precursors, and trial-based functional analysis. We also present the three general categories of function-based interventions: extinction, antecedent manipulation, and differential reinforcement. Functional analysis methodology is a valid and efficient approach to the assessment of problem behavior and the selection of treatment strategies.
Probabilistic structural analysis methods for space propulsion system components
NASA Technical Reports Server (NTRS)
Chamis, C. C.
1986-01-01
The development of a three-dimensional inelastic analysis methodology for the Space Shuttle main engine (SSME) structural components is described. The methodology is composed of: (1) composite load spectra, (2) probabilistic structural analysis methods, (3) the probabilistic finite element theory, and (4) probabilistic structural analysis. The methodology has led to significant technical progress in several important aspects of probabilistic structural analysis. The program and accomplishments to date are summarized.
A Review of Citation Analysis Methodologies for Collection Management
ERIC Educational Resources Information Center
Hoffmann, Kristin; Doucette, Lise
2012-01-01
While there is a considerable body of literature that presents the results of citation analysis studies, most researchers do not provide enough detail in their methodology to reproduce the study, nor do they provide rationale for methodological decisions. In this paper, we review the methodologies used in 34 recent articles that present a…
Risk-Informed External Hazards Analysis for Seismic and Flooding Phenomena for a Generic PWR
DOE Office of Scientific and Technical Information (OSTI.GOV)
Parisi, Carlo; Prescott, Steve; Ma, Zhegang
This report describes the activities performed during the FY2017 for the US-DOE Light Water Reactor Sustainability Risk-Informed Safety Margin Characterization (LWRS-RISMC), Industry Application #2. The scope of Industry Application #2 is to deliver a risk-informed external hazards safety analysis for a representative nuclear power plant. Following the advancements occurred during the previous FYs (toolkits identification, models development), FY2017 focused on: increasing the level of realism of the analysis; improving the tools and the coupling methodologies. In particular the following objectives were achieved: calculation of buildings pounding and their effects on components seismic fragility; development of a SAPHIRE code PRA modelsmore » for 3-loops Westinghouse PWR; set-up of a methodology for performing static-dynamic PRA coupling between SAPHIRE and EMRALD codes; coupling RELAP5-3D/RAVEN for performing Best-Estimate Plus Uncertainty analysis and automatic limit surface search; and execute sample calculations for demonstrating the capabilities of the toolkit in performing a risk-informed external hazards safety analyses.« less
Active subspace uncertainty quantification for a polydomain ferroelectric phase-field model
NASA Astrophysics Data System (ADS)
Leon, Lider S.; Smith, Ralph C.; Miles, Paul; Oates, William S.
2018-03-01
Quantum-informed ferroelectric phase field models capable of predicting material behavior, are necessary for facilitating the development and production of many adaptive structures and intelligent systems. Uncertainty is present in these models, given the quantum scale at which calculations take place. A necessary analysis is to determine how the uncertainty in the response can be attributed to the uncertainty in the model inputs or parameters. A second analysis is to identify active subspaces within the original parameter space, which quantify directions in which the model response varies most dominantly, thus reducing sampling effort and computational cost. In this investigation, we identify an active subspace for a poly-domain ferroelectric phase-field model. Using the active variables as our independent variables, we then construct a surrogate model and perform Bayesian inference. Once we quantify the uncertainties in the active variables, we obtain uncertainties for the original parameters via an inverse mapping. The analysis provides insight into how active subspace methodologies can be used to reduce computational power needed to perform Bayesian inference on model parameters informed by experimental or simulated data.
NASA Technical Reports Server (NTRS)
Cirillo, William M.; Earle, Kevin D.; Goodliff, Kandyce E.; Reeves, J. D.; Stromgren, Chel; Andraschko, Mark R.; Merrill, R. Gabe
2008-01-01
NASA s Constellation Program employs a strategic analysis methodology in providing an integrated analysis capability of Lunar exploration scenarios and to support strategic decision-making regarding those scenarios. The strategic analysis methodology integrates the assessment of the major contributors to strategic objective satisfaction performance, affordability, and risk and captures the linkages and feedbacks between all three components. Strategic analysis supports strategic decision making by senior management through comparable analysis of alternative strategies, provision of a consistent set of high level value metrics, and the enabling of cost-benefit analysis. The tools developed to implement the strategic analysis methodology are not element design and sizing tools. Rather, these models evaluate strategic performance using predefined elements, imported into a library from expert-driven design/sizing tools or expert analysis. Specific components of the strategic analysis tool set include scenario definition, requirements generation, mission manifesting, scenario lifecycle costing, crew time analysis, objective satisfaction benefit, risk analysis, and probabilistic evaluation. Results from all components of strategic analysis are evaluated a set of pre-defined figures of merit (FOMs). These FOMs capture the high-level strategic characteristics of all scenarios and facilitate direct comparison of options. The strategic analysis methodology that is described in this paper has previously been applied to the Space Shuttle and International Space Station Programs and is now being used to support the development of the baseline Constellation Program lunar architecture. This paper will present an overview of the strategic analysis methodology and will present sample results from the application of the strategic analysis methodology to the Constellation Program lunar architecture.
Pendular behavior of public transport networks
NASA Astrophysics Data System (ADS)
Izawa, Mirian M.; Oliveira, Fernando A.; Cajueiro, Daniel O.; Mello, Bernardo A.
2017-07-01
In this paper, we propose a methodology that bears close resemblance to the Fourier analysis of the first harmonic to study networks subjected to pendular behavior. In this context, pendular behavior is characterized by the phenomenon of people's dislocation from their homes to work in the morning and people's dislocation in the opposite direction in the afternoon. Pendular behavior is a relevant phenomenon that takes place in public transport networks because it may reduce the overall efficiency of the system as a result of the asymmetric utilization of the system in different directions. We apply this methodology to the bus transport system of Brasília, which is a city that has commercial and residential activities in distinct boroughs. We show that this methodology can be used to characterize the pendular behavior of this system, identifying the most critical nodes and times of the day when this system is in more severe demanded.
Burns, K E; Haysom, H E; Higgins, A M; Waters, N; Tahiri, R; Rushford, K; Dunstan, T; Saxby, K; Kaplan, Z; Chunilal, S; McQuilten, Z K; Wood, E M
2018-04-10
To describe the methodology to estimate the total cost of administration of a single unit of red blood cells (RBC) in adults with beta thalassaemia major in an Australian specialist haemoglobinopathy centre. Beta thalassaemia major is a genetic disorder of haemoglobin associated with multiple end-organ complications and typically requiring lifelong RBC transfusion therapy. New therapeutic agents are becoming available based on advances in understanding of the disorder and its consequences. Assessment of the true total cost of transfusion, incorporating both product and activity costs, is required in order to evaluate the benefits and costs of these new therapies. We describe the bottom-up, time-driven, activity-based costing methodology used to develop process maps to provide a step-by-step outline of the entire transfusion pathway. Detailed flowcharts for each process are described. Direct observations and timing of the process maps document all activities, resources, staff, equipment and consumables in detail. The analysis will include costs associated with performing these processes, including resources and consumables. Sensitivity analyses will be performed to determine the impact of different staffing levels, timings and probabilities associated with performing different tasks. Thirty-one process maps have been developed, with over 600 individual activities requiring multiple timings. These will be used for future detailed cost analyses. Detailed process maps using bottom-up, time-driven, activity-based costing for determining the cost of RBC transfusion in thalassaemia major have been developed. These could be adapted for wider use to understand and compare the costs and complexities of transfusion in other settings. © 2018 British Blood Transfusion Society.
Power-sharing Partnerships: Teachers' Experiences of Participatory Methodology.
Ferreira, Ronél; Ebersöhn, Liesel; Mbongwe, Bathsheba B
2015-01-01
This article reports on the experiences of teachers as coresearchers in a long-term partnership with university researchers, who participated in an asset-based intervention project known as Supportive Teachers, Assets and Resilience (STAR). In an attempt to inform participatory research methodology, the study investigated how coresearchers (teachers) experienced power relations. We utilized Gaventa's power cube as a theoretical framework and participatory research as our methodologic paradigm. Ten teachers of a primary school in the Eastern Cape and five teachers of a secondary school in a remote area in the Mpumalanga Province in South Africa participated (n=15). We employed multiple data generation techniques, namely Participatory Reflection and Action (PRA) activities, observation, focus group discussions, and semistructured interviews, using thematic analysis and categorical aggregation for data analysis. We identified three themes, related to the (1) nature of power in participatory partnerships, (2) coreasearchers' meaning making of power and partnerships, and their (3) role in taking agency. Based on these findings, we developed a framework of power sharing partnerships to extend Gaventa's power cube theory. This framework, and its five interrelated elements (leadership as power, identifying vision and mission, synergy, interdependent role of partners, and determination), provide insight into the way coresearchers shared their experiences of participatory research methodology. We theorise power-sharing partnerships as a complimentary platform hosting partners' shared strengths, skills, and experience, creating synergy in collaborative projects.
GIS Methodology for Planning Planetary-Rover Operations
NASA Technical Reports Server (NTRS)
Powell, Mark; Norris, Jeffrey; Fox, Jason; Rabe, Kenneth; Shu, I-Hsiang
2007-01-01
A document describes a methodology for utilizing image data downlinked from cameras aboard a robotic ground vehicle (rover) on a remote planet for analyzing and planning operations of the vehicle and of any associated spacecraft. Traditionally, the cataloging and presentation of large numbers of downlinked planetary-exploration images have been done by use of two organizational methods: temporal organization and correlation between activity plans and images. In contrast, the present methodology involves spatial indexing of image data by use of the computational discipline of geographic information systems (GIS), which has been maturing in terrestrial applications for decades, but, until now, has not been widely used in support of exploration of remote planets. The use of GIS to catalog data products for analysis is intended to increase efficiency and effectiveness in planning rover operations, just as GIS has proven to be a source of powerful computational tools in such terrestrial endeavors as law enforcement, military strategic planning, surveying, political science, and epidemiology. The use of GIS also satisfies the need for a map-based user interface that is intuitive to rover-activity planners, many of whom are deeply familiar with maps and know how to use them effectively in field geology.
Assessment of perceptions of clinical management in courses oriented by competency.
Gomes, Romeu; Padilha, Roberto de Queiroz; Lima, Valéria Vernaschi; Silva, Cosme Marcelo Furtado Passos da
2018-01-01
The study aims to assess perceptions of mastery of abilities in clinical management in participants of courses oriented by competency and based on active methodologies of teaching and learning, before and after the offered training process. Three conceptual frameworks were utilized: clinical management, expectation of auto-efficacy, and the holistic concept of competency. Methodologically, an electronic instrument was made available to students of the training courses, adapted to the Likert scale, in two stages: before the courses were undertaken and after their completion. The group of subjects that participated simultaneously in both stages was comprised of 825 trainees. Average, mean, standard deviation, and the Wilcoxon test were utilized in the analysis. Generally, in terms of findings, the perception of mastery of abilities in clinical management increased after the courses, proving a positive contribution of the training process of the students. Among other aspects of their results, it is concluded that the educational initiatives studied, oriented by competency and based in active methodologies of teaching and learning, can obtain the increase in perception of their participants regarding the mastery of abilities present in the competency profile, confirming the study's hypothesis.
On sustainable and efficient design of ground-source heat pump systems
NASA Astrophysics Data System (ADS)
Grassi, W.; Conti, P.; Schito, E.; Testi, D.
2015-11-01
This paper is mainly aimed at stressing some fundamental features of the GSHP design and is based on a broad research we are performing at the University of Pisa. In particular, we focus the discussion on an environmentally sustainable approach, based on performance optimization during the entire operational life. The proposed methodology aims at investigating design and management strategies to find the optimal level of exploitation of the ground source and refer to other technical means to cover the remaining energy requirements and modulate the power peaks. The method is holistic, considering the system as a whole, rather than focusing only on some components, usually considered as the most important ones. Each subsystem is modeled and coupled to the others in a full set of equations, which is used within an optimization routine to reproduce the operative performances of the overall GSHP system. As a matter of fact, the recommended methodology is a 4-in-1 activity, including sizing of components, lifecycle performance evaluation, optimization process, and feasibility analysis. The paper reviews also some previous works concerning possible applications of the proposed methodology. In conclusion, we describe undergoing research activities and objectives of future works.
Transportation networks : data, analysis, methodology development and visualization.
DOT National Transportation Integrated Search
2007-12-29
This project provides data compilation, analysis methodology and visualization methodology for the current network : data assets of the Alabama Department of Transportation (ALDOT). This study finds that ALDOT is faced with a : considerable number of...
Radiation protection for manned space activities
NASA Technical Reports Server (NTRS)
Jordan, T. M.
1983-01-01
The Earth's natural radiation environment poses a hazard to manned space activities directly through biological effects and indirectly through effects on materials and electronics. The following standard practices are indicated that address: (1) environment models for all radiation species including uncertainties and temporal variations; (2) upper bound and nominal quality factors for biological radiation effects that include dose, dose rate, critical organ, and linear energy transfer variations; (3) particle transport and shielding methodology including system and man modeling and uncertainty analysis; (4) mission planning that includes active dosimetry, minimizes exposure during extravehicular activities, subjects every mission to a radiation review, and specifies operational procedures for forecasting, recognizing, and dealing with large solar flaes.
1991-01-01
games. A leader with limited rationality will make decisions that bear a -vi- reasonable relationship to his objectives and values, but they may be...possible reasoning of opponents before or during crisis and conflict. The methodology is intended for use in analysis and defense planning, especially...overconfidence in prediction, failure to hedge, and failure actively to find ways to determine and affect the opponent’s reasoning before it is too late
DiNardo, Thomas P.; Jackson, R. Alan
1984-01-01
An analysis of land use change for an area in Boulder County, Colorado, was conducted using digital cartographic data. The authors selected data in the Geographic Information Retrieval and Analysis System (GIRAS) format which is digitized from the 1:250,000-scale land use and land cover map series. The Map Overlay and Statistical System (MOSS) was used as an analytical tool for the study. The authors describe the methodology used in converting the GIRAS file into a MOSS format and the activities associated with the conversion.
1982-08-01
between one that provides for total protection of life and property and one that per- mits operators to conduct activities in a " laisse - faire " manner...Workers. AD-PO00 456 General Risk Analysis Methodological Implications to Explosives Risk Management Systems. AD-PO0O 457 Risk Analysis for Explosives...THE EFFECTS OF THE HEALTH AND SAFETY AT WORK ACT, 1974, ON MILITARY EXPLOSIVES SAFETY MANAGEMENT IN THE UNITED KINGDOM ........................ 7 Air
Payload/orbiter contamination control requirement study: Spacelab configuration contamination study
NASA Technical Reports Server (NTRS)
Bareiss, L. E.; Hetrick, M. A.; Ress, E. B.; Strange, D. A.
1976-01-01
The assessment of the Spacelab carrier induced contaminant environment was continued, and the ability of Spacelab to meet established contamination control criteria for the space transportation system program was determined. The primary areas considered included: (1) updating, refining, and improving the Spacelab contamination computer model and contamination analysis methodology, (2) establishing the resulting adjusted induced environment predictions for comparison with the applicable criteria, (3) determining the Spacelab design and operational requirements necessary to meet the criteria, (4) conducting mission feasibility analyses of the combined Spacelab/Orbiter contaminant environment for specific proposed mission and payload mixes, and (5) establishing a preliminary Spacelab mission support plan as well as model interface requirements; A summary of those activities conducted to date with respect to the modelling, analysis, and predictions of the induced environment, including any modifications in approach or methodology utilized in the contamination assessment of the Spacelab carrier, was presented.
Suzuki, Masahiko; Mitoma, Hiroshi; Yoneyama, Mitsuru
2017-01-01
Long-term and objective monitoring is necessary for full assessment of the condition of patients with Parkinson's disease (PD). Recent advances in biotechnology have seen the development of various types of wearable (body-worn) sensor systems. By using accelerometers and gyroscopes, these devices can quantify motor abnormalities, including decreased activity and gait disturbances, as well as nonmotor signs, such as sleep disturbances and autonomic dysfunctions in PD. This review discusses methodological problems inherent in wearable devices. Until now, analysis of the mean values of motion-induced signals on a particular day has been widely applied in the clinical management of PD patients. On the other hand, the reliability of these devices to detect various events, such as freezing of gait and dyskinesia, has been less than satisfactory. Quantification of disease-specific changes rather than nonspecific changes is necessary.
Transition Characteristic Analysis of Traffic Evolution Process for Urban Traffic Network
Chen, Hong; Li, Yang
2014-01-01
The characterization of the dynamics of traffic states remains fundamental to seeking for the solutions of diverse traffic problems. To gain more insights into traffic dynamics in the temporal domain, this paper explored temporal characteristics and distinct regularity in the traffic evolution process of urban traffic network. We defined traffic state pattern through clustering multidimensional traffic time series using self-organizing maps and construct a pattern transition network model that is appropriate for representing and analyzing the evolution progress. The methodology is illustrated by an application to data flow rate of multiple road sections from Network of Shenzhen's Nanshan District, China. Analysis and numerical results demonstrated that the methodology permits extracting many useful traffic transition characteristics including stability, preference, activity, and attractiveness. In addition, more information about the relationships between these characteristics was extracted, which should be helpful in understanding the complex behavior of the temporal evolution features of traffic patterns. PMID:24982969
2017-01-01
Long-term and objective monitoring is necessary for full assessment of the condition of patients with Parkinson's disease (PD). Recent advances in biotechnology have seen the development of various types of wearable (body-worn) sensor systems. By using accelerometers and gyroscopes, these devices can quantify motor abnormalities, including decreased activity and gait disturbances, as well as nonmotor signs, such as sleep disturbances and autonomic dysfunctions in PD. This review discusses methodological problems inherent in wearable devices. Until now, analysis of the mean values of motion-induced signals on a particular day has been widely applied in the clinical management of PD patients. On the other hand, the reliability of these devices to detect various events, such as freezing of gait and dyskinesia, has been less than satisfactory. Quantification of disease-specific changes rather than nonspecific changes is necessary. PMID:28607801
Tracking problem solving by multivariate pattern analysis and Hidden Markov Model algorithms.
Anderson, John R
2012-03-01
Multivariate pattern analysis can be combined with Hidden Markov Model algorithms to track the second-by-second thinking as people solve complex problems. Two applications of this methodology are illustrated with a data set taken from children as they interacted with an intelligent tutoring system for algebra. The first "mind reading" application involves using fMRI activity to track what students are doing as they solve a sequence of algebra problems. The methodology achieves considerable accuracy at determining both what problem-solving step the students are taking and whether they are performing that step correctly. The second "model discovery" application involves using statistical model evaluation to determine how many substates are involved in performing a step of algebraic problem solving. This research indicates that different steps involve different numbers of substates and these substates are associated with different fluency in algebra problem solving. Copyright © 2011 Elsevier Ltd. All rights reserved.
The methodology of semantic analysis for extracting physical effects
NASA Astrophysics Data System (ADS)
Fomenkova, M. A.; Kamaev, V. A.; Korobkin, D. M.; Fomenkov, S. A.
2017-01-01
The paper represents new methodology of semantic analysis for physical effects extracting. This methodology is based on the Tuzov ontology that formally describes the Russian language. In this paper, semantic patterns were described to extract structural physical information in the form of physical effects. A new algorithm of text analysis was described.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-20
... DEPARTMENT OF EDUCATION Federal Need Analysis Methodology for the 2014-15 Award Year-- Federal Pell Grant, Federal Perkins Loan, Federal Work-Study, Federal Supplemental Educational Opportunity... announces the annual updates to the tables used in the statutory Federal Need Analysis Methodology that...
Identifying treatment effect heterogeneity in clinical trials using subpopulations of events: STEPP.
Lazar, Ann A; Bonetti, Marco; Cole, Bernard F; Yip, Wai-Ki; Gelber, Richard D
2016-04-01
Investigators conducting randomized clinical trials often explore treatment effect heterogeneity to assess whether treatment efficacy varies according to patient characteristics. Identifying heterogeneity is central to making informed personalized healthcare decisions. Treatment effect heterogeneity can be investigated using subpopulation treatment effect pattern plot (STEPP), a non-parametric graphical approach that constructs overlapping patient subpopulations with varying values of a characteristic. Procedures for statistical testing using subpopulation treatment effect pattern plot when the endpoint of interest is survival remain an area of active investigation. A STEPP analysis was used to explore patterns of absolute and relative treatment effects for varying levels of a breast cancer biomarker, Ki-67, in the phase III Breast International Group 1-98 randomized clinical trial, comparing letrozole to tamoxifen as adjuvant therapy for postmenopausal women with hormone receptor-positive breast cancer. Absolute treatment effects were measured by differences in 4-year cumulative incidence of breast cancer recurrence, while relative effects were measured by the subdistribution hazard ratio in the presence of competing risks using O-E (observed-minus-expected) methodology, an intuitive non-parametric method. While estimation of hazard ratio values based on O-E methodology has been shown, a similar development for the subdistribution hazard ratio has not. Furthermore, we observed that the subpopulation treatment effect pattern plot analysis may not produce results, even with 100 patients within each subpopulation. After further investigation through simulation studies, we observed inflation of the type I error rate of the traditional test statistic and sometimes singular variance-covariance matrix estimates that may lead to results not being produced. This is due to the lack of sufficient number of events within the subpopulations, which we refer to as instability of the subpopulation treatment effect pattern plot analysis. We introduce methodology designed to improve stability of the subpopulation treatment effect pattern plot analysis and generalize O-E methodology to the competing risks setting. Simulation studies were designed to assess the type I error rate of the tests for a variety of treatment effect measures, including subdistribution hazard ratio based on O-E estimation. This subpopulation treatment effect pattern plot methodology and standard regression modeling were used to evaluate heterogeneity of Ki-67 in the Breast International Group 1-98 randomized clinical trial. We introduce methodology that generalizes O-E methodology to the competing risks setting and that improves stability of the STEPP analysis by pre-specifying the number of events across subpopulations while controlling the type I error rate. The subpopulation treatment effect pattern plot analysis of the Breast International Group 1-98 randomized clinical trial showed that patients with high Ki-67 percentages may benefit most from letrozole, while heterogeneity was not detected using standard regression modeling. The STEPP methodology can be used to study complex patterns of treatment effect heterogeneity, as illustrated in the Breast International Group 1-98 randomized clinical trial. For the subpopulation treatment effect pattern plot analysis, we recommend a minimum of 20 events within each subpopulation. © The Author(s) 2015.
Tzelepis, Flora; Paul, Christine L; Walsh, Raoul A; McElduff, Patrick; Knight, Jenny
2011-06-22
Systematic reviews demonstrated that proactive telephone counseling increases smoking cessation rates. However, these reviews did not differentiate studies by recruitment channel, did not adequately assess methodological quality, and combined different measures of abstinence. Twenty-four randomized controlled trials published before December 31, 2008, included seven of active recruitment, 16 of passive recruitment, and one of mixed recruitment. We rated methodological quality on selection bias, study design, confounders, blinding, data collection methods, withdrawals, and dropouts, according to the Quality Assessment Tool for Quantitative Studies. We conducted random effects meta-analysis to pool the results according to abstinence type and follow-up time for studies overall and segregated by recruitment channel, and methodological quality. The level of statistical heterogeneity was quantified by I(2). All statistical tests were two-sided. Methodological quality ratings indicated two strong, 10 moderate, and 12 weak studies. Overall, compared with self-help materials or no intervention control groups, proactive telephone counseling had a statistically significantly greater effect on point prevalence abstinence (nonsmoking at follow-up or abstinent for at least 24 hours, 7 days before follow-up) at 6-9 months (relative risk [RR] = 1.26, 95% confidence interval [CI] = 1.11 to 1.43, P < .001, I(2) = 21.4%) but not at 12-15 months after recruitment. This pattern also emerged when studies were segregated by recruitment channel (active, passive) or methodological quality (strong/moderate, weak). Overall, the positive effect on prolonged/continuous abstinence (abstinent for 3 months or longer before follow-up) was also statistically significantly greater at 6-9 months (RR = 1.58, CI = 1.26 to 1.98, P < .001, I(2) = 49.1%) and 12-18 months after recruitment (RR = 1.40, CI = 1.23 to 1.60, P < .001, I(2) = 18.5%). With the exception of point prevalence abstinence in the long term, these data support previous results showing that proactive telephone counseling has a positive impact on smoking cessation. Proactive telephone counseling increased prolonged/continuous abstinence long term for both actively and passively recruited smokers.
Active Methodologies in a Queueing Systems Course for Telecommunication Engineering Studies
ERIC Educational Resources Information Center
Garcia, J.; Hernandez, A.
2010-01-01
This paper presents the results of a one-year experiment in incorporating active methodologies in a Queueing Systems course as part of the Telecommunication Engineering degree at the University of Zaragoza, Spain, during the period of adaptation to the European Higher Education Area. A problem-based learning methodology has been introduced, and…
Design Optimization Method for Composite Components Based on Moment Reliability-Sensitivity Criteria
NASA Astrophysics Data System (ADS)
Sun, Zhigang; Wang, Changxi; Niu, Xuming; Song, Yingdong
2017-08-01
In this paper, a Reliability-Sensitivity Based Design Optimization (RSBDO) methodology for the design of the ceramic matrix composites (CMCs) components has been proposed. A practical and efficient method for reliability analysis and sensitivity analysis of complex components with arbitrary distribution parameters are investigated by using the perturbation method, the respond surface method, the Edgeworth series and the sensitivity analysis approach. The RSBDO methodology is then established by incorporating sensitivity calculation model into RBDO methodology. Finally, the proposed RSBDO methodology is applied to the design of the CMCs components. By comparing with Monte Carlo simulation, the numerical results demonstrate that the proposed methodology provides an accurate, convergent and computationally efficient method for reliability-analysis based finite element modeling engineering practice.
Development of economic consequence methodology for process risk analysis.
Zadakbar, Omid; Khan, Faisal; Imtiaz, Syed
2015-04-01
A comprehensive methodology for economic consequence analysis with appropriate models for risk analysis of process systems is proposed. This methodology uses loss functions to relate process deviations in a given scenario to economic losses. It consists of four steps: definition of a scenario, identification of losses, quantification of losses, and integration of losses. In this methodology, the process deviations that contribute to a given accident scenario are identified and mapped to assess potential consequences. Losses are assessed with an appropriate loss function (revised Taguchi, modified inverted normal) for each type of loss. The total loss is quantified by integrating different loss functions. The proposed methodology has been examined on two industrial case studies. Implementation of this new economic consequence methodology in quantitative risk assessment will provide better understanding and quantification of risk. This will improve design, decision making, and risk management strategies. © 2014 Society for Risk Analysis.
A methodology for the assessment of flood hazards at the regional scale
NASA Astrophysics Data System (ADS)
Gallina, Valentina; Torresan, Silvia; Critto, Andrea; Zabeo, Alex; Semenzin, Elena; Marcomini, Antonio
2013-04-01
In recent years, the frequency of water-related disasters has increased and recent flood events in Europe (e.g. 2002 in Central Europe, 2007 in UK, 2010 in Italy) caused physical-environmental and socio-economic damages. Specifically, floods are the most threatening water-related disaster that affects humans, their lives and properties. Within the KULTURisk project (FP7) a Regional Risk Assessment (RRA) methodology is proposed to evaluate the benefits of risk prevention in terms of reduced environmental risks due to floods. The method is based on the KULTURisk framework and allows the identification and prioritization of targets (i.e. people, buildings, infrastructures, agriculture, natural and semi-natural systems, cultural heritages) and areas at risk from floods in the considered region by comparing the baseline scenario (i.e. current state) with alternative scenarios (i.e. where different structural and/or non-structural measures are planned). The RRA methodology is flexible and can be adapted to different case studies (i.e. large rivers, alpine/mountain catchments, urban areas and coastal areas) and spatial scales (i.e. from the large river to the urban scale). The final aim of RRA is to help decision-makers in examining the possible environmental risks associated with uncertain future flood hazards and in identifying which prevention scenario could be the most suitable one. The RRA methodology employs Multi-Criteria Decision Analysis (MCDA functions) in order to integrate stakeholder preferences and experts judgments into the analysis. Moreover, Geographic Information Systems (GISs) are used to manage, process, analyze, and map data to facilitate the analysis and the information sharing with different experts and stakeholders. In order to characterize flood risks, the proposed methodology integrates the output of hydrodynamic models with the analysis of site-specific bio-geophysical and socio-economic indicators (e.g. slope of the territory, land cover, population density, economic activities) of several case studies in order to develop risk maps that identify and prioritize relative hot-spot areas and targets at risk at the regional scale. The main outputs of the RRA are receptor-based maps of risks useful to communicate the potential implications of floods in non-monetary terms to stakeholders and administrations. These maps can be a basis for the management of flood risks as they can provide information about the indicative number of inhabitants, the type of economic activities, natural systems and cultural heritages potentially affected by flooding. Moreover, they can provide suitable information about flood risk in the considered area in order to define priorities for prevention measures, for land use planning and management. Finally, the outputs of the RRA methodology can be used as data input in the Socio- Economic Regional Risk Assessment methodology for the economic evaluation of different damages (e.g. tangible costs, intangible costs) and for the social assessment considering the benefits of the human dimension of vulnerability (i.e. adaptive and coping capacity). Within the KULTURisk project, the methodology has been applied and validated in several European case studies. Moreover, its generalization to address other types of natural hazards (e.g. earthquakes, forest fires) will be evaluated. The preliminary results of the RRA application in the KULTURisk project will be here presented and discussed.
Research in health sciences library and information science: a quantitative analysis.
Dimitroff, A
1992-01-01
A content analysis of research articles published between 1966 and 1990 in the Bulletin of the Medical Library Association was undertaken. Four specific questions were addressed: What subjects are of interest to health sciences librarians? Who is conducting this research? How do health sciences librarians conduct their research? Do health sciences librarians obtain funding for their research activities? Bibliometric characteristics of the research articles are described and compared to characteristics of research in library and information science as a whole in terms of subject and methodology. General findings were that most research in health sciences librarianship is conducted by librarians affiliated with academic health sciences libraries (51.8%); most deals with an applied (45.7%) or a theoretical (29.2%) topic; survey (41.0%) or observational (20.7%) research methodologies are used; descriptive quantitative analytical techniques are used (83.5%); and over 25% of research is funded. The average number of authors was 1.85, average article length was 7.25 pages, and average number of citations per article was 9.23. These findings are consistent with those reported in the general library and information science literature for the most part, although specific differences do exist in methodological and analytical areas. PMID:1422504
Development of weight and cost estimates for lifting surfaces with active controls
NASA Technical Reports Server (NTRS)
Anderson, R. D.; Flora, C. C.; Nelson, R. M.; Raymond, E. T.; Vincent, J. H.
1976-01-01
Equations and methodology were developed for estimating the weight and cost incrementals due to active controls added to the wing and horizontal tail of a subsonic transport airplane. The methods are sufficiently generalized to be suitable for preliminary design. Supporting methodology and input specifications for the weight and cost equations are provided. The weight and cost equations are structured to be flexible in terms of the active control technology (ACT) flight control system specification. In order to present a self-contained package, methodology is also presented for generating ACT flight control system characteristics for the weight and cost equations. Use of the methodology is illustrated.
C. Mann; J.D. Absher
2007-01-01
The scientific inputs to management of recreation areas in Germany have been largely determined by ecologically oriented quantitative impact and conflict studies with an emphasis on nature protection. Today, however, Germanyâs recreational situation has changed. New activities and increased participation by people seeking different recreational experiences challenge...
How Do We Know What Is Happening Online?: A Mixed Methods Approach to Analysing Online Activity
ERIC Educational Resources Information Center
Charalampidi, Marina; Hammond, Michael
2016-01-01
Purpose: The purpose of this paper is to discuss the process of analysing online discussion and argue for the merits of mixed methods. Much research of online participation and e-learning has been either message-focused or person-focused. The former covers methodologies such as content and discourse analysis, the latter interviewing and surveys.…
ERIC Educational Resources Information Center
Bombardelli, Olga; Codato, Marta
2017-01-01
Purpose: In the present paper we describe how civic and citizenship education takes place in Italy, trying to identify strengths and weaknesses, with the aims both of understanding the situation and of identifying possible measures for improvement. Methods: The methodology implies an analysis of the official guidelines by the Ministry in this…
NASA Technical Reports Server (NTRS)
Johnson, W. Steven
1990-01-01
A workshop was held to help assess the state-of-the-art in evaluating the long term durability of polymeric matrix composites (PMCs) and to recommend future activities. Design and evaluation of PMCs at elevated temperatures were discussed. The workshop presentations, the findings of the workshop sessions are briefly summarized.
ERIC Educational Resources Information Center
Frame, Stanley M.
In the Spring of 1969, Bethany Nazarene College started an intensive self evaluation effort, called the Ten-Year Advance Study. Part I of the report, the Study Design, was published in October 1969. This study, Part II, relates the study activities, the methodology, and sources consulted. The effort involved over 120 administrators, faculty,…
Koush, Yury; Ashburner, John; Prilepin, Evgeny; Sladky, Ronald; Zeidman, Peter; Bibikov, Sergei; Scharnowski, Frank; Nikonorov, Artem; De Ville, Dimitri Van
2017-08-01
Neurofeedback based on real-time functional magnetic resonance imaging (rt-fMRI) is a novel and rapidly developing research field. It allows for training of voluntary control over localized brain activity and connectivity and has demonstrated promising clinical applications. Because of the rapid technical developments of MRI techniques and the availability of high-performance computing, new methodological advances in rt-fMRI neurofeedback become possible. Here we outline the core components of a novel open-source neurofeedback framework, termed Open NeuroFeedback Training (OpenNFT), which efficiently integrates these new developments. This framework is implemented using Python and Matlab source code to allow for diverse functionality, high modularity, and rapid extendibility of the software depending on the user's needs. In addition, it provides an easy interface to the functionality of Statistical Parametric Mapping (SPM) that is also open-source and one of the most widely used fMRI data analysis software. We demonstrate the functionality of our new framework by describing case studies that include neurofeedback protocols based on brain activity levels, effective connectivity models, and pattern classification approaches. This open-source initiative provides a suitable framework to actively engage in the development of novel neurofeedback approaches, so that local methodological developments can be easily made accessible to a wider range of users. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
Veronesi, G; Bertù, L; Mombelli, S; Cimmino, L; Caravati, G; Conti, M; Abate, T; Ferrario, M M
2011-01-01
We discuss the methodological aspects related to the evaluation of turn-over and up-down sizing as indicators of work-related stress, in complex organizations like a university hospital. To estimate the active workers population we developed an algorithm which integrated several administrative databases. The indicators were standardized to take into account some potential confounders (age, sex, work seniority) when considering different hospital structures and job mansions. Main advantages of our method include flexibility in the choice of the analysis detail (hospital units, job mansions, a combination of both) and the possibility to describe over-time trends to measure the success of preventive strategies.
Dynamic Palmitoylation and the Role of DHHC Proteins in T Cell Activation and Anergy
Ladygina, Nadejda; Martin, Brent R.; Altman, Amnon
2017-01-01
Although protein S-palmitoylation was first characterized >30 years ago, and is implicated in the function, trafficking, and localization of many proteins, little is known about the regulation and physiological implications of this posttranslational modification. Palmitoylation of various signaling proteins required for TCR-induced T cell activation is also necessary for their proper function. LAT (linker for activation of T cells) is an essential scaffolding protein involved in T cell development and activation, and we found that its palmitoylation is selectively impaired in anergic T cells. The recent discovery of the DHHC family of palmitoyl acyl transferases (PATs) and the establishment of sensitive and quantitative proteomics-based methods for global analysis of the palmitoyl proteome led to significant progress in studying the biology and underlying mechanisms of cellular protein palmitoylation. We are using these approaches to explore the palmitoyl proteome in T lymphocytes and, specifically, the mechanistic basis for the impaired palmitoylation of LAT in anergic T cells. This chapter reviews the history of protein palmitoylation and its role in T cell activation, the DHHC family and new methodologies for global analysis of the palmitoyl proteome, and summarizes our recent work in this area. The new methodologies will accelerate the pace of research and provide a greatly improved mechanistic and molecular understanding of the complex process of protein palmitoylation and its regulation, and the substrate specificity of the novel DHHC family. Reversible protein palmitoylation will likely prove to be an important posttranslational mechanism that regulates cellular responses, similar to protein phosphorylation and ubiquitination. PMID:21569911
Sentinel-1 data exploitation for geohazard activity map generation
NASA Astrophysics Data System (ADS)
Barra, Anna; Solari, Lorenzo; Béjar-Pizarro, Marta; Monserrat, Oriol; Herrera, Gerardo; Bianchini, Silvia; Crosetto, Michele; María Mateos, Rosa; Sarro, Roberto; Moretti, Sandro
2017-04-01
This work is focused on geohazard mapping and monitoring by exploiting Sentinel-1 (A and B) data and the DInSAR (Differential interferometric SAR (Synthetic Aperture Radar)) techniques. Sometimes the interpretation of the DInSAR derived product (like the velocity map) can be complex, mostly for a final user who do not usually works with radar. The aim of this work is to generate, in a rapid way, a clear product to be easily exploited by the authorities in the geohazard management: intervention planning and prevention activities. Specifically, the presented methodology has been developed in the framework of the European project SAFETY, which is aimed at providing Civil Protection Authorities (CPA) with the capability of periodically evaluating and assessing the potential impact of geohazards (volcanic activity, earthquakes, landslides and subsidence) on urban areas. The methodology has three phases, the interferograms generation, the activity map generation, in terms of velocity and accumulated deformation (with time-series), and the Active Deformation Area (ADA) map generation. The last one is the final product, derived from the original activity map by analyzing the data in a Geographic Information System (GIS) environment, which isolate only the true deformation areas over the noise. This product can be more easily read by the authorities than the original activity map, i.e. can be better exploited to integrate other information and analysis. This product also permit an easy monitoring of the active areas.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes, These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
Czolowski, Eliza D.; Santoro, Renee L.; Srebotnjak, Tanja
2017-01-01
Background: Higher risk of exposure to environmental health hazards near oil and gas wells has spurred interest in quantifying populations that live in proximity to oil and gas development. The available studies on this topic lack consistent methodology and ignore aspects of oil and gas development of value to public health–relevant assessment and decision-making. Objectives: We aim to present a methodological framework for oil and gas development proximity studies grounded in an understanding of hydrocarbon geology and development techniques. Methods: We geospatially overlay locations of active oil and gas wells in the conterminous United States and Census data to estimate the population living in proximity to hydrocarbon development at the national and state levels. We compare our methods and findings with existing proximity studies. Results: Nationally, we estimate that 17.6 million people live within 1,600m (∼1 mi) of at least one active oil and/or gas well. Three of the eight studies overestimate populations at risk from actively producing oil and gas wells by including wells without evidence of production or drilling completion and/or using inappropriate population allocation methods. The remaining five studies, by omitting conventional wells in regions dominated by historical conventional development, significantly underestimate populations at risk. Conclusions: The well inventory guidelines we present provide an improved methodology for hydrocarbon proximity studies by acknowledging the importance of both conventional and unconventional well counts as well as the relative exposure risks associated with different primary production categories (e.g., oil, wet gas, dry gas) and developmental stages of wells. https://doi.org/10.1289/EHP1535 PMID:28858829
De Buck, Emmy; Pauwels, Nele S; Dieltjens, Tessa; Vandekerckhove, Philippe
2014-03-01
As part of its strategy Belgian Red Cross-Flanders underpins all its activities with evidence-based guidelines and systematic reviews. The aim of this publication is to describe in detail the methodology used to achieve this goal within an action-oriented organisation, in a timely and cost-effective way. To demonstrate transparency in our methods, we wrote a methodological charter describing the way in which we develop evidence-based materials to support our activities. Criteria were drawn up for deciding on project priority and the choice of different types of projects (scoping reviews, systematic reviews and evidence-based guidelines). While searching for rigorous and realistically attainable methodological standards, we encountered a wide variety in terminology and methodology used in the field of evidence-based practice. Terminologies currently being used by different organisations and institutions include systematic reviews, systematic literature searches, evidence-based guidelines, rapid reviews, pragmatic systematic reviews, and rapid response service. It is not always clear what the definition and methodology is behind these terms and whether they are used consistently. We therefore describe the terminology and methodology used by Belgian Red Cross-Flanders; criteria for making methodological choices and details on the methodology we use are given. In our search for an appropriate methodology, taking into account time and resource constraints, we encountered an enormous variety of methodological approaches and terminology used for evidence-based materials. In light of this, we recommend that authors of evidence-based guidelines and reviews are transparent and clear about the methodology used. To be transparent about our approach, we developed a methodological charter. This charter may inspire other organisations that want to use evidence-based methodology to support their activities.
Methodologies in Cultural-Historical Activity Theory: The Example of School-Based Development
ERIC Educational Resources Information Center
Postholm, May Britt
2015-01-01
Background and purpose: Relatively little research has been conducted on methodology within Cultural-Historical Activity Theory (CHAT). CHAT is mainly used as a framework for developmental processes. The purpose of this article is to discuss both focuses for research and research questions within CHAT and to outline methodologies that can be used…
NASA Astrophysics Data System (ADS)
Tinoco, Hector A.; Ovalle, Alex M.; Vargas, Carlos A.; Cardona, María J.
2015-09-01
In the context of industrial engineering, the predetermined time systems (PTS) play an important role in workplaces because inefficiencies are found in assembly processes that require manual manipulations. In this study, an approach is proposed with the aim to analyze time and motions in a manual process using a capture motion system embedded to a virtual environment. Capture motion system tracks IR passive markers located on the hands to take the positions of each one. For our purpose, a real workplace is virtually represented by domains to create a virtual workplace based on basic geometries. Motion captured data are combined with the virtual workplace to simulate operations carried out on it, and a time and motion analysis is completed by means of an algorithm. To test the methodology of analysis, a case study was intentionally designed using and violating the principles of motion economy. In the results, it was possible to observe where the hands never crossed as well as where the hands passed by the same place. In addition, the activities done in each zone were observed and some known deficiencies were identified in the distribution of the workplace by computational analysis. Using a frequency analysis of hand velocities, errors in the chosen assembly method were revealed showing differences in the hand velocities. An opportunity is seen to classify some quantifiable aspects that are not identified easily in a traditional time and motion analysis. The automated analysis is considered as the main contribution in this study. In the industrial context, a great application is perceived in terms of monitoring the workplace to analyze repeatability, PTS, workplace and labor activities redistribution using the proposed methodology.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-08
... analysis, survey methodology, geospatial analysis, econometrics, cognitive psychology, and computer science... following disciplines: demography, economics, geography, psychology, statistics, survey methodology, social... expertise in such areas as demography, economics, geography, psychology, statistics, survey methodology...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1983-02-01
Work on energy consumption in a large office building is reported, including the following tasks: (1) evaluating and testing the effectiveness of the existing ASHRAE 90-75 and 90-80 standards; (2) evaluating the effectiveness of the BEPS; (3) evaluating the effectiveness of some envelope and lighting design variables towards achieving the BEPS budgets; and (4) comparing the computer energy analysis technique, DOE-2.1, with manual calculation procedures. These tasks are the initial activities in the energy analysis of the Park Plaza Building and will serve as the basis for further understanding the results of ongoing data collection and analysis.
Failure mode effect analysis and fault tree analysis as a combined methodology in risk management
NASA Astrophysics Data System (ADS)
Wessiani, N. A.; Yoshio, F.
2018-04-01
There have been many studies reported the implementation of Failure Mode Effect Analysis (FMEA) and Fault Tree Analysis (FTA) as a method in risk management. However, most of the studies usually only choose one of these two methods in their risk management methodology. On the other side, combining these two methods will reduce the drawbacks of each methods when implemented separately. This paper aims to combine the methodology of FMEA and FTA in assessing risk. A case study in the metal company will illustrate how this methodology can be implemented. In the case study, this combined methodology will assess the internal risks that occur in the production process. Further, those internal risks should be mitigated based on their level of risks.
NASA Technical Reports Server (NTRS)
Barr, B. G.; Martinko, E. A.
1976-01-01
Activities of the Kansas Applied Remote Sensing Program (KARS) designed to establish interactions on cooperative projects with decision makers in Kansas agencies in the development and application of remote sensing procedures are reported. Cooperative demonstration projects undertaken with several different agencies involved three principal areas of effort: Wildlife Habitat and Environmental Analysis; Urban and Regional Analysis; Agricultural and Rural Analysis. These projects were designed to concentrate remote sensing concepts and methodologies on existing agency problems to insure the continued relevancy of the program and maximize the possibility for immediate operational use. Completed projects are briefly discussed.
An activity-based methodology for operations cost analysis
NASA Technical Reports Server (NTRS)
Korsmeyer, David; Bilby, Curt; Frizzell, R. A.
1991-01-01
This report describes an activity-based cost estimation method, proposed for the Space Exploration Initiative (SEI), as an alternative to NASA's traditional mass-based cost estimation method. A case study demonstrates how the activity-based cost estimation technique can be used to identify the operations that have a significant impact on costs over the life cycle of the SEI. The case study yielded an operations cost of $101 billion for the 20-year span of the lunar surface operations for the Option 5a program architecture. In addition, the results indicated that the support and training costs for the missions were the greatest contributors to the annual cost estimates. A cost-sensitivity analysis of the cultural and architectural drivers determined that the length of training and the amount of support associated with the ground support personnel for mission activities are the most significant cost contributors.
A long-term epigenetic memory switch controls bacterial virulence bimodality
Ronin, Irine; Katsowich, Naama; Rosenshine, Ilan; Balaban, Nathalie Q
2017-01-01
When pathogens enter the host, sensing of environmental cues activates the expression of virulence genes. Opposite transition of pathogens from activating to non-activating conditions is poorly understood. Interestingly, variability in the expression of virulence genes upon infection enhances colonization. In order to systematically detect the role of phenotypic variability in enteropathogenic E. coli (EPEC), an important human pathogen, both in virulence activating and non-activating conditions, we employed the ScanLag methodology. The analysis revealed a bimodal growth rate. Mathematical modeling combined with experimental analysis showed that this bimodality is mediated by a hysteretic memory-switch that results in the stable co-existence of non-virulent and hyper-virulent subpopulations, even after many generations of growth in non-activating conditions. We identified the per operon as the key component of the hysteretic switch. This unique hysteretic memory switch may result in persistent infection and enhanced host-to-host spreading. DOI: http://dx.doi.org/10.7554/eLife.19599.001 PMID:28178445
Souza, Beatriz C C; De Oliveira, Tiago B; Aquino, Thiago M; de Lima, Maria C A; Pitta, Ivan R; Galdino, Suely L; Lima, Edeltrudes O; Gonçalves-Silva, Teresinha; Militão, Gardênia C G; Scotti, Luciana; Scotti, Marcus T; Mendonça, Francisco J B
2012-06-01
A series of 2-[(arylidene)amino]-cycloalkyl[b]thiophene-3-carbonitriles (2a-x) was synthesized by incorporation of substituted aromatic aldehydes in Gewald adducts (1a-c). The title compounds were screened for their antifungal activity against Candida krusei and Criptococcus neoformans and for their antiproliferative activity against a panel of 3 human cancer cell lines (HT29, NCI H-292 and HEP). For antiproliferative activity, the partial least squares (PLS) methodology was applied. Some of the prepared compounds exhibited promising antifungal and proliferative properties. The most active compounds for antifungal activity were cyclohexyl[b]thiophene derivatives, and for antiproliferative activity cycloheptyl[b]thiophene derivatives, especially 2-[(1H-indol-2-yl-methylidene)amino]- 5,6,7,8-tetrahydro-4H-cyclohepta[b]thiophene-3-carbonitrile (2r), which inhibited more than 97 % growth of the three cell lines. The PLS discriminant analysis (PLS-DA) applied generated good exploratory and predictive results and showed that the descriptors having shape characteristics were strongly correlated with the biological data.
Larson, Tracy A; Normand, Matthew P; Morley, Allison J; Miller, Bryon G
2014-01-01
Inadequate physical activity increases the risks related to several health problems in children; however, increasing physical activity mitigates these risks. In this study, we examined the relations between moderate-to-vigorous physical activity (MVPA) and several environmental conditions (attention, interactive play, alone, escape) with 4 preschool children. We compared the experimental conditions to a control condition and a naturalistic baseline according to a combined multielement and reversal design. Results indicated that all participants were most active in the interactive play condition and that the percentage of MVPA varied across experimental and control conditions. In addition, the frequency and duration of bouts of MVPA were greatest in the interactive play condition. The current study presents a methodology for the identification of environmental contingencies that support increased levels of MVPA in young children, and it holds promise for improving our understanding of the variables related to physical activity. © Society for the Experimental Analysis of Behavior.
NASA Astrophysics Data System (ADS)
Alkasem, Ameen; Liu, Hongwei; Zuo, Decheng; Algarash, Basheer
2018-01-01
The volume of data being collected, analyzed, and stored has exploded in recent years, in particular in relation to the activity on the cloud computing. While large-scale data processing, analysis, storage, and platform model such as cloud computing were previously and currently are increasingly. Today, the major challenge is it address how to monitor and control these massive amounts of data and perform analysis in real-time at scale. The traditional methods and model systems are unable to cope with these quantities of data in real-time. Here we present a new methodology for constructing a model for optimizing the performance of real-time monitoring of big datasets, which includes a machine learning algorithms and Apache Spark Streaming to accomplish fine-grained fault diagnosis and repair of big dataset. As a case study, we use the failure of Virtual Machines (VMs) to start-up. The methodology proposition ensures that the most sensible action is carried out during the procedure of fine-grained monitoring and generates the highest efficacy and cost-saving fault repair through three construction control steps: (I) data collection; (II) analysis engine and (III) decision engine. We found that running this novel methodology can save a considerate amount of time compared to the Hadoop model, without sacrificing the classification accuracy or optimization of performance. The accuracy of the proposed method (92.13%) is an improvement on traditional approaches.
A methodology to event reconstruction from trace images.
Milliet, Quentin; Delémont, Olivier; Sapin, Eric; Margot, Pierre
2015-03-01
The widespread use of digital imaging devices for surveillance (CCTV) and entertainment (e.g., mobile phones, compact cameras) has increased the number of images recorded and opportunities to consider the images as traces or documentation of criminal activity. The forensic science literature focuses almost exclusively on technical issues and evidence assessment [1]. Earlier steps in the investigation phase have been neglected and must be considered. This article is the first comprehensive description of a methodology to event reconstruction using images. This formal methodology was conceptualised from practical experiences and applied to different contexts and case studies to test and refine it. Based on this practical analysis, we propose a systematic approach that includes a preliminary analysis followed by four main steps. These steps form a sequence for which the results from each step rely on the previous step. However, the methodology is not linear, but it is a cyclic, iterative progression for obtaining knowledge about an event. The preliminary analysis is a pre-evaluation phase, wherein potential relevance of images is assessed. In the first step, images are detected and collected as pertinent trace material; the second step involves organising and assessing their quality and informative potential. The third step includes reconstruction using clues about space, time and actions. Finally, in the fourth step, the images are evaluated and selected as evidence. These steps are described and illustrated using practical examples. The paper outlines how images elicit information about persons, objects, space, time and actions throughout the investigation process to reconstruct an event step by step. We emphasise the hypothetico-deductive reasoning framework, which demonstrates the contribution of images to generating, refining or eliminating propositions or hypotheses. This methodology provides a sound basis for extending image use as evidence and, more generally, as clues in investigation and crime reconstruction processes. Copyright © 2015 Forensic Science Society. Published by Elsevier Ireland Ltd. All rights reserved.
Ruffault, Alexis; Czernichow, Sébastien; Hagger, Martin S; Ferrand, Margot; Erichot, Nelly; Carette, Claire; Boujut, Emilie; Flahault, Cécile
The aim of this study was to conduct a comprehensive quantitative synthesis of the effects of mindfulness training interventions on weight-loss and health behaviours in adults with overweight and obesity using meta-analytic techniques. Studies included in the analysis (k=12) were randomised controlled trials investigating the effects of any form of mindfulness training on weight loss, impulsive eating, binge eating, or physical activity participation in adults with overweight and obesity. Random effects meta-analysis revealed that mindfulness training had no significant effect on weight loss, but an overall negative effect on impulsive eating (d=-1.13) and binge eating (d=-.90), and a positive effect on physical activity levels (d=.42). Meta-regression analysis showed that methodological features of included studies accounted for 100% of statistical heterogeneity of the effects of mindfulness training on weight loss (R 2 =1,00). Among methodological features, the only significant predictor of weight loss was follow-up distance from post-intervention (β=1.18; p<.05), suggesting that the longer follow-up distances were associated with greater weight loss. Results suggest that mindfulness training has short-term benefits on health-related behaviours. Future studies should explore the effectiveness of mindfulness training on long-term post-intervention weight loss in adults with overweight and obesity. Copyright © 2016 Asia Oceania Association for the Study of Obesity. Published by Elsevier Ltd. All rights reserved.
Automated Methodologies for the Design of Flow Diagrams for Development and Maintenance Activities
NASA Astrophysics Data System (ADS)
Shivanand M., Handigund; Shweta, Bhat
The Software Requirements Specification (SRS) of the organization is a text document prepared by strategic management incorporating the requirements of the organization. These requirements of ongoing business/ project development process involve the software tools, the hardware devices, the manual procedures, the application programs and the communication commands. These components are appropriately ordered for achieving the mission of the concerned process both in the project development and the ongoing business processes, in different flow diagrams viz. activity chart, workflow diagram, activity diagram, component diagram and deployment diagram. This paper proposes two generic, automatic methodologies for the design of various flow diagrams of (i) project development activities, (ii) ongoing business process. The methodologies also resolve the ensuing deadlocks in the flow diagrams and determine the critical paths for the activity chart. Though both methodologies are independent, each complements other in authenticating its correctness and completeness.
[Occupational epidemiology: from analysis of the apparent to investigation of the unknown].
Zocchetti, C; Pesatori, Angela; Consonni, D
2003-01-01
This paper, as a contribution for the centenary celebration of the establishment of the "Clinica del Lavoro Luigi Devoto" in Milan (Italy), presents a brief 30 year history of the activities of its Department of Occupational Epidemiology. Studies and methodological contributions that characterized the first decade of activity are presented and grouped under the heading of analysis of known health effects. The second decade was dominated by the studies and activities that originated from the Seveso accident (dioxin), with an initial interest towards molecular epidemiology, which became increasingly relevant during the third decade when we addressed topics like melanoma, lung cancer, and benzene, in addition to dioxin. More traditional occupational approaches were not dismissed and cohort mortality studies are currently under way (textile dyeing and finishing industry, sulfuric acid, tetrafluoroethylene). Pros and cons of the epidemiologic approach are discussed in the context of occupational health and the strength of its methodological apparatus is suggested as a fundamental tool for studying adverse occupational health effects. In contrast, it is stressed how occupational epidemiology has been poorly used in the application of law 626/94. Considering that contemporary epidemiology is much more inclined towards the discovery of new work-related risks (electromagnetic fields, air pollution) than the description of known health effects, the paper suggests that occupational epidemiology enlarge its interests: people and environment outside the factories might be good candidates for study.
Sparse representation of whole-brain fMRI signals for identification of functional networks.
Lv, Jinglei; Jiang, Xi; Li, Xiang; Zhu, Dajiang; Chen, Hanbo; Zhang, Tuo; Zhang, Shu; Hu, Xintao; Han, Junwei; Huang, Heng; Zhang, Jing; Guo, Lei; Liu, Tianming
2015-02-01
There have been several recent studies that used sparse representation for fMRI signal analysis and activation detection based on the assumption that each voxel's fMRI signal is linearly composed of sparse components. Previous studies have employed sparse coding to model functional networks in various modalities and scales. These prior contributions inspired the exploration of whether/how sparse representation can be used to identify functional networks in a voxel-wise way and on the whole brain scale. This paper presents a novel, alternative methodology of identifying multiple functional networks via sparse representation of whole-brain task-based fMRI signals. Our basic idea is that all fMRI signals within the whole brain of one subject are aggregated into a big data matrix, which is then factorized into an over-complete dictionary basis matrix and a reference weight matrix via an effective online dictionary learning algorithm. Our extensive experimental results have shown that this novel methodology can uncover multiple functional networks that can be well characterized and interpreted in spatial, temporal and frequency domains based on current brain science knowledge. Importantly, these well-characterized functional network components are quite reproducible in different brains. In general, our methods offer a novel, effective and unified solution to multiple fMRI data analysis tasks including activation detection, de-activation detection, and functional network identification. Copyright © 2014 Elsevier B.V. All rights reserved.
Jaeschke, Anelena; Saldanha, Maria Christine Werba
2012-01-01
The current article aims to analyse the activity and the repercussions in the health of the artisan fishermen who use rafts in the urban Beach of Ponta Negra in Natal - RN, with emphasis on the physical demands at the step of hauling the nets from the ocean. It has been taken as a reference the methodology of Work Ergonomic Analysis-WEA. 21 fishermen participated in the study (50% of the population). It was used applying observational techniques and methodologies (open and sistematic observations and simulations) interactional ( social and economic questionnaire, conversational actions, verbalizations, self confrontations and collective analysis) and health assessment (static postural assessment, movement flexibility of flexo extension of the shoulder and the ischiotibial muscle, lumbar region and hang grip. Applying the diagram of painful areas and the Nordic Questionnaire of mucleskeletal symptoms). The results were submeted to restitutions and validations with the raftmen. The hauling of the nets represents a high muscleskeletal risk, joining physical effort, adopting forced postures with the rotation of the spine and movement repetition, mainly flexo extension of vertebral spine, aggravated by the demand of strength fo the stabilizing muscles of the human body to keep the poise. The impacts of the activity of the fisherman of Ponta Negra, related to postures and strength demand were observed in the postural assessment and are related to the activity.
Disselhorst-Klug, Catherine; Heinze, Franziska; Breitbach-Faller, Nico; Schmitz-Rode, Thomas; Rau, Günter
2012-04-01
Coordination between perception and action is required to interact with the environment successfully. This is already trained by very young infants who perform spontaneous movements to learn how their body interacts with the environment. The strategies used by the infants for this purpose change with age. Therefore, very early progresses in action control made by the infants can be investigated by monitoring the development of spontaneous motor activity. In this paper, an objective method is introduced, which allows the quantitative evaluation of the development of spontaneous motor activity in newborns. The introduced methodology is based on the acquisition of spontaneous movement trajectories of the feet by 3D movement analysis and subsequent calculation of specific movement parameters from them. With these movement-based parameters, it was possible to provide an objective description of age-dependent developmental steps in healthy newborns younger than 6 months. Furthermore, it has been shown that pathologies like infantile cerebral palsy influence development of motor activity significantly. Since the introduced methodology is objective and quantitative, it is suitable to monitor how newborns train their cognitive processes, which will enable them to cope with their environment by motor interaction.
Scotti, Luca; Genovese, Salvatore; Bucciarelli, Tonino; Martini, Filippo; Epifano, Francesco; Fiorito, Serena; Preziuso, Francesca; Taddeo, Vito Alessandro
2018-05-30
An efficient analytical strategy based on different extraction methods of biologically active naturally occurring oxyprenylated umbelliferone and ferulic acid derivatives 7-isopentenyloxycoumarin, auraptene, umbelliprenin, boropinic acid, and 4'-geranyloxyferulic acid and quantification by UHPLC with spectrophotometric (UV/Vis) detection from Tea tree oil is reported. Absorption of the pure oil on Al 2 O 3 (Brockmann activity II) prior washing the resulting solid with MeOH and treatment of this latter with CH 2 Cl 2 resulted the best extraction methodology in terms of yields of oxyprenylated secondary metabolites. Among the five O-prenylphenylpropanoids herein under investigation auraptene and umbelliprenin were never detected while 4'-geranyloxyferulic acid was the most abundant compound resulting from all the three extraction methods employed. The UHPLC analytical methodology set up in the present study resulted to be an effective and versatile technique for the simultaneous characterization and quantification of prenyloxyphenylpropanoids in Tea tree oil and applicable to other complex matrices from the plant kingdom. Copyright © 2018 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Tibin, El Mubarak Musa; Al-Shorgani, Najeeb Kaid Naseer; Abuelhassan, Nawal Noureldaim; Hamid, Aidil Abdul; Kalil, Mohd Sahaid; Yusoff, Wan Mohtar Wan
2013-11-01
The cellulase production using sorghum straw as substrate by fungal culture of Aspergillus terreus SUK-1 was investigated in solid substrate fermentation (SSF). The optimum CMCase was achieved by testing most effective fermentation parameters which were: incubation temperature, pH and moisture content using Response Surface Methodology (RSM) based on Central Composite Design (CCD). The carboxymethyl cellulase activity (CMCase) was measured as the defining factor. The results were analysed by analysis of variance (ANOVA) and the regression quadratic model was obtained. The model was found to be significant (p<0.05) and the effect of temperature (25-40°C) and pH (4-7) was found to be not significant on CMCase activity whereas the moisture content was significant in the SSF conditions employed. The high yield of predicted CMCase activity (0.2 U/ml) was obtained under the optimized conditions (temperature 40 □C, pH 5.4 and moisture content of 80%). The model was validated by applying the optimized conditions and it was found that the model was valid.
Webb, S. J.; Bernier, R.; Henderson, H. A.; Johnson, M. H.; Jones, E. J. H.; Lerner, M. D.; McPartland, J. C.; Nelson, C. A.; Rojas, D. C.; Townsend, J.; Westerfield, M.
2014-01-01
The EEG reflects the activation of large populations of neurons that act in synchrony and propagate to the scalp surface. This activity reflects both the brain’s background electrical activity and when the brain is being challenged by a task. Despite strong theoretical and methodological arguments for the use of EEG in understanding the neural correlates of autism, the practice of collecting, processing and evaluating EEG data is complex. Scientists should take into consideration both the nature of development in autism given the life-long, pervasive course of the disorder and the disability of altered or atypical social, communicative, and motor behaviors, all of which require accommodations to traditional EEG environments and paradigms. This paper presents guidelines for the recording, analyzing, and interpreting of EEG data with participants with autism. The goal is to articulate a set of scientific standards as well as methodological considerations that will increase the general field’s understanding of EEG methods, provide support for collaborative projects, and contribute to the evaluation of results and conclusions. PMID:23975145
NASA Technical Reports Server (NTRS)
Young, Larry A.; Yetter, Jeffrey A.; Guynn, Mark D.
2006-01-01
Maturation of intelligent systems technologies and their incorporation into aerial platforms are dictating the development of new analysis tools and incorporation of such tools into existing system analysis methodologies in order to fully capture the trade-offs of autonomy on vehicle and mission success. A first-order "system analysis of autonomy" methodology is outlined in this paper. Further, this analysis methodology is subsequently applied to notional high-altitude long-endurance (HALE) aerial vehicle missions.
Performance in physiology evaluation: possible improvement by active learning strategies.
Montrezor, Luís H
2016-12-01
The evaluation process is complex and extremely important in the teaching/learning process. Evaluations are constantly employed in the classroom to assist students in the learning process and to help teachers improve the teaching process. The use of active methodologies encourages students to participate in the learning process, encourages interaction with their peers, and stimulates thinking about physiological mechanisms. This study examined the performance of medical students on physiology over four semesters with and without active engagement methodologies. Four activities were used: a puzzle, a board game, a debate, and a video. The results show that engaging in activities with active methodologies before a physiology cognitive monitoring test significantly improved student performance compared with not performing the activities. We integrate the use of these methodologies with classic lectures, and this integration appears to improve the teaching/learning process in the discipline of physiology and improves the integration of physiology with cardiology and neurology. In addition, students enjoy the activities and perform better on their evaluations when they use them. Copyright © 2016 The American Physiological Society.
NASA Technical Reports Server (NTRS)
Mason, Gregory S.; Berg, Martin C.; Mukhopadhyay, Vivek
2002-01-01
To study the effectiveness of various control system design methodologies, the NASA Langley Research Center initiated the Benchmark Active Controls Project. In this project, the various methodologies were applied to design a flutter suppression system for the Benchmark Active Controls Technology (BACT) Wing. This report describes the user's manual and software toolbox developed at the University of Washington to design a multirate flutter suppression control law for the BACT wing.
An Integrated Low-Speed Performance and Noise Prediction Methodology for Subsonic Aircraft
NASA Technical Reports Server (NTRS)
Olson, E. D.; Mavris, D. N.
2000-01-01
An integrated methodology has been assembled to compute the engine performance, takeoff and landing trajectories, and community noise levels for a subsonic commercial aircraft. Where feasible, physics-based noise analysis methods have been used to make the results more applicable to newer, revolutionary designs and to allow for a more direct evaluation of new technologies. The methodology is intended to be used with approximation methods and risk analysis techniques to allow for the analysis of a greater number of variable combinations while retaining the advantages of physics-based analysis. Details of the methodology are described and limited results are presented for a representative subsonic commercial aircraft.
Verification of Ceramic Structures
NASA Astrophysics Data System (ADS)
Behar-Lafenetre, Stephanie; Cornillon, Laurence; Rancurel, Michael; De Graaf, Dennis; Hartmann, Peter; Coe, Graham; Laine, Benoit
2012-07-01
In the framework of the “Mechanical Design and Verification Methodologies for Ceramic Structures” contract [1] awarded by ESA, Thales Alenia Space has investigated literature and practices in affiliated industries to propose a methodological guideline for verification of ceramic spacecraft and instrument structures. It has been written in order to be applicable to most types of ceramic or glass-ceramic materials - typically Cesic®, HBCesic®, Silicon Nitride, Silicon Carbide and ZERODUR®. The proposed guideline describes the activities to be performed at material level in order to cover all the specific aspects of ceramics (Weibull distribution, brittle behaviour, sub-critical crack growth). Elementary tests and their post-processing methods are described, and recommendations for optimization of the test plan are given in order to have a consistent database. The application of this method is shown on an example in a dedicated article [7]. Then the verification activities to be performed at system level are described. This includes classical verification activities based on relevant standard (ECSS Verification [4]), plus specific analytical, testing and inspection features. The analysis methodology takes into account the specific behaviour of ceramic materials, especially the statistical distribution of failures (Weibull) and the method to transfer it from elementary data to a full-scale structure. The demonstration of the efficiency of this method is described in a dedicated article [8]. The verification is completed by classical full-scale testing activities. Indications about proof testing, case of use and implementation are given and specific inspection and protection measures are described. These additional activities are necessary to ensure the required reliability. The aim of the guideline is to describe how to reach the same reliability level as for structures made of more classical materials (metals, composites).
Development of policies for Natura 2000 sites: a multi-criteria approach to support decision makers.
Cortina, Carla; Boggia, Antonio
2014-08-01
The aim of this study is to present a methodology to support decision makers in the choice of Natura 2000 sites needing an appropriate management plan to ensure a sustainable socio-economic development. In order to promote sustainable development in the Natura 2000 sites compatible with nature preservation, conservation measures or management plans are necessary. The main issue is to decide when only conservation measures can be applied and when the sites need an appropriate management plan. We present a case study for the Italian Region of Umbria. The methodology is based on a multi-criteria approach to identify the biodiversity index (BI), and on the development of a human activities index (HAI). By crossing the two indexes for each site on a Cartesian plane, four groups of sites were identified. Each group corresponds to a specific need for an appropriate management plan. Sites in the first group with a high level both of biodiversity and human activities have the most urgent need of an appropriate management plan to ensure sustainable development. The proposed methodology and analysis is replicable in other regions or countries by using the data available for each site in the Natura 2000 standard data form. A multi-criteria analysis is especially suitable for supporting decision makers when they deal with a multidimensional decision process. We found the multi-criteria approach particularly sound in this case, due to the concept of biodiversity itself, which is complex and multidimensional, and to the high number of alternatives (Natura 2000 sites) to be assessed. Copyright © 2014 Elsevier Ltd. All rights reserved.
Analysis and methodology for aeronautical systems technology program planning
NASA Technical Reports Server (NTRS)
White, M. J.; Gershkoff, I.; Lamkin, S.
1983-01-01
A structured methodology was developed that allows the generation, analysis, and rank-ordering of system concepts by their benefits and costs, indicating the preferred order of implementation. The methodology is supported by a base of data on civil transport aircraft fleet growth projections and data on aircraft performance relating the contribution of each element of the aircraft to overall performance. The performance data are used to assess the benefits of proposed concepts. The methodology includes a computer program for performing the calculations needed to rank-order the concepts and compute their cumulative benefit-to-cost ratio. The use of the methodology and supporting data is illustrated through the analysis of actual system concepts from various sources.
CMOS Active Pixel Sensor Technology and Reliability Characterization Methodology
NASA Technical Reports Server (NTRS)
Chen, Yuan; Guertin, Steven M.; Pain, Bedabrata; Kayaii, Sammy
2006-01-01
This paper describes the technology, design features and reliability characterization methodology of a CMOS Active Pixel Sensor. Both overall chip reliability and pixel reliability are projected for the imagers.
Alonso, Monica; Cerdan, Laura; Godayol, Anna; Anticó, Enriqueta; Sanchez, Juan M
2011-11-11
Combining headspace (HS) sampling with a needle-trap device (NTD) to determine priority volatile organic compounds (VOCs) in water samples results in improved sensitivity and efficiency when compared to conventional static HS sampling. A 22 gauge stainless steel, 51-mm needle packed with Tenax TA and Carboxen 1000 particles is used as the NTD. Three different HS-NTD sampling methodologies are evaluated and all give limits of detection for the target VOCs in the ng L⁻¹ range. Active (purge-and-trap) HS-NTD sampling is found to give the best sensitivity but requires exhaustive control of the sampling conditions. The use of the NTD to collect the headspace gas sample results in a combined adsorption/desorption mechanism. The testing of different temperatures for the HS thermostating reveals a greater desorption effect when the sample is allowed to diffuse, whether passively or actively, through the sorbent particles. The limits of detection obtained in the simplest sampling methodology, static HS-NTD (5 mL aqueous sample in 20 mL HS vials, thermostating at 50 °C for 30 min with agitation), are sufficiently low as to permit its application to the analysis of 18 priority VOCs in natural and waste waters. In all cases compounds were detected below regulated levels. Copyright © 2011 Elsevier B.V. All rights reserved.
Comparison of Methodologies of Activation Barrier Measurements for Reactions with Deactivation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xie, Zhenhua; Yan, Binhang; Zhang, Li
In this work, methodologies of activation barrier measurements for reactions with deactivation were theoretically analyzed. Reforming of ethane with CO 2 was introduced as an example for reactions with deactivation to experimentally evaluate these methodologies. Both the theoretical and experimental results showed that due to catalyst deactivation, the conventional method would inevitably lead to a much lower activation barrier, compared to the intrinsic value, even though heat and mass transport limitations were excluded. In this work, an optimal method was identified in order to provide a reliable and efficient activation barrier measurement for reactions with deactivation.
Comparison of Methodologies of Activation Barrier Measurements for Reactions with Deactivation
Xie, Zhenhua; Yan, Binhang; Zhang, Li; ...
2017-01-25
In this work, methodologies of activation barrier measurements for reactions with deactivation were theoretically analyzed. Reforming of ethane with CO 2 was introduced as an example for reactions with deactivation to experimentally evaluate these methodologies. Both the theoretical and experimental results showed that due to catalyst deactivation, the conventional method would inevitably lead to a much lower activation barrier, compared to the intrinsic value, even though heat and mass transport limitations were excluded. In this work, an optimal method was identified in order to provide a reliable and efficient activation barrier measurement for reactions with deactivation.
NASA Technical Reports Server (NTRS)
Mason, Gregory S.; Berg, Martin C.; Mukhopadhyay, Vivek
2002-01-01
To study the effectiveness of various control system design methodologies, the NASA Langley Research Center initiated the Benchmark Active Controls Project. In this project, the various methodologies were applied to design a flutter suppression system for the Benchmark Active Controls Technology (BACT) Wing. This report describes a project at the University of Washington to design a multirate suppression system for the BACT wing. The objective of the project was two fold. First, to develop a methodology for designing robust multirate compensators, and second, to demonstrate the methodology by applying it to the design of a multirate flutter suppression system for the BACT wing.
Val, Jonatan; Pino, María Rosa; Chinarro, David
2018-03-15
Thermal quality in river ecosystems is a fundamental property for the development of biological processes and many of the human activities linked to the aquatic environment. In the future, this property is going to be threatened due to global change impacts, and basin managers will need useful tools to evaluate these impacts. Currently, future projections in temperature modelling are based on the historical data for air and water temperatures, and the relationship with past temperature scenarios; however, this represents a problem when evaluating future scenarios with new thermal impacts. Here, we analysed the thermal impacts produced by several human activities, and linked them with the decoupling degree of the thermal transfer mechanism from natural systems measured with frequency analysis tools (wavelet coherence). Once this relationship has been established we develop a new methodology for simulating different thermal impacts scenarios in order to project them into future. Finally, we validate this methodology using a site that changed its thermal quality during the studied period due to human impacts. Results showed a high correlation (r 2 =0.84) between the decoupling degree of the thermal transfer mechanisms and the quantified human impacts, obtaining 3 thermal impact scenarios. Furthermore, the graphic representation of these thermal scenarios with its wavelet coherence spectrums showed the impacts of an extreme drought period and the agricultural management. The inter-conversion between the scenarios gave high morphological similarities in the obtained wavelet coherence spectrums, and the validation process clearly showed high efficiency of the developed model against old methodologies when comparing with Nash-Stucliffe criterion. Although there is need for further investigation with different climatic and anthropic management conditions, the developed frequency models could be useful in decision-making processes by managers when faced with future global change impacts. Copyright © 2017 Elsevier B.V. All rights reserved.
Fungistatic activity of heat-treated flaxseed determined by response surface methodology.
Xu, Y; Hall, C; Wolf-Hall, C
2008-08-01
The objective of this study was to evaluate the effect of heat treatment on the fungistatic activity of flaxseed (Linum usitatissimum) in potato dextrose agar (PDA) medium and a fresh noodle system. The radial growth of Penicilliumn chrysogenum, Aspergillus flavus, and a Penicillium sp. isolated from moldy noodles, as well as the mold count of fresh noodle enriched with heat treated flaxseed, were used to assess antifungal activity. A central composite design in the response surface methodology was used to predict the effect of heating temperature and time on antifungal activity of flaxseed flour (FF). Statistical analysis determined that the linear terms of both variables (that is, heating temperature and time) and the quadratic terms of the heating temperature had significant (P<0.05) effects on the radial growth of all 3 test fungi and the mold count log-cycle reduction of fresh noodle. The interactions between the temperature and time were significant for all dependent variables (P<0.05). Significant reductions in antifungal activities were found when FF was subjected to high temperatures, regardless of heating time. In contrast, prolonging the heating time did not substantially affect the antifungal activities of FF at low temperature. However, 60% of the antifungal activity was retained after FF was heated at 100 degrees C for 15 min, which suggests a potential use of FF as an antifungal additive in food products subjected to low to mild heat treatments.
NASA Astrophysics Data System (ADS)
Dominguez, Caroline; Nascimento, Maria M.; Payan-Carreira, Rita; Cruz, Gonçalo; Silva, Helena; Lopes, José; Morais, Maria da Felicidade A.; Morais, Eva
2015-09-01
Considering the results of research on the benefits and difficulties of peer review, this paper describes how teaching faculty, interested in endorsing the acquisition of communication and critical thinking (CT) skills among engineering students, has been implementing a learning methodology throughout online peer review activities. While introducing a new methodology, it is important to weight the advantages found and the conditions that might have restrained the activity outcomes, thereby modulating its overall efficiency. Our results show that several factors are decisive for the success of the methodology: the use of specific and detailed orientation guidelines for CT skills, the students' training on how to deliver a meaningful feedback, the opportunity to counter-argument, the selection of good assignments' examples, and the constant teacher's monitoring of the activity. Results also tackle other aspects of the methodology such as the thinking skills evaluation tools (grades and tests) that most suit our reality. An improved methodology is proposed taking in account the encountered limitations, thus offering the possibility to other interested institutions to use/test and/or improve it.
Benefit-Cost Analysis of Integrated Paratransit Systems : Volume 6. Technical Appendices.
DOT National Transportation Integrated Search
1979-09-01
This last volume, includes five technical appendices which document the methodologies used in the benefit-cost analysis. They are the following: Scenario analysis methodology; Impact estimation; Example of impact estimation; Sensitivity analysis; Agg...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mata, Pedro; Fuente, Rafael de la; Iglesias, Javier
Iberdrola (spanish utility) and Iberdrola Ingenieria (engineering branch) have been developing during the last two years the 110% Extended Power Up-rate Project (EPU 110%) for Cofrentes BWR-6. IBERDROLA has available an in-house design and licensing reload methodology that has been approved by the Spanish Nuclear Regulatory Authority. This methodology has been already used to perform the nuclear design and the reload licensing analysis for Cofrentes cycles 12 to 14. The methodology has been also applied to develop a significant number of safety analysis of the Cofrentes Extended Power Up-rate including: Reactor Heat Balance, Core and Fuel performance, Thermal Hydraulic Stability,more » ECCS LOCA Evaluation, Transient Analysis, Anticipated Transient Without Scram (ATWS) and Station Blackout (SBO) Since the scope of the licensing process of the Cofrentes Extended Power Up-rate exceeds the range of analysis included in the Cofrentes generic reload licensing process, it has been required to extend the applicability of the Cofrentes licensing methodology to the analysis of new transients. This is the case of the TLFW transient. The content of this paper shows the benefits of having an in-house design and licensing methodology, and describes the process to extend the applicability of the methodology to the analysis of new transients. The case of analysis of Total Loss of Feedwater with the Cofrentes Retran Model is included as an example of this process. (authors)« less
2007-12-01
significantly affecting American society. Part of the cost is associated with the employment of the Reserve Forces, who are often subject to multiple ...he’s been able to quantify that comprise the total cost of the war in Iraq over a decade-long time frame ranging from 2003- 2012 . Such a methodology...configurations.”70 It’s genesis flowed in the Army’s desire to raise the number of total active divisions without a change in active duty personnel
Odor-active constituents of Cedrus atlantica wood essential oil.
Uehara, Ayaka; Tommis, Basma; Belhassen, Emilie; Satrani, Badr; Ghanmi, Mohamed; Baldovini, Nicolas
2017-12-01
The main odorant constituents of Cedrus atlantica essential oil were characterized by GC-Olfactometry (GC-O), using the Aroma Extract Dilution Analysis (AEDA) methodology with 12 panelists. The two most potent odor-active constituents were vestitenone and 4-acetyl-1-methylcyclohexene. The identification of the odorants was realized by a detailed fractionation of the essential oil by liquid-liquid basic extraction, distillation and column chromatography, followed by the GC-MS and GC-O analyses of some fractions, and the synthesis of some non-commercial reference constituents. Copyright © 2017 Elsevier Ltd. All rights reserved.
Bio-speckle assessment of bruising in fruits
NASA Astrophysics Data System (ADS)
Pajuelo, M.; Baldwin, G.; Rabal, H.; Cap, N.; Arizaga, R.; Trivi, M.
2003-07-01
The dynamic speckle patterns or bio-speckle is a phenomenon produced by laser illumination of active materials, such as a biological tissue. Fruits, even hard peel ones, show a speckle activity that can be related to maturity, turgor, damage, aging, and mechanical properties. In this case, we suggest a bio-speckle technique as a potential methodology for the study of impact on apples and the analysis of bruises produced by them. The aim is to correlate physical properties of apples with quality factors using a non-contact and non-invasive technique.
Gastric Emptying Assessment in Frequency and Time Domain Using Bio-impedance: Preliminary Results
NASA Astrophysics Data System (ADS)
Huerta-Franco, R.; Vargas-Luna, M.; Hernández, E.; Córdova, T.; Sosa, M.; Gutiérrez, G.; Reyes, P.; Mendiola, C.
2006-09-01
The impedance assessment to measure gastric emptying and in general gastric activity has been reported since 1985. The physiological interpretation of these measurements, is still under research. This technique usually uses a single frequency, and the conductivity parameter. The frequency domain and the Fourier analysis of the time domain behavior of the gastric impedance in different gastric conditions (fasting state, and after food administration) has not been explored in detail. This work presents some insights of the potentiality of these alternative methodologies to measure gastric activity.
Global/local methods research using a common structural analysis framework
NASA Technical Reports Server (NTRS)
Knight, Norman F., Jr.; Ransom, Jonathan B.; Griffin, O. H., Jr.; Thompson, Danniella M.
1991-01-01
Methodologies for global/local stress analysis are described including both two- and three-dimensional analysis methods. These methods are being developed within a common structural analysis framework. Representative structural analysis problems are presented to demonstrate the global/local methodologies being developed.
NASA Technical Reports Server (NTRS)
Unal, Resit; Keating, Charles; Conway, Bruce; Chytka, Trina
2004-01-01
A comprehensive expert-judgment elicitation methodology to quantify input parameter uncertainty and analysis tool uncertainty in a conceptual launch vehicle design analysis has been developed. The ten-phase methodology seeks to obtain expert judgment opinion for quantifying uncertainties as a probability distribution so that multidisciplinary risk analysis studies can be performed. The calibration and aggregation techniques presented as part of the methodology are aimed at improving individual expert estimates, and provide an approach to aggregate multiple expert judgments into a single probability distribution. The purpose of this report is to document the methodology development and its validation through application to a reference aerospace vehicle. A detailed summary of the application exercise, including calibration and aggregation results is presented. A discussion of possible future steps in this research area is given.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Columbia River System Operation Review
1995-11-01
This Appendix J of the Final Environmental Impact Statement for the Columbia River System discusses impacts on the recreational activities in the region. Major sections include the following: scope and processes; recreation in the Columbia River Basin today - by type, location, participation, user characteristics, factors which affect usage, and managing agencies; recreation analysis procedures and methodology; and alternatives and their impacts.
NASA Technical Reports Server (NTRS)
Hatterick, G. R.
1972-01-01
Activities are documented of the study to determine skills required of on-orbit crew personnel of the space shuttle. The material is presented in four sections that include: (1) methodology for identifying flight experiment task-skill requirements, (2) task-skill analysis of selected flight experiments, (3) study results and conclusions, and (4) new technology.
A New Methodology for Simultaneous Multi-layer Retrievals of Ice and Liquid Water Cloud Properties
NASA Astrophysics Data System (ADS)
Sourdeval, O.; Labonnote, L.; Baran, A. J.; Brogniez, G.
2014-12-01
It is widely recognized that the study of clouds has nowadays become one of the major concern of the climate research community. Consequently, a multitude of retrieval methodologies have been developed during the last decades in order to obtain accurate retrievals of cloud properties that can be supplied to climate models. Most of the current methodologies have proven to be satisfactory for separately retrieving ice or liquid cloud properties, but very few of them have attempted simultaneous retrievals of these two cloud types. Recent studies nevertheless show that the omission of one of these layers can have strong consequences on the retrievals and their accuracy. In this study, a new methodology that simultaneously retrieves the properties of ice and liquid clouds is presented. The optical thickness and the effective radius of up to two liquid cloud layers and the ice water path of one ice cloud layer are simultaneously retrieved, along with an accurate estimation of their uncertainties. Radiometric measurements ranging from the visible to the thermal infrared are used for performing the retrievals. In order to quantify the capabilities and limitations of our methodology, the results of a theoretical information content analysis are first presented. This analysis allows obtaining an a priori understanding of how much information should be expected on each of the retrieval parameters in different atmospheric conditions, and which set of channels is likely to provide this information. After such theoretical considerations, global retrievals corresponding to several months of A-Train data are presented. Comparisons of our retrievals with operational products from active and passive instruments are effectuated and show good global agreements. These comparisons are useful for validating our retrievals but also for testing how operational products can be influenced by multi-layer configurations.
NASA Astrophysics Data System (ADS)
Huda, J.; Kauneckis, D. L.
2013-12-01
Climate change adaptation represents a number of unique policy-making challenges. Foremost among these is dealing with the range of future climate impacts to a wide scope of inter-related natural systems, their interaction with social and economic systems, and uncertainty resulting from the variety of downscaled climate model scenarios and climate science projections. These cascades of uncertainty have led to a number of new approaches as well as a reexamination of traditional methods for evaluating risk and uncertainty in policy-making. Policy makers are required to make decisions and formulate policy irrespective of the level of uncertainty involved and while a debate continues regarding the level of scientific certainty required in order to make a decision, incremental change in the climate policy continues at multiple governance levels. This project conducts a comparative analysis of the range of methodological approaches that are evolving to address uncertainty in climate change policy. It defines 'methodologies' to include a variety of quantitative and qualitative approaches involving both top-down and bottom-up policy processes that attempt to enable policymakers to synthesize climate information into the policy process. The analysis examines methodological approaches to decision-making in climate policy based on criteria such as sources of policy choice information, sectors to which the methodology has been applied, sources from which climate projections were derived, quantitative and qualitative methods used to deal with uncertainty, and the benefits and limitations of each. A typology is developed to better categorize the variety of approaches and methods, examine the scope of policy activities they are best suited for, and highlight areas for future research and development.
Fieldwork Methodology in South American Maritime Archaeology: A Critical Review
NASA Astrophysics Data System (ADS)
Argüeso, Amaru; Ciarlo, Nicolás C.
2017-12-01
In archaeology, data obtained from the analysis of material evidence (i.e., the archaeological record) from extensive excavations have been a significant means for the ultimate development of interpretations about human life in the past. Therefore, the methodological procedures and tools employed during fieldwork are of crucial importance due to their effect on the information likely to be recovered. In the case of maritime archaeology, the development of rigorous methods and techniques allowed for reaching outcomes as solid as those from the work performed on land. These improvements constituted one of the principal supports—if not, the most important pillar—for its acceptance as a scientific field of study. Over time, the growing diversity of sites under study (e.g., shipwrecks, ports, dockyards, and prehistoric settlements) and the underwater environments encountered made it clear that there was a need for the application of specific methodological criteria, in accordance with the particularities of the sites and of each study (e.g., the research aims and the available resources). This article presents some ideas concerning the methodologies used in South American investigations that have exhibited a strong emphasis on the analysis of historical shipwrecks (the sixteenth to twentieth centuries). Based on a state-of-the-knowledge review of these research projects, in particular where excavations were conducted, the article focuses on the details of the main strategies adopted and results achieved. The ideas proposed in this article can be useful as a starting point for future activities of surveying, recording, and excavating shipwrecks.
Human factors evaluation of teletherapy: Function and task analysis. Volume 2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kaye, R.D.; Henriksen, K.; Jones, R.
1995-07-01
As a treatment methodology, teletherapy selectively destroys cancerous and other tissue by exposure to an external beam of ionizing radiation. Sources of radiation are either a radioactive isotope, typically Cobalt-60 (Co-60), or a linear accelerator. Records maintained by the NRC have identified instances of teletherapy misadministration where the delivered radiation dose has differed from the radiation prescription (e.g., instances where fractions were delivered to the wrong patient, to the wrong body part, or were too great or too little with respect to the defined treatment volume). Both human error and machine malfunction have led to misadministrations. Effective and safe treatmentmore » requires a concern for precision and consistency of human-human and human-machine interactions throughout the course of therapy. The present study is the first part of a series of human factors evaluations for identifying the root causes that lead to human error in the teletherapy environment. The human factors evaluations included: (1) a function and task analysis of teletherapy activities, (2) an evaluation of the human-system interfaces, (3) an evaluation of procedures used by teletherapy staff, (4) an evaluation of the training and qualifications of treatment staff (excluding the oncologists), (5) an evaluation of organizational practices and policies, and (6) an identification of problems and alternative approaches for NRC and industry attention. The present report addresses the function and task analysis of teletherapy activities and provides the foundation for the conduct of the subsequent evaluations. The report includes sections on background, methodology, a description of the function and task analysis, and use of the task analysis findings for the subsequent tasks. The function and task analysis data base also is included.« less
A knowledge discovery approach to explore some Sun/Earth's climate relationships
NASA Astrophysics Data System (ADS)
Pou, A.; Valdes, J.
2009-09-01
Recent developments in data driven modeling and analysis including computational intelligence techniques may throw new light on the exploration of possible solar activity/Earth's climate relationships. Here we present three different examples of methodologies under development and some preliminary results. a) Multivariate Time Series Model Mining (MVTSMM) analysis [1] and Genetic Programming were applied to Greenland's CRETE Site-E ice core Delta O18/16 values (1721-1983, one year interval sampling) and with sunspots activity (International Sunspots Number) during the same time span [2]. According to the results (1771 to 1933 period) indicated by the lag importance spectrum obtained with MVTSMM analysis, the sun's activity itself shows high internal variability and is inhomogeneous. The Dalton minimum, a low activity period usually considered to occur between 1790 and 1830, is shown to be a complex structure beginning about 1778 and ending in 1840. Apparently, the system entered a new state in 1912. In the joint analysis, the analytical tool uses extensively the solar activity data to explain the Delta O18/16 data, showing areas of stable patterns, lag drifts and abrupt pattern disruptions, indicating changes of state in the solar processes of several kinds at different times. b) A similar MVTSMM analysis was conducted on Central England Temperature (CET) and solar activity data using Group Sunspots Number (GSN) with a useful interpretive span of time from 1771 to 1916. The joint analysis involved large amounts of solar activity variables, except for the 1843-1862 and 1877-1889 periods where the discovered models used much less information from GSN data. As with the Crete-E/ISN analysis the lag importance spectrum of CET/GSN shows a number of clear discontinuities. A quarter of them are present in both (1778-1779, 1806, 1860-1862, 1912-1913). These experiments were designed for testing methodologies and not for specific hypothesis testing. However, it seems that Delta O18/16 data would more readily respond to solar influences. This raises the suspicion that perhaps they do not only reflect temperatures but also solar activity, as well as other possible factors not directly related to atmospheric temperatures. These methodologies may be useful as exploratory tools, directing the attention to specific areas where further research should be required. This could be the case of the Delta O18/16 data, frequently considered to be a reliable and accurate proxy of temperatures. c) Another experiment was made using daily maximum temperatures from 10 Spanish meteorological stations for the period 1901-2005 [3]. Using a hybrid procedure (Differential Evolution and Fletcher-Reeves Classical Optimization) it was found that a subset was capable of preserving the 10-dimensional similarity when nonlinearly mapped into 1D. A daily index, F1 was applied to the whole dataset and grouped by years and transformed into a Kolmogorov-Smirnov dissimilarity matrix, space optimized and clustered giving the following landmarks: 1911-12, 1919-1920, 1960, 1973 and 1989. A visual comparison with the aa geomagnetic index may suggest a certain coupling with changes in the magnetic field behavior. The complexity of the patterns suggest that the possible relationships between Earth's climate and solar activity may occur in much more complex ways than just irradiance variations and simple linear correlations. REFERENCES: [1] Valdés, J.J., Bonham-Carter, G. " Time Dependent Neural Network Models For Detecting Changes of State in Complex Processes: Applications in Earth Sciences and Astronomy”. Neural Networks, vol 19, (2), pp 196-207, 2006. [2] Valdés, J., Pou, A. "Greenland Temperatures and Solar Activity: A Computational Intelligence Approach," Proceedings of the 2007 IEEE International Joint Conference on Neural Networks (IJCNN 2007). Orlando, Florida, USA. August 12-17, 2007. [3] Valdés, J., Pou, A., Orchard, B. "Characterization of Climatic Variations in Spain at the Regional Scale: A Computational Intelligence Approach," Proceedings of the IEEE World Congress on Computational Intelligence (WCCI-2008). Hong Kong, China. June 1, 2008.
Comparison between two methodologies for urban drainage decision aid.
Moura, P M; Baptista, M B; Barraud, S
2006-01-01
The objective of the present work is to compare two methodologies based on multicriteria analysis for the evaluation of stormwater systems. The first methodology was developed in Brazil and is based on performance-cost analysis, the second one is ELECTRE III. Both methodologies were applied to a case study. Sensitivity and robustness analyses were then carried out. These analyses demonstrate that both methodologies have equivalent results, and present low sensitivity and high robustness. These results prove that the Brazilian methodology is consistent and can be used safely in order to select a good solution or a small set of good solutions that could be compared with more detailed methods afterwards.
NASA Astrophysics Data System (ADS)
Tesfamichael, Aklilu A.; Caplan, Arthur J.; Kaluarachchi, Jagath J.
2005-05-01
This study provides an improved methodology for investigating the trade-offs between the health risks and economic benefits of using atrazine in the agricultural sector by incorporating public attitude to pesticide management in the analysis. Regression models are developed to predict finished water atrazine concentration in high-risk community water supplies in the United States. The predicted finished water atrazine concentrations are then used in a health risk assessment. The computed health risks are compared with the total economic surplus in the U.S. corn market for different atrazine application rates using estimated demand and supply functions developed in this work. Analysis of different scenarios with consumer price premiums for chemical-free and reduced-chemical corn indicate that if the society is willing to pay a price premium, risks can be reduced without a large reduction in the total economic surplus and net benefits may be higher. The results also show that this methodology provides an improved scientific framework for future decision making and policy evaluation in pesticide management.
Violanti, S; Fraschetta, M; Adda, S; Caputo, E
2009-12-01
Within the framework of Environmental Agencies system's activities, coordinated by ISPRA (superior institute for environmental protection and research), a comparison among measurements was designed and accomplished, in order to go into depth on the matter of measurement problems and to evaluate magnetic field at power frequencies. These measurements have been taken near medium voltage /low voltage transformer substation. This project was developed with the contribution of several experts who belong to different Regional Agencies. In three of these regions, substations having specific international standard characteristics were chosen; then a measurement and data analysis protocol was arranged. Data analysis showed a good level of coherence among results obtained by different laboratories. However, a range of problems emerged, either during the protocol predisposition and definition of the data analysis procedure or during the execution of measures and data reprocessing, because of the spatial and temporal variability of magnetic field. These problems represent elements of particular interest in determining a correct measurement methodology, whose purpose is the comparison with limits of exposure, attention values and quality targets.
Brain activation during human male ejaculation revisited.
Georgiadis, Janniko R; Reinders, A A T Simone; Van der Graaf, Ferdinand H C E; Paans, Anne M J; Kortekaas, Rudie
2007-04-16
In a prior [O]-H2O positron emission tomographic study we reported brain regions involved in human male ejaculation. Here, we used another, more recently acquired data set to evaluate the methodological approach of this previous study, and discovered that part of the reported activation pattern was not related to ejaculation. With a new analysis of these ejaculation data, we now demonstrate ejaculation-related activations in the deep cerebellar nuclei (dentate nucleus), anterior vermis, pons, and ventrolateral thalamus, and, most importantly, ejaculation-related deactivations throughout the prefrontal cortex. This revision offers a new and more accurate insight into the brain regions involved in human male ejaculation.
Quantifying Low Energy Proton Damage in Multijunction Solar Cells
NASA Technical Reports Server (NTRS)
Messenger, Scott R.; Burke, Edward A.; Walters, Robert J.; Warner, Jeffrey H.; Summers, Geoffrey P.; Lorentzen, Justin R.; Morton, Thomas L.; Taylor, Steven J.
2007-01-01
An analysis of the effects of low energy proton irradiation on the electrical performance of triple junction (3J) InGaP2/GaAs/Ge solar cells is presented. The Monte Carlo ion transport code (SRIM) is used to simulate the damage profile induced in a 3J solar cell under the conditions of typical ground testing and that of the space environment. The results are used to present a quantitative analysis of the defect, and hence damage, distribution induced in the cell active region by the different radiation conditions. The modelling results show that, in the space environment, the solar cell will experience a uniform damage distribution through the active region of the cell. Through an application of the displacement damage dose analysis methodology, the implications of this result on mission performance predictions are investigated.
Trace impurities analysis determined by neutron activation in the PbI 2 crystal semiconductor
NASA Astrophysics Data System (ADS)
Hamada, M. M.; Oliveira, I. B.; Armelin, M. J.; Mesquita, C. H.
2003-06-01
In this work, a methodology for impurity analysis of PbI 2 was studied to investigate the effectiveness of the purification. Commercial salts were purified by the multi passes zone refining and grown by the Bridgman method. To evaluate the purification efficiency, samples from the bottom, middle and upper sections of the ZR ingot were analyzed after 200, 300 and 500 purification passes, by measurements of the impurity concentrations, using the neutron activation analysis (NAA) technique. There was a significant reduction of the impurities according to the purification numbers. The reduction efficiency was different for each element, namely: Au>Mn>Co˜Ag>K˜Br. The impurity concentration of the crystals grown after 200, 300 and 500 passes and the PbI 2 starting material were analyzed by NAA and plasma optical emission spectroscopy.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harben, P E; Harris, D; Myers, S
Seismic imaging and tracking methods have intelligence and monitoring applications. Current systems, however, do not adequately calibrate or model the unknown geological heterogeneity. Current systems are also not designed for rapid data acquisition and analysis in the field. This project seeks to build the core technological capabilities coupled with innovative deployment, processing, and analysis methodologies to allow seismic methods to be effectively utilized in the applications of seismic imaging and vehicle tracking where rapid (minutes to hours) and real-time analysis is required. The goal of this project is to build capabilities in acquisition system design, utilization and in full 3Dmore » finite difference modeling as well as statistical characterization of geological heterogeneity. Such capabilities coupled with a rapid field analysis methodology based on matched field processing are applied to problems associated with surveillance, battlefield management, finding hard and deeply buried targets, and portal monitoring. This project benefits the U.S. military and intelligence community in support of LLNL's national-security mission. FY03 was the final year of this project. In the 2.5 years this project has been active, numerous and varied developments and milestones have been accomplished. A wireless communication module for seismic data was developed to facilitate rapid seismic data acquisition and analysis. The E3D code was enhanced to include topographic effects. Codes were developed to implement the Karhunen-Loeve (K-L) statistical methodology for generating geological heterogeneity that can be utilized in E3D modeling. The matched field processing methodology applied to vehicle tracking and based on a field calibration to characterize geological heterogeneity was tested and successfully demonstrated in a tank tracking experiment at the Nevada Test Site. A 3-seismic-array vehicle tracking testbed was installed on-site at LLNL for testing real-time seismic tracking methods. A field experiment was conducted over a tunnel at the Nevada Site that quantified the tunnel reflection signal and, coupled with modeling, identified key needs and requirements in experimental layout of sensors. A large field experiment was conducted at the Lake Lynn Laboratory, a mine safety research facility in Pennsylvania, over a tunnel complex in realistic, difficult conditions. This experiment gathered the necessary data for a full 3D attempt to apply the methodology. The experiment also collected data to analyze the capabilities to detect and locate in-tunnel explosions for mine safety and other applications.« less
Establishing Equivalence: Methodological Progress in Group-Matching Design and Analysis
ERIC Educational Resources Information Center
Kover, Sara T.; Atwood, Amy K.
2013-01-01
This methodological review draws attention to the challenges faced by intellectual and developmental disabilities researchers in the appropriate design and analysis of group comparison studies. We provide a brief overview of matching methodologies in the field, emphasizing group-matching designs used in behavioral research on cognition and…
29 CFR 1926.64 - Process safety management of highly hazardous chemicals.
Code of Federal Regulations, 2011 CFR
2011-07-01
... analysis methodology being used. (5) The employer shall establish a system to promptly address the team's... the decision as to the appropriate PHA methodology to use. All PHA methodologies are subject to... be developed in conjunction with the process hazard analysis in sufficient detail to support the...
29 CFR 1926.64 - Process safety management of highly hazardous chemicals.
Code of Federal Regulations, 2010 CFR
2010-07-01
... analysis methodology being used. (5) The employer shall establish a system to promptly address the team's... the decision as to the appropriate PHA methodology to use. All PHA methodologies are subject to... be developed in conjunction with the process hazard analysis in sufficient detail to support the...
When micro meets macro: microbial lipid analysis and ecosystem ecology
NASA Astrophysics Data System (ADS)
Balser, T.; Gutknecht, J.
2008-12-01
There is growing interest in linking soil microbial community composition and activity with large-scale field studies of nutrient cycling or plant community response to disturbances. And while analysis of microbial communities has moved rapidly in the past decade from culture-based to non-culture based techniques, still it must be asked what have we gained from the move? How well does the necessarily micro-scale of microbial analysis allow us to address questions of interest at the macro-scale? Several challenges exist in bridging the scales, and foremost is the question of methodological feasibility. Past microbiological methodologies have not been readily adaptable to the large sample sizes necessary for ecosystem-scale research. As a result, it has been difficult to generate compatible microbial and ecosystem data sets. We describe the use of a modified lipid extraction method to generate microbial community data sets that allow us to match landscape-scale or long-term ecological studies with microbial community data. We briefly discuss the challenges and advantages associated with lipid analysis as an approach to addressing ecosystem ecological studies, and provide examples from our research in ecosystem restoration and recovery following disturbance and climate change.
Probabilistic assessment of dynamic system performance. Part 3
DOE Office of Scientific and Technical Information (OSTI.GOV)
Belhadj, Mohamed
1993-01-01
Accurate prediction of dynamic system failure behavior can be important for the reliability and risk analyses of nuclear power plants, as well as for their backfitting to satisfy given constraints on overall system reliability, or optimization of system performance. Global analysis of dynamic systems through investigating the variations in the structure of the attractors of the system and the domains of attraction of these attractors as a function of the system parameters is also important for nuclear technology in order to understand the fault-tolerance as well as the safety margins of the system under consideration and to insure a safemore » operation of nuclear reactors. Such a global analysis would be particularly relevant to future reactors with inherent or passive safety features that are expected to rely on natural phenomena rather than active components to achieve and maintain safe shutdown. Conventionally, failure and global analysis of dynamic systems necessitate the utilization of different methodologies which have computational limitations on the system size that can be handled. Using a Chapman-Kolmogorov interpretation of system dynamics, a theoretical basis is developed that unifies these methodologies as special cases and which can be used for a comprehensive safety and reliability analysis of dynamic systems.« less
Work-based physiological assessment of physically-demanding trades: a methodological overview.
Taylor, Nigel A S; Groeller, Herb
2003-03-01
Technological advances, modified work practices, altered employment strategies, work-related injuries, and the rise in work-related litigation and compensation claims necessitate ongoing trade analysis research. Such research enables the identification and development of gender- and age-neutral skills, physiological attributes and employment standards required to satisfactorily perform critical trade tasks. This paper overviews a methodological approach which may be adopted when seeking to establish trade-specific physiological competencies for physically-demanding trades (occupations). A general template is presented for conducting a trade analyses within physically-demanding trades, such as those encountered within military or emergency service occupations. Two streams of analysis are recommended: the trade analysis and the task analysis. The former involves a progressive dissection of activities and skills into a series of specific tasks (elements), and results in a broad approximation of the types of trade duties, and the links between trade tasks. The latter, will lead to the determination of how a task is performed within a trade, and the physiological attributes required to satisfactorily perform that task. The approach described within this paper is designed to provide research outcomes which have high content, criterion-related and construct validities.
NASA Technical Reports Server (NTRS)
Thacker, B. H.; Mcclung, R. C.; Millwater, H. R.
1990-01-01
An eigenvalue analysis of a typical space propulsion system turbopump blade is presented using an approximate probabilistic analysis methodology. The methodology was developed originally to investigate the feasibility of computing probabilistic structural response using closed-form approximate models. This paper extends the methodology to structures for which simple closed-form solutions do not exist. The finite element method will be used for this demonstration, but the concepts apply to any numerical method. The results agree with detailed analysis results and indicate the usefulness of using a probabilistic approximate analysis in determining efficient solution strategies.
Oliver, Penelope; Cicerale, Sara; Pang, Edwin; Keast, Russell
2018-04-01
Temporal dominance of sensations (TDS) is a rapid descriptive method that offers a different magnitude of information to traditional descriptive analysis methodologies. This methodology considers the dynamic nature of eating, assessing sensory perception of foods as they change throughout the eating event. Limited research has applied the TDS methodology to strawberries and subsequently validated the results against Quantitative Descriptive Analysis (QDA™). The aim of this research is to compare the TDS methodology using an untrained consumer panel to the results obtained via QDA™ with a trained sensory panel. The trained panelists (n = 12, minimum 60 hr each panelist) were provided with six strawberry samples (three cultivars at two maturation levels) and applied QDA™ techniques to profile each strawberry sample. Untrained consumers (n = 103) were provided with six strawberry samples (three cultivars at two maturation levels) and required to use TDS methodology to assess the dominant sensations for each sample as they change over time. Results revealed moderately comparable product configurations produced via TDS in comparison to QDA™ (RV coefficient = 0.559), as well as similar application of the sweet attribute (correlation coefficient of 0.895 at first bite). The TDS methodology however was not in agreement with the QDA™ methodology regarding more complex flavor terms. These findings support the notion that the lack of training on the definition of terms, together with the limitations of the methodology to ignore all attributes other than those dominant, provide a different magnitude of information than the QDA™ methodology. A comparison of TDS to traditional descriptive analysis indicate that TDS provides additional information to QDA™ regarding the lingering component of eating. The QDA™ results however provide more precise detail regarding singular attributes. Therefore, the TDS methodology has an application in industry when it is important to understand the lingering profile of products. However, this methodology should not be employed as a replacement to traditional descriptive analysis methods. © 2018 Institute of Food Technologists®.
Collaborative Visual Analytics: A Health Analytics Approach to Injury Prevention.
Al-Hajj, Samar; Fisher, Brian; Smith, Jennifer; Pike, Ian
2017-09-12
Background : Accurate understanding of complex health data is critical in order to deal with wicked health problems and make timely decisions. Wicked problems refer to ill-structured and dynamic problems that combine multidimensional elements, which often preclude the conventional problem solving approach. This pilot study introduces visual analytics (VA) methods to multi-stakeholder decision-making sessions about child injury prevention; Methods : Inspired by the Delphi method, we introduced a novel methodology-group analytics (GA). GA was pilot-tested to evaluate the impact of collaborative visual analytics on facilitating problem solving and supporting decision-making. We conducted two GA sessions. Collected data included stakeholders' observations, audio and video recordings, questionnaires, and follow up interviews. The GA sessions were analyzed using the Joint Activity Theory protocol analysis methods; Results : The GA methodology triggered the emergence of ' common g round ' among stakeholders. This common ground evolved throughout the sessions to enhance stakeholders' verbal and non-verbal communication, as well as coordination of joint activities and ultimately collaboration on problem solving and decision-making; Conclusion s : Understanding complex health data is necessary for informed decisions. Equally important, in this case, is the use of the group analytics methodology to achieve ' common ground' among diverse stakeholders about health data and their implications.
Płaszewski, Maciej; Bettany-Saltikov, Josette
2014-01-01
Background Non-surgical interventions for adolescents with idiopathic scoliosis remain highly controversial. Despite the publication of numerous reviews no explicit methodological evaluation of papers labeled as, or having a layout of, a systematic review, addressing this subject matter, is available. Objectives Analysis and comparison of the content, methodology, and evidence-base from systematic reviews regarding non-surgical interventions for adolescents with idiopathic scoliosis. Design Systematic overview of systematic reviews. Methods Articles meeting the minimal criteria for a systematic review, regarding any non-surgical intervention for adolescent idiopathic scoliosis, with any outcomes measured, were included. Multiple general and systematic review specific databases, guideline registries, reference lists and websites of institutions were searched. The AMSTAR tool was used to critically appraise the methodology, and the Oxford Centre for Evidence Based Medicine and the Joanna Briggs Institute’s hierarchies were applied to analyze the levels of evidence from included reviews. Results From 469 citations, twenty one papers were included for analysis. Five reviews assessed the effectiveness of scoliosis-specific exercise treatments, four assessed manual therapies, five evaluated bracing, four assessed different combinations of interventions, and one evaluated usual physical activity. Two reviews addressed the adverse effects of bracing. Two papers were high quality Cochrane reviews, Three were of moderate, and the remaining sixteen were of low or very low methodological quality. The level of evidence of these reviews ranged from 1 or 1+ to 4, and in some reviews, due to their low methodological quality and/or poor reporting, this could not be established. Conclusions Higher quality reviews indicate that generally there is insufficient evidence to make a judgment on whether non-surgical interventions in adolescent idiopathic scoliosis are effective. Papers labeled as systematic reviews need to be considered in terms of their methodological rigor; otherwise they may be mistakenly regarded as high quality sources of evidence. Protocol registry number CRD42013003538, PROSPERO PMID:25353954
Płaszewski, Maciej; Bettany-Saltikov, Josette
2014-01-01
Non-surgical interventions for adolescents with idiopathic scoliosis remain highly controversial. Despite the publication of numerous reviews no explicit methodological evaluation of papers labeled as, or having a layout of, a systematic review, addressing this subject matter, is available. Analysis and comparison of the content, methodology, and evidence-base from systematic reviews regarding non-surgical interventions for adolescents with idiopathic scoliosis. Systematic overview of systematic reviews. Articles meeting the minimal criteria for a systematic review, regarding any non-surgical intervention for adolescent idiopathic scoliosis, with any outcomes measured, were included. Multiple general and systematic review specific databases, guideline registries, reference lists and websites of institutions were searched. The AMSTAR tool was used to critically appraise the methodology, and the Oxford Centre for Evidence Based Medicine and the Joanna Briggs Institute's hierarchies were applied to analyze the levels of evidence from included reviews. From 469 citations, twenty one papers were included for analysis. Five reviews assessed the effectiveness of scoliosis-specific exercise treatments, four assessed manual therapies, five evaluated bracing, four assessed different combinations of interventions, and one evaluated usual physical activity. Two reviews addressed the adverse effects of bracing. Two papers were high quality Cochrane reviews, Three were of moderate, and the remaining sixteen were of low or very low methodological quality. The level of evidence of these reviews ranged from 1 or 1+ to 4, and in some reviews, due to their low methodological quality and/or poor reporting, this could not be established. Higher quality reviews indicate that generally there is insufficient evidence to make a judgment on whether non-surgical interventions in adolescent idiopathic scoliosis are effective. Papers labeled as systematic reviews need to be considered in terms of their methodological rigor; otherwise they may be mistakenly regarded as high quality sources of evidence. CRD42013003538, PROSPERO.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ramuhalli, Pradeep; Hirt, Evelyn H.; Veeramany, Arun
This research report summaries the development and evaluation of a prototypic enhanced risk monitor (ERM) methodology (framework) that includes alternative risk metrics and uncertainty analysis. This updated ERM methodology accounts for uncertainty in the equipment condition assessment (ECA), the prognostic result, and the probabilistic risk assessment (PRA) model. It is anticipated that the ability to characterize uncertainty in the estimated risk and update the risk estimates in real time based on equipment condition assessment (ECA) will provide a mechanism for optimizing plant performance while staying within specified safety margins. These results (based on impacting active component O&M using real-time equipmentmore » condition information) are a step towards ERMs that, if integrated with AR supervisory plant control systems, can help control O&M costs and improve affordability of advanced reactors.« less
Deepak, V; Kalishwaralal, K; Ramkumarpandian, S; Babu, S Venkatesh; Senthilkumar, S R; Sangiliyandi, G
2008-11-01
Response surface methodology and central composite rotary design (CCRD) was employed to optimize a fermentation medium for the production of Nattokinase by Bacillus subtilis at pH 7.5. The four variables involved in this study were Glucose, Peptone, CaCl2, and MgSO4. The statistical analysis of the results showed that, in the range studied; only peptone had a significant effect on Nattokinase production. The optimized medium containing (%) Glucose: 1, Peptone: 5.5, MgSO4: 0.2 and CaCl2: 0.5 resulted in 2-fold increased level of Nattokinase (3194.25U/ml) production compared to initial level (1599.09U/ml) after 10h of fermentation. Nattokinase production was checked with fibrinolytic activity.
NASA Astrophysics Data System (ADS)
Verma, Jitender; Khedkar, Vijay M.; Prabhu, Arati S.; Khedkar, Santosh A.; Malde, Alpeshkumar K.; Coutinho, Evans C.
2008-02-01
Quantitative Structure-Activity Relationships (QSAR) are being used since decades for prediction of biological activity, lead optimization, classification, identification and explanation of the mechanisms of drug action, and prediction of novel structural leads in drug discovery. Though the technique has lived up to its expectations in many aspects, much work still needs to be done in relation to problems related to the rational design of peptides. Peptides are the drugs of choice in many situations, however, designing them rationally is a complicated task and the complexity increases with the length of their sequence. In order to deal with the problem of peptide optimization, one of our recently developed QSAR formalisms CoRIA (Comparative Residue Interaction Analysis) is being expanded and modified as: reverse-CoRIA ( rCoRIA) and mixed-CoRIA ( mCoRIA) approaches. In these methodologies, the peptide is fragmented into individual units and the interaction energies (van der Waals, Coulombic and hydrophobic) of each amino acid in the peptide with the receptor as a whole ( rCoRIA) and with individual active site residues in the receptor ( mCoRIA) are calculated, which along with other thermodynamic descriptors, are used as independent variables that are correlated to the biological activity by chemometric methods. As a test case, the three CoRIA methodologies have been validated on a dataset of diverse nonamer peptides that bind to the Class I major histocompatibility complex molecule HLA-A*0201, and for which some structure activity relationships have already been reported. The different models developed, and validated both internally as well as externally, were found to be robust with statistically significant values of r 2 (correlation coefficient) and r 2 pred (predictive r 2). These models were able to identify all the structure activity relationships known for this class of peptides, as well uncover some new relationships. This means that these methodologies will perform well for other peptide datasets too. The major advantage of these approaches is that they explicitly utilize the 3D structures of small molecules or peptides as well as their macromolecular targets, to extract position-specific information about important interactions between the ligand and receptor, which can assist the medicinal and computational chemists in designing new molecules, and biologists in studying the influence of mutations in the target receptor on ligand binding.
D'Onza, Giuseppe; Greco, Giulio; Allegrini, Marco
2016-02-01
Recycling implies additional costs for separated municipal solid waste (MSW) collection. The aim of the present study is to propose and implement a management tool - the full cost accounting (FCA) method - to calculate the full collection costs of different types of waste. Our analysis aims for a better understanding of the difficulties of putting FCA into practice in the MSW sector. We propose a FCA methodology that uses standard cost and actual quantities to calculate the collection costs of separate and undifferentiated waste. Our methodology allows cost efficiency analysis and benchmarking, overcoming problems related to firm-specific accounting choices, earnings management policies and purchase policies. Our methodology allows benchmarking and variance analysis that can be used to identify the causes of off-standards performance and guide managers to deploy resources more efficiently. Our methodology can be implemented by companies lacking a sophisticated management accounting system. Copyright © 2015 Elsevier Ltd. All rights reserved.
Stoop, Rahel; Clijsen, Ron; Leoni, Diego; Soldini, Emiliano; Castellini, Greta; Redaelli, Valentina; Barbero, Marco
2017-08-01
The methodological quality of controlled clinical trials (CCTs) of physiotherapeutic treatment modalities for myofascial trigger points (MTrP) has not been investigated yet. To detect the methodological quality of CCTs for physiotherapy treatments of MTrPs and demonstrating the possible increase over time. Systematic review. A systematic search was conducted in two databases, Physiotherapy Evidence Database (PEDro) and Medicine Medical Literature Analysis and Retrieval System online (MEDLINE), using the same keywords and selection procedure corresponding to pre-defined inclusion criteria. The methodological quality, assessed by the 11-item PEDro scale, served as outcome measure. The CCTs had to compare at least two interventions, where one intervention had to lay within the scope of physiotherapy. Participants had to be diagnosed with myofascial pain syndrome or trigger points (active or latent). A total of n = 230 studies was analysed. The cervico-thoracic region was the most frequently treated body part (n = 143). Electrophysical agent applications was the most frequent intervention. The average methodological quality reached 5.5 on the PEDro scale. A total of n = 6 studies scored the value of 9. The average PEDro score increased by 0.7 points per decade between 1978 and 2015. The average PEDro score of CCTs for MTrP treatments does not reach the cut-off of 6 proposed for moderate to high methodological quality. Nevertheless, a promising trend towards an increase of the average methodological quality of CCTs for MTrPs was recorded. More high-quality CCT studies with thorough research procedures are recommended to enhance methodological quality. Copyright © 2017 The Author(s). Published by Elsevier Ltd.. All rights reserved.
[Methodological aspects of integrated care pathways].
Gomis, R; Mata Cases, M; Mauricio Puente, D; Artola Menéndez, S; Ena Muñoz, J; Mediavilla Bravo, J J; Miranda Fernández-Santos, C; Orozco Beltrán, D; Rodríguez Mañas, L; Sánchez Villalba, C; Martínez, J A
An Integrated Healthcare Pathway (PAI) is a tool which has as its aim to increase the effectiveness of clinical performance through greater coordination and to ensure continuity of care. PAI places the patient as the central focus of the organisation of health services. It is defined as the set of activities carried out by the health care providers in order to increase the level of health and satisfaction of the population receiving services. The development of a PAI requires the analysis of the flow of activities, the inter-relationships between professionals and care teams, and patient expectations. The methodology for the development of a PAI is presented and discussed in this article, as well as the success factors for its definition and its effective implementation. It also explains, as an example, the recent PAI for Hypoglycaemia in patients with Type 2 Diabetes Mellitus developed by a multidisciplinary team and supported by several scientific societies. Copyright © 2017 SECA. Publicado por Elsevier España, S.L.U. All rights reserved.
Environmental impact reduction through ecological planning at Bahia Magdalena, Mexico.
Malagrino, Giovanni; Lagunas, Magdalena; Rubio, Alfredo Ortega
2008-03-01
For analyzing basic marine and coastal characteristics we selected the potential sites where shrimp culture could be developed in a large coastal zone, Bahia Magdalena, Baja California Sur, Mexico. Based on our analysis, 6 sites were preselected and field stages of work were then developed to assess the precise suitability of each site in order to develop the proposed aquaculture activities. In ranking the suitability we were able to recommend the most appropriate places to develop shrimp culture in this region. Also, knowing the exact biological, physico-chemical and social environment, we determined the best species to cultivate, the recommended total area and the methodology to be used to lessen the environmental impact and to obtain the maximum profitability Our methodology could be used not only to select appropriate sites for shrimp culture in other coastal lagoons, but it also could be applied to assess the suitability in a quick and accurate way, of any other production activity in coastal zones.
Yadav, Kaushlesh K; Garg, Neelima; Kumar, Devendra; Kumar, Sanjay; Singh, Achal; Muthukumar, M
2015-01-01
Polygalacturonase (PG) degrades pectin into D-galacturonic acid monomers and is used widely in food industry especially for juice clarification. In the present study,. fermentation conditions for polygalacturonase production by Asgergillus niger NAIMCCF-02958, using mango peel as substrate, were optimized using the 2(3) factorial design with central composite rotatable experimental design (CCRD) of response surface methodology (RSM). The maximum PG activity 723.66 U g(-1) was achieved under pH 4.0, temperature 30 degrees C and 2% inoculum by response surface curve. The experimental value of PG activity wkas higher 607.65 U g(-1) than the predicted value 511.75 U g(-1). Under the proposed optimized conditions, the determination coefficient (R2) was equal to 0.66 indicating that the model could explain 66% of the total variation as well as establish the relationship between the variables and the responses. ANOVA analysis and the three dimensional plots also confirmed interactions among the parameters.
Wavelet methodology to improve single unit isolation in primary motor cortex cells
Ortiz-Rosario, Alexis; Adeli, Hojjat; Buford, John A.
2016-01-01
The proper isolation of action potentials recorded extracellularly from neural tissue is an active area of research in the fields of neuroscience and biomedical signal processing. This paper presents an isolation methodology for neural recordings using the wavelet transform (WT), a statistical thresholding scheme, and the principal component analysis (PCA) algorithm. The effectiveness of five different mother wavelets was investigated: biorthogonal, Daubachies, discrete Meyer, symmetric, and Coifman; along with three different wavelet coefficient thresholding schemes: fixed form threshold, Stein’s unbiased estimate of risk, and minimax; and two different thresholding rules: soft and hard thresholding. The signal quality was evaluated using three different statistical measures: mean-squared error, root-mean squared, and signal to noise ratio. The clustering quality was evaluated using two different statistical measures: isolation distance, and L-ratio. This research shows that the selection of the mother wavelet has a strong influence on the clustering and isolation of single unit neural activity, with the Daubachies 4 wavelet and minimax thresholding scheme performing the best. PMID:25794461
Mkadmini Hammi, Khaoula; Hammami, Majdi; Rihouey, Christophe; Le Cerf, Didier; Ksouri, Riadh; Majdoub, Hatem
2016-12-01
Response surface methodology using a Box-Behnken design was employed to optimize extraction temperature, extraction time and ratio of water to material to obtain a maximum polysaccharide yield with high uronic acid content and antioxidant property from edible Zizyphus lotus fruit. The optimal conditions were: extraction time of 3h 15min, extraction temperature of 91.2°C and water to solid ratio of 39mL/g. Under these conditions, the experimental extraction yield, uronic acid content and 2,2-diphenyl-1-picrylhydrazyl scavenging ability (IC50) were 18.88%, 41.89 and 0.518mg/mL, respectively. Chemical analysis revealed that the extract was composed of 97.92% carbohydrate of which 41.89% is uronic acid. The extracted polysaccharides, with an average molecular weight of 2720kDa, are composed of arabinose, rhamnose, glucose, fructose, galactose and xylose. Moreover, the polysaccharides exhibited a significant reducing power and anti-lipid peroxidation activities. Copyright © 2016 Elsevier Ltd. All rights reserved.
Shear-wave velocity profiling according to three alternative approaches: A comparative case study
NASA Astrophysics Data System (ADS)
Dal Moro, G.; Keller, L.; Al-Arifi, N. S.; Moustafa, S. S. R.
2016-11-01
The paper intends to compare three different methodologies which can be used to analyze surface-wave propagation, thus eventually obtaining the vertical shear-wave velocity (VS) profile. The three presented methods (currently still quite unconventional) are characterized by different field procedures and data processing. The first methodology is a sort of evolution of the classical Multi-channel Analysis of Surface Waves (MASW) here accomplished by jointly considering Rayleigh and Love waves (analyzed according to the Full Velocity Spectrum approach) and the Horizontal-to-Vertical Spectral Ratio (HVSR). The second method is based on the joint analysis of the HVSR curve together with the Rayleigh-wave dispersion determined via Miniature Array Analysis of Microtremors (MAAM), a passive methodology that relies on a small number (4 to 6) of vertical geophones deployed along a small circle (for the common near-surface application the radius usually ranges from 0.6 to 5 m). Finally, the third considered approach is based on the active data acquired by a single 3-component geophone and relies on the joint inversion of the group-velocity spectra of the radial and vertical components of the Rayleigh waves, together with the Radial-to-Vertical Spectral Ratio (RVSR). The results of the analyses performed while considering these approaches (completely different both in terms of field procedures and data analysis) appear extremely consistent thus mutually validating their performances. Pros and cons of each approach are summarized both in terms of computational aspects as well as with respect to practical considerations regarding the specific character of the pertinent field procedures.
Hermans, Artur; Kieninger, Clemens; Koskinen, Kalle; Wickberg, Andreas; Solano, Eduardo; Dendooven, Jolien; Kauranen, Martti; Clemmen, Stéphane; Wegener, Martin; Koos, Christian; Baets, Roel
2017-01-01
The determination of the second-order susceptibility (χ(2)) of thin film samples can be a delicate matter since well-established χ(2) measurement methodologies such as the Maker fringe technique are best suited for nonlinear materials with large thicknesses typically ranging from tens of microns to several millimeters. Here we compare two different second-harmonic generation setups and the corresponding measurement methodologies that are especially advantageous for thin film χ(2) characterization. This exercise allows for cross-checking the χ(2) obtained for identical samples and identifying the main sources of error for the respective techniques. The development of photonic integrated circuits makes nonlinear thin films of particular interest, since they can be processed into long waveguides to create efficient nonlinear devices. The investigated samples are ABC-type nanolaminates, which were reported recently by two different research groups. However, the subsequent analysis can be useful for all researchers active in the field of thin film χ(2) characterization. PMID:28317938
Abderrahim, Mohamed; Arribas, Silvia M; Condezo-Hoyos, Luis
2017-05-01
Pyrogallol red (PGR) was identified as a novel optical probe for the detection of hydrogen peroxide (H 2 O 2 ) based on horseradish peroxidase (HRP)-catalyzed oxidation. Response surface methodology (RSM) was applied as a tool to optimize the concentrations of PGR (100µmolL -1 ), HRP (1UmL -1 ) and H 2 O 2 (250µmolL -1 ) and used to develop a sensitive PGR-based catalase (CAT) activity assay (PGR-CAT assay). N-ethylmaleimide -NEM- (102mmolL -1 ) was used to avoid interference produced by thiol groups while protecting CAT activity. Incubation time (30min) for samples or CAT used as standard and H 2 O 2 as well as signal stability (stable between 5 and 60min) were also evaluated. PGR-CAT assay was linear within the range of 0-4UmL -1 (R 2 =0.993) and very sensitive with limits of detection (LOD) of 0.005UmL -1 and quantitation (LOQ) of 0.01UmL -1 . PGR-CAT assay showed an adequate intra-day RSD=0.6-9.5% and inter-day RSD=2.4-8.9%. Bland-Altman analysis and Passing-Bablok and Pearson correlation analysis showed good agreement between CAT activity as measured by the PRG-CAT assay and the Amplex Red assay. The PGR-CAT assay is more sensitive than all the other colorimetric assays reported, particularly the Amplex Red assay, and the cost of PGR is a small fraction (about 1/1000) of that of an Amplex Red probe, so it can be expected to find wide use among scientists studying CAT activity in biological samples. Copyright © 2017 Elsevier B.V. All rights reserved.
Zuberer, Agnieszka; Brandeis, Daniel; Drechsler, Renate
2015-01-01
While issues of efficacy and specificity are crucial for the future of neurofeedback training, there may be alternative designs and control analyses to circumvent the methodological and ethical problems associated with double-blind placebo studies. Surprisingly, most NF studies do not report the most immediate result of their NF training, i.e., whether or not children with ADHD gain control over their brain activity during the training sessions. For the investigation of specificity, however, it seems essential to analyze the learning and adaptation processes that take place in the course of the training and to relate improvements in self-regulated brain activity across training sessions to behavioral, neuropsychological and electrophysiological outcomes. To this aim, a review of studies on neurofeedback training with ADHD patients which include the analysis of learning across training sessions or relate training performance to outcome is presented. Methods on how to evaluate and quantify learning of EEG regulation over time are discussed. “Non-learning” has been reported in a small number of ADHD-studies, but has not been a focus of general methodological discussion so far. For this reason, selected results from the brain-computer interface (BCI) research on the so-called “brain-computer illiteracy”, the inability to gain control over one’s brain activity, are also included. It is concluded that in the discussion on specificity, more attention should be devoted to the analysis of EEG regulation performance in the course of the training and its impact on clinical outcome. It is necessary to improve the knowledge on characteristic cross-session and within-session learning trajectories in ADHD and to provide the best conditions for learning. PMID:25870550
Combining users' activity survey and simulators to evaluate human activity recognition systems.
Azkune, Gorka; Almeida, Aitor; López-de-Ipiña, Diego; Chen, Liming
2015-04-08
Evaluating human activity recognition systems usually implies following expensive and time-consuming methodologies, where experiments with humans are run with the consequent ethical and legal issues. We propose a novel evaluation methodology to overcome the enumerated problems, which is based on surveys for users and a synthetic dataset generator tool. Surveys allow capturing how different users perform activities of daily living, while the synthetic dataset generator is used to create properly labelled activity datasets modelled with the information extracted from surveys. Important aspects, such as sensor noise, varying time lapses and user erratic behaviour, can also be simulated using the tool. The proposed methodology is shown to have very important advantages that allow researchers to carry out their work more efficiently. To evaluate the approach, a synthetic dataset generated following the proposed methodology is compared to a real dataset computing the similarity between sensor occurrence frequencies. It is concluded that the similarity between both datasets is more than significant.
DESIGN ANALYSIS FOR THE NAVAL SNF WASTE PACKAGE
DOE Office of Scientific and Technical Information (OSTI.GOV)
T.L. Mitchell
2000-05-31
The purpose of this analysis is to demonstrate the design of the naval spent nuclear fuel (SNF) waste package (WP) using the Waste Package Department's (WPD) design methodologies and processes described in the ''Waste Package Design Methodology Report'' (CRWMS M&O [Civilian Radioactive Waste Management System Management and Operating Contractor] 2000b). The calculations that support the design of the naval SNF WP will be discussed; however, only a sub-set of such analyses will be presented and shall be limited to those identified in the ''Waste Package Design Sensitivity Report'' (CRWMS M&O 2000c). The objective of this analysis is to describe themore » naval SNF WP design method and to show that the design of the naval SNF WP complies with the ''Naval Spent Nuclear Fuel Disposal Container System Description Document'' (CRWMS M&O 1999a) and Interface Control Document (ICD) criteria for Site Recommendation. Additional criteria for the design of the naval SNF WP have been outlined in Section 6.2 of the ''Waste Package Design Sensitivity Report'' (CRWMS M&O 2000c). The scope of this analysis is restricted to the design of the naval long WP containing one naval long SNF canister. This WP is representative of the WPs that will contain both naval short SNF and naval long SNF canisters. The following items are included in the scope of this analysis: (1) Providing a general description of the applicable design criteria; (2) Describing the design methodology to be used; (3) Presenting the design of the naval SNF waste package; and (4) Showing compliance with all applicable design criteria. The intended use of this analysis is to support Site Recommendation reports and assist in the development of WPD drawings. Activities described in this analysis were conducted in accordance with the technical product development plan (TPDP) ''Design Analysis for the Naval SNF Waste Package (CRWMS M&O 2000a).« less
Capannesi, Geraldo; Lopez, Francesco
2013-01-01
Human activities introduce compounds increasing levels of many dangerous species for environment and population. In this way, trace elements in airborne particulate have a preeminent position due to toxic element presence affecting the biological systems. The main problem is the analytical determination of such species at ultratrace levels: a very specific methodology is necessary with regard to the accuracy and precision and contamination problems. Instrumental Neutron Activation Analysis and Instrumental Photon Activation Analysis assure these requirements. A retrospective element analysis in airborne particulate collected in the last 4 decades has been carried out for studying their trend. The samples were collected in urban location in order to determine only effects due to global aerosol circulation; semiannual samples have been used to characterize the summer/winter behavior of natural and artificial origin. The levels of natural origin element are higher than those in other countries owing to geological and meteorological factors peculiar to Central Italy. The levels of artificial elements are sometimes less than those in other countries, suggesting a less polluted general situation for Central Italy. However, for a few elements (e.g., Pb) the levels measured are only slight lower than those proposed as air ambient standard. PMID:23878525
Persechino, Benedetta; Valenti, Antonio; Ronchetti, Matteo; Rondinone, Bruna Maria; Di Tecco, Cristina; Vitali, Sara; Iavicoli, Sergio
2013-06-01
Work-related stress is one of the major causes of occupational ill health. In line with the regulatory framework on occupational health and safety (OSH), adequate models for assessing and managing risk need to be identified so as to minimize the impact of this stress not only on workers' health, but also on productivity. After close analysis of the Italian and European reference regulatory framework and work-related stress assessment and management models used in some European countries, we adopted the UK Health and Safety Executive's (HSE) Management Standards (MS) approach, adapting it to the Italian context in order to provide a suitable methodological proposal for Italy. We have developed a work-related stress risk assessment strategy, meeting regulatory requirements, now available on a specific web platform that includes software, tutorials, and other tools to assist companies in their assessments. This methodological proposal is new on the Italian work-related stress risk assessment scene. Besides providing an evaluation approach using scientifically validated instruments, it ensures the active participation of occupational health professionals in each company. The assessment tools provided enable companies not only to comply with the law, but also to contribute to a database for monitoring and assessment and give access to a reserved area for data analysis and comparisons.
Persechino, Benedetta; Valenti, Antonio; Ronchetti, Matteo; Rondinone, Bruna Maria; Di Tecco, Cristina; Vitali, Sara; Iavicoli, Sergio
2013-01-01
Background Work-related stress is one of the major causes of occupational ill health. In line with the regulatory framework on occupational health and safety (OSH), adequate models for assessing and managing risk need to be identified so as to minimize the impact of this stress not only on workers' health, but also on productivity. Methods After close analysis of the Italian and European reference regulatory framework and work-related stress assessment and management models used in some European countries, we adopted the UK Health and Safety Executive's (HSE) Management Standards (MS) approach, adapting it to the Italian context in order to provide a suitable methodological proposal for Italy. Results We have developed a work-related stress risk assessment strategy, meeting regulatory requirements, now available on a specific web platform that includes software, tutorials, and other tools to assist companies in their assessments. Conclusion This methodological proposal is new on the Italian work-related stress risk assessment scene. Besides providing an evaluation approach using scientifically validated instruments, it ensures the active participation of occupational health professionals in each company. The assessment tools provided enable companies not only to comply with the law, but also to contribute to a database for monitoring and assessment and give access to a reserved area for data analysis and comparisons. PMID:23961332
D'Orso, M I; Centemeri, R; Oggionni, P; Latocca, R; Crippa, M; Vercellino, R; Riva, M; Cesana, G
2011-01-01
The movement computerized analysis of upper limb is a valid support in the definition of residual functional capability and of specific work suitability in complex cases. This methodology of evaluation is able to correctly and objectively define the tridimensional ranges of motion of every patient's upper limb. This fact can be particularly useful for workers coming back to work after a work-related or a not work-related accident of for handicapped workers at the beginning of a new work activity. We report a research carried out using computerized analysis of motion of upper limbs in 20 engineering workers.
Raut, Savita V; Yadav, Dinkar M
2018-03-28
This paper presents an fMRI signal analysis methodology using geometric mean curve decomposition (GMCD) and mutual information-based voxel selection framework. Previously, the fMRI signal analysis has been conducted using empirical mean curve decomposition (EMCD) model and voxel selection on raw fMRI signal. The erstwhile methodology loses frequency component, while the latter methodology suffers from signal redundancy. Both challenges are addressed by our methodology in which the frequency component is considered by decomposing the raw fMRI signal using geometric mean rather than arithmetic mean and the voxels are selected from EMCD signal using GMCD components, rather than raw fMRI signal. The proposed methodologies are adopted for predicting the neural response. Experimentations are conducted in the openly available fMRI data of six subjects, and comparisons are made with existing decomposition models and voxel selection frameworks. Subsequently, the effect of degree of selected voxels and the selection constraints are analyzed. The comparative results and the analysis demonstrate the superiority and the reliability of the proposed methodology.
A novel integrated framework and improved methodology of computer-aided drug design.
Chen, Calvin Yu-Chian
2013-01-01
Computer-aided drug design (CADD) is a critical initiating step of drug development, but a single model capable of covering all designing aspects remains to be elucidated. Hence, we developed a drug design modeling framework that integrates multiple approaches, including machine learning based quantitative structure-activity relationship (QSAR) analysis, 3D-QSAR, Bayesian network, pharmacophore modeling, and structure-based docking algorithm. Restrictions for each model were defined for improved individual and overall accuracy. An integration method was applied to join the results from each model to minimize bias and errors. In addition, the integrated model adopts both static and dynamic analysis to validate the intermolecular stabilities of the receptor-ligand conformation. The proposed protocol was applied to identifying HER2 inhibitors from traditional Chinese medicine (TCM) as an example for validating our new protocol. Eight potent leads were identified from six TCM sources. A joint validation system comprised of comparative molecular field analysis, comparative molecular similarity indices analysis, and molecular dynamics simulation further characterized the candidates into three potential binding conformations and validated the binding stability of each protein-ligand complex. The ligand pathway was also performed to predict the ligand "in" and "exit" from the binding site. In summary, we propose a novel systematic CADD methodology for the identification, analysis, and characterization of drug-like candidates.
Day-Ahead Crude Oil Price Forecasting Using a Novel Morphological Component Analysis Based Model
Zhu, Qing; Zou, Yingchao; Lai, Kin Keung
2014-01-01
As a typical nonlinear and dynamic system, the crude oil price movement is difficult to predict and its accurate forecasting remains the subject of intense research activity. Recent empirical evidence suggests that the multiscale data characteristics in the price movement are another important stylized fact. The incorporation of mixture of data characteristics in the time scale domain during the modelling process can lead to significant performance improvement. This paper proposes a novel morphological component analysis based hybrid methodology for modeling the multiscale heterogeneous characteristics of the price movement in the crude oil markets. Empirical studies in two representative benchmark crude oil markets reveal the existence of multiscale heterogeneous microdata structure. The significant performance improvement of the proposed algorithm incorporating the heterogeneous data characteristics, against benchmark random walk, ARMA, and SVR models, is also attributed to the innovative methodology proposed to incorporate this important stylized fact during the modelling process. Meanwhile, work in this paper offers additional insights into the heterogeneous market microstructure with economic viable interpretations. PMID:25061614
[Problems of world outlook and methodology of science integration in biological studies].
Khododova, Iu D
1981-01-01
Problems of worldoutlook and methodology of the natural-science knowledge are considered basing on the analysis of tendencies in the development of the membrane theory of cell processes and the use of principles of biological membrane functioning when solving some scientific and applied problems pertaining to different branches of chemistry and biology. The notion scientific knowledge integration is defined as interpenetration of approaches, methods and ideas of different branches of knowledge and enrichment on this basis of their content resulting in knowledge augmentation in each field taken separately. These processes are accompanied by appearance of new branches of knowledge - sciences "on junction" and their subsequent differentiations. The analysis of some gnoseological situations shows that integration of sciences contributes to coordination and some agreement of thinking styles of different specialists, puts forward keen personality of a scientist demanding, in particular, his high professional mobility. Problems of scientific activity organization are considered, which involve social sciences into the integration processes. The role of philosophy in the integration processes is emphasized.
Understanding Design Tradeoffs for Health Technologies: A Mixed-Methods Approach
O’Leary, Katie; Eschler, Jordan; Kendall, Logan; Vizer, Lisa M.; Ralston, James D.; Pratt, Wanda
2017-01-01
We introduce a mixed-methods approach for determining how people weigh tradeoffs in values related to health and technologies for health self-management. Our approach combines interviews with Q-methodology, a method from psychology uniquely suited to quantifying opinions. We derive the framework for structured data collection and analysis for the Q-methodology from theories of self-management of chronic illness and technology adoption. To illustrate the power of this new approach, we used it in a field study of nine older adults with type 2 diabetes, and nine mothers of children with asthma. Our mixed-methods approach provides three key advantages for health design science in HCI: (1) it provides a structured health sciences theoretical framework to guide data collection and analysis; (2) it enhances the coding of unstructured data with statistical patterns of polarizing and consensus views; and (3) it empowers participants to actively weigh competing values that are most personally significant to them. PMID:28804794
A Monte Carlo–Based Bayesian Approach for Measuring Agreement in a Qualitative Scale
Pérez Sánchez, Carlos Javier
2014-01-01
Agreement analysis has been an active research area whose techniques have been widely applied in psychology and other fields. However, statistical agreement among raters has been mainly considered from a classical statistics point of view. Bayesian methodology is a viable alternative that allows the inclusion of subjective initial information coming from expert opinions, personal judgments, or historical data. A Bayesian approach is proposed by providing a unified Monte Carlo–based framework to estimate all types of measures of agreement in a qualitative scale of response. The approach is conceptually simple and it has a low computational cost. Both informative and non-informative scenarios are considered. In case no initial information is available, the results are in line with the classical methodology, but providing more information on the measures of agreement. For the informative case, some guidelines are presented to elicitate the prior distribution. The approach has been applied to two applications related to schizophrenia diagnosis and sensory analysis. PMID:29881002
Wong, John B.; Coates, Paul M.; Russell, Robert M.; Dwyer, Johanna T.; Schuttinga, James A.; Bowman, Barbara A.; Peterson, Sarah A.
2011-01-01
Increased interest in the potential societal benefit of incorporating health economics as a part of clinical translational science, particularly nutrition interventions, led the Office of Dietary Supplements at the National Institutes of Health to sponsor a conference to address key questions about economic analysis of nutrition interventions to enhance communication among health economic methodologists, researchers, reimbursement policy makers, and regulators. Issues discussed included the state of the science, such as what health economic methods are currently used to judge the burden of illness, interventions, or health care policies, and what new research methodologies are available or needed to address knowledge and methodological gaps or barriers. Research applications included existing evidence-based health economic research activities in nutrition that are ongoing or planned at federal agencies. International and U.S. regulatory, policy and clinical practice perspectives included a discussion of how research results can help regulators and policy makers within government make nutrition policy decisions, and how economics affects clinical guideline development. PMID:21884133
Day-ahead crude oil price forecasting using a novel morphological component analysis based model.
Zhu, Qing; He, Kaijian; Zou, Yingchao; Lai, Kin Keung
2014-01-01
As a typical nonlinear and dynamic system, the crude oil price movement is difficult to predict and its accurate forecasting remains the subject of intense research activity. Recent empirical evidence suggests that the multiscale data characteristics in the price movement are another important stylized fact. The incorporation of mixture of data characteristics in the time scale domain during the modelling process can lead to significant performance improvement. This paper proposes a novel morphological component analysis based hybrid methodology for modeling the multiscale heterogeneous characteristics of the price movement in the crude oil markets. Empirical studies in two representative benchmark crude oil markets reveal the existence of multiscale heterogeneous microdata structure. The significant performance improvement of the proposed algorithm incorporating the heterogeneous data characteristics, against benchmark random walk, ARMA, and SVR models, is also attributed to the innovative methodology proposed to incorporate this important stylized fact during the modelling process. Meanwhile, work in this paper offers additional insights into the heterogeneous market microstructure with economic viable interpretations.
ERIC Educational Resources Information Center
Courtillon-Leclercq, Janine; Papo, Eliane
1977-01-01
An attempt to show that a threshold level would furnish conditions for a renewed methodology and greater student creativity. The acquisition of communicative competence would be constructed around two types of activities: analysis of the conditions of speech production and systematization of two levels of grammar. (Text is in French.) (AMH)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Neubauer, R.J.; Thebeau, L.; Paul, J.
1994-12-31
The US Army Aberdeen Proving Ground (APG) is a primarily undeveloped installation on the upper Chesapeake bay in Maryland. The bush and Gunpowder Rivers are two sub-estuaries that run through the installation before emptying into the Chesapeake Bay. Past activities at EA APG include pilot-scale chemical agent manufacturing, munitions testing, smoke/incendiary manufacturing, domestic and rubble landfilling, and disposal of chemical warfare agents as well as other materials. It was determined that if contamination of the Gunpowder River exists from these previous activities on EA APG it was most likely to be found in the sediments. The initial phase was tomore » conduct a sediment survey of the river to determine the spatial distribution of sediment types and the suitability of the benthos for the proposed methodologies. The second phase was to combine innovative screening-level investigative methodologies as well as sediment chemical and physical analyses into one survey of the benthos and sediments of the Gunpowder River. This phase used the Microtox luminescent bioassay and Daphnia magna IQ Toxicity Test, Surface and Profile Image (SPI) photography, analysis of sediment physical characteristics, and limited chemical analysis to identify locations that warrant a more focused investigation.« less
Lee-Bates, Benjamin; Billing, Daniel C; Caputi, Peter; Carstairs, Greg L; Linnane, Denise; Middleton, Kane
2017-09-01
The aim of this study was to determine if perceptions of physically demanding job tasks are biased by employee demographics and employment profile characteristics including: age, sex, experience, length of tenure, rank and if they completed or supervised a task. Surveys were administered to 427 Royal Australian Navy personnel who characterised 33 tasks in terms of physical effort, importance, frequency, duration and vertical/horizontal distance travelled. Results showed no evidence of bias resulting from participant characteristics, however participants who were actively involved in both task participation and supervision rated these tasks as more important than those involved only in the supervision of that task. This may indicate self-serving bias in which participants that are more actively involved in a task had an inflated perception of that task's importance. These results have important implications for the conduct of job task analyses, especially the use of subjective methodologies in the development of scientifically defensible physical employment standards. Practitioner Summary: To examine the presence of systematic bias in subjective job task analysis methodologies, a survey was conducted on a sample of Royal Australian Navy personnel. The relationship between job task descriptions and participant's demographic and job profile characteristics revealed the presence of self-serving bias affecting perceptions of task importance.
Samavati, Vahid
2013-10-01
Microwave-assisted extraction (MAE) technique was employed to extract the hydrocolloid from okra pods (OPH). The optimal conditions for microwave-assisted extraction of OPH were determined by response surface methodology. A central composite rotatable design (CCRD) was applied to evaluate the effects of three independent variables (microwave power (X1: 100-500 W), extraction time (X2: 30-90 min), and extraction temperature (X3: 40-90 °C)) on the extraction yield of OPH. The correlation analysis of the mathematical-regression model indicated that quadratic polynomial model could be employed to optimize the microwave extraction of OPH. The optimal conditions to obtain the highest recovery of OPH (14.911±0.27%) were as follows: microwave power, 395.56 W; extraction time, 67.11 min and extraction temperature, 73.33 °C. Under these optimal conditions, the experimental values agreed with the predicted ones by analysis of variance. It indicated high fitness of the model used and the success of response surface methodology for optimizing OPH extraction. After method development, the DPPH radical scavenging activity of the OPH was evaluated. MAE showed obvious advantages in terms of high extraction efficiency and radical scavenging activity of extract within the shorter extraction time. Copyright © 2013 Elsevier B.V. All rights reserved.
Ab Initio Crystal Field for Lanthanides.
Ungur, Liviu; Chibotaru, Liviu F
2017-03-13
An ab initio methodology for the first-principle derivation of crystal-field (CF) parameters for lanthanides is described. The methodology is applied to the analysis of CF parameters in [Tb(Pc) 2 ] - (Pc=phthalocyanine) and Dy 4 K 2 ([Dy 4 K 2 O(OtBu) 12 ]) complexes, and compared with often used approximate and model descriptions. It is found that the application of geometry symmetrization, and the use of electrostatic point-charge and phenomenological CF models, lead to unacceptably large deviations from predictions based on ab initio calculations for experimental geometry. It is shown how the predictions of standard CASSCF (Complete Active Space Self-Consistent Field) calculations (with 4f orbitals in the active space) can be systematically improved by including effects of dynamical electronic correlation (CASPT2 step) and by admixing electronic configurations of the 5d shell. This is exemplified for the well-studied Er-trensal complex (H 3 trensal=2,2',2"-tris(salicylideneimido)trimethylamine). The electrostatic contributions to CF parameters in this complex, calculated with true charge distributions in the ligands, yield less than half of the total CF splitting, thus pointing to the dominant role of covalent effects. This analysis allows the conclusion that ab initio crystal field is an essential tool for the decent description of lanthanides. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Change Mechanisms of Schema-Centered Group Psychotherapy with Personality Disorder Patients
Tschacher, Wolfgang; Zorn, Peter; Ramseyer, Fabian
2012-01-01
Background This study addressed the temporal properties of personality disorders and their treatment by schema-centered group psychotherapy. It investigated the change mechanisms of psychotherapy using a novel method by which psychotherapy can be modeled explicitly in the temporal domain. Methodology and Findings 69 patients were assigned to a specific schema-centered behavioral group psychotherapy, 26 to social skills training as a control condition. The largest diagnostic subgroups were narcissistic and borderline personality disorder. Both treatments offered 30 group sessions of 100 min duration each, at a frequency of two sessions per week. Therapy process was described by components resulting from principal component analysis of patients' session-reports that were obtained after each session. These patient-assessed components were Clarification, Bond, Rejection, and Emotional Activation. The statistical approach focused on time-lagged associations of components using time-series panel analysis. This method provided a detailed quantitative representation of therapy process. It was found that Clarification played a core role in schema-centered psychotherapy, reducing rejection and regulating the emotion of patients. This was also a change mechanism linked to therapy outcome. Conclusions/Significance The introduced process-oriented methodology allowed to highlight the mechanisms by which psychotherapeutic treatment became effective. Additionally, process models depicted the actual patterns that differentiated specific diagnostic subgroups. Time-series analysis explores Granger causality, a non-experimental approximation of causality based on temporal sequences. This methodology, resting upon naturalistic data, can explicate mechanisms of action in psychotherapy research and illustrate the temporal patterns underlying personality disorders. PMID:22745811
Polyphony: superposition independent methods for ensemble-based drug discovery.
Pitt, William R; Montalvão, Rinaldo W; Blundell, Tom L
2014-09-30
Structure-based drug design is an iterative process, following cycles of structural biology, computer-aided design, synthetic chemistry and bioassay. In favorable circumstances, this process can lead to the structures of hundreds of protein-ligand crystal structures. In addition, molecular dynamics simulations are increasingly being used to further explore the conformational landscape of these complexes. Currently, methods capable of the analysis of ensembles of crystal structures and MD trajectories are limited and usually rely upon least squares superposition of coordinates. Novel methodologies are described for the analysis of multiple structures of a protein. Statistical approaches that rely upon residue equivalence, but not superposition, are developed. Tasks that can be performed include the identification of hinge regions, allosteric conformational changes and transient binding sites. The approaches are tested on crystal structures of CDK2 and other CMGC protein kinases and a simulation of p38α. Known interaction - conformational change relationships are highlighted but also new ones are revealed. A transient but druggable allosteric pocket in CDK2 is predicted to occur under the CMGC insert. Furthermore, an evolutionarily-conserved conformational link from the location of this pocket, via the αEF-αF loop, to phosphorylation sites on the activation loop is discovered. New methodologies are described and validated for the superimposition independent conformational analysis of large collections of structures or simulation snapshots of the same protein. The methodologies are encoded in a Python package called Polyphony, which is released as open source to accompany this paper [http://wrpitt.bitbucket.org/polyphony/].
Methodologies for Evaluating the Impact of Contraceptive Social Marketing Programs.
ERIC Educational Resources Information Center
Bertrand, Jane T.; And Others
1989-01-01
An overview of the evaluation issues associated with contraceptive social marketing programs is provided. Methodologies covered include survey techniques, cost-effectiveness analyses, retail audits of sales data, time series analysis, nested logit analysis, and discriminant analysis. (TJH)
Comprehensive Design Reliability Activities for Aerospace Propulsion Systems
NASA Technical Reports Server (NTRS)
Christenson, R. L.; Whitley, M. R.; Knight, K. C.
2000-01-01
This technical publication describes the methodology, model, software tool, input data, and analysis result that support aerospace design reliability studies. The focus of these activities is on propulsion systems mechanical design reliability. The goal of these activities is to support design from a reliability perspective. Paralleling performance analyses in schedule and method, this requires the proper use of metrics in a validated reliability model useful for design, sensitivity, and trade studies. Design reliability analysis in this view is one of several critical design functions. A design reliability method is detailed and two example analyses are provided-one qualitative and the other quantitative. The use of aerospace and commercial data sources for quantification is discussed and sources listed. A tool that was developed to support both types of analyses is presented. Finally, special topics discussed include the development of design criteria, issues of reliability quantification, quality control, and reliability verification.
Engineering uses of physics-based ground motion simulations
Baker, Jack W.; Luco, Nicolas; Abrahamson, Norman A.; Graves, Robert W.; Maechling, Phillip J.; Olsen, Kim B.
2014-01-01
This paper summarizes validation methodologies focused on enabling ground motion simulations to be used with confidence in engineering applications such as seismic hazard analysis and dynmaic analysis of structural and geotechnical systems. Numberical simullation of ground motion from large erthquakes, utilizing physics-based models of earthquake rupture and wave propagation, is an area of active research in the earth science community. Refinement and validatoin of these models require collaboration between earthquake scientists and engineering users, and testing/rating methodolgies for simulated ground motions to be used with confidence in engineering applications. This paper provides an introduction to this field and an overview of current research activities being coordinated by the Souther California Earthquake Center (SCEC). These activities are related both to advancing the science and computational infrastructure needed to produce ground motion simulations, as well as to engineering validation procedures. Current research areas and anticipated future achievements are also discussed.
Costa, Ana Lúcia Siqueira; Silva, Rodrigo Marques da; Mussi, Fernanda Carneiro; Serrano, Patrícia Maria; Graziano, Eliane da Silva; Batista, Karla de Melo
2018-01-08
validate a short version of the Instrument for assessment of stress in nursing students in the Brazilian reality. Methodological study conducted with 1047 nursing students from five Brazilian institutions, who answered the 30 items initially distributed in eight domains. Data were analyzed in the R Statistical Package and in the latent variable analysis, using exploratory and confirmatory factor analyses, Cronbach's alpha and item-total correlation. The short version of the instrument had 19 items distributed into four domains: Environment, Professional Training, Theoretical Activities and Performance of Practical Activities. The confirmatory analysis showed absolute and parsimony fit to the proposed model with satisfactory residual levels. Alpha values per factor ranged from 0.736 (Environment) to 0.842 (Performance of Practical Activities). The short version of the instrument has construct validity and reliability for application to Brazilian nursing undergraduates at any stage of the course.
Reliability Modeling Methodology for Independent Approaches on Parallel Runways Safety Analysis
NASA Technical Reports Server (NTRS)
Babcock, P.; Schor, A.; Rosch, G.
1998-01-01
This document is an adjunct to the final report An Integrated Safety Analysis Methodology for Emerging Air Transport Technologies. That report presents the results of our analysis of the problem of simultaneous but independent, approaches of two aircraft on parallel runways (independent approaches on parallel runways, or IAPR). This introductory chapter presents a brief overview and perspective of approaches and methodologies for performing safety analyses for complex systems. Ensuing chapter provide the technical details that underlie the approach that we have taken in performing the safety analysis for the IAPR concept.
The 'Direct Attack' Strategy for Poverty Removal: Implementation Methodology.
ERIC Educational Resources Information Center
Sinha, Sanjay
1981-01-01
Discusses elements of an implementation methodology for the removal of poverty in India. Includes background, methodology, aggregation of demands, economics of the strategy, complementary activities and infrastructure, mechanics of implementation, and monitoring. (CT)
Quantifying Volcanic Emissions of Trace Elements to the Atmosphere: Ideas Based on Past Studies
NASA Astrophysics Data System (ADS)
Rose, W. I.
2003-12-01
Extensive data exist from volcanological and geochemical studies about exotic elemental enrichments in volcanic emissions to the atmosphere but quantitative data are quite rare. Advanced, highly sensitive techniques of analysis are needed to detect low concentrations of some minor elements, especially during major eruptions. I will present data from studies done during low levels of activity (incrustations and silica tube sublimates at high temperature fumaroles, from SEM studies of particle samples collected in volcanic plumes and volcanic clouds, from geochemical analysis of volcanic gas condensates, from analysis of treated particle and gas filter packs) and a much smaller number that could reflect explosive activity (from fresh ashfall leachate geochemistry, and from thermodynamic codes modeling volatile emissions from magma). This data describes a highly variable pattern of elemental enrichments which are difficult to quantify, generalize and understand. Sampling in a routine way is difficult, and work in active craters has heightened our awareness of danger, which appropriately inhibits some sampling. There are numerous localized enrichments of minor elements that can be documented and others can be expected or inferred. There is a lack of systematic tools to measure minor element abundances in volcanic emissions. The careful combination of several methodologies listed above for the same volcanic vents can provide redundant data on multiple elements which could lead to overall quantification of minor element fluxes but there are challenging issues about detection. For quiescent plumes we can design combinations of measurements to quantify minor element emission rates. Doing a comparable methodology to succeed in measuring minor element fluxes for significant eruptions will require new strategies and/or ideas.
Magness, Scott T.; Puthoff, Brent J.; Crissey, Mary Ann; Dunn, James; Henning, Susan J.; Houchen, Courtney; Kaddis, John S.; Kuo, Calvin J.; Li, Linheng; Lynch, John; Martin, Martin G.; May, Randal; Niland, Joyce C.; Olack, Barbara; Qian, Dajun; Stelzner, Matthias; Swain, John R.; Wang, Fengchao; Wang, Jiafang; Wang, Xinwei; Yan, Kelley; Yu, Jian
2013-01-01
Fluorescence-activated cell sorting (FACS) is an essential tool for studies requiring isolation of distinct intestinal epithelial cell populations. Inconsistent or lack of reporting of the critical parameters associated with FACS methodologies has complicated interpretation, comparison, and reproduction of important findings. To address this problem a comprehensive multicenter study was designed to develop guidelines that limit experimental and data reporting variability and provide a foundation for accurate comparison of data between studies. Common methodologies and data reporting protocols for tissue dissociation, cell yield, cell viability, FACS, and postsort purity were established. Seven centers tested the standardized methods by FACS-isolating a specific crypt-based epithelial population (EpCAM+/CD44+) from murine small intestine. Genetic biomarkers for stem/progenitor (Lgr5 and Atoh 1) and differentiated cell lineages (lysozyme, mucin2, chromogranin A, and sucrase isomaltase) were interrogated in target and control populations to assess intra- and intercenter variability. Wilcoxon's rank sum test on gene expression levels showed limited intracenter variability between biological replicates. Principal component analysis demonstrated significant intercenter reproducibility among four centers. Analysis of data collected by standardized cell isolation methods and data reporting requirements readily identified methodological problems, indicating that standard reporting parameters facilitate post hoc error identification. These results indicate that the complexity of FACS isolation of target intestinal epithelial populations can be highly reproducible between biological replicates and different institutions by adherence to common cell isolation methods and FACS gating strategies. This study can be considered a foundation for continued method development and a starting point for investigators that are developing cell isolation expertise to study physiology and pathophysiology of the intestinal epithelium. PMID:23928185
Support system for the professional integration of people with disability into the labour market.
Filgueiras, Ernesto; Vilar, Elisângela; Rebelo, Francisco
2015-01-01
Successful cases of professional reintegration were achieved when adequate conditions were created for the adaptation of the worker with disability to the working environment and to the professional activity, allowing them to carry out all their functions without any restriction. In this sense, this paper presents a methodology for professional integration of people with disability in service companies and industry. It has as results a matrix of analysis of a set of observables for the reintegration of people with disability into the labour market, as well as an auxiliary tool for those who work in recruitment of personnel. The main objective was to develop a tool (i.e., a software) based on the crossing of data obtained from the analysis of the individual capacities and the requirements of the job to optimise the relationship between worker and the workplace. There was also considered a series of strategies which can be adopted by the individuals and the possible adaptations in the workplace, as a way to reduce the handicap in the accomplishment of different activities. The methodology for the development of this study is divided in two phases: Phase I, destined to the assessment criteria and classification of the indispensable functional characteristics of the individuals; Phase II, related to the assessment criteria of the jobs and the functions that have to be performed. As a result it was developed an evaluation tool to match the individuals' capabilities and the job requirements. A software was created to support the evaluation and to help professionals during the assessment. This methodology together with the support tool demonstrated to be a quite inclusive tool, as it considers, as a matter of priority, the capacities of the individuals and the real necessities of the workplaces.
Hamilton, Kyra; White, Katherine M
2010-11-01
Drawing on the belief-based framework of the Theory of Planned Behaviour, this study employs qualitative methodology involving individual and group interviews to examine the beliefs associated with regular physical activity performance among parents of young children (N = 40). The data were analysed using thematic content analysis. A range of advantages (e.g. improves parenting practices), disadvantages (e.g. interferes with commitments), barriers (e.g. time), and facilitators (e.g. social support) to performing physical activity are identified. Normative pressures are also identified as affecting parents' activity behaviour. These identified beliefs can be used to inform interventions to challenge inactivity among this at-risk group.
TRAC Innovative Visualization Techniques
2016-11-14
Therefore, TRAC analysts need a way to analyze the effectiveness of their visualization design choices. Currently, TRAC does not have a methodology ...to analyze visualizations used to support an analysis story. Our research team developed a visualization design methodology to create effective...visualizations that support an analysis story. First, we based our methodology on the latest research on design thinking, cognitive learning, and
Ethics education in research involving human beings in undergraduate medicine curriculum in Brazil.
Novaes, Maria Rita Garbi; Guilhem, Dirce; Barragan, Elena; Mennin, Stewart
2013-12-01
The Brazilian national curriculum guidelines for undergraduate medicine courses inspired and influenced the groundwork for knowledge acquisition, skills development and the perception of ethical values in the context of professional conduct. The evaluation of ethics education in research involving human beings in undergraduate medicine curriculum in Brazil, both in courses with active learning processes and in those with traditional lecture learning methodologies. Curricula and teaching projects of 175 Brazilian medical schools were analyzed using a retrospective historical and descriptive exploratory cohort study. Thirty one medical schools were excluded from the study because of incomplete information or a refusal to participate. Active research for information from institutional sites and documents was guided by terms based on 69 DeCS/MeSH descriptors. Curriculum information was correlated with educational models of learning such as active learning methodologies, tutorial discussions with integrated curriculum into core modules, and traditional lecture learning methodologies for large classes organized by disciplines and reviewed by occurrence frequency of ethical themes and average hourly load per semester. Ninety-five medical schools used traditional learning methodologies. The ten most frequent ethical themes were: 1--ethics in research (26); 2--ethical procedures and advanced technology (46); 3--ethic-professional conduct (413). Over 80% of schools using active learning methodologies had between 50 and 100 hours of scheduled curriculum time devoted to ethical themes whereas more than 60% of traditional learning methodology schools devoted less than 50 hours in curriculum time to ethical themes. The data indicates that medical schools that employ more active learning methodologies provide more attention and time to ethical themes than schools with traditional discipline-based methodologies. Given the importance of ethical issues in contemporary medical education, these findings are significant for curriculum change and modification plans in the future of Brazilian medical education. © 2012 John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Volmert, Ben; Pantelias, Manuel; Mutnuru, R. K.; Neukaeter, Erwin; Bitterli, Beat
2016-02-01
In this paper, an overview of the Swiss Nuclear Power Plant (NPP) activation methodology is presented and the work towards its validation by in-situ NPP foil irradiation campaigns is outlined. Nuclear Research and consultancy Group (NRG) in The Netherlands has been given the task of performing the corresponding neutron metrology. For this purpose, small Aluminium boxes containing a set of circular-shaped neutron activation foils have been prepared. After being irradiated for one complete reactor cycle, the sets have been successfully retrieved, followed by gamma-spectrometric measurements of the individual foils at NRG. Along with the individual activities of the foils, the reaction rates and thermal, intermediate and fast neutron fluence rates at the foil locations have been determined. These determinations include appropriate corrections for gamma self-absorption and neutron self-shielding as well as corresponding measurement uncertainties. The comparison of the NPP Monte Carlo calculations with the results of the foil measurements is done by using an individual generic MCNP model functioning as an interface and allowing the simulation of individual foil activation by predetermined neutron spectra. To summarize, the comparison between calculation and measurement serve as a sound validation of the Swiss NPP activation methodology by demonstrating a satisfying agreement between measurement and calculation. Finally, the validation offers a chance for further improvements of the existing NPP models by ensuing calibration and/or modelling optimizations for key components and structures.
Chauvin, Christine; Le Bouar, Gilbert; Lardjane, Salim
2017-01-01
Sea fishing is one of the most dangerous occupations. Numerous studies have already sought to evaluate the risk level of this occupation through the analysis of the frequency and seriousness of occupational injuries. The purpose of the present study is to analyse these accidents in terms of two main characteristics of the vessels involved: the fishery type (high seas, offshore, coastal, or inshore fishery) and the fishing activity (use of passive or active gears). Injury rates were calculated for the Brittany region and for the year 2012. A second analysis was carried out on 8,286 reported injuries that occurred in France from 2002 to 2012, while vessels were in the process of fishing. This first analysis shows that the incidence rate is very high (103 per 1,000 full-time equivalent fishermen) and that it depends more on the fishery type than on the fishing activity; the highest rates concern the offshore and the coastal fleets. Results of the second analysis show that the nature of accidents depends more on the fishing activity than on the type of fishery. These findings lead to a discussion of the causes of the highest incidence rate values and the causes of the observed variations. The discussion also involves the methodological difficulties related to the incidence rate calculations.
Structural Optimization Methodology for Rotating Disks of Aircraft Engines
NASA Technical Reports Server (NTRS)
Armand, Sasan C.
1995-01-01
In support of the preliminary evaluation of various engine technologies, a methodology has been developed for structurally designing the rotating disks of an aircraft engine. The structural design methodology, along with a previously derived methodology for predicting low-cycle fatigue life, was implemented in a computer program. An interface computer program was also developed that gathers the required data from a flowpath analysis program (WATE) being used at NASA Lewis. The computer program developed for this study requires minimum interaction with the user, thus allowing engineers with varying backgrounds in aeropropulsion to successfully execute it. The stress analysis portion of the methodology and the computer program were verified by employing the finite element analysis method. The 10th- stage, high-pressure-compressor disk of the Energy Efficient Engine Program (E3) engine was used to verify the stress analysis; the differences between the stresses and displacements obtained from the computer program developed for this study and from the finite element analysis were all below 3 percent for the problem solved. The computer program developed for this study was employed to structurally optimize the rotating disks of the E3 high-pressure compressor. The rotating disks designed by the computer program in this study were approximately 26 percent lighter than calculated from the E3 drawings. The methodology is presented herein.
ERIC Educational Resources Information Center
Pedersen, Mitra
2013-01-01
This study investigated the rate of success for IT projects using agile and standard project management methodologies. Any successful project requires use of project methodology. Specifically, large projects require formal project management methodologies or models, which establish a blueprint of processes and project planning activities. This…
Neng, N R; Mestre, A S; Carvalho, A P; Nogueira, J M F
2011-09-16
In this contribution, powdered activated carbons (ACs) from cork waste were supported for bar adsorptive micro-extraction (BAμE), as novel adsorbent phases for the analysis of polar compounds. By combining this approach with liquid desorption followed by high performance liquid chromatography with diode array detection (BAμE(AC)-LD/HPLC-DAD), good analytical performance was achieved using clofibric acid (CLOF) and ibuprofen (IBU) model compounds in environmental and biological matrices. Assays performed on 30 mL water samples spiked at the 25.0 μg L(-1) level yielded recoveries around 80% for CLOF and 95% for IBU, under optimized experimental conditions. The ACs textural and surface chemistry properties were correlated with the results obtained. The analytical performance showed good precision (<15%), suitable detection limits (0.24 and 0.78 μg L(-1) for CLOF and IBU, respectively) and good linear dynamic ranges (r(2)>0.9922) from 1.0 to 600.0 μg L(-1). By using the standard addition methodology, the application of the present approach to environmental water and urine matrices allowed remarkable performance at the trace level. The proposed methodology proved to be a viable alternative for acidic pharmaceuticals analysis, showing to be easy to implement, reliable, sensitive and requiring low sample volume to monitor these priority compounds in environmental and biological matrices. Copyright © 2011 Elsevier B.V. All rights reserved.
Detecting switching and intermittent causalities in time series
NASA Astrophysics Data System (ADS)
Zanin, Massimiliano; Papo, David
2017-04-01
During the last decade, complex network representations have emerged as a powerful instrument for describing the cross-talk between different brain regions both at rest and as subjects are carrying out cognitive tasks, in healthy brains and neurological pathologies. The transient nature of such cross-talk has nevertheless by and large been neglected, mainly due to the inherent limitations of some metrics, e.g., causality ones, which require a long time series in order to yield statistically significant results. Here, we present a methodology to account for intermittent causal coupling in neural activity, based on the identification of non-overlapping windows within the original time series in which the causality is strongest. The result is a less coarse-grained assessment of the time-varying properties of brain interactions, which can be used to create a high temporal resolution time-varying network. We apply the proposed methodology to the analysis of the brain activity of control subjects and alcoholic patients performing an image recognition task. Our results show that short-lived, intermittent, local-scale causality is better at discriminating both groups than global network metrics. These results highlight the importance of the transient nature of brain activity, at least under some pathological conditions.
Anthropic Risk Assessment on Biodiversity
NASA Astrophysics Data System (ADS)
Piragnolo, M.; Pirotti, F.; Vettore, A.; Salogni, G.
2013-01-01
This paper presents a methodology for risk assessment of anthropic activities on habitats and species. The method has been developed for Veneto Region, in order to simplify and improve the quality of EIA procedure (VINCA). Habitats and species, animals and plants, are protected by European Directive 92/43/EEC and 2009/147/EC but they are subject at hazard due to pollution produced by human activities. Biodiversity risks may conduct to deterioration and disturbance in ecological niches, with consequence of loss of biodiversity. Ecological risk assessment applied on Natura 2000 network, is needed to best practice of management and monitoring of environment and natural resources. Threats, pressure and activities, stress and indicators may be managed by geodatabase and analysed using GIS technology. The method used is the classic risk assessment in ecological context, and it defines the natural hazard as influence, element of risk as interference and vulnerability. Also it defines a new parameter called pressure. It uses risk matrix for the risk analysis on spatial and temporal scale. The methodology is qualitative and applies the precautionary principle in environmental assessment. The final product is a matrix which excludes the risk and could find application in the development of a territorial information system.
Shimoda, Lori M.N.; Park, Christy; Stokes, Alexander J.; Gomes, Henry Halenani; Turner, Helen
2013-01-01
Kava (‘Awa) is a traditional water-based beverage in Pacific island communities, prepared from the ground root and stems of Piper methysticum. Kava use is associated with an ichthyotic dermatitis and delayed type hypersensitivity reactions. In the current study we collated preparative methodologies from cultural practitioners and recreational kava users in various Pacific communities. We standardized culturally-informed aqueous extraction methods and prepared extracts that were subjected to basic physicochemical analysis. Mast cells exposed to these extracts displayed robust intracellular free calcium responses, and concomitant release of pro-inflammatory mediators. In contrast, mast cells were refractory to single or combinatorial stimulation with kavalactones including methysticin, dihydromethysticin and kavain. Moreover, we reproduced a traditional modification of the kava preparation methodology, pre-mixing with the mucilage of Hibiscus taliaceus, and observed its potentiating effect on the activity of aqueous extracts in mast cells. Taken together, these data indicate that water extractable active ingredients may play a role in the physiological and pathophysiological effects of kava, and suggests that mast cell activation may be a mechanistic component of kava-related skin inflammations. PMID:22473598
Ghasemzadeh, Ali; Jaafar, Hawa Z E; Rahmat, Asmah
2015-07-30
Analysis and extraction of plant matrices are important processes for the development, modernization, and quality control of herbal formulations. Response surface methodology is a collection of statistical and mathematical techniques that are used to optimize the range of variables in various experimental processes to reduce the number of experimental runs, cost , and time, compared to other methods. Response surface methodology was applied for optimizing reflux extraction conditions for achieving high 6-gingerol and 6-shogaol contents, and high antioxidant activity in Zingiber officinale var. rubrum Theilade . The two-factor central composite design was employed to determine the effects of two independent variables, namely extraction temperature (X1: 50-80 °C) and time (X2: 2-4 h), on the properties of the extracts. The 6-gingerol and 6-shogaol contents were measured using ultra-performance liquid chromatography. The antioxidant activity of the rhizome extracts was determined by means of the 1,1-diphenyl-2-picrylhydrazyl assay. Anticancer activity of optimized extracts against HeLa cancer cell lines was measured using MTT (3-(4,5-dimethylthiazol-2-yl)-2,5-diphenyltetrazolium bromide) assay. Increasing the extraction temperature and time induced significant response of the variables. The optimum extraction condition for all responses was at 76.9 °C for 3.4 h. Under the optimum condition, the corresponding predicted response values for 6-gingerol, 6-shogaol, and the antioxidant activity were 2.89 mg/g DW, 1.85 mg/g DW, and 84.3%, respectively. 6-gingerol and 6-shogaol were extracted under optimized condition to check the viability of the models. The values were 2.92 and 1.88 mg/g DW, and 84.0% for 6-gingerol, 6-shogaol, and the antioxidant activity respectively. The experimental values agreed with those predicted, thus indicating suitability of the models employed and the success of RSM in optimizing the extraction condition. With optimizing of reflux extraction anticancer activity of extracts against HeLa cancer cells enhanced about 16.8%. The half inhibition concentration (IC50) value of optimized and unoptimized extract was found at concentration of 20.9 and 38.4 μg/mL respectively. Optimized extract showed more distinct anticancer activities against HeLa cancer cells in a concentration of 40 μg/mL (P < 0.01) without toxicity to normal cells. The results indicated that the pharmaceutical quality of ginger could be improved significantly by optimizing of extraction process using response surface methodology.
Biffi, E; Menegon, A; Regalia, G; Maida, S; Ferrigno, G; Pedrocchi, A
2011-08-15
Modern drug discovery for Central Nervous System pathologies has recently focused its attention to in vitro neuronal networks as models for the study of neuronal activities. Micro Electrode Arrays (MEAs), a widely recognized tool for pharmacological investigations, enable the simultaneous study of the spiking activity of discrete regions of a neuronal culture, providing an insight into the dynamics of networks. Taking advantage of MEAs features and making the most of the cross-correlation analysis to assess internal parameters of a neuronal system, we provide an efficient method for the evaluation of comprehensive neuronal network activity. We developed an intra network burst correlation algorithm, we evaluated its sensitivity and we explored its potential use in pharmacological studies. Our results demonstrate the high sensitivity of this algorithm and the efficacy of this methodology in pharmacological dose-response studies, with the advantage of analyzing the effect of drugs on the comprehensive correlative properties of integrated neuronal networks. Copyright © 2011 Elsevier B.V. All rights reserved.
Harinipriya, S; Sangaranarayanan, M V
2006-01-31
The evaluation of the free energy of activation pertaining to the electron-transfer reactions occurring at liquid/liquid interfaces is carried out employing a diffuse boundary model. The interfacial solvation numbers are estimated using a lattice gas model under the quasichemical approximation. The standard reduction potentials of the redox couples, appropriate inner potential differences, dielectric permittivities, as well as the width of the interface are included in the analysis. The methodology is applied to the reaction between [Fe(CN)6](3-/4-) and [Lu(biphthalocyanine)](3+/4+) at water/1,2-dichloroethane interface. The rate-determining step is inferred from the estimated free energy of activation for the constituent processes. The results indicate that the solvent shielding effect and the desolvation of the reactants at the interface play a central role in dictating the free energy of activation. The heterogeneous electron-transfer rate constant is evaluated from the molar reaction volume and the frequency factor.
NASA Astrophysics Data System (ADS)
Izquierdo, Mª Teresa; de Yuso, Alicia Martínez; Valenciano, Raquel; Rubio, Begoña; Pino, Mª Rosa
2013-01-01
The objective of this study was to evaluate the adsorption capacity of toluene and hexane over activated carbons prepared according an experimental design, considering as variables the activation temperature, the impregnation ratio and the activation time. The response surface methodology was applied to optimize the adsorption capacity of the carbons regarding the preparation conditions that determine the physicochemical characteristics of the activated carbons. The methodology of preparation produced activated carbons with surface areas and micropore volumes as high as 1128 m2/g and 0.52 cm3/g, respectively. Moreover, the activated carbons exhibit mesoporosity, ranging from 64.6% to 89.1% the percentage of microporosity. The surface chemistry was characterized by TPD, FTIR and acid-base titration obtaining different values of surface groups from the different techniques because the limitation of each technique, but obtaining similar trends for the activated carbons studied. The exhaustive characterization of the activated carbons allows to state that the measured surface area does not explain the adsorption capacity for either toluene or n-hexane. On the other hand, the surface chemistry does not explain the adsorption results either. A compromise between physical and chemical characteristics can be obtained from the appropriate activation conditions, and the response surface methodology gives the optimal activated carbon to maximize adsorption capacity. Low activation temperature, intermediate impregnation ratio lead to high toluene and n-hexane adsorption capacities depending on the activation time, which a determining factor to maximize toluene adsorption.
NASA Astrophysics Data System (ADS)
Ortiz-Jaramillo, B.; Fandiño Toro, H. A.; Benitez-Restrepo, H. D.; Orjuela-Vargas, S. A.; Castellanos-Domínguez, G.; Philips, W.
2012-03-01
Infrared Non-Destructive Testing (INDT) is known as an effective and rapid method for nondestructive inspection. It can detect a broad range of near-surface structuring flaws in metallic and composite components. Those flaws are modeled as a smooth contour centered at peaks of stored thermal energy, termed Regions of Interest (ROI). Dedicated methodologies must detect the presence of those ROIs. In this paper, we present a methodology for ROI extraction in INDT tasks. The methodology deals with the difficulties due to the non-uniform heating. The non-uniform heating affects low spatial/frequencies and hinders the detection of relevant points in the image. In this paper, a methodology for ROI extraction in INDT using multi-resolution analysis is proposed, which is robust to ROI low contrast and non-uniform heating. The former methodology includes local correlation, Gaussian scale analysis and local edge detection. In this methodology local correlation between image and Gaussian window provides interest points related to ROIs. We use a Gaussian window because thermal behavior is well modeled by Gaussian smooth contours. Also, the Gaussian scale is used to analyze details in the image using multi-resolution analysis avoiding low contrast, non-uniform heating and selection of the Gaussian window size. Finally, local edge detection is used to provide a good estimation of the boundaries in the ROI. Thus, we provide a methodology for ROI extraction based on multi-resolution analysis that is better or equal compared with the other dedicate algorithms proposed in the state of art.
NASA Astrophysics Data System (ADS)
Lee, Yu-Cheng; Yen, Tieh-Min; Tsai, Chih-Hung
This study provides an integrated model of Supplier Quality Performance Assesment (SQPA) activity for the semiconductor industry through introducing the ISO 9001 management framework, Importance-Performance Analysis (IPA) Supplier Quality Performance Assesment and Taguchi`s Signal-to-Noise Ratio (S/N) techniques. This integrated model provides a SQPA methodology to create value for all members under mutual cooperation and trust in the supply chain. This method helps organizations build a complete SQPA framework, linking organizational objectives and SQPA activities to optimize rating techniques to promote supplier quality improvement. The techniques used in SQPA activities are easily understood. A case involving a design house is illustrated to show our model.
Okahashi, Nobuyuki; Kohno, Susumu; Kitajima, Shunsuke; Matsuda, Fumio; Takahashi, Chiaki; Shimizu, Hiroshi
2015-12-01
Studying metabolic directions and flow rates in cultured mammalian cells can provide key information for understanding metabolic function in the fields of cancer research, drug discovery, stem cell biology, and antibody production. In this work, metabolic engineering methodologies including medium component analysis, (13)C-labeling experiments, and computer-aided simulation analysis were applied to characterize the metabolic phenotype of soft tissue sarcoma cells derived from p53-null mice. Cells were cultured in medium containing [1-(13)C] glutamine to assess the level of reductive glutamine metabolism via the reverse reaction of isocitrate dehydrogenase (IDH). The specific uptake and production rates of glucose, organic acids, and the 20 amino acids were determined by time-course analysis of cultured media. Gas chromatography-mass spectrometry analysis of the (13)C-labeling of citrate, succinate, fumarate, malate, and aspartate confirmed an isotopically steady state of the cultured cells. After removing the effect of naturally occurring isotopes, the direction of the IDH reaction was determined by computer-aided analysis. The results validated that metabolic engineering methodologies are applicable to soft tissue sarcoma cells derived from p53-null mice, and also demonstrated that reductive glutamine metabolism is active in p53-null soft tissue sarcoma cells under normoxia. Copyright © 2015 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.
Meyer, Georg F; Spray, Amy; Fairlie, Jo E; Uomini, Natalie T
2014-01-01
Current neuroimaging techniques with high spatial resolution constrain participant motion so that many natural tasks cannot be carried out. The aim of this paper is to show how a time-locked correlation-analysis of cerebral blood flow velocity (CBFV) lateralization data, obtained with functional TransCranial Doppler (fTCD) ultrasound, can be used to infer cerebral activation patterns across tasks. In a first experiment we demonstrate that the proposed analysis method results in data that are comparable with the standard Lateralization Index (LI) for within-task comparisons of CBFV patterns, recorded during cued word generation (CWG) at two difficulty levels. In the main experiment we demonstrate that the proposed analysis method shows correlated blood-flow patterns for two different cognitive tasks that are known to draw on common brain areas, CWG, and Music Synthesis. We show that CBFV patterns for Music and CWG are correlated only for participants with prior musical training. CBFV patterns for tasks that draw on distinct brain areas, the Tower of London and CWG, are not correlated. The proposed methodology extends conventional fTCD analysis by including temporal information in the analysis of cerebral blood-flow patterns to provide a robust, non-invasive method to infer whether common brain areas are used in different cognitive tasks. It complements conventional high resolution imaging techniques.
Analysis of pressure distortion testing
NASA Technical Reports Server (NTRS)
Koch, K. E.; Rees, R. L.
1976-01-01
The development of a distortion methodology, method D, was documented, and its application to steady state and unsteady data was demonstrated. Three methodologies based upon DIDENT, a NASA-LeRC distortion methodology based upon the parallel compressor model, were investigated by applying them to a set of steady state data. The best formulation was then applied to an independent data set. The good correlation achieved with this data set showed that method E, one of the above methodologies, is a viable concept. Unsteady data were analyzed by using the method E methodology. This analysis pointed out that the method E sensitivities are functions of pressure defect level as well as corrected speed and pattern.
Chen, Shasha; Zeng, Zhi; Hu, Na; Bai, Bo; Wang, Honglun; Suo, Yourui
2018-03-01
Lycium ruthenicum Murr. (LR) is a functional food that plays an important role in anti-oxidation due to its high level of phenolic compounds. This study aims to optimize ultrasound-assisted extraction (UAE) of phenolic compounds and antioxidant activities of obtained extracts from LR using response surface methodology (RSM). A four-factor-three-level Box-Behnken design (BBD) was employed to discuss the following extracting parameters: extraction time (X 1 ), ultrasonic power (X 2 ), solvent to sample ratio (X 3 ) and solvent concentration (X 4 ). The analysis of variance (ANOVA) results revealed that the solvent to sample ratio had a significant influence on all responses, while the extraction time had no statistically significant effect on phenolic compounds. The optimum values of the combination of phenolic compounds and antioxidant activities were obtained for X 1 =30min, X 2 =100W, X 3 =40mL/g, and X 4 =33% (v/v). Five phenolic acids, including chlorogenic acid, caffeic acid, syringic acid, p-coumaric acid and ferulic acid, were analyzed by HPLC. Our results indicated that optimization extraction is vital for the quantification of phenolic compounds and antioxidant activity in LR, which may be contributed to large-scale industrial applications and future pharmacological activities research. Copyright © 2017 Elsevier Ltd. All rights reserved.
Barquero, Laura A.; Davis, Nicole; Cutting, Laurie E.
2014-01-01
A growing number of studies examine instructional training and brain activity. The purpose of this paper is to review the literature regarding neuroimaging of reading intervention, with a particular focus on reading difficulties (RD). To locate relevant studies, searches of peer-reviewed literature were conducted using electronic databases to search for studies from the imaging modalities of fMRI and MEG (including MSI) that explored reading intervention. Of the 96 identified studies, 22 met the inclusion criteria for descriptive analysis. A subset of these (8 fMRI experiments with post-intervention data) was subjected to activation likelihood estimate (ALE) meta-analysis to investigate differences in functional activation following reading intervention. Findings from the literature review suggest differences in functional activation of numerous brain regions associated with reading intervention, including bilateral inferior frontal, superior temporal, middle temporal, middle frontal, superior frontal, and postcentral gyri, as well as bilateral occipital cortex, inferior parietal lobules, thalami, and insulae. Findings from the meta-analysis indicate change in functional activation following reading intervention in the left thalamus, right insula/inferior frontal, left inferior frontal, right posterior cingulate, and left middle occipital gyri. Though these findings should be interpreted with caution due to the small number of studies and the disparate methodologies used, this paper is an effort to synthesize across studies and to guide future exploration of neuroimaging and reading intervention. PMID:24427278
A new methodological approach for worldwide beryllium-7 time series analysis
NASA Astrophysics Data System (ADS)
Bianchi, Stefano; Longo, Alessandro; Plastino, Wolfango
2018-07-01
Time series analyses of cosmogenic radionuclide 7Be and 22Na atmospheric activity concentrations and meteorological data observed at twenty-five International Monitoring System (IMS) stations of the Comprehensive Nuclear-Test-Ban Treaty Organisation (CTBTO) have shown great variability in terms of noise structures, harmonic content, cross-correlation patterns and local Hurst exponent behaviour. Noise content and its structure has been extracted and characterised for the two radionuclides time series. It has been found that the yearly component, which is present in most of the time series, is not stationary, but has a percentage weight that varies with time. Analysis of atmospheric activity concentrations of 7Be, measured at IMS stations, has shown them to be influenced by distinct meteorological patterns, mainly by atmospheric pressure and temperature.
Community mobilization, organizing, and media advocacy. A discussion of methodological issues.
Treno, A J; Holder, H D
1997-04-01
Community Mobilization refers to those activities that prepare communities to accept, receive, and support prevention interventions designed to reduce alcohol-involved trauma. Media advocacy refers to the strategic use of media by those seeking to advance a social or public policy initiative. Within the Community Prevention Trial, both of these activities were critical elements. This article presents the evaluation design for community mobilization and media advocacy implemented for the project. Here the authors argue for the need to include both structured and unstructured community monitoring instruments, coding of local alcohol-related news coverage, and surveying community members about the exposure to alcohol-related problems, and support for project interventions. This article also presents an audience segmentation analysis and discusses the implications of this analysis for media advocacy efforts.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ovacik, Meric A.; Sen, Banalata; Euling, Susan Y.
Pathway activity level analysis, the approach pursued in this study, focuses on all genes that are known to be members of metabolic and signaling pathways as defined by the KEGG database. The pathway activity level analysis entails singular value decomposition (SVD) of the expression data of the genes constituting a given pathway. We explore an extension of the pathway activity methodology for application to time-course microarray data. We show that pathway analysis enhances our ability to detect biologically relevant changes in pathway activity using synthetic data. As a case study, we apply the pathway activity level formulation coupled with significancemore » analysis to microarray data from two different rat testes exposed in utero to Dibutyl Phthalate (DBP). In utero DBP exposure in the rat results in developmental toxicity of a number of male reproductive organs, including the testes. One well-characterized mode of action for DBP and the male reproductive developmental effects is the repression of expression of genes involved in cholesterol transport, steroid biosynthesis and testosterone synthesis that lead to a decreased fetal testicular testosterone. Previous analyses of DBP testes microarray data focused on either individual gene expression changes or changes in the expression of specific genes that are hypothesized, or known, to be important in testicular development and testosterone synthesis. However, a pathway analysis may inform whether there are additional affected pathways that could inform additional modes of action linked to DBP developmental toxicity. We show that Pathway activity analysis may be considered for a more comprehensive analysis of microarray data.« less
Modeling energy/economy interactions for conservation and renewable energy-policy analysis
NASA Astrophysics Data System (ADS)
Groncki, P. J.
Energy policy and the implications for policy analysis and the methodological tools are discussed. The evolution of one methodological approach and the combined modeling system of the component models, their evolution in response to changing analytic needs, and the development of the integrated framework are reported. The analyses performed over the past several years are summarized. The current philosophy behind energy policy is discussed and compared to recent history. Implications for current policy analysis and methodological approaches are drawn.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mandelli, Diego; Rabiti, Cristian; Cogliati, Joshua
2014-11-01
Passive system, structure and components (SSCs) will degrade over their operation life and this degradation may cause to reduction in the safety margins of a nuclear power plant. In traditional probabilistic risk assessment (PRA) using the event-tree/fault-tree methodology, passive SSC failure rates are generally based on generic plant failure data and the true state of a specific plant is not reflected realistically. To address aging effects of passive SSCs in the traditional PRA methodology [1] does consider physics based models that account for the operating conditions in the plant, however, [1] does not include effects of surveillance/inspection. This paper representsmore » an overall methodology for the incorporation of aging modeling of passive components into the RAVEN/RELAP-7 environment which provides a framework for performing dynamic PRA. Dynamic PRA allows consideration of both epistemic and aleatory uncertainties (including those associated with maintenance activities) in a consistent phenomenological and probabilistic framework and is often needed when there is complex process/hardware/software/firmware/ human interaction [2]. Dynamic PRA has gained attention recently due to difficulties in the traditional PRA modeling of aging effects of passive components using physics based models and also in the modeling of digital instrumentation and control systems. RAVEN (Reactor Analysis and Virtual control Environment) [3] is a software package under development at the Idaho National Laboratory (INL) as an online control logic driver and post-processing tool. It is coupled to the plant transient code RELAP-7 (Reactor Excursion and Leak Analysis Program) also currently under development at INL [3], as well as RELAP 5 [4]. The overall methodology aims to: • Address multiple aging mechanisms involving large number of components in a computational feasible manner where sequencing of events is conditioned on the physical conditions predicted in a simulation environment such as RELAP-7. • Identify the risk-significant passive components, their failure modes and anticipated rates of degradation • Incorporate surveillance and maintenance activities and their effects into the plant state and into component aging progress. • Asses aging affects in a dynamic simulation environment 1. C. L. SMITH, V. N. SHAH, T. KAO, G. APOSTOLAKIS, “Incorporating Ageing Effects into Probabilistic Risk Assessment –A Feasibility Study Utilizing Reliability Physics Models,” NUREG/CR-5632, USNRC, (2001). 2. T. ALDEMIR, “A Survey of Dynamic Methodologies for Probabilistic Safety Assessment of Nuclear Power Plants, Annals of Nuclear Energy, 52, 113-124, (2013). 3. C. RABITI, A. ALFONSI, J. COGLIATI, D. MANDELLI and R. KINOSHITA “Reactor Analysis and Virtual Control Environment (RAVEN) FY12 Report,” INL/EXT-12-27351, (2012). 4. D. ANDERS et.al, "RELAP-7 Level 2 Milestone Report: Demonstration of a Steady State Single Phase PWR Simulation with RELAP-7," INL/EXT-12-25924, (2012).« less
Speed Accuracy Tradeoffs in Human Speech Production
2017-05-01
for considering Fitts’ law in the domain of speech production is elucidated. Methodological challenges in applying Fitts-style analysis are addressed...order to assess whether articulatory kinematics conform to Fitts’ law. A second, associated goal is to address the methodological challenges inherent in...performing Fitts-style analysis on rtMRI data of speech production. Methodological challenges include segmenting continuous speech into specific motor
Tularosa Basin Play Fairway Analysis: Methodology Flow Charts
Adam Brandt
2015-11-15
These images show the comprehensive methodology used for creation of a Play Fairway Analysis to explore the geothermal resource potential of the Tularosa Basin, New Mexico. The deterministic methodology was originated by the petroleum industry, but was custom-modified to function as a knowledge-based geothermal exploration tool. The stochastic PFA flow chart uses weights of evidence, and is data-driven.
Regional Shelter Analysis Methodology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dillon, Michael B.; Dennison, Deborah; Kane, Jave
2015-08-01
The fallout from a nuclear explosion has the potential to injure or kill 100,000 or more people through exposure to external gamma (fallout) radiation. Existing buildings can reduce radiation exposure by placing material between fallout particles and exposed people. Lawrence Livermore National Laboratory was tasked with developing an operationally feasible methodology that could improve fallout casualty estimates. The methodology, called a Regional Shelter Analysis, combines the fallout protection that existing buildings provide civilian populations with the distribution of people in various locations. The Regional Shelter Analysis method allows the consideration of (a) multiple building types and locations within buildings, (b)more » country specific estimates, (c) population posture (e.g., unwarned vs. minimally warned), and (d) the time of day (e.g., night vs. day). The protection estimates can be combined with fallout predictions (or measurements) to (a) provide a more accurate assessment of exposure and injury and (b) evaluate the effectiveness of various casualty mitigation strategies. This report describes the Regional Shelter Analysis methodology, highlights key operational aspects (including demonstrating that the methodology is compatible with current tools), illustrates how to implement the methodology, and provides suggestions for future work.« less
[Problem-based learning in cardiopulmonary resuscitation: basic life support].
Sardo, Pedro Miguel Garcez; Dal Sasso, Grace Terezinha Marcon
2008-12-01
Descriptive and exploratory study, aimed to develop an educational practice of Problem-Based Learning in CPR/BLS with 24 students in the third stage of the Nursing Undergraduate Course in a University in the Southern region of Brazil. The study used the PBL methodology, focused on problem situations of cardiopulmonary arrest, and was approved by the CONEP. The methodological strategies for data collection, such as participative observation and questionnaires to evaluate the learning, the educational practices and their methodology, allowed for grouping the results in: students' expectations; group activities; individual activities; practical activities; evaluation of the meetings and their methodology. The study showed that PBL allows the educator to evaluate the academic learning process in several dimensions, functioning as a motivating factor for both the educator and the student, because it allows the theoretical-practical integration in an integrated learning process.
A Hierarchical Clustering Methodology for the Estimation of Toxicity
A Quantitative Structure Activity Relationship (QSAR) methodology based on hierarchical clustering was developed to predict toxicological endpoints. This methodology utilizes Ward's method to divide a training set into a series of structurally similar clusters. The structural sim...
System data communication structures for active-control transport aircraft, volume 1
NASA Technical Reports Server (NTRS)
Hopkins, A. L.; Martin, J. H.; Brock, L. D.; Jansson, D. G.; Serben, S.; Smith, T. B.; Hanley, L. D.
1981-01-01
Candidate data communication techniques are identified, including dedicated links, local buses, broadcast buses, multiplex buses, and mesh networks. The design methodology for mesh networks is then discussed, including network topology and node architecture. Several concepts of power distribution are reviewed, including current limiting and mesh networks for power. The technology issues of packaging, transmission media, and lightning are addressed, and, finally, the analysis tools developed to aid in the communication design process are described. There are special tools to analyze the reliability and connectivity of networks and more general reliability analysis tools for all types of systems.
ERIC Educational Resources Information Center
Tutlys, Vidmantas; Spöttl, Georg
2017-01-01
Purpose: This paper aims to explore methodological and institutional challenges on application of the work-process analysis approach in the design and development of competence-based occupational standards for Lithuania. Design/methodology/approach: The theoretical analysis is based on the review of scientific literature and the analysis of…
The association between physical activity and renal cancer: systematic review and meta-analysis
Behrens, G; Leitzmann, M F
2013-01-01
Background: Physical activity may decrease renal cancer risk by reducing obesity, blood pressure, insulin resistance, and lipid peroxidation. Despite plausible biologic mechanisms linking increased physical activity to decreased risk for renal cancer, few epidemiologic studies have been able to report a clear inverse association between physical activity and renal cancer, and no meta-analysis is available on the topic. Methods: We searched the literature using PubMed and Web of Knowledge to identify published non-ecologic epidemiologic studies quantifying the relationship between physical activity and renal cancer risk in individuals without a cancer history. Following the PRISMA guidelines, we conducted a systematic review and meta-analysis, including information from 19 studies based on a total of 2 327 322 subjects and 10 756 cases. The methodologic quality of the studies was examined using a comprehensive scoring system. Results: Comparing high vs low levels of physical activity, we observed an inverse association between physical activity and renal cancer risk (summary relative risk (RR) from random-effects meta-analysis=0.88; 95% confidence interval (CI)=0.79–0.97). Summarising risk estimates from high-quality studies strengthened the inverse association between physical activity and renal cancer risk (RR=0.78; 95% CI=0.66–0.92). Effect modification by adiposity, hypertension, type 2 diabetes, smoking, gender, or geographic region was not observed. Conclusion: Our comprehensive meta-analysis provides strong support for an inverse relation of physical activity to renal cancer risk. Future high-quality studies are required to discern which specific types, intensities, frequencies, and durations of physical activity are needed for renal cancer risk reduction. PMID:23412105
CrossTalk: The Journal of Defense Software Engineering. Volume 19, Number 9
2006-09-01
it does. Several freely down- loadable methodologies have emerged to support the developer in modeling threats to applications and other soft...SECURIS. Model -Driven Develop - ment and Analysis of Secure Information Systems <www.sintef.no/ content/page1_1824.aspx>. 10. The SECURIS Project ...By applying these methods to the SDLC , we can actively reduce the number of known vulnerabilities in software as it is developed . For
Depth Cue Integration in an Active Control Paradigm
NASA Technical Reports Server (NTRS)
Kaiser, Mary K.; Sweet, Barabara T.; Shafto, Meredith; Null, Cynthia H. (Technical Monitor)
1995-01-01
Numerous models of depth cue integration have been proposed. Of particular interest is how the visual system processes discrepent cues, as might arise when viewing synthetic displays. A powerful paradigm for examining this integration process can be adapted from manual control research. This methodology introduces independent disturbances in the candidate cues, then performs spectral analysis of subjects' resulting motoric responses (e.g., depth matching). We will describe this technique and present initial findings.
The NBS Energy Model Assessment project: Summary and overview
NASA Astrophysics Data System (ADS)
Gass, S. I.; Hoffman, K. L.; Jackson, R. H. F.; Joel, L. S.; Saunders, P. B.
1980-09-01
The activities and technical reports for the project are summarized. The reports cover: assessment of the documentation of Midterm Oil and Gas Supply Modeling System; analysis of the model methodology characteristics of the input and other supporting data; statistical procedures undergirding construction of the model and sensitivity of the outputs to variations in input, as well as guidelines and recommendations for the role of these in model building and developing procedures for their evaluation.
B Owen, Katherine; Smith, Jordan; Lubans, David R; Ng, Johan Y Y; Lonsdale, Chris
2014-10-01
Self-determination theory is used as a framework for examining the relation between motivation and physical activity. The purpose of this review was to systematically review studies that assessed the association between self-determined motivation and physical activity levels in children and adolescents. We searched electronic databases in April 2013. Included studies assessed the relation between motivation (as outlined in self-determination theory) and physical activity in children and adolescents. Forty-six studies (n=15,984 participants) met the inclusion criteria. Meta-analysis indicated that overall levels of self-determined motivation had a weak to moderate, positive associations with physical activity (ρ=.21 to .31). Autonomous forms of motivation (i.e., intrinsic motivation and identified regulation) had moderate, positive associations with physical activity (ρ=.27 to .38), whereas controlled forms of motivation (i.e., introjection and external regulation) had weak, negative associations with physical activity (ρ=-.03 to -.17). Amotivation had a weak, negative association with physical activity (ρ=-.11 to -.21). Evidence provides some support for self-determination theory tenets. However, there was substantial heterogeneity in most associations and many studies had methodological shortcomings. Crown Copyright © 2014. Published by Elsevier Inc. All rights reserved.
Teaching Research Methodology through Active Learning
ERIC Educational Resources Information Center
Lundahl, Brad W.
2008-01-01
To complement traditional learning activities in a masters-level research methodology course, social work students worked on a formal research project which involved: designing the study, constructing measures, selecting a sampling strategy, collecting data, reducing and analyzing data, and finally interpreting and communicating the results. The…
A Methodology for the Hybridization Based in Active Components: The Case of cGA and Scatter Search.
Villagra, Andrea; Alba, Enrique; Leguizamón, Guillermo
2016-01-01
This work presents the results of a new methodology for hybridizing metaheuristics. By first locating the active components (parts) of one algorithm and then inserting them into second one, we can build efficient and accurate optimization, search, and learning algorithms. This gives a concrete way of constructing new techniques that contrasts the spread ad hoc way of hybridizing. In this paper, the enhanced algorithm is a Cellular Genetic Algorithm (cGA) which has been successfully used in the past to find solutions to such hard optimization problems. In order to extend and corroborate the use of active components as an emerging hybridization methodology, we propose here the use of active components taken from Scatter Search (SS) to improve cGA. The results obtained over a varied set of benchmarks are highly satisfactory in efficacy and efficiency when compared with a standard cGA. Moreover, the proposed hybrid approach (i.e., cGA+SS) has shown encouraging results with regard to earlier applications of our methodology.
2017-04-30
practices in latent variable theory, it is not surprising that effective measurement programs present methodological typing and considering of experimental ...7 3.3 Methodology ...8 Revised Enterprise Modeling Methodology ................................................................ 128 9 Conclusions
Biochemical Assays of Cultured Cells
NASA Technical Reports Server (NTRS)
Barlow, G. H.
1985-01-01
Subpopulations of human embryonic kidney cells isolated from continuous flow electrophoresis experiments performed at McDonnell Douglas and on STS-8 have been analyzed. These analyses have included plasminogen activator assays involving indirect methodology on fibrin plated and direct methodology using chromogenic substrates. Immunological studies were performed and the conditioned media for erythropoietin activity and human granulocyte colony stimulating (HGCSF) activity was analyzed.
Unified methodology for airport pavement analysis and design. Vol. 1, state of the art
DOT National Transportation Integrated Search
1991-06-01
This report presents an assessment of the state of the art of airport pavement analysis : and design. The objective is to identify those areas in current airport pavement : analysis methodology that need to be substantially improved from the perspect...
DOT National Transportation Integrated Search
2006-11-01
This report discusses data acquisition and analysis for grade crossing risk analysis at the proposed San Joaquin High-Speed Rail Corridor in San Joaquin, California, and documents the data acquisition and analysis methodologies used to collect and an...
[Quantitative data analysis for live imaging of bone.
Seno, Shigeto
Bone tissue is a hard tissue, it was difficult to observe the interior of the bone tissue alive. With the progress of microscopic technology and fluorescent probe technology in recent years, it becomes possible to observe various activities of various cells forming bone society. On the other hand, the quantitative increase in data and the diversification and complexity of the images makes it difficult to perform quantitative analysis by visual inspection. It has been expected to develop a methodology for processing microscopic images and data analysis. In this article, we introduce the research field of bioimage informatics which is the boundary area of biology and information science, and then outline the basic image processing technology for quantitative analysis of live imaging data of bone.
NASA Technical Reports Server (NTRS)
Tamma, Kumar K.; Railkar, Sudhir B.
1988-01-01
This paper describes new and recent advances in the development of a hybrid transfinite element computational methodology for applicability to conduction/convection/radiation heat transfer problems. The transfinite element methodology, while retaining the modeling versatility of contemporary finite element formulations, is based on application of transform techniques in conjunction with classical Galerkin schemes and is a hybrid approach. The purpose of this paper is to provide a viable hybrid computational methodology for applicability to general transient thermal analysis. Highlights and features of the methodology are described and developed via generalized formulations and applications to several test problems. The proposed transfinite element methodology successfully provides a viable computational approach and numerical test problems validate the proposed developments for conduction/convection/radiation thermal analysis.
Canino-Rodríguez, José M; García-Herrero, Jesús; Besada-Portas, Juan; Ravelo-García, Antonio G; Travieso-González, Carlos; Alonso-Hernández, Jesús B
2015-03-04
The limited efficiency of current air traffic systems will require a next-generation of Smart Air Traffic System (SATS) that relies on current technological advances. This challenge means a transition toward a new navigation and air-traffic procedures paradigm, where pilots and air traffic controllers perform and coordinate their activities according to new roles and technological supports. The design of new Human-Computer Interactions (HCI) for performing these activities is a key element of SATS. However efforts for developing such tools need to be inspired on a parallel characterization of hypothetical air traffic scenarios compatible with current ones. This paper is focused on airborne HCI into SATS where cockpit inputs came from aircraft navigation systems, surrounding traffic situation, controllers' indications, etc. So the HCI is intended to enhance situation awareness and decision-making through pilot cockpit. This work approach considers SATS as a system distributed on a large-scale with uncertainty in a dynamic environment. Therefore, a multi-agent systems based approach is well suited for modeling such an environment. We demonstrate that current methodologies for designing multi-agent systems are a useful tool to characterize HCI. We specifically illustrate how the selected methodological approach provides enough guidelines to obtain a cockpit HCI design that complies with future SATS specifications.
Canino-Rodríguez, José M.; García-Herrero, Jesús; Besada-Portas, Juan; Ravelo-García, Antonio G.; Travieso-González, Carlos; Alonso-Hernández, Jesús B.
2015-01-01
The limited efficiency of current air traffic systems will require a next-generation of Smart Air Traffic System (SATS) that relies on current technological advances. This challenge means a transition toward a new navigation and air-traffic procedures paradigm, where pilots and air traffic controllers perform and coordinate their activities according to new roles and technological supports. The design of new Human-Computer Interactions (HCI) for performing these activities is a key element of SATS. However efforts for developing such tools need to be inspired on a parallel characterization of hypothetical air traffic scenarios compatible with current ones. This paper is focused on airborne HCI into SATS where cockpit inputs came from aircraft navigation systems, surrounding traffic situation, controllers’ indications, etc. So the HCI is intended to enhance situation awareness and decision-making through pilot cockpit. This work approach considers SATS as a system distributed on a large-scale with uncertainty in a dynamic environment. Therefore, a multi-agent systems based approach is well suited for modeling such an environment. We demonstrate that current methodologies for designing multi-agent systems are a useful tool to characterize HCI. We specifically illustrate how the selected methodological approach provides enough guidelines to obtain a cockpit HCI design that complies with future SATS specifications. PMID:25746092
[Education for patients with fibromyalgia. A systematic review of randomised clinical trials].
Elizagaray-Garcia, Ignacio; Muriente-Gonzalez, Jorge; Gil-Martinez, Alfonso
2016-01-16
To analyse the effectiveness of education about pain, quality of life and functionality in patients with fibromyalgia. The search for articles was carried out in electronic databases. Eligibility criteria were: controlled randomised clinical trials (RCT), published in English and Spanish, that had been conducted on patients with fibromyalgia, in which the therapeutic procedure was based on patient education. Two independent reviewers analysed the methodological quality using the PEDro scale. Five RCT were selected, of which four offered good methodological quality. In three of the studies, patient education, in combination with another intervention based on therapeutic exercise, improved the outcomes in the variables assessing pain and quality of life as compared with the same procedures performed separately. Moreover, an RCT with a high quality methodology showed that patient education activated inhibitory neural pathways capable of lowering the level of pain. The quantitative analysis yields strong-moderate evidence that patient education, in combination with other therapeutic exercise procedures, offers positive results in the variables pain, quality of life and functionality. Patient education in itself has not proved to be effective for pain, quality of life or functionality in patients with fibromyalgia. There is strong evidence, however, of the effectiveness of combining patient education with exercise and active strategies for coping with pain, quality of life and functionality in the short, medium and long term in patients with fibromyalgia.
77 FR 76077 - Agency Information Collection Activities: Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-26
... studies is to use the latest and most appropriate methodology to improve NCSES surveys and evaluate new data collection efforts. Methodological findings may be presented externally in technical papers at... individual survey may represent several methodological improvement projects. [[Page 76078
Phytochemistry of Cimicifugic Acids and Associated Bases in Cimicifuga racemosa Root Extracts
GÖdecke, Tanja; Nikolic, Dejan; Lankin, David C.; Chen, Shao-Nong; Powell, Sharla L.; Dietz, Birgit; Bolton, Judy L.; Van Breemen, Richard B.; Farnsworth, Norman R.; Pauli, Guido F.
2009-01-01
Introduction Earlier studies reported serotonergic activity for cimicifugic acids (CA) isolated from Cimicifuga racemosa. The discovery of strongly basic alkaloids, cimipronidines, from the active extract partition and evaluation of previously employed work-up procedures has led to the hypothesis of strong acid/base association in the extract. Objective Re-isolation of the CAs was desired to permit further detailed studies. Based on the acid/base association hypothesis, a new separation scheme of the active partition was required, which separates acids from associated bases. Methodology A new 5-HT7 bioassay guided work-up procedure was developed that concentrates activity into one partition. The latter was subjected to a new 2-step centrifugal partitioning chromatography (CPC) method, which applies pH zone refinement gradient (pHZR CPC) to dissociate the acid/base complexes. The resulting CA fraction was subjected to a second CPC step. Fractions and compounds were monitored by 1H NMR using a structure based spin-pattern analysis facilitating dereplication of the known acids. Bioassay results were obtained for the pHZR CPC fractions and for purified CAs. Results A new CA was characterized. While none of the pure CAs was active, the serotonergic activity was concentrated in a single pHZR CPC fraction, which was subsequently shown to contain low levels of the potent 5-HT7 ligand, Nω–methylserotonin. Conclusion This study shows that CAs are not responsible for serotonergic activity in black cohosh. New phytochemical methodology (pHZR CPC) and a sensitive dereplication method (LC-MS) led to the identification of Nω–methylserotonin as serotonergic active principle. PMID:19140115
Plyku, Donika; Loeb, David M.; Prideaux, Andrew R.; Baechler, Sébastien; Wahl, Richard L.; Sgouros, George
2015-01-01
Abstract Purpose: Dosimetric accuracy depends directly upon the accuracy of the activity measurements in tumors and organs. The authors present the methods and results of a retrospective tumor dosimetry analysis in 14 patients with a total of 28 tumors treated with high activities of 153Sm-ethylenediaminetetramethylenephosphonate (153Sm-EDTMP) for therapy of metastatic osteosarcoma using planar images and compare the results with three-dimensional dosimetry. Materials and Methods: Analysis of phantom data provided a complete set of parameters for dosimetric calculations, including buildup factor, attenuation coefficient, and camera dead-time compensation. The latter was obtained using a previously developed methodology that accounts for the relative motion of the camera and patient during whole-body (WB) imaging. Tumor activity values calculated from the anterior and posterior views of WB planar images of patients treated with 153Sm-EDTMP for pediatric osteosarcoma were compared with the geometric mean value. The mean activities were integrated over time and tumor-absorbed doses were calculated using the software package OLINDA/EXM. Results: The authors found that it was necessary to employ the dead-time correction algorithm to prevent measured tumor activity half-lives from often exceeding the physical decay half-life of 153Sm. Measured half-lives so long are unquestionably in error. Tumor-absorbed doses varied between 0.0022 and 0.27 cGy/MBq with an average of 0.065 cGy/MBq; however, a comparison with absorbed dose values derived from a three-dimensional analysis for the same tumors showed no correlation; moreover, the ratio of three-dimensional absorbed dose value to planar absorbed dose value was 2.19. From the anterior and posterior activity comparisons, the order of clinical uncertainty for activity and dose calculations from WB planar images, with the present methodology, is hypothesized to be about 70%. Conclusion: The dosimetric results from clinical patient data indicate that absolute planar dosimetry is unreliable and dosimetry using three-dimensional imaging is preferable, particularly for tumors, except perhaps for the most sophisticated planar methods. The relative activity and patient kinetics derived from planar imaging show a greater level of reliability than the dosimetry. PMID:26560193
Coastal zone management with stochastic multi-criteria analysis.
Félix, A; Baquerizo, A; Santiago, J M; Losada, M A
2012-12-15
The methodology for coastal management proposed in this study takes into account the physical processes of the coastal system and the stochastic nature of forcing agents. Simulation techniques are used to assess the uncertainty in the performance of a set of predefined management strategies based on different criteria representing the main concerns of interest groups. This statistical information as well as the distribution function that characterizes the uncertainty regarding the preferences of the decision makers is fed into a stochastic multi-criteria acceptability analysis that provides the probability of alternatives obtaining certain ranks and also calculates the preferences of a typical decision maker who supports an alternative. This methodology was applied as a management solution for Playa Granada in the Guadalfeo River Delta (Granada, Spain), where the construction of a dam in the river basin is causing severe erosion. The analysis of shoreline evolution took into account the coupled action of atmosphere, ocean, and land agents and their intrinsic stochastic character. This study considered five different management strategies. The criteria selected for the analysis were the economic benefits for three interest groups: (i) indirect beneficiaries of tourist activities; (ii) beach homeowners; and (iii) the administration. The strategies were ranked according to their effectiveness, and the relative importance given to each criterion was obtained. Copyright © 2012 Elsevier Ltd. All rights reserved.
Design and analysis of sustainable computer mouse using design for disassembly methodology
NASA Astrophysics Data System (ADS)
Roni Sahroni, Taufik; Fitri Sukarman, Ahmad; Agung Mahardini, Karunia
2017-12-01
This paper presents the design and analysis of computer mouse using Design for Disassembly methodology. Basically, the existing computer mouse model consist a number of unnecessary part that cause the assembly and disassembly time in production. The objective of this project is to design a new computer mouse based on Design for Disassembly (DFD) methodology. The main methodology of this paper was proposed from sketch generation, concept selection, and concept scoring. Based on the design screening, design concept B was selected for further analysis. New design of computer mouse is proposed using fastening system. Furthermore, three materials of ABS, Polycarbonate, and PE high density were prepared to determine the environmental impact category. Sustainable analysis was conducted using software SolidWorks. As a result, PE High Density gives the lowers amount in the environmental category with great maximum stress value.
NASA Astrophysics Data System (ADS)
Carneiro, Renato Lajarim; Poppi, Ronei Jesus
2014-01-01
In the present work the homogeneity of a pharmaceutical formulation presented as a cream was studied using infrared imaging spectroscopy and chemometric methodologies such as principal component analysis (PCA) and multivariate curve resolution with alternating least squares (MCR-ALS). A cream formulation, presented as an emulsion, was prepared using imiquimod as the active pharmaceutical ingredient (API) and the excipients: water, vaseline, an emulsifier and a carboxylic acid in order to dissolve the API. After exposure at 45 °C during 3 months to perform accelerated stability test, the presence of some crystals was observed, indicating homogeneity problems in the formulation. PCA exploratory analysis showed that the crystal composition was different from the composition of the emulsion, since the score maps presented crystal structures in the emulsion. MCR-ALS estimated the spectra of the crystals and the emulsion. The crystals presented amine and C-H bands, suggesting that the precipitate was a salt formed by carboxylic acid and imiquimod. These results indicate the potential of infrared imaging spectroscopy in conjunction with chemometric methodologies as an analytical tool to ensure the quality of cream formulations in the pharmaceutical industry.
The ergonomic process of an automotive company in Brazil: a study case.
Bustos, Carolina; Fischer, Daniela; Ballardin, Lucimara; Nielsen, Rudolf
2012-01-01
The goal of this paper is to present the ergonomic process of an automotive company, whose focus is on the adaptation of the work conditions to the psychophysiological characteristics of its employees. The planning and the development of the ergonomic actions took place in three distinctive stages: ergonomic analysis of the work post (stage 1), ergonomic adaptations (stage 2) and Ergonomic Committee (stage 3). The activities started in June 2006 and have lasted to the current date, keeping a permanent improving process. The procedure adopted was based on the ergonomic analysis methodology proposed by Wisner (1994:1997) and the stages of the Ergonomic Analysis of Work presented in the 17 NR Regulatory Manual (MET, 2002). The paper's approach focused on the voluntary participation of workers from different areas and different hierarchical levels of the organization throughout all the stages of the process. The methodological procedures included descriptive research techniques, exploratory and qualitative research criteria, background and guidelines available in literature and legislation, as well as company information. Among the main results it can be mentioned the satisfaction of the employees regarding the appropriate work conditions, cultural and organizational changes and the creation of an Ergonomic Committee in the company.
Han, Qiaohong; Wu, Zili; Huang, Bo; Sun, Liangqi; Ding, Chunbang; Yuan, Shu; Zhang, Zhongwei; Chen, Yanger; Hu, Chao; Zhou, Lijun; Liu, Jing; Huang, Yan; Liao, Jinqiu; Yuan, Ming
2016-11-01
Polysaccharides were extracted from Broussonetia papyrifera ((L.) L'Herit. ex Vent.) fruits (BPP), and response surface methodology was used to maximize extraction yield. The optimum extraction conditions were: ratio of water to solid, 30mL/g; extraction duration, 50min; extraction power, 180W; and extraction temperature, 60°C. Under these conditions, the yield of BPP was 8.61%. Then, BPP was purified, and three purified fractions (designated BPP-1, BPP-2 and BPP-3) were obtained for further physicochemical properties, antioxidant activity and antibacterial activity analysis. These fractions were mainly composed of glucose, mannose and arabinose residue, meanwhile, BPP-3 had a significantly higher rhamnose and uronic acid content than BPP-1 and BPP-2. And BPP-3 showed the best hydroxyl radial scavenging activity, ferric reducing activity power (FRAP), antihemolytic activity and antibacterial activity. Copyright © 2016 Elsevier B.V. All rights reserved.
Harmonizing Automatic Test System Assets, Drivers, and Control Methodologies
1999-07-18
ORGANIZATION PRINCIPAL AREAS OF INTEREST TO ATS NAME 1394 TA Firewire Trade Association Defining high speed bus protocol Active Group Accelerating ActiveX ...System Assets, Drivers, and Control Methodologies 17 JUL, 1999 component is a diagonal matrix containing scaling values such that when the three
A New Methodology for Systematic Exploitation of Technology Databases.
ERIC Educational Resources Information Center
Bedecarrax, Chantal; Huot, Charles
1994-01-01
Presents the theoretical aspects of a data analysis methodology that can help transform sequential raw data from a database into useful information, using the statistical analysis of patents as an example. Topics discussed include relational analysis and a technology watch approach. (Contains 17 references.) (LRW)
Antimicrobial activity and mechanism of the human milk-sourced peptide Casein201
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Fan; Department of Endocrinology, Children's Hospital of Nanjing Medical University, Nanjing; Cui, Xianwei
Introduction: Casein201 is one of the human milk sourced peptides that differed significantly in preterm and full-term mothers. This study is designed to demonstrate the biological characteristics, antibacterial activity and mechanisms of Casein201 against common pathogens in neonatal infection. Methodology: The analysis of biological characteristics was done by bioinformatics. Disk diffusion method and flow cytometry were used to detect the antimicrobial activity of Casein201. Killing kinetics of Casein201 was measured using microplate reader. The antimicrobial mechanism of Casein201 was studied by electron microscopy and electrophoresis. Results: Bioinformatics analysis indicates that Casein201 derived from β-casein and showed significant sequence overlap. Antibacterialmore » assays showed Casein201 inhibited the growth of S taphylococcus aureus and Y ersinia enterocolitica. Ultrastructural analyses revealed that the antibacterial activity of Casein201 is through cytoplasmic structures disintegration and bacterial cell envelope alterations but not combination with DNA. Conclusion: We conclude the antimicrobial activity and mechanism of Casein201. Our data demonstrate that Casein201 has potential therapeutic value for the prevention and treatment of pathogens in neonatal infection.« less
Ruano, Juan; Aguilar-Luque, Macarena; Gómez-Garcia, Francisco; Alcalde Mellado, Patricia; Gay-Mimbrera, Jesus; Carmona-Fernandez, Pedro J; Maestre-López, Beatriz; Sanz-Cabanillas, Juan Luís; Hernández Romero, José Luís; González-Padilla, Marcelino; Vélez García-Nieto, Antonio; Isla-Tejera, Beatriz
2018-01-01
Researchers are increasingly using on line social networks to promote their work. Some authors have suggested that measuring social media activity can predict the impact of a primary study (i.e., whether or not an article will be highly cited). However, the influence of variables such as scientific quality, research disclosures, and journal characteristics on systematic reviews and meta-analyses has not yet been assessed. The present study aims to describe the effect of complex interactions between bibliometric factors and social media activity on the impact of systematic reviews and meta-analyses about psoriasis (PROSPERO 2016: CRD42016053181). Methodological quality was assessed using the Assessing the Methodological Quality of Systematic Reviews (AMSTAR) tool. Altmetrics, which consider Twitter, Facebook, and Google+ mention counts as well as Mendeley and SCOPUS readers, and corresponding article citation counts from Google Scholar were obtained for each article. Metadata and journal-related bibliometric indices were also obtained. One-hundred and sixty-four reviews with available altmetrics information were included in the final multifactorial analysis, which showed that social media and impact factor have less effect than Mendeley and SCOPUS readers on the number of cites that appear in Google Scholar. Although a journal's impact factor predicted the number of tweets (OR, 1.202; 95% CI, 1.087-1.049), the years of publication and the number of Mendeley readers predicted the number of citations in Google Scholar (OR, 1.033; 95% CI, 1.018-1.329). Finally, methodological quality was related neither with bibliometric influence nor social media activity for systematic reviews. In conclusion, there seems to be a lack of connectivity between scientific quality, social media activity, and article usage, thus predicting scientific success based on these variables may be inappropriate in the particular case of systematic reviews.
Gómez-Garcia, Francisco; Alcalde Mellado, Patricia; Gay-Mimbrera, Jesus; Carmona-Fernandez, Pedro J.; Maestre-López, Beatriz; Sanz-Cabanillas, Juan Luís; Hernández Romero, José Luís; González-Padilla, Marcelino; Vélez García-Nieto, Antonio; Isla-Tejera, Beatriz
2018-01-01
Researchers are increasingly using on line social networks to promote their work. Some authors have suggested that measuring social media activity can predict the impact of a primary study (i.e., whether or not an article will be highly cited). However, the influence of variables such as scientific quality, research disclosures, and journal characteristics on systematic reviews and meta-analyses has not yet been assessed. The present study aims to describe the effect of complex interactions between bibliometric factors and social media activity on the impact of systematic reviews and meta-analyses about psoriasis (PROSPERO 2016: CRD42016053181). Methodological quality was assessed using the Assessing the Methodological Quality of Systematic Reviews (AMSTAR) tool. Altmetrics, which consider Twitter, Facebook, and Google+ mention counts as well as Mendeley and SCOPUS readers, and corresponding article citation counts from Google Scholar were obtained for each article. Metadata and journal-related bibliometric indices were also obtained. One-hundred and sixty-four reviews with available altmetrics information were included in the final multifactorial analysis, which showed that social media and impact factor have less effect than Mendeley and SCOPUS readers on the number of cites that appear in Google Scholar. Although a journal’s impact factor predicted the number of tweets (OR, 1.202; 95% CI, 1.087–1.049), the years of publication and the number of Mendeley readers predicted the number of citations in Google Scholar (OR, 1.033; 95% CI, 1.018–1.329). Finally, methodological quality was related neither with bibliometric influence nor social media activity for systematic reviews. In conclusion, there seems to be a lack of connectivity between scientific quality, social media activity, and article usage, thus predicting scientific success based on these variables may be inappropriate in the particular case of systematic reviews. PMID:29377889
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oliveira, P. A.; Santos, J. A. M., E-mail: joao.santos@ipoporto.min-saude.pt; Serviço de Física Médica do Instituto Português de Oncologia do Porto Francisco Gentil, EPE, Porto
2014-07-15
Purpose: An original radionuclide calibrator method for activity determination is presented. The method could be used for intercomparison surveys for short half-life radioactive sources used in Nuclear Medicine, such as{sup 99m}Tc or most positron emission tomography radiopharmaceuticals. Methods: By evaluation of the resulting net optical density (netOD) using a standardized scanning method of irradiated Gafchromic XRQA2 film, a comparison of the netOD measurement with a previously determined calibration curve can be made and the difference between the tested radionuclide calibrator and a radionuclide calibrator used as reference device can be calculated. To estimate the total expected measurement uncertainties, a carefulmore » analysis of the methodology, for the case of{sup 99m}Tc, was performed: reproducibility determination, scanning conditions, and possible fadeout effects. Since every factor of the activity measurement procedure can influence the final result, the method also evaluates correct syringe positioning inside the radionuclide calibrator. Results: As an alternative to using a calibrated source sent to the surveyed site, which requires a relatively long half-life of the nuclide, or sending a portable calibrated radionuclide calibrator, the proposed method uses a source preparedin situ. An indirect activity determination is achieved by the irradiation of a radiochromic film using {sup 99m}Tc under strictly controlled conditions, and cumulated activity calculation from the initial activity and total irradiation time. The irradiated Gafchromic film and the irradiator, without the source, can then be sent to a National Metrology Institute for evaluation of the results. Conclusions: The methodology described in this paper showed to have a good potential for accurate (3%) radionuclide calibrators intercomparison studies for{sup 99m}Tc between Nuclear Medicine centers without source transfer and can easily be adapted to other short half-life radionuclides.« less
Walia, Gurjot S; Wong, Alison L; Lo, Andrea Y; Mackert, Gina A; Carl, Hannah M; Pedreira, Rachel A; Bello, Ricardo; Aquino, Carla S; Padula, William V; Sacks, Justin M
2016-12-01
To present a systematic review of the literature assessing the efficacy of monitoring devices for reducing the risk of developing pressure injuries. This continuing education activity is intended for physicians, physician assistants, nurse practitioners, and nurses with an interest in skin and wound care. After participating in this educational activity, the participant should be better able to:1. Explain the methodology of the literature review and its results.2. Discuss the scope of the problem and the implications of the research. OBJECTIVE: To assess the efficacy of monitoring devices for reducing the risk of developing pressure injuries (PIs). The authors systematically reviewed the literature by searching PubMed/MEDLINE and CINAHL databases through January 2016. Articles included clinical trials and cohort studies that tested monitoring devices, evaluating PI risk factors on patients in acute and skilled nursing settings. The articles were scored using the Methodological Index for Non-randomized Studies. Using a standardized extraction form, the authors extracted patient inclusion/exclusion criteria, care setting, key baseline, description of monitoring device and methodology, number of patients included in each group, description of any standard of care, follow-up period, and outcomes. Of the identified 1866 publications, 9 met the inclusion criteria. The high-quality studies averaged Methodological Index for Non-randomized Studies scores of 19.4 for clinical trials and 12.2 for observational studies. These studies evaluated monitoring devices that measured interface pressure, subdermal tissue stress, motion, and moisture. Most studies found a statistically significant decrease in PIs; 2 studies were eligible for meta-analysis, demonstrating that use of monitoring devices was associated with an 88% reduction in the risk of developing PIs (Mantel-Haenszel risk ratio, 0.12; 95% confidence interval, 0.04-0.41; I = 0%). Pressure injury monitoring devices are associated with a strong reduction in the risk of developing PIs. These devices provide clinicians and patients with critical information to implement prevention guidelines. Randomized controlled trials would help assess which technologies are most effective at reducing the risk of developing PIs.
Multimodal hybrid reasoning methodology for personalized wellbeing services.
Ali, Rahman; Afzal, Muhammad; Hussain, Maqbool; Ali, Maqbool; Siddiqi, Muhammad Hameed; Lee, Sungyoung; Ho Kang, Byeong
2016-02-01
A wellness system provides wellbeing recommendations to support experts in promoting a healthier lifestyle and inducing individuals to adopt healthy habits. Adopting physical activity effectively promotes a healthier lifestyle. A physical activity recommendation system assists users to adopt daily routines to form a best practice of life by involving themselves in healthy physical activities. Traditional physical activity recommendation systems focus on general recommendations applicable to a community of users rather than specific individuals. These recommendations are general in nature and are fit for the community at a certain level, but they are not relevant to every individual based on specific requirements and personal interests. To cover this aspect, we propose a multimodal hybrid reasoning methodology (HRM) that generates personalized physical activity recommendations according to the user׳s specific needs and personal interests. The methodology integrates the rule-based reasoning (RBR), case-based reasoning (CBR), and preference-based reasoning (PBR) approaches in a linear combination that enables personalization of recommendations. RBR uses explicit knowledge rules from physical activity guidelines, CBR uses implicit knowledge from experts׳ past experiences, and PBR uses users׳ personal interests and preferences. To validate the methodology, a weight management scenario is considered and experimented with. The RBR part of the methodology generates goal, weight status, and plan recommendations, the CBR part suggests the top three relevant physical activities for executing the recommended plan, and the PBR part filters out irrelevant recommendations from the suggested ones using the user׳s personal preferences and interests. To evaluate the methodology, a baseline-RBR system is developed, which is improved first using ranged rules and ultimately using a hybrid-CBR. A comparison of the results of these systems shows that hybrid-CBR outperforms the modified-RBR and baseline-RBR systems. Hybrid-CBR yields a 0.94% recall, a 0.97% precision, a 0.95% f-score, and low Type I and Type II errors. Copyright © 2015 Elsevier Ltd. All rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-17
...: Proposed Collection; Comment Request--Generic Clearance to Conduct Methodological Testing, Surveys, Focus... proposed information collection. This information collection will conduct research by methodological... Methodological Testing, Surveys, Focus Groups, and Related Tools to Improve the Management of Federal Nutrition...
Applications of decision analysis and related techniques to industrial engineering problems at KSC
NASA Technical Reports Server (NTRS)
Evans, Gerald W.
1995-01-01
This report provides: (1) a discussion of the origination of decision analysis problems (well-structured problems) from ill-structured problems; (2) a review of the various methodologies and software packages for decision analysis and related problem areas; (3) a discussion of how the characteristics of a decision analysis problem affect the choice of modeling methodologies, thus providing a guide as to when to choose a particular methodology; and (4) examples of applications of decision analysis to particular problems encountered by the IE Group at KSC. With respect to the specific applications at KSC, particular emphasis is placed on the use of the Demos software package (Lumina Decision Systems, 1993).
Muto, Shunsuke; Tatsumi, Kazuyoshi
2017-02-08
Advancements in the field of renewable energy resources have led to a growing demand for the analysis of light elements at the nanometer scale. Detection of lithium is one of the key issues to be resolved for providing guiding principles for the synthesis of cathode active materials, and degradation analysis after repeated use of those materials. We have reviewed the different techniques currently used for the characterization of light elements such as high-resolution transmission electron microscopy, scanning transmission electron microscopy (STEM) and electron energy-loss spectroscopy (EELS). In the present study, we have introduced a methodology to detect lithium in solid materials, particularly for cathode active materials used in lithium-ion battery. The chemical states of lithium were isolated and analyzed from the overlapping multiple spectral profiles, using a suite of STEM, EELS and hyperspectral image analysis. The method was successfully applied in the chemical state analyses of hetero-phases near the surface and grain boundary regions of the active material particles formed by chemical reactions between the electrolyte and the active materials. © The Author 2016. Published by Oxford University Press on behalf of The Japanese Society of Microscopy. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
QESA: Quarantine Extraterrestrial Sample Analysis Methodology
NASA Astrophysics Data System (ADS)
Simionovici, A.; Lemelle, L.; Beck, P.; Fihman, F.; Tucoulou, R.; Kiryukhina, K.; Courtade, F.; Viso, M.
2018-04-01
Our nondestructive, nm-sized, hyperspectral analysis methodology of combined X-rays/Raman/IR probes in BSL4 quarantine, renders our patented mini-sample holder ideal for detecting extraterrestrial life. Our Stardust and Archean results validate it.
Health economic evaluation: important principles and methodology.
Rudmik, Luke; Drummond, Michael
2013-06-01
To discuss health economic evaluation and improve the understanding of common methodology. This article discusses the methodology for the following types of economic evaluations: cost-minimization, cost-effectiveness, cost-utility, cost-benefit, and economic modeling. Topics include health-state utility measures, the quality-adjusted life year (QALY), uncertainty analysis, discounting, decision tree analysis, and Markov modeling. Economic evaluation is the comparative analysis of alternative courses of action in terms of both their costs and consequences. With increasing health care expenditure and limited resources, it is important for physicians to consider the economic impact of their interventions. Understanding common methodology involved in health economic evaluation will improve critical appraisal of the literature and optimize future economic evaluations. Copyright © 2012 The American Laryngological, Rhinological and Otological Society, Inc.
Eigenvalue Contributon Estimator for Sensitivity Calculations with TSUNAMI-3D
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rearden, Bradley T; Williams, Mark L
2007-01-01
Since the release of the Tools for Sensitivity and Uncertainty Analysis Methodology Implementation (TSUNAMI) codes in SCALE [1], the use of sensitivity and uncertainty analysis techniques for criticality safety applications has greatly increased within the user community. In general, sensitivity and uncertainty analysis is transitioning from a technique used only by specialists to a practical tool in routine use. With the desire to use the tool more routinely comes the need to improve the solution methodology to reduce the input and computational burden on the user. This paper reviews the current solution methodology of the Monte Carlo eigenvalue sensitivity analysismore » sequence TSUNAMI-3D, describes an alternative approach, and presents results from both methodologies.« less
Sensor-based activity recognition using extended belief rule-based inference methodology.
Calzada, A; Liu, J; Nugent, C D; Wang, H; Martinez, L
2014-01-01
The recently developed extended belief rule-based inference methodology (RIMER+) recognizes the need of modeling different types of information and uncertainty that usually coexist in real environments. A home setting with sensors located in different rooms and on different appliances can be considered as a particularly relevant example of such an environment, which brings a range of challenges for sensor-based activity recognition. Although RIMER+ has been designed as a generic decision model that could be applied in a wide range of situations, this paper discusses how this methodology can be adapted to recognize human activities using binary sensors within smart environments. The evaluation of RIMER+ against other state-of-the-art classifiers in terms of accuracy, efficiency and applicability was found to be significantly relevant, specially in situations of input data incompleteness, and it demonstrates the potential of this methodology and underpins the basis to develop further research on the topic.
Activities for Engaging Schools in Health Promotion
ERIC Educational Resources Information Center
Bardi, Mohammad; Burbank, Andrea; Choi, Wayne; Chow, Lawrence; Jang, Wesley; Roccamatisi, Dawn; Timberley-Berg, Tonia; Sanghera, Mandeep; Zhang, Margaret; Macnab, Andrew J.
2014-01-01
Purpose: The purpose of this paper is to describe activities used to initiate health promotion in the school setting. Design/Methodology/Approach: Description of successful pilot Health Promoting School (HPS) initiatives in Canada and Uganda and the validated measures central to each program. Evaluation methodologies: quantitative data from the…
Change--how to remove the fear, resentment, and resistance.
Weitz, A J
1995-11-01
This article introduces active learning, which is an innovative education methodology for the workplace classroom. It is used to help people remove their fear, resentment, and resistance to the change process itself. Active learning makes education more effective compared with the predominantly used traditional lecture-type teaching methodology.
Opinion: Clarifying Two Controversies about Information Mapping's Method.
ERIC Educational Resources Information Center
Horn, Robert E.
1992-01-01
Describes Information Mapping, a methodology for the analysis, organization, sequencing, and presentation of information and explains three major parts of the method: (1) content analysis, (2) project life-cycle synthesis and integration of the content analysis, and (3) sequencing and formatting. Major criticisms of the methodology are addressed.…
Measures of outdoor play and independent mobility in children and youth: A methodological review.
Bates, Bree; Stone, Michelle R
2015-09-01
Declines in children's outdoor play have been documented globally, which are partly due to heightened restrictions around children's independent mobility. Literature on outdoor play and children's independent mobility is increasing, yet no paper has summarized the various methodological approaches used. A methodological review could highlight most commonly used measures and comprehensive research designs that could result in more standardized methodological approaches. Methodological review. A standardized protocol guided a methodological review of published research on measures of outdoor play and children's independent mobility in children and youth (0-18 years). Online searches of 8 electronic databases were conducted and studies included if they contained a subjective/objective measure of outdoor play or children's independent mobility. References of included articles were scanned to identify additional articles. Twenty-four studies were included on outdoor play, and twenty-three on children's independent mobility. Study designs were diverse. Common objective measures included accelerometry, global positioning systems and direct observation; questionnaires, surveys and interviews were common subjective measures. Focus groups, activity logs, monitoring sheets, travel/activity diaries, behavioral maps and guided tours were also utilized. Questionnaires were used most frequently, yet few studies used the same questionnaire. Five studies employed comprehensive, mixed-methods designs. Outdoor play and children's independent mobility have been measured using a wide variety of techniques, with only a few studies using similar methodologies. A standardized methodological approach does not exist. Future researchers should consider including both objective measures (accelerometry and global positioning systems) and subjective measures (questionnaires, activity logs, interviews), as more comprehensive designs will enhance understanding of each multidimensional construct. Creating a standardized methodological approach would improve study comparisons. Copyright © 2014 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.
Design Methodology of a Dual-Halbach Array Linear Actuator with Thermal-Electromagnetic Coupling
Eckert, Paulo Roberto; Flores Filho, Aly Ferreira; Perondi, Eduardo; Ferri, Jeferson; Goltz, Evandro
2016-01-01
This paper proposes a design methodology for linear actuators, considering thermal and electromagnetic coupling with geometrical and temperature constraints, that maximizes force density and minimizes force ripple. The method allows defining an actuator for given specifications in a step-by-step way so that requirements are met and the temperature within the device is maintained under or equal to its maximum allowed for continuous operation. According to the proposed method, the electromagnetic and thermal models are built with quasi-static parametric finite element models. The methodology was successfully applied to the design of a linear cylindrical actuator with a dual quasi-Halbach array of permanent magnets and a moving-coil. The actuator can produce an axial force of 120 N and a stroke of 80 mm. The paper also presents a comparative analysis between results obtained considering only an electromagnetic model and the thermal-electromagnetic coupled model. This comparison shows that the final designs for both cases differ significantly, especially regarding its active volume and its electrical and magnetic loading. Although in this paper the methodology was employed to design a specific actuator, its structure can be used to design a wide range of linear devices if the parametric models are adjusted for each particular actuator. PMID:26978370
High-Dimensional Sparse Factor Modeling: Applications in Gene Expression Genomics
Carvalho, Carlos M.; Chang, Jeffrey; Lucas, Joseph E.; Nevins, Joseph R.; Wang, Quanli; West, Mike
2010-01-01
We describe studies in molecular profiling and biological pathway analysis that use sparse latent factor and regression models for microarray gene expression data. We discuss breast cancer applications and key aspects of the modeling and computational methodology. Our case studies aim to investigate and characterize heterogeneity of structure related to specific oncogenic pathways, as well as links between aggregate patterns in gene expression profiles and clinical biomarkers. Based on the metaphor of statistically derived “factors” as representing biological “subpathway” structure, we explore the decomposition of fitted sparse factor models into pathway subcomponents and investigate how these components overlay multiple aspects of known biological activity. Our methodology is based on sparsity modeling of multivariate regression, ANOVA, and latent factor models, as well as a class of models that combines all components. Hierarchical sparsity priors address questions of dimension reduction and multiple comparisons, as well as scalability of the methodology. The models include practically relevant non-Gaussian/nonparametric components for latent structure, underlying often quite complex non-Gaussianity in multivariate expression patterns. Model search and fitting are addressed through stochastic simulation and evolutionary stochastic search methods that are exemplified in the oncogenic pathway studies. Supplementary supporting material provides more details of the applications, as well as examples of the use of freely available software tools for implementing the methodology. PMID:21218139
MODIFIED PATH METHODOLOGY FOR OBTAINING INTERVAL-SCALED POSTURAL ASSESSMENTS OF FARMWORKERS.
Garrison, Emma B; Dropkin, Jonathan; Russell, Rebecca; Jenkins, Paul
2018-01-29
Agricultural workers perform tasks that frequently require awkward and extreme postures that are associated with musculoskeletal disorders (MSDs). The PATH (Posture, Activity, Tools, Handling) system currently provides a sound methodology for quantifying workers' exposure to these awkward postures on an ordinal scale of measurement, which places restrictions on the choice of analytic methods. This study reports a modification of the PATH methodology that instead captures these postures as degrees of flexion, an interval-scaled measurement. Rather than making live observations in the field, as in PATH, the postural assessments were performed on photographs using ImageJ photo analysis software. Capturing the postures in photographs permitted more careful measurement of the degrees of flexion. The current PATH methodology requires that the observer in the field be trained in the use of PATH, whereas the single photographer used in this modification requires only sufficient training to maintain the proper camera angle. Ultimately, these interval-scale measurements could be combined with other quantitative measures, such as those produced by electromyograms (EMGs), to provide more sophisticated estimates of future risk for MSDs. Further, these data can provide a baseline from which the effects of interventions designed to reduce hazardous postures can be calculated with greater precision. Copyright© by the American Society of Agricultural Engineers.
Molina, Iñigo; Martinez, Estibaliz; Arquero, Agueda; Pajares, Gonzalo; Sanchez, Javier
2012-01-01
Landcover is subject to continuous changes on a wide variety of temporal and spatial scales. Those changes produce significant effects in human and natural activities. Maintaining an updated spatial database with the occurred changes allows a better monitoring of the Earth’s resources and management of the environment. Change detection (CD) techniques using images from different sensors, such as satellite imagery, aerial photographs, etc., have proven to be suitable and secure data sources from which updated information can be extracted efficiently, so that changes can also be inventoried and monitored. In this paper, a multisource CD methodology for multiresolution datasets is applied. First, different change indices are processed, then different thresholding algorithms for change/no_change are applied to these indices in order to better estimate the statistical parameters of these categories, finally the indices are integrated into a change detection multisource fusion process, which allows generating a single CD result from several combination of indices. This methodology has been applied to datasets with different spectral and spatial resolution properties. Then, the obtained results are evaluated by means of a quality control analysis, as well as with complementary graphical representations. The suggested methodology has also been proved efficiently for identifying the change detection index with the higher contribution. PMID:22737023
Molina, Iñigo; Martinez, Estibaliz; Arquero, Agueda; Pajares, Gonzalo; Sanchez, Javier
2012-01-01
Landcover is subject to continuous changes on a wide variety of temporal and spatial scales. Those changes produce significant effects in human and natural activities. Maintaining an updated spatial database with the occurred changes allows a better monitoring of the Earth's resources and management of the environment. Change detection (CD) techniques using images from different sensors, such as satellite imagery, aerial photographs, etc., have proven to be suitable and secure data sources from which updated information can be extracted efficiently, so that changes can also be inventoried and monitored. In this paper, a multisource CD methodology for multiresolution datasets is applied. First, different change indices are processed, then different thresholding algorithms for change/no_change are applied to these indices in order to better estimate the statistical parameters of these categories, finally the indices are integrated into a change detection multisource fusion process, which allows generating a single CD result from several combination of indices. This methodology has been applied to datasets with different spectral and spatial resolution properties. Then, the obtained results are evaluated by means of a quality control analysis, as well as with complementary graphical representations. The suggested methodology has also been proved efficiently for identifying the change detection index with the higher contribution.
Design Methodology of a Dual-Halbach Array Linear Actuator with Thermal-Electromagnetic Coupling.
Eckert, Paulo Roberto; Flores Filho, Aly Ferreira; Perondi, Eduardo; Ferri, Jeferson; Goltz, Evandro
2016-03-11
This paper proposes a design methodology for linear actuators, considering thermal and electromagnetic coupling with geometrical and temperature constraints, that maximizes force density and minimizes force ripple. The method allows defining an actuator for given specifications in a step-by-step way so that requirements are met and the temperature within the device is maintained under or equal to its maximum allowed for continuous operation. According to the proposed method, the electromagnetic and thermal models are built with quasi-static parametric finite element models. The methodology was successfully applied to the design of a linear cylindrical actuator with a dual quasi-Halbach array of permanent magnets and a moving-coil. The actuator can produce an axial force of 120 N and a stroke of 80 mm. The paper also presents a comparative analysis between results obtained considering only an electromagnetic model and the thermal-electromagnetic coupled model. This comparison shows that the final designs for both cases differ significantly, especially regarding its active volume and its electrical and magnetic loading. Although in this paper the methodology was employed to design a specific actuator, its structure can be used to design a wide range of linear devices if the parametric models are adjusted for each particular actuator.
Response Time Analysis and Test of Protection System Instrument Channels for APR1400 and OPR1000
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Chang Jae; Han, Seung; Yun, Jae Hee
2015-07-01
Safety limits are required to maintain the integrity of physical barriers designed to prevent the uncontrolled release of radioactive materials in nuclear power plants. The safety analysis establishes two critical constraints that include an analytical limit in terms of a measured or calculated variable, and a specific time after the analytical limit is reached to begin protective action. Keeping with the nuclear regulations and industry standards, satisfying these two requirements will ensure that the safety limit will not be exceeded during the design basis event, either an anticipated operational occurrence or a postulated accident. Various studies on the setpoint determinationmore » methodology for the safety-related instrumentation have been actively performed to ensure that the requirement of the analytical limit is satisfied. In particular, the protection setpoint methodology for the advanced power reactor 1400 (APP1400) and the optimized power reactor 1000 (OPR1000) has been recently developed to cover both the design basis event and the beyond design basis event. The developed setpoint methodology has also been quantitatively validated using specific computer programs and setpoint calculations. However, the safety of nuclear power plants cannot be fully guaranteed by satisfying the requirement of the analytical limit. In spite of the response time verification requirements of nuclear regulations and industry standards, it is hard to find the studies on the systematically integrated methodology regarding the response time evaluation. In cases of APR1400 and OPR1000, the response time analysis for the plant protection system is partially included in the setpoint calculation and the response time test is separately performed via the specific plant procedure. The test technique has a drawback which is the difficulty to demonstrate completeness of timing test. The analysis technique has also a demerit of resulting in extreme times that not actually possible. Thus, the establishment of the systematic response time evaluation methodology is needed to justify the conformance to the response time requirement used in the safety analysis. This paper proposes the response time evaluation methodology for APR1400 and OPR1000 using the combined analysis and test technique to confirm that the plant protection system can meet the analytical response time assumed in the safety analysis. In addition, the results of the quantitative evaluation performed for APR1400 and OPR1000 are presented in this paper. The proposed response time analysis technique consists of defining the response time requirement, determining the critical signal path for the trip parameter, allocating individual response time to each component on the signal path, and analyzing the total response time for the trip parameter, and demonstrates that the total analyzed response time does not exceed the response time requirement. The proposed response time test technique is composed of defining the response time requirement, determining the critical signal path for the trip parameter, determining the test method for each component on the signal path, performing the response time test, and demonstrates that the total test result does not exceed the response time requirement. The total response time should be tested in a single test that covers from the sensor to the final actuation device on the instrument channel. When the total channel is not tested in a single test, separate tests on groups of components or single components including the total instrument channel shall be combined to verify the total channel response. For APR1400 and OPR1000, the ramp test technique is used for the pressure and differential pressure transmitters and the step function testing technique is applied to the signal processing equipment and final actuation device. As a result, it can be demonstrated that the response time requirement is satisfied by the combined analysis and test technique. Therefore, the proposed methodology in this paper plays a crucial role in guaranteeing the safety of the nuclear power plants systematically satisfying one of two critical requirements from the safety analysis. (authors)« less
Extraction of Organochlorine Pesticides from Plastic Pellets and Plastic Type Analysis.
Pflieger, Marilyne; Makorič, Petra; Kovač Viršek, Manca; Koren, Špela
2017-07-01
Plastic resin pellets, categorized as microplastics (≤5 mm in diameter), are small granules that can be unintentionally released to the environment during manufacturing and transport. Because of their environmental persistence, they are widely distributed in the oceans and on beaches all over the world. They can act as a vector of potentially toxic organic compounds (e.g., polychlorinated biphenyls) and might consequently negatively affect marine organisms. Their possible impacts along the food chain are not yet well understood. In order to assess the hazards associated with the occurrence of plastic pellets in the marine environment, it is necessary to develop methodologies that allow for rapid determination of associated organic contaminant levels. The present protocol describes the different steps required for sampling resin pellets, analyzing adsorbed organochlorine pesticides (OCPs) and identifying the plastic type. The focus is on the extraction of OCPs from plastic pellets by means of a pressurized fluid extractor (PFE) and on the polymer chemical analysis applying Fourier Transform-InfraRed (FT-IR) spectroscopy. The developed methodology focuses on 11 OCPs and related compounds, including dichlorodiphenyltrichloroethane (DDT) and its two main metabolites, lindane and two production isomers, as well as the two biologically active isomers of technical endosulfan. This protocol constitutes a simple and rapid alternative to existing methodology for evaluating the concentration of organic contaminants adsorbed on plastic pieces.
Dent, Andrew W; Asadpour, Ali; Weiland, Tracey J; Paltridge, Debbie
2008-02-01
Fellows of the Australasian College for Emergency Medicine (FACEM) have opportunities to participate in a range of continuing professional development activities. To inform FACEM and assist those involved in planning continuing professional development interventions for FACEM, we undertook a learning needs analysis of emergency physicians. Exploratory study using survey methodology. Following questionnaire development by iterative feedback with emergency physicians and researchers, a mailed survey was distributed to all FACEM. The survey comprised eight items on work and demographic characteristics of FACEM, and 194 items on attitudes to existing learning opportunities, barriers to learning, and perceived learning needs and preferences. Fifty-eight percent (503/854) of all FACEM surveyed responded to the questionnaire, almost half of whom attained their FACEM after year 2000. The sample comprised mostly males (72.8%) with mean age of the sample 41.6 years, similar to ACEM database. Most respondents reported working in ACEM accredited hospitals (89%), major referral hospitals (54%), and practiced on both children and adults (78%). FACEM reported working on average 26.7 clinical hours per week with those at private hospitals working a greater proportion of clinical hours than other hospital types. As the first of six related reports, this paper documents the methodology used, including questionnaire development, and provides the demographics of responding FACEM, including the clinical and non-clinical hours worked and type of hospital of principal employment.
Shukla, Nagesh; Keast, John E; Ceglarek, Darek
2014-10-01
The modelling of complex workflows is an important problem-solving technique within healthcare settings. However, currently most of the workflow models use a simplified flow chart of patient flow obtained using on-site observations, group-based debates and brainstorming sessions, together with historic patient data. This paper presents a systematic and semi-automatic methodology for knowledge acquisition with detailed process representation using sequential interviews of people in the key roles involved in the service delivery process. The proposed methodology allows the modelling of roles, interactions, actions, and decisions involved in the service delivery process. This approach is based on protocol generation and analysis techniques such as: (i) initial protocol generation based on qualitative interviews of radiology staff, (ii) extraction of key features of the service delivery process, (iii) discovering the relationships among the key features extracted, and, (iv) a graphical representation of the final structured model of the service delivery process. The methodology is demonstrated through a case study of a magnetic resonance (MR) scanning service-delivery process in the radiology department of a large hospital. A set of guidelines is also presented in this paper to visually analyze the resulting process model for identifying process vulnerabilities. A comparative analysis of different workflow models is also conducted. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Transitioning Domain Analysis: An Industry Experience.
1996-06-01
References 6 Implementation 6.1 Analysis of Operator Services’ Requirements Process 21 6.2 Preliminary Planning for FODA Training by SEI 21...an academic and industry partnership took feature oriented domain analysis ( FODA ) from a methodology that is still being defined to a well-documented...to pilot the use of the Software Engineering Institute (SEI) domain analysis methodology known as feature-oriented domain analysis ( FODA ). Supported
Fox, Kieran C R; Dixon, Matthew L; Nijeboer, Savannah; Girn, Manesh; Floman, James L; Lifshitz, Michael; Ellamil, Melissa; Sedlmeier, Peter; Christoff, Kalina
2016-06-01
Meditation is a family of mental practices that encompasses a wide array of techniques employing distinctive mental strategies. We systematically reviewed 78 functional neuroimaging (fMRI and PET) studies of meditation, and used activation likelihood estimation to meta-analyze 257 peak foci from 31 experiments involving 527 participants. We found reliably dissociable patterns of brain activation and deactivation for four common styles of meditation (focused attention, mantra recitation, open monitoring, and compassion/loving-kindness), and suggestive differences for three others (visualization, sense-withdrawal, and non-dual awareness practices). Overall, dissociable activation patterns are congruent with the psychological and behavioral aims of each practice. Some brain areas are recruited consistently across multiple techniques-including insula, pre/supplementary motor cortices, dorsal anterior cingulate cortex, and frontopolar cortex-but convergence is the exception rather than the rule. A preliminary effect-size meta-analysis found medium effects for both activations (d=0.59) and deactivations (d=-0.74), suggesting potential practical significance. Our meta-analysis supports the neurophysiological dissociability of meditation practices, but also raises many methodological concerns and suggests avenues for future research. Copyright © 2016 Elsevier Ltd. All rights reserved.
Qin, Yao; Guo, Xing Wei; Li, Lei; Wang, Hong Wei; Kim, Wook
2013-06-01
The present study examined, for the first time, the in vitro wound healing potential of chitosan green tea polyphenols (CGP) complex based on the activation of transglutaminase (TGM) genes in epidermal morphogenesis. Response surface methodology was applied to determine the optimal processing condition that gave maximum extraction of green tea polyphenols. The antioxidant activity, scavenging ability, and chelating ability were studied and expressed as average EC50 values of CGP and other treatments. In silico analysis and gene coexpression network was subjected to the TGM sequences analysis. The temporal expressions of TGMs were profiled by semi-quantitative reverse transcription (RT)-PCR technology within 10 days after wounding and 2 days postwounding. CGP showed the effectiveness of antioxidant properties, and the observations of histopathological photography showed advanced tissue granulation and epithelialization formation by CGP treatment. In silico and coexpression analysis confirmed the regulation via TGM gene family in dermatological tissues. RT-PCR demonstrated increased levels of TGM1-3 expression induced by CGP treatment. The efficacy of CGP in wound healing based on these results may be ascribed to its antioxidant properties and activation of the expression of TGMs, and is, thus, essential for the facilitated repair of skin injury.
Automated analysis of art object surfaces using time-averaged digital speckle pattern interferometry
NASA Astrophysics Data System (ADS)
Lukomski, Michal; Krzemien, Leszek
2013-05-01
Technical development and practical evaluation of a laboratory built, out-of-plane digital speckle pattern interferometer (DSPI) are reported. The instrument was used for non-invasive, non-contact detection and characterization of early-stage damage, like fracturing and layer separation, of painted objects of art. A fully automated algorithm was developed for recording and analysis of vibrating objects utilizing continuous-wave laser light. The algorithm uses direct, numerical fitting or Hilbert transformation for an independent, quantitative evaluation of the Bessel function at every point of the investigated surface. The procedure does not require phase modulation and thus can be implemented within any, even the simplest, DSPI apparatus. The proposed deformation analysis is fast and computationally inexpensive. Diagnosis of physical state of the surface of a panel painting attributed to Nicolaus Haberschrack (a late-mediaeval painter active in Krakow) from the collection of the National Museum in Krakow is presented as an example of an in situ application of the developed methodology. It has allowed the effectiveness of the deformation analysis to be evaluated for the surface of a real painting (heterogeneous colour and texture) in a conservation studio where vibration level was considerably higher than in the laboratory. It has been established that the methodology, which offers automatic analysis of the interferometric fringe patterns, has a considerable potential to facilitate and render more precise the condition surveys of works of art.
Zhu, Bao Ting
2010-01-01
Background Recent studies showed that some of the dietary bioflavonoids can strongly stimulate the catalytic activity of cyclooxygenase (COX) I and II in vitro and in vivo, presumably by facilitating enzyme re-activation. In this study, we sought to understand the structural basis of COX activation by these dietary compounds. Methodology/Principal Findings A combination of molecular modeling studies, biochemical analysis and site-directed mutagenesis assay was used as research tools. Three-dimensional quantitative structure-activity relationship analysis (QSAR/CoMFA) predicted that the ability of bioflavonoids to activate COX I and II depends heavily on their B-ring structure, a moiety known to be associated with strong antioxidant ability. Using the homology modeling and docking approaches, we identified the peroxidase active site of COX I and II as the binding site for bioflavonoids. Upon binding to this site, bioflavonoid can directly interact with hematin of the COX enzyme and facilitate the electron transfer from bioflavonoid to hematin. The docking results were verified by biochemical analysis, which reveals that when the cyclooxygenase activity of COXs is inhibited by covalent modification, myricetin can still stimulate the conversion of PGG2 to PGE2, a reaction selectively catalyzed by the peroxidase activity. Using the site-directed mutagenesis analysis, we confirmed that Q189 at the peroxidase site of COX II is essential for bioflavonoids to bind and re-activate its catalytic activity. Conclusions/Significance These findings provide the structural basis for bioflavonoids to function as high-affinity reducing co-substrates of COXs through binding to the peroxidase active site, facilitating electron transfer and enzyme re-activation. PMID:20808785
Competency-based curriculum and active methodology: perceptions of nursing students.
Paranhos, Vania Daniele; Mendes, Maria Manuela Rino
2010-01-01
This study identifies the perceptions of undergraduate students at the University of São Paulo at Ribeirão Preto, Brazil, College of Nursing (EERP-USP) concerning the teaching-learning process in two courses: Integrated Seminar: Health-Disease/Care Process in Health Services Policies and Organization, which was offered to first-year students in 2005 and 2006 and Integrality in Health Care I and II, which was offered to second-year students in 2006. The courses proposal was to adopt active methodology and competency-based curriculum. Data were collected from written tests submitted to 62 students at the end of the curse, focusing on the tests pertinence, development of performance, structure and pedagogical dynamics, organization and settings. Thematic analysis indicated that students enjoyed the courses, highlighted the role of the professor/facilitator at points of the pedagogical cycle and learning recorded in students portfolios. Students valued their experience in the Primary Health Care setting, which was based on, and has since the beginning of the program been based on, the theory-professional practice interlocution and closeness to the principles of the Unified Health System (SUS).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Keereetaweep, Jantana; Chapman, Kent D.
The endocannabinoidsN-arachidonoylethanolamide (or anandamide, AEA) and 2-arachidonoylglycerol (2-AG) belong to the larger groups ofN-acylethanolamines (NAEs) and monoacylglycerol (MAG) lipid classes, respectively. They are biologically active lipid molecules that activate G-protein-coupled cannabinoid receptors found in various organisms. After AEA and 2-AG were discovered in the 1990s, they have been extensively documented to have a broad range of physiological functions. Along with AEA, several NAEs, for example,N-palmitoylethanolamine (PEA),N-stearoylethanolamine (SEA), andN-oleoylethanolamine (OEA) are also present in tissues, usually at much larger concentrations than AEA. Any perturbation that involves the endocannabinoid pathway may subsequently alter basal level or metabolism of these lipid mediators. Further,more » the altered levels of these molecules often reflect pathological conditions associated with tissue damage. Robust and sensitive methodologies to analyze these lipid mediators are essential to understanding how they act as endocannabinoids. Lastly, the recent advances in mass spectrometry allow researchers to develop lipidomics approaches and several methodologies have been proposed to quantify endocannabinoids in various biological systems.« less
Analytical control test plan and microbiological methods for the water recovery test
NASA Technical Reports Server (NTRS)
Traweek, M. S. (Editor); Tatara, J. D. (Editor)
1994-01-01
Qualitative and quantitative laboratory results are important to the decision-making process. In some cases, they may represent the only basis for deciding between two or more given options or processes. Therefore, it is essential that handling of laboratory samples and analytical operations employed are performed at a deliberate level of conscientious effort. Reporting erroneous results can lead to faulty interpretations and result in misinformed decisions. This document provides analytical control specifications which will govern future test procedures related to all Water Recovery Test (WRT) Phase 3 activities to be conducted at the National Aeronautics and Space Administration/Marshall Space Flight Center (NASA/MSFC). This document addresses the process which will be used to verify analytical data generated throughout the test period, and to identify responsibilities of key personnel and participating laboratories, the chains of communication to be followed, and ensure that approved methodology and procedures are used during WRT activities. This document does not outline specifics, but provides a minimum guideline by which sampling protocols, analysis methodologies, test site operations, and laboratory operations should be developed.
Keereetaweep, Jantana; Chapman, Kent D.
2016-01-01
The endocannabinoidsN-arachidonoylethanolamide (or anandamide, AEA) and 2-arachidonoylglycerol (2-AG) belong to the larger groups ofN-acylethanolamines (NAEs) and monoacylglycerol (MAG) lipid classes, respectively. They are biologically active lipid molecules that activate G-protein-coupled cannabinoid receptors found in various organisms. After AEA and 2-AG were discovered in the 1990s, they have been extensively documented to have a broad range of physiological functions. Along with AEA, several NAEs, for example,N-palmitoylethanolamine (PEA),N-stearoylethanolamine (SEA), andN-oleoylethanolamine (OEA) are also present in tissues, usually at much larger concentrations than AEA. Any perturbation that involves the endocannabinoid pathway may subsequently alter basal level or metabolism of these lipid mediators. Further,more » the altered levels of these molecules often reflect pathological conditions associated with tissue damage. Robust and sensitive methodologies to analyze these lipid mediators are essential to understanding how they act as endocannabinoids. Lastly, the recent advances in mass spectrometry allow researchers to develop lipidomics approaches and several methodologies have been proposed to quantify endocannabinoids in various biological systems.« less
Wavelet methodology to improve single unit isolation in primary motor cortex cells.
Ortiz-Rosario, Alexis; Adeli, Hojjat; Buford, John A
2015-05-15
The proper isolation of action potentials recorded extracellularly from neural tissue is an active area of research in the fields of neuroscience and biomedical signal processing. This paper presents an isolation methodology for neural recordings using the wavelet transform (WT), a statistical thresholding scheme, and the principal component analysis (PCA) algorithm. The effectiveness of five different mother wavelets was investigated: biorthogonal, Daubachies, discrete Meyer, symmetric, and Coifman; along with three different wavelet coefficient thresholding schemes: fixed form threshold, Stein's unbiased estimate of risk, and minimax; and two different thresholding rules: soft and hard thresholding. The signal quality was evaluated using three different statistical measures: mean-squared error, root-mean squared, and signal to noise ratio. The clustering quality was evaluated using two different statistical measures: isolation distance, and L-ratio. This research shows that the selection of the mother wavelet has a strong influence on the clustering and isolation of single unit neural activity, with the Daubachies 4 wavelet and minimax thresholding scheme performing the best. Copyright © 2015. Published by Elsevier B.V.
Keereetaweep, Jantana; Chapman, Kent D.
2016-01-01
The endocannabinoids N-arachidonoylethanolamide (or anandamide, AEA) and 2-arachidonoylglycerol (2-AG) belong to the larger groups of N-acylethanolamines (NAEs) and monoacylglycerol (MAG) lipid classes, respectively. They are biologically active lipid molecules that activate G-protein-coupled cannabinoid receptors found in various organisms. After AEA and 2-AG were discovered in the 1990s, they have been extensively documented to have a broad range of physiological functions. Along with AEA, several NAEs, for example, N-palmitoylethanolamine (PEA), N-stearoylethanolamine (SEA), and N-oleoylethanolamine (OEA) are also present in tissues, usually at much larger concentrations than AEA. Any perturbation that involves the endocannabinoid pathway may subsequently alter basal level or metabolism of these lipid mediators. Further, the altered levels of these molecules often reflect pathological conditions associated with tissue damage. Robust and sensitive methodologies to analyze these lipid mediators are essential to understanding how they act as endocannabinoids. The recent advances in mass spectrometry allow researchers to develop lipidomics approaches and several methodologies have been proposed to quantify endocannabinoids in various biological systems. PMID:26839710
Ceramic matrix composite behavior -- Computational simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chamis, C.C.; Murthy, P.L.N.; Mital, S.K.
Development of analytical modeling and computational capabilities for the prediction of high temperature ceramic matrix composite behavior has been an ongoing research activity at NASA-Lewis Research Center. These research activities have resulted in the development of micromechanics based methodologies to evaluate different aspects of ceramic matrix composite behavior. The basis of the approach is micromechanics together with a unique fiber substructuring concept. In this new concept the conventional unit cell (the smallest representative volume element of the composite) of micromechanics approach has been modified by substructuring the unit cell into several slices and developing the micromechanics based equations at themore » slice level. Main advantage of this technique is that it can provide a much greater detail in the response of composite behavior as compared to a conventional micromechanics based analysis and still maintains a very high computational efficiency. This methodology has recently been extended to model plain weave ceramic composites. The objective of the present paper is to describe the important features of the modeling and simulation and illustrate with select examples of laminated as well as woven composites.« less
Object-oriented analysis and design: a methodology for modeling the computer-based patient record.
Egyhazy, C J; Eyestone, S M; Martino, J; Hodgson, C L
1998-08-01
The article highlights the importance of an object-oriented analysis and design (OOAD) methodology for the computer-based patient record (CPR) in the military environment. Many OOAD methodologies do not adequately scale up, allow for efficient reuse of their products, or accommodate legacy systems. A methodology that addresses these issues is formulated and used to demonstrate its applicability in a large-scale health care service system. During a period of 6 months, a team of object modelers and domain experts formulated an OOAD methodology tailored to the Department of Defense Military Health System and used it to produce components of an object model for simple order processing. This methodology and the lessons learned during its implementation are described. This approach is necessary to achieve broad interoperability among heterogeneous automated information systems.
The Development of a "Neighborhood in Solidarity" in Switzerland.
Zwygart, Marion; Plattet, Alain; Ammor, Sarah
2017-01-01
This article presents a case study based on the "Neighborhood in Solidarity" (NS) methodology to illustrate its application in a locality of 8,000 inhabitants in Switzerland. This specific project is proposed to exemplify the global aim of the NS methodology. That aim is to increase the integration of elderly persons in societies in order to improve their quality of life. The case study demonstrates the enhancement of the capacity of the older people to remain actively engaged in their neighborhood. The article focuses on the creation of an autonomous community of empowered older people who can resolve their own problems after a 5-year project. The construction of the local community is presented throughout the six steps of the methodology: (1) preliminary analysis, (2) diagnostic, (3) construction, (4) project design, (5) project implementation, and (6) empowerment and with three degrees of involvement (community, participative, and integrative involvement). Performance and output indicators, quality indicators, and social determinants of health assess the development of the local project. The impacts of the projects which are illustrated in this specific example motivated this publication to inspire practitioners from other countries.
77 FR 69693 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-20
... collection activities at national conferences will use identical methodologies or otherwise share a common element. Similarly, the BEP's scientific studies will use very similar methodologies or share a common... conducting scientific tests. Using those methodologies, the BEP or its contracted specialists will conduct...
"Could I return to my life?" Integrated Narrative Nursing Model in Education (INNE).
Artioli, Giovanna; Foà, Chiara; Cosentino, Chiara; Sulla, Francesco; Sollami, Alfonso; Taffurelli, Chiara
2018-03-28
The Integrated Narrative Nursing Model (INNM) is an approach that integrates the qualitative methodology typical of the human sciences, with the quantitative methodology more often associated with the natural sciences. This complex model, which combines a focus on narrative with quantitative measures, has recently been effectively applied to the assessment of chronic patients. In this study, the model is applied to the planning phase of education (Integrated Narrative Nursing Education, INNE), and proves to be a valid instrument for the promotion of the current educational paradigm that is centered on the engagement of both the patient and the caregiver in their own path of care. The aim of this study is therefore to describe the nurse's strategy in the planning of an educational intervention by using the INNE model. The case of a 70-year-old woman with pulmonary neoplasm is described at her first admission to Hospice. Each step conducted by the reference nurse, who uses INNE to record the nurse-patient narrative and collect subsequent questionnaires in order to create a shared educational plan, is also described. The information collected was submitted, starting from a grounded methodology to the following four levels of analysis: I. Needs Assessment, II. Narrative Diagnosis, III. Quantitative Outcome, IV. Integrated Outcome. Step IV, which is derived from the integration of all levels of analysis, allows a nurse to define, even graphically, the conceptual map of a patient's needs, resources and perspectives, in a completely tailored manner. The INNE model offers a valid methodological support for the professional who intends to educate the patient through an inter-subjective and engaged pathway, between the professional, their patient and the socio-relational context. It is a matter of adopting a complex vision that combines processes and methods that require a steady scientific basis and advanced methodological expertise with active listening and empathy - skills which require emotional intelligence.
Control of maglev vehicles with aerodynamic and guideway disturbances
NASA Technical Reports Server (NTRS)
Flueckiger, Karl; Mark, Steve; Caswell, Ruth; Mccallum, Duncan
1994-01-01
A modeling, analysis, and control design methodology is presented for maglev vehicle ride quality performance improvement as measured by the Pepler Index. Ride quality enhancement is considered through active control of secondary suspension elements and active aerodynamic surfaces mounted on the train. To analyze and quantify the benefits of active control, the authors have developed a five degree-of-freedom lumped parameter model suitable for describing a large class of maglev vehicles, including both channel and box-beam guideway configurations. Elements of this modeling capability have been recently employed in studies sponsored by the U.S. Department of Transportation (DOT). A perturbation analysis about an operating point, defined by vehicle and average crosswind velocities, yields a suitable linearized state space model for multivariable control system analysis and synthesis. Neglecting passenger compartment noise, the ride quality as quantified by the Pepler Index is readily computed from the system states. A statistical analysis is performed by modeling the crosswind disturbances and guideway variations as filtered white noise, whereby the Pepler Index is established in closed form through the solution to a matrix Lyapunov equation. Data is presented which indicates the anticipated ride quality achieved through various closed-loop control arrangements.
Rat sperm motility analysis: methodologic considerations
The objective of these studies was to optimize conditions for computer-assisted sperm analysis (CASA) of rat epididymal spermatozoa. Methodologic issues addressed include sample collection technique, sampling region within the epididymis, type of diluent medium used, and sample c...
An economic analysis methodology for project evaluation and programming.
DOT National Transportation Integrated Search
2013-08-01
Economic analysis is a critical component of a comprehensive project or program evaluation methodology that considers all key : quantitative and qualitative impacts of highway investments. It allows highway agencies to identify, quantify, and value t...
The ARGO Project: assessing NA-TECH risks on off-shore oil platforms
NASA Astrophysics Data System (ADS)
Capuano, Paolo; Basco, Anna; Di Ruocco, Angela; Esposito, Simona; Fusco, Giannetta; Garcia-Aristizabal, Alexander; Mercogliano, Paola; Salzano, Ernesto; Solaro, Giuseppe; Teofilo, Gianvito; Scandone, Paolo; Gasparini, Paolo
2017-04-01
ARGO (Analysis of natural and anthropogenic risks on off-shore oil platforms) is a 2 years project, funded by the DGS-UNMIG (Directorate General for Safety of Mining and Energy Activities - National Mining Office for Hydrocarbons and Georesources) of Italian Ministry of Economic Development. The project, coordinated by AMRA (Center for the Analysis and Monitoring of Environmental Risk), aims at providing technical support for the analysis of natural and anthropogenic risks on offshore oil platforms. In order to achieve this challenging objective, ARGO brings together climate experts, risk management experts, seismologists, geologists, chemical engineers, earth and coastal observation experts. ARGO has developed methodologies for the probabilistic analysis of industrial accidents triggered by natural events (NA-TECH) on offshore oil platforms in the Italian seas, including extreme events related to climate changes. Furthermore the environmental effect of offshore activities has been investigated, including: changes on seismicity and on the evolution of coastal areas close to offshore platforms. Then a probabilistic multi-risk framework has been developed for the analysis of NA-TECH events on offshore installations for hydrocarbon extraction.
Spatial Bayesian Latent Factor Regression Modeling of Coordinate-based Meta-analysis Data
Montagna, Silvia; Wager, Tor; Barrett, Lisa Feldman; Johnson, Timothy D.; Nichols, Thomas E.
2017-01-01
Summary Now over 20 years old, functional MRI (fMRI) has a large and growing literature that is best synthesised with meta-analytic tools. As most authors do not share image data, only the peak activation coordinates (foci) reported in the paper are available for Coordinate-Based Meta-Analysis (CBMA). Neuroimaging meta-analysis is used to 1) identify areas of consistent activation; and 2) build a predictive model of task type or cognitive process for new studies (reverse inference). To simultaneously address these aims, we propose a Bayesian point process hierarchical model for CBMA. We model the foci from each study as a doubly stochastic Poisson process, where the study-specific log intensity function is characterised as a linear combination of a high-dimensional basis set. A sparse representation of the intensities is guaranteed through latent factor modeling of the basis coefficients. Within our framework, it is also possible to account for the effect of study-level covariates (meta-regression), significantly expanding the capabilities of the current neuroimaging meta-analysis methods available. We apply our methodology to synthetic data and neuroimaging meta-analysis datasets. PMID:28498564
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dandini, Vincent John; Duran, Felicia Angelica; Wyss, Gregory Dane
2003-09-01
This article describes how features of event tree analysis and Monte Carlo-based discrete event simulation can be combined with concepts from object-oriented analysis to develop a new risk assessment methodology, with some of the best features of each. The resultant object-based event scenario tree (OBEST) methodology enables an analyst to rapidly construct realistic models for scenarios for which an a priori discovery of event ordering is either cumbersome or impossible. Each scenario produced by OBEST is automatically associated with a likelihood estimate because probabilistic branching is integral to the object model definition. The OBEST methodology is then applied to anmore » aviation safety problem that considers mechanisms by which an aircraft might become involved in a runway incursion incident. The resulting OBEST model demonstrates how a close link between human reliability analysis and probabilistic risk assessment methods can provide important insights into aviation safety phenomenology.« less
An Analysis of the Fulton Industrial Area: A Market for Parkway’s Industrial Medicine Program.
1983-05-30
strenuous activity. Executive Examination A rather extensive exam, the executive physical is typically used to detect illness in the early stages among ...to these abnormalities (37:495). Having the basic reasoning for back x-ray exams unsubstantiated, a committee of the American Occupatomal Medical...good that these two agencies have contributed to the health and safety of the American worker, concern over methodology, enforcement, and the cost for