ERIC Educational Resources Information Center
Iivari, Juhani; Hirschheim, Rudy
1996-01-01
Analyzes and compares eight information systems (IS) development approaches: Information Modelling, Decision Support Systems, the Socio-Technical approach, the Infological approach, the Interactionist approach, the Speech Act-based approach, Soft Systems Methodology, and the Scandinavian Trade Unionist approach. Discusses the organizational roles…
Development Context Driven Change Awareness and Analysis Framework
NASA Technical Reports Server (NTRS)
Sarma, Anita; Branchaud, Josh; Dwyer, Matthew B.; Person, Suzette; Rungta, Neha
2014-01-01
Recent work on workspace monitoring allows conflict prediction early in the development process, however, these approaches mostly use syntactic differencing techniques to compare different program versions. In contrast, traditional change-impact analysis techniques analyze related versions of the program only after the code has been checked into the master repository. We propose a novel approach, De- CAF (Development Context Analysis Framework), that leverages the development context to scope a change impact analysis technique. The goal is to characterize the impact of each developer on other developers in the team. There are various client applications such as task prioritization, early conflict detection, and providing advice on testing that can benefit from such a characterization. The DeCAF framework leverages information from the development context to bound the iDiSE change impact analysis technique to analyze only the parts of the code base that are of interest. Bounding the analysis can enable DeCAF to efficiently compute the impact of changes using a combination of program dependence and symbolic execution based approaches.
Development Context Driven Change Awareness and Analysis Framework
NASA Technical Reports Server (NTRS)
Sarma, Anita; Branchaud, Josh; Dwyer, Matthew B.; Person, Suzette; Rungta, Neha; Wang, Yurong; Elbaum, Sebastian
2014-01-01
Recent work on workspace monitoring allows conflict prediction early in the development process, however, these approaches mostly use syntactic differencing techniques to compare different program versions. In contrast, traditional change-impact analysis techniques analyze related versions of the program only after the code has been checked into the master repository. We propose a novel approach, DeCAF (Development Context Analysis Framework), that leverages the development context to scope a change impact analysis technique. The goal is to characterize the impact of each developer on other developers in the team. There are various client applications such as task prioritization, early conflict detection, and providing advice on testing that can benefit from such a characterization. The DeCAF framework leverages information from the development context to bound the iDiSE change impact analysis technique to analyze only the parts of the code base that are of interest. Bounding the analysis can enable DeCAF to efficiently compute the impact of changes using a combination of program dependence and symbolic execution based approaches.
A conditional probability analysis (CPA) approach has been developed for identifying biological thresholds of impact for use in the development of geographic-specific water quality criteria for protection of aquatic life. This approach expresses the threshold as the likelihood ...
We advocate an approach to reduce the anticipated increase in stormwater runoff from conventional development by demonstrating a low-impact development that incorporates hydrologic factors into an expanded land suitability analysis. This methodology was applied to a 3 hectare exp...
Towards an Interoperability Ontology for Software Development Tools
2003-03-01
The description of feature models was tied to the introduction of the Feature-Oriented Domain Analysis ( FODA *) [KANG90] approach in the late eighties...Feature-oriented domain analysis ( FODA ) is a domain analysis method developed at the Software...ese obstacles was to construct a “pilot” ontology that is extensible. We applied the Feature-Oriented Domain Analysis approach to capture the
Forestry sector analysis for developing countries: issues and methods.
R.W. Haynes
1993-01-01
A satellite meeting of the 10th Forestry World Congress focused on the methods used for forest sector analysis and their applications in both developed and developing countries. The results of that meeting are summarized, and a general approach for forest sector modeling is proposed. The approach includes models derived from the existing...
A Corpus-Based Approach to Online Materials Development for Writing Research Articles
ERIC Educational Resources Information Center
Chang, Ching-Fen; Kuo, Chih-Hua
2011-01-01
There has been increasing interest in the possible applications of corpora to both linguistic research and pedagogy. This study takes a corpus-based, genre-analytic approach to discipline-specific materials development. Combining corpus analysis with genre analysis makes it possible to develop teaching materials that are not only authentic but…
An Integrated Approach to Risk Assessment for Concurrent Design
NASA Technical Reports Server (NTRS)
Meshkat, Leila; Voss, Luke; Feather, Martin; Cornford, Steve
2005-01-01
This paper describes an approach to risk assessment and analysis suited to the early phase, concurrent design of a space mission. The approach integrates an agile, multi-user risk collection tool, a more in-depth risk analysis tool, and repositories of risk information. A JPL developed tool, named RAP, is used for collecting expert opinions about risk from designers involved in the concurrent design of a space mission. Another in-house developed risk assessment tool, named DDP, is used for the analysis.
A network approach for distinguishing ethical issues in research and development.
Zwart, Sjoerd D; van de Poel, Ibo; van Mil, Harald; Brumsen, Michiel
2006-10-01
In this paper we report on our experiences with using network analysis to discern and analyse ethical issues in research into, and the development of, a new wastewater treatment technology. Using network analysis, we preliminarily interpreted some of our observations in a Group Decision Room (GDR) session where we invited important stakeholders to think about the risks of this new technology. We show how a network approach is useful for understanding the observations, and suggests some relevant ethical issues. We argue that a network approach is also useful for ethical analysis of issues in other fields of research and development. The abandoning of the overarching rationality assumption, which is central to network approaches, does not have to lead to ethical relativism.
Barzyk, Timothy M.; Wilson, Sacoby; Wilson, Anthony
2015-01-01
Community, state, and federal approaches to conventional and cumulative risk assessment (CRA) were described and compared to assess similarities and differences, and develop recommendations for a consistent CRA approach, acceptable across each level as a rigorous scientific methodology, including partnership formation and solution development as necessary practices. Community, state, and federal examples were described and then summarized based on their adherence to CRA principles of: (1) planning, scoping, and problem formulation; (2) risk analysis and ranking, and (3) risk characterization, interpretation, and management. While each application shared the common goal of protecting human health and the environment, they adopted different approaches to achieve this. For a specific project-level analysis of a particular place or instance, this may be acceptable, but to ensure long-term applicability and transferability to other projects, recommendations for developing a consistent approach to CRA are provided. This approach would draw from best practices, risk assessment and decision analysis sciences, and historical lessons learned to provide results in an understandable and accepted manner by all entities. This approach is intended to provide a common ground around which to develop CRA methods and approaches that can be followed at all levels. PMID:25918910
NASA Technical Reports Server (NTRS)
Griesel, Martha Ann
1988-01-01
Several Laboratory software development projects that followed nonstandard development processes, which were hybrids of incremental development and prototyping, are being studied. Factors in the project environment leading to the decision to use a nonstandard development process and affecting its success are analyzed. A simple characterization of project environment based on this analysis is proposed, together with software development approaches which have been found effective for each category. These approaches include both documentation and review requirements.
ERIC Educational Resources Information Center
Green, C. Paul; Orsak, Charles G.
Undertaking of a systems approach to curriculum development for solar training led to (1) a feasibility study to determine the role of the community college in solar energy technology, (2) a market analysis to determine the manpower need, and (3) a task analysis for development of a curriculum for training solar energy technicians at Navarro…
Franks, Robert G
2016-01-01
The use of chloral hydrate optical clearing paired with differential interference contrast microscopy allows the analysis of internal structures of developing plant organs without the need for paraffin embedding and sectioning. This approach is appropriate for the analysis of the developing gynoecium or seedpod of the flowering plant Arabidopsis thaliana and many other types of fixed plant material. Early stages of ovule development are observable with this approach.
Naval Research Laboratory Industrial Chemical Analysis and Respiratory Filter Standards Development
2017-09-29
Filter Standards Development September 29, 2017 Approved for public release; distribution is unlimited. Thomas E. suTTo Materials and Systems Branch...LIMITATION OF ABSTRACT Naval Research Laboratory Industrial Chemical Analysis and Respiratory Filter Standards Development Thomas E. Sutto Naval Research...approach, developed by NRL, is tested by examining the filter behavior against a number of chemicals to determine if the NRL approach resulted in the
I show that a conditional probability analysis using a stressor-response model based on a logistic regression provides a useful approach for developing candidate water quality criteria from empirical data, such as the Maryland Biological Streams Survey (MBSS) data.
1985-02-01
Energy Analysis , a branch of dynamic modal analysis developed for analyzing acoustic vibration problems, its present stage of development embodies a...Maximum Entropy Stochastic Modelling and Reduced-Order Design Synthesis is a rigorous new approach to this class of problems. Inspired by Statistical
NASA Technical Reports Server (NTRS)
Hofmann, L. G.; Hoh, R. H.; Jewell, W. F.; Teper, G. L.; Patel, P. D.
1978-01-01
The objective of this effort is to determine IFR approach path and touchdown dispersions for manual and automatic XV-15 tilt rotor landings, and to develop missed approach criteria. Only helicopter mode XV-15 operation is considered. The analysis and design sections develop the automatic and flight director guidance equations for decelerating curved and straight-in approaches into a typical VTOL landing site equipped with an MLS navigation aid. These system designs satisfy all known pilot-centered, guidance and control requirements for this flying task. Performance data, obtained from nonstationary covariance propagation dispersion analysis for the system, are used to develop the approach monitoring criteria. The autoland and flight director guidance equations are programmed for the VSTOLAND 1819B digital computer. The system design dispersion data developed through analysis and the 1819B digital computer program are verified and refined using the fixed-base, man-in-the-loop XV-15 VSTOLAND simulation.
Estimating the cost of major ongoing cost plus hardware development programs
NASA Technical Reports Server (NTRS)
Bush, J. C.
1990-01-01
Approaches are developed for forecasting the cost of major hardware development programs while these programs are in the design and development C/D phase. Three approaches are developed: a schedule assessment technique for bottom-line summary cost estimation, a detailed cost estimation approach, and an intermediate cost element analysis procedure. The schedule assessment technique was developed using historical cost/schedule performance data.
NASA Technical Reports Server (NTRS)
Hoh, R. H.; Klein, R. H.; Johnson, W. A.
1977-01-01
A system analysis method for the development of an integrated configuration management/flight director system for IFR STOL approaches is presented. Curved descending decelerating approach trajectories are considered. Considerable emphasis is placed on satisfying the pilot centered requirements (acceptable workload) as well as the usual guidance and control requirements (acceptable performance). The Augmentor Wing Jet STOL Research Aircraft was utilized to allow illustration by example, and to validate the analysis procedure via manned simulation.
Introducing Systems Approaches
NASA Astrophysics Data System (ADS)
Reynolds, Martin; Holwell, Sue
Systems Approaches to Managing Change brings together five systems approaches to managing complex issues, each having a proven track record of over 25 years. The five approaches are: System Dynamics (SD) developed originally in the late 1950s by Jay Forrester Viable Systems Model (VSM) developed originally in the late 1960s by Stafford Beer Strategic Options Development and Analysis (SODA: with cognitive mapping) developed originally in the 1970s by Colin Eden Soft Systems Methodology (SSM) developed originally in the 1970s by Peter Checkland Critical Systems Heuristics (CSH) developed originally in the late 1970s by Werner Ulrich
A Methodology for the Development of RESTful Semantic Web Services for Gene Expression Analysis
Guardia, Gabriela D. A.; Pires, Luís Ferreira; Vêncio, Ricardo Z. N.; Malmegrim, Kelen C. R.; de Farias, Cléver R. G.
2015-01-01
Gene expression studies are generally performed through multi-step analysis processes, which require the integrated use of a number of analysis tools. In order to facilitate tool/data integration, an increasing number of analysis tools have been developed as or adapted to semantic web services. In recent years, some approaches have been defined for the development and semantic annotation of web services created from legacy software tools, but these approaches still present many limitations. In addition, to the best of our knowledge, no suitable approach has been defined for the functional genomics domain. Therefore, this paper aims at defining an integrated methodology for the implementation of RESTful semantic web services created from gene expression analysis tools and the semantic annotation of such services. We have applied our methodology to the development of a number of services to support the analysis of different types of gene expression data, including microarray and RNASeq. All developed services are publicly available in the Gene Expression Analysis Services (GEAS) Repository at http://dcm.ffclrp.usp.br/lssb/geas. Additionally, we have used a number of the developed services to create different integrated analysis scenarios to reproduce parts of two gene expression studies documented in the literature. The first study involves the analysis of one-color microarray data obtained from multiple sclerosis patients and healthy donors. The second study comprises the analysis of RNA-Seq data obtained from melanoma cells to investigate the role of the remodeller BRG1 in the proliferation and morphology of these cells. Our methodology provides concrete guidelines and technical details in order to facilitate the systematic development of semantic web services. Moreover, it encourages the development and reuse of these services for the creation of semantically integrated solutions for gene expression analysis. PMID:26207740
A Methodology for the Development of RESTful Semantic Web Services for Gene Expression Analysis.
Guardia, Gabriela D A; Pires, Luís Ferreira; Vêncio, Ricardo Z N; Malmegrim, Kelen C R; de Farias, Cléver R G
2015-01-01
Gene expression studies are generally performed through multi-step analysis processes, which require the integrated use of a number of analysis tools. In order to facilitate tool/data integration, an increasing number of analysis tools have been developed as or adapted to semantic web services. In recent years, some approaches have been defined for the development and semantic annotation of web services created from legacy software tools, but these approaches still present many limitations. In addition, to the best of our knowledge, no suitable approach has been defined for the functional genomics domain. Therefore, this paper aims at defining an integrated methodology for the implementation of RESTful semantic web services created from gene expression analysis tools and the semantic annotation of such services. We have applied our methodology to the development of a number of services to support the analysis of different types of gene expression data, including microarray and RNASeq. All developed services are publicly available in the Gene Expression Analysis Services (GEAS) Repository at http://dcm.ffclrp.usp.br/lssb/geas. Additionally, we have used a number of the developed services to create different integrated analysis scenarios to reproduce parts of two gene expression studies documented in the literature. The first study involves the analysis of one-color microarray data obtained from multiple sclerosis patients and healthy donors. The second study comprises the analysis of RNA-Seq data obtained from melanoma cells to investigate the role of the remodeller BRG1 in the proliferation and morphology of these cells. Our methodology provides concrete guidelines and technical details in order to facilitate the systematic development of semantic web services. Moreover, it encourages the development and reuse of these services for the creation of semantically integrated solutions for gene expression analysis.
Preliminary Evaluation of BIM-based Approaches for Schedule Delay Analysis
NASA Astrophysics Data System (ADS)
Chou, Hui-Yu; Yang, Jyh-Bin
2017-10-01
The problem of schedule delay commonly occurs in construction projects. The quality of delay analysis depends on the availability of schedule-related information and delay evidence. More information used in delay analysis usually produces more accurate and fair analytical results. How to use innovative techniques to improve the quality of schedule delay analysis results have received much attention recently. As Building Information Modeling (BIM) technique has been quickly developed, using BIM and 4D simulation techniques have been proposed and implemented. Obvious benefits have been achieved especially in identifying and solving construction consequence problems in advance of construction. This study preforms an intensive literature review to discuss the problems encountered in schedule delay analysis and the possibility of using BIM as a tool in developing a BIM-based approach for schedule delay analysis. This study believes that most of the identified problems can be dealt with by BIM technique. Research results could be a fundamental of developing new approaches for resolving schedule delay disputes.
2017-01-30
dynamic structural time- history response analysis of flexible approach walls founded on clustered pile groups using Impact_Deck. In Preparation, ERDC...research (Ebeling et al. 2012) has developed simplified analysis procedures for flexible approach wall systems founded on clustered groups of vertical...history response analysis of flexible approach walls founded on clustered pile groups using Impact_Deck. In Preparation, ERDC/ITL TR-16-X. Vicksburg, MS
ERIC Educational Resources Information Center
Tutlys, Vidmantas; Spöttl, Georg
2017-01-01
Purpose: This paper aims to explore methodological and institutional challenges on application of the work-process analysis approach in the design and development of competence-based occupational standards for Lithuania. Design/methodology/approach: The theoretical analysis is based on the review of scientific literature and the analysis of…
A Systems Analysis Role Play Case: We Sell Stuff, Inc.
ERIC Educational Resources Information Center
Mitri, Michel; Cole, Carey
2007-01-01
Most systems development projects incorporate some sort of life cycle approach in their development. Whether the development methodology involves a traditional life cycle, prototyping, rapid application development, or some other approach, the first step usually involves a system investigation, which includes problem identification, feasibility…
Problem analysis: application in the development of market strategies for health care organizations.
Martin, J
1988-03-01
The problem analysis technique is an approach to understanding salient customer needs that is especially appropriate under complex market conditions. The author demonstrates the use of the approach in segmenting markets and conducting competitive analysis for positioning strategy decisions in health care.
Indic, Premananda; Bloch-Salisbury, Elisabeth; Bednarek, Frank; Brown, Emery N; Paydarfar, David; Barbieri, Riccardo
2011-07-01
Cardio-respiratory interactions are weak at the earliest stages of human development, suggesting that assessment of their presence and integrity may be an important indicator of development in infants. Despite the valuable research devoted to infant development, there is still a need for specifically targeted standards and methods to assess cardiopulmonary functions in the early stages of life. We present a new methodological framework for the analysis of cardiovascular variables in preterm infants. Our approach is based on a set of mathematical tools that have been successful in quantifying important cardiovascular control mechanisms in adult humans, here specifically adapted to reflect the physiology of the developing cardiovascular system. We applied our methodology in a study of cardio-respiratory responses for 11 preterm infants. We quantified cardio-respiratory interactions using specifically tailored multivariate autoregressive analysis and calculated the coherence as well as gain using causal approaches. The significance of the interactions in each subject was determined by surrogate data analysis. The method was tested in control conditions as well as in two different experimental conditions; with and without use of mild mechanosensory intervention. Our multivariate analysis revealed a significantly higher coherence, as confirmed by surrogate data analysis, in the frequency range associated with eupneic breathing compared to the other ranges. Our analysis validates the models behind our new approaches, and our results confirm the presence of cardio-respiratory coupling in early stages of development, particularly during periods of mild mechanosensory intervention, thus encouraging further application of our approach. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
ERIC Educational Resources Information Center
Rhode, William E.; And Others
Basic cost estimates for selected instructional media are tabled in this document, Part II (Appendix III) of the report "Analysis and Approach to the Development of an Advanced Multimedia Instructional System" by William E. Rhode and others. Learning materials production costs are given for motion pictures, still visuals, videotapes, live…
Object-oriented productivity metrics
NASA Technical Reports Server (NTRS)
Connell, John L.; Eller, Nancy
1992-01-01
Software productivity metrics are useful for sizing and costing proposed software and for measuring development productivity. Estimating and measuring source lines of code (SLOC) has proven to be a bad idea because it encourages writing more lines of code and using lower level languages. Function Point Analysis is an improved software metric system, but it is not compatible with newer rapid prototyping and object-oriented approaches to software development. A process is presented here for counting object-oriented effort points, based on a preliminary object-oriented analysis. It is proposed that this approach is compatible with object-oriented analysis, design, programming, and rapid prototyping. Statistics gathered on actual projects are presented to validate the approach.
IDHEAS – A NEW APPROACH FOR HUMAN RELIABILITY ANALYSIS
DOE Office of Scientific and Technical Information (OSTI.GOV)
G. W. Parry; J.A Forester; V.N. Dang
2013-09-01
This paper describes a method, IDHEAS (Integrated Decision-Tree Human Event Analysis System) that has been developed jointly by the US NRC and EPRI as an improved approach to Human Reliability Analysis (HRA) that is based on an understanding of the cognitive mechanisms and performance influencing factors (PIFs) that affect operator responses. The paper describes the various elements of the method, namely the performance of a detailed cognitive task analysis that is documented in a crew response tree (CRT), and the development of the associated time-line to identify the critical tasks, i.e. those whose failure results in a human failure eventmore » (HFE), and an approach to quantification that is based on explanations of why the HFE might occur.« less
NASA Astrophysics Data System (ADS)
Erduran, Sibel; Simon, Shirley; Osborne, Jonathan
2004-11-01
This paper reports some methodological approaches to the analysis of argumentation discourse developed as part of the two-and-a-half year project titled Enhancing the Quality of Argument in School Scienc'' supported by the Economic and Social Research Council in the United Kingdom. In this project researchers collaborated with middle-school science teachers to develop models of instructional activities in an effort to make argumentation a component of instruction. We begin the paper with a brief theoretical justification for why we consider argumentation to be of significance to science education. We then contextualize the use of Toulmin's Argument Pattern in the study of argumentation discourse and provide a justification for the methodological outcomes our approach generates. We illustrate how our work refines and develops research methodologies in argumentation analysis. In particular, we present two methodological approaches to the analysis of argumentation resulting in whole-class as well as small-group student discussions. For each approach, we illustrate our coding scheme and some results as well as how our methodological approach has enabled our inquiry into the quality of argumentation in the classroom. We conclude with some implications for future research in argumentation in science education.
Flash Study Analysis and the Music Learning Pro-Files Project
ERIC Educational Resources Information Center
Cremata, Radio; Pignato, Joseph; Powell, Bryan; Smith, Gareth Dylan
2016-01-01
This paper introduces the Music Learning Profiles Project, and its methodological approach, flash study analysis. Flash study analysis is a method that draws heavily on extant qualitative approaches to education research, to develop broad understandings of music learning in diverse contexts. The Music Learning Profiles Project (MLPP) is an…
Beck, Marcus W.; Vondracek, Bruce C.; Hatch, Lorin K.; Vinje, Jason
2013-01-01
Lake resources can be negatively affected by environmental stressors originating from multiple sources and different spatial scales. Shoreline development, in particular, can negatively affect lake resources through decline in habitat quality, physical disturbance, and impacts on fisheries. The development of remote sensing techniques that efficiently characterize shoreline development in a regional context could greatly improve management approaches for protecting and restoring lake resources. The goal of this study was to develop an approach using high-resolution aerial photographs to quantify and assess docks as indicators of shoreline development. First, we describe a dock analysis workflow that can be used to quantify the spatial extent of docks using aerial images. Our approach incorporates pixel-based classifiers with object-based techniques to effectively analyze high-resolution digital imagery. Second, we apply the analysis workflow to quantify docks for 4261 lakes managed by the Minnesota Department of Natural Resources. Overall accuracy of the analysis results was 98.4% (87.7% based on ) after manual post-processing. The analysis workflow was also 74% more efficient than the time required for manual digitization of docks. These analyses have immediate relevance for resource planning in Minnesota, whereas the dock analysis workflow could be used to quantify shoreline development in other regions with comparable imagery. These data can also be used to better understand the effects of shoreline development on aquatic resources and to evaluate the effects of shoreline development relative to other stressors.
NASA Astrophysics Data System (ADS)
Qi, Wei
2017-11-01
Cost-benefit analysis is commonly used for engineering planning and design problems in practice. However, previous cost-benefit based design flood estimation is based on stationary assumption. This study develops a non-stationary cost-benefit based design flood estimation approach. This approach integrates a non-stationary probability distribution function into cost-benefit analysis, and influence of non-stationarity on expected total cost (including flood damage and construction costs) and design flood estimation can be quantified. To facilitate design flood selections, a 'Risk-Cost' analysis approach is developed, which reveals the nexus of extreme flood risk, expected total cost and design life periods. Two basins, with 54-year and 104-year flood data respectively, are utilized to illustrate the application. It is found that the developed approach can effectively reveal changes of expected total cost and extreme floods in different design life periods. In addition, trade-offs are found between extreme flood risk and expected total cost, which reflect increases in cost to mitigate risk. Comparing with stationary approaches which generate only one expected total cost curve and therefore only one design flood estimation, the proposed new approach generate design flood estimation intervals and the 'Risk-Cost' approach selects a design flood value from the intervals based on the trade-offs between extreme flood risk and expected total cost. This study provides a new approach towards a better understanding of the influence of non-stationarity on expected total cost and design floods, and could be beneficial to cost-benefit based non-stationary design flood estimation across the world.
Moret-Bonillo, Vicente; Alvarez-Estévez, Diego; Fernández-Leal, Angel; Hernández-Pereira, Elena
2014-01-01
This work deals with the development of an intelligent approach for clinical decision making in the diagnosis of the Sleep Apnea/Hypopnea Syndrome, SAHS, from the analysis of respiratory signals and oxygen saturation in arterial blood, SaO2. In order to accomplish the task the proposed approach makes use of different artificial intelligence techniques and reasoning processes being able to deal with imprecise data. These reasoning processes are based on fuzzy logic and on temporal analysis of the information. The developed approach also takes into account the possibility of artifacts in the monitored signals. Detection and characterization of signal artifacts allows detection of false positives. Identification of relevant diagnostic patterns and temporal correlation of events is performed through the implementation of temporal constraints.
Moret-Bonillo, Vicente; Alvarez-Estévez, Diego; Fernández-Leal, Angel; Hernández-Pereira, Elena
2014-01-01
This work deals with the development of an intelligent approach for clinical decision making in the diagnosis of the Sleep Apnea/Hypopnea Syndrome, SAHS, from the analysis of respiratory signals and oxygen saturation in arterial blood, SaO2. In order to accomplish the task the proposed approach makes use of different artificial intelligence techniques and reasoning processes being able to deal with imprecise data. These reasoning processes are based on fuzzy logic and on temporal analysis of the information. The developed approach also takes into account the possibility of artifacts in the monitored signals. Detection and characterization of signal artifacts allows detection of false positives. Identification of relevant diagnostic patterns and temporal correlation of events is performed through the implementation of temporal constraints. PMID:25035712
Interdisciplinary Investigations in Support of Project DI-MOD
NASA Technical Reports Server (NTRS)
Starks, Scott A. (Principal Investigator)
1996-01-01
Various concepts from time series analysis are used as the basis for the development of algorithms to assist in the analysis and interpretation of remote sensed imagery. An approach to trend detection that is based upon the fractal analysis of power spectrum estimates is presented. Additionally, research was conducted toward the development of a software architecture to support processing tasks associated with databases housing a variety of data. An algorithmic approach which provides for the automation of the state monitoring process is presented.
[Causal analysis approaches in epidemiology].
Dumas, O; Siroux, V; Le Moual, N; Varraso, R
2014-02-01
Epidemiological research is mostly based on observational studies. Whether such studies can provide evidence of causation remains discussed. Several causal analysis methods have been developed in epidemiology. This paper aims at presenting an overview of these methods: graphical models, path analysis and its extensions, and models based on the counterfactual approach, with a special emphasis on marginal structural models. Graphical approaches have been developed to allow synthetic representations of supposed causal relationships in a given problem. They serve as qualitative support in the study of causal relationships. The sufficient-component cause model has been developed to deal with the issue of multicausality raised by the emergence of chronic multifactorial diseases. Directed acyclic graphs are mostly used as a visual tool to identify possible confounding sources in a study. Structural equations models, the main extension of path analysis, combine a system of equations and a path diagram, representing a set of possible causal relationships. They allow quantifying direct and indirect effects in a general model in which several relationships can be tested simultaneously. Dynamic path analysis further takes into account the role of time. The counterfactual approach defines causality by comparing the observed event and the counterfactual event (the event that would have been observed if, contrary to the fact, the subject had received a different exposure than the one he actually received). This theoretical approach has shown limits of traditional methods to address some causality questions. In particular, in longitudinal studies, when there is time-varying confounding, classical methods (regressions) may be biased. Marginal structural models have been developed to address this issue. In conclusion, "causal models", though they were developed partly independently, are based on equivalent logical foundations. A crucial step in the application of these models is the formulation of causal hypotheses, which will be a basis for all methodological choices. Beyond this step, statistical analysis tools recently developed offer new possibilities to delineate complex relationships, in particular in life course epidemiology. Copyright © 2013 Elsevier Masson SAS. All rights reserved.
ERIC Educational Resources Information Center
Besson, Ugo; Borghi, Lidia; De Ambrosis, Anna; Mascheretti, Paolo
2010-01-01
We have developed a teaching-learning sequence (TLS) on friction based on a preliminary study involving three dimensions: an analysis of didactic research on the topic, an overview of usual approaches, and a critical analysis of the subject, considered also in its historical development. We found that mostly the usual presentations do not take…
Employing Simulation to Evaluate Designs: The APEX Approach
NASA Technical Reports Server (NTRS)
Freed, Michael A.; Shafto, Michael G.; Remington, Roger W.; Null, Cynthia H. (Technical Monitor)
1998-01-01
The key innovations of APEX are its integrated approaches to task analysis, procedure definition, and intelligent, resource-constrained multi-tasking. This paper presents a step-by-step description of how APEX is used, from scenario development through trace analysis.
NASA Technical Reports Server (NTRS)
Buffalano, C.; Fogleman, S.; Gielecki, M.
1976-01-01
A methodology is outlined which can be used to estimate the costs of research and development projects. The approach uses the Delphi technique a method developed by the Rand Corporation for systematically eliciting and evaluating group judgments in an objective manner. The use of the Delphi allows for the integration of expert opinion into the cost-estimating process in a consistent and rigorous fashion. This approach can also signal potential cost-problem areas. This result can be a useful tool in planning additional cost analysis or in estimating contingency funds. A Monte Carlo approach is also examined.
The Design of Curriculum Development Based on Entrepreneurship through Balanced Scorecard Approach
ERIC Educational Resources Information Center
Hidayat, Muhammad; Musa, Chalid Imran; Haerani, Siti; Sudirman, Indrianti
2015-01-01
This research is intended to develop curriculum based on entrepreneurship through balanced scorecard approach at the School of Business or "Sekolah Tinggi Ilmu Ekonomi" (STIE) Nobel Indonesia. In order to develop the curriculum, a need analysis in terms of curriculum development that involves all stakeholders at STIE Nobel in Indonesia…
ERIC Educational Resources Information Center
Chochard, Yves; Davoine, Eric
2011-01-01
In this article, we present the utility analysis approach as an alternative and promising approach to measure the return on investment in managerial training programs. This approach, linking economic value with competencies developed by trainees, enables researchers and decision-makers to compare the return on investment from different programs in…
Discourse Analysis and Development of English Listening for Non-English Majors in China
ERIC Educational Resources Information Center
Ji, Yinxiu
2015-01-01
Traditional approach of listening teaching mainly focuses on the sentence level and regards the listening process in a passive and static way. To compensate for this deficiency, a new listening approach, that is, discourse-oriented approach has been introduced into the listening classroom. Although discourse analysis is a comparatively new field…
We show that a conditional probability analysis that utilizes a stressor-response model based on a logistic regression provides a useful approach for developing candidate water quality criterai from empirical data. The critical step in this approach is transforming the response ...
NASA Astrophysics Data System (ADS)
Argyropoulou, Evangelia
2015-04-01
The current study was focused on the seafloor morphology of the North Aegean Basin in Greece, through Object Based Image Analysis (OBIA) using a Digital Elevation Model. The goal was the automatic extraction of morphologic and morphotectonic features, resulting into fault surface extraction. An Object Based Image Analysis approach was developed based on the bathymetric data and the extracted features, based on morphological criteria, were compared with the corresponding landforms derived through tectonic analysis. A digital elevation model of 150 meters spatial resolution was used. At first, slope, profile curvature, and percentile were extracted from this bathymetry grid. The OBIA approach was developed within the eCognition environment. Four segmentation levels were created having as a target "level 4". At level 4, the final classes of geomorphological features were classified: discontinuities, fault-like features and fault surfaces. On previous levels, additional landforms were also classified, such as continental platform and continental slope. The results of the developed approach were evaluated by two methods. At first, classification stability measures were computed within eCognition. Then, qualitative and quantitative comparison of the results took place with a reference tectonic map which has been created manually based on the analysis of seismic profiles. The results of this comparison were satisfactory, a fact which determines the correctness of the developed OBIA approach.
Recent Advances in the Analysis of Spiral Bevel Gears
NASA Technical Reports Server (NTRS)
Handschuh, Robert F.
1997-01-01
A review of recent progress for the analysis of spiral bevel gears will be described. The foundation of this work relies on the description of the gear geometry of face-milled spiral bevel gears via the approach developed by Litvin. This methodology was extended by combining the basic gear design data with the manufactured surfaces using a differential geometry approach, and provides the data necessary for assembling three-dimensional finite element models. The finite element models have been utilized to conduct thermal and structural analysis of the gear system. Examples of the methods developed for thermal and structural/contact analysis are presented.
Model prototype utilization in the analysis of fault tolerant control and data processing systems
NASA Astrophysics Data System (ADS)
Kovalev, I. V.; Tsarev, R. Yu; Gruzenkin, D. V.; Prokopenko, A. V.; Knyazkov, A. N.; Laptenok, V. D.
2016-04-01
The procedure assessing the profit of control and data processing system implementation is presented in the paper. The reasonability of model prototype creation and analysis results from the implementing of the approach of fault tolerance provision through the inclusion of structural and software assessment redundancy. The developed procedure allows finding the best ratio between the development cost and the analysis of model prototype and earnings from the results of this utilization and information produced. The suggested approach has been illustrated by the model example of profit assessment and analysis of control and data processing system.
Commercial transport aircraft composite structures
NASA Technical Reports Server (NTRS)
Mccarty, J. E.
1983-01-01
The role that analysis plays in the development, production, and substantiation of aircraft structures is discussed. The types, elements, and applications of failure that are used and needed; the current application of analysis methods to commercial aircraft advanced composite structures, along with a projection of future needs; and some personal thoughts on analysis development goals and the elements of an approach to analysis development are discussed.
ERIC Educational Resources Information Center
Galey, Sarah; Youngs, Peter
2014-01-01
Scholars have developed a wide range of theories to explain both stability and change in policy subsystems. In recent years, a burgeoning literature has emerged that focuses on the application of network analysis in policy research, more formally known as Policy Network Analysis (PNA). This approach, while still developing, has great potential as…
ERIC Educational Resources Information Center
Poveda, Alexander Cotte
2012-01-01
This paper develops an index to evaluate the level of effectiveness of the control of violence based on the data envelopment analysis approach. The index is used to examine the grade of effectiveness of the control of violence at the level of Colombian departments between 1993 and 2007. Comparing the results across Colombian departments, we find…
Relative risk analysis of the use of radiation-emitting medical devices: A preliminary application
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, E.D.
This report describes the development of a risk analysis approach for evaluating the use of radiation-emitting medial devices. This effort was performed by Lawrence Livermore National Laboratory for the US Nuclear Regulatory Commission (NRC). The assessment approach has bee applied to understand the risks in using the Gamma Knife, a gamma irradiation therapy device. This effort represents an initial step to evaluate the potential role of risk analysis for developing regulations and quality assurance requirements in the use of nuclear medical devices. The risk approach identifies and assesses the most likely risk contributors and their relative importance for the medicalmore » system. The approach uses expert screening techniques and relative risk profiling to incorporate the type, quality, and quantity of data available and to present results in an easily understood form.« less
1978-10-20
Preparation of the Battlefield (IPB) - Phase A An Automated Approach to Terrain and Mobility Cocridor Analysis Prepared For The ;*ttlefield Systems... the Battlefield (IPB) - Phase A An Automated Approach to Terrain and Mobility Corridcr Analysis, Prepared For The Battlefield Systems Integration... series of snapshots developed for Option A. The situation snapshots would be deteloped in like manner for each option, and stored in an
Cross-Sectional Time Series Designs: A General Transformation Approach.
ERIC Educational Resources Information Center
Velicer, Wayne F.; McDonald, Roderick P.
1991-01-01
The general transformation approach to time series analysis is extended to the analysis of multiple unit data by the development of a patterned transformation matrix. The procedure includes alternatives for special cases and requires only minor revisions in existing computer software. (SLD)
A 21st Century Collaborative Policy Development and Implementation Approach: A Discourse Analysis
ERIC Educational Resources Information Center
Nyoni, J.
2012-01-01
The article used Unisa Framework for the implementation of a team approach to curriculum and learning development to explore and analyse the views and experiences of academic lecturers and curriculum and learning development experts on the conceptualisation and development of the said framework and its subsequent implementation thereof. I used a…
ERIC Educational Resources Information Center
Kusumaningrum, Indrati; Hidayat, Hendra; Ganefri; Anori, Sartika; Dewy, Mega Silfia
2016-01-01
This article describes the development of a business plan by using production-based learning approach. In addition, this development also aims to maximize learning outcomes in vocational education. Preliminary analysis of curriculum and learning and the needs of the market and society become the basic for business plan development. To produce a…
Face recognition using an enhanced independent component analysis approach.
Kwak, Keun-Chang; Pedrycz, Witold
2007-03-01
This paper is concerned with an enhanced independent component analysis (ICA) and its application to face recognition. Typically, face representations obtained by ICA involve unsupervised learning and high-order statistics. In this paper, we develop an enhancement of the generic ICA by augmenting this method by the Fisher linear discriminant analysis (LDA); hence, its abbreviation, FICA. The FICA is systematically developed and presented along with its underlying architecture. A comparative analysis explores four distance metrics, as well as classification with support vector machines (SVMs). We demonstrate that the FICA approach leads to the formation of well-separated classes in low-dimension subspace and is endowed with a great deal of insensitivity to large variation in illumination and facial expression. The comprehensive experiments are completed for the facial-recognition technology (FERET) face database; a comparative analysis demonstrates that FICA comes with improved classification rates when compared with some other conventional approaches such as eigenface, fisherface, and the ICA itself.
Environmental management strategy: four forces analysis.
Doyle, Martin W; Von Windheim, Jesko
2015-01-01
We develop an analytical approach for more systematically analyzing environmental management problems in order to develop strategic plans. This approach can be deployed by agencies, non-profit organizations, corporations, or other organizations and institutions tasked with improving environmental quality. The analysis relies on assessing the underlying natural processes followed by articulation of the relevant societal forces causing environmental change: (1) science and technology, (2) governance, (3) markets and the economy, and (4) public behavior. The four forces analysis is then used to strategize which types of actions might be most effective at influencing environmental quality. Such strategy has been under-used and under-valued in environmental management outside of the corporate sector, and we suggest that this four forces analysis is a useful analytic to begin developing such strategy.
Environmental Management Strategy: Four Forces Analysis
NASA Astrophysics Data System (ADS)
Doyle, Martin W.; Von Windheim, Jesko
2015-01-01
We develop an analytical approach for more systematically analyzing environmental management problems in order to develop strategic plans. This approach can be deployed by agencies, non-profit organizations, corporations, or other organizations and institutions tasked with improving environmental quality. The analysis relies on assessing the underlying natural processes followed by articulation of the relevant societal forces causing environmental change: (1) science and technology, (2) governance, (3) markets and the economy, and (4) public behavior. The four forces analysis is then used to strategize which types of actions might be most effective at influencing environmental quality. Such strategy has been under-used and under-valued in environmental management outside of the corporate sector, and we suggest that this four forces analysis is a useful analytic to begin developing such strategy.
Reduction method with system analysis for multiobjective optimization-based design
NASA Technical Reports Server (NTRS)
Azarm, S.; Sobieszczanski-Sobieski, J.
1993-01-01
An approach for reducing the number of variables and constraints, which is combined with System Analysis Equations (SAE), for multiobjective optimization-based design is presented. In order to develop a simplified analysis model, the SAE is computed outside an optimization loop and then approximated for use by an operator. Two examples are presented to demonstrate the approach.
Stirling engine - Approach for long-term durability assessment
NASA Technical Reports Server (NTRS)
Tong, Michael T.; Bartolotta, Paul A.; Halford, Gary R.; Freed, Alan D.
1992-01-01
The approach employed by NASA Lewis for the long-term durability assessment of the Stirling engine hot-section components is summarized. The approach consists of: preliminary structural assessment; development of a viscoplastic constitutive model to accurately determine material behavior under high-temperature thermomechanical loads; an experimental program to characterize material constants for the viscoplastic constitutive model; finite-element thermal analysis and structural analysis using a viscoplastic constitutive model to obtain stress/strain/temperature at the critical location of the hot-section components for life assessment; and development of a life prediction model applicable for long-term durability assessment at high temperatures. The approach should aid in the provision of long-term structural durability and reliability of Stirling engines.
An evolutionary approach to the group analysis of global geophysical data
NASA Technical Reports Server (NTRS)
Vette, J. I.
1979-01-01
The coordinated data analysis that developed within the International Magnetospheric Study is presented. A tracing of its development along with various activities taking place within this framework are reported.
Telford, Mark; Senior, Emma
2017-06-08
This article describes the experiences of undergraduate healthcare students taking a module adopting a 'flipped classroom' approach. Evidence suggests that flipped classroom as a pedagogical tool has the potential to enhance student learning and to improve healthcare practice. This innovative approach was implemented within a healthcare curriculum and in a module looking at public health delivered at the beginning of year two of a 3-year programme. The focus of the evaluation study was on the e-learning resources used in the module and the student experiences of these; with a specific aim to evaluate this element of the flipped classroom approach. A mixed-methods approach was adopted and data collected using questionnaires, which were distributed across a whole cohort, and a focus group involving ten participants. Statistical analysis of the data showed the positive student experience of engaging with e-learning. The thematic analysis identified two key themes; factors influencing a positive learning experience and the challenges when developing e-learning within a flipped classroom approach. The study provides guidance for further developments and improvements when developing e-learning as part of the flipped classroom approach.
Development of Phased-Array Ultrasonic Testing Acceptability Criteria : (Phase II)
DOT National Transportation Integrated Search
2014-10-01
The preliminary technical approach and scan plans developed during phase I of this research was implemented on testing four butt-weld specimens. The ray path analysis carried out to develop the scan plans and the preliminary data analysis indicated t...
A Disciplined Architectural Approach to Scaling Data Analysis for Massive, Scientific Data
NASA Astrophysics Data System (ADS)
Crichton, D. J.; Braverman, A. J.; Cinquini, L.; Turmon, M.; Lee, H.; Law, E.
2014-12-01
Data collections across remote sensing and ground-based instruments in astronomy, Earth science, and planetary science are outpacing scientists' ability to analyze them. Furthermore, the distribution, structure, and heterogeneity of the measurements themselves pose challenges that limit the scalability of data analysis using traditional approaches. Methods for developing science data processing pipelines, distribution of scientific datasets, and performing analysis will require innovative approaches that integrate cyber-infrastructure, algorithms, and data into more systematic approaches that can more efficiently compute and reduce data, particularly distributed data. This requires the integration of computer science, machine learning, statistics and domain expertise to identify scalable architectures for data analysis. The size of data returned from Earth Science observing satellites and the magnitude of data from climate model output, is predicted to grow into the tens of petabytes challenging current data analysis paradigms. This same kind of growth is present in astronomy and planetary science data. One of the major challenges in data science and related disciplines defining new approaches to scaling systems and analysis in order to increase scientific productivity and yield. Specific needs include: 1) identification of optimized system architectures for analyzing massive, distributed data sets; 2) algorithms for systematic analysis of massive data sets in distributed environments; and 3) the development of software infrastructures that are capable of performing massive, distributed data analysis across a comprehensive data science framework. NASA/JPL has begun an initiative in data science to address these challenges. Our goal is to evaluate how scientific productivity can be improved through optimized architectural topologies that identify how to deploy and manage the access, distribution, computation, and reduction of massive, distributed data, while managing the uncertainties of scientific conclusions derived from such capabilities. This talk will provide an overview of JPL's efforts in developing a comprehensive architectural approach to data science.
Analysis of Advanced Modular Power Systems (AMPS) for Deep Space Exploration
NASA Technical Reports Server (NTRS)
Oeftering, Richard; Soeder, James F.; Beach, Ray
2014-01-01
The Advanced Modular Power Systems (AMPS) project is developing a modular approach to spacecraft power systems for exploration beyond Earth orbit. AMPS is intended to meet the need of reducing the cost of design development, test and integration and also reducing the operational logistics cost of supporting exploration missions. AMPS seeks to establish modular power building blocks with standardized electrical, mechanical, thermal and data interfaces that can be applied across multiple exploration vehicles. The presentation discusses the results of a cost analysis that compares the cost of the modular approach against a traditional non-modular approach.
A new framework for comprehensive, robust, and efficient global sensitivity analysis: 2. Application
NASA Astrophysics Data System (ADS)
Razavi, Saman; Gupta, Hoshin V.
2016-01-01
Based on the theoretical framework for sensitivity analysis called "Variogram Analysis of Response Surfaces" (VARS), developed in the companion paper, we develop and implement a practical "star-based" sampling strategy (called STAR-VARS), for the application of VARS to real-world problems. We also develop a bootstrap approach to provide confidence level estimates for the VARS sensitivity metrics and to evaluate the reliability of inferred factor rankings. The effectiveness, efficiency, and robustness of STAR-VARS are demonstrated via two real-data hydrological case studies (a 5-parameter conceptual rainfall-runoff model and a 45-parameter land surface scheme hydrology model), and a comparison with the "derivative-based" Morris and "variance-based" Sobol approaches are provided. Our results show that STAR-VARS provides reliable and stable assessments of "global" sensitivity across the full range of scales in the factor space, while being 1-2 orders of magnitude more efficient than the Morris or Sobol approaches.
Bogani, Patrizia; Spiriti, Maria Michela; Lazzarano, Stefano; Arcangeli, Annarosa; Buiatti, Marcello; Minunni, Maria
2011-11-01
The World Anti-Doping Agency fears the use of gene doping to enhance athletic performances. Thus, a bioanalytical approach based on end point PCR for detecting markers' of transgenesis traceability was developed. A few sequences from two different vectors using an animal model were selected and traced in different tissues and at different times. In particular, enhanced green fluorescent protein gene and a construct-specific new marker were targeted in the analysis. To make the developed detection approach open to future routine doping analysis, matrices such as urine and tears as well blood were also tested. This study will have impact in evaluating the vector transgenes traceability for the detection of a gene doping event by non-invasive sampling.
The Portfolio Approach Developed to Underpin the Capital Investment Program Plan Review (CIPPR)
2014-11-06
Basinger, Director, DCI, CFD Scientific Letter The PORTFOLIO APPROACH developed to underpin the Capital Investment Program Plan Review (CIPPR) To better...prepare senior management for meetings about CIPPR in November 2014, this scientific letter has been pre- pared upon request [1] to clarify some of...Research and Analysis in support of CIPPR was to: 1. Provide scientific support to the development of a traceable and sustainable approach and process by
ERIC Educational Resources Information Center
Shen, Hao-Yu; Shen, Bo; Hardacre, Christopher
2013-01-01
A systematic approach to develop the teaching of instrumental analytical chemistry is discussed, as well as a conceptual framework for organizing and executing lectures and a laboratory course. Three main components are used in this course: theoretical knowledge developed in the classroom, simulations via a virtual laboratory, and practical…
Ebrahim, Shanil; Johnston, Bradley C; Akl, Elie A; Mustafa, Reem A; Sun, Xin; Walter, Stephen D; Heels-Ansdell, Diane; Alonso-Coello, Pablo; Guyatt, Gordon H
2014-05-01
We previously developed an approach to address the impact of missing participant data in meta-analyses of continuous variables in trials that used the same measurement instrument. We extend this approach to meta-analyses including trials that use different instruments to measure the same construct. We reviewed the available literature, conducted an iterative consultative process, and developed an approach involving a complete-case analysis complemented by sensitivity analyses that apply a series of increasingly stringent assumptions about results in patients with missing continuous outcome data. Our approach involves choosing the reference measurement instrument; converting scores from different instruments to the units of the reference instrument; developing four successively more stringent imputation strategies for addressing missing participant data; calculating a pooled mean difference for the complete-case analysis and imputation strategies; calculating the proportion of patients who experienced an important treatment effect; and judging the impact of the imputation strategies on the confidence in the estimate of effect. We applied our approach to an example systematic review of respiratory rehabilitation for chronic obstructive pulmonary disease. Our extended approach provides quantitative guidance for addressing missing participant data in systematic reviews of trials using different instruments to measure the same construct. Copyright © 2014 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Watters, H.; Steadman, J.
1976-01-01
A modular training approach for Spacelab payload crews is described. Representative missions are defined for training requirements analysis, training hardware, and simulations. Training times are projected for each experiment of each representative flight. A parametric analysis of the various flights defines resource requirements for a modular training facility at different flight frequencies. The modular approach is believed to be more flexible, time saving, and economical than previous single high fidelity trainer concepts. Block diagrams of training programs are shown.
Cost analysis in support of minimum energy standards for clothes washers and dryers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1979-02-02
The results of the cost analysis of energy conservation design options for laundry products are presented. The analysis was conducted using two approaches. The first, is directed toward the development of industrial engineering cost estimates of each energy conservation option. This approach results in the estimation of manufacturers costs. The second approach is directed toward determining the market price differential of energy conservation features. The results of this approach are shown. The market cost represents the cost to the consumer. It is the final cost, and therefore includes distribution costs as well as manufacturing costs.
An Evidential Reasoning-Based CREAM to Human Reliability Analysis in Maritime Accident Process.
Wu, Bing; Yan, Xinping; Wang, Yang; Soares, C Guedes
2017-10-01
This article proposes a modified cognitive reliability and error analysis method (CREAM) for estimating the human error probability in the maritime accident process on the basis of an evidential reasoning approach. This modified CREAM is developed to precisely quantify the linguistic variables of the common performance conditions and to overcome the problem of ignoring the uncertainty caused by incomplete information in the existing CREAM models. Moreover, this article views maritime accident development from the sequential perspective, where a scenario- and barrier-based framework is proposed to describe the maritime accident process. This evidential reasoning-based CREAM approach together with the proposed accident development framework are applied to human reliability analysis of a ship capsizing accident. It will facilitate subjective human reliability analysis in different engineering systems where uncertainty exists in practice. © 2017 Society for Risk Analysis.
Decoupled 1D/3D analysis of a hydraulic valve
NASA Astrophysics Data System (ADS)
Mehring, Carsten; Zopeya, Ashok; Latham, Matt; Ihde, Thomas; Massie, Dan
2014-10-01
Analysis approaches during product development of fluid valves and other aircraft fluid delivery components vary greatly depending on the development stage. Traditionally, empirical or simplistic one-dimensional tools are being deployed during preliminary design, whereas detailed analysis such as CFD (Computational Fluid Dynamics) tools are used to refine a selected design during the detailed design stage. In recent years, combined 1D/3D co-simulation has been deployed specifically for system level simulations requiring an increased level of analysis detail for one or more components. The present paper presents a decoupled 1D/3D analysis approach where 3D CFD analysis results are utilized to enhance the fidelity of a dynamic 1D modelin context of an aircraft fuel valve.
Critical Factors Analysis for Offshore Software Development Success by Structural Equation Modeling
NASA Astrophysics Data System (ADS)
Wada, Yoshihisa; Tsuji, Hiroshi
In order to analyze the success/failure factors in offshore software development service by the structural equation modeling, this paper proposes to follow two approaches together; domain knowledge based heuristic analysis and factor analysis based rational analysis. The former works for generating and verifying of hypothesis to find factors and causalities. The latter works for verifying factors introduced by theory to build the model without heuristics. Following the proposed combined approaches for the responses from skilled project managers of the questionnaire, this paper found that the vendor property has high causality for the success compared to software property and project property.
Utilization of the Building-Block Approach in Structural Mechanics Research
NASA Technical Reports Server (NTRS)
Rouse, Marshall; Jegley, Dawn C.; McGowan, David M.; Bush, Harold G.; Waters, W. Allen
2005-01-01
In the last 20 years NASA has worked in collaboration with industry to develop enabling technologies needed to make aircraft safer and more affordable, extend their lifetime, improve their reliability, better understand their behavior, and reduce their weight. To support these efforts, research programs starting with ideas and culminating in full-scale structural testing were conducted at the NASA Langley Research Center. Each program contained development efforts that (a) started with selecting the material system and manufacturing approach; (b) moved on to experimentation and analysis of small samples to characterize the system and quantify behavior in the presence of defects like damage and imperfections; (c) progressed on to examining larger structures to examine buckling behavior, combined loadings, and built-up structures; and (d) finally moved to complicated subcomponents and full-scale components. Each step along the way was supported by detailed analysis, including tool development, to prove that the behavior of these structures was well-understood and predictable. This approach for developing technology became known as the "building-block" approach. In the Advanced Composites Technology Program and the High Speed Research Program the building-block approach was used to develop a true understanding of the response of the structures involved through experimentation and analysis. The philosophy that if the structural response couldn't be accurately predicted, it wasn't really understood, was critical to the progression of these programs. To this end, analytical techniques including closed-form and finite elements were employed and experimentation used to verify assumptions at each step along the way. This paper presents a discussion of the utilization of the building-block approach described previously in structural mechanics research and development programs at NASA Langley Research Center. Specific examples that illustrate the use of this approach are included from recent research and development programs for both subsonic and supersonic transports.
The Dispositions for Culturally Responsive Pedagogy Scale
ERIC Educational Resources Information Center
Whitaker, Manya C.; Valtierra, Kristina Marie
2018-01-01
Purpose: The purpose of this study is to develop and validate the dispositions for culturally responsive pedagogy scale (DCRPS). Design/methodology/approach: Scale development consisted of a six-step process including item development, expert review, exploratory factor analysis, factor interpretation, confirmatory factor analysis and convergent…
A Deliberate Practice Approach to Teaching Phylogenetic Analysis
ERIC Educational Resources Information Center
Hobbs, F. Collin; Johnson, Daniel J.; Kearns, Katherine D.
2013-01-01
One goal of postsecondary education is to assist students in developing expert-level understanding. Previous attempts to encourage expert-level understanding of phylogenetic analysis in college science classrooms have largely focused on isolated, or "one-shot," in-class activities. Using a deliberate practice instructional approach, we…
Probabilistic Parameter Uncertainty Analysis of Single Input Single Output Control Systems
NASA Technical Reports Server (NTRS)
Smith, Brett A.; Kenny, Sean P.; Crespo, Luis G.
2005-01-01
The current standards for handling uncertainty in control systems use interval bounds for definition of the uncertain parameters. This approach gives no information about the likelihood of system performance, but simply gives the response bounds. When used in design, current methods of m-analysis and can lead to overly conservative controller design. With these methods, worst case conditions are weighted equally with the most likely conditions. This research explores a unique approach for probabilistic analysis of control systems. Current reliability methods are examined showing the strong areas of each in handling probability. A hybrid method is developed using these reliability tools for efficiently propagating probabilistic uncertainty through classical control analysis problems. The method developed is applied to classical response analysis as well as analysis methods that explore the effects of the uncertain parameters on stability and performance metrics. The benefits of using this hybrid approach for calculating the mean and variance of responses cumulative distribution functions are shown. Results of the probabilistic analysis of a missile pitch control system, and a non-collocated mass spring system, show the added information provided by this hybrid analysis.
Tjiam, Irene M; Schout, Barbara M A; Hendrikx, Ad J M; Scherpbier, Albert J J M; Witjes, J Alfred; van Merriënboer, Jeroen J G
2012-01-01
Most studies of simulator-based surgical skills training have focused on the acquisition of psychomotor skills, but surgical procedures are complex tasks requiring both psychomotor and cognitive skills. As skills training is modelled on expert performance consisting partly of unconscious automatic processes that experts are not always able to explicate, simulator developers should collaborate with educational experts and physicians in developing efficient and effective training programmes. This article presents an approach to designing simulator-based skill training comprising cognitive task analysis integrated with instructional design according to the four-component/instructional design model. This theory-driven approach is illustrated by a description of how it was used in the development of simulator-based training for the nephrostomy procedure.
Whole-genome CNV analysis: advances in computational approaches.
Pirooznia, Mehdi; Goes, Fernando S; Zandi, Peter P
2015-01-01
Accumulating evidence indicates that DNA copy number variation (CNV) is likely to make a significant contribution to human diversity and also play an important role in disease susceptibility. Recent advances in genome sequencing technologies have enabled the characterization of a variety of genomic features, including CNVs. This has led to the development of several bioinformatics approaches to detect CNVs from next-generation sequencing data. Here, we review recent advances in CNV detection from whole genome sequencing. We discuss the informatics approaches and current computational tools that have been developed as well as their strengths and limitations. This review will assist researchers and analysts in choosing the most suitable tools for CNV analysis as well as provide suggestions for new directions in future development.
Object-oriented requirements analysis: A quick tour
NASA Technical Reports Server (NTRS)
Berard, Edward V.
1990-01-01
Of all the approaches to software development, an object-oriented approach appears to be both the most beneficial and the most popular. The description of the object-oriented approach is presented in the form of the view graphs.
Criteria for Developing a Successful Privatization Project
1989-05-01
conceptualization and planning are required when pursuing privatization projects. In fact, privatization project proponents need to know how to...selection of projects for analysis, methods of acquiring information about these projects, and the analysis framwork . Chapter IV includes the analysis. A...performed an analysis to determine cormion conceptual and creative approaches and lessons learned. This analysis was then used to develop criteria for
Kim, Seongho; Carruthers, Nicholas; Lee, Joohyoung; Chinni, Sreenivasa; Stemmer, Paul
2016-12-01
Stable isotope labeling by amino acids in cell culture (SILAC) is a practical and powerful approach for quantitative proteomic analysis. A key advantage of SILAC is the ability to simultaneously detect the isotopically labeled peptides in a single instrument run and so guarantee relative quantitation for a large number of peptides without introducing any variation caused by separate experiment. However, there are a few approaches available to assessing protein ratios and none of the existing algorithms pays considerable attention to the proteins having only one peptide hit. We introduce new quantitative approaches to dealing with SILAC protein-level summary using classification-based methodologies, such as Gaussian mixture models with EM algorithms and its Bayesian approach as well as K-means clustering. In addition, a new approach is developed using Gaussian mixture model and a stochastic, metaheuristic global optimization algorithm, particle swarm optimization (PSO), to avoid either a premature convergence or being stuck in a local optimum. Our simulation studies show that the newly developed PSO-based method performs the best among others in terms of F1 score and the proposed methods further demonstrate the ability of detecting potential markers through real SILAC experimental data. No matter how many peptide hits the protein has, the developed approach can be applicable, rescuing many proteins doomed to removal. Furthermore, no additional correction for multiple comparisons is necessary for the developed methods, enabling direct interpretation of the analysis outcomes. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Ryan, R. S.; Bullock, T.; Holland, W. B.; Kross, D. A.; Kiefling, L. A.
1981-01-01
The achievement of an optimized design from the system standpoint under the low cost, high risk constraints of the present day environment was analyzed. Space Shuttle illustrates the requirement for an analysis approach that considers all major disciplines (coupling between structures control, propulsion, thermal, aeroelastic, and performance), simultaneously. The Space Shuttle and certain payloads, Space Telescope and Spacelab, are examined. The requirements for system analysis approaches and criteria, including dynamic modeling requirements, test requirements, control requirements, and the resulting design verification approaches are illustrated. A survey of the problem, potential approaches available as solutions, implications for future systems, and projected technology development areas are addressed.
SMARTe is being developed to give stakeholders information resources, analytical tools, communication strategies, and a decision analysis approach to be able to make better decisions regarding future uses of property. The development of the communication tools and decision analys...
On analyzing free-response data on location level
NASA Astrophysics Data System (ADS)
Bandos, Andriy I.; Obuchowski, Nancy A.
2017-03-01
Free-response ROC (FROC) data are typically collected when primary question of interest is focused on the proportions of the correct detection-localization of known targets and frequencies of false positive responses, which can be multiple per subject (image). These studies are particularly relevant for CAD and related applications. The fundamental tool of the location-level FROC analysis is the FROC curve. Although there are many methods of FROC analysis, as we describe in this work, some of the standard and popular approaches, while important, are not suitable for analyzing specifically the location-level FROC performance as summarized by the FROC curve. Analysis of the FROC curve, on the other hand, might not be straightforward. Recently we developed an approach for the location-level analysis of the FROC data using the well-known tools for clustered ROC analysis. In the current work, based on previously developed concepts, and using specific examples, we demonstrate the key reasons why specifically location-level FROC performance cannot be fully addressed by the common approaches as well as illustrate the proposed solution. Specifically, we consider the two most salient FROC approaches, namely JAFROC and the area under the exponentially transformed FROC curve (AFE) and show that clearly superior FROC curves can have lower values for these indices. We describe the specific features that make these approaches inconsistent with FROC curves. This work illustrates some caveats for using the common approaches for location-level FROC analysis and provides guidelines for the appropriate assessment or comparison of FROC systems.
Developing Problem-Solving Skills through Retrosynthetic Analysis and Clickers in Organic Chemistry
ERIC Educational Resources Information Center
Flynn, Alison B.
2011-01-01
A unique approach to teaching and learning problem-solving and critical-thinking skills in the context of retrosynthetic analysis is described. In this approach, introductory organic chemistry students, who typically see only simple organic structures, undertook partial retrosynthetic analyses of real and complex synthetic targets. Multiple…
Conducting an integrated analysis to evaluate the societal and ecological consequences of environmental management actions requires decisions about data collection, theory development, modeling and valuation. Approaching these decisions in coordinated fashion necessitates a syste...
Qualitative Data Analysis for Health Services Research: Developing Taxonomy, Themes, and Theory
Bradley, Elizabeth H; Curry, Leslie A; Devers, Kelly J
2007-01-01
Objective To provide practical strategies for conducting and evaluating analyses of qualitative data applicable for health services researchers. Data Sources and Design We draw on extant qualitative methodological literature to describe practical approaches to qualitative data analysis. Approaches to data analysis vary by discipline and analytic tradition; however, we focus on qualitative data analysis that has as a goal the generation of taxonomy, themes, and theory germane to health services research. Principle Findings We describe an approach to qualitative data analysis that applies the principles of inductive reasoning while also employing predetermined code types to guide data analysis and interpretation. These code types (conceptual, relationship, perspective, participant characteristics, and setting codes) define a structure that is appropriate for generation of taxonomy, themes, and theory. Conceptual codes and subcodes facilitate the development of taxonomies. Relationship and perspective codes facilitate the development of themes and theory. Intersectional analyses with data coded for participant characteristics and setting codes can facilitate comparative analyses. Conclusions Qualitative inquiry can improve the description and explanation of complex, real-world phenomena pertinent to health services research. Greater understanding of the processes of qualitative data analysis can be helpful for health services researchers as they use these methods themselves or collaborate with qualitative researchers from a wide range of disciplines. PMID:17286625
Qualitative data analysis for health services research: developing taxonomy, themes, and theory.
Bradley, Elizabeth H; Curry, Leslie A; Devers, Kelly J
2007-08-01
To provide practical strategies for conducting and evaluating analyses of qualitative data applicable for health services researchers. DATA SOURCES AND DESIGN: We draw on extant qualitative methodological literature to describe practical approaches to qualitative data analysis. Approaches to data analysis vary by discipline and analytic tradition; however, we focus on qualitative data analysis that has as a goal the generation of taxonomy, themes, and theory germane to health services research. We describe an approach to qualitative data analysis that applies the principles of inductive reasoning while also employing predetermined code types to guide data analysis and interpretation. These code types (conceptual, relationship, perspective, participant characteristics, and setting codes) define a structure that is appropriate for generation of taxonomy, themes, and theory. Conceptual codes and subcodes facilitate the development of taxonomies. Relationship and perspective codes facilitate the development of themes and theory. Intersectional analyses with data coded for participant characteristics and setting codes can facilitate comparative analyses. Qualitative inquiry can improve the description and explanation of complex, real-world phenomena pertinent to health services research. Greater understanding of the processes of qualitative data analysis can be helpful for health services researchers as they use these methods themselves or collaborate with qualitative researchers from a wide range of disciplines.
Development of Game-Like Simulations for Procedural Knowledge in Healthcare Education
ERIC Educational Resources Information Center
Torrente, Javier; Borro-Escribano, Blanca; Freire, Manuel; del Blanco, Ángel; Marchiori, Eugenio J.; Martinez-Ortiz, Iván; Moreno-Ger, Pablo; Fernández-Manjón, Baltasar
2014-01-01
We present EGDA, an educational game development approach focused on the teaching of procedural knowledge using a cost-effective approach. EGDA proposes four tasks: analysis, design, implementation, and quality assurance that are subdivided in a total of 12 subtasks. One of the benefits of EGDA is that anyone can apply it to develop a game since…
Computerized Design and Analysis of Face-Milled, Uniform Tooth Height Spiral Bevel Gear Drives
NASA Technical Reports Server (NTRS)
Litvin, Faydor L.; Wang, Anngwo; Handschuh, R. F.
1996-01-01
Face-milled spiral bevel gears with uniform tooth height are considered. An approach is proposed for the design of low noise and localized bearing contact of such gears. The approach is based on the mismatch of contacting surfaces and permits two types of bearing contact either directed longitudinally or across the surface to be obtained. A Tooth Contact Analysis (TCA) computer program was developed. This analysis was used to determine the influence of misalignment on meshing and contact of the spiral bevel gears. A numerical example that illustrates the developed theory is provided.
An Approach towards Ultrasound Kidney Cysts Detection using Vector Graphic Image Analysis
NASA Astrophysics Data System (ADS)
Mahmud, Wan Mahani Hafizah Wan; Supriyanto, Eko
2017-08-01
This study develops new approach towards detection of kidney ultrasound image for both with single cyst as well as multiple cysts. 50 single cyst images and 25 multiple cysts images were used to test the developed algorithm. Steps involved in developing this algorithm were vector graphic image formation and analysis, thresholding, binarization, filtering as well as roundness test. Performance evaluation to 50 single cyst images gave accuracy of 92%, while for multiple cysts images, the accuracy was about 86.89% when tested to 25 multiple cysts images. This developed algorithm may be used in developing a computerized system such as computer aided diagnosis system to help medical experts in diagnosis of kidney cysts.
NASA Technical Reports Server (NTRS)
Oden, J. Tinsley; Fly, Gerald W.; Mahadevan, L.
1987-01-01
A hybrid stress finite element method is developed for accurate stress and vibration analysis of problems in linear anisotropic elasticity. A modified form of the Hellinger-Reissner principle is formulated for dynamic analysis and an algorithm for the determination of the anisotropic elastic and compliance constants from experimental data is developed. These schemes were implemented in a finite element program for static and dynamic analysis of linear anisotropic two dimensional elasticity problems. Specific numerical examples are considered to verify the accuracy of the hybrid stress approach and compare it with that of the standard displacement method, especially for highly anisotropic materials. It is that the hybrid stress approach gives much better results than the displacement method. Preliminary work on extensions of this method to three dimensional elasticity is discussed, and the stress shape functions necessary for this extension are included.
NASA Technical Reports Server (NTRS)
Ray, Ronald J.
1994-01-01
New flight test maneuvers and analysis techniques for evaluating the dynamic response of in-flight thrust models during throttle transients have been developed and validated. The approach is based on the aircraft and engine performance relationship between thrust and drag. Two flight test maneuvers, a throttle step and a throttle frequency sweep, were developed and used in the study. Graphical analysis techniques, including a frequency domain analysis method, were also developed and evaluated. They provide quantitative and qualitative results. Four thrust calculation methods were used to demonstrate and validate the test technique. Flight test applications on two high-performance aircraft confirmed the test methods as valid and accurate. These maneuvers and analysis techniques were easy to implement and use. Flight test results indicate the analysis techniques can identify the combined effects of model error and instrumentation response limitations on the calculated thrust value. The methods developed in this report provide an accurate approach for evaluating, validating, or comparing thrust calculation methods for dynamic flight applications.
A study of concept-based similarity approaches for recommending program examples
NASA Astrophysics Data System (ADS)
Hosseini, Roya; Brusilovsky, Peter
2017-07-01
This paper investigates a range of concept-based example recommendation approaches that we developed to provide example-based problem-solving support in the domain of programming. The goal of these approaches is to offer students a set of most relevant remedial examples when they have trouble solving a code comprehension problem where students examine a program code to determine its output or the final value of a variable. In this paper, we use the ideas of semantic-level similarity-based linking developed in the area of intelligent hypertext to generate examples for the given problem. To determine the best-performing approach, we explored two groups of similarity approaches for selecting examples: non-structural approaches focusing on examples that are similar to the problem in terms of concept coverage and structural approaches focusing on examples that are similar to the problem by the structure of the content. We also explored the value of personalized example recommendation based on student's knowledge levels and learning goal of the exercise. The paper presents concept-based similarity approaches that we developed, explains the data collection studies and reports the result of comparative analysis. The results of our analysis showed better ranking performance of the personalized structural variant of cosine similarity approach.
Automatic Identification of Character Types from Film Dialogs
Skowron, Marcin; Trapp, Martin; Payr, Sabine; Trappl, Robert
2016-01-01
ABSTRACT We study the detection of character types from fictional dialog texts such as screenplays. As approaches based on the analysis of utterances’ linguistic properties are not sufficient to identify all fictional character types, we develop an integrative approach that complements linguistic analysis with interactive and communication characteristics, and show that it can improve the identification performance. The interactive characteristics of fictional characters are captured by the descriptive analysis of semantic graphs weighted by linguistic markers of expressivity and social role. For this approach, we introduce a new data set of action movie character types with their corresponding sequences of dialogs. The evaluation results demonstrate that the integrated approach outperforms baseline approaches on the presented data set. Comparative in-depth analysis of a single screenplay leads on to the discussion of possible limitations of this approach and to directions for future research. PMID:29118463
Theorising Critical HRD: A Paradox of Intricacy and Discrepancy
ERIC Educational Resources Information Center
Trehan, Kiran; Rigg, Clare
2011-01-01
Purpose: This paper aims to advance theoretical understanding of the concept of "critical human resource development". Design/methodology/approach: This is a conceptual paper. Findings: Foregrounding questions of power, emotions and political dynamics within the analysis of organisational learning and development activity, critical approaches in…
A computer-aided approach to nonlinear control systhesis
NASA Technical Reports Server (NTRS)
Wie, Bong; Anthony, Tobin
1988-01-01
The major objective of this project is to develop a computer-aided approach to nonlinear stability analysis and nonlinear control system design. This goal is to be obtained by refining the describing function method as a synthesis tool for nonlinear control design. The interim report outlines the approach by this study to meet these goals including an introduction to the INteractive Controls Analysis (INCA) program which was instrumental in meeting these study objectives. A single-input describing function (SIDF) design methodology was developed in this study; coupled with the software constructed in this study, the results of this project provide a comprehensive tool for design and integration of nonlinear control systems.
Development of a Conservative Model Validation Approach for Reliable Analysis
2015-01-01
CIE 2015 August 2-5, 2015, Boston, Massachusetts, USA [DRAFT] DETC2015-46982 DEVELOPMENT OF A CONSERVATIVE MODEL VALIDATION APPROACH FOR RELIABLE...obtain a conservative simulation model for reliable design even with limited experimental data. Very little research has taken into account the...3, the proposed conservative model validation is briefly compared to the conventional model validation approach. Section 4 describes how to account
ERIC Educational Resources Information Center
Mostertman, L. J.
Because of the uncertainty related to water resources development projects, and because of the multitude of factors influencing their performance, the systems analysis approach is often used as an instrument in the planning and design process. The approach will also yield good results in the programming of the maintenance and management of the…
Team Software Development for Aerothermodynamic and Aerodynamic Analysis and Design
NASA Technical Reports Server (NTRS)
Alexandrov, N.; Atkins, H. L.; Bibb, K. L.; Biedron, R. T.; Carpenter, M. H.; Gnoffo, P. A.; Hammond, D. P.; Jones, W. T.; Kleb, W. L.; Lee-Rausch, E. M.
2003-01-01
A collaborative approach to software development is described. The approach employs the agile development techniques: project retrospectives, Scrum status meetings, and elements of Extreme Programming to efficiently develop a cohesive and extensible software suite. The software product under development is a fluid dynamics simulator for performing aerodynamic and aerothermodynamic analysis and design. The functionality of the software product is achieved both through the merging, with substantial rewrite, of separate legacy codes and the authorship of new routines. Examples of rapid implementation of new functionality demonstrate the benefits obtained with this agile software development process. The appendix contains a discussion of coding issues encountered while porting legacy Fortran 77 code to Fortran 95, software design principles, and a Fortran 95 coding standard.
Wightman, Jade; Julio, Flávia; Virués-Ortega, Javier
2014-05-01
Experimental functional analysis is an assessment methodology to identify the environmental factors that maintain problem behavior in individuals with developmental disabilities and in other populations. Functional analysis provides the basis for the development of reinforcement-based approaches to treatment. This article reviews the procedures, validity, and clinical implementation of the methodological variations of functional analysis and function-based interventions. We present six variations of functional analysis methodology in addition to the typical functional analysis: brief functional analysis, single-function tests, latency-based functional analysis, functional analysis of precursors, and trial-based functional analysis. We also present the three general categories of function-based interventions: extinction, antecedent manipulation, and differential reinforcement. Functional analysis methodology is a valid and efficient approach to the assessment of problem behavior and the selection of treatment strategies.
ERIC Educational Resources Information Center
Mitchell, Jimmy L.; McCormick, Ernest J.
The development and analysis of the Professional and Managerial Position Questionnaire (PMPQ) is reported. PMPQ is intended to serve as a job analysis instrument for higher level occupations than those assessed by the Position Analysis Questionnaire (PAQ). Four approaches to job analysis are described with different emphases on the requirements of…
Fourier analysis and signal processing by use of the Moebius inversion formula
NASA Technical Reports Server (NTRS)
Reed, Irving S.; Yu, Xiaoli; Shih, Ming-Tang; Tufts, Donald W.; Truong, T. K.
1990-01-01
A novel Fourier technique for digital signal processing is developed. This approach to Fourier analysis is based on the number-theoretic method of the Moebius inversion of series. The Fourier transform method developed is shown also to yield the convolution of two signals. A computer simulation shows that this method for finding Fourier coefficients is quite suitable for digital signal processing. It competes with the classical FFT (fast Fourier transform) approach in terms of accuracy, complexity, and speed.
A CAD approach to magnetic bearing design
NASA Technical Reports Server (NTRS)
Jeyaseelan, M.; Anand, D. K.; Kirk, J. A.
1988-01-01
A design methodology has been developed at the Magnetic Bearing Research Laboratory for designing magnetic bearings using a CAD approach. This is used in the algorithm of an interactive design software package. The package is a design tool developed to enable the designer to simulate the entire process of design and analysis of the system. Its capabilities include interactive input/modification of geometry, finding any possible saturation at critical sections of the system, and the design and analysis of a control system that stabilizes and maintains magnetic suspension.
IKOS: A Framework for Static Analysis based on Abstract Interpretation (Tool Paper)
NASA Technical Reports Server (NTRS)
Brat, Guillaume P.; Laserna, Jorge A.; Shi, Nija; Venet, Arnaud Jean
2014-01-01
The RTCA standard (DO-178C) for developing avionic software and getting certification credits includes an extension (DO-333) that describes how developers can use static analysis in certification. In this paper, we give an overview of the IKOS static analysis framework that helps developing static analyses that are both precise and scalable. IKOS harnesses the power of Abstract Interpretation and makes it accessible to a larger class of static analysis developers by separating concerns such as code parsing, model development, abstract domain management, results management, and analysis strategy. The benefits of the approach is demonstrated by a buffer overflow analysis applied to flight control systems.
Recommended approach to sofware development
NASA Technical Reports Server (NTRS)
Mcgarry, F. E.; Page, J.; Eslinger, S.; Church, V.; Merwarth, P.
1983-01-01
A set of guideline for an organized, disciplined approach to software development, based on data collected and studied for 46 flight dynamics software development projects. Methods and practices for each phase of a software development life cycle that starts with requirements analysis and ends with acceptance testing are described; maintenance and operation is not addressed. For each defined life cycle phase, guidelines for the development process and its management, and the products produced and their reviews are presented.
Ada developers' supplement to the recommended approach
NASA Technical Reports Server (NTRS)
Kester, Rush; Landis, Linda
1993-01-01
This document is a collection of guidelines for programmers and managers who are responsible for the development of flight dynamics applications in Ada. It is intended to be used in conjunction with the Recommended Approach to Software Development (SEL-81-305), which describes the software development life cycle, its products, reviews, methods, tools, and measures. The Ada Developers' Supplement provides additional detail on such topics as reuse, object-oriented analysis, and object-oriented design.
A MOOC on Approaches to Machine Translation
ERIC Educational Resources Information Center
Costa-jussà, Mart R.; Formiga, Lluís; Torrillas, Oriol; Petit, Jordi; Fonollosa, José A. R.
2015-01-01
This paper describes the design, development, and analysis of a MOOC entitled "Approaches to Machine Translation: Rule-based, statistical and hybrid", and provides lessons learned and conclusions to be taken into account in the future. The course was developed within the Canvas platform, used by recognized European universities. It…
Social Dialectics and Language: Mother and Child Construct the Discourse
ERIC Educational Resources Information Center
Harris, Adrienne E.
1975-01-01
The child's development of productive control over the adults language system is seen as an outcome of the dynamic social discourse of parent and child. Traditional approaches to child language are reviewed and a dialectical analysis is developed using concepts from information theory and a general systems approach. (JMB)
Human Capital Development: Comparative Analysis of BRICs
ERIC Educational Resources Information Center
Ardichvili, Alexandre; Zavyalova, Elena; Minina, Vera
2012-01-01
Purpose: The goal of this article is to conduct macro-level analysis of human capital (HC) development strategies, pursued by four countries commonly referred to as BRICs (Brazil, Russia, India, and China). Design/methodology/approach: This analysis is based on comparisons of macro indices of human capital and innovativeness of the economy and a…
Chemical Facility Preparedness: A Comprehensive Approach
2006-09-01
25 E. SWOT ANALYSIS & STRATEGIC ISSUE DEVELOPMENT ..............27 1. What can DHS do to Improve...be calculated and reviewed. Continuous benchmarking against best practices will be a necessity. E. SWOT ANALYSIS & STRATEGIC ISSUE DEVELOPMENT...In order to identify strategic issues for DHS, a Strength, Weakness, Opportunity and Threat ( SWOT ) analysis is necessary. Conducting this kind of
Identification of metabolic pathways using pathfinding approaches: a systematic review.
Abd Algfoor, Zeyad; Shahrizal Sunar, Mohd; Abdullah, Afnizanfaizal; Kolivand, Hoshang
2017-03-01
Metabolic pathways have become increasingly available for various microorganisms. Such pathways have spurred the development of a wide array of computational tools, in particular, mathematical pathfinding approaches. This article can facilitate the understanding of computational analysis of metabolic pathways in genomics. Moreover, stoichiometric and pathfinding approaches in metabolic pathway analysis are discussed. Three major types of studies are elaborated: stoichiometric identification models, pathway-based graph analysis and pathfinding approaches in cellular metabolism. Furthermore, evaluation of the outcomes of the pathways with mathematical benchmarking metrics is provided. This review would lead to better comprehension of metabolism behaviors in living cells, in terms of computed pathfinding approaches. © The Author 2016. Published by Oxford University Press. All rights reserved. For permissions, please email: journals.permissions@oup.com.
Shell Buckling Design Criteria Based on Manufacturing Imperfection Signatures
NASA Technical Reports Server (NTRS)
Hilburger, Mark W.; Nemeth, Michael P.; Starnes, James H., Jr.
2004-01-01
An analysis-based approach .for developing shell-buckling design criteria for laminated-composite cylindrical shells that accurately accounts for the effects of initial geometric imperfections is presented. With this approach, measured initial geometric imperfection data from six graphite-epoxy shells are used to determine a manufacturing-process-specific imperfection signature for these shells. This imperfection signature is then used as input into nonlinear finite-element analyses. The imperfection signature represents a "first-approximation" mean imperfection shape that is suitable for developing preliminary-design data. Comparisons of test data and analytical results obtained by using several different imperfection shapes are presented for selected shells. Overall, the results indicate that the analysis-based approach presented for developing reliable preliminary-design criteria has the potential to provide improved, less conservative buckling-load estimates, and to reduce the weight and cost of developing buckling-resistant shell structures.
Clinical Trials for Predictive Medicine—New Challenges and Paradigms*
Simon, Richard
2014-01-01
Background Developments in biotechnology and genomics have increased the focus of biostatisticians on prediction problems. This has led to many exciting developments for predictive modeling where the number of variables is larger than the number of cases. Heterogeneity of human diseases and new technology for characterizing them presents new opportunities and challenges for the design and analysis of clinical trials. Purpose In oncology, treatment of broad populations with regimens that do not benefit most patients is less economically sustainable with expensive molecularly targeted therapeutics. The established molecular heterogeneity of human diseases requires the development of new paradigms for the design and analysis of randomized clinical trials as a reliable basis for predictive medicine[1, 2]. Results We have reviewed prospective designs for the development of new therapeutics with candidate predictive biomarkers. We have also outlined a prediction based approach to the analysis of randomized clinical trials that both preserves the type I error and provides a reliable internally validated basis for predicting which patients are most likely or unlikely to benefit from the new regimen. Conclusions Developing new treatments with predictive biomarkers for identifying the patients who are most likely or least likely to benefit makes drug development more complex. But for many new oncology drugs it is the only science based approach and should increase the chance of success. It may also lead to more consistency in results among trials and has obvious benefits for reducing the number of patients who ultimately receive expensive drugs which expose them risks of adverse events but no benefit. This approach also has great potential value for controlling societal expenditures on health care. Development of treatments with predictive biomarkers requires major changes in the standard paradigms for the design and analysis of clinical trials. Some of the key assumptions upon which current methods are based are no longer valid. In addition to reviewing a variety of new clinical trial designs for co-development of treatments and predictive biomarkers, we have outlined a prediction based approach to the analysis of randomized clinical trials. This is a very structured approach whose use requires careful prospective planning. It requires further development but may serve as a basis for a new generation of predictive clinical trials which provide the kinds of reliable individualized information which physicians and patients have long sought, but which have not been available from the past use of post-hoc subset analysis. PMID:20338899
Comparative Analysis of Sustainable Approaches and Systems for Scientific Data Stewardship
NASA Astrophysics Data System (ADS)
Downs, R. R.; Chen, R. S.
2012-12-01
Sustainable data systems are critical components of the cyberinfrastructure needed to provide long-term stewardship of scientific data, including Earth science data, throughout their entire life cycle. A variety of approaches may help ensure the sustainability of such systems, but these approaches must be able to survive the demands of competing priorities and decreasing budgets. Analyzing and comparing alternative approaches can identify viable aspects of each approach and inform decisions for developing, managing, and supporting the cyberinfrastructure needed to facilitate discovery, access, and analysis of data by future communities of users. A typology of sustainability approaches is proposed, and example use cases are offered for comparing the approaches over time. These examples demonstrate the potential strengths and weaknesses of each approach under various conditions and with regard to different objectives, e.g., open vs. limited access. By applying the results of these analyses to their particular circumstances, systems stakeholders can assess their options for a sustainable systems approach along with other metrics and identify alternative strategies to ensure the sustainability of the scientific data and information for which they are responsible. In addition, comparing sustainability approaches should inform the design of new systems and the improvement of existing systems to meet the needs for long-term stewardship of scientific data, and support education and workforce development efforts needed to ensure that the appropriate scientific and technical skills are available to operate and further develop sustainable cyberinfrastructure.
de Greef-van der Sandt, I; Newgreen, D; Schaddelee, M; Dorrepaal, C; Martina, R; Ridder, A; van Maanen, R
2016-04-01
A multicriteria decision analysis (MCDA) approach was developed and used to estimate the benefit-risk of solifenacin and mirabegron and their combination in the treatment of overactive bladder (OAB). The objectives were 1) to develop an MCDA tool to compare drug effects in OAB quantitatively, 2) to establish transparency in the evaluation of the benefit-risk profile of various dose combinations, and 3) to quantify the added value of combination use compared to monotherapies. The MCDA model was developed using efficacy, safety, and tolerability attributes and the results of a phase II factorial design combination study were evaluated. Combinations of solifenacin 5 mg and mirabegron 25 mg and mirabegron 50 (5+25 and 5+50) scored the highest clinical utility and supported combination therapy development of solifenacin and mirabegron for phase III clinical development at these dose regimens. This case study underlines the benefit of using a quantitative approach in clinical drug development programs. © 2015 The American Society for Clinical Pharmacology and Therapeutics.
Sensitivity Analysis for Coupled Aero-structural Systems
NASA Technical Reports Server (NTRS)
Giunta, Anthony A.
1999-01-01
A novel method has been developed for calculating gradients of aerodynamic force and moment coefficients for an aeroelastic aircraft model. This method uses the Global Sensitivity Equations (GSE) to account for the aero-structural coupling, and a reduced-order modal analysis approach to condense the coupling bandwidth between the aerodynamic and structural models. Parallel computing is applied to reduce the computational expense of the numerous high fidelity aerodynamic analyses needed for the coupled aero-structural system. Good agreement is obtained between aerodynamic force and moment gradients computed with the GSE/modal analysis approach and the same quantities computed using brute-force, computationally expensive, finite difference approximations. A comparison between the computational expense of the GSE/modal analysis method and a pure finite difference approach is presented. These results show that the GSE/modal analysis approach is the more computationally efficient technique if sensitivity analysis is to be performed for two or more aircraft design parameters.
QuickNGS elevates Next-Generation Sequencing data analysis to a new level of automation.
Wagle, Prerana; Nikolić, Miloš; Frommolt, Peter
2015-07-01
Next-Generation Sequencing (NGS) has emerged as a widely used tool in molecular biology. While time and cost for the sequencing itself are decreasing, the analysis of the massive amounts of data remains challenging. Since multiple algorithmic approaches for the basic data analysis have been developed, there is now an increasing need to efficiently use these tools to obtain results in reasonable time. We have developed QuickNGS, a new workflow system for laboratories with the need to analyze data from multiple NGS projects at a time. QuickNGS takes advantage of parallel computing resources, a comprehensive back-end database, and a careful selection of previously published algorithmic approaches to build fully automated data analysis workflows. We demonstrate the efficiency of our new software by a comprehensive analysis of 10 RNA-Seq samples which we can finish in only a few minutes of hands-on time. The approach we have taken is suitable to process even much larger numbers of samples and multiple projects at a time. Our approach considerably reduces the barriers that still limit the usability of the powerful NGS technology and finally decreases the time to be spent before proceeding to further downstream analysis and interpretation of the data.
Sustainable Biofuel Crops Project, Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Juhn, Daniel; Grantham, Hedley
2014-05-28
Over the last six years, the Food and Agriculture Organization of the United Nations (FAO) has developed the Bioenergy and Food Security (BEFS) Approach to help countries design and implement sustainable bioenergy policies and strategies. The BEFS Approach consists of two sets of multidisciplinary and integrated tools and guidance (the BEFS Rapid Appraisal and the BEFS Detailed Analysis) to facilitate better decision on bioenergy development which should foster both food and energy security, and contribute to agricultural and rural development. The development of the BEFS Approach was for the most part funded by the German Federal Ministry of Food andmore » Agriculture. Recognizing the need to provide support to countries that wanted an initial assessment of their sustainable bioenergy potential, and of the associated opportunities, risks and trade offs, FAO began developing the BEFS-RA (Rapid Appraisal). The BEFS RA is a spreadsheet–based assessment and analysis tool designed to outline the country's basic energy, agriculture and food security context, the natural resources potential, the bioenergy end use options, including initial financial and economic implications, and the identification of issues that might require fuller investigation with the BEFS Detailed Analysis.« less
Erhart, M; Hagquist, C; Auquier, P; Rajmil, L; Power, M; Ravens-Sieberer, U
2010-07-01
This study compares item reduction analysis based on classical test theory (maximizing Cronbach's alpha - approach A), with analysis based on the Rasch Partial Credit Model item-fit (approach B), as applied to children and adolescents' health-related quality of life (HRQoL) items. The reliability and structural, cross-cultural and known-group validity of the measures were examined. Within the European KIDSCREEN project, 3019 children and adolescents (8-18 years) from seven European countries answered 19 HRQoL items of the Physical Well-being dimension of a preliminary KIDSCREEN instrument. The Cronbach's alpha and corrected item total correlation (approach A) were compared with infit mean squares and the Q-index item-fit derived according to a partial credit model (approach B). Cross-cultural differential item functioning (DIF ordinal logistic regression approach), structural validity (confirmatory factor analysis and residual correlation) and relative validity (RV) for socio-demographic and health-related factors were calculated for approaches (A) and (B). Approach (A) led to the retention of 13 items, compared with 11 items with approach (B). The item overlap was 69% for (A) and 78% for (B). The correlation coefficient of the summated ratings was 0.93. The Cronbach's alpha was similar for both versions [0.86 (A); 0.85 (B)]. Both approaches selected some items that are not strictly unidimensional and items displaying DIF. RV ratios favoured (A) with regard to socio-demographic aspects. Approach (B) was superior in RV with regard to health-related aspects. Both types of item reduction analysis should be accompanied by additional analyses. Neither of the two approaches was universally superior with regard to cultural, structural and known-group validity. However, the results support the usability of the Rasch method for developing new HRQoL measures for children and adolescents.
A joint probability approach for coincidental flood frequency analysis at ungauged basin confluences
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Cheng
2016-03-12
A reliable and accurate flood frequency analysis at the confluence of streams is of importance. Given that long-term peak flow observations are often unavailable at tributary confluences, at a practical level, this paper presents a joint probability approach (JPA) to address the coincidental flood frequency analysis at the ungauged confluence of two streams based on the flow rate data from the upstream tributaries. One case study is performed for comparison against several traditional approaches, including the position-plotting formula, the univariate flood frequency analysis, and the National Flood Frequency Program developed by US Geological Survey. It shows that the results generatedmore » by the JPA approach agree well with the floods estimated by the plotting position and univariate flood frequency analysis based on the observation data.« less
Test Cases for Modeling and Validation of Structures with Piezoelectric Actuators
NASA Technical Reports Server (NTRS)
Reaves, Mercedes C.; Horta, Lucas G.
2001-01-01
A set of benchmark test articles were developed to validate techniques for modeling structures containing piezoelectric actuators using commercially available finite element analysis packages. The paper presents the development, modeling, and testing of two structures: an aluminum plate with surface mounted patch actuators and a composite box beam with surface mounted actuators. Three approaches for modeling structures containing piezoelectric actuators using the commercially available packages: MSC/NASTRAN and ANSYS are presented. The approaches, applications, and limitations are discussed. Data for both test articles are compared in terms of frequency response functions from deflection and strain data to input voltage to the actuator. Frequency response function results using the three different analysis approaches provided comparable test/analysis results. It is shown that global versus local behavior of the analytical model and test article must be considered when comparing different approaches. Also, improper bonding of actuators greatly reduces the electrical to mechanical effectiveness of the actuators producing anti-resonance errors.
Mapping the Hot Spots: A Zoning Approach to Space Analysis and Design
ERIC Educational Resources Information Center
Bunnell, Adam; Carpenter, Russell; Hensley, Emily; Strong, Kelsey; Williams, ReBecca; Winter, Rachel
2016-01-01
This article examines a preliminary approach to space design developed and implemented in Eastern Kentucky University's Noel Studio for Academic Creativity. The approach discussed here is entitled "hot spots," which has allowed the research team to observe trends in space usage and composing activities among students. This approach has…
Malaria vaccine development and how external forces shape it: an overview.
Lorenz, Veronique; Karanis, Gabriele; Karanis, Panagiotis
2014-06-30
The aim of this paper is to analyse the current status and scientific value of malaria vaccine approaches and to provide a realistic prognosis for future developments. We systematically review previous approaches to malaria vaccination, address how vaccine efforts have developed, how this issue may be fixed, and how external forces shape vaccine development. Our analysis provides significant information on the various aspects and on the external factors that shape malaria vaccine development and reveal the importance of vaccine development in our society.
Bayesian Factor Analysis as a Variable Selection Problem: Alternative Priors and Consequences
Lu, Zhao-Hua; Chow, Sy-Miin; Loken, Eric
2016-01-01
Factor analysis is a popular statistical technique for multivariate data analysis. Developments in the structural equation modeling framework have enabled the use of hybrid confirmatory/exploratory approaches in which factor loading structures can be explored relatively flexibly within a confirmatory factor analysis (CFA) framework. Recently, a Bayesian structural equation modeling (BSEM) approach (Muthén & Asparouhov, 2012) has been proposed as a way to explore the presence of cross-loadings in CFA models. We show that the issue of determining factor loading patterns may be formulated as a Bayesian variable selection problem in which Muthén and Asparouhov’s approach can be regarded as a BSEM approach with ridge regression prior (BSEM-RP). We propose another Bayesian approach, denoted herein as the Bayesian structural equation modeling with spike and slab prior (BSEM-SSP), which serves as a one-stage alternative to the BSEM-RP. We review the theoretical advantages and disadvantages of both approaches and compare their empirical performance relative to two modification indices-based approaches and exploratory factor analysis with target rotation. A teacher stress scale data set (Byrne, 2012; Pettegrew & Wolf, 1982) is used to demonstrate our approach. PMID:27314566
Approach to proliferation risk assessment based on multiple objective analysis framework
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andrianov, A.; Kuptsov, I.; Studgorodok 1, Obninsk, Kaluga region, 249030
2013-07-01
The approach to the assessment of proliferation risk using the methods of multi-criteria decision making and multi-objective optimization is presented. The approach allows the taking into account of the specifics features of the national nuclear infrastructure, and possible proliferation strategies (motivations, intentions, and capabilities). 3 examples of applying the approach are shown. First, the approach has been used to evaluate the attractiveness of HEU (high enriched uranium)production scenarios at a clandestine enrichment facility using centrifuge enrichment technology. Secondly, the approach has been applied to assess the attractiveness of scenarios for undeclared production of plutonium or HEU by theft of materialsmore » circulating in nuclear fuel cycle facilities and thermal reactors. Thirdly, the approach has been used to perform a comparative analysis of the structures of developing nuclear power systems based on different types of nuclear fuel cycles, the analysis being based on indicators of proliferation risk.« less
Decision support systems in water and wastewater treatment process selection and design: a review.
Hamouda, M A; Anderson, W B; Huck, P M
2009-01-01
The continuously changing drivers of the water treatment industry, embodied by rigorous environmental and health regulations and the challenge of emerging contaminants, necessitates the development of decision support systems for the selection of appropriate treatment trains. This paper explores a systematic approach to developing decision support systems, which includes the analysis of the treatment problem(s), knowledge acquisition and representation, and the identification and evaluation of criteria controlling the selection of optimal treatment systems. The objective of this article is to review approaches and methods used in decision support systems developed to aid in the selection, sequencing of unit processes and design of drinking water, domestic wastewater, and industrial wastewater treatment systems. Not surprisingly, technical considerations were found to dominate the logic of the developed systems. Most of the existing decision-support tools employ heuristic knowledge. It has been determined that there is a need to develop integrated decision support systems that are generic, usable and consider a system analysis approach.
NASA Technical Reports Server (NTRS)
Patrick, Sean; Oliver, Emerson
2018-01-01
One of the SLS Navigation System's key performance requirements is a constraint on the payload system's delta-v allocation to correct for insertion errors due to vehicle state uncertainty at payload separation. The SLS navigation team has developed a Delta-Delta-V analysis approach to assess the effect on trajectory correction maneuver (TCM) design needed to correct for navigation errors. This approach differs from traditional covariance analysis based methods and makes no assumptions with regard to the propagation of the state dynamics. This allows for consideration of non-linearity in the propagation of state uncertainties. The Delta-Delta-V analysis approach re-optimizes perturbed SLS mission trajectories by varying key mission states in accordance with an assumed state error. The state error is developed from detailed vehicle 6-DOF Monte Carlo analysis or generated using covariance analysis. These perturbed trajectories are compared to a nominal trajectory to determine necessary TCM design. To implement this analysis approach, a tool set was developed which combines the functionality of a 3-DOF trajectory optimization tool, Copernicus, and a detailed 6-DOF vehicle simulation tool, Marshall Aerospace Vehicle Representation in C (MAVERIC). In addition to delta-v allocation constraints on SLS navigation performance, SLS mission requirement dictate successful upper stage disposal. Due to engine and propellant constraints, the SLS Exploration Upper Stage (EUS) must dispose into heliocentric space by means of a lunar fly-by maneuver. As with payload delta-v allocation, upper stage disposal maneuvers must place the EUS on a trajectory that maximizes the probability of achieving a heliocentric orbit post Lunar fly-by considering all sources of vehicle state uncertainty prior to the maneuver. To ensure disposal, the SLS navigation team has developed an analysis approach to derive optimal disposal guidance targets. This approach maximizes the state error covariance prior to the maneuver to develop and re-optimize a nominal disposal maneuver (DM) target that, if achieved, would maximize the potential for successful upper stage disposal. For EUS disposal analysis, a set of two tools was developed. The first considers only the nominal pre-disposal maneuver state, vehicle constraints, and an a priori estimate of the state error covariance. In the analysis, the optimal nominal disposal target is determined. This is performed by re-formulating the trajectory optimization to consider constraints on the eigenvectors of the error ellipse applied to the nominal trajectory. A bisection search methodology is implemented in the tool to refine these dispersions resulting in the maximum dispersion feasible for successful disposal via lunar fly-by. Success is defined based on the probability that the vehicle will not impact the lunar surface and will achieve a characteristic energy (C3) relative to the Earth such that it is no longer in the Earth-Moon system. The second tool propagates post-disposal maneuver states to determine the success of disposal for provided trajectory achieved states. This is performed using the optimized nominal target within the 6-DOF vehicle simulation. This paper will discuss the application of the Delta-Delta-V analysis approach for performance evaluation as well as trajectory re-optimization so as to demonstrate the system's capability in meeting performance constraints. Additionally, further discussion of the implementation of assessing disposal analysis will be provided.
Castro, Cecilia; Motto, Mario; Rossi, Vincenzo; Manetti, Cesare
2008-01-01
To shed light on the specific contribution of HDA101 in modulating metabolic pathways in the maize seed, changes in the metabolic profiles of kernels obtained from hda101 mutant plants have been investigated by a metabonomic approach. Dynamic properties of chromatin folding can be mediated by enzymes that modify DNA and histones. The enzymes responsible for the steady-state of histone acetylation are histone acetyltransferase and histone deacetylase (HDA). Therefore, it is interesting to evaluate the effects of up- and down-regulation of a Rpd-3 type HDA on the development of maize seeds in terms of metabolic changes. This has been reached by analysing nuclear magnetic resonance spectra by different chemometrician approaches, such as Orthogonal Projection to Latent Structure-Discriminant Analysis, Parallel Factors Analysis, and Multi-way Partial Least Squares-Discriminant Analysis (N-PLS-DA). In particular, the latter approaches were chosen because they explicitly take time into account, organizing data into a set of slices that refer to different steps of the developing process. The results show the good discriminating capabilities of the N-PLS-DA approach, even if the number of samples ought be increased to obtain better predictive capabilities. However, using this approach, it was possible to show differences in the accumulation of metabolites during development and to highlight the changes occuring in the modified seeds. In particular, the results confirm the role of this gene in cell cycle control. PMID:18836140
Planning and Measurement in School to Work Transition.
ERIC Educational Resources Information Center
Kooi, Beverly Y.
An analysis, development, and research (ADR) approach for planning educational research and development programs was used as a model for planning the National Institute of Education's School-To-Work Transition Program. The ADR model is system oriented and utilizes an iterative approach in which research questions are raised as others are answered.…
The development of an episode selection and aggregation approach, designed to support distributional estimation of use with the Models-3 Community Multiscale Air Quality (CMAQ) model, is described. The approach utilized cluster analysis of the 700-hPa east-west and north-south...
A Cognitive Component Analysis Approach for Developing Game-Based Spatial Learning Tools
ERIC Educational Resources Information Center
Hung, Pi-Hsia; Hwang, Gwo-Jen; Lee, Yueh-Hsun; Su, I-Hsiang
2012-01-01
Spatial ability has been recognized as one of the most important factors affecting the mathematical performance of students. Previous studies on spatial learning have mainly focused on developing strategies to shorten the problem-solving time of learners for very specific learning tasks. Such an approach usually has limited effects on improving…
ERIC Educational Resources Information Center
Meng, Christine
2015-01-01
Research Findings: This study examined whether approaches to learning moderate the association between home literacy environment and English receptive vocabulary development. The Head Start Family and Child Experiences Survey (2003 cohort) was used for analysis. Latent growth curve modeling was utilized to test a quadratic model of English…
ERIC Educational Resources Information Center
Masami, Matoba; Reza, Sarkar Arani M.
2005-01-01
This paper tries to present a careful analysis of current trends and challenges to importing Japanese model of teachers' professional development. The objective is to examine what "we" can learn from Japanese approach to improving instruction, especially "Jugyou Kenkyu" (Lesson Study) as a collaborative research on the…
NASA Technical Reports Server (NTRS)
McComas, David; Stark, Michael; Leake, Stephen; White, Michael; Morisio, Maurizio; Travassos, Guilherme H.; Powers, Edward I. (Technical Monitor)
2000-01-01
The NASA Goddard Space Flight Center Flight Software Branch (FSB) is developing a Guidance, Navigation, and Control (GNC) Flight Software (FSW) product line. The demand for increasingly more complex flight software in less time while maintaining the same level of quality has motivated us to look for better FSW development strategies. The GNC FSW product line has been planned to address the core GNC FSW functionality very similar on many recent low/near Earth missions in the last ten years. Unfortunately these missions have not accomplished significant drops in development cost since a systematic approach towards reuse has not been adopted. In addition, new demands are continually being placed upon the FSW which means the FSB must become more adept at providing GNC FSW functionality's core so it can accommodate additional requirements. These domain features together with engineering concepts are influencing the specification, description and evaluation of FSW product line. Domain engineering is the foundation for emerging product line software development approaches. A product line is 'A family of products designed to take advantage of their common aspects and predicted variabilities'. In our product line approach, domain engineering includes the engineering activities needed to produce reusable artifacts for a domain. Application engineering refers to developing an application in the domain starting from reusable artifacts. The focus of this paper is regarding the software process, lessons learned and on how the GNC FSW product line manages variability. Existing domain engineering approaches do not enforce any specific notation for domain analysis or commonality and variability analysis. Usually, natural language text is the preferred tool. The advantage is the flexibility and adapt ability of natural language. However, one has to be ready to accept also its well-known drawbacks, such as ambiguity, inconsistency, and contradictions. While most domain analysis approaches are functionally oriented, the idea of applying the object-oriented approach in domain analysis is not new. Some authors propose to use UML as the notation underlying domain analysis. Our work is based on the same idea of merging UML and domain analysis. Further, we propose a few extensions to UML in order to express variability, and we define precisely their semantics so that a tool can support them. The extensions are designed to be implemented on the API of a popular industrial CASE tool, with obvious advantages in cost and availability of tool support. The paper outlines the product line processes and identifies where variability must be addressed. Then it describes the product line products with respect to how they accommodate variability. The Celestial Body subdomain is used as a working example. Our results to date are summarized and plans for the future are described.
ERIC Educational Resources Information Center
Martinková, Patricia; Drabinová, Adéla; Liaw, Yuan-Ling; Sanders, Elizabeth A.; McFarland, Jenny L.; Price, Rebecca M.
2017-01-01
We provide a tutorial on differential item functioning (DIF) analysis, an analytic method useful for identifying potentially biased items in assessments. After explaining a number of methodological approaches, we test for gender bias in two scenarios that demonstrate why DIF analysis is crucial for developing assessments, particularly because…
Closed-loop, pilot/vehicle analysis of the approach and landing task
NASA Technical Reports Server (NTRS)
Anderson, M. R.; Schmidt, D. K.
1986-01-01
In the case of approach and landing, it is universally accepted that the pilot uses more than one vehicle response, or output, to close his control loops. Therefore, to model this task, a multi-loop analysis technique is required. The analysis problem has been in obtaining reasonable analytic estimates of the describing functions representing the pilot's loop compensation. Once these pilot describing functions are obtained, appropriate performance and workload metrics must then be developed for the landing task. The optimal control approach provides a powerful technique for obtaining the necessary describing functions, once the appropriate task objective is defined in terms of a quadratic objective function. An approach is presented through the use of a simple, reasonable objective function and model-based metrics to evaluate loop performance and pilot workload. The results of an analysis of the LAHOS (Landing and Approach of Higher Order Systems) study performed by R.E. Smith is also presented.
Expedited Selection of NMR Chiral Solvating Agents for Determination of Enantiopurity
2016-01-01
The use of NMR chiral solvating agents (CSAs) for the analysis of enantiopurity has been known for decades, but has been supplanted in recent years by chromatographic enantioseparation technology. While chromatographic methods for the analysis of enantiopurity are now commonplace and easy to implement, there are still individual compounds and entire classes of analytes where enantioseparation can prove extremely difficult, notably, compounds that are chiral by virtue of very subtle differences such as isotopic substitution or small differences in alkyl chain length. NMR analysis using CSAs can often be useful for such problems, but the traditional approach to selection of an appropriate CSA and the development of an NMR-based analysis method often involves a trial-and-error approach that can be relatively slow and tedious. In this study we describe a high-throughput experimentation approach to the selection of NMR CSAs that employs automation-enabled screening of prepared libraries of CSAs in a systematic fashion. This approach affords excellent results for a standard set of enantioenriched compounds, providing a valuable comparative data set for the effectiveness of CSAs for different classes of compounds. In addition, the technique has been successfully applied to challenging pharmaceutical development problems that are not amenable to chromatographic solutions. Overall, this methodology provides a rapid and powerful approach for investigating enantiopurity that compliments and augments conventional chromatographic approaches. PMID:27280168
Description of a user-oriented geographic information system - The resource analysis program
NASA Technical Reports Server (NTRS)
Tilmann, S. E.; Mokma, D. L.
1980-01-01
This paper describes the Resource Analysis Program, an applied geographic information system. Several applications are presented which utilized soil, and other natural resource data, to develop integrated maps and data analyses. These applications demonstrate the methods of analysis and the philosophy of approach used in the mapping system. The applications are evaluated in reference to four major needs of a functional mapping system: data capture, data libraries, data analysis, and mapping and data display. These four criteria are then used to describe an effort to develop the next generation of applied mapping systems. This approach uses inexpensive microcomputers for field applications and should prove to be a viable entry point for users heretofore unable or unwilling to venture into applied computer mapping.
Instantiating the art of war for effects-based operations
NASA Astrophysics Data System (ADS)
Burns, Carla L.
2002-07-01
Effects-Based Operations (EBO) is a mindset, a philosophy and an approach for planning, executing and assessing military operations for the effects they produce rather than the targets or even objectives they deal with. An EBO approach strives to provide economy of force, dynamic tasking, and reduced collateral damage. The notion of EBO is not new. Military Commanders certainly have desired effects in mind when conducting military operations. However, to date EBO has been an art of war that lacks automated techniques and tools that enable effects-based analysis and assessment. Modeling and simulation is at the heart of this challenge. The Air Force Research Laboratory (AFRL) EBO Program is developing modeling techniques and corresponding tool capabilities that can be brought to bear against the challenges presented by effects-based analysis and assessment. Effects-based course-of-action development, center of gravity/target system analysis, and wargaming capabilities are being developed and integrated to help give Commanders the information decision support required to achieve desired national security objectives. This paper presents an introduction to effects-based operations, discusses the benefits of an EBO approach, and focuses on modeling and analysis for effects-based strategy development. An overview of modeling and simulation challenges for EBO is presented, setting the stage for the detailed technical papers in the subject session.
Problem Analysis: Application in Developing Marketing Strategies for Colleges.
ERIC Educational Resources Information Center
Martin, John; Moore, Thomas
1991-01-01
The problem analysis technique can help colleges understand students' salient needs in a competitive market. A preliminary study demonstrates the usefulness of the approach for developing strategies aimed at maintaining student loyalty and improving word-of-mouth promotion to other prospective students. (Author/MSE)
Cleverley, Steve; Chen, Irene; Houle, Jean-François
2010-01-15
Immunoaffinity approaches remain invaluable tools for characterization and quantitation of biopolymers. Their application in separation science is often limited due to the challenges of immunoassay development. Typical end-point immunoassays require time consuming and labor-intensive approaches for optimization. Real-time label-free analysis using diffractive optics technology (dot) helps guide a very effective iterative process for rapid immunoassay development. Both label-free and amplified approaches can be used throughout feasibility testing and ultimately in the final assay, providing a robust platform for biopolymer analysis over a very broad dynamic range. We demonstrate the use of dot in rapidly developing assays for quantitating (1) human IgG in complex media, (2) a fusion protein in production media and (3) protein A contamination in purified immunoglobulin preparations. 2009 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Skersys, Tomas; Butleris, Rimantas; Kapocius, Kestutis
2013-10-01
Approaches for the analysis and specification of business vocabularies and rules are very relevant topics in both Business Process Management and Information Systems Development disciplines. However, in common practice of Information Systems Development, the Business modeling activities still are of mostly empiric nature. In this paper, basic aspects of the approach for business vocabularies' semi-automated extraction from business process models are presented. The approach is based on novel business modeling-level OMG standards "Business Process Model and Notation" (BPMN) and "Semantics for Business Vocabularies and Business Rules" (SBVR), thus contributing to OMG's vision about Model-Driven Architecture (MDA) and to model-driven development in general.
Systems Approach to Arms Control Verification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Allen, K; Neimeyer, I; Listner, C
2015-05-15
Using the decades of experience of developing concepts and technologies for verifying bilateral and multilateral arms control agreements, a broad conceptual systems approach is being developed that takes into account varying levels of information and risk. The IAEA has already demonstrated the applicability of a systems approach by implementing safeguards at the State level, with acquisition path analysis as the key element. In order to test whether such an approach could also be implemented for arms control verification, an exercise was conducted in November 2014 at the JRC ITU Ispra. Based on the scenario of a hypothetical treaty between twomore » model nuclear weapons states aimed at capping their nuclear arsenals at existing levels, the goal of this exercise was to explore how to use acquisition path analysis in an arms control context. Our contribution will present the scenario, objectives and results of this exercise, and attempt to define future workshops aimed at further developing verification measures that will deter or detect treaty violations.« less
Remote Sensing Technologies and Geospatial Modelling Hierarchy for Smart City Support
NASA Astrophysics Data System (ADS)
Popov, M.; Fedorovsky, O.; Stankevich, S.; Filipovich, V.; Khyzhniak, A.; Piestova, I.; Lubskyi, M.; Svideniuk, M.
2017-12-01
The approach to implementing the remote sensing technologies and geospatial modelling for smart city support is presented. The hierarchical structure and basic components of the smart city information support subsystem are considered. Some of the already available useful practical developments are described. These include city land use planning, urban vegetation analysis, thermal condition forecasting, geohazard detection, flooding risk assessment. Remote sensing data fusion approach for comprehensive geospatial analysis is discussed. Long-term city development forecasting by Forrester - Graham system dynamics model is provided over Kiev urban area.
Development of Phonological Awareness in down Syndrome: A Meta-Analysis and Empirical Study
ERIC Educational Resources Information Center
Naess, Kari-Anne B.
2016-01-01
Phonological awareness (PA) is the knowledge and understanding of the sound structure of language and is believed to be an important skill for the development of reading. This study explored PA skills in children with Down syndrome and matched typically developing (TD) controls using a dual approach: a meta-analysis of the existing international…
An Integrated Approach to Life Cycle Analysis
NASA Technical Reports Server (NTRS)
Chytka, T. M.; Brown, R. W.; Shih, A. T.; Reeves, J. D.; Dempsey, J. A.
2006-01-01
Life Cycle Analysis (LCA) is the evaluation of the impacts that design decisions have on a system and provides a framework for identifying and evaluating design benefits and burdens associated with the life cycles of space transportation systems from a "cradle-to-grave" approach. Sometimes called life cycle assessment, life cycle approach, or "cradle to grave analysis", it represents a rapidly emerging family of tools and techniques designed to be a decision support methodology and aid in the development of sustainable systems. The implementation of a Life Cycle Analysis can vary and may take many forms; from global system-level uncertainty-centered analysis to the assessment of individualized discriminatory metrics. This paper will focus on a proven LCA methodology developed by the Systems Analysis and Concepts Directorate (SACD) at NASA Langley Research Center to quantify and assess key LCA discriminatory metrics, in particular affordability, reliability, maintainability, and operability. This paper will address issues inherent in Life Cycle Analysis including direct impacts, such as system development cost and crew safety, as well as indirect impacts, which often take the form of coupled metrics (i.e., the cost of system unreliability). Since LCA deals with the analysis of space vehicle system conceptual designs, it is imperative to stress that the goal of LCA is not to arrive at the answer but, rather, to provide important inputs to a broader strategic planning process, allowing the managers to make risk-informed decisions, and increase the likelihood of meeting mission success criteria.
Bao, Weier; Greenwold, Matthew J; Sawyer, Roger H
2017-11-01
Gene co-expression network analysis has been a research method widely used in systematically exploring gene function and interaction. Using the Weighted Gene Co-expression Network Analysis (WGCNA) approach to construct a gene co-expression network using data from a customized 44K microarray transcriptome of chicken epidermal embryogenesis, we have identified two distinct modules that are highly correlated with scale or feather development traits. Signaling pathways related to feather development were enriched in the traditional KEGG pathway analysis and functional terms relating specifically to embryonic epidermal development were also enriched in the Gene Ontology analysis. Significant enrichment annotations were discovered from customized enrichment tools such as Modular Single-Set Enrichment Test (MSET) and Medical Subject Headings (MeSH). Hub genes in both trait-correlated modules showed strong specific functional enrichment toward epidermal development. Also, regulatory elements, such as transcription factors and miRNAs, were targeted in the significant enrichment result. This work highlights the advantage of this methodology for functional prediction of genes not previously associated with scale- and feather trait-related modules.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hinman, N.D.; Yancey, M.A.
1997-12-31
One of the main functions of government is to invest taxpayers dollars in projects, programs, and properties that will result in social benefit. Public programs focused on the development of technology are examples of such opportunities. Selecting these programs requires the same investment analysis approaches that private companies and individuals use. Good use of investment analysis approaches to these programs will minimize our tax costs and maximize public benefit from tax dollars invested. This article describes the use of the net present value (NPV) analysis approach to select public R&D programs and valuate expected private sector participation in the programs.more » 5 refs.« less
Adversarial risk analysis with incomplete information: a level-k approach.
Rothschild, Casey; McLay, Laura; Guikema, Seth
2012-07-01
This article proposes, develops, and illustrates the application of level-k game theory to adversarial risk analysis. Level-k reasoning, which assumes that players play strategically but have bounded rationality, is useful for operationalizing a Bayesian approach to adversarial risk analysis. It can be applied in a broad class of settings, including settings with asynchronous play and partial but incomplete revelation of early moves. Its computational and elicitation requirements are modest. We illustrate the approach with an application to a simple defend-attack model in which the defender's countermeasures are revealed with a probability less than one to the attacker before he decides on how or whether to attack. © 2011 Society for Risk Analysis.
Streamline Your Project: A Lifecycle Model.
ERIC Educational Resources Information Center
Viren, John
2000-01-01
Discusses one approach to project organization providing a baseline lifecycle model for multimedia/CBT development. This variation of the standard four-phase model of Analysis, Design, Development, and Implementation includes a Pre-Analysis phase, called Definition, and a Post-Implementation phase, known as Maintenance. Each phase is described.…
LES Investigation of Wake Development in a Transonic Fan Stage for Aeroacoustic Analysis
NASA Technical Reports Server (NTRS)
Hah, Chunill; Romeo, Michael
2017-01-01
Detailed development of the rotor wake and its interaction with the stator are investigated with a large eddy simulation (LES). Typical steady and unsteady Navier-Stokes approaches (RANS and URANS) do not calculate wake development accurately and do not provide all the necessary information for an aeroacoustic analysis. It is generally believed that higher fidelity analysis tools are required for an aeroacoustic investigation of transonic fan stages.
The promise of the state space approach to time series analysis for nursing research.
Levy, Janet A; Elser, Heather E; Knobel, Robin B
2012-01-01
Nursing research, particularly related to physiological development, often depends on the collection of time series data. The state space approach to time series analysis has great potential to answer exploratory questions relevant to physiological development but has not been used extensively in nursing. The aim of the study was to introduce the state space approach to time series analysis and demonstrate potential applicability to neonatal monitoring and physiology. We present a set of univariate state space models; each one describing a process that generates a variable of interest over time. Each model is presented algebraically and a realization of the process is presented graphically from simulated data. This is followed by a discussion of how the model has been or may be used in two nursing projects on neonatal physiological development. The defining feature of the state space approach is the decomposition of the series into components that are functions of time; specifically, slowly varying level, faster varying periodic, and irregular components. State space models potentially simulate developmental processes where a phenomenon emerges and disappears before stabilizing, where the periodic component may become more regular with time, or where the developmental trajectory of a phenomenon is irregular. The ultimate contribution of this approach to nursing science will require close collaboration and cross-disciplinary education between nurses and statisticians.
Shao, Jingyuan; Cao, Wen; Qu, Haibin; Pan, Jianyang; Gong, Xingchu
2018-01-01
The aim of this study was to present a novel analytical quality by design (AQbD) approach for developing an HPLC method to analyze herbal extracts. In this approach, critical method attributes (CMAs) and critical method parameters (CMPs) of the analytical method were determined using the same data collected from screening experiments. The HPLC-ELSD method for separation and quantification of sugars in Codonopsis Radix extract (CRE) samples and Astragali Radix extract (ARE) samples was developed as an example method with a novel AQbD approach. Potential CMAs and potential CMPs were found with Analytical Target Profile. After the screening experiments, the retention time of the D-glucose peak of CRE samples, the signal-to-noise ratio of the D-glucose peak of CRE samples, and retention time of the sucrose peak in ARE samples were considered CMAs. The initial and final composition of the mobile phase, flow rate, and column temperature were found to be CMPs using a standard partial regression coefficient method. The probability-based design space was calculated using a Monte-Carlo simulation method and verified by experiments. The optimized method was validated to be accurate and precise, and then it was applied in the analysis of CRE and ARE samples. The present AQbD approach is efficient and suitable for analysis objects with complex compositions.
Gordijn, Sanne J; Korteweg, Fleurisca J; Erwich, Jan Jaap H M; Holm, Jozien P; van Diem, Mariet Th; Bergman, Klasien A; Timmer, Albertus
2009-06-01
Many classification systems for perinatal mortality are available, all with their own strengths and weaknesses: none of them has been universally accepted. We present a systematic multilayered approach for the analysis of perinatal mortality based on information related to the moment of death, the conditions associated with death and the underlying cause of death, using a combination of representatives of existing classification systems. We compared the existing classification systems regarding their definition of the perinatal period, level of complexity, inclusion of maternal, foetal and/or placental factors and whether they focus at a clinical or pathological viewpoint. Furthermore, we allocated the classification systems to one of three categories: 'when', 'what' or 'why', dependent on whether the allocation of the individual cases of perinatal mortality is based on the moment of death ('when'), the clinical conditions associated with death ('what'), or the underlying cause of death ('why'). A multilayered approach for the analysis and classification of perinatal mortality is possible by using combinations of existing systems; for example the Wigglesworth or Nordic Baltic ('when'), ReCoDe ('what') and Tulip ('why') classification systems. This approach is useful not only for in depth analysis of perinatal mortality in the developed world but also for analysis of perinatal mortality in the developing countries, where resources to investigate death are often limited.
Who's in and why? A typology of stakeholder analysis methods for natural resource management.
Reed, Mark S; Graves, Anil; Dandy, Norman; Posthumus, Helena; Hubacek, Klaus; Morris, Joe; Prell, Christina; Quinn, Claire H; Stringer, Lindsay C
2009-04-01
Stakeholder analysis means many things to different people. Various methods and approaches have been developed in different fields for different purposes, leading to confusion over the concept and practice of stakeholder analysis. This paper asks how and why stakeholder analysis should be conducted for participatory natural resource management research. This is achieved by reviewing the development of stakeholder analysis in business management, development and natural resource management. The normative and instrumental theoretical basis for stakeholder analysis is discussed, and a stakeholder analysis typology is proposed. This consists of methods for: i) identifying stakeholders; ii) differentiating between and categorising stakeholders; and iii) investigating relationships between stakeholders. The range of methods that can be used to carry out each type of analysis is reviewed. These methods and approaches are then illustrated through a series of case studies funded through the Rural Economy and Land Use (RELU) programme. These case studies show the wide range of participatory and non-participatory methods that can be used, and discuss some of the challenges and limitations of existing methods for stakeholder analysis. The case studies also propose new tools and combinations of methods that can more effectively identify and categorise stakeholders and help understand their inter-relationships.
System analysis in rotorcraft design: The past decade
NASA Technical Reports Server (NTRS)
Galloway, Thomas L.
1988-01-01
Rapid advances in the technology of electronic digital computers and the need for an integrated synthesis approach in developing future rotorcraft programs has led to increased emphasis on system analysis techniques in rotorcraft design. The task in systems analysis is to deal with complex, interdependent, and conflicting requirements in a structured manner so rational and objective decisions can be made. Whether the results are wisdom or rubbish depends upon the validity and sometimes more importantly, the consistency of the inputs, the correctness of the analysis, and a sensible choice of measures of effectiveness to draw conclusions. In rotorcraft design this means combining design requirements, technology assessment, sensitivity analysis and reviews techniques currently in use by NASA and Army organizations in developing research programs and vehicle specifications for rotorcraft. These procedures span simple graphical approaches to comprehensive analysis on large mainframe computers. Examples of recent applications to military and civil missions are highlighted.
SLS Navigation Model-Based Design Approach
NASA Technical Reports Server (NTRS)
Oliver, T. Emerson; Anzalone, Evan; Geohagan, Kevin; Bernard, Bill; Park, Thomas
2018-01-01
The SLS Program chose to implement a Model-based Design and Model-based Requirements approach for managing component design information and system requirements. This approach differs from previous large-scale design efforts at Marshall Space Flight Center where design documentation alone conveyed information required for vehicle design and analysis and where extensive requirements sets were used to scope and constrain the design. The SLS Navigation Team has been responsible for the Program-controlled Design Math Models (DMMs) which describe and represent the performance of the Inertial Navigation System (INS) and the Rate Gyro Assemblies (RGAs) used by Guidance, Navigation, and Controls (GN&C). The SLS Navigation Team is also responsible for the navigation algorithms. The navigation algorithms are delivered for implementation on the flight hardware as a DMM. For the SLS Block 1-B design, the additional GPS Receiver hardware is managed as a DMM at the vehicle design level. This paper provides a discussion of the processes and methods used to engineer, design, and coordinate engineering trades and performance assessments using SLS practices as applied to the GN&C system, with a particular focus on the Navigation components. These include composing system requirements, requirements verification, model development, model verification and validation, and modeling and analysis approaches. The Model-based Design and Requirements approach does not reduce the effort associated with the design process versus previous processes used at Marshall Space Flight Center. Instead, the approach takes advantage of overlap between the requirements development and management process, and the design and analysis process by efficiently combining the control (i.e. the requirement) and the design mechanisms. The design mechanism is the representation of the component behavior and performance in design and analysis tools. The focus in the early design process shifts from the development and management of design requirements to the development of usable models, model requirements, and model verification and validation efforts. The models themselves are represented in C/C++ code and accompanying data files. Under the idealized process, potential ambiguity in specification is reduced because the model must be implementable versus a requirement which is not necessarily subject to this constraint. Further, the models are shown to emulate the hardware during validation. For models developed by the Navigation Team, a common interface/standalone environment was developed. The common environment allows for easy implementation in design and analysis tools. Mechanisms such as unit test cases ensure implementation as the developer intended. The model verification and validation process provides a very high level of component design insight. The origin and implementation of the SLS variant of Model-based Design is described from the perspective of the SLS Navigation Team. The format of the models and the requirements are described. The Model-based Design approach has many benefits but is not without potential complications. Key lessons learned associated with the implementation of the Model Based Design approach and process from infancy to verification and certification are discussed
The JPL Cost Risk Analysis Approach that Incorporates Engineering Realism
NASA Technical Reports Server (NTRS)
Harmon, Corey C.; Warfield, Keith R.; Rosenberg, Leigh S.
2006-01-01
This paper discusses the JPL Cost Engineering Group (CEG) cost risk analysis approach that accounts for all three types of cost risk. It will also describe the evaluation of historical cost data upon which this method is based. This investigation is essential in developing a method that is rooted in engineering realism and produces credible, dependable results to aid decision makers.
ERIC Educational Resources Information Center
Mettas, Alexandros; Norman, Eddie
2011-01-01
This paper discusses the establishment of a framework for researching children's decision-making skills in design and technology education through taking a grounded theory approach. Three data sources were used: (1) analysis of available literature; (2) curriculum analysis and interviews with teachers concerning their practice in relation to their…
Contemporary Militant Extremism: A Linguistic Approach to Scale Development
ERIC Educational Resources Information Center
Stankov, Lazar; Higgins, Derrick; Saucier, Gerard; Knezevic, Goran
2010-01-01
In this article, the authors describe procedures used in the development of a new scale of militant extremist mindset. A 2-step approach consisted of (a) linguistic analysis of the texts produced by known terrorist organizations and selection of statements from these texts that reflect the mindset of those belonging to these organizations and (b)…
NASA Technical Reports Server (NTRS)
Karandikar, Harsh M.
1997-01-01
An approach for objective and quantitative technical and cost risk analysis during product development, which is applicable from the earliest stages, is discussed. The approach is supported by a software tool called the Analytical System for Uncertainty and Risk Estimation (ASURE). Details of ASURE, the underlying concepts and its application history, are provided.
ERIC Educational Resources Information Center
Davids, Mogamat Razeen; Chikte, Usuf M. E.; Halperin, Mitchell L.
2011-01-01
This article reports on the development and evaluation of a Web-based application that provides instruction and hands-on practice in managing electrolyte and acid-base disorders. Our teaching approach, which focuses on concepts rather than details, encourages quantitative analysis and a logical problem-solving approach. Identifying any dangers to…
Harmonising Training and Development across an Industry: The Case of Australian Rail
ERIC Educational Resources Information Center
Short, Tom; Harris, Roger McL.
2017-01-01
Purpose: This paper aims to explore why harmonisation, given its potential, is so difficult to achieve. It analyses the issues and challenges in achieving harmonisation of training and development across an industry. Design/methodology/approach: The approach was a meta-analysis of six research projects undertaken in the Australian rail industry.…
ERIC Educational Resources Information Center
Kissack, Heather C.; Callahan, Jamie L.
2010-01-01
Purpose: The purpose of this paper is to demonstrate that training designers can, and should, account for organizational culture during training needs assessments. Design/methodology/approach: Utilizing the approach and arguments in Giddens' structuration theory, the paper conceptually applies these tenets to training and development programs…
ERIC Educational Resources Information Center
Keedy, John L.
2005-01-01
Purpose: The purpose of this paper is to contribute to the international debate over the university as the service provider for school administrator preparation programs from the United States perspective. Design/methodology/approach: The author's approach is that of using historical analysis in developing a conceptual position the author argues…
Three Approaches to Environmental Resources Analysis.
ERIC Educational Resources Information Center
Harvard Univ., Cambridge, MA. Graduate School of Design.
This booklet, the first of a projected series related to the development of methodologies and techniques for environments planning and design, examines three approaches that are currently being used to identify, analyze, and evaluate the natural and man-made resources that comprise the physical environment. One approach by G. Angus Hills uses a…
2011-12-01
therefore a more general approach uses the pseudo-inverse shown in Equation (12) to obtain the commanded gimbal rate. 1 /T T b N CMG...gimbal motor. Approaching the problem from this perspective increases the complexity significantly and the relationship between motor current and...included in this document confirms the equations that Schaub and Junkins developed. The approaches used in the two derivations are sufficiently
Kadiyala, Akhil; Kaur, Devinder; Kumar, Ashok
2013-02-01
The present study developed a novel approach to modeling indoor air quality (IAQ) of a public transportation bus by the development of hybrid genetic-algorithm-based neural networks (also known as evolutionary neural networks) with input variables optimized from using the regression trees, referred as the GART approach. This study validated the applicability of the GART modeling approach in solving complex nonlinear systems by accurately predicting the monitored contaminants of carbon dioxide (CO2), carbon monoxide (CO), nitric oxide (NO), sulfur dioxide (SO2), 0.3-0.4 microm sized particle numbers, 0.4-0.5 microm sized particle numbers, particulate matter (PM) concentrations less than 1.0 microm (PM10), and PM concentrations less than 2.5 microm (PM2.5) inside a public transportation bus operating on 20% grade biodiesel in Toledo, OH. First, the important variables affecting each monitored in-bus contaminant were determined using regression trees. Second, the analysis of variance was used as a complimentary sensitivity analysis to the regression tree results to determine a subset of statistically significant variables affecting each monitored in-bus contaminant. Finally, the identified subsets of statistically significant variables were used as inputs to develop three artificial neural network (ANN) models. The models developed were regression tree-based back-propagation network (BPN-RT), regression tree-based radial basis function network (RBFN-RT), and GART models. Performance measures were used to validate the predictive capacity of the developed IAQ models. The results from this approach were compared with the results obtained from using a theoretical approach and a generalized practicable approach to modeling IAQ that included the consideration of additional independent variables when developing the aforementioned ANN models. The hybrid GART models were able to capture majority of the variance in the monitored in-bus contaminants. The genetic-algorithm-based neural network IAQ models outperformed the traditional ANN methods of the back-propagation and the radial basis function networks. The novelty of this research is the development of a novel approach to modeling vehicular indoor air quality by integration of the advanced methods of genetic algorithms, regression trees, and the analysis of variance for the monitored in-vehicle gaseous and particulate matter contaminants, and comparing the results obtained from using the developed approach with conventional artificial intelligence techniques of back propagation networks and radial basis function networks. This study validated the newly developed approach using holdout and threefold cross-validation methods. These results are of great interest to scientists, researchers, and the public in understanding the various aspects of modeling an indoor microenvironment. This methodology can easily be extended to other fields of study also.
NASA Technical Reports Server (NTRS)
Townsend, J.; Meyers, C.; Ortega, R.; Peck, J.; Rheinfurth, M.; Weinstock, B.
1993-01-01
Probabilistic structural analyses and design methods are steadily gaining acceptance within the aerospace industry. The safety factor approach to design has long been the industry standard, and it is believed by many to be overly conservative and thus, costly. A probabilistic approach to design may offer substantial cost savings. This report summarizes several probabilistic approaches: the probabilistic failure analysis (PFA) methodology developed by Jet Propulsion Laboratory, fast probability integration (FPI) methods, the NESSUS finite element code, and response surface methods. Example problems are provided to help identify the advantages and disadvantages of each method.
A review of lipidomic technologies applicable to sphingolipidomics and their relevant applications
Han, Xianlin; Jiang, Xuntian
2009-01-01
Sphingolipidomics, a branch of lipidomics, focuses on the large-scale study of the cellular sphingolipidomes. In the current review, two main approaches for the analysis of cellular sphingolipidomes (i.e. LC-MS- or LC-MS/MS-based approach and shotgun lipidomics-based approach) are briefly discussed. Their advantages, some considerations of these methods, and recent applications of these approaches are summarized. It is the authors’ sincere hope that this review article will add to the readers understanding of the advantages and limitations of each developed method for the analysis of a cellular sphingolipidome. PMID:19690629
NASA Technical Reports Server (NTRS)
Schlater, Nelson J.; Simonds, Charles H.; Ballin, Mark G.
1993-01-01
Applied research and technology development (R&TD) is often characterized by uncertainty, risk, and significant delays before tangible returns are obtained. Given the increased awareness of limitations in resources, effective R&TD today needs a method for up-front assessment of competing technologies to help guide technology investment decisions. Such an assessment approach must account for uncertainties in system performance parameters, mission requirements and architectures, and internal and external events influencing a development program. The methodology known as decision analysis has the potential to address these issues. It was evaluated by performing a case study assessment of alternative carbon dioxide removal technologies for NASA's proposed First Lunar Outpost program. An approach was developed that accounts for the uncertainties in each technology's cost and performance parameters as well as programmatic uncertainties such as mission architecture. Life cycle cost savings relative to a baseline, adjusted for the cost of money, was used as a figure of merit to evaluate each of the alternative carbon dioxide removal technology candidates. The methodology was found to provide a consistent decision-making strategy for development of new life support technology. The case study results provided insight that was not possible from more traditional analysis approaches.
Optimal guidance law development for an advanced launch system
NASA Technical Reports Server (NTRS)
Calise, Anthony J.; Leung, Martin S. K.
1995-01-01
The objective of this research effort was to develop a real-time guidance approach for launch vehicles ascent to orbit injection. Various analytical approaches combined with a variety of model order and model complexity reduction have been investigated. Singular perturbation methods were first attempted and found to be unsatisfactory. The second approach based on regular perturbation analysis was subsequently investigated. It also fails because the aerodynamic effects (ignored in the zero order solution) are too large to be treated as perturbations. Therefore, the study demonstrates that perturbation methods alone (both regular and singular perturbations) are inadequate for use in developing a guidance algorithm for the atmospheric flight phase of a launch vehicle. During a second phase of the research effort, a hybrid analytic/numerical approach was developed and evaluated. The approach combines the numerical methods of collocation and the analytical method of regular perturbations. The concept of choosing intelligent interpolating functions is also introduced. Regular perturbation analysis allows the use of a crude representation for the collocation solution, and intelligent interpolating functions further reduce the number of elements without sacrificing the approximation accuracy. As a result, the combined method forms a powerful tool for solving real-time optimal control problems. Details of the approach are illustrated in a fourth order nonlinear example. The hybrid approach is then applied to the launch vehicle problem. The collocation solution is derived from a bilinear tangent steering law, and results in a guidance solution for the entire flight regime that includes both atmospheric and exoatmospheric flight phases.
ERIC Educational Resources Information Center
MacNeela, Pádraig; Gannon, Niall
2014-01-01
Volunteering among university students is an important expression of civic engagement, but the impact of this experience on the development of emerging adults requires further contextualization. Adopting interpretative phenomenological analysis as a qualitative research approach, we carried out semistructured interviews with 10 students of one…
Meta-Analysis: An Approach to Interview Success.
ERIC Educational Resources Information Center
McCaslin, Mark; Carlson, Nancy M.
An initial research step, developing an effective interview strategy, presents unique challenges for novice and master research alike. To focus qualitative research in the human ecology of the study, the strategy presented in this paper used an initial interview protocol and preanalysis process, called meta-analysis, prior to developing the formal…
Developing comparative criminology and the case of China: an introduction.
Liu, Jianhong
2007-02-01
Although comparative criminology has made significant development during the past decade or so, systematic empirical research has only developed along a few topics. Comparative criminology has never occupied a central position in criminology. This article analyzes the major theoretical and methodological impediments in the development of comparative criminology. It stresses a need to shift methodology from a conventional primary approach that uses the nation as the unit of analysis to an in-depth case study method as a primary methodological approach. The article maintains that case study method can overcome the limitation of its descriptive tradition and become a promising methodological approach for comparative criminology.
The initial instability and finite-amplitude stability of alternate bars in straight channels
Nelson, J.M.
1990-01-01
The initial instability and fully developed stability of alternate bars in straight channels are investigated using linearized and nonlinear analyses. The fundamental instability leading to these features is identified through a linear stability analysis of the equations governing the flow and sediment transport fields. This instability is explained in terms of topographically induced steering of the flow and the associated pattern of erosion and deposition on the bed. While the linear theory is useful for examining the instability mechanism, this approach is shown to yield relatively little information about well-developed alternate bars and, specifically, the linear analysis is shown to yield poor predictions of the fully developed bar wavelength. A fully nonlinear approach is presented that permits computation of the evolution of these bed features from an initial perturbation to their fully developed morphology. This analysis indicates that there is typically substantial elongation of the bar wavelength during the evolution process, a result that is consistent with observations of bar development in flumes and natural channels. The nonlinear approach demonstrates that the eventual stability of these features is a result of the interplay between topographic steering effects, secondary flow production as a result of streamline curvature, and gravitationally induced modifications of sediment fluxes over a sloping bed. ?? 1990.
Control, responses and modularity of cellular regulatory networks: a control analysis perspective.
Bruggeman, F J; Snoep, J L; Westerhoff, H V
2008-11-01
Cells adapt to changes in environmental conditions through the concerted action of signalling, gene expression and metabolic subsystems. The authors will discuss a theoretical framework addressing such integrated systems. This 'hierarchical analysis' was first developed as an extension to a metabolic control analysis. It builds on the phenomenon that often the communication between signalling, gene expression and metabolic subsystems is almost exclusively via regulatory interactions and not via mass flow interactions. This allows for the treatment of the said subsystems as 'levels' in a hierarchical view of the organisation of the molecular reaction network of cells. Such a hierarchical approach has as a major advantage that levels can be analysed conceptually in isolation of each other (from a local intra-level perspective) and at a later stage integrated via their interactions (from a global inter-level perspective). Hereby, it allows for a modular approach with variable scope. A number of different approaches have been developed for the analysis of hierarchical systems, for example hierarchical control analysis and modular response analysis. The authors, here, review these methods and illustrate the strength of these types of analyses using a core model of a system with gene expression, metabolic and signal transduction levels.
Sparse models for correlative and integrative analysis of imaging and genetic data
Lin, Dongdong; Cao, Hongbao; Calhoun, Vince D.
2014-01-01
The development of advanced medical imaging technologies and high-throughput genomic measurements has enhanced our ability to understand their interplay as well as their relationship with human behavior by integrating these two types of datasets. However, the high dimensionality and heterogeneity of these datasets presents a challenge to conventional statistical methods; there is a high demand for the development of both correlative and integrative analysis approaches. Here, we review our recent work on developing sparse representation based approaches to address this challenge. We show how sparse models are applied to the correlation and integration of imaging and genetic data for biomarker identification. We present examples on how these approaches are used for the detection of risk genes and classification of complex diseases such as schizophrenia. Finally, we discuss future directions on the integration of multiple imaging and genomic datasets including their interactions such as epistasis. PMID:25218561
A Polyglot Approach to Bioinformatics Data Integration: A Phylogenetic Analysis of HIV-1
Reisman, Steven; Hatzopoulos, Thomas; Läufer, Konstantin; Thiruvathukal, George K.; Putonti, Catherine
2016-01-01
As sequencing technologies continue to drop in price and increase in throughput, new challenges emerge for the management and accessibility of genomic sequence data. We have developed a pipeline for facilitating the storage, retrieval, and subsequent analysis of molecular data, integrating both sequence and metadata. Taking a polyglot approach involving multiple languages, libraries, and persistence mechanisms, sequence data can be aggregated from publicly available and local repositories. Data are exposed in the form of a RESTful web service, formatted for easy querying, and retrieved for downstream analyses. As a proof of concept, we have developed a resource for annotated HIV-1 sequences. Phylogenetic analyses were conducted for >6,000 HIV-1 sequences revealing spatial and temporal factors influence the evolution of the individual genes uniquely. Nevertheless, signatures of origin can be extrapolated even despite increased globalization. The approach developed here can easily be customized for any species of interest. PMID:26819543
ERIC Educational Resources Information Center
White, Doug
This volume is part of a series of monographs from Australia devoted to outlining an alternative approach, based on neo-Marxist concepts, to policy studies in education. The opening essay in this volume is a historical analysis of federal involvement in Australian educational policy development. After a descriptive overview of the role of…
Decision analysis in clinical cardiology: When is coronary angiography required in aortic stenosis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Georgeson, S.; Meyer, K.B.; Pauker, S.G.
1990-03-15
Decision analysis offers a reproducible, explicit approach to complex clinical decisions. It consists of developing a model, typically a decision tree, that separates choices from chances and that specifies and assigns relative values to outcomes. Sensitivity analysis allows exploration of alternative assumptions. Cost-effectiveness analysis shows the relation between dollars spent and improved health outcomes achieved. In a tutorial format, this approach is applied to the decision whether to perform coronary angiography in a patient who requires aortic valve replacement for critical aortic stenosis.
Genome-based approaches to develop vaccines against bacterial pathogens.
Serruto, Davide; Serino, Laura; Masignani, Vega; Pizza, Mariagrazia
2009-05-26
Bacterial infectious diseases remain the single most important threat to health worldwide. Although conventional vaccinology approaches were successful in conferring protection against several diseases, they failed to provide efficacious solutions against many others. The advent of whole-genome sequencing changed the way to think about vaccine development, enabling the targeting of possible vaccine candidates starting from the genomic information of a single bacterial isolate, with a process named reverse vaccinology. As the genomic era progressed, reverse vaccinology has evolved with a pan-genome approach and multi-strain genome analysis became fundamental for the design of universal vaccines. This review describes the applications of genome-based approaches in the development of new vaccines against bacterial pathogens.
Leung, Janet T Y; Shek, Daniel T L
2011-01-01
This paper examines the use of quantitative and qualitative approaches to study the impact of economic disadvantage on family processes and adolescent development. Quantitative research has the merits of objectivity, good predictive and explanatory power, parsimony, precision and sophistication of analysis. Qualitative research, in contrast, provides a detailed, holistic, in-depth understanding of social reality and allows illumination of new insights. With the pragmatic considerations of methodological appropriateness, design flexibility, and situational responsiveness in responding to the research inquiry, a mixed methods approach could be a possibility of integrating quantitative and qualitative approaches and offers an alternative strategy to study the impact of economic disadvantage on family processes and adolescent development.
NASA Technical Reports Server (NTRS)
Hou, Jean W.
1985-01-01
The thermal analysis and the calculation of thermal sensitivity of a cure cycle in autoclave processing of thick composite laminates were studied. A finite element program for the thermal analysis and design derivatives calculation for temperature distribution and the degree of cure was developed and verified. It was found that the direct differentiation was the best approach for the thermal design sensitivity analysis. In addition, the approach of the direct differentiation provided time histories of design derivatives which are of great value to the cure cycle designers. The approach of direct differentiation is to be used for further study, i.e., the optimal cycle design.
Data sharing for public health research: A qualitative study of industry and academia.
Saunders, Pamela A; Wilhelm, Erin E; Lee, Sinae; Merkhofer, Elizabeth; Shoulson, Ira
2014-01-01
Data sharing is a key biomedical research theme for the 21st century. Biomedical data sharing is the exchange of data among (non)affiliated parties under mutually agreeable terms to promote scientific advancement and the development of safe and effective medical products. Wide sharing of research data is important for scientific discovery, medical product development, and public health. Data sharing enables improvements in development of medical products, more attention to rare diseases, and cost-efficiencies in biomedical research. We interviewed 11 participants about their attitudes and beliefs about data sharing. Using a qualitative, thematic analysis approach, our analysis revealed a number of themes including: experiences, approaches, perceived challenges, and opportunities for sharing data.
Saramago, Pedro; Woods, Beth; Weatherly, Helen; Manca, Andrea; Sculpher, Mark; Khan, Kamran; Vickers, Andrew J; MacPherson, Hugh
2016-10-06
Network meta-analysis methods, which are an extension of the standard pair-wise synthesis framework, allow for the simultaneous comparison of multiple interventions and consideration of the entire body of evidence in a single statistical model. There are well-established advantages to using individual patient data to perform network meta-analysis and methods for network meta-analysis of individual patient data have already been developed for dichotomous and time-to-event data. This paper describes appropriate methods for the network meta-analysis of individual patient data on continuous outcomes. This paper introduces and describes network meta-analysis of individual patient data models for continuous outcomes using the analysis of covariance framework. Comparisons are made between this approach and change score and final score only approaches, which are frequently used and have been proposed in the methodological literature. A motivating example on the effectiveness of acupuncture for chronic pain is used to demonstrate the methods. Individual patient data on 28 randomised controlled trials were synthesised. Consistency of endpoints across the evidence base was obtained through standardisation and mapping exercises. Individual patient data availability avoided the use of non-baseline-adjusted models, allowing instead for analysis of covariance models to be applied and thus improving the precision of treatment effect estimates while adjusting for baseline imbalance. The network meta-analysis of individual patient data using the analysis of covariance approach is advocated to be the most appropriate modelling approach for network meta-analysis of continuous outcomes, particularly in the presence of baseline imbalance. Further methods developments are required to address the challenge of analysing aggregate level data in the presence of baseline imbalance.
The chronnectome: time-varying connectivity networks as the next frontier in fMRI data discovery.
Calhoun, Vince D; Miller, Robyn; Pearlson, Godfrey; Adalı, Tulay
2014-10-22
Recent years have witnessed a rapid growth of interest in moving functional magnetic resonance imaging (fMRI) beyond simple scan-length averages and into approaches that capture time-varying properties of connectivity. In this Perspective we use the term "chronnectome" to describe metrics that allow a dynamic view of coupling. In the chronnectome, coupling refers to possibly time-varying levels of correlated or mutually informed activity between brain regions whose spatial properties may also be temporally evolving. We primarily focus on multivariate approaches developed in our group and review a number of approaches with an emphasis on matrix decompositions such as principle component analysis and independent component analysis. We also discuss the potential these approaches offer to improve characterization and understanding of brain function. There are a number of methodological directions that need to be developed further, but chronnectome approaches already show great promise for the study of both the healthy and the diseased brain.
Bahm, Sarah M; Karkazis, Katrina; Magnus, David
2013-09-01
To identify and analyze existing posthumous sperm procurement (PSP) protocols in order to outline central themes for institutions to consider when developing future policies. Qualitative content analysis. Large academic institutions across the United States. We performed a literature search and contacted 40 institutions to obtain nine full PSP protocols. We then performed a content analysis on these policies to identify major themes and factors to consider when developing a PSP protocol. Presence of a PSP policy. We identified six components of a thorough PSP protocol: Standard of Evidence, Terms of Eligibility, Sperm Designee, Restrictions on Use in Reproduction, Logistics, and Contraindications. We also identified two different approaches to policy structure. In the Limited Role approach, institutions have stricter consent requirements and limit their involvement to the time of procurement. In the Family-Centered approach, substituted judgment is permitted but a mandatory wait period is enforced before sperm use in reproduction. Institutions seeking to implement a PSP protocol will benefit from considering the six major building blocks of a thorough protocol and where they would like to fall on the spectrum from a Limited Role to a Family-Centered approach. Copyright © 2013 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.
The Aeronautical Data Link: Decision Framework for Architecture Analysis
NASA Technical Reports Server (NTRS)
Morris, A. Terry; Goode, Plesent W.
2003-01-01
A decision analytic approach that develops optimal data link architecture configuration and behavior to meet multiple conflicting objectives of concurrent and different airspace operations functions has previously been developed. The approach, premised on a formal taxonomic classification that correlates data link performance with operations requirements, information requirements, and implementing technologies, provides a coherent methodology for data link architectural analysis from top-down and bottom-up perspectives. This paper follows the previous research by providing more specific approaches for mapping and transitioning between the lower levels of the decision framework. The goal of the architectural analysis methodology is to assess the impact of specific architecture configurations and behaviors on the efficiency, capacity, and safety of operations. This necessarily involves understanding the various capabilities, system level performance issues and performance and interface concepts related to the conceptual purpose of the architecture and to the underlying data link technologies. Efficient and goal-directed data link architectural network configuration is conditioned on quantifying the risks and uncertainties associated with complex structural interface decisions. Deterministic and stochastic optimal design approaches will be discussed that maximize the effectiveness of architectural designs.
Panel Discussion on Multi-Disciplinary Analysis
NASA Technical Reports Server (NTRS)
Garcia, Robert
2002-01-01
The Marshall Space Flight Center (MSFC) is hosting the Thermal and Fluids Analysis Workshop (TFAWS) during the week of September 10, 2001. Included in this year's TFAWS is a panel session on Multidisciplinary Analysis techniques. The intent is to provide an opportunity for the users to gain information as to what product may be best suited for their applications environment and to provide feedback to you, the developers, on future desired developments. Potential users of multidisciplinary analysis (MDA) techniques are often overwhelmed by the number of choices available to them via commercial products and by the pace of new developments in this area. The purpose of this panel session is to provide a forum wherein MDA tools available and under development can be discussed, compared, and contrasted. The intent of this panel is to provide the end-user with the information necessary to make educated decisions on how to proceed with selecting their MDA tool. It is anticipated that the discussions this year will focus on MDA techniques that couple discipline codes or algorithms (as opposed to monolithic, unified MDA approaches). The MDA developers will be asked to prepare a product overview presentation addressing specific questions provided by the panel organizers. The purpose of these questions will be to establish the method employed by the particular MDA technique for communication between the discipline codes, to establish the similarities and differences amongst the various approaches, and to establish the range of experience and applications for each particular MDA approach.
ERIC Educational Resources Information Center
Woods, Jeffrey G.
2012-01-01
Purpose: The purpose of this research is to provide an in-depth analysis of the labor market for apprentice training in the US construction industry. Also, the paper analyzes the learning process of apprentices and discusses the role of apprenticeships as a pathway to higher education. Design/methodology/approach: The interdisciplinary approach of…
Integrated transient thermal-structural finite element analysis
NASA Technical Reports Server (NTRS)
Thornton, E. A.; Dechaumphai, P.; Wieting, A. R.; Tamma, K. K.
1981-01-01
An integrated thermal structural finite element approach for efficient coupling of transient thermal and structural analysis is presented. Integrated thermal structural rod and one dimensional axisymmetric elements considering conduction and convection are developed and used in transient thermal structural applications. The improved accuracy of the integrated approach is illustrated by comparisons with exact transient heat conduction elasticity solutions and conventional finite element thermal finite element structural analyses.
Estimating the Regional Economic Significance of Airports
1992-09-01
following three options for estimating induced impacts: the economic base model , an econometric model , and a regional input-output model . One approach to...limitations, however, the economic base model has been widely used for regional economic analysis. A second approach is to develop an econometric model of...analysis is the principal statistical tool used to estimate the economic relationships. Regional econometric models are capable of estimating a single
Modal analysis of untransposed bilateral three-phase lines -- a perturbation approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Faria, J.A.B.; Mendes, J.H.B.
1997-01-01
Model analysis of three-phase power lines exhibiting bilateral symmetry leads to modal transformation matrices that closely resemble Clarke`s transformation. The authors develop a perturbation theory approach to justify, interpret, and gain understanding of this well known fact. Further, the authors show how to find new frequency dependent correction terms that once added to Clarke`s transformation lead to improved accuracy.
ERIC Educational Resources Information Center
Essid, Hedi; Ouellette, Pierre; Vigeant, Stephane
2010-01-01
The objective of this paper is to measure the efficiency of high schools in Tunisia. We use a statistical data envelopment analysis (DEA)-bootstrap approach with quasi-fixed inputs to estimate the precision of our measure. To do so, we developed a statistical model serving as the foundation of the data generation process (DGP). The DGP is…
Design and Development of an E-Learning Environment for the Course of Electrical Circuit Analysis
ERIC Educational Resources Information Center
Deperlioglu, Omer; Kose, Utku; Yildirim, Ramazan
2012-01-01
E-learning is an educational approach that combines different types of multimedia technologies to ensure better education experiences for students and teachers. Today, it is a popular approach among especially teachers and educators. In this sense, this paper describes a web based e-learning system that was designed and developed to be used in the…
ERIC Educational Resources Information Center
Shakuto, Elena A.; Dorozhkin, Evgenij M.; Kozlova, Anastasia A.
2016-01-01
The relevance of the subject under analysis is determined by the lack of theoretical development of the problem of management of teacher scientific-methodical work in vocational educational institutions based upon innovative approaches in the framework of project paradigm. The purpose of the article is to develop and test a science-based…
ERIC Educational Resources Information Center
Zhou, Wenxia; Sun, Jianmin; Guan, Yanjun; Li, Yuhui; Pan, Jingzhou
2013-01-01
The current research aimed to develop a multidimensional measure on the criteria of career success in a Chinese context. Items on the criteria of career success were obtained using a qualitative approach among 30 Chinese employees; exploratory factor analysis was conducted to select items and determine the factor structure among a new sample of…
ERIC Educational Resources Information Center
Nguyen, Huu Cuong; Evers, Colin; Marshall, Stephen
2017-01-01
Purpose: The purpose of this paper is to investigate the development of Viet Nam's approach to higher education quality assurance during the past dozen years since its establishment, focusing on the achievements and challenges. Design/methodology/approach: This is a desktop analysis study. The paper analyses the policies and practices related to…
ERIC Educational Resources Information Center
Lee, Lena; Tu, Xintian
2016-01-01
As digital media devices have been increasingly used in early childhood educational settings, this study examined whether the iPad with a Vygotskian social development approach--namely, More Knowledgeable Other--can be integrated into low-income preschool classrooms to improve science learning. An analysis of variance was used to examine the…
Modeling energy/economy interactions for conservation and renewable energy-policy analysis
NASA Astrophysics Data System (ADS)
Groncki, P. J.
Energy policy and the implications for policy analysis and the methodological tools are discussed. The evolution of one methodological approach and the combined modeling system of the component models, their evolution in response to changing analytic needs, and the development of the integrated framework are reported. The analyses performed over the past several years are summarized. The current philosophy behind energy policy is discussed and compared to recent history. Implications for current policy analysis and methodological approaches are drawn.
Low-cost digital image processing at the University of Oklahoma
NASA Technical Reports Server (NTRS)
Harrington, J. A., Jr.
1981-01-01
Computer assisted instruction in remote sensing at the University of Oklahoma involves two separate approaches and is dependent upon initial preprocessing of a LANDSAT computer compatible tape using software developed for an IBM 370/158 computer. In-house generated preprocessing algorithms permits students or researchers to select a subset of a LANDSAT scene for subsequent analysis using either general purpose statistical packages or color graphic image processing software developed for Apple II microcomputers. Procedures for preprocessing the data and image analysis using either of the two approaches for low-cost LANDSAT data processing are described.
Using student writing assignments to assess critical thinking skills: a holistic approach.
Niedringhaus, L K
2001-04-01
This work offers an example of one school's holistic approach to the evaluation of critical thinking by using student writing assignments. Faculty developed tools to assess achievement of critical thinking competencies, such as analysis, synthesis, insight, reflection, open mindedness, and depth, breadth, and appropriateness of clinical interventions. Faculty created a model for the development of program-specific critical thinking competencies, selected appropriate writing assignments that demonstrate critical thinking, and implemented a holistic assessment plan for data collection and analysis. Holistic assessment involves the identification of shared values and practices, and the use of concepts and language important to nursing.
Bourne, Tom; De Rijdt, Sylvie; Van Holsbeke, Caroline; Sayasneh, Ahmad; Valentin, Lil; Van Calster, Ben; Timmerman, Dirk
2015-01-01
Abstract The principal aim of the IOTA project has been to develop approaches to the evaluation of adnexal pathology using ultrasound that can be transferred to all examiners. Creating models that use simple, easily reproducible ultrasound characteristics is one approach. PMID:28191150
ERIC Educational Resources Information Center
Walter, Justin D.; Littlefield, Peter; Delbecq, Scott; Prody, Gerry; Spiegel, P. Clint
2010-01-01
New approaches are currently being developed to expose biochemistry and molecular biology undergraduates to a more interactive learning environment. Here, we propose a unique project-based laboratory module, which incorporates exposure to biophysical chemistry approaches to address problems in protein chemistry. Each of the experiments described…
The Influence of Moral Education on the Personal Worldview of Students
ERIC Educational Resources Information Center
van der Kooij, Jacomijn C.; de Ruyter, Doret J.; Miedema, Siebren
2015-01-01
This article researches whether approaches to moral education aim to influence the development of the personal worldview of students. An example of a Dutch moral education programme is presented and the findings are used to analyse various approaches to moral education. Our analysis demonstrates that every approach aims to influence the personal…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nguyen, Ba Nghiep; Bapanapalli, Satish K.; Smith, Mark T.
2008-09-01
The objective of our work is to enable the optimum design of lightweight automotive structural components using injection-molded long fiber thermoplastics (LFTs). To this end, an integrated approach that links process modeling to structural analysis with experimental microstructural characterization and validation is developed. First, process models for LFTs are developed and implemented into processing codes (e.g. ORIENT, Moldflow) to predict the microstructure of the as-formed composite (i.e. fiber length and orientation distributions). In parallel, characterization and testing methods are developed to obtain necessary microstructural data to validate process modeling predictions. Second, the predicted LFT composite microstructure is imported into amore » structural finite element analysis by ABAQUS to determine the response of the as-formed composite to given boundary conditions. At this stage, constitutive models accounting for the composite microstructure are developed to predict various types of behaviors (i.e. thermoelastic, viscoelastic, elastic-plastic, damage, fatigue, and impact) of LFTs. Experimental methods are also developed to determine material parameters and to validate constitutive models. Such a process-linked-structural modeling approach allows an LFT composite structure to be designed with confidence through numerical simulations. Some recent results of our collaborative research will be illustrated to show the usefulness and applications of this integrated approach.« less
NASA Astrophysics Data System (ADS)
Hanoca, P.; Ramakrishna, H. V.
2018-03-01
This work is related to develop a methodology to model and simulate the TEHD using the sequential application of CFD and CSD. The FSI analyses are carried out using ANSYS Workbench. In this analysis steady state, 3D Navier-Stoke equations along with energy equation are solved. Liquid properties are introduced where the viscosity and density are the function of pressure and temperature. The cavitation phenomenon is adopted in the analysis. Numerical analysis has been carried at different speeds and surfaces temperatures. During the analysis, it was found that as speed increases, hydrodynamic pressures will also increases. The pressure profile obtained from the Roelands equation is more sensitive to the temperature as compared to the Barus equation. The stress distributions specify the significant positions in the bearing structure. The developed method is capable of giving latest approaching into the physics of elasto hydrodynamic lubrication.
Visual analytics for aviation safety: A collaborative approach to sensemaking
NASA Astrophysics Data System (ADS)
Wade, Andrew
Visual analytics, the "science of analytical reasoning facilitated by interactive visual interfaces", is more than just visualization. Understanding the human reasoning process is essential for designing effective visualization tools and providing correct analyses. This thesis describes the evolution, application and evaluation of a new method for studying analytical reasoning that we have labeled paired analysis. Paired analysis combines subject matter experts (SMEs) and tool experts (TE) in an analytic dyad, here used to investigate aircraft maintenance and safety data. The method was developed and evaluated using interviews, pilot studies and analytic sessions during an internship at the Boeing Company. By enabling a collaborative approach to sensemaking that can be captured by researchers, paired analysis yielded rich data on human analytical reasoning that can be used to support analytic tool development and analyst training. Keywords: visual analytics, paired analysis, sensemaking, boeing, collaborative analysis.
Buckling Load Calculations of the Isotropic Shell A-8 Using a High-Fidelity Hierarchical Approach
NASA Technical Reports Server (NTRS)
Arbocz, Johann; Starnes, James H.
2002-01-01
As a step towards developing a new design philosophy, one that moves away from the traditional empirical approach used today in design towards a science-based design technology approach, a test series of 7 isotropic shells carried out by Aristocrat and Babcock at Caltech is used. It is shown how the hierarchical approach to buckling load calculations proposed by Arbocz et al can be used to perform an approach often called 'high fidelity analysis', where the uncertainties involved in a design are simulated by refined and accurate numerical methods. The Delft Interactive Shell DEsign COde (short, DISDECO) is employed for this hierarchical analysis to provide an accurate prediction of the critical buckling load of the given shell structure. This value is used later as a reference to establish the accuracy of the Level-3 buckling load predictions. As a final step in the hierarchical analysis approach, the critical buckling load and the estimated imperfection sensitivity of the shell are verified by conducting an analysis using a sufficiently refined finite element model with one of the current generation two-dimensional shell analysis codes with the advanced capabilities needed to represent both geometric and material nonlinearities.
On a High-Fidelity Hierarchical Approach to Buckling Load Calculations
NASA Technical Reports Server (NTRS)
Arbocz, Johann; Starnes, James H.; Nemeth, Michael P.
2001-01-01
As a step towards developing a new design philosophy, one that moves away from the traditional empirical approach used today in design towards a science-based design technology approach, a recent test series of 5 composite shells carried out by Waters at NASA Langley Research Center is used. It is shown how the hierarchical approach to buckling load calculations proposed by Arbocz et al can be used to perform an approach often called "high fidelity analysis", where the uncertainties involved in a design are simulated by refined and accurate numerical methods. The Delft Interactive Shell DEsign COde (short, DISDECO) is employed for this hierarchical analysis to provide an accurate prediction of the critical buckling load of the given shell structure. This value is used later as a reference to establish the accuracy of the Level-3 buckling load predictions. As a final step in the hierarchical analysis approach, the critical buckling load and the estimated imperfection sensitivity of the shell are verified by conducting an analysis using a sufficiently refined finite element model with one of the current generation two-dimensional shell analysis codes with the advanced capabilities needed to represent both geometric and material nonlinearities.
A generalized least-squares framework for rare-variant analysis in family data.
Li, Dalin; Rotter, Jerome I; Guo, Xiuqing
2014-01-01
Rare variants may, in part, explain some of the hereditability missing in current genome-wide association studies. Many gene-based rare-variant analysis approaches proposed in recent years are aimed at population-based samples, although analysis strategies for family-based samples are clearly warranted since the family-based design has the potential to enhance our ability to enrich for rare causal variants. We have recently developed the generalized least squares, sequence kernel association test, or GLS-SKAT, approach for the rare-variant analyses in family samples, in which the kinship matrix that was computed from the high dimension genetic data was used to decorrelate the family structure. We then applied the SKAT-O approach for gene-/region-based inference in the decorrelated data. In this study, we applied this GLS-SKAT method to the systolic blood pressure data in the simulated family sample distributed by the Genetic Analysis Workshop 18. We compared the GLS-SKAT approach to the rare-variant analysis approach implemented in family-based association test-v1 and demonstrated that the GLS-SKAT approach provides superior power and good control of type I error rate.
Designing a ticket to ride with the Cognitive Work Analysis Design Toolkit.
Read, Gemma J M; Salmon, Paul M; Lenné, Michael G; Jenkins, Daniel P
2015-01-01
Cognitive work analysis has been applied in the design of numerous sociotechnical systems. The process used to translate analysis outputs into design concepts, however, is not always clear. Moreover, structured processes for translating the outputs of ergonomics methods into concrete designs are lacking. This paper introduces the Cognitive Work Analysis Design Toolkit (CWA-DT), a design approach which has been developed specifically to provide a structured means of incorporating cognitive work analysis outputs in design using design principles and values derived from sociotechnical systems theory. This paper outlines the CWA-DT and describes its application in a public transport ticketing design case study. Qualitative and quantitative evaluations of the process provide promising early evidence that the toolkit fulfils the evaluation criteria identified for its success, with opportunities for improvement also highlighted. The Cognitive Work Analysis Design Toolkit has been developed to provide ergonomics practitioners with a structured approach for translating the outputs of cognitive work analysis into design solutions. This paper demonstrates an application of the toolkit and provides evaluation findings.
Kawarazuka, Nozomi; Locke, Catherine; McDougall, Cynthia; Kantor, Paula; Morgan, Miranda
2017-03-01
The demand for gender analysis is now increasingly orthodox in natural resource programming, including that for small-scale fisheries. Whilst the analysis of social-ecological resilience has made valuable contributions to integrating social dimensions into research and policy-making on natural resource management, it has so far demonstrated limited success in effectively integrating considerations of gender equity. This paper reviews the challenges in, and opportunities for, bringing a gender analysis together with social-ecological resilience analysis in the context of small-scale fisheries research in developing countries. We conclude that rather than searching for a single unifying framework for gender and resilience analysis, it will be more effective to pursue a plural solution in which closer engagement is fostered between analysis of gender and social-ecological resilience whilst preserving the strengths of each approach. This approach can make an important contribution to developing a better evidence base for small-scale fisheries management and policy.
NASA Astrophysics Data System (ADS)
Rushton, Gregory T.; Lotter, Christine; Singer, Jonathan
2011-02-01
This study investigates the beliefs and practices of seven high school chemistry teachers as a result of their participation in a year-long inquiry professional development (PD) project. An analysis of oral interviews, written reflections, and in-class observations were used to determine the extent to which the PD affected the teachers' beliefs and practice. The data indicated that the teachers developed more complete conceptions of classroom inquiry, valued a "phenomena first" approach to scientific investigations, and viewed inquiry approaches as helpful for facilitating improved student thinking. Analysis of classroom observations with the Reformed Teaching Observation Protocol indicated that features of the PD were observed in the teachers' practice during the academic year follow-up. Implications for effective science teacher professional development models are discussed.
Ares Project Technology Assessment: Approach and Tools
NASA Technical Reports Server (NTRS)
Hueter, Uwe; Tyson, Richard
2010-01-01
Technology assessments provide a status of the development maturity of specific technologies. Along with benefit analysis, the risks the project assumes can be quantified. Normally due to budget constraints, the competing technologies are prioritized and decisions are made which ones to fund. A detailed technology development plan is produced for the selected technologies to provide a roadmap to reach the desired maturity by the project s critical design review. Technology assessments can be conducted for both technology only tasks or for product development programs. This paper is primarily biased toward the product development programs. The paper discusses the Ares Project s approach to technology assessment. System benefit analysis, risk assessment, technology prioritization, and technology readiness assessment are addressed. A description of the technology readiness level tool being used is provided.
Policy Analysis for Sustainable Development: The Toolbox for the Environmental Social Scientist
ERIC Educational Resources Information Center
Runhaar, Hens; Dieperink, Carel; Driessen, Peter
2006-01-01
Purpose: The paper seeks to propose the basic competencies of environmental social scientists regarding policy analysis for sustainable development. The ultimate goal is to contribute to an improvement of educational programmes in higher education by suggesting a toolbox that should be integrated in the curriculum. Design/methodology/approach:…
Why It Matters How We Frame "Education" in Education for Sustainable Development
ERIC Educational Resources Information Center
Shephard, Kerry; Dulgar, Pete
2015-01-01
We analyzed two educational frameworks that seek to embed "education for sustainable development" into higher education (HE). Both identify that HE is failing to educate graduates able to address the sustainability needs of society and suggest approaches to remedy the situation. We used discourse analysis and framing analysis to explore…
USDA-ARS?s Scientific Manuscript database
Objective: To examine the risk factors of developing functional decline and make probabilistic predictions by using a tree-based method that allows higher order polynomials and interactions of the risk factors. Methods: The conditional inference tree analysis, a data mining approach, was used to con...
ERIC Educational Resources Information Center
Janson, Harald; Mathiesen, Kristin S.
2008-01-01
The authors applied I-States as Objects Analysis (ISOA), a recently proposed person-oriented analytic approach, to the study of temperament development in 921 Norwegian children from a population-based sample. A 5-profile classification based on cluster analysis of standardized mother reports of activity, sociability, emotionality, and shyness at…
Analysis of rocket engine injection combustion processes
NASA Technical Reports Server (NTRS)
Salmon, J. W.; Saltzman, D. H.
1977-01-01
Mixing methodology improvement for the JANNAF DER and CICM injection/combustion analysis computer programs was accomplished. ZOM plane prediction model development was improved for installation into the new standardized DER computer program. An intra-element mixing model developing approach was recommended for gas/liquid coaxial injection elements for possible future incorporation into the CICM computer program.
Facilitating Video Analysis for Teacher Development: A Systematic Review of the Research
ERIC Educational Resources Information Center
Baecher, Laura; Kung, Shiao-Chuan; Ward, Sarah Laleman; Kern, Kimberly
2018-01-01
Video analysis of classroom practice as a tool in teacher professional learning has become ever more widely used, with hundreds of articles published on the topic over the past decade. When designing effective professional development for teachers using video, facilitators turn to the literature to identify promising approaches. This article…
ERIC Educational Resources Information Center
Crossley, Michael
2010-01-01
The article argues that greater attention should be paid to contextual factors in educational research and international development cooperation. The analysis draws upon principles that underpin socio-cultural approaches to comparative education, a critical analysis of the political economy of contemporary educational research, and recent research…
Comparison of a Traditional Probabilistic Risk Assessment Approach with Advanced Safety Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Curtis L; Mandelli, Diego; Zhegang Ma
2014-11-01
As part of the Light Water Sustainability Program (LWRS) [1], the purpose of the Risk Informed Safety Margin Characterization (RISMC) [2] Pathway research and development (R&D) is to support plant decisions for risk-informed margin management with the aim to improve economics, reliability, and sustain safety of current NPPs. In this paper, we describe the RISMC analysis process illustrating how mechanistic and probabilistic approaches are combined in order to estimate a safety margin. We use the scenario of a “station blackout” (SBO) wherein offsite power and onsite power is lost, thereby causing a challenge to plant safety systems. We describe themore » RISMC approach, illustrate the station blackout modeling, and contrast this with traditional risk analysis modeling for this type of accident scenario. We also describe our approach we are using to represent advanced flooding analysis.« less
A probability-based approach for assessment of roadway safety hardware.
DOT National Transportation Integrated Search
2017-03-14
This report presents a general probability-based approach for assessment of roadway safety hardware (RSH). It was achieved using a reliability : analysis method and computational techniques. With the development of high-fidelity finite element (FE) m...
Ethical analysis in HTA of complex health interventions.
Lysdahl, Kristin Bakke; Oortwijn, Wija; van der Wilt, Gert Jan; Refolo, Pietro; Sacchini, Dario; Mozygemba, Kati; Gerhardus, Ansgar; Brereton, Louise; Hofmann, Bjørn
2016-03-22
In the field of health technology assessment (HTA), there are several approaches that can be used for ethical analysis. However, there is a scarcity of literature that critically evaluates and compares the strength and weaknesses of these approaches when they are applied in practice. In this paper, we analyse the applicability of some selected approaches for addressing ethical issues in HTA in the field of complex health interventions. Complex health interventions have been the focus of methodological attention in HTA. However, the potential methodological challenges for ethical analysis are as yet unknown. Six of the most frequently described and applied ethical approaches in HTA were critically assessed against a set of five characteristics of complex health interventions: multiple and changing perspectives, indeterminate phenomena, uncertain causality, unpredictable outcomes, and ethical complexity. The assessments are based on literature and the authors' experiences of developing, applying and assessing the approaches. The Interactive, participatory HTA approach is by its nature and flexibility, applicable across most complexity characteristics. Wide Reflective Equilibrium is also flexible and its openness to different perspectives makes it better suited for complex health interventions than more rigid conventional approaches, such as Principlism and Casuistry. Approaches developed for HTA purposes are fairly applicable for complex health interventions, which one could expect because they include various ethical perspectives, such as the HTA Core Model® and the Socratic approach. This study shows how the applicability for addressing ethical issues in HTA of complex health interventions differs between the selected ethical approaches. Knowledge about these differences may be helpful when choosing and applying an approach for ethical analyses in HTA. We believe that the study contributes to increasing awareness and interest of the ethical aspects of complex health interventions in general.
Software Safety Progress in NASA
NASA Technical Reports Server (NTRS)
Radley, Charles F.
1995-01-01
NASA has developed guidelines for development and analysis of safety-critical software. These guidelines have been documented in a Guidebook for Safety Critical Software Development and Analysis. The guidelines represent a practical 'how to' approach, to assist software developers and safety analysts in cost effective methods for software safety. They provide guidance in the implementation of the recent NASA Software Safety Standard NSS-1740.13 which was released as 'Interim' version in June 1994, scheduled for formal adoption late 1995. This paper is a survey of the methods in general use, resulting in the NASA guidelines for safety critical software development and analysis.
ERIC Educational Resources Information Center
Cathcart, Stephen Michael
2016-01-01
This mixed method study examines HRD professionals' decision-making processes when making an organizational purchase of training. The study uses a case approach with a degrees of freedom analysis. The data to analyze will examine how HRD professionals in manufacturing select outside vendors human resource development programs for training,…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bush, B.; Melaina, M.; Penev, M.
This report describes the development and analysis of detailed temporal and spatial scenarios for early market hydrogen fueling infrastructure clustering and fuel cell electric vehicle rollout using the Scenario Evaluation, Regionalization and Analysis (SERA) model. The report provides an overview of the SERA scenario development framework and discusses the approach used to develop the nationwidescenario.
Huang, Jun; Kaul, Goldi; Cai, Chunsheng; Chatlapalli, Ramarao; Hernandez-Abad, Pedro; Ghosh, Krishnendu; Nagi, Arwinder
2009-12-01
To facilitate an in-depth process understanding, and offer opportunities for developing control strategies to ensure product quality, a combination of experimental design, optimization and multivariate techniques was integrated into the process development of a drug product. A process DOE was used to evaluate effects of the design factors on manufacturability and final product CQAs, and establish design space to ensure desired CQAs. Two types of analyses were performed to extract maximal information, DOE effect & response surface analysis and multivariate analysis (PCA and PLS). The DOE effect analysis was used to evaluate the interactions and effects of three design factors (water amount, wet massing time and lubrication time), on response variables (blend flow, compressibility and tablet dissolution). The design space was established by the combined use of DOE, optimization and multivariate analysis to ensure desired CQAs. Multivariate analysis of all variables from the DOE batches was conducted to study relationships between the variables and to evaluate the impact of material attributes/process parameters on manufacturability and final product CQAs. The integrated multivariate approach exemplifies application of QbD principles and tools to drug product and process development.
New approaches in GMO detection.
Querci, Maddalena; Van den Bulcke, Marc; Zel, Jana; Van den Eede, Guy; Broll, Hermann
2010-03-01
The steady rate of development and diffusion of genetically modified plants and their increasing diversification of characteristics, genes and genetic control elements poses a challenge in analysis of genetically modified organisms (GMOs). It is expected that in the near future the picture will be even more complex. Traditional approaches, mostly based on the sequential detection of one target at a time, or on a limited multiplexing, allowing only a few targets to be analysed at once, no longer meet the testing requirements. Along with new analytical technologies, new approaches for the detection of GMOs authorized for commercial purposes in various countries have been developed that rely on (1) a smart and accurate strategy for target selection, (2) the use of high-throughput systems or platforms for the detection of multiple targets and (3) algorithms that allow the conversion of analytical results into an indication of the presence of individual GMOs potentially present in an unknown sample. This paper reviews the latest progress made in GMO analysis, taking examples from the most recently developed strategies and tools, and addresses some of the critical aspects related to these approaches.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abdel-Khalik, Hany S.; Zhang, Qiong
2014-05-20
The development of hybrid Monte-Carlo-Deterministic (MC-DT) approaches, taking place over the past few decades, have primarily focused on shielding and detection applications where the analysis requires a small number of responses, i.e. at the detector locations(s). This work further develops a recently introduced global variance reduction approach, denoted by the SUBSPACE approach is designed to allow the use of MC simulation, currently limited to benchmarking calculations, for routine engineering calculations. By way of demonstration, the SUBSPACE approach is applied to assembly level calculations used to generate the few-group homogenized cross-sections. These models are typically expensive and need to be executedmore » in the order of 10 3 - 10 5 times to properly characterize the few-group cross-sections for downstream core-wide calculations. Applicability to k-eigenvalue core-wide models is also demonstrated in this work. Given the favorable results obtained in this work, we believe the applicability of the MC method for reactor analysis calculations could be realized in the near future.« less
Bayesian Exploratory Factor Analysis
Conti, Gabriella; Frühwirth-Schnatter, Sylvia; Heckman, James J.; Piatek, Rémi
2014-01-01
This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo study confirms the validity of the approach. The method is used to produce interpretable low dimensional aggregates from a high dimensional set of psychological measurements. PMID:25431517
Possibilities of fractal analysis of the competitive dynamics: Approaches and procedures
NASA Astrophysics Data System (ADS)
Zagornaya, T. O.; Medvedeva, M. A.; Panova, V. L.; Isaichik, K. F.; Medvedev, A. N.
2017-11-01
The possibilities of the fractal approach are used for the study of non-linear nature of the competitive dynamics of the market of trading intermediaries. Based on a statistical study of the functioning of retail indicators in the region, the approach to the analysis of the characteristics of the competitive behavior of market participants is developed. The authors postulate the principles of studying the dynamics of competition as a result of changes in the characteristics of the vector and the competitive behavior of market agents.
Investigation of air transportation technology at Massachusetts Institute of Technology, 1985
NASA Technical Reports Server (NTRS)
Simpson, Robert W.
1987-01-01
Two areas of research are discussed, an investigation into runway approach flying with Loran C and a series of research topics in the development of experimental validation of methodologies to support aircraft icing analysis. Flight tests with the Loran C led to the conclusion that it is a suitable system for non-precision approaches, and that time-difference corrections made every eight weeks in the instrument approach plates will produce acceptable errors. In the area of aircraft icing analysis, wind tunnel and flight test results are discussed.
Discrete Fourier Transform Analysis in a Complex Vector Space
NASA Technical Reports Server (NTRS)
Dean, Bruce H.
2009-01-01
Alternative computational strategies for the Discrete Fourier Transform (DFT) have been developed using analysis of geometric manifolds. This approach provides a general framework for performing DFT calculations, and suggests a more efficient implementation of the DFT for applications using iterative transform methods, particularly phase retrieval. The DFT can thus be implemented using fewer operations when compared to the usual DFT counterpart. The software decreases the run time of the DFT in certain applications such as phase retrieval that iteratively call the DFT function. The algorithm exploits a special computational approach based on analysis of the DFT as a transformation in a complex vector space. As such, this approach has the potential to realize a DFT computation that approaches N operations versus Nlog(N) operations for the equivalent Fast Fourier Transform (FFT) calculation.
RF control at SSCL — an object oriented design approach
NASA Astrophysics Data System (ADS)
Dohan, D. A.; Osberg, E.; Biggs, R.; Bossom, J.; Chillara, K.; Richter, R.; Wade, D.
1994-12-01
The Superconducting Super Collider (SSC) in Texas, the construction of which was stopped in 1994, would have represented a major challenge in accelerator research and development. This paper addresses the issues encountered in the parallel design and construction of the control systems for the RF equipment for the five accelerators comprising the SSC. An extensive analysis of the components of the RF control systems has been undertaken, based upon the Schlaer-Mellor object-oriented analysis and design (OOA/OOD) methodology. The RF subsystem components such as amplifiers, tubes, power supplies, PID loops, etc. were analyzed to produce OOA information, behavior and process models. Using these models, OOD was iteratively applied to develop a generic RF control system design. This paper describes the results of this analysis and the development of 'bridges' between the analysis objects, and the EPICS-based software and underlying VME-based hardware architectures. The application of this approach to several of the SSCL RF control systems is discussed.
Pawlowski, Roger P.; Phipps, Eric T.; Salinger, Andrew G.; ...
2012-01-01
A template-based generic programming approach was presented in Part I of this series of papers [Sci. Program. 20 (2012), 197–219] that separates the development effort of programming a physical model from that of computing additional quantities, such as derivatives, needed for embedded analysis algorithms. In this paper, we describe the implementation details for using the template-based generic programming approach for simulation and analysis of partial differential equations (PDEs). We detail several of the hurdles that we have encountered, and some of the software infrastructure developed to overcome them. We end with a demonstration where we present shape optimization and uncertaintymore » quantification results for a 3D PDE application.« less
Using a Knowledge Representations Approach to Cognitive Task Analysis.
ERIC Educational Resources Information Center
Black, John B.; And Others
Task analyses have traditionally been framed in terms of overt behaviors performed in accomplishing tasks and goals. Pioneering work at the Learning Research and Development Center looked at what contribution a cognitive analysis might make to current task analysis procedures, since traditional task analysis methods neither elicit nor capture…
The implementation and use of Ada on distributed systems with high reliability requirements
NASA Technical Reports Server (NTRS)
Knight, J. C.
1987-01-01
Performance analysis was begin on the Ada implementations. The goal is to supply the system designer with tools that will allow a rational decision to be made about whether a particular implementation can support a given application early in the design cycle. Primary activities were: analysis of the original approach to recovery in distributed Ada programs using the Advanced Transport Operating System (ATOPS) example; review and assessment of the original approach which was found to be capable of improvement; preparation and presentation of a paper at the 1987 Washington DC Ada Symposium; development of a refined approach to recovery that is presently being applied to the ATOPS example; and design and development of a performance assessment scheme for Ada programs based on a flexible user-driven benchmarking system.
Platforms for Single-Cell Collection and Analysis.
Valihrach, Lukas; Androvic, Peter; Kubista, Mikael
2018-03-11
Single-cell analysis has become an established method to study cell heterogeneity and for rare cell characterization. Despite the high cost and technical constraints, applications are increasing every year in all fields of biology. Following the trend, there is a tremendous development of tools for single-cell analysis, especially in the RNA sequencing field. Every improvement increases sensitivity and throughput. Collecting a large amount of data also stimulates the development of new approaches for bioinformatic analysis and interpretation. However, the essential requirement for any analysis is the collection of single cells of high quality. The single-cell isolation must be fast, effective, and gentle to maintain the native expression profiles. Classical methods for single-cell isolation are micromanipulation, microdissection, and fluorescence-activated cell sorting (FACS). In the last decade several new and highly efficient approaches have been developed, which not just supplement but may fully replace the traditional ones. These new techniques are based on microfluidic chips, droplets, micro-well plates, and automatic collection of cells using capillaries, magnets, an electric field, or a punching probe. In this review we summarize the current methods and developments in this field. We discuss the advantages of the different commercially available platforms and their applicability, and also provide remarks on future developments.
Platforms for Single-Cell Collection and Analysis
Valihrach, Lukas; Androvic, Peter; Kubista, Mikael
2018-01-01
Single-cell analysis has become an established method to study cell heterogeneity and for rare cell characterization. Despite the high cost and technical constraints, applications are increasing every year in all fields of biology. Following the trend, there is a tremendous development of tools for single-cell analysis, especially in the RNA sequencing field. Every improvement increases sensitivity and throughput. Collecting a large amount of data also stimulates the development of new approaches for bioinformatic analysis and interpretation. However, the essential requirement for any analysis is the collection of single cells of high quality. The single-cell isolation must be fast, effective, and gentle to maintain the native expression profiles. Classical methods for single-cell isolation are micromanipulation, microdissection, and fluorescence-activated cell sorting (FACS). In the last decade several new and highly efficient approaches have been developed, which not just supplement but may fully replace the traditional ones. These new techniques are based on microfluidic chips, droplets, micro-well plates, and automatic collection of cells using capillaries, magnets, an electric field, or a punching probe. In this review we summarize the current methods and developments in this field. We discuss the advantages of the different commercially available platforms and their applicability, and also provide remarks on future developments. PMID:29534489
NASA Technical Reports Server (NTRS)
Skillen, Michael D.; Crossley, William A.
2008-01-01
This report documents a series of investigations to develop an approach for structural sizing of various morphing wing concepts. For the purposes of this report, a morphing wing is one whose planform can make significant shape changes in flight - increasing wing area by 50% or more from the lowest possible area, changing sweep 30 or more, and / or increasing aspect ratio by as much as 200% from the lowest possible value. These significant changes in geometry mean that the underlying load-bearing structure changes geometry. While most finite element analysis packages provide some sort of structural optimization capability, these codes are not amenable to making significant changes in the stiffness matrix to reflect the large morphing wing planform changes. The investigations presented here use a finite element code capable of aeroelastic analysis in three different optimization approaches -a "simultaneous analysis" approach, a "sequential" approach, and an "aggregate" approach.
Qualitative research methods in renal medicine: an introduction.
Bristowe, Katherine; Selman, Lucy; Murtagh, Fliss E M
2015-09-01
Qualitative methodologies are becoming increasingly widely used in health research. However, within some specialties, including renal medicine, qualitative approaches remain under-represented in the high-impact factor journals. Qualitative research can be undertaken: (i) as a stand-alone research method, addressing specific research questions; (ii) as part of a mixed methods approach alongside quantitative approaches or (iii) embedded in clinical trials, or during the development of complex interventions. The aim of this paper is to introduce qualitative research, including the rationale for choosing qualitative approaches, and guidance for ensuring quality when undertaking and reporting qualitative research. In addition, we introduce types of qualitative data (observation, interviews and focus groups) as well as some of the most commonly encountered methodological approaches (case studies, ethnography, phenomenology, grounded theory, thematic analysis, framework analysis and content analysis). © The Author 2015. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.
STATISTICAL SAMPLING AND DATA ANALYSIS
Research is being conducted to develop approaches to improve soil and sediment sampling techniques, measurement design and geostatistics, and data analysis via chemometric, environmetric, and robust statistical methods. Improvements in sampling contaminated soil and other hetero...
Global Persistent Attack: A Systems Architecture, Process Modeling, and Risk Analysis Approach
2008-06-01
develop an analysis process for quantifying risk associated with the limitations presented by a fiscally constrained environment. The second step...previous independent analysis of each force structure provided information for quantifying risk associated with the given force presentations, the
An illustrative analysis of technological alternatives for satellite communications
NASA Technical Reports Server (NTRS)
Metcalfe, M. R.; Cazalet, E. G.; North, D. W.
1979-01-01
The demand for satellite communications services in the domestic market is discussed. Two approaches to increasing system capacity are the expansion of service into frequencies presently allocated but not used for satellite communications, and the development of technologies that provide a greater level of service within the currently used frequency bands. The development of economic models and analytic techniques for evaluating capacity expansion alternatives such as these are presented. The satellite orbit spectrum problem, and also outlines of some suitable analytic approaches are examined. Illustrative analysis of domestic communications satellite technology options for providing increased levels of service are also examined. The analysis illustrates the use of probabilities and decision trees in analyzing alternatives, and provides insight into the important aspects of the orbit spectrum problem that would warrant inclusion in a larger scale analysis.
Sage Simulation Model for Technology Demonstration Convertor by a Step-by-Step Approach
NASA Technical Reports Server (NTRS)
Demko, Rikako; Penswick, L. Barry
2006-01-01
The development of a Stirling model using the 1-D Saga design code was completed using a step-by-step approach. This is a method of gradually increasing the complexity of the Saga model while observing the energy balance and energy losses at each step of the development. This step-by-step model development and energy-flow analysis can clarify where the losses occur, their impact, and suggest possible opportunities for design improvement.
Parametric and experimental analysis using a power flow approach
NASA Technical Reports Server (NTRS)
Cuschieri, J. M.
1988-01-01
Having defined and developed a structural power flow approach for the analysis of structure-borne transmission of structural vibrations, the technique is used to perform an analysis of the influence of structural parameters on the transmitted energy. As a base for comparison, the parametric analysis is first performed using a Statistical Energy Analysis approach and the results compared with those obtained using the power flow approach. The advantages of using structural power flow are thus demonstrated by comparing the type of results obtained by the two methods. Additionally, to demonstrate the advantages of using the power flow method and to show that the power flow results represent a direct physical parameter that can be measured on a typical structure, an experimental investigation of structural power flow is also presented. Results are presented for an L-shaped beam for which an analytical solution has already been obtained. Furthermore, the various methods available to measure vibrational power flow are compared to investigate the advantages and disadvantages of each method.
Using Web-Based Knowledge Extraction Techniques to Support Cultural Modeling
NASA Astrophysics Data System (ADS)
Smart, Paul R.; Sieck, Winston R.; Shadbolt, Nigel R.
The World Wide Web is a potentially valuable source of information about the cognitive characteristics of cultural groups. However, attempts to use the Web in the context of cultural modeling activities are hampered by the large-scale nature of the Web and the current dominance of natural language formats. In this paper, we outline an approach to support the exploitation of the Web for cultural modeling activities. The approach begins with the development of qualitative cultural models (which describe the beliefs, concepts and values of cultural groups), and these models are subsequently used to develop an ontology-based information extraction capability. Our approach represents an attempt to combine conventional approaches to information extraction with epidemiological perspectives of culture and network-based approaches to cultural analysis. The approach can be used, we suggest, to support the development of models providing a better understanding of the cognitive characteristics of particular cultural groups.
Development and Application of an Integrated Approach toward NASA Airspace Systems Research
NASA Technical Reports Server (NTRS)
Barhydt, Richard; Fong, Robert K.; Abramson, Paul D.; Koenke, Ed
2008-01-01
The National Aeronautics and Space Administration's (NASA) Airspace Systems Program is contributing air traffic management research in support of the 2025 Next Generation Air Transportation System (NextGen). Contributions support research and development needs provided by the interagency Joint Planning and Development Office (JPDO). These needs generally call for integrated technical solutions that improve system-level performance and work effectively across multiple domains and planning time horizons. In response, the Airspace Systems Program is pursuing an integrated research approach and has adapted systems engineering best practices for application in a research environment. Systems engineering methods aim to enable researchers to methodically compare different technical approaches, consider system-level performance, and develop compatible solutions. Systems engineering activities are performed iteratively as the research matures. Products of this approach include a demand and needs analysis, system-level descriptions focusing on NASA research contributions, system assessment and design studies, and common systemlevel metrics, scenarios, and assumptions. Results from the first systems engineering iteration include a preliminary demand and needs analysis; a functional modeling tool; and initial system-level metrics, scenario characteristics, and assumptions. Demand and needs analysis results suggest that several advanced concepts can mitigate demand/capacity imbalances for NextGen, but fall short of enabling three-times current-day capacity at the nation s busiest airports and airspace. Current activities are focusing on standardizing metrics, scenarios, and assumptions, conducting system-level performance assessments of integrated research solutions, and exploring key system design interfaces.
Managed Development Environment Successes for MSFC's VIPA Team
NASA Technical Reports Server (NTRS)
Finckenor, Jeff; Corder, Gary; Owens, James; Meehan, Jim; Tidwell, Paul H.
2005-01-01
This paper outlines the best practices of the Vehicle Design Team for VIPA. The functions of the VIPA Vehicle Design (VVD) discipline team are to maintain the controlled reference geometry and provide linked, simplified geometry for each of the other discipline analyses. The core of the VVD work, and the approach for VVD s first task of controlling the reference geometry, involves systems engineering, top-down, layout-based CAD modeling within a Product Data Manager (PDM) development environment. The top- down approach allows for simple control of very large, integrated assemblies and greatly enhances the ability to generate trade configurations and reuse data. The second VVD task, model simplification for analysis, is handled within the managed environment through application of the master model concept. In this approach, there is a single controlling, or master, product definition dataset. Connected to this master model are reference datasets with live geometric and expression links. The referenced models can be for drawings, manufacturing, visualization, embedded analysis, or analysis simplification. A discussion of web based interaction, including visualization, between the design and other disciplines is included. Demonstrated examples are cited, including the Space Launch Initiative development cycle, the Saturn V systems integration and verification cycle, an Orbital Space Plane study, and NASA Exploration Office studies of Shuttle derived and clean sheet launch vehicles. The VIPA Team has brought an immense amount of detailed data to bear on program issues. A central piece of that success has been the Managed Development Environment and the VVD Team approach to modeling.
Research in interactive scene analysis
NASA Technical Reports Server (NTRS)
Tenenbaum, J. M.; Garvey, T. D.; Weyl, S. A.; Wolf, H. C.
1975-01-01
An interactive scene interpretation system (ISIS) was developed as a tool for constructing and experimenting with man-machine and automatic scene analysis methods tailored for particular image domains. A recently developed region analysis subsystem based on the paradigm of Brice and Fennema is described. Using this subsystem a series of experiments was conducted to determine good criteria for initially partitioning a scene into atomic regions and for merging these regions into a final partition of the scene along object boundaries. Semantic (problem-dependent) knowledge is essential for complete, correct partitions of complex real-world scenes. An interactive approach to semantic scene segmentation was developed and demonstrated on both landscape and indoor scenes. This approach provides a reasonable methodology for segmenting scenes that cannot be processed completely automatically, and is a promising basis for a future automatic system. A program is described that can automatically generate strategies for finding specific objects in a scene based on manually designated pictorial examples.
Natural selection. VII. History and interpretation of kin selection theory.
Frank, S A
2013-06-01
Kin selection theory is a kind of causal analysis. The initial form of kin selection ascribed cause to costs, benefits and genetic relatedness. The theory then slowly developed a deeper and more sophisticated approach to partitioning the causes of social evolution. Controversy followed because causal analysis inevitably attracts opposing views. It is always possible to separate total effects into different component causes. Alternative causal schemes emphasize different aspects of a problem, reflecting the distinct goals, interests and biases of different perspectives. For example, group selection is a particular causal scheme with certain advantages and significant limitations. Ultimately, to use kin selection theory to analyse natural patterns and to understand the history of debates over different approaches, one must follow the underlying history of causal analysis. This article describes the history of kin selection theory, with emphasis on how the causal perspective improved through the study of key patterns of natural history, such as dispersal and sex ratio, and through a unified approach to demographic and social processes. Independent historical developments in the multivariate analysis of quantitative traits merged with the causal analysis of social evolution by kin selection. © 2013 The Author. Journal of Evolutionary Biology © 2013 European Society For Evolutionary Biology.
Public health policy research: making the case for a political science approach.
Bernier, Nicole F; Clavier, Carole
2011-03-01
The past few years have seen the emergence of claims that the political determinants of health do not get due consideration and a growing demand for better insights into public policy analysis in the health research field. Several public health and health promotion researchers are calling for better training and a stronger research culture in health policy. The development of these studies tends to be more advanced in health promotion than in other areas of public health research, but researchers are still commonly caught in a naïve, idealistic and narrow view of public policy. This article argues that the political science discipline has developed a specific approach to public policy analysis that can help to open up unexplored levers of influence for public health research and practice and that can contribute to a better understanding of public policy as a determinant of health. It describes and critiques the public health model of policy analysis, analyzes political science's specific approach to public policy analysis, and discusses how the politics of research provides opportunities and barriers to the integration of political science's distinctive contributions to policy analysis in health promotion.
Modeling and analysis of cascade solar cells
NASA Technical Reports Server (NTRS)
Ho, F. D.
1986-01-01
A brief review is given of the present status of the development of cascade solar cells. It is known that photovoltaic efficiencies can be improved through this development. The designs and calculations of the multijunction cells, however, are quite complicated. The main goal is to find a method which is a compromise between accuracy and simplicity for modeling a cascade solar cell. Three approaches are presently under way, among them (1) equivalent circuit approach, (2) numerical approach, and (3) analytical approach. Here, the first and the second approaches are discussed. The equivalent circuit approach using SPICE (Simulation Program, Integrated Circuit Emphasis) to the cascade cells and the cascade-cell array is highlighted. The methods of extracting parameters for modeling are discussed.
A review of risk management process in construction projects of developing countries
NASA Astrophysics Data System (ADS)
Bahamid, R. A.; Doh, S. I.
2017-11-01
In the construction industry, risk management concept is a less popular technique. There are three main stages in the systematic approach to risk management in construction industry. These stages include: a) risk response; b) risk analysis and evaluation; and c) risk identification. The high risk related to construction business affects each of its participants; while operational analysis and management of construction related risks remain an enormous task to practitioners of the industry. This paper tends towards reviewing the existing literature on construction project risk managements in developing countries specifically on risk management process. The literature lacks ample risk management process approach capable of capturing risk impact on diverse project objectives. This literature review aims at discovering the frequently used techniques in risk identification and analysis. It also attempts to identify response to clarifying the different classifications of risk sources in the existing literature of developing countries, and to identify the future research directions on project risks in the area of construction in developing countries.
Development of a User Interface for a Regression Analysis Software Tool
NASA Technical Reports Server (NTRS)
Ulbrich, Norbert Manfred; Volden, Thomas R.
2010-01-01
An easy-to -use user interface was implemented in a highly automated regression analysis tool. The user interface was developed from the start to run on computers that use the Windows, Macintosh, Linux, or UNIX operating system. Many user interface features were specifically designed such that a novice or inexperienced user can apply the regression analysis tool with confidence. Therefore, the user interface s design minimizes interactive input from the user. In addition, reasonable default combinations are assigned to those analysis settings that influence the outcome of the regression analysis. These default combinations will lead to a successful regression analysis result for most experimental data sets. The user interface comes in two versions. The text user interface version is used for the ongoing development of the regression analysis tool. The official release of the regression analysis tool, on the other hand, has a graphical user interface that is more efficient to use. This graphical user interface displays all input file names, output file names, and analysis settings for a specific software application mode on a single screen which makes it easier to generate reliable analysis results and to perform input parameter studies. An object-oriented approach was used for the development of the graphical user interface. This choice keeps future software maintenance costs to a reasonable limit. Examples of both the text user interface and graphical user interface are discussed in order to illustrate the user interface s overall design approach.
NASA Technical Reports Server (NTRS)
Weissenberger, S. (Editor)
1973-01-01
A systems engineering approach is reported for the problem of reducing the number and severity of California's wildlife fires. Prevention methodologies are reviewed and cost benefit models are developed for making preignition decisions.
A Comprehensive Planning Model
ERIC Educational Resources Information Center
Temkin, Sanford
1972-01-01
Combines elements of the problem solving approach inherent in methods of applied economics and operations research and the structural-functional analysis common in social science modeling to develop an approach for economic planning and resource allocation for schools and other public sector organizations. (Author)
Franco-Trigo, L; Hossain, L N; Durks, D; Fam, D; Inglis, S C; Benrimoj, S I; Sabater-Hernández, D
Participatory approaches involving stakeholders across the health care system can help enhance the development, implementation and evaluation of health services. These approaches may be particularly useful in planning community pharmacy services and so overcome challenges in their implementation into practice. Conducting a stakeholder analysis is a key first step since it allows relevant stakeholders to be identified, as well as providing planners a better understanding of the complexity of the health care system. The main aim of this study was to conduct a stakeholder analysis to identify those individuals and organizations that could be part of a leading planning group for the development of a community pharmacy service (CPS) to prevent cardiovascular disease (CVD) in Australia. An experienced facilitator conducted a workshop with 8 key informants of the Australian health care system. Two structured activities were undertaken. The first explored current needs and gaps in cardiovascular care and the role of community pharmacists. The second was a stakeholder analysis, using both ex-ante and ad-hoc approaches. Identified stakeholders were then classified into three groups according to their relative influence on the development of the pharmacy service. The information gathered was analyzed using qualitative content analysis. The key informants identified 46 stakeholders, including (1) patient/consumers and their representative organizations, (2) health care providers and their professional organizations and (3) institutions and organizations that do not directly interact with patients but organize and manage the health care system, develop and implement health policies, pay for health care, influence funding for health service research or promote new health initiatives. From the 46 stakeholders, a core group of 12 stakeholders was defined. These were considered crucial to the service's development because they held positions that could drive or inhibit progress. Secondary results of the workshop included: a list of needs and gaps in cardiovascular care (n = 6), a list of roles for community pharmacists in cardiovascular prevention (n = 12) and a list of potential factors (n = 7) that can hinder the integration of community pharmacy services into practice. This stakeholder analysis provided a detailed picture of the wide range of stakeholders across the entire health care system that have a stake in the development of a community pharmacy service aimed at preventing CVD. Of these, a core group of key stakeholders, with complementary roles, can then be approached for further planning of the service. The results of this analysis highlight the relevance of establishing multilevel stakeholder groups for CPS planning. Copyright © 2016 Elsevier Inc. All rights reserved.
Relational frame theory: A new paradigm for the analysis of social behavior
Roche, Bryan; Barnes-Holmes, Yvonne; Barnes-Holmes, Dermot; Stewart, Ian; O'Hora, Denis
2002-01-01
Recent developments in the analysis of derived relational responding, under the rubric of relational frame theory, have brought several complex language and cognitive phenomena within the empirical reach of the experimental analysis of behavior. The current paper provides an outline of relational frame theory as a new approach to the analysis of language, cognition, and complex behavior more generally. Relational frame theory, it is argued, also provides a suitable paradigm for the analysis of a wide variety of social behavior that is mediated by language. Recent empirical evidence and theoretical interpretations are provided in support of the relational frame approach to social behavior. PMID:22478379
Research in interactive scene analysis
NASA Technical Reports Server (NTRS)
Tenenbaum, J. M.; Barrow, H. G.; Weyl, S. A.
1976-01-01
Cooperative (man-machine) scene analysis techniques were developed whereby humans can provide a computer with guidance when completely automated processing is infeasible. An interactive approach promises significant near-term payoffs in analyzing various types of high volume satellite imagery, as well as vehicle-based imagery used in robot planetary exploration. This report summarizes the work accomplished over the duration of the project and describes in detail three major accomplishments: (1) the interactive design of texture classifiers; (2) a new approach for integrating the segmentation and interpretation phases of scene analysis; and (3) the application of interactive scene analysis techniques to cartography.
Analysis of the time structure of synchronization in multidimensional chaotic systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Makarenko, A. V., E-mail: avm.science@mail.ru
2015-05-15
A new approach is proposed to the integrated analysis of the time structure of synchronization of multidimensional chaotic systems. The method allows one to diagnose and quantitatively evaluate the intermittency characteristics during synchronization of chaotic oscillations in the T-synchronization mode. A system of two identical logistic mappings with unidirectional coupling that operate in the developed chaos regime is analyzed. It is shown that the widely used approach, in which only synchronization patterns are subjected to analysis while desynchronization areas are considered as a background signal and removed from analysis, should be regarded as methodologically incomplete.
Risk-Based Probabilistic Approach to Aeropropulsion System Assessment
NASA Technical Reports Server (NTRS)
Tong, Michael T.
2002-01-01
In an era of shrinking development budgets and resources, where there is also an emphasis on reducing the product development cycle, the role of system assessment, performed in the early stages of an engine development program, becomes very critical to the successful development of new aeropropulsion systems. A reliable system assessment not only helps to identify the best propulsion system concept among several candidates, it can also identify which technologies are worth pursuing. This is particularly important for advanced aeropropulsion technology development programs, which require an enormous amount of resources. In the current practice of deterministic, or point-design, approaches, the uncertainties of design variables are either unaccounted for or accounted for by safety factors. This could often result in an assessment with unknown and unquantifiable reliability. Consequently, it would fail to provide additional insight into the risks associated with the new technologies, which are often needed by decision makers to determine the feasibility and return-on-investment of a new aircraft engine. In this work, an alternative approach based on the probabilistic method was described for a comprehensive assessment of an aeropropulsion system. The statistical approach quantifies the design uncertainties inherent in a new aeropropulsion system and their influences on engine performance. Because of this, it enhances the reliability of a system assessment. A technical assessment of a wave-rotor-enhanced gas turbine engine was performed to demonstrate the methodology. The assessment used probability distributions to account for the uncertainties that occur in component efficiencies and flows and in mechanical design variables. The approach taken in this effort was to integrate the thermodynamic cycle analysis embedded in the computer code NEPP (NASA Engine Performance Program) and the engine weight analysis embedded in the computer code WATE (Weight Analysis of Turbine Engines) with the fast probability integration technique (FPI). FPI was developed by Southwest Research Institute under contract with the NASA Glenn Research Center. The results were plotted in the form of cumulative distribution functions and sensitivity analyses and were compared with results from the traditional deterministic approach. The comparison showed that the probabilistic approach provides a more realistic and systematic way to assess an aeropropulsion system. The current work addressed the application of the probabilistic approach to assess specific fuel consumption, engine thrust, and weight. Similarly, the approach can be used to assess other aspects of aeropropulsion system performance, such as cost, acoustic noise, and emissions. Additional information is included in the original extended abstract.
Zhang, Qibin; Ames, Jennifer M.; Smith, Richard D.; Baynes, John W.; Metz, Thomas O.
2009-01-01
The Maillard reaction, starting from the glycation of protein and progressing to the formation of advanced glycation end-products (AGEs), is implicated in the development of complications of diabetes mellitus, as well as in the pathogenesis of cardiovascular, renal, and neurodegenerative diseases. In this perspective review, we provide an overview on the relevance of the Maillard reaction in the pathogenesis of chronic disease and discuss traditional approaches and recent developments in the analysis of glycated proteins by mass spectrometry. We propose that proteomics approaches, particularly bottom-up proteomics, will play a significant role in analyses of clinical samples leading to the identification of new markers of disease development and progression. PMID:19093874
Parametric Robust Control and System Identification: Unified Approach
NASA Technical Reports Server (NTRS)
Keel, L. H.
1996-01-01
During the period of this support, a new control system design and analysis method has been studied. This approach deals with control systems containing uncertainties that are represented in terms of its transfer function parameters. Such a representation of the control system is common and many physical parameter variations fall into this type of uncertainty. Techniques developed here are capable of providing nonconservative analysis of such control systems with parameter variations. We have also developed techniques to deal with control systems when their state space representations are given rather than transfer functions. In this case, the plant parameters will appear as entries of state space matrices. Finally, a system modeling technique to construct such systems from the raw input - output frequency domain data has been developed.
Zhang, Qibin; Ames, Jennifer M; Smith, Richard D; Baynes, John W; Metz, Thomas O
2009-02-01
The Maillard reaction, starting from the glycation of protein and progressing to the formation of advanced glycation end-products (AGEs), is implicated in the development of complications of diabetes mellitus, as well as in the pathogenesis of cardiovascular, renal, and neurodegenerative diseases. In this perspective review, we provide an overview on the relevance of the Maillard reaction in the pathogenesis of chronic disease and discuss traditional approaches and recent developments in the analysis of glycated proteins by mass spectrometry. We propose that proteomics approaches, particularly bottom-up proteomics, will play a significant role in analyses of clinical samples leading to the identification of new markers of disease development and progression.
Fragoulakis, Vasilios; Mitropoulou, Christina; van Schaik, Ron H; Maniadakis, Nikolaos; Patrinos, George P
2016-05-01
Genomic Medicine aims to improve therapeutic interventions and diagnostics, the quality of life of patients, but also to rationalize healthcare costs. To reach this goal, careful assessment and identification of evidence gaps for public health genomics priorities are required so that a more efficient healthcare environment is created. Here, we propose a public health genomics-driven approach to adjust the classical healthcare decision making process with an alternative methodological approach of cost-effectiveness analysis, which is particularly helpful for genomic medicine interventions. By combining classical cost-effectiveness analysis with budget constraints, social preferences, and patient ethics, we demonstrate the application of this model, the Genome Economics Model (GEM), based on a previously reported genome-guided intervention from a developing country environment. The model and the attendant rationale provide a practical guide by which all major healthcare stakeholders could ensure the sustainability of funding for genome-guided interventions, their adoption and coverage by health insurance funds, and prioritization of Genomic Medicine research, development, and innovation, given the restriction of budgets, particularly in developing countries and low-income healthcare settings in developed countries. The implications of the GEM for the policy makers interested in Genomic Medicine and new health technology and innovation assessment are also discussed.
ERIC Educational Resources Information Center
Li, Wei-Ting; Liang, Jyh-Chong; Tsai, Chin-Chung
2013-01-01
The purpose of this research was to examine the relationships between conceptions of learning and approaches to learning in chemistry. Two questionnaires, conceptions of learning chemistry (COLC) and approaches to learning chemistry (ALC), were developed to identify 369 college chemistry-major students' (220 males and 149 females) conceptions of…
Learning Analysis of K-12 Students' Online Problem Solving: A Three-Stage Assessment Approach
ERIC Educational Resources Information Center
Hu, Yiling; Wu, Bian; Gu, Xiaoqing
2017-01-01
Problem solving is considered a fundamental human skill. However, large-scale assessment of problem solving in K-12 education remains a challenging task. Researchers have argued for the development of an enhanced assessment approach through joint effort from multiple disciplines. In this study, a three-stage approach based on an evidence-centered…
Data First: Building Scientific Reasoning in AP Chemistry via the Concept Development Study Approach
ERIC Educational Resources Information Center
Nichol, Carolyn A.; Szymczyk, Amber J.; Hutchinson, John S.
2014-01-01
This article introduces the "Data First" approach and shows how the observation and analysis of scientific data can be used as a scaffold to build conceptual understanding in chemistry through inductive reasoning. The "Data First" approach emulates the scientific process by changing the order by which we introduce data. Rather…
Rexhepaj, Elton; Brennan, Donal J; Holloway, Peter; Kay, Elaine W; McCann, Amanda H; Landberg, Goran; Duffy, Michael J; Jirstrom, Karin; Gallagher, William M
2008-01-01
Manual interpretation of immunohistochemistry (IHC) is a subjective, time-consuming and variable process, with an inherent intra-observer and inter-observer variability. Automated image analysis approaches offer the possibility of developing rapid, uniform indicators of IHC staining. In the present article we describe the development of a novel approach for automatically quantifying oestrogen receptor (ER) and progesterone receptor (PR) protein expression assessed by IHC in primary breast cancer. Two cohorts of breast cancer patients (n = 743) were used in the study. Digital images of breast cancer tissue microarrays were captured using the Aperio ScanScope XT slide scanner (Aperio Technologies, Vista, CA, USA). Image analysis algorithms were developed using MatLab 7 (MathWorks, Apple Hill Drive, MA, USA). A fully automated nuclear algorithm was developed to discriminate tumour from normal tissue and to quantify ER and PR expression in both cohorts. Random forest clustering was employed to identify optimum thresholds for survival analysis. The accuracy of the nuclear algorithm was initially confirmed by a histopathologist, who validated the output in 18 representative images. In these 18 samples, an excellent correlation was evident between the results obtained by manual and automated analysis (Spearman's rho = 0.9, P < 0.001). Optimum thresholds for survival analysis were identified using random forest clustering. This revealed 7% positive tumour cells as the optimum threshold for the ER and 5% positive tumour cells for the PR. Moreover, a 7% cutoff level for the ER predicted a better response to tamoxifen than the currently used 10% threshold. Finally, linear regression was employed to demonstrate a more homogeneous pattern of expression for the ER (R = 0.860) than for the PR (R = 0.681). In summary, we present data on the automated quantification of the ER and the PR in 743 primary breast tumours using a novel unsupervised image analysis algorithm. This novel approach provides a useful tool for the quantification of biomarkers on tissue specimens, as well as for objective identification of appropriate cutoff thresholds for biomarker positivity. It also offers the potential to identify proteins with a homogeneous pattern of expression.
Cheng, Feon W; Gao, Xiang; Bao, Le; Mitchell, Diane C; Wood, Craig; Sliwinski, Martin J; Smiciklas-Wright, Helen; Still, Christopher D; Rolston, David D K; Jensen, Gordon L
2017-07-01
To examine the risk factors of developing functional decline and make probabilistic predictions by using a tree-based method that allows higher order polynomials and interactions of the risk factors. The conditional inference tree analysis, a data mining approach, was used to construct a risk stratification algorithm for developing functional limitation based on BMI and other potential risk factors for disability in 1,951 older adults without functional limitations at baseline (baseline age 73.1 ± 4.2 y). We also analyzed the data with multivariate stepwise logistic regression and compared the two approaches (e.g., cross-validation). Over a mean of 9.2 ± 1.7 years of follow-up, 221 individuals developed functional limitation. Higher BMI, age, and comorbidity were consistently identified as significant risk factors for functional decline using both methods. Based on these factors, individuals were stratified into four risk groups via the conditional inference tree analysis. Compared to the low-risk group, all other groups had a significantly higher risk of developing functional limitation. The odds ratio comparing two extreme categories was 9.09 (95% confidence interval: 4.68, 17.6). Higher BMI, age, and comorbid disease were consistently identified as significant risk factors for functional decline among older individuals across all approaches and analyses. © 2017 The Obesity Society.
Rosenow, Felix; van Alphen, Natascha; Becker, Albert; Chiocchetti, Andreas; Deichmann, Ralf; Deller, Thomas; Freiman, Thomas; Freitag, Christine M; Gehrig, Johannes; Hermsen, Anke M; Jedlicka, Peter; Kell, Christian; Klein, Karl Martin; Knake, Susanne; Kullmann, Dimitri M; Liebner, Stefan; Norwood, Braxton A; Omigie, Diana; Plate, Karlheinz; Reif, Andreas; Reif, Philipp S; Reiss, Yvonne; Roeper, Jochen; Ronellenfitsch, Michael W; Schorge, Stephanie; Schratt, Gerhard; Schwarzacher, Stephan W; Steinbach, Joachim P; Strzelczyk, Adam; Triesch, Jochen; Wagner, Marlies; Walker, Matthew C; von Wegner, Frederic; Bauer, Sebastian
2017-11-01
Despite the availability of more than 15 new "antiepileptic drugs", the proportion of patients with pharmacoresistant epilepsy has remained constant at about 20-30%. Furthermore, no disease-modifying treatments shown to prevent the development of epilepsy following an initial precipitating brain injury or to reverse established epilepsy have been identified to date. This is likely in part due to the polyetiologic nature of epilepsy, which in turn requires personalized medicine approaches. Recent advances in imaging, pathology, genetics and epigenetics have led to new pathophysiological concepts and the identification of monogenic causes of epilepsy. In the context of these advances, the First International Symposium on Personalized Translational Epilepsy Research (1st ISymPTER) was held in Frankfurt on September 8, 2016, to discuss novel approaches and future perspectives for personalized translational research. These included new developments and ideas in a range of experimental and clinical areas such as deep phenotyping, quantitative brain imaging, EEG/MEG-based analysis of network dysfunction, tissue-based translational studies, innate immunity mechanisms, microRNA as treatment targets, functional characterization of genetic variants in human cell models and rodent organotypic slice cultures, personalized treatment approaches for monogenic epilepsies, blood-brain barrier dysfunction, therapeutic focal tissue modification, computational modeling for target and biomarker identification, and cost analysis in (monogenic) disease and its treatment. This report on the meeting proceedings is aimed at stimulating much needed investments of time and resources in personalized translational epilepsy research. Part I includes the clinical phenotyping and diagnostic methods, EEG network-analysis, biomarkers, and personalized treatment approaches. In Part II, experimental and translational approaches will be discussed (Bauer et al., 2017) [1]. Copyright © 2017 Elsevier Inc. All rights reserved.
Mannarini, Stefania; Balottin, Laura; Toldo, Irene; Gatta, Michela
2016-10-01
The study, conducted on Italian preadolscents aged 11 to 13 belonging to the general population, aims to investigate the relationship between the emotional functioning, namely, alexithymia, and the risk of developing behavioral and emotional problems measured using the Strength and Difficulty Questionnaire. The latent class analysis approach allowed to identify two latent variables, accounting for the internalizing (emotional symptoms and difficulties in emotional awareness) and for the externalizing problems (conduct problems and hyperactivity, problematic relationships with peers, poor prosocial behaviors and externally oriented thinking). The two latent variables featured two latent classes: the difficulty in dealing with problems and the strength to face problems that was representative of most of the healthy participants with specific gender differences. Along with the analysis of psychopathological behaviors, the study of resilience and strengths can prove to be a key step in order to develop valuable preventive approaches to tackle psychiatric disorders. © 2016 Scandinavian Psychological Associations and John Wiley & Sons Ltd.
Development of an Aeroelastic Analysis Including a Viscous Flow Model
NASA Technical Reports Server (NTRS)
Keith, Theo G., Jr.; Bakhle, Milind A.
2001-01-01
Under this grant, Version 4 of the three-dimensional Navier-Stokes aeroelastic code (TURBO-AE) has been developed and verified. The TURBO-AE Version 4 aeroelastic code allows flutter calculations for a fan, compressor, or turbine blade row. This code models a vibrating three-dimensional bladed disk configuration and the associated unsteady flow (including shocks, and viscous effects) to calculate the aeroelastic instability using a work-per-cycle approach. Phase-lagged (time-shift) periodic boundary conditions are used to model the phase lag between adjacent vibrating blades. The direct-store approach is used for this purpose to reduce the computational domain to a single interblade passage. A disk storage option, implemented using direct access files, is available to reduce the large memory requirements of the direct-store approach. Other researchers have implemented 3D inlet/exit boundary conditions based on eigen-analysis. Appendix A: Aeroelastic calculations based on three-dimensional euler analysis. Appendix B: Unsteady aerodynamic modeling of blade vibration using the turbo-V3.1 code.
Behavior Analysis in Distance Education: A Systems Approach.
ERIC Educational Resources Information Center
Coldeway, Dan O.
1987-01-01
Describes a model of instructional theory relevant to individualized distance education that is based on Keller's Personalized System of Instruction (PSI), behavior analysis, and the instructional systems development model (ISD). Systems theory is emphasized, and ISD and behavior analysis are discussed as cybernetic processes. (LRW)
NASA Technical Reports Server (NTRS)
Fayssal, Safie; Weldon, Danny
2008-01-01
The United States National Aeronautics and Space Administration (NASA) is in the midst of a space exploration program called Constellation to send crew and cargo to the international Space Station, to the moon, and beyond. As part of the Constellation program, a new launch vehicle, Ares I, is being developed by NASA Marshall Space Flight Center. Designing a launch vehicle with high reliability and increased safety requires a significant effort in understanding design variability and design uncertainty at the various levels of the design (system, element, subsystem, component, etc.) and throughout the various design phases (conceptual, preliminary design, etc.). In a previous paper [1] we discussed a probabilistic functional failure analysis approach intended mainly to support system requirements definition, system design, and element design during the early design phases. This paper provides an overview of the application of probabilistic engineering methods to support the detailed subsystem/component design and development as part of the "Design for Reliability and Safety" approach for the new Ares I Launch Vehicle. Specifically, the paper discusses probabilistic engineering design analysis cases that had major impact on the design and manufacturing of the Space Shuttle hardware. The cases represent important lessons learned from the Space Shuttle Program and clearly demonstrate the significance of probabilistic engineering analysis in better understanding design deficiencies and identifying potential design improvement for Ares I. The paper also discusses the probabilistic functional failure analysis approach applied during the early design phases of Ares I and the forward plans for probabilistic design analysis in the detailed design and development phases.
1992 NASA Life Support Systems Analysis workshop
NASA Technical Reports Server (NTRS)
Evanich, Peggy L.; Crabb, Thomas M.; Gartrell, Charles F.
1992-01-01
The 1992 Life Support Systems Analysis Workshop was sponsored by NASA's Office of Aeronautics and Space Technology (OAST) to integrate the inputs from, disseminate information to, and foster communication among NASA, industry, and academic specialists. The workshop continued discussion and definition of key issues identified in the 1991 workshop, including: (1) modeling and experimental validation; (2) definition of systems analysis evaluation criteria; (3) integration of modeling at multiple levels; and (4) assessment of process control modeling approaches. Through both the 1991 and 1992 workshops, NASA has continued to seek input from industry and university chemical process modeling and analysis experts, and to introduce and apply new systems analysis approaches to life support systems. The workshop included technical presentations, discussions, and interactive planning, with sufficient time allocated for discussion of both technology status and technology development recommendations. Key personnel currently involved with life support technology developments from NASA, industry, and academia provided input to the status and priorities of current and future systems analysis methods and requirements.
EDNA: Expert fault digraph analysis using CLIPS
NASA Technical Reports Server (NTRS)
Dixit, Vishweshwar V.
1990-01-01
Traditionally fault models are represented by trees. Recently, digraph models have been proposed (Sack). Digraph models closely imitate the real system dependencies and hence are easy to develop, validate and maintain. However, they can also contain directed cycles and analysis algorithms are hard to find. Available algorithms tend to be complicated and slow. On the other hand, the tree analysis (VGRH, Tayl) is well understood and rooted in vast research effort and analytical techniques. The tree analysis algorithms are sophisticated and orders of magnitude faster. Transformation of a digraph (cyclic) into trees (CLP, LP) is a viable approach to blend the advantages of the representations. Neither the digraphs nor the trees provide the ability to handle heuristic knowledge. An expert system, to capture the engineering knowledge, is essential. We propose an approach here, namely, expert network analysis. We combine the digraph representation and tree algorithms. The models are augmented by probabilistic and heuristic knowledge. CLIPS, an expert system shell from NASA-JSC will be used to develop a tool. The technique provides the ability to handle probabilities and heuristic knowledge. Mixed analysis, some nodes with probabilities, is possible. The tool provides graphics interface for input, query, and update. With the combined approach it is expected to be a valuable tool in the design process as well in the capture of final design knowledge.
High Fidelity System Simulation of Multiple Components in Support of the UEET Program
NASA Technical Reports Server (NTRS)
Plybon, Ronald C.; VanDeWall, Allan; Sampath, Rajiv; Balasubramaniam, Mahadevan; Mallina, Ramakrishna; Irani, Rohinton
2006-01-01
The High Fidelity System Simulation effort has addressed various important objectives to enable additional capability within the NPSS framework. The scope emphasized High Pressure Turbine and High Pressure Compressor components. Initial effort was directed at developing and validating intermediate fidelity NPSS model using PD geometry and extended to high-fidelity NPSS model by overlaying detailed geometry to validate CFD against rig data. Both "feedforward" and feedback" approaches of analysis zooming was employed to enable system simulation capability in NPSS. These approaches have certain benefits and applicability in terms of specific applications "feedback" zooming allows the flow-up of information from high-fidelity analysis to be used to update the NPSS model results by forcing the NPSS solver to converge to high-fidelity analysis predictions. This apporach is effective in improving the accuracy of the NPSS model; however, it can only be used in circumstances where there is a clear physics-based strategy to flow up the high-fidelity analysis results to update the NPSS system model. "Feed-forward" zooming approach is more broadly useful in terms of enabling detailed analysis at early stages of design for a specified set of critical operating points and using these analysis results to drive design decisions early in the development process.
A new metaphor for projection-based visual analysis and data exploration
NASA Astrophysics Data System (ADS)
Schreck, Tobias; Panse, Christian
2007-01-01
In many important application domains such as Business and Finance, Process Monitoring, and Security, huge and quickly increasing volumes of complex data are collected. Strong efforts are underway developing automatic and interactive analysis tools for mining useful information from these data repositories. Many data analysis algorithms require an appropriate definition of similarity (or distance) between data instances to allow meaningful clustering, classification, and retrieval, among other analysis tasks. Projection-based data visualization is highly interesting (a) for visual discrimination analysis of a data set within a given similarity definition, and (b) for comparative analysis of similarity characteristics of a given data set represented by different similarity definitions. We introduce an intuitive and effective novel approach for projection-based similarity visualization for interactive discrimination analysis, data exploration, and visual evaluation of metric space effectiveness. The approach is based on the convex hull metaphor for visually aggregating sets of points in projected space, and it can be used with a variety of different projection techniques. The effectiveness of the approach is demonstrated by application on two well-known data sets. Statistical evidence supporting the validity of the hull metaphor is presented. We advocate the hull-based approach over the standard symbol-based approach to projection visualization, as it allows a more effective perception of similarity relationships and class distribution characteristics.
2008-09-01
gathering and prioritization of their inputs, system development and implementation would become chaotic at best, and the developmental cost 74...for shipbuilding. This study investigated current DoD Human Capital Management (HCM) strategies for attracting, developing , retaining and managing...employed by these stakeholders. The result of the analysis was the development , via a functional analysis, of a notional HCM architecture for the
A qualitative approach to systemic diagnosis of the SSME
NASA Technical Reports Server (NTRS)
Bickmore, Timothy W.; Maul, William A.
1993-01-01
A generic software architecture has been developed for posttest diagnostics of rocket engines, and is presently being applied to the posttest analysis of the SSME. This investigation deals with the Systems Section module of the architecture, which is presently under development. Overviews of the manual SSME systems analysis process and the overall SSME diagnostic system architecture are presented.
ERIC Educational Resources Information Center
Lorenzo-Seva, Urbano; Ferrando, Pere J.
2013-01-01
FACTOR 9.2 was developed for three reasons. First, exploratory factor analysis (FA) is still an active field of research although most recent developments have not been incorporated into available programs. Second, there is now renewed interest in semiconfirmatory (SC) solutions as suitable approaches to the complex structures are commonly found…
ERIC Educational Resources Information Center
Brissett, Nigel; Mitter, Radhika
2017-01-01
We conduct a critical discourse analysis of the extent to which Sustainable Development Goal 4, "to ensure inclusive and equitable quality education for all and promote lifelong learning," promotes a utilitarian and/or transformative approach to education. Our findings show that despite transformative language used throughout the Agenda,…
ERIC Educational Resources Information Center
Duke, L. Donald; Schmidt, Diane L.
2011-01-01
The Toxics Geography Exercise was developed as an application-oriented exercise to develop skills in critical analysis in groups of undergraduate students from widely diverse academic backgrounds. Students use publicly available data on industrial activities, history of toxic material disposal, basic chemistry, regulatory approaches of federal and…
NASA Astrophysics Data System (ADS)
Szafranko, Elżbieta
2017-10-01
Assessment of variant solutions developed for a building investment project needs to be made at the stage of planning. While considering alternative solutions, the investor defines various criteria, but a direct evaluation of the degree of their fulfilment by developed variant solutions can be very difficult. In practice, there are different methods which enable the user to include a large number of parameters into an analysis, but their implementation can be challenging. Some methods require advanced mathematical computations, preceded by complicating input data processing, and the generated results may not lend themselves easily to interpretation. Hence, during her research, the author has developed a systemic approach, which involves several methods and whose goal is to compare their outcome. The final stage of the proposed method consists of graphic interpretation of results. The method has been tested on a variety of building and development projects.
ERIC Educational Resources Information Center
Ho, Hsuan-Fu; Hung, Chia-Chi
2008-01-01
Purpose: The purpose of this paper is to examine how a graduate institute at National Chiayi University (NCYU), by using a model that integrates analytic hierarchy process, cluster analysis and correspondence analysis, can develop effective marketing strategies. Design/methodology/approach: This is primarily a quantitative study aimed at…
NASA Technical Reports Server (NTRS)
Zoladz, T.; Earhart, E.; Fiorucci, T.
1995-01-01
Utilizing high-frequency data from a highly instrumented rotor assembly, seeded bearing defect signatures are characterized using both conventional linear approaches, such as power spectral density analysis, and recently developed nonlinear techniques such as bicoherence analysis. Traditional low-frequency (less than 20 kHz) analysis and high-frequency envelope analysis of both accelerometer and acoustic emission data are used to recover characteristic bearing distress information buried deeply in acquired data. The successful coupling of newly developed nonlinear signal analysis with recovered wideband envelope data from accelerometers and acoustic emission sensors is the innovative focus of this research.
Challenges in Visual Analysis of Ensembles
Crossno, Patricia
2018-04-12
Modeling physical phenomena through computational simulation increasingly relies on generating a collection of related runs, known as an ensemble. In this paper, we explore the challenges we face in developing analysis and visualization systems for large and complex ensemble data sets, which we seek to understand without having to view the results of every simulation run. Implementing approaches and ideas developed in response to this goal, we demonstrate the analysis of a 15K run material fracturing study using Slycat, our ensemble analysis system.
Challenges in Visual Analysis of Ensembles
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crossno, Patricia
Modeling physical phenomena through computational simulation increasingly relies on generating a collection of related runs, known as an ensemble. In this paper, we explore the challenges we face in developing analysis and visualization systems for large and complex ensemble data sets, which we seek to understand without having to view the results of every simulation run. Implementing approaches and ideas developed in response to this goal, we demonstrate the analysis of a 15K run material fracturing study using Slycat, our ensemble analysis system.
Zheng, Jie; Gaunt, Tom R; Day, Ian N M
2013-01-01
Genome-Wide Association Studies (GWAS) frequently incorporate meta-analysis within their framework. However, conditional analysis of individual-level data, which is an established approach for fine mapping of causal sites, is often precluded where only group-level summary data are available for analysis. Here, we present a numerical and graphical approach, "sequential sentinel SNP regional association plot" (SSS-RAP), which estimates regression coefficients (beta) with their standard errors using the meta-analysis summary results directly. Under an additive model, typical for genes with small effect, the effect for a sentinel SNP can be transformed to the predicted effect for a possibly dependent SNP through a 2×2 2-SNP haplotypes table. The approach assumes Hardy-Weinberg equilibrium for test SNPs. SSS-RAP is available as a Web-tool (http://apps.biocompute.org.uk/sssrap/sssrap.cgi). To develop and illustrate SSS-RAP we analyzed lipid and ECG traits data from the British Women's Heart and Health Study (BWHHS), evaluated a meta-analysis for ECG trait and presented several simulations. We compared results with existing approaches such as model selection methods and conditional analysis. Generally findings were consistent. SSS-RAP represents a tool for testing independence of SNP association signals using meta-analysis data, and is also a convenient approach based on biological principles for fine mapping in group level summary data. © 2012 Blackwell Publishing Ltd/University College London.
FEASIBILITY AND APPROACH FOR MAPPING RADON POTENTIALS IN FLORIDA
The report gives results of an analysis of the feasibility and approach for developing statewide maps of radon potentials in Florida. he maps would provide a geographic basis for implementing new radon-protective building construction standards to reduce public health risks from ...
EPA and EFSA approaches for Benchmark Dose modeling
Benchmark dose (BMD) modeling has become the preferred approach in the analysis of toxicological dose-response data for the purpose of deriving human health toxicity values. The software packages most often used are Benchmark Dose Software (BMDS, developed by EPA) and PROAST (de...
A parametric analysis of visual approaches for helicopters
NASA Technical Reports Server (NTRS)
Moen, G. C.; Dicarlo, D. J.; Yenni, K. R.
1976-01-01
A flight investigation was conducted to determine the characteristic shapes of the altitude, ground speed, and deceleration profiles of visual approaches for helicopters. Two hundred thirty-six visual approaches were flown from nine sets of initial conditions with four types of helicopters. Mathematical relationships were developed that describe the characteristic visual deceleration profiles. These mathematical relationships were expanded to develop equations which define the corresponding nominal ground speed, pitch attitude, pitch rate, and pitch acceleration profiles. Results are applicable to improved helicopter handling qualities in terminal area operations.
On the interplay between mathematics and biology. Hallmarks toward a new systems biology
NASA Astrophysics Data System (ADS)
Bellomo, Nicola; Elaiw, Ahmed; Althiabi, Abdullah M.; Alghamdi, Mohammed Ali
2015-03-01
This paper proposes a critical analysis of the existing literature on mathematical tools developed toward systems biology approaches and, out of this overview, develops a new approach whose main features can be briefly summarized as follows: derivation of mathematical structures suitable to capture the complexity of biological, hence living, systems, modeling, by appropriate mathematical tools, Darwinian type dynamics, namely mutations followed by selection and evolution. Moreover, multiscale methods to move from genes to cells, and from cells to tissue are analyzed in view of a new systems biology approach.
Crew Exploration Vehicle (CEV) Avionics Integration Laboratory (CAIL) Independent Analysis
NASA Technical Reports Server (NTRS)
Davis, Mitchell L.; Aguilar, Michael L.; Mora, Victor D.; Regenie, Victoria A.; Ritz, William F.
2009-01-01
Two approaches were compared to the Crew Exploration Vehicle (CEV) Avionics Integration Laboratory (CAIL) approach: the Flat-Sat and Shuttle Avionics Integration Laboratory (SAIL). The Flat-Sat and CAIL/SAIL approaches are two different tools designed to mitigate different risks. Flat-Sat approach is designed to develop a mission concept into a flight avionics system and associated ground controller. The SAIL approach is designed to aid in the flight readiness verification of the flight avionics system. The approaches are complimentary in addressing both the system development risks and mission verification risks. The following NESC team findings were identified: The CAIL assumption is that the flight subsystems will be matured for the system level verification; The Flat-Sat and SAIL approaches are two different tools designed to mitigate different risks. The following NESC team recommendation was provided: Define, document, and manage a detailed interface between the design and development (EDL and other integration labs) to the verification laboratory (CAIL).
Simulating The Change In Agricultural Fruit Patterns In The Context of River Basin Modelling
NASA Astrophysics Data System (ADS)
Kloecking, B.; Laue, K.; Stroebl, B.
A new concept has been developed for the integrated analysis of impacts of Global Change and direct human activities on the environment and the society in mesoscale river basins. The main steps of this approach are: (1) Developing a set of regional scenarios of change considering expected changes in climate, economic, demographic and social development, (2) Identification of indicators of sustainability for the impact assessment, (3) Impact analysis of the defined scenarios of development, (4) Evalu- ation of the different scenarios on the basis of the impact analysis to elaborate new stategies in regional development. All steps include consultations with actors and stakeholders. The concept is applied in the western part of Thuringia (7.500 km2), covering the basin of the Unstrut river. This part of the German Elbe river basin is highly suited for food production under the present conditions. Therefore it is a good site for vulnerability studies focused on agriculture. The development of agricultural land-use scenarios for the Unstrut region will be done in form of a bottom-up approach based on adaptation reactions of example farms within the expected boundary condi- tions such as the global food markets and other global economic trends as well as in- ternational agreements. Representing the present conditions in Thuringia, a referential land-use scenario was developed, assuming a complete realisation of the AGENDA 2000 resolutions. Impacts of changed land use in combination with climate change scenarios on plant production and on availability and quality of water are been inves- tigated with the help of a spatial distributed river basin model. A GIS-based approach was developed to locate the spatially not explicit land use scenarios. This approach allows to reproduce the agricultural fruit patterns of a region in a river basin model without taking into account the real field boundaries. First simulation results for the referential climate and land-use scenario for the Unstrut region will be presented.
Investigation of type-I interferon dysregulation by arenaviruses : a multidisciplinary approach.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kozina, Carol L.; Moorman, Matthew Wallace; Branda, Catherine
2011-09-01
This report provides a detailed overview of the work performed for project number 130781, 'A Systems Biology Approach to Understanding Viral Hemorrhagic Fever Pathogenesis.' We report progress in five key areas: single cell isolation devices and control systems, fluorescent cytokine and transcription factor reporters, on-chip viral infection assays, molecular virology analysis of Arenavirus nucleoprotein structure-function, and development of computational tools to predict virus-host protein interactions. Although a great deal of work remains from that begun here, we have developed several novel single cell analysis tools and knowledge of Arenavirus biology that will facilitate and inform future publications and funding proposals.
NASA Technical Reports Server (NTRS)
Schlater, Nelson J.; Simonds, Charles H.; Ballin, Mark G.
1993-01-01
Applied research and technology development (R&TD) is often characterized by uncertainty, risk, and significant delays before tangible returns are obtained. Given the increased awareness of limitations in resources, effective R&TD today needs a method for up-front assessment of competing technologies to help guide technology investment decisions. Such an assessment approach must account for uncertainties in system performance parameters, mission requirements and architectures, and internal and external events influencing a development program. The methodology known as decision analysis has the potential to address these issues. It was evaluated by performing a case study assessment of alternative carbon dioxide removal technologies for NASA"s proposed First Lunar Outpost program. An approach was developed that accounts for the uncertainties in each technology's cost and performance parameters as well as programmatic uncertainties such as mission architecture. Life cycle cost savings relative to a baseline, adjusted for the cost of money, was used as a figure of merit to evaluate each of the alternative carbon dioxide removal technology candidates. The methodology was found to provide a consistent decision-making strategy for the develpoment of new life support technology. The case study results provided insight that was not possible from more traditional analysis approaches.
Analytical Tools for Space Suit Design
NASA Technical Reports Server (NTRS)
Aitchison, Lindsay
2011-01-01
As indicated by the implementation of multiple small project teams within the agency, NASA is adopting a lean approach to hardware development that emphasizes quick product realization and rapid response to shifting program and agency goals. Over the past two decades, space suit design has been evolutionary in approach with emphasis on building prototypes then testing with the largest practical range of subjects possible. The results of these efforts show continuous improvement but make scaled design and performance predictions almost impossible with limited budgets and little time. Thus, in an effort to start changing the way NASA approaches space suit design and analysis, the Advanced Space Suit group has initiated the development of an integrated design and analysis tool. It is a multi-year-if not decadal-development effort that, when fully implemented, is envisioned to generate analysis of any given space suit architecture or, conversely, predictions of ideal space suit architectures given specific mission parameters. The master tool will exchange information to and from a set of five sub-tool groups in order to generate the desired output. The basic functions of each sub-tool group, the initial relationships between the sub-tools, and a comparison to state of the art software and tools are discussed.
Life cycle approaches to sustainable consumption: a critical review.
Hertwich, Edgar G
2005-07-01
The 2002 World Summit for Sustainable Development in Johannesburg called for a comprehensive set of programs focusing on sustainable consumption and production. According to world leaders, these programs should rely on life cycle assessment (LCA) to promote sustainable patterns of production and consumption. Cleaner production is a well-established activity, and it uses LCA. UNEP, the European Union, and a number of national organizations have now begun to work on sustainable consumption. In developing sustainable consumption policies and activities, the use of LCA presents interesting opportunities that are not yet well understood by policy makers. This paper reviews how life cycle approaches, primarily based on input-output analysis, have been used in the area of sustainable consumption: to inform policy making, select areas of action, identify which lifestyles are more sustainable, advise consumers, and evaluate the effectiveness of sustainable consumption measures. Information on consumption patterns usually comes from consumer expenditure surveys. Different study designs and a better integration with consumer research can provide further interesting insights. Life-cycle approaches still need to be developed and tested. Current research is mostly descriptive; policy makers, however, require more strategic analysis addressing their decision options, including scenario analysis and backcasting.
Developing students’ ideas about lens imaging: teaching experiments with an image-based approach
NASA Astrophysics Data System (ADS)
Grusche, Sascha
2017-07-01
Lens imaging is a classic topic in physics education. To guide students from their holistic viewpoint to the scientists’ analytic viewpoint, an image-based approach to lens imaging has recently been proposed. To study the effect of the image-based approach on undergraduate students’ ideas, teaching experiments are performed and evaluated using qualitative content analysis. Some of the students’ ideas have not been reported before, namely those related to blurry lens images, and those developed by the proposed teaching approach. To describe learning pathways systematically, a conception-versus-time coordinate system is introduced, specifying how teaching actions help students advance toward a scientific understanding.
Nanobiocatalysis for protein digestion in proteomic analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Jungbae; Kim, Byoung Chan; Lopez-Ferrer, Daniel
2010-02-01
The process of protein digestion is a critical step for successful protein identification in the bottom-up proteomic analysis. To substitute the present practice of in-solution protein digestion, which is long, tedious, and difficult to automate, a lot of efforts have been dedicated for the development of a rapid, recyclable and automated digestion system. Recent advances of nanobiocatalytic approaches have improved the performance of protein digestion by using various nanomaterials such as nanoporous materials, magnetic nanoparticles, and polymer nanofibers. Especially, the unprecedented success of trypsin stabilization in the form of trypsin-coated nanofibers, showing no activity decrease under repeated uses for onemore » year and retaining good resistance to proteolysis, has demonstrated its great potential to be employed in the development of automated, high-throughput, and on-line digestion systems. This review discusses recent developments of nanobiocatalytic approaches for the improved performance of protein digestion in speed, detection sensitivity, recyclability, and trypsin stability. In addition, we also introduce the protein digestions under unconventional energy inputs for protein denaturation and the development of microfluidic enzyme reactors that can benefit from recent successes of these nanobiocatalytic approaches.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lon N. Haney; David I. Gertman
2003-04-01
Beginning in the 1980s a primary focus of human reliability analysis was estimation of human error probabilities. However, detailed qualitative modeling with comprehensive representation of contextual variables often was lacking. This was likely due to the lack of comprehensive error and performance shaping factor taxonomies, and the limited data available on observed error rates and their relationship to specific contextual variables. In the mid 90s Boeing, America West Airlines, NASA Ames Research Center and INEEL partnered in a NASA sponsored Advanced Concepts grant to: assess the state of the art in human error analysis, identify future needs for human errormore » analysis, and develop an approach addressing these needs. Identified needs included the need for a method to identify and prioritize task and contextual characteristics affecting human reliability. Other needs identified included developing comprehensive taxonomies to support detailed qualitative modeling and to structure meaningful data collection efforts across domains. A result was the development of the FRamework Assessing Notorious Contributing Influences for Error (FRANCIE) with a taxonomy for airline maintenance tasks. The assignment of performance shaping factors to generic errors by experts proved to be valuable to qualitative modeling. Performance shaping factors and error types from such detailed approaches can be used to structure error reporting schemes. In a recent NASA Advanced Human Support Technology grant FRANCIE was refined, and two new taxonomies for use on space missions were developed. The development, sharing, and use of error taxonomies, and the refinement of approaches for increased fidelity of qualitative modeling is offered as a means to help direct useful data collection strategies.« less
Development of a High-Order Space-Time Matrix-Free Adjoint Solver
NASA Technical Reports Server (NTRS)
Ceze, Marco A.; Diosady, Laslo T.; Murman, Scott M.
2016-01-01
The growth in computational power and algorithm development in the past few decades has granted the science and engineering community the ability to simulate flows over complex geometries, thus making Computational Fluid Dynamics (CFD) tools indispensable in analysis and design. Currently, one of the pacing items limiting the utility of CFD for general problems is the prediction of unsteady turbulent ows.1{3 Reynolds-averaged Navier-Stokes (RANS) methods, which predict a time-invariant mean flowfield, struggle to provide consistent predictions when encountering even mild separation, such as the side-of-body separation at a wing-body junction. NASA's Transformative Tools and Technologies project is developing both numerical methods and physical modeling approaches to improve the prediction of separated flows. A major focus of this e ort is efficient methods for resolving the unsteady fluctuations occurring in these flows to provide valuable engineering data of the time-accurate flow field for buffet analysis, vortex shedding, etc. This approach encompasses unsteady RANS (URANS), large-eddy simulations (LES), and hybrid LES-RANS approaches such as Detached Eddy Simulations (DES). These unsteady approaches are inherently more expensive than traditional engineering RANS approaches, hence every e ort to mitigate this cost must be leveraged. Arguably, the most cost-effective approach to improve the efficiency of unsteady methods is the optimal placement of the spatial and temporal degrees of freedom (DOF) using solution-adaptive methods.
An exchange format for use-cases of hospital information systems.
Masuda, G; Sakamoto, N; Sakai, R; Yamamoto, R
2001-01-01
Object-oriented software development is a powerful methodology for development of large hospital information systems. We think use-case driven approach is particularly useful for the development. In the use-cases driven approach, use-cases are documented at the first stage in the software development process and they are used through the whole steps in a variety of ways. Therefore, it is important to exchange and share the use-cases and make effective use of them through the overall lifecycle of a development process. In this paper, we propose a method of sharing and exchanging use-case models between applications, developers, and projects. We design an XML based exchange format for use-cases. We then discuss an application of the exchange format to support several software development activities. We preliminarily implemented a support system for object-oriented analysis based on the exchange format. The result shows that using the structural and semantic information in the exchange format enables the support system to assist the object-oriented analysis successfully.
Nolan, John P.; Mandy, Francis
2008-01-01
While the term flow cytometry refers to the measurement of cells, the approach of making sensitive multiparameter optical measurements in a flowing sample stream is a very general analytical approach. The past few years have seen an explosion in the application of flow cytometry technology for molecular analysis and measurements using micro-particles as solid supports. While microsphere-based molecular analyses using flow cytometry date back three decades, the need for highly parallel quantitative molecular measurements that has arisen from various genomic and proteomic advances has driven the development in particle encoding technology to enable highly multiplexed assays. Multiplexed particle-based immunoassays are now common place, and new assays to study genes, protein function, and molecular assembly. Numerous efforts are underway to extend the multiplexing capabilities of microparticle-based assays through new approaches to particle encoding and analyte reporting. The impact of these developments will be seen in the basic research and clinical laboratories, as well as in drug development. PMID:16604537
Unified Approach to the Biomechanics of Dental Implantology
NASA Technical Reports Server (NTRS)
Grenoble, D. E.; Knoell, A. C.
1973-01-01
The human need for safe and effective dental implants is well-recognized. Although many implant designs have been tested and are in use today, a large number have resulted in clinical failure. These failures appear to be due to biomechanical effects, as well as biocompatibility and surgical factors. A unified approach is proposed using multidisciplinary systems technology, for the study of the biomechanical interactions between dental implants and host tissues. The approach progresses from biomechanical modeling and analysis, supported by experimental investigations, through implant design development, clinical verification, and education of the dental practitioner. The result of the biomechanical modeling, analysis, and experimental phases would be the development of scientific design criteria for implants. Implant designs meeting these criteria would be generated, fabricated, and tested in animals. After design acceptance, these implants would be tested in humans, using efficient and safe surgical and restorative procedures. Finally, educational media and instructional courses would be developed for training dental practitioners in the use of the resulting implants.
Holt, Vernon P; Ladwa, Russ
2009-10-01
Mentoring and coaching, as they are currently practised, are relatively new techniques for working with people. The roots of the current approach can be traced back to the psychotherapist Carl Rogers, who developed a new 'person-centred approach' to counselling and quickly realised that this approach was also appropriate for many types of relationship, from education to family life. Rogers' thinking was deeply influenced by dialogues with his friend, the existentialist philosopher Martin Buber. Developments in psychology building upon this new person-centred approach include transactional analysis (TA) and neurolingusitic programming (NLP). More recently, solutions-focused approaches have been used and a related approach to leadership in the business environment-strengths-based leadership-has been developed. In recent years, developments in neuroscience have greatly increased understanding not only of how the brain is 'wired up' but also of how it is specifically wired to function as a social organ. The increased understanding in these areas can be considered in the context of emotional and social intelligence. These concepts and knowledge have been drawn together into a more structured discipline with the development of the approach known as positive psychology, the focus of which is on the strengths and virtues that contribute to good performance and authentic happiness.
Planning for population viability on Northern Great Plains national grasslands
Samson, F.B.; Knopf, F.L.; McCarthy, C.W.; Noon, B.R.; Ostlie, W.R.; Rinehart, S.M.; Larson, S.; Plumb, G.E.; Schenbeck, G.L.; Svingen, D.N.; Byer, T.W.
2003-01-01
Broad-scale information in concert with conservation of individual species must be used to develop conservation priorities and a more integrated ecosystem protection strategy. In 1999 the United States Forest Service initiated an approach for the 1.2× 106 ha of national grasslands in the Northern Great Plains to fulfill the requirement to maintain viable populations of all native and desirable introduced vertebrate and plant species. The challenge was threefold: 1) develop basic building blocks in the conservation planning approach, 2) apply the approach to national grasslands, and 3) overcome differences that may exist in agency-specific legal and policy requirements. Key assessment components in the approach included a bioregional assessment, coarse-filter analysis, and fine-filter analysis aimed at species considered at-risk. A science team of agency, conservation organization, and university personnel was established to develop the guidelines and standards and other formal procedures for implementation of conservation strategies. Conservation strategies included coarse-filter recommendations to restore the tallgrass, mixed, and shortgrass prairies to conditions that approximate historical ecological processes and landscape patterns, and fine-filter recommendations to address viability needs of individual and multiple species of native animals and plants. Results include a cost-effective approach to conservation planning and recommendations for addressing population viability and biodiversity concerns on national grasslands in the Northern Great Plains.
NASA Technical Reports Server (NTRS)
Shackelford, John H.; Saugen, John D.; Wurst, Michael J.; Adler, James
1991-01-01
A generic planar 3 degree of freedom simulation was developed that supports hardware in the loop simulations, guidance and control analysis, and can directly generate flight software. This simulation was developed in a small amount of time utilizing rapid prototyping techniques. The approach taken to develop this simulation tool, the benefits seen using this approach to development, and on-going efforts to improve and extend this capability are described. The simulation is composed of 3 major elements: (1) Docker dynamics model, (2) Dockee dynamics model, and (3) Docker Control System. The docker and dockee models are based on simple planar orbital dynamics equations using a spherical earth gravity model. The docker control system is based on a phase plane approach to error correction.
Recent advances in statistical energy analysis
NASA Technical Reports Server (NTRS)
Heron, K. H.
1992-01-01
Statistical Energy Analysis (SEA) has traditionally been developed using modal summation and averaging approach, and has led to the need for many restrictive SEA assumptions. The assumption of 'weak coupling' is particularly unacceptable when attempts are made to apply SEA to structural coupling. It is now believed that this assumption is more a function of the modal formulation rather than a necessary formulation of SEA. The present analysis ignores this restriction and describes a wave approach to the calculation of plate-plate coupling loss factors. Predictions based on this method are compared with results obtained from experiments using point excitation on one side of an irregular six-sided box structure. Conclusions show that the use and calculation of infinite transmission coefficients is the way forward for the development of a purely predictive SEA code.
Use of application containers and workflows for genomic data analysis.
Schulz, Wade L; Durant, Thomas J S; Siddon, Alexa J; Torres, Richard
2016-01-01
The rapid acquisition of biological data and development of computationally intensive analyses has led to a need for novel approaches to software deployment. In particular, the complexity of common analytic tools for genomics makes them difficult to deploy and decreases the reproducibility of computational experiments. Recent technologies that allow for application virtualization, such as Docker, allow developers and bioinformaticians to isolate these applications and deploy secure, scalable platforms that have the potential to dramatically increase the efficiency of big data processing. While limitations exist, this study demonstrates a successful implementation of a pipeline with several discrete software applications for the analysis of next-generation sequencing (NGS) data. With this approach, we significantly reduced the amount of time needed to perform clonal analysis from NGS data in acute myeloid leukemia.
A Practical Engineering Approach to Predicting Fatigue Crack Growth in Riveted Lap Joints
NASA Technical Reports Server (NTRS)
Harris, Charles E.; Piascik, Robert S.; Newman, James C., Jr.
1999-01-01
An extensive experimental database has been assembled from very detailed teardown examinations of fatigue cracks found in rivet holes of fuselage structural components. Based on this experimental database, a comprehensive analysis methodology was developed to predict the onset of widespread fatigue damage in lap joints of fuselage structure. Several computer codes were developed with specialized capabilities to conduct the various analyses that make up the comprehensive methodology. Over the past several years, the authors have interrogated various aspects of the analysis methods to determine the degree of computational rigor required to produce numerical predictions with acceptable engineering accuracy. This study led to the formulation of a practical engineering approach to predicting fatigue crack growth in riveted lap joints. This paper describes the practical engineering approach and compares predictions with the results from several experimental studies.
A Practical Engineering Approach to Predicting Fatigue Crack Growth in Riveted Lap Joints
NASA Technical Reports Server (NTRS)
Harris, C. E.; Piascik, R. S.; Newman, J. C., Jr.
2000-01-01
An extensive experimental database has been assembled from very detailed teardown examinations of fatigue cracks found in rivet holes of fuselage structural components. Based on this experimental database, a comprehensive analysis methodology was developed to predict the onset of widespread fatigue damage in lap joints of fuselage structure. Several computer codes were developed with specialized capabilities to conduct the various analyses that make up the comprehensive methodology. Over the past several years, the authors have interrogated various aspects of the analysis methods to determine the degree of computational rigor required to produce numerical predictions with acceptable engineering accuracy. This study led to the formulation of a practical engineering approach to predicting fatigue crack growth in riveted lap joints. This paper describes the practical engineering approach and compares predictions with the results from several experimental studies.
ATDRS payload technology research and development
NASA Technical Reports Server (NTRS)
Anzic, G.; Connolly, D. J.; Fujikawa, G.; Andro, M.; Kunath, R. R.; Sharp, G. R.
1990-01-01
Four technology development tasks were chosen to reduce (or at least better understand) the technology risks associated with proposed approaches to Advanced Tracking and Data Relay Satellite (ATDRS). The four tasks relate to a Tri-Band Antenna feed system, a Digital Beamforming System for the S Band Multiple Access System (SMA), an SMA Phased Array Antenna, and a Configuration Thermal/Mechanical Analysis task. The objective, approach, and status of each are discussed.
NASA Astrophysics Data System (ADS)
Labate, Demetrio; Negi, Pooran; Ozcan, Burcin; Papadakis, Manos
2015-09-01
As advances in imaging technologies make more and more data available for biomedical applications, there is an increasing need to develop efficient quantitative algorithms for the analysis and processing of imaging data. In this paper, we introduce an innovative multiscale approach called Directional Ratio which is especially effective to distingush isotropic from anisotropic structures. This task is especially useful in the analysis of images of neurons, the main units of the nervous systems which consist of a main cell body called the soma and many elongated processes called neurites. We analyze the theoretical properties of our method on idealized models of neurons and develop a numerical implementation of this approach for analysis of fluorescent images of cultured neurons. We show that this algorithm is very effective for the detection of somas and the extraction of neurites in images of small circuits of neurons.
NASA Technical Reports Server (NTRS)
Baker, V. R. (Principal Investigator); Holz, R. K.; Hulke, S. D.; Patton, P. C.; Penteado, M. M.
1975-01-01
The author has identified the following significant results. Development of a quantitative hydrogeomorphic approach to flood hazard evaluation was hindered by (1) problems of resolution and definition of the morphometric parameters which have hydrologic significance, and (2) mechanical difficulties in creating the necessary volume of data for meaningful analysis. Measures of network resolution such as drainage density and basin Shreve magnitude indicated that large scale topographic maps offered greater resolution than small scale suborbital imagery and orbital imagery. The disparity in network resolution capabilities between orbital and suborbital imagery formats depends on factors such as rock type, vegetation, and land use. The problem of morphometric data analysis was approached by developing a computer-assisted method for network analysis. The system allows rapid identification of network properties which can then be related to measures of flood response.
Integration of Engine, Plume, and CFD Analyses in Conceptual Design of Low-Boom Supersonic Aircraft
NASA Technical Reports Server (NTRS)
Li, Wu; Campbell, Richard; Geiselhart, Karl; Shields, Elwood; Nayani, Sudheer; Shenoy, Rajiv
2009-01-01
This paper documents an integration of engine, plume, and computational fluid dynamics (CFD) analyses in the conceptual design of low-boom supersonic aircraft, using a variable fidelity approach. In particular, the Numerical Propulsion Simulation System (NPSS) is used for propulsion system cycle analysis and nacelle outer mold line definition, and a low-fidelity plume model is developed for plume shape prediction based on NPSS engine data and nacelle geometry. This model provides a capability for the conceptual design of low-boom supersonic aircraft that accounts for plume effects. Then a newly developed process for automated CFD analysis is presented for CFD-based plume and boom analyses of the conceptual geometry. Five test cases are used to demonstrate the integrated engine, plume, and CFD analysis process based on a variable fidelity approach, as well as the feasibility of the automated CFD plume and boom analysis capability.
Integrative prescreening in analysis of multiple cancer genomic studies
2012-01-01
Background In high throughput cancer genomic studies, results from the analysis of single datasets often suffer from a lack of reproducibility because of small sample sizes. Integrative analysis can effectively pool and analyze multiple datasets and provides a cost effective way to improve reproducibility. In integrative analysis, simultaneously analyzing all genes profiled may incur high computational cost. A computationally affordable remedy is prescreening, which fits marginal models, can be conducted in a parallel manner, and has low computational cost. Results An integrative prescreening approach is developed for the analysis of multiple cancer genomic datasets. Simulation shows that the proposed integrative prescreening has better performance than alternatives, particularly including prescreening with individual datasets, an intensity approach and meta-analysis. We also analyze multiple microarray gene profiling studies on liver and pancreatic cancers using the proposed approach. Conclusions The proposed integrative prescreening provides an effective way to reduce the dimensionality in cancer genomic studies. It can be coupled with existing analysis methods to identify cancer markers. PMID:22799431
Ice Accretion Modeling using an Eulerian Approach for Droplet Impingement
NASA Technical Reports Server (NTRS)
Kim, Joe Woong; Garza, Dennis P.; Sankar, Lakshmi N.; Kreeger, Richard E.
2012-01-01
A three-dimensional Eulerian analysis has been developed for modeling droplet impingement on lifting bodes. The Eulerian model solves the conservation equations of mass and momentum to obtain the droplet flow field properties on the same mesh used in CFD simulations. For complex configurations such as a full rotorcraft, the Eulerian approach is more efficient because the Lagrangian approach would require a significant amount of seeding for accurate estimates of collection efficiency. Simulations are done for various benchmark cases such as NACA0012 airfoil, MS317 airfoil and oscillating SC2110 airfoil to illustrate its use. The present results are compared with results from the Lagrangian approach used in an industry standard analysis called LEWICE.
Shedding light into the function of the earliest vertebrate skeleton
NASA Astrophysics Data System (ADS)
Martinez-Perez, Carlos; Purnell, Mark; Rayfield, Emily; Donoghue, Philip
2016-04-01
Conodonts are an extinct group of jawless vertebrates, the first in our evolutionary lineage to develop a biomineralized skeleton. As such, the conodont skeleton is of great significance because of the insights it provides concerning the biology and function of the primitive vertebrate skeleton. Conodont function has been debated for a century and a half on the basis of its paleocological importance in the Palaezoic ecosystems. However, due to the lack of extanct close representatives and the small size of the conodont element (under a milimiter in length) strongly limited their functional analysis, traditional restricted to analogy. More recently, qualitative approaches have been developed, facilitating tests of element function based on occlusal performance and analysis of microwear and microstructure. In this work we extend these approaches using novel quantitative experimental methods including Synchrotron Radiation X-ray Tomographic Microscopy or Finite Element Analysis to test hypotheses of conodont function. The development of high resolution virtual models of conodont elements, together with biomechanical approaches using Finite Element analysis, informed by occlusal and microwear analyses, provided conclusive support to test hypothesis of structural adaptation within the crown tissue microstructure, showing a close topological co-variation patterns of compressive and tensile stress distribution with different crystallite orientation. In addition, our computational analyses strongly support a tooth-like function for many conodont species. Above all, our study establishes a framework (experimental approach) in which the functional ecology of conodonts can be read from their rich taxonomy and phylogeny, representing an important attempt to understand the role of this abundant and diverse clade in the Phanerozoic marine ecosystems.
NASA Astrophysics Data System (ADS)
McMahon, Kendra
2012-07-01
By developing two case studies of expert teaching in action, this study aimed to develop knowledge of talk in whole-class teaching in UK primary science lessons and understand this in relation to both the teachers' interpretations and sociocultural theoretical frameworks. Lessons were observed and video-recorded and the teachers engaged in video-stimulated-reflective dialogue to capture participants' reflections upon their own pedagogic purposes and interactions in the classroom. The analytic framework was developed at three levels: sequence of lessons, lesson, and episode. For each episode, the 'communicative approach' and teaching purposes were recorded. Transcripts were developed for fine grain analysis of selected episodes and a quantitative analysis was undertaken of the use of communicative approaches. Findings exemplify how different communicative approaches were used by the case-study teachers for different pedagogical purposes at different points in the sequence of lessons, contributing to primary teachers' repertoire for planning and practice. The initial elicitation of children's ideas can be understood as pooling them to enhance multivoicedness and develop a shared resource for future dialogues. Whole-class talk can support univocality by rehearsing procedural knowledge and exploring the meanings of scientific terminology. Identifying salient features of phenomena in the context of the whole-class marks them as significant as shared knowledge but valuing other observations extends the multivoicedness of the discourse.
New Technologies for Rapid Bacterial Identification and Antibiotic Resistance Profiling.
Kelley, Shana O
2017-04-01
Conventional approaches to bacterial identification and drug susceptibility testing typically rely on culture-based approaches that take 2 to 7 days to return results. The long turnaround times contribute to the spread of infectious disease, negative patient outcomes, and the misuse of antibiotics that can contribute to antibiotic resistance. To provide new solutions enabling faster bacterial analysis, a variety of approaches are under development that leverage single-cell analysis, microfluidic concentration and detection strategies, and ultrasensitive readout mechanisms. This review discusses recent advances in this area and the potential of new technologies to enable more effective management of infectious disease.
ERIC Educational Resources Information Center
Lingvay, Mónika; Timofte, Roxana S.; Ciascai, Liliana; Predescu, Constantin
2015-01-01
Development of pupils' deep learning approach is an important goal of education nowadays, considering that a deep learning approach is mediating conceptual understanding and transfer. Different performance at PISA tests of Romanian and Hungarian pupils cause us to commence a study for the analysis of learning approaches employed by these pupils.…
Improving Cohesion in L2 Writing: A Three-Strand Approach to Building Lexical Cohesion
ERIC Educational Resources Information Center
Johnson, Mark
2017-01-01
This article presents a three-strand approach to help L2 writers in English as a foreign language (EFL) and English as a second language (ESL) instructional contexts achieve greater cohesion in their written work. The approach focuses on (1) the analysis of authentic texts, (2) the development of productive vocabulary, and (3) information…
Integrating host, natural enemy, and other processes in population models of the pine sawfly
A. A. Sharov
1991-01-01
Explanation of population dynamics is one of the main problems in population ecology. There are two main approaches to the explanation: the factor approach and the dynamic approach. According to the first, an explanation is obtained when the effect of various environmental factors on population density is revealed. Such analysis is performed using well developed...
Measuring sustainable development using a multi-criteria model: a case study.
Boggia, Antonio; Cortina, Carla
2010-11-01
This paper shows how Multi-criteria Decision Analysis (MCDA) can help in a complex process such as the assessment of the level of sustainability of a certain area. The paper presents the results of a study in which a model for measuring sustainability was implemented to better aid public policy decisions regarding sustainability. In order to assess sustainability in specific areas, a methodological approach based on multi-criteria analysis has been developed. The aim is to rank areas in order to understand the specific technical and/or financial support that they need to develop sustainable growth. The case study presented is an assessment of the level of sustainability in different areas of an Italian Region using the MCDA approach. Our results show that MCDA is a proper approach for sustainability assessment. The results are easy to understand and the evaluation path is clear and transparent. This is what decision makers need for having support to their decisions. The multi-criteria model for evaluation has been developed respecting the sustainable development economic theory, so that final results can have a clear meaning in terms of sustainability. Copyright 2010 Elsevier Ltd. All rights reserved.
Dorofeev, S B; Babenko, A I
2017-01-01
The article deals with analysis of national and international publications concerning methodological aspects of elaborating systematic approach to healthy life-style of population. This scope of inquiry plays a key role in development of human capital. The costs related to healthy life-style are to be considered as personal investment into future income due to physical incrementation of human capital. The definitions of healthy life-style, its categories and supportive factors are to be considered in the process of development of strategies and programs of healthy lifestyle. The implementation of particular strategies entails application of comprehensive information and educational programs meant for various categories of population. Therefore, different motivation techniques are to be considered for children, adolescents, able-bodied population, the elderly. This approach is to be resulted in establishing particular responsibility for national government, territorial administrations, health care administrations, employers and population itself. The necessity of complex legislative measures is emphasized. The recent social hygienic studies were focused mostly on particular aspects of development of healthy life-style of population. Hence, the demand for long term exploration of development of organizational and functional models implementing medical preventive measures on the basis of comprehensive information analysis using statistical, sociological and professional expertise.
Dennis M. May
1998-01-01
Discusses a regional composite approach to managing timber product output data in a relational database. Describes the development and structure of the regional composite database and demonstrates its use in addressing everyday timber product output information needs.
Knowledge, Learning, Analysis System (KLAS)
USDA-ARS?s Scientific Manuscript database
The goal of KLAS is to develop a new scientific approach that takes advantage of the data deluge, defined here as both legacy data and new data acquired from environmental and biotic sensors, complex simulation models, and improved technologies for probing biophysical samples. This approach can be i...
Quantification of Uncertainty in the Flood Frequency Analysis
NASA Astrophysics Data System (ADS)
Kasiapillai Sudalaimuthu, K.; He, J.; Swami, D.
2017-12-01
Flood frequency analysis (FFA) is usually carried out for planning and designing of water resources and hydraulic structures. Owing to the existence of variability in sample representation, selection of distribution and estimation of distribution parameters, the estimation of flood quantile has been always uncertain. Hence, suitable approaches must be developed to quantify the uncertainty in the form of prediction interval as an alternate to deterministic approach. The developed framework in the present study to include uncertainty in the FFA discusses a multi-objective optimization approach to construct the prediction interval using ensemble of flood quantile. Through this approach, an optimal variability of distribution parameters is identified to carry out FFA. To demonstrate the proposed approach, annual maximum flow data from two gauge stations (Bow river at Calgary and Banff, Canada) are used. The major focus of the present study was to evaluate the changes in magnitude of flood quantiles due to the recent extreme flood event occurred during the year 2013. In addition, the efficacy of the proposed method was further verified using standard bootstrap based sampling approaches and found that the proposed method is reliable in modeling extreme floods as compared to the bootstrap methods.
Analysis of Tube Hydroforming by means of an Inverse Approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nguyen, Ba Nghiep; Johnson, Kenneth I.; Khaleel, Mohammad A.
2003-05-01
This paper presents a computational tool for the analysis of freely hydroformed tubes by means of an inverse approach. The formulation of the inverse method developed by Guo et al. is adopted and extended to the tube hydrofoming problems in which the initial geometry is a round tube submitted to hydraulic pressure and axial feed at the tube ends (end-feed). A simple criterion based on a forming limit diagram is used to predict the necking regions in the deformed workpiece. Although the developed computational tool is a stand-alone code, it has been linked to the Marc finite element code formore » meshing and visualization of results. The application of the inverse approach to tube hydroforming is illustrated through the analyses of the aluminum alloy AA6061-T4 seamless tubes under free hydroforming conditions. The results obtained are in good agreement with those issued from a direct incremental approach. However, the computational time in the inverse procedure is much less than that in the incremental method.« less
Phase Space Dissimilarity Measures for Structural Health Monitoring
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bubacz, Jacob A; Chmielewski, Hana T; Pape, Alexander E
A novel method for structural health monitoring (SHM), known as the Phase Space Dissimilarity Measures (PSDM) approach, is proposed and developed. The patented PSDM approach has already been developed and demonstrated for a variety of equipment and biomedical applications. Here, we investigate SHM of bridges via analysis of time serial accelerometer measurements. This work has four aspects. The first is algorithm scalability, which was found to scale linearly from one processing core to four cores. Second, the same data are analyzed to determine how the use of the PSDM approach affects sensor placement. We found that a relatively low-density placementmore » sufficiently captures the dynamics of the structure. Third, the same data are analyzed by unique combinations of accelerometer axes (vertical, longitudinal, and lateral with respect to the bridge) to determine how the choice of axes affects the analysis. The vertical axis is found to provide satisfactory SHM data. Fourth, statistical methods were investigated to validate the PSDM approach for this application, yielding statistically significant results.« less
Using expert judgments to explore robust alternatives for forest management under climate change.
McDaniels, Timothy; Mills, Tamsin; Gregory, Robin; Ohlson, Dan
2012-12-01
We develop and apply a judgment-based approach to selecting robust alternatives, which are defined here as reasonably likely to achieve objectives, over a range of uncertainties. The intent is to develop an approach that is more practical in terms of data and analysis requirements than current approaches, informed by the literature and experience with probability elicitation and judgmental forecasting. The context involves decisions about managing forest lands that have been severely affected by mountain pine beetles in British Columbia, a pest infestation that is climate-exacerbated. A forest management decision was developed as the basis for the context, objectives, and alternatives for land management actions, to frame and condition the judgments. A wide range of climate forecasts, taken to represent the 10-90% levels on cumulative distributions for future climate, were developed to condition judgments. An elicitation instrument was developed, tested, and revised to serve as the basis for eliciting probabilistic three-point distributions regarding the performance of selected alternatives, over a set of relevant objectives, in the short and long term. The elicitations were conducted in a workshop comprising 14 regional forest management specialists. We employed the concept of stochastic dominance to help identify robust alternatives. We used extensive sensitivity analysis to explore the patterns in the judgments, and also considered the preferred alternatives for each individual expert. The results show that two alternatives that are more flexible than the current policies are judged more likely to perform better than the current alternatives on average in terms of stochastic dominance. The results suggest judgmental approaches to robust decision making deserve greater attention and testing. © 2012 Society for Risk Analysis.
Gene network analysis: from heart development to cardiac therapy.
Ferrazzi, Fulvia; Bellazzi, Riccardo; Engel, Felix B
2015-03-01
Networks offer a flexible framework to represent and analyse the complex interactions between components of cellular systems. In particular gene networks inferred from expression data can support the identification of novel hypotheses on regulatory processes. In this review we focus on the use of gene network analysis in the study of heart development. Understanding heart development will promote the elucidation of the aetiology of congenital heart disease and thus possibly improve diagnostics. Moreover, it will help to establish cardiac therapies. For example, understanding cardiac differentiation during development will help to guide stem cell differentiation required for cardiac tissue engineering or to enhance endogenous repair mechanisms. We introduce different methodological frameworks to infer networks from expression data such as Boolean and Bayesian networks. Then we present currently available temporal expression data in heart development and discuss the use of network-based approaches in published studies. Collectively, our literature-based analysis indicates that gene network analysis constitutes a promising opportunity to infer therapy-relevant regulatory processes in heart development. However, the use of network-based approaches has so far been limited by the small amount of samples in available datasets. Thus, we propose to acquire high-resolution temporal expression data to improve the mathematical descriptions of regulatory processes obtained with gene network inference methodologies. Especially probabilistic methods that accommodate the intrinsic variability of biological systems have the potential to contribute to a deeper understanding of heart development.
Developing and Refining the Taiwan Birth Cohort Study (TBCS): Five Years of Experience
ERIC Educational Resources Information Center
Lung, For-Wey; Chiang, Tung-Liang; Lin, Shio-Jean; Shu, Bih-Ching; Lee, Meng-Chih
2011-01-01
The Taiwan Birth Cohort Study (TBCS) is the first nationwide birth cohort database in Asia designed to establish national norms of children's development. Several challenges during database development and data analysis were identified. Challenges include sampling methods, instrument development and statistical approach to missing data. The…
Improving Distributed Diagnosis Through Structural Model Decomposition
NASA Technical Reports Server (NTRS)
Bregon, Anibal; Daigle, Matthew John; Roychoudhury, Indranil; Biswas, Gautam; Koutsoukos, Xenofon; Pulido, Belarmino
2011-01-01
Complex engineering systems require efficient fault diagnosis methodologies, but centralized approaches do not scale well, and this motivates the development of distributed solutions. This work presents an event-based approach for distributed diagnosis of abrupt parametric faults in continuous systems, by using the structural model decomposition capabilities provided by Possible Conflicts. We develop a distributed diagnosis algorithm that uses residuals computed by extending Possible Conflicts to build local event-based diagnosers based on global diagnosability analysis. The proposed approach is applied to a multitank system, and results demonstrate an improvement in the design of local diagnosers. Since local diagnosers use only a subset of the residuals, and use subsystem models to compute residuals (instead of the global system model), the local diagnosers are more efficient than previously developed distributed approaches.
TSHIPS : Transportation shipping harmonization and integration planning system
DOT National Transportation Integrated Search
2001-03-01
This report documents the development of the Transportation Shipping Harmonization and Integration Planning System (TSHIPS). The TSHIPS project was developed to advance the state of the art in transportation systems analysis. Existing approaches and ...
Analysis and Development of Management Information Systems for Private Messes Afloat
1988-03-01
the development phase emphasis was placed on a three step approach starting with an analysis of the requirements as established by... oper - ating the mess divided by number of mess members Total Mess Bill Due Total of old bills, current bill, mess share owed, and special assessment 46...TRANSPARENCY THE SYSTEM BEHAVIOR IS TRANSPARENT TO THE USER. THAT MEANS THAT THE USER CAN DEVELOP A CONSISTENT MODEL OF THE SYSTEM WHEN WORKING
Mitchell, Jade; Arnot, Jon A.; Jolliet, Olivier; Georgopoulos, Panos G.; Isukapalli, Sastry; Dasgupta, Surajit; Pandian, Muhilan; Wambaugh, John; Egeghy, Peter; Cohen Hubal, Elaine A.; Vallero, Daniel A.
2014-01-01
While only limited data are available to characterize the potential toxicity of over 8 million commercially available chemical substances, there is even less information available on the exposure and use-scenarios that are required to link potential toxicity to human and ecological health outcomes. Recent improvements and advances such as high throughput data gathering, high performance computational capabilities, and predictive chemical inherency methodology make this an opportune time to develop an exposure-based prioritization approach that can systematically utilize and link the asymmetrical bodies of knowledge for hazard and exposure. In response to the US EPA’s need to develop novel approaches and tools for rapidly prioritizing chemicals, a “Challenge” was issued to several exposure model developers to aid the understanding of current systems in a broader sense and to assist the US EPA’s effort to develop an approach comparable to other international efforts. A common set of chemicals were prioritized under each current approach. The results are presented herein along with a comparative analysis of the rankings of the chemicals based on metrics of exposure potential or actual exposure estimates. The analysis illustrates the similarities and differences across the domains of information incorporated in each modeling approach. The overall findings indicate a need to reconcile exposures from diffuse, indirect sources (far-field) with exposures from directly, applied chemicals in consumer products or resulting from the presence of a chemical in a microenvironment like a home or vehicle. Additionally, the exposure scenario, including the mode of entry into the environment (i.e. through air, water or sediment) appears to be an important determinant of the level of agreement between modeling approaches. PMID:23707726
Mitchell, Jade; Arnot, Jon A; Jolliet, Olivier; Georgopoulos, Panos G; Isukapalli, Sastry; Dasgupta, Surajit; Pandian, Muhilan; Wambaugh, John; Egeghy, Peter; Cohen Hubal, Elaine A; Vallero, Daniel A
2013-08-01
While only limited data are available to characterize the potential toxicity of over 8 million commercially available chemical substances, there is even less information available on the exposure and use-scenarios that are required to link potential toxicity to human and ecological health outcomes. Recent improvements and advances such as high throughput data gathering, high performance computational capabilities, and predictive chemical inherency methodology make this an opportune time to develop an exposure-based prioritization approach that can systematically utilize and link the asymmetrical bodies of knowledge for hazard and exposure. In response to the US EPA's need to develop novel approaches and tools for rapidly prioritizing chemicals, a "Challenge" was issued to several exposure model developers to aid the understanding of current systems in a broader sense and to assist the US EPA's effort to develop an approach comparable to other international efforts. A common set of chemicals were prioritized under each current approach. The results are presented herein along with a comparative analysis of the rankings of the chemicals based on metrics of exposure potential or actual exposure estimates. The analysis illustrates the similarities and differences across the domains of information incorporated in each modeling approach. The overall findings indicate a need to reconcile exposures from diffuse, indirect sources (far-field) with exposures from directly, applied chemicals in consumer products or resulting from the presence of a chemical in a microenvironment like a home or vehicle. Additionally, the exposure scenario, including the mode of entry into the environment (i.e. through air, water or sediment) appears to be an important determinant of the level of agreement between modeling approaches. Copyright © 2013 Elsevier B.V. All rights reserved.
Efficiency Analysis of Public Universities in Thailand
ERIC Educational Resources Information Center
Kantabutra, Saranya; Tang, John C. S.
2010-01-01
This paper examines the performance of Thai public universities in terms of efficiency, using a non-parametric approach called data envelopment analysis. Two efficiency models, the teaching efficiency model and the research efficiency model, are developed and the analysis is conducted at the faculty level. Further statistical analyses are also…
ERIC Educational Resources Information Center
Studdert, Thomas Patrick
2013-01-01
Using an innovation adaptation of the gap analysis approach of Richard Clark and Fred Estes, the collegiate First-Year Experience (FYE) consisting of comprehensive and intentional curricular and co-curricular initiatives was examined. Conceptualization and operationalization of the goal for a FYE program was based on 3 student development theories…
ERIC Educational Resources Information Center
Gao, Ying; Du, Wanyi
2013-01-01
This paper traces 9 non-English major EFL students and collects their oral productions in 4 successive oral exams in 2 years. The canonical correlation analysis approach of SPSS is adopted to study the disfluencies developmental traits under the influence of language acquisition development. We find that as language acquisition develops, the total…
ERIC Educational Resources Information Center
De Rosa, Marcello; Bartoli, Luca
2017-01-01
Purpose: The aim of the paper is to evaluate how advisory services stimulate the adoption of rural development policies (RDP) aiming at value creation. Design/methodology/approach: By linking the use of agricultural extension services (AES) to policies for value creation, we will put forward an empirical analysis in Italy, with the aim of…
Analysis Matrix of Resilience in the Face of Disability, Old Age and Poverty
ERIC Educational Resources Information Center
Cardenas, Andrea; Lopez, Lucero
2010-01-01
The purpose of this article is to describe the process of the development of the "Resilience Theoretical Analysis Matrix" (RTAM) (or in its Spanish translation: MATR), a tool designed to facilitate a coherent and organised approach to the assessment of a wide spectrum of factors influencing the development of resilience in the face of disability,…
Potential applications of computational fluid dynamics to biofluid analysis
NASA Technical Reports Server (NTRS)
Kwak, D.; Chang, J. L. C.; Rogers, S. E.; Rosenfeld, M.; Kwak, D.
1988-01-01
Computational fluid dynamics was developed to the stage where it has become an indispensable part of aerospace research and design. In view of advances made in aerospace applications, the computational approach can be used for biofluid mechanics research. Several flow simulation methods developed for aerospace problems are briefly discussed for potential applications to biofluids, especially to blood flow analysis.
Effects-based strategy development through center of gravity and target system analysis
NASA Astrophysics Data System (ADS)
White, Christopher M.; Prendergast, Michael; Pioch, Nicholas; Jones, Eric K.; Graham, Stephen
2003-09-01
This paper describes an approach to effects-based planning in which a strategic-theater-level mission is refined into operational-level and ultimately tactical-level tasks and desired effects, informed by models of the expected enemy response at each level of abstraction. We describe a strategy development system that implements this approach and supports human-in-the-loop development of an effects-based plan. This system consists of plan authoring tools tightly integrated with a suite of center of gravity (COG) and target system analysis tools. A human planner employs the plan authoring tools to develop a hierarchy of tasks and desired effects. Upon invocation, the target system analysis tools use reduced-order models of enemy centers of gravity to select appropriate target set options for the achievement of desired effects, together with associated indicators for each option. The COG analysis tools also provide explicit models of the causal mechanisms linking tasks and desired effects to one another, and suggest appropriate observable indicators to guide ISR planning, execution monitoring, and campaign assessment. We are currently implementing the system described here as part of the AFRL-sponsored Effects Based Operations program.
Impact of instructional Approaches to Teaching Elementary Science on Student Achievement
NASA Astrophysics Data System (ADS)
Kensinger, Seth H.
Strengthening our science education in the United States is essential to the future success of our country in the global marketplace. Immersing our elementary students with research-based quality science instruction is a critical component to build a strong foundation and motivate our students to become interested in science. The research for this study pertained to the type of elementary science instruction in correlation to academic achievement and gender. Through this study, the researcher answered the following questions: 1. What is the difference in achievement for elementary students who have been taught using one of the three science instructional approaches analyzed in this study: traditional science instruction, inquiry-based science instruction with little or no professional development and inquiry-based science instruction with high-quality professional development? 2. What is the difference in student achievement between inquiry-based instruction and non-inquiry based (traditional) instruction? 3. What is the difference in student achievement between inquiry with high quality professional development and inquiry with little or no professional development? 4. Do the three instructional approaches have differentiated effects across gender? The student achievement was measured using the 2010 fourth grade Pennsylvania System of School Assessment (PSSA) in Science. Data was collected from 15 elementary schools forming three main groupings of similar schools based on the results from the 2009 third grade PSSA in Mathematics and student and community demographics. In addition, five sub-group triads were formed to further analyze the data and each sub-group was composed of schools with matching demographic data. Each triad contained a school using a traditional approach to teaching science, a school utilizing an inquiry science approach with little or no professional development, and a school incorporating inquiry science instruction with high quality professional development. The five schools which provided its students with inquiry science and high quality professional development were Science Its Elementary (SIE) schools, as provided through a grant from the Pennsylvania Department of Education (PDE). The findings of the study indicated that there is evidence to suggest that elementary science achievement improves significantly when teachers have utilized inquiry instruction after receiving high-quality professional development. Specifically, the analysis of the whole group and the majority of the triad sub-groupings did result in a consistent trend to support science instruction utilizing inquiry with high-quality professional development compared to a traditional approach and an inquiry-based approach with little or no professional development. The gender analysis of this study focused on whether or not girls at the elementary school level would perform better than boys depending upon method of science instruction. The study revealed no relationship between approach to teaching science and achievement level based on gender. The whole group results and sub-group triads produced no significant findings for this part of the data analysis.
NASA Technical Reports Server (NTRS)
Leonard, J. I.; White, R. J.; Rummel, J. A.
1980-01-01
An approach was developed to aid in the integration of many of the biomedical findings of space flight, using systems analysis. The mathematical tools used in accomplishing this task include an automated data base, a biostatistical and data analysis system, and a wide variety of mathematical simulation models of physiological systems. A keystone of this effort was the evaluation of physiological hypotheses using the simulation models and the prediction of the consequences of these hypotheses on many physiological quantities, some of which were not amenable to direct measurement. This approach led to improvements in the model, refinements of the hypotheses, a tentative integrated hypothesis for adaptation to weightlessness, and specific recommendations for new flight experiments.
Interim analysis: A rational approach of decision making in clinical trial.
Kumar, Amal; Chakraborty, Bhaswat S
2016-01-01
Interim analysis of especially sizeable trials keeps the decision process free of conflict of interest while considering cost, resources, and meaningfulness of the project. Whenever necessary, such interim analysis can also call for potential termination or appropriate modification in sample size, study design, and even an early declaration of success. Given the extraordinary size and complexity today, this rational approach helps to analyze and predict the outcomes of a clinical trial that incorporate what is learned during the course of a study or a clinical development program. Such approach can also fill the gap by directing the resources toward relevant and optimized clinical trials between unmet medical needs and interventions being tested currently rather than fulfilling only business and profit goals.
Development of Multiobjective Optimization Techniques for Sonic Boom Minimization
NASA Technical Reports Server (NTRS)
Chattopadhyay, Aditi; Rajadas, John Narayan; Pagaldipti, Naryanan S.
1996-01-01
A discrete, semi-analytical sensitivity analysis procedure has been developed for calculating aerodynamic design sensitivities. The sensitivities of the flow variables and the grid coordinates are numerically calculated using direct differentiation of the respective discretized governing equations. The sensitivity analysis techniques are adapted within a parabolized Navier Stokes equations solver. Aerodynamic design sensitivities for high speed wing-body configurations are calculated using the semi-analytical sensitivity analysis procedures. Representative results obtained compare well with those obtained using the finite difference approach and establish the computational efficiency and accuracy of the semi-analytical procedures. Multidisciplinary design optimization procedures have been developed for aerospace applications namely, gas turbine blades and high speed wing-body configurations. In complex applications, the coupled optimization problems are decomposed into sublevels using multilevel decomposition techniques. In cases with multiple objective functions, formal multiobjective formulation such as the Kreisselmeier-Steinhauser function approach and the modified global criteria approach have been used. Nonlinear programming techniques for continuous design variables and a hybrid optimization technique, based on a simulated annealing algorithm, for discrete design variables have been used for solving the optimization problems. The optimization procedure for gas turbine blades improves the aerodynamic and heat transfer characteristics of the blades. The two-dimensional, blade-to-blade aerodynamic analysis is performed using a panel code. The blade heat transfer analysis is performed using an in-house developed finite element procedure. The optimization procedure yields blade shapes with significantly improved velocity and temperature distributions. The multidisciplinary design optimization procedures for high speed wing-body configurations simultaneously improve the aerodynamic, the sonic boom and the structural characteristics of the aircraft. The flow solution is obtained using a comprehensive parabolized Navier Stokes solver. Sonic boom analysis is performed using an extrapolation procedure. The aircraft wing load carrying member is modeled as either an isotropic or a composite box beam. The isotropic box beam is analyzed using thin wall theory. The composite box beam is analyzed using a finite element procedure. The developed optimization procedures yield significant improvements in all the performance criteria and provide interesting design trade-offs. The semi-analytical sensitivity analysis techniques offer significant computational savings and allow the use of comprehensive analysis procedures within design optimization studies.
The Practicality of Statistical Physics Handout Based on KKNI and the Constructivist Approach
NASA Astrophysics Data System (ADS)
Sari, S. Y.; Afrizon, R.
2018-04-01
Statistical physics lecture shows that: 1) the performance of lecturers, social climate, students’ competence and soft skills needed at work are in enough category, 2) students feel difficulties in following the lectures of statistical physics because it is abstract, 3) 40.72% of students needs more understanding in the form of repetition, practice questions and structured tasks, and 4) the depth of statistical physics material needs to be improved gradually and structured. This indicates that learning materials in accordance of The Indonesian National Qualification Framework or Kerangka Kualifikasi Nasional Indonesia (KKNI) with the appropriate learning approach are needed to help lecturers and students in lectures. The author has designed statistical physics handouts which have very valid criteria (90.89%) according to expert judgment. In addition, the practical level of handouts designed also needs to be considered in order to be easy to use, interesting and efficient in lectures. The purpose of this research is to know the practical level of statistical physics handout based on KKNI and a constructivist approach. This research is a part of research and development with 4-D model developed by Thiagarajan. This research activity has reached part of development test at Development stage. Data collection took place by using a questionnaire distributed to lecturers and students. Data analysis using descriptive data analysis techniques in the form of percentage. The analysis of the questionnaire shows that the handout of statistical physics has very practical criteria. The conclusion of this study is statistical physics handouts based on the KKNI and constructivist approach have been practically used in lectures.
The Verification-based Analysis of Reliable Multicast Protocol
NASA Technical Reports Server (NTRS)
Wu, Yunqing
1996-01-01
Reliable Multicast Protocol (RMP) is a communication protocol that provides an atomic, totally ordered, reliable multicast service on top of unreliable IP Multicasting. In this paper, we develop formal models for R.W using existing automatic verification systems, and perform verification-based analysis on the formal RMP specifications. We also use the formal models of RW specifications to generate a test suite for conformance testing of the RMP implementation. Throughout the process of RMP development, we follow an iterative, interactive approach that emphasizes concurrent and parallel progress between the implementation and verification processes. Through this approach, we incorporate formal techniques into our development process, promote a common understanding for the protocol, increase the reliability of our software, and maintain high fidelity between the specifications of RMP and its implementation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Qibin; Ames, Jennifer M.; Smith, Richard D.
2008-12-18
The Maillard reaction, starting from the glycation of protein and progressing to the formation of advanced glycation end-products (AGEs), is implicated in the development of complications of diabetes mellitus, as well as in the pathogenesis of cardiovascular, renal, and neurodegenerative diseases. In this perspective review, we provide on overview on the relevance of the Maillard reaction in the pathogenesis of chronic disease and discuss traditional approaches and recent developments in the analysis of glycated proteins by mass spectrometry. We propose that proteomics approaches, particularly bottom-up proteomics, will play a significant role in analyses of clinical samples leading to the identificationmore » of new markers of disease development and progression.« less
NASA Technical Reports Server (NTRS)
Tamma, Kumar K.; Railkar, Sudhir B.
1988-01-01
This paper describes new and recent advances in the development of a hybrid transfinite element computational methodology for applicability to conduction/convection/radiation heat transfer problems. The transfinite element methodology, while retaining the modeling versatility of contemporary finite element formulations, is based on application of transform techniques in conjunction with classical Galerkin schemes and is a hybrid approach. The purpose of this paper is to provide a viable hybrid computational methodology for applicability to general transient thermal analysis. Highlights and features of the methodology are described and developed via generalized formulations and applications to several test problems. The proposed transfinite element methodology successfully provides a viable computational approach and numerical test problems validate the proposed developments for conduction/convection/radiation thermal analysis.
NASA Astrophysics Data System (ADS)
Azougagh, Yassine; Benhida, Khalid; Elfezazi, Said
2016-02-01
In this paper, the focus is on studying the performance of complex systems in a supply chain context by developing a structured modelling approach based on the methodology ASDI (Analysis, Specification, Design and Implementation) by combining the modelling by Petri nets and simulation using ARENA. The linear approach typically followed in conducting of this kind of problems has to cope with a difficulty of modelling due to the complexity and the number of parameters of concern. Therefore, the approach used in this work is able to structure modelling a way to cover all aspects of the performance study. The modelling structured approach is first introduced before being applied to the case of an industrial system in the field of phosphate. Results of the performance indicators obtained from the models developed, permitted to test the behaviour and fluctuations of this system and to develop improved models of the current situation. In addition, in this paper, it was shown how Arena software can be adopted to simulate complex systems effectively. The method in this research can be applied to investigate various improvements scenarios and their consequences before implementing them in reality.
Future Issues and Approaches to Health Monitoring and Failure Prevention for Oil-Free Gas Turbines
NASA Technical Reports Server (NTRS)
DellaCorte, Christopher
2004-01-01
Recent technology advances in foil air bearings, high temperature solid lubricants and computer based modeling has enabled the development of small Oil-Free gas turbines. These turbomachines are currently commercialized as small (<100 kW) microturbine generators and larger machines are being developed. Based upon these successes and the high potential payoffs offered by Oil-Free systems, NASA, industry, and other government entities are anticipating Oil-Free gas turbine propulsion systems to proliferate future markets. Since an Oil-Free engine has no oil system, traditional approaches to health monitoring and diagnostics, such as chip detection, oil analysis, and possibly vibration signature analyses (e.g., ball pass frequency) will be unavailable. As such, new approaches will need to be considered. These could include shaft orbit analyses, foil bearing temperature measurements, embedded wear sensors and start-up/coast down speed analysis. In addition, novel, as yet undeveloped techniques may emerge based upon concurrent developments in MEMS technology. This paper introduces Oil-Free technology, reviews the current state of the art and potential for future turbomachinery applications and discusses possible approaches to health monitoring, diagnostics and failure prevention.
Electrochemical biosensing strategies for DNA methylation analysis.
Hossain, Tanvir; Mahmudunnabi, Golam; Masud, Mostafa Kamal; Islam, Md Nazmul; Ooi, Lezanne; Konstantinov, Konstantin; Hossain, Md Shahriar Al; Martinac, Boris; Alici, Gursel; Nguyen, Nam-Trung; Shiddiky, Muhammad J A
2017-08-15
DNA methylation is one of the key epigenetic modifications of DNA that results from the enzymatic addition of a methyl group at the fifth carbon of the cytosine base. It plays a crucial role in cellular development, genomic stability and gene expression. Aberrant DNA methylation is responsible for the pathogenesis of many diseases including cancers. Over the past several decades, many methodologies have been developed to detect DNA methylation. These methodologies range from classical molecular biology and optical approaches, such as bisulfite sequencing, microarrays, quantitative real-time PCR, colorimetry, Raman spectroscopy to the more recent electrochemical approaches. Among these, electrochemical approaches offer sensitive, simple, specific, rapid, and cost-effective analysis of DNA methylation. Additionally, electrochemical methods are highly amenable to miniaturization and possess the potential to be multiplexed. In recent years, several reviews have provided information on the detection strategies of DNA methylation. However, to date, there is no comprehensive evaluation of electrochemical DNA methylation detection strategies. Herein, we address the recent developments of electrochemical DNA methylation detection approaches. Furthermore, we highlight the major technical and biological challenges involved in these strategies and provide suggestions for the future direction of this important field. Copyright © 2017 Elsevier B.V. All rights reserved.
Beregovykh, V V; Spitskiy, O R
2014-01-01
Risk-based approach is used for examination of impact of different factors on quality of medicinal products in technology transfer. A general diagram is offered for risk analysis execution in technology transfer from pharmaceutical development to production. When transferring technology to full- scale commercial production it is necessary to investigate and simulate production process application beforehand in new real conditions. The manufacturing process is the core factorfor risk analysis having the most impact on quality attributes of a medicinal product. Further importantfactors are linked to materials and products to be handled and manufacturing environmental conditions such as premises, equipment and personnel. Usage of risk-based approach in designing of multipurpose production facility of medicinal products is shown where quantitative risk analysis tool RAMM (Risk Analysis and Mitigation Matrix) was applied.
Corriveau, H; Arsenault, A B; Dutil, E; Lepage, Y
1992-01-01
An evaluation based on the Bobath approach to treatment has previously been developed and partially validated. The purpose of the present study was to verify the content validity of this evaluation with the use of a statistical approach known as principal components analysis. Thirty-eight hemiplegic subjects participated in the study. Analysis of the scores on each of six parameters (sensorium, active movements, muscle tone, reflex activity, postural reactions, and pain) was evaluated on three occasions across a 2-month period. Each time this produced three factors that contained 70% of the variation in the data set. The first component mainly reflected variations in mobility, the second mainly variations in muscle tone, and the third mainly variations in sensorium and pain. The results of such exploratory analysis highlight the fact that some of the parameters are not only important but also interrelated. These results seem to partially support the conceptual framework substantiating the Bobath approach to treatment.
Content Analysis for Proactive Protective Intelligence
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sanfilippo, Antonio P.
The aim of this paper is to outline a plan for developing and validating a Proactive Protective Intelligence approach that prevents targeted violence through the analysis and assessment of threats overtly or covertly expressed in abnormal communications to USSS protectees.
Profiling a Periodicals Collection
ERIC Educational Resources Information Center
Bolgiano, Christina E.; King, Mary Kathryn
1978-01-01
Libraries need solid information upon which to base collection development decisions. Specific evaluative methods for determining scope, access, and usefullness are described. Approaches used for data collection include analysis of interlibrary loan requests, comparison with major bibliographies, and analysis of accessibility through available…
NASA Astrophysics Data System (ADS)
Crowell, Andrew Rippetoe
This dissertation describes model reduction techniques for the computation of aerodynamic heat flux and pressure loads for multi-disciplinary analysis of hypersonic vehicles. NASA and the Department of Defense have expressed renewed interest in the development of responsive, reusable hypersonic cruise vehicles capable of sustained high-speed flight and access to space. However, an extensive set of technical challenges have obstructed the development of such vehicles. These technical challenges are partially due to both the inability to accurately test scaled vehicles in wind tunnels and to the time intensive nature of high-fidelity computational modeling, particularly for the fluid using Computational Fluid Dynamics (CFD). The aim of this dissertation is to develop efficient and accurate models for the aerodynamic heat flux and pressure loads to replace the need for computationally expensive, high-fidelity CFD during coupled analysis. Furthermore, aerodynamic heating and pressure loads are systematically evaluated for a number of different operating conditions, including: simple two-dimensional flow over flat surfaces up to three-dimensional flows over deformed surfaces with shock-shock interaction and shock-boundary layer interaction. An additional focus of this dissertation is on the implementation and computation of results using the developed aerodynamic heating and pressure models in complex fluid-thermal-structural simulations. Model reduction is achieved using a two-pronged approach. One prong focuses on developing analytical corrections to isothermal, steady-state CFD flow solutions in order to capture flow effects associated with transient spatially-varying surface temperatures and surface pressures (e.g., surface deformation, surface vibration, shock impingements, etc.). The second prong is focused on minimizing the computational expense of computing the steady-state CFD solutions by developing an efficient surrogate CFD model. The developed two-pronged approach is found to exhibit balanced performance in terms of accuracy and computational expense, relative to several existing approaches. This approach enables CFD-based loads to be implemented into long duration fluid-thermal-structural simulations.
Robust Methods for Moderation Analysis with a Two-Level Regression Model.
Yang, Miao; Yuan, Ke-Hai
2016-01-01
Moderation analysis has many applications in social sciences. Most widely used estimation methods for moderation analysis assume that errors are normally distributed and homoscedastic. When these assumptions are not met, the results from a classical moderation analysis can be misleading. For more reliable moderation analysis, this article proposes two robust methods with a two-level regression model when the predictors do not contain measurement error. One method is based on maximum likelihood with Student's t distribution and the other is based on M-estimators with Huber-type weights. An algorithm for obtaining the robust estimators is developed. Consistent estimates of standard errors of the robust estimators are provided. The robust approaches are compared against normal-distribution-based maximum likelihood (NML) with respect to power and accuracy of parameter estimates through a simulation study. Results show that the robust approaches outperform NML under various distributional conditions. Application of the robust methods is illustrated through a real data example. An R program is developed and documented to facilitate the application of the robust methods.
Utilization Elementary Siphons of Petri Net to Solved Deadlocks in Flexible Manufacturing Systems
NASA Astrophysics Data System (ADS)
Abdul-Hussin, Mowafak Hassan
2015-07-01
This article presents an approach to the constructing a class structural analysis of Petri nets, where elementary siphons are mainly used in the development of a deadlock control policy of flexible manufacturing systems (FMSs), that has been exploited successfully for the design of supervisors of some supervisory control problems. Deadlock-free operation of FMSs is significant objectives of siphons in the Petri net. The structure analysis of Petri net models has efficiency in control of FMSs, however different policy can be implemented for the deadlock prevention. Petri nets models based deadlock prevention for FMS's has gained considerable interest in the development of control theory and methods for design, controlling, operation, and performance evaluation depending of the special class of Petri nets called S3PR. Both structural analysis and reachability tree analysis is used for the purposes analysis, simulation and control of Petri nets. In our ex-perimental approach based to siphon is able to resolve the problem of deadlock occurred to Petri nets that are illustrated with an FMS.
Optical and system engineering in the development of a high-quality student telescope kit
NASA Astrophysics Data System (ADS)
Pompea, Stephen M.; Pfisterer, Richard N.; Ellis, Scott; Arion, Douglas N.; Fienberg, Richard Tresch; Smith, Thomas C.
2010-07-01
The Galileoscope student telescope kit was developed by a volunteer team of astronomers, science education experts, and optical engineers in conjunction with the International Year of Astronomy 2009. This refracting telescope is in production with over 180,000 units produced and distributed with 25,000 units in production. The telescope was designed to be able to resolve the rings of Saturn and to be used in urban areas. The telescope system requirements, performance metrics, and architecture were established after an analysis of current inexpensive telescopes and student telescope kits. The optical design approaches used in the various prototypes and the optical system engineering tradeoffs will be described. Risk analysis, risk management, and change management were critical as was cost management since the final product was to cost around 15 (but had to perform as well as 100 telescopes). In the system engineering of the Galileoscope a variety of analysis and testing approaches were used, including stray light design and analysis using the powerful optical analysis program FRED.
STANDARD OPERATING PROCEDURE FOR QUALITY ASSURANCE IN ANALYTICAL CHEMISTRY METHODS DEVELOPMENT
The Environmental Protection Agency's (EPA) Office of Research and Development (ORD) is engaged in the development, demonstration, and validation of new or newly adapted methods of analysis for environmentally related samples. Recognizing that a "one size fits all" approach to qu...
Between practice and theory: Melanie Klein, Anna Freud and the development of child analysis.
Donaldson, G
1996-04-01
An examination of the early history of child analysis in the writings of Melanie Klein and Anna Freud reveals how two different and opposing approaches to child analysis arose at the same time. The two methods of child analysis are rooted in a differential emphasis on psychoanalytic theory and practice. The Kleinian method derives from the application of technique while the Anna Freudian method is driven by theory. Furthermore, by holding to the Freudian theory of child development Anna Freud was forced to limit the scope of child analysis, while Klein's application of Freudian practice has led to new discoveries about the development of the infant psyche.
Thermal performance modeling of NASA s scientific balloons
NASA Astrophysics Data System (ADS)
Franco, H.; Cathey, H.
The flight performance of a scientific balloon is highly dependant on the interaction between the balloon and its environment. The balloon is a thermal vehicle. Modeling a scientific balloon's thermal performance has proven to be a difficult analytical task. Most previous thermal models have attempted these analyses by using either a bulk thermal model approach, or by simplified representations of the balloon. These approaches to date have provided reasonable, but not very accurate results. Improvements have been made in recent years using thermal analysis tools developed for the thermal modeling of spacecraft and other sophisticated heat transfer problems. These tools, which now allow for accurate modeling of highly transmissive materials, have been applied to the thermal analysis of NASA's scientific balloons. A research effort has been started that utilizes the "Thermal Desktop" addition to AUTO CAD. This paper will discuss the development of thermal models for both conventional and Ultra Long Duration super-pressure balloons. This research effort has focused on incremental analysis stages of development to assess the accuracy of the tool and the required model resolution to produce usable data. The first stage balloon thermal analyses started with simple spherical balloon models with a limited number of nodes, and expanded the number of nodes to determine required model resolution. These models were then modified to include additional details such as load tapes. The second stage analyses looked at natural shaped Zero Pressure balloons. Load tapes were then added to these shapes, again with the goal of determining the required modeling accuracy by varying the number of gores. The third stage, following the same steps as the Zero Pressure balloon efforts, was directed at modeling super-pressure pumpkin shaped balloons. The results were then used to develop analysis guidelines and an approach for modeling balloons for both simple first order estimates and detailed full models. The development of the radiative environment and program input files, the development of the modeling techniques for balloons, and the development of appropriate data output handling techniques for both the raw data and data plots will be discussed. A general guideline to match predicted balloon performance with known flight data will also be presented. One long-term goal of this effort is to develop simplified approaches and techniques to include results in performance codes being developed.
DOT National Transportation Integrated Search
2016-02-01
In this study, a computational approach for conducting durability analysis of bridges using detailed finite element models is developed. The underlying approach adopted is based on the hypothesis that the two main factors affecting the life of a brid...
ERIC Educational Resources Information Center
Panza, Carol M.
2001-01-01
Suggests that human performance technologists need to have an analysis approach to support the development of an appropriate set of improvement recommendations for clients and then move to an action plan to help them see results. Presents a performance improvement model and a systematic approach that considers organizational context, ownership,…
Biological Nature of Knowledge in the Learning Organisation
ERIC Educational Resources Information Center
Hall, William P.
2005-01-01
Purpose: To develop a biological approach to the analysis of learning organisations based on complexity theory, autopoiesis, and evolutionary epistemology. Design/methodology/approach: This paper synthesises ideas from disciplines ranging from physics, epistemology and philosophy of science to military affairs, to sketch a scientific framework in…
AN AGGREGATION AND EPISODE SELECTION SCHEME FOR EPA'S MODELS-3 CMAQ
The development of an episode selection and aggregation approach, designed to support distributional estimation for use with the Models-3 Community Multiscale Air Quality (CMAQ) model, is described. The approach utilized cluster analysis of the 700 hPa u and v wind field compo...
An Instructional Design Framework for Fostering Student Engagement in Online Learning Environments
ERIC Educational Resources Information Center
Czerkawski, Betul C.; Lyman, Eugene W.
2016-01-01
Many approaches, models and frameworks exist when designing quality online learning environments. These approaches assist and guide instructional designers through the process of analysis, design, development, implementation and evaluation of instructional processes. Some of these frameworks are concerned with student participation, some with…
Adrenocortical carcinoma: the dawn of a new era of genomic and molecular biology analysis.
Armignacco, R; Cantini, G; Canu, L; Poli, G; Ercolino, T; Mannelli, M; Luconi, M
2018-05-01
Over the last decade, the development of novel and high penetrance genomic approaches to analyze biological samples has provided very new insights in the comprehension of the molecular biology and genetics of tumors. The use of these techniques, consisting of exome sequencing, transcriptome, miRNome, chromosome alteration, genome, and epigenome analysis, has also been successfully applied to adrenocortical carcinoma (ACC). In fact, the analysis of large cohorts of patients allowed the stratification of ACC with different patterns of molecular alterations, associated with different outcomes, thus providing a novel molecular classification of the malignancy to be associated with the classical pathological analysis. Improving our knowledge about ACC molecular features will result not only in a better diagnostic and prognostic accuracy, but also in the identification of more specific therapeutic targets for the development of more effective pharmacological anti-cancer approaches. In particular, the specific molecular alteration profiles identified in ACC may represent targetable events by the use of already developed or newly designed drugs enabling a better and more efficacious management of the ACC patient in the context of new frontiers of personalized precision medicine.
Engineering design, stress and thermal analysis, and documentation for SATS program
NASA Technical Reports Server (NTRS)
1973-01-01
An in-depth analysis and mechanical design of the solar array stowage and deployment arrangements for use in Small Applications Technology Satellite spacecraft is presented. Alternate approaches for the major elements of work are developed and evaluated. Elements include array stowage and deployment arrangements, the spacecraft and array behavior in the spacecraft despin mode, and the design of the main hinge and segment hinge assemblies. Feasibility calculations are performed and the preferred approach is identified.
A Nonparametric Statistical Approach to the Validation of Computer Simulation Models
1985-11-01
Ballistic Research Laboratory, the Experimental Design and Analysis Branch of the Systems Engineering and Concepts Analysis Division was funded to...2 Winter. E M. Wisemiler. D P. azd UjiharmJ K. Venrgcation ad Validatiot of Engineering Simulatiots with Minimal D2ta." Pmeedinr’ of the 1976 Summer...used by numerous authors. Law%6 has augmented their approach with specific suggestions for each of the three stage’s: 1. develop high face-validity
NASA Astrophysics Data System (ADS)
Markantonis, Vasileios; Farinosi, Fabio; Dondeynaz, Celine; Ameztoy, Iban; Pastori, Marco; Marletta, Luca; Ali, Abdou; Carmona Moreno, Cesar
2018-05-01
The assessment of natural hazards such as floods and droughts is a complex issue that demands integrated approaches and high-quality data. Especially in African developing countries, where information is limited, the assessment of floods and droughts, though an overarching issue that influences economic and social development, is even more challenging. This paper presents an integrated approach to assessing crucial aspects of floods and droughts in the transboundary Mékrou River basin (a portion of the Niger River basin in West Africa), combining climatic trends analysis and the findings of a household survey. The multivariable trend analysis estimates, at the biophysical level, the climate variability and the occurrence of floods and droughts. These results are coupled with an analysis of household survey data that reveals the behaviour and opinions of local residents regarding the observed climate variability and occurrence of flood and drought events, household mitigation measures, and the impacts of floods and droughts. Based on survey data analysis, the paper provides a per-household cost estimation of floods and droughts that occurred over a 2-year period (2014-2015). Furthermore, two econometric models are set up to identify the factors that influence the costs of floods and droughts to impacted households.
Application of Open Source Technologies for Oceanographic Data Analysis
NASA Astrophysics Data System (ADS)
Huang, T.; Gangl, M.; Quach, N. T.; Wilson, B. D.; Chang, G.; Armstrong, E. M.; Chin, T. M.; Greguska, F.
2015-12-01
NEXUS is a data-intensive analysis solution developed with a new approach for handling science data that enables large-scale data analysis by leveraging open source technologies such as Apache Cassandra, Apache Spark, Apache Solr, and Webification. NEXUS has been selected to provide on-the-fly time-series and histogram generation for the Soil Moisture Active Passive (SMAP) mission for Level 2 and Level 3 Active, Passive, and Active Passive products. It also provides an on-the-fly data subsetting capability. NEXUS is designed to scale horizontally, enabling it to handle massive amounts of data in parallel. It takes a new approach on managing time and geo-referenced array data by dividing data artifacts into chunks and stores them in an industry-standard, horizontally scaled NoSQL database. This approach enables the development of scalable data analysis services that can infuse and leverage the elastic computing infrastructure of the Cloud. It is equipped with a high-performance geospatial and indexed data search solution, coupled with a high-performance data Webification solution free from file I/O bottlenecks, as well as a high-performance, in-memory data analysis engine. In this talk, we will focus on the recently funded AIST 2014 project by using NEXUS as the core for oceanographic anomaly detection service and web portal. We call it, OceanXtremes
Zhu, Zaifang; Chen, Huang; Ren, Jiangtao; Lu, Juan J; Gu, Congying; Lynch, Kyle B; Wu, Si; Wang, Zhe; Cao, Chengxi; Liu, Shaorong
2018-03-01
We develop a new two-dimensional (2D) high performance liquid chromatography (HPLC) approach for intact protein analysis. Development of 2D HPLC has a bottleneck problem - limited second-dimension (second-D) separation speed. We solve this problem by incorporating multiple second-D columns to allow several second-D separations to be proceeded in parallel. To demonstrate the feasibility of using this approach for comprehensive protein analysis, we select ion-exchange chromatography as the first-dimension and reverse-phase chromatography as the second-D. We incorporate three second-D columns in an innovative way so that three reverse-phase separations can be performed simultaneously. We test this system for separating both standard proteins and E. coli lysates and achieve baseline resolutions for eleven standard proteins and obtain more than 500 peaks for E. coli lysates. This is an indication that the sample complexities are greatly reduced. We see less than 10 bands when each fraction of the second-D effluents are analyzed by sodium dodecyl sulfate - polyacrylamide gel electrophoresis (SDS-PAGE), compared to hundreds of SDS-PAGE bands as the original sample is analyzed. This approach could potentially be an excellent and general tool for protein analysis. Copyright © 2017 Elsevier B.V. All rights reserved.
A Review of Multivariate Methods for Multimodal Fusion of Brain Imaging Data
Adali, Tülay; Yu, Qingbao; Calhoun, Vince D.
2011-01-01
The development of various neuroimaging techniques is rapidly improving the measurements of brain function/structure. However, despite improvements in individual modalities, it is becoming increasingly clear that the most effective research approaches will utilize multi-modal fusion, which takes advantage of the fact that each modality provides a limited view of the brain. The goal of multimodal fusion is to capitalize on the strength of each modality in a joint analysis, rather than a separate analysis of each. This is a more complicated endeavor that must be approached more carefully and efficient methods should be developed to draw generalized and valid conclusions from high dimensional data with a limited number of subjects. Numerous research efforts have been reported in the field based on various statistical approaches, e.g. independent component analysis (ICA), canonical correlation analysis (CCA) and partial least squares (PLS). In this review paper, we survey a number of multivariate methods appearing in previous reports, which are performed with or without prior information and may have utility for identifying potential brain illness biomarkers. We also discuss the possible strengths and limitations of each method, and review their applications to brain imaging data. PMID:22108139
Interstage Flammability Analysis Approach
NASA Technical Reports Server (NTRS)
Little, Jeffrey K.; Eppard, William M.
2011-01-01
The Interstage of the Ares I launch platform houses several key components which are on standby during First Stage operation: the Reaction Control System (ReCS), the Upper Stage (US) Thrust Vector Control (TVC) and the J-2X with the Main Propulsion System (MPS) propellant feed system. Therefore potentially dangerous leaks of propellants could develop. The Interstage leaks analysis addresses the concerns of localized mixing of hydrogen and oxygen gases to produce deflagration zones in the Interstage of the Ares I launch vehicle during First Stage operation. This report details the approach taken to accomplish the analysis. Specified leakage profiles and actual flammability results are not presented due to proprietary and security restrictions. The interior volume formed by the Interstage walls, bounding interfaces with the Upper and First Stages, and surrounding the J2-X engine was modeled using Loci-CHEM to assess the potential for flammable gas mixtures to develop during First Stage operations. The transient analysis included a derived flammability indicator based on mixture ratios to maintain achievable simulation times. Validation of results was based on a comparison to Interstage pressure profiles outlined in prior NASA studies. The approach proved useful in the bounding of flammability risk in supporting program hazard reviews.
Giustacchini, Alice; Thongjuea, Supat; Barkas, Nikolaos; Woll, Petter S; Povinelli, Benjamin J; Booth, Christopher A G; Sopp, Paul; Norfo, Ruggiero; Rodriguez-Meira, Alba; Ashley, Neil; Jamieson, Lauren; Vyas, Paresh; Anderson, Kristina; Segerstolpe, Åsa; Qian, Hong; Olsson-Strömberg, Ulla; Mustjoki, Satu; Sandberg, Rickard; Jacobsen, Sten Eirik W; Mead, Adam J
2017-06-01
Recent advances in single-cell transcriptomics are ideally placed to unravel intratumoral heterogeneity and selective resistance of cancer stem cell (SC) subpopulations to molecularly targeted cancer therapies. However, current single-cell RNA-sequencing approaches lack the sensitivity required to reliably detect somatic mutations. We developed a method that combines high-sensitivity mutation detection with whole-transcriptome analysis of the same single cell. We applied this technique to analyze more than 2,000 SCs from patients with chronic myeloid leukemia (CML) throughout the disease course, revealing heterogeneity of CML-SCs, including the identification of a subgroup of CML-SCs with a distinct molecular signature that selectively persisted during prolonged therapy. Analysis of nonleukemic SCs from patients with CML also provided new insights into cell-extrinsic disruption of hematopoiesis in CML associated with clinical outcome. Furthermore, we used this single-cell approach to identify a blast-crisis-specific SC population, which was also present in a subclone of CML-SCs during the chronic phase in a patient who subsequently developed blast crisis. This approach, which might be broadly applied to any malignancy, illustrates how single-cell analysis can identify subpopulations of therapy-resistant SCs that are not apparent through cell-population analysis.
NASA Astrophysics Data System (ADS)
Bajard, Y.; Draper, M.; Viens, P.
1981-05-01
The proposed paper deals with a comparative analysis of several approaches possible and actually used for a joint action of local institutions and foreign aid in the field of water supply and related services such as sanitation to villages and small rural agglomerations (market towns, etc.) in developing countries. This comparative analysis is based on examples of actual programmes in this field. The authors have participated in most of the programmes selected as examples, at various levels and in various capacities, from conception to design, implementation and/or evaluation (i.e. rural development programmes in Ivory Coast, Ghana (upper region), Benin and Ethiopia. The authors were not involved in other examples such as water supply and/or sanitation to small urban centres in Benin, Ivory Coast, etc. They have, however, witnessed them directly and have obtained, therefore, first-hand information on their organization, execution and results. Several typical examples of actual projects are briefly defined and characterized. The paper undertakes, then, to compare, in a clinical fashion, the advantages and drawbacks of the approaches taken in the various examples presented. The paper finally proposes a recommendation for a realistic approach to joint action between local/domestic and foreign financing/assistance agencies and executing bodies (consultants, contractors) in the field of rural water supply, sanitation, and more generally, health improvement. The definition of this line of approach is made in terms of logical framework, i.e. goals, purposes, outputs and inputs at the various stages of the project, up to actual evaluation of execution and impact if possible; description of practical indicators of the two types of evaluation. A particular attention is given to the problems of technological choices, in view of the constraints imposed by the natural environment, by the human and social patterns; in view also of the institutions and the economy. Another point of importance taken into consideration by the paper is the problem of information, education, and support to users for the introduction, implementation, operation and maintenance of technical developments at village level. Conclusions are drawn as to the relative advantages of this approach over the "classical" approach and its replicability.
A Research Roadmap for Computation-Based Human Reliability Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boring, Ronald; Mandelli, Diego; Joe, Jeffrey
2015-08-01
The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is oftenmore » secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.« less
NASA Astrophysics Data System (ADS)
Wang, Dong; Ding, Hao; Singh, Vijay P.; Shang, Xiaosan; Liu, Dengfeng; Wang, Yuankun; Zeng, Xiankui; Wu, Jichun; Wang, Lachun; Zou, Xinqing
2015-05-01
For scientific and sustainable management of water resources, hydrologic and meteorologic data series need to be often extended. This paper proposes a hybrid approach, named WA-CM (wavelet analysis-cloud model), for data series extension. Wavelet analysis has time-frequency localization features, known as "mathematics microscope," that can decompose and reconstruct hydrologic and meteorologic series by wavelet transform. The cloud model is a mathematical representation of fuzziness and randomness and has strong robustness for uncertain data. The WA-CM approach first employs the wavelet transform to decompose the measured nonstationary series and then uses the cloud model to develop an extension model for each decomposition layer series. The final extension is obtained by summing the results of extension of each layer. Two kinds of meteorologic and hydrologic data sets with different characteristics and different influence of human activity from six (three pairs) representative stations are used to illustrate the WA-CM approach. The approach is also compared with four other methods, which are conventional correlation extension method, Kendall-Theil robust line method, artificial neural network method (back propagation, multilayer perceptron, and radial basis function), and single cloud model method. To evaluate the model performance completely and thoroughly, five measures are used, which are relative error, mean relative error, standard deviation of relative error, root mean square error, and Thiel inequality coefficient. Results show that the WA-CM approach is effective, feasible, and accurate and is found to be better than other four methods compared. The theory employed and the approach developed here can be applied to extension of data in other areas as well.
A study of software management and guidelines for flight projects
NASA Technical Reports Server (NTRS)
1980-01-01
A survey of present software development policies and practices, and an analysis of these policies and practices are summarized. Background information necessary to assess the adequacy of present NASA flight software development approaches is presented.
NASA Astrophysics Data System (ADS)
Razavi, S.; Gupta, H. V.
2014-12-01
Sensitivity analysis (SA) is an important paradigm in the context of Earth System model development and application, and provides a powerful tool that serves several essential functions in modelling practice, including 1) Uncertainty Apportionment - attribution of total uncertainty to different uncertainty sources, 2) Assessment of Similarity - diagnostic testing and evaluation of similarities between the functioning of the model and the real system, 3) Factor and Model Reduction - identification of non-influential factors and/or insensitive components of model structure, and 4) Factor Interdependence - investigation of the nature and strength of interactions between the factors, and the degree to which factors intensify, cancel, or compensate for the effects of each other. A variety of sensitivity analysis approaches have been proposed, each of which formally characterizes a different "intuitive" understanding of what is meant by the "sensitivity" of one or more model responses to its dependent factors (such as model parameters or forcings). These approaches are based on different philosophies and theoretical definitions of sensitivity, and range from simple local derivatives and one-factor-at-a-time procedures to rigorous variance-based (Sobol-type) approaches. In general, each approach focuses on, and identifies, different features and properties of the model response and may therefore lead to different (even conflicting) conclusions about the underlying sensitivity. This presentation revisits the theoretical basis for sensitivity analysis, and critically evaluates existing approaches so as to demonstrate their flaws and shortcomings. With this background, we discuss several important properties of response surfaces that are associated with the understanding and interpretation of sensitivity. Finally, a new approach towards global sensitivity assessment is developed that is consistent with important properties of Earth System model response surfaces.
Groves, Ethan; Palenik, Skip; Palenik, Christopher S
2018-04-18
While color is arguably the most important optical property of evidential fibers, the actual dyestuffs responsible for its expression in them are, in forensic trace evidence examinations, rarely analyzed and still less often identified. This is due, primarily, to the exceedingly small quantities of dye present in a single fiber as well as to the fact that dye identification is a challenging analytical problem, even when large quantities are available for analysis. Among the practical reasons for this are the wide range of dyestuffs available (and the even larger number of trade names), the low total concentration of dyes in the finished product, the limited amount of sample typically available for analysis in forensic cases, and the complexity of the dye mixtures that may exist within a single fiber. Literature on the topic of dye analysis is often limited to a specific method, subset of dyestuffs, or an approach that is not applicable given the constraints of a forensic analysis. Here, we present a generalized approach to dye identification that ( 1 ) combines several robust analytical methods, ( 2 ) is broadly applicable to a wide range of dye chemistries, application classes, and fiber types, and ( 3 ) can be scaled down to forensic casework-sized samples. The approach is based on the development of a reference collection of 300 commercially relevant textile dyes that have been characterized by a variety of microanalytical methods (HPTLC, Raman microspectroscopy, infrared microspectroscopy, UV-Vis spectroscopy, and visible microspectrophotometry). Although there is no single approach that is applicable to all dyes on every type of fiber, a combination of these analytical methods has been applied using a reproducible approach that permits the use of reference libraries to constrain the identity of and, in many cases, identify the dye (or dyes) present in a textile fiber sample.
Evaluating cardiac risk: exposure response analysis in early clinical drug development.
Grenier, Julie; Paglialunga, Sabina; Morimoto, Bruce H; Lester, Robert M
2018-01-01
The assessment of a drug's cardiac liability has undergone considerable metamorphosis by regulators since International Council for Harmonization of Technical Requirement for Pharmaceuticals for Human Use E14 guideline was introduced in 2005. Drug developers now have a choice in how proarrhythmia risk can be evaluated; the options include a dedicated thorough QT (TQT) study or exposure response (ER) modeling of intensive electrocardiogram (ECG) captured in early clinical development. The alternative approach of ER modeling was incorporated into a guidance document in 2015 as a primary analysis tool which could be utilized in early phase dose escalation studies as an option to perform a dedicated TQT trial. This review will describe the current state of ER modeling of intensive ECG data collected during early clinical drug development; the requirements with regard to the use of a positive control; and address the challenges and opportunities of this alternative approach to assessing QT liability.
Decision analysis and risk models for land development affecting infrastructure systems.
Thekdi, Shital A; Lambert, James H
2012-07-01
Coordination and layering of models to identify risks in complex systems such as large-scale infrastructure of energy, water, and transportation is of current interest across application domains. Such infrastructures are increasingly vulnerable to adjacent commercial and residential land development. Land development can compromise the performance of essential infrastructure systems and increase the costs of maintaining or increasing performance. A risk-informed approach to this topic would be useful to avoid surprise, regret, and the need for costly remedies. This article develops a layering and coordination of models for risk management of land development affecting infrastructure systems. The layers are: system identification, expert elicitation, predictive modeling, comparison of investment alternatives, and implications of current decisions for future options. The modeling layers share a focus on observable factors that most contribute to volatility of land development and land use. The relevant data and expert evidence include current and forecasted growth in population and employment, conservation and preservation rules, land topography and geometries, real estate assessments, market and economic conditions, and other factors. The approach integrates to a decision framework of strategic considerations based on assessing risk, cost, and opportunity in order to prioritize needs and potential remedies that mitigate impacts of land development to the infrastructure systems. The approach is demonstrated for a 5,700-mile multimodal transportation system adjacent to 60,000 tracts of potential land development. © 2011 Society for Risk Analysis.
Green supplier selection: a new genetic/immune strategy with industrial application
NASA Astrophysics Data System (ADS)
Kumar, Amit; Jain, Vipul; Kumar, Sameer; Chandra, Charu
2016-10-01
With the onset of the 'climate change movement', organisations are striving to include environmental criteria into the supplier selection process. This article hybridises a Green Data Envelopment Analysis (GDEA)-based approach with a new Genetic/Immune Strategy for Data Envelopment Analysis (GIS-DEA). A GIS-DEA approach provides a different view to solving multi-criteria decision making problems using data envelopment analysis (DEA) by considering DEA as a multi-objective optimisation problem with efficiency as one objective and proximity of solution to decision makers' preferences as the other objective. The hybrid approach called GIS-GDEA is applied here to a well-known automobile spare parts manufacturer in India and the results presented. User validation developed based on specific set of criteria suggests that the supplier selection process with GIS-GDEA is more practical than other approaches in a current industrial scenario with multiple decision makers.
Blurring the Inputs: A Natural Language Approach to Sensitivity Analysis
NASA Technical Reports Server (NTRS)
Kleb, William L.; Thompson, Richard A.; Johnston, Christopher O.
2007-01-01
To document model parameter uncertainties and to automate sensitivity analyses for numerical simulation codes, a natural-language-based method to specify tolerances has been developed. With this new method, uncertainties are expressed in a natural manner, i.e., as one would on an engineering drawing, namely, 5.25 +/- 0.01. This approach is robust and readily adapted to various application domains because it does not rely on parsing the particular structure of input file formats. Instead, tolerances of a standard format are added to existing fields within an input file. As a demonstration of the power of this simple, natural language approach, a Monte Carlo sensitivity analysis is performed for three disparate simulation codes: fluid dynamics (LAURA), radiation (HARA), and ablation (FIAT). Effort required to harness each code for sensitivity analysis was recorded to demonstrate the generality and flexibility of this new approach.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, H.; Li, G., E-mail: gli@clemson.edu
2014-08-28
An accelerated Finite Element Contact Block Reduction (FECBR) approach is presented for computational analysis of ballistic transport in nanoscale electronic devices with arbitrary geometry and unstructured mesh. Finite element formulation is developed for the theoretical CBR/Poisson model. The FECBR approach is accelerated through eigen-pair reduction, lead mode space projection, and component mode synthesis techniques. The accelerated FECBR is applied to perform quantum mechanical ballistic transport analysis of a DG-MOSFET with taper-shaped extensions and a DG-MOSFET with Si/SiO{sub 2} interface roughness. The computed electrical transport properties of the devices obtained from the accelerated FECBR approach and associated computational cost as amore » function of system degrees of freedom are compared with those obtained from the original CBR and direct inversion methods. The performance of the accelerated FECBR in both its accuracy and efficiency is demonstrated.« less
Large-Scale Compute-Intensive Analysis via a Combined In-situ and Co-scheduling Workflow Approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Messer, Bronson; Sewell, Christopher; Heitmann, Katrin
2015-01-01
Large-scale simulations can produce tens of terabytes of data per analysis cycle, complicating and limiting the efficiency of workflows. Traditionally, outputs are stored on the file system and analyzed in post-processing. With the rapidly increasing size and complexity of simulations, this approach faces an uncertain future. Trending techniques consist of performing the analysis in situ, utilizing the same resources as the simulation, and/or off-loading subsets of the data to a compute-intensive analysis system. We introduce an analysis framework developed for HACC, a cosmological N-body code, that uses both in situ and co-scheduling approaches for handling Petabyte-size outputs. An initial inmore » situ step is used to reduce the amount of data to be analyzed, and to separate out the data-intensive tasks handled off-line. The analysis routines are implemented using the PISTON/VTK-m framework, allowing a single implementation of an algorithm that simultaneously targets a variety of GPU, multi-core, and many-core architectures.« less
NASA Astrophysics Data System (ADS)
Kong, Xianyu; Liu, Yanfang; Jian, Huimin; Su, Rongguo; Yao, Qingzhen; Shi, Xiaoyong
2017-10-01
To realize potential cost savings in coastal monitoring programs and provide timely advice for marine management, there is an urgent need for efficient evaluation tools based on easily measured variables for the rapid and timely assessment of estuarine and offshore eutrophication. In this study, using parallel factor analysis (PARAFAC), principal component analysis (PCA), and discriminant function analysis (DFA) with the trophic index (TRIX) for reference, we developed an approach for rapidly assessing the eutrophication status of coastal waters using easy-to-measure parameters, including chromophoric dissolved organic matter (CDOM), fluorescence excitation-emission matrices, CDOM UV-Vis absorbance, and other water-quality parameters (turbidity, chlorophyll a, and dissolved oxygen). First, we decomposed CDOM excitation-emission matrices (EEMs) by PARAFAC to identify three components. Then, we applied PCA to simplify the complexity of the relationships between the water-quality parameters. Finally, we used the PCA score values as independent variables in DFA to develop a eutrophication assessment model. The developed model yielded classification accuracy rates of 97.1%, 80.5%, 90.3%, and 89.1% for good, moderate, and poor water qualities, and for the overall data sets, respectively. Our results suggest that these easy-to-measure parameters could be used to develop a simple approach for rapid in-situ assessment and monitoring of the eutrophication of estuarine and offshore areas.
A global optimization approach to multi-polarity sentiment analysis.
Li, Xinmiao; Li, Jing; Wu, Yukeng
2015-01-01
Following the rapid development of social media, sentiment analysis has become an important social media mining technique. The performance of automatic sentiment analysis primarily depends on feature selection and sentiment classification. While information gain (IG) and support vector machines (SVM) are two important techniques, few studies have optimized both approaches in sentiment analysis. The effectiveness of applying a global optimization approach to sentiment analysis remains unclear. We propose a global optimization-based sentiment analysis (PSOGO-Senti) approach to improve sentiment analysis with IG for feature selection and SVM as the learning engine. The PSOGO-Senti approach utilizes a particle swarm optimization algorithm to obtain a global optimal combination of feature dimensions and parameters in the SVM. We evaluate the PSOGO-Senti model on two datasets from different fields. The experimental results showed that the PSOGO-Senti model can improve binary and multi-polarity Chinese sentiment analysis. We compared the optimal feature subset selected by PSOGO-Senti with the features in the sentiment dictionary. The results of this comparison indicated that PSOGO-Senti can effectively remove redundant and noisy features and can select a domain-specific feature subset with a higher-explanatory power for a particular sentiment analysis task. The experimental results showed that the PSOGO-Senti approach is effective and robust for sentiment analysis tasks in different domains. By comparing the improvements of two-polarity, three-polarity and five-polarity sentiment analysis results, we found that the five-polarity sentiment analysis delivered the largest improvement. The improvement of the two-polarity sentiment analysis was the smallest. We conclude that the PSOGO-Senti achieves higher improvement for a more complicated sentiment analysis task. We also compared the results of PSOGO-Senti with those of the genetic algorithm (GA) and grid search method. From the results of this comparison, we found that PSOGO-Senti is more suitable for improving a difficult multi-polarity sentiment analysis problem.
Boot, Walter R; Sumner, Anna; Towne, Tyler J; Rodriguez, Paola; Anders Ericsson, K
2017-04-01
Video games are ideal platforms for the study of skill acquisition for a variety of reasons. However, our understanding of the development of skill and the cognitive representations that support skilled performance can be limited by a focus on game scores. We present an alternative approach to the study of skill acquisition in video games based on the tools of the Expert Performance Approach. Our investigation was motivated by a detailed analysis of the behaviors responsible for the superior performance of one of the highest scoring players of the video game Space Fortress (Towne, Boot, & Ericsson, ). This analysis revealed how certain behaviors contributed to his exceptional performance. In this study, we recruited a participant for a similar training regimen, but we collected concurrent and retrospective verbal protocol data throughout training. Protocol analysis revealed insights into strategies, errors, mental representations, and shifting game priorities. We argue that these insights into the developing representations that guided skilled performance could only easily have been derived from the tools of the Expert Performance Approach. We propose that the described approach could be applied to understand performance and skill acquisition in many different video games (and other short- to medium-term skill acquisition paradigms) and help reveal mechanisms of transfer from gameplay to other measures of laboratory and real-world performance. Copyright © 2016 Cognitive Science Society, Inc.
ERIC Educational Resources Information Center
Kervan, Serdan; Tezci, Erdogan
2018-01-01
The aim of this study is to adapt ICT integration approach scale to Kosovo culture, which measures ICT integration approaches of university faculty to teaching and learning process. The scale developed in Turkish has been translated into Albanian to provide linguistic equivalence. The survey was given to a total of 303 instructors [161 (53.1%)…
Multidisciplinary Design, Analysis, and Optimization Tool Development Using a Genetic Algorithm
NASA Technical Reports Server (NTRS)
Pak, Chan-gi; Li, Wesley
2009-01-01
Multidisciplinary design, analysis, and optimization using a genetic algorithm is being developed at the National Aeronautics and Space Administration Dryden Flight Research Center (Edwards, California) to automate analysis and design process by leveraging existing tools to enable true multidisciplinary optimization in the preliminary design stage of subsonic, transonic, supersonic, and hypersonic aircraft. This is a promising technology, but faces many challenges in large-scale, real-world application. This report describes current approaches, recent results, and challenges for multidisciplinary design, analysis, and optimization as demonstrated by experience with the Ikhana fire pod design.!
Development of an Aeroelastic Code Based on an Euler/Navier-Stokes Aerodynamic Solver
NASA Technical Reports Server (NTRS)
Bakhle, Milind A.; Srivastava, Rakesh; Keith, Theo G., Jr.; Stefko, George L.; Janus, Mark J.
1996-01-01
This paper describes the development of an aeroelastic code (TURBO-AE) based on an Euler/Navier-Stokes unsteady aerodynamic analysis. A brief review of the relevant research in the area of propulsion aeroelasticity is presented. The paper briefly describes the original Euler/Navier-Stokes code (TURBO) and then details the development of the aeroelastic extensions. The aeroelastic formulation is described. The modeling of the dynamics of the blade using a modal approach is detailed, along with the grid deformation approach used to model the elastic deformation of the blade. The work-per-cycle approach used to evaluate aeroelastic stability is described. Representative results used to verify the code are presented. The paper concludes with an evaluation of the development thus far, and some plans for further development and validation of the TURBO-AE code.
Training Needs Analysis: Weaknesses in the Conventional Approach.
ERIC Educational Resources Information Center
Leat, Michael James; Lovel, Murray Jack
1997-01-01
Identification of the training and development needs of administrative support staff is not aided by conventional performance appraisal, which measures summary or comparative effectiveness. Meaningful diagnostic evaluation integrates three levels of analysis (organization, task, and individual), using behavioral expectation scales. (SK)
Integrated Proteomic Approaches for Understanding Toxicity of Environmental Chemicals
To apply quantitative proteomic analysis to the evaluation of toxicity of environmental chemicals, we have developed an integrated proteomic technology platform. This platform has been applied to the analysis of the toxic effects and pathways of many important environmental chemi...
Developments in Sampling and Analysis Instrumentation for Stationary Sources
ERIC Educational Resources Information Center
Nader, John S.
1973-01-01
Instrumentation for the measurement of pollutant emissions is considered including sample-site selection, sample transport, sample treatment, sample analysis, and data reduction, display, and interpretation. Measurement approaches discussed involve sample extraction from within the stack and electro-optical methods. (BL)
Quantitative analysis to guide orphan drug development.
Lesko, L J
2012-08-01
The development of orphan drugs for rare diseases has made impressive strides in the past 10 years. There has been a surge in orphan drug designations, but new drug approvals have not kept up. This article presents a three-pronged hierarchical strategy for quantitative analysis of data at the descriptive, mechanistic, and systems levels of the biological system that could represent a standardized and rational approach to orphan drug development. Examples are provided to illustrate the concept.
ERIC Educational Resources Information Center
Hawkey, R.; And Others
1981-01-01
Describes an English language program for foreign professionals expected to attend graduate courses at British universities under the auspices of the Overseas Development Administration. Explains how this intensive program was based on an analysis of students' communication needs, and uses a teaching approach covering, in turn, academic and…
Development and application of optimum sensitivity analysis of structures
NASA Technical Reports Server (NTRS)
Barthelemy, J. F. M.; Hallauer, W. L., Jr.
1984-01-01
The research focused on developing an algorithm applying optimum sensitivity analysis for multilevel optimization. The research efforts have been devoted to assisting NASA Langley's Interdisciplinary Research Office (IRO) in the development of a mature methodology for a multilevel approach to the design of complex (large and multidisciplinary) engineering systems. An effort was undertaken to identify promising multilevel optimization algorithms. In the current reporting period, the computer program generating baseline single level solutions was completed and tested out.
NASA Technical Reports Server (NTRS)
Neuhaus, Jason R.
2018-01-01
This document describes the heads-up display (HUD) used in a piloted lifting-body entry, approach and landing simulation developed for the simulator facilities of the Simulation Development and Analysis Branch (SDAB) at NASA Langley Research Center. The HUD symbology originated with the piloted simulation evaluations of the HL-20 lifting body concept conducted in 1989 at NASA Langley. The original symbology was roughly based on Shuttle HUD symbology, as interpreted by Langley researchers. This document focuses on the addition of the precision approach path indicator (PAPI) lights to the HUD overlay.
On the interplay between mathematics and biology: hallmarks toward a new systems biology.
Bellomo, Nicola; Elaiw, Ahmed; Althiabi, Abdullah M; Alghamdi, Mohammed Ali
2015-03-01
This paper proposes a critical analysis of the existing literature on mathematical tools developed toward systems biology approaches and, out of this overview, develops a new approach whose main features can be briefly summarized as follows: derivation of mathematical structures suitable to capture the complexity of biological, hence living, systems, modeling, by appropriate mathematical tools, Darwinian type dynamics, namely mutations followed by selection and evolution. Moreover, multiscale methods to move from genes to cells, and from cells to tissue are analyzed in view of a new systems biology approach. Copyright © 2014 Elsevier B.V. All rights reserved.
Strategic planning features of subsurface management in Kemerovo Oblast
NASA Astrophysics Data System (ADS)
Romanyuk, V.; Grinkevich, A.; Akhmadeev, K.; Pozdeeva, G.
2016-09-01
The article discusses the strategic planning features of regional development based on the production and subsurface management in Kemerovo Oblast. The modern approach - SWOT analysis was applied to assess the regional development strategy. The estimation of regional development plan implementation was given for the foreseeable future.
Managing the "Performance" in Performance Management.
ERIC Educational Resources Information Center
Repinski, Marilyn; Bartsch, Maryjo
1996-01-01
Describes a five-step approach to performance management which includes (1) redefining tasks; (2) identifying skills; (3) determining what development tools are necessary; (4) prioritizing skills development; and (5) developing an action plan. Presents a hiring model that includes job analysis, job description, selection, goal setting, evaluation,…
Agyei, Dominic; Tsopmo, Apollinaire; Udenigwe, Chibuike C
2018-06-01
There are emerging advancements in the strategies used for the discovery and development of food-derived bioactive peptides because of their multiple food and health applications. Bioinformatics and peptidomics are two computational and analytical techniques that have the potential to speed up the development of bioactive peptides from bench to market. Structure-activity relationships observed in peptides form the basis for bioinformatics and in silico prediction of bioactive sequences encrypted in food proteins. Peptidomics, on the other hand, relies on "hyphenated" (liquid chromatography-mass spectrometry-based) techniques for the detection, profiling, and quantitation of peptides. Together, bioinformatics and peptidomics approaches provide a low-cost and effective means of predicting, profiling, and screening bioactive protein hydrolysates and peptides from food. This article discuses the basis, strengths, and limitations of bioinformatics and peptidomics approaches currently used for the discovery and analysis of food-derived bioactive peptides.
Stable Scalp EEG Spatiospectral Patterns Across Paradigms Estimated by Group ICA.
Labounek, René; Bridwell, David A; Mareček, Radek; Lamoš, Martin; Mikl, Michal; Slavíček, Tomáš; Bednařík, Petr; Baštinec, Jaromír; Hluštík, Petr; Brázdil, Milan; Jan, Jiří
2018-01-01
Electroencephalography (EEG) oscillations reflect the superposition of different cortical sources with potentially different frequencies. Various blind source separation (BSS) approaches have been developed and implemented in order to decompose these oscillations, and a subset of approaches have been developed for decomposition of multi-subject data. Group independent component analysis (Group ICA) is one such approach, revealing spatiospectral maps at the group level with distinct frequency and spatial characteristics. The reproducibility of these distinct maps across subjects and paradigms is relatively unexplored domain, and the topic of the present study. To address this, we conducted separate group ICA decompositions of EEG spatiospectral patterns on data collected during three different paradigms or tasks (resting-state, semantic decision task and visual oddball task). K-means clustering analysis of back-reconstructed individual subject maps demonstrates that fourteen different independent spatiospectral maps are present across the different paradigms/tasks, i.e. they are generally stable.
Aichele, Stephen S.
2005-01-01
This apparent contradiction may be caused by the differences in the changes measured in each analysis. The change-through-time approach describes change from a fixed starting point of approximately 1970; the gradient approach describes the cumulative effect of all change up to approximately 2000. These findings indicate that although urbanization in Oakland County results in most of the effects observed in the literature, as evidenced in the gradient approach, relatively few of the anticipated effects have been observed during the past three decades. This relative stability despite rapid land-cover change may be related to efforts to mitigate the effects of development and a general decrease in the density of new residential development. It may also be related to external factors such as climate variability and reduced atmospheric deposition of specific chemicals.
Life Prediction Issues in Thermal/Environmental Barrier Coatings in Ceramic Matrix Composites
NASA Technical Reports Server (NTRS)
Shah, Ashwin R.; Brewer, David N.; Murthy, Pappu L. N.
2001-01-01
Issues and design requirements for the environmental barrier coating (EBC)/thermal barrier coating (TBC) life that are general and those specific to the NASA Ultra-Efficient Engine Technology (UEET) development program have been described. The current state and trend of the research, methods in vogue related to the failure analysis, and long-term behavior and life prediction of EBCITBC systems are reported. Also, the perceived failure mechanisms, variables, and related uncertainties governing the EBCITBC system life are summarized. A combined heat transfer and structural analysis approach based on the oxidation kinetics using the Arrhenius theory is proposed to develop a life prediction model for the EBC/TBC systems. Stochastic process-based reliability approach that includes the physical variables such as gas pressure, temperature, velocity, moisture content, crack density, oxygen content, etc., is suggested. Benefits of the reliability-based approach are also discussed in the report.
NASA Astrophysics Data System (ADS)
Anaperta, M.; Helendra, H.; Zulva, R.
2018-04-01
This study aims to describe the validity of physics module with Character Oriented Values Using Process Approach Skills at Dynamic Electrical Material in high school physics / MA and SMK. The type of research is development research. The module development model uses the development model proposed by Plomp which consists of (1) preliminary research phase, (2) the prototyping phase, and (3) assessment phase. In this research is done is initial investigation phase and designing. Data collecting technique to know validation is observation and questionnaire. In the initial investigative phase, curriculum analysis, student analysis, and concept analysis were conducted. In the design phase and the realization of module design for SMA / MA and SMK subjects in dynamic electrical materials. After that, the formative evaluation which include self evaluation, prototyping (expert reviews, one-to-one, and small group. At this stage validity is performed. This research data is obtained through the module validation sheet, which then generates a valid module.
Use of application containers and workflows for genomic data analysis
Schulz, Wade L.; Durant, Thomas J. S.; Siddon, Alexa J.; Torres, Richard
2016-01-01
Background: The rapid acquisition of biological data and development of computationally intensive analyses has led to a need for novel approaches to software deployment. In particular, the complexity of common analytic tools for genomics makes them difficult to deploy and decreases the reproducibility of computational experiments. Methods: Recent technologies that allow for application virtualization, such as Docker, allow developers and bioinformaticians to isolate these applications and deploy secure, scalable platforms that have the potential to dramatically increase the efficiency of big data processing. Results: While limitations exist, this study demonstrates a successful implementation of a pipeline with several discrete software applications for the analysis of next-generation sequencing (NGS) data. Conclusions: With this approach, we significantly reduced the amount of time needed to perform clonal analysis from NGS data in acute myeloid leukemia. PMID:28163975
Galileo environmental test and analysis program summary
NASA Technical Reports Server (NTRS)
Hoffman, A. R.
1991-01-01
This paper presents an overview of the Galileo Project's environmental test and analysis program during the spacecraft development phase - October 1978 through launch in October 1989. After describing the top-level objectives of the program, summaries of-the approach, requirements, and margins are provided. Examples of assembly- and system-level test results are given for both the pre-1986 (direct mission) testing and the post-1986 (Venus-Earth-Earth gravity assist mission) testing, including dynamic, thermal, electromagnetic compatibility (EMC), and magnetic. The approaches and results for verifying by analysis that the requirements of certain environments (e.g., radiation, micrometeoroids, and single event upsets) are satisfied are presented. The environmental program implemented on Galileo satisfied the spirit and intent of the requirements imposed by the Project during the spacecraft's development. The lessons learned from the Galileo environmental program are discussed in this paper.
A Systems Approach to Develop Sustainable Water Supply Infrastructure and Management
In a visit to Zhejiang University, China, Dr. Y. Jeffrey Yang will discuss in this presentation the system approach for urban water infrastructure sustainability. Through a system analysis, it becomes clear at an urban scale that the energy and water efficiencies of a water supp...
A Composite Model for Employees' Performance Appraisal and Improvement
ERIC Educational Resources Information Center
Manoharan, T. R.; Muralidharan, C.; Deshmukh, S. G.
2012-01-01
Purpose: The purpose of this paper is to develop an innovative method of performance appraisal that will be useful for designing a structured training programme. Design/methodology/approach: Employees' performance appraisals are conducted using new approaches, namely data envelopment analysis and an integrated fuzzy model. Interpretive structural…
A systematic risk management approach employed on the CloudSat project
NASA Technical Reports Server (NTRS)
Basilio, R. R.; Plourde, K. S.; Lam, T.
2000-01-01
The CloudSat Project has developed a simplified approach for fault tree analysis and probabilistic risk assessment. A system-level fault tree has been constructed to identify credible fault scenarios and failure modes leading up to a potential failure to meet the nominal mission success criteria.
One of the approaches for reducing uncertainties in the assessment of human exposure is to better characterize the hazardous wastes that contaminate our environment. A significant limitation to this approach, however, is that sampling and laboratory analysis of contaminated envi...
A Participatory Design Approach for a Mobile App-Based Personal Response System
ERIC Educational Resources Information Center
Song, Donggil; Oh, Eun Young
2016-01-01
This study reports on a participatory design approach including the design, development, implementation, and evaluation of a mobile app-based personal response system (PRS). The first cycle formulated initial design principles through context and needs analysis; the second utilized the collaboration with instructors and experts embodying specific…
Pedagogical Basis of DAS Formalism in Engineering Education
ERIC Educational Resources Information Center
Hiltunen, J.; Heikkinen, E.-P.; Jaako, J.; Ahola, J.
2011-01-01
The paper presents a new approach for a bachelor-level curriculum structure in engineering. The approach is called DAS formalism according to its three phases: description, analysis and synthesis. Although developed specifically for process and environmental engineering, DAS formalism has a generic nature and it could also be used in other…
Notes about COOL: Analysis and Highlights of Complex View in Education
ERIC Educational Resources Information Center
de Oliveira, C. A.
2012-01-01
Purpose: The purpose of this paper is to present principles from the complex approach in education and describe some practical pedagogic experiences enhancing how "real world" perspectives have influenced and contributed to curriculum development. Design/methodology/approach: Necessity of integration in terms of knowledge modeling is an…
A study for hypergolic vapor sensor development
NASA Technical Reports Server (NTRS)
Stetter, J. R.
1977-01-01
The use of an electrochemical technique for MMH and N02 measurement was investigated. Specific MMH and N02 electrochemical sensors were developed. Experimental techniques for preparation, handling, and analysis of hydrazine's vapor mixtures at ppb and ppm levels were developed. Two approaches to N02 instrument design were evaluated including specific adsorption and specific electrochemical reduction. Two approaches to hydrazines monitoring were evaluated including catalytic conversion to N0 with subsequent N0 detection and direct specific electrochemical oxidation. Two engineering prototype MMH/N02 monitors were designed and constructed.
ERIC Educational Resources Information Center
Showanasai, Parinya; Lu, Jiafang; Hallinger, Philip
2013-01-01
Purpose: The extant literature on school leadership development is dominated by conceptual analysis, descriptive studies of current practice, critiques of current practice, and prescriptions for better ways to approach practice. Relatively few studies have examined impact of leadership development using experimental methods, among which even fewer…
Analysis of rubber supply in Sri Lanka
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hartley, M.J.; Nerlove, M.; Peters, R.K. Jr.
1987-11-01
An analysis of the supply response for perennial crops is undertaken for rubber in Sir Lanka, focusing on the uprooting-replanting decision and disaggregating the typical reduced-form supply response equation into several structural relationships. This approach is compared and contrasted with Dowling's analysis of supply response for rubber in Thailand, which is based upon a sophisticated reduced-form supply function developed by Wickens and Greenfield for Brazilian coffee. Because the uprooting-replanting decision is central to understanding rubber supply response in Sri Lanka and for other perennial crops where replanting activities dominate new planting, the standard approaches do not adequately capture supply response.
System review: a method for investigating medical errors in healthcare settings.
Alexander, G L; Stone, T T
2000-01-01
System analysis is a process of evaluating objectives, resources, structure, and design of businesses. System analysis can be used by leaders to collaboratively identify breakthrough opportunities to improve system processes. In healthcare systems, system analysis can be used to review medical errors (system occurrences) that may place patients at risk for injury, disability, and/or death. This study utilizes a case management approach to identify medical errors. Utilizing an interdisciplinary approach, a System Review Team was developed to identify trends in system occurrences, facilitate communication, and enhance the quality of patient care by reducing medical errors.
An operational approach to high resolution agro-ecological zoning in West-Africa.
Le Page, Y; Vasconcelos, Maria; Palminha, A; Melo, I Q; Pereira, J M C
2017-01-01
The objective of this work is to develop a simple methodology for high resolution crop suitability analysis under current and future climate, easily applicable and useful in Least Developed Countries. The approach addresses both regional planning in the context of climate change projections and pre-emptive short-term rural extension interventions based on same-year agricultural season forecasts, while implemented with off-the-shelf resources. The developed tools are applied operationally in a case-study developed in three regions of Guinea-Bissau and the obtained results, as well as the advantages and limitations of methods applied, are discussed. In this paper we show how a simple approach can easily generate information on climate vulnerability and how it can be operationally used in rural extension services.
Engineering large-scale agent-based systems with consensus
NASA Technical Reports Server (NTRS)
Bokma, A.; Slade, A.; Kerridge, S.; Johnson, K.
1994-01-01
The paper presents the consensus method for the development of large-scale agent-based systems. Systems can be developed as networks of knowledge based agents (KBA) which engage in a collaborative problem solving effort. The method provides a comprehensive and integrated approach to the development of this type of system. This includes a systematic analysis of user requirements as well as a structured approach to generating a system design which exhibits the desired functionality. There is a direct correspondence between system requirements and design components. The benefits of this approach are that requirements are traceable into design components and code thus facilitating verification. The use of the consensus method with two major test applications showed it to be successful and also provided valuable insight into problems typically associated with the development of large systems.
Analyzing public health policy: three approaches.
Coveney, John
2010-07-01
Policy is an important feature of public and private organizations. Within the field of health as a policy arena, public health has emerged in which policy is vital to decision making and the deployment of resources. Public health practitioners and students need to be able to analyze public health policy, yet many feel daunted by the subject's complexity. This article discusses three approaches that simplify policy analysis: Bacchi's "What's the problem?" approach examines the way that policy represents problems. Colebatch's governmentality approach provides a way of analyzing the implementation of policy. Bridgman and Davis's policy cycle allows for an appraisal of public policy development. Each approach provides an analytical framework from which to rigorously study policy. Practitioners and students of public health gain much in engaging with the politicized nature of policy, and a simple approach to policy analysis can greatly assist one's understanding and involvement in policy work.
Estimating residential price elasticity of demand for water: A contingent valuation approach
NASA Astrophysics Data System (ADS)
Thomas, John F.; Syme, Geoffrey J.
1988-11-01
Residential households in Perth, Western Australia have access to privately extracted groundwater as well as a public mains water supply, which has been charged through a two-part block tariff. A contingent valuation approach is developed to estimate price elasticity of demand for public supply. Results are compared with those of a multivariate time series analysis. Validation tests for the contingent approach are proposed, based on a comparison of predicted behaviors following hypothesised price changes with relevant independent data. Properly conducted, the contingent approach appears to be reliable, applicable where the available data do not favor regression analysis, and a fruitful source of information about social, technical, and behavioral responses to change in the price of water.
ERIC Educational Resources Information Center
Bove, Chiara; Jensen, Bente; Wyslowska, Olga; Iannone, Rosa Lisa; Mantovani, Susanna; Karwowska-Struczyk, Malgorzata
2018-01-01
This article offers insights into what characterises innovative continuous professional development (CPD) in the field of early childhood education and care (ECEC) by analysing similarities and differences from case studies of exemplary approaches to innovative CPD in Denmark, Italy and Poland. The comparative analysis focuses on four features…
ATAC Autocuer Modeling Analysis.
1981-01-01
the analysis of the simple rectangular scrnentation (1) is based on detection and estimation theory (2). This approach uses the concept of maximum ...continuous wave forms. In order to develop the principles of maximum likelihood, it is con- venient to develop the principles for the "classical...the concept of maximum likelihood is significant in that it provides the optimum performance of the detection/estimation problem. With a knowledge of
Method for Identifying Probable Archaeological Sites from Remotely Sensed Data
NASA Technical Reports Server (NTRS)
Tilton, James C.; Comer, Douglas C.; Priebe, Carey E.; Sussman, Daniel
2011-01-01
Archaeological sites are being compromised or destroyed at a catastrophic rate in most regions of the world. The best solution to this problem is for archaeologists to find and study these sites before they are compromised or destroyed. One way to facilitate the necessary rapid, wide area surveys needed to find these archaeological sites is through the generation of maps of probable archaeological sites from remotely sensed data. We describe an approach for identifying probable locations of archaeological sites over a wide area based on detecting subtle anomalies in vegetative cover through a statistically based analysis of remotely sensed data from multiple sources. We further developed this approach under a recent NASA ROSES Space Archaeology Program project. Under this project we refined and elaborated this statistical analysis to compensate for potential slight miss-registrations between the remote sensing data sources and the archaeological site location data. We also explored data quantization approaches (required by the statistical analysis approach), and we identified a superior data quantization approached based on a unique image segmentation approach. In our presentation we will summarize our refined approach and demonstrate the effectiveness of the overall approach with test data from Santa Catalina Island off the southern California coast. Finally, we discuss our future plans for further improving our approach.
Instream-Flow Analysis for the Luquillo Experimental Forest, Puerto Rico: Methods and Analysis
F.N. Scatena; S.L. Johnson
2001-01-01
This study develops two habitat-based approaches for evaluating instream-flow requirements within the Luquillo Experimental Forest in northeastern Puerto Rico. The analysis is restricted to instream-flow requirements in upland streams dominated by the common communities of freshwater decapods. In headwater streams, pool volume was the most consistent factor...
A Conversation Analysis Approach to Researching eTandems--The Challenges of Data Collection
ERIC Educational Resources Information Center
Renner, Julia
2016-01-01
This article deals with the challenges of data collection from a Conversation Analysis (CA) perspective to researching synchronous, audio-visual eTandems. Conversation analysis is a research tradition that developed out of ethnomethodology and is concerned with the question of how social interaction in naturally occurring situations is organized.…
Multidimensional Functional Behaviour Assessment within a Problem Analysis Framework.
ERIC Educational Resources Information Center
Ryba, Ken; Annan, Jean
This paper presents a new approach to contextualized problem analysis developed for use with multimodal Functional Behaviour Assessment (FBA) at Massey University in Auckland, New Zealand. The aim of problem analysis is to simplify complex problems that are difficult to understand. It accomplishes this by providing a high order framework that can…
Reasserting the Fundamentals of Systems Analysis and Design through the Rudiments of Artifacts
ERIC Educational Resources Information Center
Jafar, Musa; Babb, Jeffry
2012-01-01
In this paper we present an artifacts-based approach to teaching a senior level Object-Oriented Analysis and Design course. Regardless of the systems development methodology and process model, and in order to facilitate communication across the business modeling, analysis, design, construction and deployment disciplines, we focus on (1) the…
Investigating the Application of Needs Analysis on EAP Business Administration Materials
ERIC Educational Resources Information Center
Mohammed, Saifalislam Abdalla Hajahmed
2016-01-01
This study is conducted to investigate the application of needs analysis in developing EAP materials for business administration students in two Sudanese universities. The subjects are 2 head departments of English language. To collect data, the researcher uses interview and content analysis. The study adopts the descriptive approach. The data of…
A new practice-driven approach to develop software in a cyber-physical system environment
NASA Astrophysics Data System (ADS)
Jiang, Yiping; Chen, C. L. Philip; Duan, Junwei
2016-02-01
Cyber-physical system (CPS) is an emerging area, which cannot work efficiently without proper software handling of the data and business logic. Software and middleware is the soul of the CPS. The software development of CPS is a critical issue because of its complicity in a large scale realistic system. Furthermore, object-oriented approach (OOA) is often used to develop CPS software, which needs some improvements according to the characteristics of CPS. To develop software in a CPS environment, a new systematic approach is proposed in this paper. It comes from practice, and has been evolved from software companies. It consists of (A) Requirement analysis in event-oriented way, (B) architecture design in data-oriented way, (C) detailed design and coding in object-oriented way and (D) testing in event-oriented way. It is a new approach based on OOA; the difference when compared with OOA is that the proposed approach has different emphases and measures in every stage. It is more accord with the characteristics of event-driven CPS. In CPS software development, one should focus on the events more than the functions or objects. A case study of a smart home system is designed to reveal the effectiveness of the approach. It shows that the approach is also easy to be operated in the practice owing to some simplifications. The running result illustrates the validity of this approach.
Deciphering the Epigenetic Code: An Overview of DNA Methylation Analysis Methods
Umer, Muhammad
2013-01-01
Abstract Significance: Methylation of cytosine in DNA is linked with gene regulation, and this has profound implications in development, normal biology, and disease conditions in many eukaryotic organisms. A wide range of methods and approaches exist for its identification, quantification, and mapping within the genome. While the earliest approaches were nonspecific and were at best useful for quantification of total methylated cytosines in the chunk of DNA, this field has seen considerable progress and development over the past decades. Recent Advances: Methods for DNA methylation analysis differ in their coverage and sensitivity, and the method of choice depends on the intended application and desired level of information. Potential results include global methyl cytosine content, degree of methylation at specific loci, or genome-wide methylation maps. Introduction of more advanced approaches to DNA methylation analysis, such as microarray platforms and massively parallel sequencing, has brought us closer to unveiling the whole methylome. Critical Issues: Sensitive quantification of DNA methylation from degraded and minute quantities of DNA and high-throughput DNA methylation mapping of single cells still remain a challenge. Future Directions: Developments in DNA sequencing technologies as well as the methods for identification and mapping of 5-hydroxymethylcytosine are expected to augment our current understanding of epigenomics. Here we present an overview of methodologies available for DNA methylation analysis with special focus on recent developments in genome-wide and high-throughput methods. While the application focus relates to cancer research, the methods are equally relevant to broader issues of epigenetics and redox science in this special forum. Antioxid. Redox Signal. 18, 1972–1986. PMID:23121567
Frimpong, Joseph Asamoah; Amo-Addae, Maame Pokuah; Adewuyi, Peter Adebayo; Hall, Casey Daniel; Park, Meeyoung Mattie; Nagbe, Thomas Knue
2017-01-01
The laboratory plays a major role in surveillance, including confirming the start and end of an outbreak. Knowing the causative agent for an outbreak informs the development of response strategies and management plans for a public health event. However, issues and challenges may arise that limit the effectiveness or efficiency of laboratories in surveillance. This case study applies a systematic approach to analyse gaps in laboratory surveillance, thereby improving the ability to mitigate these gaps. Although this case study concentrates on factors resulting in poor feedback from the laboratory, practise of this general approach to problem analysis will confer skills required in analysing most public health issues. This case study was developed based on a report submitted by the district surveillance officer in Grand Bassa County, Liberia, as a resident of the Liberian Frontline Field Epidemiology Training Program in 2016. This case study will serve as a training tool to reinforce lectures on surveillance problem analysis using the fishbone approach. It is designed for public health training in a classroom setting and can be completed within 2 hours 30 minutes.
A systematic approach for analysis and design of secure health information systems.
Blobel, B; Roger-France, F
2001-06-01
A toolset using object-oriented techniques including the nowadays popular unified modelling language (UML) approach has been developed to facilitate the different users' views for security analysis and design of health care information systems. Paradigm and concepts used are based on the component architecture of information systems and on a general layered security model. The toolset was developed in 1996/1997 within the ISHTAR project funded by the European Commission as well as through international standardisation activities. Analysing and systematising real health care scenarios, only six and nine use case types could be found in the health and the security-related view, respectively. By combining these use case types, the analysis and design of any thinkable system architecture can be simplified significantly. Based on generic schemes, the environment needed for both communication and application security can be established by appropriate sets of security services and mechanisms. Because of the importance and the basic character of electronic health care record (EHCR) systems, the understanding of the approach is facilitated by (incomplete) examples for this application.
Microfluidic approaches to malaria detection
Gascoyne, Peter; Satayavivad, Jutamaad; Ruchirawat, Mathuros
2009-01-01
Microfluidic systems are under development to address a variety of medical problems. Key advantages of micrototal analysis systems based on microfluidic technology are the promise of small size and the integration of sample handling and measurement functions within a single, automated device having low mass-production costs. Here, we review the spectrum of methods currently used to detect malaria, consider their advantages and disadvantages, and discuss their adaptability towards integration into small, automated micro total analysis systems. Molecular amplification methods emerge as leading candidates for chip-based systems because they offer extremely high sensitivity, the ability to recognize malaria species and strain, and they will be adaptable to the detection of new genotypic signatures that will emerge from current genomic-based research of the disease. Current approaches to the development of chip-based molecular amplification are considered with special emphasis on flow-through PCR, and we present for the first time the method of malaria specimen preparation by dielectrophoretic field-flow-fractionation. Although many challenges must be addressed to realize a micrototal analysis system for malaria diagnosis, it is concluded that the potential benefits of the approach are well worth pursuing. PMID:14744562
Motivations and Barriers for Policymakers to Developing State Adaptation Plans
NASA Astrophysics Data System (ADS)
Miller, R.; Sylak-Glassman, E.
2016-12-01
Current approaches for developing high-quality adaptation plan require significant resources. In recent years, communities have grown to embrace adaptive plans across multiple forms, including adaptive capacity assessments, resilience strategies, and vulnerability assessments. Across the United States, as of this writing, 14 states have established adaptation plans, with another 8 states having begun the process. Given the high resources requirements and increasing interest in the development of adaptation plans, we aim to examine patterns behind the establishment of resilience plans at the state level. We examine demographic, financial, political, and physical characteristics associated with different states in an effort to explore the reasoning behind investing in the development of adaptation plans. This analysis considers quantitative and qualitative factors, including recent elections for political parties, politicians' climate-related statements and campaign promises, demographics, budgets, and regional climate threats. The analysis aims to identify motivations for state leadership taking action to develop adaptation plans. Results from the analysis seek to identify the primary drivers and barriers associated with state-wide resilience planning. These results could inform the design of scientific communication tools or approaches to aid future adaptation responses to climate change.
Development of a realistic stress analysis for fatigue analysis of notched composite laminates
NASA Technical Reports Server (NTRS)
Humphreys, E. A.; Rosen, B. W.
1979-01-01
A finite element stress analysis which consists of a membrane and interlaminar shear spring analysis was developed. This approach was utilized in order to model physically realistic failure mechanisms while maintaining a high degree of computational economy. The accuracy of the stress analysis predictions is verified through comparisons with other solutions to the composite laminate edge effect problem. The stress analysis model was incorporated into an existing fatigue analysis methodology and the entire procedure computerized. A fatigue analysis is performed upon a square laminated composite plate with a circular central hole. A complete description and users guide for the computer code FLAC (Fatigue of Laminated Composites) is included as an appendix.
Advances in Proteomics Data Analysis and Display Using an Accurate Mass and Time Tag Approach
Zimmer, Jennifer S.D.; Monroe, Matthew E.; Qian, Wei-Jun; Smith, Richard D.
2007-01-01
Proteomics has recently demonstrated utility in understanding cellular processes on the molecular level as a component of systems biology approaches and for identifying potential biomarkers of various disease states. The large amount of data generated by utilizing high efficiency (e.g., chromatographic) separations coupled to high mass accuracy mass spectrometry for high-throughput proteomics analyses presents challenges related to data processing, analysis, and display. This review focuses on recent advances in nanoLC-FTICR-MS-based proteomics approaches and the accompanying data processing tools that have been developed to display and interpret the large volumes of data being produced. PMID:16429408
Marketing approaches for OTC analgesics in Bulgaria
Petkova, Valentina; Valchanova, Velislava; Ibrahim, Adel; Nikolova, Irina; Benbasat, Niko; Dimitrov, Milen
2014-01-01
The marketing management includes analysis of market opportunities, selection of target markets, planning, developing and implementing of marketing strategies, monitoring and result control. The object of the present study was to analyse the marketing approaches applied for non-steroidal anti-inflammatory drugs (NSAIDs) in Bulgaria. The performed SWOT(planning method used to evaluate the strengths, weaknesses, opportunities, and threats) analysis for one of the leading Bulgarian manufacturers marked the complex corporative strategy for stimulating the sales of NSAIDs. The study results show that the legislation frame in the country gives an opportunity for regulation of the NSAID market in order that incorrect marketing approaches such as disloyal competition are avoided. PMID:26019521
Quantile regression in the presence of monotone missingness with sensitivity analysis
Liu, Minzhao; Daniels, Michael J.; Perri, Michael G.
2016-01-01
In this paper, we develop methods for longitudinal quantile regression when there is monotone missingness. In particular, we propose pattern mixture models with a constraint that provides a straightforward interpretation of the marginal quantile regression parameters. Our approach allows sensitivity analysis which is an essential component in inference for incomplete data. To facilitate computation of the likelihood, we propose a novel way to obtain analytic forms for the required integrals. We conduct simulations to examine the robustness of our approach to modeling assumptions and compare its performance to competing approaches. The model is applied to data from a recent clinical trial on weight management. PMID:26041008
Marketing approaches for OTC analgesics in Bulgaria.
Petkova, Valentina; Valchanova, Velislava; Ibrahim, Adel; Nikolova, Irina; Benbasat, Niko; Dimitrov, Milen
2014-03-04
The marketing management includes analysis of market opportunities, selection of target markets, planning, developing and implementing of marketing strategies, monitoring and result control. The object of the present study was to analyse the marketing approaches applied for non-steroidal anti-inflammatory drugs (NSAIDs) in Bulgaria. The performed SWOT(planning method used to evaluate the strengths, weaknesses, opportunities, and threats) analysis for one of the leading Bulgarian manufacturers marked the complex corporative strategy for stimulating the sales of NSAIDs. The study results show that the legislation frame in the country gives an opportunity for regulation of the NSAID market in order that incorrect marketing approaches such as disloyal competition are avoided.
SCOS 2: An object oriented software development approach
NASA Technical Reports Server (NTRS)
Symonds, Martin; Lynenskjold, Steen; Mueller, Christian
1994-01-01
The Spacecraft Control and Operations System 2 (SCOS 2), is intended to provide the generic mission control system infrastructure for future ESA missions. It represents a bold step forward in order to take advantage of state-of-the-art technology and current practices in the area of software engineering. Key features include: (1) use of object oriented analysis and design techniques; (2) use of UNIX, C++ and a distributed architecture as the enabling implementation technology; (3) goal of re-use for development, maintenance and mission specific software implementation; and (4) introduction of the concept of a spacecraft control model. This paper touches upon some of the traditional beliefs surrounding Object Oriented development and describes their relevance to SCOS 2. It gives rationale for why particular approaches were adopted and others not, and describes the impact of these decisions. The development approach followed is discussed, highlighting the evolutionary nature of the overall process and the iterative nature of the various tasks carried out. The emphasis of this paper is on the process of the development with the following being covered: (1) the three phases of the SCOS 2 project - prototyping & analysis, design & implementation and configuration / delivery of mission specific systems; (2) the close cooperation and continual interaction with the users during the development; (3) the management approach - the split between client staff, industry and some of the required project management activities; (4) the lifecycle adopted being an enhancement of the ESA PSS-05 standard with SCOS 2 specific activities and approaches defined; and (5) an examination of some of the difficulties encountered and the solutions adopted. Finally, the lessons learned from the SCOS 2 experience are highlighted, identifying those issues to be used as feedback into future developments of this nature. This paper does not intend to describe the finished product and its operation, but focusing on the journey to arrive there, concentrating therefore on the process and not the products of the SCOS 2 software development.
A phasor approach analysis of multiphoton FLIM measurements of three-dimensional cell culture models
NASA Astrophysics Data System (ADS)
Lakner, P. H.; Möller, Y.; Olayioye, M. A.; Brucker, S. Y.; Schenke-Layland, K.; Monaghan, M. G.
2016-03-01
Fluorescence lifetime imaging microscopy (FLIM) is a useful approach to obtain information regarding the endogenous fluorophores present in biological samples. The concise evaluation of FLIM data requires the use of robust mathematical algorithms. In this study, we developed a user-friendly phasor approach for analyzing FLIM data and applied this method on three-dimensional (3D) Caco-2 models of polarized epithelial luminal cysts in a supporting extracellular matrix environment. These Caco-2 based models were treated with epidermal growth factor (EGF), to stimulate proliferation in order to determine if FLIM could detect such a change in cell behavior. Autofluorescence from nicotinamide adenine dinucleotide (phosphate) (NAD(P)H) in luminal Caco-2 cysts was stimulated by 2-photon laser excitation. Using a phasor approach, the lifetimes of involved fluorophores and their contribution were calculated with fewer initial assumptions when compared to multiexponential decay fitting. The phasor approach simplified FLIM data analysis, making it an interesting tool for non-experts in numerical data analysis. We observed that an increased proliferation stimulated by EGF led to a significant shift in fluorescence lifetime and a significant alteration of the phasor data shape. Our data demonstrates that multiphoton FLIM analysis with the phasor approach is a suitable method for the non-invasive analysis of 3D in vitro cell culture models qualifying this method for monitoring basic cellular features and the effect of external factors.
Using argument notation to engineer biological simulations with increased confidence
Alden, Kieran; Andrews, Paul S.; Polack, Fiona A. C.; Veiga-Fernandes, Henrique; Coles, Mark C.; Timmis, Jon
2015-01-01
The application of computational and mathematical modelling to explore the mechanics of biological systems is becoming prevalent. To significantly impact biological research, notably in developing novel therapeutics, it is critical that the model adequately represents the captured system. Confidence in adopting in silico approaches can be improved by applying a structured argumentation approach, alongside model development and results analysis. We propose an approach based on argumentation from safety-critical systems engineering, where a system is subjected to a stringent analysis of compliance against identified criteria. We show its use in examining the biological information upon which a model is based, identifying model strengths, highlighting areas requiring additional biological experimentation and providing documentation to support model publication. We demonstrate our use of structured argumentation in the development of a model of lymphoid tissue formation, specifically Peyer's Patches. The argumentation structure is captured using Artoo (www.york.ac.uk/ycil/software/artoo), our Web-based tool for constructing fitness-for-purpose arguments, using a notation based on the safety-critical goal structuring notation. We show how argumentation helps in making the design and structured analysis of a model transparent, capturing the reasoning behind the inclusion or exclusion of each biological feature and recording assumptions, as well as pointing to evidence supporting model-derived conclusions. PMID:25589574
Using argument notation to engineer biological simulations with increased confidence.
Alden, Kieran; Andrews, Paul S; Polack, Fiona A C; Veiga-Fernandes, Henrique; Coles, Mark C; Timmis, Jon
2015-03-06
The application of computational and mathematical modelling to explore the mechanics of biological systems is becoming prevalent. To significantly impact biological research, notably in developing novel therapeutics, it is critical that the model adequately represents the captured system. Confidence in adopting in silico approaches can be improved by applying a structured argumentation approach, alongside model development and results analysis. We propose an approach based on argumentation from safety-critical systems engineering, where a system is subjected to a stringent analysis of compliance against identified criteria. We show its use in examining the biological information upon which a model is based, identifying model strengths, highlighting areas requiring additional biological experimentation and providing documentation to support model publication. We demonstrate our use of structured argumentation in the development of a model of lymphoid tissue formation, specifically Peyer's Patches. The argumentation structure is captured using Artoo (www.york.ac.uk/ycil/software/artoo), our Web-based tool for constructing fitness-for-purpose arguments, using a notation based on the safety-critical goal structuring notation. We show how argumentation helps in making the design and structured analysis of a model transparent, capturing the reasoning behind the inclusion or exclusion of each biological feature and recording assumptions, as well as pointing to evidence supporting model-derived conclusions.
Comparison of Nonlinear Random Response Using Equivalent Linearization and Numerical Simulation
NASA Technical Reports Server (NTRS)
Rizzi, Stephen A.; Muravyov, Alexander A.
2000-01-01
A recently developed finite-element-based equivalent linearization approach for the analysis of random vibrations of geometrically nonlinear multiple degree-of-freedom structures is validated. The validation is based on comparisons with results from a finite element based numerical simulation analysis using a numerical integration technique in physical coordinates. In particular, results for the case of a clamped-clamped beam are considered for an extensive load range to establish the limits of validity of the equivalent linearization approach.
ERIC Educational Resources Information Center
Muthen, Bengt
This paper investigates methods that avoid using multiple groups to represent the missing data patterns in covariance structure modeling, attempting instead to do a single-group analysis where the only action the analyst has to take is to indicate that data is missing. A new covariance structure approach developed by B. Muthen and G. Arminger is…
Cinner, Joshua E; Bodin, Orjan
2010-08-11
Diverse livelihood portfolios are frequently viewed as a critical component of household economies in developing countries. Within the context of natural resources governance in particular, the capacity of individual households to engage in multiple occupations has been shown to influence important issues such as whether fishers would exit a declining fishery, how people react to policy, the types of resource management systems that may be applicable, and other decisions about natural resource use. This paper uses network analysis to provide a novel methodological framework for detailed systemic analysis of household livelihood portfolios. Paying particular attention to the role of natural resource-based occupations such as fisheries, we use network analyses to map occupations and their interrelationships- what we refer to as 'livelihood landscapes'. This network approach allows for the visualization of complex information about dependence on natural resources that can be aggregated at different scales. We then examine how the role of natural resource-based occupations changes along spectra of socioeconomic development and population density in 27 communities in 5 western Indian Ocean countries. Network statistics, including in- and out-degree centrality, the density of the network, and the level of network centralization are compared along a multivariate index of community-level socioeconomic development and a gradient of human population density. The combination of network analyses suggests an increase in household-level specialization with development for most occupational sectors, including fishing and farming, but that at the community-level, economies remained diversified. The novel modeling approach introduced here provides for various types of livelihood portfolio analyses at different scales of social aggregation. Our livelihood landscapes approach provides insights into communities' dependencies and usages of natural resources, and shows how patterns of occupational interrelationships relate to socioeconomic development and population density. A key question for future analysis is how the reduction of household occupational diversity, but maintenance of community diversity we see with increasing socioeconomic development influences key aspects of societies' vulnerability to environmental change or disasters.
NASA Supportability Engineering Implementation Utilizing DoD Practices and Processes
NASA Technical Reports Server (NTRS)
Smith, David A.; Smith, John V.
2010-01-01
The Ares I design and development program made the determination early in the System Design Review Phase to utilize DoD ILS and LSA approach for supportability engineering as an integral part of the system engineering process. This paper is to provide a review of the overall approach to design Ares-I with an emphasis on a more affordable, supportable, and sustainable launch vehicle. Discussions will include the requirements development, design influence, support concept alternatives, ILS and LSA planning, Logistics support analyses/trades performed, LSA tailoring for NASA Ares Program, support system infrastructure identification, ILS Design Review documentation, Working Group coordination, and overall ILS implementation. At the outset, the Ares I Project initiated the development of the Integrated Logistics Support Plan (ILSP) and a Logistics Support Analysis process to provide a path forward for the management of the Ares-I ILS program and supportability analysis activities. The ILSP provide the initial planning and coordination between the Ares-I Project Elements and Ground Operation Project. The LSA process provided a system engineering approach in the development of the Ares-I supportability requirements; influence the design for supportability and development of alternative support concepts that satisfies the program operability requirements. The LSA planning and analysis results are documented in the Logistics Support Analysis Report. This document was required during the Ares-I System Design Review (SDR) and Preliminary Design Review (PDR) review cycles. To help coordinate the LSA process across the Ares-I project and between programs, the LSA Report is updated and released quarterly. A System Requirement Analysis was performed to determine the supportability requirements and technical performance measurements (TPMs). Two working groups were established to provide support in the management and implement the Ares-I ILS program, the Integrated Logistics Support Working Group (ILSWG) and the Logistics Support Analysis Record Working Group (LSARWG). The Ares I ILSWG is established to assess the requirements and conduct, evaluate analyses and trade studies associated with acquisition logistic and supportability processes and to resolve Ares I integrated logistics and supportability issues. It established a strategic collaborative alliance for coordination of Logistics Support Analysis activates in support of the integrated Ares I vehicle design and development of logistics support infrastructure. A Joint Ares I - Orion LSAR Working Group was established to: 1) Guide the development of Ares-I and Orion LSAR data and serve as a model for future Constellation programs, 2) Develop rules and assumptions that will apply across the Constellation program with regards to the program's LSAR development, and 3) Maintain the Constellation LSAR Style Guide.
Analysis of Illumina Microbial Assemblies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clum, Alicia; Foster, Brian; Froula, Jeff
2010-05-28
Since the emerging of second generation sequencing technologies, the evaluation of different sequencing approaches and their assembly strategies for different types of genomes has become an important undertaken. Next generation sequencing technologies dramatically increase sequence throughput while decreasing cost, making them an attractive tool for whole genome shotgun sequencing. To compare different approaches for de-novo whole genome assembly, appropriate tools and a solid understanding of both quantity and quality of the underlying sequence data are crucial. Here, we performed an in-depth analysis of short-read Illumina sequence assembly strategies for bacterial and archaeal genomes. Different types of Illumina libraries as wellmore » as different trim parameters and assemblers were evaluated. Results of the comparative analysis and sequencing platforms will be presented. The goal of this analysis is to develop a cost-effective approach for the increased throughput of the generation of high quality microbial genomes.« less
ERIC Educational Resources Information Center
Moorosi, Pontso
2014-01-01
This article explores the notion of leadership identity construction as it happens through a leadership development programme. Influenced by a conception that leadership development is essentially about facilitating an identity transition, it uses an intersectional approach to explore school leaders' identity construction as it was shaped and…
DOT National Transportation Integrated Search
2006-08-28
Task 5 - Identify Corridor Types, Operational Approaches and Strategies, and Analysis Tools - is part of the overall foundational research to further the understanding of various aspects of Integrated Corridor Management (ICM) and to identify integra...
Improving Conference Skills Through the CCS.
ERIC Educational Resources Information Center
Wilen, William W.; Kindsvatter, Richard
1982-01-01
Presents a Conference Category System (CCS) which will help social studies supervisors develop the skills necessary to conduct a conference effectively. The CCS can be applied using either a shared-analysis or self-analysis approach in conjunction with a video or audio-tape recorder. (RM)
A Taxonomic Approach to the Gestalt Theory of Perls
ERIC Educational Resources Information Center
Raming, Henry E.; Frey, David H.
1974-01-01
This study applied content analysis and cluster analysis to the ideas of Fritz Perls to develop a taxonomy of Gestalt processes and goals. Summaries of the typal groups or clusters were written and the implications of taxonomic research in counseling discussed. (Author)
Educational Cost-Benefit Analysis.
ERIC Educational Resources Information Center
Hough, J. R.
1994-01-01
Educational cost-benefit analysis, as practiced in both industrialized and developing nations, has been much criticized. Manpower planning, the principal alternative, has received even harsher criticism. The two approaches should be combined in empirically based projects that study recent graduates and chart their subsequent employment progress.…
An Empirical Bayes Approach to Mantel-Haenszel DIF Analysis.
ERIC Educational Resources Information Center
Zwick, Rebecca; Thayer, Dorothy T.; Lewis, Charles
1999-01-01
Developed an empirical Bayes enhancement to Mantel-Haenszel (MH) analysis of differential item functioning (DIF) in which it is assumed that the MH statistics are normally distributed and that the prior distribution of underlying DIF parameters is also normal. (Author/SLD)
Development of an integrated BEM approach for hot fluid structure interaction
NASA Technical Reports Server (NTRS)
Dargush, G. F.; Banerjee, P. K.; Shi, Y.
1990-01-01
A comprehensive boundary element method is presented for transient thermoelastic analysis of hot section Earth-to-Orbit engine components. This time-domain formulation requires discretization of only the surface of the component, and thus provides an attractive alternative to finite element analysis for this class of problems. In addition, steep thermal gradients, which often occur near the surface, can be captured more readily since with a boundary element approach there are no shape functions to constrain the solution in the direction normal to the surface. For example, the circular disc analysis indicates the high level of accuracy that can be obtained. In fact, on the basis of reduced modeling effort and improved accuracy, it appears that the present boundary element method should be the preferred approach for general problems of transient thermoelasticity.
Progressive collapse of a two-story reinforced concrete frame with embedded smart aggregates
NASA Astrophysics Data System (ADS)
Laskar, Arghadeep; Gu, Haichang; Mo, Y. L.; Song, Gangbing
2009-07-01
This paper reports the experimental and analytical results of a two-story reinforced concrete frame instrumented with innovative piezoceramic-based smart aggregates (SAs) and subjected to a monotonic lateral load up to failure. A finite element model of the frame is developed and analyzed using a computer program called Open system for earthquake engineering simulation (OpenSees). The finite element analysis (FEA) is used to predict the load-deformation curve as well as the development of plastic hinges in the frame. The load-deformation curve predicted from FEA matched well with the experimental results. The sequence of development of plastic hinges in the frame is also studied from the FEA results. The locations of the plastic hinges, as obtained from the analysis, were similar to those observed during the experiment. An SA-based approach is also proposed to evaluate the health status of the concrete frame and identify the development of plastic hinges during the loading procedure. The results of the FEA are used to validate the SA-based approach for detecting the locations and occurrence of the plastic hinges leading to the progressive collapse of the frame. The locations and sequential development of the plastic hinges obtained from the SA-based approach corresponds well with the FEA results. The proposed SA-based approach, thus validated using FEA and experimental results, has a great potential to be applied in the health monitoring of large-scale civil infrastructures.
Hansen, William B; Dusenbury, Linda; Bishop, Dana; Derzon, James H
2007-06-01
We conducted an analysis of programs listed on the National Registry of Effective Programs and Practices as of 2003. This analysis focused on programs that addressed substance abuse prevention from among those on the effective or model program lists and that had manuals. A total of 48 programs met these inclusion criteria. We coded program manuals for content that was covered based on how much time was devoted to changing targeted mediating variables. The value of this approach is that program content can be judged using an impartial standard that can be applied to a wide range of intervention approaches. On average, programs addressed eight of 23 possible content areas. Our analyses suggested there were seven distinguishable approaches that have been used in substance abuse prevention programs. These include (i) changing access within the environment, (ii) promoting the development of personal and social skills, (iii) promoting positive affiliation, (iv) addressing social influences, (v) providing social support and helping participants develop goals and alternatives, (vi) developing positive schools and (vii) enhancing motivation to avoid substance use. We propose that the field use such analyses as the basis of future theory development.
A practical approach to object based requirements analysis
NASA Technical Reports Server (NTRS)
Drew, Daniel W.; Bishop, Michael
1988-01-01
Presented here is an approach developed at the Unisys Houston Operation Division, which supports the early identification of objects. This domain oriented analysis and development concept is based on entity relationship modeling and object data flow diagrams. These modeling techniques, based on the GOOD methodology developed at the Goddard Space Flight Center, support the translation of requirements into objects which represent the real-world problem domain. The goal is to establish a solid foundation of understanding before design begins, thereby giving greater assurance that the system will do what is desired by the customer. The transition from requirements to object oriented design is also promoted by having requirements described in terms of objects. Presented is a five step process by which objects are identified from the requirements to create a problem definition model. This process involves establishing a base line requirements list from which an object data flow diagram can be created. Entity-relationship modeling is used to facilitate the identification of objects from the requirements. An example is given of how semantic modeling may be used to improve the entity-relationship model and a brief discussion on how this approach might be used in a large scale development effort.
Some aspects of doping and medication control in equine sports.
Houghton, Ed; Maynard, Steve
2010-01-01
This chapter reviews drug and medication control in equestrian sports and addresses the rules of racing, the technological advances that have been made in drug detection and the importance of metabolism studies in the development of effective drug surveillance programmes. Typical approaches to screening and confirmatory analysis are discussed, as are the quality processes that underpin these procedures. The chapter also addresses four specific topics relevant to equestrian sports: substances controlled by threshold values, the approach adopted recently by European racing authorities to control some therapeutic substances, anabolic steroids in the horse and LC-MS analysis in drug testing in animal sports and metabolism studies. The purpose of discussing these specific topics is to emphasise the importance of research and development and collaboration to further global harmonisation and the development and support of international rules.
Building Efficiency Evaluation and Uncertainty Analysis with DOE's Asset Score Preview
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2016-08-12
Building Energy Asset Score Tool, developed by the U.S. Department of Energy (DOE), is a program to encourage energy efficiency improvement by helping building owners and managers assess a building's energy-related systems independent of operations and maintenance. Asset Score Tool uses a simplified EnergyPlus model to provide an assessment of building systems, through minimum user inputs of basic building characteristics. Asset Score Preview is a newly developed option that allows users to assess their building's systems and the potential value of a more in-depth analysis via an even more simplified approach. This methodology provides a preliminary approach to estimating amore » building's energy efficiency and potential for improvement. This paper provides an overview of the methodology used for the development of Asset Score Preview and the scoring methodology.« less
2010-01-01
mechanisms for interna- tional involvement and oversight. Ensure that development approaches are holistic and reflect ethnic and gender realities and...development approaches are holistic and reflect ethnic and gender reali- ties and needs. No less important than understanding the long-term nature of the...and opportunity for all groups. 21 Bibliography Barnes, Anne Evans, “Realizing Protection Space for Iraqi Refugees: UNHCR in Syria, Jordan, and Lebanon
Evaluation of Private Sector Roles in Space Resource Development
NASA Astrophysics Data System (ADS)
Lamassoure, Elisabeth S.; Blair, Brad R.; Diaz, Javier; Oderman, Mark; Duke, Michael B.; Vaucher, Marc; Manvi, Ramachandra; Easter, Robert W.
2003-01-01
An integrated engineering and financial modeling approach has been developed and used to evaluate the potential for private sector investment in space resource development, and to assess possible roles of the public sector in fostering private interest. This paper presents the modeling approach and its results for a transportation service using propellant extracted from lunar regolith. The analysis starts with careful case study definition, including an analysis of the customer base and market requirements, which are the basis for design of a modular, scalable space architecture. The derived non-recurring, recurring and operations costs become inputs for a `standard' financial model, as used in any commercial business plan. This model generates pro forma financial statements, calculates the amount of capitalization required, and generates return on equity calculations using two valuation metrics of direct interest to private investors: market enterprise value and multiples of key financial measures. Use of this model on an architecture to sell transportation services in Earth orbit based on lunar propellants shows how to rapidly test various assumptions and identify interesting architectural options, key areas for investment in exploration and technology, or innovative business approaches that could produce an economically viable industry. The same approach can be used to evaluate any other possible private ventures in space, and conclude on the respective roles of NASA and the private sector in space resource development and solar system exploration.
Data Intensive Analysis of Biomolecular Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Straatsma, TP; Soares, Thereza A.
2007-12-01
The advances in biomolecular modeling and simulation made possible by the availability of increasingly powerful high performance computing resources is extending molecular simulations to biological more relevant system size and time scales. At the same time, advances in simulation methodologies are allowing more complex processes to be described more accurately. These developments make a systems approach to computational structural biology feasible, but this will require a focused emphasis on the comparative analysis of the increasing number of molecular simulations that are being carried out for biomolecular systems with more realistic models, multi-component environments, and for longer simulation times. Just asmore » in the case of the analysis of the large data sources created by the new high-throughput experimental technologies, biomolecular computer simulations contribute to the progress in biology through comparative analysis. The continuing increase in available protein structures allows the comparative analysis of the role of structure and conformational flexibility in protein function, and is the foundation of the discipline of structural bioinformatics. This creates the opportunity to derive general findings from the comparative analysis of molecular dynamics simulations of a wide range of proteins, protein-protein complexes and other complex biological systems. Because of the importance of protein conformational dynamics for protein function, it is essential that the analysis of molecular trajectories is carried out using a novel, more integrative and systematic approach. We are developing a much needed rigorous computer science based framework for the efficient analysis of the increasingly large data sets resulting from molecular simulations. Such a suite of capabilities will also provide the required tools for access and analysis of a distributed library of generated trajectories. Our research is focusing on the following areas: (1) the development of an efficient analysis framework for very large scale trajectories on massively parallel architectures, (2) the development of novel methodologies that allow automated detection of events in these very large data sets, and (3) the efficient comparative analysis of multiple trajectories. The goal of the presented work is the development of new algorithms that will allow biomolecular simulation studies to become an integral tool to address the challenges of post-genomic biological research. The strategy to deliver the required data intensive computing applications that can effectively deal with the volume of simulation data that will become available is based on taking advantage of the capabilities offered by the use of large globally addressable memory architectures. The first requirement is the design of a flexible underlying data structure for single large trajectories that will form an adaptable framework for a wide range of analysis capabilities. The typical approach to trajectory analysis is to sequentially process trajectories time frame by time frame. This is the implementation found in molecular simulation codes such as NWChem, and has been designed in this way to be able to run on workstation computers and other architectures with an aggregate amount of memory that would not allow entire trajectories to be held in core. The consequence of this approach is an I/O dominated solution that scales very poorly on parallel machines. We are currently using an approach of developing tools specifically intended for use on large scale machines with sufficient main memory that entire trajectories can be held in core. This greatly reduces the cost of I/O as trajectories are read only once during the analysis. In our current Data Intensive Analysis (DIANA) implementation, each processor determines and skips to the entry within the trajectory that typically will be available in multiple files and independently from all other processors read the appropriate frames.« less
Data Intensive Analysis of Biomolecular Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Straatsma, TP
2008-03-01
The advances in biomolecular modeling and simulation made possible by the availability of increasingly powerful high performance computing resources is extending molecular simulations to biological more relevant system size and time scales. At the same time, advances in simulation methodologies are allowing more complex processes to be described more accurately. These developments make a systems approach to computational structural biology feasible, but this will require a focused emphasis on the comparative analysis of the increasing number of molecular simulations that are being carried out for biomolecular systems with more realistic models, multi-component environments, and for longer simulation times. Just asmore » in the case of the analysis of the large data sources created by the new high-throughput experimental technologies, biomolecular computer simulations contribute to the progress in biology through comparative analysis. The continuing increase in available protein structures allows the comparative analysis of the role of structure and conformational flexibility in protein function, and is the foundation of the discipline of structural bioinformatics. This creates the opportunity to derive general findings from the comparative analysis of molecular dynamics simulations of a wide range of proteins, protein-protein complexes and other complex biological systems. Because of the importance of protein conformational dynamics for protein function, it is essential that the analysis of molecular trajectories is carried out using a novel, more integrative and systematic approach. We are developing a much needed rigorous computer science based framework for the efficient analysis of the increasingly large data sets resulting from molecular simulations. Such a suite of capabilities will also provide the required tools for access and analysis of a distributed library of generated trajectories. Our research is focusing on the following areas: (1) the development of an efficient analysis framework for very large scale trajectories on massively parallel architectures, (2) the development of novel methodologies that allow automated detection of events in these very large data sets, and (3) the efficient comparative analysis of multiple trajectories. The goal of the presented work is the development of new algorithms that will allow biomolecular simulation studies to become an integral tool to address the challenges of post-genomic biological research. The strategy to deliver the required data intensive computing applications that can effectively deal with the volume of simulation data that will become available is based on taking advantage of the capabilities offered by the use of large globally addressable memory architectures. The first requirement is the design of a flexible underlying data structure for single large trajectories that will form an adaptable framework for a wide range of analysis capabilities. The typical approach to trajectory analysis is to sequentially process trajectories time frame by time frame. This is the implementation found in molecular simulation codes such as NWChem, and has been designed in this way to be able to run on workstation computers and other architectures with an aggregate amount of memory that would not allow entire trajectories to be held in core. The consequence of this approach is an I/O dominated solution that scales very poorly on parallel machines. We are currently using an approach of developing tools specifically intended for use on large scale machines with sufficient main memory that entire trajectories can be held in core. This greatly reduces the cost of I/O as trajectories are read only once during the analysis. In our current Data Intensive Analysis (DIANA) implementation, each processor determines and skips to the entry within the trajectory that typically will be available in multiple files and independently from all other processors read the appropriate frames.« less
Systems Toxicology: From Basic Research to Risk Assessment
2014-01-01
Systems Toxicology is the integration of classical toxicology with quantitative analysis of large networks of molecular and functional changes occurring across multiple levels of biological organization. Society demands increasingly close scrutiny of the potential health risks associated with exposure to chemicals present in our everyday life, leading to an increasing need for more predictive and accurate risk-assessment approaches. Developing such approaches requires a detailed mechanistic understanding of the ways in which xenobiotic substances perturb biological systems and lead to adverse outcomes. Thus, Systems Toxicology approaches offer modern strategies for gaining such mechanistic knowledge by combining advanced analytical and computational tools. Furthermore, Systems Toxicology is a means for the identification and application of biomarkers for improved safety assessments. In Systems Toxicology, quantitative systems-wide molecular changes in the context of an exposure are measured, and a causal chain of molecular events linking exposures with adverse outcomes (i.e., functional and apical end points) is deciphered. Mathematical models are then built to describe these processes in a quantitative manner. The integrated data analysis leads to the identification of how biological networks are perturbed by the exposure and enables the development of predictive mathematical models of toxicological processes. This perspective integrates current knowledge regarding bioanalytical approaches, computational analysis, and the potential for improved risk assessment. PMID:24446777
A Deliberate Practice Approach to Teaching Phylogenetic Analysis
Hobbs, F. Collin; Johnson, Daniel J.; Kearns, Katherine D.
2013-01-01
One goal of postsecondary education is to assist students in developing expert-level understanding. Previous attempts to encourage expert-level understanding of phylogenetic analysis in college science classrooms have largely focused on isolated, or “one-shot,” in-class activities. Using a deliberate practice instructional approach, we designed a set of five assignments for a 300-level plant systematics course that incrementally introduces the concepts and skills used in phylogenetic analysis. In our assignments, students learned the process of constructing phylogenetic trees through a series of increasingly difficult tasks; thus, skill development served as a framework for building content knowledge. We present results from 5 yr of final exam scores, pre- and postconcept assessments, and student surveys to assess the impact of our new pedagogical materials on student performance related to constructing and interpreting phylogenetic trees. Students improved in their ability to interpret relationships within trees and improved in several aspects related to between-tree comparisons and tree construction skills. Student feedback indicated that most students believed our approach prepared them to engage in tree construction and gave them confidence in their abilities. Overall, our data confirm that instructional approaches implementing deliberate practice address student misconceptions, improve student experiences, and foster deeper understanding of difficult scientific concepts. PMID:24297294
Systems toxicology: from basic research to risk assessment.
Sturla, Shana J; Boobis, Alan R; FitzGerald, Rex E; Hoeng, Julia; Kavlock, Robert J; Schirmer, Kristin; Whelan, Maurice; Wilks, Martin F; Peitsch, Manuel C
2014-03-17
Systems Toxicology is the integration of classical toxicology with quantitative analysis of large networks of molecular and functional changes occurring across multiple levels of biological organization. Society demands increasingly close scrutiny of the potential health risks associated with exposure to chemicals present in our everyday life, leading to an increasing need for more predictive and accurate risk-assessment approaches. Developing such approaches requires a detailed mechanistic understanding of the ways in which xenobiotic substances perturb biological systems and lead to adverse outcomes. Thus, Systems Toxicology approaches offer modern strategies for gaining such mechanistic knowledge by combining advanced analytical and computational tools. Furthermore, Systems Toxicology is a means for the identification and application of biomarkers for improved safety assessments. In Systems Toxicology, quantitative systems-wide molecular changes in the context of an exposure are measured, and a causal chain of molecular events linking exposures with adverse outcomes (i.e., functional and apical end points) is deciphered. Mathematical models are then built to describe these processes in a quantitative manner. The integrated data analysis leads to the identification of how biological networks are perturbed by the exposure and enables the development of predictive mathematical models of toxicological processes. This perspective integrates current knowledge regarding bioanalytical approaches, computational analysis, and the potential for improved risk assessment.
[Recent development of metabonomics and its applications in clinical research].
Li, Hao; Jiang, Ying; He, Fu-Chu
2008-04-01
In the post-genomic era, systems biology is central to the biological sciences. Functional genomics such as transcriptomics and proteomics can simultaneous determine massive gene or protein expression changes following drug treatment or other intervention. However, these changes can't be coupled directly to changes in biological function. As a result, metabonomics and its many pseudonyms (metabolomics, metabolic profiling, etc.) have exploded onto the scientific scene in the past several years. Metabonomics is a rapidly growing research area and a system approach for comprehensive and quantitative analysis of the global metabolites in a biological matrix. Analytical chemistry approach is necessary for the development of comprehensive metabonomics investigations. Fundamentally, there are two types of metabonomics approaches: mass-spectrometry (MS) based and nuclear magnetic resonance (NMR) methodologies. Metabonomics measurements provide a wealth of data information and interpretation of these data relies mainly on chemometrics approaches to perform large-scale data analysis and data visualization, such as principal and independent component analysis, multidimensional scaling, a variety of clustering techniques, and discriminant function analysis, among many others. In this review, the recent development of analytical and statistical techniques used in metabonomics is summarized. Major applications of metabonomics relevant to clinical and preclinical study are then reviewed. The applications of metabonomics in study of liver diseases, cancers and other diseases have proved useful both as an experimental tool for pathogenesis mechanism re-search and ultimately a tool for diagnosis and monitoring treatment response of these diseases. Next, the applications of metabonomics in preclinical toxicology are discussed and the role that metabonomics might do in pharmaceutical research and development is explained with special reference to the aims and achievements of the Consortium for Metabonomic Toxicology (COMET), and the concept of pharmacometabonomics as a way of predicting an individual's response to treatment is highlighted. Finally, the role of metabonomics in elucidating the function of the unknown or novel enzyme is mentioned.
Schaefbauer, Chris L; Campbell, Terrance R; Senteio, Charles; Siek, Katie A; Bakken, Suzanne; Veinot, Tiffany C
2016-01-01
Objective We compare 5 health informatics research projects that applied community-based participatory research (CBPR) approaches with the goal of extending existing CBPR principles to address issues specific to health informatics research. Materials and methods We conducted a cross-case analysis of 5 diverse case studies with 1 common element: integration of CBPR approaches into health informatics research. After reviewing publications and other case-related materials, all coauthors engaged in collaborative discussions focused on CBPR. Researchers mapped each case to an existing CBPR framework, examined each case individually for success factors and barriers, and identified common patterns across cases. Results Benefits of applying CBPR approaches to health informatics research across the cases included the following: developing more relevant research with wider impact, greater engagement with diverse populations, improved internal validity, more rapid translation of research into action, and the development of people. Challenges of applying CBPR to health informatics research included requirements to develop strong, sustainable academic-community partnerships and mismatches related to cultural and temporal factors. Several technology-related challenges, including needs to define ownership of technology outputs and to build technical capacity with community partners, also emerged from our analysis. Finally, we created several principles that extended an existing CBPR framework to specifically address health informatics research requirements. Conclusions Our cross-case analysis yielded valuable insights regarding CBPR implementation in health informatics research and identified valuable lessons useful for future CBPR-based research. The benefits of applying CBPR approaches can be significant, particularly in engaging populations that are typically underserved by health care and in designing patient-facing technology. PMID:26228766
Peeters, Michael J; Vaidya, Varun A
2016-06-25
Objective. To describe an approach for assessing the Accreditation Council for Pharmacy Education's (ACPE) doctor of pharmacy (PharmD) Standard 4.4, which focuses on students' professional development. Methods. This investigation used mixed methods with triangulation of qualitative and quantitative data to assess professional development. Qualitative data came from an electronic developmental portfolio of professionalism and ethics, completed by PharmD students during their didactic studies. Quantitative confirmation came from the Defining Issues Test (DIT)-an assessment of pharmacists' professional development. Results. Qualitatively, students' development reflections described growth through this course series. Quantitatively, the 2015 PharmD class's DIT N2-scores illustrated positive development overall; the lower 50% had a large initial improvement compared to the upper 50%. Subsequently, the 2016 PharmD class confirmed these average initial improvements of students and also showed further substantial development among students thereafter. Conclusion. Applying an assessment for learning approach, triangulation of qualitative and quantitative assessments confirmed that PharmD students developed professionally during this course series.
An object-based approach to weather analysis and its applications
NASA Astrophysics Data System (ADS)
Troemel, Silke; Diederich, Malte; Horvath, Akos; Simmer, Clemens; Kumjian, Matthew
2013-04-01
The research group 'Object-based Analysis and SEamless prediction' (OASE) within the Hans Ertel Centre for Weather Research programme (HErZ) pursues an object-based approach to weather analysis. The object-based tracking approach adopts the Lagrange perspective by identifying and following the development of convective events over the course of their lifetime. Prerequisites of the object-based analysis are a high-resolved observational data base and a tracking algorithm. A near real-time radar and satellite remote sensing-driven 3D observation-microphysics composite covering Germany, currently under development, contains gridded observations and estimated microphysical quantities. A 3D scale-space tracking identifies convective rain events in the dual-composite and monitors the development over the course of their lifetime. The OASE-group exploits the object-based approach in several fields of application: (1) For a better understanding and analysis of precipitation processes responsible for extreme weather events, (2) in nowcasting, (3) as a novel approach for validation of meso-γ atmospheric models, and (4) in data assimilation. Results from the different fields of application will be presented. The basic idea of the object-based approach is to identify a small set of radar- and satellite derived descriptors which characterize the temporal development of precipitation systems which constitute the objects. So-called proxies of the precipitation process are e.g. the temporal change of the brightband, vertically extensive columns of enhanced differential reflectivity ZDR or the cloud top temperature and heights identified in the 4D field of ground-based radar reflectivities and satellite retrievals generated by a cell during its life time. They quantify (micro-) physical differences among rain events and relate to the precipitation yield. Analyses on the informative content of ZDR columns as precursor for storm evolution for example will be presented to demonstrate the use of such system-oriented predictors for nowcasting. Columns of differential reflectivity ZDR measured by polarimetric weather radars are prominent signatures associated with thunderstorm updrafts. Since greater vertical velocities can loft larger drops and water-coated ice particles to higher altitudes above the environmental freezing level, the integrated ZDR column above the freezing level increases with increasing updraft intensity. Validation of atmospheric models concerning precipitation representation or prediction is usually confined to comparisons of precipitation fields or their temporal and spatial statistics. A comparison of the rain rates alone, however, does not immediately explain discrepancies between models and observations, because similar rain rates might be produced by different processes. Within the event-based approach for validation of models both observed and modeled rain events are analyzed by means of proxies of the precipitation process. Both sets of descriptors represent the basis for model validation since different leading descriptors - in a statistical sense- hint at process formulations potentially responsible for model failures.
Bioinformatics/biostatistics: microarray analysis.
Eichler, Gabriel S
2012-01-01
The quantity and complexity of the molecular-level data generated in both research and clinical settings require the use of sophisticated, powerful computational interpretation techniques. It is for this reason that bioinformatic analysis of complex molecular profiling data has become a fundamental technology in the development of personalized medicine. This chapter provides a high-level overview of the field of bioinformatics and outlines several, classic bioinformatic approaches. The highlighted approaches can be aptly applied to nearly any sort of high-dimensional genomic, proteomic, or metabolomic experiments. Reviewed technologies in this chapter include traditional clustering analysis, the Gene Expression Dynamics Inspector (GEDI), GoMiner (GoMiner), Gene Set Enrichment Analysis (GSEA), and the Learner of Functional Enrichment (LeFE).
NASA Astrophysics Data System (ADS)
Cazzani, Antonio; Malagù, Marcello; Turco, Emilio
2016-03-01
We illustrate a numerical tool for analyzing plane arches such as those frequently used in historical masonry heritage. It is based on a refined elastic mechanical model derived from the isogeometric approach. In particular, geometry and displacements are modeled by means of non-uniform rational B-splines. After a brief introduction, outlining the basic assumptions of this approach and the corresponding modeling choices, several numerical applications to arches, which are typical of masonry structures, show the performance of this novel technique. These are discussed in detail to emphasize the advantage and potential developments of isogeometric analysis in the field of structural analysis of historical masonry buildings with complex geometries.
Approach for Estimating Exposures and Incremental Health ...
Approach for Estimating Exposures and Incremental Health Effects from Lead During Renovation, Repair, and Painting Activities in Public and Commercial Buildings” (Technical Approach Document). Also available for public review and comment are two supplementary documents: the detailed appendices for the Technical Approach Document and a supplementary report entitled “Developing a Concentration-Response Function for Pb Exposure and Cardiovascular Disease-Related Mortality.” Together, these documents describes an analysis for estimating exposures and incremental health effects created by renovations of public and commercial buildings (P&CBs). This analysis could be used to identify and evaluate hazards from renovation, repair, and painting activities in P&CBs. A general overview of how this analysis can be used to inform EPA’s hazard finding is described in the Framework document that was previously made available for public comment (79 FR 31072; FRL9910-44). The analysis can be used in any proposed rulemaking to estimate the reduction in deleterious health effects that would result from any proposed regulatory requirements to mitigate exposure from P&CB renovation activities. The Technical Approach Document describes in detail how the analyses under this approach have been performed and presents the results – expected changes in blood lead levels and health effects due to lead exposure from renovation activities.
Ocean wavenumber estimation from wave-resolving time series imagery
Plant, N.G.; Holland, K.T.; Haller, M.C.
2008-01-01
We review several approaches that have been used to estimate ocean surface gravity wavenumbers from wave-resolving remotely sensed image sequences. Two fundamentally different approaches that utilize these data exist. A power spectral density approach identifies wavenumbers where image intensity variance is maximized. Alternatively, a cross-spectral correlation approach identifies wavenumbers where intensity coherence is maximized. We develop a solution to the latter approach based on a tomographic analysis that utilizes a nonlinear inverse method. The solution is tolerant to noise and other forms of sampling deficiency and can be applied to arbitrary sampling patterns, as well as to full-frame imagery. The solution includes error predictions that can be used for data retrieval quality control and for evaluating sample designs. A quantitative analysis of the intrinsic resolution of the method indicates that the cross-spectral correlation fitting improves resolution by a factor of about ten times as compared to the power spectral density fitting approach. The resolution analysis also provides a rule of thumb for nearshore bathymetry retrievals-short-scale cross-shore patterns may be resolved if they are about ten times longer than the average water depth over the pattern. This guidance can be applied to sample design to constrain both the sensor array (image resolution) and the analysis array (tomographic resolution). ?? 2008 IEEE.