A new window of opportunity to reject process-based biotechnology regulation
Marchant, Gary E; Stevens, Yvonne A
2015-01-01
ABSTRACT. The question of whether biotechnology regulation should be based on the process or the product has long been debated, with different jurisdictions adopting different approaches. The European Union has adopted a process-based approach, Canada has adopted a product-based approach, and the United States has implemented a hybrid system. With the recent proliferation of new methods of genetic modification, such as gene editing, process-based regulatory systems, which are premised on a binary system of transgenic and conventional approaches, will become increasingly obsolete and unsustainable. To avoid unreasonable, unfair and arbitrary results, nations that have adopted process-based approaches will need to migrate to a product-based approach that considers the novelty and risks of the individual trait, rather than the process by which that trait was produced. This commentary suggests some approaches for the design of such a product-based approach. PMID:26930116
A new window of opportunity to reject process-based biotechnology regulation.
Marchant, Gary E; Stevens, Yvonne A
2015-01-01
The question of whether biotechnology regulation should be based on the process or the product has long been debated, with different jurisdictions adopting different approaches. The European Union has adopted a process-based approach, Canada has adopted a product-based approach, and the United States has implemented a hybrid system. With the recent proliferation of new methods of genetic modification, such as gene editing, process-based regulatory systems, which are premised on a binary system of transgenic and conventional approaches, will become increasingly obsolete and unsustainable. To avoid unreasonable, unfair and arbitrary results, nations that have adopted process-based approaches will need to migrate to a product-based approach that considers the novelty and risks of the individual trait, rather than the process by which that trait was produced. This commentary suggests some approaches for the design of such a product-based approach.
Dynamic Approaches to Language Processing
ERIC Educational Resources Information Center
Srinivasan, Narayanan
2007-01-01
Symbolic rule-based approaches have been a preferred way to study language and cognition. Dissatisfaction with rule-based approaches in the 1980s lead to alternative approaches to study language, the most notable being the dynamic approaches to language processing. Dynamic approaches provide a significant alternative by not being rule-based and…
Implicit Schemata and Categories in Memory-Based Language Processing
ERIC Educational Resources Information Center
van den Bosch, Antal; Daelemans, Walter
2013-01-01
Memory-based language processing (MBLP) is an approach to language processing based on exemplar storage during learning and analogical reasoning during processing. From a cognitive perspective, the approach is attractive as a model for human language processing because it does not make any assumptions about the way abstractions are shaped, nor any…
Conceptual information processing: A robust approach to KBS-DBMS integration
NASA Technical Reports Server (NTRS)
Lazzara, Allen V.; Tepfenhart, William; White, Richard C.; Liuzzi, Raymond
1987-01-01
Integrating the respective functionality and architectural features of knowledge base and data base management systems is a topic of considerable interest. Several aspects of this topic and associated issues are addressed. The significance of integration and the problems associated with accomplishing that integration are discussed. The shortcomings of current approaches to integration and the need to fuse the capabilities of both knowledge base and data base management systems motivates the investigation of information processing paradigms. One such paradigm is concept based processing, i.e., processing based on concepts and conceptual relations. An approach to robust knowledge and data base system integration is discussed by addressing progress made in the development of an experimental model for conceptual information processing.
ERIC Educational Resources Information Center
Ngo, Chau M.; Trinh, Lap Q.
2011-01-01
The field of English language education has seen developments in writing pedagogy, moving from product-based to process-based and then to genre-based approaches. In Vietnam, teaching secondary school students how to write in English is still lagging behind these growing developments. Product-based approach is commonly seen in English writing…
A KPI-based process monitoring and fault detection framework for large-scale processes.
Zhang, Kai; Shardt, Yuri A W; Chen, Zhiwen; Yang, Xu; Ding, Steven X; Peng, Kaixiang
2017-05-01
Large-scale processes, consisting of multiple interconnected subprocesses, are commonly encountered in industrial systems, whose performance needs to be determined. A common approach to this problem is to use a key performance indicator (KPI)-based approach. However, the different KPI-based approaches are not developed with a coherent and consistent framework. Thus, this paper proposes a framework for KPI-based process monitoring and fault detection (PM-FD) for large-scale industrial processes, which considers the static and dynamic relationships between process and KPI variables. For the static case, a least squares-based approach is developed that provides an explicit link with least-squares regression, which gives better performance than partial least squares. For the dynamic case, using the kernel representation of each subprocess, an instrument variable is used to reduce the dynamic case to the static case. This framework is applied to the TE benchmark process and the hot strip mill rolling process. The results show that the proposed method can detect faults better than previous methods. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.
Process-based models are required to manage ecological systems in a changing world
K. Cuddington; M.-J. Fortin; L.R. Gerber; A. Hastings; A. Liebhold; M. OConnor; C. Ray
2013-01-01
Several modeling approaches can be used to guide management decisions. However, some approaches are better fitted than others to address the problem of prediction under global change. Process-based models, which are based on a theoretical understanding of relevant ecological processes, provide a useful framework to incorporate specific responses to altered...
The Knowledge-Based Software Assistant: Beyond CASE
NASA Technical Reports Server (NTRS)
Carozzoni, Joseph A.
1993-01-01
This paper will outline the similarities and differences between two paradigms of software development. Both support the whole software life cycle and provide automation for most of the software development process, but have different approaches. The CASE approach is based on a set of tools linked by a central data repository. This tool-based approach is data driven and views software development as a series of sequential steps, each resulting in a product. The Knowledge-Based Software Assistant (KBSA) approach, a radical departure from existing software development practices, is knowledge driven and centers around a formalized software development process. KBSA views software development as an incremental, iterative, and evolutionary process with development occurring at the specification level.
A Systems Approach to Nitrogen Delivery
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goins, Bobby
A systems based approach will be used to evaluate the nitrogen delivery process. This approach involves principles found in Lean, Reliability, Systems Thinking, and Requirements. This unique combination of principles and thought process yields a very in depth look into the system to which it is applied. By applying a systems based approach to the nitrogen delivery process there should be improvements in cycle time, efficiency, and a reduction in the required number of personnel needed to sustain the delivery process. This will in turn reduce the amount of demurrage charges that the site incurs. In addition there should bemore » less frustration associated with the delivery process.« less
Social Models: Blueprints or Processes?
ERIC Educational Resources Information Center
Little, Graham R.
1981-01-01
Discusses the nature and implications of two different models for societal planning: (1) the problem-solving process approach based on Karl Popper; and (2) the goal-setting "blueprint" approach based on Karl Marx. (DC)
A proven knowledge-based approach to prioritizing process information
NASA Technical Reports Server (NTRS)
Corsberg, Daniel R.
1991-01-01
Many space-related processes are highly complex systems subject to sudden, major transients. In any complex process control system, a critical aspect is rapid analysis of the changing process information. During a disturbance, this task can overwhelm humans as well as computers. Humans deal with this by applying heuristics in determining significant information. A simple, knowledge-based approach to prioritizing information is described. The approach models those heuristics that humans would use in similar circumstances. The approach described has received two patents and was implemented in the Alarm Filtering System (AFS) at the Idaho National Engineering Laboratory (INEL). AFS was first developed for application in a nuclear reactor control room. It has since been used in chemical processing applications, where it has had a significant impact on control room environments. The approach uses knowledge-based heuristics to analyze data from process instrumentation and respond to that data according to knowledge encapsulated in objects and rules. While AFS cannot perform the complete diagnosis and control task, it has proven to be extremely effective at filtering and prioritizing information. AFS was used for over two years as a first level of analysis for human diagnosticians. Given the approach's proven track record in a wide variety of practical applications, it should be useful in both ground- and space-based systems.
ERIC Educational Resources Information Center
Chia, Robert
2017-01-01
Purpose: This paper aims to articulate a practice-based, non-cognitivist approach to organizational learning. Design/methodology/approach: This paper explores the potential contribution of a process-based "practice turn" in social theory for understanding organizational learning. Findings: In complex, turbulent environments, robust…
ERIC Educational Resources Information Center
Paleeri, Sankaranarayanan
2015-01-01
Transaction methods and approaches of value education have to change from lecturing to process based methods according to the development of constructivist approach. The process based methods provide creative interpretation and active participation from student side. Teachers have to organize suitable activities to transact values through process…
Business process architectures: overview, comparison and framework
NASA Astrophysics Data System (ADS)
Dijkman, Remco; Vanderfeesten, Irene; Reijers, Hajo A.
2016-02-01
With the uptake of business process modelling in practice, the demand grows for guidelines that lead to consistent and integrated collections of process models. The notion of a business process architecture has been explicitly proposed to address this. This paper provides an overview of the prevailing approaches to design a business process architecture. Furthermore, it includes evaluations of the usability and use of the identified approaches. Finally, it presents a framework for business process architecture design that can be used to develop a concrete architecture. The use and usability were evaluated in two ways. First, a survey was conducted among 39 practitioners, in which the opinion of the practitioners on the use and usefulness of the approaches was evaluated. Second, four case studies were conducted, in which process architectures from practice were analysed to determine the approaches or elements of approaches that were used in their design. Both evaluations showed that practitioners have a preference for using approaches that are based on reference models and approaches that are based on the identification of business functions or business objects. At the same time, the evaluations showed that practitioners use these approaches in combination, rather than selecting a single approach.
Echtermeyer, Alexander; Amar, Yehia; Zakrzewski, Jacek; Lapkin, Alexei
2017-01-01
A recently described C(sp 3 )-H activation reaction to synthesise aziridines was used as a model reaction to demonstrate the methodology of developing a process model using model-based design of experiments (MBDoE) and self-optimisation approaches in flow. The two approaches are compared in terms of experimental efficiency. The self-optimisation approach required the least number of experiments to reach the specified objectives of cost and product yield, whereas the MBDoE approach enabled a rapid generation of a process model.
Hydrological modelling in forested systems | Science ...
This chapter provides a brief overview of forest hydrology modelling approaches for answering important global research and management questions. Many hundreds of hydrological models have been applied globally across multiple decades to represent and predict forest hydrological processes. The focus of this chapter is on process-based models and approaches, specifically 'forest hydrology models'; that is, physically based simulation tools that quantify compartments of the forest hydrological cycle. Physically based models can be considered those that describe the conservation of mass, momentum and/or energy. The purpose of this chapter is to provide a brief overview of forest hydrology modeling approaches for answering important global research and management questions. The focus of this chapter is on process-based models and approaches, specifically “forest hydrology models”, i.e., physically-based simulation tools that quantify compartments of the forest hydrological cycle.
NASA Astrophysics Data System (ADS)
Skersys, Tomas; Butleris, Rimantas; Kapocius, Kestutis
2013-10-01
Approaches for the analysis and specification of business vocabularies and rules are very relevant topics in both Business Process Management and Information Systems Development disciplines. However, in common practice of Information Systems Development, the Business modeling activities still are of mostly empiric nature. In this paper, basic aspects of the approach for business vocabularies' semi-automated extraction from business process models are presented. The approach is based on novel business modeling-level OMG standards "Business Process Model and Notation" (BPMN) and "Semantics for Business Vocabularies and Business Rules" (SBVR), thus contributing to OMG's vision about Model-Driven Architecture (MDA) and to model-driven development in general.
A Mixed Kijima Model Using the Weibull-Based Generalized Renewal Processes
2015-01-01
Generalized Renewal Processes are useful for approaching the rejuvenation of dynamical systems resulting from planned or unplanned interventions. We present new perspectives for the Generalized Renewal Processes in general and for the Weibull-based Generalized Renewal Processes in particular. Disregarding from literature, we present a mixed Generalized Renewal Processes approach involving Kijima Type I and II models, allowing one to infer the impact of distinct interventions on the performance of the system under study. The first and second theoretical moments of this model are introduced as well as its maximum likelihood estimation and random sampling approaches. In order to illustrate the usefulness of the proposed Weibull-based Generalized Renewal Processes model, some real data sets involving improving, stable, and deteriorating systems are used. PMID:26197222
Fault detection and diagnosis using neural network approaches
NASA Technical Reports Server (NTRS)
Kramer, Mark A.
1992-01-01
Neural networks can be used to detect and identify abnormalities in real-time process data. Two basic approaches can be used, the first based on training networks using data representing both normal and abnormal modes of process behavior, and the second based on statistical characterization of the normal mode only. Given data representative of process faults, radial basis function networks can effectively identify failures. This approach is often limited by the lack of fault data, but can be facilitated by process simulation. The second approach employs elliptical and radial basis function neural networks and other models to learn the statistical distributions of process observables under normal conditions. Analytical models of failure modes can then be applied in combination with the neural network models to identify faults. Special methods can be applied to compensate for sensor failures, to produce real-time estimation of missing or failed sensors based on the correlations codified in the neural network.
A Hough Transform Global Probabilistic Approach to Multiple-Subject Diffusion MRI Tractography
2010-04-01
distribution unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT A global probabilistic fiber tracking approach based on the voting process provided by...umn.edu 2 ABSTRACT A global probabilistic fiber tracking approach based on the voting process provided by the Hough transform is introduced in...criteria for aligning curves and particularly tracts. In this work, we present a global probabilistic approach inspired by the voting procedure provided
Bridging process-based and empirical approaches to modeling tree growth
Harry T. Valentine; Annikki Makela; Annikki Makela
2005-01-01
The gulf between process-based and empirical approaches to modeling tree growth may be bridged, in part, by the use of a common model. To this end, we have formulated a process-based model of tree growth that can be fitted and applied in an empirical mode. The growth model is grounded in pipe model theory and an optimal control model of crown development. Together, the...
ERIC Educational Resources Information Center
Tutlys, Vidmantas; Spöttl, Georg
2017-01-01
Purpose: This paper aims to explore methodological and institutional challenges on application of the work-process analysis approach in the design and development of competence-based occupational standards for Lithuania. Design/methodology/approach: The theoretical analysis is based on the review of scientific literature and the analysis of…
Towards a model-based cognitive neuroscience of stopping - a neuroimaging perspective.
Sebastian, Alexandra; Forstmann, Birte U; Matzke, Dora
2018-07-01
Our understanding of the neural correlates of response inhibition has greatly advanced over the last decade. Nevertheless the specific function of regions within this stopping network remains controversial. The traditional neuroimaging approach cannot capture many processes affecting stopping performance. Despite the shortcomings of the traditional neuroimaging approach and a great progress in mathematical and computational models of stopping, model-based cognitive neuroscience approaches in human neuroimaging studies are largely lacking. To foster model-based approaches to ultimately gain a deeper understanding of the neural signature of stopping, we outline the most prominent models of response inhibition and recent advances in the field. We highlight how a model-based approach in clinical samples has improved our understanding of altered cognitive functions in these disorders. Moreover, we show how linking evidence-accumulation models and neuroimaging data improves the identification of neural pathways involved in the stopping process and helps to delineate these from neural networks of related but distinct functions. In conclusion, adopting a model-based approach is indispensable to identifying the actual neural processes underlying stopping. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.
NASA Astrophysics Data System (ADS)
Elag, M.; Goodall, J. L.
2013-12-01
Hydrologic modeling often requires the re-use and integration of models from different disciplines to simulate complex environmental systems. Component-based modeling introduces a flexible approach for integrating physical-based processes across disciplinary boundaries. Several hydrologic-related modeling communities have adopted the component-based approach for simulating complex physical systems by integrating model components across disciplinary boundaries in a workflow. However, it is not always straightforward to create these interdisciplinary models due to the lack of sufficient knowledge about a hydrologic process. This shortcoming is a result of using informal methods for organizing and sharing information about a hydrologic process. A knowledge-based ontology provides such standards and is considered the ideal approach for overcoming this challenge. The aims of this research are to present the methodology used in analyzing the basic hydrologic domain in order to identify hydrologic processes, the ontology itself, and how the proposed ontology is integrated with the Water Resources Component (WRC) ontology. The proposed ontology standardizes the definitions of a hydrologic process, the relationships between hydrologic processes, and their associated scientific equations. The objective of the proposed Hydrologic Process (HP) Ontology is to advance the idea of creating a unified knowledge framework for components' metadata by introducing a domain-level ontology for hydrologic processes. The HP ontology is a step toward an explicit and robust domain knowledge framework that can be evolved through the contribution of domain users. Analysis of the hydrologic domain is accomplished using the Formal Concept Approach (FCA), in which the infiltration process, an important hydrologic process, is examined. Two infiltration methods, the Green-Ampt and Philip's methods, were used to demonstrate the implementation of information in the HP ontology. Furthermore, a SPARQL service is provided for semantic-based querying of the ontology.
How Many Batches Are Needed for Process Validation under the New FDA Guidance?
Yang, Harry
2013-01-01
The newly updated FDA Guidance for Industry on Process Validation: General Principles and Practices ushers in a life cycle approach to process validation. While the guidance no longer considers the use of traditional three-batch validation appropriate, it does not prescribe the number of validation batches for a prospective validation protocol, nor does it provide specific methods to determine it. This potentially could leave manufacturers in a quandary. In this paper, I develop a Bayesian method to address the issue. By combining process knowledge gained from Stage 1 Process Design (PD) with expected outcomes of Stage 2 Process Performance Qualification (PPQ), the number of validation batches for PPQ is determined to provide a high level of assurance that the process will consistently produce future batches meeting quality standards. Several examples based on simulated data are presented to illustrate the use of the Bayesian method in helping manufacturers make risk-based decisions for Stage 2 PPQ, and they highlight the advantages of the method over traditional Frequentist approaches. The discussions in the paper lend support for a life cycle and risk-based approach to process validation recommended in the new FDA guidance. The newly updated FDA Guidance for Industry on Process Validation: General Principles and Practices ushers in a life cycle approach to process validation. While the guidance no longer considers the use of traditional three-batch validation appropriate, it does not prescribe the number of validation batches for a prospective validation protocol, nor does it provide specific methods to determine it. This potentially could leave manufacturers in a quandary. In this paper, I develop a Bayesian method to address the issue. By combining process knowledge gained from Stage 1 Process Design (PD) with expected outcomes of Stage 2 Process Performance Qualification (PPQ), the number of validation batches for PPQ is determined to provide a high level of assurance that the process will consistently produce future batches meeting quality standards. Several examples based on simulated data are presented to illustrate the use of the Bayesian method in helping manufacturers make risk-based decisions for Stage 2 PPQ, and THEY highlight the advantages of the method over traditional Frequentist approaches. The discussions in the paper lend support for a life cycle and risk-based approach to process validation recommended in the new FDA guidance.
Cognition-Based Approaches for High-Precision Text Mining
ERIC Educational Resources Information Center
Shannon, George John
2017-01-01
This research improves the precision of information extraction from free-form text via the use of cognitive-based approaches to natural language processing (NLP). Cognitive-based approaches are an important, and relatively new, area of research in NLP and search, as well as linguistics. Cognitive approaches enable significant improvements in both…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goins, Bobby
A systems based approach will be used to evaluate the nitrogen delivery process. This approach involves principles found in Lean, Reliability, Systems Thinking, and Requirements. This unique combination of principles and thought process yields a very in depth look into the system to which it is applied. By applying a systems based approach to the nitrogen delivery process there should be improvements in cycle time, efficiency, and a reduction in the required number of personnel needed to sustain the delivery process. This will in turn reduce the amount of demurrage charges that the site incurs. In addition there should bemore » less frustration associated with the delivery process.« less
Khandpur, Paramjeet; Gogate, Parag R
2016-03-01
The present work evaluates the performance of ultrasound based sterilization approaches for processing of different fruit and vegetable juices in terms of microbial growth and changes in the quality parameters during the storage. Comparison with the conventional thermal processing has also been presented. A novel approach based on combination of ultrasound with ultraviolet irradiation and crude extract of essential oil from orange peels has been used for the first time. Identification of the microbial growth (total bacteria and yeast content) in the juices during the subsequent storage and assessing the safety for human consumption along with the changes in the quality parameters (Brix, titratable acidity, pH, ORP, salt, conductivity, TSS and TDS) has been investigated in details. The optimized ultrasound parameters for juice sterilization were established as ultrasound power of 100 W and treatment time of 15 min for the constant frequency operation (20 kHz). It has been established that more than 5 log reduction was achieved using the novel combined approaches based on ultrasound. The treated juices using different approaches based on ultrasound also showed lower microbial growth and improved quality characteristics as compared to the thermally processed juice. Scale up studies were also performed using spinach juice as the test sample with processing at 5 L volume for the first time. The ultrasound treated juice satisfied the microbiological and physiochemical safety limits in refrigerated storage conditions for 20 days for the large scale processing. Overall the present work conclusively established the usefulness of combined treatment approaches based on ultrasound for maintaining the microbiological safety of beverages with enhanced shelf life and excellent quality parameters as compared to the untreated and thermally processed juices. Copyright © 2015 Elsevier B.V. All rights reserved.
Model-Driven Useware Engineering
NASA Astrophysics Data System (ADS)
Meixner, Gerrit; Seissler, Marc; Breiner, Kai
User-oriented hardware and software development relies on a systematic development process based on a comprehensive analysis focusing on the users' requirements and preferences. Such a development process calls for the integration of numerous disciplines, from psychology and ergonomics to computer sciences and mechanical engineering. Hence, a correspondingly interdisciplinary team must be equipped with suitable software tools to allow it to handle the complexity of a multimodal and multi-device user interface development approach. An abstract, model-based development approach seems to be adequate for handling this complexity. This approach comprises different levels of abstraction requiring adequate tool support. Thus, in this chapter, we present the current state of our model-based software tool chain. We introduce the use model as the core model of our model-based process, transformation processes, and a model-based architecture, and we present different software tools that provide support for creating and maintaining the models or performing the necessary model transformations.
Effect of Inquiry-Based Learning Approach on Student Resistance in a Science and Technology Course
ERIC Educational Resources Information Center
Sever, Demet; Guven, Meral
2014-01-01
The aim of this study was to identify the resistance behaviors of 7th grade students exhibited during their Science and Technology course teaching-learning processes, and to remove the identified resistance behaviors through teaching-learning processes that were constructed based on the inquiry-based learning approach. In the quasi-experimentally…
Liu, Chunbo; Pan, Feng; Li, Yun
2016-07-29
Glutamate is of great importance in food and pharmaceutical industries. There is still lack of effective statistical approaches for fault diagnosis in the fermentation process of glutamate. To date, the statistical approach based on generalized additive model (GAM) and bootstrap has not been used for fault diagnosis in fermentation processes, much less the fermentation process of glutamate with small samples sets. A combined approach of GAM and bootstrap was developed for the online fault diagnosis in the fermentation process of glutamate with small sample sets. GAM was first used to model the relationship between glutamate production and different fermentation parameters using online data from four normal fermentation experiments of glutamate. The fitted GAM with fermentation time, dissolved oxygen, oxygen uptake rate and carbon dioxide evolution rate captured 99.6 % variance of glutamate production during fermentation process. Bootstrap was then used to quantify the uncertainty of the estimated production of glutamate from the fitted GAM using 95 % confidence interval. The proposed approach was then used for the online fault diagnosis in the abnormal fermentation processes of glutamate, and a fault was defined as the estimated production of glutamate fell outside the 95 % confidence interval. The online fault diagnosis based on the proposed approach identified not only the start of the fault in the fermentation process, but also the end of the fault when the fermentation conditions were back to normal. The proposed approach only used a small sample sets from normal fermentations excitements to establish the approach, and then only required online recorded data on fermentation parameters for fault diagnosis in the fermentation process of glutamate. The proposed approach based on GAM and bootstrap provides a new and effective way for the fault diagnosis in the fermentation process of glutamate with small sample sets.
A systems-based approach for integrated design of materials, products and design process chains
NASA Astrophysics Data System (ADS)
Panchal, Jitesh H.; Choi, Hae-Jin; Allen, Janet K.; McDowell, David L.; Mistree, Farrokh
2007-12-01
The concurrent design of materials and products provides designers with flexibility to achieve design objectives that were not previously accessible. However, the improved flexibility comes at a cost of increased complexity of the design process chains and the materials simulation models used for executing the design chains. Efforts to reduce the complexity generally result in increased uncertainty. We contend that a systems based approach is essential for managing both the complexity and the uncertainty in design process chains and simulation models in concurrent material and product design. Our approach is based on simplifying the design process chains systematically such that the resulting uncertainty does not significantly affect the overall system performance. Similarly, instead of striving for accurate models for multiscale systems (that are inherently complex), we rely on making design decisions that are robust to uncertainties in the models. Accordingly, we pursue hierarchical modeling in the context of design of multiscale systems. In this paper our focus is on design process chains. We present a systems based approach, premised on the assumption that complex systems can be designed efficiently by managing the complexity of design process chains. The approach relies on (a) the use of reusable interaction patterns to model design process chains, and (b) consideration of design process decisions using value-of-information based metrics. The approach is illustrated using a Multifunctional Energetic Structural Material (MESM) design example. Energetic materials store considerable energy which can be released through shock-induced detonation; conventionally, they are not engineered for strength properties. The design objectives for the MESM in this paper include both sufficient strength and energy release characteristics. The design is carried out by using models at different length and time scales that simulate different aspects of the system. Finally, by applying the method to the MESM design problem, we show that the integrated design of materials and products can be carried out more efficiently by explicitly accounting for design process decisions with the hierarchy of models.
Accelerated rescaling of single Monte Carlo simulation runs with the Graphics Processing Unit (GPU).
Yang, Owen; Choi, Bernard
2013-01-01
To interpret fiber-based and camera-based measurements of remitted light from biological tissues, researchers typically use analytical models, such as the diffusion approximation to light transport theory, or stochastic models, such as Monte Carlo modeling. To achieve rapid (ideally real-time) measurement of tissue optical properties, especially in clinical situations, there is a critical need to accelerate Monte Carlo simulation runs. In this manuscript, we report on our approach using the Graphics Processing Unit (GPU) to accelerate rescaling of single Monte Carlo runs to calculate rapidly diffuse reflectance values for different sets of tissue optical properties. We selected MATLAB to enable non-specialists in C and CUDA-based programming to use the generated open-source code. We developed a software package with four abstraction layers. To calculate a set of diffuse reflectance values from a simulated tissue with homogeneous optical properties, our rescaling GPU-based approach achieves a reduction in computation time of several orders of magnitude as compared to other GPU-based approaches. Specifically, our GPU-based approach generated a diffuse reflectance value in 0.08ms. The transfer time from CPU to GPU memory currently is a limiting factor with GPU-based calculations. However, for calculation of multiple diffuse reflectance values, our GPU-based approach still can lead to processing that is ~3400 times faster than other GPU-based approaches.
Wu, Bian; Wang, Minhong; Grotzer, Tina A; Liu, Jun; Johnson, Janice M
2016-08-22
Practical experience with clinical cases has played an important role in supporting the learning of clinical reasoning. However, learning through practical experience involves complex processes difficult to be captured by students. This study aimed to examine the effects of a computer-based cognitive-mapping approach that helps students to externalize the reasoning process and the knowledge underlying the reasoning process when they work with clinical cases. A comparison between the cognitive-mapping approach and the verbal-text approach was made by analyzing their effects on learning outcomes. Fifty-two third-year or higher students from two medical schools participated in the study. Students in the experimental group used the computer-base cognitive-mapping approach, while the control group used the verbal-text approach, to make sense of their thinking and actions when they worked with four simulated cases over 4 weeks. For each case, students in both groups reported their reasoning process (involving data capture, hypotheses formulation, and reasoning with justifications) and the underlying knowledge (involving identified concepts and the relationships between the concepts) using the given approach. The learning products (cognitive maps or verbal text) revealed that students in the cognitive-mapping group outperformed those in the verbal-text group in the reasoning process, but not in making sense of the knowledge underlying the reasoning process. No significant differences were found in a knowledge posttest between the two groups. The computer-based cognitive-mapping approach has shown a promising advantage over the verbal-text approach in improving students' reasoning performance. Further studies are needed to examine the effects of the cognitive-mapping approach in improving the construction of subject-matter knowledge on the basis of practical experience.
Predictive Models for Semiconductor Device Design and Processing
NASA Technical Reports Server (NTRS)
Meyyappan, Meyya; Arnold, James O. (Technical Monitor)
1998-01-01
The device feature size continues to be on a downward trend with a simultaneous upward trend in wafer size to 300 mm. Predictive models are needed more than ever before for this reason. At NASA Ames, a Device and Process Modeling effort has been initiated recently with a view to address these issues. Our activities cover sub-micron device physics, process and equipment modeling, computational chemistry and material science. This talk would outline these efforts and emphasize the interaction among various components. The device physics component is largely based on integrating quantum effects into device simulators. We have two parallel efforts, one based on a quantum mechanics approach and the second, a semiclassical hydrodynamics approach with quantum correction terms. Under the first approach, three different quantum simulators are being developed and compared: a nonequlibrium Green's function (NEGF) approach, Wigner function approach, and a density matrix approach. In this talk, results using various codes will be presented. Our process modeling work focuses primarily on epitaxy and etching using first-principles models coupling reactor level and wafer level features. For the latter, we are using a novel approach based on Level Set theory. Sample results from this effort will also be presented.
Aligning observed and modelled behaviour based on workflow decomposition
NASA Astrophysics Data System (ADS)
Wang, Lu; Du, YuYue; Liu, Wei
2017-09-01
When business processes are mostly supported by information systems, the availability of event logs generated from these systems, as well as the requirement of appropriate process models are increasing. Business processes can be discovered, monitored and enhanced by extracting process-related information. However, some events cannot be correctly identified because of the explosion of the amount of event logs. Therefore, a new process mining technique is proposed based on a workflow decomposition method in this paper. Petri nets (PNs) are used to describe business processes, and then conformance checking of event logs and process models is investigated. A decomposition approach is proposed to divide large process models and event logs into several separate parts that can be analysed independently; while an alignment approach based on a state equation method in PN theory enhances the performance of conformance checking. Both approaches are implemented in programmable read-only memory (ProM). The correctness and effectiveness of the proposed methods are illustrated through experiments.
Service-based analysis of biological pathways
Zheng, George; Bouguettaya, Athman
2009-01-01
Background Computer-based pathway discovery is concerned with two important objectives: pathway identification and analysis. Conventional mining and modeling approaches aimed at pathway discovery are often effective at achieving either objective, but not both. Such limitations can be effectively tackled leveraging a Web service-based modeling and mining approach. Results Inspired by molecular recognitions and drug discovery processes, we developed a Web service mining tool, named PathExplorer, to discover potentially interesting biological pathways linking service models of biological processes. The tool uses an innovative approach to identify useful pathways based on graph-based hints and service-based simulation verifying user's hypotheses. Conclusion Web service modeling of biological processes allows the easy access and invocation of these processes on the Web. Web service mining techniques described in this paper enable the discovery of biological pathways linking these process service models. Algorithms presented in this paper for automatically highlighting interesting subgraph within an identified pathway network enable the user to formulate hypothesis, which can be tested out using our simulation algorithm that are also described in this paper. PMID:19796403
Enhancing the Teaching-Learning Process: A Knowledge Management Approach
ERIC Educational Resources Information Center
Bhusry, Mamta; Ranjan, Jayanthi
2012-01-01
Purpose: The purpose of this paper is to emphasize the need for knowledge management (KM) in the teaching-learning process in technical educational institutions (TEIs) in India, and to assert the impact of information technology (IT) based KM intervention in the teaching-learning process. Design/methodology/approach: The approach of the paper is…
NASA Astrophysics Data System (ADS)
Armando, Alessandro; Giunchiglia, Enrico; Ponta, Serena Elisa
We present an approach to the formal specification and automatic analysis of business processes under authorization constraints based on the action language \\cal{C}. The use of \\cal{C} allows for a natural and concise modeling of the business process and the associated security policy and for the automatic analysis of the resulting specification by using the Causal Calculator (CCALC). Our approach improves upon previous work by greatly simplifying the specification step while retaining the ability to perform a fully automatic analysis. To illustrate the effectiveness of the approach we describe its application to a version of a business process taken from the banking domain and use CCALC to determine resource allocation plans complying with the security policy.
The Utility of the Pattern of the Strengths and Weaknesses Approach
ERIC Educational Resources Information Center
Fiorello, Catherine A.; Flanagan, Dawn P.; Hale, James B.
2014-01-01
Unlike ability-achievement discrepancy and response-to-intervention approaches, the processing strengths and weaknesses (PSW) approach is the only empirically based approach that attempts to identify the pattern of deficit in the basic psychological processes that interferes with academic achievement for children with specific learning…
Risk based decision tool for space exploration missions
NASA Technical Reports Server (NTRS)
Meshkat, Leila; Cornford, Steve; Moran, Terrence
2003-01-01
This paper presents an approach and corresponding tool to assess and analyze the risks involved in a mission during the pre-phase A design process. This approach is based on creating a risk template for each subsystem expert involved in the mission design process and defining appropriate interactions between the templates.
DOT National Transportation Integrated Search
2009-04-01
The Wilmington Area Planning Council takes an objectives-driven, performance-based approach to its metropolitan transportation planning, including paying special attention to integrating its Congestion Management Process into its planning efforts. Th...
Strategic Planning: A (Site) Sight-Based Approach to Curriculum and Staff Development.
ERIC Educational Resources Information Center
Johnson, Daniel P.
The purpose of (Colorado's) Clear Creek School District's strategic planning process has been to develop basic district-wide parameters to promote instructional improvement through a process of shared leadership. The approach is termed "sight-based" to indicate the school district's commitment to connecting curriculum and…
Sánchez, José; Guarnes, Miguel Ángel; Dormido, Sebastián
2009-01-01
This paper is an experimental study of the utilization of different event-based strategies for the automatic control of a simple but very representative industrial process: the level control of a tank. In an event-based control approach it is the triggering of a specific event, and not the time, that instructs the sensor to send the current state of the process to the controller, and the controller to compute a new control action and send it to the actuator. In the document, five control strategies based on different event-based sampling techniques are described, compared, and contrasted with a classical time-based control approach and a hybrid one. The common denominator in the time, the hybrid, and the event-based control approaches is the controller: a proportional-integral algorithm with adaptations depending on the selected control approach. To compare and contrast each one of the hybrid and the pure event-based control algorithms with the time-based counterpart, the two tasks that a control strategy must achieve (set-point following and disturbance rejection) are independently analyzed. The experimental study provides new proof concerning the ability of event-based control strategies to minimize the data exchange among the control agents (sensors, controllers, actuators) when an error-free control of the process is not a hard requirement. PMID:22399975
Estimating Missing Unit Process Data in Life Cycle Assessment Using a Similarity-Based Approach.
Hou, Ping; Cai, Jiarui; Qu, Shen; Xu, Ming
2018-05-01
In life cycle assessment (LCA), collecting unit process data from the empirical sources (i.e., meter readings, operation logs/journals) is often costly and time-consuming. We propose a new computational approach to estimate missing unit process data solely relying on limited known data based on a similarity-based link prediction method. The intuition is that similar processes in a unit process network tend to have similar material/energy inputs and waste/emission outputs. We use the ecoinvent 3.1 unit process data sets to test our method in four steps: (1) dividing the data sets into a training set and a test set; (2) randomly removing certain numbers of data in the test set indicated as missing; (3) using similarity-weighted means of various numbers of most similar processes in the training set to estimate the missing data in the test set; and (4) comparing estimated data with the original values to determine the performance of the estimation. The results show that missing data can be accurately estimated when less than 5% data are missing in one process. The estimation performance decreases as the percentage of missing data increases. This study provides a new approach to compile unit process data and demonstrates a promising potential of using computational approaches for LCA data compilation.
NASA Technical Reports Server (NTRS)
Li, Zhenlong; Hu, Fei; Schnase, John L.; Duffy, Daniel Q.; Lee, Tsengdar; Bowen, Michael K.; Yang, Chaowei
2016-01-01
Climate observations and model simulations are producing vast amounts of array-based spatiotemporal data. Efficient processing of these data is essential for assessing global challenges such as climate change, natural disasters, and diseases. This is challenging not only because of the large data volume, but also because of the intrinsic high-dimensional nature of geoscience data. To tackle this challenge, we propose a spatiotemporal indexing approach to efficiently manage and process big climate data with MapReduce in a highly scalable environment. Using this approach, big climate data are directly stored in a Hadoop Distributed File System in its original, native file format. A spatiotemporal index is built to bridge the logical array-based data model and the physical data layout, which enables fast data retrieval when performing spatiotemporal queries. Based on the index, a data-partitioning algorithm is applied to enable MapReduce to achieve high data locality, as well as balancing the workload. The proposed indexing approach is evaluated using the National Aeronautics and Space Administration (NASA) Modern-Era Retrospective Analysis for Research and Applications (MERRA) climate reanalysis dataset. The experimental results show that the index can significantly accelerate querying and processing (10 speedup compared to the baseline test using the same computing cluster), while keeping the index-to-data ratio small (0.0328). The applicability of the indexing approach is demonstrated by a climate anomaly detection deployed on a NASA Hadoop cluster. This approach is also able to support efficient processing of general array-based spatiotemporal data in various geoscience domains without special configuration on a Hadoop cluster.
NASA Astrophysics Data System (ADS)
Kaiser, C.; Roll, K.; Volk, W.
2017-09-01
In the automotive industry, the manufacturing of automotive outer panels requires hemming processes in which two sheet metal parts are joined together by bending the flange of the outer part over the inner part. Because of decreasing development times and the steadily growing number of vehicle derivatives, an efficient digital product and process validation is necessary. Commonly used simulations, which are based on the finite element method, demand significant modelling effort, which results in disadvantages especially in the early product development phase. To increase the efficiency of designing hemming processes this paper presents a hemming-specific metamodel approach. The approach includes a part analysis in which the outline of the automotive outer panels is initially split into individual segments. By doing a para-metrization of each of the segments and assigning basic geometric shapes, the outline of the part is approximated. Based on this, the hemming parameters such as flange length, roll-in, wrinkling and plastic strains are calculated for each of the geometric basic shapes by performing a meta-model-based segmental product validation. The metamodel is based on an element similar formulation that includes a reference dataset of various geometric basic shapes. A random automotive outer panel can now be analysed and optimized based on the hemming-specific database. By implementing this approach into a planning system, an efficient optimization of designing hemming processes will be enabled. Furthermore, valuable time and cost benefits can be realized in a vehicle’s development process.
ERIC Educational Resources Information Center
Pandey, Anjali
2012-01-01
This article calls for a rethinking of pure process-based approaches in the teaching of second language writers in the middle school classroom. The author provides evidence from a detailed case study of the writing of a Korean middle school student in a U.S. school setting to make a case for rethinking the efficacy of classic process-based…
A Multi-Systemic School-Based Approach for Addressing Childhood Aggression
ERIC Educational Resources Information Center
Runions, Kevin
2008-01-01
School-based approaches to addressing aggression in the early grades have focused on explicit curriculum addressing social and emotional processes. The current study reviews research on the distinct modes of aggression, the status of current research on social and emotional processing relevant to problems of aggression amongst young children, as…
We propose multi-faceted research to enhance our understanding of NH3 emissions from livestock feeding operations. A process-based emissions modeling approach will be used, and we will investigate ammonia emissions from the scale of the individual farm out to impacts on region...
Process-Based Governance in Public Administrations Using Activity-Based Costing
NASA Astrophysics Data System (ADS)
Becker, Jörg; Bergener, Philipp; Räckers, Michael
Decision- and policy-makers in public administrations currently lack on missing relevant information for sufficient governance. In Germany the introduction of New Public Management and double-entry accounting enable public administrations to get the opportunity to use cost-centered accounting mechanisms to establish new governance mechanisms. Process modelling in this case can be a useful instrument to help the public administrations decision- and policy-makers to structure their activities and capture relevant information. In combination with approaches like Activity-Based Costing, higher management level can be supported with a reasonable data base for fruitful and reasonable governance approaches. Therefore, the aim of this article is combining the public sector domain specific process modelling method PICTURE and concept of activity-based costing for supporting Public Administrations in process-based Governance.
Sliding mode control: an approach to regulate nonlinear chemical processes
Camacho; Smith
2000-01-01
A new approach for the design of sliding mode controllers based on a first-order-plus-deadtime model of the process, is developed. This approach results in a fixed structure controller with a set of tuning equations as a function of the characteristic parameters of the model. The controller performance is judged by simulations on two nonlinear chemical processes.
Truccolo, Wilson
2017-01-01
This review presents a perspective on capturing collective dynamics in recorded neuronal ensembles based on multivariate point process models, inference of low-dimensional dynamics and coarse graining of spatiotemporal measurements. A general probabilistic framework for continuous time point processes reviewed, with an emphasis on multivariate nonlinear Hawkes processes with exogenous inputs. A point process generalized linear model (PP-GLM) framework for the estimation of discrete time multivariate nonlinear Hawkes processes is described. The approach is illustrated with the modeling of collective dynamics in neocortical neuronal ensembles recorded in human and non-human primates, and prediction of single-neuron spiking. A complementary approach to capture collective dynamics based on low-dimensional dynamics (“order parameters”) inferred via latent state-space models with point process observations is presented. The approach is illustrated by inferring and decoding low-dimensional dynamics in primate motor cortex during naturalistic reach and grasp movements. Finally, we briefly review hypothesis tests based on conditional inference and spatiotemporal coarse graining for assessing collective dynamics in recorded neuronal ensembles. PMID:28336305
Truccolo, Wilson
2016-11-01
This review presents a perspective on capturing collective dynamics in recorded neuronal ensembles based on multivariate point process models, inference of low-dimensional dynamics and coarse graining of spatiotemporal measurements. A general probabilistic framework for continuous time point processes reviewed, with an emphasis on multivariate nonlinear Hawkes processes with exogenous inputs. A point process generalized linear model (PP-GLM) framework for the estimation of discrete time multivariate nonlinear Hawkes processes is described. The approach is illustrated with the modeling of collective dynamics in neocortical neuronal ensembles recorded in human and non-human primates, and prediction of single-neuron spiking. A complementary approach to capture collective dynamics based on low-dimensional dynamics ("order parameters") inferred via latent state-space models with point process observations is presented. The approach is illustrated by inferring and decoding low-dimensional dynamics in primate motor cortex during naturalistic reach and grasp movements. Finally, we briefly review hypothesis tests based on conditional inference and spatiotemporal coarse graining for assessing collective dynamics in recorded neuronal ensembles. Published by Elsevier Ltd.
NASA Astrophysics Data System (ADS)
Wang, Xiaodong; Zhang, Xiaoyu; Cai, Hongming; Xu, Boyi
Enacting a supply-chain process involves variant partners and different IT systems. REST receives increasing attention for distributed systems with loosely coupled resources. Nevertheless, resource model incompatibilities and conflicts prevent effective process modeling and deployment in resource-centric Web service environment. In this paper, a Petri-net based framework for supply-chain process integration is proposed. A resource meta-model is constructed to represent the basic information of resources. Then based on resource meta-model, XML schemas and documents are derived, which represent resources and their states in Petri-net. Thereafter, XML-net, a high level Petri-net, is employed for modeling control and data flow of process. From process model in XML-net, RESTful services and choreography descriptions are deduced. Therefore, unified resource representation and RESTful services description are proposed for cross-system integration in a more effective way. A case study is given to illustrate the approach and the desirable features of the approach are discussed.
Kalbar, Pradip P; Karmakar, Subhankar; Asolekar, Shyam R
2013-10-15
The application of multiple-attribute decision-making (MADM) to real life decision problems suggests that avoiding the loss of information through scenario-based approaches and including expert opinions in the decision-making process are two major challenges that require more research efforts. Recently, a wastewater treatment technology selection effort has been made with a 'scenario-based' method of MADM. This paper focuses on a novel approach to incorporate expert opinions into the scenario-based decision-making process, as expert opinions play a major role in the selection of treatment technologies. The sets of criteria and the indicators that are used consist of both qualitative and quantitative criteria. The group decision-making (GDM) approach that is implemented for aggregating expert opinions is based on an analytical hierarchy process (AHP), which is the most widely used MADM method. The pairwise comparison matrices (PCMs) for qualitative criteria are formed based on expert opinions, whereas, a novel approach is proposed for generating PCMs for quantitative criteria. It has been determined that the experts largely prefer natural treatment systems because they are more sustainable in any scenario. However, PCMs based on expert opinions suggest that advanced technologies such as the sequencing batch reactor (SBR) can also be appropriate for a given decision scenario. The proposed GDM approach is a rationalized process that will be more appropriate in realistic scenarios where multiple stakeholders with local and regional societal priorities are involved in the selection of treatment technology. Copyright © 2013 Elsevier Ltd. All rights reserved.
[GSH fermentation process modeling using entropy-criterion based RBF neural network model].
Tan, Zuoping; Wang, Shitong; Deng, Zhaohong; Du, Guocheng
2008-05-01
The prediction accuracy and generalization of GSH fermentation process modeling are often deteriorated by noise existing in the corresponding experimental data. In order to avoid this problem, we present a novel RBF neural network modeling approach based on entropy criterion. It considers the whole distribution structure of the training data set in the parameter learning process compared with the traditional MSE-criterion based parameter learning, and thus effectively avoids the weak generalization and over-learning. Then the proposed approach is applied to the GSH fermentation process modeling. Our results demonstrate that this proposed method has better prediction accuracy, generalization and robustness such that it offers a potential application merit for the GSH fermentation process modeling.
Some contingencies of spelling
Lee, Vicki L.; Sanderson, Gwenda M.
1987-01-01
This paper presents some speculation about the contingencies that might select standard spellings. The speculation is based on a new development in the teaching of spelling—the process writing approach, which lets standard spellings emerge collateral to a high frequency of reading and writing. The paper discusses this approach, contrasts it with behavior-analytic research on spelling, and suggests some new directions for this latter research based on a behavioral interpretation of the process writing approach to spelling. PMID:22477529
Enforcement of entailment constraints in distributed service-based business processes.
Hummer, Waldemar; Gaubatz, Patrick; Strembeck, Mark; Zdun, Uwe; Dustdar, Schahram
2013-11-01
A distributed business process is executed in a distributed computing environment. The service-oriented architecture (SOA) paradigm is a popular option for the integration of software services and execution of distributed business processes. Entailment constraints, such as mutual exclusion and binding constraints, are important means to control process execution. Mutually exclusive tasks result from the division of powerful rights and responsibilities to prevent fraud and abuse. In contrast, binding constraints define that a subject who performed one task must also perform the corresponding bound task(s). We aim to provide a model-driven approach for the specification and enforcement of task-based entailment constraints in distributed service-based business processes. Based on a generic metamodel, we define a domain-specific language (DSL) that maps the different modeling-level artifacts to the implementation-level. The DSL integrates elements from role-based access control (RBAC) with the tasks that are performed in a business process. Process definitions are annotated using the DSL, and our software platform uses automated model transformations to produce executable WS-BPEL specifications which enforce the entailment constraints. We evaluate the impact of constraint enforcement on runtime performance for five selected service-based processes from existing literature. Our evaluation demonstrates that the approach correctly enforces task-based entailment constraints at runtime. The performance experiments illustrate that the runtime enforcement operates with an overhead that scales well up to the order of several ten thousand logged invocations. Using our DSL annotations, the user-defined process definition remains declarative and clean of security enforcement code. Our approach decouples the concerns of (non-technical) domain experts from technical details of entailment constraint enforcement. The developed framework integrates seamlessly with WS-BPEL and the Web services technology stack. Our prototype implementation shows the feasibility of the approach, and the evaluation points to future work and further performance optimizations.
ERIC Educational Resources Information Center
Fernandino, Leonardo; Iacoboni, Marco
2010-01-01
The embodied cognition approach to the study of the mind proposes that higher order mental processes such as concept formation and language are essentially based on perceptual and motor processes. Contrary to the classical approach in cognitive science, in which concepts are viewed as amodal, arbitrary symbols, embodied semantics argues that…
ERIC Educational Resources Information Center
Meyer-Base, U.; Vera, A.; Meyer-Base, A.; Pattichis, M. S.; Perry, R. J.
2010-01-01
In this paper, an innovative educational approach to introducing undergraduates to both digital signal processing (DSP) and field programmable gate array (FPGA)-based design in a one-semester course and laboratory is described. While both DSP and FPGA-based courses are currently present in different curricula, this integrated approach reduces the…
ERIC Educational Resources Information Center
Karacop, Ataman; Diken, Emine Hatun
2017-01-01
The purpose of this study is to investigate the effects of laboratory approach based on jigsaw method with cooperative learning and confirmatory laboratory approach on university students' cognitive process development in Science teaching laboratory applications, and to determine the opinions of the students on applied laboratory methods. The…
Weaknesses in Applying a Process Approach in Industry Enterprises
NASA Astrophysics Data System (ADS)
Kučerová, Marta; Mĺkva, Miroslava; Fidlerová, Helena
2012-12-01
The paper deals with a process approach as one of the main principles of the quality management. Quality management systems based on process approach currently represents one of a proofed ways how to manage an organization. The volume of sales, costs and profit levels are influenced by quality of processes and efficient process flow. As results of the research project showed, there are some weaknesses in applying of the process approach in the industrial routine and it has been often only a formal change of the functional management to process management in many organizations in Slovakia. For efficient process management it is essential that companies take attention to the way how to organize their processes and seek for their continuous improvement.
Chen, Xiaodong; Sadineni, Vikram; Maity, Mita; Quan, Yong; Enterline, Matthew; Mantri, Rao V
2015-12-01
Lyophilization is an approach commonly undertaken to formulate drugs that are unstable to be commercialized as ready to use (RTU) solutions. One of the important aspects of commercializing a lyophilized product is to transfer the process parameters that are developed in lab scale lyophilizer to commercial scale without a loss in product quality. This process is often accomplished by costly engineering runs or through an iterative process at the commercial scale. Here, we are highlighting a combination of computational and experimental approach to predict commercial process parameters for the primary drying phase of lyophilization. Heat and mass transfer coefficients are determined experimentally either by manometric temperature measurement (MTM) or sublimation tests and used as inputs for the finite element model (FEM)-based software called PASSAGE, which computes various primary drying parameters such as primary drying time and product temperature. The heat and mass transfer coefficients will vary at different lyophilization scales; hence, we present an approach to use appropriate factors while scaling-up from lab scale to commercial scale. As a result, one can predict commercial scale primary drying time based on these parameters. Additionally, the model-based approach presented in this study provides a process to monitor pharmaceutical product robustness and accidental process deviations during Lyophilization to support commercial supply chain continuity. The approach presented here provides a robust lyophilization scale-up strategy; and because of the simple and minimalistic approach, it will also be less capital intensive path with minimal use of expensive drug substance/active material.
Time-based analysis of total cost of patient episodes: a case study of hip replacement.
Peltokorpi, Antti; Kujala, Jaakko
2006-01-01
Healthcare in the public and private sectors is facing increasing pressure to become more cost-effective. Time-based competition and work-in-progress have been used successfully to measure and improve the efficiency of industrial manufacturing. Seeks to address this issue. Presents a framework for time based management of the total cost of a patient episode and apply it to the six sigma DMAIC-process development approach. The framework is used to analyse hip replacement patient episodes in Päijät-Häme Hospital District in Finland, which has a catchment area of 210,000 inhabitants and performs an average of 230 hip replacements per year. The work-in-progress concept is applicable to healthcare--notably that the DMAIC-process development approach can be used to analyse the total cost of patient episodes. Concludes that a framework, which combines the patient-in-process and the DMAIC development approach, can be used not only to analyse the total cost of patient episode but also to improve patient process efficiency. Presents a framework that combines patient-in-process and DMAIC-process development approaches, which can be used to analyse the total cost of a patient episode in order to improve patient process efficiency.
A manufacturable process integration approach for graphene devices
NASA Astrophysics Data System (ADS)
Vaziri, Sam; Lupina, Grzegorz; Paussa, Alan; Smith, Anderson D.; Henkel, Christoph; Lippert, Gunther; Dabrowski, Jarek; Mehr, Wolfgang; Östling, Mikael; Lemme, Max C.
2013-06-01
In this work, we propose an integration approach for double gate graphene field effect transistors. The approach includes a number of process steps that are key for future integration of graphene in microelectronics: bottom gates with ultra-thin (2 nm) high-quality thermally grown SiO2 dielectrics, shallow trench isolation between devices and atomic layer deposited Al2O3 top gate dielectrics. The complete process flow is demonstrated with fully functional GFET transistors and can be extended to wafer scale processing. We assess, through simulation, the effects of the quantum capacitance and band bending in the silicon substrate on the effective electric fields in the top and bottom gate oxide. The proposed process technology is suitable for other graphene-based devices such as graphene-based hot electron transistors and photodetectors.
Revising a Design Course from a Lecture Approach to a Project-Based Learning Approach
ERIC Educational Resources Information Center
Kunberger, Tanya
2013-01-01
In order to develop the evaluative skills necessary for successful performance of design, a senior, Geotechnical Engineering course was revised to immerse students in the complexity of the design process utilising a project-based learning (PBL) approach to instruction. The student-centred approach stresses self-directed group learning, which…
Bellanger, Martine M; Jourdain, Alain
2004-01-01
This article aims to evaluate the results of two different approaches underlying the attempts to reduce health inequalities in France. In the 'instrumental' approach, resource allocation is based on an indicator to assess the well-being or the quality of life associated with healthcare provision, the argument being that additional resources would respond to needs that could then be treated quickly and efficiently. This governs the distribution of regional hospital budgets. In the second approach, health professionals and users in a given region are involved in a consensus process to define those priorities to be included in programme formulation. This 'procedural' approach is employed in the case of the regional health programmes. In this second approach, the evaluation of the results runs parallel with an analysis of the process using Rawlsian principles, whereas the first approach is based on the classical economic model.At this stage, a pragmatic analysis based on both the comparison of regional hospital budgets during the period 1992-2003 (calculated using a 'RAWP [resource allocation working party]-like' formula) and the evolution of regional health policies through the evaluation of programmes for the prevention of suicide, alcohol-related diseases and cancers provides a partial assessment of the impact of the two types of approaches, the second having a greater effect on the reduction of regional inequalities.
Rule-Based Event Processing and Reaction Rules
NASA Astrophysics Data System (ADS)
Paschke, Adrian; Kozlenkov, Alexander
Reaction rules and event processing technologies play a key role in making business and IT / Internet infrastructures more agile and active. While event processing is concerned with detecting events from large event clouds or streams in almost real-time, reaction rules are concerned with the invocation of actions in response to events and actionable situations. They state the conditions under which actions must be taken. In the last decades various reaction rule and event processing approaches have been developed, which for the most part have been advanced separately. In this paper we survey reaction rule approaches and rule-based event processing systems and languages.
Integrating relationship- and research-based approaches in Australian health promotion practice.
Klinner, Christiane; Carter, Stacy M; Rychetnik, Lucie; Li, Vincy; Daley, Michelle; Zask, Avigdor; Lloyd, Beverly
2015-12-01
We examine the perspectives of health promotion practitioners on their approaches to determining health promotion practice, in particular on the role of research and relationships in this process. Using Grounded Theory methods, we analysed 58 semi-structured interviews with 54 health promotion practitioners in New South Wales, Australia. Practitioners differentiated between relationship-based and research-based approaches as two sources of knowledge to guide health promotion practice. We identify several tensions in seeking to combine these approaches in practice and describe the strategies that participants adopted to manage these tensions. The strategies included working in an evidence-informed rather than evidence-based way, creating new evidence about relationship-based processes and outcomes, adopting 'relationship-based' research and evaluation methods, making research and evaluation useful for communities, building research and evaluation skills and improving collaboration between research and evaluation and programme implementation staff. We conclude by highlighting three systemic factors which could further support the integration of research-based and relationship-based health promotion practices: (i) expanding conceptions of health promotion evidence, (ii) developing 'relationship-based' research methods that enable practitioners to measure complex social processes and outcomes and to facilitate community participation and benefit, and (iii) developing organizational capacity. © The Author (2014). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Kinetic Modeling of Microbiological Processes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Chongxuan; Fang, Yilin
Kinetic description of microbiological processes is vital for the design and control of microbe-based biotechnologies such as waste water treatment, petroleum oil recovery, and contaminant attenuation and remediation. Various models have been proposed to describe microbiological processes. This editorial article discusses the advantages and limiation of these modeling approaches in cluding tranditional, Monod-type models and derivatives, and recently developed constraint-based approaches. The article also offers the future direction of modeling researches that best suit for petroleum and environmental biotechnologies.
An adaptive signal-processing approach to online adaptive tutoring.
Bergeron, Bryan; Cline, Andrew
2011-01-01
Conventional intelligent or adaptive tutoring online systems rely on domain-specific models of learner behavior based on rules, deep domain knowledge, and other resource-intensive methods. We have developed and studied a domain-independent methodology of adaptive tutoring based on domain-independent signal-processing approaches that obviate the need for the construction of explicit expert and student models. A key advantage of our method over conventional approaches is a lower barrier to entry for educators who want to develop adaptive online learning materials.
Testing Strategies for Model-Based Development
NASA Technical Reports Server (NTRS)
Heimdahl, Mats P. E.; Whalen, Mike; Rajan, Ajitha; Miller, Steven P.
2006-01-01
This report presents an approach for testing artifacts generated in a model-based development process. This approach divides the traditional testing process into two parts: requirements-based testing (validation testing) which determines whether the model implements the high-level requirements and model-based testing (conformance testing) which determines whether the code generated from a model is behaviorally equivalent to the model. The goals of the two processes differ significantly and this report explores suitable testing metrics and automation strategies for each. To support requirements-based testing, we define novel objective requirements coverage metrics similar to existing specification and code coverage metrics. For model-based testing, we briefly describe automation strategies and examine the fault-finding capability of different structural coverage metrics using tests automatically generated from the model.
A novel surrogate-based approach for optimal design of electromagnetic-based circuits
NASA Astrophysics Data System (ADS)
Hassan, Abdel-Karim S. O.; Mohamed, Ahmed S. A.; Rabie, Azza A.; Etman, Ahmed S.
2016-02-01
A new geometric design centring approach for optimal design of central processing unit-intensive electromagnetic (EM)-based circuits is introduced. The approach uses norms related to the probability distribution of the circuit parameters to find distances from a point to the feasible region boundaries by solving nonlinear optimization problems. Based on these normed distances, the design centring problem is formulated as a max-min optimization problem. A convergent iterative boundary search technique is exploited to find the normed distances. To alleviate the computation cost associated with the EM-based circuits design cycle, space-mapping (SM) surrogates are used to create a sequence of iteratively updated feasible region approximations. In each SM feasible region approximation, the centring process using normed distances is implemented, leading to a better centre point. The process is repeated until a final design centre is attained. Practical examples are given to show the effectiveness of the new design centring method for EM-based circuits.
A new hybrid case-based reasoning approach for medical diagnosis systems.
Sharaf-El-Deen, Dina A; Moawad, Ibrahim F; Khalifa, M E
2014-02-01
Case-Based Reasoning (CBR) has been applied in many different medical applications. Due to the complexities and the diversities of this domain, most medical CBR systems become hybrid. Besides, the case adaptation process in CBR is often a challenging issue as it is traditionally carried out manually by domain experts. In this paper, a new hybrid case-based reasoning approach for medical diagnosis systems is proposed to improve the accuracy of the retrieval-only CBR systems. The approach integrates case-based reasoning and rule-based reasoning, and also applies the adaptation process automatically by exploiting adaptation rules. Both adaptation rules and reasoning rules are generated from the case-base. After solving a new case, the case-base is expanded, and both adaptation and reasoning rules are updated. To evaluate the proposed approach, a prototype was implemented and experimented to diagnose breast cancer and thyroid diseases. The final results show that the proposed approach increases the diagnosing accuracy of the retrieval-only CBR systems, and provides a reliable accuracy comparing to the current breast cancer and thyroid diagnosis systems.
Commodity-based Approach for Evaluating the Value of Freight Moving on Texas’ Roadway Network
DOT National Transportation Integrated Search
2017-12-10
The researchers took a commodity-based approach to evaluate the value of a list of selected commodities moved on the Texas freight network. This approach takes advantage of commodity-specific data sources and modeling processes. It provides a unique ...
What's Wrong with Learning for the Exam? An Assessment-Based Approach for Student Engagement
ERIC Educational Resources Information Center
Ito, Hiroshi
2014-01-01
It is now widely recognized that assessment and the feedback play key roles in the learning process. However, assessment-based learning approaches are not yet commonly practiced in Japan. This paper provides an example of an assessment-based approach to teaching and learning employed for a course entitled "English as an International…
Van Dyke, Melissa K; Naoom, Sandra F
2016-01-01
Evidence-based approaches only benefit individuals when fully and effectively implemented. Since funding and monitoring alone will not ensure the full and effective implementation of effective strategies, state agencies have the opportunity to assess and modify current roles, functions, and policies to align with the requirements of evidence-based strategies. Based on a growing body of knowledge to guide effective implementation processes, state agencies, or designated partner organizations, can develop the capacity, mechanisms, and infrastructure to effectively implement evidence-based strategies. This article describes a framework that can guide this process. Informed by the literature and shaped by "real-world experience," the Active Implementation Frameworks provide a stage-matched approach to purposeful, active, and effective implementation.
ERIC Educational Resources Information Center
Hall, Mona L.; Vardar-Ulu, Didem
2014-01-01
The laboratory setting is an exciting and gratifying place to teach because you can actively engage the students in the learning process through hands-on activities; it is a dynamic environment amenable to collaborative work, critical thinking, problem-solving and discovery. The guided inquiry-based approach described here guides the students…
Technical Potential Assessment for the Renewable Energy Zone (REZ) Process: A GIS-Based Approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Nathan; Roberts, Billy J
Geographic Information Systems (GIS)-based energy resource and technical potential assessments identify areas capable of supporting high levels of renewable energy (RE) development as part of a Renewable Energy Zone (REZ) Transmission Planning process. This document expands on the REZ Process to aid practitioners in conducting GIS-based RE resource and technical potential assessments. The REZ process is an approach to plan, approve, and build transmission infrastructure that connects REZs - geographic areas that have high-quality RE resources, suitable topography and land-use designations, and demonstrated developer interest - to the power system. The REZ process helps to increase the share of solarmore » photovoltaic (PV), wind, and other resources while also maintaining reliability and economics.« less
NASA Astrophysics Data System (ADS)
Cheng, Jie; Qian, Zhaogang; Irani, Keki B.; Etemad, Hossein; Elta, Michael E.
1991-03-01
To meet the ever-increasing demand of the rapidly-growing semiconductor manufacturing industry it is critical to have a comprehensive methodology integrating techniques for process optimization real-time monitoring and adaptive process control. To this end we have accomplished an integrated knowledge-based approach combining latest expert system technology machine learning method and traditional statistical process control (SPC) techniques. This knowledge-based approach is advantageous in that it makes it possible for the task of process optimization and adaptive control to be performed consistently and predictably. Furthermore this approach can be used to construct high-level and qualitative description of processes and thus make the process behavior easy to monitor predict and control. Two software packages RIST (Rule Induction and Statistical Testing) and KARSM (Knowledge Acquisition from Response Surface Methodology) have been developed and incorporated with two commercially available packages G2 (real-time expert system) and ULTRAMAX (a tool for sequential process optimization).
Conceptual design of distillation-based hybrid separation processes.
Skiborowski, Mirko; Harwardt, Andreas; Marquardt, Wolfgang
2013-01-01
Hybrid separation processes combine different separation principles and constitute a promising design option for the separation of complex mixtures. Particularly, the integration of distillation with other unit operations can significantly improve the separation of close-boiling or azeotropic mixtures. Although the design of single-unit operations is well understood and supported by computational methods, the optimal design of flowsheets of hybrid separation processes is still a challenging task. The large number of operational and design degrees of freedom requires a systematic and optimization-based design approach. To this end, a structured approach, the so-called process synthesis framework, is proposed. This article reviews available computational methods for the conceptual design of distillation-based hybrid processes for the separation of liquid mixtures. Open problems are identified that must be addressed to finally establish a structured process synthesis framework for such processes.
NASA Astrophysics Data System (ADS)
Salcedo-Sanz, S.
2016-10-01
Meta-heuristic algorithms are problem-solving methods which try to find good-enough solutions to very hard optimization problems, at a reasonable computation time, where classical approaches fail, or cannot even been applied. Many existing meta-heuristics approaches are nature-inspired techniques, which work by simulating or modeling different natural processes in a computer. Historically, many of the most successful meta-heuristic approaches have had a biological inspiration, such as evolutionary computation or swarm intelligence paradigms, but in the last few years new approaches based on nonlinear physics processes modeling have been proposed and applied with success. Non-linear physics processes, modeled as optimization algorithms, are able to produce completely new search procedures, with extremely effective exploration capabilities in many cases, which are able to outperform existing optimization approaches. In this paper we review the most important optimization algorithms based on nonlinear physics, how they have been constructed from specific modeling of a real phenomena, and also their novelty in terms of comparison with alternative existing algorithms for optimization. We first review important concepts on optimization problems, search spaces and problems' difficulty. Then, the usefulness of heuristics and meta-heuristics approaches to face hard optimization problems is introduced, and some of the main existing classical versions of these algorithms are reviewed. The mathematical framework of different nonlinear physics processes is then introduced as a preparatory step to review in detail the most important meta-heuristics based on them. A discussion on the novelty of these approaches, their main computational implementation and design issues, and the evaluation of a novel meta-heuristic based on Strange Attractors mutation will be carried out to complete the review of these techniques. We also describe some of the most important application areas, in broad sense, of meta-heuristics, and describe free-accessible software frameworks which can be used to make easier the implementation of these algorithms.
Effectiveness of a publicly-funded demonstration program to promote management of dryland salinity.
Robertson, M J; Measham, T G; Batchelor, G; George, R; Kingwell, R; Hosking, K
2009-07-01
Community and catchment-based approaches to salinity management continue to attract interest in Australia. In one such approach, Catchment Demonstration Initiative (CDI) projects were established by the Western Australian (WA) Government in 2000 for targeted investment in large-scale catchment-based demonstrations of integrated salinity management practices. The aim was to promote a process for technically-informed salinity management by landholders. This paper offers an evaluation of the effectiveness of one CDI project in the central wheatbelt of WA, covering issues including: its role in fostering adoption of salinity management options, the role of research and the technical requirements for design and implementation of on-ground works, the role of monitoring and evaluation, the identification and measurement of public and private benefits, comparison and identification of the place and value of plant-based and engineering-based options, reliance on social processes and impacts of constraints on capacity, management of governance and administration requirements and an appreciation of the value of group-based approaches. A number of factors may reduce the effectiveness of CDI-type approaches in facilitating landholder action to address salinity, many of these are socially-based. Such approaches can create considerable demands on landholders, can be expensive (because of the planning and accountability required) on the basis of dollars per hectare impacted, and can be difficult to garner ownership from all involved. An additional problem could be that few community groups would have the capacity to run such programs and disseminate the new knowledge so that the CDI-type projects can impact outside the focus catchment. In common with many publicly-funded approaches to salinity, we found that direct benefits on public assets are smaller than planned and that results from science-based requirements of monitoring and evaluation have long lead times, causing farmers to either wait for the information or act sooner and take risks based on initial results. We also found that often it is a clear outline of the process that is of most importance in decision making as opposed to the actual results. We identified limitations in regulatory processes and the capacity for local government to engage in the CDI. The opportunities that CDI-type approaches provide centre around the value of its group-based approach. We conclude that they can overcome knowledge constraints in managing salinity by fostering group-based learning, offer a structured process of trialling options so that the costs and benefits can be clearly and transparently quantified, and avoid the costly mistakes and "learning failures" of the past.
The Metamemory Approach to Confidence: A Test Using Semantic Memory
ERIC Educational Resources Information Center
Brewer, William F.; Sampaio, Cristina
2012-01-01
The metamemory approach to memory confidence was extended and elaborated to deal with semantic memory tasks. The metamemory approach assumes that memory confidence is based on the products and processes of a completed memory task, as well as metamemory beliefs that individuals have about how their memory products and processes relate to memory…
Holt, Cheryl L.; Lee, Crystal; Wright, Katrina
2017-01-01
The purpose of this study was to compare the communication effectiveness of a spiritually-based approach to breast cancer early detection education with a secular approach, among African American women, by conducting a cognitive response analysis. A total of 108 women from six Alabama churches were randomly assigned by church to receive a spiritually-based or secular educational booklet discussing breast cancer early detection. Based on the Elaboration Likelihood Model (Petty & Cacioppo, 1981), after reading the booklets participants were asked to complete a thought-listing task writing down any thoughts they experienced and rating them as positive, negative, or neutral. Two independent coders then used five dimensions to code participants thoughts. Compared with the secular booklet, the spiritually-based booklet resulted in significantly more thoughts involving personal connection, self-assessment, and spiritually-based responses. These results suggest that a spiritually-based approach to breast cancer awareness may be more effective than the secular because it caused women to more actively process the message, stimulating central route processing. The incorporation of spiritually-based content into church-based breast cancer education could be a promising health communication approach for African American women. PMID:18443989
Billieux, Joël; Philippot, Pierre; Schmid, Cécile; Maurage, Pierre; De Mol, Jan; Van der Linden, Martial
2015-01-01
Dysfunctional use of the mobile phone has often been conceptualized as a 'behavioural addiction' that shares most features with drug addictions. In the current article, we challenge the clinical utility of the addiction model as applied to mobile phone overuse. We describe the case of a woman who overuses her mobile phone from two distinct approaches: (1) a symptom-based categorical approach inspired from the addiction model of dysfunctional mobile phone use and (2) a process-based approach resulting from an idiosyncratic clinical case conceptualization. In the case depicted here, the addiction model was shown to lead to standardized and non-relevant treatment, whereas the clinical case conceptualization allowed identification of specific psychological processes that can be targeted with specific, empirically based psychological interventions. This finding highlights that conceptualizing excessive behaviours (e.g., gambling and sex) within the addiction model can be a simplification of an individual's psychological functioning, offering only limited clinical relevance. The addiction model, applied to excessive behaviours (e.g., gambling, sex and Internet-related activities) may lead to non-relevant standardized treatments. Clinical case conceptualization allowed identification of specific psychological processes that can be targeted with specific empirically based psychological interventions. The biomedical model might lead to the simplification of an individual's psychological functioning with limited clinical relevance. Copyright © 2014 John Wiley & Sons, Ltd.
Developing the skills required for evidence-based practice.
French, B
1998-01-01
The current health care environment requires practitioners with the skills to find and apply the best currently available evidence for effective health care, to contribute to the development of evidence-based practice protocols, and to evaluate the impact of utilizing validated research findings in practice. Current approaches to teaching research are based mainly on gaining skills by participation in the research process. Emphasis on the requirement for rigour in the process of creating new knowledge is assumed to lead to skill in the process of using research information created by others. This article reflects upon the requirements for evidence-based practice, and the degree to which current approaches to teaching research prepare practitioners who are able to find, evaluate and best use currently available research information. The potential for using the principles of systematic review as a teaching and learning strategy for research is explored, and some of the possible strengths and weakness of this approach are highlighted.
An evolution of image source camera attribution approaches.
Jahanirad, Mehdi; Wahab, Ainuddin Wahid Abdul; Anuar, Nor Badrul
2016-05-01
Camera attribution plays an important role in digital image forensics by providing the evidence and distinguishing characteristics of the origin of the digital image. It allows the forensic analyser to find the possible source camera which captured the image under investigation. However, in real-world applications, these approaches have faced many challenges due to the large set of multimedia data publicly available through photo sharing and social network sites, captured with uncontrolled conditions and undergone variety of hardware and software post-processing operations. Moreover, the legal system only accepts the forensic analysis of the digital image evidence if the applied camera attribution techniques are unbiased, reliable, nondestructive and widely accepted by the experts in the field. The aim of this paper is to investigate the evolutionary trend of image source camera attribution approaches from fundamental to practice, in particular, with the application of image processing and data mining techniques. Extracting implicit knowledge from images using intrinsic image artifacts for source camera attribution requires a structured image mining process. In this paper, we attempt to provide an introductory tutorial on the image processing pipeline, to determine the general classification of the features corresponding to different components for source camera attribution. The article also reviews techniques of the source camera attribution more comprehensively in the domain of the image forensics in conjunction with the presentation of classifying ongoing developments within the specified area. The classification of the existing source camera attribution approaches is presented based on the specific parameters, such as colour image processing pipeline, hardware- and software-related artifacts and the methods to extract such artifacts. The more recent source camera attribution approaches, which have not yet gained sufficient attention among image forensics researchers, are also critically analysed and further categorised into four different classes, namely, optical aberrations based, sensor camera fingerprints based, processing statistics based and processing regularities based, to present a classification. Furthermore, this paper aims to investigate the challenging problems, and the proposed strategies of such schemes based on the suggested taxonomy to plot an evolution of the source camera attribution approaches with respect to the subjective optimisation criteria over the last decade. The optimisation criteria were determined based on the strategies proposed to increase the detection accuracy, robustness and computational efficiency of source camera brand, model or device attribution. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
SLS Navigation Model-Based Design Approach
NASA Technical Reports Server (NTRS)
Oliver, T. Emerson; Anzalone, Evan; Geohagan, Kevin; Bernard, Bill; Park, Thomas
2018-01-01
The SLS Program chose to implement a Model-based Design and Model-based Requirements approach for managing component design information and system requirements. This approach differs from previous large-scale design efforts at Marshall Space Flight Center where design documentation alone conveyed information required for vehicle design and analysis and where extensive requirements sets were used to scope and constrain the design. The SLS Navigation Team has been responsible for the Program-controlled Design Math Models (DMMs) which describe and represent the performance of the Inertial Navigation System (INS) and the Rate Gyro Assemblies (RGAs) used by Guidance, Navigation, and Controls (GN&C). The SLS Navigation Team is also responsible for the navigation algorithms. The navigation algorithms are delivered for implementation on the flight hardware as a DMM. For the SLS Block 1-B design, the additional GPS Receiver hardware is managed as a DMM at the vehicle design level. This paper provides a discussion of the processes and methods used to engineer, design, and coordinate engineering trades and performance assessments using SLS practices as applied to the GN&C system, with a particular focus on the Navigation components. These include composing system requirements, requirements verification, model development, model verification and validation, and modeling and analysis approaches. The Model-based Design and Requirements approach does not reduce the effort associated with the design process versus previous processes used at Marshall Space Flight Center. Instead, the approach takes advantage of overlap between the requirements development and management process, and the design and analysis process by efficiently combining the control (i.e. the requirement) and the design mechanisms. The design mechanism is the representation of the component behavior and performance in design and analysis tools. The focus in the early design process shifts from the development and management of design requirements to the development of usable models, model requirements, and model verification and validation efforts. The models themselves are represented in C/C++ code and accompanying data files. Under the idealized process, potential ambiguity in specification is reduced because the model must be implementable versus a requirement which is not necessarily subject to this constraint. Further, the models are shown to emulate the hardware during validation. For models developed by the Navigation Team, a common interface/standalone environment was developed. The common environment allows for easy implementation in design and analysis tools. Mechanisms such as unit test cases ensure implementation as the developer intended. The model verification and validation process provides a very high level of component design insight. The origin and implementation of the SLS variant of Model-based Design is described from the perspective of the SLS Navigation Team. The format of the models and the requirements are described. The Model-based Design approach has many benefits but is not without potential complications. Key lessons learned associated with the implementation of the Model Based Design approach and process from infancy to verification and certification are discussed
NASA Astrophysics Data System (ADS)
Musil, Juergen; Schweda, Angelika; Winkler, Dietmar; Biffl, Stefan
Based on our observations of Austrian video game software development (VGSD) practices we identified a lack of systematic processes/method support and inefficient collaboration between various involved disciplines, i.e. engineers and artists. VGSD includes heterogeneous disciplines, e.g. creative arts, game/content design, and software. Nevertheless, improving team collaboration and process support is an ongoing challenge to enable a comprehensive view on game development projects. Lessons learned from software engineering practices can help game developers to increase game development processes within a heterogeneous environment. Based on a state of the practice survey in the Austrian games industry, this paper presents (a) first results with focus on process/method support and (b) suggests a candidate flexible process approach based on Scrum to improve VGSD and team collaboration. Results showed (a) a trend to highly flexible software processes involving various disciplines and (b) identified the suggested flexible process approach as feasible and useful for project application.
A Science and Risk-Based Pragmatic Methodology for Blend and Content Uniformity Assessment.
Sayeed-Desta, Naheed; Pazhayattil, Ajay Babu; Collins, Jordan; Doshi, Chetan
2018-04-01
This paper describes a pragmatic approach that can be applied in assessing powder blend and unit dosage uniformity of solid dose products at Process Design, Process Performance Qualification, and Continued/Ongoing Process Verification stages of the Process Validation lifecycle. The statistically based sampling, testing, and assessment plan was developed due to the withdrawal of the FDA draft guidance for industry "Powder Blends and Finished Dosage Units-Stratified In-Process Dosage Unit Sampling and Assessment." This paper compares the proposed Grouped Area Variance Estimate (GAVE) method with an alternate approach outlining the practicality and statistical rationalization using traditional sampling and analytical methods. The approach is designed to fit solid dose processes assuring high statistical confidence in both powder blend uniformity and dosage unit uniformity during all three stages of the lifecycle complying with ASTM standards as recommended by the US FDA.
Quality management of manufacturing process based on manufacturing execution system
NASA Astrophysics Data System (ADS)
Zhang, Jian; Jiang, Yang; Jiang, Weizhuo
2017-04-01
Quality control elements in manufacturing process are elaborated. And the approach of quality management of manufacturing process based on manufacturing execution system (MES) is discussed. The functions of MES for a microcircuit production line are introduced conclusively.
Investigating an approach to the alliance based on interpersonal defense theory.
Westerman, Michael A; Muran, J Christopher
2017-09-01
Notwithstanding consistent findings of significant relationships between the alliance and outcome, questions remain to be answered about the relatively small magnitude of those correlations, the mechanisms underlying the association, and how to conceptualize the alliance construct. We conducted a preliminary study of an approach to the alliance based on interpersonal defense theory, which is an interpersonal reconceptualization of defense processes, to investigate the promise of this alternative approach as a way to address the outstanding issues. We employed qualitative, theory-building case study methodology, closely examining alliance processes at four time points in the treatment of a case in terms of a case formulation based on interpersonal defense theory. The results suggested that our approach made it possible to recognize key processes in the alliance and that it helps explain how the alliance influences outcome. Our analyses also provided a rich set of concrete illustrations of the alliance phenomena identified by the theory. The findings suggest that an approach to the alliance based on interpersonal defense theory holds promise. However, although the qualitative method we employed has advantages, it also has limitations. We offer suggestions about how future qualitative and quantitative investigations could build on this study.
A Physics-Based Engineering Approach to Predict the Cross Section for Advanced SRAMs
NASA Astrophysics Data System (ADS)
Li, Lei; Zhou, Wanting; Liu, Huihua
2012-12-01
This paper presents a physics-based engineering approach to estimate the heavy ion induced upset cross section for 6T SRAM cells from layout and technology parameters. The new approach calculates the effects of radiation with junction photocurrent, which is derived based on device physics. The new and simple approach handles the problem by using simple SPICE simulations. At first, the approach uses a standard SPICE program on a typical PC to predict the SPICE-simulated curve of the collected charge vs. its affected distance from the drain-body junction with the derived junction photocurrent. And then, the SPICE-simulated curve is used to calculate the heavy ion induced upset cross section with a simple model, which considers that the SEU cross section of a SRAM cell is more related to a “radius of influence” around a heavy ion strike than to the physical size of a diffusion node in the layout for advanced SRAMs in nano-scale process technologies. The calculated upset cross section based on this method is in good agreement with the test results for 6T SRAM cells processed using 90 nm process technology.
Niang, Oumar; Thioune, Abdoulaye; El Gueirea, Mouhamed Cheikh; Deléchelle, Eric; Lemoine, Jacques
2012-09-01
The major problem with the empirical mode decomposition (EMD) algorithm is its lack of a theoretical framework. So, it is difficult to characterize and evaluate this approach. In this paper, we propose, in the 2-D case, the use of an alternative implementation to the algorithmic definition of the so-called "sifting process" used in the original Huang's EMD method. This approach, especially based on partial differential equations (PDEs), was presented by Niang in previous works, in 2005 and 2007, and relies on a nonlinear diffusion-based filtering process to solve the mean envelope estimation problem. In the 1-D case, the efficiency of the PDE-based method, compared to the original EMD algorithmic version, was also illustrated in a recent paper. Recently, several 2-D extensions of the EMD method have been proposed. Despite some effort, 2-D versions for EMD appear poorly performing and are very time consuming. So in this paper, an extension to the 2-D space of the PDE-based approach is extensively described. This approach has been applied in cases of both signal and image decomposition. The obtained results confirm the usefulness of the new PDE-based sifting process for the decomposition of various kinds of data. Some results have been provided in the case of image decomposition. The effectiveness of the approach encourages its use in a number of signal and image applications such as denoising, detrending, or texture analysis.
Real-time interactive virtual tour on the World Wide Web (WWW)
NASA Astrophysics Data System (ADS)
Yoon, Sanghyuk; Chen, Hai-jung; Hsu, Tom; Yoon, Ilmi
2003-12-01
Web-based Virtual Tour has become a desirable and demanded application, yet challenging due to the nature of web application's running environment such as limited bandwidth and no guarantee of high computation power on the client side. Image-based rendering approach has attractive advantages over traditional 3D rendering approach in such Web Applications. Traditional approach, such as VRML, requires labor-intensive 3D modeling process, high bandwidth and computation power especially for photo-realistic virtual scenes. QuickTime VR and IPIX as examples of image-based approach, use panoramic photos and the virtual scenes that can be generated from photos directly skipping the modeling process. But, these image-based approaches may require special cameras or effort to take panoramic views and provide only one fixed-point look-around and zooming in-out rather than 'walk around', that is a very important feature to provide immersive experience to virtual tourists. The Web-based Virtual Tour using Tour into the Picture employs pseudo 3D geometry with image-based rendering approach to provide viewers with immersive experience of walking around the virtual space with several snap shots of conventional photos.
Madsen, William C
2016-06-01
Across North America, community agencies and state/provincial jurisdictions are embracing family-centered approaches to service delivery that are grounded in strength-based, culturally responsive, accountable partnerships with families. This article details a collaborative consultation process to initiate and sustain organizational change toward this effort. It draws on innovative ideas from narrative theory, organizational development, and implementation science to highlight a three component approach. This approach includes the use of appreciative inquiry focus groups to elicit existing best practices, the provision of clinical training, and ongoing coaching with practice leaders to build on those better moments and develop concrete practice frameworks, and leadership coaching and organizational consultation to develop organizational structures that institutionalize family-centered practice. While the article uses a principle-based practice framework, Collaborative Helping, to illustrate this process, the approach is applicable with a variety of clinical frameworks grounded in family-centered values and principles. © 2016 Family Process Institute.
Process-based tolerance assessment of connecting rod machining process
NASA Astrophysics Data System (ADS)
Sharma, G. V. S. S.; Rao, P. Srinivasa; Surendra Babu, B.
2016-06-01
Process tolerancing based on the process capability studies is the optimistic and pragmatic approach of determining the manufacturing process tolerances. On adopting the define-measure-analyze-improve-control approach, the process potential capability index ( C p) and the process performance capability index ( C pk) values of identified process characteristics of connecting rod machining process are achieved to be greater than the industry benchmark of 1.33, i.e., four sigma level. The tolerance chain diagram methodology is applied to the connecting rod in order to verify the manufacturing process tolerances at various operations of the connecting rod manufacturing process. This paper bridges the gap between the existing dimensional tolerances obtained via tolerance charting and process capability studies of the connecting rod component. Finally, the process tolerancing comparison has been done by adopting a tolerance capability expert software.
Araújo, Luciano V; Malkowski, Simon; Braghetto, Kelly R; Passos-Bueno, Maria R; Zatz, Mayana; Pu, Calton; Ferreira, João E
2011-12-22
Recent medical and biological technology advances have stimulated the development of new testing systems that have been providing huge, varied amounts of molecular and clinical data. Growing data volumes pose significant challenges for information processing systems in research centers. Additionally, the routines of genomics laboratory are typically characterized by high parallelism in testing and constant procedure changes. This paper describes a formal approach to address this challenge through the implementation of a genetic testing management system applied to human genome laboratory. We introduced the Human Genome Research Center Information System (CEGH) in Brazil, a system that is able to support constant changes in human genome testing and can provide patients updated results based on the most recent and validated genetic knowledge. Our approach uses a common repository for process planning to ensure reusability, specification, instantiation, monitoring, and execution of processes, which are defined using a relational database and rigorous control flow specifications based on process algebra (ACP). The main difference between our approach and related works is that we were able to join two important aspects: 1) process scalability achieved through relational database implementation, and 2) correctness of processes using process algebra. Furthermore, the software allows end users to define genetic testing without requiring any knowledge about business process notation or process algebra. This paper presents the CEGH information system that is a Laboratory Information Management System (LIMS) based on a formal framework to support genetic testing management for Mendelian disorder studies. We have proved the feasibility and showed usability benefits of a rigorous approach that is able to specify, validate, and perform genetic testing using easy end user interfaces.
2011-01-01
Background Recent medical and biological technology advances have stimulated the development of new testing systems that have been providing huge, varied amounts of molecular and clinical data. Growing data volumes pose significant challenges for information processing systems in research centers. Additionally, the routines of genomics laboratory are typically characterized by high parallelism in testing and constant procedure changes. Results This paper describes a formal approach to address this challenge through the implementation of a genetic testing management system applied to human genome laboratory. We introduced the Human Genome Research Center Information System (CEGH) in Brazil, a system that is able to support constant changes in human genome testing and can provide patients updated results based on the most recent and validated genetic knowledge. Our approach uses a common repository for process planning to ensure reusability, specification, instantiation, monitoring, and execution of processes, which are defined using a relational database and rigorous control flow specifications based on process algebra (ACP). The main difference between our approach and related works is that we were able to join two important aspects: 1) process scalability achieved through relational database implementation, and 2) correctness of processes using process algebra. Furthermore, the software allows end users to define genetic testing without requiring any knowledge about business process notation or process algebra. Conclusions This paper presents the CEGH information system that is a Laboratory Information Management System (LIMS) based on a formal framework to support genetic testing management for Mendelian disorder studies. We have proved the feasibility and showed usability benefits of a rigorous approach that is able to specify, validate, and perform genetic testing using easy end user interfaces. PMID:22369688
Supporting the Knowledge-to-Action Process: A Systems-Thinking Approach
ERIC Educational Resources Information Center
Cherney, Adrian; Head, Brian
2011-01-01
The processes for moving research-based knowledge to the domains of action in social policy and professional practice are complex. Several disciplinary research traditions have illuminated several key aspects of these processes. A more holistic approach, drawing on systems thinking, has also been outlined and advocated by recent contributors to…
Stenlund, Mari
2017-08-09
This article clarifies how the freedom of thought as a human right can be understood and promoted as a right of mental health service users, especially people with psychotic disorder, by using Martha Nussbaum's capabilities approach and Fulford's and Fulford et al 's values-based practice. According to Nussbaum, freedom of thought seems to primarily protect the capability to think, believe and feel. This capability can be promoted in the context of mental health services by values-based practice. The article points out that both Nussbaum's approach and values-based practice recognise that people's values differ. The idea of involving different actors and service users in mental healthcare is also common in both Nussbaum's approach and values-based practice. However, there are also differences in that values-based practice relies on a 'good process' in decision-making, whereas the capabilities approach is oriented towards a 'right outcome'. However, since process and outcome are linked with each other, these two approaches do not necessarily conflict despite this difference. The article suggests that absolute rights are possible within the two approaches. It also recognises that the capabilities approach, values-based practice and human rights approach lean on liberal values and thus can be combined at least in liberal societies. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Workplace-Based Assessment: Effects of Rater Expertise
ERIC Educational Resources Information Center
Govaerts, M. J. B.; Schuwirth, L. W. T.; Van der Vleuten, C. P. M.; Muijtjens, A. M. M.
2011-01-01
Traditional psychometric approaches towards assessment tend to focus exclusively on quantitative properties of assessment outcomes. This may limit more meaningful educational approaches towards workplace-based assessment (WBA). Cognition-based models of WBA argue that assessment outcomes are determined by cognitive processes by raters which are…
Development of Scientific Approach Based on Discovery Learning Module
NASA Astrophysics Data System (ADS)
Ellizar, E.; Hardeli, H.; Beltris, S.; Suharni, R.
2018-04-01
Scientific Approach is a learning process, designed to make the students actively construct their own knowledge through stages of scientific method. The scientific approach in learning process can be done by using learning modules. One of the learning model is discovery based learning. Discovery learning is a learning model for the valuable things in learning through various activities, such as observation, experience, and reasoning. In fact, the students’ activity to construct their own knowledge were not optimal. It’s because the available learning modules were not in line with the scientific approach. The purpose of this study was to develop a scientific approach discovery based learning module on Acid Based, also on electrolyte and non-electrolyte solution. The developing process of this chemistry modules use the Plomp Model with three main stages. The stages are preliminary research, prototyping stage, and the assessment stage. The subject of this research was the 10th and 11th Grade of Senior High School students (SMAN 2 Padang). Validation were tested by the experts of Chemistry lecturers and teachers. Practicality of these modules had been tested through questionnaire. The effectiveness had been tested through experimental procedure by comparing student achievement between experiment and control groups. Based on the findings, it can be concluded that the developed scientific approach discovery based learning module significantly improve the students’ learning in Acid-based and Electrolyte solution. The result of the data analysis indicated that the chemistry module was valid in content, construct, and presentation. Chemistry module also has a good practicality level and also accordance with the available time. This chemistry module was also effective, because it can help the students to understand the content of the learning material. That’s proved by the result of learning student. Based on the result can conclude that chemistry module based on discovery learning and scientific approach in electrolyte and non-electrolyte solution and Acid Based for the 10th and 11th grade of senior high school students were valid, practice, and effective.
NASA Astrophysics Data System (ADS)
Hoang, Hanh H.; Jung, Jason J.; Tran, Chi P.
2014-11-01
Based on an in-depth analysis of the existing approaches in applying semantic technologies to business process management (BPM) research in the perspective of cross-enterprise collaboration or so-called business-to-business integration, we analyse, discuss and compare methodologies, applications and best practices of the surveyed approaches with the proposed criteria. This article identifies various relevant research directions in semantic BPM (SBPM). Founded on the result of our investigation, we summarise the state of art of SBPM. We also address areas and directions for further research activities.
Image processing based detection of lung cancer on CT scan images
NASA Astrophysics Data System (ADS)
Abdillah, Bariqi; Bustamam, Alhadi; Sarwinda, Devvi
2017-10-01
In this paper, we implement and analyze the image processing method for detection of lung cancer. Image processing techniques are widely used in several medical problems for picture enhancement in the detection phase to support the early medical treatment. In this research we proposed a detection method of lung cancer based on image segmentation. Image segmentation is one of intermediate level in image processing. Marker control watershed and region growing approach are used to segment of CT scan image. Detection phases are followed by image enhancement using Gabor filter, image segmentation, and features extraction. From the experimental results, we found the effectiveness of our approach. The results show that the best approach for main features detection is watershed with masking method which has high accuracy and robust.
A Model to Translate Evidence-Based Interventions Into Community Practice
Christiansen, Ann L.; Peterson, Donna J.; Guse, Clare E.; Maurana, Cheryl A.; Brandenburg, Terry
2012-01-01
There is a tension between 2 alternative approaches to implementing community-based interventions. The evidence-based public health movement emphasizes the scientific basis of prevention by disseminating rigorously evaluated interventions from academic and governmental agencies to local communities. Models used by local health departments to incorporate community input into their planning, such as the community health improvement process (CHIP), emphasize community leadership in identifying health problems and developing and implementing health improvement strategies. Each approach has limitations. Modifying CHIP to formally include consideration of evidence-based interventions in both the planning and evaluation phases leads to an evidence-driven community health improvement process that can serve as a useful framework for uniting the different approaches while emphasizing community ownership, priorities, and wisdom. PMID:22397341
PREDICTIVE MODELING OF LIGHT-INDUCED MORTALITY OF ENTEROCOCCI FAECALIS IN RECREATIONAL WATERS
One approach to predictive modeling of biological contamination of recreational waters involves the application of process-based approaches that consider microbial sources, hydrodynamic transport, and microbial fate. This presentation focuses on one important fate process, light-...
Dependent Neyman type A processes based on common shock Poisson approach
NASA Astrophysics Data System (ADS)
Kadilar, Gamze Özel; Kadilar, Cem
2016-04-01
The Neyman type A process is used for describing clustered data since the Poisson process is insufficient for clustering of events. In a multivariate setting, there may be dependencies between multivarite Neyman type A processes. In this study, dependent form of the Neyman type A process is considered under common shock approach. Then, the joint probability function are derived for the dependent Neyman type A Poisson processes. Then, an application based on forest fires in Turkey are given. The results show that the joint probability function of the dependent Neyman type A processes, which is obtained in this study, can be a good tool for the probabilistic fitness for the total number of burned trees in Turkey.
Báscolo, Ernesto Pablo; Yavich, Natalia; Denis, Jean-Louis
2016-01-01
Abstract Background Primary health care (PHC)-based reforms have had different results in Latin America. Little attention has been paid to the enablers of collective action capacities required to produce a comprehensive PHC approach. Objective To analyse the enablers of collective action capacities to transform health systems towards a comprehensive PHC approach in Latin American PHC-based reforms. Methods We conducted a longitudinal, retrospective case study of three municipal PHC-based reforms in Bolivia and Argentina. We used multiple data sources and methodologies: document review; interviews with policymakers, managers and practitioners; and household and services surveys. We used temporal bracketing to analyse how the dynamic of interaction between the institutional reform process and the collective action characteristics enabled or hindered the enablers of collective action capacities required to produce the envisioned changes. Results The institutional structuring dynamics and collective action capacities were different in each case. In Cochabamba, there was an ‘interrupted’ structuring process that achieved the establishment of a primary level with a selective PHC approach. In Vicente López, there was a ‘path-dependency’ structuring process that permitted the consolidation of a ‘primary care’ approach, but with limited influence in hospitals. In Rosario, there was a ‘dialectic’ structuring process that favoured the development of the capacities needed to consolidate a comprehensive PHC approach that permeates the entire system. Conclusion The institutional change processes achieved the development of a primary health care level with different degrees of consolidation and system-wide influence given how the characteristics of each collective action enabled or hindered the ‘structuring’ processes. PMID:27209640
Cai, Bin; Altman, Michael B; Garcia-Ramirez, Jose; LaBrash, Jason; Goddu, S Murty; Mutic, Sasa; Parikh, Parag J; Olsen, Jeffrey R; Saad, Nael; Zoberi, Jacqueline E
To develop a safe and robust workflow for yttrium-90 (Y-90) radioembolization procedures in a multidisciplinary team environment. A generalized Define-Measure-Analyze-Improve-Control (DMAIC)-based approach to process improvement was applied to a Y-90 radioembolization workflow. In the first DMAIC cycle, events with the Y-90 workflow were defined and analyzed. To improve the workflow, a web-based interactive electronic white board (EWB) system was adopted as the central communication platform and information processing hub. The EWB-based Y-90 workflow then underwent a second DMAIC cycle. Out of 245 treatments, three misses that went undetected until treatment initiation were recorded over a period of 21 months, and root-cause-analysis was performed to determine causes of each incident and opportunities for improvement. The EWB-based Y-90 process was further improved via new rules to define reliable sources of information as inputs into the planning process, as well as new check points to ensure this information was communicated correctly throughout the process flow. After implementation of the revised EWB-based Y-90 workflow, after two DMAIC-like cycles, there were zero misses out of 153 patient treatments in 1 year. The DMAIC-based approach adopted here allowed the iterative development of a robust workflow to achieve an adaptable, event-minimizing planning process despite a complex setting which requires the participation of multiple teams for Y-90 microspheres therapy. Implementation of such a workflow using the EWB or similar platform with a DMAIC-based process improvement approach could be expanded to other treatment procedures, especially those requiring multidisciplinary management. Copyright © 2016 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.
E-Learning Personalization Using Triple-Factor Approach in Standard-Based Education
NASA Astrophysics Data System (ADS)
Laksitowening, K. A.; Santoso, H. B.; Hasibuan, Z. A.
2017-01-01
E-Learning can be a tool in monitoring learning process and progress towards the targeted competency. Process and progress on every learner can be different one to another, since every learner may have different learning type. Learning type itself can be identified by taking into account learning style, motivation, and knowledge ability. This study explores personalization for learning type based on Triple-Factor Approach. Considering that factors in Triple-Factor Approach are dynamic, the personalization system needs to accommodate the changes that may occurs. Originated from the issue, this study proposed personalization that guides learner progression dynamically towards stages of their learning process. The personalization is implemented in the form of interventions that trigger learner to access learning contents and discussion forums more often as well as improve their level of knowledge ability based on their state of learning type.
NASA Astrophysics Data System (ADS)
Di Lorenzo, R.; Ingarao, G.; Fonti, V.
2007-05-01
The crucial task in the prevention of ductile fracture is the availability of a tool for the prediction of such defect occurrence. The technical literature presents a wide investigation on this topic and many contributions have been given by many authors following different approaches. The main class of approaches regards the development of fracture criteria: generally, such criteria are expressed by determining a critical value of a damage function which depends on stress and strain paths: ductile fracture is assumed to occur when such critical value is reached during the analysed process. There is a relevant drawback related to the utilization of ductile fracture criteria; in fact each criterion usually has good performances in the prediction of fracture for particular stress - strain paths, i.e. it works very well for certain processes but may provide no good results for other processes. On the other hand, the approaches based on damage mechanics formulation are very effective from a theoretical point of view but they are very complex and their proper calibration is quite difficult. In this paper, two different approaches are investigated to predict fracture occurrence in cold forming operations. The final aim of the proposed method is the achievement of a tool which has a general reliability i.e. it is able to predict fracture for different forming processes. The proposed approach represents a step forward within a research project focused on the utilization of innovative predictive tools for ductile fracture. The paper presents a comparison between an artificial neural network design procedure and an approach based on statistical tools; both the approaches were aimed to predict fracture occurrence/absence basing on a set of stress and strain paths data. The proposed approach is based on the utilization of experimental data available, for a given material, on fracture occurrence in different processes. More in detail, the approach consists in the analysis of experimental tests in which fracture occurs followed by the numerical simulations of such processes in order to track the stress-strain paths in the workpiece region where fracture is expected. Such data are utilized to build up a proper data set which was utilized both to train an artificial neural network and to perform a statistical analysis aimed to predict fracture occurrence. The developed statistical tool is properly designed and optimized and is able to recognize the fracture occurrence. The reliability and predictive capability of the statistical method were compared with the ones obtained from an artificial neural network developed to predict fracture occurrence. Moreover, the approach is validated also in forming processes characterized by a complex fracture mechanics.
NASA Astrophysics Data System (ADS)
Chardon, J.; Mathevet, T.; Le Lay, M.; Gailhard, J.
2012-04-01
In the context of a national energy company (EDF : Electricité de France), hydro-meteorological forecasts are necessary to ensure safety and security of installations, meet environmental standards and improve water ressources management and decision making. Hydrological ensemble forecasts allow a better representation of meteorological and hydrological forecasts uncertainties and improve human expertise of hydrological forecasts, which is essential to synthesize available informations, coming from different meteorological and hydrological models and human experience. An operational hydrological ensemble forecasting chain has been developed at EDF since 2008 and is being used since 2010 on more than 30 watersheds in France. This ensemble forecasting chain is characterized ensemble pre-processing (rainfall and temperature) and post-processing (streamflow), where a large human expertise is solicited. The aim of this paper is to compare 2 hydrological ensemble post-processing methods developed at EDF in order improve ensemble forecasts reliability (similar to Monatanari &Brath, 2004; Schaefli et al., 2007). The aim of the post-processing methods is to dress hydrological ensemble forecasts with hydrological model uncertainties, based on perfect forecasts. The first method (called empirical approach) is based on a statistical modelisation of empirical error of perfect forecasts, by streamflow sub-samples of quantile class and lead-time. The second method (called dynamical approach) is based on streamflow sub-samples of quantile class and streamflow variation, and lead-time. On a set of 20 watersheds used for operational forecasts, results show that both approaches are necessary to ensure a good post-processing of hydrological ensemble, allowing a good improvement of reliability, skill and sharpness of ensemble forecasts. The comparison of the empirical and dynamical approaches shows the limits of the empirical approach which is not able to take into account hydrological dynamic and processes, i. e. sample heterogeneity. For a same streamflow range corresponds different processes such as rising limbs or recession, where uncertainties are different. The dynamical approach improves reliability, skills and sharpness of forecasts and globally reduces confidence intervals width. When compared in details, the dynamical approach allows a noticeable reduction of confidence intervals during recessions where uncertainty is relatively lower and a slight increase of confidence intervals during rising limbs or snowmelt where uncertainty is greater. The dynamic approach, validated by forecaster's experience that considered the empirical approach not discriminative enough, improved forecaster's confidence and communication of uncertainties. Montanari, A. and Brath, A., (2004). A stochastic approach for assessing the uncertainty of rainfall-runoff simulations. Water Resources Research, 40, W01106, doi:10.1029/2003WR002540. Schaefli, B., Balin Talamba, D. and Musy, A., (2007). Quantifying hydrological modeling errors through a mixture of normal distributions. Journal of Hydrology, 332, 303-315.
Spatiotemporal video deinterlacing using control grid interpolation
NASA Astrophysics Data System (ADS)
Venkatesan, Ragav; Zwart, Christine M.; Frakes, David H.; Li, Baoxin
2015-03-01
With the advent of progressive format display and broadcast technologies, video deinterlacing has become an important video-processing technique. Numerous approaches exist in the literature to accomplish deinterlacing. While most earlier methods were simple linear filtering-based approaches, the emergence of faster computing technologies and even dedicated video-processing hardware in display units has allowed higher quality but also more computationally intense deinterlacing algorithms to become practical. Most modern approaches analyze motion and content in video to select different deinterlacing methods for various spatiotemporal regions. We introduce a family of deinterlacers that employs spectral residue to choose between and weight control grid interpolation based spatial and temporal deinterlacing methods. The proposed approaches perform better than the prior state-of-the-art based on peak signal-to-noise ratio, other visual quality metrics, and simple perception-based subjective evaluations conducted by human viewers. We further study the advantages of using soft and hard decision thresholds on the visual performance.
Lyapunov-Based Sensor Failure Detection And Recovery For The Reverse Water Gas Shift Process
NASA Technical Reports Server (NTRS)
Haralambous, Michael G.
2001-01-01
Livingstone, a model-based AI software system, is planned for use in the autonomous fault diagnosis, reconfiguration, and control of the oxygen-producing reverse water gas shift (RWGS) process test-bed located in the Applied Chemistry Laboratory at KSC. In this report the RWGS process is first briefly described and an overview of Livingstone is given. Next, a Lyapunov-based approach for detecting and recovering from sensor failures, differing significantly from that used by Livingstone, is presented. In this new method, models used are in terms of the defining differential equations of system components, thus differing from the qualitative, static models used by Livingstone. An easily computed scalar inequality constraint, expressed in terms of sensed system variables, is used to determine the existence of sensor failures. In the event of sensor failure, an observer/estimator is used for determining which sensors have failed. The theory underlying the new approach is developed. Finally, a recommendation is made to use the Lyapunov-based approach to complement the capability of Livingstone and to use this combination in the RWGS process.
LYAPUNOV-Based Sensor Failure Detection and Recovery for the Reverse Water Gas Shift Process
NASA Technical Reports Server (NTRS)
Haralambous, Michael G.
2002-01-01
Livingstone, a model-based AI software system, is planned for use in the autonomous fault diagnosis, reconfiguration, and control of the oxygen-producing reverse water gas shift (RWGS) process test-bed located in the Applied Chemistry Laboratory at KSC. In this report the RWGS process is first briefly described and an overview of Livingstone is given. Next, a Lyapunov-based approach for detecting and recovering from sensor failures, differing significantly from that used by Livingstone, is presented. In this new method, models used are in t e m of the defining differential equations of system components, thus differing from the qualitative, static models used by Livingstone. An easily computed scalar inequality constraint, expressed in terms of sensed system variables, is used to determine the existence of sensor failures. In the event of sensor failure, an observer/estimator is used for determining which sensors have failed. The theory underlying the new approach is developed. Finally, a recommendation is made to use the Lyapunov-based approach to complement the capability of Livingstone and to use this combination in the RWGS process.
Agent-based modeling: a new approach for theory building in social psychology.
Smith, Eliot R; Conrey, Frederica R
2007-02-01
Most social and psychological phenomena occur not as the result of isolated decisions by individuals but rather as the result of repeated interactions between multiple individuals over time. Yet the theory-building and modeling techniques most commonly used in social psychology are less than ideal for understanding such dynamic and interactive processes. This article describes an alternative approach to theory building, agent-based modeling (ABM), which involves simulation of large numbers of autonomous agents that interact with each other and with a simulated environment and the observation of emergent patterns from their interactions. The authors believe that the ABM approach is better able than prevailing approaches in the field, variable-based modeling (VBM) techniques such as causal modeling, to capture types of complex, dynamic, interactive processes so important in the social world. The article elaborates several important contrasts between ABM and VBM and offers specific recommendations for learning more and applying the ABM approach.
The Population Care Coordination Process.
Rushton, Sharron
2015-01-01
The purpose of the article was to outline a population-based approach to providing care coordination. The Population Care Coordination Process provides a framework for each provider and/or organization to provide multilevel care based on population- and patient-centered principles. The Population Care Coordination Process is scalable. It can be utilized in a smaller scale such as single provider office or in a larger scale such as an accountable care organization. There are many issues within our current health care structure that must be addressed. Care coordination has been identified as a potential solution to address the needs of complex patients within the system. The expansion to consider populations allows for a more targeted and efficient approach. The population care process entails a data-driven approach to care coordination. The inclusion of populations in the care coordination process provides an opportunity to maximize efforts and improve outcomes.
General purpose graphic processing unit implementation of adaptive pulse compression algorithms
NASA Astrophysics Data System (ADS)
Cai, Jingxiao; Zhang, Yan
2017-07-01
This study introduces a practical approach to implement real-time signal processing algorithms for general surveillance radar based on NVIDIA graphical processing units (GPUs). The pulse compression algorithms are implemented using compute unified device architecture (CUDA) libraries such as CUDA basic linear algebra subroutines and CUDA fast Fourier transform library, which are adopted from open source libraries and optimized for the NVIDIA GPUs. For more advanced, adaptive processing algorithms such as adaptive pulse compression, customized kernel optimization is needed and investigated. A statistical optimization approach is developed for this purpose without needing much knowledge of the physical configurations of the kernels. It was found that the kernel optimization approach can significantly improve the performance. Benchmark performance is compared with the CPU performance in terms of processing accelerations. The proposed implementation framework can be used in various radar systems including ground-based phased array radar, airborne sense and avoid radar, and aerospace surveillance radar.
A quality by design approach to scale-up of high-shear wet granulation process.
Pandey, Preetanshu; Badawy, Sherif
2016-01-01
High-shear wet granulation is a complex process that in turn makes scale-up a challenging task. Scale-up of high-shear wet granulation process has been studied extensively in the past with various different methodologies being proposed in the literature. This review article discusses existing scale-up principles and categorizes the various approaches into two main scale-up strategies - parameter-based and attribute-based. With the advent of quality by design (QbD) principle in drug product development process, an increased emphasis toward the latter approach may be needed to ensure product robustness. In practice, a combination of both scale-up strategies is often utilized. In a QbD paradigm, there is also a need for an increased fundamental and mechanistic understanding of the process. This can be achieved either by increased experimentation that comes at higher costs, or by using modeling techniques, that are also discussed as part of this review.
Model-Based GN and C Simulation and Flight Software Development for Orion Missions beyond LEO
NASA Technical Reports Server (NTRS)
Odegard, Ryan; Milenkovic, Zoran; Henry, Joel; Buttacoli, Michael
2014-01-01
For Orion missions beyond low Earth orbit (LEO), the Guidance, Navigation, and Control (GN&C) system is being developed using a model-based approach for simulation and flight software. Lessons learned from the development of GN&C algorithms and flight software for the Orion Exploration Flight Test One (EFT-1) vehicle have been applied to the development of further capabilities for Orion GN&C beyond EFT-1. Continuing the use of a Model-Based Development (MBD) approach with the Matlab®/Simulink® tool suite, the process for GN&C development and analysis has been largely improved. Furthermore, a model-based simulation environment in Simulink, rather than an external C-based simulation, greatly eases the process for development of flight algorithms. The benefits seen by employing lessons learned from EFT-1 are described, as well as the approach for implementing additional MBD techniques. Also detailed are the key enablers for improvements to the MBD process, including enhanced configuration management techniques for model-based software systems, automated code and artifact generation, and automated testing and integration.
Detecting determinism from point processes.
Andrzejak, Ralph G; Mormann, Florian; Kreuz, Thomas
2014-12-01
The detection of a nonrandom structure from experimental data can be crucial for the classification, understanding, and interpretation of the generating process. We here introduce a rank-based nonlinear predictability score to detect determinism from point process data. Thanks to its modular nature, this approach can be adapted to whatever signature in the data one considers indicative of deterministic structure. After validating our approach using point process signals from deterministic and stochastic model dynamics, we show an application to neuronal spike trains recorded in the brain of an epilepsy patient. While we illustrate our approach in the context of temporal point processes, it can be readily applied to spatial point processes as well.
Risk management for moisture related effects in dry manufacturing processes: a statistical approach.
Quiroz, Jorge; Strong, John; Zhang, Lanju
2016-03-01
A risk- and science-based approach to control the quality in pharmaceutical manufacturing includes a full understanding of how product attributes and process parameters relate to product performance through a proactive approach in formulation and process development. For dry manufacturing, where moisture content is not directly manipulated within the process, the variability in moisture of the incoming raw materials can impact both the processability and drug product quality attributes. A statistical approach is developed using individual raw material historical lots as a basis for the calculation of tolerance intervals for drug product moisture content so that risks associated with excursions in moisture content can be mitigated. The proposed method is based on a model-independent approach that uses available data to estimate parameters of interest that describe the population of blend moisture content values and which do not require knowledge of the individual blend moisture content values. Another advantage of the proposed tolerance intervals is that, it does not require the use of tabulated values for tolerance factors. This facilitates the implementation on any spreadsheet program like Microsoft Excel. A computational example is used to demonstrate the proposed method.
Kahn, Jeremy M; Gould, Michael K; Krishnan, Jerry A; Wilson, Kevin C; Au, David H; Cooke, Colin R; Douglas, Ivor S; Feemster, Laura C; Mularski, Richard A; Slatore, Christopher G; Wiener, Renda Soylemez
2014-05-01
Many health care performance measures are either not based on high-quality clinical evidence or not tightly linked to patient-centered outcomes, limiting their usefulness in quality improvement. In this report we summarize the proceedings of an American Thoracic Society workshop convened to address this problem by reviewing current approaches to performance measure development and creating a framework for developing high-quality performance measures by basing them directly on recommendations from well-constructed clinical practice guidelines. Workshop participants concluded that ideally performance measures addressing care processes should be linked to clinical practice guidelines that explicitly rate the quality of evidence and the strength of recommendations, such as the Grading of Recommendations Assessment, Development, and Evaluation (GRADE) process. Under this framework, process-based performance measures would only be developed from strong recommendations based on high- or moderate-quality evidence. This approach would help ensure that clinical processes specified in performance measures are both of clear benefit to patients and supported by strong evidence. Although this approach may result in fewer performance measures, it would substantially increase the likelihood that quality-improvement programs based on these measures actually improve patient care.
Modelling of the mercury loss in fluorescent lamps under the influence of metal oxide coatings
NASA Astrophysics Data System (ADS)
Santos Abreu, A.; Mayer, J.; Lenk, D.; Horn, S.; Konrad, A.; Tidecks, R.
2016-11-01
The mercury transport and loss mechanisms in the metal oxide coatings of mercury low pressure discharge fluorescent lamps have been investigated. An existing model based on a ballistic process is discussed in the context of experimental mercury loss data. Two different approaches to the modeling of the mercury loss have been developed. The first one is based on mercury transition rates between the plasma, the coating, and the glass without specifying the underlying physical processes. The second one is based on a transport process driven by diffusion and a binding process of mercury reacting to mercury oxide inside the layers. Moreover, we extended the diffusion based model to handle multi-component coatings. All approaches are applied to describe mercury loss experiments under the influence of an Al 2 O 3 coating.
Process Writing and the Internet: Blogs and Ning Networks in the Classroom
ERIC Educational Resources Information Center
Boas, Isabela Villas
2011-01-01
In contrast to the product approach to writing, which is based on studying and replicating textual models, the process approach involves multiple and repeated steps that compel the writer to closely consider the topic, language, purpose for writing, and social reality of an audience. In addition to discussing the benefits of the process approach…
Focusing light through random photonic layers by four-element division algorithm
NASA Astrophysics Data System (ADS)
Fang, Longjie; Zhang, Xicheng; Zuo, Haoyi; Pang, Lin
2018-02-01
The propagation of waves in turbid media is a fundamental problem of optics with vast applications. Optical phase optimization approaches for focusing light through turbid media using phase control algorithm have been widely studied in recent years due to the rapid development of spatial light modulator. The existing approaches include element-based algorithms - stepwise sequential algorithm, continuous sequential algorithm and whole element optimization approaches - partitioning algorithm, transmission matrix approach and genetic algorithm. The advantage of element-based approaches is that the phase contribution of each element is very clear; however, because the intensity contribution of each element to the focal point is small especially for the case of large number of elements, the determination of the optimal phase for a single element would be difficult. In other words, the signal to noise ratio of the measurement is weak, leading to possibly local maximal during the optimization. As for whole element optimization approaches, all elements are employed for the optimization. Of course, signal to noise ratio during the optimization is improved. However, because more random processings are introduced into the processing, optimizations take more time to converge than the single element based approaches. Based on the advantages of both single element based approaches and whole element optimization approaches, we propose FEDA approach. Comparisons with the existing approaches show that FEDA only takes one third of measurement time to reach the optimization, which means that FEDA is promising in practical application such as for deep tissue imaging.
Compliance monitoring in business processes: Functionalities, application, and tool-support.
Ly, Linh Thao; Maggi, Fabrizio Maria; Montali, Marco; Rinderle-Ma, Stefanie; van der Aalst, Wil M P
2015-12-01
In recent years, monitoring the compliance of business processes with relevant regulations, constraints, and rules during runtime has evolved as major concern in literature and practice. Monitoring not only refers to continuously observing possible compliance violations, but also includes the ability to provide fine-grained feedback and to predict possible compliance violations in the future. The body of literature on business process compliance is large and approaches specifically addressing process monitoring are hard to identify. Moreover, proper means for the systematic comparison of these approaches are missing. Hence, it is unclear which approaches are suitable for particular scenarios. The goal of this paper is to define a framework for Compliance Monitoring Functionalities (CMF) that enables the systematic comparison of existing and new approaches for monitoring compliance rules over business processes during runtime. To define the scope of the framework, at first, related areas are identified and discussed. The CMFs are harvested based on a systematic literature review and five selected case studies. The appropriateness of the selection of CMFs is demonstrated in two ways: (a) a systematic comparison with pattern-based compliance approaches and (b) a classification of existing compliance monitoring approaches using the CMFs. Moreover, the application of the CMFs is showcased using three existing tools that are applied to two realistic data sets. Overall, the CMF framework provides powerful means to position existing and future compliance monitoring approaches.
Compliance monitoring in business processes: Functionalities, application, and tool-support
Ly, Linh Thao; Maggi, Fabrizio Maria; Montali, Marco; Rinderle-Ma, Stefanie; van der Aalst, Wil M.P.
2015-01-01
In recent years, monitoring the compliance of business processes with relevant regulations, constraints, and rules during runtime has evolved as major concern in literature and practice. Monitoring not only refers to continuously observing possible compliance violations, but also includes the ability to provide fine-grained feedback and to predict possible compliance violations in the future. The body of literature on business process compliance is large and approaches specifically addressing process monitoring are hard to identify. Moreover, proper means for the systematic comparison of these approaches are missing. Hence, it is unclear which approaches are suitable for particular scenarios. The goal of this paper is to define a framework for Compliance Monitoring Functionalities (CMF) that enables the systematic comparison of existing and new approaches for monitoring compliance rules over business processes during runtime. To define the scope of the framework, at first, related areas are identified and discussed. The CMFs are harvested based on a systematic literature review and five selected case studies. The appropriateness of the selection of CMFs is demonstrated in two ways: (a) a systematic comparison with pattern-based compliance approaches and (b) a classification of existing compliance monitoring approaches using the CMFs. Moreover, the application of the CMFs is showcased using three existing tools that are applied to two realistic data sets. Overall, the CMF framework provides powerful means to position existing and future compliance monitoring approaches. PMID:26635430
Steps in Moving Evidence-Based Health Informatics from Theory to Practice.
Rigby, Michael; Magrabi, Farah; Scott, Philip; Doupi, Persephone; Hypponen, Hannele; Ammenwerth, Elske
2016-10-01
To demonstrate and promote the importance of applying a scientific process to health IT design and implementation, and of basing this on research principles and techniques. A review by international experts linked to the IMIA Working Group on Technology Assessment and Quality Development. Four approaches are presented, linking to the creation of national professional expectations, adherence to research-based standards, quality assurance approaches to ensure safety, and scientific measurement of impact. Solely marketing- and aspiration-based approaches to health informatics applications are no longer ethical or acceptable when scientifically grounded evidence-based approaches are available and in use.
ERIC Educational Resources Information Center
Somba, Anne W.; Obura, Ger; Njuguna, Margaret; Itevete, Boniface; Mulwa, Jones; Wandera, Nooh
2015-01-01
The importance of writing skills in enhancing student performance in language exams and even other subject areas is widely acknowledged. At Jaffery secondary, the approach to the teaching of writing has generally been to use three approaches: product-based approach with focus on what the students composed; process-based approach that is focused on…
Enhancing Users' Participation in Business Process Modeling through Ontology-Based Training
NASA Astrophysics Data System (ADS)
Macris, A.; Malamateniou, F.; Vassilacopoulos, G.
Successful business process design requires active participation of users who are familiar with organizational activities and business process modelling concepts. Hence, there is a need to provide users with reusable, flexible, agile and adaptable training material in order to enable them instil their knowledge and expertise in business process design and automation activities. Knowledge reusability is of paramount importance in designing training material on process modelling since it enables users participate actively in process design/redesign activities stimulated by the changing business environment. This paper presents a prototype approach for the design and use of training material that provides significant advantages to both the designer (knowledge - content reusability and semantic web enabling) and the user (semantic search, knowledge navigation and knowledge dissemination). The approach is based on externalizing domain knowledge in the form of ontology-based knowledge networks (i.e. training scenarios serving specific training needs) so that it is made reusable.
A new approach towards image based virtual 3D city modeling by using close range photogrammetry
NASA Astrophysics Data System (ADS)
Singh, S. P.; Jain, K.; Mandla, V. R.
2014-05-01
3D city model is a digital representation of the Earth's surface and it's related objects such as building, tree, vegetation, and some manmade feature belonging to urban area. The demand of 3D city modeling is increasing day to day for various engineering and non-engineering applications. Generally three main image based approaches are using for virtual 3D city models generation. In first approach, researchers used Sketch based modeling, second method is Procedural grammar based modeling and third approach is Close range photogrammetry based modeling. Literature study shows that till date, there is no complete solution available to create complete 3D city model by using images. These image based methods also have limitations This paper gives a new approach towards image based virtual 3D city modeling by using close range photogrammetry. This approach is divided into three sections. First, data acquisition process, second is 3D data processing, and third is data combination process. In data acquisition process, a multi-camera setup developed and used for video recording of an area. Image frames created from video data. Minimum required and suitable video image frame selected for 3D processing. In second section, based on close range photogrammetric principles and computer vision techniques, 3D model of area created. In third section, this 3D model exported to adding and merging of other pieces of large area. Scaling and alignment of 3D model was done. After applying the texturing and rendering on this model, a final photo-realistic textured 3D model created. This 3D model transferred into walk-through model or in movie form. Most of the processing steps are automatic. So this method is cost effective and less laborious. Accuracy of this model is good. For this research work, study area is the campus of department of civil engineering, Indian Institute of Technology, Roorkee. This campus acts as a prototype for city. Aerial photography is restricted in many country and high resolution satellite images are costly. In this study, proposed method is based on only simple video recording of area. Thus this proposed method is suitable for 3D city modeling. Photo-realistic, scalable, geo-referenced virtual 3D city model is useful for various kinds of applications such as for planning in navigation, tourism, disasters management, transportations, municipality, urban and environmental managements, real-estate industry. Thus this study will provide a good roadmap for geomatics community to create photo-realistic virtual 3D city model by using close range photogrammetry.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Elter, M.; Schulz-Wendtland, R.; Wittenberg, T.
2007-11-15
Mammography is the most effective method for breast cancer screening available today. However, the low positive predictive value of breast biopsy resulting from mammogram interpretation leads to approximately 70% unnecessary biopsies with benign outcomes. To reduce the high number of unnecessary breast biopsies, several computer-aided diagnosis (CAD) systems have been proposed in the last several years. These systems help physicians in their decision to perform a breast biopsy on a suspicious lesion seen in a mammogram or to perform a short term follow-up examination instead. We present two novel CAD approaches that both emphasize an intelligible decision process to predictmore » breast biopsy outcomes from BI-RADS findings. An intelligible reasoning process is an important requirement for the acceptance of CAD systems by physicians. The first approach induces a global model based on decison-tree learning. The second approach is based on case-based reasoning and applies an entropic similarity measure. We have evaluated the performance of both CAD approaches on two large publicly available mammography reference databases using receiver operating characteristic (ROC) analysis, bootstrap sampling, and the ANOVA statistical significance test. Both approaches outperform the diagnosis decisions of the physicians. Hence, both systems have the potential to reduce the number of unnecessary breast biopsies in clinical practice. A comparison of the performance of the proposed decision tree and CBR approaches with a state of the art approach based on artificial neural networks (ANN) shows that the CBR approach performs slightly better than the ANN approach, which in turn results in slightly better performance than the decision-tree approach. The differences are statistically significant (p value <0.001). On 2100 masses extracted from the DDSM database, the CRB approach for example resulted in an area under the ROC curve of A(z)=0.89{+-}0.01, the decision-tree approach in A(z)=0.87{+-}0.01, and the ANN approach in A(z)=0.88{+-}0.01.« less
Unified modeling language and design of a case-based retrieval system in medical imaging.
LeBozec, C.; Jaulent, M. C.; Zapletal, E.; Degoulet, P.
1998-01-01
One goal of artificial intelligence research into case-based reasoning (CBR) systems is to develop approaches for designing useful and practical interactive case-based environments. Explaining each step of the design of the case-base and of the retrieval process is critical for the application of case-based systems to the real world. We describe herein our approach to the design of IDEM--Images and Diagnosis from Examples in Medicine--a medical image case-based retrieval system for pathologists. Our approach is based on the expressiveness of an object-oriented modeling language standard: the Unified Modeling Language (UML). We created a set of diagrams in UML notation illustrating the steps of the CBR methodology we used. The key aspect of this approach was selecting the relevant objects of the system according to user requirements and making visualization of cases and of the components of the case retrieval process. Further evaluation of the expressiveness of the design document is required but UML seems to be a promising formalism, improving the communication between the developers and users. Images Figure 6 Figure 7 PMID:9929346
Unified modeling language and design of a case-based retrieval system in medical imaging.
LeBozec, C; Jaulent, M C; Zapletal, E; Degoulet, P
1998-01-01
One goal of artificial intelligence research into case-based reasoning (CBR) systems is to develop approaches for designing useful and practical interactive case-based environments. Explaining each step of the design of the case-base and of the retrieval process is critical for the application of case-based systems to the real world. We describe herein our approach to the design of IDEM--Images and Diagnosis from Examples in Medicine--a medical image case-based retrieval system for pathologists. Our approach is based on the expressiveness of an object-oriented modeling language standard: the Unified Modeling Language (UML). We created a set of diagrams in UML notation illustrating the steps of the CBR methodology we used. The key aspect of this approach was selecting the relevant objects of the system according to user requirements and making visualization of cases and of the components of the case retrieval process. Further evaluation of the expressiveness of the design document is required but UML seems to be a promising formalism, improving the communication between the developers and users.
Zhou, Li; Xu, Jin-Di; Zhou, Shan-Shan; Shen, Hong; Mao, Qian; Kong, Ming; Zou, Ye-Ting; Xu, Ya-Yun; Xu, Jun; Li, Song-Lin
2017-12-29
Exploring processing chemistry, in particular the chemical transformation mechanisms involved, is a key step to elucidate the scientific basis in traditional processing of herbal medicines. Previously, taking Rehmanniae Radix (RR) as a case study, the holistic chemome (secondary metabolome and glycome) difference between raw and processed RR was revealed by integrating hyphenated chromatographic techniques-based targeted glycomics and untargeted metabolomics. Nevertheless, the complex chemical transformation mechanisms underpinning the holistic chemome variation in RR processing remain to be extensively clarified. As a continuous study, here a novel strategy by combining chemomics-based marker compounds mining and mimetic processing is proposed for further exploring the chemical mechanisms involved in herbal processing. First, the differential marker compounds between raw and processed herbs were rapidly discovered by untargeted chemomics-based mining approach through multivariate statistical analysis of the chemome data obtained by integrated metabolomics and glycomics analysis. Second, the marker compounds were mimetically processed under the simulated physicochemical conditions as in the herb processing, and the final reaction products were chemically characterized by targeted chemomics-based mining approach. Third, the main chemical transformation mechanisms involved were clarified by linking up the original marker compounds and their mimetic processing products. Using this strategy, a set of differential marker compounds including saccharides, glycosides and furfurals in raw and processed RR was rapidly found, and the major chemical mechanisms involved in RR processing were elucidated as stepwise transformations of saccharides (polysaccharides, oligosaccharides and monosaccharides) and glycosides (iridoid glycosides and phenethylalcohol glycosides) into furfurals (glycosylated/non-glycosylated hydroxymethylfurfurals) by deglycosylation and/or dehydration. The research deliverables indicated that the proposed strategy could advance the understanding of RR processing chemistry, and therefore may be considered a promising approach for delving into the scientific basis in traditional processing of herbal medicines. Copyright © 2017 Elsevier B.V. All rights reserved.
Advanced process control framework initiative
NASA Astrophysics Data System (ADS)
Hill, Tom; Nettles, Steve
1997-01-01
The semiconductor industry, one the world's most fiercely competitive industries, is driven by increasingly complex process technologies and global competition to improve cycle time, quality, and process flexibility. Due to the complexity of these problems, current process control techniques are generally nonautomated, time-consuming, reactive, nonadaptive, and focused on individual fabrication tools and processes. As the semiconductor industry moves into higher density processes, radical new approaches are required. To address the need for advanced factory-level process control in this environment, Honeywell, Advanced Micro Devices (AMD), and SEMATECH formed the Advanced Process Control Framework Initiative (APCFI) joint research project. The project defines and demonstrates an Advanced Process Control (APC) approach based on SEMATECH's Computer Integrated Manufacturing (CIM) Framework. Its scope includes the coordination of Manufacturing Execution Systems, process control tools, and wafer fabrication equipment to provide necessary process control capabilities. Moreover, it takes advantage of the CIM Framework to integrate and coordinate applications from other suppliers that provide services necessary for the overall system to function. This presentation discusses the key concept of model-based process control that differentiates the APC Framework. This major improvement over current methods enables new systematic process control by linking the knowledge of key process settings to desired product characteristics that reside in models created with commercial model development tools The unique framework-based approach facilitates integration of commercial tools and reuse of their data by tying them together in an object-based structure. The presentation also explores the perspective of each organization's involvement in the APCFI project. Each has complementary goals and expertise to contribute; Honeywell represents the supplier viewpoint, AMD represents the user with 'real customer requirements', and SEMATECH provides a consensus-building organization that widely disseminates technology to suppliers and users in the semiconductor industry that face similar equipment and factory control systems challenges.
Using fuzzy fractal features of digital images for the material surface analisys
NASA Astrophysics Data System (ADS)
Privezentsev, D. G.; Zhiznyakov, A. L.; Astafiev, A. V.; Pugin, E. V.
2018-01-01
Edge detection is an important task in image processing. There are a lot of approaches in this area: Sobel, Canny operators and others. One of the perspective techniques in image processing is the use of fuzzy logic and fuzzy sets theory. They allow us to increase processing quality by representing information in its fuzzy form. Most of the existing fuzzy image processing methods switch to fuzzy sets on very late stages, so this leads to some useful information loss. In this paper, a novel method of edge detection based on fuzzy image representation and fuzzy pixels is proposed. With this approach, we convert the image to fuzzy form on the first step. Different approaches to this conversion are described. Several membership functions for fuzzy pixel description and requirements for their form and view are given. A novel approach to edge detection based on Sobel operator and fuzzy image representation is proposed. Experimental testing of developed method was performed on remote sensing images.
Application of genetic algorithm in integrated setup planning and operation sequencing
NASA Astrophysics Data System (ADS)
Kafashi, Sajad; Shakeri, Mohsen
2011-01-01
Process planning is an essential component for linking design and manufacturing process. Setup planning and operation sequencing is two main tasks in process planning. Many researches solved these two problems separately. Considering the fact that the two functions are complementary, it is necessary to integrate them more tightly so that performance of a manufacturing system can be improved economically and competitively. This paper present a generative system and genetic algorithm (GA) approach to process plan the given part. The proposed approach and optimization methodology analyses the TAD (tool approach direction), tolerance relation between features and feature precedence relations to generate all possible setups and operations using workshop resource database. Based on these technological constraints the GA algorithm approach, which adopts the feature-based representation, optimizes the setup plan and sequence of operations using cost indices. Case study show that the developed system can generate satisfactory results in optimizing the setup planning and operation sequencing simultaneously in feasible condition.
One Step at a Time: SBM as an Incremental Process.
ERIC Educational Resources Information Center
Conrad, Mark
1995-01-01
Discusses incremental SBM budgeting and answers questions regarding resource equity, bookkeeping requirements, accountability, decision-making processes, and purchasing. Approaching site-based management as an incremental process recognizes that every school system engages in some level of site-based decisions. Implementation can be gradual and…
An analysis of learning process based on scientific approach in physical chemsitry experiment
NASA Astrophysics Data System (ADS)
Arlianty, Widinda Normalia; Febriana, Beta Wulan; Diniaty, Artina
2017-03-01
This study aimed to analysis the quality of learning process based on scientific approach in physical chemistry experiment of Chemistry Education students, Islamic University of Indonesia. The research was descriptive qualitative. The samples of this research were 2nd semester student, class of 2015. Scientific data of learning process were collected by observation sheet and documentation of seven title experimental. The results showed that the achievement of scientific learning process on observing, questioning, experimenting and associating data were 73.98%; 81.79%; 80.74%; and 76.94% respectively, which categorized as medium. Furthermore, for aspect communicating had high category at 86.11% of level achievement.
Toward a transport-based analysis of nutrient spiraling and uptake in streams
Runkel, Robert L.
2007-01-01
Nutrient addition experiments are designed to study the cycling of nutrients in stream ecosystems where hydrologic and nonhydrologic processes determine nutrient fate. Because of the importance of hydrologic processes in stream ecosystems, a conceptual model known as nutrient spiraling is frequently employed. A central part of the nutrient spiraling approach is the determination of uptake length (SW), the average distance traveled by dissolved nutrients in the water column before uptake. Although the nutrient spiraling concept has been an invaluable tool in stream ecology, the current practice of estimating uptake length from steady-state nutrient data using linear regression (called here the "SW approach") presents a number of limitations. These limitations are identified by comparing the exponential SW equation with analytical solutions of a stream solute transport model. This comparison indicates that (1) SW, is an aggregate measure of uptake that does not distinguish between main channel and storage zone processes, (2) SW, is an integrated measure of numerous hydrologie and nonhydrologic processes-this process integration may lead to difficulties in interpretation when comparing estimates of SW, and (3) estimates of uptake velocity and areal uptake rate (Vf and U) based on S W, are not independent of system hydrology. Given these findings, a transport-based approach to nutrient spiraling is presented for steady-state and time-series data sets. The transport-based approach for time-series data sets is suggested for future research on nutrient uptake as it provides a number of benefits, including the ability to (1) separately quantify main channel and storage zone uptake, (2) quantify specific hydrologic and nonhydrologic processes using various model parameters (process separation), (3) estimate uptake velocities and areal uptake rates that are independent of hydrologic effects, and (4) use short-term, non-plateau nutrient additions such that the effects of regeneration and mineralization are minimized. In summary, the transport-based, time-series approach provides a means of estimating traditional measures of nutrient uptake (SW, V?? U) while providing additional information on the location and magnitude of uptake (main channel versus storage zone). Application of the transport-based approach to time-series data from Green Creek, Antarctica, indicates that the bulk of nitrate uptake (???74% to 100%) occurred within the main channel where benthic uptake by algal mats is a likely process. Substantial uptake (???26%) also occurred in the storage zone of one reach, where uptake is attributed to the microbial community.
Yoshino, Hiroyuki; Hara, Yuko; Dohi, Masafumi; Yamashita, Kazunari; Hakomori, Tadashi; Kimura, Shin-Ichiro; Iwao, Yasunori; Itai, Shigeru
2018-04-01
Scale-up approaches for film coating process have been established for each type of film coating equipment from thermodynamic and mechanical analyses for several decades. The objective of the present study was to establish a versatile scale-up approach for film coating process applicable to commercial production that is based on critical quality attribute (CQA) using the Quality by Design (QbD) approach and is independent of the equipment used. Experiments on a pilot scale using the Design of Experiment (DoE) approach were performed to find a suitable CQA from surface roughness, contact angle, color difference, and coating film properties by terahertz spectroscopy. Surface roughness was determined to be a suitable CQA from a quantitative appearance evaluation. When surface roughness was fixed as the CQA, the water content of the film-coated tablets was determined to be the critical material attribute (CMA), a parameter that does not depend on scale or equipment. Finally, to verify the scale-up approach determined from the pilot scale, experiments on a commercial scale were performed. The good correlation between the surface roughness (CQA) and the water content (CMA) identified at the pilot scale was also retained at the commercial scale, indicating that our proposed method should be useful as a scale-up approach for film coating process.
Arts-Based Learning: A New Approach to Nursing Education Using Andragogy.
Nguyen, Megan; Miranda, Joyal; Lapum, Jennifer; Donald, Faith
2016-07-01
Learner-oriented strategies focusing on learning processes are needed to prepare nursing students for complex practice situations. An arts-based learning approach uses art to nurture cognitive and emotional learning. Knowles' theory of andragogy aims to develop the skill of learning and can inform the process of implementing arts-based learning. This article explores the use and evaluation of andragogy-informed arts-based learning for teaching nursing theory at the undergraduate level. Arts-based learning activities were implemented and then evaluated by students and instructors using anonymous questionnaires. Most students reported that the activities promoted learning. All instructors indicated an interest in integrating arts-based learning into the curricula. Facilitators and barriers to mainstreaming arts-based learning were highlighted. Findings stimulate implications for prospective research and education. Findings suggest that arts-based learning approaches enhance learning by supporting deep inquiry and different learning styles. Further exploration of andragogy-informed arts-based learning in nursing and other disciplines is warranted. [J Nurs Educ. 2016;55(7):407-410.]. Copyright 2016, SLACK Incorporated.
Can microbes economically remove sulfur
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fox, J.L.
Researchers have reported that refiners who now rely on costly physic-chemical procedures to desulfurize petroleum will soon have an alternative microbial-enzyme-based approach to this process. This new approach is still under development and considerable number chemical engineering problems need to be solved before this process is ready for large-scale use. This paper reviews the several research projects dedicated solving the problems that keep a biotechnology-based alternative from competing with chemical desulfurization.
Using fuzzy rule-based knowledge model for optimum plating conditions search
NASA Astrophysics Data System (ADS)
Solovjev, D. S.; Solovjeva, I. A.; Litovka, Yu V.; Arzamastsev, A. A.; Glazkov, V. P.; L’vov, A. A.
2018-03-01
The paper discusses existing approaches to plating process modeling in order to decrease the distribution thickness of plating surface cover. However, these approaches do not take into account the experience, knowledge, and intuition of the decision-makers when searching the optimal conditions of electroplating technological process. The original approach to optimal conditions search for applying the electroplating coatings, which uses the rule-based model of knowledge and allows one to reduce the uneven product thickness distribution, is proposed. The block diagrams of a conventional control system of a galvanic process as well as the system based on the production model of knowledge are considered. It is shown that the fuzzy production model of knowledge in the control system makes it possible to obtain galvanic coatings of a given thickness unevenness with a high degree of adequacy to the experimental data. The described experimental results confirm the theoretical conclusions.
NASA Astrophysics Data System (ADS)
Lachowicz, Mirosław
2016-03-01
The very stimulating paper [6] discusses an approach to perception and learning in a large population of living agents. The approach is based on a generalization of kinetic theory methods in which the interactions between agents are described in terms of game theory. Such an approach was already discussed in Ref. [2-4] (see also references therein) in various contexts. The processes of perception and learning are based on the interactions between agents and therefore the general kinetic theory is a suitable tool for modeling them. However the main question that rises is how the perception and learning processes may be treated in the mathematical modeling. How may we precisely deliver suitable mathematical structures that are able to capture various aspects of perception and learning?
Worklist handling in workflow-enabled radiological application systems
NASA Astrophysics Data System (ADS)
Wendler, Thomas; Meetz, Kirsten; Schmidt, Joachim; von Berg, Jens
2000-05-01
For the next generation integrated information systems for health care applications, more emphasis has to be put on systems which, by design, support the reduction of cost, the increase inefficiency and the improvement of the quality of services. A substantial contribution to this will be the modeling. optimization, automation and enactment of processes in health care institutions. One of the perceived key success factors for the system integration of processes will be the application of workflow management, with workflow management systems as key technology components. In this paper we address workflow management in radiology. We focus on an important aspect of workflow management, the generation and handling of worklists, which provide workflow participants automatically with work items that reflect tasks to be performed. The display of worklists and the functions associated with work items are the visible part for the end-users of an information system using a workflow management approach. Appropriate worklist design and implementation will influence user friendliness of a system and will largely influence work efficiency. Technically, in current imaging department information system environments (modality-PACS-RIS installations), a data-driven approach has been taken: Worklist -- if present at all -- are generated from filtered views on application data bases. In a future workflow-based approach, worklists will be generated by autonomous workflow services based on explicit process models and organizational models. This process-oriented approach will provide us with an integral view of entire health care processes or sub- processes. The paper describes the basic mechanisms of this approach and summarizes its benefits.
Comparison of Traditional and ADRI Based Teaching Approaches in an Introductory Programming Course
ERIC Educational Resources Information Center
Malik, Sohail Iqbal; Coldwell-Neilson, Jo
2017-01-01
Aim/Purpose: This study introduced a new teaching and learning approach based on an ADRI (Approach, Deployment, Result, Improvement) model in an introductory programming (IP) course. The effectiveness of the new teaching and learning process was determined by collecting feedback from the IP instructors and by analyzing the final exam grades of the…
A Sampled Literature Review of Design-Based Learning Approaches: A Search for Key Characteristics
ERIC Educational Resources Information Center
Gómez Puente, Sonia M.; van Eijck, Michiel; Jochems, Wim
2013-01-01
Design-based learning (DBL) is an educational approach grounded in the processes of inquiry and reasoning towards generating innovative artifacts, systems and solutions. The approach is well characterized in the context of learning natural sciences in secondary education. Less is known, however, of its characteristics in the context of higher…
Success in Sight: A Comprehensive Approach to School Improvement
ERIC Educational Resources Information Center
Cicchinelli, Lou; Dean, Ceri; Galvin, Mike; Goodwin, Bryan; Parsley, Danette
2006-01-01
This document was developed by Mid-continent Research for Education and Learning (McREL) to improve schools is to balance a prescriptive content approach and a context-driven process approach. "Success in Sight" is based on the "science" of improvement--it provides clear, specific, research-based guidance for what to do in schools. But it also…
Chad D. Pierskalla; Dorothy H. Anderson; David W. Lime
2000-01-01
To manage various recreation opportunities, managers and planners must consider the spatial and temporal scale of social process when identifying opportunities on base maps. However, analyses of social process and spatial form are often treated as two distinct approaches--sociological and geographical approaches. A sociologist might control for spatial form by adopting...
Shahaf, Goded; Pratt, Hillel
2013-01-01
In this work we demonstrate the principles of a systematic modeling approach of the neurophysiologic processes underlying a behavioral function. The modeling is based upon a flexible simulation tool, which enables parametric specification of the underlying neurophysiologic characteristics. While the impact of selecting specific parameters is of interest, in this work we focus on the insights, which emerge from rather accepted assumptions regarding neuronal representation. We show that harnessing of even such simple assumptions enables the derivation of significant insights regarding the nature of the neurophysiologic processes underlying behavior. We demonstrate our approach in some detail by modeling the behavioral go/no-go task. We further demonstrate the practical significance of this simplified modeling approach in interpreting experimental data - the manifestation of these processes in the EEG and ERP literature of normal and abnormal (ADHD) function, as well as with comprehensive relevant ERP data analysis. In-fact we show that from the model-based spatiotemporal segregation of the processes, it is possible to derive simple and yet effective and theory-based EEG markers differentiating normal and ADHD subjects. We summarize by claiming that the neurophysiologic processes modeled for the go/no-go task are part of a limited set of neurophysiologic processes which underlie, in a variety of combinations, any behavioral function with measurable operational definition. Such neurophysiologic processes could be sampled directly from EEG on the basis of model-based spatiotemporal segregation.
Fluvial Geomorphology and River Restoration: Uneasy Allies (Invited)
NASA Astrophysics Data System (ADS)
Kondolf, G. M.
2009-12-01
A growing body of literature demonstrates that river restoration based on understanding of geomorphic and ecological process is more likely to be sustainable than form-based approaches. In the early days of river ‘restoration’ in North America, most projects involved bank stabilization, habitat structure placement, or construction of rocked meandering channels, at odds with restoration of the dynamic processes we now see as fundamental to effective, sustainable restoration. Recent years have seen a growing body of restoration programs emphasizing restoration of connectivity and geomorphic process. This evolution has been reflected in publications, from the form-based approach advocated in the early 1990s by an NRC panel (which did not include a geomorphologist) to more recent works by interdisciplinary panels emphasizing process restoration. Large-scale river restoration came later to Europe, motivated by the EU Water Framework Directive (2000) requirements that member states implement measures to improve ecological status of degraded rivers. Interestingly, European approaches to restoration have often reflected a more nuanced understanding of process, including deliberate recreation of unstable braided channels, removal of bank protection, and reconnecting floodplains. In part this may reflect a reaction to the more thorough post-war channelization of rivers in western Europe. In part it may also reflect a greater influence of academic and research laboratories upon practitioners than in the US, where a strong anti-intellectual strain, cultural preference for easy fixes, and reluctance to conduct objective post-project assessments have contributed to the adoption of form-based approaches by many public agencies.
Semantic Service Design for Collaborative Business Processes in Internetworked Enterprises
NASA Astrophysics Data System (ADS)
Bianchini, Devis; Cappiello, Cinzia; de Antonellis, Valeria; Pernici, Barbara
Modern collaborating enterprises can be seen as borderless organizations whose processes are dynamically transformed and integrated with the ones of their partners (Internetworked Enterprises, IE), thus enabling the design of collaborative business processes. The adoption of Semantic Web and service-oriented technologies for implementing collaboration in such distributed and heterogeneous environments promises significant benefits. IE can model their own processes independently by using the Software as a Service paradigm (SaaS). Each enterprise maintains a catalog of available services and these can be shared across IE and reused to build up complex collaborative processes. Moreover, each enterprise can adopt its own terminology and concepts to describe business processes and component services. This brings requirements to manage semantic heterogeneity in process descriptions which are distributed across different enterprise systems. To enable effective service-based collaboration, IEs have to standardize their process descriptions and model them through component services using the same approach and principles. For enabling collaborative business processes across IE, services should be designed following an homogeneous approach, possibly maintaining a uniform level of granularity. In the paper we propose an ontology-based semantic modeling approach apt to enrich and reconcile semantics of process descriptions to facilitate process knowledge management and to enable semantic service design (by discovery, reuse and integration of process elements/constructs). The approach brings together Semantic Web technologies, techniques in process modeling, ontology building and semantic matching in order to provide a comprehensive semantic modeling framework.
An Alternative Approach to Zero Tolerance Policies.
ERIC Educational Resources Information Center
Ilg, Timothy J.; Russo, Charles J.
2001-01-01
School officials should adopt no-tolerance policies that require educators' discretion in punishing misbehaving students (based on due process and fundamental fairness), rather than relying on the zero-tolerance approach, which fails to differentiate among different levels of offenses. Even disruptive students deserve due process and appropriate…
Bridging analytical approaches for low-carbon transitions
NASA Astrophysics Data System (ADS)
Geels, Frank W.; Berkhout, Frans; van Vuuren, Detlef P.
2016-06-01
Low-carbon transitions are long-term multi-faceted processes. Although integrated assessment models have many strengths for analysing such transitions, their mathematical representation requires a simplification of the causes, dynamics and scope of such societal transformations. We suggest that integrated assessment model-based analysis should be complemented with insights from socio-technical transition analysis and practice-based action research. We discuss the underlying assumptions, strengths and weaknesses of these three analytical approaches. We argue that full integration of these approaches is not feasible, because of foundational differences in philosophies of science and ontological assumptions. Instead, we suggest that bridging, based on sequential and interactive articulation of different approaches, may generate a more comprehensive and useful chain of assessments to support policy formation and action. We also show how these approaches address knowledge needs of different policymakers (international, national and local), relate to different dimensions of policy processes and speak to different policy-relevant criteria such as cost-effectiveness, socio-political feasibility, social acceptance and legitimacy, and flexibility. A more differentiated set of analytical approaches thus enables a more differentiated approach to climate policy making.
McDermott, Jason E.; Wang, Jing; Mitchell, Hugh; Webb-Robertson, Bobbie-Jo; Hafen, Ryan; Ramey, John; Rodland, Karin D.
2012-01-01
Introduction The advent of high throughput technologies capable of comprehensive analysis of genes, transcripts, proteins and other significant biological molecules has provided an unprecedented opportunity for the identification of molecular markers of disease processes. However, it has simultaneously complicated the problem of extracting meaningful molecular signatures of biological processes from these complex datasets. The process of biomarker discovery and characterization provides opportunities for more sophisticated approaches to integrating purely statistical and expert knowledge-based approaches. Areas covered In this review we will present examples of current practices for biomarker discovery from complex omic datasets and the challenges that have been encountered in deriving valid and useful signatures of disease. We will then present a high-level review of data-driven (statistical) and knowledge-based methods applied to biomarker discovery, highlighting some current efforts to combine the two distinct approaches. Expert opinion Effective, reproducible and objective tools for combining data-driven and knowledge-based approaches to identify predictive signatures of disease are key to future success in the biomarker field. We will describe our recommendations for possible approaches to this problem including metrics for the evaluation of biomarkers. PMID:23335946
A machine-learned computational functional genomics-based approach to drug classification.
Lötsch, Jörn; Ultsch, Alfred
2016-12-01
The public accessibility of "big data" about the molecular targets of drugs and the biological functions of genes allows novel data science-based approaches to pharmacology that link drugs directly with their effects on pathophysiologic processes. This provides a phenotypic path to drug discovery and repurposing. This paper compares the performance of a functional genomics-based criterion to the traditional drug target-based classification. Knowledge discovery in the DrugBank and Gene Ontology databases allowed the construction of a "drug target versus biological process" matrix as a combination of "drug versus genes" and "genes versus biological processes" matrices. As a canonical example, such matrices were constructed for classical analgesic drugs. These matrices were projected onto a toroid grid of 50 × 82 artificial neurons using a self-organizing map (SOM). The distance, respectively, cluster structure of the high-dimensional feature space of the matrices was visualized on top of this SOM using a U-matrix. The cluster structure emerging on the U-matrix provided a correct classification of the analgesics into two main classes of opioid and non-opioid analgesics. The classification was flawless with both the functional genomics and the traditional target-based criterion. The functional genomics approach inherently included the drugs' modulatory effects on biological processes. The main pharmacological actions known from pharmacological science were captures, e.g., actions on lipid signaling for non-opioid analgesics that comprised many NSAIDs and actions on neuronal signal transmission for opioid analgesics. Using machine-learned techniques for computational drug classification in a comparative assessment, a functional genomics-based criterion was found to be similarly suitable for drug classification as the traditional target-based criterion. This supports a utility of functional genomics-based approaches to computational system pharmacology for drug discovery and repurposing.
Prediction of the properties anhydrite construction mixtures based on neural network approach
NASA Astrophysics Data System (ADS)
Fedorchuk, Y. M.; Zamyatin, N. V.; Smirnov, G. V.; Rusina, O. N.; Sadenova, M. A.
2017-08-01
The article considered the question of applying the backstop modeling mechanism from the components of anhydride mixtures in the process of managing the technological processes of receiving construction products which based on fluoranhydrite.
What will the future of cloud-based astronomical data processing look like?
NASA Astrophysics Data System (ADS)
Green, Andrew W.; Mannering, Elizabeth; Harischandra, Lloyd; Vuong, Minh; O'Toole, Simon; Sealey, Katrina; Hopkins, Andrew M.
2017-06-01
Astronomy is rapidly approaching an impasse: very large datasets require remote or cloud-based parallel processing, yet many astronomers still try to download the data and develop serial code locally. Astronomers understand the need for change, but the hurdles remain high. We are developing a data archive designed from the ground up to simplify and encourage cloud-based parallel processing. While the volume of data we host remains modest by some standards, it is still large enough that download and processing times are measured in days and even weeks. We plan to implement a python based, notebook-like interface that automatically parallelises execution. Our goal is to provide an interface sufficiently familiar and user-friendly that it encourages the astronomer to run their analysis on our system in the cloud-astroinformatics as a service. We describe how our system addresses the approaching impasse in astronomy using the SAMI Galaxy Survey as an example.
From IHE Audit Trails to XES Event Logs Facilitating Process Mining.
Paster, Ferdinand; Helm, Emmanuel
2015-01-01
Recently Business Intelligence approaches like process mining are applied to the healthcare domain. The goal of process mining is to gain process knowledge, compliance and room for improvement by investigating recorded event data. Previous approaches focused on process discovery by event data from various specific systems. IHE, as a globally recognized basis for healthcare information systems, defines in its ATNA profile how real-world events must be recorded in centralized event logs. The following approach presents how audit trails collected by the means of ATNA can be transformed to enable process mining. Using the standardized audit trails provides the ability to apply these methods to all IHE based information systems.
NASA Astrophysics Data System (ADS)
Barroi, A.; Hermsdorf, J.; Prank, U.; Kaierle, S.
First results of the process development of a novel approach for a high deposition rate cladding process with minimal dilution are presented. The approach will combine the enormous melting potential of an electrical arc that burns between two consumable wire electrodes with the precision of a laser process. Separate test for the plasma melting and for the laser based surface heating have been performed. A steadily burning arc between the electrodes could be established and a deposition rate of 10 kg/h could be achieved. The laser was able to apply the desired heat profile, needed for the combination of the processes. Process problems were analyzed and solutions proposed.
ERIC Educational Resources Information Center
Desyatov, Tymofiy
2015-01-01
The article analyzes the development of competency-based professional training standards and their implementation into educational process in foreign countries. It determines that the main idea of competency-based approach is competency-and-active learning, which aims at complex acquirement of diverse skills and ways of practice activities via…
A template-based approach for responsibility management in executable business processes
NASA Astrophysics Data System (ADS)
Cabanillas, Cristina; Resinas, Manuel; Ruiz-Cortés, Antonio
2018-05-01
Process-oriented organisations need to manage the different types of responsibilities their employees may have w.r.t. the activities involved in their business processes. Despite several approaches provide support for responsibility modelling, in current Business Process Management Systems (BPMS) the only responsibility considered at runtime is the one related to performing the work required for activity completion. Others like accountability or consultation must be implemented by manually adding activities in the executable process model, which is time-consuming and error-prone. In this paper, we address this limitation by enabling current BPMS to execute processes in which people with different responsibilities interact to complete the activities. We introduce a metamodel based on Responsibility Assignment Matrices (RAM) to model the responsibility assignment for each activity, and a flexible template-based mechanism that automatically transforms such information into BPMN elements, which can be interpreted and executed by a BPMS. Thus, our approach does not enforce any specific behaviour for the different responsibilities but new templates can be modelled to specify the interaction that best suits the activity requirements. Furthermore, libraries of templates can be created and reused in different processes. We provide a reference implementation and build a library of templates for a well-known set of responsibilities.
Towards a voxel-based geographic automata for the simulation of geospatial processes
NASA Astrophysics Data System (ADS)
Jjumba, Anthony; Dragićević, Suzana
2016-07-01
Many geographic processes evolve in a three dimensional space and time continuum. However, when they are represented with the aid of geographic information systems (GIS) or geosimulation models they are modelled in a framework of two-dimensional space with an added temporal component. The objective of this study is to propose the design and implementation of voxel-based automata as a methodological approach for representing spatial processes evolving in the four-dimensional (4D) space-time domain. Similar to geographic automata models which are developed to capture and forecast geospatial processes that change in a two-dimensional spatial framework using cells (raster geospatial data), voxel automata rely on the automata theory and use three-dimensional volumetric units (voxels). Transition rules have been developed to represent various spatial processes which range from the movement of an object in 3D to the diffusion of airborne particles and landslide simulation. In addition, the proposed 4D models demonstrate that complex processes can be readily reproduced from simple transition functions without complex methodological approaches. The voxel-based automata approach provides a unique basis to model geospatial processes in 4D for the purpose of improving representation, analysis and understanding their spatiotemporal dynamics. This study contributes to the advancement of the concepts and framework of 4D GIS.
Market-based approaches to tree valuation
Geoffrey H. Donovan; David T. Butry
2008-01-01
A recent four-part series in Arborist News outlined different appraisal processes used to value urban trees. The final article in the series described the three generally accepted approaches to tree valuation: the sales comparison approach, the cost approach, and the income capitalization approach. The author, D. Logan Nelson, noted that the sales comparison approach...
Murphy, Enda; King, Eoin A
2016-08-15
The strategic noise mapping process of the EU has now been ongoing for more than ten years. However, despite the fact that a significant volume of research has been conducted on the process and related issues there has been little change or innovation in how relevant authorities and policymakers are conducting the process since its inception. This paper reports on research undertaken to assess the possibility for smartphone-based noise mapping data to be integrated into the traditional strategic noise mapping process. We compare maps generated using the traditional approach with those generated using smartphone-based measurement data. The advantage of the latter approach is that it has the potential to remove the need for exhaustive input data into the source calculation model for noise prediction. In addition, the study also tests the accuracy of smartphone-based measurements against simultaneous measurements taken using traditional sound level meters in the field. Copyright © 2016 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
de Jager, H. J.; Nieuwenhuis, F. J.
2005-01-01
South Africa has embarked on a process of education renewal by adopting outcomes-based education (OBE). This paper focuses on the linkages between total quality management (TQM) and the outcomes-based approach in an education context. Quality assurance in academic programmes in higher education in South Africa is, in some instances, based on the…
Corrected simulations for one-dimensional diffusion processes with naturally occurring boundaries.
Shafiey, Hassan; Gan, Xinjun; Waxman, David
2017-11-01
To simulate a diffusion process, a usual approach is to discretize the time in the associated stochastic differential equation. This is the approach used in the Euler method. In the present work we consider a one-dimensional diffusion process where the terms occurring, within the stochastic differential equation, prevent the process entering a region. The outcome is a naturally occurring boundary (which may be absorbing or reflecting). A complication occurs in a simulation of this situation. The term involving a random variable, within the discretized stochastic differential equation, may take a trajectory across the boundary into a "forbidden region." The naive way of dealing with this problem, which we refer to as the "standard" approach, is simply to reset the trajectory to the boundary, based on the argument that crossing the boundary actually signifies achieving the boundary. In this work we show, within the framework of the Euler method, that such resetting introduces a spurious force into the original diffusion process. This force may have a significant influence on trajectories that come close to a boundary. We propose a corrected numerical scheme, for simulating one-dimensional diffusion processes with naturally occurring boundaries. This involves correcting the standard approach, so that an exact property of the diffusion process is precisely respected. As a consequence, the proposed scheme does not introduce a spurious force into the dynamics. We present numerical test cases, based on exactly soluble one-dimensional problems with one or two boundaries, which suggest that, for a given value of the discrete time step, the proposed scheme leads to substantially more accurate results than the standard approach. Alternatively, the standard approach needs considerably more computation time to obtain a comparable level of accuracy to the proposed scheme, because the standard approach requires a significantly smaller time step.
Corrected simulations for one-dimensional diffusion processes with naturally occurring boundaries
NASA Astrophysics Data System (ADS)
Shafiey, Hassan; Gan, Xinjun; Waxman, David
2017-11-01
To simulate a diffusion process, a usual approach is to discretize the time in the associated stochastic differential equation. This is the approach used in the Euler method. In the present work we consider a one-dimensional diffusion process where the terms occurring, within the stochastic differential equation, prevent the process entering a region. The outcome is a naturally occurring boundary (which may be absorbing or reflecting). A complication occurs in a simulation of this situation. The term involving a random variable, within the discretized stochastic differential equation, may take a trajectory across the boundary into a "forbidden region." The naive way of dealing with this problem, which we refer to as the "standard" approach, is simply to reset the trajectory to the boundary, based on the argument that crossing the boundary actually signifies achieving the boundary. In this work we show, within the framework of the Euler method, that such resetting introduces a spurious force into the original diffusion process. This force may have a significant influence on trajectories that come close to a boundary. We propose a corrected numerical scheme, for simulating one-dimensional diffusion processes with naturally occurring boundaries. This involves correcting the standard approach, so that an exact property of the diffusion process is precisely respected. As a consequence, the proposed scheme does not introduce a spurious force into the dynamics. We present numerical test cases, based on exactly soluble one-dimensional problems with one or two boundaries, which suggest that, for a given value of the discrete time step, the proposed scheme leads to substantially more accurate results than the standard approach. Alternatively, the standard approach needs considerably more computation time to obtain a comparable level of accuracy to the proposed scheme, because the standard approach requires a significantly smaller time step.
Plastic modulation of episodic memory networks in the aging brain with cognitive decline.
Bai, Feng; Yuan, Yonggui; Yu, Hui; Zhang, Zhijun
2016-07-15
Social-cognitive processing has been posited to underlie general functions such as episodic memory. Episodic memory impairment is a recognized hallmark of amnestic mild cognitive impairment (aMCI) who is at a high risk for dementia. Three canonical networks, self-referential processing, executive control processing and salience processing, have distinct roles in episodic memory retrieval processing. It remains unclear whether and how these sub-networks of the episodic memory retrieval system would be affected in aMCI. This task-state fMRI study constructed systems-level episodic memory retrieval sub-networks in 28 aMCI and 23 controls using two computational approaches: a multiple region-of-interest based approach and a voxel-level functional connectivity-based approach, respectively. These approaches produced the remarkably similar findings that the self-referential processing network made critical contributions to episodic memory retrieval in aMCI. More conspicuous alterations in self-referential processing of the episodic memory retrieval network were identified in aMCI. In order to complete a given episodic memory retrieval task, increases in cooperation between the self-referential processing network and other sub-networks were mobilized in aMCI. Self-referential processing mediate the cooperation of the episodic memory retrieval sub-networks as it may help to achieve neural plasticity and may contribute to the prevention and treatment of dementia. Copyright © 2016 Elsevier B.V. All rights reserved.
A continuous quality improvement team approach to adverse drug reaction reporting.
Flowers, P; Dzierba, S; Baker, O
1992-07-01
Crossfunctional teams can generate more new ideas, concepts, and possible solutions than does a department-based process alone. Working collaboratively can increase knowledge of teams using CQI approaches and appropriate tools. CQI produces growth and development at multiple levels resulting from involvement in the process of incremental improvement.
On Some Stopping Times of Citation Processes. From Theory to Indicators.
ERIC Educational Resources Information Center
Glanzel, Wolfgang
1992-01-01
Proposes a new measure of the citation speed of scientific publications based on a stopping time approach. The stochastic process approach in bibliometrics is described, mean response time (MRT) is discussed, harmonic mean response time (HMRT) is explained, and examples are given. (six references) (LRW)
Qualitative Analysis for Maintenance Process Assessment
NASA Technical Reports Server (NTRS)
Brand, Lionel; Kim, Yong-Mi; Melo, Walcelio; Seaman, Carolyn; Basili, Victor
1996-01-01
In order to improve software maintenance processes, we first need to be able to characterize and assess them. These tasks must be performed in depth and with objectivity since the problems are complex. One approach is to set up a measurement-based software process improvement program specifically aimed at maintenance. However, establishing a measurement program requires that one understands the problems to be addressed by the measurement program and is able to characterize the maintenance environment and processes in order to collect suitable and cost-effective data. Also, enacting such a program and getting usable data sets takes time. A short term substitute is therefore needed. We propose in this paper a characterization process aimed specifically at maintenance and based on a general qualitative analysis methodology. This process is rigorously defined in order to be repeatable and usable by people who are not acquainted with such analysis procedures. A basic feature of our approach is that actual implemented software changes are analyzed in order to understand the flaws in the maintenance process. Guidelines are provided and a case study is shown that demonstrates the usefulness of the approach.
A 2-D process-based model for suspended sediment dynamics: A first step towards ecological modeling
Achete, F. M.; van der Wegen, M.; Roelvink, D.; Jaffe, B.
2015-01-01
In estuaries suspended sediment concentration (SSC) is one of the most important contributors to turbidity, which influences habitat conditions and ecological functions of the system. Sediment dynamics differs depending on sediment supply and hydrodynamic forcing conditions that vary over space and over time. A robust sediment transport model is a first step in developing a chain of models enabling simulations of contaminants, phytoplankton and habitat conditions. This works aims to determine turbidity levels in the complex-geometry delta of the San Francisco estuary using a process-based approach (Delft3D Flexible Mesh software). Our approach includes a detailed calibration against measured SSC levels, a sensitivity analysis on model parameters and the determination of a yearly sediment budget as well as an assessment of model results in terms of turbidity levels for a single year, water year (WY) 2011. Model results show that our process-based approach is a valuable tool in assessing sediment dynamics and their related ecological parameters over a range of spatial and temporal scales. The model may act as the base model for a chain of ecological models assessing the impact of climate change and management scenarios. Here we present a modeling approach that, with limited data, produces reliable predictions and can be useful for estuaries without a large amount of processes data.
A 2-D process-based model for suspended sediment dynamics: a first step towards ecological modeling
NASA Astrophysics Data System (ADS)
Achete, F. M.; van der Wegen, M.; Roelvink, D.; Jaffe, B.
2015-06-01
In estuaries suspended sediment concentration (SSC) is one of the most important contributors to turbidity, which influences habitat conditions and ecological functions of the system. Sediment dynamics differs depending on sediment supply and hydrodynamic forcing conditions that vary over space and over time. A robust sediment transport model is a first step in developing a chain of models enabling simulations of contaminants, phytoplankton and habitat conditions. This works aims to determine turbidity levels in the complex-geometry delta of the San Francisco estuary using a process-based approach (Delft3D Flexible Mesh software). Our approach includes a detailed calibration against measured SSC levels, a sensitivity analysis on model parameters and the determination of a yearly sediment budget as well as an assessment of model results in terms of turbidity levels for a single year, water year (WY) 2011. Model results show that our process-based approach is a valuable tool in assessing sediment dynamics and their related ecological parameters over a range of spatial and temporal scales. The model may act as the base model for a chain of ecological models assessing the impact of climate change and management scenarios. Here we present a modeling approach that, with limited data, produces reliable predictions and can be useful for estuaries without a large amount of processes data.
Wimmer, Lena; Bellingrath, Silja; von Stockhausen, Lisa
2016-01-01
The present paper reports a pilot study which tested cognitive effects of mindfulness practice in a theory-driven approach. Thirty-four fifth graders received either a mindfulness training which was based on the mindfulness-based stress reduction approach (experimental group), a concentration training (active control group), or no treatment (passive control group). Based on the operational definition of mindfulness by Bishop et al. (2004), effects on sustained attention, cognitive flexibility, cognitive inhibition, and data-driven as opposed to schema-based information processing were predicted. These abilities were assessed in a pre-post design by means of a vigilance test, a reversible figures test, the Wisconsin Card Sorting Test, a Stroop test, a visual search task, and a recognition task of prototypical faces. Results suggest that the mindfulness training specifically improved cognitive inhibition and data-driven information processing. PMID:27462287
Wimmer, Lena; Bellingrath, Silja; von Stockhausen, Lisa
2016-01-01
The present paper reports a pilot study which tested cognitive effects of mindfulness practice in a theory-driven approach. Thirty-four fifth graders received either a mindfulness training which was based on the mindfulness-based stress reduction approach (experimental group), a concentration training (active control group), or no treatment (passive control group). Based on the operational definition of mindfulness by Bishop et al. (2004), effects on sustained attention, cognitive flexibility, cognitive inhibition, and data-driven as opposed to schema-based information processing were predicted. These abilities were assessed in a pre-post design by means of a vigilance test, a reversible figures test, the Wisconsin Card Sorting Test, a Stroop test, a visual search task, and a recognition task of prototypical faces. Results suggest that the mindfulness training specifically improved cognitive inhibition and data-driven information processing.
A Generic Approach for Pen-Based User Interface Development
NASA Astrophysics Data System (ADS)
Macé, Sébastien; Anquetil, Éric
Pen-based interaction is an intuitive way to realize hand drawn structured documents, but few applications take advantage of it. Indeed, the interpretation of the user hand drawn strokes in the context of document is a complex problem. In this paper, we propose a new generic approach to develop such systems based on three independent components. The first one is a set of graphical and editing functions adapted to pen interaction. The second one is a rule-based formalism that models structured document composition and the corresponding interpretation process. The last one is a hand drawn stroke analyzer that is able to interpret strokes progressively, directly while the user is drawing. We highlight in particular the human-computer interaction induced from this progressive interpretation process. Thanks to this generic approach, three pen-based system prototypes have already been developed, for musical score editing, for graph editing, and for UML class diagram editing
Process-based upscaling of surface-atmosphere exchange
NASA Astrophysics Data System (ADS)
Keenan, T. F.; Prentice, I. C.; Canadell, J.; Williams, C. A.; Wang, H.; Raupach, M. R.; Collatz, G. J.; Davis, T.; Stocker, B.; Evans, B. J.
2015-12-01
Empirical upscaling techniques such as machine learning and data-mining have proven invaluable tools for the global scaling of disparate observations of surface-atmosphere exchange, but are not based on a theoretical understanding of the key processes involved. This makes spatial and temporal extrapolation outside of the training domain difficult at best. There is therefore a clear need for the incorporation of knowledge of ecosystem function, in combination with the strength of data mining. Here, we present such an approach. We describe a novel diagnostic process-based model of global photosynthesis and ecosystem respiration, which is directly informed by a variety of global datasets relevant to ecosystem state and function. We use the model framework to estimate global carbon cycling both spatially and temporally, with a specific focus on the mechanisms responsible for long-term change. Our results show the importance of incorporating process knowledge into upscaling approaches, and highlight the effect of key processes on the terrestrial carbon cycle.
Collaboration in Global Software Engineering Based on Process Description Integration
NASA Astrophysics Data System (ADS)
Klein, Harald; Rausch, Andreas; Fischer, Edward
Globalization is one of the big trends in software development. Development projects need a variety of different resources with appropriate expert knowledge to be successful. More and more of these resources are nowadays obtained from specialized organizations and countries all over the world, varying in development approaches, processes, and culture. As seen with early outsourcing attempts, collaboration may fail due to these differences. Hence, the major challenge in global software engineering is to streamline collaborating organizations towards a successful conjoint development. Based on typical collaboration scenarios, this paper presents a structured approach to integrate processes in a comprehensible way.
Guidelines for Risk-Based Changeover of Biopharma Multi-Product Facilities.
Lynch, Rob; Barabani, David; Bellorado, Kathy; Canisius, Peter; Heathcote, Doug; Johnson, Alan; Wyman, Ned; Parry, Derek Willison
2018-01-01
In multi-product biopharma facilities, the protection from product contamination due to the manufacture of multiple products simultaneously is paramount to assure product quality. To that end, the use of traditional changeover methods (elastomer change-out, full sampling, etc.) have been widely used within the industry and have been accepted by regulatory agencies. However, with the endorsement of Quality Risk Management (1), the use of risk-based approaches may be applied to assess and continuously improve established changeover processes. All processes, including changeover, can be improved with investment (money/resources), parallel activities, equipment design improvements, and standardization. However, processes can also be improved by eliminating waste. For product changeover, waste is any activity not needed for the new process or that does not provide added assurance of the quality of the subsequent product. The application of a risk-based approach to changeover aligns with the principles of Quality Risk Management. Through the use of risk assessments, the appropriate changeover controls can be identified and controlled to assure product quality is maintained. Likewise, the use of risk assessments and risk-based approaches may be used to improve operational efficiency, reduce waste, and permit concurrent manufacturing of products. © PDA, Inc. 2018.
Nested polynomial trends for the improvement of Gaussian process-based predictors
NASA Astrophysics Data System (ADS)
Perrin, G.; Soize, C.; Marque-Pucheu, S.; Garnier, J.
2017-10-01
The role of simulation keeps increasing for the sensitivity analysis and the uncertainty quantification of complex systems. Such numerical procedures are generally based on the processing of a huge amount of code evaluations. When the computational cost associated with one particular evaluation of the code is high, such direct approaches based on the computer code only, are not affordable. Surrogate models have therefore to be introduced to interpolate the information given by a fixed set of code evaluations to the whole input space. When confronted to deterministic mappings, the Gaussian process regression (GPR), or kriging, presents a good compromise between complexity, efficiency and error control. Such a method considers the quantity of interest of the system as a particular realization of a Gaussian stochastic process, whose mean and covariance functions have to be identified from the available code evaluations. In this context, this work proposes an innovative parametrization of this mean function, which is based on the composition of two polynomials. This approach is particularly relevant for the approximation of strongly non linear quantities of interest from very little information. After presenting the theoretical basis of this method, this work compares its efficiency to alternative approaches on a series of examples.
Agent-Based Modeling of Growth Processes
ERIC Educational Resources Information Center
Abraham, Ralph
2014-01-01
Growth processes abound in nature, and are frequently the target of modeling exercises in the sciences. In this article we illustrate an agent-based approach to modeling, in the case of a single example from the social sciences: bullying.
An object-based approach to weather analysis and its applications
NASA Astrophysics Data System (ADS)
Troemel, Silke; Diederich, Malte; Horvath, Akos; Simmer, Clemens; Kumjian, Matthew
2013-04-01
The research group 'Object-based Analysis and SEamless prediction' (OASE) within the Hans Ertel Centre for Weather Research programme (HErZ) pursues an object-based approach to weather analysis. The object-based tracking approach adopts the Lagrange perspective by identifying and following the development of convective events over the course of their lifetime. Prerequisites of the object-based analysis are a high-resolved observational data base and a tracking algorithm. A near real-time radar and satellite remote sensing-driven 3D observation-microphysics composite covering Germany, currently under development, contains gridded observations and estimated microphysical quantities. A 3D scale-space tracking identifies convective rain events in the dual-composite and monitors the development over the course of their lifetime. The OASE-group exploits the object-based approach in several fields of application: (1) For a better understanding and analysis of precipitation processes responsible for extreme weather events, (2) in nowcasting, (3) as a novel approach for validation of meso-γ atmospheric models, and (4) in data assimilation. Results from the different fields of application will be presented. The basic idea of the object-based approach is to identify a small set of radar- and satellite derived descriptors which characterize the temporal development of precipitation systems which constitute the objects. So-called proxies of the precipitation process are e.g. the temporal change of the brightband, vertically extensive columns of enhanced differential reflectivity ZDR or the cloud top temperature and heights identified in the 4D field of ground-based radar reflectivities and satellite retrievals generated by a cell during its life time. They quantify (micro-) physical differences among rain events and relate to the precipitation yield. Analyses on the informative content of ZDR columns as precursor for storm evolution for example will be presented to demonstrate the use of such system-oriented predictors for nowcasting. Columns of differential reflectivity ZDR measured by polarimetric weather radars are prominent signatures associated with thunderstorm updrafts. Since greater vertical velocities can loft larger drops and water-coated ice particles to higher altitudes above the environmental freezing level, the integrated ZDR column above the freezing level increases with increasing updraft intensity. Validation of atmospheric models concerning precipitation representation or prediction is usually confined to comparisons of precipitation fields or their temporal and spatial statistics. A comparison of the rain rates alone, however, does not immediately explain discrepancies between models and observations, because similar rain rates might be produced by different processes. Within the event-based approach for validation of models both observed and modeled rain events are analyzed by means of proxies of the precipitation process. Both sets of descriptors represent the basis for model validation since different leading descriptors - in a statistical sense- hint at process formulations potentially responsible for model failures.
A compositional approach to building applications in a computational environment
NASA Astrophysics Data System (ADS)
Roslovtsev, V. V.; Shumsky, L. D.; Wolfengagen, V. E.
2014-04-01
The paper presents an approach to creating an applicative computational environment to feature computational processes and data decomposition, and a compositional approach to application building. The approach in question is based on the notion of combinator - both in systems with variable binding (such as λ-calculi) and those allowing programming without variables (combinatory logic style). We present a computation decomposition technique based on objects' structural decomposition, with the focus on computation decomposition. The computational environment's architecture is based on a network with nodes playing several roles simultaneously.
Luo, Xin; You, Zhuhong; Zhou, Mengchu; Li, Shuai; Leung, Hareton; Xia, Yunni; Zhu, Qingsheng
2015-01-09
The comprehensive mapping of protein-protein interactions (PPIs) is highly desired for one to gain deep insights into both fundamental cell biology processes and the pathology of diseases. Finely-set small-scale experiments are not only very expensive but also inefficient to identify numerous interactomes despite their high accuracy. High-throughput screening techniques enable efficient identification of PPIs; yet the desire to further extract useful knowledge from these data leads to the problem of binary interactome mapping. Network topology-based approaches prove to be highly efficient in addressing this problem; however, their performance deteriorates significantly on sparse putative PPI networks. Motivated by the success of collaborative filtering (CF)-based approaches to the problem of personalized-recommendation on large, sparse rating matrices, this work aims at implementing a highly efficient CF-based approach to binary interactome mapping. To achieve this, we first propose a CF framework for it. Under this framework, we model the given data into an interactome weight matrix, where the feature-vectors of involved proteins are extracted. With them, we design the rescaled cosine coefficient to model the inter-neighborhood similarity among involved proteins, for taking the mapping process. Experimental results on three large, sparse datasets demonstrate that the proposed approach outperforms several sophisticated topology-based approaches significantly.
Luo, Xin; You, Zhuhong; Zhou, Mengchu; Li, Shuai; Leung, Hareton; Xia, Yunni; Zhu, Qingsheng
2015-01-01
The comprehensive mapping of protein-protein interactions (PPIs) is highly desired for one to gain deep insights into both fundamental cell biology processes and the pathology of diseases. Finely-set small-scale experiments are not only very expensive but also inefficient to identify numerous interactomes despite their high accuracy. High-throughput screening techniques enable efficient identification of PPIs; yet the desire to further extract useful knowledge from these data leads to the problem of binary interactome mapping. Network topology-based approaches prove to be highly efficient in addressing this problem; however, their performance deteriorates significantly on sparse putative PPI networks. Motivated by the success of collaborative filtering (CF)-based approaches to the problem of personalized-recommendation on large, sparse rating matrices, this work aims at implementing a highly efficient CF-based approach to binary interactome mapping. To achieve this, we first propose a CF framework for it. Under this framework, we model the given data into an interactome weight matrix, where the feature-vectors of involved proteins are extracted. With them, we design the rescaled cosine coefficient to model the inter-neighborhood similarity among involved proteins, for taking the mapping process. Experimental results on three large, sparse datasets demonstrate that the proposed approach outperforms several sophisticated topology-based approaches significantly. PMID:25572661
NASA Astrophysics Data System (ADS)
Luo, Xin; You, Zhuhong; Zhou, Mengchu; Li, Shuai; Leung, Hareton; Xia, Yunni; Zhu, Qingsheng
2015-01-01
The comprehensive mapping of protein-protein interactions (PPIs) is highly desired for one to gain deep insights into both fundamental cell biology processes and the pathology of diseases. Finely-set small-scale experiments are not only very expensive but also inefficient to identify numerous interactomes despite their high accuracy. High-throughput screening techniques enable efficient identification of PPIs; yet the desire to further extract useful knowledge from these data leads to the problem of binary interactome mapping. Network topology-based approaches prove to be highly efficient in addressing this problem; however, their performance deteriorates significantly on sparse putative PPI networks. Motivated by the success of collaborative filtering (CF)-based approaches to the problem of personalized-recommendation on large, sparse rating matrices, this work aims at implementing a highly efficient CF-based approach to binary interactome mapping. To achieve this, we first propose a CF framework for it. Under this framework, we model the given data into an interactome weight matrix, where the feature-vectors of involved proteins are extracted. With them, we design the rescaled cosine coefficient to model the inter-neighborhood similarity among involved proteins, for taking the mapping process. Experimental results on three large, sparse datasets demonstrate that the proposed approach outperforms several sophisticated topology-based approaches significantly.
Hogan, Lindsay; García Bengoechea, Enrique; Salsberg, Jon; Jacobs, Judi; King, Morrison; Macaulay, Ann C
2014-12-01
This study is part of a larger community-based participatory research (CBPR) project to develop, implement, and evaluate the physical activity component of a school-based wellness policy. The policy intervention is being carried out by community stakeholders and academic researchers within the Kahnawake Schools Diabetes Prevention Project, a well-established health promotion organization in the Indigenous community of Kahnawake, Quebec. We explored how a group of stakeholders develop a school physical activity policy in a participatory manner, and examined factors serving as facilitators and barriers to the development process. This case study was guided by an interpretive description approach and draws upon data from documentary analysis and participant observation. A CBPR approach allowed academic researchers and community stakeholders to codevelop a physical activity policy that is both evidence-based and contextually appropriate. The development process was influenced by a variety of barriers and facilitators including working within existing structures, securing appropriate stakeholders, and school contextual factors. This research offers a process framework that others developing school-based wellness policies may use with appropriate modifications based on local environments. © 2014, American School Health Association.
Shrestha, Roman; Altice, Frederick; Karki, Pramila; Copenhaver, Michael
2017-01-01
To date, HIV prevention efforts have largely relied on singular strategies (e.g., behavioral or biomedical approaches alone) with modest HIV risk-reduction outcomes for people who use drugs (PWUD), many of whom experience a wide range of neurocognitive impairments (NCI). We report on the process and outcome of our formative research aimed at developing an integrated biobehavioral approach that incorporates innovative strategies to address the HIV prevention and cognitive needs of high-risk PWUD in drug treatment. Our formative work involved first adapting an evidence-based behavioral intervention—guided by the Assessment–Decision–Administration–Production–Topical experts–Integration–Training–Testing model—and then combining the behavioral intervention with an evidence-based biomedical intervention for implementation among the target population. This process involved eliciting data through structured focus groups (FGs) with key stakeholders—members of the target population (n = 20) and treatment providers (n = 10). Analysis of FG data followed a thematic analysis approach utilizing several qualitative data analysis techniques, including inductive analysis and cross-case analysis. Based on all information, we integrated the adapted community-friendly health recovery program—a brief evidence-based HIV prevention behavioral intervention—with the evidence-based biomedical component [i.e., preexposure prophylaxis (PrEP)], an approach that incorporates innovative strategies to accommodate individuals with NCI. This combination approach—now called the biobehavioral community-friendly health recovery program—is designed to address HIV-related risk behaviors and PrEP uptake and adherence as experienced by many PWUD in treatment. This study provides a complete example of the process of selecting, adapting, and integrating the evidence-based interventions—taking into account both empirical evidence and input from target population members and target organization stakeholders. The resultant brief evidence-based biobehavioral approach could significantly advance primary prevention science by cost-effectively optimizing PrEP adherence and HIV risk reduction within common drug treatment settings. PMID:28553295
Vibronic coupling simulations for linear and nonlinear optical processes: Simulation results
NASA Astrophysics Data System (ADS)
Silverstein, Daniel W.; Jensen, Lasse
2012-02-01
A vibronic coupling model based on time-dependent wavepacket approach is applied to simulate linear optical processes, such as one-photon absorbance and resonance Raman scattering, and nonlinear optical processes, such as two-photon absorbance and resonance hyper-Raman scattering, on a series of small molecules. Simulations employing both the long-range corrected approach in density functional theory and coupled cluster are compared and also examined based on available experimental data. Although many of the small molecules are prone to anharmonicity in their potential energy surfaces, the harmonic approach performs adequately. A detailed discussion of the non-Condon effects is illustrated by the molecules presented in this work. Linear and nonlinear Raman scattering simulations allow for the quantification of interference between the Franck-Condon and Herzberg-Teller terms for different molecules.
Zhang, Tianchang; Kim, Christine H J; Cheng, Yingwen; Ma, Yanwen; Zhang, Hongbo; Liu, Jie
2015-02-21
A "top-down" and scalable approach for processing carbon fiber cloth (CFC) into flexible and all-carbon electrodes with remarkable areal capacity and cyclic stability was developed. CFC is commercially available in large quantities but its use as an electrode material in supercapacitors is not satisfactory. The approach demonstrated in this work is based on the sequential treatment of CFC with KOH activation and high temperature annealing that can effectively improve its specific surface area to a remarkable 2780 m(2) g(-1) while at the same time achieving a good electrical conductivity of 320 S m(-1) without sacrificing its intrinsic mechanical strength and flexibility. The processed CFC can be directly used as an electrode for supercapacitors without any binders, conductive additives and current collectors while avoiding elaborate electrode processing steps to deliver a specific capacitance of ∼0.5 F cm(-2) and ∼197 F g(-1) with remarkable rate performance and excellent cyclic stability. The properties of these processed CFCs are comparable or better than graphene and carbon nanotube based electrodes. We further demonstrate symmetric solid-state supercapacitors based on these processed CFCs with very good flexibility. This "top-down" and scalable approach can be readily applied to other types of commercially available carbon materials and therefore can have a substantial significance for high performance supercapacitor devices.
On the Use of Computers for Teaching Fluid Mechanics
NASA Technical Reports Server (NTRS)
Benson, Thomas J.
1994-01-01
Several approaches for improving the teaching of basic fluid mechanics using computers are presented. There are two objectives to these approaches: to increase the involvement of the student in the learning process and to present information to the student in a variety of forms. Items discussed include: the preparation of educational videos using the results of computational fluid dynamics (CFD) calculations, the analysis of CFD flow solutions using workstation based post-processing graphics packages, and the development of workstation or personal computer based simulators which behave like desk top wind tunnels. Examples of these approaches are presented along with observations from working with undergraduate co-ops. Possible problems in the implementation of these approaches as well as solutions to these problems are also discussed.
Complex Event Processing for Content-Based Text, Image, and Video Retrieval
2016-06-01
NY): Wiley- Interscience; 2000. Feldman R, Sanger J. The text mining handbook: advanced approaches in analyzing unstructured data. New York (NY...ARL-TR-7705 ● JUNE 2016 US Army Research Laboratory Complex Event Processing for Content-Based Text , Image, and Video Retrieval...ARL-TR-7705 ● JUNE 2016 US Army Research Laboratory Complex Event Processing for Content-Based Text , Image, and Video Retrieval
Uncertainty Quantification in Simulations of Epidemics Using Polynomial Chaos
Santonja, F.; Chen-Charpentier, B.
2012-01-01
Mathematical models based on ordinary differential equations are a useful tool to study the processes involved in epidemiology. Many models consider that the parameters are deterministic variables. But in practice, the transmission parameters present large variability and it is not possible to determine them exactly, and it is necessary to introduce randomness. In this paper, we present an application of the polynomial chaos approach to epidemiological mathematical models based on ordinary differential equations with random coefficients. Taking into account the variability of the transmission parameters of the model, this approach allows us to obtain an auxiliary system of differential equations, which is then integrated numerically to obtain the first-and the second-order moments of the output stochastic processes. A sensitivity analysis based on the polynomial chaos approach is also performed to determine which parameters have the greatest influence on the results. As an example, we will apply the approach to an obesity epidemic model. PMID:22927889
Holmes, Lisa; Landsverk, John; Ward, Harriet; Rolls-Reutz, Jennifer; Saldana, Lisa; Wulczyn, Fred; Chamberlain, Patricia
2014-04-01
Estimating costs in child welfare services is critical as new service models are incorporated into routine practice. This paper describes a unit costing estimation system developed in England (cost calculator) together with a pilot test of its utility in the United States where unit costs are routinely available for health services but not for child welfare services. The cost calculator approach uses a unified conceptual model that focuses on eight core child welfare processes. Comparison of these core processes in England and in four counties in the United States suggests that the underlying child welfare processes generated from England were perceived as very similar by child welfare staff in California county systems with some exceptions in the review and legal processes. Overall, the adaptation of the cost calculator for use in the United States child welfare systems appears promising. The paper also compares the cost calculator approach to the workload approach widely used in the United States and concludes that there are distinct differences between the two approaches with some possible advantages to the use of the cost calculator approach, especially in the use of this method for estimating child welfare costs in relation to the incorporation of evidence-based interventions into routine practice.
NASA Astrophysics Data System (ADS)
Singh, Rupinder
2018-02-01
Hot chamber (HC) die casting process is one of the most widely used commercial processes for the casting of low temperature metals and alloys. This process gives near-net shape product with high dimensional accuracy. However in actual field environment the best settings of input parameters is often conflicting as the shape and size of the casting changes and one have to trade off among various output parameters like hardness, dimensional accuracy, casting defects, microstructure etc. So for online inspection of the cast components properties (without affecting the production line) the weight measurement has been established as one of the cost effective method (as the difference in weight of sound and unsound casting reflects the possible casting defects) in field environment. In the present work at first stage the effect of three input process parameters (namely: pressure at 2nd phase in HC die casting; metal pouring temperature and die opening time) has been studied for optimizing the cast component weight `W' as output parameter in form of macro model based upon Taguchi L9 OA. After this Buckingham's π approach has been applied on Taguchi based macro model for the development of micro model. This study highlights the Taguchi-Buckingham based combined approach as a case study (for conversion of macro model into micro model) by identification of optimum levels of input parameters (based on Taguchi approach) and development of mathematical model (based on Buckingham's π approach). Finally developed mathematical model can be used for predicting W in HC die casting process with more flexibility. The results of study highlights second degree polynomial equation for predicting cast component weight in HC die casting and suggest that pressure at 2nd stage is one of the most contributing factors for controlling the casting defect/weight of casting.
Automated and model-based assembly of an anamorphic telescope
NASA Astrophysics Data System (ADS)
Holters, Martin; Dirks, Sebastian; Stollenwerk, Jochen; Loosen, Peter
2018-02-01
Since the first usage of optical glasses there has been an increasing demand for optical systems which are highly customized for a wide field of applications. To meet the challenge of the production of so many unique systems, the development of new techniques and approaches has risen in importance. However, the assembly of precision optical systems with lot sizes of one up to a few tens of systems is still dominated by manual labor. In contrast, highly adaptive and model-based approaches may offer a solution for manufacturing with a high degree of automation and high throughput while maintaining high precision. In this work a model-based automated assembly approach based on ray-tracing is presented. This process runs autonomously, and accounts for a wide range of functionality. It firstly identifies the sequence for an optimized assembly and secondly, generates and matches intermediate figures of merit to predict the overall optical functionality of the optical system. This process also takes into account the generation of a digital twin of the optical system, by mapping key-performance-indicators like the first and the second momentum of intensity into the optical model. This approach is verified by the automatic assembly of an anamorphic telescope within an assembly cell. By continuous measuring and mapping the key-performance-indicators into the optical model, the quality of the digital twin is determined. Moreover, by measuring the optical quality and geometrical parameters of the telescope, the precision of this approach is determined. Finally, the productivity of the process is evaluated by monitoring the speed of the different steps of the process.
Palmero, Paola; Kern, Frank; Sommer, Frank; Lombardi, Mariangela; Gadow, Rainer; Montanaro, Laura
2014-12-30
Ceramic nanocomposites, containing at least one phase in the nanometric dimension, have received special interest in recent years. They have, in fact, demonstrated increased performance, reliability and lifetime with respect to monolithic ceramics. However, a successful approach to the production of tailored composite nanostructures requires the development of innovative concepts at each step of manufacturing, from the synthesis of composite nanopowders, to their processing and sintering.This review aims to deepen understanding of some of the critical issues associated with the manufacturing of nanocomposite ceramics, focusing on alumina-based composite systems. Two case studies are presented and briefly discussed. The former illustrates the benefits, in terms of sintered microstructure and related mechanical properties, resulting from the application of an engineering approach to a laboratory-scale protocol for the elaboration of nanocomposites in the system alumina-ZrO2-YAG (yttrium aluminium garnet). The latter illustrates the manufacturing of alumina-based composites for large-scale applications such as cutting tools, carried out by an injection molding process. The need for an engineering approach to be applied in all processing steps is demonstrated also in this second case study, where a tailored manufacturing process is required to obtain the desired results.
Strengths-Based Nursing: A Process for Implementing a Philosophy Into Practice.
Gottlieb, Laurie N; Gottlieb, Bruce
2017-08-01
Strengths-Based Nursing (SBN) is both a philosophy and value-driven approach that can guide clinicians, educators, manager/leaders, and researchers. SBN is rooted in principles of person/family centered care, empowerment, relational care, and innate health and healing. SBN is family nursing yet not all family nursing models are strengths-based. The challenge is how to translate a philosophy to change practice. In this article, we describe a process of implementation that has organically evolved of a multi-layered and multi-pronged approach that involves patients and families, clinicians, educators, leaders, managers, and researchers as well as key stakeholders including union leaders, opinion leaders, and policy makers from both nursing and other disciplines. There are two phases to the implementation process, namely, Phase 1: pre-commitment/pre-adoption and Phase 2: adoption. Each phase consists of distinct steps with accompanying strategies. These phases occur both sequentially and concurrently. Facilitating factors that enable the implementation process include values which align, readiness to accept SBN, curiosity-courage-commitment on the part of early adopters, a critical mass of early adopters, and making SBN approach both relevant and context specific.
Malik, Gulzar; McKenna, Lisa; Griffiths, Debra
2017-04-01
The study aimed to explore the processes undertaken by nurse academics when integrating evidence-based practice (EBP) into their teaching and learning practices. This article focuses on pedagogical approaches employed by academics to influence evidence-based practice integration into undergraduate programs across Australian universities. Nursing academics are challenged to incorporate a variety of teaching and learning strategies to teach evidence-based practice and determine their effectiveness. However, literature suggests that there are limited studies available focusing on pedagogical approaches in evidence-based practice education. A constructivist grounded theory methodology, informed by Charmaz was used for this study. Data were collected during 2014 from 23 nurse academics across Australian universities through semi-structured interviews. Additionally, nine were observed during teaching of undergraduate students. Twenty subject outlines were also analysed following Charmaz's approach of data analysis. 'Influencing EBP integration' describes the pedagogical approaches employed by academics to incorporate EBP knowledge and skills into undergraduate curricula. With the use of various teaching and learning strategies, academics attempted to contextualize EBP by engaging students with activities aiming to link evidence to practice and with the EBP process. Although, some strategies appeared to be engaging, others were traditional and seemed to be disengaging for students due to the challenges experienced by participants that impeded the use of the most effective teaching methods. Study findings offer valuable insights into the teaching practices and identify some key challenges that require the adoption of appropriate strategies to ensure future nurses are well prepared in the paradigm of evidence-based practice. © 2016 John Wiley & Sons Ltd.
ERIC Educational Resources Information Center
Sung, Han-Yu; Hwang, Gwo-Jen
2013-01-01
In this study, a collaborative game-based learning environment is developed by integrating a grid-based Mindtool to facilitate the students to share and organize what they have learned during the game-playing process. To evaluate the effectiveness of the proposed approach, an experiment has been conducted in an elementary school natural science…
Place-based planning: innovations and applications from four western forests.
Jennifer O. Farnum; Linda E. Kruger
2008-01-01
Place-based planning is an emergent method of public lands planning that aims to redefine the scale at which planning occurs, using place meanings and place values to guide planning processes. Despite the approach's growing popularity, there exist few published accounts of place-based approaches. To provide practitioners and researchers with such examples, the...
Health professionals' decision-making in wound management: a grounded theory.
Gillespie, Brigid M; Chaboyer, Wendy; St John, Winsome; Morley, Nicola; Nieuwenhoven, Paul
2015-06-01
To develop a conceptual understanding of the decision-making processes used by healthcare professionals in wound care practice. With the global move towards using an evidence-base in standardizing wound care practices and the need to reduce hospital wound care costs, it is important to understand health professionals' decision-making in this important yet under-researched area. A grounded theory approach was used to explore clinical decision-making of healthcare professionals in wound care practice. Interviews were conducted with 20 multi-disciplinary participants from nursing, surgery, infection control and wound care who worked at a metropolitan hospital in Australia. Data were collected during 2012-2013. Constant comparative analysis underpinned by Strauss and Corbin's framework was used to identify clinical decision-making processes. The core category was 'balancing practice-based knowledge with evidence-based knowledge'. Participants' clinical practice and actions embedded the following processes: 'utilizing the best available information', 'using a consistent approach in wound assessment' and 'using a multidisciplinary approach'. The substantive theory explains how practice and evidence knowledge was balanced and the variation in use of intuitive practice-based knowledge versus evidence-based knowledge. Participants considered patients' needs and preferences, costs, outcomes, technologies, others' expertise and established practices. Participants' decision-making tended to be more heavily weighted towards intuitive practice-based processes. These findings offer a better understanding of the processes used by health professionals' in their decision-making in wound care. Such an understanding may inform the development of evidence-based interventions that lead to better patient outcomes. © 2014 John Wiley & Sons Ltd.
Vibronic coupling simulations for linear and nonlinear optical processes: Theory
NASA Astrophysics Data System (ADS)
Silverstein, Daniel W.; Jensen, Lasse
2012-02-01
A comprehensive vibronic coupling model based on the time-dependent wavepacket approach is derived to simulate linear optical processes, such as one-photon absorbance and resonance Raman scattering, and nonlinear optical processes, such as two-photon absorbance and resonance hyper-Raman scattering. This approach is particularly well suited for combination with first-principles calculations. Expressions for the Franck-Condon terms, and non-Condon effects via the Herzberg-Teller coupling approach in the independent-mode displaced harmonic oscillator model are presented. The significance of each contribution to the different spectral types is discussed briefly.
End-of-life conversations and care: an asset-based model for community engagement.
Matthiesen, Mary; Froggatt, Katherine; Owen, Elaine; Ashton, John R
2014-09-01
Public awareness work regarding palliative and end-of-life care is increasingly promoted within national strategies for palliative care. Different approaches to undertaking this work are being used, often based upon broader educational principles, but little is known about how to undertake such initiatives in a way that equally engages both the health and social care sector and the local communities. An asset-based community engagement approach has been developed that facilitates community-led awareness initiatives concerning end-of-life conversations and care by identifying and connecting existing skills and expertise. (1) To describe the processes and features of an asset-based community engagement approach that facilitates community-led awareness initiatives with a focus on end-of-life conversations and care; and (2) to identify key community-identified priorities for sustainable community engagement processes. An asset-based model of community engagement specific to end-of-life issues using a four-step process is described (getting started, coming together, action planning and implementation). The use of this approach, in two regional community engagement programmes, based across rural and urban communities in the northwest of England, is described. The assets identified in the facilitated community engagement process encompassed people's talents and skills, community groups and networks, government and non-government agencies, physical and economic assets and community values and stories. Five priority areas were addressed to ensure active community engagement work: information, outreach, education, leadership and sustainability. A facilitated, asset-based approach of community engagement for end-of-life conversations and care can catalyse community-led awareness initiatives. This occurs through the involvement of community and local health and social care organisations as co-creators of this change across multiple sectors in a sustainable way. This approach provides a framework for other communities seeking to engage with public awareness in end-of-life issues. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Input-Based Approaches to Teaching Grammar: A Review of Classroom-Oriented Research.
ERIC Educational Resources Information Center
Ellis, Rod
1999-01-01
Examines the theoretical rationales (universal grammar, information-processing theories, skill-learning theories) for input-based grammar teaching and reviews classroom-oriented research (i.e., enriched-input studies, input-processing studies) that has integrated this option. (Author/VWL)
NASA Astrophysics Data System (ADS)
Zhang, Chao; Zhang, Qian; Zheng, Chi; Qiu, Guoping
2018-04-01
Video foreground segmentation is one of the key problems in video processing. In this paper, we proposed a novel and fully unsupervised approach for foreground object co-localization and segmentation of unconstrained videos. We firstly compute both the actual edges and motion boundaries of the video frames, and then align them by their HOG feature maps. Then, by filling the occlusions generated by the aligned edges, we obtained more precise masks about the foreground object. Such motion-based masks could be derived as the motion-based likelihood. Moreover, the color-base likelihood is adopted for the segmentation process. Experimental Results show that our approach outperforms most of the State-of-the-art algorithms.
An outcome-based assessment process for accrediting computing programmes
NASA Astrophysics Data System (ADS)
Harmanani, Haidar M.
2017-11-01
The calls for accountability in higher education have made outcome-based assessment a key accreditation component. Accreditation remains a well-regarded seal of approval on college quality, and requires the programme to set clear, appropriate, and measurable goals and courses to attain them. Furthermore, programmes must demonstrate that responsibilities associated with the goals are being carried out. Assessment leaders face various challenges including process design and implementation, faculty buy-in, and resources availability. This paper presents an outcome-based assessment approach that facilitates faculty participation while simplifying the assessment and reporting processes through effective and meaningful visualisation. The proposed approach has been implemented and used for the successful ABET accreditation of a computer science programme, and can be easily adapted to any higher education programme.
Collective learning modeling based on the kinetic theory of active particles
NASA Astrophysics Data System (ADS)
Burini, D.; De Lillo, S.; Gibelli, L.
2016-03-01
This paper proposes a systems approach to the theory of perception and learning in populations composed of many living entities. Starting from a phenomenological description of these processes, a mathematical structure is derived which is deemed to incorporate their complexity features. The modeling is based on a generalization of kinetic theory methods where interactions are described by theoretical tools of game theory. As an application, the proposed approach is used to model the learning processes that take place in a classroom.
ERIC Educational Resources Information Center
Slussareff, Michaela; Bohácková, Petra
2016-01-01
This paper compares two kinds of educational treatment within location-based game approach; learning by playing a location-based game and learning by designing a location-based game. Two parallel elementary school classes were included in our study (N = 27; age 14-15). The "designers" class took part in the whole process of game design…
Trait-based approaches for understanding microbial biodiversity and ecosystem functioning
Krause, Sascha; Le Roux, Xavier; Niklaus, Pascal A.; Van Bodegom, Peter M.; Lennon, Jay T.; Bertilsson, Stefan; Grossart, Hans-Peter; Philippot, Laurent; Bodelier, Paul L. E.
2014-01-01
In ecology, biodiversity-ecosystem functioning (BEF) research has seen a shift in perspective from taxonomy to function in the last two decades, with successful application of trait-based approaches. This shift offers opportunities for a deeper mechanistic understanding of the role of biodiversity in maintaining multiple ecosystem processes and services. In this paper, we highlight studies that have focused on BEF of microbial communities with an emphasis on integrating trait-based approaches to microbial ecology. In doing so, we explore some of the inherent challenges and opportunities of understanding BEF using microbial systems. For example, microbial biologists characterize communities using gene phylogenies that are often unable to resolve functional traits. Additionally, experimental designs of existing microbial BEF studies are often inadequate to unravel BEF relationships. We argue that combining eco-physiological studies with contemporary molecular tools in a trait-based framework can reinforce our ability to link microbial diversity to ecosystem processes. We conclude that such trait-based approaches are a promising framework to increase the understanding of microbial BEF relationships and thus generating systematic principles in microbial ecology and more generally ecology. PMID:24904563
Developing Emotion-Based Case Formulations: A Research-Informed Method.
Pascual-Leone, Antonio; Kramer, Ueli
2017-01-01
New research-informed methods for case conceptualization that cut across traditional therapy approaches are increasingly popular. This paper presents a trans-theoretical approach to case formulation based on the research observations of emotion. The sequential model of emotional processing (Pascual-Leone & Greenberg, 2007) is a process research model that provides concrete markers for therapists to observe the emerging emotional development of their clients. We illustrate how this model can be used by clinicians to track change and provides a 'clinical map,' by which therapist may orient themselves in-session and plan treatment interventions. Emotional processing offers as a trans-theoretical framework for therapists who wish to conduct emotion-based case formulations. First, we present criteria for why this research model translates well into practice. Second, two contrasting case studies are presented to demonstrate the method. The model bridges research with practice by using client emotion as an axis of integration. Key Practitioner Message Process research on emotion can offer a template for therapists to make case formulations while using a range of treatment approaches. The sequential model of emotional processing provides a 'process map' of concrete markers for therapists to (1) observe the emerging emotional development of their clients, and (2) help therapists develop a treatment plan. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Christensen, Leif; Karle, Hans; Nystrup, Jørgen
2007-09-01
An outcome-based approach to medical education compared to a process/content orientation is currently being discussed intensively. In this article, the process and outcome interrelationship in medical education is discussed, with specific emphasis on the relation to the definition of standards in basic medical education. Perceptions of outcome have always been an integrated element of curricular planning. The present debate underlines the need for stronger focus on learning objectives and outcome assessment in many medical schools around the world. The need to maintain an integrated approach of process/content and outcome is underlined in this paper. A worry is expressed about the taxonomy of learning in pure outcome-based medical education, in which student assessment can be a major determinant for the learning process, leaving the control of the medical curriculum to medical examiners. Moreover, curricula which favour reductionism by stating everything in terms of instrumental outcomes or competences, do face a risk of lowering quality and do become a prey for political interference. Standards based on outcome alone rise unclarified problems in relationship to licensure requirements of medical doctors. It is argued that the alleged dichotomy between process/content and outcome seems artificial, and that formulation of standards in medical education must follow a comprehensive line in curricular planning.
Modeling winter hydrological processes under differing climatic conditions: Modifying WEPP
NASA Astrophysics Data System (ADS)
Dun, Shuhui
Water erosion is a serious and continuous environmental problem worldwide. In cold regions, soil freeze and thaw has great impacts on infiltration and erosion. Rain or snowmelt on a thawing soil can cause severe water erosion. Of equal importance is snow accumulation and snowmelt, which can be the predominant hydrological process in areas of mid- to high latitudes and forested watersheds. Modelers must properly simulate winter processes to adequately represent the overall hydrological outcome and sediment and chemical transport in these areas. Modeling winter hydrology is presently lacking in water erosion models. Most of these models are based on the functional Universal Soil Loss Equation (USLE) or its revised forms, e.g., Revised USLE (RUSLE). In RUSLE a seasonally variable soil erodibility factor (K) was used to account for the effects of frozen and thawing soil. Yet the use of this factor requires observation data for calibration, and such a simplified approach cannot represent the complicated transient freeze-thaw processes and their impacts on surface runoff and erosion. The Water Erosion Prediction Project (WEPP) watershed model, a physically-based erosion prediction software developed by the USDA-ARS, has seen numerous applications within and outside the US. WEPP simulates winter processes, including snow accumulation, snowmelt, and soil freeze-thaw, using an approach based on mass and energy conservation. However, previous studies showed the inadequacy of the winter routines in the WEPP model. Therefore, the objectives of this study were: (1) To adapt a modeling approach for winter hydrology based on mass and energy conservation, and to implement this approach into a physically-oriented hydrological model, such as WEPP; and (2) To assess this modeling approach through case applications to different geographic conditions. A new winter routine was developed and its performance was evaluated by incorporating it into WEPP (v2008.9) and then applying WEPP to four study sites at different spatial scales under different climatic conditions, including experimental plots in Pullman, WA and Morris, MN, two agricultural drainages in Pendleton, OR, and a forest watershed in Mica Creek, ID. The model applications showed promising results, indicating adequacy of the mass- and energy-balance-based approach for winter hydrology simulation.
Lai, Ying-Hui; Tsao, Yu; Lu, Xugang; Chen, Fei; Su, Yu-Ting; Chen, Kuang-Chao; Chen, Yu-Hsuan; Chen, Li-Ching; Po-Hung Li, Lieber; Lee, Chin-Hui
2018-01-20
We investigate the clinical effectiveness of a novel deep learning-based noise reduction (NR) approach under noisy conditions with challenging noise types at low signal to noise ratio (SNR) levels for Mandarin-speaking cochlear implant (CI) recipients. The deep learning-based NR approach used in this study consists of two modules: noise classifier (NC) and deep denoising autoencoder (DDAE), thus termed (NC + DDAE). In a series of comprehensive experiments, we conduct qualitative and quantitative analyses on the NC module and the overall NC + DDAE approach. Moreover, we evaluate the speech recognition performance of the NC + DDAE NR and classical single-microphone NR approaches for Mandarin-speaking CI recipients under different noisy conditions. The testing set contains Mandarin sentences corrupted by two types of maskers, two-talker babble noise, and a construction jackhammer noise, at 0 and 5 dB SNR levels. Two conventional NR techniques and the proposed deep learning-based approach are used to process the noisy utterances. We qualitatively compare the NR approaches by the amplitude envelope and spectrogram plots of the processed utterances. Quantitative objective measures include (1) normalized covariance measure to test the intelligibility of the utterances processed by each of the NR approaches; and (2) speech recognition tests conducted by nine Mandarin-speaking CI recipients. These nine CI recipients use their own clinical speech processors during testing. The experimental results of objective evaluation and listening test indicate that under challenging listening conditions, the proposed NC + DDAE NR approach yields higher intelligibility scores than the two compared classical NR techniques, under both matched and mismatched training-testing conditions. When compared to the two well-known conventional NR techniques under challenging listening condition, the proposed NC + DDAE NR approach has superior noise suppression capabilities and gives less distortion for the key speech envelope information, thus, improving speech recognition more effectively for Mandarin CI recipients. The results suggest that the proposed deep learning-based NR approach can potentially be integrated into existing CI signal processors to overcome the degradation of speech perception caused by noise.
Training for Template Creation: A Performance Improvement Method
ERIC Educational Resources Information Center
Lyons, Paul
2008-01-01
Purpose: There are three purposes to this article: first, to offer a training approach to employee learning and performance improvement that makes use of a step-by-step process of skill/knowledge creation. The process offers follow-up opportunities for skill maintenance and improvement; second, to explain the conceptual bases of the approach; and…
Beyond Competence: An Essay on a Process Approach to Organising and Enacting Vocational Education
ERIC Educational Resources Information Center
Billett, Stephen
2016-01-01
The competency-based approach to vocational education is premised on narrow and dated conceptions of human functioning, performance and development. Its adoption is more driven by administrative concerns about measurable outcomes than educational processes and outcomes. Informed by educational science and earlier debates, this article discusses…
From Career Decision-Making Styles to Career Decision-Making Profiles: A Multidimensional Approach
ERIC Educational Resources Information Center
Gati, Itamar; Landman, Shiri; Davidovitch, Shlomit; Asulin-Peretz, Lisa; Gadassi, Reuma
2010-01-01
Previous research on individual differences in career decision-making processes has often focused on classifying individuals into a few types of decision-making "styles" based on the most dominant trait or characteristic of their approach to the decision process (e.g., rational, intuitive, dependent; Harren, 1979). In this research, an…
An Improved Incremental Learning Approach for KPI Prognosis of Dynamic Fuel Cell System.
Yin, Shen; Xie, Xiaochen; Lam, James; Cheung, Kie Chung; Gao, Huijun
2016-12-01
The key performance indicator (KPI) has an important practical value with respect to the product quality and economic benefits for modern industry. To cope with the KPI prognosis issue under nonlinear conditions, this paper presents an improved incremental learning approach based on available process measurements. The proposed approach takes advantage of the algorithm overlapping of locally weighted projection regression (LWPR) and partial least squares (PLS), implementing the PLS-based prognosis in each locally linear model produced by the incremental learning process of LWPR. The global prognosis results including KPI prediction and process monitoring are obtained from the corresponding normalized weighted means of all the local models. The statistical indicators for prognosis are enhanced as well by the design of novel KPI-related and KPI-unrelated statistics with suitable control limits for non-Gaussian data. For application-oriented purpose, the process measurements from real datasets of a proton exchange membrane fuel cell system are employed to demonstrate the effectiveness of KPI prognosis. The proposed approach is finally extended to a long-term voltage prediction for potential reference of further fuel cell applications.
NASA Astrophysics Data System (ADS)
Divine, D. V.; Godtliebsen, F.; Rue, H.
2012-01-01
The paper proposes an approach to assessment of timescale errors in proxy-based series with chronological uncertainties. The method relies on approximation of the physical process(es) forming a proxy archive by a random Gamma process. Parameters of the process are partly data-driven and partly determined from prior assumptions. For a particular case of a linear accumulation model and absolutely dated tie points an analytical solution is found suggesting the Beta-distributed probability density on age estimates along the length of a proxy archive. In a general situation of uncertainties in the ages of the tie points the proposed method employs MCMC simulations of age-depth profiles yielding empirical confidence intervals on the constructed piecewise linear best guess timescale. It is suggested that the approach can be further extended to a more general case of a time-varying expected accumulation between the tie points. The approach is illustrated by using two ice and two lake/marine sediment cores representing the typical examples of paleoproxy archives with age models based on tie points of mixed origin.
Loutfy, Mona; Greene, Saara; Kennedy, V Logan; Lewis, Johanna; Thomas-Pavanel, Jamie; Conway, Tracey; de Pokomandy, Alexandra; O'Brien, Nadia; Carter, Allison; Tharao, Wangari; Nicholson, Valerie; Beaver, Kerrigan; Dubuc, Danièle; Gahagan, Jacqueline; Proulx-Boucher, Karène; Hogg, Robert S; Kaida, Angela
2016-08-19
Community-based research has gained increasing recognition in health research over the last two decades. Such participatory research approaches are lauded for their ability to anchor research in lived experiences, ensuring cultural appropriateness, accessing local knowledge, reaching marginalized communities, building capacity, and facilitating research-to-action. While having these positive attributes, the community-based health research literature is predominantly composed of small projects, using qualitative methods, and set within geographically limited communities. Its use in larger health studies, including clinical trials and cohorts, is limited. We present the Canadian HIV Women's Sexual and Reproductive Health Cohort Study (CHIWOS), a large-scale, multi-site, national, longitudinal quantitative study that has operationalized community-based research in all steps of the research process. Successes, challenges and further considerations are offered. Through the integration of community-based research principles, we have been successful in: facilitating a two-year long formative phase for this study; developing a novel survey instrument with national involvement; training 39 Peer Research Associates (PRAs); offering ongoing comprehensive support to PRAs; and engaging in an ongoing iterative community-based research process. Our community-based research approach within CHIWOS demanded that we be cognizant of challenges managing a large national team, inherent power imbalances and challenges with communication, compensation and volunteering considerations, and extensive delays in institutional processes. It is important to consider the iterative nature of community-based research and to work through tensions that emerge given the diverse perspectives of numerous team members. Community-based research, as an approach to large-scale quantitative health research projects, is an increasingly viable methodological option. Community-based research has several advantages that go hand-in-hand with its obstacles. We offer guidance on implementing this approach, such that the process can be better planned and result in success.
Modeling marine oily wastewater treatment by a probabilistic agent-based approach.
Jing, Liang; Chen, Bing; Zhang, Baiyu; Ye, Xudong
2018-02-01
This study developed a novel probabilistic agent-based approach for modeling of marine oily wastewater treatment processes. It begins first by constructing a probability-based agent simulation model, followed by a global sensitivity analysis and a genetic algorithm-based calibration. The proposed modeling approach was tested through a case study of the removal of naphthalene from marine oily wastewater using UV irradiation. The removal of naphthalene was described by an agent-based simulation model using 8 types of agents and 11 reactions. Each reaction was governed by a probability parameter to determine its occurrence. The modeling results showed that the root mean square errors between modeled and observed removal rates were 8.73 and 11.03% for calibration and validation runs, respectively. Reaction competition was analyzed by comparing agent-based reaction probabilities, while agents' heterogeneity was visualized by plotting their real-time spatial distribution, showing a strong potential for reactor design and process optimization. Copyright © 2017 Elsevier Ltd. All rights reserved.
Masters, Kevin S; Ross, Kaile M; Hooker, Stephanie A; Wooldridge, Jennalee L
2018-05-18
There has been a notable disconnect between theories of behavior change and behavior change interventions. Because few interventions are both explicitly and adequately theory-based, investigators cannot assess the impact of theory on intervention effectiveness. Theory-based interventions, designed to deliberately engage the theory's proposed mechanisms of change, are needed to adequately test theories. Thus, systematic approaches to theory-based intervention development are needed. This article will introduce and discuss the psychometric method of developing theory-based interventions. The psychometric approach to intervention development utilizes basic psychometric principles at each step of the intervention development process in order to build a theoretically driven intervention to, subsequently, be tested in process (mechanism) and outcome studies. Five stages of intervention development are presented as follows: (i) Choice of theory; (ii) Identification and characterization of key concepts and expected relations; (iii) Intervention construction; (iv) Initial testing and revision; and (v) Empirical testing of the intervention. Examples of this approach from the Colorado Meaning-Activity Project (COMAP) are presented. Based on self-determination theory integrated with meaning or purpose, and utilizing a motivational interviewing approach, the COMAP intervention is individually based with an initial interview followed by smart phone-delivered interventions for increasing daily activity. The psychometric approach to intervention development is one method to ensure careful consideration of theory in all steps of intervention development. This structured approach supports developing a research culture that endorses deliberate and systematic operationalization of theory into behavior change intervention from the outset of intervention development.
How we flipped the medical classroom.
Sharma, Neel; Lau, C S; Doherty, Iain; Harbutt, Darren
2015-04-01
Flipping the classroom centres on the delivery of print, audio or video based material prior to a lecture or class session. The class session is then dedicated to more active learning processes with application of knowledge through problem solving or case based scenarios. The rationale behind this approach is that teachers can spend their face-to-face time supporting students in deeper learning processes. In this paper we provide a background literature review on the flipped classroom along with a three step approach to flipping the classroom comprising implementing, enacting and evaluating this form of pedagogy. Our three step approach is based on actual experience of delivering a flipped classroom at the University of Hong Kong. This initiative was evaluated with positive results. We hope our experience will be transferable to other medical institutions.
NASA Astrophysics Data System (ADS)
Peltola, Olli; Raivonen, Maarit; Li, Xuefei; Vesala, Timo
2018-02-01
Emission via bubbling, i.e. ebullition, is one of the main methane (CH4) emission pathways from wetlands to the atmosphere. Direct measurement of gas bubble formation, growth and release in the peat-water matrix is challenging and in consequence these processes are relatively unknown and are coarsely represented in current wetland CH4 emission models. In this study we aimed to evaluate three ebullition modelling approaches and their effect on model performance. This was achieved by implementing the three approaches in one process-based CH4 emission model. All the approaches were based on some kind of threshold: either on CH4 pore water concentration (ECT), pressure (EPT) or free-phase gas volume (EBG) threshold. The model was run using 4 years of data from a boreal sedge fen and the results were compared with eddy covariance measurements of CH4 fluxes.
Modelled annual CH4 emissions were largely unaffected by the different ebullition modelling approaches; however, temporal variability in CH4 emissions varied an order of magnitude between the approaches. Hence the ebullition modelling approach drives the temporal variability in modelled CH4 emissions and therefore significantly impacts, for instance, high-frequency (daily scale) model comparison and calibration against measurements. The modelling approach based on the most recent knowledge of the ebullition process (volume threshold, EBG) agreed the best with the measured fluxes (R2 = 0.63) and hence produced the most reasonable results, although there was a scale mismatch between the measurements (ecosystem scale with heterogeneous ebullition locations) and model results (single horizontally homogeneous peat column). The approach should be favoured over the two other more widely used ebullition modelling approaches and researchers are encouraged to implement it into their CH4 emission models.
Scalable and responsive event processing in the cloud
Suresh, Visalakshmi; Ezhilchelvan, Paul; Watson, Paul
2013-01-01
Event processing involves continuous evaluation of queries over streams of events. Response-time optimization is traditionally done over a fixed set of nodes and/or by using metrics measured at query-operator levels. Cloud computing makes it easy to acquire and release computing nodes as required. Leveraging this flexibility, we propose a novel, queueing-theory-based approach for meeting specified response-time targets against fluctuating event arrival rates by drawing only the necessary amount of computing resources from a cloud platform. In the proposed approach, the entire processing engine of a distinct query is modelled as an atomic unit for predicting response times. Several such units hosted on a single node are modelled as a multiple class M/G/1 system. These aspects eliminate intrusive, low-level performance measurements at run-time, and also offer portability and scalability. Using model-based predictions, cloud resources are efficiently used to meet response-time targets. The efficacy of the approach is demonstrated through cloud-based experiments. PMID:23230164
Implementation of science process skills using ICT-based approach to facilitate student life skills
NASA Astrophysics Data System (ADS)
Rahayu, Y. S.; Yuliani; Wijaya, B. R.
2018-01-01
The purpose of this study is to describe the results of the implementation of a teaching-learning package in Plant Physiology courses to improve the student’s life skills using the science process skills-based approach ICT. This research used 15 students of Biology Education of Undergraduate International Class who are in the Plant Physiology course. This study consists of two phases items, namely the development phase and implementation phase by using a one-shot case study design. Research parameters were the feasibility of lesson plans, student achievement, Including academic skills, thinking skills, and social skills. Data were descriptively Analyzed According to the characteristics of the existing data. The result shows that the feasibility of a lesson plan is very satisfied and can be improvements in student’s life skills, especially with regards to student’s thinking skills and scientific thinking skills. The results indicate that the science process skills using ICT-based approach can be effective methods to improve student’s life skills.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Jiangjiang; Li, Weixuan; Zeng, Lingzao
Surrogate models are commonly used in Bayesian approaches such as Markov Chain Monte Carlo (MCMC) to avoid repetitive CPU-demanding model evaluations. However, the approximation error of a surrogate may lead to biased estimations of the posterior distribution. This bias can be corrected by constructing a very accurate surrogate or implementing MCMC in a two-stage manner. Since the two-stage MCMC requires extra original model evaluations, the computational cost is still high. If the information of measurement is incorporated, a locally accurate approximation of the original model can be adaptively constructed with low computational cost. Based on this idea, we propose amore » Gaussian process (GP) surrogate-based Bayesian experimental design and parameter estimation approach for groundwater contaminant source identification problems. A major advantage of the GP surrogate is that it provides a convenient estimation of the approximation error, which can be incorporated in the Bayesian formula to avoid over-confident estimation of the posterior distribution. The proposed approach is tested with a numerical case study. Without sacrificing the estimation accuracy, the new approach achieves about 200 times of speed-up compared to our previous work using two-stage MCMC.« less
EPA announced the availability of the final report, Considerations for Developing a Dosimetry-Based Cumulative Risk Assessment Approach for Mixtures of Environmental Contaminants. This report describes a process that can be used to determine the potential value of develop...
A Social Neuroscientific Model of Vocational Behavior
ERIC Educational Resources Information Center
Hansen, Jo-Ida C.; Sullivan, Brandon A.; Luciana, Monica
2011-01-01
In this article, the separate literatures of a neurobiologically based approach system and vocational interests are reviewed and integrated into a social neuroscientific model of the processes underlying interests, based upon the idea of selective approach motivation. The authors propose that vocational interests describe the types of stimuli that…
Towards an improved ensemble precipitation forecast: A probabilistic post-processing approach
NASA Astrophysics Data System (ADS)
Khajehei, Sepideh; Moradkhani, Hamid
2017-03-01
Recently, ensemble post-processing (EPP) has become a commonly used approach for reducing the uncertainty in forcing data and hence hydrologic simulation. The procedure was introduced to build ensemble precipitation forecasts based on the statistical relationship between observations and forecasts. More specifically, the approach relies on a transfer function that is developed based on a bivariate joint distribution between the observations and the simulations in the historical period. The transfer function is used to post-process the forecast. In this study, we propose a Bayesian EPP approach based on copula functions (COP-EPP) to improve the reliability of the precipitation ensemble forecast. Evaluation of the copula-based method is carried out by comparing the performance of the generated ensemble precipitation with the outputs from an existing procedure, i.e. mixed type meta-Gaussian distribution. Monthly precipitation from Climate Forecast System Reanalysis (CFS) and gridded observation from Parameter-Elevation Relationships on Independent Slopes Model (PRISM) have been employed to generate the post-processed ensemble precipitation. Deterministic and probabilistic verification frameworks are utilized in order to evaluate the outputs from the proposed technique. Distribution of seasonal precipitation for the generated ensemble from the copula-based technique is compared to the observation and raw forecasts for three sub-basins located in the Western United States. Results show that both techniques are successful in producing reliable and unbiased ensemble forecast, however, the COP-EPP demonstrates considerable improvement in the ensemble forecast in both deterministic and probabilistic verification, in particular in characterizing the extreme events in wet seasons.
Prabhakar, P.; Sames, William J.; Dehoff, Ryan R.; ...
2015-03-28
Here, a computational modeling approach to simulate residual stress formation during the electron beam melting (EBM) process within the additive manufacturing (AM) technologies for Inconel 718 is presented in this paper. The EBM process has demonstrated a high potential to fabricate components with complex geometries, but the resulting components are influenced by the thermal cycles observed during the manufacturing process. When processing nickel based superalloys, very high temperatures (approx. 1000 °C) are observed in the powder bed, base plate, and build. These high temperatures, when combined with substrate adherence, can result in warping of the base plate and affect themore » final component by causing defects. It is important to have an understanding of the thermo-mechanical response of the entire system, that is, its mechanical behavior towards thermal loading occurring during the EBM process prior to manufacturing a component. Therefore, computational models to predict the response of the system during the EBM process will aid in eliminating the undesired process conditions, a priori, in order to fabricate the optimum component. Such a comprehensive computational modeling approach is demonstrated to analyze warping of the base plate, stress and plastic strain accumulation within the material, and thermal cycles in the system during different stages of the EBM process.« less
Improving the learning of clinical reasoning through computer-based cognitive representation.
Wu, Bian; Wang, Minhong; Johnson, Janice M; Grotzer, Tina A
2014-01-01
Objective Clinical reasoning is usually taught using a problem-solving approach, which is widely adopted in medical education. However, learning through problem solving is difficult as a result of the contextualization and dynamic aspects of actual problems. Moreover, knowledge acquired from problem-solving practice tends to be inert and fragmented. This study proposed a computer-based cognitive representation approach that externalizes and facilitates the complex processes in learning clinical reasoning. The approach is operationalized in a computer-based cognitive representation tool that involves argument mapping to externalize the problem-solving process and concept mapping to reveal the knowledge constructed from the problems. Methods Twenty-nine Year 3 or higher students from a medical school in east China participated in the study. Participants used the proposed approach implemented in an e-learning system to complete four learning cases in 4 weeks on an individual basis. For each case, students interacted with the problem to capture critical data, generate and justify hypotheses, make a diagnosis, recall relevant knowledge, and update their conceptual understanding of the problem domain. Meanwhile, students used the computer-based cognitive representation tool to articulate and represent the key elements and their interactions in the learning process. Results A significant improvement was found in students' learning products from the beginning to the end of the study, consistent with students' report of close-to-moderate progress in developing problem-solving and knowledge-construction abilities. No significant differences were found between the pretest and posttest scores with the 4-week period. The cognitive representation approach was found to provide more formative assessment. Conclusions The computer-based cognitive representation approach improved the learning of clinical reasoning in both problem solving and knowledge construction.
Improving the learning of clinical reasoning through computer-based cognitive representation
Wu, Bian; Wang, Minhong; Johnson, Janice M.; Grotzer, Tina A.
2014-01-01
Objective Clinical reasoning is usually taught using a problem-solving approach, which is widely adopted in medical education. However, learning through problem solving is difficult as a result of the contextualization and dynamic aspects of actual problems. Moreover, knowledge acquired from problem-solving practice tends to be inert and fragmented. This study proposed a computer-based cognitive representation approach that externalizes and facilitates the complex processes in learning clinical reasoning. The approach is operationalized in a computer-based cognitive representation tool that involves argument mapping to externalize the problem-solving process and concept mapping to reveal the knowledge constructed from the problems. Methods Twenty-nine Year 3 or higher students from a medical school in east China participated in the study. Participants used the proposed approach implemented in an e-learning system to complete four learning cases in 4 weeks on an individual basis. For each case, students interacted with the problem to capture critical data, generate and justify hypotheses, make a diagnosis, recall relevant knowledge, and update their conceptual understanding of the problem domain. Meanwhile, students used the computer-based cognitive representation tool to articulate and represent the key elements and their interactions in the learning process. Results A significant improvement was found in students’ learning products from the beginning to the end of the study, consistent with students’ report of close-to-moderate progress in developing problem-solving and knowledge-construction abilities. No significant differences were found between the pretest and posttest scores with the 4-week period. The cognitive representation approach was found to provide more formative assessment. Conclusions The computer-based cognitive representation approach improved the learning of clinical reasoning in both problem solving and knowledge construction. PMID:25518871
Improving the learning of clinical reasoning through computer-based cognitive representation.
Wu, Bian; Wang, Minhong; Johnson, Janice M; Grotzer, Tina A
2014-01-01
Clinical reasoning is usually taught using a problem-solving approach, which is widely adopted in medical education. However, learning through problem solving is difficult as a result of the contextualization and dynamic aspects of actual problems. Moreover, knowledge acquired from problem-solving practice tends to be inert and fragmented. This study proposed a computer-based cognitive representation approach that externalizes and facilitates the complex processes in learning clinical reasoning. The approach is operationalized in a computer-based cognitive representation tool that involves argument mapping to externalize the problem-solving process and concept mapping to reveal the knowledge constructed from the problems. Twenty-nine Year 3 or higher students from a medical school in east China participated in the study. Participants used the proposed approach implemented in an e-learning system to complete four learning cases in 4 weeks on an individual basis. For each case, students interacted with the problem to capture critical data, generate and justify hypotheses, make a diagnosis, recall relevant knowledge, and update their conceptual understanding of the problem domain. Meanwhile, students used the computer-based cognitive representation tool to articulate and represent the key elements and their interactions in the learning process. A significant improvement was found in students' learning products from the beginning to the end of the study, consistent with students' report of close-to-moderate progress in developing problem-solving and knowledge-construction abilities. No significant differences were found between the pretest and posttest scores with the 4-week period. The cognitive representation approach was found to provide more formative assessment. The computer-based cognitive representation approach improved the learning of clinical reasoning in both problem solving and knowledge construction.
ERIC Educational Resources Information Center
Zhang, Xiaolei; Wong, Jocelyn L. N.
2018-01-01
Studies of professional development have examined the influence of school-based approaches on in-service teacher learning and change but have seldom investigated teachers' job-embedded learning processes. This paper explores the dynamic processes of teacher learning in school-based settings. A qualitative comparative case study based on the…
NASA Astrophysics Data System (ADS)
Müller, M. F.; Thompson, S. E.
2016-02-01
The prediction of flow duration curves (FDCs) in ungauged basins remains an important task for hydrologists given the practical relevance of FDCs for water management and infrastructure design. Predicting FDCs in ungauged basins typically requires spatial interpolation of statistical or model parameters. This task is complicated if climate becomes non-stationary, as the prediction challenge now also requires extrapolation through time. In this context, process-based models for FDCs that mechanistically link the streamflow distribution to climate and landscape factors may have an advantage over purely statistical methods to predict FDCs. This study compares a stochastic (process-based) and statistical method for FDC prediction in both stationary and non-stationary contexts, using Nepal as a case study. Under contemporary conditions, both models perform well in predicting FDCs, with Nash-Sutcliffe coefficients above 0.80 in 75 % of the tested catchments. The main drivers of uncertainty differ between the models: parameter interpolation was the main source of error for the statistical model, while violations of the assumptions of the process-based model represented the main source of its error. The process-based approach performed better than the statistical approach in numerical simulations with non-stationary climate drivers. The predictions of the statistical method under non-stationary rainfall conditions were poor if (i) local runoff coefficients were not accurately determined from the gauge network, or (ii) streamflow variability was strongly affected by changes in rainfall. A Monte Carlo analysis shows that the streamflow regimes in catchments characterized by frequent wet-season runoff and a rapid, strongly non-linear hydrologic response are particularly sensitive to changes in rainfall statistics. In these cases, process-based prediction approaches are favored over statistical models.
ERIC Educational Resources Information Center
Thomson, Jennifer M.; Leong, Victoria; Goswami, Usha
2013-01-01
The purpose of this study was to compare the efficacy of two auditory processing interventions for developmental dyslexia, one based on rhythm and one based on phonetic training. Thirty-three children with dyslexia participated and were assigned to one of three groups (a) a novel rhythmic processing intervention designed to highlight auditory…
Applying SF-Based Genre Approaches to English Writing Class
ERIC Educational Resources Information Center
Wu, Yan; Dong, Hailin
2009-01-01
By exploring genre approaches in systemic functional linguistics and examining the analytic tools that can be applied to the process of English learning and teaching, this paper seeks to find a way of applying genre approaches to English writing class.
A Hospital Is Not Just a Factory, but a Complex Adaptive System-Implications for Perioperative Care.
Mahajan, Aman; Islam, Salim D; Schwartz, Michael J; Cannesson, Maxime
2017-07-01
Many methods used to improve hospital and perioperative services productivity and quality of care have assumed that the hospital is essentially a factory, and therefore, that industrial engineering and manufacturing-derived redesign approaches such as Six Sigma and Lean can be applied to hospitals and perioperative services just as they have been applied in factories. However, a hospital is not merely a factory but also a complex adaptive system (CAS). The hospital CAS has many subsystems, with perioperative care being an important one for which concepts of factory redesign are frequently advocated. In this article, we argue that applying only factory approaches such as lean methodologies or process standardization to complex systems such as perioperative care could account for difficulties and/or failures in improving performance in care delivery. Within perioperative services, only noncomplex/low-variance surgical episodes are amenable to manufacturing-based redesign. On the other hand, complex surgery/high-variance cases and preoperative segmentation (the process of distinguishing between normal and complex cases) can be viewed as CAS-like. These systems tend to self-organize, often resist or react unpredictably to attempts at control, and therefore require application of CAS principles to modify system behavior. We describe 2 examples of perioperative redesign to illustrate the concepts outlined above. These examples present complementary and contrasting cases from 2 leading delivery systems. The Mayo Clinic example illustrates the application of manufacturing-based redesign principles to a factory-like (high-volume, low-risk, and mature practice) clinical program, while the Kaiser Permanente example illustrates the application of both manufacturing-based and self-organization-based approaches to programs and processes that are not factory-like but CAS-like. In this article, we describe how factory-like processes and CAS can coexist within a hospital and how self-organization-based approaches can be used to improve care delivery in many situations where manufacturing-based approaches may not be appropriate.
Agent-based model of angiogenesis simulates capillary sprout initiation in multicellular networks
Walpole, J.; Chappell, J.C.; Cluceru, J.G.; Mac Gabhann, F.; Bautch, V.L.; Peirce, S. M.
2015-01-01
Many biological processes are controlled by both deterministic and stochastic influences. However, efforts to model these systems often rely on either purely stochastic or purely rule-based methods. To better understand the balance between stochasticity and determinism in biological processes a computational approach that incorporates both influences may afford additional insight into underlying biological mechanisms that give rise to emergent system properties. We apply a combined approach to the simulation and study of angiogenesis, the growth of new blood vessels from existing networks. This complex multicellular process begins with selection of an initiating endothelial cell, or tip cell, which sprouts from the parent vessels in response to stimulation by exogenous cues. We have constructed an agent-based model of sprouting angiogenesis to evaluate endothelial cell sprout initiation frequency and location, and we have experimentally validated it using high-resolution time-lapse confocal microscopy. ABM simulations were then compared to a Monte Carlo model, revealing that purely stochastic simulations could not generate sprout locations as accurately as the rule-informed agent-based model. These findings support the use of rule-based approaches for modeling the complex mechanisms underlying sprouting angiogenesis over purely stochastic methods. PMID:26158406
Agent-based model of angiogenesis simulates capillary sprout initiation in multicellular networks.
Walpole, J; Chappell, J C; Cluceru, J G; Mac Gabhann, F; Bautch, V L; Peirce, S M
2015-09-01
Many biological processes are controlled by both deterministic and stochastic influences. However, efforts to model these systems often rely on either purely stochastic or purely rule-based methods. To better understand the balance between stochasticity and determinism in biological processes a computational approach that incorporates both influences may afford additional insight into underlying biological mechanisms that give rise to emergent system properties. We apply a combined approach to the simulation and study of angiogenesis, the growth of new blood vessels from existing networks. This complex multicellular process begins with selection of an initiating endothelial cell, or tip cell, which sprouts from the parent vessels in response to stimulation by exogenous cues. We have constructed an agent-based model of sprouting angiogenesis to evaluate endothelial cell sprout initiation frequency and location, and we have experimentally validated it using high-resolution time-lapse confocal microscopy. ABM simulations were then compared to a Monte Carlo model, revealing that purely stochastic simulations could not generate sprout locations as accurately as the rule-informed agent-based model. These findings support the use of rule-based approaches for modeling the complex mechanisms underlying sprouting angiogenesis over purely stochastic methods.
CNES reliability approach for the qualification of MEMS for space
NASA Astrophysics Data System (ADS)
Pressecq, Francis; Lafontan, Xavier; Perez, Guy; Fortea, Jean-Pierre
2001-10-01
This paper describes the reliability approach performs at CNES to evaluate MEMS for space application. After an introduction and a detailed state of the art on the space requirements and on the use of MEMS for space, different approaches for taking into account MEMS in the qualification phases are presented. CNES proposes improvement to theses approaches in term of failure mechanisms identification. Our approach is based on a design and test phase deeply linked with a technology study. This workflow is illustrated with an example: the case of a variable capacitance processed with MUMPS process is presented.
ERIC Educational Resources Information Center
Yen, Cheng-Huang; Chen, I-Chuan; Lai, Su-Chun; Chuang, Yea-Ru
2015-01-01
Traces of learning behaviors generally provide insights into learners and the learning processes that they employ. In this article, a learning-analytics-based approach is proposed for managing cognitive load by adjusting the instructional strategies used in online courses. The technology-based learning environment examined in this study involved a…
ERIC Educational Resources Information Center
McLean, Monica; Walker, Melanie
2012-01-01
The education of professionals oriented to poverty reduction and the public good is the focus of the article. Sen's "capability approach" is used to conceptualise university-based professional education as a process of developing public-good professional capabilities. The main output of a research project on professional education in…
Data near processing support for climate data analysis
NASA Astrophysics Data System (ADS)
Kindermann, Stephan; Ehbrecht, Carsten; Hempelmann, Nils
2016-04-01
Climate data repositories grow in size exponentially. Scalable data near processing capabilities are required to meet future data analysis requirements and to replace current "data download and process at home" workflows and approaches. On one hand side, these processing capabilities should be accessible via standardized interfaces (e.g. OGC WPS), on the other side a large variety of processing tools, toolboxes and deployment alternatives have to be supported and maintained at the data/processing center. We present a community approach of a modular and flexible system supporting the development, deployment and maintenace of OGC-WPS based web processing services. This approach is organized in an open source github project (called "bird-house") supporting individual processing services ("birds", e.g. climate index calculations, model data ensemble calculations), which rely on basic common infrastructural components (e.g. installation and deployment recipes, analysis code dependencies management). To support easy deployment at data centers as well as home institutes (e.g. for testing and development) the system supports the management of the often very complex package dependency chain of climate data analysis packages as well as docker based packaging and installation. We present a concrete deployment scenario at the German Climate Computing Center (DKRZ). The DKRZ one hand side hosts a multi-petabyte climate archive which is integrated e.g. into the european ENES and worldwide ESGF data infrastructure, and on the other hand hosts an HPC center supporting (model) data production and data analysis. The deployment scenario also includes openstack based data cloud services to support data import and data distribution for bird-house based WPS web processing services. Current challenges for inter-institutionnal deployments of web processing services supporting the european and international climate modeling community as well as the climate impact community are highlighted. Also aspects supporting future WPS based cross community usage scenarios supporting data reuse and data provenance aspects are reflected.
A KPI framework for process-based benchmarking of hospital information systems.
Jahn, Franziska; Winter, Alfred
2011-01-01
Benchmarking is a major topic for monitoring, directing and elucidating the performance of hospital information systems (HIS). Current approaches neglect the outcome of the processes that are supported by the HIS and their contribution to the hospital's strategic goals. We suggest to benchmark HIS based on clinical documentation processes and their outcome. A framework consisting of a general process model and outcome criteria for clinical documentation processes is introduced.
Knowledge-Based Object Detection in Laser Scanning Point Clouds
NASA Astrophysics Data System (ADS)
Boochs, F.; Karmacharya, A.; Marbs, A.
2012-07-01
Object identification and object processing in 3D point clouds have always posed challenges in terms of effectiveness and efficiency. In practice, this process is highly dependent on human interpretation of the scene represented by the point cloud data, as well as the set of modeling tools available for use. Such modeling algorithms are data-driven and concentrate on specific features of the objects, being accessible to numerical models. We present an approach that brings the human expert knowledge about the scene, the objects inside, and their representation by the data and the behavior of algorithms to the machine. This "understanding" enables the machine to assist human interpretation of the scene inside the point cloud. Furthermore, it allows the machine to understand possibilities and limitations of algorithms and to take this into account within the processing chain. This not only assists the researchers in defining optimal processing steps, but also provides suggestions when certain changes or new details emerge from the point cloud. Our approach benefits from the advancement in knowledge technologies within the Semantic Web framework. This advancement has provided a strong base for applications based on knowledge management. In the article we will present and describe the knowledge technologies used for our approach such as Web Ontology Language (OWL), used for formulating the knowledge base and the Semantic Web Rule Language (SWRL) with 3D processing and topologic built-ins, aiming to combine geometrical analysis of 3D point clouds, and specialists' knowledge of the scene and algorithmic processing.
Modified Redundancy based Technique—a New Approach to Combat Error Propagation Effect of AES
NASA Astrophysics Data System (ADS)
Sarkar, B.; Bhunia, C. T.; Maulik, U.
2012-06-01
Advanced encryption standard (AES) is a great research challenge. It has been developed to replace the data encryption standard (DES). AES suffers from a major limitation of error propagation effect. To tackle this limitation, two methods are available. One is redundancy based technique and the other one is bite based parity technique. The first one has a significant advantage of correcting any error on definite term over the second one but at the cost of higher level of overhead and hence lowering the processing speed. In this paper, a new approach based on the redundancy based technique is proposed that would certainly speed up the process of reliable encryption and hence the secured communication.
Using a contextualized sensemaking model for interaction design: A case study of tumor contouring.
Aselmaa, Anet; van Herk, Marcel; Laprie, Anne; Nestle, Ursula; Götz, Irina; Wiedenmann, Nicole; Schimek-Jasch, Tanja; Picaud, Francois; Syrykh, Charlotte; Cagetti, Leonel V; Jolnerovski, Maria; Song, Yu; Goossens, Richard H M
2017-01-01
Sensemaking theories help designers understand the cognitive processes of a user when he/she performs a complicated task. This paper introduces a two-step approach of incorporating sensemaking support within the design of health information systems by: (1) modeling the sensemaking process of physicians while performing a task, and (2) identifying software interaction design requirements that support sensemaking based on this model. The two-step approach is presented based on a case study of the tumor contouring clinical task for radiotherapy planning. In the first step of the approach, a contextualized sensemaking model was developed to describe the sensemaking process based on the goal, the workflow and the context of the task. In the second step, based on a research software prototype, an experiment was conducted where three contouring tasks were performed by eight physicians respectively. Four types of navigation interactions and five types of interaction sequence patterns were identified by analyzing the gathered interaction log data from those twenty-four cases. Further in-depth study on each of the navigation interactions and interaction sequence patterns in relation to the contextualized sensemaking model revealed five main areas for design improvements to increase sensemaking support. Outcomes of the case study indicate that the proposed two-step approach was beneficial for gaining a deeper understanding of the sensemaking process during the task, as well as for identifying design requirements for better sensemaking support. Copyright © 2016. Published by Elsevier Inc.
A Model-Based Approach to Developing Your Mission Operations System
NASA Technical Reports Server (NTRS)
Smith, Robert R.; Schimmels, Kathryn A.; Lock, Patricia D; Valerio, Charlene P.
2014-01-01
Model-Based System Engineering (MBSE) is an increasingly popular methodology for designing complex engineering systems. As the use of MBSE has grown, it has begun to be applied to systems that are less hardware-based and more people- and process-based. We describe our approach to incorporating MBSE as a way to streamline development, and how to build a model consisting of core resources, such as requirements and interfaces, that can be adapted and used by new and upcoming projects. By comparing traditional Mission Operations System (MOS) system engineering with an MOS designed via a model, we will demonstrate the benefits to be obtained by incorporating MBSE in system engineering design processes.
Shorov, Andrey; Kotenko, Igor
2014-01-01
The paper outlines a bioinspired approach named "network nervous system" and methods of simulation of infrastructure attacks and protection mechanisms based on this approach. The protection mechanisms based on this approach consist of distributed procedures of information collection and processing, which coordinate the activities of the main devices of a computer network, identify attacks, and determine necessary countermeasures. Attacks and protection mechanisms are specified as structural models using a set-theoretic approach. An environment for simulation of protection mechanisms based on the biological metaphor is considered; the experiments demonstrating the effectiveness of the protection mechanisms are described.
2016-01-01
Abstract Background Metabarcoding is becoming a common tool used to assess and compare diversity of organisms in environmental samples. Identification of OTUs is one of the critical steps in the process and several taxonomy assignment methods were proposed to accomplish this task. This publication evaluates the quality of reference datasets, alongside with several alignment and phylogeny inference methods used in one of the taxonomy assignment methods, called tree-based approach. This approach assigns anonymous OTUs to taxonomic categories based on relative placements of OTUs and reference sequences on the cladogram and support that these placements receive. New information In tree-based taxonomy assignment approach, reliable identification of anonymous OTUs is based on their placement in monophyletic and highly supported clades together with identified reference taxa. Therefore, it requires high quality reference dataset to be used. Resolution of phylogenetic trees is strongly affected by the presence of erroneous sequences as well as alignment and phylogeny inference methods used in the process. Two preparation steps are essential for the successful application of tree-based taxonomy assignment approach. Curated collections of genetic information do include erroneous sequences. These sequences have detrimental effect on the resolution of cladograms used in tree-based approach. They must be identified and excluded from the reference dataset beforehand. Various combinations of multiple sequence alignment and phylogeny inference methods provide cladograms with different topology and bootstrap support. These combinations of methods need to be tested in order to determine the one that gives highest resolution for the particular reference dataset. Completing the above mentioned preparation steps is expected to decrease the number of unassigned OTUs and thus improve the results of the tree-based taxonomy assignment approach. PMID:27932919
Holovachov, Oleksandr
2016-01-01
Metabarcoding is becoming a common tool used to assess and compare diversity of organisms in environmental samples. Identification of OTUs is one of the critical steps in the process and several taxonomy assignment methods were proposed to accomplish this task. This publication evaluates the quality of reference datasets, alongside with several alignment and phylogeny inference methods used in one of the taxonomy assignment methods, called tree-based approach. This approach assigns anonymous OTUs to taxonomic categories based on relative placements of OTUs and reference sequences on the cladogram and support that these placements receive. In tree-based taxonomy assignment approach, reliable identification of anonymous OTUs is based on their placement in monophyletic and highly supported clades together with identified reference taxa. Therefore, it requires high quality reference dataset to be used. Resolution of phylogenetic trees is strongly affected by the presence of erroneous sequences as well as alignment and phylogeny inference methods used in the process. Two preparation steps are essential for the successful application of tree-based taxonomy assignment approach. Curated collections of genetic information do include erroneous sequences. These sequences have detrimental effect on the resolution of cladograms used in tree-based approach. They must be identified and excluded from the reference dataset beforehand.Various combinations of multiple sequence alignment and phylogeny inference methods provide cladograms with different topology and bootstrap support. These combinations of methods need to be tested in order to determine the one that gives highest resolution for the particular reference dataset.Completing the above mentioned preparation steps is expected to decrease the number of unassigned OTUs and thus improve the results of the tree-based taxonomy assignment approach.
Estimating Function Approaches for Spatial Point Processes
NASA Astrophysics Data System (ADS)
Deng, Chong
Spatial point pattern data consist of locations of events that are often of interest in biological and ecological studies. Such data are commonly viewed as a realization from a stochastic process called spatial point process. To fit a parametric spatial point process model to such data, likelihood-based methods have been widely studied. However, while maximum likelihood estimation is often too computationally intensive for Cox and cluster processes, pairwise likelihood methods such as composite likelihood, Palm likelihood usually suffer from the loss of information due to the ignorance of correlation among pairs. For many types of correlated data other than spatial point processes, when likelihood-based approaches are not desirable, estimating functions have been widely used for model fitting. In this dissertation, we explore the estimating function approaches for fitting spatial point process models. These approaches, which are based on the asymptotic optimal estimating function theories, can be used to incorporate the correlation among data and yield more efficient estimators. We conducted a series of studies to demonstrate that these estmating function approaches are good alternatives to balance the trade-off between computation complexity and estimating efficiency. First, we propose a new estimating procedure that improves the efficiency of pairwise composite likelihood method in estimating clustering parameters. Our approach combines estimating functions derived from pairwise composite likeli-hood estimation and estimating functions that account for correlations among the pairwise contributions. Our method can be used to fit a variety of parametric spatial point process models and can yield more efficient estimators for the clustering parameters than pairwise composite likelihood estimation. We demonstrate its efficacy through a simulation study and an application to the longleaf pine data. Second, we further explore the quasi-likelihood approach on fitting second-order intensity function of spatial point processes. However, the original second-order quasi-likelihood is barely feasible due to the intense computation and high memory requirement needed to solve a large linear system. Motivated by the existence of geometric regular patterns in the stationary point processes, we find a lower dimension representation of the optimal weight function and propose a reduced second-order quasi-likelihood approach. Through a simulation study, we show that the proposed method not only demonstrates superior performance in fitting the clustering parameter but also merits in the relaxation of the constraint of the tuning parameter, H. Third, we studied the quasi-likelihood type estimating funciton that is optimal in a certain class of first-order estimating functions for estimating the regression parameter in spatial point process models. Then, by using a novel spectral representation, we construct an implementation that is computationally much more efficient and can be applied to more general setup than the original quasi-likelihood method.
A hybrid agent-based approach for modeling microbiological systems.
Guo, Zaiyi; Sloot, Peter M A; Tay, Joc Cing
2008-11-21
Models for systems biology commonly adopt Differential Equations or Agent-Based modeling approaches for simulating the processes as a whole. Models based on differential equations presuppose phenomenological intracellular behavioral mechanisms, while models based on Multi-Agent approach often use directly translated, and quantitatively less precise if-then logical rule constructs. We propose an extendible systems model based on a hybrid agent-based approach where biological cells are modeled as individuals (agents) while molecules are represented by quantities. This hybridization in entity representation entails a combined modeling strategy with agent-based behavioral rules and differential equations, thereby balancing the requirements of extendible model granularity with computational tractability. We demonstrate the efficacy of this approach with models of chemotaxis involving an assay of 10(3) cells and 1.2x10(6) molecules. The model produces cell migration patterns that are comparable to laboratory observations.
Fixism and conservation science.
Robert, Alexandre; Fontaine, Colin; Veron, Simon; Monnet, Anne-Christine; Legrand, Marine; Clavel, Joanne; Chantepie, Stéphane; Couvet, Denis; Ducarme, Frédéric; Fontaine, Benoît; Jiguet, Frédéric; le Viol, Isabelle; Rolland, Jonathan; Sarrazin, François; Teplitsky, Céline; Mouchet, Maud
2017-08-01
The field of biodiversity conservation has recently been criticized as relying on a fixist view of the living world in which existing species constitute at the same time targets of conservation efforts and static states of reference, which is in apparent disagreement with evolutionary dynamics. We reviewed the prominent role of species as conservation units and the common benchmark approach to conservation that aims to use past biodiversity as a reference to conserve current biodiversity. We found that the species approach is justified by the discrepancy between the time scales of macroevolution and human influence and that biodiversity benchmarks are based on reference processes rather than fixed reference states. Overall, we argue that the ethical and theoretical frameworks underlying conservation research are based on macroevolutionary processes, such as extinction dynamics. Current species, phylogenetic, community, and functional conservation approaches constitute short-term responses to short-term human effects on these reference processes, and these approaches are consistent with evolutionary principles. © 2016 Society for Conservation Biology.
Continuity-based model interfacing for plant-wide simulation: a general approach.
Volcke, Eveline I P; van Loosdrecht, Mark C M; Vanrolleghem, Peter A
2006-08-01
In plant-wide simulation studies of wastewater treatment facilities, often existing models from different origin need to be coupled. However, as these submodels are likely to contain different state variables, their coupling is not straightforward. The continuity-based interfacing method (CBIM) provides a general framework to construct model interfaces for models of wastewater systems, taking into account conservation principles. In this contribution, the CBIM approach is applied to study the effect of sludge digestion reject water treatment with a SHARON-Anammox process on a plant-wide scale. Separate models were available for the SHARON process and for the Anammox process. The Benchmark simulation model no. 2 (BSM2) is used to simulate the behaviour of the complete WWTP including sludge digestion. The CBIM approach is followed to develop three different model interfaces. At the same time, the generally applicable CBIM approach was further refined and particular issues when coupling models in which pH is considered as a state variable, are pointed out.
Multiobjective optimization approach: thermal food processing.
Abakarov, A; Sushkov, Y; Almonacid, S; Simpson, R
2009-01-01
The objective of this study was to utilize a multiobjective optimization technique for the thermal sterilization of packaged foods. The multiobjective optimization approach used in this study is based on the optimization of well-known aggregating functions by an adaptive random search algorithm. The applicability of the proposed approach was illustrated by solving widely used multiobjective test problems taken from the literature. The numerical results obtained for the multiobjective test problems and for the thermal processing problem show that the proposed approach can be effectively used for solving multiobjective optimization problems arising in the food engineering field.
Fielding, J E; Lamirault, I; Nolan, B; Bobrowsky, J
2000-07-01
In 1998, Los Angeles County's Department of Health Services (DHS) embarked on a planning process to expand ambulatory care services for the county's 2.7 million uninsured and otherwise medically indigent residents. This planning process was novel in two ways. First, it used a quantitative, needs-based approach for resource allocation to ensure an equitable distribution of safety-net ambulatory care services across the county. Second, it used a new community-based planning paradigm that took into consideration the specific needs of each of the county's eight geographic service planning areas. Together, the evidence-based approach to planning and the community-based decision-making will ensure that DHS can more equitably provide for the needs of Los Angeles County's medically indigent residents.
ERIC Educational Resources Information Center
Crittenden, Barry D.
1991-01-01
A simple liquid-liquid equilibrium (LLE) system involving a constant partition coefficient based on solute ratios is used to develop an algebraic understanding of multistage contacting in a first-year separation processes course. This algebraic approach to the LLE system is shown to be operable for the introduction of graphical techniques…
ERIC Educational Resources Information Center
Wenjuan, Hao; Rui, Liang
2016-01-01
Teaching is a spiral rising process. A complete teaching should be composed of five parts: theoretical basis, goal orientation, operating procedures, implementation conditions and assessment. On the basis of the genre knowledge, content-based approach and process approach, this text constructs the Teaching Model of College Writing Instruction, in…
The Use of Video Feedback in Teaching Process-Approach EFL Writing
ERIC Educational Resources Information Center
Özkul, Sertaç; Ortaçtepe, Deniz
2017-01-01
This experimental study investigated the use of video feedback as an alternative to feedback with correction codes at an institution where the latter was commonly used for teaching process-approach English as a foreign language (EFL) writing. Over a 5-week period, the control and the experimental groups were provided with feedback based on…
Counseling Families in Poverty: Moving from Paralyzing to Revitalizing
ERIC Educational Resources Information Center
Cholewa, Blaire; Smith-Adcock, Sondra
2013-01-01
Counseling families in poverty can be a daunting process if one only focuses on what is lacking. Taking such a deficit approach is limiting not only to the counselor but also can serve to disempower the clients. This paper presents a strengths-based approach for counseling families living in poverty that emphasizes relational processes and the…
Feature based Weld-Deposition for Additive Manufacturing of Complex Shapes
NASA Astrophysics Data System (ADS)
Panchagnula, Jayaprakash Sharma; Simhambhatla, Suryakumar
2018-06-01
Fabricating functional metal parts using Additive Manufacturing (AM) is a leading trend. However, realizing overhanging features has been a challenge due to the lack of support mechanism for metals. Powder-bed fusion techniques like, Selective Laser Sintering (SLS) employ easily-breakable-scaffolds made of the same material to realize the overhangs. However, the same approach is not extendible to deposition processes like laser or arc based direct energy deposition processes. Although it is possible to realize small overhangs by exploiting the inherent overhanging capability of the process or by blinding some small features like holes, the same cannot be extended for more complex geometries. The current work presents a novel approach for realizing complex overhanging features without the need of support structures. This is possible by using higher order kinematics and suitably aligning the overhang with the deposition direction. Feature based non-uniform slicing and non-uniform area-filling are some vital concepts required in realizing the same and are briefly discussed here. This method can be used to fabricate and/or repair fully dense and functional components for various engineering applications. Although this approach has been implemented for weld-deposition based system, the same can be extended to any other direct energy deposition processes also.
Image search engine with selective filtering and feature-element-based classification
NASA Astrophysics Data System (ADS)
Li, Qing; Zhang, Yujin; Dai, Shengyang
2001-12-01
With the growth of Internet and storage capability in recent years, image has become a widespread information format in World Wide Web. However, it has become increasingly harder to search for images of interest, and effective image search engine for the WWW needs to be developed. We propose in this paper a selective filtering process and a novel approach for image classification based on feature element in the image search engine we developed for the WWW. First a selective filtering process is embedded in a general web crawler to filter out the meaningless images with GIF format. Two parameters that can be obtained easily are used in the filtering process. Our classification approach first extract feature elements from images instead of feature vectors. Compared with feature vectors, feature elements can better capture visual meanings of the image according to subjective perception of human beings. Different from traditional image classification method, our classification approach based on feature element doesn't calculate the distance between two vectors in the feature space, while trying to find associations between feature element and class attribute of the image. Experiments are presented to show the efficiency of the proposed approach.
Symbolic Processing Combined with Model-Based Reasoning
NASA Technical Reports Server (NTRS)
James, Mark
2009-01-01
A computer program for the detection of present and prediction of future discrete states of a complex, real-time engineering system utilizes a combination of symbolic processing and numerical model-based reasoning. One of the biggest weaknesses of a purely symbolic approach is that it enables prediction of only future discrete states while missing all unmodeled states or leading to incorrect identification of an unmodeled state as a modeled one. A purely numerical approach is based on a combination of statistical methods and mathematical models of the applicable physics and necessitates development of a complete model to the level of fidelity required for prediction. In addition, a purely numerical approach does not afford the ability to qualify its results without some form of symbolic processing. The present software implements numerical algorithms to detect unmodeled events and symbolic algorithms to predict expected behavior, correlate the expected behavior with the unmodeled events, and interpret the results in order to predict future discrete states. The approach embodied in this software differs from that of the BEAM methodology (aspects of which have been discussed in several prior NASA Tech Briefs articles), which provides for prediction of future measurements in the continuous-data domain.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McDermott, Jason E.; Wang, Jing; Mitchell, Hugh D.
2013-01-01
The advent of high throughput technologies capable of comprehensive analysis of genes, transcripts, proteins and other significant biological molecules has provided an unprecedented opportunity for the identification of molecular markers of disease processes. However, it has simultaneously complicated the problem of extracting meaningful signatures of biological processes from these complex datasets. The process of biomarker discovery and characterization provides opportunities both for purely statistical and expert knowledge-based approaches and would benefit from improved integration of the two. Areas covered In this review we will present examples of current practices for biomarker discovery from complex omic datasets and the challenges thatmore » have been encountered. We will then present a high-level review of data-driven (statistical) and knowledge-based methods applied to biomarker discovery, highlighting some current efforts to combine the two distinct approaches. Expert opinion Effective, reproducible and objective tools for combining data-driven and knowledge-based approaches to biomarker discovery and characterization are key to future success in the biomarker field. We will describe our recommendations of possible approaches to this problem including metrics for the evaluation of biomarkers.« less
Adaptive Gaussian mixture models for pre-screening in GPR data
NASA Astrophysics Data System (ADS)
Torrione, Peter; Morton, Kenneth, Jr.; Besaw, Lance E.
2011-06-01
Due to the large amount of data generated by vehicle-mounted ground penetrating radar (GPR) antennae arrays, advanced feature extraction and classification can only be performed on a small subset of data during real-time operation. As a result, most GPR based landmine detection systems implement "pre-screening" algorithms to processes all of the data generated by the antennae array and identify locations with anomalous signatures for more advanced processing. These pre-screening algorithms must be computationally efficient and obtain high probability of detection, but can permit a false alarm rate which might be higher than the total system requirements. Many approaches to prescreening have previously been proposed, including linear prediction coefficients, the LMS algorithm, and CFAR-based approaches. Similar pre-screening techniques have also been developed in the field of video processing to identify anomalous behavior or anomalous objects. One such algorithm, an online k-means approximation to an adaptive Gaussian mixture model (GMM), is particularly well-suited to application for pre-screening in GPR data due to its computational efficiency, non-linear nature, and relevance of the logic underlying the algorithm to GPR processing. In this work we explore the application of an adaptive GMM-based approach for anomaly detection from the video processing literature to pre-screening in GPR data. Results with the ARA Nemesis landmine detection system demonstrate significant pre-screening performance improvements compared to alternative approaches, and indicate that the proposed algorithm is a complimentary technique to existing methods.
Monte Carlo based toy model for fission process
NASA Astrophysics Data System (ADS)
Kurniadi, R.; Waris, A.; Viridi, S.
2014-09-01
There are many models and calculation techniques to obtain visible image of fission yield process. In particular, fission yield can be calculated by using two calculations approach, namely macroscopic approach and microscopic approach. This work proposes another calculation approach in which the nucleus is treated as a toy model. Hence, the fission process does not represent real fission process in nature completely. The toy model is formed by Gaussian distribution of random number that randomizes distance likesthe distance between particle and central point. The scission process is started by smashing compound nucleus central point into two parts that are left central and right central points. These three points have different Gaussian distribution parameters such as mean (μCN, μL, μR), and standard deviation (σCN, σL, σR). By overlaying of three distributions, the number of particles (NL, NR) that are trapped by central points can be obtained. This process is iterated until (NL, NR) become constant numbers. Smashing process is repeated by changing σL and σR, randomly.
ERIC Educational Resources Information Center
Haque, Mohammad Mahfujul; Little, David C.; Barman, Benoy K.; Wahab, Md. Abdul
2010-01-01
Purpose: The purpose of the study was to understand the adoption process of ricefield based fish seed production (RBFSP) that has been developed, promoted and established in Northwest Bangladesh. Design/Methodology/Approach: Quantitative investigation based on regression analysis and qualitative investigation using semi-structured interview were…
NASA Technical Reports Server (NTRS)
Westgate, P.; Kohlmann, K.; Hendrickson, R.; Ladisch, M. R.; Mitchell, C. A. (Principal Investigator)
1992-01-01
Two approaches for biomass processing in Controlled Ecological Life Support Systems are compared in a literature survey. The approaches are based on (1) total oxidation of plant matter and (2) the potential of bioregenerative recovery.
NASA Astrophysics Data System (ADS)
Clausing, Eric; Vielhauer, Claus
2014-02-01
Locksmith forensics is an important area in crime scene forensics. Due to new optical, contactless, nanometer range sensing technology, such traces can be captured, digitized and analyzed more easily allowing a complete digital forensic investigation. In this paper we present a significantly improved approach for the detection and segmentation of toolmarks on surfaces of locking cylinder components (using the example of the locking cylinder component 'key pin') acquired by a 3D Confocal Laser Scanning Microscope. This improved approach is based on our prior work1 using a block-based classification approach with textural features. In this prior work1 we achieve a solid detection rate of 75-85% for the detection of toolmarks originating from illegal opening methods. Here, in this paper we improve, expand and fuse this prior approach with additional features from acquired surface topography data, color data and an image processing approach using adapted Gabor filters. In particular we are able of raising the detection and segmentation rates above 90% with our test set of 20 key pins with approximately 700 single toolmark traces of four different opening methods. We can provide a precise pixel- based segmentation as opposed to the rather imprecise segmentation of our prior block-based approach and as the use of the two additional data types (color and especially topography) require a specific pre-processing, we furthermore propose an adequate approach for this purpose.
Horwood, Christiane M; Youngleson, Michele S; Moses, Edward; Stern, Amy F; Barker, Pierre M
2015-07-01
Achieving long-term retention in HIV care is an important challenge for HIV management and achieving elimination of mother-to-child transmission. Sustainable, affordable strategies are required to achieve this, including strengthening of community-based interventions. Deployment of community-based health workers (CHWs) can improve health outcomes but there is a need to identify systems to support and maintain high-quality performance. Quality-improvement strategies have been successfully implemented to improve quality and coverage of healthcare in facilities and could provide a framework to support community-based interventions. Four community-based quality-improvement projects from South Africa, Malawi and Mozambique are described. Community-based improvement teams linked to the facility-based health system participated in learning networks (modified Breakthrough Series), and used quality-improvement methods to improve process performance. Teams were guided by trained quality mentors who used local data to help nurses and CHWs identify gaps in service provision and test solutions. Learning network participants gathered at intervals to share progress and identify successful strategies for improvement. CHWs demonstrated understanding of quality-improvement concepts, tools and methods, and implemented quality-improvement projects successfully. Challenges of using quality-improvement approaches in community settings included adapting processes, particularly data reporting, to the education level and first language of community members. Quality-improvement techniques can be implemented by CHWs to improve outcomes in community settings but these approaches require adaptation and additional mentoring support to be successful. More research is required to establish the effectiveness of this approach on processes and outcomes of care.
Babar, Muhammad Imran; Ghazali, Masitah; Jawawi, Dayang N A; Bin Zaheer, Kashif
2015-01-01
Value-based requirements engineering plays a vital role in the development of value-based software (VBS). Stakeholders are the key players in the requirements engineering process, and the selection of critical stakeholders for the VBS systems is highly desirable. Based on the stakeholder requirements, the innovative or value-based idea is realized. The quality of the VBS system is associated with the concrete set of valuable requirements, and the valuable requirements can only be obtained if all the relevant valuable stakeholders participate in the requirements elicitation phase. The existing value-based approaches focus on the design of the VBS systems. However, the focus on the valuable stakeholders and requirements is inadequate. The current stakeholder identification and quantification (SIQ) approaches are neither state-of-the-art nor systematic for the VBS systems. The existing approaches are time-consuming, complex and inconsistent which makes the initiation process difficult. Moreover, the main motivation of this research is that the existing SIQ approaches do not provide the low level implementation details for SIQ initiation and stakeholder metrics for quantification. Hence, keeping in view the existing SIQ problems, this research contributes in the form of a new SIQ framework called 'StakeMeter'. The StakeMeter framework is verified and validated through case studies. The proposed framework provides low-level implementation guidelines, attributes, metrics, quantification criteria and application procedure as compared to the other methods. The proposed framework solves the issues of stakeholder quantification or prioritization, higher time consumption, complexity, and process initiation. The framework helps in the selection of highly critical stakeholders for the VBS systems with less judgmental error.
Classroom EFL Writing: The Alignment-Oriented Approach
ERIC Educational Resources Information Center
Haiyan, Miao; Rilong, Liu
2016-01-01
This paper outlines the alignment-oriented approach in classroom EFL writing. Based on a review of the characteristics of the written language and comparison between the product-focused approach and the process-focused approach, the paper proposes a practical classroom procedure as to how to teach EFL writing. A follow-up empirical study is…
A concept of volume rendering guided search process to analyze medical data set.
Zhou, Jianlong; Xiao, Chun; Wang, Zhiyan; Takatsuka, Masahiro
2008-03-01
This paper firstly presents an approach of parallel coordinates based parameter control panel (PCP). The PCP is used to control parameters of focal region-based volume rendering (FRVR) during data analysis. It uses a parallel coordinates style interface. Different rendering parameters represented with nodes on each axis, and renditions based on related parameters are connected using polylines to show dependencies between renditions and parameters. Based on the PCP, a concept of volume rendering guided search process is proposed. The search pipeline is divided into four phases. Different parameters of FRVR are recorded and modulated in the PCP during search phases. The concept shows that volume visualization could play the role of guiding a search process in the rendition space to help users to efficiently find local structures of interest. The usability of the proposed approach is evaluated to show its effectiveness.
CATS - A process-based model for turbulent turbidite systems at the reservoir scale
NASA Astrophysics Data System (ADS)
Teles, Vanessa; Chauveau, Benoît; Joseph, Philippe; Weill, Pierre; Maktouf, Fakher
2016-09-01
The Cellular Automata for Turbidite systems (CATS) model is intended to simulate the fine architecture and facies distribution of turbidite reservoirs with a multi-event and process-based approach. The main processes of low-density turbulent turbidity flow are modeled: downslope sediment-laden flow, entrainment of ambient water, erosion and deposition of several distinct lithologies. This numerical model, derived from (Salles, 2006; Salles et al., 2007), proposes a new approach based on the Rouse concentration profile to consider the flow capacity to carry the sediment load in suspension. In CATS, the flow distribution on a given topography is modeled with local rules between neighboring cells (cellular automata) based on potential and kinetic energy balance and diffusion concepts. Input parameters are the initial flow parameters and a 3D topography at depositional time. An overview of CATS capabilities in different contexts is presented and discussed.
Advanced Computing Architectures for Cognitive Processing
2009-07-01
Evolution ................................................................................. 20 Figure 9: Logic diagram smart block-based neuron...48 Figure 21: Naive Grid Potential Kernel...processing would be helpful for Air Force systems acquisition. Specific cognitive processing approaches addressed herein include global information grid
Lessard, Jean-Philippe; Weinstein, Ben G; Borregaard, Michael K; Marske, Katharine A; Martin, Danny R; McGuire, Jimmy A; Parra, Juan L; Rahbek, Carsten; Graham, Catherine H
2016-01-01
A persistent challenge in ecology is to tease apart the influence of multiple processes acting simultaneously and interacting in complex ways to shape the structure of species assemblages. We implement a heuristic approach that relies on explicitly defining species pools and permits assessment of the relative influence of the main processes thought to shape assemblage structure: environmental filtering, dispersal limitations, and biotic interactions. We illustrate our approach using data on the assemblage composition and geographic distribution of hummingbirds, a comprehensive phylogeny and morphological traits. The implementation of several process-based species pool definitions in null models suggests that temperature-but not precipitation or dispersal limitation-acts as the main regional filter of assemblage structure. Incorporating this environmental filter directly into the definition of assemblage-specific species pools revealed an otherwise hidden pattern of phylogenetic evenness, indicating that biotic interactions might further influence hummingbird assemblage structure. Such hidden patterns of assemblage structure call for a reexamination of a multitude of phylogenetic- and trait-based studies that did not explicitly consider potentially important processes in their definition of the species pool. Our heuristic approach provides a transparent way to explore patterns and refine interpretations of the underlying causes of assemblage structure.
NASA Astrophysics Data System (ADS)
Bergen, K.; Yoon, C. E.; OReilly, O. J.; Beroza, G. C.
2015-12-01
Recent improvements in computational efficiency for waveform correlation-based detections achieved by new methods such as Fingerprint and Similarity Thresholding (FAST) promise to allow large-scale blind search for similar waveforms in long-duration continuous seismic data. Waveform similarity search applied to datasets of months to years of continuous seismic data will identify significantly more events than traditional detection methods. With the anticipated increase in number of detections and associated increase in false positives, manual inspection of the detection results will become infeasible. This motivates the need for new approaches to process the output of similarity-based detection. We explore data mining techniques for improved detection post-processing. We approach this by considering similarity-detector output as a sparse similarity graph with candidate events as vertices and similarities as weighted edges. Image processing techniques are leveraged to define candidate events and combine results individually processed at multiple stations. Clustering and graph analysis methods are used to identify groups of similar waveforms and assign a confidence score to candidate detections. Anomaly detection and classification are applied to waveform data for additional false detection removal. A comparison of methods will be presented and their performance will be demonstrated on a suspected induced and non-induced earthquake sequence.
Fleisher, Linda; Ruggieri, Dominique G.; Miller, Suzanne M.; Manne, Sharon; Albrecht, Terrance; Buzaglo, Joanne; Collins, Michael A.; Katz, Michael; Kinzy, Tyler G.; Liu, Tasnuva; Manning, Cheri; Charap, Ellen Specker; Millard, Jennifer; Miller, Dawn M.; Poole, David; Raivitch, Stephanie; Roach, Nancy; Ross, Eric A.; Meropol, Neal J.
2014-01-01
Objective This article describes the rigorous development process and initial feedback of the PRE-ACT (Preparatory Education About Clinical Trials) web-based- intervention designed to improve preparation for decision making in cancer clinical trials. Methods The multi-step process included stakeholder input, formative research, user testing and feedback. Diverse teams (researchers, advocates and developers) participated including content refinement, identification of actors, and development of video scripts. Patient feedback was provided in the final production period and through a vanguard group (N = 100) from the randomized trial. Results Patients/advocates confirmed barriers to cancer clinical trial participation, including lack of awareness and knowledge, fear of side effects, logistical concerns, and mistrust. Patients indicated they liked the tool’s user-friendly nature, the organized and comprehensive presentation of the subject matter, and the clarity of the videos. Conclusion The development process serves as an example of operationalizing best practice approaches and highlights the value of a multi-disciplinary team to develop a theory-based, sophisticated tool that patients found useful in their decision making process. Practice implications Best practice approaches can be addressed and are important to ensure evidence-based tools that are of value to patients and supports the usefulness of a process map in the development of e-health tools. PMID:24813474
Geerts, Hugo; Hofmann-Apitius, Martin; Anastasio, Thomas J
2017-11-01
Neurodegenerative diseases such as Alzheimer's disease (AD) follow a slowly progressing dysfunctional trajectory, with a large presymptomatic component and many comorbidities. Using preclinical models and large-scale omics studies ranging from genetics to imaging, a large number of processes that might be involved in AD pathology at different stages and levels have been identified. The sheer number of putative hypotheses makes it almost impossible to estimate their contribution to the clinical outcome and to develop a comprehensive view on the pathological processes driving the clinical phenotype. Traditionally, bioinformatics approaches have provided correlations and associations between processes and phenotypes. Focusing on causality, a new breed of advanced and more quantitative modeling approaches that use formalized domain expertise offer new opportunities to integrate these different modalities and outline possible paths toward new therapeutic interventions. This article reviews three different computational approaches and their possible complementarities. Process algebras, implemented using declarative programming languages such as Maude, facilitate simulation and analysis of complicated biological processes on a comprehensive but coarse-grained level. A model-driven Integration of Data and Knowledge, based on the OpenBEL platform and using reverse causative reasoning and network jump analysis, can generate mechanistic knowledge and a new, mechanism-based taxonomy of disease. Finally, Quantitative Systems Pharmacology is based on formalized implementation of domain expertise in a more fine-grained, mechanism-driven, quantitative, and predictive humanized computer model. We propose a strategy to combine the strengths of these individual approaches for developing powerful modeling methodologies that can provide actionable knowledge for rational development of preventive and therapeutic interventions. Development of these computational approaches is likely to be required for further progress in understanding and treating AD. Copyright © 2017 the Alzheimer's Association. Published by Elsevier Inc. All rights reserved.
Selecting concepts for a concept-based curriculum: application of a benchmark approach.
Giddens, Jean Foret; Wright, Mary; Gray, Irene
2012-09-01
In response to a transformational movement in nursing education, faculty across the country are considering changes to curricula and approaches to teaching. As a result, an emerging trend in many nursing programs is the adoption of a concept-based curriculum. As part of the curriculum development process, the selection of concepts, competencies, and exemplars on which to build courses and base content is needed. This article presents a benchmark approach used to validate and finalize concept selection among educators developing a concept-based curriculum for a statewide nursing consortium. These findings are intended to inform other nurse educators who are currently involved with or are considering this curriculum approach. Copyright 2012, SLACK Incorporated.
A Competence-Based Service for Supporting Self-Regulated Learning in Virtual Environments
ERIC Educational Resources Information Center
Nussbaumer, Alexander; Hillemann, Eva-Catherine; Gütl, Christian; Albert, Dietrich
2015-01-01
This paper presents a conceptual approach and a Web-based service that aim at supporting self-regulated learning in virtual environments. The conceptual approach consists of four components: 1) a self-regulated learning model for supporting a learner-centred learning process, 2) a psychological model for facilitating competence-based…
Student's Uncertainty Modeling through a Multimodal Sensor-Based Approach
ERIC Educational Resources Information Center
Jraidi, Imene; Frasson, Claude
2013-01-01
Detecting the student internal state during learning is a key construct in educational environment and particularly in Intelligent Tutoring Systems (ITS). Students' uncertainty is of primary interest as it is deeply rooted in the process of knowledge construction. In this paper we propose a new sensor-based multimodal approach to model…
An Approach Based on Social Network Analysis Applied to a Collaborative Learning Experience
ERIC Educational Resources Information Center
Claros, Iván; Cobos, Ruth; Collazos, César A.
2016-01-01
The Social Network Analysis (SNA) techniques allow modelling and analysing the interaction among individuals based on their attributes and relationships. This approach has been used by several researchers in order to measure the social processes in collaborative learning experiences. But oftentimes such measures were calculated at the final state…
Dynamic Group Formation Based on a Natural Phenomenon
ERIC Educational Resources Information Center
Zedadra, Amina; Lafifi, Yacine; Zedadra, Ouarda
2016-01-01
This paper presents a new approach of learners grouping in collaborative learning systems. This grouping process is based on traces left by learners. The goal is the circular dynamic grouping to achieve collaborative projects. The proposed approach consists of two main algorithms: (1) the circular grouping algorithm and (2) the dynamic grouping…
An Adaptive Approach to Managing Knowledge Development in a Project-Based Learning Environment
ERIC Educational Resources Information Center
Tilchin, Oleg; Kittany, Mohamed
2016-01-01
In this paper we propose an adaptive approach to managing the development of students' knowledge in the comprehensive project-based learning (PBL) environment. Subject study is realized by two-stage PBL. It shapes adaptive knowledge management (KM) process and promotes the correct balance between personalized and collaborative learning. The…
PBL-SEE: An Authentic Assessment Model for PBL-Based Software Engineering Education
ERIC Educational Resources Information Center
dos Santos, Simone C.
2017-01-01
The problem-based learning (PBL) approach has been successfully applied to teaching software engineering thanks to its principles of group work, learning by solving real problems, and learning environments that match the market realities. However, the lack of well-defined methodologies and processes for implementing the PBL approach represents a…
Education Quality in Kazakhstan in the Context of Competence-Based Approach
ERIC Educational Resources Information Center
Nabi, Yskak; Zhaxylykova, Nuriya Ermuhametovna; Kenbaeva, Gulmira Kaparbaevna; Tolbayev, Abdikerim; Bekbaeva, Zeinep Nusipovna
2016-01-01
The background of this paper is to present how education system of Kazakhstan evolved during the last 24 years of independence, highlighting the contemporary transformational processes. We defined the aim to identify the education quality in the context of competence-based approach. Methods: Analysis of references, interviewing, experimental work.…
NASA Astrophysics Data System (ADS)
Hardyanti, R. C.; Hartono; Fianti
2018-03-01
Physics Learning in Curriculum of 2013 is closely related to the implementation of scientific approach and authentic assessment in learning. This study aims to analyze the implementation of scientific approaches and authentic assessment in physics learning, as well as to analyze the constraints of scientific approach and authentic assessment in physics learning. The data collection techniques used in this study are questionnaires, observations, interviews, and documentation. The calculation results used are percentage techniques and analyzed by using qualitative descriptive approach. Based on the results of research and discussion, the implementation of physics learning based on the scientific approach goes well with the percentage of 84.60%. Physical learning activity based on authentic assessment also goes well with the percentage of 88%. The results of the percentage of scientific approaches and authentic assessment approaches are less than 100%. It shows that there are obstacles to the implementation of the scientific approach and the constraints of authentic assessment. The obstacles to the implementation of scientific approach include time, heavy load of material, input or ability of learners, the willingness of learners in asking questions, laboratory support, and the ability of students to process data. While the obstacles to the implementation of authentic assessment include the limited time for carrying out of authentic assessment, the components of the criteria in carrying out the authentic assessment, the lack of discipline in administering the administration, the difficulty of changing habits in carrying out the assessment from traditional assessment to the authentic assessment, the obstacle to process the score in accordance with the format Curriculum of 2013.
NASA Technical Reports Server (NTRS)
Lewandowski, Leon; Struckman, Keith
1994-01-01
Microwave Vision (MV), a concept originally developed in 1985, could play a significant role in the solution to robotic vision problems. Originally our Microwave Vision concept was based on a pattern matching approach employing computer based stored replica correlation processing. Artificial Neural Network (ANN) processor technology offers an attractive alternative to the correlation processing approach, namely the ability to learn and to adapt to changing environments. This paper describes the Microwave Vision concept, some initial ANN-MV experiments, and the design of an ANN-MV system that has led to a second patent disclosure in the robotic vision field.
A New Multi-Agent Approach to Adaptive E-Education
NASA Astrophysics Data System (ADS)
Chen, Jing; Cheng, Peng
Improving customer satisfaction degree is important in e-Education. This paper describes a new approach to adaptive e-Education taking into account the full spectrum of Web service techniques and activities. It presents a multi-agents architecture based on artificial psychology techniques, which makes the e-Education process both adaptable and dynamic, and hence up-to-date. Knowledge base techniques are used to support the e-Education process, and artificial psychology techniques to deal with user psychology, which makes the e-Education system more effective and satisfying.
Collective learning modeling based on the kinetic theory of active particles.
Burini, D; De Lillo, S; Gibelli, L
2016-03-01
This paper proposes a systems approach to the theory of perception and learning in populations composed of many living entities. Starting from a phenomenological description of these processes, a mathematical structure is derived which is deemed to incorporate their complexity features. The modeling is based on a generalization of kinetic theory methods where interactions are described by theoretical tools of game theory. As an application, the proposed approach is used to model the learning processes that take place in a classroom. Copyright © 2015 Elsevier B.V. All rights reserved.
Beregovykh, V V; Spitskiy, O R
2014-01-01
Risk-based approach is used for examination of impact of different factors on quality of medicinal products in technology transfer. A general diagram is offered for risk analysis execution in technology transfer from pharmaceutical development to production. When transferring technology to full- scale commercial production it is necessary to investigate and simulate production process application beforehand in new real conditions. The manufacturing process is the core factorfor risk analysis having the most impact on quality attributes of a medicinal product. Further importantfactors are linked to materials and products to be handled and manufacturing environmental conditions such as premises, equipment and personnel. Usage of risk-based approach in designing of multipurpose production facility of medicinal products is shown where quantitative risk analysis tool RAMM (Risk Analysis and Mitigation Matrix) was applied.
Query Language for Location-Based Services: A Model Checking Approach
NASA Astrophysics Data System (ADS)
Hoareau, Christian; Satoh, Ichiro
We present a model checking approach to the rationale, implementation, and applications of a query language for location-based services. Such query mechanisms are necessary so that users, objects, and/or services can effectively benefit from the location-awareness of their surrounding environment. The underlying data model is founded on a symbolic model of space organized in a tree structure. Once extended to a semantic model for modal logic, we regard location query processing as a model checking problem, and thus define location queries as hybrid logicbased formulas. Our approach is unique to existing research because it explores the connection between location models and query processing in ubiquitous computing systems, relies on a sound theoretical basis, and provides modal logic-based query mechanisms for expressive searches over a decentralized data structure. A prototype implementation is also presented and will be discussed.
Rütten, A; Wolff, A; Streber, A
2016-03-01
This article discusses 2 current issues in the field of public health research: (i) transfer of scientific knowledge into practice and (ii) sustainable implementation of good practice projects. It also supports integration of scientific and practice-based evidence production. Furthermore, it supports utilisation of interactive models that transcend deductive approaches to the process of knowledge transfer. Existing theoretical approaches, pilot studies and thoughtful conceptual considerations are incorporated into a framework showing the interplay of science, politics and prevention practice, which fosters a more sustainable implementation of health promotion programmes. The framework depicts 4 key processes of interaction between science and prevention practice: interactive knowledge to action, capacity building, programme adaptation and adaptation of the implementation context. Ensuring sustainability of health promotion programmes requires a concentrated process of integrating scientific and practice-based evidence production in the context of implementation. Central to the integration process is the approach of interactive knowledge to action, which especially benefits from capacity building processes that facilitate participation and systematic interaction between relevant stakeholders. Intense cooperation also induces a dynamic interaction between multiple actors and components such as health promotion programmes, target groups, relevant organisations and social, cultural and political contexts. The reciprocal adaptation of programmes and key components of the implementation context can foster effectiveness and sustainability of programmes. Sustainable implementation of evidence-based health promotion programmes requires alternatives to recent deductive models of knowledge transfer. Interactive approaches prove to be promising alternatives. Simultaneously, they change the responsibilities of science, policy and public health practice. Existing boundaries within disciplines and sectors are overcome by arranging transdisciplinary teams as well as by developing common agendas and procedures. Such approaches also require adaptations of the structure of research projects such as extending the length of funding. © Georg Thieme Verlag KG Stuttgart · New York.
Community-Based Participatory Evaluation: The Healthy Start Approach
Braithwaite, Ronald L.; McKenzie, Robetta D.; Pruitt, Vikki; Holden, Kisha B.; Aaron, Katrina; Hollimon, Chavone
2013-01-01
The use of community-based participatory research has gained momentum as a viable approach to academic and community engagement for research over the past 20 years. This article discusses an approach for extending the process with an emphasis on evaluation of a community partnership–driven initiative and thus advances the concept of conducting community-based participatory evaluation (CBPE) through a model used by the Healthy Start project of the Augusta Partnership for Children, Inc., in Augusta, Georgia. Application of the CBPE approach advances the importance of bilateral engagements with consumers and academic evaluators. The CBPE model shows promise as a reliable and credible evaluation approach for community-level assessment of health promotion programs. PMID:22461687
Community-based participatory evaluation: the healthy start approach.
Braithwaite, Ronald L; McKenzie, Robetta D; Pruitt, Vikki; Holden, Kisha B; Aaron, Katrina; Hollimon, Chavone
2013-03-01
The use of community-based participatory research has gained momentum as a viable approach to academic and community engagement for research over the past 20 years. This article discusses an approach for extending the process with an emphasis on evaluation of a community partnership-driven initiative and thus advances the concept of conducting community-based participatory evaluation (CBPE) through a model used by the Healthy Start project of the Augusta Partnership for Children, Inc., in Augusta, Georgia. Application of the CBPE approach advances the importance of bilateral engagements with consumers and academic evaluators. The CBPE model shows promise as a reliable and credible evaluation approach for community-level assessment of health promotion programs.
Discrete post-processing of total cloud cover ensemble forecasts
NASA Astrophysics Data System (ADS)
Hemri, Stephan; Haiden, Thomas; Pappenberger, Florian
2017-04-01
This contribution presents an approach to post-process ensemble forecasts for the discrete and bounded weather variable of total cloud cover. Two methods for discrete statistical post-processing of ensemble predictions are tested. The first approach is based on multinomial logistic regression, the second involves a proportional odds logistic regression model. Applying them to total cloud cover raw ensemble forecasts from the European Centre for Medium-Range Weather Forecasts improves forecast skill significantly. Based on station-wise post-processing of raw ensemble total cloud cover forecasts for a global set of 3330 stations over the period from 2007 to early 2014, the more parsimonious proportional odds logistic regression model proved to slightly outperform the multinomial logistic regression model. Reference Hemri, S., Haiden, T., & Pappenberger, F. (2016). Discrete post-processing of total cloud cover ensemble forecasts. Monthly Weather Review 144, 2565-2577.
[Establishment of design space for production process of traditional Chinese medicine preparation].
Xu, Bing; Shi, Xin-Yuan; Qiao, Yan-Jiang; Wu, Zhi-Sheng; Lin, Zhao-Zhou
2013-03-01
The philosophy of quality by design (QbD) is now leading the changes in the drug manufacturing mode from the conventional test-based approach to the science and risk based approach focusing on the detailed research and understanding of the production process. Along with the constant deepening of the understanding of the manufacturing process, the design space will be determined, and the emphasis of quality control will be shifted from the quality standards to the design space. Therefore, the establishment of the design space is core step in the implementation of QbD, and it is of great importance to study the methods for building the design space. This essay proposes the concept of design space for the production process of traditional Chinese medicine (TCM) preparations, gives a systematic introduction of the concept of the design space, analyzes the feasibility and significance to build the design space in the production process of traditional Chinese medicine preparations, and proposes study approaches on the basis of examples that comply with the characteristics of traditional Chinese medicine preparations, as well as future study orientations.
An alternative approach to characterize nonlinear site effects
Zhang, R.R.; Hartzell, S.; Liang, J.; Hu, Y.
2005-01-01
This paper examines the rationale of a method of nonstationary processing and analysis, referred to as the Hilbert-Huang transform (HHT), for its application to a recording-based approach in quantifying influences of soil nonlinearity in site response. In particular, this paper first summarizes symptoms of soil nonlinearity shown in earthquake recordings, reviews the Fourier-based approach to characterizing nonlinearity, and offers justifications for the HHT in addressing nonlinearity issues. This study then uses the HHT method to analyze synthetic data and recordings from the 1964 Niigata and 2001 Nisqually earthquakes. In doing so, the HHT-based site response is defined as the ratio of marginal Hilbert amplitude spectra, alternative to the Fourier-based response that is the ratio of Fourier amplitude spectra. With the Fourier-based approach in studies of site response as a reference, this study shows that the alternative HHT-based approach is effective in characterizing soil nonlinearity and nonlinear site response.
Classification of cancerous cells based on the one-class problem approach
NASA Astrophysics Data System (ADS)
Murshed, Nabeel A.; Bortolozzi, Flavio; Sabourin, Robert
1996-03-01
One of the most important factors in reducing the effect of cancerous diseases is the early diagnosis, which requires a good and a robust method. With the advancement of computer technologies and digital image processing, the development of a computer-based system has become feasible. In this paper, we introduce a new approach for the detection of cancerous cells. This approach is based on the one-class problem approach, through which the classification system need only be trained with patterns of cancerous cells. This reduces the burden of the training task by about 50%. Based on this approach, a computer-based classification system is developed, based on the Fuzzy ARTMAP neural networks. Experimental results were performed using a set of 542 patterns taken from a sample of breast cancer. Results of the experiment show 98% correct identification of cancerous cells and 95% correct identification of non-cancerous cells.
Flexible End2End Workflow Automation of Hit-Discovery Research.
Holzmüller-Laue, Silke; Göde, Bernd; Thurow, Kerstin
2014-08-01
The article considers a new approach of more complex laboratory automation at the workflow layer. The authors purpose the automation of end2end workflows. The combination of all relevant subprocesses-whether automated or manually performed, independently, and in which organizational unit-results in end2end processes that include all result dependencies. The end2end approach focuses on not only the classical experiments in synthesis or screening, but also on auxiliary processes such as the production and storage of chemicals, cell culturing, and maintenance as well as preparatory activities and analyses of experiments. Furthermore, the connection of control flow and data flow in the same process model leads to reducing of effort of the data transfer between the involved systems, including the necessary data transformations. This end2end laboratory automation can be realized effectively with the modern methods of business process management (BPM). This approach is based on a new standardization of the process-modeling notation Business Process Model and Notation 2.0. In drug discovery, several scientific disciplines act together with manifold modern methods, technologies, and a wide range of automated instruments for the discovery and design of target-based drugs. The article discusses the novel BPM-based automation concept with an implemented example of a high-throughput screening of previously synthesized compound libraries. © 2014 Society for Laboratory Automation and Screening.
SU-E-J-108: Solving the Chinese Postman Problem for Effective Contour Deformation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, J; Zhang, L; Balter, P
2015-06-15
Purpose: To develop a practical approach for accurate contour deformation when deformable image registration (DIR) is used for atlas-based segmentation or contour propagation in image-guided radiotherapy. Methods: A contour deformation approach was developed on the basis of 3D mesh operations. The 2D contours represented by a series of points in each slice were first converted to a 3D triangular mesh, which was deformed by the deformation vectors resulting from DIR. A set of parallel 2D planes then cut through the deformed 3D mesh, generating unordered points and line segments, which should be reorganized into a set of 2D contour points.more » It was realized that the reorganization problem was equivalent to solving the Chinese Postman Problem (CPP) by traversing a graph built from the unordered points with the least cost. Alternatively, deformation could be applied to a binary mask converted from the original contours. The deformed binary mask was then converted back into contours at the CT slice locations. We performed a qualitative comparison to validate the mesh-based approach against the image-based approach. Results: The DIR could considerably change the 3D mesh, making complicated 2D contour representations after deformation. CPP was able to effectively reorganize the points in 2D planes no matter how complicated the 2D contours were. The mesh-based approach did not require a post-processing of the contour, thus accurately showing the actual deformation in DIR. The mesh-based approach could keep some fine details and resulted in smoother contours than the image-based approach did, especially for the lung structure. Image-based approach appeared to over-process contours and suffered from image resolution limits. The mesh-based approach was integrated into in-house DIR software for use in routine clinic and research. Conclusion: We developed a practical approach for accurate contour deformation. The efficiency of this approach was demonstrated in both clinic and research applications. This work was partially supported by Cancer Prevention & Research Institute of Texas (CPRIT) RP110562.« less
Passive vibration control: a structure–immittance approach
Zhang, Sara Ying; Neild, Simon A.
2017-01-01
Linear passive vibration absorbers, such as tuned mass dampers, often contain springs, dampers and masses, although recently there has been a growing trend to employ or supplement the mass elements with inerters. When considering possible configurations with these elements broadly, two approaches are normally used: one structure-based and one immittance-based. Both approaches have their advantages and disadvantages. In this paper, a new approach is proposed: the structure–immittance approach. Using this approach, a full set of possible series–parallel networks with predetermined numbers of each element type can be represented by structural immittances, obtained via a proposed general formulation process. Using the structural immittances, both the ability to investigate a class of absorber possibilities together (advantage of the immittance-based approach), and the ability to control the complexity, topology and element values in resulting absorber configurations (advantages of the structure-based approach) are provided at the same time. The advantages of the proposed approach are demonstrated through two case studies on building vibration suppression and automotive suspension design, respectively. PMID:28588407
Passive vibration control: a structure-immittance approach.
Zhang, Sara Ying; Jiang, Jason Zheng; Neild, Simon A
2017-05-01
Linear passive vibration absorbers, such as tuned mass dampers, often contain springs, dampers and masses, although recently there has been a growing trend to employ or supplement the mass elements with inerters. When considering possible configurations with these elements broadly, two approaches are normally used: one structure-based and one immittance-based. Both approaches have their advantages and disadvantages. In this paper, a new approach is proposed: the structure-immittance approach. Using this approach, a full set of possible series-parallel networks with predetermined numbers of each element type can be represented by structural immittances, obtained via a proposed general formulation process. Using the structural immittances, both the ability to investigate a class of absorber possibilities together (advantage of the immittance-based approach), and the ability to control the complexity, topology and element values in resulting absorber configurations (advantages of the structure-based approach) are provided at the same time. The advantages of the proposed approach are demonstrated through two case studies on building vibration suppression and automotive suspension design, respectively.
Passive vibration control: a structure-immittance approach
NASA Astrophysics Data System (ADS)
Zhang, Sara Ying; Jiang, Jason Zheng; Neild, Simon A.
2017-05-01
Linear passive vibration absorbers, such as tuned mass dampers, often contain springs, dampers and masses, although recently there has been a growing trend to employ or supplement the mass elements with inerters. When considering possible configurations with these elements broadly, two approaches are normally used: one structure-based and one immittance-based. Both approaches have their advantages and disadvantages. In this paper, a new approach is proposed: the structure-immittance approach. Using this approach, a full set of possible series-parallel networks with predetermined numbers of each element type can be represented by structural immittances, obtained via a proposed general formulation process. Using the structural immittances, both the ability to investigate a class of absorber possibilities together (advantage of the immittance-based approach), and the ability to control the complexity, topology and element values in resulting absorber configurations (advantages of the structure-based approach) are provided at the same time. The advantages of the proposed approach are demonstrated through two case studies on building vibration suppression and automotive suspension design, respectively.
A Comprehensive Planning Model.
ERIC Educational Resources Information Center
Rieley, James B.
The key to long-term institutional effectiveness is a comprehensive planning process that identifies a few vital goals that can be measured by an institution. Effective strategic planning involves five key elements: process-based planning, a systemic approach, integration with the budget process, an effective deployment process, and appropriate…
NASA Astrophysics Data System (ADS)
Liu, Likun
2018-01-01
In the field of remote sensing image processing, remote sensing image segmentation is a preliminary step for later analysis of remote sensing image processing and semi-auto human interpretation, fully-automatic machine recognition and learning. Since 2000, a technique of object-oriented remote sensing image processing method and its basic thought prevails. The core of the approach is Fractal Net Evolution Approach (FNEA) multi-scale segmentation algorithm. The paper is intent on the research and improvement of the algorithm, which analyzes present segmentation algorithms and selects optimum watershed algorithm as an initialization. Meanwhile, the algorithm is modified by modifying an area parameter, and then combining area parameter with a heterogeneous parameter further. After that, several experiments is carried on to prove the modified FNEA algorithm, compared with traditional pixel-based method (FCM algorithm based on neighborhood information) and combination of FNEA and watershed, has a better segmentation result.
Analyzing Hedges in Verbal Communication: An Adaptation-Based Approach
ERIC Educational Resources Information Center
Wang, Yuling
2010-01-01
Based on Adaptation Theory, the article analyzes the production process of hedges. The procedure consists of the continuous making of choices in linguistic forms and communicative strategies. These choices are made just for adaptation to the contextual correlates. Besides, the adaptation process is dynamic, intentional and bidirectional.
A process-based emission model for volatile organic compounds from silage sources on farms
USDA-ARS?s Scientific Manuscript database
Silage on dairy farms can emit large amounts of volatile organic compounds (VOCs), a precursor in the formation of tropospheric ozone. Because of the challenges associated with direct measurements, process-based modeling is another approach for estimating emissions of air pollutants from sources suc...
NASA Astrophysics Data System (ADS)
Wichmann, Volker
2017-09-01
The Gravitational Process Path (GPP) model can be used to simulate the process path and run-out area of gravitational processes based on a digital terrain model (DTM). The conceptual model combines several components (process path, run-out length, sink filling and material deposition) to simulate the movement of a mass point from an initiation site to the deposition area. For each component several modeling approaches are provided, which makes the tool configurable for different processes such as rockfall, debris flows or snow avalanches. The tool can be applied to regional-scale studies such as natural hazard susceptibility mapping but also contains components for scenario-based modeling of single events. Both the modeling approaches and precursor implementations of the tool have proven their applicability in numerous studies, also including geomorphological research questions such as the delineation of sediment cascades or the study of process connectivity. This is the first open-source implementation, completely re-written, extended and improved in many ways. The tool has been committed to the main repository of the System for Automated Geoscientific Analyses (SAGA) and thus will be available with every SAGA release.
On Intelligent Design and Planning Method of Process Route Based on Gun Breech Machining Process
NASA Astrophysics Data System (ADS)
Hongzhi, Zhao; Jian, Zhang
2018-03-01
The paper states an approach of intelligent design and planning of process route based on gun breech machining process, against several problems, such as complex machining process of gun breech, tedious route design and long period of its traditional unmanageable process route. Based on gun breech machining process, intelligent design and planning system of process route are developed by virtue of DEST and VC++. The system includes two functional modules--process route intelligent design and its planning. The process route intelligent design module, through the analysis of gun breech machining process, summarizes breech process knowledge so as to complete the design of knowledge base and inference engine. And then gun breech process route intelligently output. On the basis of intelligent route design module, the final process route is made, edited and managed in the process route planning module.
NASA Technical Reports Server (NTRS)
Yarrow, Maurice; McCann, Karen M.; Biswas, Rupak; VanderWijngaart, Rob; Yan, Jerry C. (Technical Monitor)
2000-01-01
The creation of parameter study suites has recently become a more challenging problem as the parameter studies have now become multi-tiered and the computational environment has become a supercomputer grid. The parameter spaces are vast, the individual problem sizes are getting larger, and researchers are now seeking to combine several successive stages of parameterization and computation. Simultaneously, grid-based computing offers great resource opportunity but at the expense of great difficulty of use. We present an approach to this problem which stresses intuitive visual design tools for parameter study creation and complex process specification, and also offers programming-free access to grid-based supercomputer resources and process automation.
A Cognitive-Behavioral Approach to Chronic Pain Management.
ERIC Educational Resources Information Center
Grant, Lynda D.; Haverkamp, Beth E.
1995-01-01
Provides counselors with an introduction to the role of psychosocial processes in the experience of pain and offers assessment and intervention recommendations based on a cognitive-behavioral therapy approach to pain management. (JPS)
ERIC Educational Resources Information Center
Veloo, Arsaythamby; Krishnasamy, Hariharan N.; Harun, Hana Mulyani
2015-01-01
The purpose of this study is to determine gender differences and type of learning approaches among Universiti Utara Malaysia (UUM) undergraduate students in English writing performance. The study involved 241 (32.8% male & 67.2% female) undergraduate students of UUM who were taking the Process Writing course. This study uses a Two-Factor Study…
New approaches to digital transformation of petrochemical production
NASA Astrophysics Data System (ADS)
Andieva, E. Y.; Kapelyuhovskaya, A. A.
2017-08-01
The newest concepts of the reference architecture of digital industrial transformation are considered, the problems of their application for the enterprises having in their life cycle oil products processing and marketing are revealed. The concept of the reference architecture, providing a systematic representation of the fundamental changes in the approaches to production management based on the automation of production process control is proposed.
The jABC Approach to Rigorous Collaborative Development of SCM Applications
NASA Astrophysics Data System (ADS)
Hörmann, Martina; Margaria, Tiziana; Mender, Thomas; Nagel, Ralf; Steffen, Bernhard; Trinh, Hong
Our approach to the model-driven collaborative design of IKEA's P3 Delivery Management Process uses the jABC [9] for model driven mediation and choreography to complement a RUP-based (Rational Unified Process) development process. jABC is a framework for service development based on Lightweight Process Coordination. Users (product developers and system/software designers) easily develop services and applications by composing reusable building-blocks into (flow-) graph structures that can be animated, analyzed, simulated, verified, executed, and compiled. This way of handling the collaborative design of complex embedded systems has proven to be effective and adequate for the cooperation of non-programmers and non-technical people, which is the focus of this contribution, and it is now being rolled out in the operative practice.
Zhuo, Ming-Peng; Zhang, Ye-Xin; Li, Zhi-Zhou; Shi, Ying-Li; Wang, Xue-Dong; Liao, Liang-Sheng
2018-03-15
The controlled fabrication of organic single-crystalline nanowires (OSCNWs) with a uniform diameter in the nanoscale via the bottom-up approach, which is just based on weak intermolecular interaction, is a great challenge. Herein, we utilize the synergy approach of the bottom-up and the top-down processes to fabricate OSCNWs with diameters of 120 ± 10 nm through stepwise evolution processes. Specifically, the evolution processes vary from the self-assembled organic micro-rods with a quadrangular pyramid-like end-structure bounded with {111}s and {11-1}s crystal planes to the "top-down" synthesized organic micro-rods with the flat cross-sectional {002}s plane, to the organic micro-tubes with a wall thickness of ∼115 nm, and finally to the organic nanowires. Notably, the anisotropic etching process caused by the protic solvent molecules (such as ethanol) is crucial for the evolution of the morphology throughout the whole top-down process. Therefore, our demonstration opens a new avenue for the controlled-fabrication of organic nanowires, and also contributes to the development of nanowire-based organic optoelectronics such as organic nanowire lasers.
Taking Root: a grounded theory on evidence-based nursing implementation in China.
Cheng, L; Broome, M E; Feng, S; Hu, Y
2018-06-01
Evidence-based nursing is widely recognized as the critical foundation for quality care. To develop a middle-range theory on the process of evidence-based nursing implementation in Chinese context. A grounded theory study using unstructured in-depth individual interviews was conducted with 56 participants who were involved in 24 evidence-based nursing implementation projects in Mainland China from September 2015 to September 2016. A middle-range grounded theory of 'Taking Root' was developed. The theory describes the evidence implementation process consisting of four components (driving forces, process, outcome, sustainment/regression), three approaches (top-down, bottom-up and outside-in), four implementation strategies (patient-centred, nurses at the heart of change, reaching agreement, collaboration) and two patterns (transformational and adaptive implementation). Certain perspectives may have not been captured, as the retrospective nature of the interviewing technique did not allow for 'real-time' assessment of the actual implementation process. The transferability of the findings requires further exploration as few participants with negative experiences were recruited. This is the first study that explored evidence-based implementation process, strategies, approaches and patterns in the Chinese nursing practice context to inform international nursing and health policymaking. The theory of Taking Root described various approaches to evidence implementation and how the implementation can be transformational for the nurses and the setting in which they work. Nursing educators, managers and researchers should work together to improve nurses' readiness for evidence implementation. Healthcare systems need to optimize internal mechanisms and external collaborations to promote nursing practice in line with evidence and achieve clinical outcomes and sustainability. © 2017 International Council of Nurses.
ERIC Educational Resources Information Center
Chen, Jinshi
2017-01-01
Legal case brief writing is pedagogically important yet insufficiently discussed for Chinese EFL learners majoring in law. Based on process genre approach and discourse information theory (DIT), the present study designs a corpus-based analytical model for Chinese EFL learners' autonomy in legal case brief writing and explores the process of case…
The data base management system alternative for computing in the human services.
Sircar, S; Schkade, L L; Schoech, D
1983-01-01
The traditional incremental approach to computerization presents substantial problems as systems develop and grow. The Data Base Management System approach to computerization was developed to overcome the problems resulting from implementing computer applications one at a time. The authors describe the applications approach and the alternative Data Base Management System (DBMS) approach through their developmental history, discuss the technology of DBMS components, and consider the implications of choosing the DBMS alternative. Human service managers need an understanding of the DBMS alternative and its applicability to their agency data processing needs. The basis for a conscious selection of computing alternatives is outlined.
Kotenko, Igor
2014-01-01
The paper outlines a bioinspired approach named “network nervous system" and methods of simulation of infrastructure attacks and protection mechanisms based on this approach. The protection mechanisms based on this approach consist of distributed prosedures of information collection and processing, which coordinate the activities of the main devices of a computer network, identify attacks, and determine nessesary countermeasures. Attacks and protection mechanisms are specified as structural models using a set-theoretic approach. An environment for simulation of protection mechanisms based on the biological metaphor is considered; the experiments demonstrating the effectiveness of the protection mechanisms are described. PMID:25254229
Han, L. F; Plummer, Niel
2016-01-01
Numerous methods have been proposed to estimate the pre-nuclear-detonation 14C content of dissolved inorganic carbon (DIC) recharged to groundwater that has been corrected/adjusted for geochemical processes in the absence of radioactive decay (14C0) - a quantity that is essential for estimation of radiocarbon age of DIC in groundwater. The models/approaches most commonly used are grouped as follows: (1) single-sample-based models, (2) a statistical approach based on the observed (curved) relationship between 14C and δ13C data for the aquifer, and (3) the geochemical mass-balance approach that constructs adjustment models accounting for all the geochemical reactions known to occur along a groundwater flow path. This review discusses first the geochemical processes behind each of the single-sample-based models, followed by discussions of the statistical approach and the geochemical mass-balance approach. Finally, the applications, advantages and limitations of the three groups of models/approaches are discussed.The single-sample-based models constitute the prevailing use of 14C data in hydrogeology and hydrological studies. This is in part because the models are applied to an individual water sample to estimate the 14C age, therefore the measurement data are easily available. These models have been shown to provide realistic radiocarbon ages in many studies. However, they usually are limited to simple carbonate aquifers and selection of model may have significant effects on 14C0 often resulting in a wide range of estimates of 14C ages.Of the single-sample-based models, four are recommended for the estimation of 14C0 of DIC in groundwater: Pearson's model, (Ingerson and Pearson, 1964; Pearson and White, 1967), Han & Plummer's model (Han and Plummer, 2013), the IAEA model (Gonfiantini, 1972; Salem et al., 1980), and Oeschger's model (Geyh, 2000). These four models include all processes considered in single-sample-based models, and can be used in different ranges of 13C values.In contrast to the single-sample-based models, the extended Gonfiantini & Zuppi model (Gonfiantini and Zuppi, 2003; Han et al., 2014) is a statistical approach. This approach can be used to estimate 14C ages when a curved relationship between the 14C and 13C values of the DIC data is observed. In addition to estimation of groundwater ages, the relationship between 14C and δ13C data can be used to interpret hydrogeological characteristics of the aquifer, e.g. estimating apparent rates of geochemical reactions and revealing the complexity of the geochemical environment, and identify samples that are not affected by the same set of reactions/processes as the rest of the dataset. The investigated water samples may have a wide range of ages, and for waters with very low values of 14C, the model based on statistics may give more reliable age estimates than those obtained from single-sample-based models. In the extended Gonfiantini & Zuppi model, a representative system-wide value of the initial 14C content is derived from the 14C and δ13C data of DIC and can differ from that used in single-sample-based models. Therefore, the extended Gonfiantini & Zuppi model usually avoids the effect of modern water components which might retain ‘bomb’ pulse signatures.The geochemical mass-balance approach constructs an adjustment model that accounts for all the geochemical reactions known to occur along an aquifer flow path (Plummer et al., 1983; Wigley et al., 1978; Plummer et al., 1994; Plummer and Glynn, 2013), and includes, in addition to DIC, dissolved organic carbon (DOC) and methane (CH4). If sufficient chemical, mineralogical and isotopic data are available, the geochemical mass-balance method can yield the most accurate estimates of the adjusted radiocarbon age. The main limitation of this approach is that complete information is necessary on chemical, mineralogical and isotopic data and these data are often limited.Failure to recognize the limitations and underlying assumptions on which the various models and approaches are based can result in a wide range of estimates of 14C0 and limit the usefulness of radiocarbon as a dating tool for groundwater. In each of the three generalized approaches (single-sample-based models, statistical approach, and geochemical mass-balance approach), successful application depends on scrutiny of the isotopic (14C and 13C) and chemical data to conceptualize the reactions and processes that affect the 14C content of DIC in aquifers. The recently developed graphical analysis method is shown to aid in determining which approach is most appropriate for the isotopic and chemical data from a groundwater system.
Learning and teaching about the nature of science through process skills
NASA Astrophysics Data System (ADS)
Mulvey, Bridget K.
This dissertation, a three-paper set, explored whether the process skills-based approach to nature of science instruction improves teachers' understandings, intentions to teach, and instructional practice related to the nature of science. The first paper examined the nature of science views of 53 preservice science teachers before and after a year of secondary science methods instruction that incorporated the process skills-based approach. Data consisted of each participant's written and interview responses to the Views of the Nature of Science (VNOS) questionnaire. Systematic data analysis led to the conclusion that participants exhibited statistically significant and practically meaningful improvements in their nature of science views and viewed teaching the nature of science as essential to their future instruction. The second and third papers assessed the outcomes of the process skills-based approach with 25 inservice middle school science teachers. For the second paper, she collected and analyzed participants' VNOS and interview responses before, after, and 10 months after a 6-day summer professional development. Long-term retention of more aligned nature of science views underpins teachers' ability to teach aligned conceptions to their students yet it is rarely examined. Participants substantially improved their nature of science views after the professional development, retained those views over 10 months, and attributed their more aligned understandings to the course. The third paper addressed these participants' instructional practices based on participant-created video reflections of their nature of science and inquiry instruction. Two participant interviews and class notes also were analyzed via a constant comparative approach to ascertain if, how, and why the teachers explicitly integrated the nature of science into their instruction. The participants recognized the process skills-based approach as instrumental in the facilitation of their improved views. Additionally, the participants saw the nature of science as an important way to help students to access core science content such as the theory of evolution by natural selection. Most impressively, participants taught the nature of science explicitly and regularly. This instruction was student-centered, involving high levels of student engagement in ways that represented applying, adapting, and innovating on what they learned in the summer professional development.
NASA Astrophysics Data System (ADS)
Liu, Jie; Hu, Youmin; Wang, Yan; Wu, Bo; Fan, Jikai; Hu, Zhongxu
2018-05-01
The diagnosis of complicated fault severity problems in rotating machinery systems is an important issue that affects the productivity and quality of manufacturing processes and industrial applications. However, it usually suffers from several deficiencies. (1) A considerable degree of prior knowledge and expertise is required to not only extract and select specific features from raw sensor signals, and but also choose a suitable fusion for sensor information. (2) Traditional artificial neural networks with shallow architectures are usually adopted and they have a limited ability to learn the complex and variable operating conditions. In multi-sensor-based diagnosis applications in particular, massive high-dimensional and high-volume raw sensor signals need to be processed. In this paper, an integrated multi-sensor fusion-based deep feature learning (IMSFDFL) approach is developed to identify the fault severity in rotating machinery processes. First, traditional statistics and energy spectrum features are extracted from multiple sensors with multiple channels and combined. Then, a fused feature vector is constructed from all of the acquisition channels. Further, deep feature learning with stacked auto-encoders is used to obtain the deep features. Finally, the traditional softmax model is applied to identify the fault severity. The effectiveness of the proposed IMSFDFL approach is primarily verified by a one-stage gearbox experimental platform that uses several accelerometers under different operating conditions. This approach can identify fault severity more effectively than the traditional approaches.
Managing complex processing of medical image sequences by program supervision techniques
NASA Astrophysics Data System (ADS)
Crubezy, Monica; Aubry, Florent; Moisan, Sabine; Chameroy, Virginie; Thonnat, Monique; Di Paola, Robert
1997-05-01
Our objective is to offer clinicians wider access to evolving medical image processing (MIP) techniques, crucial to improve assessment and quantification of physiological processes, but difficult to handle for non-specialists in MIP. Based on artificial intelligence techniques, our approach consists in the development of a knowledge-based program supervision system, automating the management of MIP libraries. It comprises a library of programs, a knowledge base capturing the expertise about programs and data and a supervision engine. It selects, organizes and executes the appropriate MIP programs given a goal to achieve and a data set, with dynamic feedback based on the results obtained. It also advises users in the development of new procedures chaining MIP programs.. We have experimented the approach for an application of factor analysis of medical image sequences as a means of predicting the response of osteosarcoma to chemotherapy, with both MRI and NM dynamic image sequences. As a result our program supervision system frees clinical end-users from performing tasks outside their competence, permitting them to concentrate on clinical issues. Therefore our approach enables a better exploitation of possibilities offered by MIP and higher quality results, both in terms of robustness and reliability.
Yielding physically-interpretable emulators - A Sparse PCA approach
NASA Astrophysics Data System (ADS)
Galelli, S.; Alsahaf, A.; Giuliani, M.; Castelletti, A.
2015-12-01
Projection-based techniques, such as Principal Orthogonal Decomposition (POD), are a common approach to surrogate high-fidelity process-based models by lower order dynamic emulators. With POD, the dimensionality reduction is achieved by using observations, or 'snapshots' - generated with the high-fidelity model -, to project the entire set of input and state variables of this model onto a smaller set of basis functions that account for most of the variability in the data. While reduction efficiency and variance control of POD techniques are usually very high, the resulting emulators are structurally highly complex and can hardly be given a physically meaningful interpretation as each basis is a projection of the entire set of inputs and states. In this work, we propose a novel approach based on Sparse Principal Component Analysis (SPCA) that combines the several assets of POD methods with the potential for ex-post interpretation of the emulator structure. SPCA reduces the number of non-zero coefficients in the basis functions by identifying a sparse matrix of coefficients. While the resulting set of basis functions may retain less variance of the snapshots, the presence of a few non-zero coefficients assists in the interpretation of the underlying physical processes. The SPCA approach is tested on the reduction of a 1D hydro-ecological model (DYRESM-CAEDYM) used to describe the main ecological and hydrodynamic processes in Tono Dam, Japan. An experimental comparison against a standard POD approach shows that SPCA achieves the same accuracy in emulating a given output variable - for the same level of dimensionality reduction - while yielding better insights of the main process dynamics.
An emerging paradigm: a strength-based approach to exploring mental imagery
MacIntyre, Tadhg E.; Moran, Aidan P.; Collet, Christian; Guillot, Aymeric
2013-01-01
Mental imagery, or the ability to simulate in the mind information that is not currently perceived by the senses, has attracted considerable research interest in psychology since the early 1970's. Within the past two decades, research in this field—as in cognitive psychology more generally—has been dominated by neuroscientific methods that typically involve comparisons between imagery performance of participants from clinical populations with those who exhibit apparently normal cognitive functioning. Although this approach has been valuable in identifying key neural substrates of visual imagery, it has been less successful in understanding the possible mechanisms underlying another simulation process, namely, motor imagery or the mental rehearsal of actions without engaging in the actual movements involved. In order to address this oversight, a “strength-based” approach has been postulated which is concerned with understanding those on the high ability end of the imagery performance spectrum. Guided by the expert performance approach and principles of ecological validity, converging methods have the potential to enable imagery researchers to investigate the neural “signature” of elite performers, for example. Therefore, the purpose of this paper is to explain the origin, nature, and implications of the strength-based approach to mental imagery. Following a brief explanation of the background to this latter approach, we highlight some important theoretical advances yielded by recent research on mental practice, mental travel, and meta-imagery processes in expert athletes and dancers. Next, we consider the methodological implications of using a strength-based approach to investigate imagery processes. The implications for the field of motor cognition are outlined and specific research questions, in dynamic imagery, imagery perspective, measurement, multi-sensory imagery, and metacognition that may benefit from this approach in the future are sketched briefly. PMID:23554591
Shared decision making in chronic care in the context of evidence based practice in nursing.
Friesen-Storms, Jolanda H H M; Bours, Gerrie J J W; van der Weijden, Trudy; Beurskens, Anna J H M
2015-01-01
In the decision-making environment of evidence-based practice, the following three sources of information must be integrated: research evidence of the intervention, clinical expertise, and the patient's values. In reality, evidence-based practice usually focuses on research evidence (which may be translated into clinical practice guidelines) and clinical expertise without considering the individual patient's values. The shared decision-making model seems to be helpful in the integration of the individual patient's values in evidence-based practice. We aim to discuss the relevance of shared decision making in chronic care and to suggest how it can be integrated with evidence-based practice in nursing. We start by describing the following three possible approaches to guide the decision-making process: the paternalistic approach, the informed approach, and the shared decision-making approach. Implementation of shared decision making has gained considerable interest in cases lacking a strong best-treatment recommendation, and when the available treatment options are equivalent to some extent. We discuss that in chronic care it is important to always invite the patient to participate in the decision-making process. We delineate the following six attributes of health care interventions in chronic care that influence the degree of shared decision making: the level of research evidence, the number of available intervention options, the burden of side effects, the impact on lifestyle, the patient group values, and the impact on resources. Furthermore, the patient's willingness to participate in shared decision making, the clinical expertise of the nurse, and the context in which the decision making takes place affect the shared decision-making process. A knowledgeable and skilled nurse with a positive attitude towards shared decision making—integrated with evidence-based practice—can facilitate the shared decision-making process. We conclude that nurses as well as other health care professionals in chronic care should integrate shared decision making with evidence-based practice to deliver patient-centred care. Copyright © 2014 Elsevier Ltd. All rights reserved.
Netchacovitch, L; Thiry, J; De Bleye, C; Dumont, E; Cailletaud, J; Sacré, P-Y; Evrard, B; Hubert, Ph; Ziemons, E
2017-08-15
Since the Food and Drug Administration (FDA) published a guidance based on the Process Analytical Technology (PAT) approach, real-time analyses during manufacturing processes are in real expansion. In this study, in-line Raman spectroscopic analyses were performed during a Hot-Melt Extrusion (HME) process to determine the Active Pharmaceutical Ingredient (API) content in real-time. The method was validated based on a univariate and a multivariate approach and the analytical performances of the obtained models were compared. Moreover, on one hand, in-line data were correlated with the real API concentration present in the sample quantified by a previously validated off-line confocal Raman microspectroscopic method. On the other hand, in-line data were also treated in function of the concentration based on the weighing of the components in the prepared mixture. The importance of developing quantitative methods based on the use of a reference method was thus highlighted. The method was validated according to the total error approach fixing the acceptance limits at ±15% and the α risk at ±5%. This method reaches the requirements of the European Pharmacopeia norms for the uniformity of content of single-dose preparations. The validation proves that future results will be in the acceptance limits with a previously defined probability. Finally, the in-line validated method was compared with the off-line one to demonstrate its ability to be used in routine analyses. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Fagbohun, Babatunde Joseph; Olabode, Oluwaseun Franklin; Adebola, Abiodun Olufemi; Akinluyi, Francis Omowonuola
2017-12-01
Identifying landscapes having comparable hydrological characteristics is valuable for the determination of dominant runoff process (DRP) and prediction of flood. Several approaches used for DRP-mapping vary in relation to data and time requirement. Manual approaches which are based on field investigation and expert knowledge are time demanding and difficult to implement at regional scale. Automatic GIS-based approach on the other hand require simplification of data but is easier to implement and it is applicable on a regional scale. In this study, GIS-based automated approach was used to identify the DRPs in Anambra area. The result showed that Hortonian overland flow (HOF) has the highest coverage of 1508.3 km2 (33.5%) followed by deep percolation (DP) with coverage of 1455.3 km2 (32.3%). Subsurface flow (SSF) is the third dominant runoff process covering 920.6 km2 (20.4%) while saturated overland flow (SOF) covers the least area of 618.4 km2 (13.7%) of the study area. The result reveal that considerable amount of precipitated water would be infiltrated into the subsurface through deep percolation process contributing to groundwater recharge in the study area. However, it is envisaged that HOF and SOF will continue to increase due to the continuous expansion of built-up area. With the expected increase in HOF and SOF, and the change in rainfall pattern associated with perpetual problem of climate change, it is paramount that groundwater conservation practices should be considered to ensure continued sustainable utilization of groundwater in the study area.
Holistic versus monomeric strategies for hydrological modelling of human-modified hydrosystems
NASA Astrophysics Data System (ADS)
Nalbantis, I.; Efstratiadis, A.; Rozos, E.; Kopsiafti, M.; Koutsoyiannis, D.
2011-03-01
The modelling of human-modified basins that are inadequately measured constitutes a challenge for hydrological science. Often, models for such systems are detailed and hydraulics-based for only one part of the system while for other parts oversimplified models or rough assumptions are used. This is typically a bottom-up approach, which seeks to exploit knowledge of hydrological processes at the micro-scale at some components of the system. Also, it is a monomeric approach in two ways: first, essential interactions among system components may be poorly represented or even omitted; second, differences in the level of detail of process representation can lead to uncontrolled errors. Additionally, the calibration procedure merely accounts for the reproduction of the observed responses using typical fitting criteria. The paper aims to raise some critical issues, regarding the entire modelling approach for such hydrosystems. For this, two alternative modelling strategies are examined that reflect two modelling approaches or philosophies: a dominant bottom-up approach, which is also monomeric and, very often, based on output information, and a top-down and holistic approach based on generalized information. Critical options are examined, which codify the differences between the two strategies: the representation of surface, groundwater and water management processes, the schematization and parameterization concepts and the parameter estimation methodology. The first strategy is based on stand-alone models for surface and groundwater processes and for water management, which are employed sequentially. For each model, a different (detailed or coarse) parameterization is used, which is dictated by the hydrosystem schematization. The second strategy involves model integration for all processes, parsimonious parameterization and hybrid manual-automatic parameter optimization based on multiple objectives. A test case is examined in a hydrosystem in Greece with high complexities, such as extended surface-groundwater interactions, ill-defined boundaries, sinks to the sea and anthropogenic intervention with unmeasured abstractions both from surface water and aquifers. Criteria for comparison are the physical consistency of parameters, the reproduction of runoff hydrographs at multiple sites within the studied basin, the likelihood of uncontrolled model outputs, the required amount of computational effort and the performance within a stochastic simulation setting. Our work allows for investigating the deterioration of model performance in cases where no balanced attention is paid to all components of human-modified hydrosystems and the related information. Also, sources of errors are identified and their combined effect are evaluated.
An Evidential Reasoning-Based CREAM to Human Reliability Analysis in Maritime Accident Process.
Wu, Bing; Yan, Xinping; Wang, Yang; Soares, C Guedes
2017-10-01
This article proposes a modified cognitive reliability and error analysis method (CREAM) for estimating the human error probability in the maritime accident process on the basis of an evidential reasoning approach. This modified CREAM is developed to precisely quantify the linguistic variables of the common performance conditions and to overcome the problem of ignoring the uncertainty caused by incomplete information in the existing CREAM models. Moreover, this article views maritime accident development from the sequential perspective, where a scenario- and barrier-based framework is proposed to describe the maritime accident process. This evidential reasoning-based CREAM approach together with the proposed accident development framework are applied to human reliability analysis of a ship capsizing accident. It will facilitate subjective human reliability analysis in different engineering systems where uncertainty exists in practice. © 2017 Society for Risk Analysis.
Global Detection of Live Virtual Machine Migration Based on Cellular Neural Networks
Xie, Kang; Yang, Yixian; Zhang, Ling; Jing, Maohua; Xin, Yang; Li, Zhongxian
2014-01-01
In order to meet the demands of operation monitoring of large scale, autoscaling, and heterogeneous virtual resources in the existing cloud computing, a new method of live virtual machine (VM) migration detection algorithm based on the cellular neural networks (CNNs), is presented. Through analyzing the detection process, the parameter relationship of CNN is mapped as an optimization problem, in which improved particle swarm optimization algorithm based on bubble sort is used to solve the problem. Experimental results demonstrate that the proposed method can display the VM migration processing intuitively. Compared with the best fit heuristic algorithm, this approach reduces the processing time, and emerging evidence has indicated that this new approach is affordable to parallelism and analog very large scale integration (VLSI) implementation allowing the VM migration detection to be performed better. PMID:24959631
Global detection of live virtual machine migration based on cellular neural networks.
Xie, Kang; Yang, Yixian; Zhang, Ling; Jing, Maohua; Xin, Yang; Li, Zhongxian
2014-01-01
In order to meet the demands of operation monitoring of large scale, autoscaling, and heterogeneous virtual resources in the existing cloud computing, a new method of live virtual machine (VM) migration detection algorithm based on the cellular neural networks (CNNs), is presented. Through analyzing the detection process, the parameter relationship of CNN is mapped as an optimization problem, in which improved particle swarm optimization algorithm based on bubble sort is used to solve the problem. Experimental results demonstrate that the proposed method can display the VM migration processing intuitively. Compared with the best fit heuristic algorithm, this approach reduces the processing time, and emerging evidence has indicated that this new approach is affordable to parallelism and analog very large scale integration (VLSI) implementation allowing the VM migration detection to be performed better.
Schulz, Amy J.; Israel, Barbara A.; Coombe, Chris M.; Gaines, Causandra; Reyes, Angela G.; Rowe, Zachary; Sand, Sharon; Strong, Larkin L.; Weir, Sheryl
2010-01-01
The elimination of persistent health inequities requires the engagement of multiple perspectives, resources and skills. Community-based participatory research is one approach to developing action strategies that promote health equity by addressing contextual as well as individual level factors, and that can contribute to addressing more fundamental factors linked to health inequity. Yet many questions remain about how to implement participatory processes that engage local insights and expertise, are informed by the existing public health knowledge base, and build support across multiple sectors to implement solutions. We describe a CBPR approach used to conduct a community assessment and action planning process, culminating in development of a multilevel intervention to address inequalities in cardiovascular disease in Detroit, Michigan. We consider implications for future efforts to engage communities in developing strategies toward eliminating health inequities. PMID:21873580
Investigation of Low-Reynolds-Number Rocket Nozzle Design Using PNS-Based Optimization Procedure
NASA Technical Reports Server (NTRS)
Hussaini, M. Moin; Korte, John J.
1996-01-01
An optimization approach to rocket nozzle design, based on computational fluid dynamics (CFD) methodology, is investigated for low-Reynolds-number cases. This study is undertaken to determine the benefits of this approach over those of classical design processes such as Rao's method. A CFD-based optimization procedure, using the parabolized Navier-Stokes (PNS) equations, is used to design conical and contoured axisymmetric nozzles. The advantage of this procedure is that it accounts for viscosity during the design process; other processes make an approximated boundary-layer correction after an inviscid design is created. Results showed significant improvement in the nozzle thrust coefficient over that of the baseline case; however, the unusual nozzle design necessitates further investigation of the accuracy of the PNS equations for modeling expanding flows with thick laminar boundary layers.
Chemical processing of glasses
NASA Astrophysics Data System (ADS)
Laine, Richard M.
1990-11-01
The development of chemical processing methods for the fabrication of glass and ceramic shapes for photonic applications is frequently Edisonian in nature. In part, this is because the numerous variables that must be optimized to obtain a given material with a specific shape and particular properties cannot be readily defined based on fundamental principles. In part, the problems arise because the basic chemistry of common chemical processing systems has not been fully delineated. The prupose of this paper is to provide an overview of the basic chemical problems associated with chemical processing. The emphasis will be on sol-gel processing, a major subset pf chemical processing. Two alternate approaches to chemical processing of glasses are also briefly discussed. One approach concerns the use of bimetallic alkoxide oligomers and polymers as potential precursors to mulimetallic glasses. The second approach describes the utility of metal carboxylate precursors to multimetallic glasses.
Radiology information system: a workflow-based approach.
Zhang, Jinyan; Lu, Xudong; Nie, Hongchao; Huang, Zhengxing; van der Aalst, W M P
2009-09-01
Introducing workflow management technology in healthcare seems to be prospective in dealing with the problem that the current healthcare Information Systems cannot provide sufficient support for the process management, although several challenges still exist. The purpose of this paper is to study the method of developing workflow-based information system in radiology department as a use case. First, a workflow model of typical radiology process was established. Second, based on the model, the system could be designed and implemented as a group of loosely coupled components. Each component corresponded to one task in the process and could be assembled by the workflow management system. The legacy systems could be taken as special components, which also corresponded to the tasks and were integrated through transferring non-work- flow-aware interfaces to the standard ones. Finally, a workflow dashboard was designed and implemented to provide an integral view of radiology processes. The workflow-based Radiology Information System was deployed in the radiology department of Zhejiang Chinese Medicine Hospital in China. The results showed that it could be adjusted flexibly in response to the needs of changing process, and enhance the process management in the department. It can also provide a more workflow-aware integration method, comparing with other methods such as IHE-based ones. The workflow-based approach is a new method of developing radiology information system with more flexibility, more functionalities of process management and more workflow-aware integration. The work of this paper is an initial endeavor for introducing workflow management technology in healthcare.
NASA Astrophysics Data System (ADS)
Rodrigues, João Fabrício Mota; Coelho, Marco Túlio Pacheco; Ribeiro, Bruno R.
2018-04-01
Species distribution models (SDM) have been broadly used in ecology to address theoretical and practical problems. Currently, there are two main approaches to generate SDMs: (i) correlative, which is based on species occurrences and environmental predictor layers and (ii) process-based models, which are constructed based on species' functional traits and physiological tolerances. The distributions estimated by each approach are based on different components of species niche. Predictions of correlative models approach species realized niches, while predictions of process-based are more akin to species fundamental niche. Here, we integrated the predictions of fundamental and realized distributions of the freshwater turtle Trachemys dorbigni. Fundamental distribution was estimated using data of T. dorbigni's egg incubation temperature, and realized distribution was estimated using species occurrence records. Both types of distributions were estimated using the same regression approaches (logistic regression and support vector machines), both considering macroclimatic and microclimatic temperatures. The realized distribution of T. dorbigni was generally nested in its fundamental distribution reinforcing theoretical assumptions that the species' realized niche is a subset of its fundamental niche. Both modelling algorithms produced similar results but microtemperature generated better results than macrotemperature for the incubation model. Finally, our results reinforce the conclusion that species realized distributions are constrained by other factors other than just thermal tolerances.
Monitoring autocorrelated process: A geometric Brownian motion process approach
NASA Astrophysics Data System (ADS)
Li, Lee Siaw; Djauhari, Maman A.
2013-09-01
Autocorrelated process control is common in today's modern industrial process control practice. The current practice of autocorrelated process control is to eliminate the autocorrelation by using an appropriate model such as Box-Jenkins models or other models and then to conduct process control operation based on the residuals. In this paper we show that many time series are governed by a geometric Brownian motion (GBM) process. Therefore, in this case, by using the properties of a GBM process, we only need an appropriate transformation and model the transformed data to come up with the condition needs in traditional process control. An industrial example of cocoa powder production process in a Malaysian company will be presented and discussed to illustrate the advantages of the GBM approach.
Ji, Xiaonan; Ritter, Alan; Yen, Po-Yin
2017-05-01
Systematic Reviews (SRs) are utilized to summarize evidence from high quality studies and are considered the preferred source of evidence-based practice (EBP). However, conducting SRs can be time and labor intensive due to the high cost of article screening. In previous studies, we demonstrated utilizing established (lexical) article relationships to facilitate the identification of relevant articles in an efficient and effective manner. Here we propose to enhance article relationships with background semantic knowledge derived from Unified Medical Language System (UMLS) concepts and ontologies. We developed a pipelined semantic concepts representation process to represent articles from an SR into an optimized and enriched semantic space of UMLS concepts. Throughout the process, we leveraged concepts and concept relations encoded in biomedical ontologies (SNOMED-CT and MeSH) within the UMLS framework to prompt concept features of each article. Article relationships (similarities) were established and represented as a semantic article network, which was readily applied to assist with the article screening process. We incorporated the concept of active learning to simulate an interactive article recommendation process, and evaluated the performance on 15 completed SRs. We used work saved over sampling at 95% recall (WSS95) as the performance measure. We compared the WSS95 performance of our ontology-based semantic approach to existing lexical feature approaches and corpus-based semantic approaches, and found that we had better WSS95 in most SRs. We also had the highest average WSS95 of 43.81% and the highest total WSS95 of 657.18%. We demonstrated using ontology-based semantics to facilitate the identification of relevant articles for SRs. Effective concepts and concept relations derived from UMLS ontologies can be utilized to establish article semantic relationships. Our approach provided a promising performance and can easily apply to any SR topics in the biomedical domain with generalizability. Copyright © 2017 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Chu, Hui-Chun; Chang, Shao-Chen
2014-01-01
Although educational computer games have been recognized as being a promising approach, previous studies have indicated that, without supportive models, students might only show temporary interest during the game-based learning process, and their learning performance is often not as good as expected. Therefore, in this paper, a two-tier test…
ERIC Educational Resources Information Center
Lee, Chien-I; Yang, Ya-Fei; Mai, Shin-Yi
2016-01-01
Web-based peer assessment has been considered an important process for learning. However, students may not offer constructive feedback due to lack of expertise knowledge. Therefore, this study proposed a scaffolded assessment approach accordingly. To evaluate the effectiveness of the proposed approach, the quasi-experimental design was employed to…
ERIC Educational Resources Information Center
Rapp, Brenda; Miozzo, Michele
2011-01-01
The papers in this special issue of "Language and Cognitive Processing" on the neural bases of language production illustrate two general approaches in current cognitive neuroscience. One approach focuses on investigating cognitive issues, making use of the logic of associations/dissociations or the logic of neural markers as key investigative…
ERIC Educational Resources Information Center
Zimmermann, Judith; Brodersen, Kay H.; Heinimann, Hans R.; Buhmann, Joachim M.
2015-01-01
The graduate admissions process is crucial for controlling the quality of higher education, yet, rules-of-thumb and domain-specific experiences often dominate evidence-based approaches. The goal of the present study is to dissect the predictive power of undergraduate performance indicators and their aggregates. We analyze 81 variables in 171…
Multiple constraint analysis of regional land-surface carbon flux
D.P. Turner; M. Göckede; B.E. Law; W.D. Ritts; W.B. Cohen; Z. Yang; T. Hudiburg; R. Kennedy; M. Duane
2011-01-01
We applied and compared bottom-up (process model-based) and top-down (atmospheric inversion-based) scaling approaches to evaluate the spatial and temporal patterns of net ecosystem production (NEP) over a 2.5 Ã 105 km2 area (the state of Oregon) in the western United States. Both approaches indicated a carbon sink over this...
A PetriNet-Based Approach for Supporting Traceability in Cyber-Physical Manufacturing Systems
Huang, Jiwei; Zhu, Yeping; Cheng, Bo; Lin, Chuang; Chen, Junliang
2016-01-01
With the growing popularity of complex dynamic activities in manufacturing processes, traceability of the entire life of every product has drawn significant attention especially for food, clinical materials, and similar items. This paper studies the traceability issue in cyber-physical manufacturing systems from a theoretical viewpoint. Petri net models are generalized for formulating dynamic manufacturing processes, based on which a detailed approach for enabling traceability analysis is presented. Models as well as algorithms are carefully designed, which can trace back the lifecycle of a possibly contaminated item. A practical prototype system for supporting traceability is designed, and a real-life case study of a quality control system for bee products is presented to validate the effectiveness of the approach. PMID:26999141
A PetriNet-Based Approach for Supporting Traceability in Cyber-Physical Manufacturing Systems.
Huang, Jiwei; Zhu, Yeping; Cheng, Bo; Lin, Chuang; Chen, Junliang
2016-03-17
With the growing popularity of complex dynamic activities in manufacturing processes, traceability of the entire life of every product has drawn significant attention especially for food, clinical materials, and similar items. This paper studies the traceability issue in cyber-physical manufacturing systems from a theoretical viewpoint. Petri net models are generalized for formulating dynamic manufacturing processes, based on which a detailed approach for enabling traceability analysis is presented. Models as well as algorithms are carefully designed, which can trace back the lifecycle of a possibly contaminated item. A practical prototype system for supporting traceability is designed, and a real-life case study of a quality control system for bee products is presented to validate the effectiveness of the approach.
Ground robotic measurement of aeolian processes
USDA-ARS?s Scientific Manuscript database
Models of aeolian processes rely on accurate measurements of the rates of sediment transport by wind, and careful evaluation of the environmental controls of these processes. Existing field approaches typically require intensive, event-based experiments involving dense arrays of instruments. These d...
Application of agent-based system for bioprocess description and process improvement.
Gao, Ying; Kipling, Katie; Glassey, Jarka; Willis, Mark; Montague, Gary; Zhou, Yuhong; Titchener-Hooker, Nigel J
2010-01-01
Modeling plays an important role in bioprocess development for design and scale-up. Predictive models can also be used in biopharmaceutical manufacturing to assist decision-making either to maintain process consistency or to identify optimal operating conditions. To predict the whole bioprocess performance, the strong interactions present in a processing sequence must be adequately modeled. Traditionally, bioprocess modeling considers process units separately, which makes it difficult to capture the interactions between units. In this work, a systematic framework is developed to analyze the bioprocesses based on a whole process understanding and considering the interactions between process operations. An agent-based approach is adopted to provide a flexible infrastructure for the necessary integration of process models. This enables the prediction of overall process behavior, which can then be applied during process development or once manufacturing has commenced, in both cases leading to the capacity for fast evaluation of process improvement options. The multi-agent system comprises a process knowledge base, process models, and a group of functional agents. In this system, agent components co-operate with each other in performing their tasks. These include the description of the whole process behavior, evaluating process operating conditions, monitoring of the operating processes, predicting critical process performance, and providing guidance to decision-making when coping with process deviations. During process development, the system can be used to evaluate the design space for process operation. During manufacture, the system can be applied to identify abnormal process operation events and then to provide suggestions as to how best to cope with the deviations. In all cases, the function of the system is to ensure an efficient manufacturing process. The implementation of the agent-based approach is illustrated via selected application scenarios, which demonstrate how such a framework may enable the better integration of process operations by providing a plant-wide process description to facilitate process improvement. Copyright 2009 American Institute of Chemical Engineers
Time series modeling by a regression approach based on a latent process.
Chamroukhi, Faicel; Samé, Allou; Govaert, Gérard; Aknin, Patrice
2009-01-01
Time series are used in many domains including finance, engineering, economics and bioinformatics generally to represent the change of a measurement over time. Modeling techniques may then be used to give a synthetic representation of such data. A new approach for time series modeling is proposed in this paper. It consists of a regression model incorporating a discrete hidden logistic process allowing for activating smoothly or abruptly different polynomial regression models. The model parameters are estimated by the maximum likelihood method performed by a dedicated Expectation Maximization (EM) algorithm. The M step of the EM algorithm uses a multi-class Iterative Reweighted Least-Squares (IRLS) algorithm to estimate the hidden process parameters. To evaluate the proposed approach, an experimental study on simulated data and real world data was performed using two alternative approaches: a heteroskedastic piecewise regression model using a global optimization algorithm based on dynamic programming, and a Hidden Markov Regression Model whose parameters are estimated by the Baum-Welch algorithm. Finally, in the context of the remote monitoring of components of the French railway infrastructure, and more particularly the switch mechanism, the proposed approach has been applied to modeling and classifying time series representing the condition measurements acquired during switch operations.
Predicting influent biochemical oxygen demand: Balancing energy demand and risk management.
Zhu, Jun-Jie; Kang, Lulu; Anderson, Paul R
2018-01-01
Ready access to comprehensive influent information can help water reclamation plant (WRP) operators implement better real-time process controls, provide operational reliability and reduce energy consumption. The five-day biochemical oxygen demand (BOD 5 ), a critical parameter for WRP process control, is expensive and difficult to measure using hard-sensors. An alternative approach based on a soft-sensor methodology shows promise, but can be problematic when used to predict high BOD 5 values. Underestimating high BOD 5 concentrations for process control could result in an insufficient amount of aeration, increasing the risk of an effluent violation. To address this issue, we tested a hierarchical hybrid soft-sensor approach involving multiple linear regression, artificial neural networks (ANN), and compromise programming. While this hybrid approach results in a slight decrease in overall prediction accuracy relative to the approach based on ANN only, the underestimation percentage is substantially lower (37% vs. 61%) for predictions of carbonaceous BOD 5 (CBOD 5 ) concentrations higher than the long-term average value. The hybrid approach is also flexible and can be adjusted depending on the relative importance between energy savings and managing the risk of an effluent violation. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Steenhuis, T. S.; Mendoza, G.; Lyon, S. W.; Gerard Marchant, P.; Walter, M. T.; Schneiderman, E.
2003-04-01
Because the traditional Soil Conservation Service Curve Number (SCS-CN) approach continues to be ubiquitously used in GIS-BASED water quality models, new application methods are needed that are consistent with variable source area (VSA) hydrological processes in the landscape. We developed within an integrated GIS modeling environment a distributed approach for applying the traditional SCS-CN equation to watersheds where VSA hydrology is a dominant process. Spatial representation of hydrologic processes is important for watershed planning because restricting potentially polluting activities from runoff source areas is fundamental to controlling non-point source pollution. The methodology presented here uses the traditional SCS-CN method to predict runoff volume and spatial extent of saturated areas and uses a topographic index to distribute runoff source areas through watersheds. The resulting distributed CN-VSA method was incorporated in an existing GWLF water quality model and applied to sub-watersheds of the Delaware basin in the Catskill Mountains region of New York State. We found that the distributed CN-VSA approach provided a physically-based method that gives realistic results for watersheds with VSA hydrology.
Advances in Proteomics Data Analysis and Display Using an Accurate Mass and Time Tag Approach
Zimmer, Jennifer S.D.; Monroe, Matthew E.; Qian, Wei-Jun; Smith, Richard D.
2007-01-01
Proteomics has recently demonstrated utility in understanding cellular processes on the molecular level as a component of systems biology approaches and for identifying potential biomarkers of various disease states. The large amount of data generated by utilizing high efficiency (e.g., chromatographic) separations coupled to high mass accuracy mass spectrometry for high-throughput proteomics analyses presents challenges related to data processing, analysis, and display. This review focuses on recent advances in nanoLC-FTICR-MS-based proteomics approaches and the accompanying data processing tools that have been developed to display and interpret the large volumes of data being produced. PMID:16429408
Emergence of Scaffold-free Approaches for Tissue Engineering Musculoskeletal Cartilages
DuRaine, Grayson D.; Brown, Wendy E.; Hu, Jerry C.; Athanasiou, Kyriacos A.
2014-01-01
This review explores scaffold-free methods as an additional paradigm for tissue engineering. Musculoskeletal cartilages –for example articular cartilage, meniscus, temporomandibular joint disc, and intervertebral disc – are characterized by low vascularity and cellularity, and are amenable to scaffold-free tissue engineering approaches. Scaffold-free approaches, particularly the self-assembling process, mimic elements of developmental processes underlying these tissues. Discussed are various scaffold-free approaches for musculoskeletal cartilage tissue engineering, such as cell sheet engineering, aggregation, and the self-assembling process, as well as the availability and variety of cells used. Immunological considerations are of particular importance as engineered tissues are frequently of allogeneic, if not xenogeneic, origin. Factors that enhance the matrix production and mechanical properties of these engineered cartilages are also reviewed, as the fabrication of biomimetically suitable tissues is necessary to replicate function and ensure graft survival in vivo. The concept of combining scaffold-free and scaffold-based tissue engineering methods to address clinical needs is also discussed. Inasmuch as scaffold-based musculoskeletal tissue engineering approaches have been employed as a paradigm to generate engineered cartilages with appropriate functional properties, scaffold-free approaches are emerging as promising elements of a translational pathway not only for musculoskeletal cartilages but for other tissues as well. PMID:25331099
Mathematizing: An Emergent Math Curriculum Approach for Young Children
ERIC Educational Resources Information Center
Rosales, Allen C.
2015-01-01
Based on years of research with early childhood teachers, author Allen Rosales provides an approach to create an emergent math curriculum that integrates children's interests with math concepts. The mathematizing approach is different from traditional math curriculums, as it immerses children in a process that is designed to develop their…
76 FR 72220 - Incorporation of Risk Management Concepts in Regulatory Programs
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-22
... and support the adoption of improved designs or processes. \\1\\ A deterministic approach to regulation... longstanding goal to move toward more risk-informed, performance- based approaches in its regulatory programs... regulatory approach that would continue to ensure the safe and secure use of nuclear material. As part of...
Cross-Evaluation of Degree Programmes in Higher Education
ERIC Educational Resources Information Center
Kettunen, Juha
2010-01-01
Purpose: This study seeks to develop and describe the benchmarking approach of enhancement-led evaluation in higher education and to present a cross-evaluation process for degree programmes. Design/methodology/approach: The benchmarking approach produces useful information for the development of degree programmes based on self-evaluation,…
A functional language approach in high-speed digital simulation
NASA Technical Reports Server (NTRS)
Ercegovac, M. D.; Lu, S.-L.
1983-01-01
A functional programming approach for a multi-microprocessor architecture is presented. The language, based on Backus FP, its intermediate form and the translation process are discussed and illustrated with an example. The approach allows performance analysis to be performed at a high level as an aid in program partitioning.
ERIC Educational Resources Information Center
Smith, Corinne Roth
A multidimensional approach to assessment of children with learning difficulties is examined. The approach explores factors along five dimensions: (1) learner characteristics (motivation, social-emotional maturity, cognitive abilities and styles); (2) task-based contributors (match of tasks to maturational levels and to cognitive style); (3)…
[Optimize dropping process of Ginkgo biloba dropping pills by using design space approach].
Shen, Ji-Chen; Wang, Qing-Qing; Chen, An; Pan, Fang-Lai; Gong, Xing-Chu; Qu, Hai-Bin
2017-07-01
In this paper, a design space approach was applied to optimize the dropping process of Ginkgo biloba dropping pills. Firstly, potential critical process parameters and potential process critical quality attributes were determined through literature research and pre-experiments. Secondly, experiments were carried out according to Box-Behnken design. Then the critical process parameters and critical quality attributes were determined based on the experimental results. Thirdly, second-order polynomial models were used to describe the quantitative relationships between critical process parameters and critical quality attributes. Finally, a probability-based design space was calculated and verified. The verification results showed that efficient production of Ginkgo biloba dropping pills can be guaranteed by operating within the design space parameters. The recommended operation ranges for the critical dropping process parameters of Ginkgo biloba dropping pills were as follows: dropping distance of 5.5-6.7 cm, and dropping speed of 59-60 drops per minute, providing a reference for industrial production of Ginkgo biloba dropping pills. Copyright© by the Chinese Pharmaceutical Association.
Novel approach for solid state cryocoolers.
Volpi, Azzurra; Di Lieto, Alberto; Tonelli, Mauro
2015-04-06
Laser cooling in solids is based on anti-Stokes luminescence, via the annihilation of lattice phonons needed to compensate the energy of emitted photons, higher than absorbed ones. Usually the anti-Stokes process is obtained using a rare-earth active ion, like Yb. In this work we demonstrate a novel approach for optical cooling based not only to Yb anti-Stokes cycle but also to virtuous energy-transfer processes from the active ion, obtaining an increase of the cooling efficiency of a single crystal LiYF(4) (YLF) doped Yb at 5at.% with a controlled co-doping of 0.0016% Thulium ions. A model for efficiency enhancement based on Yb-Tm energy transfer is also suggested.
NASA Astrophysics Data System (ADS)
Priyono, Wena, Made; Rahardjo, Boedi
2017-09-01
Experts and practitioners agree that the quality of higher education in Indonesia needs to be improved significantly and continuously. The low quality of university graduates is caused by many factors, one of which is the poor quality of learning. Today's instruction process tends to place great emphasis only on delivering knowledge. To avoid the pitfalls of such instruction, e.g. passive learning, thus Civil Engineering students should be given more opportunities to interact with others and actively participate in the learning process. Based on a number of theoretical and empirical studies, one appropriate strategy to overcome the aforementioned problem is by developing and implementing activity-based learning approach.
Neural Network Based Modeling and Analysis of LP Control Surface Allocation
NASA Technical Reports Server (NTRS)
Langari, Reza; Krishnakumar, Kalmanje; Gundy-Burlet, Karen
2003-01-01
This paper presents an approach to interpretive modeling of LP based control allocation in intelligent flight control. The emphasis is placed on a nonlinear interpretation of the LP allocation process as a static map to support analytical study of the resulting closed loop system, albeit in approximate form. The approach makes use of a bi-layer neural network to capture the essential functioning of the LP allocation process. It is further shown via Lyapunov based analysis that under certain relatively mild conditions the resulting closed loop system is stable. Some preliminary conclusions from a study at Ames are stated and directions for further research are given at the conclusion of the paper.
Psychological Processing in Chronic Pain: A Neural Systems Approach
Simons, Laura; Elman, Igor; Borsook, David
2014-01-01
Our understanding of chronic pain involves complex brain circuits that include sensory, emotional, cognitive and interoceptive processing. The feed-forward interactions between physical (e.g., trauma) and emotional pain and the consequences of altered psychological status on the expression of pain have made the evaluation and treatment of chronic pain a challenge in the clinic. By understanding the neural circuits involved in psychological processes, a mechanistic approach to the implementation of psychology-based treatments may be better understood. In this review we evaluate some of the principle processes that may be altered as a consequence of chronic pain in the context of localized and integrated neural networks. These changes are ongoing, vary in their magnitude, and their hierarchical manifestations, and may be temporally and sequentially altered by treatments, and all contribute to an overall pain phenotype. Furthermore, we link altered psychological processes to specific evidence-based treatments to put forth a model of pain neuroscience psychology. PMID:24374383
Instances selection algorithm by ensemble margin
NASA Astrophysics Data System (ADS)
Saidi, Meryem; Bechar, Mohammed El Amine; Settouti, Nesma; Chikh, Mohamed Amine
2018-05-01
The main limit of data mining algorithms is their inability to deal with the huge amount of available data in a reasonable processing time. A solution of producing fast and accurate results is instances and features selection. This process eliminates noisy or redundant data in order to reduce the storage and computational cost without performances degradation. In this paper, a new instance selection approach called Ensemble Margin Instance Selection (EMIS) algorithm is proposed. This approach is based on the ensemble margin. To evaluate our approach, we have conducted several experiments on different real-world classification problems from UCI Machine learning repository. The pixel-based image segmentation is a field where the storage requirement and computational cost of applied model become higher. To solve these limitations we conduct a study based on the application of EMIS and other instance selection techniques for the segmentation and automatic recognition of white blood cells WBC (nucleus and cytoplasm) in cytological images.
A Novel Artificial Bee Colony Based Clustering Algorithm for Categorical Data
2015-01-01
Data with categorical attributes are ubiquitous in the real world. However, existing partitional clustering algorithms for categorical data are prone to fall into local optima. To address this issue, in this paper we propose a novel clustering algorithm, ABC-K-Modes (Artificial Bee Colony clustering based on K-Modes), based on the traditional k-modes clustering algorithm and the artificial bee colony approach. In our approach, we first introduce a one-step k-modes procedure, and then integrate this procedure with the artificial bee colony approach to deal with categorical data. In the search process performed by scout bees, we adopt the multi-source search inspired by the idea of batch processing to accelerate the convergence of ABC-K-Modes. The performance of ABC-K-Modes is evaluated by a series of experiments in comparison with that of the other popular algorithms for categorical data. PMID:25993469
A novel artificial bee colony based clustering algorithm for categorical data.
Ji, Jinchao; Pang, Wei; Zheng, Yanlin; Wang, Zhe; Ma, Zhiqiang
2015-01-01
Data with categorical attributes are ubiquitous in the real world. However, existing partitional clustering algorithms for categorical data are prone to fall into local optima. To address this issue, in this paper we propose a novel clustering algorithm, ABC-K-Modes (Artificial Bee Colony clustering based on K-Modes), based on the traditional k-modes clustering algorithm and the artificial bee colony approach. In our approach, we first introduce a one-step k-modes procedure, and then integrate this procedure with the artificial bee colony approach to deal with categorical data. In the search process performed by scout bees, we adopt the multi-source search inspired by the idea of batch processing to accelerate the convergence of ABC-K-Modes. The performance of ABC-K-Modes is evaluated by a series of experiments in comparison with that of the other popular algorithms for categorical data.
Cárdenas-García, Maura; González-Pérez, Pedro Pablo
2013-04-11
Apoptotic cell death plays a crucial role in development and homeostasis. This process is driven by mitochondrial permeabilization and activation of caspases. In this paper we adopt a tuple spaces-based modelling and simulation approach, and show how it can be applied to the simulation of this intracellular signalling pathway. Specifically, we are working to explore and to understand the complex interaction patterns of the caspases apoptotic and the mitochondrial role. As a first approximation, using the tuple spaces-based in silico approach, we model and simulate both the extrinsic and intrinsic apoptotic signalling pathways and the interactions between them. During apoptosis, mitochondrial proteins, released from mitochondria to cytosol are decisively involved in the process. If the decision is to die, from this point there is normally no return, cancer cells offer resistance to the mitochondrial induction.
Artificial neural networks for document analysis and recognition.
Marinai, Simone; Gori, Marco; Soda, Giovanni; Society, Computer
2005-01-01
Artificial neural networks have been extensively applied to document analysis and recognition. Most efforts have been devoted to the recognition of isolated handwritten and printed characters with widely recognized successful results. However, many other document processing tasks, like preprocessing, layout analysis, character segmentation, word recognition, and signature verification, have been effectively faced with very promising results. This paper surveys the most significant problems in the area of offline document image processing, where connectionist-based approaches have been applied. Similarities and differences between approaches belonging to different categories are discussed. A particular emphasis is given on the crucial role of prior knowledge for the conception of both appropriate architectures and learning algorithms. Finally, the paper provides a critical analysis on the reviewed approaches and depicts the most promising research guidelines in the field. In particular, a second generation of connectionist-based models are foreseen which are based on appropriate graphical representations of the learning environment.
Potts, Tavis; O'Higgins, Tim; Hastings, Emily
2012-12-13
The management of European seas is undergoing a process of major reform. In the past, oceans and coastal policy has traditionally evolved in a fragmented and uncoordinated manner, developed by different sector-based agencies and arms of government with competing aims and objectives. Recently, the call for integrated and ecosystem-based approaches has driven the conceptualization of a new approach. At the scale of Europe through the Integrated Maritime Policy and Marine Strategy Framework Directive and in national jurisdictions such as the Marine and Coastal Access Act in the United Kingdom, ecosystem-based planning is becoming the norm. There are major challenges to this process and this paper explores, in particular, the opportunities inherent in building truly integrated approaches that cross different sectors of activity, integrate across scales, incorporate public involvement and build a sense of oceans citizenship.
NASA Technical Reports Server (NTRS)
Pototzky, Anthony S.; Zeiler, Thomas A.; Perry, Boyd, III
1989-01-01
This paper describes and illustrates two ways of performing time-correlated gust-load calculations. The first is based on Matched Filter Theory; the second on Random Process Theory. Both approaches yield theoretically identical results and represent novel applications of the theories, are computationally fast, and may be applied to other dynamic-response problems. A theoretical development and example calculations using both Matched Filter Theory and Random Process Theory approaches are presented.
A comparative psychophysical approach to visual perception in primates.
Matsuno, Toyomi; Fujita, Kazuo
2009-04-01
Studies on the visual processing of primates, which have well developed visual systems, provide essential information about the perceptual bases of their higher-order cognitive abilities. Although the mechanisms underlying visual processing are largely shared between human and nonhuman primates, differences have also been reported. In this article, we review psychophysical investigations comparing the basic visual processing that operates in human and nonhuman species, and discuss the future contributions potentially deriving from such comparative psychophysical approaches to primate minds.
Fourier analysis and signal processing by use of the Moebius inversion formula
NASA Technical Reports Server (NTRS)
Reed, Irving S.; Yu, Xiaoli; Shih, Ming-Tang; Tufts, Donald W.; Truong, T. K.
1990-01-01
A novel Fourier technique for digital signal processing is developed. This approach to Fourier analysis is based on the number-theoretic method of the Moebius inversion of series. The Fourier transform method developed is shown also to yield the convolution of two signals. A computer simulation shows that this method for finding Fourier coefficients is quite suitable for digital signal processing. It competes with the classical FFT (fast Fourier transform) approach in terms of accuracy, complexity, and speed.
NASA Technical Reports Server (NTRS)
Pototzky, Anthony S.; Zeiler, Thomas A.; Perry, Boyd, III
1989-01-01
Two ways of performing time-correlated gust-load calculations are described and illustrated. The first is based on Matched Filter Theory; the second on Random Process Theory. Both approaches yield theoretically identical results and represent novel applications of the theories, are computationally fast, and may be applied to other dynamic-response problems. A theoretical development and example calculations using both Matched Filter Theory and Random Process Theory approaches are presented.
Guijarro, María; Pajares, Gonzalo; Herrera, P. Javier
2009-01-01
The increasing technology of high-resolution image airborne sensors, including those on board Unmanned Aerial Vehicles, demands automatic solutions for processing, either on-line or off-line, the huge amountds of image data sensed during the flights. The classification of natural spectral signatures in images is one potential application. The actual tendency in classification is oriented towards the combination of simple classifiers. In this paper we propose a combined strategy based on the Deterministic Simulated Annealing (DSA) framework. The simple classifiers used are the well tested supervised parametric Bayesian estimator and the Fuzzy Clustering. The DSA is an optimization approach, which minimizes an energy function. The main contribution of DSA is its ability to avoid local minima during the optimization process thanks to the annealing scheme. It outperforms simple classifiers used for the combination and some combined strategies, including a scheme based on the fuzzy cognitive maps and an optimization approach based on the Hopfield neural network paradigm. PMID:22399989
Process simulation and dynamic control for marine oily wastewater treatment using UV irradiation.
Jing, Liang; Chen, Bing; Zhang, Baiyu; Li, Pu
2015-09-15
UV irradiation and advanced oxidation processes have been recently regarded as promising solutions in removing polycyclic aromatic hydrocarbons (PAHs) from marine oily wastewater. However, such treatment methods are generally not sufficiently understood in terms of reaction mechanisms, process simulation and process control. These deficiencies can drastically hinder their application in shipping and offshore petroleum industries which produce bilge/ballast water and produced water as the main streams of marine oily wastewater. In this study, the factorial design of experiment was carried out to investigate the degradation mechanism of a typical PAH, namely naphthalene, under UV irradiation in seawater. Based on the experimental results, a three-layer feed-forward artificial neural network simulation model was developed to simulate the treatment process and to forecast the removal performance. A simulation-based dynamic mixed integer nonlinear programming (SDMINP) approach was then proposed to intelligently control the treatment process by integrating the developed simulation model, genetic algorithm and multi-stage programming. The applicability and effectiveness of the developed approach were further tested though a case study. The experimental results showed that the influences of fluence rate and temperature on the removal of naphthalene were greater than those of salinity and initial concentration. The developed simulation model could well predict the UV-induced removal process under varying conditions. The case study suggested that the SDMINP approach, with the aid of the multi-stage control strategy, was able to significantly reduce treatment cost when comparing to the traditional single-stage process optimization. The developed approach and its concept/framework have high potential of applicability in other environmental fields where a treatment process is involved and experimentation and modeling are used for process simulation and control. Copyright © 2015 Elsevier Ltd. All rights reserved.
New approach to gallbladder ultrasonic images analysis and lesions recognition.
Bodzioch, Sławomir; Ogiela, Marek R
2009-03-01
This paper presents a new approach to gallbladder ultrasonic image processing and analysis towards detection of disease symptoms on processed images. First, in this paper, there is presented a new method of filtering gallbladder contours from USG images. A major stage in this filtration is to segment and section off areas occupied by the said organ. In most cases this procedure is based on filtration that plays a key role in the process of diagnosing pathological changes. Unfortunately ultrasound images present among the most troublesome methods of analysis owing to the echogenic inconsistency of structures under observation. This paper provides for an inventive algorithm for the holistic extraction of gallbladder image contours. The algorithm is based on rank filtration, as well as on the analysis of histogram sections on tested organs. The second part concerns detecting lesion symptoms of the gallbladder. Automating a process of diagnosis always comes down to developing algorithms used to analyze the object of such diagnosis and verify the occurrence of symptoms related to given affection. Usually the final stage is to make a diagnosis based on the detected symptoms. This last stage can be carried out through either dedicated expert systems or more classic pattern analysis approach like using rules to determine illness basing on detected symptoms. This paper discusses the pattern analysis algorithms for gallbladder image interpretation towards classification of the most frequent illness symptoms of this organ.
NASA Astrophysics Data System (ADS)
Cifelli, R.; Mahoney, K. M.; Webb, R. S.; McCormick, B.
2017-12-01
To ensure structural and operational safety of dams and other water management infrastructure, water resources managers and engineers require information about the potential for heavy precipitation. The methods and data used to estimate extreme rainfall amounts for managing risk are based on 40-year-old science and in need of improvement. The need to evaluate new approaches based on the best science available has led the states of Colorado and New Mexico to engage a body of scientists and engineers in an innovative "ensemble approach" to updating extreme precipitation estimates. NOAA is at the forefront of one of three technical approaches that make up the "ensemble study"; the three approaches are conducted concurrently and in collaboration with each other. One approach is the conventional deterministic, "storm-based" method, another is a risk-based regional precipitation frequency estimation tool, and the third is an experimental approach utilizing NOAA's state-of-the-art High Resolution Rapid Refresh (HRRR) physically-based dynamical weather prediction model. The goal of the overall project is to use the individual strengths of these different methods to define an updated and broadly acceptable state of the practice for evaluation and design of dam spillways. This talk will highlight the NOAA research and NOAA's role in the overarching goal to better understand and characterizing extreme precipitation estimation uncertainty. The research led by NOAA explores a novel high-resolution dataset and post-processing techniques using a super-ensemble of hourly forecasts from the HRRR model. We also investigate how this rich dataset may be combined with statistical methods to optimally cast the data in probabilistic frameworks. NOAA expertise in the physical processes that drive extreme precipitation is also employed to develop careful testing and improved understanding of the limitations of older estimation methods and assumptions. The process of decision making in the midst of uncertainty is a major part of this study. We will speak to how the ensemble approach may be used in concert with one another to manage risk and enhance resiliency in the midst of uncertainty. Finally, the presentation will also address the implications of including climate change in future extreme precipitation estimation studies.
Gehrlach, Christoph; Güntert, Bernhard
2015-01-01
Patient satisfaction (PS) surveys are frequently used evaluation methods to show performance from the customer's view. This approach has some fundamental deficits, especially with respect to theory, methodology and usage. Because of the significant theoretical value of the expectation confirmation/disconfirmation concept in the development of PS, an expectation-based experience typology has been developed and tested to check whether this approach could be a theoretical and practical alternative to the survey of PS. Due to the mainly cognitive-rational process of comparison between expectations and expectation fulfilment, it is easier to make changes in this stage of the process than in the subsequent stage of the development of PS that is mainly based on emotional-affective processes. The paper contains a literature review of the common concept of PS and its causal and influencing factors. Based on the theoretical part of this study, an expectation-based experience typology was developed. In the next step, the typology was subjected to exploratory testing, based on two patient surveys. In some parts of the tested typology explorative differences could be found between hospitals. Despite this rather more complex and unusual approach to expectation-based experience typology, this concept offers the chance to change conditions not only retrospectively (based on data), but also in a prospective way in terms of a "management of expectations". Copyright © 2014. Published by Elsevier GmbH.
Babar, Muhammad Imran; Ghazali, Masitah; Jawawi, Dayang N. A.; Zaheer, Kashif Bin
2015-01-01
Value-based requirements engineering plays a vital role in the development of value-based software (VBS). Stakeholders are the key players in the requirements engineering process, and the selection of critical stakeholders for the VBS systems is highly desirable. Based on the stakeholder requirements, the innovative or value-based idea is realized. The quality of the VBS system is associated with the concrete set of valuable requirements, and the valuable requirements can only be obtained if all the relevant valuable stakeholders participate in the requirements elicitation phase. The existing value-based approaches focus on the design of the VBS systems. However, the focus on the valuable stakeholders and requirements is inadequate. The current stakeholder identification and quantification (SIQ) approaches are neither state-of-the-art nor systematic for the VBS systems. The existing approaches are time-consuming, complex and inconsistent which makes the initiation process difficult. Moreover, the main motivation of this research is that the existing SIQ approaches do not provide the low level implementation details for SIQ initiation and stakeholder metrics for quantification. Hence, keeping in view the existing SIQ problems, this research contributes in the form of a new SIQ framework called ‘StakeMeter’. The StakeMeter framework is verified and validated through case studies. The proposed framework provides low-level implementation guidelines, attributes, metrics, quantification criteria and application procedure as compared to the other methods. The proposed framework solves the issues of stakeholder quantification or prioritization, higher time consumption, complexity, and process initiation. The framework helps in the selection of highly critical stakeholders for the VBS systems with less judgmental error. PMID:25799490
Melendez, Johan H.; Santaus, Tonya M.; Brinsley, Gregory; Kiang, Daniel; Mali, Buddha; Hardick, Justin; Gaydos, Charlotte A.; Geddes, Chris D.
2016-01-01
Nucleic acid-based detection of gonorrhea infections typically require a two-step process involving isolation of the nucleic acid, followed by the detection of the genomic target often involving PCR-based approaches. In an effort to improve on current detection approaches, we have developed a unique two-step microwave-accelerated approach for rapid extraction and detection of Neisseria gonorrhoeae (GC) DNA. Our approach is based on the use of highly-focused microwave radiation to rapidly lyse bacterial cells, release, and subsequently fragment microbial DNA. The DNA target is then detected by a process known as microwave-accelerated metal-enhanced fluorescence (MAMEF), an ultra-sensitive direct DNA detection analytical technique. In the present study, we show that highly focused microwaves at 2.45 GHz, using 12.3 mm gold film equilateral triangles, are able to rapidly lyse both bacteria cells and fragment DNA in a time- and microwave power-dependent manner. Detection of the extracted DNA can be performed by MAMEF, without the need for DNA amplification in less than 10 minutes total time or by other PCR-based approaches. Collectively, the use of a microwave-accelerated method for the release and detection of DNA represents a significant step forward towards the development of a point-of-care (POC) platform for detection of gonorrhea infections. PMID:27325503
Skelton, JA; Buehler, C; Irby, MB; Grzywacz, JG
2014-01-01
Family-based approaches to pediatric obesity treatment are considered the ‘gold-standard,’ and are recommended for facilitating behavior change to improve child weight status and health. If family-based approaches are to be truly rooted in the family, clinicians and researchers must consider family process and function in designing effective interventions. To bring a better understanding of family complexities to family-based treatment, two relevant reviews were conducted and are presented: (1) a review of prominent and established theories of the family that may provide a more comprehensive and in-depth approach for addressing pediatric obesity; and (2) a systematic review of the literature to identify the use of prominent family theories in pediatric obesity research, which found little use of theories in intervention studies. Overlapping concepts across theories include: families are a system, with interdependence of units; the idea that families are goal-directed and seek balance; and the physical and social environment imposes demands on families. Family-focused theories provide valuable insight into the complexities of families. Increased use of these theories in both research and practice may identify key leverage points in family process and function to prevent the development of or more effectively treat obesity. The field of family studies provides an innovative approach to the difficult problem of pediatric obesity, building on the long-established approach of family-based treatment. PMID:22531090
Implementation of a VLSI Level Zero Processing system utilizing the functional component approach
NASA Technical Reports Server (NTRS)
Shi, Jianfei; Horner, Ward P.; Grebowsky, Gerald J.; Chesney, James R.
1991-01-01
A high rate Level Zero Processing system is currently being prototyped at NASA/Goddard Space Flight Center (GSFC). Based on state-of-the-art VLSI technology and the functional component approach, the new system promises capabilities of handling multiple Virtual Channels and Applications with a combined data rate of up to 20 Megabits per second (Mbps) at low cost.
ERIC Educational Resources Information Center
Kyle, William C.; And Others
In anticipation of House Bill 246 (now Texas Administrative Code Chapter 75) which requires an inquiry-based, process-approach to the teaching of science, the Richardson Independent School District established the Elementary Science Pilot Project and adopted the Science Curriculum Improvement Study (SCIS) as part of their new K-6 Science through…
ERIC Educational Resources Information Center
Bogumil, Elizabeth; Capous-Desyllas, Moshoula; Lara, Patricia; Reshetnikov, Aleksey
2017-01-01
This article highlights the ways in which arts-based approaches to research can be used in teaching and learning about the qualitative research process. Specifically, in our qualitative research class graduate students used the arts as a form of reflexivity to highlight various aspects of their research process, including their positionality,…
Automated Detection of a Crossing Contact Based on Its Doppler Shift
2009-03-01
contacts in passive sonar systems. A common approach is the application of high- gain processing followed by successive classification criteria. Most...contacts in passive sonar systems. A common approach is the application of high-gain processing followed by successive classification criteria...RESEARCH MOTIVATION The trade-off between the false alarm and detection probability is fundamental in radar and sonar . (Chevalier, 2002) A common
Nonlocal approach to nonequilibrium thermodynamics and nonlocal heat diffusion processes
NASA Astrophysics Data System (ADS)
El-Nabulsi, Rami Ahmad
2018-04-01
We study some aspects of nonequilibrium thermodynamics and heat diffusion processes based on Suykens's nonlocal-in-time kinetic energy approach recently introduced in the literature. A number of properties and insights are obtained in particular the emergence of oscillating entropy and nonlocal diffusion equations which are relevant to a number of physical and engineering problems. Several features are obtained and discussed in details.
An Approach for Integrating the Prioritization of Functional and Nonfunctional Requirements
Dabbagh, Mohammad; Lee, Sai Peck
2014-01-01
Due to the budgetary deadlines and time to market constraints, it is essential to prioritize software requirements. The outcome of requirements prioritization is an ordering of requirements which need to be considered first during the software development process. To achieve a high quality software system, both functional and nonfunctional requirements must be taken into consideration during the prioritization process. Although several requirements prioritization methods have been proposed so far, no particular method or approach is presented to consider both functional and nonfunctional requirements during the prioritization stage. In this paper, we propose an approach which aims to integrate the process of prioritizing functional and nonfunctional requirements. The outcome of applying the proposed approach produces two separate prioritized lists of functional and non-functional requirements. The effectiveness of the proposed approach has been evaluated through an empirical experiment aimed at comparing the approach with the two state-of-the-art-based approaches, analytic hierarchy process (AHP) and hybrid assessment method (HAM). Results show that our proposed approach outperforms AHP and HAM in terms of actual time-consumption while preserving the quality of the results obtained by our proposed approach at a high level of agreement in comparison with the results produced by the other two approaches. PMID:24982987
An approach for integrating the prioritization of functional and nonfunctional requirements.
Dabbagh, Mohammad; Lee, Sai Peck
2014-01-01
Due to the budgetary deadlines and time to market constraints, it is essential to prioritize software requirements. The outcome of requirements prioritization is an ordering of requirements which need to be considered first during the software development process. To achieve a high quality software system, both functional and nonfunctional requirements must be taken into consideration during the prioritization process. Although several requirements prioritization methods have been proposed so far, no particular method or approach is presented to consider both functional and nonfunctional requirements during the prioritization stage. In this paper, we propose an approach which aims to integrate the process of prioritizing functional and nonfunctional requirements. The outcome of applying the proposed approach produces two separate prioritized lists of functional and non-functional requirements. The effectiveness of the proposed approach has been evaluated through an empirical experiment aimed at comparing the approach with the two state-of-the-art-based approaches, analytic hierarchy process (AHP) and hybrid assessment method (HAM). Results show that our proposed approach outperforms AHP and HAM in terms of actual time-consumption while preserving the quality of the results obtained by our proposed approach at a high level of agreement in comparison with the results produced by the other two approaches.
ERIC Educational Resources Information Center
Chan, Cecilia K. Y.
2016-01-01
Many educational researchers have established problem-based learning (PBL) as a total approach to education--both a product and a process--from a pedagogical instructional strategy to skills development to assessment. This study provides qualitative evidences from educational practitioners in various professional disciplines, namely, Medicine,…
Implementation of Process Oriented Guided Inquiry Learning (POGIL) in Engineering
ERIC Educational Resources Information Center
Douglas, Elliot P.; Chiu, Chu-Chuan
2013-01-01
This paper describes implementation and testing of an active learning, team-based pedagogical approach to instruction in engineering. This pedagogy has been termed Process Oriented Guided Inquiry Learning (POGIL), and is based upon the learning cycle model. Rather than sitting in traditional lectures, students work in teams to complete worksheets…
Grading Homework to Emphasize Problem-Solving Process Skills
ERIC Educational Resources Information Center
Harper, Kathleen A.
2012-01-01
This article describes a grading approach that encourages students to employ particular problem-solving skills. Some strengths of this method, called "process-based grading," are that it is easy to implement, requires minimal time to grade, and can be used in conjunction with either an online homework delivery system or paper-based homework.
A Tutorial Programme to Enhance Psychiatry Learning Processes within a PBL-Based Course
ERIC Educational Resources Information Center
Hood, Sean; Chapman, Elaine
2011-01-01
This paper describes a tutorial programme developed at the University of Western Australia (UWA) to enhance medical students' learning processes within problem-based learning contexts. The programme encourages students to use more effective learning approaches by scaffolding the development of effective problem-solving strategies, and by reducing…
A Place-Based Process for Reimagining Learning in the Hawaiian Context
ERIC Educational Resources Information Center
Sang, Kau'i; Worchel, Jessica
2017-01-01
What would an educational system centered on core Hawaiian values look like? The Office of Hawaiian Education, established by the Hawai'i Department of Education (HIDOE) in 2015, has been exploring this question through a community-based process that differs significantly from typical Western approaches to policymaking. Often, policymakers use a…
ERIC Educational Resources Information Center
Liou, Hsien-Chin; Chang, Jason S; Chen, Hao-Jan; Lin, Chih-Cheng; Liaw, Meei-Ling; Gao, Zhao-Ming; Jang, Jyh-Shing Roger; Yeh, Yuli; Chuang, Thomas C.; You, Geeng-Neng
2006-01-01
This paper describes the development of an innovative web-based environment for English language learning with advanced data-driven and statistical approaches. The project uses various corpora, including a Chinese-English parallel corpus ("Sinorama") and various natural language processing (NLP) tools to construct effective English…
ERIC Educational Resources Information Center
Perlberg, Arye
1983-01-01
Two explanations of the underlying process in faculty self-evaluation by videotape recording are outlined and integrated into one conceptualization. One theory is based on affect: self-confrontation, dissonance, stress, distress, and eustress. The second is based on a cognitive and information processing approach and includes feedback,…
NASA Astrophysics Data System (ADS)
Vassena, G.; Clerici, A.
2018-05-01
The state of the art of 3D surveying technologies, if correctly applied, allows to obtain 3D coloured models of large open pit mines using different technologies as terrestrial laser scanner (TLS), with images, combined with UAV based digital photogrammetry. GNSS and/or total station are also currently used to geo reference the model. The University of Brescia has been realised a project to map in 3D an open pit mine located in Botticino, a famous location of marble extraction close to Brescia in North Italy. Terrestrial Laser Scanner 3D point clouds combined with RGB images and digital photogrammetry from UAV have been used to map a large part of the cave. By rigorous and well know procedures a 3D point cloud and mesh model have been obtained using an easy and rigorous approach. After the description of the combined mapping process, the paper describes the innovative process proposed for the daily/weekly update of the model itself. To realize this task a SLAM technology approach is described, using an innovative approach based on an innovative instrument capable to run an automatic localization process and real time on the field change detection analysis.
From Physical Process to Economic Cost - Integrated Approaches of Landslide Risk Assessment
NASA Astrophysics Data System (ADS)
Klose, M.; Damm, B.
2014-12-01
The nature of landslides is complex in many respects, with landslide hazard and impact being dependent on a variety of factors. This obviously requires an integrated assessment for fundamental understanding of landslide risk. Integrated risk assessment, according to the approach presented in this contribution, implies combining prediction of future landslide occurrence with analysis of landslide impact in the past. A critical step for assessing landslide risk in integrated perspective is to analyze what types of landslide damage affected people and property in which way and how people contributed and responded to these damage types. In integrated risk assessment, the focus is on systematic identification and monetization of landslide damage, and analytical tools that allow deriving economic costs from physical landslide processes are at the heart of this approach. The broad spectrum of landslide types and process mechanisms as well as nonlinearity between landslide magnitude, damage intensity, and direct costs are some main factors explaining recent challenges in risk assessment. The two prevailing approaches for assessing the impact of landslides in economic terms are cost survey (ex-post) and risk analysis (ex-ante). Both approaches are able to complement each other, but yet a combination of them has not been realized so far. It is common practice today to derive landslide risk without considering landslide process-based cause-effect relationships, since integrated concepts or new modeling tools expanding conventional methods are still widely missing. The approach introduced in this contribution is based on a systematic framework that combines cost survey and GIS-based tools for hazard or cost modeling with methods to assess interactions between land use practices and landslides in historical perspective. Fundamental understanding of landslide risk also requires knowledge about the economic and fiscal relevance of landslide losses, wherefore analysis of their impact on public budgets is a further component of this approach. In integrated risk assessment, combination of methods plays an important role, with the objective of collecting and integrating complex data sets on landslide risk.
A risk-based auditing process for pharmaceutical manufacturers.
Vargo, Susan; Dana, Bob; Rangavajhula, Vijaya; Rönninger, Stephan
2014-01-01
The purpose of this article is to share ideas on developing a risk-based model for the scheduling of audits (both internal and external). Audits are a key element of a manufacturer's quality system and provide an independent means of evaluating the manufacturer's or the supplier/vendor's compliance status. Suggestions for risk-based scheduling approaches are discussed in the article. Pharmaceutical manufacturers are required to establish and implement a quality system. The quality system is an organizational structure defining responsibilities, procedures, processes, and resources that the manufacturer has established to ensure quality throughout the manufacturing process. Audits are a component of the manufacturer's quality system and provide a systematic and an independent means of evaluating the manufacturer's overall quality system and compliance status. Audits are performed at defined intervals for a specified duration. The intention of the audit process is to focus on key areas within the quality system and may not cover all relevant areas during each audit. In this article, the authors provide suggestions for risk-based scheduling approaches to aid pharmaceutical manufacturers in identifying the key focus areas for an audit.
Supervised Learning Based Hypothesis Generation from Biomedical Literature.
Sang, Shengtian; Yang, Zhihao; Li, Zongyao; Lin, Hongfei
2015-01-01
Nowadays, the amount of biomedical literatures is growing at an explosive speed, and there is much useful knowledge undiscovered in this literature. Researchers can form biomedical hypotheses through mining these works. In this paper, we propose a supervised learning based approach to generate hypotheses from biomedical literature. This approach splits the traditional processing of hypothesis generation with classic ABC model into AB model and BC model which are constructed with supervised learning method. Compared with the concept cooccurrence and grammar engineering-based approaches like SemRep, machine learning based models usually can achieve better performance in information extraction (IE) from texts. Then through combining the two models, the approach reconstructs the ABC model and generates biomedical hypotheses from literature. The experimental results on the three classic Swanson hypotheses show that our approach outperforms SemRep system.
Pumped shot noise in adiabatically modulated graphene-based double-barrier structures.
Zhu, Rui; Lai, Maoli
2011-11-16
Quantum pumping processes are accompanied by considerable quantum noise. Based on the scattering approach, we investigated the pumped shot noise properties in adiabatically modulated graphene-based double-barrier structures. It is found that compared with the Poisson processes, the pumped shot noise is dramatically enhanced where the dc pumped current changes flow direction, which demonstrates the effect of the Klein paradox.
Pumped shot noise in adiabatically modulated graphene-based double-barrier structures
NASA Astrophysics Data System (ADS)
Zhu, Rui; Lai, Maoli
2011-11-01
Quantum pumping processes are accompanied by considerable quantum noise. Based on the scattering approach, we investigated the pumped shot noise properties in adiabatically modulated graphene-based double-barrier structures. It is found that compared with the Poisson processes, the pumped shot noise is dramatically enhanced where the dc pumped current changes flow direction, which demonstrates the effect of the Klein paradox.
Vote Stuffing Control in IPTV-based Recommender Systems
NASA Astrophysics Data System (ADS)
Bhatt, Rajen
Vote stuffing is a general problem in the functioning of the content rating-based recommender systems. Currently IPTV viewers browse various contents based on the program ratings. In this paper, we propose a fuzzy clustering-based approach to remove the effects of vote stuffing and consider only the genuine ratings for the programs over multiple genres. The approach requires only one authentic rating, which is generally available from recommendation system administrators or program broadcasters. The entire process is automated using fuzzy c-means clustering. Computational experiments performed over one real-world program rating database shows that the proposed approach is very efficient for controlling vote stuffing.
Learning Based Bidding Strategy for HVAC Systems in Double Auction Retail Energy Markets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sun, Yannan; Somani, Abhishek; Carroll, Thomas E.
In this paper, a bidding strategy is proposed using reinforcement learning for HVAC systems in a double auction market. The bidding strategy does not require a specific model-based representation of behavior, i.e., a functional form to translate indoor house temperatures into bid prices. The results from reinforcement learning based approach are compared with the HVAC bidding approach used in the AEP gridSMART® smart grid demonstration project and it is shown that the model-free (learning based) approach tracks well the results from the model-based behavior. Successful use of model-free approaches to represent device-level economic behavior may help develop similar approaches tomore » represent behavior of more complex devices or groups of diverse devices, such as in a building. Distributed control requires an understanding of decision making processes of intelligent agents so that appropriate mechanisms may be developed to control and coordinate their responses, and model-free approaches to represent behavior will be extremely useful in that quest.« less
Liu, Kui; Wei, Sixiao; Chen, Zhijiang; Jia, Bin; Chen, Genshe; Ling, Haibin; Sheaff, Carolyn; Blasch, Erik
2017-01-01
This paper presents the first attempt at combining Cloud with Graphic Processing Units (GPUs) in a complementary manner within the framework of a real-time high performance computation architecture for the application of detecting and tracking multiple moving targets based on Wide Area Motion Imagery (WAMI). More specifically, the GPU and Cloud Moving Target Tracking (GC-MTT) system applied a front-end web based server to perform the interaction with Hadoop and highly parallelized computation functions based on the Compute Unified Device Architecture (CUDA©). The introduced multiple moving target detection and tracking method can be extended to other applications such as pedestrian tracking, group tracking, and Patterns of Life (PoL) analysis. The cloud and GPUs based computing provides an efficient real-time target recognition and tracking approach as compared to methods when the work flow is applied using only central processing units (CPUs). The simultaneous tracking and recognition results demonstrate that a GC-MTT based approach provides drastically improved tracking with low frame rates over realistic conditions. PMID:28208684
Liu, Kui; Wei, Sixiao; Chen, Zhijiang; Jia, Bin; Chen, Genshe; Ling, Haibin; Sheaff, Carolyn; Blasch, Erik
2017-02-12
This paper presents the first attempt at combining Cloud with Graphic Processing Units (GPUs) in a complementary manner within the framework of a real-time high performance computation architecture for the application of detecting and tracking multiple moving targets based on Wide Area Motion Imagery (WAMI). More specifically, the GPU and Cloud Moving Target Tracking (GC-MTT) system applied a front-end web based server to perform the interaction with Hadoop and highly parallelized computation functions based on the Compute Unified Device Architecture (CUDA©). The introduced multiple moving target detection and tracking method can be extended to other applications such as pedestrian tracking, group tracking, and Patterns of Life (PoL) analysis. The cloud and GPUs based computing provides an efficient real-time target recognition and tracking approach as compared to methods when the work flow is applied using only central processing units (CPUs). The simultaneous tracking and recognition results demonstrate that a GC-MTT based approach provides drastically improved tracking with low frame rates over realistic conditions.
Process-based network decomposition reveals backbone motif structure
Wang, Guanyu; Du, Chenghang; Chen, Hao; Simha, Rahul; Rong, Yongwu; Xiao, Yi; Zeng, Chen
2010-01-01
A central challenge in systems biology today is to understand the network of interactions among biomolecules and, especially, the organizing principles underlying such networks. Recent analysis of known networks has identified small motifs that occur ubiquitously, suggesting that larger networks might be constructed in the manner of electronic circuits by assembling groups of these smaller modules. Using a unique process-based approach to analyzing such networks, we show for two cell-cycle networks that each of these networks contains a giant backbone motif spanning all the network nodes that provides the main functional response. The backbone is in fact the smallest network capable of providing the desired functionality. Furthermore, the remaining edges in the network form smaller motifs whose role is to confer stability properties rather than provide function. The process-based approach used in the above analysis has additional benefits: It is scalable, analytic (resulting in a single analyzable expression that describes the behavior), and computationally efficient (all possible minimal networks for a biological process can be identified and enumerated). PMID:20498084
The FoReVer Methodology: A MBSE Framework for Formal Verification
NASA Astrophysics Data System (ADS)
Baracchi, Laura; Mazzini, Silvia; Cimatti, Alessandro; Tonetta, Stefano; Garcia, Gerald
2013-08-01
The need for high level of confidence and operational integrity in critical space (software) systems is well recognized in the Space industry and has been addressed so far through rigorous System and Software Development Processes and stringent Verification and Validation regimes. The Model Based Space System Engineering process (MBSSE) derived in the System and Software Functional Requirement Techniques study (SSFRT) focused on the application of model based engineering technologies to support the space system and software development processes, from mission level requirements to software implementation through model refinements and translations. In this paper we report on our work in the ESA-funded FoReVer project where we aim at developing methodological, theoretical and technological support for a systematic approach to the space avionics system development, in phases 0/A/B/C. FoReVer enriches the MBSSE process with contract-based formal verification of properties, at different stages from system to software, through a step-wise refinement approach, with the support for a Software Reference Architecture.
Green supplier selection: a new genetic/immune strategy with industrial application
NASA Astrophysics Data System (ADS)
Kumar, Amit; Jain, Vipul; Kumar, Sameer; Chandra, Charu
2016-10-01
With the onset of the 'climate change movement', organisations are striving to include environmental criteria into the supplier selection process. This article hybridises a Green Data Envelopment Analysis (GDEA)-based approach with a new Genetic/Immune Strategy for Data Envelopment Analysis (GIS-DEA). A GIS-DEA approach provides a different view to solving multi-criteria decision making problems using data envelopment analysis (DEA) by considering DEA as a multi-objective optimisation problem with efficiency as one objective and proximity of solution to decision makers' preferences as the other objective. The hybrid approach called GIS-GDEA is applied here to a well-known automobile spare parts manufacturer in India and the results presented. User validation developed based on specific set of criteria suggests that the supplier selection process with GIS-GDEA is more practical than other approaches in a current industrial scenario with multiple decision makers.
Lee, Mi Kyung; Coker, David F
2016-08-18
An accurate approach for computing intermolecular and intrachromophore contributions to spectral densities to describe the electronic-nuclear interactions relevant for modeling excitation energy transfer processes in light harvesting systems is presented. The approach is based on molecular dynamics (MD) calculations of classical correlation functions of long-range contributions to excitation energy fluctuations and a separate harmonic analysis and single-point gradient quantum calculations for electron-intrachromophore vibrational couplings. A simple model is also presented that enables detailed analysis of the shortcomings of standard MD-based excitation energy fluctuation correlation function approaches. The method introduced here avoids these problems, and its reliability is demonstrated in accurate predictions for bacteriochlorophyll molecules in the Fenna-Matthews-Olson pigment-protein complex, where excellent agreement with experimental spectral densities is found. This efficient approach can provide instantaneous spectral densities for treating the influence of fluctuations in environmental dissipation on fast electronic relaxation.
Method Engineering: A Service-Oriented Approach
NASA Astrophysics Data System (ADS)
Cauvet, Corine
In the past, a large variety of methods have been published ranging from very generic frameworks to methods for specific information systems. Method Engineering has emerged as a research discipline for designing, constructing and adapting methods for Information Systems development. Several approaches have been proposed as paradigms in method engineering. The meta modeling approach provides means for building methods by instantiation, the component-based approach aims at supporting the development of methods by using modularization constructs such as method fragments, method chunks and method components. This chapter presents an approach (SO2M) for method engineering based on the service paradigm. We consider services as autonomous computational entities that are self-describing, self-configuring and self-adapting. They can be described, published, discovered and dynamically composed for processing a consumer's demand (a developer's requirement). The method service concept is proposed to capture a development process fragment for achieving a goal. Goal orientation in service specification and the principle of service dynamic composition support method construction and method adaptation to different development contexts.
ERIC Educational Resources Information Center
National Centre for Vocational Education Research (NCVER), 2010
2010-01-01
This good practice guide is based on research that looked at how to teach adult literacy and numeracy using a social capital approach. The guide suggests ways vocational education and training (VET) practitioners can adopt a social capital approach to their teaching practice. A social capital approach refers to the process in which networks are…
Poulsen, Signe; Jørgensen, Michael Søgaard
2011-09-01
The aim of this article is to analyse the social shaping of worksite food interventions at two Danish worksites. The overall aims are to contribute first, to the theoretical frameworks for the planning and analysis of food and health interventions at worksites and second, to a foodscape approach to worksite food interventions. The article is based on a case study of the design of a canteen takeaway (CTA) scheme for employees at two Danish hospitals. This was carried out as part of a project to investigate the shaping and impact of schemes that offer employees meals to buy, to take home or to eat at the worksite during irregular working hours. Data collection was carried out through semi-structured interviews with stakeholders within the two change processes. Two focus group interviews were also carried out at one hospital and results from a user survey carried out by other researchers at the other hospital were included. Theoretically, the study was based on the social constitution approach to change processes at worksites and a co-evolution approach to problem-solution complexes as part of change processes. Both interventions were initiated because of the need to improve the food supply for the evening shift and the work-life balance. The shaping of the schemes at the two hospitals became rather different change processes due to the local organizational processes shaped by previously developed norms and values. At one hospital the change process challenged norms and values about food culture and challenged ideas in the canteen kitchen about working hours. At the other hospital, the change was more of a learning process that aimed at finding the best way to offer a CTA scheme. Worksite health promotion practitioners should be aware that the intervention itself is an object of negotiation between different stakeholders at a worksite based on existing norms and values. The social contextual model and the setting approach to worksite health interventions lack reflections about how such norms and values might influence the shaping of the intervention. It is recommended that future planning and analyses of worksite health promotion interventions apply a combination of the social constitution approach to worksites and an integrated food supply and demand perspective based on analyses of the co-evolution of problem-solution complexes.
Local Area Networks (The Printout).
ERIC Educational Resources Information Center
Aron, Helen; Balajthy, Ernest
1989-01-01
Describes the Local Area Network (LAN), a project in which students used LAN-based word processing and electronic mail software as the center of a writing process approach. Discusses the advantages and disadvantages of networking. (MM)
A simulation-based approach for estimating premining water quality: Red Mountain Creek, Colorado
Runkel, Robert L.; Kimball, Briant A; Walton-Day, Katherine; Verplanck, Philip L.
2007-01-01
Regulatory agencies are often charged with the task of setting site-specific numeric water quality standards for impaired streams. This task is particularly difficult for streams draining highly mineralized watersheds with past mining activity. Baseline water quality data obtained prior to mining are often non-existent and application of generic water quality standards developed for unmineralized watersheds is suspect given the geology of most watersheds affected by mining. Various approaches have been used to estimate premining conditions, but none of the existing approaches rigorously consider the physical and geochemical processes that ultimately determine instream water quality. An approach based on simulation modeling is therefore proposed herein. The approach utilizes synoptic data that provide spatially-detailed profiles of concentration, streamflow, and constituent load along the study reach. This field data set is used to calibrate a reactive stream transport model that considers the suite of physical and geochemical processes that affect constituent concentrations during instream transport. A key input to the model is the quality and quantity of waters entering the study reach. This input is based on chemical analyses available from synoptic sampling and observed increases in streamflow along the study reach. Given the calibrated model, additional simulations are conducted to estimate premining conditions. In these simulations, the chemistry of mining-affected sources is replaced with the chemistry of waters that are thought to be unaffected by mining (proximal, premining analogues). The resultant simulations provide estimates of premining water quality that reflect both the reduced loads that were present prior to mining and the processes that affect these loads as they are transported downstream. This simulation-based approach is demonstrated using data from Red Mountain Creek, Colorado, a small stream draining a heavily-mined watershed. Model application to the premining problem for Red Mountain Creek is based on limited field reconnaissance and chemical analyses; additional field work and analyses may be needed to develop definitive, quantitative estimates of premining water quality.
Capel, P.D.; Larson, S.J.
1995-01-01
Minimizing the loss of target organic chemicals from environmental water samples between the time of sample collection and isolation is important to the integrity of an investigation. During this sample holding time, there is a potential for analyte loss through volatilization from the water to the headspace, sorption to the walls and cap of the sample bottle; and transformation through biotic and/or abiotic reactions. This paper presents a chemodynamic-based, generalized approach to estimate the most probable loss processes for individual target organic chemicals. The basic premise is that the investigator must know which loss process(es) are important for a particular analyte, based on its chemodynamic properties, when choosing the appropriate method(s) to prevent loss.
Multifractal Properties of Process Control Variables
NASA Astrophysics Data System (ADS)
Domański, Paweł D.
2017-06-01
Control system is an inevitable element of any industrial installation. Its quality affects overall process performance significantly. The assessment, whether control system needs any improvement or not, requires relevant and constructive measures. There are various methods, like time domain based, Minimum Variance, Gaussian and non-Gaussian statistical factors, fractal and entropy indexes. Majority of approaches use time series of control variables. They are able to cover many phenomena. But process complexities and human interventions cause effects that are hardly visible for standard measures. It is shown that the signals originating from industrial installations have multifractal properties and such an analysis may extend standard approach to further observations. The work is based on industrial and simulation data. The analysis delivers additional insight into the properties of control system and the process. It helps to discover internal dependencies and human factors, which are hardly detectable.
Product, not process! Explaining a basic concept in agricultural biotechnologies and food safety.
Tagliabue, Giovanni
2017-12-01
Most life scientists have relentlessly recommended any evaluative approach of agri-food products to be based on examination of the phenotype, i.e. the actual characteristics of the food, feed and fiber varieties: the effects of any new cultivar (or micro-organism, animal) on our health are not dependent on the process(es), the techniques used to obtain it.The so-called "genetically modified organisms" ("GMOs"), on the other hand, are commonly framed as a group with special properties - most frequently seen as dubious, or even harmful.Some social scientists still believe that considering the process is a correct background for science-based understanding and regulation. To show that such an approach is utterly wrong, and to invite scientists, teachers and science communicators to explain this mistake to students, policy-makers and the public at large, we imagined a dialogue between a social scientist, who has a positive opinion about a certain weight that a process-based orientation should have in the risk assessment, and a few experts who offer plenty of arguments against that view. The discussion focuses on new food safety.
Cao, Xiaobing; Zhi, Lili; Li, Yahui; Fang, Fei; Cui, Xian; Yao, Youwei; Ci, Lijie; Ding, Kongxian; Wei, Jinquan
2017-09-27
High-quality perovskite films can be fabricated from Lewis acid-base adducts through molecule exchange. Substantial work is needed to fully understand the formation mechanism of the perovskite films, which helps to further improve their quality. Here, we study the formation of CH 3 NH 3 PbI 3 perovskite films by introducing some dimethylacetamide into the PbI 2 /N,N-dimethylformamide solution. We reveal that there are three key processes during the formation of perovskite films through the Lewis acid-base adduct approach: molecule intercalation of solvent into the PbI 2 lattice, molecule exchange between the solvent and CH 3 NH 3 I, and dissolution-recrystallization of the perovskite grains during annealing. The Lewis base solvents play multiple functions in the above processes. The properties of the solvent, including Lewis basicity and boiling point, play key roles in forming smooth perovskite films with large grains. We also provide some rules for choosing Lewis base additives to prepare high-quality perovskite films through the Lewis adduct approach.
Dictionary Based Machine Translation from Kannada to Telugu
NASA Astrophysics Data System (ADS)
Sindhu, D. V.; Sagar, B. M.
2017-08-01
Machine Translation is a task of translating from one language to another language. For the languages with less linguistic resources like Kannada and Telugu Dictionary based approach is the best approach. This paper mainly focuses on Dictionary based machine translation for Kannada to Telugu. The proposed methodology uses dictionary for translating word by word without much correlation of semantics between them. The dictionary based machine translation process has the following sub process: Morph analyzer, dictionary, transliteration, transfer grammar and the morph generator. As a part of this work bilingual dictionary with 8000 entries is developed and the suffix mapping table at the tag level is built. This system is tested for the children stories. In near future this system can be further improved by defining transfer grammar rules.
Complementary Approaches to Existing Target Based Drug Discovery for Identifying Novel Drug Targets.
Vasaikar, Suhas; Bhatia, Pooja; Bhatia, Partap G; Chu Yaiw, Koon
2016-11-21
In the past decade, it was observed that the relationship between the emerging New Molecular Entities and the quantum of R&D investment has not been favorable. There might be numerous reasons but few studies stress the introduction of target based drug discovery approach as one of the factors. Although a number of drugs have been developed with an emphasis on a single protein target, yet identification of valid target is complex. The approach focuses on an in vitro single target, which overlooks the complexity of cell and makes process of validation drug targets uncertain. Thus, it is imperative to search for alternatives rather than looking at success stories of target-based drug discovery. It would be beneficial if the drugs were developed to target multiple components. New approaches like reverse engineering and translational research need to take into account both system and target-based approach. This review evaluates the strengths and limitations of known drug discovery approaches and proposes alternative approaches for increasing efficiency against treatment.
Networked Workstations and Parallel Processing Utilizing Functional Languages
1993-03-01
program . This frees the programmer to concentrate on what the program is to do, not how the program is...traditional ’von Neumann’ architecture uses a timer based (e.g., the program counter), sequentially pro- grammed, single processor approach to problem...traditional ’von Neumann’ architecture uses a timer based (e.g., the program counter), sequentially programmed , single processor approach to
ERIC Educational Resources Information Center
Wang, Yanqing; Li, Hang; Feng, Yuqiang; Jiang, Yu; Liu, Ying
2012-01-01
The traditional assessment approach, in which one single written examination counts toward a student's total score, no longer meets new demands of programming language education. Based on a peer code review process model, we developed an online assessment system called "EduPCR" and used a novel approach to assess the learning of computer…
National Infrastructure Protection Plan
2006-01-01
effective and efficient CI/KR protection; and • Provide a system for continuous measurement and improvement of CI/KR...information- based core processes, a top-down system -, network-, or function- based approach may be more appropri- ate. A bottom-up approach normally... e - commerce , e -mail, and R&D systems . • Control Systems : Cyber systems used within many infrastructure and industries to monitor and
Jinghao Li; John F. Hunt; Shaoqin Gong; Zhiyong Cai
2016-01-01
This paper presents a simplified analytical model and balanced design approach for modeling lightweight wood-based structural panels in bending. Because many design parameters are required to input for the model of finite element analysis (FEA) during the preliminary design process and optimization, the equivalent method was developed to analyze the mechanical...
ERIC Educational Resources Information Center
Gammage, David T.
2008-01-01
Purpose: The purpose of this paper is to explore how the process of implementation of school-based management (SBM) has worked within the public school systems in the Australian Capital Territory (ACT) and Victoria in Australia. The period covered was 1976-2006. Design/methodology/approach: The approach adopted was the mixed methodology which…
Karayanidis, Frini; Jamadar, Sharna; Ruge, Hannes; Phillips, Natalie; Heathcote, Andrew; Forstmann, Birte U.
2010-01-01
Recent research has taken advantage of the temporal and spatial resolution of event-related brain potentials (ERPs) and functional magnetic resonance imaging (fMRI) to identify the time course and neural circuitry of preparatory processes required to switch between different tasks. Here we overview some key findings contributing to understanding strategic processes in advance preparation. Findings from these methodologies are compatible with advance preparation conceptualized as a set of processes activated for both switch and repeat trials, but with substantial variability as a function of individual differences and task requirements. We then highlight new approaches that attempt to capitalize on this variability to link behavior and brain activation patterns. One approach examines correlations among behavioral, ERP and fMRI measures. A second “model-based” approach accounts for differences in preparatory processes by estimating quantitative model parameters that reflect latent psychological processes. We argue that integration of behavioral and neuroscientific methodologies is key to understanding the complex nature of advance preparation in task-switching. PMID:21833196
Qureshi, Adil
2005-01-01
Effective intercultural psychotherapy generally has been conceptualized in terms of a specific knowledge and skills base, combined with relevant attention to the practitioner's cultural attitudes and beliefs. Although such an approach continues to be the gold standard in the field, it has yet to be demonstrated that these components are either necessary or sufficient for effective treatment. This paper presents an approach to intercultural therapy based on Gadamer's philosophical hermeneutics. Humans are always in the process of making sense of the world around them, a process which is predicated on culturally given preunderstandings. Cultural difference means that the preunderstandings are rarely mutual, and therefore, communication and psychotherapy are often problematic. These preunderstandings often show up in the form of racial and ethnic prejudice and the therapist is rarely aware of this. Therapist preunderstanding influences all aspects of the psychotherapy process, such as treatment planning, interventions chosen, and the therapeutic relationship. Recommendations are given for improving the intercultural therapy process, and draw strongly on the twin notions of the dialogical relationship and cultural imagination.
Robust PLS approach for KPI-related prediction and diagnosis against outliers and missing data
NASA Astrophysics Data System (ADS)
Yin, Shen; Wang, Guang; Yang, Xu
2014-07-01
In practical industrial applications, the key performance indicator (KPI)-related prediction and diagnosis are quite important for the product quality and economic benefits. To meet these requirements, many advanced prediction and monitoring approaches have been developed which can be classified into model-based or data-driven techniques. Among these approaches, partial least squares (PLS) is one of the most popular data-driven methods due to its simplicity and easy implementation in large-scale industrial process. As PLS is totally based on the measured process data, the characteristics of the process data are critical for the success of PLS. Outliers and missing values are two common characteristics of the measured data which can severely affect the effectiveness of PLS. To ensure the applicability of PLS in practical industrial applications, this paper introduces a robust version of PLS to deal with outliers and missing values, simultaneously. The effectiveness of the proposed method is finally demonstrated by the application results of the KPI-related prediction and diagnosis on an industrial benchmark of Tennessee Eastman process.
ALFA: The new ALICE-FAIR software framework
NASA Astrophysics Data System (ADS)
Al-Turany, M.; Buncic, P.; Hristov, P.; Kollegger, T.; Kouzinopoulos, C.; Lebedev, A.; Lindenstruth, V.; Manafov, A.; Richter, M.; Rybalchenko, A.; Vande Vyvre, P.; Winckler, N.
2015-12-01
The commonalities between the ALICE and FAIR experiments and their computing requirements led to the development of large parts of a common software framework in an experiment independent way. The FairRoot project has already shown the feasibility of such an approach for the FAIR experiments and extending it beyond FAIR to experiments at other facilities[1, 2]. The ALFA framework is a joint development between ALICE Online- Offline (O2) and FairRoot teams. ALFA is designed as a flexible, elastic system, which balances reliability and ease of development with performance using multi-processing and multithreading. A message- based approach has been adopted; such an approach will support the use of the software on different hardware platforms, including heterogeneous systems. Each process in ALFA assumes limited communication and reliance on other processes. Such a design will add horizontal scaling (multiple processes) to vertical scaling provided by multiple threads to meet computing and throughput demands. ALFA does not dictate any application protocols. Potentially, any content-based processor or any source can change the application protocol. The framework supports different serialization standards for data exchange between different hardware and software languages.
Machinery health prognostics: A systematic review from data acquisition to RUL prediction
NASA Astrophysics Data System (ADS)
Lei, Yaguo; Li, Naipeng; Guo, Liang; Li, Ningbo; Yan, Tao; Lin, Jing
2018-05-01
Machinery prognostics is one of the major tasks in condition based maintenance (CBM), which aims to predict the remaining useful life (RUL) of machinery based on condition information. A machinery prognostic program generally consists of four technical processes, i.e., data acquisition, health indicator (HI) construction, health stage (HS) division, and RUL prediction. Over recent years, a significant amount of research work has been undertaken in each of the four processes. And much literature has made an excellent overview on the last process, i.e., RUL prediction. However, there has not been a systematic review that covers the four technical processes comprehensively. To fill this gap, this paper provides a review on machinery prognostics following its whole program, i.e., from data acquisition to RUL prediction. First, in data acquisition, several prognostic datasets widely used in academic literature are introduced systematically. Then, commonly used HI construction approaches and metrics are discussed. After that, the HS division process is summarized by introducing its major tasks and existing approaches. Afterwards, the advancements of RUL prediction are reviewed including the popular approaches and metrics. Finally, the paper provides discussions on current situation, upcoming challenges as well as possible future trends for researchers in this field.
Atypical resource allocation may contribute to many aspects of autism
Goldknopf, Emily J.
2013-01-01
Based on a review of the literature and on reports by people with autism, this paper suggests that atypical resource allocation is a factor that contributes to many aspects of autism spectrum conditions, including difficulties with language and social cognition, atypical sensory and attentional experiences, executive and motor challenges, and perceptual and conceptual strengths and weaknesses. Drawing upon resource theoretical approaches that suggest that perception, cognition, and action draw upon multiple pools of resources, the approach hypothesizes that compared with resources in typical cognition, resources in autism are narrowed or reduced, especially in people with strong sensory symptoms. In narrowed attention, resources are restricted to smaller areas and to fewer modalities, stages of processing, and cognitive processes than in typical cognition; narrowed resources may be more intense than in typical cognition. In reduced attentional capacity, overall resources are reduced; resources may be restricted to fewer modalities, stages of processing, and cognitive processes than in typical cognition, or the amount of resources allocated to each area or process may be reduced. Possible neural bases of the hypothesized atypical resource allocation, relations to other approaches, limitations, and tests of the hypotheses are discussed. PMID:24421760
Merilaita, Sami; Scott-Samuel, Nicholas E; Cuthill, Innes C
2017-07-05
For camouflage to succeed, an individual has to pass undetected, unrecognized or untargeted, and hence it is the processing of visual information that needs to be deceived. Camouflage is therefore an adaptation to the perception and cognitive mechanisms of another animal. Although this has been acknowledged for a long time, there has been no unitary account of the link between visual perception and camouflage. Viewing camouflage as a suite of adaptations to reduce the signal-to-noise ratio provides the necessary common framework. We review the main processes in visual perception and how animal camouflage exploits these. We connect the function of established camouflage mechanisms to the analysis of primitive features, edges, surfaces, characteristic features and objects (a standard hierarchy of processing in vision science). Compared to the commonly used research approach based on established camouflage mechanisms, we argue that our approach based on perceptual processes targeted by camouflage has several important benefits: specifically, it enables the formulation of more precise hypotheses and addresses questions that cannot even be identified when investigating camouflage only through the classic approach based on the patterns themselves. It also promotes a shift from the appearance to the mechanistic function of animal coloration.This article is part of the themed issue 'Animal coloration: production, perception, function and application'. © 2017 The Author(s).
Geocenter Coordinates from a Combined Processing of LEO and Ground-based GPS Observations
NASA Astrophysics Data System (ADS)
Männel, Benjamin; Rothacher, Markus
2017-04-01
The GPS observations provided by the global IGS (International GNSS Service) tracking network play an important role for the realization of a unique terrestrial reference frame that is accurate enough to allow the monitoring of the Earth's system. Combining these ground-based data with GPS observations tracked by high-quality dual-frequency receivers on-board Low Earth Orbiters (LEO) might help to further improve the realization of the terrestrial reference frame and the estimation of the geocenter coordinates, GPS satellite orbits and Earth rotation parameters (ERP). To assess the scope of improvement, we processed a network of 50 globally distributed and stable IGS-stations together with four LEOs (GRACE-A, GRACE-B, OSTM/Jason-2 and GOCE) over a time interval of three years (2010-2012). To ensure fully consistent solutions the zero-difference phase observations of the ground stations and LEOs were processed in a common least-square adjustment, estimating GPS orbits, LEO orbits, station coordinates, ERPs, site-specific tropospheric delays, satellite and receiver clocks and ambiguities. We present the significant impact of the individual LEOs and a combination of all four LEOs on geocenter coordinates derived by using a translational approach (also called network shift approach). In addition, we present geocenter coordinates derived from the same set of GPS observations by using a unified approach. This approach combines the translational and the degree-one approach by estimating translations and surface deformations simultaneously. Based on comparisons against each other and against geocenter time series derived by other techniques the effect of the selected approach is assessed.
Introduction to Message-Bus Architectures for Space Systems
NASA Technical Reports Server (NTRS)
Smith, Dan; Gregory, Brian
2005-01-01
This course presents technical and programmatic information on the development of message-based architectures for space mission ground and flight software systems. Message-based architecture approaches provide many significant advantages over the more traditional socket-based one-of-a-kind integrated system development approaches. The course provides an overview of publish/subscribe concepts, the use of common isolation layer API's, approaches to message standardization, and other technical topics. Several examples of currently operational systems are discussed and possible changes to the system development process are presented. Benefits and lessons learned will be discussed and time for questions and answers will be provided.
An experiment-based comparative study of fuzzy logic control
NASA Technical Reports Server (NTRS)
Berenji, Hamid R.; Chen, Yung-Yaw; Lee, Chuen-Chein; Murugesan, S.; Jang, Jyh-Shing
1989-01-01
An approach is presented to the control of a dynamic physical system through the use of approximate reasoning. The approach has been implemented in a program named POLE, and the authors have successfully built a prototype hardware system to solve the cartpole balancing problem in real-time. The approach provides a complementary alternative to the conventional analytical control methodology and is of substantial use when a precise mathematical model of the process being controlled is not available. A set of criteria for comparing controllers based on approximate reasoning and those based on conventional control schemes is furnished.
Rinaldi, Fabio; Schneider, Gerold; Kaljurand, Kaarel; Hess, Michael; Andronis, Christos; Konstandi, Ourania; Persidis, Andreas
2007-02-01
The amount of new discoveries (as published in the scientific literature) in the biomedical area is growing at an exponential rate. This growth makes it very difficult to filter the most relevant results, and thus the extraction of the core information becomes very expensive. Therefore, there is a growing interest in text processing approaches that can deliver selected information from scientific publications, which can limit the amount of human intervention normally needed to gather those results. This paper presents and evaluates an approach aimed at automating the process of extracting functional relations (e.g. interactions between genes and proteins) from scientific literature in the biomedical domain. The approach, using a novel dependency-based parser, is based on a complete syntactic analysis of the corpus. We have implemented a state-of-the-art text mining system for biomedical literature, based on a deep-linguistic, full-parsing approach. The results are validated on two different corpora: the manually annotated genomics information access (GENIA) corpus and the automatically annotated arabidopsis thaliana circadian rhythms (ATCR) corpus. We show how a deep-linguistic approach (contrary to common belief) can be used in a real world text mining application, offering high-precision relation extraction, while at the same time retaining a sufficient recall.
A proactive approach to sustainable management of mine tailings
NASA Astrophysics Data System (ADS)
Edraki, Mansour; Baumgartl, Thomas
2015-04-01
The reactive strategies to manage mine tailings i.e. containment of slurries of tailings in tailings storage facilities (TSF's) and remediation of tailings solids or tailings seepage water after the decommissioning of those facilities, can be technically inefficient to eliminate environmental risks (e.g. prevent dispersion of contaminants and catastrophic dam wall failures), pose a long term economic burden for companies, governments and society after mine closure, and often fail to meet community expectations. Most preventive environmental management practices promote proactive integrated approaches to waste management whereby the source of environmental issues are identified to help make a more informed decisions. They often use life cycle assessment to find the "hot spots" of environmental burdens. This kind of approach is often based on generic data and has rarely been used for tailings. Besides, life cycle assessments are less useful for designing operations or simulating changes in the process and consequent environmental outcomes. It is evident that an integrated approach for tailings research linked to better processing options is needed. A literature review revealed that there are only few examples of integrated approaches. The aim of this project is to develop new tailings management models by streamlining orebody characterization, process optimization and rehabilitation. The approach is based on continuous fingerprinting of geochemical processes from orebody to tailings storage facility, and benchmark the success of such proactive initiatives by evidence of no impacts and no future projected impacts on receiving environments. We present an approach for developing such a framework and preliminary results from a case study where combined grinding and flotation models developed using geometallurgical data from the orebody were constructed to predict the properties of tailings produced under various processing scenarios. The modelling scenarios based on the case study data provide the capacity to predict the composition of tailings and the resulting environmental management implications. For example, the type and content of clay minerals in tailings will affect the geotechnical stability and water recovery. Clay content will also influence decisions made for paste or thickened tailings and underground backfilling. It is possible by using an integrated assessment framework to evaluate more alternatives, including the production of additional saleable and benign streams, alternative tailings treatment and disposal, as well as options for reuse, recycling and pre-processing of existing tailings.
NASA Technical Reports Server (NTRS)
Hale, Mark A.; Craig, James I.; Mistree, Farrokh; Schrage, Daniel P.
1995-01-01
Integrated Product and Process Development (IPPD) embodies the simultaneous application of both system and quality engineering methods throughout an iterative design process. The use of IPPD results in the time-conscious, cost-saving development of engineering systems. Georgia Tech has proposed the development of an Integrated Design Engineering Simulator that will merge Integrated Product and Process Development with interdisciplinary analysis techniques and state-of-the-art computational technologies. To implement IPPD, a Decision-Based Design perspective is encapsulated in an approach that focuses on the role of the human designer in product development. The approach has two parts and is outlined in this paper. First, an architecture, called DREAMS, is being developed that facilitates design from a decision-based perspective. Second, a supporting computing infrastructure, called IMAGE, is being designed. The current status of development is given and future directions are outlined.
The effect of individually-induced processes on image-based overlay and diffraction-based overlay
NASA Astrophysics Data System (ADS)
Oh, SeungHwa; Lee, Jeongjin; Lee, Seungyoon; Hwang, Chan; Choi, Gilheyun; Kang, Ho-Kyu; Jung, EunSeung
2014-04-01
In this paper, set of wafers with separated processes was prepared and overlay measurement result was compared in two methods; IBO and DBO. Based on the experimental result, theoretical approach of relationship between overlay mark deformation and overlay variation is presented. Moreover, overlay reading simulation was used in verification and prediction of overlay variation due to deformation of overlay mark caused by induced processes. Through this study, understanding of individual process effects on overlay measurement error is given. Additionally, guideline of selecting proper overlay measurement scheme for specific layer is presented.
Multiresponse imaging system design for improved resolution
NASA Technical Reports Server (NTRS)
Alter-Gartenberg, Rachel; Fales, Carl L.; Huck, Friedrich O.; Rahman, Zia-Ur; Reichenbach, Stephen E.
1991-01-01
Multiresponse imaging is a process that acquires A images, each with a different optical response, and reassembles them into a single image with an improved resolution that can approach 1/sq rt A times the photodetector-array sampling lattice. Our goals are to optimize the performance of this process in terms of the resolution and fidelity of the restored image and to assess the amount of information required to do so. The theoretical approach is based on the extension of both image restoration and rate-distortion theories from their traditional realm of signal processing to image processing which includes image gathering and display.
NASA Astrophysics Data System (ADS)
Müller, M. F.; Thompson, S. E.
2015-09-01
The prediction of flow duration curves (FDCs) in ungauged basins remains an important task for hydrologists given the practical relevance of FDCs for water management and infrastructure design. Predicting FDCs in ungauged basins typically requires spatial interpolation of statistical or model parameters. This task is complicated if climate becomes non-stationary, as the prediction challenge now also requires extrapolation through time. In this context, process-based models for FDCs that mechanistically link the streamflow distribution to climate and landscape factors may have an advantage over purely statistical methods to predict FDCs. This study compares a stochastic (process-based) and statistical method for FDC prediction in both stationary and non-stationary contexts, using Nepal as a case study. Under contemporary conditions, both models perform well in predicting FDCs, with Nash-Sutcliffe coefficients above 0.80 in 75 % of the tested catchments. The main drives of uncertainty differ between the models: parameter interpolation was the main source of error for the statistical model, while violations of the assumptions of the process-based model represented the main source of its error. The process-based approach performed better than the statistical approach in numerical simulations with non-stationary climate drivers. The predictions of the statistical method under non-stationary rainfall conditions were poor if (i) local runoff coefficients were not accurately determined from the gauge network, or (ii) streamflow variability was strongly affected by changes in rainfall. A Monte Carlo analysis shows that the streamflow regimes in catchments characterized by a strong wet-season runoff and a rapid, strongly non-linear hydrologic response are particularly sensitive to changes in rainfall statistics. In these cases, process-based prediction approaches are strongly favored over statistical models.
Authoring and verification of clinical guidelines: a model driven approach.
Pérez, Beatriz; Porres, Ivan
2010-08-01
The goal of this research is to provide a framework to enable authoring and verification of clinical guidelines. The framework is part of a larger research project aimed at improving the representation, quality and application of clinical guidelines in daily clinical practice. The verification process of a guideline is based on (1) model checking techniques to verify guidelines against semantic errors and inconsistencies in their definition, (2) combined with Model Driven Development (MDD) techniques, which enable us to automatically process manually created guideline specifications and temporal-logic statements to be checked and verified regarding these specifications, making the verification process faster and cost-effective. Particularly, we use UML statecharts to represent the dynamics of guidelines and, based on this manually defined guideline specifications, we use a MDD-based tool chain to automatically process them to generate the input model of a model checker. The model checker takes the resulted model together with the specific guideline requirements, and verifies whether the guideline fulfils such properties. The overall framework has been implemented as an Eclipse plug-in named GBDSSGenerator which, particularly, starting from the UML statechart representing a guideline, allows the verification of the guideline against specific requirements. Additionally, we have established a pattern-based approach for defining commonly occurring types of requirements in guidelines. We have successfully validated our overall approach by verifying properties in different clinical guidelines resulting in the detection of some inconsistencies in their definition. The proposed framework allows (1) the authoring and (2) the verification of clinical guidelines against specific requirements defined based on a set of property specification patterns, enabling non-experts to easily write formal specifications and thus easing the verification process. Copyright 2010 Elsevier Inc. All rights reserved.
Yield impact for wafer shape misregistration-based binning for overlay APC diagnostic enhancement
NASA Astrophysics Data System (ADS)
Jayez, David; Jock, Kevin; Zhou, Yue; Govindarajulu, Venugopal; Zhang, Zhen; Anis, Fatima; Tijiwa-Birk, Felipe; Agarwal, Shivam
2018-03-01
The importance of traditionally acceptable sources of variation has started to become more critical as semiconductor technologies continue to push into smaller technology nodes. New metrology techniques are needed to pursue the process uniformity requirements needed for controllable lithography. Process control for lithography has the advantage of being able to adjust for cross-wafer variability, but this requires that all processes are close in matching between process tools/chambers for each process. When this is not the case, the cumulative line variability creates identifiable groups of wafers1 . This cumulative shape based effect is described as impacting overlay measurements and alignment by creating misregistration of the overlay marks. It is necessary to understand what requirements might go into developing a high volume manufacturing approach which leverages this grouping methodology, the key inputs and outputs, and what can be extracted from such an approach. It will be shown that this line variability can be quantified into a loss of electrical yield primarily at the edge of the wafer and proposes a methodology for root cause identification and improvement. This paper will cover the concept of wafer shape based grouping as a diagnostic tool for overlay control and containment, the challenges in implementing this in a manufacturing setting, and the limitations of this approach. This will be accomplished by showing that there are identifiable wafer shape based signatures. These shape based wafer signatures will be shown to be correlated to overlay misregistration, primarily at the edge. It will also be shown that by adjusting for this wafer shape signal, improvements can be made to both overlay as well as electrical yield. These improvements show an increase in edge yield, and a reduction in yield variability.
DATA QUALITY OBJECTIVES-FOUNDATION OF A SUCCESSFUL MONITORING PROGRAM
The data quality objectives (DQO) process is a fundamental site characterization tool and the foundation of a successful monitoring program. The DQO process is a systematic planning approach based on the scientific method of inquiry. The process identifies the goals of data col...
SensePath: Understanding the Sensemaking Process Through Analytic Provenance.
Nguyen, Phong H; Xu, Kai; Wheat, Ashley; Wong, B L William; Attfield, Simon; Fields, Bob
2016-01-01
Sensemaking is described as the process of comprehension, finding meaning and gaining insight from information, producing new knowledge and informing further action. Understanding the sensemaking process allows building effective visual analytics tools to make sense of large and complex datasets. Currently, it is often a manual and time-consuming undertaking to comprehend this: researchers collect observation data, transcribe screen capture videos and think-aloud recordings, identify recurring patterns, and eventually abstract the sensemaking process into a general model. In this paper, we propose a general approach to facilitate such a qualitative analysis process, and introduce a prototype, SensePath, to demonstrate the application of this approach with a focus on browser-based online sensemaking. The approach is based on a study of a number of qualitative research sessions including observations of users performing sensemaking tasks and post hoc analyses to uncover their sensemaking processes. Based on the study results and a follow-up participatory design session with HCI researchers, we decided to focus on the transcription and coding stages of thematic analysis. SensePath automatically captures user's sensemaking actions, i.e., analytic provenance, and provides multi-linked views to support their further analysis. A number of other requirements elicited from the design session are also implemented in SensePath, such as easy integration with existing qualitative analysis workflow and non-intrusive for participants. The tool was used by an experienced HCI researcher to analyze two sensemaking sessions. The researcher found the tool intuitive and considerably reduced analysis time, allowing better understanding of the sensemaking process.
A disciplined approach to capital: today's healthcare imperative.
Dupuis, Patrick J; Kaufman, Kenneth
2007-07-01
BJC HealthCare's experience exemplifies several basic principles of a finance-based approach to capital. Organizations that adopt this approach look to improve processes first, remove costs second, and spend capital last. Multiyear planning is required to quantitatively identify the profitability and liquidity requirements of strategic initiatives and address essential funding and financing issues.
ERIC Educational Resources Information Center
Cermakova, Lucie; Moneta, Giovanni B.; Spada, Marcantonio M.
2010-01-01
This study investigated how attentional control and study-related dispositional flow influence students' approaches to studying when preparing for academic examinations. Based on information-processing theories, it was hypothesised that attentional control would be positively associated with deep and strategic approaches to studying, and…
Automating Media Centers and Small Libraries: A Microcomputer-Based Approach.
ERIC Educational Resources Information Center
Meghabghab, Dania Bilal
Although the general automation process can be applied to most libraries, small libraries and media centers require a customized approach. Using a systematic approach, this guide covers each step and aspect of automation in a small library setting, and combines the principles of automation with field- tested activities. After discussing needs…
Using Q Methodology in the Literature Review Process: A Mixed Research Approach
ERIC Educational Resources Information Center
Onwuegbuzie, Anthony J.; Frels, Rebecca K.
2015-01-01
Because of the mixed research-based nature of literature reviews, it is surprising, then, that insufficient information has been provided as to how reviewers can incorporate mixed research approaches into their literature reviews. Thus, in this article, we provide a mixed methods research approach--Q methodology--for analyzing information…
Language Learning in Mindbodyworld: A Sociocognitive Approach to Second Language Acquisition
ERIC Educational Resources Information Center
Atkinson, Dwight
2014-01-01
Based on recent research in cognitive science, interaction, and second language acquisition (SLA), I describe a sociocognitive approach to SLA. This approach adopts a "non-cognitivist" view of cognition: Instead of an isolated computational process in which input is extracted from the environment and used to build elaborate internal…
Dake, Gregory R; Fenster, Erik E; Patrick, Brian O
2008-09-05
A synthetic approach to the A-B ring system within the fusicoccane family of diterpenes is presented. Key steps in this approach are a diastereoselective Pauson-Khand reaction, a Norrish 1 photofragmentation, a Charette cyclopropanation, and a ring-closing metathesis process.
A Catalyst-for-Change Approach to Evaluation Capacity Building
ERIC Educational Resources Information Center
Garcia-Iriarte, Edurne; Suarez-Balcazar, Yolanda; Taylor-Ritzler, Tina; Luna, Maria
2011-01-01
Evaluation capacity building (ECB) has become a popular approach for helping community-based organizations (CBOs) to meet their funders' demands for accountability. This case study reports the ECB process with one staff member using a catalyst-for-change approach. The authors analyzed the role of the catalyst in diffusing evaluation knowledge and…
NASA Astrophysics Data System (ADS)
Zan, Tao; Wang, Min; Hu, Jianzhong
2010-12-01
Machining status monitoring technique by multi-sensors can acquire and analyze the machining process information to implement abnormity diagnosis and fault warning. Statistical quality control technique is normally used to distinguish abnormal fluctuations from normal fluctuations through statistical method. In this paper by comparing the advantages and disadvantages of the two methods, the necessity and feasibility of integration and fusion is introduced. Then an approach that integrates multi-sensors status monitoring and statistical process control based on artificial intelligent technique, internet technique and database technique is brought forward. Based on virtual instrument technique the author developed the machining quality assurance system - MoniSysOnline, which has been used to monitoring the grinding machining process. By analyzing the quality data and AE signal information of wheel dressing process the reason of machining quality fluctuation has been obtained. The experiment result indicates that the approach is suitable for the status monitoring and analyzing of machining process.
Review of Instructional Approaches in Ethics Education.
Mulhearn, Tyler J; Steele, Logan M; Watts, Logan L; Medeiros, Kelsey E; Mumford, Michael D; Connelly, Shane
2017-06-01
Increased investment in ethics education has prompted a variety of instructional objectives and frameworks. Yet, no systematic procedure to classify these varying instructional approaches has been attempted. In the present study, a quantitative clustering procedure was conducted to derive a typology of instruction in ethics education. In total, 330 ethics training programs were included in the cluster analysis. The training programs were appraised with respect to four instructional categories including instructional content, processes, delivery methods, and activities. Eight instructional approaches were identified through this clustering procedure, and these instructional approaches showed different levels of effectiveness. Instructional effectiveness was assessed based on one of nine commonly used ethics criteria. With respect to specific training types, Professional Decision Processes Training (d = 0.50) and Field-Specific Compliance Training (d = 0.46) appear to be viable approaches to ethics training based on Cohen's d effect size estimates. By contrast, two commonly used approaches, General Discussion Training (d = 0.31) and Norm Adherence Training (d = 0.37), were found to be considerably less effective. The implications for instruction in ethics training are discussed.
NASA Astrophysics Data System (ADS)
Yoto
2017-09-01
Vocational high school (Sekolah Menengah Kejuruan / SMK) aims to prepare mid-level skilled labors to work in the industry and are able to create self-employment opportunities. For those reasons, the curriculum in SMK should be based on meeting the needs of the industries and is able to prepare learners to master the competence in accordance with the skills program of their choice. Production based curriculum is the curriculum which the learning process is designed together with the production process or using production process as a learning medium. This approach with the primary intention to introduce students with the real working environment and not merely simulations. In the production-based curriculum implementation model, students are directly involved in the industry through the implementation of industrial working practices, do work on production units in school, and do practical work in school by doing the job as done in the industry by using industry standards machines.
Band, Rebecca; Bradbury, Katherine; Morton, Katherine; May, Carl; Michie, Susan; Mair, Frances S; Murray, Elizabeth; McManus, Richard J; Little, Paul; Yardley, Lucy
2017-02-23
This paper describes the intervention planning process for the Home and Online Management and Evaluation of Blood Pressure (HOME BP), a digital intervention to promote hypertension self-management. It illustrates how a Person-Based Approach can be integrated with theory- and evidence-based approaches. The Person-Based Approach to intervention development emphasises the use of qualitative research to ensure that the intervention is acceptable, persuasive, engaging and easy to implement. Our intervention planning process comprised two parallel, integrated work streams, which combined theory-, evidence- and person-based elements. The first work stream involved collating evidence from a mixed methods feasibility study, a systematic review and a synthesis of qualitative research. This evidence was analysed to identify likely barriers and facilitators to uptake and implementation as well as design features that should be incorporated in the HOME BP intervention. The second work stream used three complementary approaches to theoretical modelling: developing brief guiding principles for intervention design, causal modelling to map behaviour change techniques in the intervention onto the Behaviour Change Wheel and Normalisation Process Theory frameworks, and developing a logic model. The different elements of our integrated approach to intervention planning yielded important, complementary insights into how to design the intervention to maximise acceptability and ease of implementation by both patients and health professionals. From the primary and secondary evidence, we identified key barriers to overcome (such as patient and health professional concerns about side effects of escalating medication) and effective intervention ingredients (such as providing in-person support for making healthy behaviour changes). Our guiding principles highlighted unique design features that could address these issues (such as online reassurance and procedures for managing concerns). Causal modelling ensured that all relevant behavioural determinants had been addressed, and provided a complete description of the intervention. Our logic model linked the hypothesised mechanisms of action of our intervention to existing psychological theory. Our integrated approach to intervention development, combining theory-, evidence- and person-based approaches, increased the clarity, comprehensiveness and confidence of our theoretical modelling and enabled us to ground our intervention in an in-depth understanding of the barriers and facilitators most relevant to this specific intervention and user population.
NASA Astrophysics Data System (ADS)
Pathak, Jaideep; Wikner, Alexander; Fussell, Rebeckah; Chandra, Sarthak; Hunt, Brian R.; Girvan, Michelle; Ott, Edward
2018-04-01
A model-based approach to forecasting chaotic dynamical systems utilizes knowledge of the mechanistic processes governing the dynamics to build an approximate mathematical model of the system. In contrast, machine learning techniques have demonstrated promising results for forecasting chaotic systems purely from past time series measurements of system state variables (training data), without prior knowledge of the system dynamics. The motivation for this paper is the potential of machine learning for filling in the gaps in our underlying mechanistic knowledge that cause widely-used knowledge-based models to be inaccurate. Thus, we here propose a general method that leverages the advantages of these two approaches by combining a knowledge-based model and a machine learning technique to build a hybrid forecasting scheme. Potential applications for such an approach are numerous (e.g., improving weather forecasting). We demonstrate and test the utility of this approach using a particular illustrative version of a machine learning known as reservoir computing, and we apply the resulting hybrid forecaster to a low-dimensional chaotic system, as well as to a high-dimensional spatiotemporal chaotic system. These tests yield extremely promising results in that our hybrid technique is able to accurately predict for a much longer period of time than either its machine-learning component or its model-based component alone.
Disentangling sampling and ecological explanations underlying species-area relationships
Cam, E.; Nichols, J.D.; Hines, J.E.; Sauer, J.R.; Alpizar-Jara, R.; Flather, C.H.
2002-01-01
We used a probabilistic approach to address the influence of sampling artifacts on the form of species-area relationships (SARs). We developed a model in which the increase in observed species richness is a function of sampling effort exclusively. We assumed that effort depends on area sampled, and we generated species-area curves under that model. These curves can be realistic looking. We then generated SARs from avian data, comparing SARs based on counts with those based on richness estimates. We used an approach to estimation of species richness that accounts for species detection probability and, hence, for variation in sampling effort. The slopes of SARs based on counts are steeper than those of curves based on estimates of richness, indicating that the former partly reflect failure to account for species detection probability. SARs based on estimates reflect ecological processes exclusively, not sampling processes. This approach permits investigation of ecologically relevant hypotheses. The slope of SARs is not influenced by the slope of the relationship between habitat diversity and area. In situations in which not all of the species are detected during sampling sessions, approaches to estimation of species richness integrating species detection probability should be used to investigate the rate of increase in species richness with area.
NASA Astrophysics Data System (ADS)
Tang, Jian; Qiao, Junfei; Wu, ZhiWei; Chai, Tianyou; Zhang, Jian; Yu, Wen
2018-01-01
Frequency spectral data of mechanical vibration and acoustic signals relate to difficult-to-measure production quality and quantity parameters of complex industrial processes. A selective ensemble (SEN) algorithm can be used to build a soft sensor model of these process parameters by fusing valued information selectively from different perspectives. However, a combination of several optimized ensemble sub-models with SEN cannot guarantee the best prediction model. In this study, we use several techniques to construct mechanical vibration and acoustic frequency spectra of a data-driven industrial process parameter model based on selective fusion multi-condition samples and multi-source features. Multi-layer SEN (MLSEN) strategy is used to simulate the domain expert cognitive process. Genetic algorithm and kernel partial least squares are used to construct the inside-layer SEN sub-model based on each mechanical vibration and acoustic frequency spectral feature subset. Branch-and-bound and adaptive weighted fusion algorithms are integrated to select and combine outputs of the inside-layer SEN sub-models. Then, the outside-layer SEN is constructed. Thus, "sub-sampling training examples"-based and "manipulating input features"-based ensemble construction methods are integrated, thereby realizing the selective information fusion process based on multi-condition history samples and multi-source input features. This novel approach is applied to a laboratory-scale ball mill grinding process. A comparison with other methods indicates that the proposed MLSEN approach effectively models mechanical vibration and acoustic signals.
NASA Astrophysics Data System (ADS)
Ghaemi, Z.; Farnaghi, M.; Alimohammadi, A.
2015-12-01
The critical impact of air pollution on human health and environment in one hand and the complexity of pollutant concentration behavior in the other hand lead the scientists to look for advance techniques for monitoring and predicting the urban air quality. Additionally, recent developments in data measurement techniques have led to collection of various types of data about air quality. Such data is extremely voluminous and to be useful it must be processed at high velocity. Due to the complexity of big data analysis especially for dynamic applications, online forecasting of pollutant concentration trends within a reasonable processing time is still an open problem. The purpose of this paper is to present an online forecasting approach based on Support Vector Machine (SVM) to predict the air quality one day in advance. In order to overcome the computational requirements for large-scale data analysis, distributed computing based on the Hadoop platform has been employed to leverage the processing power of multiple processing units. The MapReduce programming model is adopted for massive parallel processing in this study. Based on the online algorithm and Hadoop framework, an online forecasting system is designed to predict the air pollution of Tehran for the next 24 hours. The results have been assessed on the basis of Processing Time and Efficiency. Quite accurate predictions of air pollutant indicator levels within an acceptable processing time prove that the presented approach is very suitable to tackle large scale air pollution prediction problems.
Bidding-based autonomous process planning and scheduling
NASA Astrophysics Data System (ADS)
Gu, Peihua; Balasubramanian, Sivaram; Norrie, Douglas H.
1995-08-01
Improving productivity through computer integrated manufacturing systems (CIMS) and concurrent engineering requires that the islands of automation in an enterprise be completely integrated. The first step in this direction is to integrate design, process planning, and scheduling. This can be achieved through a bidding-based process planning approach. The product is represented in a STEP model with detailed design and administrative information including design specifications, batch size, and due dates. Upon arrival at the manufacturing facility, the product registered in the shop floor manager which is essentially a coordinating agent. The shop floor manager broadcasts the product's requirements to the machines. The shop contains autonomous machines that have knowledge about their functionality, capabilities, tooling, and schedule. Each machine has its own process planner and responds to the product's request in a different way that is consistent with its capabilities and capacities. When more than one machine offers certain process(es) for the same requirements, they enter into negotiation. Based on processing time, due date, and cost, one of the machines wins the contract. The successful machine updates its schedule and advises the product to request raw material for processing. The concept was implemented using a multi-agent system with the task decomposition and planning achieved through contract nets. The examples are included to illustrate the approach.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garcia, Humberto E.; Simpson, Michael F.; Lin, Wen-Chiao
In this paper, we apply an advanced safeguards approach and associated methods for process monitoring to a hypothetical nuclear material processing system. The assessment regarding the state of the processing facility is conducted at a systemcentric level formulated in a hybrid framework. This utilizes architecture for integrating both time- and event-driven data and analysis for decision making. While the time-driven layers of the proposed architecture encompass more traditional process monitoring methods based on time series data and analysis, the event-driven layers encompass operation monitoring methods based on discrete event data and analysis. By integrating process- and operation-related information and methodologiesmore » within a unified framework, the task of anomaly detection is greatly improved. This is because decision-making can benefit from not only known time-series relationships among measured signals but also from known event sequence relationships among generated events. This available knowledge at both time series and discrete event layers can then be effectively used to synthesize observation solutions that optimally balance sensor and data processing requirements. The application of the proposed approach is then implemented on an illustrative monitored system based on pyroprocessing and results are discussed.« less
A New Approach to Create Image Control Networks in ISIS
NASA Astrophysics Data System (ADS)
Becker, K. J.; Berry, K. L.; Mapel, J. A.; Walldren, J. C.
2017-06-01
A new approach was used to create a feature-based control point network that required the development of new tools in the Integrated Software for Imagers and Spectrometers (ISIS3) system to process very large datasets.
DOT National Transportation Integrated Search
2010-02-01
This guidebook presents an approach for integrating management and operations (M&O) strategies into the metropolitan transportation planning process that is designed to maximize the performance of the existing and planned transportation system. This ...
Mak, Winnie W S; Chan, Amy T Y; Cheung, Eliza Y L; Lin, Cherry L Y; Ngai, Karin C S
2015-01-19
With increasing evidence demonstrating the effectiveness of Web-based interventions and mindfulness-based training in improving health, delivering mindfulness training online is an attractive proposition. The aim of this study was to evaluate the efficacy of two Internet-based interventions (basic mindfulness and Health Action Process Approach enhanced mindfulness) with waitlist control. Health Action Process Approach (HAPA) principles were used to enhance participants' efficacy and planning. Participants were recruited online and offline among local universities; 321 university students and staff were randomly assigned to three conditions. The basic and HAPA-enhanced groups completed the 8-week fully automated mindfulness training online. All participants (including control) were asked to complete an online questionnaire pre-program, post-program, and at 3-month follow-up. Significant group by time interaction effect was found. The HAPA-enhanced group showed significantly higher levels of mindfulness from pre-intervention to post-intervention, and such improvement was sustained at follow-up. Both the basic and HAPA-enhanced mindfulness groups showed better mental well-being from pre-intervention to post-intervention, and improvement was sustained at 3-month follow-up. Online mindfulness training can improve mental health. An online platform is a viable medium to implement and disseminate evidence-based interventions and is a highly scalable approach to reach the general public. Chinese Clinical Trial Registry (ChiCTR): ChiCTR-TRC-12002954; http://www.chictr.org/en/proj/show.aspx?proj=3904 (Archived by WebCite at http://www.webcitation.org/6VCdG09pA).
Integral processing in beyond-Hartree-Fock calculations
NASA Technical Reports Server (NTRS)
Taylor, P. R.
1986-01-01
The increasing rate at which improvements in processing capacity outstrip improvements in input/output performance of large computers has led to recent attempts to bypass generation of a disk-based integral file. The direct self-consistent field (SCF) method of Almlof and co-workers represents a very successful implementation of this approach. This paper is concerned with the extension of this general approach to configuration interaction (CI) and multiconfiguration-self-consistent field (MCSCF) calculations. After a discussion of the particular types of molecular orbital (MO) integrals for which -- at least for most current generation machines -- disk-based storage seems unavoidable, it is shown how all the necessary integrals can be obtained as matrix elements of Coulomb and exchange operators that can be calculated using a direct approach. Computational implementations of such a scheme are discussed.
Neuroscientific Model of Motivational Process
Kim, Sung-il
2013-01-01
Considering the neuroscientific findings on reward, learning, value, decision-making, and cognitive control, motivation can be parsed into three sub processes, a process of generating motivation, a process of maintaining motivation, and a process of regulating motivation. I propose a tentative neuroscientific model of motivational processes which consists of three distinct but continuous sub processes, namely reward-driven approach, value-based decision-making, and goal-directed control. Reward-driven approach is the process in which motivation is generated by reward anticipation and selective approach behaviors toward reward. This process recruits the ventral striatum (reward area) in which basic stimulus-action association is formed, and is classified as an automatic motivation to which relatively less attention is assigned. By contrast, value-based decision-making is the process of evaluating various outcomes of actions, learning through positive prediction error, and calculating the value continuously. The striatum and the orbitofrontal cortex (valuation area) play crucial roles in sustaining motivation. Lastly, the goal-directed control is the process of regulating motivation through cognitive control to achieve goals. This consciously controlled motivation is associated with higher-level cognitive functions such as planning, retaining the goal, monitoring the performance, and regulating action. The anterior cingulate cortex (attention area) and the dorsolateral prefrontal cortex (cognitive control area) are the main neural circuits related to regulation of motivation. These three sub processes interact with each other by sending reward prediction error signals through dopaminergic pathway from the striatum and to the prefrontal cortex. The neuroscientific model of motivational process suggests several educational implications with regard to the generation, maintenance, and regulation of motivation to learn in the learning environment. PMID:23459598
Neuroscientific model of motivational process.
Kim, Sung-Il
2013-01-01
Considering the neuroscientific findings on reward, learning, value, decision-making, and cognitive control, motivation can be parsed into three sub processes, a process of generating motivation, a process of maintaining motivation, and a process of regulating motivation. I propose a tentative neuroscientific model of motivational processes which consists of three distinct but continuous sub processes, namely reward-driven approach, value-based decision-making, and goal-directed control. Reward-driven approach is the process in which motivation is generated by reward anticipation and selective approach behaviors toward reward. This process recruits the ventral striatum (reward area) in which basic stimulus-action association is formed, and is classified as an automatic motivation to which relatively less attention is assigned. By contrast, value-based decision-making is the process of evaluating various outcomes of actions, learning through positive prediction error, and calculating the value continuously. The striatum and the orbitofrontal cortex (valuation area) play crucial roles in sustaining motivation. Lastly, the goal-directed control is the process of regulating motivation through cognitive control to achieve goals. This consciously controlled motivation is associated with higher-level cognitive functions such as planning, retaining the goal, monitoring the performance, and regulating action. The anterior cingulate cortex (attention area) and the dorsolateral prefrontal cortex (cognitive control area) are the main neural circuits related to regulation of motivation. These three sub processes interact with each other by sending reward prediction error signals through dopaminergic pathway from the striatum and to the prefrontal cortex. The neuroscientific model of motivational process suggests several educational implications with regard to the generation, maintenance, and regulation of motivation to learn in the learning environment.
Process-Based Mission Assurance- Knowledge Management System
NASA Astrophysics Data System (ADS)
Kantzes, Zachary S.; Wander, Stephen; Otero, Suzanne; Vantine, William; Stuart, Richard
2005-12-01
The Process-Based Mission Assurance - Knowledge Management System (PBMA-KMS) implemented at the National Aeronautics and Space Administration (NASA) focuses on the practical application of the knowledge management (KM) theory and is based on a systems engineering management approach coupled to a continual improvement and risk management philosophy. Not to be confused with an Agency mandate, an intense focus has been placed on grassroots input to the future of the product. By providing emphasis to both Agency safety and mission success objectives and individual users' needs, the PBMA-KMS team has been able to be both reactive to Agency requirements and proactive to the needs of the community.PBMA-KMS is an excellent case study on how to use new approaches to facilitate and integrate safety into the culture of an organization. Principle discussion topics include: • Overarching themes,• Tactical approaches,• Highlights of key functionalities, and• Agency KM approach of managed Darwinism.PBMA-KMS can show how, by providing top-level guidance along with the necessary tools and support, the organization not only receives immediate value, but the long-ranging benefits of a more experienced, effective, and engaged workforce.
Falkenmark, Malin
2003-12-29
The paper has its focus on water's key functions behind ecosystem dynamics and the water-related balancing involved in a catchment-based ecosystem approach. A conceptual framework is being developed to address fundamental trade-offs between humans and ecosystems. This is done by paying attention to society's unavoidable landscape modifications and their unavoidable ecological effects mediated by water processes. Because the coevolution of societal and environmental processes indicates resonance rather than a cause-effect relationship, humanity will have to learn to live with change while securing ecosystem resilience. In view of the partial incompatibility of the social imperative of the millennium goals and its environmental sustainability goal, human activities and ecosystems have to be orchestrated for compatibility. To this end a catchment-based approach has to be taken by integrating water, land use and ecosystems. It is being suggested that ecosystem protection has to be thought of in two scales: site-specific biotic landscape components to be protected for their social value, and a catchment-based ecosystem approach to secure sustainable supply of crucial ecosystem goods and services on which social and economic development depends.
Job Skills Education Program. Design Specifications
1985-03-01
training approach is supplied in part by research based on the depth-of- processing paradigm ( Craik & Lockhart , 1972; Craik & Tulving, 1975), which...discussion here develops a rationale for the approach, which is consistent with research on incidental learning ( Craik & Lockhart , 1972; Craik & Tulving, 1975...this meeting, a plan evolved to integrate available RCA results and contract products into the - JSEP design. 0 During the Task 1 in- process review, the
NASA Astrophysics Data System (ADS)
Ranfagni, Anedio; Mugnai, Daniela; Cacciari, Ilaria
2018-05-01
The usefulness of a stochastic approach in determining time scales in tunneling processes (mainly, but not only, in the microwave range) is reconsidered and compared with a different approach to these kinds of processes, based on Feynman's transition elements. This latter method is found to be particularly suitable for interpreting situations in the near field, as results from some experimental cases considered here.
ERIC Educational Resources Information Center
Richards, Cameron
2015-01-01
The challenge of better reconciling individual and collective aspects of innovative problem-solving can be productively addressed to enhance the role of PBL as a key focus of the creative process in future higher education. This should involve "active learning" approaches supported by related processes of teaching, assessment and…
Bindings and RESTlets: A Novel Set of CoAP-Based Application Enablers to Build IoT Applications.
Teklemariam, Girum Ketema; Van Den Abeele, Floris; Moerman, Ingrid; Demeester, Piet; Hoebeke, Jeroen
2016-08-02
Sensors and actuators are becoming important components of Internet of Things (IoT) applications. Today, several approaches exist to facilitate communication of sensors and actuators in IoT applications. Most communications go through often proprietary gateways requiring availability of the gateway for each and every interaction between sensors and actuators. Sometimes, the gateway does some processing of the sensor data before triggering actuators. Other approaches put this processing logic further in the cloud. These approaches introduce significant latencies and increased number of packets. In this paper, we introduce a CoAP-based mechanism for direct binding of sensors and actuators. This flexible binding solution is utilized further to build IoT applications through RESTlets. RESTlets are defined to accept inputs and produce outputs after performing some processing tasks. Sensors and actuators could be associated with RESTlets (which can be hosted on any device) through the flexible binding mechanism we introduced. This approach facilitates decentralized IoT application development by placing all or part of the processing logic in Low power and Lossy Networks (LLNs). We run several tests to compare the performance of our solution with existing solutions and found out that our solution reduces communication delay and number of packets in the LLN.
Bindings and RESTlets: A Novel Set of CoAP-Based Application Enablers to Build IoT Applications
Teklemariam, Girum Ketema; Van Den Abeele, Floris; Moerman, Ingrid; Demeester, Piet; Hoebeke, Jeroen
2016-01-01
Sensors and actuators are becoming important components of Internet of Things (IoT) applications. Today, several approaches exist to facilitate communication of sensors and actuators in IoT applications. Most communications go through often proprietary gateways requiring availability of the gateway for each and every interaction between sensors and actuators. Sometimes, the gateway does some processing of the sensor data before triggering actuators. Other approaches put this processing logic further in the cloud. These approaches introduce significant latencies and increased number of packets. In this paper, we introduce a CoAP-based mechanism for direct binding of sensors and actuators. This flexible binding solution is utilized further to build IoT applications through RESTlets. RESTlets are defined to accept inputs and produce outputs after performing some processing tasks. Sensors and actuators could be associated with RESTlets (which can be hosted on any device) through the flexible binding mechanism we introduced. This approach facilitates decentralized IoT application development by placing all or part of the processing logic in Low power and Lossy Networks (LLNs). We run several tests to compare the performance of our solution with existing solutions and found out that our solution reduces communication delay and number of packets in the LLN. PMID:27490554
Toward a model-based cognitive neuroscience of mind wandering.
Hawkins, G E; Mittner, M; Boekel, W; Heathcote, A; Forstmann, B U
2015-12-03
People often "mind wander" during everyday tasks, temporarily losing track of time, place, or current task goals. In laboratory-based tasks, mind wandering is often associated with performance decrements in behavioral variables and changes in neural recordings. Such empirical associations provide descriptive accounts of mind wandering - how it affects ongoing task performance - but fail to provide true explanatory accounts - why it affects task performance. In this perspectives paper, we consider mind wandering as a neural state or process that affects the parameters of quantitative cognitive process models, which in turn affect observed behavioral performance. Our approach thus uses cognitive process models to bridge the explanatory divide between neural and behavioral data. We provide an overview of two general frameworks for developing a model-based cognitive neuroscience of mind wandering. The first approach uses neural data to segment observed performance into a discrete mixture of latent task-related and task-unrelated states, and the second regresses single-trial measures of neural activity onto structured trial-by-trial variation in the parameters of cognitive process models. We discuss the relative merits of the two approaches, and the research questions they can answer, and highlight that both approaches allow neural data to provide additional constraint on the parameters of cognitive models, which will lead to a more precise account of the effect of mind wandering on brain and behavior. We conclude by summarizing prospects for mind wandering as conceived within a model-based cognitive neuroscience framework, highlighting the opportunities for its continued study and the benefits that arise from using well-developed quantitative techniques to study abstract theoretical constructs. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.
Advanced biologically plausible algorithms for low-level image processing
NASA Astrophysics Data System (ADS)
Gusakova, Valentina I.; Podladchikova, Lubov N.; Shaposhnikov, Dmitry G.; Markin, Sergey N.; Golovan, Alexander V.; Lee, Seong-Whan
1999-08-01
At present, in computer vision, the approach based on modeling the biological vision mechanisms is extensively developed. However, up to now, real world image processing has no effective solution in frameworks of both biologically inspired and conventional approaches. Evidently, new algorithms and system architectures based on advanced biological motivation should be developed for solution of computational problems related to this visual task. Basic problems that should be solved for creation of effective artificial visual system to process real world imags are a search for new algorithms of low-level image processing that, in a great extent, determine system performance. In the present paper, the result of psychophysical experiments and several advanced biologically motivated algorithms for low-level processing are presented. These algorithms are based on local space-variant filter, context encoding visual information presented in the center of input window, and automatic detection of perceptually important image fragments. The core of latter algorithm are using local feature conjunctions such as noncolinear oriented segment and composite feature map formation. Developed algorithms were integrated into foveal active vision model, the MARR. It is supposed that proposed algorithms may significantly improve model performance while real world image processing during memorizing, search, and recognition.
Sarkar, Sumona; Lund, Steven P; Vyzasatya, Ravi; Vanguri, Padmavathy; Elliott, John T; Plant, Anne L; Lin-Gibson, Sheng
2017-12-01
Cell counting measurements are critical in the research, development and manufacturing of cell-based products, yet determining cell quantity with accuracy and precision remains a challenge. Validating and evaluating a cell counting measurement process can be difficult because of the lack of appropriate reference material. Here we describe an experimental design and statistical analysis approach to evaluate the quality of a cell counting measurement process in the absence of appropriate reference materials or reference methods. The experimental design is based on a dilution series study with replicate samples and observations as well as measurement process controls. The statistical analysis evaluates the precision and proportionality of the cell counting measurement process and can be used to compare the quality of two or more counting methods. As an illustration of this approach, cell counting measurement processes (automated and manual methods) were compared for a human mesenchymal stromal cell (hMSC) preparation. For the hMSC preparation investigated, results indicated that the automated method performed better than the manual counting methods in terms of precision and proportionality. By conducting well controlled dilution series experimental designs coupled with appropriate statistical analysis, quantitative indicators of repeatability and proportionality can be calculated to provide an assessment of cell counting measurement quality. This approach does not rely on the use of a reference material or comparison to "gold standard" methods known to have limited assurance of accuracy and precision. The approach presented here may help the selection, optimization, and/or validation of a cell counting measurement process. Published by Elsevier Inc.
Cost Models for MMC Manufacturing Processes
NASA Technical Reports Server (NTRS)
Elzey, Dana M.; Wadley, Haydn N. G.
1996-01-01
Processes for the manufacture of advanced metal matrix composites are rapidly approaching maturity in the research laboratory and there is growing interest in their transition to industrial production. However, research conducted to date has almost exclusively focused on overcoming the technical barriers to producing high-quality material and little attention has been given to the economical feasibility of these laboratory approaches and process cost issues. A quantitative cost modeling (QCM) approach was developed to address these issues. QCM are cost analysis tools based on predictive process models relating process conditions to the attributes of the final product. An important attribute, of the QCM approach is the ability to predict the sensitivity of material production costs to product quality and to quantitatively explore trade-offs between cost and quality. Applications of the cost models allow more efficient direction of future MMC process technology development and a more accurate assessment of MMC market potential. Cost models were developed for two state-of-the art metal matrix composite (MMC) manufacturing processes: tape casting and plasma spray deposition. Quality and Cost models are presented for both processes and the resulting predicted quality-cost curves are presented and discussed.
NASA Astrophysics Data System (ADS)
Bieda, Bogusław; Grzesik, Katarzyna
2017-11-01
The study proposes an stochastic approach based on Monte Carlo (MC) simulation for life cycle assessment (LCA) method limited to life cycle inventory (LCI) study for rare earth elements (REEs) recovery from the secondary materials processes production applied to the New Krankberg Mine in Sweden. The MC method is recognizes as an important tool in science and can be considered the most effective quantification approach for uncertainties. The use of stochastic approach helps to characterize the uncertainties better than deterministic method. Uncertainty of data can be expressed through a definition of probability distribution of that data (e.g. through standard deviation or variance). The data used in this study are obtained from: (i) site-specific measured or calculated data, (ii) values based on literature, (iii) the ecoinvent process "rare earth concentrate, 70% REO, from bastnäsite, at beneficiation". Environmental emissions (e.g, particulates, uranium-238, thorium-232), energy and REE (La, Ce, Nd, Pr, Sm, Dy, Eu, Tb, Y, Sc, Yb, Lu, Tm, Y, Gd) have been inventoried. The study is based on a reference case for the year 2016. The combination of MC analysis with sensitivity analysis is the best solution for quantified the uncertainty in the LCI/LCA. The reliability of LCA results may be uncertain, to a certain degree, but this uncertainty can be noticed with the help of MC method.
2017-01-01
Background The use of telemedicine technologies in health care has increased substantially, together with a growing interest in participatory design methods when developing telemedicine approaches. Objective We present lessons learned from a case study involving patients with heart disease and health care professionals in the development of a personalized Web-based health care intervention. Methods We used a participatory design approach inspired by the method for feasibility studies in software development. We collected qualitative data using multiple methods in 3 workshops and analyzed the data using thematic analysis. Participants were 7 patients with diagnosis of heart disease, 2 nurses, 1 physician, 2 systems architects, 3 moderators, and 3 observers. Results We present findings in 2 parts. (1) Outcomes of the participatory design process: users gave valuable feedback on ease of use of the platforms’ tracking tools, platform design, terminology, and insights into patients’ monitoring needs, information and communication technologies skills, and preferences for self-management tools. (2) Experiences from the participatory design process: patients and health care professionals contributed different perspectives, with the patients using an experience-based approach and the health care professionals using a more attitude-based approach. Conclusions The essential lessons learned concern planning and organization of workshops, including the finding that patients engaged actively and willingly in a participatory design process, whereas it was more challenging to include and engage health care professionals. PMID:28526674
Implementation of quality by design toward processing of food products.
Rathore, Anurag S; Kapoor, Gautam
2017-05-28
Quality by design (QbD) is a systematic approach that begins with predefined objectives and emphasizes product and process understanding and process control. It is an approach based on principles of sound science and quality risk management. As the food processing industry continues to embrace the idea of in-line, online, and/or at-line sensors and real-time characterization for process monitoring and control, the existing gaps with regard to our ability to monitor multiple parameters/variables associated with the manufacturing process will be alleviated over time. Investments made for development of tools and approaches that facilitate high-throughput analytical and process development, process analytical technology, design of experiments, risk analysis, knowledge management, and enhancement of process/product understanding would pave way for operational and economic benefits later in the commercialization process and across other product pipelines. This article aims to achieve two major objectives. First, to review the progress that has been made in the recent years on the topic of QbD implementation in processing of food products and second, present a case study that illustrates benefits of such QbD implementation.
ERIC Educational Resources Information Center
Cunningham, Debra Jayne
2015-01-01
Using a constructivist grounded theory approach (Charmaz, 2006), this qualitative study examined how eight female senior-level professionals employed at faith-based colleges and universities processed and navigated the experience of involuntary job loss and successfully transitioned to another position. The theoretical framework of psychological…
Onboard FPGA-based SAR processing for future spaceborne systems
NASA Technical Reports Server (NTRS)
Le, Charles; Chan, Samuel; Cheng, Frank; Fang, Winston; Fischman, Mark; Hensley, Scott; Johnson, Robert; Jourdan, Michael; Marina, Miguel; Parham, Bruce;
2004-01-01
We present a real-time high-performance and fault-tolerant FPGA-based hardware architecture for the processing of synthetic aperture radar (SAR) images in future spaceborne system. In particular, we will discuss the integrated design approach, from top-level algorithm specifications and system requirements, design methodology, functional verification and performance validation, down to hardware design and implementation.
An Infinite Game in a Finite Setting: Visualizing Foreign Language Teaching and Learning in America.
ERIC Educational Resources Information Center
Mantero, Miguel
According to contemporary thought and foundational research, this paper presents various elements of the foreign language teaching profession and language learning environment in the United States as either product-driven or process-based. It is argued that a process-based approach to language teaching and learning benefits not only second…
Evaluating crown fire rate of spread predictions from physics-based models
C. M. Hoffman; J. Ziegler; J. Canfield; R. R. Linn; W. Mell; C. H. Sieg; F. Pimont
2015-01-01
Modeling the behavior of crown fires is challenging due to the complex set of coupled processes that drive the characteristics of a spreading wildfire and the large range of spatial and temporal scales over which these processes occur. Detailed physics-based modeling approaches such as FIRETEC and the Wildland Urban Interface Fire Dynamics Simulator (WFDS) simulate...
USDA-ARS?s Scientific Manuscript database
Calibration of process-based hydrologic models is a challenging task in data-poor basins, where monitored hydrologic data are scarce. In this study, we present a novel approach that benefits from remotely sensed evapotranspiration (ET) data to calibrate a complex watershed model, namely the Soil and...