This is the first phase of a potentially multi-phase project aimed at identifying scientific methodologies that will lead to the development of innnovative analytical tools supporting the analysis of control strategy effectiveness, namely. accountabilty. Significant reductions i...
McKelvey, Maureen
2016-01-01
The main contribution of this paper is a theory-based conceptual framework of innovation spaces, and how firms must navigate through them to innovate. The concept of innovation systems - at the regional, sectoral and national levels - have been highly influential. Previous literature developing the concept of innovation systems has stressed the importance of institutions, networks and knowledge bases at the regional, sectoral and national levels. This paper primarily draws upon an evolutionary and Schumpeterian economics perspective, in the following three senses. The conceptualization of 'innnovation spaces' focuses upon how and why firm search for innovations is influenced the opportunities within certain geographical contexts. This means that the firm create opportunities and can span different context, but they are influence by the context in term of the access, flow and co-evolution of ideas, resources, technology, people and knowledge, which help stimulate business innovation in terms of products, process and services. The paper concludes with an agenda for future research and especially the need to focus on globalization as a process of intensifying linkages across the globe.
McGloughlin, Elizabeth Kate; Anglim, Paul; Keogh, Ivan; Sharif, Faisal
2018-01-01
Clinicians have historically been integral in innovating and developing technology in medicine and surgery. In recent years, however, in an increasingly complex healthcare system, a doctor with innovative ideas is often left behind. Transition from idea to bedside now entails significant hurdles, which often go unrecognised at the outset, particularly for first-time innovators. The BioInnnovate Ireland process, based on the Stanford Biodesign Programme (Identify, Invent and Implement), aims to streamline the process of innovation within the MedTech sector. These programmes focus on needs-based innovation and enable multidisciplinary teams to innovate and collaborate more succinctly. In this preliminary study, the authors aimed to examine the impact of BioInnovate Ireland has had on the clinicians involved and validate the collaborative process. To date, 13 fellows with backgrounds in clinical medicine have participated in the BioInnovate programme. Ten of these clinicians remain involved in clinical innovation projects with four of these working on Enterprise Ireland funded commercialisation grants and one working as chief executive officer of a service-led start-up, Strive. Of these, five also remain engaged in clinical practice on a full or part-time basis. The clinicians who have returned to full-time clinical practice have used the process and learning of the programme to influence their individual clinical areas and actively seek innovative solutions to meet clinical challenges. Clinicians, in particular, describe gaining value from the BioInnovate programme in areas of ‘Understanding Entrepreneurship’ and ‘Business Strategy’. Further study is needed into the quantitative impact on the ecosystem and impact to other stakeholders. PMID:29599999
McGloughlin, Elizabeth Kate; Anglim, Paul; Keogh, Ivan; Sharif, Faisal
2018-01-01
Clinicians have historically been integral in innovating and developing technology in medicine and surgery. In recent years, however, in an increasingly complex healthcare system, a doctor with innovative ideas is often left behind. Transition from idea to bedside now entails significant hurdles, which often go unrecognised at the outset, particularly for first-time innovators. The BioInnnovate Ireland process, based on the Stanford Biodesign Programme (Identify, Invent and Implement), aims to streamline the process of innovation within the MedTech sector. These programmes focus on needs-based innovation and enable multidisciplinary teams to innovate and collaborate more succinctly. In this preliminary study, the authors aimed to examine the impact of BioInnovate Ireland has had on the clinicians involved and validate the collaborative process. To date, 13 fellows with backgrounds in clinical medicine have participated in the BioInnovate programme. Ten of these clinicians remain involved in clinical innovation projects with four of these working on Enterprise Ireland funded commercialisation grants and one working as chief executive officer of a service-led start-up, Strive. Of these, five also remain engaged in clinical practice on a full or part-time basis. The clinicians who have returned to full-time clinical practice have used the process and learning of the programme to influence their individual clinical areas and actively seek innovative solutions to meet clinical challenges. Clinicians, in particular, describe gaining value from the BioInnovate programme in areas of 'Understanding Entrepreneurship' and 'Business Strategy'. Further study is needed into the quantitative impact on the ecosystem and impact to other stakeholders.
NASA Astrophysics Data System (ADS)
Philen, Michael
2011-04-01
This manuscript is an overview of the research that is currently being performed as part of a 2009 NSF Office of Emerging Frontiers in Research and Innnovation (EFRI) grant on BioSensing and BioActuation (BSBA). The objectives of this multi-university collaborative research are to achieve a greater understanding of the hierarchical organization and structure of the sensory, muscular, and control systems of fish, and to develop advanced biologically-inspired material systems having distributed sensing, actuation, and intelligent control. New experimental apparatus have been developed for performing experiments involving live fish and robotic devices, and new bio-inspired haircell sensors and artificial muscles are being developed using carbonaceous nanomaterials, bio-derived molecules, and composite technology. Results demonstrating flow sensing and actuation are presented.
Jiao, York; Gipson, Keith E; Bonde, Pramod; Mangi, Abeel; Hagberg, Robert; Rosinski, David J; Gross, Jeffrey B; Schonberger, Robert B
Prolonged use of venoarterial extracorporeal membrane oxygenation (VA ECMO) may be complicated by end-organ dysfunction. Although gaseous microemboli (GME) are thought to damage end organs during cardiopulmonary bypass, patient exposures to GME have not been well characterized during VA ECMO. We therefore performed an observational study of GME in adult VA ECMO patients, with correlation to clinical events during routine patient care. After institutional review board (IRB) approval, we used two Doppler probes to detect GME noninvasively in extracorporeal membrane oxygenation (ECMO) circuits on four patients for 15 hours total while also recording patient care events. We then conducted in vitro trials to compare Doppler signals with gold-standard measurements using an Emboli Detection and Classification EDAC quantifier (Luna Innnovations, Inc. Roanoke, VA) (Terumo Cardiovascular, Ann Arbor, MI) during simulated clinical interventions. Correlations between Doppler and EDAC data were used to estimate GME counts and volumes represented by clinical Doppler data. A total of 503 groups of Doppler peaks representing GME showers were observed, including 194 statistically larger showers during patient care activities containing 92% of total Doppler peaks. Intravenous injections accounted for an estimated 68% of GME and 88% of GME volume, whereas care involving movement accounted for an estimated 6% of GME and 3% of volume. Overall estimated embolic rates of 24,000 GME totaling 4 μl/hr rivals reported GME rates during cardiopulmonary bypass. Numerous GME are present in the postmembrane circuit during VA ECMO, raising concern for effects on microcirculation and organ dysfunction. Strategies to detect and minimize GME may be warranted to limit embolic exposures experienced by VA ECMO patients.
System and method for deriving a process-based specification
NASA Technical Reports Server (NTRS)
Hinchey, Michael Gerard (Inventor); Rouff, Christopher A. (Inventor); Rash, James Larry (Inventor)
2009-01-01
A system and method for deriving a process-based specification for a system is disclosed. The process-based specification is mathematically inferred from a trace-based specification. The trace-based specification is derived from a non-empty set of traces or natural language scenarios. The process-based specification is mathematically equivalent to the trace-based specification. Code is generated, if applicable, from the process-based specification. A process, or phases of a process, using the features disclosed can be reversed and repeated to allow for an interactive development and modification of legacy systems. The process is applicable to any class of system, including, but not limited to, biological and physical systems, electrical and electro-mechanical systems in addition to software, hardware and hybrid hardware-software systems.
Differentiating location- and distance-based processes in memory for time: an ERP study.
Curran, Tim; Friedman, William J
2003-09-01
Memory for the time of events may benefit from reconstructive, location-based, and distance-based processes, but these processes are difficult to dissociate with behavioral methods. Neuropsychological research has emphasized the contribution of prefrontal brain mechanisms to memory for time but has not clearly differentiated location- from distance-based processing. The present experiment recorded event-related brain potentials (ERPs) while subjects completed two different temporal memory tests, designed to emphasize either location- or distance-based processing. The subjects' reports of location-based versus distance-based strategies and the reaction time pattern validated our experimental manipulation. Late (800-1,800 msec) frontal ERP effects were related to location-based processing. The results provide support for a two-process theory of memory for time and suggest that frontal memory mechanisms are specifically related to reconstructive, location-based processing.
On Intelligent Design and Planning Method of Process Route Based on Gun Breech Machining Process
NASA Astrophysics Data System (ADS)
Hongzhi, Zhao; Jian, Zhang
2018-03-01
The paper states an approach of intelligent design and planning of process route based on gun breech machining process, against several problems, such as complex machining process of gun breech, tedious route design and long period of its traditional unmanageable process route. Based on gun breech machining process, intelligent design and planning system of process route are developed by virtue of DEST and VC++. The system includes two functional modules--process route intelligent design and its planning. The process route intelligent design module, through the analysis of gun breech machining process, summarizes breech process knowledge so as to complete the design of knowledge base and inference engine. And then gun breech process route intelligently output. On the basis of intelligent route design module, the final process route is made, edited and managed in the process route planning module.
Lau, Nathan; Jamieson, Greg A; Skraaning, Gyrd
2016-07-01
We introduce Process Overview, a situation awareness characterisation of the knowledge derived from monitoring process plants. Process Overview is based on observational studies of process control work in the literature. The characterisation is applied to develop a query-based measure called the Process Overview Measure. The goal of the measure is to improve coupling between situation and awareness according to process plant properties and operator cognitive work. A companion article presents the empirical evaluation of the Process Overview Measure in a realistic process control setting. The Process Overview Measure demonstrated sensitivity and validity by revealing significant effects of experimental manipulations that corroborated with other empirical results. The measure also demonstrated adequate inter-rater reliability and practicality for measuring SA based on data collected by process experts. Practitioner Summary: The Process Overview Measure is a query-based measure for assessing operator situation awareness from monitoring process plants in representative settings.
A new window of opportunity to reject process-based biotechnology regulation
Marchant, Gary E; Stevens, Yvonne A
2015-01-01
ABSTRACT. The question of whether biotechnology regulation should be based on the process or the product has long been debated, with different jurisdictions adopting different approaches. The European Union has adopted a process-based approach, Canada has adopted a product-based approach, and the United States has implemented a hybrid system. With the recent proliferation of new methods of genetic modification, such as gene editing, process-based regulatory systems, which are premised on a binary system of transgenic and conventional approaches, will become increasingly obsolete and unsustainable. To avoid unreasonable, unfair and arbitrary results, nations that have adopted process-based approaches will need to migrate to a product-based approach that considers the novelty and risks of the individual trait, rather than the process by which that trait was produced. This commentary suggests some approaches for the design of such a product-based approach. PMID:26930116
A new window of opportunity to reject process-based biotechnology regulation.
Marchant, Gary E; Stevens, Yvonne A
2015-01-01
The question of whether biotechnology regulation should be based on the process or the product has long been debated, with different jurisdictions adopting different approaches. The European Union has adopted a process-based approach, Canada has adopted a product-based approach, and the United States has implemented a hybrid system. With the recent proliferation of new methods of genetic modification, such as gene editing, process-based regulatory systems, which are premised on a binary system of transgenic and conventional approaches, will become increasingly obsolete and unsustainable. To avoid unreasonable, unfair and arbitrary results, nations that have adopted process-based approaches will need to migrate to a product-based approach that considers the novelty and risks of the individual trait, rather than the process by which that trait was produced. This commentary suggests some approaches for the design of such a product-based approach.
Implicit Schemata and Categories in Memory-Based Language Processing
ERIC Educational Resources Information Center
van den Bosch, Antal; Daelemans, Walter
2013-01-01
Memory-based language processing (MBLP) is an approach to language processing based on exemplar storage during learning and analogical reasoning during processing. From a cognitive perspective, the approach is attractive as a model for human language processing because it does not make any assumptions about the way abstractions are shaped, nor any…
Models of Quantitative Estimations: Rule-Based and Exemplar-Based Processes Compared
ERIC Educational Resources Information Center
von Helversen, Bettina; Rieskamp, Jorg
2009-01-01
The cognitive processes underlying quantitative estimations vary. Past research has identified task-contingent changes between rule-based and exemplar-based processes (P. Juslin, L. Karlsson, & H. Olsson, 2008). B. von Helversen and J. Rieskamp (2008), however, proposed a simple rule-based model--the mapping model--that outperformed the…
Unified Modeling Language (UML) for hospital-based cancer registration processes.
Shiki, Naomi; Ohno, Yuko; Fujii, Ayumi; Murata, Taizo; Matsumura, Yasushi
2008-01-01
Hospital-based cancer registry involves complex processing steps that span across multiple departments. In addition, management techniques and registration procedures differ depending on each medical facility. Establishing processes for hospital-based cancer registry requires clarifying specific functions and labor needed. In recent years, the business modeling technique, in which management evaluation is done by clearly spelling out processes and functions, has been applied to business process analysis. However, there are few analytical reports describing the applications of these concepts to medical-related work. In this study, we initially sought to model hospital-based cancer registration processes using the Unified Modeling Language (UML), to clarify functions. The object of this study was the cancer registry of Osaka University Hospital. We organized the hospital-based cancer registration processes based on interview and observational surveys, and produced an As-Is model using activity, use-case, and class diagrams. After drafting every UML model, it was fed-back to practitioners to check its validity and improved. We were able to define the workflow for each department using activity diagrams. In addition, by using use-case diagrams we were able to classify each department within the hospital as a system, and thereby specify the core processes and staff that were responsible for each department. The class diagrams were effective in systematically organizing the information to be used for hospital-based cancer registries. Using UML modeling, hospital-based cancer registration processes were broadly classified into three separate processes, namely, registration tasks, quality control, and filing data. An additional 14 functions were also extracted. Many tasks take place within the hospital-based cancer registry office, but the process of providing information spans across multiple departments. Moreover, additional tasks were required in comparison to using a standardized system because the hospital-based cancer registration system was constructed with the pre-existing computer system in Osaka University Hospital. Difficulty of utilization of useful information for cancer registration processes was shown to increase the task workload. By using UML, we were able to clarify functions and extract the typical processes for a hospital-based cancer registry. Modeling can provide a basis of process analysis for establishment of efficient hospital-based cancer registration processes in each institute.
A neuroanatomical model of space-based and object-centered processing in spatial neglect.
Pedrazzini, Elena; Schnider, Armin; Ptak, Radek
2017-11-01
Visual attention can be deployed in space-based or object-centered reference frames. Right-hemisphere damage may lead to distinct deficits of space- or object-based processing, and such dissociations are thought to underlie the heterogeneous nature of spatial neglect. Previous studies have suggested that object-centered processing deficits (such as in copying, reading or line bisection) result from damage to retro-rolandic regions while impaired spatial exploration reflects damage to more anterior regions. However, this evidence is based on small samples and heterogeneous tasks. Here, we tested a theoretical model of neglect that takes in account the space- and object-based processing and relates them to neuroanatomical predictors. One hundred and one right-hemisphere-damaged patients were examined with classic neuropsychological tests and structural brain imaging. Relations between neglect measures and damage to the temporal-parietal junction, intraparietal cortex, insula and middle frontal gyrus were examined with two structural equation models by assuming that object-centered processing (involved in line bisection and single-word reading) and space-based processing (involved in cancelation tasks) either represented a unique latent variable or two distinct variables. Of these two models the latter had better explanatory power. Damage to the intraparietal sulcus was a significant predictor of object-centered, but not space-based processing, while damage to the temporal-parietal junction predicted space-based, but not object-centered processing. Space-based processing and object-centered processing were strongly intercorrelated, indicating that they rely on similar, albeit partly dissociated processes. These findings indicate that object-centered and space-based deficits in neglect are partly independent and result from superior parietal and inferior parietal damage, respectively.
Musical rhythm and reading development: does beat processing matter?
Ozernov-Palchik, Ola; Patel, Aniruddh D
2018-05-20
There is mounting evidence for links between musical rhythm processing and reading-related cognitive skills, such as phonological awareness. This may be because music and speech are rhythmic: both involve processing complex sound sequences with systematic patterns of timing, accent, and grouping. Yet, there is a salient difference between musical and speech rhythm: musical rhythm is often beat-based (based on an underlying grid of equal time intervals), while speech rhythm is not. Thus, the role of beat-based processing in the reading-rhythm relationship is not clear. Is there is a distinct relation between beat-based processing mechanisms and reading-related language skills, or is the rhythm-reading link entirely due to shared mechanisms for processing nonbeat-based aspects of temporal structure? We discuss recent evidence for a distinct link between beat-based processing and early reading abilities in young children, and suggest experimental designs that would allow one to further methodically investigate this relationship. We propose that beat-based processing taps into a listener's ability to use rich contextual regularities to form predictions, a skill important for reading development. © 2018 New York Academy of Sciences.
Conceptual information processing: A robust approach to KBS-DBMS integration
NASA Technical Reports Server (NTRS)
Lazzara, Allen V.; Tepfenhart, William; White, Richard C.; Liuzzi, Raymond
1987-01-01
Integrating the respective functionality and architectural features of knowledge base and data base management systems is a topic of considerable interest. Several aspects of this topic and associated issues are addressed. The significance of integration and the problems associated with accomplishing that integration are discussed. The shortcomings of current approaches to integration and the need to fuse the capabilities of both knowledge base and data base management systems motivates the investigation of information processing paradigms. One such paradigm is concept based processing, i.e., processing based on concepts and conceptual relations. An approach to robust knowledge and data base system integration is discussed by addressing progress made in the development of an experimental model for conceptual information processing.
The research on construction and application of machining process knowledge base
NASA Astrophysics Data System (ADS)
Zhao, Tan; Qiao, Lihong; Qie, Yifan; Guo, Kai
2018-03-01
In order to realize the application of knowledge in machining process design, from the perspective of knowledge in the application of computer aided process planning(CAPP), a hierarchical structure of knowledge classification is established according to the characteristics of mechanical engineering field. The expression of machining process knowledge is structured by means of production rules and the object-oriented methods. Three kinds of knowledge base models are constructed according to the representation of machining process knowledge. In this paper, the definition and classification of machining process knowledge, knowledge model, and the application flow of the process design based on the knowledge base are given, and the main steps of the design decision of the machine tool are carried out as an application by using the knowledge base.
See, Ya Hui Michelle; Petty, Richard E; Fabrigar, Leandre R
2013-08-01
We proposed that (a) processing interest for affective over cognitive information is captured by meta-bases (i.e., the extent to which people subjectively perceive themselves to rely on affect or cognition in their attitudes) and (b) processing efficiency for affective over cognitive information is captured by structural bases (i.e., the extent to which attitudes are more evaluatively congruent with affect or cognition). Because processing speed can disentangle interest from efficiency by being manifest as longer or shorter reading times, we hypothesized and found that more affective meta-bases predicted longer affective than cognitive reading time when processing efficiency was held constant (Study 1). In contrast, more affective structural bases predicted shorter affective than cognitive reading time when participants were constrained in their ability to allocate resources deliberatively (Study 2). When deliberation was neither encouraged nor constrained, effects for meta-bases and structural bases emerged (Study 3). Implications for affective-cognitive processing and other attitudes-relevant constructs are discussed.
Process-based Cost Estimation for Ramjet/Scramjet Engines
NASA Technical Reports Server (NTRS)
Singh, Brijendra; Torres, Felix; Nesman, Miles; Reynolds, John
2003-01-01
Process-based cost estimation plays a key role in effecting cultural change that integrates distributed science, technology and engineering teams to rapidly create innovative and affordable products. Working together, NASA Glenn Research Center and Boeing Canoga Park have developed a methodology of process-based cost estimation bridging the methodologies of high-level parametric models and detailed bottoms-up estimation. The NASA GRC/Boeing CP process-based cost model provides a probabilistic structure of layered cost drivers. High-level inputs characterize mission requirements, system performance, and relevant economic factors. Design alternatives are extracted from a standard, product-specific work breakdown structure to pre-load lower-level cost driver inputs and generate the cost-risk analysis. As product design progresses and matures the lower level more detailed cost drivers can be re-accessed and the projected variation of input values narrowed, thereby generating a progressively more accurate estimate of cost-risk. Incorporated into the process-based cost model are techniques for decision analysis, specifically, the analytic hierarchy process (AHP) and functional utility analysis. Design alternatives may then be evaluated not just on cost-risk, but also user defined performance and schedule criteria. This implementation of full-trade study support contributes significantly to the realization of the integrated development environment. The process-based cost estimation model generates development and manufacturing cost estimates. The development team plans to expand the manufacturing process base from approximately 80 manufacturing processes to over 250 processes. Operation and support cost modeling is also envisioned. Process-based estimation considers the materials, resources, and processes in establishing cost-risk and rather depending on weight as an input, actually estimates weight along with cost and schedule.
Li, Wen-Long; Qu, Hai-Bin
2016-10-01
In this paper, the principle of NIRS (near infrared spectroscopy)-based process trajectory technology was introduced.The main steps of the technique include:① in-line collection of the processes spectra of different technics; ② unfolding of the 3-D process spectra;③ determination of the process trajectories and their normal limits;④ monitoring of the new batches with the established MSPC (multivariate statistical process control) models.Applications of the technology in the chemical and biological medicines were reviewed briefly. By a comprehensive introduction of our feasibility research on the monitoring of traditional Chinese medicine technical process using NIRS-based multivariate process trajectories, several important problems of the practical applications which need urgent solutions are proposed, and also the application prospect of the NIRS-based process trajectory technology is fully discussed and put forward in the end. Copyright© by the Chinese Pharmaceutical Association.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hohimer, J.P.
The use of laser-based analytical methods in nuclear-fuel processing plants is considered. The species and locations for accountability, process control, and effluent control measurements in the Coprocessing, Thorex, and reference Purex fuel processing operations are identified and the conventional analytical methods used for these measurements are summarized. The laser analytical methods based upon Raman, absorption, fluorescence, and nonlinear spectroscopy are reviewed and evaluated for their use in fuel processing plants. After a comparison of the capabilities of the laser-based and conventional analytical methods, the promising areas of application of the laser-based methods in fuel processing plants are identified.
NASA Astrophysics Data System (ADS)
Chi, Xu; Dongming, Guo; Zhuji, Jin; Renke, Kang
2010-12-01
A signal processing method for the friction-based endpoint detection system of a chemical mechanical polishing (CMP) process is presented. The signal process method uses the wavelet threshold denoising method to reduce the noise contained in the measured original signal, extracts the Kalman filter innovation from the denoised signal as the feature signal, and judges the CMP endpoint based on the feature of the Kalman filter innovation sequence during the CMP process. Applying the signal processing method, the endpoint detection experiments of the Cu CMP process were carried out. The results show that the signal processing method can judge the endpoint of the Cu CMP process.
Espinoza, Manuel Antonio; Manca, Andrea; Claxton, Karl; Sculpher, Mark
2018-02-01
Evidence about cost-effectiveness is increasingly being used to inform decisions about the funding of new technologies that are usually implemented as guidelines from centralized decision-making bodies. However, there is also an increasing recognition for the role of patients in determining their preferred treatment option. This paper presents a method to estimate the value of implementing a choice-based decision process using the cost-effectiveness analysis toolbox. This value is estimated for 3 alternative scenarios. First, it compares centralized decisions, based on population average cost-effectiveness, against a decision process based on patient choice. Second, it compares centralized decision based on patients' subgroups versus an individual choice-based decision process. Third, it compares a centralized process based on average cost-effectiveness against a choice-based process where patients choose according to a different measure of outcome to that used by the centralized decision maker. The methods are applied to a case study for the management of acute coronary syndrome. It is concluded that implementing a choice-based process of treatment allocation may be an option in collectively funded health systems. However, its value will depend on the specific health problem and the social values considered relevant to the health system. Copyright © 2017 John Wiley & Sons, Ltd.
Pre- and Post-Processing Tools to Create and Characterize Particle-Based Composite Model Structures
2017-11-01
ARL-TR-8213 ● NOV 2017 US Army Research Laboratory Pre- and Post -Processing Tools to Create and Characterize Particle-Based...ARL-TR-8213 ● NOV 2017 US Army Research Laboratory Pre- and Post -Processing Tools to Create and Characterize Particle-Based Composite...AND SUBTITLE Pre- and Post -Processing Tools to Create and Characterize Particle-Based Composite Model Structures 5a. CONTRACT NUMBER 5b. GRANT
Introduction to Radar Signal and Data Processing: The Opportunity
2006-09-01
SpA) Director of Analysis of Integrated Systems Group Via Tiburtina Km. 12.400 00131 Rome ITALY e.mail: afarina@selex-si.com Key words: radar...signal processing, data processing, adaptivity, space-time adaptive processing, knowledge based systems , CFAR. 1. SUMMARY This paper introduces to...the lecture series dedicated to the knowledge-based radar signal and data processing. Knowledge-based expert system (KBS) is in the realm of
Karimi, Davood; Ward, Rabab K
2016-10-01
Image models are central to all image processing tasks. The great advancements in digital image processing would not have been made possible without powerful models which, themselves, have evolved over time. In the past decade, "patch-based" models have emerged as one of the most effective models for natural images. Patch-based methods have outperformed other competing methods in many image processing tasks. These developments have come at a time when greater availability of powerful computational resources and growing concerns over the health risks of the ionizing radiation encourage research on image processing algorithms for computed tomography (CT). The goal of this paper is to explain the principles of patch-based methods and to review some of their recent applications in CT. We first review the central concepts in patch-based image processing and explain some of the state-of-the-art algorithms, with a focus on aspects that are more relevant to CT. Then, we review some of the recent application of patch-based methods in CT. Patch-based methods have already transformed the field of image processing, leading to state-of-the-art results in many applications. More recently, several studies have proposed patch-based algorithms for various image processing tasks in CT, from denoising and restoration to iterative reconstruction. Although these studies have reported good results, the true potential of patch-based methods for CT has not been yet appreciated. Patch-based methods can play a central role in image reconstruction and processing for CT. They have the potential to lead to substantial improvements in the current state of the art.
76 FR 70878 - Revitalizing Base Closure Communities and Addressing Impacts of Realignment
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-16
... base closure process to conform to the amendment to the Defense Base Closure and Realignment Act of... departments to expedite the EDC process. Closed military bases represent a potential engine of economic... purposes of establishing EDC terms and conditions. It also eliminates the need to establish a process by...
Expert system for web based collaborative CAE
NASA Astrophysics Data System (ADS)
Hou, Liang; Lin, Zusheng
2006-11-01
An expert system for web based collaborative CAE was developed based on knowledge engineering, relational database and commercial FEA (Finite element analysis) software. The architecture of the system was illustrated. In this system, the experts' experiences, theories and typical examples and other related knowledge, which will be used in the stage of pre-process in FEA, were categorized into analysis process and object knowledge. Then, the integrated knowledge model based on object-oriented method and rule based method was described. The integrated reasoning process based on CBR (case based reasoning) and rule based reasoning was presented. Finally, the analysis process of this expert system in web based CAE application was illustrated, and an analysis example of a machine tool's column was illustrated to prove the validity of the system.
Hydrothermal Processing of Base Camp Solid Wastes To Allow Onsite Recycling
2008-09-01
ER D C/ CE R L TR -0 8 -1 3 Hydrothermal Processing of Base Camp Solid Wastes To Allow Onsite Recycling Gary L. Gerdes, Deborah...release; distribution is unlimited. ERDC/CERL TR-08-13 September 2008 Hydrothermal Processing of Base Camp Solid Wastes To Allow Onsite Recycling...a technology to process domestic solid waste using a unique hydrothermal system. The process was successfully demonstrated at Forts Benning and
Valentijn, Pim P; Ruwaard, Dirk; Vrijhoef, Hubertus J M; de Bont, Antoinette; Arends, Rosa Y; Bruijnzeels, Marc A
2015-10-09
Collaborative partnerships are considered an essential strategy for integrating local disjointed health and social services. Currently, little evidence is available on how integrated care arrangements between professionals and organisations are achieved through the evolution of collaboration processes over time. The first aim was to develop a typology of integrated care projects (ICPs) based on the final degree of integration as perceived by multiple stakeholders. The second aim was to study how types of integration differ in changes of collaboration processes over time and final perceived effectiveness. A longitudinal mixed-methods study design based on two data sources (surveys and interviews) was used to identify the perceived degree of integration and patterns in collaboration among 42 ICPs in primary care in The Netherlands. We used cluster analysis to identify distinct subgroups of ICPs based on the final perceived degree of integration from a professional, organisational and system perspective. With the use of ANOVAs, the subgroups were contrasted based on: 1) changes in collaboration processes over time (shared ambition, interests and mutual gains, relationship dynamics, organisational dynamics and process management) and 2) final perceived effectiveness (i.e. rated success) at the professional, organisational and system levels. The ICPs were classified into three subgroups with: 'United Integration Perspectives (UIP)', 'Disunited Integration Perspectives (DIP)' and 'Professional-oriented Integration Perspectives (PIP)'. ICPs within the UIP subgroup made the strongest increase in trust-based (mutual gains and relationship dynamics) as well as control-based (organisational dynamics and process management) collaboration processes and had the highest overall effectiveness rates. On the other hand, ICPs with the DIP subgroup decreased on collaboration processes and had the lowest overall effectiveness rates. ICPs within the PIP subgroup increased in control-based collaboration processes (organisational dynamics and process management) and had the highest effectiveness rates at the professional level. The differences across the three subgroups in terms of the development of collaboration processes and the final perceived effectiveness provide evidence that united stakeholders' perspectives are achieved through a constructive collaboration process over time. Disunited perspectives at the professional, organisation and system levels can be aligned by both trust-based and control-based collaboration processes.
Lau, Nathan; Jamieson, Greg A; Skraaning, Gyrd
2016-03-01
The Process Overview Measure is a query-based measure developed to assess operator situation awareness (SA) from monitoring process plants. A companion paper describes how the measure has been developed according to process plant properties and operator cognitive work. The Process Overview Measure demonstrated practicality, sensitivity, validity and reliability in two full-scope simulator experiments investigating dramatically different operational concepts. Practicality was assessed based on qualitative feedback of participants and researchers. The Process Overview Measure demonstrated sensitivity and validity by revealing significant effects of experimental manipulations that corroborated with other empirical results. The measure also demonstrated adequate inter-rater reliability and practicality for measuring SA in full-scope simulator settings based on data collected on process experts. Thus, full-scope simulator studies can employ the Process Overview Measure to reveal the impact of new control room technology and operational concepts on monitoring process plants. Practitioner Summary: The Process Overview Measure is a query-based measure that demonstrated practicality, sensitivity, validity and reliability for assessing operator situation awareness (SA) from monitoring process plants in representative settings.
Understanding community-based processes for research ethics review: a national study.
Shore, Nancy; Brazauskas, Ruta; Drew, Elaine; Wong, Kristine A; Moy, Lisa; Baden, Andrea Corage; Cyr, Kirsten; Ulevicus, Jocelyn; Seifer, Sarena D
2011-12-01
Institutional review boards (IRBs), designed to protect individual study participants, do not routinely assess community consent, risks, and benefits. Community groups are establishing ethics review processes to determine whether and how research is conducted in their communities. To strengthen the ethics review of community-engaged research, we sought to identify and describe these processes. In 2008 we conducted an online survey of US-based community groups and community-institutional partnerships involved in human-participants research. We identified 109 respondents who met participation criteria and had ethics review processes in place. The respondents' processes mainly functioned through community-institutional partnerships, community-based organizations, community health centers, and tribal organizations. These processes had been created primarily to ensure that the involved communities were engaged in and directly benefited from research and were protected from research harms. The primary process benefits included giving communities a voice in determining which studies were conducted and ensuring that studies were relevant and feasible, and that they built community capacity. The primary process challenges were the time and resources needed to support the process. Community-based processes for ethics review consider community-level ethical issues that institution-based IRBs often do not.
Chisholm, Joseph D; Kingstone, Alan
2015-10-01
Research has demonstrated that experience with action video games is associated with improvements in a host of cognitive tasks. Evidence from paradigms that assess aspects of attention has suggested that action video game players (AVGPs) possess greater control over the allocation of attentional resources than do non-video-game players (NVGPs). Using a compound search task that teased apart selection- and response-based processes (Duncan, 1985), we required participants to perform an oculomotor capture task in which they made saccades to a uniquely colored target (selection-based process) and then produced a manual directional response based on information within the target (response-based process). We replicated the finding that AVGPs are less susceptible to attentional distraction and, critically, revealed that AVGPs outperform NVGPs on both selection-based and response-based processes. These results not only are consistent with the improved-attentional-control account of AVGP benefits, but they suggest that the benefit of action video game playing extends across the full breadth of attention-mediated stimulus-response processes that impact human performance.
Comparative evaluation of urban storm water quality models
NASA Astrophysics Data System (ADS)
Vaze, J.; Chiew, Francis H. S.
2003-10-01
The estimation of urban storm water pollutant loads is required for the development of mitigation and management strategies to minimize impacts to receiving environments. Event pollutant loads are typically estimated using either regression equations or "process-based" water quality models. The relative merit of using regression models compared to process-based models is not clear. A modeling study is carried out here to evaluate the comparative ability of the regression equations and process-based water quality models to estimate event diffuse pollutant loads from impervious surfaces. The results indicate that, once calibrated, both the regression equations and the process-based model can estimate event pollutant loads satisfactorily. In fact, the loads estimated using the regression equation as a function of rainfall intensity and runoff rate are better than the loads estimated using the process-based model. Therefore, if only estimates of event loads are required, regression models should be used because they are simpler and require less data compared to process-based models.
Anammox-based technologies for nitrogen removal: Advances in process start-up and remaining issues.
Ali, Muhammad; Okabe, Satoshi
2015-12-01
Nitrogen removal from wastewater via anaerobic ammonium oxidation (anammox)-based process has been recognized as efficient, cost-effective and low energy alternative to the conventional nitrification and denitrification processes. To date, more than one hundred full-scale anammox plants have been installed and operated for treatment of NH4(+)-rich wastewater streams around the world, and the number is increasing rapidly. Since the discovery of anammox process, extensive researches have been done to develop various anammox-based technologies. However, there are still some challenges in practical application of anammox-based treatment process at full-scale, e.g., longer start-up period, limited application to mainstream municipal wastewater and poor effluent water quality. This paper aimed to summarize recent status of application of anammox process and researches on technological development for solving these remaining problems. In addition, an integrated system of anammox-based process and microbial fuel cell is proposed for sustainable and energy-positive wastewater treatment. Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Eleiwi, Fadi; Laleg-Kirati, Taous Meriem
2018-06-01
An observer-based perturbation extremum seeking control is proposed for a direct-contact membrane distillation (DCMD) process. The process is described with a dynamic model that is based on a 2D advection-diffusion equation model which has pump flow rates as process inputs. The objective of the controller is to optimise the trade-off between the permeate mass flux and the energy consumption by the pumps inside the process. Cases of single and multiple control inputs are considered through the use of only the feed pump flow rate or both the feed and the permeate pump flow rates. A nonlinear Lyapunov-based observer is designed to provide an estimation for the temperature distribution all over the designated domain of the DCMD process. Moreover, control inputs are constrained with an anti-windup technique to be within feasible and physical ranges. Performance of the proposed structure is analysed, and simulations based on real DCMD process parameters for each control input are provided.
Selective aqueous extraction of organics coupled with trapping by membrane separation
van Eikeren, Paul; Brose, Daniel J.; Ray, Roderick J.
1991-01-01
An improvement to processes for the selective extractation of organic solutes from organic solvents by water-based extractants is disclosed, the improvement comprising coupling various membrane separation processes with the organic extraction process, the membrane separation process being utilized to continuously recycle the water-based extractant and at the same time selectively remove or concentrate organic solute from the water-based extractant.
Application of agent-based system for bioprocess description and process improvement.
Gao, Ying; Kipling, Katie; Glassey, Jarka; Willis, Mark; Montague, Gary; Zhou, Yuhong; Titchener-Hooker, Nigel J
2010-01-01
Modeling plays an important role in bioprocess development for design and scale-up. Predictive models can also be used in biopharmaceutical manufacturing to assist decision-making either to maintain process consistency or to identify optimal operating conditions. To predict the whole bioprocess performance, the strong interactions present in a processing sequence must be adequately modeled. Traditionally, bioprocess modeling considers process units separately, which makes it difficult to capture the interactions between units. In this work, a systematic framework is developed to analyze the bioprocesses based on a whole process understanding and considering the interactions between process operations. An agent-based approach is adopted to provide a flexible infrastructure for the necessary integration of process models. This enables the prediction of overall process behavior, which can then be applied during process development or once manufacturing has commenced, in both cases leading to the capacity for fast evaluation of process improvement options. The multi-agent system comprises a process knowledge base, process models, and a group of functional agents. In this system, agent components co-operate with each other in performing their tasks. These include the description of the whole process behavior, evaluating process operating conditions, monitoring of the operating processes, predicting critical process performance, and providing guidance to decision-making when coping with process deviations. During process development, the system can be used to evaluate the design space for process operation. During manufacture, the system can be applied to identify abnormal process operation events and then to provide suggestions as to how best to cope with the deviations. In all cases, the function of the system is to ensure an efficient manufacturing process. The implementation of the agent-based approach is illustrated via selected application scenarios, which demonstrate how such a framework may enable the better integration of process operations by providing a plant-wide process description to facilitate process improvement. Copyright 2009 American Institute of Chemical Engineers
Net-centric ACT-R-Based Cognitive Architecture with DEVS Unified Process
2011-04-01
effort has been spent in analyzing various forms of requirement specifications, viz, state-based, Natural Language based, UML-based, Rule- based, BPMN ...requirement specifications in one of the chosen formats such as BPMN , DoDAF, Natural Language Processing (NLP) based, UML- based, DSL or simply
A KPI-based process monitoring and fault detection framework for large-scale processes.
Zhang, Kai; Shardt, Yuri A W; Chen, Zhiwen; Yang, Xu; Ding, Steven X; Peng, Kaixiang
2017-05-01
Large-scale processes, consisting of multiple interconnected subprocesses, are commonly encountered in industrial systems, whose performance needs to be determined. A common approach to this problem is to use a key performance indicator (KPI)-based approach. However, the different KPI-based approaches are not developed with a coherent and consistent framework. Thus, this paper proposes a framework for KPI-based process monitoring and fault detection (PM-FD) for large-scale industrial processes, which considers the static and dynamic relationships between process and KPI variables. For the static case, a least squares-based approach is developed that provides an explicit link with least-squares regression, which gives better performance than partial least squares. For the dynamic case, using the kernel representation of each subprocess, an instrument variable is used to reduce the dynamic case to the static case. This framework is applied to the TE benchmark process and the hot strip mill rolling process. The results show that the proposed method can detect faults better than previous methods. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.
Automatic and controlled components of judgment and decision making.
Ferreira, Mario B; Garcia-Marques, Leonel; Sherman, Steven J; Sherman, Jeffrey W
2006-11-01
The categorization of inductive reasoning into largely automatic processes (heuristic reasoning) and controlled analytical processes (rule-based reasoning) put forward by dual-process approaches of judgment under uncertainty (e.g., K. E. Stanovich & R. F. West, 2000) has been primarily a matter of assumption with a scarcity of direct empirical findings supporting it. The present authors use the process dissociation procedure (L. L. Jacoby, 1991) to provide convergent evidence validating a dual-process perspective to judgment under uncertainty based on the independent contributions of heuristic and rule-based reasoning. Process dissociations based on experimental manipulation of variables were derived from the most relevant theoretical properties typically used to contrast the two forms of reasoning. These include processing goals (Experiment 1), cognitive resources (Experiment 2), priming (Experiment 3), and formal training (Experiment 4); the results consistently support the author's perspective. They conclude that judgment under uncertainty is neither an automatic nor a controlled process but that it reflects both processes, with each making independent contributions.
A midas plugin to enable construction of reproducible web-based image processing pipelines
Grauer, Michael; Reynolds, Patrick; Hoogstoel, Marion; Budin, Francois; Styner, Martin A.; Oguz, Ipek
2013-01-01
Image processing is an important quantitative technique for neuroscience researchers, but difficult for those who lack experience in the field. In this paper we present a web-based platform that allows an expert to create a brain image processing pipeline, enabling execution of that pipeline even by those biomedical researchers with limited image processing knowledge. These tools are implemented as a plugin for Midas, an open-source toolkit for creating web based scientific data storage and processing platforms. Using this plugin, an image processing expert can construct a pipeline, create a web-based User Interface, manage jobs, and visualize intermediate results. Pipelines are executed on a grid computing platform using BatchMake and HTCondor. This represents a new capability for biomedical researchers and offers an innovative platform for scientific collaboration. Current tools work well, but can be inaccessible for those lacking image processing expertise. Using this plugin, researchers in collaboration with image processing experts can create workflows with reasonable default settings and streamlined user interfaces, and data can be processed easily from a lab environment without the need for a powerful desktop computer. This platform allows simplified troubleshooting, centralized maintenance, and easy data sharing with collaborators. These capabilities enable reproducible science by sharing datasets and processing pipelines between collaborators. In this paper, we present a description of this innovative Midas plugin, along with results obtained from building and executing several ITK based image processing workflows for diffusion weighted MRI (DW MRI) of rodent brain images, as well as recommendations for building automated image processing pipelines. Although the particular image processing pipelines developed were focused on rodent brain MRI, the presented plugin can be used to support any executable or script-based pipeline. PMID:24416016
A midas plugin to enable construction of reproducible web-based image processing pipelines.
Grauer, Michael; Reynolds, Patrick; Hoogstoel, Marion; Budin, Francois; Styner, Martin A; Oguz, Ipek
2013-01-01
Image processing is an important quantitative technique for neuroscience researchers, but difficult for those who lack experience in the field. In this paper we present a web-based platform that allows an expert to create a brain image processing pipeline, enabling execution of that pipeline even by those biomedical researchers with limited image processing knowledge. These tools are implemented as a plugin for Midas, an open-source toolkit for creating web based scientific data storage and processing platforms. Using this plugin, an image processing expert can construct a pipeline, create a web-based User Interface, manage jobs, and visualize intermediate results. Pipelines are executed on a grid computing platform using BatchMake and HTCondor. This represents a new capability for biomedical researchers and offers an innovative platform for scientific collaboration. Current tools work well, but can be inaccessible for those lacking image processing expertise. Using this plugin, researchers in collaboration with image processing experts can create workflows with reasonable default settings and streamlined user interfaces, and data can be processed easily from a lab environment without the need for a powerful desktop computer. This platform allows simplified troubleshooting, centralized maintenance, and easy data sharing with collaborators. These capabilities enable reproducible science by sharing datasets and processing pipelines between collaborators. In this paper, we present a description of this innovative Midas plugin, along with results obtained from building and executing several ITK based image processing workflows for diffusion weighted MRI (DW MRI) of rodent brain images, as well as recommendations for building automated image processing pipelines. Although the particular image processing pipelines developed were focused on rodent brain MRI, the presented plugin can be used to support any executable or script-based pipeline.
Lee, Robert H; Bott, Marjorie J; Forbes, Sarah; Redford, Linda; Swagerty, Daniel L; Taunton, Roma Lee
2003-01-01
Understanding how quality improvement affects costs is important. Unfortunately, low-cost, reliable ways of measuring direct costs are scarce. This article builds on the principles of process improvement to develop a costing strategy that meets both criteria. Process-based costing has 4 steps: developing a flowchart, estimating resource use, valuing resources, and calculating direct costs. To illustrate the technique, this article uses it to cost the care planning process in 3 long-term care facilities. We conclude that process-based costing is easy to implement; generates reliable, valid data; and allows nursing managers to assess the costs of new or modified processes.
Knowlden, Adam P; Sharma, Manoj
2014-09-01
Family-and-home-based interventions are an important vehicle for preventing childhood obesity. Systematic process evaluations have not been routinely conducted in assessment of these interventions. The purpose of this study was to plan and conduct a process evaluation of the Enabling Mothers to Prevent Pediatric Obesity Through Web-Based Learning and Reciprocal Determinism (EMPOWER) randomized control trial. The trial was composed of two web-based, mother-centered interventions for prevention of obesity in children between 4 and 6 years of age. Process evaluation used the components of program fidelity, dose delivered, dose received, context, reach, and recruitment. Categorical process evaluation data (program fidelity, dose delivered, dose exposure, and context) were assessed using Program Implementation Index (PII) values. Continuous process evaluation variables (dose satisfaction and recruitment) were assessed using ANOVA tests to evaluate mean differences between groups (experimental and control) and sessions (sessions 1 through 5). Process evaluation results found that both groups (experimental and control) were equivalent, and interventions were administered as planned. Analysis of web-based intervention process objectives requires tailoring of process evaluation models for online delivery. Dissemination of process evaluation results can advance best practices for implementing effective online health promotion programs. © 2014 Society for Public Health Education.
Supervisee Art-Based Disclosure in "El Duende" Process Painting
ERIC Educational Resources Information Center
Robb, Megan; Miller, Abbe
2017-01-01
Although art-based supervision often leads to supervisee disclosure, little is known about the experience, process, or contributions of such disclosure. We investigated the phenomenon of supervisee disclosure during "El Duende" Process Painting art-based group supervision using a qualitative study. JoHari's Window was used as a grounding…
Louis R. Iverson; Frank R. Thompson; Stephen Matthews; Matthew Peters; Anantha Prasad; William D. Dijak; Jacob Fraser; Wen J. Wang; Brice Hanberry; Hong He; Maria Janowiak; Patricia Butler; Leslie Brandt; Chris Swanston
2016-01-01
Context. Species distribution models (SDM) establish statistical relationships between the current distribution of species and key attributes whereas process-based models simulate ecosystem and tree species dynamics based on representations of physical and biological processes. TreeAtlas, which uses DISTRIB SDM, and Linkages and LANDIS PRO, process...
NASA Astrophysics Data System (ADS)
Xue, Xiaochun; Yu, Yonggang
2017-04-01
Numerical analyses have been performed to study the influence of fast depressurization on the wake flow field of the base-bleed unit (BBU) with a secondary combustion when the base-bleed projectile is propelled out of the muzzle. Two-dimensional axisymmetric Navier-Stokes equations for a multi-component chemically reactive system is solved by Fortran program to calculate the couplings of the internal flow field and wake flow field with consideration of the combustion of the base-bleed propellant and secondary combustion effect. Based on the comparison with the experiments, the unsteady variation mechanism and secondary combustion characteristic of wake flow field under fast depressurization process is obtained numerically. The results show that in the fast depressurization process, the variation extent of the base pressure of the BBU is larger in first 0.9 ms and then decreases gradually and after 1.5 ms, it remains basically stable. The pressure and temperature of the base-bleed combustion chamber experience the decrease and pickup process. Moreover, after the pressure and temperature decrease to the lowest point, the phenomenon that the external gases are flowing back into the base-bleed combustion chamber appears. Also, with the decrease of the initial pressure, the unsteady process becomes shorter and the temperature gradient in the base-bleed combustion chamber declines under the fast depressurization process, which benefits the combustion of the base-bleed propellant.
Gnoth, S; Jenzsch, M; Simutis, R; Lübbert, A
2007-10-31
The Process Analytical Technology (PAT) initiative of the FDA is a reaction on the increasing discrepancy between current possibilities in process supervision and control of pharmaceutical production processes and its current application in industrial manufacturing processes. With rigid approval practices based on standard operational procedures, adaptations of production reactors towards the state of the art were more or less inhibited for long years. Now PAT paves the way for continuous process and product improvements through improved process supervision based on knowledge-based data analysis, "Quality-by-Design"-concepts, and, finally, through feedback control. Examples of up-to-date implementations of this concept are presented. They are taken from one key group of processes in recombinant pharmaceutical protein manufacturing, the cultivations of genetically modified Escherichia coli bacteria.
Learning-based controller for biotechnology processing, and method of using
Johnson, John A.; Stoner, Daphne L.; Larsen, Eric D.; Miller, Karen S.; Tolle, Charles R.
2004-09-14
The present invention relates to process control where some of the controllable parameters are difficult or impossible to characterize. The present invention relates to process control in biotechnology of such systems, but not limited to. Additionally, the present invention relates to process control in biotechnology minerals processing. In the inventive method, an application of the present invention manipulates a minerals bioprocess to find local exterma (maxima or minima) for selected output variables/process goals by using a learning-based controller for bioprocess oxidation of minerals during hydrometallurgical processing. The learning-based controller operates with or without human supervision and works to find processor optima without previously defined optima due to the non-characterized nature of the process being manipulated.
Lubricant base oil and wax processing. [Glossary included
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sequeira, A. Jr.
1994-01-01
This book provides state-of-the-art information on all processes currently used to manufacture lubricant base oils and waxes. It furnishes helpful lists of conversion factors, construction cost data, and process licensors, as well as a glossary of essential petroleum processing terms.
2007-05-01
BASED ENVIROMENTAL IMPACT ANALYSIS PROCESS LAUGHLIN AIR FORCE BASE, TEXAS AGENCY: 47th Flying Training Wing (FTW), Laughlin Air Force Base (AFB), Texas...8217\\ \\ \\ \\ \\\\ \\ ~ >(- \\ , ~ AOC01 \\ PS018 / WP002 \\ DP008 // WP006 \\ ~ ,/ ’----- -----·-------------~--/·/ LAUGHLIN AIR FORCE BASE ENVIROMENTAL RESTORATION
NASA Astrophysics Data System (ADS)
Elag, M.; Goodall, J. L.
2013-12-01
Hydrologic modeling often requires the re-use and integration of models from different disciplines to simulate complex environmental systems. Component-based modeling introduces a flexible approach for integrating physical-based processes across disciplinary boundaries. Several hydrologic-related modeling communities have adopted the component-based approach for simulating complex physical systems by integrating model components across disciplinary boundaries in a workflow. However, it is not always straightforward to create these interdisciplinary models due to the lack of sufficient knowledge about a hydrologic process. This shortcoming is a result of using informal methods for organizing and sharing information about a hydrologic process. A knowledge-based ontology provides such standards and is considered the ideal approach for overcoming this challenge. The aims of this research are to present the methodology used in analyzing the basic hydrologic domain in order to identify hydrologic processes, the ontology itself, and how the proposed ontology is integrated with the Water Resources Component (WRC) ontology. The proposed ontology standardizes the definitions of a hydrologic process, the relationships between hydrologic processes, and their associated scientific equations. The objective of the proposed Hydrologic Process (HP) Ontology is to advance the idea of creating a unified knowledge framework for components' metadata by introducing a domain-level ontology for hydrologic processes. The HP ontology is a step toward an explicit and robust domain knowledge framework that can be evolved through the contribution of domain users. Analysis of the hydrologic domain is accomplished using the Formal Concept Approach (FCA), in which the infiltration process, an important hydrologic process, is examined. Two infiltration methods, the Green-Ampt and Philip's methods, were used to demonstrate the implementation of information in the HP ontology. Furthermore, a SPARQL service is provided for semantic-based querying of the ontology.
Slofstra, Christien; Eisma, Maarten C; Holmes, Emily A; Bockting, Claudi L H; Nauta, Maaike H
2017-01-01
Ruminative (abstract verbal) processing during recall of aversive autobiographical memories may serve to dampen their short-term affective impact. Experimental studies indeed demonstrate that verbal processing of non-autobiographical material and positive autobiographical memories evokes weaker affective responses than imagery-based processing. In the current study, we hypothesized that abstract verbal or concrete verbal processing of an aversive autobiographical memory would result in weaker affective responses than imagery-based processing. The affective impact of abstract verbal versus concrete verbal versus imagery-based processing during recall of an aversive autobiographical memory was investigated in a non-clinical sample ( n = 99) using both an observational and an experimental design. Observationally, it was examined whether spontaneous use of processing modes (both state and trait measures) was associated with impact of aversive autobiographical memory recall on negative and positive affect. Experimentally, the causal relation between processing modes and affective impact was investigated by manipulating the processing mode during retrieval of the same aversive autobiographical memory. Main findings were that higher levels of trait (but not state) measures of both ruminative and imagery-based processing and depressive symptomatology were positively correlated with higher levels of negative affective impact in the observational part of the study. In the experimental part, no main effect of processing modes on affective impact of autobiographical memories was found. However, a significant moderating effect of depressive symptomatology was found. Only for individuals with low levels of depressive symptomatology, concrete verbal (but not abstract verbal) processing of the aversive autobiographical memory did result in weaker affective responses, compared to imagery-based processing. These results cast doubt on the hypothesis that ruminative processing of aversive autobiographical memories serves to avoid the negative emotions evoked by such memories. Furthermore, findings suggest that depressive symptomatology is associated with the spontaneous use and the affective impact of processing modes during recall of aversive autobiographical memories. Clinical studies are needed that examine the role of processing modes during aversive autobiographical memory recall in depression, including the potential effectiveness of targeting processing modes in therapy.
NASA Astrophysics Data System (ADS)
Qyyum, Muhammad Abdul; Wei, Feng; Hussain, Arif; Ali, Wahid; Sehee, Oh; Lee, Moonyong
2017-11-01
This research work unfolds a simple, safe, and environment-friendly energy efficient novel vortex tube-based natural gas liquefaction process (LNG). A vortex tube was introduced to the popular N2-expander liquefaction process to enhance the liquefaction efficiency. The process structure and condition were modified and optimized to take a potential advantage of the vortex tube on the natural gas liquefaction cycle. Two commercial simulators ANSYS® and Aspen HYSYS® were used to investigate the application of vortex tube in the refrigeration cycle of LNG process. The Computational fluid dynamics (CFD) model was used to simulate the vortex tube with nitrogen (N2) as a working fluid. Subsequently, the results of the CFD model were embedded in the Aspen HYSYS® to validate the proposed LNG liquefaction process. The proposed natural gas liquefaction process was optimized using the knowledge-based optimization (KBO) approach. The overall energy consumption was chosen as an objective function for optimization. The performance of the proposed liquefaction process was compared with the conventional N2-expander liquefaction process. The vortex tube-based LNG process showed a significant improvement of energy efficiency by 20% in comparison with the conventional N2-expander liquefaction process. This high energy efficiency was mainly due to the isentropic expansion of the vortex tube. It turned out that the high energy efficiency of vortex tube-based process is totally dependent on the refrigerant cold fraction, operating conditions as well as refrigerant cycle configurations.
An object-oriented description method of EPMM process
NASA Astrophysics Data System (ADS)
Jiang, Zuo; Yang, Fan
2017-06-01
In order to use the object-oriented mature tools and language in software process model, make the software process model more accord with the industrial standard, it’s necessary to study the object-oriented modelling of software process. Based on the formal process definition in EPMM, considering the characteristics that Petri net is mainly formal modelling tool and combining the Petri net modelling with the object-oriented modelling idea, this paper provides this implementation method to convert EPMM based on Petri net into object models based on object-oriented description.
Automated process control for plasma etching
NASA Astrophysics Data System (ADS)
McGeown, Margaret; Arshak, Khalil I.; Murphy, Eamonn
1992-06-01
This paper discusses the development and implementation of a rule-based system which assists in providing automated process control for plasma etching. The heart of the system is to establish a correspondence between a particular data pattern -- sensor or data signals -- and one or more modes of failure, i.e., a data-driven monitoring approach. The objective of this rule based system, PLETCHSY, is to create a program combining statistical process control (SPC) and fault diagnosis to help control a manufacturing process which varies over time. This can be achieved by building a process control system (PCS) with the following characteristics. A facility to monitor the performance of the process by obtaining and analyzing the data relating to the appropriate process variables. Process sensor/status signals are input into an SPC module. If trends are present, the SPC module outputs the last seven control points, a pattern which is represented by either regression or scoring. The pattern is passed to the rule-based module. When the rule-based system recognizes a pattern, it starts the diagnostic process using the pattern. If the process is considered to be going out of control, advice is provided about actions which should be taken to bring the process back into control.
Lee, Eunjoo; Noh, Hyun Kyung
2016-01-01
To examine the effects of a web-based nursing process documentation system on the stress and anxiety of nursing students during their clinical practice. A quasi-experimental design was employed. The experimental group (n = 110) used a web-based nursing process documentation program for their case reports as part of assignments for a clinical practicum, whereas the control group (n = 106) used traditional paper-based case reports. Stress and anxiety levels were measured with a numeric rating scale before, 2 weeks after, and 4 weeks after using the web-based nursing process documentation program during a clinical practicum. The data were analyzed using descriptive statistics, t tests, chi-square tests, and repeated-measures analyses of variance. Nursing students who used the web-based nursing process documentation program showed significant lower levels of stress and anxiety than the control group. A web-based nursing process documentation program could be used to reduce the stress and anxiety of nursing students during clinical practicum, which ultimately would benefit nursing students by increasing satisfaction with and effectiveness of clinical practicum. © 2015 NANDA International, Inc.
Process-based principles for restoring river ecosystems
Timothy J. Beechie; David A. Sear; Julian D. Olden; George R. Pess; John M. Buffington; Hamish Moir; Philip Roni; Michael M. Pollock
2010-01-01
Process-based restoration aims to reestablish normative rates and magnitudes of physical, chemical, and biological processes that sustain river and floodplain ecosystems. Ecosystem conditions at any site are governed by hierarchical regional, watershed, and reach-scale processes controlling hydrologic and sediment regimes; floodplain and aquatic habitat...
Newman, Ian R; Gibb, Maia; Thompson, Valerie A
2017-07-01
It is commonly assumed that belief-based reasoning is fast and automatic, whereas rule-based reasoning is slower and more effortful. Dual-Process theories of reasoning rely on this speed-asymmetry explanation to account for a number of reasoning phenomena, such as base-rate neglect and belief-bias. The goal of the current study was to test this hypothesis about the relative speed of belief-based and rule-based processes. Participants solved base-rate problems (Experiment 1) and conditional inferences (Experiment 2) under a challenging deadline; they then gave a second response in free time. We found that fast responses were informed by rules of probability and logical validity, and that slow responses incorporated belief-based information. Implications for Dual-Process theories and future research options for dissociating Type I and Type II processes are discussed. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
2018-01-01
ARL-TR-8270 ● JAN 2018 US Army Research Laboratory An Automated Energy Detection Algorithm Based on Morphological Filter...Automated Energy Detection Algorithm Based on Morphological Filter Processing with a Modified Watershed Transform by Kwok F Tom Sensors and Electron...1 October 2016–30 September 2017 4. TITLE AND SUBTITLE An Automated Energy Detection Algorithm Based on Morphological Filter Processing with a
The Acquisition of Integrated Science Process Skills in a Web-Based Learning Environment
ERIC Educational Resources Information Center
Saat, Rohaida Mohd
2004-01-01
Web-based learning is becoming prevalent in science learning. Some use specially designed programs, while others use materials available on the Internet. This qualitative case study examined the process of acquisition of integrated science process skills, particularly the skill of controlling variables, in a web-based learning environment among…
USDA-ARS?s Scientific Manuscript database
Predictions of seedling emergence timing for spring wheat are facilitated by process-based modeling of the microsite environment in the shallow seedling recruitment zone. Hourly temperature and water profiles within the recruitment zone for 60 days after planting were simulated from the process-base...
ERIC Educational Resources Information Center
Bruton, Anthony
2005-01-01
Process writing and communicative-task-based instruction both assume productive tasks that prompt self-expression to motivate students and as the principal engine for developing L2 proficiency in the language classroom. Besides this, process writing and communicative-task-based instruction have much else in common, despite some obvious…
Process Correlation Analysis Model for Process Improvement Identification
Park, Sooyong
2014-01-01
Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data. PMID:24977170
Process correlation analysis model for process improvement identification.
Choi, Su-jin; Kim, Dae-Kyoo; Park, Sooyong
2014-01-01
Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.
Enzyme-based solutions for textile processing and dye contaminant biodegradation-a review.
Chatha, Shahzad Ali Shahid; Asgher, Muhammad; Iqbal, Hafiz M N
2017-06-01
The textile industry, as recognized conformist and stake industry in the world's economy, is facing serious environmental challenges. In numerous industries, in practice, various chemical-based processes from initial sizing to final washing are fascinating harsh environment concerns. Some of these chemicals are corrosive to equipment and cause serious damage itself. Therefore, in the twenty-first century, chemical and allied industries quest a paradigm transition from traditional chemical-based concepts to a greener, sustainable, and environmentally friendlier catalytic alternative, both at the laboratory and industrial scales. Bio-based catalysis offers numerous benefits in the context of biotechnological industry and environmental applications. In recent years, bio-based processing has received particular interest among the scientist for inter- and multi-disciplinary investigations in the areas of natural and engineering sciences for the application in biotechnology sector at large and textile industries in particular. Different enzymatic processes such as chemical substitution have been developed or in the process of development for various textile wet processes. In this context, the present review article summarizes current developments and highlights those areas where environment-friendly enzymatic textile processing might play an increasingly important role in the textile industry. In the first part of the review, a special focus has been given to a comparative discussion of the chemical-based "classical/conventional" treatments and the modern enzyme-based treatment processes. Some relevant information is also reported to identify the major research gaps to be worked out in future.
Developing cloud-based Business Process Management (BPM): a survey
NASA Astrophysics Data System (ADS)
Mercia; Gunawan, W.; Fajar, A. N.; Alianto, H.; Inayatulloh
2018-03-01
In today’s highly competitive business environment, modern enterprises are dealing difficulties to cut unnecessary costs, eliminate wastes and delivery huge benefits for the organization. Companies are increasingly turning to a more flexible IT environment to help them realize this goal. For this reason, the article applies cloud based Business Process Management (BPM) that enables to focus on modeling, monitoring and process management. Cloud based BPM consists of business processes, business information and IT resources, which help build real-time intelligence systems, based on business management and cloud technology. Cloud computing is a paradigm that involves procuring dynamically measurable resources over the internet as an IT resource service. Cloud based BPM service enables to address common problems faced by traditional BPM, especially in promoting flexibility, event-driven business process to exploit opportunities in the marketplace.
Testing the Digital Thread in Support of Model-Based Manufacturing and Inspection
Hedberg, Thomas; Lubell, Joshua; Fischer, Lyle; Maggiano, Larry; Feeney, Allison Barnard
2016-01-01
A number of manufacturing companies have reported anecdotal evidence describing the benefits of Model-Based Enterprise (MBE). Based on this evidence, major players in industry have embraced a vision to deploy MBE. In our view, the best chance of realizing this vision is the creation of a single “digital thread.” Under MBE, there exists a Model-Based Definition (MBD), created by the Engineering function, that downstream functions reuse to complete Model-Based Manufacturing and Model-Based Inspection activities. The ensemble of data that enables the combination of model-based definition, manufacturing, and inspection defines this digital thread. Such a digital thread would enable real-time design and analysis, collaborative process-flow development, automated artifact creation, and full-process traceability in a seamless real-time collaborative development among project participants. This paper documents the strengths and weaknesses in the current, industry strategies for implementing MBE. It also identifies gaps in the transition and/or exchange of data between various manufacturing processes. Lastly, this paper presents measured results from a study of model-based processes compared to drawing-based processes and provides evidence to support the anecdotal evidence and vision made by industry. PMID:27325911
On the fractal characterization of Paretian Poisson processes
NASA Astrophysics Data System (ADS)
Eliazar, Iddo I.; Sokolov, Igor M.
2012-06-01
Paretian Poisson processes are Poisson processes which are defined on the positive half-line, have maximal points, and are quantified by power-law intensities. Paretian Poisson processes are elemental in statistical physics, and are the bedrock of a host of power-law statistics ranging from Pareto's law to anomalous diffusion. In this paper we establish evenness-based fractal characterizations of Paretian Poisson processes. Considering an array of socioeconomic evenness-based measures of statistical heterogeneity, we show that: amongst the realm of Poisson processes which are defined on the positive half-line, and have maximal points, Paretian Poisson processes are the unique class of 'fractal processes' exhibiting scale-invariance. The results established in this paper are diametric to previous results asserting that the scale-invariance of Poisson processes-with respect to physical randomness-based measures of statistical heterogeneity-is characterized by exponential Poissonian intensities.
Butt, Muhammad Arif; Akram, Muhammad
2016-01-01
We present a new intuitionistic fuzzy rule-based decision-making system based on intuitionistic fuzzy sets for a process scheduler of a batch operating system. Our proposed intuitionistic fuzzy scheduling algorithm, inputs the nice value and burst time of all available processes in the ready queue, intuitionistically fuzzify the input values, triggers appropriate rules of our intuitionistic fuzzy inference engine and finally calculates the dynamic priority (dp) of all the processes in the ready queue. Once the dp of every process is calculated the ready queue is sorted in decreasing order of dp of every process. The process with maximum dp value is sent to the central processing unit for execution. Finally, we show complete working of our algorithm on two different data sets and give comparisons with some standard non-preemptive process schedulers.
Reading Remediation Based on Sequential and Simultaneous Processing.
ERIC Educational Resources Information Center
Gunnison, Judy; And Others
1982-01-01
The theory postulating a dichotomy between sequential and simultaneous processing is reviewed and its implications for remediating reading problems are reviewed. Research is cited on sequential-simultaneous processing for early and advanced reading. A list of remedial strategies based on the processing dichotomy addresses decoding and lexical…
The poetics of mourning and faith-based intervention in maladaptive grieving processes in Ethiopia.
Hussein, Jeylan Wolyie
2018-08-01
The paper is an inquiry into the poetics of mourning and faith-based intervention in maladaptive grieving processes in Ethiopia. The paper discusses the ways that loss is signified and analyzes the meanings of ethnocultural and psychospiritual practices employed to deal with maladaptive grief processes and their psychological and emotional after-effects. Hermeneutics provided the methodological framework and informed the analysis. The thesis of the paper is that the poetics of mourning and faith-based social interventions are interactionally based meaning making processes. The paper indicates the limitations of the study and their implications for further inquiry.
A Petri Net-Based Software Process Model for Developing Process-Oriented Information Systems
NASA Astrophysics Data System (ADS)
Li, Yu; Oberweis, Andreas
Aiming at increasing flexibility, efficiency, effectiveness, and transparency of information processing and resource deployment in organizations to ensure customer satisfaction and high quality of products and services, process-oriented information systems (POIS) represent a promising realization form of computerized business information systems. Due to the complexity of POIS, explicit and specialized software process models are required to guide POIS development. In this chapter we characterize POIS with an architecture framework and present a Petri net-based software process model tailored for POIS development with consideration of organizational roles. As integrated parts of the software process model, we also introduce XML nets, a variant of high-level Petri nets as basic methodology for business processes modeling, and an XML net-based software toolset providing comprehensive functionalities for POIS development.
Process-based organization design and hospital efficiency.
Vera, Antonio; Kuntz, Ludwig
2007-01-01
The central idea of process-based organization design is that organizing a firm around core business processes leads to cost reductions and quality improvements. We investigated theoretically and empirically whether the implementation of a process-based organization design is advisable in hospitals. The data came from a database compiled by the Statistical Office of the German federal state of Rheinland-Pfalz and from a written questionnaire, which was sent to the chief executive officers (CEOs) of all 92 hospitals in this federal state. We used data envelopment analysis (DEA) to measure hospital efficiency, and factor analysis and regression analysis to test our hypothesis. Our principal finding is that a high degree of process-based organization has a moderate but significant positive effect on the efficiency of hospitals. The main implication is that hospitals should implement a process-based organization to improve their efficiency. However, to actually achieve positive effects on efficiency, it is of paramount importance to observe some implementation rules, in particular to mobilize physician participation and to create an adequate organizational culture.
Process-Based Governance in Public Administrations Using Activity-Based Costing
NASA Astrophysics Data System (ADS)
Becker, Jörg; Bergener, Philipp; Räckers, Michael
Decision- and policy-makers in public administrations currently lack on missing relevant information for sufficient governance. In Germany the introduction of New Public Management and double-entry accounting enable public administrations to get the opportunity to use cost-centered accounting mechanisms to establish new governance mechanisms. Process modelling in this case can be a useful instrument to help the public administrations decision- and policy-makers to structure their activities and capture relevant information. In combination with approaches like Activity-Based Costing, higher management level can be supported with a reasonable data base for fruitful and reasonable governance approaches. Therefore, the aim of this article is combining the public sector domain specific process modelling method PICTURE and concept of activity-based costing for supporting Public Administrations in process-based Governance.
DEVS Unified Process for Web-Centric Development and Testing of System of Systems
2008-05-20
gathering from the user. Further, methodologies have been developed to generate DEVS models from BPMN /BPEL-based and message-based requirement specifications...27] 3. BPMN /BPEL based system specifications: Business Process Modeling Notation ( BPMN ) [bpm] or Business Process Execution Language (BPEL) provide a...information is stored in .wsdl and .bpel files for BPEL but in proprietary format for BPMN . 4. DoDAF-based requirement specifications: Department of
Problem Based Learning: Cognitive and Metacognitive Processes during Problem Analysis.
ERIC Educational Resources Information Center
De Grave, W. S.; And Others
1996-01-01
To investigate whether problem-based learning leads to conceptual change, the cognitive and metacognitive processes of a group of medical students were studied during the problem analysis phase, and their verbal communication and thinking processes were analyzed. Stimulated recall of the thinking process during the discussion detected a conceptual…
ERIC Educational Resources Information Center
Xiao, Naiqi G.; Quinn, Paul C.; Ge, Liezhong; Lee, Kang
2017-01-01
Although most of the faces we encounter daily are moving ones, much of what we know about face processing and its development is based on studies using static faces that emphasize holistic processing as the hallmark of mature face processing. Here the authors examined the effects of facial movements on face processing developmentally in children…
ERIC Educational Resources Information Center
Spaulding, Trent Joseph
2011-01-01
The objective of this research is to understand how a set of systems, as defined by the business process, creates value. The three studies contained in this work develop the model of process-based automation. The model states that complementarities among systems are specified by handoffs in the business process. The model also provides theory to…
Process-based models are required to manage ecological systems in a changing world
K. Cuddington; M.-J. Fortin; L.R. Gerber; A. Hastings; A. Liebhold; M. OConnor; C. Ray
2013-01-01
Several modeling approaches can be used to guide management decisions. However, some approaches are better fitted than others to address the problem of prediction under global change. Process-based models, which are based on a theoretical understanding of relevant ecological processes, provide a useful framework to incorporate specific responses to altered...
Knowledge information management toolkit and method
Hempstead, Antoinette R.; Brown, Kenneth L.
2006-08-15
A system is provided for managing user entry and/or modification of knowledge information into a knowledge base file having an integrator support component and a data source access support component. The system includes processing circuitry, memory, a user interface, and a knowledge base toolkit. The memory communicates with the processing circuitry and is configured to store at least one knowledge base. The user interface communicates with the processing circuitry and is configured for user entry and/or modification of knowledge pieces within a knowledge base. The knowledge base toolkit is configured for converting knowledge in at least one knowledge base from a first knowledge base form into a second knowledge base form. A method is also provided.
Common Workflow Service: Standards Based Solution for Managing Operational Processes
NASA Astrophysics Data System (ADS)
Tinio, A. W.; Hollins, G. A.
2017-06-01
The Common Workflow Service is a collaborative and standards-based solution for managing mission operations processes using techniques from the Business Process Management (BPM) discipline. This presentation describes the CWS and its benefits.
NASA Astrophysics Data System (ADS)
Ariana, I. M.; Bagiada, I. M.
2018-01-01
Development of spreadsheet-based integrated transaction processing systems and financial reporting systems is intended to optimize the capabilities of spreadsheet in accounting data processing. The purpose of this study are: 1) to describe the spreadsheet-based integrated transaction processing systems and financial reporting systems; 2) to test its technical and operational feasibility. This study type is research and development. The main steps of study are: 1) needs analysis (need assessment); 2) developing spreadsheet-based integrated transaction processing systems and financial reporting systems; and 3) testing the feasibility of spreadsheet-based integrated transaction processing systems and financial reporting systems. The technical feasibility include the ability of hardware and operating systems to respond the application of accounting, simplicity and ease of use. Operational feasibility include the ability of users using accounting applications, the ability of accounting applications to produce information, and control applications of the accounting applications. The instrument used to assess the technical and operational feasibility of the systems is the expert perception questionnaire. The instrument uses 4 Likert scale, from 1 (strongly disagree) to 4 (strongly agree). Data were analyzed using percentage analysis by comparing the number of answers within one (1) item by the number of ideal answer within one (1) item. Spreadsheet-based integrated transaction processing systems and financial reporting systems integrate sales, purchases, and cash transaction processing systems to produce financial reports (statement of profit or loss and other comprehensive income, statement of changes in equity, statement of financial position, and statement of cash flows) and other reports. Spreadsheet-based integrated transaction processing systems and financial reporting systems is feasible from the technical aspects (87.50%) and operational aspects (84.17%).
Singh, Tarini; Laub, Ruth; Burgard, Jan Pablo; Frings, Christian
2018-05-01
Selective attention refers to the ability to selectively act upon relevant information at the expense of irrelevant information. Yet, in many experimental tasks, what happens to the representation of the irrelevant information is still debated. Typically, 2 approaches to distractor processing have been suggested, namely distractor inhibition and distractor-based retrieval. However, it is also typical that both processes are hard to disentangle. For instance, in the negative priming literature (for a review Frings, Schneider, & Fox, 2015) this has been a continuous debate since the early 1980s. In the present study, we attempted to prove that both processes exist, but that they reflect distractor processing at different levels of representation. Distractor inhibition impacts stimulus representation, whereas distractor-based retrieval impacts mainly motor processes. We investigated both processes in a distractor-priming task, which enables an independent measurement of both processes. For our argument that both processes impact different levels of distractor representation, we estimated the exponential parameter (τ) and Gaussian components (μ, σ) of the exponential Gaussian reaction-time (RT) distribution, which have previously been used to independently test the effects of cognitive and motor processes (e.g., Moutsopoulou & Waszak, 2012). The distractor-based retrieval effect was evident for the Gaussian component, which is typically discussed as reflecting motor processes, but not for the exponential parameter, whereas the inhibition component was evident for the exponential parameter, which is typically discussed as reflecting cognitive processes, but not for the Gaussian parameter. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
A Sensitivity Analysis Method to Study the Behavior of Complex Process-based Models
NASA Astrophysics Data System (ADS)
Brugnach, M.; Neilson, R.; Bolte, J.
2001-12-01
The use of process-based models as a tool for scientific inquiry is becoming increasingly relevant in ecosystem studies. Process-based models are artificial constructs that simulate the system by mechanistically mimicking the functioning of its component processes. Structurally, a process-based model can be characterized, in terms of its processes and the relationships established among them. Each process comprises a set of functional relationships among several model components (e.g., state variables, parameters and input data). While not encoded explicitly, the dynamics of the model emerge from this set of components and interactions organized in terms of processes. It is the task of the modeler to guarantee that the dynamics generated are appropriate and semantically equivalent to the phenomena being modeled. Despite the availability of techniques to characterize and understand model behavior, they do not suffice to completely and easily understand how a complex process-based model operates. For example, sensitivity analysis studies model behavior by determining the rate of change in model output as parameters or input data are varied. One of the problems with this approach is that it considers the model as a "black box", and it focuses on explaining model behavior by analyzing the relationship input-output. Since, these models have a high degree of non-linearity, understanding how the input affects an output can be an extremely difficult task. Operationally, the application of this technique may constitute a challenging task because complex process-based models are generally characterized by a large parameter space. In order to overcome some of these difficulties, we propose a method of sensitivity analysis to be applicable to complex process-based models. This method focuses sensitivity analysis at the process level, and it aims to determine how sensitive the model output is to variations in the processes. Once the processes that exert the major influence in the output are identified, the causes of its variability can be found. Some of the advantages of this approach are that it reduces the dimensionality of the search space, it facilitates the interpretation of the results and it provides information that allows exploration of uncertainty at the process level, and how it might affect model output. We present an example using the vegetation model BIOME-BGC.
Values-based recruitment in health care.
Miller, Sam Louise
2015-01-27
Values-based recruitment is a process being introduced to student selection for nursing courses and appointment to registered nurse posts. This article discusses the process of values-based recruitment and demonstrates why it is important in health care today. It examines the implications of values-based recruitment for candidates applying to nursing courses and to newly qualified nurses applying for their first posts in England. To ensure the best chance of success, candidates should understand the principles and process of values-based recruitment and how to prepare for this type of interview.
Natural Resource Information System. Volume 1: Overall description
NASA Technical Reports Server (NTRS)
1972-01-01
A prototype computer-based Natural Resource Information System was designed which could store, process, and display data of maximum usefulness to land management decision making. The system includes graphic input and display, the use of remote sensing as a data source, and it is useful at multiple management levels. A survey established current decision making processes and functions, information requirements, and data collection and processing procedures. The applications of remote sensing data and processing requirements were established. Processing software was constructed and a data base established using high-altitude imagery and map coverage of selected areas of SE Arizona. Finally a demonstration of system processing functions was conducted utilizing material from the data base.
Associative recognition: a case of recall-to-reject processing.
Rotello, C M; Heit, E
2000-09-01
Two-process accounts of recognition memory assume that memory judgments are based on both a rapidly available familiarity-based process and a slower, more accurate, recall-based mechanism. Past experiments on the time course of item recognition have not supported the recall-to-reject account of the second process, in which the retrieval of an old item is used to reject a similar foil (Rotello & Heit, 1999). In three new experiments, using analyses similar to those of Rotello and Heit, we found robust evidence for recall-to-reject processing in associative recognition, for word pairs, and for list-discrimination judgments. Put together, these results have implications for two-process accounts of recognition.
Application of advanced structure to multi-tone mask for FPD process
NASA Astrophysics Data System (ADS)
Song, Jin-Han; Jeong, Jin-Woong; Kim, Kyu-Sik; Jeong, Woo-Gun; Yun, Sang-Pil; Lee, Dong-Heok; Choi, Sang-Soo
2017-07-01
In accordance with improvement of FPD technology, masks such as phase shift mask (PSM) and multi-tone mask (MTM) for a particular purpose also have been developed. Above all, the MTM consisted of more than tri-tone transmittance has a substantial advantage which enables to reduce the number of mask demand in FPD fabrication process contrast to normal mask of two-tone transmittance.[1,2] A chromium (Cr)-based MTM (Typically top type) is being widely employed because of convenience of etch process caused by its only Cr-based structure consisted of Cr absorber layer and Cr half-tone layer. However, the top type of Cr-based MTM demands two Cr sputtering processes after each layer etching process and writing process. For this reason, a different material from the Cr-based MTM is required for reduction of mask fabrication time and cost. In this study, we evaluate a MTM which has a structure combined Cr with molybdenum silicide (MoSi) to resolve the issues mentioned above. The MoSi which is demonstrated by integrated circuit (IC) process is a suitable material for MTM evaluation. This structure could realize multi-transmittance in common with the Cr-based MTM. Moreover, it enables to reduce the number of sputtering process. We investigate a optimized structure upon consideration of productivity along with performance such as critical dimension (CD) variation and transmittance range of each structure. The transmittance is targeted at h-line wavelength (405 nm) in the evaluation. Compared with Cr-based MTM, the performances of all Cr-/MoSi-based MTMs are considered.
Prefrontal and medial temporal contributions to episodic memory-based reasoning.
Suzuki, Chisato; Tsukiura, Takashi; Mochizuki-Kawai, Hiroko; Shigemune, Yayoi; Iijima, Toshio
2009-03-01
Episodic memory retrieval and reasoning are fundamental psychological components of our daily lives. Although previous studies have investigated the brain regions associated with these processes separately, the neural mechanisms of reasoning based on episodic memory retrieval are largely unknown. Here, we investigated the neural correlates underlying episodic memory-based reasoning using functional magnetic resonance imaging (fMRI). During fMRI scanning, subjects performed three tasks: reasoning, episodic memory retrieval, and episodic memory-based reasoning. We identified dissociable activations related to reasoning, episodic memory retrieval, and linking processes between the two. Regions related to reasoning were identified in the left ventral prefrontal cortices (PFC), and those related to episodic memory retrieval were found in the right medial temporal lobe (MTL) regions. In addition, activations predominant in the linking process between the two were found in the left dorsal and right ventral PFC. These findings suggest that episodic memory-based reasoning is composed of at least three processes, i.e., reasoning, episodic memory retrieval, and linking processes between the two, and that activation of both the PFC and MTL is crucial in episodic memory-based reasoning. These findings are the first to demonstrate that PFC and MTL regions contribute differentially to each process in episodic memory-based reasoning.
Skorich, Daniel P; Mavor, Kenneth I
2013-09-01
In the current paper, we argue that categorization and individuation, as traditionally discussed and as experimentally operationalized, are defined in terms of two confounded underlying dimensions: a person/group dimension and a memory-based/data-driven dimension. In a series of three experiments, we unconfound these dimensions and impose a cognitive load. Across the three experiments, two with laboratory-created targets and one with participants' friends as the target, we demonstrate that cognitive load privileges memory-based over data-driven processing, not group- over person-level processing. We discuss the results in terms of their implications for conceptualizations of the categorization/individuation distinction, for the equivalence of person and group processes, for the ultimate 'purpose' and meaningfulness of group-based perception and, fundamentally, for the process of categorization, broadly defined. © 2012 The British Psychological Society.
ERIC Educational Resources Information Center
Manuel, Carlos J.
2009-01-01
This study assesses reading processes and/or strategies needed to deploy deep processing that could push learners towards syntactic-based constructions in L2 classrooms. Research has found L2 acquisition to present varying degrees of success and/or fossilization (Bley-Vroman 1989, Birdsong 1992 and Sharwood Smith 1994). For example, learners have…
NASA Astrophysics Data System (ADS)
Widyaningrum, E.; Gorte, B. G. H.
2017-05-01
LiDAR data acquisition is recognized as one of the fastest solutions to provide basis data for large-scale topographical base maps worldwide. Automatic LiDAR processing is believed one possible scheme to accelerate the large-scale topographic base map provision by the Geospatial Information Agency in Indonesia. As a progressive advanced technology, Geographic Information System (GIS) open possibilities to deal with geospatial data automatic processing and analyses. Considering further needs of spatial data sharing and integration, the one stop processing of LiDAR data in a GIS environment is considered a powerful and efficient approach for the base map provision. The quality of the automated topographic base map is assessed and analysed based on its completeness, correctness, quality, and the confusion matrix.
Towards a Web-Based Handbook of Generic, Process-Oriented Learning Designs
ERIC Educational Resources Information Center
Marjanovic, Olivera
2005-01-01
Process-oriented learning designs are innovative learning activities that include a set of inter-related learning tasks and are generic (could be used across disciplines). An example includes a problem-solving process widely used in problem-based learning today. Most of the existing process-oriented learning designs are not documented, let alone…
Visemic Processing in Audiovisual Discrimination of Natural Speech: A Simultaneous fMRI-EEG Study
ERIC Educational Resources Information Center
Dubois, Cyril; Otzenberger, Helene; Gounot, Daniel; Sock, Rudolph; Metz-Lutz, Marie-Noelle
2012-01-01
In a noisy environment, visual perception of articulatory movements improves natural speech intelligibility. Parallel to phonemic processing based on auditory signal, visemic processing constitutes a counterpart based on "visemes", the distinctive visual units of speech. Aiming at investigating the neural substrates of visemic processing in a…
Process-Based Remediation of Decoding in Gifted LD Students: Three Case Studies.
ERIC Educational Resources Information Center
Crawford, Shawn; Snart, Fern
1994-01-01
Three gifted males (ages 10-13) with deficits in successive coding participated in a process-based remedial program which combined global training on tasks requiring successive processing and tasks applying successive processing to decoding in reading, and which utilized verbal mediation. Differences in student improvement were related to entry…
Object-based neglect in number processing
2013-01-01
Recent evidence suggests that neglect patients seem to have particular problems representing relatively smaller numbers corresponding to the left part of the mental number line. However, while this indicates space-based neglect for representational number space little is known about whether and - if so - how object-based neglect influences number processing. To evaluate influences of object-based neglect in numerical cognition, a group of neglect patients and two control groups had to compare two-digit numbers to an internally represented standard. Conceptualizing two-digit numbers as objects of which the left part (i.e., the tens digit should be specifically neglected) we were able to evaluate object-based neglect for number magnitude processing. Object-based neglect was indicated by a larger unit-decade compatibility effect actually reflecting impaired processing of the leftward tens digits. Additionally, faster processing of within- as compared to between-decade items provided further evidence suggesting particular difficulties in integrating tens and units into the place-value structure of the Arabic number system. In summary, the present study indicates that, in addition to the spatial representation of number magnitude, also the processing of place-value information of multi-digit numbers seems specifically impaired in neglect patients. PMID:23343126
Zhou, Li; Xu, Jin-Di; Zhou, Shan-Shan; Shen, Hong; Mao, Qian; Kong, Ming; Zou, Ye-Ting; Xu, Ya-Yun; Xu, Jun; Li, Song-Lin
2017-12-29
Exploring processing chemistry, in particular the chemical transformation mechanisms involved, is a key step to elucidate the scientific basis in traditional processing of herbal medicines. Previously, taking Rehmanniae Radix (RR) as a case study, the holistic chemome (secondary metabolome and glycome) difference between raw and processed RR was revealed by integrating hyphenated chromatographic techniques-based targeted glycomics and untargeted metabolomics. Nevertheless, the complex chemical transformation mechanisms underpinning the holistic chemome variation in RR processing remain to be extensively clarified. As a continuous study, here a novel strategy by combining chemomics-based marker compounds mining and mimetic processing is proposed for further exploring the chemical mechanisms involved in herbal processing. First, the differential marker compounds between raw and processed herbs were rapidly discovered by untargeted chemomics-based mining approach through multivariate statistical analysis of the chemome data obtained by integrated metabolomics and glycomics analysis. Second, the marker compounds were mimetically processed under the simulated physicochemical conditions as in the herb processing, and the final reaction products were chemically characterized by targeted chemomics-based mining approach. Third, the main chemical transformation mechanisms involved were clarified by linking up the original marker compounds and their mimetic processing products. Using this strategy, a set of differential marker compounds including saccharides, glycosides and furfurals in raw and processed RR was rapidly found, and the major chemical mechanisms involved in RR processing were elucidated as stepwise transformations of saccharides (polysaccharides, oligosaccharides and monosaccharides) and glycosides (iridoid glycosides and phenethylalcohol glycosides) into furfurals (glycosylated/non-glycosylated hydroxymethylfurfurals) by deglycosylation and/or dehydration. The research deliverables indicated that the proposed strategy could advance the understanding of RR processing chemistry, and therefore may be considered a promising approach for delving into the scientific basis in traditional processing of herbal medicines. Copyright © 2017 Elsevier B.V. All rights reserved.
Business Process-Based Resource Importance Determination
NASA Astrophysics Data System (ADS)
Fenz, Stefan; Ekelhart, Andreas; Neubauer, Thomas
Information security risk management (ISRM) heavily depends on realistic impact values representing the resources’ importance in the overall organizational context. Although a variety of ISRM approaches have been proposed, well-founded methods that provide an answer to the following question are still missing: How can business processes be used to determine resources’ importance in the overall organizational context? We answer this question by measuring the actual importance level of resources based on business processes. Therefore, this paper presents our novel business process-based resource importance determination method which provides ISRM with an efficient and powerful tool for deriving realistic resource importance figures solely from existing business processes. The conducted evaluation has shown that the calculation results of the developed method comply to the results gained in traditional workshop-based assessments.
Elastic facial movement influences part-based but not holistic processing
Xiao, Naiqi G.; Quinn, Paul C.; Ge, Liezhong; Lee, Kang
2013-01-01
Face processing has been studied for decades. However, most of the empirical investigations have been conducted using static face images as stimuli. Little is known about whether static face processing findings can be generalized to real world contexts, in which faces are constantly moving. The present study investigates the nature of face processing (holistic vs. part-based) in elastic moving faces. Specifically, we focus on whether elastic moving faces, as compared to static ones, can facilitate holistic or part-based face processing. Using the composite paradigm, participants were asked to remember either an elastic moving face (i.e., a face that blinks and chews) or a static face, and then tested with a static composite face. The composite effect was (1) significantly smaller in the dynamic condition than in the static condition, (2) consistently found with different face encoding times (Experiments 1–3), and (3) present for the recognition of both upper and lower face parts (Experiment 4). These results suggest that elastic facial motion facilitates part-based processing, rather than holistic processing. Thus, while previous work with static faces has emphasized an important role for holistic processing, the current work highlights an important role for featural processing with moving faces. PMID:23398253
Low-Cost Aqueous Coal Desulfurization
NASA Technical Reports Server (NTRS)
Kalvinskas, J. J.; Vasilakos, N.; Corcoran, W. H.; Grohmann, K.; Rohatgi, N. K.
1982-01-01
Water-based process for desulfurizing coal not only eliminates need for costly organic solvent but removes sulfur more effectively than an earlier solvent-based process. New process could provide low-cost commercial method for converting high-sulfur coal into environmentally acceptable fuel.
Quality management of manufacturing process based on manufacturing execution system
NASA Astrophysics Data System (ADS)
Zhang, Jian; Jiang, Yang; Jiang, Weizhuo
2017-04-01
Quality control elements in manufacturing process are elaborated. And the approach of quality management of manufacturing process based on manufacturing execution system (MES) is discussed. The functions of MES for a microcircuit production line are introduced conclusively.
Pe'er, Guy; Zurita, Gustavo A.; Schober, Lucia; Bellocq, Maria I.; Strer, Maximilian; Müller, Michael; Pütz, Sandro
2013-01-01
Landscape simulators are widely applied in landscape ecology for generating landscape patterns. These models can be divided into two categories: pattern-based models that generate spatial patterns irrespective of the processes that shape them, and process-based models that attempt to generate patterns based on the processes that shape them. The latter often tend toward complexity in an attempt to obtain high predictive precision, but are rarely used for generic or theoretical purposes. Here we show that a simple process-based simulator can generate a variety of spatial patterns including realistic ones, typifying landscapes fragmented by anthropogenic activities. The model “G-RaFFe” generates roads and fields to reproduce the processes in which forests are converted into arable lands. For a selected level of habitat cover, three factors dominate its outcomes: the number of roads (accessibility), maximum field size (accounting for land ownership patterns), and maximum field disconnection (which enables field to be detached from roads). We compared the performance of G-RaFFe to three other models: Simmap (neutral model), Qrule (fractal-based) and Dinamica EGO (with 4 model versions differing in complexity). A PCA-based analysis indicated G-RaFFe and Dinamica version 4 (most complex) to perform best in matching realistic spatial patterns, but an alternative analysis which considers model variability identified G-RaFFe and Qrule as performing best. We also found model performance to be affected by habitat cover and the actual land-uses, the latter reflecting on land ownership patterns. We suggest that simple process-based generators such as G-RaFFe can be used to generate spatial patterns as templates for theoretical analyses, as well as for gaining better understanding of the relation between spatial processes and patterns. We suggest caution in applying neutral or fractal-based approaches, since spatial patterns that typify anthropogenic landscapes are often non-fractal in nature. PMID:23724108
Pe'er, Guy; Zurita, Gustavo A; Schober, Lucia; Bellocq, Maria I; Strer, Maximilian; Müller, Michael; Pütz, Sandro
2013-01-01
Landscape simulators are widely applied in landscape ecology for generating landscape patterns. These models can be divided into two categories: pattern-based models that generate spatial patterns irrespective of the processes that shape them, and process-based models that attempt to generate patterns based on the processes that shape them. The latter often tend toward complexity in an attempt to obtain high predictive precision, but are rarely used for generic or theoretical purposes. Here we show that a simple process-based simulator can generate a variety of spatial patterns including realistic ones, typifying landscapes fragmented by anthropogenic activities. The model "G-RaFFe" generates roads and fields to reproduce the processes in which forests are converted into arable lands. For a selected level of habitat cover, three factors dominate its outcomes: the number of roads (accessibility), maximum field size (accounting for land ownership patterns), and maximum field disconnection (which enables field to be detached from roads). We compared the performance of G-RaFFe to three other models: Simmap (neutral model), Qrule (fractal-based) and Dinamica EGO (with 4 model versions differing in complexity). A PCA-based analysis indicated G-RaFFe and Dinamica version 4 (most complex) to perform best in matching realistic spatial patterns, but an alternative analysis which considers model variability identified G-RaFFe and Qrule as performing best. We also found model performance to be affected by habitat cover and the actual land-uses, the latter reflecting on land ownership patterns. We suggest that simple process-based generators such as G-RaFFe can be used to generate spatial patterns as templates for theoretical analyses, as well as for gaining better understanding of the relation between spatial processes and patterns. We suggest caution in applying neutral or fractal-based approaches, since spatial patterns that typify anthropogenic landscapes are often non-fractal in nature.
NASA Astrophysics Data System (ADS)
Lu, Mark; Liang, Curtis; King, Dion; Melvin, Lawrence S., III
2005-11-01
Model-based Optical Proximity correction has become an indispensable tool for achieving wafer pattern to design fidelity at current manufacturing process nodes. Most model-based OPC is performed considering the nominal process condition, with limited consideration of through process manufacturing robustness. This study examines the use of off-target process models - models that represent non-nominal process states such as would occur with a dose or focus variation - to understands and manipulate the final pattern correction to a more process robust configuration. The study will first examine and validate the process of generating an off-target model, then examine the quality of the off-target model. Once the off-target model is proven, it will be used to demonstrate methods of generating process robust corrections. The concepts are demonstrated using a 0.13 μm logic gate process. Preliminary indications show success in both off-target model production and process robust corrections. With these off-target models as tools, mask production cycle times can be reduced.
1992-12-21
in preparation). Foundations of artificial intelligence. Cambridge, MA: MIT Press. O’Reilly, R. C. (1991). X3DNet: An X- Based Neural Network ...2.2.3 Trace based protocol analysis 19 2.2A Summary of important data features 21 2.3 Tools related to process model testing 23 2.3.1 Tools for building...algorithm 57 3. Requirements for testing process models using trace based protocol 59 analysis 3.1 Definition of trace based protocol analysis (TBPA) 59
Article and process for producing an article
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lacy, Benjamin Paul; Jacala, Ariel Caesar Prepena; Kottilingam, Srikanth Chandrudu
An article and a process of producing an article are provided. The article includes a base material, a cooling feature arrangement positioned on the base material, the cooling feature arrangement including an additive-structured material, and a cover material. The cooling feature arrangement is between the base material and the cover material. The process of producing the article includes manufacturing a cooling feature arrangement by an additive manufacturing technique, and then positioning the cooling feature arrangement between a base material and a cover material.
Vergauwe, Evie; Barrouillet, Pierre; Camos, Valérie
2009-07-01
Examinations of interference between visual and spatial materials in working memory have suggested domain- and process-based fractionations of visuo-spatial working memory. The present study examined the role of central time-based resource sharing in visuo-spatial working memory and assessed its role in obtained interference patterns. Visual and spatial storage were combined with both visual and spatial on-line processing components in computer-paced working memory span tasks (Experiment 1) and in a selective interference paradigm (Experiment 2). The cognitive load of the processing components was manipulated to investigate its impact on concurrent maintenance for both within-domain and between-domain combinations of processing and storage components. In contrast to both domain- and process-based fractionations of visuo-spatial working memory, the results revealed that recall performance was determined by the cognitive load induced by the processing of items, rather than by the domain to which those items pertained. These findings are interpreted as evidence for a time-based resource-sharing mechanism in visuo-spatial working memory.
[Definition and stabilization of processes II. Clinical Processes in a Urology Department].
Pascual, Carlos; Luján, Marcos; Mora, José Ramón; Diz, Manuel Ramón; Martín, Carlos; López, María Carmen
2015-01-01
New models in clinical management seek a clinical practice based on quality, efficacy and efficiency, avoiding variability and improvisation. In this paper we have developed one of the most frequent clinical processes in our speciality, the process based on DRG 311 or transurethral procedures without complications. Along it we will describe its components: Stabilization form, clinical trajectory, cost calculation, and finally the process flowchart.
Zhang, Lei; Yue, Hong-Shui; Ju, Ai-Chun; Ye, Zheng-Liang
2016-10-01
Currently, near infrared spectroscopy (NIRS) has been considered as an efficient tool for achieving process analytical technology(PAT) in the manufacture of traditional Chinese medicine (TCM) products. In this article, the NIRS based process analytical system for the production of salvianolic acid for injection was introduced. The design of the process analytical system was described in detail, including the selection of monitored processes and testing mode, and potential risks that should be avoided. Moreover, the development of relative technologies was also presented, which contained the establishment of the monitoring methods for the elution of polyamide resin and macroporous resin chromatography processes, as well as the rapid analysis method for finished products. Based on author's experience of research and work, several issues in the application of NIRS to the process monitoring and control in TCM production were then raised, and some potential solutions were also discussed. The issues include building the technical team for process analytical system, the design of the process analytical system in the manufacture of TCM products, standardization of the NIRS-based analytical methods, and improving the management of process analytical system. Finally, the prospect for the application of NIRS in the TCM industry was put forward. Copyright© by the Chinese Pharmaceutical Association.
Hammerschmidt, Nikolaus; Tscheliessnig, Anne; Sommer, Ralf; Helk, Bernhard; Jungbauer, Alois
2014-06-01
Standard industry processes for recombinant antibody production employ protein A affinity chromatography in combination with other chromatography steps and ultra-/diafiltration. This study compares a generic antibody production process with a recently developed purification process based on a series of selective precipitation steps. The new process makes two of the usual three chromatographic steps obsolete and can be performed in a continuous fashion. Cost of Goods (CoGs) analyses were done for: (i) a generic chromatography-based antibody standard purification; (ii) the continuous precipitation-based purification process coupled to a continuous perfusion production system; and (iii) a hybrid process, coupling the continuous purification process to an upstream batch process. The results of this economic analysis show that the precipitation-based process offers cost reductions at all stages of the life cycle of a therapeutic antibody, (i.e. clinical phase I, II and III, as well as full commercial production). The savings in clinical phase production are largely attributed to the fact that expensive chromatographic resins are omitted. These economic analyses will help to determine the strategies that are best suited for small-scale production in parallel fashion, which is of importance for antibody production in non-privileged countries and for personalized medicine. Copyright © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
A model of the hierarchy of behaviour, cognition, and consciousness.
Toates, Frederick
2006-03-01
Processes comparable in important respects to those underlying human conscious and non-conscious processing can be identified in a range of species and it is argued that these reflect evolutionary precursors of the human processes. A distinction is drawn between two types of processing: (1) stimulus-based and (2) higher-order. For 'higher-order,' in humans the operations of processing are themselves associated with conscious awareness. Conscious awareness sets the context for stimulus-based processing and its end-point is accessible to conscious awareness. However, the mechanics of the translation between stimulus and response proceeds without conscious control. The paper argues that higher-order processing is an evolutionary addition to stimulus-based processing. The model's value is shown for gaining insight into a range of phenomena and their link with consciousness. These include brain damage, learning, memory, development, vision, emotion, motor control, reasoning, the voluntary versus involuntary debate, and mental disorder.
An assembly process model based on object-oriented hierarchical time Petri Nets
NASA Astrophysics Data System (ADS)
Wang, Jiapeng; Liu, Shaoli; Liu, Jianhua; Du, Zenghui
2017-04-01
In order to improve the versatility, accuracy and integrity of the assembly process model of complex products, an assembly process model based on object-oriented hierarchical time Petri Nets is presented. A complete assembly process information model including assembly resources, assembly inspection, time, structure and flexible parts is established, and this model describes the static and dynamic data involved in the assembly process. Through the analysis of three-dimensional assembly process information, the assembly information is hierarchically divided from the whole, the local to the details and the subnet model of different levels of object-oriented Petri Nets is established. The communication problem between Petri subnets is solved by using message database, and it reduces the complexity of system modeling effectively. Finally, the modeling process is presented, and a five layer Petri Nets model is established based on the hoisting process of the engine compartment of a wheeled armored vehicle.
NASA Astrophysics Data System (ADS)
Musil, Juergen; Schweda, Angelika; Winkler, Dietmar; Biffl, Stefan
Based on our observations of Austrian video game software development (VGSD) practices we identified a lack of systematic processes/method support and inefficient collaboration between various involved disciplines, i.e. engineers and artists. VGSD includes heterogeneous disciplines, e.g. creative arts, game/content design, and software. Nevertheless, improving team collaboration and process support is an ongoing challenge to enable a comprehensive view on game development projects. Lessons learned from software engineering practices can help game developers to increase game development processes within a heterogeneous environment. Based on a state of the practice survey in the Austrian games industry, this paper presents (a) first results with focus on process/method support and (b) suggests a candidate flexible process approach based on Scrum to improve VGSD and team collaboration. Results showed (a) a trend to highly flexible software processes involving various disciplines and (b) identified the suggested flexible process approach as feasible and useful for project application.
On the Risk Management and Auditing of SOA Based Business Processes
NASA Astrophysics Data System (ADS)
Orriens, Bart; Heuvel, Willem-Jan V./D.; Papazoglou, Mike
SOA-enabled business processes stretch across many cooperating and coordinated systems, possibly crossing organizational boundaries, and technologies like XML and Web services are used for making system-to-system interactions commonplace. Business processes form the foundation for all organizations, and as such, are impacted by industry regulations. This requires organizations to review their business processes and ensure that they meet the compliance standards set forth in legislation. In this paper we sketch a SOA-based service risk management and auditing methodology including a compliance enforcement and verification system that assures verifiable business process compliance. This is done on the basis of a knowledge-based system that allows integration of internal control systems into business processes conform pre-defined compliance rules, monitor both the normal process behavior and those of the control systems during process execution, and log these behaviors to facilitate retrospective auditing.
Complex Event Processing for Content-Based Text, Image, and Video Retrieval
2016-06-01
NY): Wiley- Interscience; 2000. Feldman R, Sanger J. The text mining handbook: advanced approaches in analyzing unstructured data. New York (NY...ARL-TR-7705 ● JUNE 2016 US Army Research Laboratory Complex Event Processing for Content-Based Text , Image, and Video Retrieval...ARL-TR-7705 ● JUNE 2016 US Army Research Laboratory Complex Event Processing for Content-Based Text , Image, and Video Retrieval
Process for strengthening silicon based ceramics
Kim, Hyoun-Ee; Moorhead, A. J.
1993-01-01
A process for strengthening silicon based ceramic monolithic materials and omposite materials that contain silicon based ceramic reinforcing phases that requires that the ceramic be exposed to a wet hydrogen atmosphere at about 1400.degree. C. The process results in a dense, tightly adherent silicon containing oxide layer that heals, blunts , or otherwise negates the detrimental effect of strength limiting flaws on the surface of the ceramic body.
Process for strengthening silicon based ceramics
Kim, Hyoun-Ee; Moorhead, A. J.
1993-04-06
A process for strengthening silicon based ceramic monolithic materials and omposite materials that contain silicon based ceramic reinforcing phases that requires that the ceramic be exposed to a wet hydrogen atmosphere at about 1400.degree. C. The process results in a dense, tightly adherent silicon containing oxide layer that heals, blunts , or otherwise negates the detrimental effect of strength limiting flaws on the surface of the ceramic body.
ERIC Educational Resources Information Center
Briddell, Andrew
2013-01-01
This study of 1,974 fifth grade students investigated potential relationships between writing process-based instruction practices and higher-order thinking measured by a standardized literacy assessment. Writing process is defined as a highly complex, socio-cognitive process that includes: planning, text production, review, metacognition, writing…
The Problem-Based Learning Process: Reflections of Pre-Service Elementary School Teachers
ERIC Educational Resources Information Center
Baysal, Zeliha Nurdan
2017-01-01
This study aims to identify the benefits acquired by third-year pre-service elementary school teachers participating in a problem-based learning process in social studies education, the issues they encountered in that process and those they are likely to encounter, and their feelings about the process. Semi-structured interviews were used as one…
A cost-effective line-based light-balancing technique using adaptive processing.
Hsia, Shih-Chang; Chen, Ming-Huei; Chen, Yu-Min
2006-09-01
The camera imaging system has been widely used; however, the displaying image appears to have an unequal light distribution. This paper presents novel light-balancing techniques to compensate uneven illumination based on adaptive signal processing. For text image processing, first, we estimate the background level and then process each pixel with nonuniform gain. This algorithm can balance the light distribution while keeping a high contrast in the image. For graph image processing, the adaptive section control using piecewise nonlinear gain is proposed to equalize the histogram. Simulations show that the performance of light balance is better than the other methods. Moreover, we employ line-based processing to efficiently reduce the memory requirement and the computational cost to make it applicable in real-time systems.
NASA Astrophysics Data System (ADS)
Basak, Amrita; Acharya, Ranadip; Das, Suman
2016-08-01
This paper focuses on additive manufacturing (AM) of single-crystal (SX) nickel-based superalloy CMSX-4 through scanning laser epitaxy (SLE). SLE, a powder bed fusion-based AM process was explored for the purpose of producing crack-free, dense deposits of CMSX-4 on top of similar chemistry investment-cast substrates. Optical microscopy and scanning electron microscopy (SEM) investigations revealed the presence of dendritic microstructures that consisted of fine γ' precipitates within the γ matrix in the deposit region. Computational fluid dynamics (CFD)-based process modeling, statistical design of experiments (DoE), and microstructural characterization techniques were combined to produce metallurgically bonded single-crystal deposits of more than 500 μm height in a single pass along the entire length of the substrate. A customized quantitative metallography based image analysis technique was employed for automatic extraction of various deposit quality metrics from the digital cross-sectional micrographs. The processing parameters were varied, and optimal processing windows were identified to obtain good quality deposits. The results reported here represent one of the few successes obtained in producing single-crystal epitaxial deposits through a powder bed fusion-based metal AM process and thus demonstrate the potential of SLE to repair and manufacture single-crystal hot section components of gas turbine systems from nickel-based superalloy powders.
Quality data collection and management technology of aerospace complex product assembly process
NASA Astrophysics Data System (ADS)
Weng, Gang; Liu, Jianhua; He, Yongxi; Zhuang, Cunbo
2017-04-01
Aiming at solving problems of difficult management and poor traceability for discrete assembly process quality data, a data collection and management method is proposed which take the assembly process and BOM as the core. Data collection method base on workflow technology, data model base on BOM and quality traceability of assembly process is included in the method. Finally, assembly process quality data management system is developed and effective control and management of quality information for complex product assembly process is realized.
2010-01-01
Background Current healthcare systems have extended the evidence-based medicine (EBM) approach to health policy and delivery decisions, such as access-to-care, healthcare funding and health program continuance, through attempts to integrate valid and reliable evidence into the decision making process. These policy decisions have major impacts on society and have high personal and financial costs associated with those decisions. Decision models such as these function under a shared assumption of rational choice and utility maximization in the decision-making process. Discussion We contend that health policy decision makers are generally unable to attain the basic goals of evidence-based decision making (EBDM) and evidence-based policy making (EBPM) because humans make decisions with their naturally limited, faulty, and biased decision-making processes. A cognitive information processing framework is presented to support this argument, and subtle cognitive processing mechanisms are introduced to support the focal thesis: health policy makers' decisions are influenced by the subjective manner in which they individually process decision-relevant information rather than on the objective merits of the evidence alone. As such, subsequent health policy decisions do not necessarily achieve the goals of evidence-based policy making, such as maximizing health outcomes for society based on valid and reliable research evidence. Summary In this era of increasing adoption of evidence-based healthcare models, the rational choice, utility maximizing assumptions in EBDM and EBPM, must be critically evaluated to ensure effective and high-quality health policy decisions. The cognitive information processing framework presented here will aid health policy decision makers by identifying how their decisions might be subtly influenced by non-rational factors. In this paper, we identify some of the biases and potential intervention points and provide some initial suggestions about how the EBDM/EBPM process can be improved. PMID:20504357
McCaughey, Deirdre; Bruning, Nealia S
2010-05-26
Current healthcare systems have extended the evidence-based medicine (EBM) approach to health policy and delivery decisions, such as access-to-care, healthcare funding and health program continuance, through attempts to integrate valid and reliable evidence into the decision making process. These policy decisions have major impacts on society and have high personal and financial costs associated with those decisions. Decision models such as these function under a shared assumption of rational choice and utility maximization in the decision-making process. We contend that health policy decision makers are generally unable to attain the basic goals of evidence-based decision making (EBDM) and evidence-based policy making (EBPM) because humans make decisions with their naturally limited, faulty, and biased decision-making processes. A cognitive information processing framework is presented to support this argument, and subtle cognitive processing mechanisms are introduced to support the focal thesis: health policy makers' decisions are influenced by the subjective manner in which they individually process decision-relevant information rather than on the objective merits of the evidence alone. As such, subsequent health policy decisions do not necessarily achieve the goals of evidence-based policy making, such as maximizing health outcomes for society based on valid and reliable research evidence. In this era of increasing adoption of evidence-based healthcare models, the rational choice, utility maximizing assumptions in EBDM and EBPM, must be critically evaluated to ensure effective and high-quality health policy decisions. The cognitive information processing framework presented here will aid health policy decision makers by identifying how their decisions might be subtly influenced by non-rational factors. In this paper, we identify some of the biases and potential intervention points and provide some initial suggestions about how the EBDM/EBPM process can be improved.
Randomized evaluation of a web based interview process for urology resident selection.
Shah, Satyan K; Arora, Sanjeev; Skipper, Betty; Kalishman, Summers; Timm, T Craig; Smith, Anthony Y
2012-04-01
We determined whether a web based interview process for resident selection could effectively replace the traditional on-site interview. For the 2010 to 2011 match cycle, applicants to the University of New Mexico urology residency program were randomized to participate in a web based interview process via Skype or a traditional on-site interview process. Both methods included interviews with the faculty, a tour of facilities and the opportunity to ask current residents any questions. To maintain fairness the applicants were then reinterviewed via the opposite process several weeks later. We assessed comparative effectiveness, cost, convenience and satisfaction using anonymous surveys largely scored on a 5-point Likert scale. Of 39 total participants (33 applicants and 6 faculty) 95% completed the surveys. The web based interview was less costly to applicants (mean $171 vs $364, p=0.05) and required less time away from school (10% missing 1 or more days vs 30%, p=0.04) compared to traditional on-site interview. However, applicants perceived the web based interview process as less effective than traditional on-site interview, with a mean 6-item summative effectiveness score of 21.3 vs 25.6 (p=0.003). Applicants and faculty favored continuing the web based interview process in the future as an adjunct to on-site interviews. Residency interviews can be successfully conducted via the Internet. The web based interview process reduced costs and improved convenience. The findings of this study support the use of videoconferencing as an adjunct to traditional interview methods rather than as a replacement. Copyright © 2012 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.
Zhang, Jing; Liang, Lichen; Anderson, Jon R; Gatewood, Lael; Rottenberg, David A; Strother, Stephen C
2008-01-01
As functional magnetic resonance imaging (fMRI) becomes widely used, the demands for evaluation of fMRI processing pipelines and validation of fMRI analysis results is increasing rapidly. The current NPAIRS package, an IDL-based fMRI processing pipeline evaluation framework, lacks system interoperability and the ability to evaluate general linear model (GLM)-based pipelines using prediction metrics. Thus, it can not fully evaluate fMRI analytical software modules such as FSL.FEAT and NPAIRS.GLM. In order to overcome these limitations, a Java-based fMRI processing pipeline evaluation system was developed. It integrated YALE (a machine learning environment) into Fiswidgets (a fMRI software environment) to obtain system interoperability and applied an algorithm to measure GLM prediction accuracy. The results demonstrated that the system can evaluate fMRI processing pipelines with univariate GLM and multivariate canonical variates analysis (CVA)-based models on real fMRI data based on prediction accuracy (classification accuracy) and statistical parametric image (SPI) reproducibility. In addition, a preliminary study was performed where four fMRI processing pipelines with GLM and CVA modules such as FSL.FEAT and NPAIRS.CVA were evaluated with the system. The results indicated that (1) the system can compare different fMRI processing pipelines with heterogeneous models (NPAIRS.GLM, NPAIRS.CVA and FSL.FEAT) and rank their performance by automatic performance scoring, and (2) the rank of pipeline performance is highly dependent on the preprocessing operations. These results suggest that the system will be of value for the comparison, validation, standardization and optimization of functional neuroimaging software packages and fMRI processing pipelines.
Cai, Bin; Altman, Michael B; Garcia-Ramirez, Jose; LaBrash, Jason; Goddu, S Murty; Mutic, Sasa; Parikh, Parag J; Olsen, Jeffrey R; Saad, Nael; Zoberi, Jacqueline E
To develop a safe and robust workflow for yttrium-90 (Y-90) radioembolization procedures in a multidisciplinary team environment. A generalized Define-Measure-Analyze-Improve-Control (DMAIC)-based approach to process improvement was applied to a Y-90 radioembolization workflow. In the first DMAIC cycle, events with the Y-90 workflow were defined and analyzed. To improve the workflow, a web-based interactive electronic white board (EWB) system was adopted as the central communication platform and information processing hub. The EWB-based Y-90 workflow then underwent a second DMAIC cycle. Out of 245 treatments, three misses that went undetected until treatment initiation were recorded over a period of 21 months, and root-cause-analysis was performed to determine causes of each incident and opportunities for improvement. The EWB-based Y-90 process was further improved via new rules to define reliable sources of information as inputs into the planning process, as well as new check points to ensure this information was communicated correctly throughout the process flow. After implementation of the revised EWB-based Y-90 workflow, after two DMAIC-like cycles, there were zero misses out of 153 patient treatments in 1 year. The DMAIC-based approach adopted here allowed the iterative development of a robust workflow to achieve an adaptable, event-minimizing planning process despite a complex setting which requires the participation of multiple teams for Y-90 microspheres therapy. Implementation of such a workflow using the EWB or similar platform with a DMAIC-based process improvement approach could be expanded to other treatment procedures, especially those requiring multidisciplinary management. Copyright © 2016 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.
Production of orthophosphate suspension fertilizers from wet-process acid
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, T.M.; Burnell, J.R.
1984-01-01
For many years, the Tennessee Valley Authority (TVA) has worked toward development of suspension fertilizers. TVA has two plants for production of base suspension fertilizers from wet-process orthophosphoric acid. One is a demonstration-scale plant where a 13-38-0 grade base suspension is produced by a three-stage ammoniation process. The other is a new batch-type pilot plant which is capable of producing high-grade base suspensions of various ratios and grades from wet-process acid. In this batch plant, suspensions and solutions can also be produced from solid intermediates.
Jeong, Kyeong-Min; Kim, Hee-Seung; Hong, Sung-In; Lee, Sung-Keun; Jo, Na-Young; Kim, Yong-Soo; Lim, Hong-Gi; Park, Jae-Hyeung
2012-10-08
Speed enhancement of integral imaging based incoherent Fourier hologram capture using a graphic processing unit is reported. Integral imaging based method enables exact hologram capture of real-existing three-dimensional objects under regular incoherent illumination. In our implementation, we apply parallel computation scheme using the graphic processing unit, accelerating the processing speed. Using enhanced speed of hologram capture, we also implement a pseudo real-time hologram capture and optical reconstruction system. The overall operation speed is measured to be 1 frame per second.
A continuous dual-process model of remember/know judgments.
Wixted, John T; Mickes, Laura
2010-10-01
The dual-process theory of recognition memory holds that recognition decisions can be based on recollection or familiarity, and the remember/know procedure is widely used to investigate those 2 processes. Dual-process theory in general and the remember/know procedure in particular have been challenged by an alternative strength-based interpretation based on signal-detection theory, which holds that remember judgments simply reflect stronger memories than do know judgments. Although supported by a considerable body of research, the signal-detection account is difficult to reconcile with G. Mandler's (1980) classic "butcher-on-the-bus" phenomenon (i.e., strong, familiarity-based recognition). In this article, a new signal-detection model is proposed that does not deny either the validity of dual-process theory or the possibility that remember/know judgments can-when used in the right way-help to distinguish between memories that are largely recollection based from those that are largely familiarity based. It does, however, agree with all prior signal-detection-based critiques of the remember/know procedure, which hold that, as it is ordinarily used, the procedure mainly distinguishes strong memories from weak memories (not recollection from familiarity).
Gan, Yu-Yan; Zhou, Si-Li; Dai, Xiao; Wu, Han; Xiong, Zi-Yao; Qin, Yuan-Hang; Ma, Jiayu; Yang, Li; Wu, Zai-Kun; Wang, Tie-Lin; Wang, Wei-Guo; Wang, Cun-Wen
2018-06-15
Fenton-based processes with four different iron salts in two different dosing modes were used to pretreat rice straw (RS) samples to increase their enzymatic digestibility. The composition analysis shows that the RS sample pretreated by the dosing mode of iron salt adding into H 2 O 2 has a much lower hemicellulose content than that pretreated by the dosing mode of H 2 O 2 adding into iron salt, and the RS sample pretreated by the chloride salt-based Fenton process has a much lower lignin content and a slightly lower hemicellulose content than that pretreated by the sulphate salt-based Fenton process. The higher concentration of reducing sugar observed on the RS sample with lower lignin and hemicellulose contents justifies that the Fenton-based process could enhance the enzymic hydrolysis of RS by removing hemicellulose and lignin and increasing its accessibility to cellulase. FeCl 3 ·6H 2 O adding into H 2 O 2 is the most efficient Fenton-based process for RS pretreatment. Copyright © 2018 Elsevier Ltd. All rights reserved.
Service-based analysis of biological pathways
Zheng, George; Bouguettaya, Athman
2009-01-01
Background Computer-based pathway discovery is concerned with two important objectives: pathway identification and analysis. Conventional mining and modeling approaches aimed at pathway discovery are often effective at achieving either objective, but not both. Such limitations can be effectively tackled leveraging a Web service-based modeling and mining approach. Results Inspired by molecular recognitions and drug discovery processes, we developed a Web service mining tool, named PathExplorer, to discover potentially interesting biological pathways linking service models of biological processes. The tool uses an innovative approach to identify useful pathways based on graph-based hints and service-based simulation verifying user's hypotheses. Conclusion Web service modeling of biological processes allows the easy access and invocation of these processes on the Web. Web service mining techniques described in this paper enable the discovery of biological pathways linking these process service models. Algorithms presented in this paper for automatically highlighting interesting subgraph within an identified pathway network enable the user to formulate hypothesis, which can be tested out using our simulation algorithm that are also described in this paper. PMID:19796403
Conceptual design of distillation-based hybrid separation processes.
Skiborowski, Mirko; Harwardt, Andreas; Marquardt, Wolfgang
2013-01-01
Hybrid separation processes combine different separation principles and constitute a promising design option for the separation of complex mixtures. Particularly, the integration of distillation with other unit operations can significantly improve the separation of close-boiling or azeotropic mixtures. Although the design of single-unit operations is well understood and supported by computational methods, the optimal design of flowsheets of hybrid separation processes is still a challenging task. The large number of operational and design degrees of freedom requires a systematic and optimization-based design approach. To this end, a structured approach, the so-called process synthesis framework, is proposed. This article reviews available computational methods for the conceptual design of distillation-based hybrid processes for the separation of liquid mixtures. Open problems are identified that must be addressed to finally establish a structured process synthesis framework for such processes.
NASA Astrophysics Data System (ADS)
Zhu, Ming; Liu, Tingting; Wang, Shu; Zhang, Kesheng
2017-08-01
Existing two-frequency reconstructive methods can only capture primary (single) molecular relaxation processes in excitable gases. In this paper, we present a reconstructive method based on the novel decomposition of frequency-dependent acoustic relaxation spectra to capture the entire molecular multimode relaxation process. This decomposition of acoustic relaxation spectra is developed from the frequency-dependent effective specific heat, indicating that a multi-relaxation process is the sum of the interior single-relaxation processes. Based on this decomposition, we can reconstruct the entire multi-relaxation process by capturing the relaxation times and relaxation strengths of N interior single-relaxation processes, using the measurements of acoustic absorption and sound speed at 2N frequencies. Experimental data for the gas mixtures CO2-N2 and CO2-O2 validate our decomposition and reconstruction approach.
NASA Technical Reports Server (NTRS)
Bao, Han P.; Samareh, J. A.
2000-01-01
The primary objective of this paper is to demonstrate the use of process-based manufacturing and assembly cost models in a traditional performance-focused multidisciplinary design and optimization process. The use of automated cost-performance analysis is an enabling technology that could bring realistic processbased manufacturing and assembly cost into multidisciplinary design and optimization. In this paper, we present a new methodology for incorporating process costing into a standard multidisciplinary design optimization process. Material, manufacturing processes, and assembly processes costs then could be used as the objective function for the optimization method. A case study involving forty-six different configurations of a simple wing is presented, indicating that a design based on performance criteria alone may not necessarily be the most affordable as far as manufacturing and assembly cost is concerned.
Image processing system design for microcantilever-based optical readout infrared arrays
NASA Astrophysics Data System (ADS)
Tong, Qiang; Dong, Liquan; Zhao, Yuejin; Gong, Cheng; Liu, Xiaohua; Yu, Xiaomei; Yang, Lei; Liu, Weiyu
2012-12-01
Compared with the traditional infrared imaging technology, the new type of optical-readout uncooled infrared imaging technology based on MEMS has many advantages, such as low cost, small size, producing simple. In addition, the theory proves that the technology's high thermal detection sensitivity. So it has a very broad application prospects in the field of high performance infrared detection. The paper mainly focuses on an image capturing and processing system in the new type of optical-readout uncooled infrared imaging technology based on MEMS. The image capturing and processing system consists of software and hardware. We build our image processing core hardware platform based on TI's high performance DSP chip which is the TMS320DM642, and then design our image capturing board based on the MT9P031. MT9P031 is Micron's company high frame rate, low power consumption CMOS chip. Last we use Intel's company network transceiver devices-LXT971A to design the network output board. The software system is built on the real-time operating system DSP/BIOS. We design our video capture driver program based on TI's class-mini driver and network output program based on the NDK kit for image capturing and processing and transmitting. The experiment shows that the system has the advantages of high capturing resolution and fast processing speed. The speed of the network transmission is up to 100Mbps.
Digital Signal Processing Based Biotelemetry Receivers
NASA Technical Reports Server (NTRS)
Singh, Avtar; Hines, John; Somps, Chris
1997-01-01
This is an attempt to develop a biotelemetry receiver using digital signal processing technology and techniques. The receiver developed in this work is based on recovering signals that have been encoded using either Pulse Position Modulation (PPM) or Pulse Code Modulation (PCM) technique. A prototype has been developed using state-of-the-art digital signal processing technology. A Printed Circuit Board (PCB) is being developed based on the technique and technology described here. This board is intended to be used in the UCSF Fetal Monitoring system developed at NASA. The board is capable of handling a variety of PPM and PCM signals encoding signals such as ECG, temperature, and pressure. A signal processing program has also been developed to analyze the received ECG signal to determine heart rate. This system provides a base for using digital signal processing in biotelemetry receivers and other similar applications.
Individual Differences in Base Rate Neglect: A Fuzzy Processing Preference Index
Wolfe, Christopher R.; Fisher, Christopher R.
2013-01-01
Little is known about individual differences in integrating numeric base-rates and qualitative text in making probability judgments. Fuzzy-Trace Theory predicts a preference for fuzzy processing. We conducted six studies to develop the FPPI, a reliable and valid instrument assessing individual differences in this fuzzy processing preference. It consists of 19 probability estimation items plus 4 "M-Scale" items that distinguish simple pattern matching from “base rate respect.” Cronbach's Alpha was consistently above 0.90. Validity is suggested by significant correlations between FPPI scores and three other measurers: "Rule Based" Process Dissociation Procedure scores; the number of conjunction fallacies in joint probability estimation; and logic index scores on syllogistic reasoning. Replicating norms collected in a university study with a web-based study produced negligible differences in FPPI scores, indicating robustness. The predicted relationships between individual differences in base rate respect and both conjunction fallacies and syllogistic reasoning were partially replicated in two web-based studies. PMID:23935255
A KPI framework for process-based benchmarking of hospital information systems.
Jahn, Franziska; Winter, Alfred
2011-01-01
Benchmarking is a major topic for monitoring, directing and elucidating the performance of hospital information systems (HIS). Current approaches neglect the outcome of the processes that are supported by the HIS and their contribution to the hospital's strategic goals. We suggest to benchmark HIS based on clinical documentation processes and their outcome. A framework consisting of a general process model and outcome criteria for clinical documentation processes is introduced.
Predictors of Processing-Based Task Performance in Bilingual and Monolingual Children
Buac, Milijana; Gross, Megan; Kaushanskaya, Margarita
2016-01-01
In the present study we examined performance of bilingual Spanish-English-speaking and monolingual English-speaking school-age children on a range of processing-based measures within the framework of Baddeley’s working memory model. The processing-based measures included measures of short-term memory, measures of working memory, and a novel word-learning task. Results revealed that monolinguals outperformed bilinguals on the short-term memory tasks but not the working memory and novel word-learning tasks. Further, children’s vocabulary skills and socioeconomic status (SES) were more predictive of processing-based task performance in the bilingual group than the monolingual group. Together, these findings indicate that processing-based tasks that engage verbal working memory rather than short-term memory may be better-suited for diagnostic purposes with bilingual children. However, even verbal working memory measures are sensitive to bilingual children’s language-specific knowledge and demographic characteristics, and therefore may have limited clinical utility. PMID:27179914
NASA Astrophysics Data System (ADS)
Luqman, M.; Rosli, M. U.; Khor, C. Y.; Zambree, Shayfull; Jahidi, H.
2018-03-01
Crank arm is one of the important parts in a bicycle that is an expensive product due to the high cost of material and production process. This research is aimed to investigate the potential type of manufacturing process to fabricate composite bicycle crank arm and to describe an approach based on analytical hierarchy process (AHP) that assists decision makers or manufacturing engineers in determining the most suitable process to be employed in manufacturing of composite bicycle crank arm at the early stage of the product development process to reduce the production cost. There are four types of processes were considered, namely resin transfer molding (RTM), compression molding (CM), vacuum bag molding and filament winding (FW). The analysis ranks these four types of process for its suitability in the manufacturing of bicycle crank arm based on five main selection factors and 10 sub factors. Determining the right manufacturing process was performed based on AHP process steps. Consistency test was performed to make sure the judgements are consistent during the comparison. The results indicated that the compression molding was the most appropriate manufacturing process because it has the highest value (33.6%) among the other manufacturing processes.
Software-Based Real-Time Acquisition and Processing of PET Detector Raw Data.
Goldschmidt, Benjamin; Schug, David; Lerche, Christoph W; Salomon, André; Gebhardt, Pierre; Weissler, Bjoern; Wehner, Jakob; Dueppenbecker, Peter M; Kiessling, Fabian; Schulz, Volkmar
2016-02-01
In modern positron emission tomography (PET) readout architectures, the position and energy estimation of scintillation events (singles) and the detection of coincident events (coincidences) are typically carried out on highly integrated, programmable printed circuit boards. The implementation of advanced singles and coincidence processing (SCP) algorithms for these architectures is often limited by the strict constraints of hardware-based data processing. In this paper, we present a software-based data acquisition and processing architecture (DAPA) that offers a high degree of flexibility for advanced SCP algorithms through relaxed real-time constraints and an easily extendible data processing framework. The DAPA is designed to acquire detector raw data from independent (but synchronized) detector modules and process the data for singles and coincidences in real-time using a center-of-gravity (COG)-based, a least-squares (LS)-based, or a maximum-likelihood (ML)-based crystal position and energy estimation approach (CPEEA). To test the DAPA, we adapted it to a preclinical PET detector that outputs detector raw data from 60 independent digital silicon photomultiplier (dSiPM)-based detector stacks and evaluated it with a [(18)F]-fluorodeoxyglucose-filled hot-rod phantom. The DAPA is highly reliable with less than 0.1% of all detector raw data lost or corrupted. For high validation thresholds (37.1 ± 12.8 photons per pixel) of the dSiPM detector tiles, the DAPA is real time capable up to 55 MBq for the COG-based CPEEA, up to 31 MBq for the LS-based CPEEA, and up to 28 MBq for the ML-based CPEEA. Compared to the COG-based CPEEA, the rods in the image reconstruction of the hot-rod phantom are only slightly better separable and less blurred for the LS- and ML-based CPEEA. While the coincidence time resolution (∼ 500 ps) and energy resolution (∼12.3%) are comparable for all three CPEEA, the system sensitivity is up to 2.5 × higher for the LS- and ML-based CPEEA.
Calcium hydroxide as a processing base in alkali-aided pH-shift protein recovery process.
Paker, Ilgin; Jaczynski, Jacek; Matak, Kristen E
2017-02-01
Protein may be recovered by using pH shifts to solubilize and precipitate protein. Typically, sodium hydroxide is used as the processing base; however, this has been shown to significantly increase sodium in the final recovered protein. Protein was extracted from black bullhead catfish (Ameiurus melas) using a pH-shift method. Protein was solubilized using either sodium hydroxide (NaOH) or calcium hydroxide (Ca(OH) 2 ) and precipitated at pH 5.5 using hydrochloric acid (HCl). Protein solubility was greater when Ca(OH) 2 was used compared to NaOH during this process. Using Ca(OH) 2 as the processing base yielded the greatest lipid recovery (P < 0.05) at 77 g 100 g -1 , whereas the greatest (P < 0.05) protein recovery yield was recorded as 53 g 100 g -1 protein using NaOH. Protein solubilized with Ca(OH) 2 had more (P < 0.05) calcium in the protein fraction, whereas using NaOH increased (P < 0.05) sodium content. Results of our study showed that protein solubility was increased and the recovered protein had significantly more calcium when Ca(OH) 2 was used as the processing base. Results showed both NaOH and Ca(OH) 2 to be an effective processing base for pH-shift protein recovery processes. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.
Bridging process-based and empirical approaches to modeling tree growth
Harry T. Valentine; Annikki Makela; Annikki Makela
2005-01-01
The gulf between process-based and empirical approaches to modeling tree growth may be bridged, in part, by the use of a common model. To this end, we have formulated a process-based model of tree growth that can be fitted and applied in an empirical mode. The growth model is grounded in pipe model theory and an optimal control model of crown development. Together, the...
Comparison of pre-processing methods for multiplex bead-based immunoassays.
Rausch, Tanja K; Schillert, Arne; Ziegler, Andreas; Lüking, Angelika; Zucht, Hans-Dieter; Schulz-Knappe, Peter
2016-08-11
High throughput protein expression studies can be performed using bead-based protein immunoassays, such as the Luminex® xMAP® technology. Technical variability is inherent to these experiments and may lead to systematic bias and reduced power. To reduce technical variability, data pre-processing is performed. However, no recommendations exist for the pre-processing of Luminex® xMAP® data. We compared 37 different data pre-processing combinations of transformation and normalization methods in 42 samples on 384 analytes obtained from a multiplex immunoassay based on the Luminex® xMAP® technology. We evaluated the performance of each pre-processing approach with 6 different performance criteria. Three performance criteria were plots. All plots were evaluated by 15 independent and blinded readers. Four different combinations of transformation and normalization methods performed well as pre-processing procedure for this bead-based protein immunoassay. The following combinations of transformation and normalization were suitable for pre-processing Luminex® xMAP® data in this study: weighted Box-Cox followed by quantile or robust spline normalization (rsn), asinh transformation followed by loess normalization and Box-Cox followed by rsn.
Ratcliffe, M B; Khan, J H; Magee, K M; McElhinney, D B; Hubner, C
2000-06-01
Using a Java-based intranet program (applet), we collected postoperative process data after coronary artery bypass grafting. A Java-based applet was developed and deployed on a hospital intranet. Briefly, the nurse entered patient process data using a point and click interface. The applet generated a nursing note, and process data were saved in a Microsoft Access database. In 10 patients, this method was validated by comparison with a retrospective chart review. In 45 consecutive patients, weekly control charts were generated from the data. When aberrations from the pathway occurred, feedback was initiated to restore the goals of the critical pathway. The intranet process data collection method was verified by a manual chart review with 98% sensitivity. The control charts for time to extubation, intensive care unit stay, and hospital stay showed a deviation from critical pathway goals after the first 20 patients. Feedback modulation was associated with a return to critical pathway goals. Java-based applets are inexpensive and can collect accurate postoperative process data, identify critical pathway deviations, and allow timely feedback of process data.
Code of Federal Regulations, 2010 CFR
2016-10-01
...-based payment adjustment under the Home Health Value-Based Purchasing (HHVBP) Model. § 484.330 Section... (HHVBP) Model Components for Competing Home Health Agencies Within State Boundaries § 484.330 Process for determining and applying the value-based payment adjustment under the Home Health Value-Based Purchasing...
Code of Federal Regulations, 2010 CFR
2017-10-01
...-based payment adjustment under the Home Health Value-Based Purchasing (HHVBP) Model. § 484.330 Section... (HHVBP) Model Components for Competing Home Health Agencies Within State Boundaries § 484.330 Process for determining and applying the value-based payment adjustment under the Home Health Value-Based Purchasing...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meyer, Howard
2010-11-30
This project met the objective to further the development of an integrated multi-contaminant removal process in which H2S, NH3, HCl and heavy metals including Hg, As, Se and Cd present in the coal-derived syngas can be removed to specified levels in a single/integrated process step. The process supports the mission and goals of the Department of Energy's Gasification Technologies Program, namely to enhance the performance of gasification systems, thus enabling U.S. industry to improve the competitiveness of gasification-based processes. The gasification program will reduce equipment costs, improve process environmental performance, and increase process reliability and flexibility. Two sulfur conversion conceptsmore » were tested in the laboratory under this project, i.e., the solventbased, high-pressure University of California Sulfur Recovery Process High Pressure (UCSRP-HP) and the catalytic-based, direct oxidation (DO) section of the CrystaSulf-DO process. Each process required a polishing unit to meet the ultra-clean sulfur content goals of <50 ppbv (parts per billion by volume) as may be necessary for fuel cells or chemical production applications. UCSRP-HP was also tested for the removal of trace, non-sulfur contaminants, including ammonia, hydrogen chloride, and heavy metals. A bench-scale unit was commissioned and limited testing was performed with simulated syngas. Aspen-Plus®-based computer simulation models were prepared and the economics of the UCSRP-HP and CrystaSulf-DO processes were evaluated for a nominal 500 MWe, coal-based, IGCC power plant with carbon capture. This report covers the progress on the UCSRP-HP technology development and the CrystaSulf-DO technology.« less
A framework supporting the development of a Grid portal for analysis based on ROI.
Ichikawa, K; Date, S; Kaishima, T; Shimojo, S
2005-01-01
In our research on brain function analysis, users require two different simultaneous types of processing: interactive processing to a specific part of data and high-performance batch processing to an entire dataset. The difference between these two types of processing is in whether or not the analysis is for data in the region of interest (ROI). In this study, we propose a Grid portal that has a mechanism to freely assign computing resources to the users on a Grid environment according to the users' two different types of processing requirements. We constructed a Grid portal which integrates interactive processing and batch processing by the following two mechanisms. First, a job steering mechanism controls job execution based on user-tagged priority among organizations with heterogeneous computing resources. Interactive jobs are processed in preference to batch jobs by this mechanism. Second, a priority-based result delivery mechanism that administrates a rank of data significance. The portal ensures a turn-around time of interactive processing by the priority-based job controlling mechanism, and provides the users with quality of services (QoS) for interactive processing. The users can access the analysis results of interactive jobs in preference to the analysis results of batch jobs. The Grid portal has also achieved high-performance computation of MEG analysis with batch processing on the Grid environment. The priority-based job controlling mechanism has been realized to freely assign computing resources to the users' requirements. Furthermore the achievement of high-performance computation contributes greatly to the overall progress of brain science. The portal has thus made it possible for the users to flexibly include the large computational power in what they want to analyze.
Durvasula, Raghu; Kelly, Janet; Schleyer, Anneliese; Anawalt, Bradley D; Somani, Shabir; Dellit, Timothy H
2018-04-01
As healthcare costs rise and reimbursements decrease, healthcare organization leadership and clinical providers must collaborate to provide high-value healthcare. Medications are a key driver of the increasing cost of healthcare, largely as a result of the proliferation of expensive specialty drugs, including biologic agents. Such medications contribute significantly to the inpatient diagnosis-related group payment system, often with minimal or unproved benefit over less-expensive therapies. To describe a systematic review process to reduce non-evidence-based inpatient use of high-cost medications across a large multihospital academic health system. We created a Pharmacy & Therapeutics subcommittee consisting of clinicians, pharmacists, and an ethics representative. This committee developed a standardized process for a timely review (<48 hours) and approval of high-cost medications based on their clinical effectiveness, safety, and appropriateness. The engagement of clinical experts in the development of the consensus-based guidelines for the use of specific medications facilitated the clinicians' acceptance of the review process. Over a 2-year period, a total of 85 patient-specific requests underwent formal review. All reviews were conducted within 48 hours. This review process has reduced the non-evidence-based use of specialty medications and has resulted in a pharmacy savings of $491,000 in fiscal year 2016, with almost 80% of the savings occurring in the last 2 quarters, because our process has matured. The creation of a collaborative review process to ensure consistent, evidence-based utilization of high-cost medications provides value-based care, while minimizing unnecessary practice variation and reducing the cost of inpatient care.
Frégeau, Chantal J; Lett, C Marc; Fourney, Ron M
2010-10-01
A semi-automated DNA extraction process for casework samples based on the Promega DNA IQ™ system was optimized and validated on TECAN Genesis 150/8 and Freedom EVO robotic liquid handling stations configured with fixed tips and a TECAN TE-Shake™ unit. The use of an orbital shaker during the extraction process promoted efficiency with respect to DNA capture, magnetic bead/DNA complex washes and DNA elution. Validation studies determined the reliability and limitations of this shaker-based process. Reproducibility with regards to DNA yields for the tested robotic workstations proved to be excellent and not significantly different than that offered by the manual phenol/chloroform extraction. DNA extraction of animal:human blood mixtures contaminated with soil demonstrated that a human profile was detectable even in the presence of abundant animal blood. For exhibits containing small amounts of biological material, concordance studies confirmed that DNA yields for this shaker-based extraction process are equivalent or greater to those observed with phenol/chloroform extraction as well as our original validated automated magnetic bead percolation-based extraction process. Our data further supports the increasing use of robotics for the processing of casework samples. Crown Copyright © 2009. Published by Elsevier Ireland Ltd. All rights reserved.
Real-Time, Sensor-Based Computing in the Laboratory.
ERIC Educational Resources Information Center
Badmus, O. O.; And Others
1996-01-01
Demonstrates the importance of Real-Time, Sensor-Based (RTSB) computing and how it can be easily and effectively integrated into university student laboratories. Describes the experimental processes, the process instrumentation and process-computer interface, the computer and communications systems, and typical software. Provides much technical…
NASA Astrophysics Data System (ADS)
Liu, Z.; LU, G.; He, H.; Wu, Z.; He, J.
2017-12-01
Seasonal pluvial-drought transition processes are unique natural phenomena. To explore possible mechanisms, we considered Southwest China (SWC) as the study region and comprehensively investigated the temporal evolution of large-scale and regional atmospheric variables with the simple method of Standardized Anomalies (SA). Some key results include: (1) The net vertical integral of water vapour flux (VIWVF) across the four boundaries may be a feasible indicator of pluvial-drought transition processes over SWC, because its SA-based index is almost consistent with process development. (2) The vertical SA-based patterns of regional horizontal divergence (D) and vertical motion (ω) also coincides with the pluvial-drought transition processes well, and the SA-based index of regional D show relatively high correlation with the identified processes over SWC. (3) With respect to large-scale anomalies of circulation patterns, a well-organized Eurasian Pattern is one important feature during the pluvial-drought transition over SWC. (4) To explore the possibility of simulating drought development using previous pluvial anomalies, large-scale and regional atmospheric SA-based indices were used. As a whole, when SA-based indices of regional dynamic and water-vapor variables are introduced, simulated drought development only with large-scale anomalies can be improved a lot. (5) Eventually, pluvial-drought transition processes and associated regional atmospheric anomalies over nine Chinese drought study regions were investigated. With respect to regional D, vertically single or double "upper-positive-lower-negative" and "upper-negative-lower-positive" patterns are the most common vertical SA-based patterns during the pluvial and drought parts of transition processes, respectively.
Druzinec, Damir; Salzig, Denise; Brix, Alexander; Kraume, Matthias; Vilcinskas, Andreas; Kollewe, Christian; Czermak, Peter
2013-01-01
Due to the increasing use of insect cell based expression systems in research and industrial recombinant protein production, the development of efficient and reproducible production processes remains a challenging task. In this context, the application of online monitoring techniques is intended to ensure high and reproducible product qualities already during the early phases of process development. In the following chapter, the most common transient and stable insect cell based expression systems are briefly introduced. Novel applications of insect cell based expression systems for the production of insect derived antimicrobial peptides/proteins (AMPs) are discussed using the example of G. mellonella derived gloverin. Suitable in situ sensor techniques for insect cell culture monitoring in disposable and common bioreactor systems are outlined with respect to optical and capacitive sensor concepts. Since scale up of production processes is one of the most critical steps in process development, a conclusive overview is given about scale up aspects for industrial insect cell culture processes.
Development of High Throughput Process for Constructing 454 Titanium and Illumina Libraries
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deshpande, Shweta; Hack, Christopher; Tang, Eric
2010-05-28
We have developed two processes with the Biomek FX robot to construct 454 titanium and Illumina libraries in order to meet the increasing library demands. All modifications in the library construction steps were made to enable the adaptation of the entire processes to work with the 96-well plate format. The key modifications include the shearing of DNA with Covaris E210 and the enzymatic reaction cleaning and fragment size selection with SPRI beads and magnetic plate holders. The construction of 96 Titanium libraries takes about 8 hours from sheared DNA to ssDNA recovery. The processing of 96 Illumina libraries takes lessmore » time than that of the Titanium library process. Although both processes still require manual transfer of plates from robot to other work stations such as thermocyclers, these robotic processes represent about 12- to 24-folds increase of library capacity comparing to the manual processes. To enable the sequencing of many libraries in parallel, we have also developed sets of molecular barcodes for both library types. The requirements for the 454 library barcodes include 10 bases, 40-60percent GC, no consecutive same base, and no less than 3 bases difference between barcodes. We have used 96 of the resulted 270 barcodes to construct libraries and pool to test the ability of accurately assigning reads to the right samples. When allowing 1 base error occurred in the 10 base barcodes, we could assign 99.6percent of the total reads and 100percent of them were uniquely assigned. As for the Illumina barcodes, the requirements include 4 bases, balanced GC, and at least 2 bases difference between barcodes. We have begun to assess the ability to assign reads after pooling different number of libraries. We will discuss the progress and the challenges of these scale-up processes.« less
Enforcement of entailment constraints in distributed service-based business processes.
Hummer, Waldemar; Gaubatz, Patrick; Strembeck, Mark; Zdun, Uwe; Dustdar, Schahram
2013-11-01
A distributed business process is executed in a distributed computing environment. The service-oriented architecture (SOA) paradigm is a popular option for the integration of software services and execution of distributed business processes. Entailment constraints, such as mutual exclusion and binding constraints, are important means to control process execution. Mutually exclusive tasks result from the division of powerful rights and responsibilities to prevent fraud and abuse. In contrast, binding constraints define that a subject who performed one task must also perform the corresponding bound task(s). We aim to provide a model-driven approach for the specification and enforcement of task-based entailment constraints in distributed service-based business processes. Based on a generic metamodel, we define a domain-specific language (DSL) that maps the different modeling-level artifacts to the implementation-level. The DSL integrates elements from role-based access control (RBAC) with the tasks that are performed in a business process. Process definitions are annotated using the DSL, and our software platform uses automated model transformations to produce executable WS-BPEL specifications which enforce the entailment constraints. We evaluate the impact of constraint enforcement on runtime performance for five selected service-based processes from existing literature. Our evaluation demonstrates that the approach correctly enforces task-based entailment constraints at runtime. The performance experiments illustrate that the runtime enforcement operates with an overhead that scales well up to the order of several ten thousand logged invocations. Using our DSL annotations, the user-defined process definition remains declarative and clean of security enforcement code. Our approach decouples the concerns of (non-technical) domain experts from technical details of entailment constraint enforcement. The developed framework integrates seamlessly with WS-BPEL and the Web services technology stack. Our prototype implementation shows the feasibility of the approach, and the evaluation points to future work and further performance optimizations.
Prabhakar, P.; Sames, William J.; Dehoff, Ryan R.; ...
2015-03-28
Here, a computational modeling approach to simulate residual stress formation during the electron beam melting (EBM) process within the additive manufacturing (AM) technologies for Inconel 718 is presented in this paper. The EBM process has demonstrated a high potential to fabricate components with complex geometries, but the resulting components are influenced by the thermal cycles observed during the manufacturing process. When processing nickel based superalloys, very high temperatures (approx. 1000 °C) are observed in the powder bed, base plate, and build. These high temperatures, when combined with substrate adherence, can result in warping of the base plate and affect themore » final component by causing defects. It is important to have an understanding of the thermo-mechanical response of the entire system, that is, its mechanical behavior towards thermal loading occurring during the EBM process prior to manufacturing a component. Therefore, computational models to predict the response of the system during the EBM process will aid in eliminating the undesired process conditions, a priori, in order to fabricate the optimum component. Such a comprehensive computational modeling approach is demonstrated to analyze warping of the base plate, stress and plastic strain accumulation within the material, and thermal cycles in the system during different stages of the EBM process.« less
NASA Astrophysics Data System (ADS)
Yan, Li; Liao, Lei; Huang, Wei; Li, Lang-quan
2018-04-01
The analysis of nonlinear characteristics and control of mode transition process is the crucial issue to enhance the stability and reliability of the dual-mode scramjet engine. In the current study, the mode transition processes in both strut-based combustor and cavity-strut based combustor are numerically studied, and the influence of the cavity on the transition process is analyzed in detail. The simulations are conducted by means of the Reynolds averaged Navier-Stokes (RANS) equations coupled with the renormalization group (RNG) k-ε turbulence model and the single-step chemical reaction mechanism, and this numerical approach is proved to be valid by comparing the predicted results with the available experimental shadowgraphs in the open literature. During the mode transition process, an obvious nonlinear property is observed, namely the unevenly variations of pressure along the combustor. The hysteresis phenomenon is more obvious upstream of the flow field. For the cavity-strut configuration, the whole flow field is more inclined to the supersonic state during the transition process, and it is uneasy to convert to the ramjet mode. In the scram-to-ram transition process, the process would be more stable, and the hysteresis effect would be reduced in the ram-to-scram transition process.
Johnson, Kirsten A.; Farris, Samantha G.; Schmidt, Norman B.; Zvolensky, Michael J.
2012-01-01
Objective The current study investigated whether emotion dysregulation (ED; difficulties in the self-regulation of affective states) mediated relations between anxiety sensitivity (AS; fear of anxiety and related sensations) and cognitive-based smoking processes. Method Participants (n = 197; 57.5% male; Mage = 38.0) were daily smokers recruited as part of a randomized control trial for smoking cessation. Results AS was uniquely associated with all smoking processes. Moreover, ED significantly mediated relations between AS and the smoking processes. Conclusions Findings suggest that ED is an important construct to consider in relations between AS and cognitive-based smoking processes among adult treatment-seeking smokers. PMID:22540436
[GSH fermentation process modeling using entropy-criterion based RBF neural network model].
Tan, Zuoping; Wang, Shitong; Deng, Zhaohong; Du, Guocheng
2008-05-01
The prediction accuracy and generalization of GSH fermentation process modeling are often deteriorated by noise existing in the corresponding experimental data. In order to avoid this problem, we present a novel RBF neural network modeling approach based on entropy criterion. It considers the whole distribution structure of the training data set in the parameter learning process compared with the traditional MSE-criterion based parameter learning, and thus effectively avoids the weak generalization and over-learning. Then the proposed approach is applied to the GSH fermentation process modeling. Our results demonstrate that this proposed method has better prediction accuracy, generalization and robustness such that it offers a potential application merit for the GSH fermentation process modeling.
An Extension of SIC Predictions to the Wiener Coactive Model
Houpt, Joseph W.; Townsend, James T.
2011-01-01
The survivor interaction contrasts (SIC) is a powerful measure for distinguishing among candidate models of human information processing. One class of models to which SIC analysis can apply are the coactive, or channel summation, models of human information processing. In general, parametric forms of coactive models assume that responses are made based on the first passage time across a fixed threshold of a sum of stochastic processes. Previous work has shown that that the SIC for a coactive model based on the sum of Poisson processes has a distinctive down-up-down form, with an early negative region that is smaller than the later positive region. In this note, we demonstrate that a coactive process based on the sum of two Wiener processes has the same SIC form. PMID:21822333
An Extension of SIC Predictions to the Wiener Coactive Model.
Houpt, Joseph W; Townsend, James T
2011-06-01
The survivor interaction contrasts (SIC) is a powerful measure for distinguishing among candidate models of human information processing. One class of models to which SIC analysis can apply are the coactive, or channel summation, models of human information processing. In general, parametric forms of coactive models assume that responses are made based on the first passage time across a fixed threshold of a sum of stochastic processes. Previous work has shown that that the SIC for a coactive model based on the sum of Poisson processes has a distinctive down-up-down form, with an early negative region that is smaller than the later positive region. In this note, we demonstrate that a coactive process based on the sum of two Wiener processes has the same SIC form.
A Mixed Kijima Model Using the Weibull-Based Generalized Renewal Processes
2015-01-01
Generalized Renewal Processes are useful for approaching the rejuvenation of dynamical systems resulting from planned or unplanned interventions. We present new perspectives for the Generalized Renewal Processes in general and for the Weibull-based Generalized Renewal Processes in particular. Disregarding from literature, we present a mixed Generalized Renewal Processes approach involving Kijima Type I and II models, allowing one to infer the impact of distinct interventions on the performance of the system under study. The first and second theoretical moments of this model are introduced as well as its maximum likelihood estimation and random sampling approaches. In order to illustrate the usefulness of the proposed Weibull-based Generalized Renewal Processes model, some real data sets involving improving, stable, and deteriorating systems are used. PMID:26197222
a Metadata Based Approach for Analyzing Uav Datasets for Photogrammetric Applications
NASA Astrophysics Data System (ADS)
Dhanda, A.; Remondino, F.; Santana Quintero, M.
2018-05-01
This paper proposes a methodology for pre-processing and analysing Unmanned Aerial Vehicle (UAV) datasets before photogrammetric processing. In cases where images are gathered without a detailed flight plan and at regular acquisition intervals the datasets can be quite large and be time consuming to process. This paper proposes a method to calculate the image overlap and filter out images to reduce large block sizes and speed up photogrammetric processing. The python-based algorithm that implements this methodology leverages the metadata in each image to determine the end and side overlap of grid-based UAV flights. Utilizing user input, the algorithm filters out images that are unneeded for photogrammetric processing. The result is an algorithm that can speed up photogrammetric processing and provide valuable information to the user about the flight path.
Sun, E; Xu, Feng-Juan; Zhang, Zhen-Hai; Wei, Ying-Jie; Tan, Xiao-Bin; Cheng, Xu-Dong; Jia, Xiao-Bin
2014-02-01
Based on practice of Epimedium processing mechanism for many years and integrated multidisciplinary theory and technology, this paper initially constructs the research system for processing mechanism of traditional Chinese medicine based on chemical composition transformation combined with intestinal absorption barrier, which to form an innovative research mode of the " chemical composition changes-biological transformation-metabolism in vitro and in vivo-intestinal absorption-pharmacokinetic combined pharmacodynamic-pharmacodynamic mechanism". Combined with specific examples of Epimedium and other Chinese herbal medicine processing mechanism, this paper also discusses the academic thoughts, research methods and key technologies of this research system, which will be conducive to systematically reveal the modem scientific connotation of traditional Chinese medicine processing, and enrich the theory of Chinese herbal medicine processing.
How Many Batches Are Needed for Process Validation under the New FDA Guidance?
Yang, Harry
2013-01-01
The newly updated FDA Guidance for Industry on Process Validation: General Principles and Practices ushers in a life cycle approach to process validation. While the guidance no longer considers the use of traditional three-batch validation appropriate, it does not prescribe the number of validation batches for a prospective validation protocol, nor does it provide specific methods to determine it. This potentially could leave manufacturers in a quandary. In this paper, I develop a Bayesian method to address the issue. By combining process knowledge gained from Stage 1 Process Design (PD) with expected outcomes of Stage 2 Process Performance Qualification (PPQ), the number of validation batches for PPQ is determined to provide a high level of assurance that the process will consistently produce future batches meeting quality standards. Several examples based on simulated data are presented to illustrate the use of the Bayesian method in helping manufacturers make risk-based decisions for Stage 2 PPQ, and they highlight the advantages of the method over traditional Frequentist approaches. The discussions in the paper lend support for a life cycle and risk-based approach to process validation recommended in the new FDA guidance. The newly updated FDA Guidance for Industry on Process Validation: General Principles and Practices ushers in a life cycle approach to process validation. While the guidance no longer considers the use of traditional three-batch validation appropriate, it does not prescribe the number of validation batches for a prospective validation protocol, nor does it provide specific methods to determine it. This potentially could leave manufacturers in a quandary. In this paper, I develop a Bayesian method to address the issue. By combining process knowledge gained from Stage 1 Process Design (PD) with expected outcomes of Stage 2 Process Performance Qualification (PPQ), the number of validation batches for PPQ is determined to provide a high level of assurance that the process will consistently produce future batches meeting quality standards. Several examples based on simulated data are presented to illustrate the use of the Bayesian method in helping manufacturers make risk-based decisions for Stage 2 PPQ, and THEY highlight the advantages of the method over traditional Frequentist approaches. The discussions in the paper lend support for a life cycle and risk-based approach to process validation recommended in the new FDA guidance.
Research on the processing technology of elongated holes based on rotary ultrasonic drilling
NASA Astrophysics Data System (ADS)
Tong, Yi; Chen, Jianhua; Sun, Lipeng; Yu, Xin; Wang, Xin
2014-08-01
The optical glass is hard, brittle and difficult to process. Based on the method of rotating ultrasonic drilling, the study of single factor on drilling elongated holes was made in optical glass. The processing equipment was DAMA ultrasonic machine, and the machining tools were electroplated with diamond. Through the detection and analysis on the processing quality and surface roughness, the process parameters (the spindle speed, amplitude, feed rate) of rotary ultrasonic drilling were researched, and the influence of processing parameters on surface roughness was obtained, which will provide reference and basis for the actual processing.
Fuzzy control of burnout of multilayer ceramic actuators
NASA Astrophysics Data System (ADS)
Ling, Alice V.; Voss, David; Christodoulou, Leo
1996-08-01
To improve the yield and repeatability of the burnout process of multilayer ceramic actuators (MCAs), an intelligent processing of materials (IPM-based) control system has been developed for the manufacture of MCAs. IPM involves the active (ultimately adaptive) control of a material process using empirical or analytical models and in situ sensing of critical process states (part features and process parameters) to modify the processing conditions in real time to achieve predefined product goals. Thus, the three enabling technologies for the IPM burnout control system are process modeling, in situ sensing and intelligent control. This paper presents the design of an IPM-based control strategy for the burnout process of MCAs.
Graf, Laura K M; Landwehr, Jan R
2015-11-01
In this article, we develop an account of how aesthetic preferences can be formed as a result of two hierarchical, fluency-based processes. Our model suggests that processing performed immediately upon encountering an aesthetic object is stimulus driven, and aesthetic preferences that accrue from this processing reflect aesthetic evaluations of pleasure or displeasure. When sufficient processing motivation is provided by a perceiver's need for cognitive enrichment and/or the stimulus' processing affordance, elaborate perceiver-driven processing can emerge, which gives rise to fluency-based aesthetic evaluations of interest, boredom, or confusion. Because the positive outcomes in our model are pleasure and interest, we call it the Pleasure-Interest Model of Aesthetic Liking (PIA Model). Theoretically, this model integrates a dual-process perspective and ideas from lay epistemology into processing fluency theory, and it provides a parsimonious framework to embed and unite a wealth of aesthetic phenomena, including contradictory preference patterns for easy versus difficult-to-process aesthetic stimuli. © 2015 by the Society for Personality and Social Psychology, Inc.
The peculiarities of process-based approach realization in transport sector company management
NASA Astrophysics Data System (ADS)
Khripko, Elena; Sidorov, Gennadiy
2017-10-01
In the present article we study the phenomena of multiple meaning in understanding process-based management method in construction of transport infrastructure facilities. The idea of multiple meaning is in distortions which appear during reception of the management process paradigm in organizational environment of transport sector. The cause of distortion in process management is organizational resistance. The distortions of management processes are discovered at the level of diffusion among spheres of responsibility, collision in forms of functional, project and process interaction between the owner of the process and its participants. The level of distortion is affected by the attitude towards the result of work which means that process understanding of the result is replaced by the functional one in practice of management. This transfiguration is the consequence of regressive defensive mechanisms of the organizational environment. On the base of experience of forming process management in construction of transport infrastructure facilities company of the issues of diagnostics of various forms of organizational resistance and ways of reducing the destructive influence on managing processes are reviewed.
An Overview of Ni Base Additive Fabrication Technologies for Aerospace Applications (Preprint)
2011-03-01
fusion welding processes that have the ability to add filler material can be used as additive manufacturing processes . The majority of the work in the...Laser Additive Manufacturing (LAM) The LAM process uses a conventional laser welding heat source (CO2 or solid state laser) combined with a...wrought properties. The LAM process typically has a lower deposition rate (0.5 – 10 lbs/hr) compared to EB, PTA or TIG based processes , although as
Platform for Post-Processing Waveform-Based NDE
NASA Technical Reports Server (NTRS)
Roth, Don J.
2010-01-01
Signal- and image-processing methods are commonly needed to extract information from the waves, improve resolution of, and highlight defects in an image. Since some similarity exists for all waveform-based nondestructive evaluation (NDE) methods, it would seem that a common software platform containing multiple signal- and image-processing techniques to process the waveforms and images makes sense where multiple techniques, scientists, engineers, and organizations are involved. NDE Wave & Image Processor Version 2.0 software provides a single, integrated signal- and image-processing and analysis environment for total NDE data processing and analysis. It brings some of the most useful algorithms developed for NDE over the past 20 years into a commercial-grade product. The software can import signal/spectroscopic data, image data, and image series data. This software offers the user hundreds of basic and advanced signal- and image-processing capabilities including esoteric 1D and 2D wavelet-based de-noising, de-trending, and filtering. Batch processing is included for signal- and image-processing capability so that an optimized sequence of processing operations can be applied to entire folders of signals, spectra, and images. Additionally, an extensive interactive model-based curve-fitting facility has been included to allow fitting of spectroscopy data such as from Raman spectroscopy. An extensive joint-time frequency module is included for analysis of non-stationary or transient data such as that from acoustic emission, vibration, or earthquake data.
Bitumen and heavy oil upgrading in Canada
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chrones, J.; Germain, R.R.
1989-01-01
A review is presented of the heavy oil upgrading industry in Canada. Up to now it has been based on the processing of bitumen extracted from oil sands mining operations at two sites, to produce a residue-free, low sulphur, synthetic crude. Carbon rejection has been the prime process technology with delayed coking being used by Suncor and FLUID COKING at Syncrude. Alternative processes for recovering greater amounts of synthetic crude are examined. These include a variety of hydrogen addition processes and combinations which produce pipelineable materials requiring further processing in downstream refineries with expanded capabilities. The Newgrade Energy Inc. upgradermore » now under construction in Regina, will use fixed-bed, catalytic, atmospheric-residue, hydrogen processing. Two additional projects, also based on hydrogenation, will use ebullated bed catalyst systems; the expansion of Syncrude, now underway, is using the LC Fining Process whereas the announced Husky Bi-Provincial upgrader is based on H-Oil.« less
Bitumen and heavy oil upgrading in Canada
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chrones, J.
1988-06-01
A review is presented of the heavy oil upgrading industry in Canada. Up to now it has been based on the processing of bitumen extracted from oil sands mining operations at two sites, to produce a residue-free, low sulfur, synthetic crude. Carbon rejection has been the prime process technology with delayed coking being used by Suncor and FLUID COKING at Syncrude. Alternative processes for recovering greater amounts of synthetic crude are examined. These include a variety of hydrogen addition processes and combinations which produce pipelineable materials requiring further processing in downstream refineries with expanded capabilities. The Newgrade Energy Inc. upgrader,more » now under construction in Regina, will use fixed-bed, catalytic, atmospheric-residue, hydrogen processing. Two additional products, also based on hydrogenation, will use ebullated bed catalyst systems: the expansion of Syncrude, now underway, is using the LC Fining Process whereas the announced Husky Bi-Provincial upgrader is based on H-Oil.« less
One Step at a Time: SBM as an Incremental Process.
ERIC Educational Resources Information Center
Conrad, Mark
1995-01-01
Discusses incremental SBM budgeting and answers questions regarding resource equity, bookkeeping requirements, accountability, decision-making processes, and purchasing. Approaching site-based management as an incremental process recognizes that every school system engages in some level of site-based decisions. Implementation can be gradual and…
ERIC Educational Resources Information Center
Bennett, David A.
This document comprises a report on the architectural elements of choice in the desegregation process, a review of the choice process based on Minnesota's experience, and a statement of implications for state policymakers. The following organizational principles of the choice process are discussed: (1) enrollment based on a "first come, first…
76 FR 22648 - Resolution Plans and Credit Exposure Reports Required
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-22
..., including associated services, functions and support that, in the view of the Covered Company or as jointly...-based Covered Company's overall contingency planning process, and information regarding the.... operations be linked to the contingency planning process of the foreign-based Covered Company? Process 1. Are...
Radiology information system: a workflow-based approach.
Zhang, Jinyan; Lu, Xudong; Nie, Hongchao; Huang, Zhengxing; van der Aalst, W M P
2009-09-01
Introducing workflow management technology in healthcare seems to be prospective in dealing with the problem that the current healthcare Information Systems cannot provide sufficient support for the process management, although several challenges still exist. The purpose of this paper is to study the method of developing workflow-based information system in radiology department as a use case. First, a workflow model of typical radiology process was established. Second, based on the model, the system could be designed and implemented as a group of loosely coupled components. Each component corresponded to one task in the process and could be assembled by the workflow management system. The legacy systems could be taken as special components, which also corresponded to the tasks and were integrated through transferring non-work- flow-aware interfaces to the standard ones. Finally, a workflow dashboard was designed and implemented to provide an integral view of radiology processes. The workflow-based Radiology Information System was deployed in the radiology department of Zhejiang Chinese Medicine Hospital in China. The results showed that it could be adjusted flexibly in response to the needs of changing process, and enhance the process management in the department. It can also provide a more workflow-aware integration method, comparing with other methods such as IHE-based ones. The workflow-based approach is a new method of developing radiology information system with more flexibility, more functionalities of process management and more workflow-aware integration. The work of this paper is an initial endeavor for introducing workflow management technology in healthcare.
Assessing the structure of non-routine decision processes in Airline Operations Control.
Richters, Floor; Schraagen, Jan Maarten; Heerkens, Hans
2016-03-01
Unfamiliar severe disruptions challenge Airline Operations Control professionals most, as their expertise is stretched to its limits. This study has elicited the structure of Airline Operations Control professionals' decision process during unfamiliar disruptions by mapping three macrocognitive activities on the decision ladder: sensemaking, option evaluation and action planning. The relationship between this structure and decision quality was measured. A simulated task was staged, based on which think-aloud protocols were obtained. Results show that the general decision process structure resembles the structure of experts working under routine conditions, in terms of the general structure of the macrocognitive activities, and the rule-based approach used to identify options and actions. Surprisingly, high quality of decision outcomes was found to relate to the use of rule-based strategies. This implies that successful professionals are capable of dealing with unfamiliar problems by reframing them into familiar ones, rather than to engage in knowledge-based processing. Practitioner Summary: We examined the macrocognitive structure of Airline Operations Control professionals' decision process during a simulated unfamiliar disruption in relation to decision quality. Results suggest that successful professionals are capable of dealing with unfamiliar problems by reframing them into familiar ones, rather than to engage in knowledge-based processing.
Breaking Lander-Waterman’s Coverage Bound
Nashta-ali, Damoun; Motahari, Seyed Abolfazl; Hosseinkhalaj, Babak
2016-01-01
Lander-Waterman’s coverage bound establishes the total number of reads required to cover the whole genome of size G bases. In fact, their bound is a direct consequence of the well-known solution to the coupon collector’s problem which proves that for such genome, the total number of bases to be sequenced should be O(G ln G). Although the result leads to a tight bound, it is based on a tacit assumption that the set of reads are first collected through a sequencing process and then are processed through a computation process, i.e., there are two different machines: one for sequencing and one for processing. In this paper, we present a significant improvement compared to Lander-Waterman’s result and prove that by combining the sequencing and computing processes, one can re-sequence the whole genome with as low as O(G) sequenced bases in total. Our approach also dramatically reduces the required computational power for the combined process. Simulation results are performed on real genomes with different sequencing error rates. The results support our theory predicting the log G improvement on coverage bound and corresponding reduction in the total number of bases required to be sequenced. PMID:27806058
Optimization evaluation of cutting technology based on mechanical parts
NASA Astrophysics Data System (ADS)
Wang, Yu
2018-04-01
The relationship between the mechanical manufacturing process and the carbon emission is studied on the basis of the process of the mechanical manufacturing process. The formula of carbon emission calculation suitable for mechanical manufacturing process is derived. Based on this, a green evaluation method for cold machining process of mechanical parts is proposed. The application verification and data analysis of the proposed evaluation method are carried out by an example. The results show that there is a great relationship between the mechanical manufacturing process data and carbon emissions.
Reprocessing system with nuclide separation based on chromatography in hydrochloric acid solution
DOE Office of Scientific and Technical Information (OSTI.GOV)
Suzuki, Tatsuya; Tachibana, Yu; Koyama, Shi-ichi
2013-07-01
We have proposed the reprocessing system with nuclide separation processes based on the chromatographic technique in the hydrochloric acid solution system. Our proposed system consists of the dissolution process, the reprocessing process, the minor actinide separation process, and nuclide separation processes. In the reprocessing and separation processes, the pyridine resin is used as a main separation media. It was confirmed that the dissolution in the hydrochloric acid solution is easily achieved by the plasma voloxidation and by the addition of oxygen peroxide into the hydrochloric acid solution.
Perimal-Lewis, Lua; Teubner, David; Hakendorf, Paul; Horwood, Chris
2016-12-01
Effective and accurate use of routinely collected health data to produce Key Performance Indicator reporting is dependent on the underlying data quality. In this research, Process Mining methodology and tools were leveraged to assess the data quality of time-based Emergency Department data sourced from electronic health records. This research was done working closely with the domain experts to validate the process models. The hospital patient journey model was used to assess flow abnormalities which resulted from incorrect timestamp data used in time-based performance metrics. The research demonstrated process mining as a feasible methodology to assess data quality of time-based hospital performance metrics. The insight gained from this research enabled appropriate corrective actions to be put in place to address the data quality issues. © The Author(s) 2015.
Acute stress affects prospective memory functions via associative memory processes.
Szőllősi, Ágnes; Pajkossy, Péter; Demeter, Gyula; Kéri, Szabolcs; Racsmány, Mihály
2018-01-01
Recent findings suggest that acute stress can improve the execution of delayed intentions (prospective memory, PM). However, it is unclear whether this improvement can be explained by altered executive control processes or by altered associative memory functioning. To investigate this issue, we used physical-psychosocial stressors to induce acute stress in laboratory settings. Then participants completed event- and time-based PM tasks requiring the different contribution of control processes and a control task (letter fluency) frequently used to measure executive functions. According to our results, acute stress had no impact on ongoing task performance, time-based PM, and verbal fluency, whereas it enhanced event-based PM as measured by response speed for the prospective cues. Our findings indicate that, here, acute stress did not affect executive control processes. We suggest that stress affected event-based PM via associative memory processes. Copyright © 2017 Elsevier B.V. All rights reserved.
Measurement-based reliability/performability models
NASA Technical Reports Server (NTRS)
Hsueh, Mei-Chen
1987-01-01
Measurement-based models based on real error-data collected on a multiprocessor system are described. Model development from the raw error-data to the estimation of cumulative reward is also described. A workload/reliability model is developed based on low-level error and resource usage data collected on an IBM 3081 system during its normal operation in order to evaluate the resource usage/error/recovery process in a large mainframe system. Thus, both normal and erroneous behavior of the system are modeled. The results provide an understanding of the different types of errors and recovery processes. The measured data show that the holding times in key operational and error states are not simple exponentials and that a semi-Markov process is necessary to model the system behavior. A sensitivity analysis is performed to investigate the significance of using a semi-Markov process, as opposed to a Markov process, to model the measured system.
2008-01-01
PDA Technical Report No. 14 has been written to provide current best practices, such as application of risk-based decision making, based in sound science to provide a foundation for the validation of column-based chromatography processes and to expand upon information provided in Technical Report No. 42, Process Validation of Protein Manufacturing. The intent of this technical report is to provide an integrated validation life-cycle approach that begins with the use of process development data for the definition of operational parameters as a basis for validation, confirmation, and/or minor adjustment to these parameters at manufacturing scale during production of conformance batches and maintenance of the validated state throughout the product's life cycle.
Diagnostic and prognostic histopathology system using morphometric indices
DOE Office of Scientific and Technical Information (OSTI.GOV)
Parvin, Bahram; Chang, Hang; Han, Ju
Determining at least one of a prognosis or a therapy for a patient based on a stained tissue section of the patient. An image of a stained tissue section of a patient is processed by a processing device. A set of features values for a set of cell-based features is extracted from the processed image, and the processed image is associated with a particular cluster of a plurality of clusters based on the set of feature values, where the plurality of clusters is defined with respect to a feature space corresponding to the set of features.
The effect of individually-induced processes on image-based overlay and diffraction-based overlay
NASA Astrophysics Data System (ADS)
Oh, SeungHwa; Lee, Jeongjin; Lee, Seungyoon; Hwang, Chan; Choi, Gilheyun; Kang, Ho-Kyu; Jung, EunSeung
2014-04-01
In this paper, set of wafers with separated processes was prepared and overlay measurement result was compared in two methods; IBO and DBO. Based on the experimental result, theoretical approach of relationship between overlay mark deformation and overlay variation is presented. Moreover, overlay reading simulation was used in verification and prediction of overlay variation due to deformation of overlay mark caused by induced processes. Through this study, understanding of individual process effects on overlay measurement error is given. Additionally, guideline of selecting proper overlay measurement scheme for specific layer is presented.
Conflict monitoring in dual process theories of thinking.
De Neys, Wim; Glumicic, Tamara
2008-03-01
Popular dual process theories have characterized human thinking as an interplay between an intuitive-heuristic and demanding-analytic reasoning process. Although monitoring the output of the two systems for conflict is crucial to avoid decision making errors there are some widely different views on the efficiency of the process. Kahneman [Kahneman, D. (2002). Maps of bounded rationality: A perspective on intuitive judgement and choice. Nobel Prize Lecture. Retrieved January 11, 2006, from: http://nobelprize.org/nobel_prizes/economics/laureates/2002/kahnemann-lecture.pdf] and Evans [Evans, J. St. B. T. (1984). Heuristic and analytic processing in reasoning. British Journal of Psychology, 75, 451-468], for example, claim that the monitoring of the heuristic system is typically quite lax whereas others such as Sloman [Sloman, S. A. (1996). The empirical case for two systems of reasoning. Psychological Bulletin, 119, 3-22] and Epstein [Epstein, S. (1994). Integration of the cognitive and psychodynamic unconscious. American Psychologists, 49, 709-724] claim it is flawless and people typically experience a struggle between what they "know" and "feel" in case of a conflict. The present study contrasted these views. Participants solved classic base rate neglect problems while thinking aloud. In these problems a stereotypical description cues a response that conflicts with the response based on the analytic base rate information. Verbal protocols showed no direct evidence for an explicitly experienced conflict. As Kahneman and Evans predicted, participants hardly ever mentioned the base rates and seemed to base their judgment exclusively on heuristic reasoning. However, more implicit measures of conflict detection such as participants' retrieval of the base rate information in an unannounced recall test, decision making latencies, and the tendency to review the base rates indicated that the base rates had been thoroughly processed. On control problems where base rates and description did not conflict this was not the case. Results suggest that whereas the popular characterization of conflict detection as an actively experienced struggle can be questioned there is nevertheless evidence for Sloman's and Epstein's basic claim about the flawless operation of the monitoring. Whenever the base rates and description disagree people will detect this conflict and consequently redirect attention towards a deeper processing of the base rates. Implications for the dual process framework and the rationality debate are discussed.
ERIC Educational Resources Information Center
Watson, Rachel M.; Willford, John D.; Pfeifer, Mariel A.
2018-01-01
In this study, a problem-based capstone course was designed to assess the University of Wyoming Microbiology Program's skill-based and process-based student learning objectives. Students partnered with a local farm, a community garden, and a free downtown clinic in order to conceptualize, propose, perform, and present studies addressing problems…
ERIC Educational Resources Information Center
Ngo, Chau M.; Trinh, Lap Q.
2011-01-01
The field of English language education has seen developments in writing pedagogy, moving from product-based to process-based and then to genre-based approaches. In Vietnam, teaching secondary school students how to write in English is still lagging behind these growing developments. Product-based approach is commonly seen in English writing…
Reference Model for Project Support Environments Version 1.0
1993-02-28
relationship with the framework’s Process Support services and with the Lifecycle Process Engineering services. Examples: "* ORCA (Object-based...Design services. Examples: "* ORCA (Object-based Requirements Capture and Analysis). "* RETRAC (REquirements TRACeability). 4.3 Life-Cycle Process...34traditional" computer tools. Operations: Examples of audio and video processing operations include: "* Create, modify, and delete sound and video data
Khanali, Majid; Mobli, Hossein; Hosseinzadeh-Bandbafha, Homa
2017-12-01
In this study, an artificial neural network (ANN) model was developed for predicting the yield and life cycle environmental impacts based on energy inputs required in processing of black tea, green tea, and oolong tea in Guilan province of Iran. A life cycle assessment (LCA) approach was used to investigate the environmental impact categories of processed tea based on the cradle to gate approach, i.e., from production of input materials using raw materials to the gate of tea processing units, i.e., packaged tea. Thus, all the tea processing operations such as withering, rolling, fermentation, drying, and packaging were considered in the analysis. The initial data were obtained from tea processing units while the required data about the background system was extracted from the EcoInvent 2.2 database. LCA results indicated that diesel fuel and corrugated paper box used in drying and packaging operations, respectively, were the main hotspots. Black tea processing unit caused the highest pollution among the three processing units. Three feed-forward back-propagation ANN models based on Levenberg-Marquardt training algorithm with two hidden layers accompanied by sigmoid activation functions and a linear transfer function in output layer, were applied for three types of processed tea. The neural networks were developed based on energy equivalents of eight different input parameters (energy equivalents of fresh tea leaves, human labor, diesel fuel, electricity, adhesive, carton, corrugated paper box, and transportation) and 11 output parameters (yield, global warming, abiotic depletion, acidification, eutrophication, ozone layer depletion, human toxicity, freshwater aquatic ecotoxicity, marine aquatic ecotoxicity, terrestrial ecotoxicity, and photochemical oxidation). The results showed that the developed ANN models with R 2 values in the range of 0.878 to 0.990 had excellent performance in predicting all the output variables based on inputs. Energy consumption for processing of green tea, oolong tea, and black tea were calculated as 58,182, 60,947, and 66,301 MJ per ton of dry tea, respectively.
Lee, Byung Yang; Seo, Sung Min; Lee, Dong Joon; Lee, Minbaek; Lee, Joohyung; Cheon, Jun-Ho; Cho, Eunju; Lee, Hyunjoong; Chung, In-Young; Park, Young June; Kim, Suhwan; Hong, Seunghun
2010-04-07
We developed a carbon nanotube (CNT)-based biosensor system-on-a-chip (SoC) for the detection of a neurotransmitter. Here, 64 CNT-based sensors were integrated with silicon-based signal processing circuits in a single chip, which was made possible by combining several technological breakthroughs such as efficient signal processing, uniform CNT networks, and biocompatible functionalization of CNT-based sensors. The chip was utilized to detect glutamate, a neurotransmitter, where ammonia, a byproduct of the enzymatic reaction of glutamate and glutamate oxidase on CNT-based sensors, modulated the conductance signals to the CNT-based sensors. This is a major technological advancement in the integration of CNT-based sensors with microelectronics, and this chip can be readily integrated with larger scale lab-on-a-chip (LoC) systems for various applications such as LoC systems for neural networks.
Scalable graphene production: perspectives and challenges of plasma applications
NASA Astrophysics Data System (ADS)
Levchenko, Igor; Ostrikov, Kostya (Ken); Zheng, Jie; Li, Xingguo; Keidar, Michael; B. K. Teo, Kenneth
2016-05-01
Graphene, a newly discovered and extensively investigated material, has many unique and extraordinary properties which promise major technological advances in fields ranging from electronics to mechanical engineering and food production. Unfortunately, complex techniques and high production costs hinder commonplace applications. Scaling of existing graphene production techniques to the industrial level without compromising its properties is a current challenge. This article focuses on the perspectives and challenges of scalability, equipment, and technological perspectives of the plasma-based techniques which offer many unique possibilities for the synthesis of graphene and graphene-containing products. The plasma-based processes are amenable for scaling and could also be useful to enhance the controllability of the conventional chemical vapour deposition method and some other techniques, and to ensure a good quality of the produced graphene. We examine the unique features of the plasma-enhanced graphene production approaches, including the techniques based on inductively-coupled and arc discharges, in the context of their potential scaling to mass production following the generic scaling approaches applicable to the existing processes and systems. This work analyses a large amount of the recent literature on graphene production by various techniques and summarizes the results in a tabular form to provide a simple and convenient comparison of several available techniques. Our analysis reveals a significant potential of scalability for plasma-based technologies, based on the scaling-related process characteristics. Among other processes, a greater yield of 1 g × h-1 m-2 was reached for the arc discharge technology, whereas the other plasma-based techniques show process yields comparable to the neutral-gas based methods. Selected plasma-based techniques show lower energy consumption than in thermal CVD processes, and the ability to produce graphene flakes of various sizes reaching hundreds of square millimetres, and the thickness varying from a monolayer to 10-20 layers. Additional factors such as electrical voltage and current, not available in thermal CVD processes could potentially lead to better scalability, flexibility and control of the plasma-based processes. Advantages and disadvantages of various systems are also considered.
Case-based medical informatics
Pantazi, Stefan V; Arocha, José F; Moehr, Jochen R
2004-01-01
Background The "applied" nature distinguishes applied sciences from theoretical sciences. To emphasize this distinction, we begin with a general, meta-level overview of the scientific endeavor. We introduce the knowledge spectrum and four interconnected modalities of knowledge. In addition to the traditional differentiation between implicit and explicit knowledge we outline the concepts of general and individual knowledge. We connect general knowledge with the "frame problem," a fundamental issue of artificial intelligence, and individual knowledge with another important paradigm of artificial intelligence, case-based reasoning, a method of individual knowledge processing that aims at solving new problems based on the solutions to similar past problems. We outline the fundamental differences between Medical Informatics and theoretical sciences and propose that Medical Informatics research should advance individual knowledge processing (case-based reasoning) and that natural language processing research is an important step towards this goal that may have ethical implications for patient-centered health medicine. Discussion We focus on fundamental aspects of decision-making, which connect human expertise with individual knowledge processing. We continue with a knowledge spectrum perspective on biomedical knowledge and conclude that case-based reasoning is the paradigm that can advance towards personalized healthcare and that can enable the education of patients and providers. We center the discussion on formal methods of knowledge representation around the frame problem. We propose a context-dependent view on the notion of "meaning" and advocate the need for case-based reasoning research and natural language processing. In the context of memory based knowledge processing, pattern recognition, comparison and analogy-making, we conclude that while humans seem to naturally support the case-based reasoning paradigm (memory of past experiences of problem-solving and powerful case matching mechanisms), technical solutions are challenging. Finally, we discuss the major challenges for a technical solution: case record comprehensiveness, organization of information on similarity principles, development of pattern recognition and solving ethical issues. Summary Medical Informatics is an applied science that should be committed to advancing patient-centered medicine through individual knowledge processing. Case-based reasoning is the technical solution that enables a continuous individual knowledge processing and could be applied providing that challenges and ethical issues arising are addressed appropriately. PMID:15533257
Scalable graphene production: perspectives and challenges of plasma applications.
Levchenko, Igor; Ostrikov, Kostya Ken; Zheng, Jie; Li, Xingguo; Keidar, Michael; B K Teo, Kenneth
2016-05-19
Graphene, a newly discovered and extensively investigated material, has many unique and extraordinary properties which promise major technological advances in fields ranging from electronics to mechanical engineering and food production. Unfortunately, complex techniques and high production costs hinder commonplace applications. Scaling of existing graphene production techniques to the industrial level without compromising its properties is a current challenge. This article focuses on the perspectives and challenges of scalability, equipment, and technological perspectives of the plasma-based techniques which offer many unique possibilities for the synthesis of graphene and graphene-containing products. The plasma-based processes are amenable for scaling and could also be useful to enhance the controllability of the conventional chemical vapour deposition method and some other techniques, and to ensure a good quality of the produced graphene. We examine the unique features of the plasma-enhanced graphene production approaches, including the techniques based on inductively-coupled and arc discharges, in the context of their potential scaling to mass production following the generic scaling approaches applicable to the existing processes and systems. This work analyses a large amount of the recent literature on graphene production by various techniques and summarizes the results in a tabular form to provide a simple and convenient comparison of several available techniques. Our analysis reveals a significant potential of scalability for plasma-based technologies, based on the scaling-related process characteristics. Among other processes, a greater yield of 1 g × h(-1) m(-2) was reached for the arc discharge technology, whereas the other plasma-based techniques show process yields comparable to the neutral-gas based methods. Selected plasma-based techniques show lower energy consumption than in thermal CVD processes, and the ability to produce graphene flakes of various sizes reaching hundreds of square millimetres, and the thickness varying from a monolayer to 10-20 layers. Additional factors such as electrical voltage and current, not available in thermal CVD processes could potentially lead to better scalability, flexibility and control of the plasma-based processes. Advantages and disadvantages of various systems are also considered.
Li, Kangkang; Yu, Hai; Feron, Paul; Tade, Moses; Wardhaugh, Leigh
2015-08-18
Using a rate-based model, we assessed the technical feasibility and energy performance of an advanced aqueous-ammonia-based postcombustion capture process integrated with a coal-fired power station. The capture process consists of three identical process trains in parallel, each containing a CO2 capture unit, an NH3 recycling unit, a water separation unit, and a CO2 compressor. A sensitivity study of important parameters, such as NH3 concentration, lean CO2 loading, and stripper pressure, was performed to minimize the energy consumption involved in the CO2 capture process. Process modifications of the rich-split process and the interheating process were investigated to further reduce the solvent regeneration energy. The integrated capture system was then evaluated in terms of the mass balance and the energy consumption of each unit. The results show that our advanced ammonia process is technically feasible and energy-competitive, with a low net power-plant efficiency penalty of 7.7%.
Aligning observed and modelled behaviour based on workflow decomposition
NASA Astrophysics Data System (ADS)
Wang, Lu; Du, YuYue; Liu, Wei
2017-09-01
When business processes are mostly supported by information systems, the availability of event logs generated from these systems, as well as the requirement of appropriate process models are increasing. Business processes can be discovered, monitored and enhanced by extracting process-related information. However, some events cannot be correctly identified because of the explosion of the amount of event logs. Therefore, a new process mining technique is proposed based on a workflow decomposition method in this paper. Petri nets (PNs) are used to describe business processes, and then conformance checking of event logs and process models is investigated. A decomposition approach is proposed to divide large process models and event logs into several separate parts that can be analysed independently; while an alignment approach based on a state equation method in PN theory enhances the performance of conformance checking. Both approaches are implemented in programmable read-only memory (ProM). The correctness and effectiveness of the proposed methods are illustrated through experiments.
Process-based tolerance assessment of connecting rod machining process
NASA Astrophysics Data System (ADS)
Sharma, G. V. S. S.; Rao, P. Srinivasa; Surendra Babu, B.
2016-06-01
Process tolerancing based on the process capability studies is the optimistic and pragmatic approach of determining the manufacturing process tolerances. On adopting the define-measure-analyze-improve-control approach, the process potential capability index ( C p) and the process performance capability index ( C pk) values of identified process characteristics of connecting rod machining process are achieved to be greater than the industry benchmark of 1.33, i.e., four sigma level. The tolerance chain diagram methodology is applied to the connecting rod in order to verify the manufacturing process tolerances at various operations of the connecting rod manufacturing process. This paper bridges the gap between the existing dimensional tolerances obtained via tolerance charting and process capability studies of the connecting rod component. Finally, the process tolerancing comparison has been done by adopting a tolerance capability expert software.
ERIC Educational Resources Information Center
Chia, Robert
2017-01-01
Purpose: This paper aims to articulate a practice-based, non-cognitivist approach to organizational learning. Design/methodology/approach: This paper explores the potential contribution of a process-based "practice turn" in social theory for understanding organizational learning. Findings: In complex, turbulent environments, robust…
Trace-Based Microanalytic Measurement of Self-Regulated Learning Processes
ERIC Educational Resources Information Center
Siadaty, Melody; Gaševic, Dragan; Hatala, Marek
2016-01-01
To keep pace with today's rapidly growing knowledge-driven society, productive self-regulation of one's learning processes are essential. We introduce and discuss a trace-based measurement protocol to measure the effects of scaffolding interventions on self-regulated learning (SRL) processes. It guides tracing of learners' actions in a learning…
Process Evaluation in Corrections-Based Substance Abuse Treatment.
ERIC Educational Resources Information Center
Wolk, James L.; Hartmann, David J.
1996-01-01
Argues that process evaluation is needed to validate prison-based substance abuse treatment effectiveness. Five groups--inmates, treatment staff, prison staff, prison administration, and the parole board--should be a part of this process evaluation. Discusses these five groups relative to three stages of development of substance abuse treatment in…
The separation and recovery of VOCs from surfactant-containing aqueous solutions by a composite hollow fiber membrane-based pervaporation process has been studied. The process employed hydrophobic microporous polypropylene hollow fibers having a thin plasma polymerized silicon...
Towards an Intelligent Planning Knowledge Base Development Environment
NASA Technical Reports Server (NTRS)
Chien, S.
1994-01-01
ract describes work in developing knowledge base editing and debugging tools for the Multimission VICAR Planner (MVP) system. MVP uses artificial intelligence planning techniques to automatically construct executable complex image processing procedures (using models of the smaller constituent image processing requests made to the JPL Multimission Image Processing Laboratory.
Using Data-Based Inquiry and Decision Making To Improve Instruction.
ERIC Educational Resources Information Center
Feldman, Jay; Tung, Rosann
2001-01-01
Discusses a study of six schools using data-based inquiry and decision-making process to improve instruction. Findings identified two conditions to support successful implementation of the process: administrative support, especially in providing teachers learning time, and teacher leadership to encourage and support colleagues to own the process.…
An adaptive management process for forest soil conservation.
Michael P. Curran; Douglas G. Maynard; Ronald L. Heninger; Thomas A. Terry; Steven W. Howes; Douglas M. Stone; Thomas Niemann; Richard E. Miller; Robert F. Powers
2005-01-01
Soil disturbance guidelines should be based on comparable disturbance categories adapted to specific local soil conditions, validated by monitoring and research. Guidelines, standards, and practices should be continually improved based on an adaptive management process, which is presented in this paper. Core components of this process include: reliable monitoring...
Improved Warm-Working Process For An Iron-Base Alloy
NASA Technical Reports Server (NTRS)
Cone, Fred P.; Cryns, Brendan J.; Miller, John A.; Zanoni, Robert
1992-01-01
Warm-working process produces predominantly unrecrystallized grain structure in forgings of iron-base alloy A286 (PWA 1052 composition). Yield strength and ultimate strength increased, and elongation and reduction of area at break decreased. Improved process used on forgings up to 10 in. thick and weighing up to 900 lb.
Advanced process control framework initiative
NASA Astrophysics Data System (ADS)
Hill, Tom; Nettles, Steve
1997-01-01
The semiconductor industry, one the world's most fiercely competitive industries, is driven by increasingly complex process technologies and global competition to improve cycle time, quality, and process flexibility. Due to the complexity of these problems, current process control techniques are generally nonautomated, time-consuming, reactive, nonadaptive, and focused on individual fabrication tools and processes. As the semiconductor industry moves into higher density processes, radical new approaches are required. To address the need for advanced factory-level process control in this environment, Honeywell, Advanced Micro Devices (AMD), and SEMATECH formed the Advanced Process Control Framework Initiative (APCFI) joint research project. The project defines and demonstrates an Advanced Process Control (APC) approach based on SEMATECH's Computer Integrated Manufacturing (CIM) Framework. Its scope includes the coordination of Manufacturing Execution Systems, process control tools, and wafer fabrication equipment to provide necessary process control capabilities. Moreover, it takes advantage of the CIM Framework to integrate and coordinate applications from other suppliers that provide services necessary for the overall system to function. This presentation discusses the key concept of model-based process control that differentiates the APC Framework. This major improvement over current methods enables new systematic process control by linking the knowledge of key process settings to desired product characteristics that reside in models created with commercial model development tools The unique framework-based approach facilitates integration of commercial tools and reuse of their data by tying them together in an object-based structure. The presentation also explores the perspective of each organization's involvement in the APCFI project. Each has complementary goals and expertise to contribute; Honeywell represents the supplier viewpoint, AMD represents the user with 'real customer requirements', and SEMATECH provides a consensus-building organization that widely disseminates technology to suppliers and users in the semiconductor industry that face similar equipment and factory control systems challenges.
ERIC Educational Resources Information Center
Moallem, Mahnaz
2001-01-01
Provides an overview of the process of designing and developing a Web-based course using instructional design principles and models, including constructivist and objectivist theories. Explains the process of implementing an instructional design model in designing a Web-based undergraduate course and evaluates the model based on course evaluations.…
ERIC Educational Resources Information Center
Cobos, Pedro L.; Gutiérrez-Cobo, María J.; Morís, Joaquín; Luque, David
2017-01-01
In our study, we tested the hypothesis that feature-based and rule-based generalization involve different types of processes that may affect each other producing different results depending on time constraints and on how generalization is measured. For this purpose, participants in our experiments learned cue-outcome relationships that followed…
Testing Strategies for Model-Based Development
NASA Technical Reports Server (NTRS)
Heimdahl, Mats P. E.; Whalen, Mike; Rajan, Ajitha; Miller, Steven P.
2006-01-01
This report presents an approach for testing artifacts generated in a model-based development process. This approach divides the traditional testing process into two parts: requirements-based testing (validation testing) which determines whether the model implements the high-level requirements and model-based testing (conformance testing) which determines whether the code generated from a model is behaviorally equivalent to the model. The goals of the two processes differ significantly and this report explores suitable testing metrics and automation strategies for each. To support requirements-based testing, we define novel objective requirements coverage metrics similar to existing specification and code coverage metrics. For model-based testing, we briefly describe automation strategies and examine the fault-finding capability of different structural coverage metrics using tests automatically generated from the model.
Stochastic simulation by image quilting of process-based geological models
NASA Astrophysics Data System (ADS)
Hoffimann, Júlio; Scheidt, Céline; Barfod, Adrian; Caers, Jef
2017-09-01
Process-based modeling offers a way to represent realistic geological heterogeneity in subsurface models. The main limitation lies in conditioning such models to data. Multiple-point geostatistics can use these process-based models as training images and address the data conditioning problem. In this work, we further develop image quilting as a method for 3D stochastic simulation capable of mimicking the realism of process-based geological models with minimal modeling effort (i.e. parameter tuning) and at the same time condition them to a variety of data. In particular, we develop a new probabilistic data aggregation method for image quilting that bypasses traditional ad-hoc weighting of auxiliary variables. In addition, we propose a novel criterion for template design in image quilting that generalizes the entropy plot for continuous training images. The criterion is based on the new concept of voxel reuse-a stochastic and quilting-aware function of the training image. We compare our proposed method with other established simulation methods on a set of process-based training images of varying complexity, including a real-case example of stochastic simulation of the buried-valley groundwater system in Denmark.
42 CFR 425.112 - Required processes and patient-centeredness criteria.
Code of Federal Regulations, 2013 CFR
2013-10-01
.... (a) General. (1) An ACO must— (i) Promote evidence-based medicine and beneficiary engagement... to accomplish the following: (1) Promote evidence-based medicine. These processes must cover...) Communication of clinical knowledge/evidence-based medicine to beneficiaries in a way that is understandable to...
42 CFR 425.112 - Required processes and patient-centeredness criteria.
Code of Federal Regulations, 2014 CFR
2014-10-01
.... (a) General. (1) An ACO must— (i) Promote evidence-based medicine and beneficiary engagement... to accomplish the following: (1) Promote evidence-based medicine. These processes must cover...) Communication of clinical knowledge/evidence-based medicine to beneficiaries in a way that is understandable to...
42 CFR 425.112 - Required processes and patient-centeredness criteria.
Code of Federal Regulations, 2012 CFR
2012-10-01
.... (a) General. (1) An ACO must— (i) Promote evidence-based medicine and beneficiary engagement... to accomplish the following: (1) Promote evidence-based medicine. These processes must cover...) Communication of clinical knowledge/evidence-based medicine to beneficiaries in a way that is understandable to...
Shelf-stable egg-based products processed by high pressure thermal sterilization
USDA-ARS?s Scientific Manuscript database
Producing a thermally sterilized egg-based product with increased shelf life without losing the sensory and nutritional properties of the freshly prepared product is challenging. Until recently, all commercial shelf-stable egg-based products were sterilized using conventional thermal processing; how...
NASA Astrophysics Data System (ADS)
Acharya, Ranadip; Das, Suman
2015-09-01
This article describes additive manufacturing (AM) of IN100, a high gamma-prime nickel-based superalloy, through scanning laser epitaxy (SLE), aimed at the creation of thick deposits onto like-chemistry substrates for enabling repair of turbine engine hot-section components. SLE is a metal powder bed-based laser AM technology developed for nickel-base superalloys with equiaxed, directionally solidified, and single-crystal microstructural morphologies. Here, we combine process modeling, statistical design-of-experiments (DoE), and microstructural characterization to demonstrate fully metallurgically bonded, crack-free and dense deposits exceeding 1000 μm of SLE-processed IN100 powder onto IN100 cast substrates produced in a single pass. A combined thermal-fluid flow-solidification model of the SLE process compliments DoE-based process development. A customized quantitative metallography technique analyzes digital cross-sectional micrographs and extracts various microstructural parameters, enabling process model validation and process parameter optimization. Microindentation measurements show an increase in the hardness by 10 pct in the deposit region compared to the cast substrate due to microstructural refinement. The results illustrate one of the very few successes reported for the crack-free deposition of IN100, a notoriously "non-weldable" hot-section alloy, thus establishing the potential of SLE as an AM method suitable for hot-section component repair and for future new-make components in high gamma-prime containing crack-prone nickel-based superalloys.
A CSP-Based Agent Modeling Framework for the Cougaar Agent-Based Architecture
NASA Technical Reports Server (NTRS)
Gracanin, Denis; Singh, H. Lally; Eltoweissy, Mohamed; Hinchey, Michael G.; Bohner, Shawn A.
2005-01-01
Cognitive Agent Architecture (Cougaar) is a Java-based architecture for large-scale distributed agent-based applications. A Cougaar agent is an autonomous software entity with behaviors that represent a real-world entity (e.g., a business process). A Cougaar-based Model Driven Architecture approach, currently under development, uses a description of system's functionality (requirements) to automatically implement the system in Cougaar. The Communicating Sequential Processes (CSP) formalism is used for the formal validation of the generated system. Two main agent components, a blackboard and a plugin, are modeled as CSP processes. A set of channels represents communications between the blackboard and individual plugins. The blackboard is represented as a CSP process that communicates with every agent in the collection. The developed CSP-based Cougaar modeling framework provides a starting point for a more complete formal verification of the automatically generated Cougaar code. Currently it is used to verify the behavior of an individual agent in terms of CSP properties and to analyze the corresponding Cougaar society.
Model-Driven Useware Engineering
NASA Astrophysics Data System (ADS)
Meixner, Gerrit; Seissler, Marc; Breiner, Kai
User-oriented hardware and software development relies on a systematic development process based on a comprehensive analysis focusing on the users' requirements and preferences. Such a development process calls for the integration of numerous disciplines, from psychology and ergonomics to computer sciences and mechanical engineering. Hence, a correspondingly interdisciplinary team must be equipped with suitable software tools to allow it to handle the complexity of a multimodal and multi-device user interface development approach. An abstract, model-based development approach seems to be adequate for handling this complexity. This approach comprises different levels of abstraction requiring adequate tool support. Thus, in this chapter, we present the current state of our model-based software tool chain. We introduce the use model as the core model of our model-based process, transformation processes, and a model-based architecture, and we present different software tools that provide support for creating and maintaining the models or performing the necessary model transformations.
The development of additive manufacturing technique for nickel-base alloys: A review
NASA Astrophysics Data System (ADS)
Zadi-Maad, Ahmad; Basuki, Arif
2018-04-01
Nickel-base alloys are an attractive alloy due to its excellent mechanical properties, a high resistance to creep deformation, corrosion, and oxidation. However, it is a hard task to control performance when casting or forging for this material. In recent years, additive manufacturing (AM) process has been implemented to replace the conventional directional solidification process for the production of nickel-base alloys. Due to its potentially lower cost and flexibility manufacturing process, AM is considered as a substitute technique for the existing. This paper provides a comprehensive review of the previous work related to the AM techniques for Ni-base alloys while highlighting current challenges and methods to solving them. The properties of conventionally manufactured Ni-base alloys are also compared with the AM fabricated alloys. The mechanical properties obtained from tension, hardness and fatigue test are included, along with discussions of the effect of post-treatment process. Recommendations for further work are also provided.
Church-Based Recruitment to Reach Korean Immigrants: An Integrative Review.
Park, Chorong; Jang, Myoungock; Nam, Soohyun; Grey, Margaret; Whittemore, Robin
2017-04-01
Although the Korean church has been frequently used to recruit Korean immigrants in research, little is known about the specific strategies and process. The purpose of this integrative review was to describe recruitment strategies in studies of Korean immigrants and to identify the process of Korean church-based recruitment. Thirty-three studies met inclusion criteria. Four stages of church-based recruitment were identified: initiation, endorsement, advertisement, and implementation. This review identified aspects of the church-based recruitment process in Korean immigrants, which are different from the Black and Hispanic literature, due to their hierarchical culture and language barriers. Getting permission from pastors and announcing the study by pastors at Sunday services were identified as the key components of the process. Using the church newsletter to advertise the study was the most effective strategy for the advertisement stage. Despite several limitations, church-based recruitment is a very feasible and effective way to recruit Korean immigrants.
An Ontology-Based Conceptual Model For Accumulating And Reusing Knowledge In A DMAIC Process
NASA Astrophysics Data System (ADS)
Nguyen, ThanhDat; Kifor, Claudiu Vasile
2015-09-01
DMAIC (Define, Measure, Analyze, Improve, and Control) is an important process used to enhance quality of processes basing on knowledge. However, it is difficult to access DMAIC knowledge. Conventional approaches meet a problem arising from structuring and reusing DMAIC knowledge. The main reason is that DMAIC knowledge is not represented and organized systematically. In this article, we overcome the problem basing on a conceptual model that is a combination of DMAIC process, knowledge management, and Ontology engineering. The main idea of our model is to utilizing Ontologies to represent knowledge generated by each of DMAIC phases. We build five different knowledge bases for storing all knowledge of DMAIC phases with the support of necessary tools and appropriate techniques in Information Technology area. Consequently, these knowledge bases provide knowledge available to experts, managers, and web users during or after DMAIC execution in order to share and reuse existing knowledge.
Implementation of a Web-Based Collaborative Process Planning System
NASA Astrophysics Data System (ADS)
Wang, Huifen; Liu, Tingting; Qiao, Li; Huang, Shuangxi
Under the networked manufacturing environment, all phases of product manufacturing involving design, process planning, machining and assembling may be accomplished collaboratively by different enterprises, even different manufacturing stages of the same part may be finished collaboratively by different enterprises. Based on the self-developed networked manufacturing platform eCWS(e-Cooperative Work System), a multi-agent-based system framework for collaborative process planning is proposed. In accordance with requirements of collaborative process planning, share resources provided by cooperative enterprises in the course of collaboration are classified into seven classes. Then a reconfigurable and extendable resource object model is built. Decision-making strategy is also studied in this paper. Finally a collaborative process planning system e-CAPP is developed and applied. It provides strong support for distributed designers to collaboratively plan and optimize product process though network.
Molloy Elreda, Lauren; Coatsworth, J Douglas; Gest, Scott D; Ram, Nilam; Bamberger, Katharine
2016-11-01
Although the majority of evidence-based programs are designed for group delivery, group process and its role in participant outcomes have received little empirical attention. Data were collected from 20 groups of participants (94 early adolescents, 120 parents) enrolled in an efficacy trial of a mindfulness-based adaptation of the Strengthening Families Program (MSFP). Following each weekly session, participants reported on their relations to group members. Social network analysis and methods sensitive to intraindividual variability were integrated to examine weekly covariation between group process and participant progress, and to predict post-intervention outcomes from levels and changes in group process. Results demonstrate hypothesized links between network indices of group process and intervention outcomes and highlight the value of this unique analytic approach to studying intervention group process.
Indirect three-dimensional printing of synthetic polymer scaffold based on thermal molding process.
Park, Jeong Hun; Jung, Jin Woo; Kang, Hyun-Wook; Cho, Dong-Woo
2014-06-01
One of the major issues in tissue engineering has been the development of three-dimensional (3D) scaffolds, which serve as a structural template for cell growth and extracellular matrix formation. In scaffold-based tissue engineering, 3D printing (3DP) technology has been successfully applied for the fabrication of complex 3D scaffolds by using both direct and indirect techniques. In principle, direct 3DP techniques rely on the straightforward utilization of the final scaffold materials during the actual scaffold fabrication process. In contrast, indirect 3DP techniques use a negative mold based on a scaffold design, to which the desired biomaterial is cast and then sacrificed to obtain the final scaffold. Such indirect 3DP techniques generally impose a solvent-based process for scaffold fabrication, resulting in a considerable increase in the fabrication time and poor mechanical properties. In addition, the internal architecture of the resulting scaffold is affected by the properties of the biomaterial solution. In this study, we propose an advanced indirect 3DP technique using projection-based micro-stereolithography and an injection molding system (IMS) in order to address these challenges. The scaffold was fabricated by a thermal molding process using IMS to overcome the limitation of the solvent-based molding process in indirect 3DP techniques. The results indicate that the thermal molding process using an IMS has achieved a substantial reduction in scaffold fabrication time and has also provided the scaffold with higher mechanical modulus and strength. In addition, cell adhesion and proliferation studies have indicated no significant difference in cell activity between the scaffolds prepared by solvent-based and thermal molding processes.
Durvasula, Raghu; Kelly, Janet; Schleyer, Anneliese; Anawalt, Bradley D.; Somani, Shabir; Dellit, Timothy H.
2018-01-01
Background As healthcare costs rise and reimbursements decrease, healthcare organization leadership and clinical providers must collaborate to provide high-value healthcare. Medications are a key driver of the increasing cost of healthcare, largely as a result of the proliferation of expensive specialty drugs, including biologic agents. Such medications contribute significantly to the inpatient diagnosis-related group payment system, often with minimal or unproved benefit over less-expensive therapies. Objective To describe a systematic review process to reduce non–evidence-based inpatient use of high-cost medications across a large multihospital academic health system. Methods We created a Pharmacy & Therapeutics subcommittee consisting of clinicians, pharmacists, and an ethics representative. This committee developed a standardized process for a timely review (<48 hours) and approval of high-cost medications based on their clinical effectiveness, safety, and appropriateness. The engagement of clinical experts in the development of the consensus-based guidelines for the use of specific medications facilitated the clinicians' acceptance of the review process. Results Over a 2-year period, a total of 85 patient-specific requests underwent formal review. All reviews were conducted within 48 hours. This review process has reduced the non–evidence-based use of specialty medications and has resulted in a pharmacy savings of $491,000 in fiscal year 2016, with almost 80% of the savings occurring in the last 2 quarters, because our process has matured. Conclusion The creation of a collaborative review process to ensure consistent, evidence-based utilization of high-cost medications provides value-based care, while minimizing unnecessary practice variation and reducing the cost of inpatient care.
The remote sensing image segmentation mean shift algorithm parallel processing based on MapReduce
NASA Astrophysics Data System (ADS)
Chen, Xi; Zhou, Liqing
2015-12-01
With the development of satellite remote sensing technology and the remote sensing image data, traditional remote sensing image segmentation technology cannot meet the massive remote sensing image processing and storage requirements. This article put cloud computing and parallel computing technology in remote sensing image segmentation process, and build a cheap and efficient computer cluster system that uses parallel processing to achieve MeanShift algorithm of remote sensing image segmentation based on the MapReduce model, not only to ensure the quality of remote sensing image segmentation, improved split speed, and better meet the real-time requirements. The remote sensing image segmentation MeanShift algorithm parallel processing algorithm based on MapReduce shows certain significance and a realization of value.
Harbison, K; Kelly, J; Burnell, L; Silva, J
1995-01-01
The Scenario-based Engineering Process (SEP) is a user-focused methodology for large and complex system design. This process supports new application development from requirements analysis with domain models to component selection, design and modification, implementation, integration, and archival placement. It is built upon object-oriented methodologies, domain modeling strategies, and scenario-based techniques to provide an analysis process for mapping application requirements to available components. We are using SEP in the health care applications that we are developing. The process has already achieved success in the manufacturing and military domains and is being adopted by many organizations. SEP should prove viable in any domain containing scenarios that can be decomposed into tasks.
Technical Potential Assessment for the Renewable Energy Zone (REZ) Process: A GIS-Based Approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Nathan; Roberts, Billy J
Geographic Information Systems (GIS)-based energy resource and technical potential assessments identify areas capable of supporting high levels of renewable energy (RE) development as part of a Renewable Energy Zone (REZ) Transmission Planning process. This document expands on the REZ Process to aid practitioners in conducting GIS-based RE resource and technical potential assessments. The REZ process is an approach to plan, approve, and build transmission infrastructure that connects REZs - geographic areas that have high-quality RE resources, suitable topography and land-use designations, and demonstrated developer interest - to the power system. The REZ process helps to increase the share of solarmore » photovoltaic (PV), wind, and other resources while also maintaining reliability and economics.« less
NASA Astrophysics Data System (ADS)
Okubo, Michinori; Kon, Tomokuni; Abe, Nobuyuki
Dissimilar smart joints are useful. In this research, welded quality of dissimilar aluminum alloys of 3 mm thickness by various welding processes and process parameters have been investigated by hardness and tensile tests, and observation of imperfection and microstructure. Base metals used in this study are A1050-H24, A2017-T3, A5083-O, A6061-T6 and A7075-T651. Welding processes used are YAG laser beam, electron beam, metal inert gas arc, tungsten inert gas arc and friction stir welding. The properties of weld zones are affected by welding processes, welding parameters and combination of base metals. Properties of high strength aluminum alloy joints are improved by friction stir welding.
An expert systems application to space base data processing
NASA Technical Reports Server (NTRS)
Babb, Stephen M.
1988-01-01
The advent of space vehicles with their increased data requirements are reflected in the complexity of future telemetry systems. Space based operations with its immense operating costs will shift the burden of data processing and routine analysis from the space station to the Orbital Transfer Vehicle (OTV). A research and development project is described which addresses the real time onboard data processing tasks associated with a space based vehicle, specifically focusing on an implementation of an expert system.
Hardware based redundant multi-threading inside a GPU for improved reliability
Sridharan, Vilas; Gurumurthi, Sudhanva
2015-05-05
A system and method for verifying computation output using computer hardware are provided. Instances of computation are generated and processed on hardware-based processors. As instances of computation are processed, each instance of computation receives a load accessible to other instances of computation. Instances of output are generated by processing the instances of computation. The instances of output are verified against each other in a hardware based processor to ensure accuracy of the output.
Acausal measurement-based quantum computing
NASA Astrophysics Data System (ADS)
Morimae, Tomoyuki
2014-07-01
In measurement-based quantum computing, there is a natural "causal cone" among qubits of the resource state, since the measurement angle on a qubit has to depend on previous measurement results in order to correct the effect of by-product operators. If we respect the no-signaling principle, by-product operators cannot be avoided. Here we study the possibility of acausal measurement-based quantum computing by using the process matrix framework [Oreshkov, Costa, and Brukner, Nat. Commun. 3, 1092 (2012), 10.1038/ncomms2076]. We construct a resource process matrix for acausal measurement-based quantum computing restricting local operations to projective measurements. The resource process matrix is an analog of the resource state of the standard causal measurement-based quantum computing. We find that if we restrict local operations to projective measurements the resource process matrix is (up to a normalization factor and trivial ancilla qubits) equivalent to the decorated graph state created from the graph state of the corresponding causal measurement-based quantum computing. We also show that it is possible to consider a causal game whose causal inequality is violated by acausal measurement-based quantum computing.
ERIC Educational Resources Information Center
Kupersmidt, Janis B.; Stelter, Rebecca; Dodge, Kenneth A.
2011-01-01
The purpose of this study was to evaluate the psychometric properties of an audio computer-assisted self-interviewing Web-based software application called the Social Information Processing Application (SIP-AP) that was designed to assess social information processing skills in boys in RD through 5th grades. This study included a racially and…
ERIC Educational Resources Information Center
Robinson, Beatrice E.; Galbraith, Jennifer S.; Lund, Sharon M.; Hamilton, Autumn R.; Shankle, Michael D.
2012-01-01
We describe the process of adapting a community-level, evidence-based behavioral intervention (EBI), Community PROMISE, for HIV-positive African American men who have sex with men (AAMSM). The Centers for Disease Control and Prevention (CDC) Map of the Adaptation Process (MAP) guided the adaptation process for this new target population by two…
Dehydration processes using membranes with hydrophobic coating
Huang, Yu; Baker, Richard W; Aldajani, Tiem; Ly, Jennifer
2013-07-30
Processes for removing water from organic compounds, especially polar compounds such as alcohols. The processes include a membrane-based dehydration step, using a membrane that has a dioxole-based polymer selective layer or the like and a hydrophilic selective layer, and can operate even when the stream to be treated has a high water content, such as 10 wt % or more. The processes are particularly useful for dehydrating ethanol.
Janknegt, Robert; Scott, Mike; Mairs, Jill; Timoney, Mark; McElnay, James; Brenninkmeijer, Rob
2007-10-01
Drug selection should be a rational process that embraces the principles of evidence-based medicine. However, many factors may affect the choice of agent. It is against this background that the System of Objectified Judgement Analysis (SOJA) process for rational drug-selection was developed. This article describes how the information on which the SOJA process is based, was researched and processed.
Goal selection versus process control in a brain-computer interface based on sensorimotor rhythms.
Royer, Audrey S; He, Bin
2009-02-01
In a brain-computer interface (BCI) utilizing a process control strategy, the signal from the cortex is used to control the fine motor details normally handled by other parts of the brain. In a BCI utilizing a goal selection strategy, the signal from the cortex is used to determine the overall end goal of the user, and the BCI controls the fine motor details. A BCI based on goal selection may be an easier and more natural system than one based on process control. Although goal selection in theory may surpass process control, the two have never been directly compared, as we are reporting here. Eight young healthy human subjects participated in the present study, three trained and five naïve in BCI usage. Scalp-recorded electroencephalograms (EEG) were used to control a computer cursor during five different paradigms. The paradigms were similar in their underlying signal processing and used the same control signal. However, three were based on goal selection, and two on process control. For both the trained and naïve populations, goal selection had more hits per run, was faster, more accurate (for seven out of eight subjects) and had a higher information transfer rate than process control. Goal selection outperformed process control in every measure studied in the present investigation.
Contingency theoretic methodology for agent-based web-oriented manufacturing systems
NASA Astrophysics Data System (ADS)
Durrett, John R.; Burnell, Lisa J.; Priest, John W.
2000-12-01
The development of distributed, agent-based, web-oriented, N-tier Information Systems (IS) must be supported by a design methodology capable of responding to the convergence of shifts in business process design, organizational structure, computing, and telecommunications infrastructures. We introduce a contingency theoretic model for the use of open, ubiquitous software infrastructure in the design of flexible organizational IS. Our basic premise is that developers should change in the way they view the software design process from a view toward the solution of a problem to one of the dynamic creation of teams of software components. We postulate that developing effective, efficient, flexible, component-based distributed software requires reconceptualizing the current development model. The basic concepts of distributed software design are merged with the environment-causes-structure relationship from contingency theory; the task-uncertainty of organizational- information-processing relationships from information processing theory; and the concept of inter-process dependencies from coordination theory. Software processes are considered as employees, groups of processes as software teams, and distributed systems as software organizations. Design techniques already used in the design of flexible business processes and well researched in the domain of the organizational sciences are presented. Guidelines that can be utilized in the creation of component-based distributed software will be discussed.
Consumer psychology: categorization, inferences, affect, and persuasion.
Loken, Barbara
2006-01-01
This chapter reviews research on consumer psychology with emphasis on the topics of categorization, inferences, affect, and persuasion. The chapter reviews theory-based empirical research during the period 1994-2004. Research on categorization includes empirical research on brand categories, goals as organizing frameworks and motivational bases for judgments, and self-based processing. Research on inferences includes numerous types of inferences that are cognitively and/or experienced based. Research on affect includes the effects of mood on processing and cognitive and noncognitive bases for attitudes and intentions. Research on persuasion focuses heavily on the moderating role of elaboration and dual-process models, and includes research on attitude strength responses, advertising responses, and negative versus positive evaluative dimensions.
REPLACEMENT OF HAZARDOUS MATERIAL IN WIDE WEB FLEXOGRAPHIC PRINTING PROCESS
This study examined on a technical and economic basis, the effect of substituting water-based inks in a flexographic printing process. o reduce volatile organic compound (VOC) emissions by switching from the use of solvent-based inks to water based inks, several equipment modific...
REPLACEMENT OF HAZARDOUS MATERIAL IN WIDE WEB FLEXOGRAPHIC PRINTING PROCESS
This study examined on a technical and economic basis, the effect of substituting water-based inks in a flexographic printing process. To reduce volatile organic compound (VOC) emissions by switching from the use of solvent-based inks to water based inks, several equipment modifi...
[Preface for special issue on bio-based materials (2016)].
Weng, Yunxuan
2016-06-25
Bio-based materials are new materials or chemicals with renewable biomass as raw materials such as grain, legume, straw, bamboo and wood powder. This class of materials includes bio-based polymer, biobased fiber, glycotechnology products, biobased rubber and plastics produced by biomass thermoplastic processing and basic biobased chemicals, for instance, bio-alcohols, organic acids, alkanes, and alkenes, obtained by bio-synthesis, bio-processing and bio-refinery. Owing to its environmental friendly and resource conservation, bio-based materials are becoming a new dominant industry taking the lead in the world scientific and technological innovation and economic development. An overview of bio-based materials development is reported in this special issue, and the industrial status and research progress of the following aspects, including biobased fiber, polyhydroxyalkanoates, biodegradable mulching film, bio-based polyamide, protein based biomedical materials, bio-based polyurethane, and modification and processing of poly(lactic acid), are introduced.
Induced Polarization Influences the Fundamental Forces in DNA Base Flipping
2015-01-01
Base flipping in DNA is an important process involved in genomic repair and epigenetic control of gene expression. The driving forces for these processes are not fully understood, especially in the context of the underlying dynamics of the DNA and solvent effects. We studied double-stranded DNA oligomers that have been previously characterized by imino proton exchange NMR using both additive and polarizable force fields. Our results highlight the importance of induced polarization on the base flipping process, yielding near-quantitative agreement with experimental measurements of the equilibrium between the base-paired and flipped states. Further, these simulations allow us to quantify for the first time the energetic implications of polarization on the flipping pathway. Free energy barriers to base flipping are reduced by changes in dipole moments of both the flipped bases that favor solvation of the bases in the open state and water molecules adjacent to the flipping base. PMID:24976900
Proportional reasoning as a heuristic-based process: time constraint and dual task considerations.
Gillard, Ellen; Van Dooren, Wim; Schaeken, Walter; Verschaffel, Lieven
2009-01-01
The present study interprets the overuse of proportional solution methods from a dual process framework. Dual process theories claim that analytic operations involve time-consuming executive processing, whereas heuristic operations are fast and automatic. In two experiments to test whether proportional reasoning is heuristic-based, the participants solved "proportional" problems, for which proportional solution methods provide correct answers, and "nonproportional" problems known to elicit incorrect answers based on the assumption of proportionality. In Experiment 1, the available solution time was restricted. In Experiment 2, the executive resources were burdened with a secondary task. Both manipulations induced an increase in proportional answers and a decrease in correct answers to nonproportional problems. These results support the hypothesis that the choice for proportional methods is heuristic-based.
Soft sensor modeling based on variable partition ensemble method for nonlinear batch processes
NASA Astrophysics Data System (ADS)
Wang, Li; Chen, Xiangguang; Yang, Kai; Jin, Huaiping
2017-01-01
Batch processes are always characterized by nonlinear and system uncertain properties, therefore, the conventional single model may be ill-suited. A local learning strategy soft sensor based on variable partition ensemble method is developed for the quality prediction of nonlinear and non-Gaussian batch processes. A set of input variable sets are obtained by bootstrapping and PMI criterion. Then, multiple local GPR models are developed based on each local input variable set. When a new test data is coming, the posterior probability of each best performance local model is estimated based on Bayesian inference and used to combine these local GPR models to get the final prediction result. The proposed soft sensor is demonstrated by applying to an industrial fed-batch chlortetracycline fermentation process.
Qiu, Jinshu; Li, Kim; Miller, Karen; Raghani, Anil
2015-01-01
The purpose of this article is to recommend a risk-based strategy for determining clearance testing requirements of the process reagents used in manufacturing biopharmaceutical products. The strategy takes account of four risk factors. Firstly, the process reagents are classified into two categories according to their safety profile and history of use: generally recognized as safe (GRAS) and potential safety concern (PSC) reagents. The clearance testing of GRAS reagents can be eliminated because of their safe use historically and process capability to remove these reagents. An estimated safety margin (Se) value, a ratio of the exposure limit to the estimated maximum reagent amount, is then used to evaluate the necessity for testing the PSC reagents at an early development stage. The Se value is calculated from two risk factors, the starting PSC reagent amount per maximum product dose (Me), and the exposure limit (Le). A worst-case scenario is assumed to estimate the Me value, that is common. The PSC reagent of interest is co-purified with the product and no clearance occurs throughout the entire purification process. No clearance testing is required for this PSC reagent if its Se value is ≥1; otherwise clearance testing is needed. Finally, the point of the process reagent introduction to the process is also considered in determining the necessity of the clearance testing for process reagents. How to use the measured safety margin as a criterion for determining PSC reagent testing at process characterization, process validation, and commercial production stages are also described. A large number of process reagents are used in the biopharmaceutical manufacturing to control the process performance. Clearance testing for all of the process reagents will be an enormous analytical task. In this article, a risk-based strategy is described to eliminate unnecessary clearance testing for majority of the process reagents using four risk factors. The risk factors included in the strategy are (i) safety profile of the reagents, (ii) the starting amount of the process reagents used in the manufacturing process, (iii) the maximum dose of the product, and (iv) the point of introduction of the process reagents in the process. The implementation of the risk-based strategy can eliminate clearance testing for approximately 90% of the process reagents used in the manufacturing processes. This science-based strategy allows us to ensure patient safety and meet regulatory agency expectations throughout the product development life cycle. © PDA, Inc. 2015.
Chang'E-3 data pre-processing system based on scientific workflow
NASA Astrophysics Data System (ADS)
tan, xu; liu, jianjun; wang, yuanyuan; yan, wei; zhang, xiaoxia; li, chunlai
2016-04-01
The Chang'E-3(CE3) mission have obtained a huge amount of lunar scientific data. Data pre-processing is an important segment of CE3 ground research and application system. With a dramatic increase in the demand of data research and application, Chang'E-3 data pre-processing system(CEDPS) based on scientific workflow is proposed for the purpose of making scientists more flexible and productive by automating data-driven. The system should allow the planning, conduct and control of the data processing procedure with the following possibilities: • describe a data processing task, include:1)define input data/output data, 2)define the data relationship, 3)define the sequence of tasks,4)define the communication between tasks,5)define mathematical formula, 6)define the relationship between task and data. • automatic processing of tasks. Accordingly, Describing a task is the key point whether the system is flexible. We design a workflow designer which is a visual environment for capturing processes as workflows, the three-level model for the workflow designer is discussed:1) The data relationship is established through product tree.2)The process model is constructed based on directed acyclic graph(DAG). Especially, a set of process workflow constructs, including Sequence, Loop, Merge, Fork are compositional one with another.3)To reduce the modeling complexity of the mathematical formulas using DAG, semantic modeling based on MathML is approached. On top of that, we will present how processed the CE3 data with CEDPS.
Testing Theories of Recognition Memory by Predicting Performance Across Paradigms
ERIC Educational Resources Information Center
Smith, David G.; Duncan, Matthew J. J.
2004-01-01
Signal-detection theory (SDT) accounts of recognition judgments depend on the assumption that recognition decisions result from a single familiarity-based process. However, fits of a hybrid SDT model, called dual-process theory (DPT), have provided evidence for the existence of a second, recollection-based process. In 2 experiments, the authors…
USDA-ARS?s Scientific Manuscript database
The polyol process is a widely used strategy for producing nanoparticles from various reducible metallic precursors; however it requires a bulk polyol liquid reaction with additional protective agents at high temperatures. Here, we report a water-based binary polyol process using low concentrations ...
Social Workers' Orientation toward the Evidence-Based Practice Process: A Dutch Survey
ERIC Educational Resources Information Center
van der Zwet, Renske J. M.; Kolmer, Deirdre M. Beneken genaamd; Schalk, René
2016-01-01
Objectives: This study assesses social workers' orientation toward the evidence-based practice (EBP) process and explores which specific variables (e.g. age) are associated. Methods: Data were collected from 341 Dutch social workers through an online survey which included a Dutch translation of the EBP Process Assessment Scale (EBPPAS), along with…
ERIC Educational Resources Information Center
Zhang, Xiaolei; Wong, Jocelyn L. N.
2018-01-01
Studies of professional development have examined the influence of school-based approaches on in-service teacher learning and change but have seldom investigated teachers' job-embedded learning processes. This paper explores the dynamic processes of teacher learning in school-based settings. A qualitative comparative case study based on the…
Problem Based Learning and the scientific process
NASA Astrophysics Data System (ADS)
Schuchardt, Daniel Shaner
This research project was developed to inspire students to constructively use problem based learning and the scientific process to learn middle school science content. The student population in this study consisted of male and female seventh grade students. Students were presented with authentic problems that are connected to physical and chemical properties of matter. The intent of the study was to have students use the scientific process of looking at existing knowledge, generating learning issues or questions about the problems, and then developing a course of action to research and design experiments to model resolutions to the authentic problems. It was expected that students would improve their ability to actively engage with others in a problem solving process to achieve a deeper understanding of Michigan's 7th Grade Level Content Expectations, the Next Generation Science Standards, and a scientific process. Problem based learning was statistically effective in students' learning of the scientific process. Students statistically showed improvement on pre to posttest scores. The teaching method of Problem Based Learning was effective for seventh grade science students at Dowagiac Middle School.
Ethanol precipitation for purification of recombinant antibodies.
Tscheliessnig, Anne; Satzer, Peter; Hammerschmidt, Nikolaus; Schulz, Henk; Helk, Bernhard; Jungbauer, Alois
2014-10-20
Currently, the golden standard for the purification of recombinant humanized antibodies (rhAbs) from CHO cell culture is protein A chromatography. However, due to increasing rhAbs titers alternative methods have come into focus. A new strategy for purification of recombinant human antibodies from CHO cell culture supernatant based on cold ethanol precipitation (CEP) and CaCl2 precipitation has been developed. This method is based on the cold ethanol precipitation, the process used for purification of antibodies and other components from blood plasma. We proof the applicability of the developed process for four different antibodies resulting in similar yield and purity as a protein A chromatography based process. This process can be further improved using an anion-exchange chromatography in flowthrough mode e.g. a monolith as last step so that residual host cell protein is reduced to a minimum. Beside the ethanol based process, our data also suggest that ethanol could be replaced with methanol or isopropanol. The process is suited for continuous operation. Copyright © 2014 The Authors. Published by Elsevier B.V. All rights reserved.
Managing Analysis Models in the Design Process
NASA Technical Reports Server (NTRS)
Briggs, Clark
2006-01-01
Design of large, complex space systems depends on significant model-based support for exploration of the design space. Integrated models predict system performance in mission-relevant terms given design descriptions and multiple physics-based numerical models. Both the design activities and the modeling activities warrant explicit process definitions and active process management to protect the project from excessive risk. Software and systems engineering processes have been formalized and similar formal process activities are under development for design engineering and integrated modeling. JPL is establishing a modeling process to define development and application of such system-level models.
Davis, Tyler; Love, Bradley C.; Preston, Alison R.
2012-01-01
Category learning is a complex phenomenon that engages multiple cognitive processes, many of which occur simultaneously and unfold dynamically over time. For example, as people encounter objects in the world, they simultaneously engage processes to determine their fit with current knowledge structures, gather new information about the objects, and adjust their representations to support behavior in future encounters. Many techniques that are available to understand the neural basis of category learning assume that the multiple processes that subserve it can be neatly separated between different trials of an experiment. Model-based functional magnetic resonance imaging offers a promising tool to separate multiple, simultaneously occurring processes and bring the analysis of neuroimaging data more in line with category learning’s dynamic and multifaceted nature. We use model-based imaging to explore the neural basis of recognition and entropy signals in the medial temporal lobe and striatum that are engaged while participants learn to categorize novel stimuli. Consistent with theories suggesting a role for the anterior hippocampus and ventral striatum in motivated learning in response to uncertainty, we find that activation in both regions correlates with a model-based measure of entropy. Simultaneously, separate subregions of the hippocampus and striatum exhibit activation correlated with a model-based recognition strength measure. Our results suggest that model-based analyses are exceptionally useful for extracting information about cognitive processes from neuroimaging data. Models provide a basis for identifying the multiple neural processes that contribute to behavior, and neuroimaging data can provide a powerful test bed for constraining and testing model predictions. PMID:22746951
Bosse, Stefan
2015-01-01
Multi-agent systems (MAS) can be used for decentralized and self-organizing data processing in a distributed system, like a resource-constrained sensor network, enabling distributed information extraction, for example, based on pattern recognition and self-organization, by decomposing complex tasks in simpler cooperative agents. Reliable MAS-based data processing approaches can aid the material-integration of structural-monitoring applications, with agent processing platforms scaled to the microchip level. The agent behavior, based on a dynamic activity-transition graph (ATG) model, is implemented with program code storing the control and the data state of an agent, which is novel. The program code can be modified by the agent itself using code morphing techniques and is capable of migrating in the network between nodes. The program code is a self-contained unit (a container) and embeds the agent data, the initialization instructions and the ATG behavior implementation. The microchip agent processing platform used for the execution of the agent code is a standalone multi-core stack machine with a zero-operand instruction format, leading to a small-sized agent program code, low system complexity and high system performance. The agent processing is token-queue-based, similar to Petri-nets. The agent platform can be implemented in software, too, offering compatibility at the operational and code level, supporting agent processing in strong heterogeneous networks. In this work, the agent platform embedded in a large-scale distributed sensor network is simulated at the architectural level by using agent-based simulation techniques. PMID:25690550
Bosse, Stefan
2015-02-16
Multi-agent systems (MAS) can be used for decentralized and self-organizing data processing in a distributed system, like a resource-constrained sensor network, enabling distributed information extraction, for example, based on pattern recognition and self-organization, by decomposing complex tasks in simpler cooperative agents. Reliable MAS-based data processing approaches can aid the material-integration of structural-monitoring applications, with agent processing platforms scaled to the microchip level. The agent behavior, based on a dynamic activity-transition graph (ATG) model, is implemented with program code storing the control and the data state of an agent, which is novel. The program code can be modified by the agent itself using code morphing techniques and is capable of migrating in the network between nodes. The program code is a self-contained unit (a container) and embeds the agent data, the initialization instructions and the ATG behavior implementation. The microchip agent processing platform used for the execution of the agent code is a standalone multi-core stack machine with a zero-operand instruction format, leading to a small-sized agent program code, low system complexity and high system performance. The agent processing is token-queue-based, similar to Petri-nets. The agent platform can be implemented in software, too, offering compatibility at the operational and code level, supporting agent processing in strong heterogeneous networks. In this work, the agent platform embedded in a large-scale distributed sensor network is simulated at the architectural level by using agent-based simulation techniques.
Technology for the product and process data base
NASA Technical Reports Server (NTRS)
Barnes, R. D.
1984-01-01
The computerized product and process data base is increasingly recognized to be the cornerstone component of an overall system aimed at the integrated automation of the industrial processes of a given company or enterprise. The technology needed to support these more effective computer integrated design and manufacturing methods, especially the concept of 3-D computer-sensible product definitions rather than engineering drawings, is not fully available and rationalized. Progress is being made, however, in bridging this technology gap with concentration on the modeling of sophisticated information and data structures, high-performance interactive user interfaces and comprehensive tools for managing the resulting computerized product definition and process data base.
Design of virtual simulation experiment based on key events
NASA Astrophysics Data System (ADS)
Zhong, Zheng; Zhou, Dongbo; Song, Lingxiu
2018-06-01
Considering complex content and lacking of guidance in virtual simulation experiments, the key event technology in VR narrative theory was introduced for virtual simulation experiment to enhance fidelity and vividness process. Based on the VR narrative technology, an event transition structure was designed to meet the need of experimental operation process, and an interactive event processing model was used to generate key events in interactive scene. The experiment of" margin value of bees foraging" based on Biologic morphology was taken as an example, many objects, behaviors and other contents were reorganized. The result shows that this method can enhance the user's experience and ensure experimental process complete and effectively.
Light Weight Biomorphous Cellular Ceramics from Cellulose Templates
NASA Technical Reports Server (NTRS)
Singh, Mrityunjay; Yee, Bo-Moon; Gray, Hugh R. (Technical Monitor)
2003-01-01
Bimorphous ceramics are a new class of materials that can be fabricated from the cellulose templates derived from natural biopolymers. These biopolymers are abundantly available in nature and are produced by the photosynthesis process. The wood cellulose derived carbon templates have three- dimensional interconnectivity. A wide variety of non-oxide and oxide based ceramics have been fabricated by template conversion using infiltration and reaction-based processes. The cellular anatomy of the cellulose templates plays a key role in determining the processing parameters (pyrolysis, infiltration conditions, etc.) and resulting ceramic materials. The processing approach, microstructure, and mechanical properties of the biomorphous cellular ceramics (silicon carbide and oxide based) have been discussed.
Kim, Youngmi; Mosier, Nathan; Ladisch, Michael R
2008-08-01
Distillers' grains (DG), a co-product of a dry grind ethanol process, is an excellent source of supplemental proteins in livestock feed. Studies have shown that, due to its high polymeric sugar contents and ease of hydrolysis, the distillers' grains have potential as an additional source of fermentable sugars for ethanol fermentation. The benefit of processing the distillers' grains to extract fermentable sugars lies in an increased ethanol yield without significant modification in the current dry grind technology. Three different potential configurations of process alternatives in which pretreated and hydrolyzed distillers' grains are recycled for an enhanced overall ethanol yield are proposed and discussed in this paper based on the liquid hot water (LHW) pretreatment of distillers' grains. Possible limitations of each proposed process are also discussed. This paper presents a compositional analysis of distillers' grains, as well as a simulation of the modified dry grind processes with recycle of distillers' grains. Simulated material balances for the modified dry grind processes are established based on the base case assumptions. These balances are compared to the conventional dry grind process in terms of ethanol yield, compositions of its co-products, and accumulation of fermentation inhibitors. Results show that 14% higher ethanol yield is achievable by processing and hydrolyzing the distillers' grains for additional fermentable sugars, as compared to the conventional dry grind process. Accumulation of fermentation by-products and inhibitory components in the proposed process is predicted to be 2-5 times higher than in the conventional dry grind process. The impact of fermentation inhibitors is reviewed and discussed. The final eDDGS (enhanced dried distillers' grains) from the modified processes has 30-40% greater protein content per mass than DDGS, and its potential as a value-added process is also analyzed. While the case studies used to illustrate the process simulation are based on LHW pretreated DG, the process simulation itself provides a framework for evaluation of the impact of other pretreatments.
Process Evaluation for a Prison-based Substance Abuse Program.
ERIC Educational Resources Information Center
Staton, Michele; Leukefeld, Carl; Logan, T. K.; Purvis, Rick
2000-01-01
Presents findings from a process evaluation conducted in a prison-based substance abuse program in Kentucky. Discusses key components in the program, including a detailed program description, modifications in planned treatment strategies, program documentation, and perspectives of staff and clients. Findings suggest that prison-based programs have…
A new class of advanced oxidation processes (AOPs) based on sulfate radicals is being tested for the degradation of polycyclic aromatic hydrocarbons (PAHs) and polychlorinated biphenyls (PCBs) in aqueous solution. These AOPs are based on the generation of sulfate radicals through...
Polychlorinated biphenyls (PCBs) in the environment pose long-term risk to public health because of their persistent and toxic nature. This study investigates the degradation of PCBs using sulfate radical-based advanced oxidation processes (SR-AOPs). These processes are based o...
An Overview of Computer-Based Natural Language Processing.
ERIC Educational Resources Information Center
Gevarter, William B.
Computer-based Natural Language Processing (NLP) is the key to enabling humans and their computer-based creations to interact with machines using natural languages (English, Japanese, German, etc.) rather than formal computer languages. NLP is a major research area in the fields of artificial intelligence and computational linguistics. Commercial…
Performance-Based Assessment: An Alternative Assessment Process for Young Gifted Children.
ERIC Educational Resources Information Center
Hafenstein, Norma Lu; Tucker, Brooke
Performance-based assessment provides an alternative identification method for young gifted children. A performance-based identification process was developed and implemented to select three-, four-, and five-year-old children for inclusion in a school for gifted children. Literature regarding child development, characteristics of young gifted…
Community Leadership through Community-Based Programming: The Role of the Community College.
ERIC Educational Resources Information Center
Boone, Edgar J.; And Others
Organized around 15 tasks involved in the community-based programming (CBP) process, this book provides practical, field-tested guidance on successfully implementing CBP in community colleges. Following prefatory materials, the following chapters are provided: (1) "An Introduction to the Community-Based Programming Process" (Edgar J.…
Dynamic Approaches to Language Processing
ERIC Educational Resources Information Center
Srinivasan, Narayanan
2007-01-01
Symbolic rule-based approaches have been a preferred way to study language and cognition. Dissatisfaction with rule-based approaches in the 1980s lead to alternative approaches to study language, the most notable being the dynamic approaches to language processing. Dynamic approaches provide a significant alternative by not being rule-based and…
ERIC Educational Resources Information Center
Cantor, Jeffrey A.
This paper describes a formative/summative process for educational program evaluation, which is appropriate for higher education programs and is based on M. Provus' Discrepancy Evaluation Model and the principles of instructional design. The Discrepancy Based Methodology for Educational Program Evaluation facilitates systematic and detailed…
Reasoning in explanation-based decision making.
Pennington, N; Hastie, R
1993-01-01
A general theory of explanation-based decision making is outlined and the multiple roles of inference processes in the theory are indicated. A typology of formal and informal inference forms, originally proposed by Collins (1978a, 1978b), is introduced as an appropriate framework to represent inferences that occur in the overarching explanation-based process. Results from the analysis of verbal reports of decision processes are presented to demonstrate the centrality and systematic character of reasoning in a representative legal decision-making task.
García-Peñalvo, Francisco J.; Pérez-Blanco, Jonás Samuel; Martín-Suárez, Ana
2014-01-01
This paper discusses how cloud-based architectures can extend and enhance the functionality of the training environments based on virtual worlds and how, from this cloud perspective, we can provide support to analysis of training processes in the area of health, specifically in the field of training processes in quality assurance for pharmaceutical laboratories, presenting a tool for data retrieval and analysis that allows facing the knowledge discovery in the happenings inside the virtual worlds. PMID:24778593
NASA Astrophysics Data System (ADS)
Wan, Chang Jin; Zhu, Li Qiang; Zhou, Ju Mei; Shi, Yi; Wan, Qing
2013-10-01
In neuroscience, signal processing, memory and learning function are established in the brain by modifying ionic fluxes in neurons and synapses. Emulation of memory and learning behaviors of biological systems by nanoscale ionic/electronic devices is highly desirable for building neuromorphic systems or even artificial neural networks. Here, novel artificial synapses based on junctionless oxide-based protonic/electronic hybrid transistors gated by nanogranular phosphorus-doped SiO2-based proton-conducting films are fabricated on glass substrates by a room-temperature process. Short-term memory (STM) and long-term memory (LTM) are mimicked by tuning the pulse gate voltage amplitude. The LTM process in such an artificial synapse is due to the proton-related interfacial electrochemical reaction. Our results are highly desirable for building future neuromorphic systems or even artificial networks via electronic elements.In neuroscience, signal processing, memory and learning function are established in the brain by modifying ionic fluxes in neurons and synapses. Emulation of memory and learning behaviors of biological systems by nanoscale ionic/electronic devices is highly desirable for building neuromorphic systems or even artificial neural networks. Here, novel artificial synapses based on junctionless oxide-based protonic/electronic hybrid transistors gated by nanogranular phosphorus-doped SiO2-based proton-conducting films are fabricated on glass substrates by a room-temperature process. Short-term memory (STM) and long-term memory (LTM) are mimicked by tuning the pulse gate voltage amplitude. The LTM process in such an artificial synapse is due to the proton-related interfacial electrochemical reaction. Our results are highly desirable for building future neuromorphic systems or even artificial networks via electronic elements. Electronic supplementary information (ESI) available. See DOI: 10.1039/c3nr02987e
The Effect of Ultrasonic Additive Manufacturing on Integrated Printed Electronic Conductors
NASA Astrophysics Data System (ADS)
Bournias-Varotsis, Alkaios; Wang, Shanda; Hutt, David; Engstrøm, Daniel S.
2018-07-01
Ultrasonic additive manufacturing (UAM) is a low temperature manufacturing method capable of embedding printed electronics in metal components. The effect of UAM processing on the resistivity of conductive tracks printed with five different conductive pastes based on silver, copper or carbon flakes/particles in either a thermoplastic or thermoset filler binder are investigated. For all but the carbon-based paste, the resistivity changed linearly with the UAM energy input. After UAM processing, a resistivity increase of more than 150 times was recorded for the copper based thermoset paste. The silver based pastes showed a resistivity increase of between 1.1 and 50 times from their initial values. The carbon-based paste showed no change in resistivity after UAM processing. Focussed ion beam microstructure analysis of the printed conductive tracks before and after UAM processing showed that the silver particles and flakes in at least one of the pastes partly dislodged from their thermoset filler creating voids, thereby increasing the resistivity, whereas the silver flakes in a thermoplastic filler did not dislodge due to material flow of the polymer binder. The lowest resistivity (8 × 10-5 Ω cm) after UAM processing was achieved for a thermoplastic paste with silver flakes at low UAM processing energy.
The Effect of Ultrasonic Additive Manufacturing on Integrated Printed Electronic Conductors
NASA Astrophysics Data System (ADS)
Bournias-Varotsis, Alkaios; Wang, Shanda; Hutt, David; Engstrøm, Daniel S.
2018-03-01
Ultrasonic additive manufacturing (UAM) is a low temperature manufacturing method capable of embedding printed electronics in metal components. The effect of UAM processing on the resistivity of conductive tracks printed with five different conductive pastes based on silver, copper or carbon flakes/particles in either a thermoplastic or thermoset filler binder are investigated. For all but the carbon-based paste, the resistivity changed linearly with the UAM energy input. After UAM processing, a resistivity increase of more than 150 times was recorded for the copper based thermoset paste. The silver based pastes showed a resistivity increase of between 1.1 and 50 times from their initial values. The carbon-based paste showed no change in resistivity after UAM processing. Focussed ion beam microstructure analysis of the printed conductive tracks before and after UAM processing showed that the silver particles and flakes in at least one of the pastes partly dislodged from their thermoset filler creating voids, thereby increasing the resistivity, whereas the silver flakes in a thermoplastic filler did not dislodge due to material flow of the polymer binder. The lowest resistivity (8 × 10-5 Ω cm) after UAM processing was achieved for a thermoplastic paste with silver flakes at low UAM processing energy.
Reconfigurable environmentally adaptive computing
NASA Technical Reports Server (NTRS)
Coxe, Robin L. (Inventor); Galica, Gary E. (Inventor)
2008-01-01
Described are methods and apparatus, including computer program products, for reconfigurable environmentally adaptive computing technology. An environmental signal representative of an external environmental condition is received. A processing configuration is automatically selected, based on the environmental signal, from a plurality of processing configurations. A reconfigurable processing element is reconfigured to operate according to the selected processing configuration. In some examples, the environmental condition is detected and the environmental signal is generated based on the detected condition.
Low-SWaP coincidence processing for Geiger-mode LIDAR video
NASA Astrophysics Data System (ADS)
Schultz, Steven E.; Cervino, Noel P.; Kurtz, Zachary D.; Brown, Myron Z.
2015-05-01
Photon-counting Geiger-mode lidar detector arrays provide a promising approach for producing three-dimensional (3D) video at full motion video (FMV) data rates, resolution, and image size from long ranges. However, coincidence processing required to filter raw photon counts is computationally expensive, generally requiring significant size, weight, and power (SWaP) and also time. In this paper, we describe a laboratory test-bed developed to assess the feasibility of low-SWaP, real-time processing for 3D FMV based on Geiger-mode lidar. First, we examine a design based on field programmable gate arrays (FPGA) and demonstrate proof-of-concept results. Then we examine a design based on a first-of-its-kind embedded graphical processing unit (GPU) and compare performance with the FPGA. Results indicate feasibility of real-time Geiger-mode lidar processing for 3D FMV and also suggest utility for real-time onboard processing for mapping lidar systems.
Enhancing Users' Participation in Business Process Modeling through Ontology-Based Training
NASA Astrophysics Data System (ADS)
Macris, A.; Malamateniou, F.; Vassilacopoulos, G.
Successful business process design requires active participation of users who are familiar with organizational activities and business process modelling concepts. Hence, there is a need to provide users with reusable, flexible, agile and adaptable training material in order to enable them instil their knowledge and expertise in business process design and automation activities. Knowledge reusability is of paramount importance in designing training material on process modelling since it enables users participate actively in process design/redesign activities stimulated by the changing business environment. This paper presents a prototype approach for the design and use of training material that provides significant advantages to both the designer (knowledge - content reusability and semantic web enabling) and the user (semantic search, knowledge navigation and knowledge dissemination). The approach is based on externalizing domain knowledge in the form of ontology-based knowledge networks (i.e. training scenarios serving specific training needs) so that it is made reusable.
Integrated controls design optimization
Lou, Xinsheng; Neuschaefer, Carl H.
2015-09-01
A control system (207) for optimizing a chemical looping process of a power plant includes an optimizer (420), an income algorithm (230) and a cost algorithm (225) and a chemical looping process models. The process models are used to predict the process outputs from process input variables. Some of the process in puts and output variables are related to the income of the plant; and some others are related to the cost of the plant operations. The income algorithm (230) provides an income input to the optimizer (420) based on a plurality of input parameters (215) of the power plant. The cost algorithm (225) provides a cost input to the optimizer (420) based on a plurality of output parameters (220) of the power plant. The optimizer (420) determines an optimized operating parameter solution based on at least one of the income input and the cost input, and supplies the optimized operating parameter solution to the power plant.
Technology development for lunar base water recycling
NASA Technical Reports Server (NTRS)
Schultz, John R.; Sauer, Richard L.
1992-01-01
This paper will review previous and ongoing work in aerospace water recycling and identify research activities required to support development of a lunar base. The development of a water recycle system for use in the life support systems envisioned for a lunar base will require considerable research work. A review of previous work on aerospace water recycle systems indicates that more efficient physical and chemical processes are needed to reduce expendable and power requirements. Development work on biological processes that can be applied to microgravity and lunar environments also needs to be initiated. Biological processes are inherently more efficient than physical and chemical processes and may be used to minimize resupply and waste disposal requirements. Processes for recovering and recycling nutrients such as nitrogen, phosphorus, and sulfur also need to be developed to support plant growth units. The development of efficient water quality monitors to be used for process control and environmental monitoring also needs to be initiated.
Development of Replacements for Phoscoating Used in Forging, Extrusion and Metal Forming Processes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kerry Barnett
2003-03-01
Many forging, extrusion, heading and other metal forming processes use graphite-based lubricants, phosphate coatings, and other potentially hazardous or harmful substances to improve the tribology of the metal forming process. The application of phosphate-based coatings has long been studied to determine if other synthetic ''clean'' lubricants could provide the same degree of protection afforded by phoscoatings and its formulations. So far, none meets the cost and performance objectives provided by phoscoatings as a general aid to the metal forming industry. In as much as phoscoatings and graphite have replaced lead-based lubricants, the metal forming industry has had previous experience withmore » a legislated requirement to change processes. However, without a proactive approach to phoscoating replacement, many metal forming processes could find themselves without a cost effective tribology material necessary for the metal forming process« less
Signal Processing Studies of a Simulated Laser Doppler Velocimetry-Based Acoustic Sensor
1990-10-17
investigated using spectral correlation methods. Results indicate that it may be possible to extend demonstrated LDV-based acoustic sensor sensitivities using higher order processing techniques. (Author)
NASA Astrophysics Data System (ADS)
Bianco, S.; Jones, J. A.; Gosnell, H.
2017-12-01
Process-based restoration, a new approach to river and floodplain management, is being implemented on federal lands across Oregon. These management efforts are aimed at promoting key physical processes in order to improve river ecological function, create diverse habitat, and increase biological productivity for ESA-listed bull trout and spring Chinook salmon. Although the practice is being disseminated across the Pacific Northwest, it remains unclear what is driving aquatic and riparian ecosystem restoration towards this process-based approach and away from form-based methods such as Rosgen's Natural Channel Design. The technical aspects of process-based restoration have been described in the literature (ex. Beechie et al. 2010), but little is known about the practice from a social science perspective, and few case studies exist to assess the impact of these efforts. We combine semi-structured qualitative interviews with management experts and photogrammetric analysis to better understand how complex social processes and changing ideas about aquatic ecosystems are manifesting on the ground in federal land management. This study characterizes process-based river and floodplain restoration projects on federal lands in Oregon, and identifies catalysts and barriers to its implementation. The Deer Creek Floodplain Enhancement project serves as a case study for photogrammetric analysis. To characterize long-term changes at Deer Creek, geomorphic features were mapped and classified using orthoimage mosaics developed from a time series of historic aerial photographs dating back to 1954. 3D Digital Elevation Models (3D-DEMs) were created of portions of the modified sections of Deer Creek and its floodplain immediately before and after restoration using drone-captured aerial photography and a photogrammetric technique called Structure from Motion. These 3D-DEMs have enabled extraction of first-order geomorphic variables to compare pre- and post-project conditions. This study improves understanding of the historic range of conditions at Deer Creek, and assesses how process-based restoration activities drive short-term changes in geomorphic features, which can in turn influence complex riverine processes such as energy dissipation and sediment deposition.
Schmidt, Marvin; Ullrich, Johannes; Wieczorek, André; Frenzel, Jan; Eggeler, Gunther; Schütze, Andreas; Seelecke, Stefan
2016-01-01
Shape Memory Alloys (SMA) using elastocaloric cooling processes have the potential to be an environmentally friendly alternative to the conventional vapor compression based cooling process. Nickel-Titanium (Ni-Ti) based alloy systems, especially, show large elastocaloric effects. Furthermore, exhibit large latent heats which is a necessary material property for the development of an efficient solid-state based cooling process. A scientific test rig has been designed to investigate these processes and the elastocaloric effects in SMAs. The realized test rig enables independent control of an SMA's mechanical loading and unloading cycles, as well as conductive heat transfer between SMA cooling elements and a heat source/sink. The test rig is equipped with a comprehensive monitoring system capable of synchronized measurements of mechanical and thermal parameters. In addition to determining the process-dependent mechanical work, the system also enables measurement of thermal caloric aspects of the elastocaloric cooling effect through use of a high-performance infrared camera. This combination is of particular interest, because it allows illustrations of localization and rate effects — both important for efficient heat transfer from the medium to be cooled. The work presented describes an experimental method to identify elastocaloric material properties in different materials and sample geometries. Furthermore, the test rig is used to investigate different cooling process variations. The introduced analysis methods enable a differentiated consideration of material, process and related boundary condition influences on the process efficiency. The comparison of the experimental data with the simulation results (of a thermomechanically coupled finite element model) allows for better understanding of the underlying physics of the elastocaloric effect. In addition, the experimental results, as well as the findings based on the simulation results, are used to improve the material properties. PMID:27168093
Schmidt, Marvin; Ullrich, Johannes; Wieczorek, André; Frenzel, Jan; Eggeler, Gunther; Schütze, Andreas; Seelecke, Stefan
2016-05-02
Shape Memory Alloys (SMA) using elastocaloric cooling processes have the potential to be an environmentally friendly alternative to the conventional vapor compression based cooling process. Nickel-Titanium (Ni-Ti) based alloy systems, especially, show large elastocaloric effects. Furthermore, exhibit large latent heats which is a necessary material property for the development of an efficient solid-state based cooling process. A scientific test rig has been designed to investigate these processes and the elastocaloric effects in SMAs. The realized test rig enables independent control of an SMA's mechanical loading and unloading cycles, as well as conductive heat transfer between SMA cooling elements and a heat source/sink. The test rig is equipped with a comprehensive monitoring system capable of synchronized measurements of mechanical and thermal parameters. In addition to determining the process-dependent mechanical work, the system also enables measurement of thermal caloric aspects of the elastocaloric cooling effect through use of a high-performance infrared camera. This combination is of particular interest, because it allows illustrations of localization and rate effects - both important for efficient heat transfer from the medium to be cooled. The work presented describes an experimental method to identify elastocaloric material properties in different materials and sample geometries. Furthermore, the test rig is used to investigate different cooling process variations. The introduced analysis methods enable a differentiated consideration of material, process and related boundary condition influences on the process efficiency. The comparison of the experimental data with the simulation results (of a thermomechanically coupled finite element model) allows for better understanding of the underlying physics of the elastocaloric effect. In addition, the experimental results, as well as the findings based on the simulation results, are used to improve the material properties.
Multi-layered reasoning by means of conceptual fuzzy sets
NASA Technical Reports Server (NTRS)
Takagi, Tomohiro; Imura, Atsushi; Ushida, Hirohide; Yamaguchi, Toru
1993-01-01
The real world consists of a very large number of instances of events and continuous numeric values. On the other hand, people represent and process their knowledge in terms of abstracted concepts derived from generalization of these instances and numeric values. Logic based paradigms for knowledge representation use symbolic processing both for concept representation and inference. Their underlying assumption is that a concept can be defined precisely. However, as this assumption hardly holds for natural concepts, it follows that symbolic processing cannot deal with such concepts. Thus symbolic processing has essential problems from a practical point of view of applications in the real world. In contrast, fuzzy set theory can be viewed as a stronger and more practical notation than formal, logic based theories because it supports both symbolic processing and numeric processing, connecting the logic based world and the real world. In this paper, we propose multi-layered reasoning by using conceptual fuzzy sets (CFS). The general characteristics of CFS are discussed along with upper layer supervision and context dependent processing.
Tadapaneni, Ravi Kiran; Banaszewski, Katarzyna; Patazca, Eduardo; Edirisinghe, Indika; Cappozzo, Jack; Jackson, Lauren; Burton-Freeman, Britt
2012-06-13
The present study investigated processing strategies and matrix effects on the antioxidant capacity (AC) and polyphenols (PP) content of fruit-based beverages: (1) strawberry powder (Str) + dairy, D-Str; (2) Str + water, ND-Str; (3) dairy + no Str, D-NStr. Beverages were subjected to high-temperature-short-time (HTST) and high-pressure processing (HPP). AC and PP were measured before and after processing and after a 5 week shelf-life study. Unprocessed D-Str had significantly lower AC compared to unprocessed ND-Str. Significant reductions in AC were apparent in HTST- compared to HPP-processed beverages (up to 600 MPa). PP content was significantly reduced in D-Str compared to ND-Str and in response to HPP and HTST in all beverages. After storage (5 weeks), AC and PP were reduced in all beverages compared to unprocessed and week 0 processed beverages. These findings indicate potentially negative effects of milk and processing on AC and PP of fruit-based beverages.
Li, Kangkang; Yu, Hai; Tade, Moses; Feron, Paul; Yu, Jingwen; Wang, Shujuan
2014-06-17
An advanced NH3 abatement and recycling process that makes great use of the waste heat in flue gas was proposed to solve the problems of ammonia slip, NH3 makeup, and flue gas cooling in the ammonia-based CO2 capture process. The rigorous rate-based model, RateFrac in Aspen Plus, was thermodynamically and kinetically validated by experimental data from open literature and CSIRO pilot trials at Munmorah Power Station, Australia, respectively. After a thorough sensitivity analysis and process improvement, the NH3 recycling efficiency reached as high as 99.87%, and the NH3 exhaust concentration was only 15.4 ppmv. Most importantly, the energy consumption of the NH3 abatement and recycling system was only 59.34 kJ/kg CO2 of electricity. The evaluation of mass balance and temperature steady shows that this NH3 recovery process was technically effective and feasible. This process therefore is a promising prospect toward industrial application.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Shiaoguo
A novel Gas Pressurized Stripping (GPS) post-combustion carbon capture (PCC) process has been developed by Carbon Capture Scientific, LLC, CONSOL Energy Inc., Nexant Inc., and Western Kentucky University in this bench-scale project. The GPS-based process presents a unique approach that uses a gas pressurized technology for CO₂ stripping at an elevated pressure to overcome the energy use and other disadvantages associated with the benchmark monoethanolamine (MEA) process. The project was aimed at performing laboratory- and bench-scale experiments to prove its technical feasibility and generate process engineering and scale-up data, and conducting a techno-economic analysis (TEA) to demonstrate its energy usemore » and cost competitiveness over the MEA process. To meet project goals and objectives, a combination of experimental work, process simulation, and technical and economic analysis studies were applied. The project conducted individual unit lab-scale tests for major process components, including a first absorption column, a GPS column, a second absorption column, and a flasher. Computer simulations were carried out to study the GPS column behavior under different operating conditions, to optimize the column design and operation, and to optimize the GPS process for an existing and a new power plant. The vapor-liquid equilibrium data under high loading and high temperature for the selected amines were also measured. The thermal and oxidative stability of the selected solvents were also tested experimentally and presented. A bench-scale column-based unit capable of achieving at least 90% CO₂ capture from a nominal 500 SLPM coal-derived flue gas slipstream was designed and built. This integrated, continuous, skid-mounted GPS system was tested using real flue gas from a coal-fired boiler at the National Carbon Capture Center (NCCC). The technical challenges of the GPS technology in stability, corrosion, and foaming of selected solvents, and environmental, health and safety risks have been addressed through experimental tests, consultation with vendors and engineering analysis. Multiple rounds of TEA were performed to improve the GPS-based PCC process design and operation, and to compare the energy use and cost performance of a nominal 550-MWe supercritical pulverized coal (PC) plant among the DOE/NETL report Case 11 (the PC plant without CO₂ capture), the DOE/NETL report Case 12 (the PC plant with benchmark MEA-based PCC), and the PC plant using GPS-based PCC. The results reveal that the net power produced in the PC plant with GPS-based PCC is 647 MWe, greater than that of the Case 12 (550 MWe). The 20-year LCOE for the PC plant with GPS-based PCC is 97.4 mills/kWh, or 152% of that of the Case 11, which is also 23% less than that of the Case 12. These results demonstrate that the GPS-based PCC process is energy-efficient and cost-effective compared with the benchmark MEA process.« less
Beyond Depression: Towards a Process-Based Approach to Research, Diagnosis, and Treatment.
Forgeard, Marie J C; Haigh, Emily A P; Beck, Aaron T; Davidson, Richard J; Henn, Fritz A; Maier, Steven F; Mayberg, Helen S; Seligman, Martin E P
2011-12-01
Despite decades of research on the etiology and treatment of depression, a significant proportion of the population is affected by the disorder, fails to respond to treatment and is plagued by relapse. Six prominent scientists, Aaron Beck, Richard Davidson, Fritz Henn, Steven Maier, Helen Mayberg, and Martin Seligman, gathered to discuss the current state of scientific knowledge on depression, and in particular on the basic neurobiological and psychopathological processes at play in the disorder. These general themes were addressed: 1) the relevance of learned helplessness as a basic process involved in the development of depression; 2) the limitations of our current taxonomy of psychological disorders; 3) the need to work towards a psychobiological process-based taxonomy; and 4) the clinical implications of implementing such a process-based taxonomy.
Enzyme-based processing of soybean carbohydrate: Recent developments and future prospects.
Al Loman, Abdullah; Ju, Lu-Kwang
2017-11-01
Soybean is well known for its high-value oil and protein. Carbohydrate is, however, an underutilized major component, representing almost 26-30% (w/w) of the dried bean. The complex soybean carbohydrate is not easily hydrolyzable and can cause indigestibility when included in food and feed. Enzymes can be used to hydrolyze the carbohydrate for improving soybean processing and value of soybean products. Here the enzyme-based processing developed for the following purposes is reviewed: hydrolysis of different carbohydrate-rich by/products from soybean processing, improvement of soybean oil extraction, and increase of nutritional value of soybean-based food and animal feed. Once hydrolyzed into fermentable sugars, soybean carbohydrate can find more value-added applications and further improve the overall economics of soybean processing. Copyright © 2017 Elsevier Inc. All rights reserved.
Development of solution-processed nanowire composites for opto-electronics
Ginley, David S.; Aggarwal, Shruti; Singh, Rajiv; ...
2016-12-20
Here, silver nanowire-based contacts represent one of the major new directions in transparent contacts for opto-electronic devices with the added advantage that they can have Indium-Tin-Oxide-like properties at substantially reduced processing temperatures and without the use of vacuum-based processing. However, nanowires alone often do not adhere well to the substrate or other film interfaces; even after a relatively high-temperature anneal and unencapsulated nanowires show environmental degradation at high temperature and humidity. Here we report on the development of ZnO/Ag-nanowire composites that have sheet resistance below 10 Ω/sq and >90% transmittance from a solution-based process with process temperatures below 200 °C.more » These films have significant applications potential in photovoltaics and displays.« less
Beyond Depression: Towards a Process-Based Approach to Research, Diagnosis, and Treatment
Forgeard, Marie J. C.; Haigh, Emily A. P.; Beck, Aaron T.; Davidson, Richard J.; Henn, Fritz A.; Maier, Steven F.; Mayberg, Helen S.; Seligman, Martin E. P.
2012-01-01
Despite decades of research on the etiology and treatment of depression, a significant proportion of the population is affected by the disorder, fails to respond to treatment and is plagued by relapse. Six prominent scientists, Aaron Beck, Richard Davidson, Fritz Henn, Steven Maier, Helen Mayberg, and Martin Seligman, gathered to discuss the current state of scientific knowledge on depression, and in particular on the basic neurobiological and psychopathological processes at play in the disorder. These general themes were addressed: 1) the relevance of learned helplessness as a basic process involved in the development of depression; 2) the limitations of our current taxonomy of psychological disorders; 3) the need to work towards a psychobiological process-based taxonomy; and 4) the clinical implications of implementing such a process-based taxonomy. PMID:22509072
CropEx Web-Based Agricultural Monitoring and Decision Support
NASA Technical Reports Server (NTRS)
Harvey. Craig; Lawhead, Joel
2011-01-01
CropEx is a Web-based agricultural Decision Support System (DSS) that monitors changes in crop health over time. It is designed to be used by a wide range of both public and private organizations, including individual producers and regional government offices with a vested interest in tracking vegetation health. The database and data management system automatically retrieve and ingest data for the area of interest. Another stores results of the processing and supports the DSS. The processing engine will allow server-side analysis of imagery with support for image sub-setting and a set of core raster operations for image classification, creation of vegetation indices, and change detection. The system includes the Web-based (CropEx) interface, data ingestion system, server-side processing engine, and a database processing engine. It contains a Web-based interface that has multi-tiered security profiles for multiple users. The interface provides the ability to identify areas of interest to specific users, user profiles, and methods of processing and data types for selected or created areas of interest. A compilation of programs is used to ingest available data into the system, classify that data, profile that data for quality, and make data available for the processing engine immediately upon the data s availability to the system (near real time). The processing engine consists of methods and algorithms used to process the data in a real-time fashion without copying, storing, or moving the raw data. The engine makes results available to the database processing engine for storage and further manipulation. The database processing engine ingests data from the image processing engine, distills those results into numerical indices, and stores each index for an area of interest. This process happens each time new data is ingested and processed for the area of interest, and upon subsequent database entries, the database processing engine qualifies each value for each area of interest and conducts a logical processing of results indicating when and where thresholds are exceeded. Reports are provided at regular, operator-determined intervals that include variances from thresholds and links to view raw data for verification, if necessary. The technology and method of development allow the code base to easily be modified for varied use in the real-time and near-real-time processing environments. In addition, the final product will be demonstrated as a means for rapid draft assessment of imagery.
Peri, Tuvia; Gofman, Mordechai; Tal, Shahar; Tuval-Mashiach, Rivka
2015-01-01
Exposure to the trauma memory is the common denominator of most evidence-based interventions for posttraumatic stress disorder (PTSD). Although exposure-based therapies aim to change associative learning networks and negative cognitions related to the trauma memory, emotional interactions between patient and therapist have not been thoroughly considered in past evaluations of exposure-based therapy. This work focuses on recent discoveries of the mirror-neuron system and the theory of embodied simulation (ES). These conceptualizations may add a new perspective to our understanding of change processes in exposure-based treatments for PTSD patients. It is proposed that during exposure to trauma memories, emotional responses of the patient are transferred to the therapist through ES and then mirrored back to the patient in a modulated way. This process helps to alleviate the patient's sense of loneliness and enhances his or her ability to exert control over painful, trauma-related emotional responses. ES processes may enhance the integration of clinical insights originating in psychoanalytic theories—such as holding, containment, projective identification, and emotional attunement—with cognitive behavioral theories of learning processes in the alleviation of painful emotional responses aroused by trauma memories. These processes are demonstrated through a clinical vignette from an exposure-based therapy with a trauma survivor. Possible clinical implications for the importance of face-to-face relationships during exposure-based therapy are discussed. PMID:26593097
Peri, Tuvia; Gofman, Mordechai; Tal, Shahar; Tuval-Mashiach, Rivka
2015-01-01
Exposure to the trauma memory is the common denominator of most evidence-based interventions for posttraumatic stress disorder (PTSD). Although exposure-based therapies aim to change associative learning networks and negative cognitions related to the trauma memory, emotional interactions between patient and therapist have not been thoroughly considered in past evaluations of exposure-based therapy. This work focuses on recent discoveries of the mirror-neuron system and the theory of embodied simulation (ES). These conceptualizations may add a new perspective to our understanding of change processes in exposure-based treatments for PTSD patients. It is proposed that during exposure to trauma memories, emotional responses of the patient are transferred to the therapist through ES and then mirrored back to the patient in a modulated way. This process helps to alleviate the patient's sense of loneliness and enhances his or her ability to exert control over painful, trauma-related emotional responses. ES processes may enhance the integration of clinical insights originating in psychoanalytic theories-such as holding, containment, projective identification, and emotional attunement-with cognitive behavioral theories of learning processes in the alleviation of painful emotional responses aroused by trauma memories. These processes are demonstrated through a clinical vignette from an exposure-based therapy with a trauma survivor. Possible clinical implications for the importance of face-to-face relationships during exposure-based therapy are discussed.
Itô and Stratonovich integrals on compound renewal processes: the normal/Poisson case
NASA Astrophysics Data System (ADS)
Germano, Guido; Politi, Mauro; Scalas, Enrico; Schilling, René L.
2010-06-01
Continuous-time random walks, or compound renewal processes, are pure-jump stochastic processes with several applications in insurance, finance, economics and physics. Based on heuristic considerations, a definition is given for stochastic integrals driven by continuous-time random walks, which includes the Itô and Stratonovich cases. It is then shown how the definition can be used to compute these two stochastic integrals by means of Monte Carlo simulations. Our example is based on the normal compound Poisson process, which in the diffusive limit converges to the Wiener process.
Process Based on SysML for New Launchers System and Software Developments
NASA Astrophysics Data System (ADS)
Hiron, Emmanuel; Miramont, Philippe
2010-08-01
The purpose of this paper is to present the Astrium-ST engineering process based on SysML. This process is currently set-up in the frame of common CNES /Astrium-ST R&T studies related to the Ariane 5 electrical system and flight software modelling. The tool used to set up this process is Rhapsody release 7.3 from IBM-Software firm [1]. This process focuses on the system engineering phase dedicated to Software with the objective to generate both System documents (sequential system design and flight control) and Software specifications.
Social Models: Blueprints or Processes?
ERIC Educational Resources Information Center
Little, Graham R.
1981-01-01
Discusses the nature and implications of two different models for societal planning: (1) the problem-solving process approach based on Karl Popper; and (2) the goal-setting "blueprint" approach based on Karl Marx. (DC)
Cultural adaptation process for international dissemination of the strengthening families program.
Kumpfer, Karol L; Pinyuchon, Methinin; Teixeira de Melo, Ana; Whiteside, Henry O
2008-06-01
The Strengthening Families Program (SFP) is an evidence-based family skills training intervention developed and found efficacious for substance abuse prevention by U.S researchers in the 1980s. In the 1990s, a cultural adaptation process was developed to transport SFP for effectiveness trials with diverse populations (African, Hispanic, Asian, Pacific Islander, and Native American). Since 2003, SFP has been culturally adapted for use in 17 countries. This article reviews the SFP theory and research and a recommended cultural adaptation process. Challenges in international dissemination of evidence-based programs (EBPs) are discussed based on the results of U.N. and U.S. governmental initiatives to transport EBP family interventions to developing countries. The technology transfer and quality assurance system are described, including the language translation and cultural adaptation process for materials development, staff training, and on-site and online Web-based supervision and technical assistance and evaluation services to assure quality implementation and process evaluation feedback for improvements.
Murphy, Enda; King, Eoin A
2016-08-15
The strategic noise mapping process of the EU has now been ongoing for more than ten years. However, despite the fact that a significant volume of research has been conducted on the process and related issues there has been little change or innovation in how relevant authorities and policymakers are conducting the process since its inception. This paper reports on research undertaken to assess the possibility for smartphone-based noise mapping data to be integrated into the traditional strategic noise mapping process. We compare maps generated using the traditional approach with those generated using smartphone-based measurement data. The advantage of the latter approach is that it has the potential to remove the need for exhaustive input data into the source calculation model for noise prediction. In addition, the study also tests the accuracy of smartphone-based measurements against simultaneous measurements taken using traditional sound level meters in the field. Copyright © 2016 Elsevier B.V. All rights reserved.
Chronic Motivational State Interacts with Task Reward Structure in Dynamic Decision-Making
Cooper, Jessica A.; Worthy, Darrell A.; Maddox, W. Todd
2015-01-01
Research distinguishes between a habitual, model-free system motivated toward immediately rewarding actions, and a goal-directed, model-based system motivated toward actions that improve future state. We examined the balance of processing in these two systems during state-based decision-making. We tested a regulatory fit hypothesis (Maddox & Markman, 2010) that predicts that global trait motivation affects the balance of habitual- vs. goal-directed processing but only through its interaction with the task framing as gain-maximization or loss-minimization. We found support for the hypothesis that a match between an individual’s chronic motivational state and the task framing enhances goal-directed processing, and thus state-based decision-making. Specifically, chronic promotion-focused individuals under gain-maximization and chronic prevention-focused individuals under loss-minimization both showed enhanced state-based decision-making. Computational modeling indicates that individuals in a match between global chronic motivational state and local task reward structure engaged more goal-directed processing, whereas those in a mismatch engaged more habitual processing. PMID:26520256
Kawakami, Tomoya; Fujita, Naotaka; Yoshihisa, Tomoki; Tsukamoto, Masahiko
2014-01-01
In recent years, sensors become popular and Home Energy Management System (HEMS) takes an important role in saving energy without decrease in QoL (Quality of Life). Currently, many rule-based HEMSs have been proposed and almost all of them assume "IF-THEN" rules. The Rete algorithm is a typical pattern matching algorithm for IF-THEN rules. Currently, we have proposed a rule-based Home Energy Management System (HEMS) using the Rete algorithm. In the proposed system, rules for managing energy are processed by smart taps in network, and the loads for processing rules and collecting data are distributed to smart taps. In addition, the number of processes and collecting data are reduced by processing rules based on the Rete algorithm. In this paper, we evaluated the proposed system by simulation. In the simulation environment, rules are processed by a smart tap that relates to the action part of each rule. In addition, we implemented the proposed system as HEMS using smart taps.
MIRADS-2 Implementation Manual
NASA Technical Reports Server (NTRS)
1975-01-01
The Marshall Information Retrieval and Display System (MIRADS) which is a data base management system designed to provide the user with a set of generalized file capabilities is presented. The system provides a wide variety of ways to process the contents of the data base and includes capabilities to search, sort, compute, update, and display the data. The process of creating, defining, and loading a data base is generally called the loading process. The steps in the loading process which includes (1) structuring, (2) creating, (3) defining, (4) and implementing the data base for use by MIRADS are defined. The execution of several computer programs is required to successfully complete all steps of the loading process. This library must be established as a cataloged mass storage file as the first step in MIRADS implementation. The procedure for establishing the MIRADS Library is given. The system is currently operational for the UNIVAC 1108 computer system utilizing the Executive Operating System. All procedures relate to the use of MIRADS on the U-1108 computer.
Model-Based PAT for Quality Management in Pharmaceuticals Freeze-Drying: State of the Art
Fissore, Davide
2017-01-01
Model-based process analytical technologies can be used for the in-line control and optimization of a pharmaceuticals freeze-drying process, as well as for the off-line design of the process, i.e., the identification of the optimal operating conditions. This paper aims at presenting the state of the art in this field, focusing, particularly, on three groups of systems, namely, those based on the temperature measurement (i.e., the soft sensor), on the chamber pressure measurement (i.e., the systems based on the test of pressure rise and of pressure decrease), and on the sublimation flux estimate (i.e., the tunable diode laser absorption spectroscopy and the valveless monitoring system). The application of these systems for in-line process optimization (e.g., using a model predictive control algorithm) and to get a true quality by design (e.g., through the off-line calculation of the design space of the process) is presented and discussed. PMID:28224123
Neuro-estimator based GMC control of a batch reactive distillation.
Prakash, K J Jithin; Patle, Dipesh S; Jana, Amiya K
2011-07-01
In this paper, an artificial neural network (ANN)-based nonlinear control algorithm is proposed for a simulated batch reactive distillation (RD) column. In the homogeneously catalyzed reactive process, an esterification reaction takes place for the production of ethyl acetate. The fundamental model has been derived incorporating the reaction term in the model structure of the nonreactive distillation process. The process operation is simulated at the startup phase under total reflux conditions. The open-loop process dynamics is also addressed running the batch process at the production phase under partial reflux conditions. In this study, a neuro-estimator based generic model controller (GMC), which consists of an ANN-based state predictor and the GMC law, has been synthesized. Finally, this proposed control law has been tested on the representative batch reactive distillation comparing with a gain-scheduled proportional integral (GSPI) controller and with its ideal performance (ideal GMC). Copyright © 2011 ISA. Published by Elsevier Ltd. All rights reserved.
Building a Knowledge to Action Program in Stroke Rehabilitation.
Janzen, Shannon; McIntyre, Amanda; Richardson, Marina; Britt, Eileen; Teasell, Robert
2016-09-01
The knowledge to action (KTA) process proposed by Graham et al (2006) is a framework to facilitate the development and application of research evidence into clinical practice. The KTA process consists of the knowledge creation cycle and the action cycle. The Evidence Based Review of Stroke Rehabilitation is a foundational part of the knowledge creation cycle and has helped guide the development of best practice recommendations in stroke. The Rehabilitation Knowledge to Action Project is an audit-feedback process for the clinical implementation of best practice guidelines, which follows the action cycle. The objective of this review was to: (1) contextualize the Evidence Based Review of Stroke Rehabilitation and Rehabilitation Knowledge to Action Project within the KTA model and (2) show how this process led to improved evidence-based practice in stroke rehabilitation. Through this process, a single centre was able to change clinical practice and promote a culture that supports the use of evidence-based practices in stroke rehabilitation.
List, Johann-Mattis; Pathmanathan, Jananan Sylvestre; Lopez, Philippe; Bapteste, Eric
2016-08-20
For a long time biologists and linguists have been noticing surprising similarities between the evolution of life forms and languages. Most of the proposed analogies have been rejected. Some, however, have persisted, and some even turned out to be fruitful, inspiring the transfer of methods and models between biology and linguistics up to today. Most proposed analogies were based on a comparison of the research objects rather than the processes that shaped their evolution. Focusing on process-based analogies, however, has the advantage of minimizing the risk of overstating similarities, while at the same time reflecting the common strategy to use processes to explain the evolution of complexity in both fields. We compared important evolutionary processes in biology and linguistics and identified processes specific to only one of the two disciplines as well as processes which seem to be analogous, potentially reflecting core evolutionary processes. These new process-based analogies support novel methodological transfer, expanding the application range of biological methods to the field of historical linguistics. We illustrate this by showing (i) how methods dealing with incomplete lineage sorting offer an introgression-free framework to analyze highly mosaic word distributions across languages; (ii) how sequence similarity networks can be used to identify composite and borrowed words across different languages; (iii) how research on partial homology can inspire new methods and models in both fields; and (iv) how constructive neutral evolution provides an original framework for analyzing convergent evolution in languages resulting from common descent (Sapir's drift). Apart from new analogies between evolutionary processes, we also identified processes which are specific to either biology or linguistics. This shows that general evolution cannot be studied from within one discipline alone. In order to get a full picture of evolution, biologists and linguists need to complement their studies, trying to identify cross-disciplinary and discipline-specific evolutionary processes. The fact that we found many process-based analogies favoring transfer from biology to linguistics further shows that certain biological methods and models have a broader scope than previously recognized. This opens fruitful paths for collaboration between the two disciplines. This article was reviewed by W. Ford Doolittle and Eugene V. Koonin.
Liu, Xin; Fatehi, Pedram; Ni, Yonghao
2012-07-01
A process for removing inhibitors from pre-hydrolysis liquor (PHL) of a kraft-based dissolving pulp production process by adsorption and flocculation, and the characteristics of this process were studied. In this process, industrially produced PHL was treated with unmodified and oxidized activated carbon as an absorbent and polydiallyldimethylammonium chloride (PDADMAC) as a flocculant. The overall removal of lignin and furfural in the developed process was 83.3% and 100%, respectively, while that of hemicelluloses was 32.7%. These results confirmed that the developed process can remove inhibitors from PHL prior to producing value-added products, e.g. ethanol and xylitol via fermentation. Copyright © 2012 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Holmgren, Robert
2013-01-01
This article focuses on the impact on learning processes when digital technologies are integrated into PBL (problem-based learning) oriented distance training. Based on socio-cultural perspectives on learning and a comparative distance-campus as well as a time-perspective, instructor and student roles, and learning activities were explored.…
ERIC Educational Resources Information Center
Kvavilashvili, Lia; Fisher, Laura
2007-01-01
The present research examined self-reported rehearsal processes in naturalistic time-based prospective memory tasks (Study 1 and 2) and compared them with the processes in event-based tasks (Study 3). Participants had to remember to phone the experimenter either at a prearranged time (a time-based task) or after receiving a certain text message…
2008-07-01
generation of process partitioning, a thread pipelining becomes possible. In this paper we briefly summarize the requirements and trends for FADEC based... FADEC environment, presenting a hypothetical realization of an example application. Finally we discuss the application of Time-Triggered...based control applications of the future. 15. SUBJECT TERMS Gas turbine, FADEC , Multi-core processing technology, disturbed based control
Truccolo, Wilson
2017-01-01
This review presents a perspective on capturing collective dynamics in recorded neuronal ensembles based on multivariate point process models, inference of low-dimensional dynamics and coarse graining of spatiotemporal measurements. A general probabilistic framework for continuous time point processes reviewed, with an emphasis on multivariate nonlinear Hawkes processes with exogenous inputs. A point process generalized linear model (PP-GLM) framework for the estimation of discrete time multivariate nonlinear Hawkes processes is described. The approach is illustrated with the modeling of collective dynamics in neocortical neuronal ensembles recorded in human and non-human primates, and prediction of single-neuron spiking. A complementary approach to capture collective dynamics based on low-dimensional dynamics (“order parameters”) inferred via latent state-space models with point process observations is presented. The approach is illustrated by inferring and decoding low-dimensional dynamics in primate motor cortex during naturalistic reach and grasp movements. Finally, we briefly review hypothesis tests based on conditional inference and spatiotemporal coarse graining for assessing collective dynamics in recorded neuronal ensembles. PMID:28336305
Truccolo, Wilson
2016-11-01
This review presents a perspective on capturing collective dynamics in recorded neuronal ensembles based on multivariate point process models, inference of low-dimensional dynamics and coarse graining of spatiotemporal measurements. A general probabilistic framework for continuous time point processes reviewed, with an emphasis on multivariate nonlinear Hawkes processes with exogenous inputs. A point process generalized linear model (PP-GLM) framework for the estimation of discrete time multivariate nonlinear Hawkes processes is described. The approach is illustrated with the modeling of collective dynamics in neocortical neuronal ensembles recorded in human and non-human primates, and prediction of single-neuron spiking. A complementary approach to capture collective dynamics based on low-dimensional dynamics ("order parameters") inferred via latent state-space models with point process observations is presented. The approach is illustrated by inferring and decoding low-dimensional dynamics in primate motor cortex during naturalistic reach and grasp movements. Finally, we briefly review hypothesis tests based on conditional inference and spatiotemporal coarse graining for assessing collective dynamics in recorded neuronal ensembles. Published by Elsevier Ltd.
Automated Signal Processing Applied to Volatile-Based Inspection of Greenhouse Crops
Jansen, Roel; Hofstee, Jan Willem; Bouwmeester, Harro; van Henten, Eldert
2010-01-01
Gas chromatograph–mass spectrometers (GC-MS) have been used and shown utility for volatile-based inspection of greenhouse crops. However, a widely recognized difficulty associated with GC-MS application is the large and complex data generated by this instrument. As a consequence, experienced analysts are often required to process this data in order to determine the concentrations of the volatile organic compounds (VOCs) of interest. Manual processing is time-consuming, labour intensive and may be subject to errors due to fatigue. The objective of this study was to assess whether or not GC-MS data can also be automatically processed in order to determine the concentrations of crop health associated VOCs in a greenhouse. An experimental dataset that consisted of twelve data files was processed both manually and automatically to address this question. Manual processing was based on simple peak integration while the automatic processing relied on the algorithms implemented in the MetAlign™ software package. The results of automatic processing of the experimental dataset resulted in concentrations similar to that after manual processing. These results demonstrate that GC-MS data can be automatically processed in order to accurately determine the concentrations of crop health associated VOCs in a greenhouse. When processing GC-MS data automatically, noise reduction, alignment, baseline correction and normalisation are required. PMID:22163594
Sens, Brigitte
2010-01-01
The concept of general process orientation as an instrument of organisation development is the core principle of quality management philosophy, i.e. the learning organisation. Accordingly, prestigious quality awards and certification systems focus on process configuration and continual improvement. In German health care organisations, particularly in hospitals, this general process orientation has not been widely implemented yet - despite enormous change dynamics and the requirements of both quality and economic efficiency of health care processes. But based on a consistent process architecture that considers key processes as well as management and support processes, the strategy of excellent health service provision including quality, safety and transparency can be realised in daily operative work. The core elements of quality (e.g., evidence-based medicine), patient safety and risk management, environmental management, health and safety at work can be embedded in daily health care processes as an integrated management system (the "all in one system" principle). Sustainable advantages and benefits for patients, staff, and the organisation will result: stable, high-quality, efficient, and indicator-based health care processes. Hospitals with their broad variety of complex health care procedures should now exploit the full potential of total process orientation. Copyright © 2010. Published by Elsevier GmbH.
Holistic processing of static and moving faces.
Zhao, Mintao; Bülthoff, Isabelle
2017-07-01
Humans' face ability develops and matures with extensive experience in perceiving, recognizing, and interacting with faces that move most of the time. However, how facial movements affect 1 core aspect of face ability-holistic face processing-remains unclear. Here we investigated the influence of rigid facial motion on holistic and part-based face processing by manipulating the presence of facial motion during study and at test in a composite face task. The results showed that rigidly moving faces were processed as holistically as static faces (Experiment 1). Holistic processing of moving faces persisted whether facial motion was presented during study, at test, or both (Experiment 2). Moreover, when faces were inverted to eliminate the contributions of both an upright face template and observers' expertise with upright faces, rigid facial motion facilitated holistic face processing (Experiment 3). Thus, holistic processing represents a general principle of face perception that applies to both static and dynamic faces, rather than being limited to static faces. These results support an emerging view that both perceiver-based and face-based factors contribute to holistic face processing, and they offer new insights on what underlies holistic face processing, how information supporting holistic face processing interacts with each other, and why facial motion may affect face recognition and holistic face processing differently. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
ERIC Educational Resources Information Center
Lin, Xiuyun; Fang, Xiaoyi; Chi, Peilian; Li, Xiaoming; Chen, Wenrui; Heath, Melissa Allen
2014-01-01
A group of 124 children orphaned by AIDS (COA), who resided in two orphanages funded by the Chinese government, participated in a study investigating the efficacy of a grief-processing-based psychological group intervention. This psychological intervention program was designed to specifically help COA process their grief and reduce their…
ERIC Educational Resources Information Center
Kawinkamolroj, Milintra; Triwaranyu, Charinee; Thongthew, Sumlee
2015-01-01
This research aimed to develop coaching process based on transformative learning theory for changing the mindset about instruction of elementary school teachers. Tools used in this process include mindset tests and questionnaires designed to assess the instructional mindset of teachers and to allow the teachers to reflect on how they perceive…
A Scheme for Understanding Group Processes in Problem-Based Learning
ERIC Educational Resources Information Center
Hammar Chiriac, Eva
2008-01-01
The purpose of this study was to identify, describe and interpret group processes occurring in tutorials in problem-based learning. Another aim was to investigate if a combination of Steiner's (Steiner, I. D. (1972). "Group process and productivity". New York: Academic Press.) theory of group work and Bion's (Bion, W. R. (1961). "Experiences in…
ERIC Educational Resources Information Center
Thomson, Jennifer M.; Leong, Victoria; Goswami, Usha
2013-01-01
The purpose of this study was to compare the efficacy of two auditory processing interventions for developmental dyslexia, one based on rhythm and one based on phonetic training. Thirty-three children with dyslexia participated and were assigned to one of three groups (a) a novel rhythmic processing intervention designed to highlight auditory…
The Effectiveness of Adopting E-Readers to Facilitate EFL Students' Process-Based Academic Writing
ERIC Educational Resources Information Center
Hung, Hui-Chun; Young, Shelley Shwu-Ching
2015-01-01
English as Foreign Language (EFL) students face additional difficulties for academic writing largely due to their level of language competency. An appropriate structural process of writing can help students develop their academic writing skills. This study explored the use of the e-readers to facilitate EFL students' process-based academic…
Polymer based tunneling sensor
NASA Technical Reports Server (NTRS)
Wang, Jing (Inventor); Zhao, Yongjun (Inventor); Cui, Tianhong (Inventor)
2006-01-01
A process for fabricating a polymer based circuit by the following steps. A mold of a design is formed through a lithography process. The design is transferred to a polymer substrate through a hot embossing process. A metal layer is then deposited over at least part of said design and at least one electrical lead is connected to said metal layer.
Local Anesthetic Microencapsulation.
1983-11-04
tollowing I.M. injection of microencapsulated lidocaine and etidocaine than following solution injections. Local toxicity of these microcapsule injections...Distribution 41 Table 12 Processing Summary of Lidocaine (Base) 43 Microencapsulation Table 13 Lidocaine (Base) Microcapsule Size 44 Distribution...Table 14 Processing Summary of Et’idocaine-HCl 45 Microencapsulation Table 15 Etidocaine-HCl Microcapsule Size 47 Distribution Table 16 Process Summary
Seven-Step Problem-Based Learning in an Interaction Design Course
ERIC Educational Resources Information Center
Schultz, Nette; Christensen, Hans Peter
2004-01-01
The objective in this paper is the implementation of the highly structured seven-step problem-based learning (PBL) procedure as part of the learning process in a human-computer interaction (HCI) design course at the Technical University of Denmark, taking into account the common learning processes in PBL and the interaction design process. These…
ERIC Educational Resources Information Center
Bouck, Emily C.; Meyer, Nancy K.; Satsangi, Rajiv; Savage, Melissa N.; Hunley, Megan
2015-01-01
Written expression is a neglected but critical component of education; yet, the writing process--from prewriting, to writing, and postwriting--is often an area of struggle for students with disabilities. One strategy to assist students with disabilities struggling with the writing process is the use of computer-based technology. This article…
Recovery Processes of Organic Acids from Fermentation Broths in the Biomass-Based Industry.
Li, Qian-Zhu; Jiang, Xing-Lin; Feng, Xin-Jun; Wang, Ji-Ming; Sun, Chao; Zhang, Hai-Bo; Xian, Mo; Liu, Hui-Zhou
2016-01-01
The new movement towards green chemistry and renewable feedstocks makes microbial production of chemicals more competitive. Among the numerous chemicals, organic acids are more attractive targets for process development efforts in the renewable-based biorefinery industry. However, most of the production costs in microbial processes are higher than that in chemical processes, among which over 60% are generated by separation processes. Therefore, the research of separation and purification processes is important for a promising biorefinery industry. This review highlights the progress of recovery processes in the separation and purification of organic acids, including their advantages and disadvantages, current situation, and future prospects in terms of recovery yields and industrial application.
EEG feature selection method based on decision tree.
Duan, Lijuan; Ge, Hui; Ma, Wei; Miao, Jun
2015-01-01
This paper aims to solve automated feature selection problem in brain computer interface (BCI). In order to automate feature selection process, we proposed a novel EEG feature selection method based on decision tree (DT). During the electroencephalogram (EEG) signal processing, a feature extraction method based on principle component analysis (PCA) was used, and the selection process based on decision tree was performed by searching the feature space and automatically selecting optimal features. Considering that EEG signals are a series of non-linear signals, a generalized linear classifier named support vector machine (SVM) was chosen. In order to test the validity of the proposed method, we applied the EEG feature selection method based on decision tree to BCI Competition II datasets Ia, and the experiment showed encouraging results.
Space-based optical image encryption.
Chen, Wen; Chen, Xudong
2010-12-20
In this paper, we propose a new method based on a three-dimensional (3D) space-based strategy for the optical image encryption. The two-dimensional (2D) processing of a plaintext in the conventional optical encryption methods is extended to a 3D space-based processing. Each pixel of the plaintext is considered as one particle in the proposed space-based optical image encryption, and the diffraction of all particles forms an object wave in the phase-shifting digital holography. The effectiveness and advantages of the proposed method are demonstrated by numerical results. The proposed method can provide a new optical encryption strategy instead of the conventional 2D processing, and may open up a new research perspective for the optical image encryption.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Farmer, J; Brown, B; Bayles, B
The overall goal is to develop high-performance corrosion-resistant iron-based amorphous-metal coatings for prolonged trouble-free use in very aggressive environments: seawater & hot geothermal brines. The specific technical objectives are: (1) Synthesize Fe-based amorphous-metal coating with corrosion resistance comparable/superior to Ni-based Alloy C-22; (2) Establish processing parameter windows for applying and controlling coating attributes (porosity, density, bonding); (3) Assess possible cost savings through substitution of Fe-based material for more expensive Ni-based Alloy C-22; (4) Demonstrate practical fabrication processes; (5) Produce quality materials and data with complete traceability for nuclear applications; and (6) Develop, validate and calibrate computational models to enable lifemore » prediction and process design.« less
Teaching WP and DP with CP/M-Based Microcomputers.
ERIC Educational Resources Information Center
Bartholome, Lloyd W.
1982-01-01
The use of CP/M (Control Program Monitor)-based microcomputers in teaching word processing and data processing is explored. The system's advantages, variations, dictionary software, and future are all discussed. (CT)
An Ontology-Based Approach to Incorporate User-Generated Geo-Content Into Sdi
NASA Astrophysics Data System (ADS)
Deng, D.-P.; Lemmens, R.
2011-08-01
The Web is changing the way people share and communicate information because of emergence of various Web technologies, which enable people to contribute information on the Web. User-Generated Geo-Content (UGGC) is a potential resource of geographic information. Due to the different production methods, UGGC often cannot fit in geographic information model. There is a semantic gap between UGGC and formal geographic information. To integrate UGGC into geographic information, this study conducts an ontology-based process to bridge this semantic gap. This ontology-based process includes five steps: Collection, Extraction, Formalization, Mapping, and Deployment. In addition, this study implements this process on Twitter messages, which is relevant to Japan Earthquake disaster. By using this process, we extract disaster relief information from Twitter messages, and develop a knowledge base for GeoSPARQL queries in disaster relief information.
Biomimetic design processes in architecture: morphogenetic and evolutionary computational design.
Menges, Achim
2012-03-01
Design computation has profound impact on architectural design methods. This paper explains how computational design enables the development of biomimetic design processes specific to architecture, and how they need to be significantly different from established biomimetic processes in engineering disciplines. The paper first explains the fundamental difference between computer-aided and computational design in architecture, as the understanding of this distinction is of critical importance for the research presented. Thereafter, the conceptual relation and possible transfer of principles from natural morphogenesis to design computation are introduced and the related developments of generative, feature-based, constraint-based, process-based and feedback-based computational design methods are presented. This morphogenetic design research is then related to exploratory evolutionary computation, followed by the presentation of two case studies focusing on the exemplary development of spatial envelope morphologies and urban block morphologies.
Recent developments in membrane-based separations in biotechnology processes: review.
Rathore, A S; Shirke, A
2011-01-01
Membrane-based separations are the most ubiquitous unit operations in biotech processes. There are several key reasons for this. First, they can be used with a large variety of applications including clarification, concentration, buffer exchange, purification, and sterilization. Second, they are available in a variety of formats, such as depth filtration, ultrafiltration, diafiltration, nanofiltration, reverse osmosis, and microfiltration. Third, they are simple to operate and are generally robust toward normal variations in feed material and operating parameters. Fourth, membrane-based separations typically require lower capital cost when compared to other processing options. As a result of these advantages, a typical biotech process has anywhere from 10 to 20 membrane-based separation steps. In this article we review the major developments that have occurred on this topic with a focus on developments in the last 5 years.
Modelling of the mercury loss in fluorescent lamps under the influence of metal oxide coatings
NASA Astrophysics Data System (ADS)
Santos Abreu, A.; Mayer, J.; Lenk, D.; Horn, S.; Konrad, A.; Tidecks, R.
2016-11-01
The mercury transport and loss mechanisms in the metal oxide coatings of mercury low pressure discharge fluorescent lamps have been investigated. An existing model based on a ballistic process is discussed in the context of experimental mercury loss data. Two different approaches to the modeling of the mercury loss have been developed. The first one is based on mercury transition rates between the plasma, the coating, and the glass without specifying the underlying physical processes. The second one is based on a transport process driven by diffusion and a binding process of mercury reacting to mercury oxide inside the layers. Moreover, we extended the diffusion based model to handle multi-component coatings. All approaches are applied to describe mercury loss experiments under the influence of an Al 2 O 3 coating.
NASA Astrophysics Data System (ADS)
Kaiser, C.; Roll, K.; Volk, W.
2017-09-01
In the automotive industry, the manufacturing of automotive outer panels requires hemming processes in which two sheet metal parts are joined together by bending the flange of the outer part over the inner part. Because of decreasing development times and the steadily growing number of vehicle derivatives, an efficient digital product and process validation is necessary. Commonly used simulations, which are based on the finite element method, demand significant modelling effort, which results in disadvantages especially in the early product development phase. To increase the efficiency of designing hemming processes this paper presents a hemming-specific metamodel approach. The approach includes a part analysis in which the outline of the automotive outer panels is initially split into individual segments. By doing a para-metrization of each of the segments and assigning basic geometric shapes, the outline of the part is approximated. Based on this, the hemming parameters such as flange length, roll-in, wrinkling and plastic strains are calculated for each of the geometric basic shapes by performing a meta-model-based segmental product validation. The metamodel is based on an element similar formulation that includes a reference dataset of various geometric basic shapes. A random automotive outer panel can now be analysed and optimized based on the hemming-specific database. By implementing this approach into a planning system, an efficient optimization of designing hemming processes will be enabled. Furthermore, valuable time and cost benefits can be realized in a vehicle’s development process.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Feng, Xiuli, E-mail: feng.97@osu.edu; State Key Laboratory of Advanced Welding and Joining, Harbin Institute of Technology, Harbin 150001; Liu, Huijie, E-mail: liuhj@hit.edu.cn
Aluminum alloy 2219-T6 was friction stir processed using a novel submerged processing technique to facilitate cooling. Processing was conducted at a constant tool traverse speed of 200 mm/min and spindle rotation speeds in the range from 600 to 800 rpm. The microstructural characteristics of the base metal and processed zone, including grain structure and precipitation behavior, were studied using optical microscopy (OM), scanning electron microscopy (SEM) and transmission electron microscopy (TEM). Microhardness maps were constructed on polished cross sections of as-processed samples. The effect of tool rotation speed on the microstructure and hardness of the stir zone was investigated. Themore » average grain size of the stir zone was much smaller than that of the base metal, but the hardness was also lower due to the formation of equilibrium θ precipitates from the base metal θ′ precipitates. Stir zone hardness was found to decrease with increasing rotation speed (heat input). The effect of processing conditions on strength (hardness) was rationalized based on the competition between grain refinement strengthening and softening due to precipitate overaging. - Highlights: • SZ grain size (∼ 1 μm) is reduced by over one order of magnitude relative to the BM. • Hardness in the SZ is lower than that of the precipitation strengthened BM. • Metastable θ′ in the base metal transforms to equilibrium θ in the stir zone. • Softening in the SZ results from a decrease of precipitation strengthening.« less
NASA Astrophysics Data System (ADS)
Tang, Jian; Qiao, Junfei; Wu, ZhiWei; Chai, Tianyou; Zhang, Jian; Yu, Wen
2018-01-01
Frequency spectral data of mechanical vibration and acoustic signals relate to difficult-to-measure production quality and quantity parameters of complex industrial processes. A selective ensemble (SEN) algorithm can be used to build a soft sensor model of these process parameters by fusing valued information selectively from different perspectives. However, a combination of several optimized ensemble sub-models with SEN cannot guarantee the best prediction model. In this study, we use several techniques to construct mechanical vibration and acoustic frequency spectra of a data-driven industrial process parameter model based on selective fusion multi-condition samples and multi-source features. Multi-layer SEN (MLSEN) strategy is used to simulate the domain expert cognitive process. Genetic algorithm and kernel partial least squares are used to construct the inside-layer SEN sub-model based on each mechanical vibration and acoustic frequency spectral feature subset. Branch-and-bound and adaptive weighted fusion algorithms are integrated to select and combine outputs of the inside-layer SEN sub-models. Then, the outside-layer SEN is constructed. Thus, "sub-sampling training examples"-based and "manipulating input features"-based ensemble construction methods are integrated, thereby realizing the selective information fusion process based on multi-condition history samples and multi-source input features. This novel approach is applied to a laboratory-scale ball mill grinding process. A comparison with other methods indicates that the proposed MLSEN approach effectively models mechanical vibration and acoustic signals.
NASA Astrophysics Data System (ADS)
Ghaemi, Z.; Farnaghi, M.; Alimohammadi, A.
2015-12-01
The critical impact of air pollution on human health and environment in one hand and the complexity of pollutant concentration behavior in the other hand lead the scientists to look for advance techniques for monitoring and predicting the urban air quality. Additionally, recent developments in data measurement techniques have led to collection of various types of data about air quality. Such data is extremely voluminous and to be useful it must be processed at high velocity. Due to the complexity of big data analysis especially for dynamic applications, online forecasting of pollutant concentration trends within a reasonable processing time is still an open problem. The purpose of this paper is to present an online forecasting approach based on Support Vector Machine (SVM) to predict the air quality one day in advance. In order to overcome the computational requirements for large-scale data analysis, distributed computing based on the Hadoop platform has been employed to leverage the processing power of multiple processing units. The MapReduce programming model is adopted for massive parallel processing in this study. Based on the online algorithm and Hadoop framework, an online forecasting system is designed to predict the air pollution of Tehran for the next 24 hours. The results have been assessed on the basis of Processing Time and Efficiency. Quite accurate predictions of air pollutant indicator levels within an acceptable processing time prove that the presented approach is very suitable to tackle large scale air pollution prediction problems.
Sánchez, José; Guarnes, Miguel Ángel; Dormido, Sebastián
2009-01-01
This paper is an experimental study of the utilization of different event-based strategies for the automatic control of a simple but very representative industrial process: the level control of a tank. In an event-based control approach it is the triggering of a specific event, and not the time, that instructs the sensor to send the current state of the process to the controller, and the controller to compute a new control action and send it to the actuator. In the document, five control strategies based on different event-based sampling techniques are described, compared, and contrasted with a classical time-based control approach and a hybrid one. The common denominator in the time, the hybrid, and the event-based control approaches is the controller: a proportional-integral algorithm with adaptations depending on the selected control approach. To compare and contrast each one of the hybrid and the pure event-based control algorithms with the time-based counterpart, the two tasks that a control strategy must achieve (set-point following and disturbance rejection) are independently analyzed. The experimental study provides new proof concerning the ability of event-based control strategies to minimize the data exchange among the control agents (sensors, controllers, actuators) when an error-free control of the process is not a hard requirement. PMID:22399975
Development of a Cr-Based Hard Composite Processed by Spark Plasma Sintering
NASA Astrophysics Data System (ADS)
García-Junceda, A.; Sáez, I.; Deng, X. X.; Torralba, J. M.
2018-04-01
This investigation analyzes the feasibility of processing a composite material comprising WC particles randomly dispersed in a matrix in which Cr is the main metallic binder. Thus, a new composite material is processed using a commercial, economic, and easily available Cr-based alloy, assuming that there is a certain Cr solubility in the WC particles acting as reinforcement. The processing route followed includes mechanical milling of the powders and consolidation by spark plasma sintering.
Harris, Claire; Garrubba, Marie; Allen, Kelly; King, Richard; Kelly, Cate; Thiagarajan, Malar; Castleman, Beverley; Ramsey, Wayne; Farjou, Dina
2015-12-28
This paper reports the process of establishing a transparent, accountable, evidence-based program for introduction of new technologies and clinical practices (TCPs) in a large Australian healthcare network. Many countries have robust evidence-based processes for assessment of new TCPs at national level. However many decisions are made by local health services where the resources and expertise to undertake health technology assessment (HTA) are limited and a lack of structure, process and transparency has been reported. An evidence-based model for process change was used to establish the program. Evidence from research and local data, experience of health service staff and consumer perspectives were incorporated at each of four steps: identifying the need for change, developing a proposal, implementation and evaluation. Checklists assessing characteristics of success, factors for sustainability and barriers and enablers were applied and implementation strategies were based on these findings. Quantitative and qualitative methods were used for process and outcome evaluation. An action research approach underpinned ongoing refinement to systems, processes and resources. A Best Practice Guide developed from the literature and stakeholder consultation identified seven program components: Governance, Decision-Making, Application Process, Monitoring and Reporting, Resources, Administration, and Evaluation and Quality Improvement. The aims of transparency and accountability were achieved. The processes are explicit, decisions published, outcomes recorded and activities reported. The aim of ascertaining rigorous evidence-based information for decision-making was not achieved in all cases. Applicants proposing new TCPs provided the evidence from research literature and local data however the information was often incorrect or inadequate, overestimating benefits and underestimating costs. Due to these limitations the initial application process was replaced by an Expression of Interest from applicants followed by a rigorous HTA by independent in-house experts. The program is generalisable to most health care organisations. With one exception, the components would be achievable with minimal additional resources; the lack of skills and resources required for HTA will limit effective application in many settings. A toolkit containing details of the processes and sample materials is provided to facilitate replication or local adaptation by those wishing to establish a similar program.
NASA Astrophysics Data System (ADS)
McNeil, Ronald D.; Miele, Renato; Shaul, Dennis
2000-10-01
Information technology is driving improvements in manufacturing systems. Results are higher productivity and quality. However, corporate strategy is driven by a number of factors and includes data and pressure from multiple stakeholders, which includes employees, managers, executives, stockholders, boards, suppliers and customers. It is also driven by information about competitors and emerging technology. Much information is based on processing of data and the resulting biases of the processors. Thus, stakeholders can base inputs on faulty perceptions, which are not reality based. Prior to processing, data used may be inaccurate. Sources of data and information may include demographic reports, statistical analyses, intelligence reports (e.g., marketing data), technology and primary data collection. The reliability and validity of data as well as the management of sources and information is critical element to strategy formulation. The paper explores data collection, processing and analyses from secondary and primary sources, information generation and report presentation for strategy formulation and contrast this with data and information utilized to drive internal process such as manufacturing. The hypothesis is that internal process, such as manufacturing, are subordinate to corporate strategies. The impact of possible divergence in quality of decisions at the corporate level on IT driven, quality-manufacturing processes based on measurable outcomes is significant. Recommendations for IT improvements at the corporate strategy level are given.
Using artificial neural networks to model aluminium based sheet forming processes and tools details
NASA Astrophysics Data System (ADS)
Mekras, N.
2017-09-01
In this paper, a methodology and a software system will be presented concerning the use of Artificial Neural Networks (ANNs) for modeling aluminium based sheet forming processes. ANNs models’ creation is based on the training of the ANNs using experimental, trial and historical data records of processes’ inputs and outputs. ANNs models are useful in cases that processes’ mathematical models are not accurate enough, are not well defined or are missing e.g. in cases of complex product shapes, new material alloys, new process requirements, micro-scale products, etc. Usually, after the design and modeling of the forming tools (die, punch, etc.) and before mass production, a set of trials takes place at the shop floor for finalizing processes and tools details concerning e.g. tools’ minimum radii, die/punch clearance, press speed, process temperature, etc. and in relation with the material type, the sheet thickness and the quality achieved from the trials. Using data from the shop floor trials and forming theory data, ANNs models can be trained and created, and can be used to estimate processes and tools final details, hence supporting efficient set-up of processes and tools before mass production starts. The proposed ANNs methodology and the respective software system are implemented within the EU H2020 project LoCoMaTech for the aluminium-based sheet forming process HFQ (solution Heat treatment, cold die Forming and Quenching).
Shukla, Nagesh; Keast, John E; Ceglarek, Darek
2014-10-01
The modelling of complex workflows is an important problem-solving technique within healthcare settings. However, currently most of the workflow models use a simplified flow chart of patient flow obtained using on-site observations, group-based debates and brainstorming sessions, together with historic patient data. This paper presents a systematic and semi-automatic methodology for knowledge acquisition with detailed process representation using sequential interviews of people in the key roles involved in the service delivery process. The proposed methodology allows the modelling of roles, interactions, actions, and decisions involved in the service delivery process. This approach is based on protocol generation and analysis techniques such as: (i) initial protocol generation based on qualitative interviews of radiology staff, (ii) extraction of key features of the service delivery process, (iii) discovering the relationships among the key features extracted, and, (iv) a graphical representation of the final structured model of the service delivery process. The methodology is demonstrated through a case study of a magnetic resonance (MR) scanning service-delivery process in the radiology department of a large hospital. A set of guidelines is also presented in this paper to visually analyze the resulting process model for identifying process vulnerabilities. A comparative analysis of different workflow models is also conducted. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
ERIC Educational Resources Information Center
Fillingim, Jennifer Gale
2010-01-01
Contemporary mathematics education reform has placed increased emphasis on K-12 mathematics curriculum. Reform-based curricula, often referred to as "Standards-based" due to philosophical alignment with the NCTM Process Standards, have generated controversy among families, educators, and researchers. The mathematics education research…
NASA Astrophysics Data System (ADS)
Sidelnikov, O. S.; Redyuk, A. A.; Sygletos, S.
2017-12-01
We consider neural network-based schemes of digital signal processing. It is shown that the use of a dynamic neural network-based scheme of signal processing ensures an increase in the optical signal transmission quality in comparison with that provided by other methods for nonlinear distortion compensation.
Culturally Based Intervention Development: The Case of Latino Families Dealing with Schizophrenia
ERIC Educational Resources Information Center
Barrio, Concepcion; Yamada, Ann-Marie
2010-01-01
Objectives: This article describes the process of developing a culturally based family intervention for Spanish-speaking Latino families with a relative diagnosed with schizophrenia. Method: Our iterative intervention development process was guided by a cultural exchange framework and based on findings from an ethnographic study. We piloted this…
USDA-ARS?s Scientific Manuscript database
Rice milk beverages can well balanced nutrition. With healthier nutrition in consumer’s minds, national. Worldwide consumption/production of plant-based milk beverages are increasing. Much past research and invention was based on enzymatic conversion processes for starch that were uncomplicated be...
Teacher Perceptions Regarding Portfolio-Based Components of Teacher Evaluations
ERIC Educational Resources Information Center
Nagel, Charles I.
2012-01-01
This study reports the results of teachers' and principals' perceptions of the package evaluation process, a process that uses a combination of a traditional evaluation with a portfolio-based assessment tool. In addition, this study contributes to the educational knowledge base by exploring the participants' views on the impact of…
Guiding Students through the Jungle of Research-Based Literature
ERIC Educational Resources Information Center
Williams, Sherie
2005-01-01
Undergraduate students of today often lack the ability to effectively process research-based literature. In order to offer education students the most up-to-date methods, research-based literature must be considered. Hence a dilemma is born as to whether professors should discontinue requiring the processing of this type of information or teach…
ERIC Educational Resources Information Center
Frazier, Thomas W.; Youngstrom, Eric A.
2006-01-01
In this article, the authors illustrate a step-by-step process of acquiring and integrating information according to the recommendations of evidence-based practices. A case example models the process, leading to specific recommendations regarding instruments and strategies for evidence-based assessment (EBA) of attention-deficit/hyperactivity…
ERIC Educational Resources Information Center
Tutlys, Vidmantas; Spöttl, Georg
2017-01-01
Purpose: This paper aims to explore methodological and institutional challenges on application of the work-process analysis approach in the design and development of competence-based occupational standards for Lithuania. Design/methodology/approach: The theoretical analysis is based on the review of scientific literature and the analysis of…
Beckers, Laura; van der Burg, Jan; Janssen-Potten, Yvonne; Rameckers, Eugène; Aarts, Pauline; Smeets, Rob
2018-04-24
As part of the COAD-study two home-based bimanual training programs for young children with unilateral Cerebral Palsy (uCP) have been developed, both consisting of a preparation phase and a home-based training phase. Parents are coached to use either an explicit or implicit motor learning approach while teaching bimanual activities to their child. A process evaluation of these complex interventions is crucial in order to draw accurate conclusions and provide recommendations for implementation in clinical practice and further research. The aim of the process evaluation is to systematically assess fidelity of the home-based training programs, to examine the mechanisms that contribute to their effects on child-related and parent-related outcomes, and to explore the influence of contextual factors. A mixed methods embedded design is used that emerges from a pragmatism paradigm. The qualitative strand involves a generic qualitative approach. The process evaluation components fidelity (quality), dose delivered (completeness), dose received (exposure and satisfaction), recruitment and context will be investigated. Data collection includes registration of attendance of therapists and remedial educationalists to a course regarding the home-based training programs; a questionnaire to evaluate this course by the instructor; a report form concerning the preparation phase to be completed by the therapist; registration and video analyses of the home-based training; interviews with parents and questionnaires to be filled out by the therapist and remedial educationalist regarding the process of training; and focus groups with therapists and remedial educationalists as well as registration of drop-out rates and reasons, to evaluate the overall home-based training programs. Inductive thematic analysis will be used to analyse qualitative data. Qualitative and quantitative findings are merged through meta-inference. So far, effects of home-based training programs in paediatric rehabilitation have been studied without an extensive process evaluation. The findings of this process evaluation will have implications for clinical practice and further research regarding development and application of home-based bimanual training programs, executed by parents and aimed at improving activity performance and participation of children with uCP.
A neuro-inspired spike-based PID motor controller for multi-motor robots with low cost FPGAs.
Jimenez-Fernandez, Angel; Jimenez-Moreno, Gabriel; Linares-Barranco, Alejandro; Dominguez-Morales, Manuel J; Paz-Vicente, Rafael; Civit-Balcells, Anton
2012-01-01
In this paper we present a neuro-inspired spike-based close-loop controller written in VHDL and implemented for FPGAs. This controller has been focused on controlling a DC motor speed, but only using spikes for information representation, processing and DC motor driving. It could be applied to other motors with proper driver adaptation. This controller architecture represents one of the latest layers in a Spiking Neural Network (SNN), which implements a bridge between robotics actuators and spike-based processing layers and sensors. The presented control system fuses actuation and sensors information as spikes streams, processing these spikes in hard real-time, implementing a massively parallel information processing system, through specialized spike-based circuits. This spike-based close-loop controller has been implemented into an AER platform, designed in our labs, that allows direct control of DC motors: the AER-Robot. Experimental results evidence the viability of the implementation of spike-based controllers, and hardware synthesis denotes low hardware requirements that allow replicating this controller in a high number of parallel controllers working together to allow a real-time robot control.
A Neuro-Inspired Spike-Based PID Motor Controller for Multi-Motor Robots with Low Cost FPGAs
Jimenez-Fernandez, Angel; Jimenez-Moreno, Gabriel; Linares-Barranco, Alejandro; Dominguez-Morales, Manuel J.; Paz-Vicente, Rafael; Civit-Balcells, Anton
2012-01-01
In this paper we present a neuro-inspired spike-based close-loop controller written in VHDL and implemented for FPGAs. This controller has been focused on controlling a DC motor speed, but only using spikes for information representation, processing and DC motor driving. It could be applied to other motors with proper driver adaptation. This controller architecture represents one of the latest layers in a Spiking Neural Network (SNN), which implements a bridge between robotics actuators and spike-based processing layers and sensors. The presented control system fuses actuation and sensors information as spikes streams, processing these spikes in hard real-time, implementing a massively parallel information processing system, through specialized spike-based circuits. This spike-based close-loop controller has been implemented into an AER platform, designed in our labs, that allows direct control of DC motors: the AER-Robot. Experimental results evidence the viability of the implementation of spike-based controllers, and hardware synthesis denotes low hardware requirements that allow replicating this controller in a high number of parallel controllers working together to allow a real-time robot control. PMID:22666004
Solution NMR Spectroscopy in Target-Based Drug Discovery.
Li, Yan; Kang, Congbao
2017-08-23
Solution NMR spectroscopy is a powerful tool to study protein structures and dynamics under physiological conditions. This technique is particularly useful in target-based drug discovery projects as it provides protein-ligand binding information in solution. Accumulated studies have shown that NMR will play more and more important roles in multiple steps of the drug discovery process. In a fragment-based drug discovery process, ligand-observed and protein-observed NMR spectroscopy can be applied to screen fragments with low binding affinities. The screened fragments can be further optimized into drug-like molecules. In combination with other biophysical techniques, NMR will guide structure-based drug discovery. In this review, we describe the possible roles of NMR spectroscopy in drug discovery. We also illustrate the challenges encountered in the drug discovery process. We include several examples demonstrating the roles of NMR in target-based drug discoveries such as hit identification, ranking ligand binding affinities, and mapping the ligand binding site. We also speculate the possible roles of NMR in target engagement based on recent processes in in-cell NMR spectroscopy.
NASA Astrophysics Data System (ADS)
Qian, Xiaoshan
2018-01-01
The traditional model of evaporation process parameters have continuity and cumulative characteristics of the prediction error larger issues, based on the basis of the process proposed an adaptive particle swarm neural network forecasting method parameters established on the autoregressive moving average (ARMA) error correction procedure compensated prediction model to predict the results of the neural network to improve prediction accuracy. Taking a alumina plant evaporation process to analyze production data validation, and compared with the traditional model, the new model prediction accuracy greatly improved, can be used to predict the dynamic process of evaporation of sodium aluminate solution components.
Variance reduction for Fokker–Planck based particle Monte Carlo schemes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gorji, M. Hossein, E-mail: gorjih@ifd.mavt.ethz.ch; Andric, Nemanja; Jenny, Patrick
Recently, Fokker–Planck based particle Monte Carlo schemes have been proposed and evaluated for simulations of rarefied gas flows [1–3]. In this paper, the variance reduction for particle Monte Carlo simulations based on the Fokker–Planck model is considered. First, deviational based schemes were derived and reviewed, and it is shown that these deviational methods are not appropriate for practical Fokker–Planck based rarefied gas flow simulations. This is due to the fact that the deviational schemes considered in this study lead either to instabilities in the case of two-weight methods or to large statistical errors if the direct sampling method is applied.more » Motivated by this conclusion, we developed a novel scheme based on correlated stochastic processes. The main idea here is to synthesize an additional stochastic process with a known solution, which is simultaneously solved together with the main one. By correlating the two processes, the statistical errors can dramatically be reduced; especially for low Mach numbers. To assess the methods, homogeneous relaxation, planar Couette and lid-driven cavity flows were considered. For these test cases, it could be demonstrated that variance reduction based on parallel processes is very robust and effective.« less
Space-based infrared scanning sensor LOS determination and calibration using star observation
NASA Astrophysics Data System (ADS)
Chen, Jun; Xu, Zhan; An, Wei; Deng, Xin-Pu; Yang, Jun-Gang
2015-10-01
This paper provides a novel methodology for removing sensor bias from a space based infrared (IR) system (SBIRS) through the use of stars detected in the background field of the sensor. Space based IR system uses the LOS (line of sight) of target for target location. LOS determination and calibration is the key precondition of accurate location and tracking of targets in Space based IR system and the LOS calibration of scanning sensor is one of the difficulties. The subsequent changes of sensor bias are not been taking into account in the conventional LOS determination and calibration process. Based on the analysis of the imaging process of scanning sensor, a theoretical model based on the estimation of bias angles using star observation is proposed. By establishing the process model of the bias angles and the observation model of stars, using an extended Kalman filter (EKF) to estimate the bias angles, and then calibrating the sensor LOS. Time domain simulations results indicate that the proposed method has a high precision and smooth performance for sensor LOS determination and calibration. The timeliness and precision of target tracking process in the space based infrared (IR) tracking system could be met with the proposed algorithm.
Development and evaluation of spatial point process models for epidermal nerve fibers.
Olsbo, Viktor; Myllymäki, Mari; Waller, Lance A; Särkkä, Aila
2013-06-01
We propose two spatial point process models for the spatial structure of epidermal nerve fibers (ENFs) across human skin. The models derive from two point processes, Φb and Φe, describing the locations of the base and end points of the fibers. Each point of Φe (the end point process) is connected to a unique point in Φb (the base point process). In the first model, both Φe and Φb are Poisson processes, yielding a null model of uniform coverage of the skin by end points and general baseline results and reference values for moments of key physiologic indicators. The second model provides a mechanistic model to generate end points for each base, and we model the branching structure more directly by defining Φe as a cluster process conditioned on the realization of Φb as its parent points. In both cases, we derive distributional properties for observable quantities of direct interest to neurologists such as the number of fibers per base, and the direction and range of fibers on the skin. We contrast both models by fitting them to data from skin blister biopsy images of ENFs and provide inference regarding physiological properties of ENFs. Copyright © 2013 Elsevier Inc. All rights reserved.
Tapia, Gustavo; Khairallah, Saad A.; Matthews, Manyalibo J.; ...
2017-09-22
Here, Laser Powder-Bed Fusion (L-PBF) metal-based additive manufacturing (AM) is complex and not fully understood. Successful processing for one material, might not necessarily apply to a different material. This paper describes a workflow process that aims at creating a material data sheet standard that describes regimes where the process can be expected to be robust. The procedure consists of building a Gaussian process-based surrogate model of the L-PBF process that predicts melt pool depth in single-track experiments given a laser power, scan speed, and laser beam size combination. The predictions are then mapped onto a power versus scan speed diagrammore » delimiting the conduction from the keyhole melting controlled regimes. This statistical framework is shown to be robust even for cases where experimental training data might be suboptimal in quality, if appropriate physics-based filters are applied. Additionally, it is demonstrated that a high-fidelity simulation model of L-PBF can equally be successfully used for building a surrogate model, which is beneficial since simulations are getting more efficient and are more practical to study the response of different materials, than to re-tool an AM machine for new material powder.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tapia, Gustavo; Khairallah, Saad A.; Matthews, Manyalibo J.
Here, Laser Powder-Bed Fusion (L-PBF) metal-based additive manufacturing (AM) is complex and not fully understood. Successful processing for one material, might not necessarily apply to a different material. This paper describes a workflow process that aims at creating a material data sheet standard that describes regimes where the process can be expected to be robust. The procedure consists of building a Gaussian process-based surrogate model of the L-PBF process that predicts melt pool depth in single-track experiments given a laser power, scan speed, and laser beam size combination. The predictions are then mapped onto a power versus scan speed diagrammore » delimiting the conduction from the keyhole melting controlled regimes. This statistical framework is shown to be robust even for cases where experimental training data might be suboptimal in quality, if appropriate physics-based filters are applied. Additionally, it is demonstrated that a high-fidelity simulation model of L-PBF can equally be successfully used for building a surrogate model, which is beneficial since simulations are getting more efficient and are more practical to study the response of different materials, than to re-tool an AM machine for new material powder.« less
Pen-based computers: Computers without keys
NASA Technical Reports Server (NTRS)
Conklin, Cheryl L.
1994-01-01
The National Space Transportation System (NSTS) is comprised of many diverse and highly complex systems incorporating the latest technologies. Data collection associated with ground processing of the various Space Shuttle system elements is extremely challenging due to the many separate processing locations where data is generated. This presents a significant problem when the timely collection, transfer, collation, and storage of data is required. This paper describes how new technology, referred to as Pen-Based computers, is being used to transform the data collection process at Kennedy Space Center (KSC). Pen-Based computers have streamlined procedures, increased data accuracy, and now provide more complete information than previous methods. The end results is the elimination of Shuttle processing delays associated with data deficiencies.
Influence of branding on preference-based decision making.
Philiastides, Marios G; Ratcliff, Roger
2013-07-01
Branding has become one of the most important determinants of consumer choices. Intriguingly, the psychological mechanisms of how branding influences decision making remain elusive. In the research reported here, we used a preference-based decision-making task and computational modeling to identify which internal components of processing are affected by branding. We found that a process of noisy temporal integration of subjective value information can model preference-based choices reliably and that branding biases are explained by changes in the rate of the integration process itself. This result suggests that branding information and subjective preference are integrated into a single source of evidence in the decision-making process, thereby altering choice behavior.
Through-process modelling of texture and anisotropy in AA5182
NASA Astrophysics Data System (ADS)
Crumbach, M.; Neumann, L.; Goerdeler, M.; Aretz, H.; Gottstein, G.; Kopp, R.
2006-07-01
A through-process texture and anisotropy prediction for AA5182 sheet production from hot rolling through cold rolling and annealing is reported. Thermo-mechanical process data predicted by the finite element method (FEM) package T-Pack based on the software LARSTRAN were fed into a combination of physics based microstructure models for deformation texture (GIA), work hardening (3IVM), nucleation texture (ReNuc), and recrystallization texture (StaRT). The final simulated sheet texture was fed into a FEM simulation of cup drawing employing a new concept of interactively updated texture based yield locus predictions. The modelling results of texture development and anisotropy were compared to experimental data. The applicability to other alloys and processes is discussed.
Analysis of InP-based single photon avalanche diodes based on a single recess-etching process
NASA Astrophysics Data System (ADS)
Lee, Kiwon
2018-04-01
Effects of the different etching techniques have been investigated by analyzing electrical and optical characteristics of two-types of single-diffused single photon avalanche diodes (SPADs). The fabricated two-types of SPADs have no diffusion depth variation by using a single diffusion process at the same time. The dry-etched SPADs show higher temperature dependence of a breakdown voltage, larger dark-count-rate (DCR), and lower photon-detection-efficiency (PDE) than those of the wet-etched SPADs due to plasma-induced damage of dry-etching process. The results show that the dry etching damages can more significantly affect the performance of the SPADs based on a single recess-etching process.
Research on manufacturing service behavior modeling based on block chain theory
NASA Astrophysics Data System (ADS)
Zhao, Gang; Zhang, Guangli; Liu, Ming; Yu, Shuqin; Liu, Yali; Zhang, Xu
2018-04-01
According to the attribute characteristics of processing craft, the manufacturing service behavior is divided into service attribute, basic attribute, process attribute, resource attribute. The attribute information model of manufacturing service is established. The manufacturing service behavior information is successfully divided into public and private domain. Additionally, the block chain technology is introduced, and the information model of manufacturing service based on block chain principle is established, which solves the problem of sharing and secreting information of processing behavior, and ensures that data is not tampered with. Based on the key pairing verification relationship, the selective publishing mechanism for manufacturing information is established, achieving the traceability of product data, guarantying the quality of processing quality.
Manufacturing process and material selection in concurrent collaborative design of MEMS devices
NASA Astrophysics Data System (ADS)
Zha, Xuan F.; Du, H.
2003-09-01
In this paper we present knowledge of an intensive approach and system for selecting suitable manufacturing processes and materials for microelectromechanical systems (MEMS) devices in concurrent collaborative design environment. In the paper, fundamental issues on MEMS manufacturing process and material selection such as concurrent design framework, manufacturing process and material hierarchies, and selection strategy are first addressed. Then, a fuzzy decision support scheme for a multi-criteria decision-making problem is proposed for estimating, ranking and selecting possible manufacturing processes, materials and their combinations. A Web-based prototype advisory system for the MEMS manufacturing process and material selection, WebMEMS-MASS, is developed based on the client-knowledge server architecture and framework to help the designer find good processes and materials for MEMS devices. The system, as one of the important parts of an advanced simulation and modeling tool for MEMS design, is a concept level process and material selection tool, which can be used as a standalone application or a Java applet via the Web. The running sessions of the system are inter-linked with webpages of tutorials and reference pages to explain the facets, fabrication processes and material choices, and calculations and reasoning in selection are performed using process capability and material property data from a remote Web-based database and interactive knowledge base that can be maintained and updated via the Internet. The use of the developed system including operation scenario, use support, and integration with an MEMS collaborative design system is presented. Finally, an illustration example is provided.
Empowering occupational therapists to become evidence-based work rehabilitation practitioners.
Vachon, Brigitte; Durand, Marie-José; LeBlanc, Jeannette
2010-01-01
Occupational therapists (OTs) engage in continuing education to integrate best available knowledge and skills into their practice. However, many barriers influence the degree to which they are currently able to integrate research evidence into their clinical decision making process. The specific objectives were to explore the clinical decision-making processes they used, and to describe the empowerment process they developed to become evidence-based practitioners. Eight OTs, who had attended a four-day workshop on evidence-based work rehabilitation, were recruited to participate to a reflective practice group. A collaborative research methodology was used. The group was convened for 12 meetings and held during a 15-month period. The data collected was analyzed using the grounded theory method. The results revealed the different decision-making modes used by OTs: defensive, repressed, cautious, autonomous intuitive and autonomous thoughtful. These modes influenced utilization of evidence and determined the stances taken toward practice change. Reflective learning facilitated their utilization of an evidence-based practice model through a three-level empowerment process: deliberateness, client-centeredness and system mindedness. During the course of this study, participants learned to become evidence-based practitioners. This process had an impact on how they viewed their clients, their practice and the work rehabilitation system.
NASA Astrophysics Data System (ADS)
Telipenko, E.; Chernysheva, T.; Zakharova, A.; Dumchev, A.
2015-10-01
The article represents research results about the knowledge base development for the intellectual information system for the bankruptcy risk assessment of the enterprise. It is described the process analysis of the knowledge base development; the main process stages, some problems and their solutions are given. The article introduces the connectionist model for the bankruptcy risk assessment based on the analysis of industrial enterprise financial accounting. The basis for this connectionist model is a three-layer perceptron with the back propagation of error algorithm. The knowledge base for the intellectual information system consists of processed information and the processing operation method represented as the connectionist model. The article represents the structure of the intellectual information system, the knowledge base, and the information processing algorithm for neural network training. The paper shows mean values of 10 indexes for industrial enterprises; with the help of them it is possible to carry out a financial analysis of industrial enterprises and identify correctly the current situation for well-timed managerial decisions. Results are given about neural network testing on the data of both bankrupt and financially strong enterprises, which were not included into training and test sets.
Müller-Staub, Maria; de Graaf-Waar, Helen; Paans, Wolter
2016-11-01
Nurses are accountable to apply the nursing process, which is key for patient care: It is a problem-solving process providing the structure for care plans and documentation. The state-of-the art nursing process is based on classifications that contain standardized concepts, and therefore, it is named Advanced Nursing Process. It contains valid assessments, nursing diagnoses, interventions, and nursing-sensitive patient outcomes. Electronic decision support systems can assist nurses to apply the Advanced Nursing Process. However, nursing decision support systems are missing, and no "gold standard" is available. The study aim is to develop a valid Nursing Process-Clinical Decision Support System Standard to guide future developments of clinical decision support systems. In a multistep approach, a Nursing Process-Clinical Decision Support System Standard with 28 criteria was developed. After pilot testing (N = 29 nurses), the criteria were reduced to 25. The Nursing Process-Clinical Decision Support System Standard was then presented to eight internationally known experts, who performed qualitative interviews according to Mayring. Fourteen categories demonstrate expert consensus on the Nursing Process-Clinical Decision Support System Standard and its content validity. All experts agreed the Advanced Nursing Process should be the centerpiece for the Nursing Process-Clinical Decision Support System and should suggest research-based, predefined nursing diagnoses and correct linkages between diagnoses, evidence-based interventions, and patient outcomes.
Balancing Act: How to Capture Knowledge without Killing It.
ERIC Educational Resources Information Center
Brown, John Seely; Duguid, Paul
2000-01-01
Top-down processes for institutionalizing ideas can stifle creativity. Xerox researchers learned how to combine process-based and practice-based methods in order to disseminate best practices from a community of repair technicians. (JOW)
Hsu, Chun-Wei; Goh, Joshua O. S.
2016-01-01
When comparing between the values of different choices, human beings can rely on either more cognitive processes, such as using mathematical computation, or more affective processes, such as using emotion. However, the neural correlates of how these two types of processes operate during value-based decision-making remain unclear. In this study, we investigated the extent to which neural regions engaged during value-based decision-making overlap with those engaged during mathematical and emotional processing in a within-subject manner. In a functional magnetic resonance imaging experiment, participants viewed stimuli that always consisted of numbers and emotional faces that depicted two choices. Across tasks, participants decided between the two choices based on the expected value of the numbers, a mathematical result of the numbers, or the emotional face stimuli. We found that all three tasks commonly involved various cortical areas including frontal, parietal, motor, somatosensory, and visual regions. Critically, the mathematical task shared common areas with the value but not emotion task in bilateral striatum. Although the emotion task overlapped with the value task in parietal, motor, and sensory areas, the mathematical task also evoked responses in other areas within these same cortical structures. Minimal areas were uniquely engaged for the value task apart from the other two tasks. The emotion task elicited a more expansive area of neural activity whereas value and mathematical task responses were in more focal regions. Whole-brain spatial correlation analysis showed that valuative processing engaged functional brain responses more similarly to mathematical processing than emotional processing. While decisions on expected value entail both mathematical and emotional processing regions, mathematical processes have a more prominent contribution particularly in subcortical processes. PMID:27375466
Hsu, Chun-Wei; Goh, Joshua O S
2016-01-01
When comparing between the values of different choices, human beings can rely on either more cognitive processes, such as using mathematical computation, or more affective processes, such as using emotion. However, the neural correlates of how these two types of processes operate during value-based decision-making remain unclear. In this study, we investigated the extent to which neural regions engaged during value-based decision-making overlap with those engaged during mathematical and emotional processing in a within-subject manner. In a functional magnetic resonance imaging experiment, participants viewed stimuli that always consisted of numbers and emotional faces that depicted two choices. Across tasks, participants decided between the two choices based on the expected value of the numbers, a mathematical result of the numbers, or the emotional face stimuli. We found that all three tasks commonly involved various cortical areas including frontal, parietal, motor, somatosensory, and visual regions. Critically, the mathematical task shared common areas with the value but not emotion task in bilateral striatum. Although the emotion task overlapped with the value task in parietal, motor, and sensory areas, the mathematical task also evoked responses in other areas within these same cortical structures. Minimal areas were uniquely engaged for the value task apart from the other two tasks. The emotion task elicited a more expansive area of neural activity whereas value and mathematical task responses were in more focal regions. Whole-brain spatial correlation analysis showed that valuative processing engaged functional brain responses more similarly to mathematical processing than emotional processing. While decisions on expected value entail both mathematical and emotional processing regions, mathematical processes have a more prominent contribution particularly in subcortical processes.
Yu, Wen-Kang; Dong, Ling; Pei, Wen-Xuan; Sun, Zhi-Rong; Dai, Jun-Dong; Wang, Yun
2017-12-01
The whole process quality control and management of traditional Chinese medicine (TCM) decoction pieces is a system engineering, involving the base environment, seeds and seedlings, harvesting, processing and other multiple steps, so the accurate identification of factors in TCM production process that may induce the quality risk, as well as reasonable quality control measures are very important. At present, the concept of quality risk is mainly concentrated in the aspects of management and regulations, etc. There is no comprehensive analysis on possible risks in the quality control process of TCM decoction pieces, or analysis summary of effective quality control schemes. A whole process quality control and management system for TCM decoction pieces based on TCM quality tree was proposed in this study. This system effectively combined the process analysis method of TCM quality tree with the quality risk management, and can help managers to make real-time decisions while realizing the whole process quality control of TCM. By providing personalized web interface, this system can realize user-oriented information feedback, and was convenient for users to predict, evaluate and control the quality of TCM. In the application process, the whole process quality control and management system of the TCM decoction pieces can identify the related quality factors such as base environment, cultivation and pieces processing, extend and modify the existing scientific workflow according to their own production conditions, and provide different enterprises with their own quality systems, to achieve the personalized service. As a new quality management model, this paper can provide reference for improving the quality of Chinese medicine production and quality standardization. Copyright© by the Chinese Pharmaceutical Association.
Data near processing support for climate data analysis
NASA Astrophysics Data System (ADS)
Kindermann, Stephan; Ehbrecht, Carsten; Hempelmann, Nils
2016-04-01
Climate data repositories grow in size exponentially. Scalable data near processing capabilities are required to meet future data analysis requirements and to replace current "data download and process at home" workflows and approaches. On one hand side, these processing capabilities should be accessible via standardized interfaces (e.g. OGC WPS), on the other side a large variety of processing tools, toolboxes and deployment alternatives have to be supported and maintained at the data/processing center. We present a community approach of a modular and flexible system supporting the development, deployment and maintenace of OGC-WPS based web processing services. This approach is organized in an open source github project (called "bird-house") supporting individual processing services ("birds", e.g. climate index calculations, model data ensemble calculations), which rely on basic common infrastructural components (e.g. installation and deployment recipes, analysis code dependencies management). To support easy deployment at data centers as well as home institutes (e.g. for testing and development) the system supports the management of the often very complex package dependency chain of climate data analysis packages as well as docker based packaging and installation. We present a concrete deployment scenario at the German Climate Computing Center (DKRZ). The DKRZ one hand side hosts a multi-petabyte climate archive which is integrated e.g. into the european ENES and worldwide ESGF data infrastructure, and on the other hand hosts an HPC center supporting (model) data production and data analysis. The deployment scenario also includes openstack based data cloud services to support data import and data distribution for bird-house based WPS web processing services. Current challenges for inter-institutionnal deployments of web processing services supporting the european and international climate modeling community as well as the climate impact community are highlighted. Also aspects supporting future WPS based cross community usage scenarios supporting data reuse and data provenance aspects are reflected.
2010-01-01
Comparative Effectiveness Research, or other efforts to determine best practices and to develop guidelines based on meta-analysis and evidence - based medicine . An...authoritative reviews or other evidence - based medicine sources, but they have been made unambiguous and computable – a process which sounds...best practice recommendation created through an evidence - based medicine (EBM) development process. The lifecycle envisions four stages of refinement
Budin, Francois; Hoogstoel, Marion; Reynolds, Patrick; Grauer, Michael; O'Leary-Moore, Shonagh K; Oguz, Ipek
2013-01-01
Magnetic resonance imaging (MRI) of rodent brains enables study of the development and the integrity of the brain under certain conditions (alcohol, drugs etc.). However, these images are difficult to analyze for biomedical researchers with limited image processing experience. In this paper we present an image processing pipeline running on a Midas server, a web-based data storage system. It is composed of the following steps: rigid registration, skull-stripping, average computation, average parcellation, parcellation propagation to individual subjects, and computation of region-based statistics on each image. The pipeline is easy to configure and requires very little image processing knowledge. We present results obtained by processing a data set using this pipeline and demonstrate how this pipeline can be used to find differences between populations.
Optimized Laplacian image sharpening algorithm based on graphic processing unit
NASA Astrophysics Data System (ADS)
Ma, Tinghuai; Li, Lu; Ji, Sai; Wang, Xin; Tian, Yuan; Al-Dhelaan, Abdullah; Al-Rodhaan, Mznah
2014-12-01
In classical Laplacian image sharpening, all pixels are processed one by one, which leads to large amount of computation. Traditional Laplacian sharpening processed on CPU is considerably time-consuming especially for those large pictures. In this paper, we propose a parallel implementation of Laplacian sharpening based on Compute Unified Device Architecture (CUDA), which is a computing platform of Graphic Processing Units (GPU), and analyze the impact of picture size on performance and the relationship between the processing time of between data transfer time and parallel computing time. Further, according to different features of different memory, an improved scheme of our method is developed, which exploits shared memory in GPU instead of global memory and further increases the efficiency. Experimental results prove that two novel algorithms outperform traditional consequentially method based on OpenCV in the aspect of computing speed.
NASA Technical Reports Server (NTRS)
Hopkins, R. H.; Davis, J. R.; Blais, P. D.; Rohatgi, A.; Campbell, R. B.; Rai-Choudhury, P.; Mollenkopf, H. C.; Mccormick, J. R.
1979-01-01
The 13th quarterly report of a study entitled an Investigation of the Effects of Impurities and Processing on Silicon Solar Cells is given. The objective of the program is to define the effects of impurities, various thermochemical processes and any impurity-process interactions on the performance of terrestrial silicon solar cells. The Phase 3 program effort falls in five areas: (1) cell processing studies; (2) completion of the data base and impurity-performance modeling for n-base cells; (3) extension of p-base studies to include contaminants likely to be introduced during silicon production, refining or crystal growth; (4) anisotropy effects; and (5) a preliminary study of the permanence of impurity effects in silicon solar cells. The quarterly activities for this report focus on tasks (1), (3) and (4).
Process-oriented guided-inquiry learning: a natural fit for occupational therapy education.
Jaffe, Lynn; Gibson, Robert; D'Amico, Mariana
2015-04-01
After a brief review of the major group cooperative learning strategies, this article presents the format and use of Process-Oriented Guided-Inquiry Learning (POGIL) as a recommended teaching strategy for occupational therapy classes. This recommendation is based upon evidence of effectiveness of this strategy for enhancing critical thinking, content retention, and teamwork. Strategies for learning the process and suggestions for its use are based upon literature evidence and the authors' experiences with this strategy over 4 years in a class on evidence-based practice.
Wide-bandgap III-Nitride based Second Harmonic Generation
2014-10-02
fabrication process for a GaN LPS. Fig. 1: 3-step Fabrication process of a GaN based lateral polar structure. ( a ) Growth of a 20 nm AlN buffer layer...etching of the LT-AlN stripes. This results are shown in Fig. 2 ( a ) and (b). Fig. 2: AFM images of KOH ( a ) and RIE (b) patterned templates for lateral ...was varied between 0.6 - 1.0. FIG. 3: Growth process of AlGaN based Lateral Polar Structures. ( a ) RIE patterning. (b) Growth of HT- AlN. (c
Negotiation-based Order Lot-Sizing Approach for Two-tier Supply Chain
NASA Astrophysics Data System (ADS)
Chao, Yuan; Lin, Hao Wen; Chen, Xili; Murata, Tomohiro
This paper focuses on a negotiation based collaborative planning process for the determination of order lot-size over multi-period planning, and confined to a two-tier supply chain scenario. The aim is to study how negotiation based planning processes would be used to refine locally preferred ordering patterns, which would consequently affect the overall performance of the supply chain in terms of costs and service level. Minimal information exchanges in the form of mathematical models are suggested to represent the local preferences and used to support the negotiation processes.
Model-Based Verification and Validation of the SMAP Uplink Processes
NASA Technical Reports Server (NTRS)
Khan, M. Omair; Dubos, Gregory F.; Tirona, Joseph; Standley, Shaun
2013-01-01
This case study stands as an example of how a project can validate a system-level design earlier in the project life cycle than traditional V&V processes by using simulation on a system model. Specifically, this paper describes how simulation was added to a system model of the Soil Moisture Active-Passive (SMAP) mission's uplink process.Also discussed are the advantages and disadvantages of the methods employed and the lessons learned; which are intended to benefit future model-based and simulation-based V&V development efforts.
ERIC Educational Resources Information Center
Türker, Fatih Mehmet
2016-01-01
In today's world, where online learning environments have increased their efficiency in education and training, the design of the websites prepared for education and training purposes has become an important process. This study is about the teaching process of the online learning environments created to teach Turkish in web based environments, and…
ERIC Educational Resources Information Center
Singh, A. S.; Chinapaw, M. J. M.; Brug, J.; van Mechelen, W.
2009-01-01
Health promotion programs benefit from an accompanying process evaluation since it can provide more insight in the strengths and weaknesses of a program. A process evaluation was conducted to assess the reach, implementation, satisfaction and maintenance of a school-based program aimed at the prevention of excessive weight gain among Dutch…
Determination of the smoke-plume heights and their dynamics with ground-based scanning LIDAR
V. Kovalev; A. Petkov; C. Wold; S. Urbanski; W. M. Hao
2015-01-01
Lidar-data processing techniques are analyzed, which allow determining smoke-plume heights and their dynamics and can be helpful for the improvement of smoke dispersion and air quality models. The data processing algorithms considered in the paper are based on the analysis of two alternative characteristics related to the smoke dispersion process: the regularized...
NASA Astrophysics Data System (ADS)
Melkozyorova, N. A.; Zinkevich, K. G.; Lebedev, E. A.; Alekseyev, A. V.; Gromov, D. G.; Kitsyuk, E. P.; Ryazanov, R. M.; Sysa, A. V.
2017-11-01
The features of electrophoretic deposition process of composite LiCoO2-based cathode and Si-based anode materials were researched. The influence of the deposition process parameters on the structure and composition of the deposit was revealed. The possibility of a local deposition of composites on a planar lithium-ion battery structure was demonstrated.
Lannering, Christina; Ernsth Bravell, Marie; Johansson, Linda
2017-05-01
A structured and systematic care process for preventive work, aimed to reduce falls, pressure ulcers and malnutrition among older people, has been developed in Sweden. The process involves risk assessment, team-based interventions and evaluation of results. Since development, this structured work process has become web-based and has been implemented in a national quality registry called 'Senior Alert' and used countrywide. The aim of this study was to describe nursing staff's experience of preventive work by using the structured preventive care process as outlined by Senior Alert. Eight focus group interviews were conducted during 2015 including staff from nursing homes and home-based nursing care in three municipalities. The interview material was subjected to qualitative content analysis. In this study, both positive and negative opinions were expressed about the process. The systematic and structured work flow seemed to only partly facilitate care providers to improve care quality by making better clinical assessments, performing team-based planned interventions and learning from results. Participants described lack of reliability in the assessments and varying opinions about the structure. Furthermore, organisational structures limited the preventive work. © 2016 John Wiley & Sons Ltd.
Simplified process model discovery based on role-oriented genetic mining.
Zhao, Weidong; Liu, Xi; Dai, Weihui
2014-01-01
Process mining is automated acquisition of process models from event logs. Although many process mining techniques have been developed, most of them are based on control flow. Meanwhile, the existing role-oriented process mining methods focus on correctness and integrity of roles while ignoring role complexity of the process model, which directly impacts understandability and quality of the model. To address these problems, we propose a genetic programming approach to mine the simplified process model. Using a new metric of process complexity in terms of roles as the fitness function, we can find simpler process models. The new role complexity metric of process models is designed from role cohesion and coupling, and applied to discover roles in process models. Moreover, the higher fitness derived from role complexity metric also provides a guideline for redesigning process models. Finally, we conduct case study and experiments to show that the proposed method is more effective for streamlining the process by comparing with related studies.
Mode extraction on wind turbine blades via phase-based video motion estimation
NASA Astrophysics Data System (ADS)
Sarrafi, Aral; Poozesh, Peyman; Niezrecki, Christopher; Mao, Zhu
2017-04-01
In recent years, image processing techniques are being applied more often for structural dynamics identification, characterization, and structural health monitoring. Although as a non-contact and full-field measurement method, image processing still has a long way to go to outperform other conventional sensing instruments (i.e. accelerometers, strain gauges, laser vibrometers, etc.,). However, the technologies associated with image processing are developing rapidly and gaining more attention in a variety of engineering applications including structural dynamics identification and modal analysis. Among numerous motion estimation and image-processing methods, phase-based video motion estimation is considered as one of the most efficient methods regarding computation consumption and noise robustness. In this paper, phase-based video motion estimation is adopted for structural dynamics characterization on a 2.3-meter long Skystream wind turbine blade, and the modal parameters (natural frequencies, operating deflection shapes) are extracted. Phase-based video processing adopted in this paper provides reliable full-field 2-D motion information, which is beneficial for manufacturing certification and model updating at the design stage. The phase-based video motion estimation approach is demonstrated through processing data on a full-scale commercial structure (i.e. a wind turbine blade) with complex geometry and properties, and the results obtained have a good correlation with the modal parameters extracted from accelerometer measurements, especially for the first four bending modes, which have significant importance in blade characterization.
Generalization bounds of ERM-based learning processes for continuous-time Markov chains.
Zhang, Chao; Tao, Dacheng
2012-12-01
Many existing results on statistical learning theory are based on the assumption that samples are independently and identically distributed (i.i.d.). However, the assumption of i.i.d. samples is not suitable for practical application to problems in which samples are time dependent. In this paper, we are mainly concerned with the empirical risk minimization (ERM) based learning process for time-dependent samples drawn from a continuous-time Markov chain. This learning process covers many kinds of practical applications, e.g., the prediction for a time series and the estimation of channel state information. Thus, it is significant to study its theoretical properties including the generalization bound, the asymptotic convergence, and the rate of convergence. It is noteworthy that, since samples are time dependent in this learning process, the concerns of this paper cannot (at least straightforwardly) be addressed by existing methods developed under the sample i.i.d. assumption. We first develop a deviation inequality for a sequence of time-dependent samples drawn from a continuous-time Markov chain and present a symmetrization inequality for such a sequence. By using the resultant deviation inequality and symmetrization inequality, we then obtain the generalization bounds of the ERM-based learning process for time-dependent samples drawn from a continuous-time Markov chain. Finally, based on the resultant generalization bounds, we analyze the asymptotic convergence and the rate of convergence of the learning process.
ERIC Educational Resources Information Center
Kennedy, Kerry J.
The processes of instructional materials development and dissemination used in four Stanford Program on International and Cross Cultural Education (SPICE) projects dealing with Latin America, Africa, China, and Japan are described, and evaluative comments based on a review of the curriculum development process are made. The major purpose of the…
Wijmans, Johannes G.; Baker, Richard W.; Merkel, Timothy C.
2012-08-21
A gas separation process for treating flue gases from combustion processes, and combustion processes including such gas separation. The invention involves routing a first portion of the flue gas stream to be treated to an absorption-based carbon dioxide capture step, while simultaneously flowing a second portion of the flue gas across the feed side of a membrane, flowing a sweep gas stream, usually air, across the permeate side, then passing the permeate/sweep gas to the combustor.
Kim, Sunghee; Shin, Gisoo
2016-02-01
Since previous studies on simulation-based education have been focused on fundamental nursing skills for nursing students in South Korea, there is little research available that focuses on clinical nurses in simulation-based training. Further, there is a paucity of research literature related to the integration of the nursing process into simulation training particularly in the emergency nursing care of high-risk maternal and neonatal patients. The purpose of this study was to identify the effects of nursing process-based simulation on knowledge, attitudes, and skills for maternal and child emergency nursing care in clinical nurses in South Korea. Data were collected from 49 nurses, 25 in the experimental group and 24 in the control group, from August 13 to 14, 2013. This study was an equivalent control group pre- and post-test experimental design to compare the differences in knowledge, attitudes, and skills for maternal and child emergency nursing care between the experimental group and the control group. The experimental group was trained by the nursing process-based simulation training program, while the control group received traditional methods of training for maternal and child emergency nursing care. The experimental group was more likely to improve knowledge, attitudes, and skills required for clinical judgment about maternal and child emergency nursing care than the control group. Among five stages of nursing process in simulation, the experimental group was more likely to improve clinical skills required for nursing diagnosis and nursing evaluation than the control group. These results will provide valuable information on developing nursing process-based simulation training to improve clinical competency in nurses. Further research should be conducted to verify the effectiveness of nursing process-based simulation with more diverse nurse groups on more diverse subjects in the future. Copyright © 2015 Elsevier Ltd. All rights reserved.
Recollection is a continuous process: implications for dual-process theories of recognition memory.
Mickes, Laura; Wais, Peter E; Wixted, John T
2009-04-01
Dual-process theory, which holds that recognition decisions can be based on recollection or familiarity, has long seemed incompatible with signal detection theory, which holds that recognition decisions are based on a singular, continuous memory-strength variable. Formal dual-process models typically regard familiarity as a continuous process (i.e., familiarity comes in degrees), but they construe recollection as a categorical process (i.e., recollection either occurs or does not occur). A continuous process is characterized by a graded relationship between confidence and accuracy, whereas a categorical process is characterized by a binary relationship such that high confidence is associated with high accuracy but all lower degrees of confidence are associated with chance accuracy. Using a source-memory procedure, we found that the relationship between confidence and source-recollection accuracy was graded. Because recollection, like familiarity, is a continuous process, dual-process theory is more compatible with signal detection theory than previously thought.
Characterisation and Processing of Some Iron Ores of India
NASA Astrophysics Data System (ADS)
Krishna, S. J. G.; Patil, M. R.; Rudrappa, C.; Kumar, S. P.; Ravi, B. P.
2013-10-01
Lack of process characterization data of the ores based on the granulometry, texture, mineralogy, physical, chemical, properties, merits and limitations of process, market and local conditions may mislead the mineral processing entrepreneur. The proper implementation of process characterization and geotechnical map data will result in optimized sustainable utilization of resource by processing. A few case studies of process characterization of some Indian iron ores are dealt with. The tentative ascending order of process refractoriness of iron ores is massive hematite/magnetite < marine black iron oxide sands < laminated soft friable siliceous ore fines < massive banded magnetite quartzite < laminated soft friable clayey aluminous ore fines < massive banded hematite quartzite/jasper < massive clayey hydrated iron oxide ore < manganese bearing iron ores massive < Ti-V bearing magnetite magmatic ore < ferruginous cherty quartzite. Based on diagnostic process characterization, the ores have been classified and generic process have been adopted for some Indian iron ores.
Model-based pH monitor for sensor assessment.
van Schagen, Kim; Rietveld, Luuk; Veersma, Alex; Babuska, Robert
2009-01-01
Owing to the nature of the treatment processes, monitoring the processes based on individual online measurements is difficult or even impossible. However, the measurements (online and laboratory) can be combined with a priori process knowledge, using mathematical models, to objectively monitor the treatment processes and measurement devices. The pH measurement is a commonly used measurement at different stages in the drinking water treatment plant, although it is a unreliable instrument, requiring significant maintenance. It is shown that, using a grey-box model, it is possible to assess the measurement devices effectively, even if detailed information of the specific processes is unknown.
Intelligent methods for the process parameter determination of plastic injection molding
NASA Astrophysics Data System (ADS)
Gao, Huang; Zhang, Yun; Zhou, Xundao; Li, Dequn
2018-03-01
Injection molding is one of the most widely used material processing methods in producing plastic products with complex geometries and high precision. The determination of process parameters is important in obtaining qualified products and maintaining product quality. This article reviews the recent studies and developments of the intelligent methods applied in the process parameter determination of injection molding. These intelligent methods are classified into three categories: Case-based reasoning methods, expert system- based methods, and data fitting and optimization methods. A framework of process parameter determination is proposed after comprehensive discussions. Finally, the conclusions and future research topics are discussed.
MEMS-based system and image processing strategy for epiretinal prosthesis.
Xia, Peng; Hu, Jie; Qi, Jin; Gu, Chaochen; Peng, Yinghong
2015-01-01
Retinal prostheses have the potential to restore some level of visual function to the patients suffering from retinal degeneration. In this paper, an epiretinal approach with active stimulation devices is presented. The MEMS-based processing system consists of an external micro-camera, an information processor, an implanted electrical stimulator and a microelectrode array. The image processing strategy combining image clustering and enhancement techniques was proposed and evaluated by psychophysical experiments. The results indicated that the image processing strategy improved the visual performance compared with direct merging pixels to low resolution. The image processing methods assist epiretinal prosthesis for vision restoration.
Model-based query language for analyzing clinical processes.
Barzdins, Janis; Barzdins, Juris; Rencis, Edgars; Sostaks, Agris
2013-01-01
Nowadays large databases of clinical process data exist in hospitals. However, these data are rarely used in full scope. In order to perform queries on hospital processes, one must either choose from the predefined queries or develop queries using MS Excel-type software system, which is not always a trivial task. In this paper we propose a new query language for analyzing clinical processes that is easily perceptible also by non-IT professionals. We develop this language based on a process modeling language which is also described in this paper. Prototypes of both languages have already been verified using real examples from hospitals.
Rule-Based Event Processing and Reaction Rules
NASA Astrophysics Data System (ADS)
Paschke, Adrian; Kozlenkov, Alexander
Reaction rules and event processing technologies play a key role in making business and IT / Internet infrastructures more agile and active. While event processing is concerned with detecting events from large event clouds or streams in almost real-time, reaction rules are concerned with the invocation of actions in response to events and actionable situations. They state the conditions under which actions must be taken. In the last decades various reaction rule and event processing approaches have been developed, which for the most part have been advanced separately. In this paper we survey reaction rule approaches and rule-based event processing systems and languages.
Dependent Neyman type A processes based on common shock Poisson approach
NASA Astrophysics Data System (ADS)
Kadilar, Gamze Özel; Kadilar, Cem
2016-04-01
The Neyman type A process is used for describing clustered data since the Poisson process is insufficient for clustering of events. In a multivariate setting, there may be dependencies between multivarite Neyman type A processes. In this study, dependent form of the Neyman type A process is considered under common shock approach. Then, the joint probability function are derived for the dependent Neyman type A Poisson processes. Then, an application based on forest fires in Turkey are given. The results show that the joint probability function of the dependent Neyman type A processes, which is obtained in this study, can be a good tool for the probabilistic fitness for the total number of burned trees in Turkey.
NASA Technical Reports Server (NTRS)
Basili, Victor R.
1992-01-01
The concepts of quality improvements have permeated many businesses. It is clear that the nineties will be the quality era for software and there is a growing need to develop or adapt quality improvement approaches to the software business. Thus we must understand software as an artifact and software as a business. Since the business we are dealing with is software, we must understand the nature of software and software development. The software discipline is evolutionary and experimental; it is a laboratory science. Software is development not production. The technologies of the discipline are human based. There is a lack of models that allow us to reason about the process and the product. All software is not the same; process is a variable, goals are variable, etc. Packaged, reusable, experiences require additional resources in the form of organization, processes, people, etc. There have been a variety of organizational frameworks proposed to improve quality for various businesses. The ones discussed in this presentation include: Plan-Do-Check-Act, a quality improvement process based upon a feedback cycle for optimizing a single process model/production line; the Experience Factory/Quality Improvement Paradigm, continuous improvements through the experimentation, packaging, and reuse of experiences based upon a business's needs; Total Quality Management, a management approach to long term success through customer satisfaction based on the participation of all members of an organization; the SEI capability maturity model, a staged process improvement based upon assessment with regard to a set of key process areas until you reach a level 5 which represents a continuous process improvement; and Lean (software) Development, a principle supporting the concentration of the production on 'value added' activities and the elimination of reduction of 'not value added' activities.
A Framework for Preliminary Design of Aircraft Structures Based on Process Information. Part 1
NASA Technical Reports Server (NTRS)
Rais-Rohani, Masoud
1998-01-01
This report discusses the general framework and development of a computational tool for preliminary design of aircraft structures based on process information. The described methodology is suitable for multidisciplinary design optimization (MDO) activities associated with integrated product and process development (IPPD). The framework consists of three parts: (1) product and process definitions; (2) engineering synthesis, and (3) optimization. The product and process definitions are part of input information provided by the design team. The backbone of the system is its ability to analyze a given structural design for performance as well as manufacturability and cost assessment. The system uses a database on material systems and manufacturing processes. Based on the identified set of design variables and an objective function, the system is capable of performing optimization subject to manufacturability, cost, and performance constraints. The accuracy of the manufacturability measures and cost models discussed here depend largely on the available data on specific methods of manufacture and assembly and associated labor requirements. As such, our focus in this research has been on the methodology itself and not so much on its accurate implementation in an industrial setting. A three-tier approach is presented for an IPPD-MDO based design of aircraft structures. The variable-complexity cost estimation methodology and an approach for integrating manufacturing cost assessment into design process are also discussed. This report is presented in two parts. In the first part, the design methodology is presented, and the computational design tool is described. In the second part, a prototype model of the preliminary design Tool for Aircraft Structures based on Process Information (TASPI) is described. Part two also contains an example problem that applies the methodology described here for evaluation of six different design concepts for a wing spar.
NASA Astrophysics Data System (ADS)
Ducoté, Julien; Dettoni, Florent; Bouyssou, Régis; Le-Gratiet, Bertrand; Carau, Damien; Dezauzier, Christophe
2015-03-01
Patterning process control of advanced nodes has required major changes over the last few years. Process control needs of critical patterning levels since 28nm technology node is extremely aggressive showing that metrology accuracy/sensitivity must be finely tuned. The introduction of pitch splitting (Litho-Etch-Litho-Etch) at 14FDSOInm node requires the development of specific metrologies to adopt advanced process control (for CD, overlay and focus corrections). The pitch splitting process leads to final line CD uniformities that are a combination of the CD uniformities of the two exposures, while the space CD uniformities are depending on both CD and OVL variability. In this paper, investigations of CD and OVL process control of 64nm minimum pitch at Metal1 level of 14FDSOI technology, within the double patterning process flow (Litho, hard mask etch, line etch) are presented. Various measurements with SEMCD tools (Hitachi), and overlay tools (KT for Image Based Overlay - IBO, and ASML for Diffraction Based Overlay - DBO) are compared. Metrology targets are embedded within a block instanced several times within the field to perform intra-field process variations characterizations. Specific SEMCD targets were designed for independent measurement of both line CD (A and B) and space CD (A to B and B to A) for each exposure within a single measurement during the DP flow. Based on those measurements correlation between overlay determined with SEMCD and with standard overlay tools can be evaluated. Such correlation at different steps through the DP flow is investigated regarding the metrology type. Process correction models are evaluated with respect to the measurement type and the intra-field sampling.
Prediction of the properties anhydrite construction mixtures based on neural network approach
NASA Astrophysics Data System (ADS)
Fedorchuk, Y. M.; Zamyatin, N. V.; Smirnov, G. V.; Rusina, O. N.; Sadenova, M. A.
2017-08-01
The article considered the question of applying the backstop modeling mechanism from the components of anhydride mixtures in the process of managing the technological processes of receiving construction products which based on fluoranhydrite.
What will the future of cloud-based astronomical data processing look like?
NASA Astrophysics Data System (ADS)
Green, Andrew W.; Mannering, Elizabeth; Harischandra, Lloyd; Vuong, Minh; O'Toole, Simon; Sealey, Katrina; Hopkins, Andrew M.
2017-06-01
Astronomy is rapidly approaching an impasse: very large datasets require remote or cloud-based parallel processing, yet many astronomers still try to download the data and develop serial code locally. Astronomers understand the need for change, but the hurdles remain high. We are developing a data archive designed from the ground up to simplify and encourage cloud-based parallel processing. While the volume of data we host remains modest by some standards, it is still large enough that download and processing times are measured in days and even weeks. We plan to implement a python based, notebook-like interface that automatically parallelises execution. Our goal is to provide an interface sufficiently familiar and user-friendly that it encourages the astronomer to run their analysis on our system in the cloud-astroinformatics as a service. We describe how our system addresses the approaching impasse in astronomy using the SAMI Galaxy Survey as an example.
Model-based design of experiments for cellular processes.
Chakrabarty, Ankush; Buzzard, Gregery T; Rundell, Ann E
2013-01-01
Model-based design of experiments (MBDOE) assists in the planning of highly effective and efficient experiments. Although the foundations of this field are well-established, the application of these techniques to understand cellular processes is a fertile and rapidly advancing area as the community seeks to understand ever more complex cellular processes and systems. This review discusses the MBDOE paradigm along with applications and challenges within the context of cellular processes and systems. It also provides a brief tutorial on Fisher information matrix (FIM)-based and Bayesian experiment design methods along with an overview of existing software packages and computational advances that support MBDOE application and adoption within the Systems Biology community. As cell-based products and biologics progress into the commercial sector, it is anticipated that MBDOE will become an essential practice for design, quality control, and production. Copyright © 2013 Wiley Periodicals, Inc.
The Iterative Research Cycle: Process-Based Model Evaluation
NASA Astrophysics Data System (ADS)
Vrugt, J. A.
2014-12-01
The ever increasing pace of computational power, along with continued advances in measurement technologies and improvements in process understanding has stimulated the development of increasingly complex physics based models that simulate a myriad of processes at different spatial and temporal scales. Reconciling these high-order system models with perpetually larger volumes of field data is becoming more and more difficult, particularly because classical likelihood-based fitting methods lack the power to detect and pinpoint deficiencies in the model structure. In this talk I will give an overview of our latest research on process-based model calibration and evaluation. This approach, rooted in Bayesian theory, uses summary metrics of the calibration data rather than the data itself to help detect which component(s) of the model is (are) malfunctioning and in need of improvement. A few case studies involving hydrologic and geophysical models will be used to demonstrate the proposed methodology.
Method of plasma etching Ga-based compound semiconductors
Qiu, Weibin; Goddard, Lynford L.
2012-12-25
A method of plasma etching Ga-based compound semiconductors includes providing a process chamber and a source electrode adjacent to the process chamber. The process chamber contains a sample comprising a Ga-based compound semiconductor. The sample is in contact with a platen which is electrically connected to a first power supply, and the source electrode is electrically connected to a second power supply. The method includes flowing SiCl.sub.4 gas into the chamber, flowing Ar gas into the chamber, and flowing H.sub.2 gas into the chamber. RF power is supplied independently to the source electrode and the platen. A plasma is generated based on the gases in the process chamber, and regions of a surface of the sample adjacent to one or more masked portions of the surface are etched to create a substantially smooth etched surface including features having substantially vertical walls beneath the masked portions.
A digital signal processing system for coherent laser radar
NASA Technical Reports Server (NTRS)
Hampton, Diana M.; Jones, William D.; Rothermel, Jeffry
1991-01-01
A data processing system for use with continuous-wave lidar is described in terms of its configuration and performance during the second survey mission of NASA'a Global Backscatter Experiment. The system is designed to estimate a complete lidar spectrum in real time, record the data from two lidars, and monitor variables related to the lidar operating environment. The PC-based system includes a transient capture board, a digital-signal processing (DSP) board, and a low-speed data-acquisition board. Both unprocessed and processed lidar spectrum data are monitored in real time, and the results are compared to those of a previous non-DSP-based system. Because the DSP-based system is digital it is slower than the surface-acoustic-wave signal processor and collects 2500 spectra/s. However, the DSP-based system provides complete data sets at two wavelengths from the continuous-wave lidars.
Parker, Brian Corey; Myrick, Florence
2012-07-01
The use of the high-fidelity human patient simulator (HPS)-based clinical scenario in undergraduate nursing education is a powerful learning tool, well suited to modern nursing students' preference for immersive construction of knowledge through the provision of contextually rich reality-based practice and social discourse. The purpose of this study was to explore the social-psychological processes that occur within HPS-based clinical scenarios. Grounded theory method was used to study students and faculty sampled from a Western Canadian baccalaureate nursing program. The process of leveled coding generated a substantive theory that has the potential to enable educators to empower students through the use of fading support, a twofold process composed of adaptive scaffolding and dynamic assessment that challenges students to realistically self-regulate and transform their frame of reference for nursing practice, while limiting the threats that traditional HPS-based curriculum can impose. Copyright 2012, SLACK Incorporated.
A Descriptive Study of a Building-Based Team Problem-Solving Process
ERIC Educational Resources Information Center
Brewer, Alexander B.
2010-01-01
The purpose of this study was to empirically evaluate Building-Based Teams for General Education Intervention or BBT for GEI. BBT for GEI is a team problem-solving process designed to assist schools in conducting research-based interventions in the general education setting. Problem-solving teams are part of general education and provide support…
ERIC Educational Resources Information Center
Ignatova, Natalija; Dagiene, Valentina; Kubilinskiene, Svetlana
2015-01-01
How to enable students to create a personalized learning environment? What are the criteria of evaluation of the ICT-based learning process personalization affordance? These questions are answered by conducting multiple case study research of the innovative ICT-based learning process in iTEC (Innovative Technologies for Engaging Classrooms)…
ERIC Educational Resources Information Center
Haque, Mohammad Mahfujul; Little, David C.; Barman, Benoy K.; Wahab, Md. Abdul
2010-01-01
Purpose: The purpose of the study was to understand the adoption process of ricefield based fish seed production (RBFSP) that has been developed, promoted and established in Northwest Bangladesh. Design/Methodology/Approach: Quantitative investigation based on regression analysis and qualitative investigation using semi-structured interview were…
Model of Values-Based Management Process in Schools: A Mixed Design Study
ERIC Educational Resources Information Center
Dogan, Soner
2016-01-01
The aim of this paper is to evaluate the school administrators' values-based management behaviours according to the teachers' perceptions and opinions and, accordingly, to build a model of values-based management process in schools. The study was conducted using explanatory design which is inclusive of both quantitative and qualitative methods.…
A Cochlear Implant Signal Processing Lab: Exploration of a Problem-Based Learning Exercise
ERIC Educational Resources Information Center
Bhatti, P. T.; McClellan, J. H.
2011-01-01
This paper presents an introductory signal processing laboratory and examines this laboratory exercise in the context of problem-based learning (PBL). Centered in a real-world application, a cochlear implant, the exercise challenged students to demonstrate a working software-based signal processor. Partnering in groups of two or three, second-year…
Effect of Inquiry-Based Learning Approach on Student Resistance in a Science and Technology Course
ERIC Educational Resources Information Center
Sever, Demet; Guven, Meral
2014-01-01
The aim of this study was to identify the resistance behaviors of 7th grade students exhibited during their Science and Technology course teaching-learning processes, and to remove the identified resistance behaviors through teaching-learning processes that were constructed based on the inquiry-based learning approach. In the quasi-experimentally…
ERIC Educational Resources Information Center
Koka, Andre
2017-01-01
This study examined the effectiveness of a brief theory-based intervention on muscular strength among adolescents in a physical education setting. The intervention adopted a process-based mental simulation technique. The self-reported frequency of practising for and actual levels of abdominal muscular strength/endurance as one component of…
Designing a Web-Based Science Learning Environment for Model-Based Collaborative Inquiry
ERIC Educational Resources Information Center
Sun, Daner; Looi, Chee-Kit
2013-01-01
The paper traces a research process in the design and development of a science learning environment called WiMVT (web-based inquirer with modeling and visualization technology). The WiMVT system is designed to help secondary school students build a sophisticated understanding of scientific conceptions, and the science inquiry process, as well as…
ERIC Educational Resources Information Center
Stephens, Pamela Geiger
2006-01-01
Community-based learning has the power to encourage and sustain the intellectual curiosity of learners. By most accounts, community-based learning is a process that creates a collaborative environment of scholarship that holds individual differences, as well as similarities, in high esteem. It is a process, as the phrase suggests, that extends…
ERIC Educational Resources Information Center
Cohen, Edward Charles
2013-01-01
Design based research was utilized to investigate how students use a greenhouse effect simulation in order to derive best learning practices. During this process, students recognized the authentic scientific process involving computer simulations. The simulation used is embedded within an inquiry-based technology-mediated science curriculum known…
Image Understanding Architecture
1991-09-01
architecture to support real-time, knowledge -based image understanding , and develop the software support environment that will be needed to utilize...NUMBER OF PAGES Image Understanding Architecture, Knowledge -Based Vision, AI Real-Time Computer Vision, Software Simulator, Parallel Processor IL PRICE... information . In addition to sensory and knowledge -based processing it is useful to introduce a level of symbolic processing. Thus, vision researchers
ERIC Educational Resources Information Center
Bertin, Evelin; Bhatt, Ramesh S.
2001-01-01
Examined three possible explanations for findings that infants detect textural discrepancies based on individual features more readily than on feature conjunctions. Found that none of the proposed factors could explain 5.5-month-olds' superior processing of featural over conjunction-based textural discrepancies. Findings suggest that in infancy,…
Mutic, Sasa; Brame, R Scott; Oddiraju, Swetha; Parikh, Parag; Westfall, Melisa A; Hopkins, Merilee L; Medina, Angel D; Danieley, Jonathan C; Michalski, Jeff M; El Naqa, Issam M; Low, Daniel A; Wu, Bin
2010-09-01
The value of near-miss and error reporting processes in many industries is well appreciated and typically can be supported with data that have been collected over time. While it is generally accepted that such processes are important in the radiation therapy (RT) setting, studies analyzing the effects of organized reporting and process improvement systems on operation and patient safety in individual clinics remain scarce. The purpose of this work is to report on the design and long-term use of an electronic reporting system in a RT department and compare it to the paper-based reporting system it replaced. A specifically designed web-based system was designed for reporting of individual events in RT and clinically implemented in 2007. An event was defined as any occurrence that could have, or had, resulted in a deviation in the delivery of patient care. The aim of the system was to support process improvement in patient care and safety. The reporting tool was designed so individual events could be quickly and easily reported without disrupting clinical work. This was very important because the system use was voluntary. The spectrum of reported deviations extended from minor workflow issues (e.g., scheduling) to errors in treatment delivery. Reports were categorized based on functional area, type, and severity of an event. The events were processed and analyzed by a formal process improvement group that used the data and the statistics collected through the web-based tool for guidance in reengineering clinical processes. The reporting trends for the first 24 months with the electronic system were compared to the events that were reported in the same clinic with a paper-based system over a seven-year period. The reporting system and the process improvement structure resulted in increased event reporting, improved event communication, and improved identification of clinical areas which needed process and safety improvements. The reported data were also useful for the evaluation of corrective measures and recognition of ineffective measures and efforts. The electronic system was relatively well accepted by personnel and resulted in minimal disruption of clinical work. Event reporting in the quarters with the fewest number of reported events, though voluntary, was almost four times greater than the most events reported in any one quarter with the paper-based system and remained consistent from the inception of the process through the date of this report. However, the acceptance was not universal, validating the need for improved education regarding reporting processes and systematic approaches to reporting culture development. Specially designed electronic event reporting systems in a radiotherapy setting can provide valuable data for process and patient safety improvement and are more effective reporting mechanisms than paper-based systems. Additional work is needed to develop methods that can more effectively utilize reported data for process improvement, including the development of standardized event taxonomy and a classification system for RT.
NASA Astrophysics Data System (ADS)
Timchenko, E. V.; Timchenko, P. E.; Pisareva, E. V.; Vlasov, M. Yu; Revin, V. V.; Klenova, N. A.; Asadova, A. A.
2017-01-01
In this article we present the research results of lyophilization process influence on the composition of hybrid materials based on the bacterial cellulose (BC) using Raman spectroscopy method. As an object of research was used BC, as well as hybrids based on it, comprising the various combinations of hydroxyapatite (HAP) and collagen. Our studies showed that during the lyophilization process changes the ratio of the individual components. It was found that for samples hybrid based on BC with addition of HAP occurs increase of PO4 3- peak intensity in the region 956 cm-1 with decreasing width, which indicates a change in the degree of HAP crystallinity.
The Knowledge-Based Software Assistant: Beyond CASE
NASA Technical Reports Server (NTRS)
Carozzoni, Joseph A.
1993-01-01
This paper will outline the similarities and differences between two paradigms of software development. Both support the whole software life cycle and provide automation for most of the software development process, but have different approaches. The CASE approach is based on a set of tools linked by a central data repository. This tool-based approach is data driven and views software development as a series of sequential steps, each resulting in a product. The Knowledge-Based Software Assistant (KBSA) approach, a radical departure from existing software development practices, is knowledge driven and centers around a formalized software development process. KBSA views software development as an incremental, iterative, and evolutionary process with development occurring at the specification level.
Teasing apart retrieval and encoding interference in the processing of anaphors
Jäger, Lena A.; Benz, Lena; Roeser, Jens; Dillon, Brian W.; Vasishth, Shravan
2015-01-01
Two classes of account have been proposed to explain the memory processes subserving the processing of reflexive-antecedent dependencies. Structure-based accounts assume that the retrieval of the antecedent is guided by syntactic tree-configurational information without considering other kinds of information such as gender marking in the case of English reflexives. By contrast, unconstrained cue-based retrieval assumes that all available information is used for retrieving the antecedent. Similarity-based interference effects from structurally illicit distractors which match a non-structural retrieval cue have been interpreted as evidence favoring the unconstrained cue-based retrieval account since cue-based retrieval interference from structurally illicit distractors is incompatible with the structure-based account. However, it has been argued that the observed effects do not necessarily reflect interference occurring at the moment of retrieval but might equally well be accounted for by interference occurring already at the stage of encoding or maintaining the antecedent in memory, in which case they cannot be taken as evidence against the structure-based account. We present three experiments (self-paced reading and eye-tracking) on German reflexives and Swedish reflexive and pronominal possessives in which we pit the predictions of encoding interference and cue-based retrieval interference against each other. We could not find any indication that encoding interference affects the processing ease of the reflexive-antecedent dependency formation. Thus, there is no evidence that encoding interference might be the explanation for the interference effects observed in previous work. We therefore conclude that invoking encoding interference may not be a plausible way to reconcile interference effects with a structure-based account of reflexive processing. PMID:26106337
Li, Kangkang; Yu, Hai; Yan, Shuiping; Feron, Paul; Wardhaugh, Leigh; Tade, Moses
2016-10-04
Using a rigorous, rate-based model and a validated economic model, we investigated the technoeconomic performance of an aqueous NH 3 -based CO 2 capture process integrated with a 650-MW coal-fired power station. First, the baseline NH 3 process was explored with the process design of simultaneous capture of CO 2 and SO 2 to replace the conventional FGD unit. This reduced capital investment of the power station by US$425/kW (a 13.1% reduction). Integration of this NH 3 baseline process with the power station takes the CO 2 -avoided cost advantage over the MEA process (US$67.3/tonne vs US$86.4/tonne). We then investigated process modifications of a two-stage absorption, rich-split configuration and interheating stripping to further advance the NH 3 process. The modified process reduced energy consumption by 31.7 MW/h (20.2% reduction) and capital costs by US$55.4 million (6.7% reduction). As a result, the CO 2 -avoided cost fell to $53.2/tonne: a savings of $14.1 and $21.9/tonne CO 2 compared with the NH 3 baseline and advanced MEA process, respectively. The analysis of energy breakdown and cost distribution indicates that the technoeconomic performance of the NH 3 process still has great potential to be improved.
A Search Algorithm for Generating Alternative Process Plans in Flexible Manufacturing System
NASA Astrophysics Data System (ADS)
Tehrani, Hossein; Sugimura, Nobuhiro; Tanimizu, Yoshitaka; Iwamura, Koji
Capabilities and complexity of manufacturing systems are increasing and striving for an integrated manufacturing environment. Availability of alternative process plans is a key factor for integration of design, process planning and scheduling. This paper describes an algorithm for generation of alternative process plans by extending the existing framework of the process plan networks. A class diagram is introduced for generating process plans and process plan networks from the viewpoint of the integrated process planning and scheduling systems. An incomplete search algorithm is developed for generating and searching the process plan networks. The benefit of this algorithm is that the whole process plan network does not have to be generated before the search algorithm starts. This algorithm is applicable to large and enormous process plan networks and also to search wide areas of the network based on the user requirement. The algorithm can generate alternative process plans and to select a suitable one based on the objective functions.
Laser Additive Manufacturing of Magnetic Materials
NASA Astrophysics Data System (ADS)
Mikler, C. V.; Chaudhary, V.; Borkar, T.; Soni, V.; Jaeger, D.; Chen, X.; Contieri, R.; Ramanujan, R. V.; Banerjee, R.
2017-03-01
While laser additive manufacturing is becoming increasingly important in the context of next-generation manufacturing technologies, most current research efforts focus on optimizing process parameters for the processing of mature alloys for structural applications (primarily stainless steels, titanium base, and nickel base alloys) from pre-alloyed powder feedstocks to achieve properties superior to conventionally processed counterparts. However, laser additive manufacturing or processing can also be applied to functional materials. This article focuses on the use of directed energy deposition-based additive manufacturing technologies, such as the laser engineered net shaping (LENS™) process, to deposit magnetic alloys. Three case studies are presented: Fe-30 at.%Ni, permalloys of the type Ni-Fe-V and Ni-Fe-Mo, and Fe-Si-B-Cu-Nb (derived from Finemet) alloys. All these alloys have been processed from a blend of elemental powders used as the feedstock, and their resultant microstructures, phase formation, and magnetic properties are discussed in this paper. Although these alloys were produced from a blend of elemental powders, they exhibited relatively uniform microstructures and comparable magnetic properties to those of their conventionally processed counterparts.
Clinical process analysis and activity-based costing at a heart center.
Ridderstolpe, Lisa; Johansson, Andreas; Skau, Tommy; Rutberg, Hans; Ahlfeldt, Hans
2002-08-01
Cost studies, productivity, efficiency, and quality of care measures, the links between resources and patient outcomes, are fundamental issues for hospital management today. This paper describes the implementation of a model for process analysis and activity-based costing (ABC)/management at a Heart Center in Sweden as a tool for administrative cost information, strategic decision-making, quality improvement, and cost reduction. A commercial software package (QPR) containing two interrelated parts, "ProcessGuide and CostControl," was used. All processes at the Heart Center were mapped and graphically outlined. Processes and activities such as health care procedures, research, and education were identified together with their causal relationship to costs and products/services. The construction of the ABC model in CostControl was time-consuming. However, after the ABC/management system was created, it opened the way for new possibilities including process and activity analysis, simulation, and price calculations. Cost analysis showed large variations in the cost obtained for individual patients undergoing coronary artery bypass grafting (CABG) surgery. We conclude that a process-based costing system is applicable and has the potential to be useful in hospital management.
New Vistas in Chemical Product and Process Design.
Zhang, Lei; Babi, Deenesh K; Gani, Rafiqul
2016-06-07
Design of chemicals-based products is broadly classified into those that are process centered and those that are product centered. In this article, the designs of both classes of products are reviewed from a process systems point of view; developments related to the design of the chemical product, its corresponding process, and its integration are highlighted. Although significant advances have been made in the development of systematic model-based techniques for process design (also for optimization, operation, and control), much work is needed to reach the same level for product design. Timeline diagrams illustrating key contributions in product design, process design, and integrated product-process design are presented. The search for novel, innovative, and sustainable solutions must be matched by consideration of issues related to the multidisciplinary nature of problems, the lack of data needed for model development, solution strategies that incorporate multiscale options, and reliability versus predictive power. The need for an integrated model-experiment-based design approach is discussed together with benefits of employing a systematic computer-aided framework with built-in design templates.
NASA Astrophysics Data System (ADS)
Wang, Xiaodong; Zhang, Xiaoyu; Cai, Hongming; Xu, Boyi
Enacting a supply-chain process involves variant partners and different IT systems. REST receives increasing attention for distributed systems with loosely coupled resources. Nevertheless, resource model incompatibilities and conflicts prevent effective process modeling and deployment in resource-centric Web service environment. In this paper, a Petri-net based framework for supply-chain process integration is proposed. A resource meta-model is constructed to represent the basic information of resources. Then based on resource meta-model, XML schemas and documents are derived, which represent resources and their states in Petri-net. Thereafter, XML-net, a high level Petri-net, is employed for modeling control and data flow of process. From process model in XML-net, RESTful services and choreography descriptions are deduced. Therefore, unified resource representation and RESTful services description are proposed for cross-system integration in a more effective way. A case study is given to illustrate the approach and the desirable features of the approach are discussed.
Hardware design and implementation of fast DOA estimation method based on multicore DSP
NASA Astrophysics Data System (ADS)
Guo, Rui; Zhao, Yingxiao; Zhang, Yue; Lin, Qianqiang; Chen, Zengping
2016-10-01
In this paper, we present a high-speed real-time signal processing hardware platform based on multicore digital signal processor (DSP). The real-time signal processing platform shows several excellent characteristics including high performance computing, low power consumption, large-capacity data storage and high speed data transmission, which make it able to meet the constraint of real-time direction of arrival (DOA) estimation. To reduce the high computational complexity of DOA estimation algorithm, a novel real-valued MUSIC estimator is used. The algorithm is decomposed into several independent steps and the time consumption of each step is counted. Based on the statistics of the time consumption, we present a new parallel processing strategy to distribute the task of DOA estimation to different cores of the real-time signal processing hardware platform. Experimental results demonstrate that the high processing capability of the signal processing platform meets the constraint of real-time direction of arrival (DOA) estimation.
Warnke, Tom; Reinhardt, Oliver; Klabunde, Anna; Willekens, Frans; Uhrmacher, Adelinde M
2017-10-01
Individuals' decision processes play a central role in understanding modern migration phenomena and other demographic processes. Their integration into agent-based computational demography depends largely on suitable support by a modelling language. We are developing the Modelling Language for Linked Lives (ML3) to describe the diverse decision processes of linked lives succinctly in continuous time. The context of individuals is modelled by networks the individual is part of, such as family ties and other social networks. Central concepts, such as behaviour conditional on agent attributes, age-dependent behaviour, and stochastic waiting times, are tightly integrated in the language. Thereby, alternative decisions are modelled by concurrent processes that compete by stochastic race. Using a migration model, we demonstrate how this allows for compact description of complex decisions, here based on the Theory of Planned Behaviour. We describe the challenges for the simulation algorithm posed by stochastic race between multiple concurrent complex decisions.
Virtual tryout planning in automotive industry based on simulation metamodels
NASA Astrophysics Data System (ADS)
Harsch, D.; Heingärtner, J.; Hortig, D.; Hora, P.
2016-11-01
Deep drawn sheet metal parts are increasingly designed to the feasibility limit, thus achieving a robust manufacturing is often challenging. The fluctuation of process and material properties often lead to robustness problems. Therefore, numerical simulations are used to detect the critical regions. To enhance the agreement with the real process conditions, the material data are acquired through a variety of experiments. Furthermore, the force distribution is taken into account. The simulation metamodel contains the virtual knowledge of a particular forming process, which is determined based on a series of finite element simulations with variable input parameters. Based on the metamodels, virtual process windows can be displayed for different configurations. This helps to improve the operating point as well as to adjust process settings in case the process becomes unstable. Furthermore, the time of tool tryout can be shortened due to transfer of the virtual knowledge contained in the metamodels on the optimisation of the drawbeads. This allows the tool manufacturer to focus on the essential, to save time and to recognize complex relationships.
Yue, Yonghai; Yuchi, Datong; Guan, Pengfei; Xu, Jia; Guo, Lin; Liu, Jingyue
2016-01-01
To probe the nature of metal-catalysed processes and to design better metal-based catalysts, atomic scale understanding of catalytic processes is highly desirable. Here we use aberration-corrected environmental transmission electron microscopy to investigate the atomic scale processes of silver-based nanoparticles, which catalyse the oxidation of multi-wall carbon nanotubes. A direct semi-quantitative estimate of the oxidized carbon atoms by silver-based nanoparticles is achieved. A mechanism similar to the Mars–van Krevelen process is invoked to explain the catalytic oxidation process. Theoretical calculations, together with the experimental data, suggest that the oxygen molecules dissociate on the surface of silver nanoparticles and diffuse through the silver nanoparticles to reach the silver/carbon interfaces and subsequently oxidize the carbon. The lattice distortion caused by oxygen concentration gradient within the silver nanoparticles provides the direct evidence for oxygen diffusion. Such direct observation of atomic scale dynamics provides an important general methodology for investigations of catalytic processes. PMID:27406595
Syntactic processing in the absence of awareness and semantics.
Hung, Shao-Min; Hsieh, Po-Jang
2015-10-01
The classical view that multistep rule-based operations require consciousness has recently been challenged by findings that both multiword semantic processing and multistep arithmetic equations can be processed unconsciously. It remains unclear, however, whether pure rule-based cognitive processes can occur unconsciously in the absence of semantics. Here, after presenting 2 words consciously, we suppressed the third with continuous flash suppression. First, we showed that the third word in the subject-verb-verb format (syntactically incongruent) broke suppression significantly faster than the third word in the subject-verb-object format (syntactically congruent). Crucially, the same effect was observed even with sentences composed of pseudowords (pseudo subject-verb-adjective vs. pseudo subject-verb-object) without any semantic information. This is the first study to show that syntactic congruency can be processed unconsciously in the complete absence of semantics. Our findings illustrate how abstract rule-based processing (e.g., syntactic categories) can occur in the absence of visual awareness, even when deprived of semantics. (c) 2015 APA, all rights reserved).
Shi, Haiqiang; Fatehi, Pedram; Xiao, Huining; Ni, Yonghao
2011-04-01
The presence of lignin impairs the utilization of the hemicelluloses dissolved in the pre-hydrolysis liquor (PHL) of the kraft-based dissolving pulp production process. In this paper, a novel process was developed by combining the acidification and poly ethylene oxide (PEO) flocculation concepts to improve the lignin removal. The results showed that the lignin removal was improved by the addition of PEO to the acidified PHL, particularly at a low pH of 1.5. The main mechanisms involved are the lignin/PEO complex formation and the bridging of the formed complexes. This hypothesis was supported by the turbidity, FTIR and particle size measurements. Interestingly, the hemicelluloses removal from the acidification/PEO flocculation was marginal, which would be beneficial for the down-stream ethanol production from the PHL. Additionally, a process flow diagram was proposed that incorporates this new concept into the existing configuration of kraft-based dissolving pulp production process. Copyright © 2011 Elsevier Ltd. All rights reserved.
Three-dimensional motor schema based navigation
NASA Technical Reports Server (NTRS)
Arkin, Ronald C.
1989-01-01
Reactive schema-based navigation is possible in space domains by extending the methods developed for ground-based navigation found within the Autonomous Robot Architecture (AuRA). Reformulation of two dimensional motor schemas for three dimensional applications is a straightforward process. The manifold advantages of schema-based control persist, including modular development, amenability to distributed processing, and responsiveness to environmental sensing. Simulation results show the feasibility of this methodology for space docking operations in a cluttered work area.
NASA Astrophysics Data System (ADS)
Takei, Satoshi; Maki, Hirotaka; Sugahara, Kigen; Ito, Kenta; Hanabata, Makoto
2015-07-01
An electron beam (EB) lithography method using inedible cellulose-based resist material derived from woody biomass has been successfully developed. This method allows the use of pure water in the development process instead of the conventionally used tetramethylammonium hydroxide and anisole. The inedible cellulose-based biomass resist material, as an alternative to alpha-linked disaccharides in sugar derivatives that compete with food supplies, was developed by replacing the hydroxyl groups in the beta-linked disaccharides with EB-sensitive 2-methacryloyloxyethyl groups. A 75 nm line and space pattern at an exposure dose of 19 μC/cm2, a resist thickness uniformity of less than 0.4 nm on a 200 mm wafer, and low film thickness shrinkage under EB irradiation were achieved with this inedible cellulose-based biomass resist material using a water-based development process.
The syntactic complexity of Russian relative clauses
Fedorenko, Evelina; Gibson, Edward
2012-01-01
Although syntactic complexity has been investigated across dozens of studies, the available data still greatly underdetermine relevant theories of processing difficulty. Memory-based and expectation-based theories make opposite predictions regarding fine-grained time course of processing difficulty in syntactically constrained contexts, and each class of theory receives support from results on some constructions in some languages. Here we report four self-paced reading experiments on the online comprehension of Russian relative clauses together with related corpus studies, taking advantage of Russian’s flexible word order to disentangle predictions of competing theories. We find support for key predictions of memory-based theories in reading times at RC verbs, and for key predictions of expectation-based theories in processing difficulty at RC-initial accusative noun phrase (NP) objects, which corpus data suggest should be highly unexpected. These results suggest that a complete theory of syntactic complexity must integrate insights from both expectation-based and memory-based theories. PMID:24711687
PCI-based WILDFIRE reconfigurable computing engines
NASA Astrophysics Data System (ADS)
Fross, Bradley K.; Donaldson, Robert L.; Palmer, Douglas J.
1996-10-01
WILDFORCE is the first PCI-based custom reconfigurable computer that is based on the Splash 2 technology transferred from the National Security Agency and the Institute for Defense Analyses, Supercomputing Research Center (SRC). The WILDFORCE architecture has many of the features of the WILDFIRE computer, such as field- programmable gate array (FPGA) based processing elements, linear array and crossbar interconnection, and high- performance memory and I/O subsystems. New features introduced in the PCI-based WILDFIRE systems include memory/processor options that can be added to any processing element. These options include static and dynamic memory, digital signal processors (DSPs), FPGAs, and microprocessors. In addition to memory/processor options, many different application specific connectors can be used to extend the I/O capabilities of the system, including systolic I/O, camera input and video display output. This paper also discusses how this new PCI-based reconfigurable computing engine is used for rapid-prototyping, real-time video processing and other DSP applications.
RTOS kernel in portable electrocardiograph
NASA Astrophysics Data System (ADS)
Centeno, C. A.; Voos, J. A.; Riva, G. G.; Zerbini, C.; Gonzalez, E. A.
2011-12-01
This paper presents the use of a Real Time Operating System (RTOS) on a portable electrocardiograph based on a microcontroller platform. All medical device digital functions are performed by the microcontroller. The electrocardiograph CPU is based on the 18F4550 microcontroller, in which an uCOS-II RTOS can be embedded. The decision associated with the kernel use is based on its benefits, the license for educational use and its intrinsic time control and peripherals management. The feasibility of its use on the electrocardiograph is evaluated based on the minimum memory requirements due to the kernel structure. The kernel's own tools were used for time estimation and evaluation of resources used by each process. After this feasibility analysis, the migration from cyclic code to a structure based on separate processes or tasks able to synchronize events is used; resulting in an electrocardiograph running on one Central Processing Unit (CPU) based on RTOS.
Image-Based Grouping during Binocular Rivalry Is Dictated by Eye-Of-Origin
Stuit, Sjoerd M.; Paffen, Chris L. E.; van der Smagt, Maarten J.; Verstraten, Frans A. J.
2014-01-01
Prolonged viewing of dichoptically presented images with different content results in perceptual alternations known as binocular rivalry. This phenomenon is thought to be the result of competition at a local level, where local rivalry zones interact to give rise to a single, global dominant percept. Certain perceived combinations that result from this local competition are known to last longer than others, which is referred to as grouping during binocular rivalry. In recent years, the phenomenon has been suggested to be the result of competition at both eye- and image-based processing levels, although the exact contribution from each level remains elusive. Here we use a paradigm designed specifically to quantify the contribution of eye- and image-based processing to grouping during rivalry. In this paradigm we used sine-wave gratings as well as upright and inverted faces, with and without binocular disparity-based occlusion. These stimuli and conditions were used because they are known to result in processing at different stages throughout the visual processing hierarchy. Specifically, more complex images were included in order to maximize the potential contribution of image-based grouping. In spite of this, our results show that increasing image complexity did not lead to an increase in the contribution of image-based processing to grouping during rivalry. In fact, the results show that grouping was primarily affected by the eye-of-origin of the image parts, irrespective of stimulus type. We suggest that image content affects grouping during binocular rivalry at low-level processing stages, where it is intertwined with eye-of-origin information. PMID:24987847
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simpson, L.; Britt, J.; Birkmire, R.
ITN Energy Systems, Inc., and Global Solar Energy, Inc., assisted by NREL's PV Manufacturing R&D program, have continued to advance CIGS production technology by developing trajectory-oriented predictive/control models, fault-tolerance control, control platform development, in-situ sensors, and process improvements. Modeling activities included developing physics-based and empirical models for CIGS and sputter-deposition processing, implementing model-based control, and applying predictive models to the construction of new evaporation sources and for control. Model-based control is enabled by implementing reduced or empirical models into a control platform. Reliability improvement activities include implementing preventive maintenance schedules; detecting failed sensors/equipment and reconfiguring to tinue processing; and systematicmore » development of fault prevention and reconfiguration strategies for the full range of CIGS PV production deposition processes. In-situ sensor development activities have resulted in improved control and indicated the potential for enhanced process status monitoring and control of the deposition processes. Substantial process improvements have been made, including significant improvement in CIGS uniformity, thickness control, efficiency, yield, and throughput. In large measure, these gains have been driven by process optimization, which in turn have been enabled by control and reliability improvements due to this PV Manufacturing R&D program.« less
Risk-based process safety assessment and control measures design for offshore process facilities.
Khan, Faisal I; Sadiq, Rehan; Husain, Tahir
2002-09-02
Process operation is the most hazardous activity next to the transportation and drilling operation on an offshore oil and gas (OOG) platform. Past experiences of onshore and offshore oil and gas activities have revealed that a small mis-happening in the process operation might escalate to a catastrophe. This is of especial concern in the OOG platform due to the limited space and compact geometry of the process area, less ventilation, and difficult escape routes. On an OOG platform, each extra control measure, which is implemented, not only occupies space on the platform and increases congestion but also adds extra load to the platform. Eventualities in the OOG platform process operation can be avoided through incorporating the appropriate control measures at the early design stage. In this paper, the authors describe a methodology for risk-based process safety decision making for OOG activities. The methodology is applied to various offshore process units, that is, the compressor, separators, flash drum and driers of an OOG platform. Based on the risk potential, appropriate safety measures are designed for each unit. This paper also illustrates that implementation of the designed safety measures reduces the high Fatal accident rate (FAR) values to an acceptable level.
NASA Astrophysics Data System (ADS)
Wichmann, Volker
2017-09-01
The Gravitational Process Path (GPP) model can be used to simulate the process path and run-out area of gravitational processes based on a digital terrain model (DTM). The conceptual model combines several components (process path, run-out length, sink filling and material deposition) to simulate the movement of a mass point from an initiation site to the deposition area. For each component several modeling approaches are provided, which makes the tool configurable for different processes such as rockfall, debris flows or snow avalanches. The tool can be applied to regional-scale studies such as natural hazard susceptibility mapping but also contains components for scenario-based modeling of single events. Both the modeling approaches and precursor implementations of the tool have proven their applicability in numerous studies, also including geomorphological research questions such as the delineation of sediment cascades or the study of process connectivity. This is the first open-source implementation, completely re-written, extended and improved in many ways. The tool has been committed to the main repository of the System for Automated Geoscientific Analyses (SAGA) and thus will be available with every SAGA release.
Metacognitive Analysis of Pre-Service Teachers of Chemistry in Posting Questions
NASA Astrophysics Data System (ADS)
Santoso, T.; Yuanita, L.
2017-04-01
Questions addressed to something can induce metacognitive function to monitor a person’s thinking process. This study aims to describe the structure of the level of student questions based on thinking level and chemistry understanding level and describe how students use their metacognitive knowledge in asking. This research is a case study in chemistry learning, followed by 87 students. Results of the analysis revealed that the structure of thinking level of student question consists of knowledge question, understanding and application question, and high thinking question; the structure of chemistry understanding levels of student questions are a symbol, macro, macro-micro, macro-process, micro-process, and the macro-micro-process. The level Questioning skill of students to scientific articles more qualified than the level questioning skills of students to the teaching materials. The analysis result of six student interviews, a student question demonstrate the metacognitive processes with categories: (1) low-level metacognitive process, which is compiled based on questions focusing on a particular phrase or change the words; (2) intermediate level metacognitive process, submission of questions requires knowledge and understanding, and (3) high-level metacognitive process, the student questions posed based on identifying the central topic or abstraction essence of scientific articles.
NASA Astrophysics Data System (ADS)
Shen, Yan; Ge, Jin-ming; Zhang, Guo-qing; Yu, Wen-bin; Liu, Rui-tong; Fan, Wei; Yang, Ying-xuan
2018-01-01
This paper explores the problem of signal processing in optical current transformers (OCTs). Based on the noise characteristics of OCTs, such as overlapping signals, noise frequency bands, low signal-to-noise ratios, and difficulties in acquiring statistical features of noise power, an improved standard Kalman filtering algorithm was proposed for direct current (DC) signal processing. The state-space model of the OCT DC measurement system is first established, and then mixed noise can be processed by adding mixed noise into measurement and state parameters. According to the minimum mean squared error criterion, state predictions and update equations of the improved Kalman algorithm could be deduced based on the established model. An improved central difference Kalman filter was proposed for alternating current (AC) signal processing, which improved the sampling strategy and noise processing of colored noise. Real-time estimation and correction of noise were achieved by designing AC and DC noise recursive filters. Experimental results show that the improved signal processing algorithms had a good filtering effect on the AC and DC signals with mixed noise of OCT. Furthermore, the proposed algorithm was able to achieve real-time correction of noise during the OCT filtering process.
Fišer, Jaromír; Zítek, Pavel; Skopec, Pavel; Knobloch, Jan; Vyhlídal, Tomáš
2017-05-01
The purpose of the paper is to achieve a constrained estimation of process state variables using the anisochronic state observer tuned by the dominant root locus technique. The anisochronic state observer is based on the state-space time delay model of the process. Moreover the process model is identified not only as delayed but also as non-linear. This model is developed to describe a material flow process. The root locus technique combined with the magnitude optimum method is utilized to investigate the estimation process. Resulting dominant roots location serves as a measure of estimation process performance. The higher the dominant (natural) frequency in the leftmost position of the complex plane the more enhanced performance with good robustness is achieved. Also the model based observer control methodology for material flow processes is provided by means of the separation principle. For demonstration purposes, the computer-based anisochronic state observer is applied to the strip temperatures estimation in the hot strip finishing mill composed of seven stands. This application was the original motivation to the presented research. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.
Development of polypyrrole based solid-state on-chip microactuators using photolithography
NASA Astrophysics Data System (ADS)
Zhong, Yong; Lundemo, Staffan; Jager, Edwin W. H.
2018-07-01
There is a need for soft microactuators, especially for biomedical applications. We have developed a microfabrication process to create such soft, on-chip polymer based microactuators that can operate in air. The on-chip microactuators were fabricated using standard photolithographic techniques and wet etching, combined with special designed process to micropattern the electroactive polymer polypyrrole that drives the microactuators. By immobilizing a UV-patternable gel containing a liquid electrolyte on top of the electroactive polypyrrole layer, actuation in air was achieved although with reduced movement. Further optimization of the processing is currently on-going. The result shows the possibility to batch fabricate complex microsystems such as microrobotics and micromanipulators based on these solid-state on-chip microactuators using microfabrication methods including standard photolithographic processes.
Method for fabricating beryllium-based multilayer structures
Skulina, Kenneth M.; Bionta, Richard M.; Makowiecki, Daniel M.; Alford, Craig S.
2003-02-18
Beryllium-based multilayer structures and a process for fabricating beryllium-based multilayer mirrors, useful in the wavelength region greater than the beryllium K-edge (111 .ANG. or 11.1 nm). The process includes alternating sputter deposition of beryllium and a metal, typically from the fifth row of the periodic table, such as niobium (Nb), molybdenum (Mo), ruthenium (Ru), and rhodium (Rh). The process includes not only the method of sputtering the materials, but the industrial hygiene controls for safe handling of beryllium. The mirrors made in accordance with the process may be utilized in soft x-ray and extreme-ultraviolet projection lithography, which requires mirrors of high reflectivity (>60%) for x-rays in the range of 60-140 .ANG. (60-14.0 nm).
NASA Astrophysics Data System (ADS)
Ribes-Pleguezuelo, Pol; Inza, Andoni Moral; Basset, Marta Gilaberte; Rodríguez, Pablo; Rodríguez, Gemma; Laudisio, Marco; Galan, Miguel; Hornaff, Marcel; Beckert, Erik; Eberhardt, Ramona; Tünnermann, Andreas
2016-11-01
A miniaturized diode-pumped solid-state laser (DPSSL) designed as part of the Raman laser spectrometer (RLS) instrument for the European Space Agency (ESA) Exomars mission 2020 is assembled and tested for the mission purpose and requirements. Two different processes were tried for the laser assembling: one based on adhesives, following traditional laser manufacturing processes; another based on a low-stress and organic-free soldering technique called solderjet bumping technology. The manufactured devices were tested for the processes validation by passing mechanical, thermal cycles, radiation, and optical functional tests. The comparison analysis showed a device improvement in terms of reliability of the optical performances from the soldered to the assembled by adhesive-based means.
Interferometric architectures based All-Optical logic design methods and their implementations
NASA Astrophysics Data System (ADS)
Singh, Karamdeep; Kaur, Gurmeet
2015-06-01
All-Optical Signal Processing is an emerging technology which can avoid costly Optical-electronic-optical (O-E-O) conversions which are usually compulsory in traditional Electronic Signal Processing systems, thus greatly enhancing operating bit rate with some added advantages such as electro-magnetic interference immunity and low power consumption etc. In order to implement complex signal processing tasks All-Optical logic gates are required as backbone elements. This review describes the advances in the field of All-Optical logic design methods based on interferometric architectures such as Mach-Zehnder Interferometer (MZI), Sagnac Interferometers and Ultrafast Non-Linear Interferometer (UNI). All-Optical logic implementations for realization of arithmetic and signal processing applications based on each interferometric arrangement are also presented in a categorized manner.
Shi, Yanwei; Ling, Wencui; Qiang, Zhimin
2013-01-01
The effect of chlorine dioxide (ClO2) oxidation on the formation of disinfection by-products (DBPs) during sequential (ClO2 pre-oxidation for 30 min) and simultaneous disinfection processes with free chlorine (FC) or monochloramine (MCA) was investigated. The formation of DBPs from synthetic humic acid (HA) water and three natural surface waters containing low bromide levels (11-27 microg/L) was comparatively examined in the FC-based (single FC, sequential ClO2-FC, and simultaneous ClO2/FC) and MCA-based (single MCA, ClO2-MCA, and ClO2/MCA) disinfection processes. The results showed that much more DBPs were formed from the synthetic HA water than from the three natural surface waters with comparative levels of dissolved organic carbon. In the FC-based processes, ClO2 oxidation could reduce trihalomethanes (THMs) by 27-35% and haloacetic acids (HAAs) by 14-22% in the three natural surface waters, but increased THMs by 19% and HAAs by 31% in the synthetic HA water after an FC contact time of 48 h. In the MCA-based processes, similar trends were observed although DBPs were produced at a much lower level. There was an insignificant difference in DBPs formation between the sequential and simultaneous processes. The presence of a high level of bromide (320 microg/L) remarkably promoted the DBPs formation in the FC-based processes. Therefore, the simultaneous disinfection process of ClO2/MCA is recommended particularly for waters with a high bromide level.
Agent-Based Modeling of Growth Processes
ERIC Educational Resources Information Center
Abraham, Ralph
2014-01-01
Growth processes abound in nature, and are frequently the target of modeling exercises in the sciences. In this article we illustrate an agent-based approach to modeling, in the case of a single example from the social sciences: bullying.
Wohlmuth da Silva, Salatiel; Arenhart Heberle, Alan Nelson; Pereira Santos, Alexia; Siqueira Rodrigues, Marco Antônio; Pérez-Herranz, Valentín; Moura Bernardes, Andréa
2018-05-29
Antibiotics are not efficiently removed in conventional wastewater treatments. In fact, different advanced oxidation process (AOPs), including ozone, peroxide, UV radiation, among others, are being investigated in the elimination of microcontaminants. Most of AOPs proved to be efficient on the degradation of antibiotics, but the mineralization is on the one hand not evaluated or on the other hand not high. At this work, the UV-based hybrid process, namely Photo-assisted electrochemical oxidation (PEO), was applied, aiming the mineralization of microcontaminants such as the antibiotics Amoxicillin (AMX), Norfloxacin (NOR) and Azithromycin (AZI). The influence of the individual contributions of electrochemical oxidation (EO) and the UV-base processes on the hybrid process (PEO) was analysed. Results showed that AMX and NOR presented higher mineralization rate under direct photolysis than AZI due to the high absorption of UV radiation. For the EO processes, a low mineralization was found for all antibiotics, what was associated to a mass-transport limitation related to the low concentration of contaminants (200 µg/L). Besides that, an increase in mineralization was found, when heterogeneous photocatalysis and EO are compared, due to the influence of UV radiation, which overcomes the mass-transport limitations. Although the UV-based processes control the reaction pathway that leads to mineralization, the best results to mineralize the antibiotics were achieved by PEO hybrid process. This can be explained by the synergistic effect of the processes that constitute them. A higher mineralization was achieved, which is an important and useful finding to avoid the discharge of microcontaminants in the environment.
High-performance wavelet engine
NASA Astrophysics Data System (ADS)
Taylor, Fred J.; Mellot, Jonathon D.; Strom, Erik; Koren, Iztok; Lewis, Michael P.
1993-11-01
Wavelet processing has shown great promise for a variety of image and signal processing applications. Wavelets are also among the most computationally expensive techniques in signal processing. It is demonstrated that a wavelet engine constructed with residue number system arithmetic elements offers significant advantages over commercially available wavelet accelerators based upon conventional arithmetic elements. Analysis is presented predicting the dynamic range requirements of the reported residue number system based wavelet accelerator.
ERIC Educational Resources Information Center
Gershon, Walter S.; Oded, Ben-Horin
2014-01-01
Drawing from their respective work at the intersection of music and science, the coauthors argue that engaging in processes of making music can help students more deeply engage in the kinds of creativity associated with inquiry based science education (IBSE) and scientists better convey their ideas to others. Of equal importance, the processes of…
ERIC Educational Resources Information Center
Balci, Ceyda; Yenice, Nilgun
2016-01-01
The aim of this study is to analyse the effects of scientific argumentation based learning process on the eighth grade students' achievement in the unit of "cell division and inheritance". It also deals with the effects of this process on their comprehension about the nature of scientific knowledge, their willingness to take part in…
ERIC Educational Resources Information Center
Barbera, Elena; Garcia, Iolanda; Fuertes-Alpiste, Marc
2017-01-01
This paper presents a case study of the co-design process for an online course on Sustainable Development (Degree in Tourism) involving the teacher, two students, and the project researchers. The co-design process was founded on an inquiry-based and technology-enhanced model that takes shape in a set of design principles. The research had two main…
ERIC Educational Resources Information Center
Ellett, Chad D.; Demir, Kadir; Monsaas, Judith
2015-01-01
The purpose of this study was to examine change processes, self-efficacy beliefs, and department culture and the roles these elements play in faculty engagement in working in K-12 schools. The development of three new web-based measures of faculty perceptions of change processes, self-efficacy beliefs, and department culture are described. The…
USDA-ARS?s Scientific Manuscript database
The U.S. food and non-food industries would benefit from the development of a domestically produced crude, semi-pure and pure bio-based fiber gum from corn bran and oat hulls processing waste streams. When corn bran and oat hulls are processed to produce a commercial cellulose enriched fiber gel, th...
NASA Technical Reports Server (NTRS)
2001-01-01
REI Systems, Inc. developed a software solution that uses the Internet to eliminate the paperwork typically required to document and manage complex business processes. The data management solution, called Electronic Handbooks (EHBs), is presently used for the entire SBIR program processes at NASA. The EHB-based system is ideal for programs and projects whose users are geographically distributed and are involved in complex management processes and procedures. EHBs provide flexible access control and increased communications while maintaining security for systems of all sizes. Through Internet Protocol- based access, user authentication and user-based access restrictions, role-based access control, and encryption/decryption, EHBs provide the level of security required for confidential data transfer. EHBs contain electronic forms and menus, which can be used in real time to execute the described processes. EHBs use standard word processors that generate ASCII HTML code to set up electronic forms that are viewed within a web browser. EHBs require no end-user software distribution, significantly reducing operating costs. Each interactive handbook simulates a hard-copy version containing chapters with descriptions of participants' roles in the online process.
Li, Rui; Feng, Chuanping; Hu, Weiwu; Xi, Beidou; Chen, Nan; Zhao, Baowei; Liu, Ying; Hao, Chunbo; Pu, Jiaoyang
2016-02-01
Nitrate contaminated water can be effectively treated by simultaneous heterotrophic and autotrophic denitrification (HAD). In the present study, woodchips and elemental sulfur were used as co-electron donors for HAD. It was found that ammonium salts could enhance the denitrifying activity of the Thiobacillus bacteria, which utilize the ammonium that is produced by the dissimilatory nitrate reduction to ammonium (DNRA) in the woodchip-sulfur based heterotrophic and autotrophic denitrification (WSHAD) process. The denitrification performance of the WSHAD process (reaction constants range from 0.05485 h(-1) to 0.06637 h(-1)) is better than that of sulfur-based autotrophic denitrification (reaction constants range from 0.01029 h(-1) to 0.01379 h(-1)), and the optimized ratio of woodchips to sulfur is 1:1 (w/w). No sulfate accumulation is observed in the WSHAD process and the alkalinity generated in the heterotrophic denitrification can compensate for alkalinity consumption by the sulfur-based autotrophic denitrification. The symbiotic relationship between the autotrophic and the heterotrophic denitrification processes play a vital role in the mixotrophic environment. Copyright © 2015 Elsevier Ltd. All rights reserved.
Chronic motivational state interacts with task reward structure in dynamic decision-making.
Cooper, Jessica A; Worthy, Darrell A; Maddox, W Todd
2015-12-01
Research distinguishes between a habitual, model-free system motivated toward immediately rewarding actions, and a goal-directed, model-based system motivated toward actions that improve future state. We examined the balance of processing in these two systems during state-based decision-making. We tested a regulatory fit hypothesis (Maddox & Markman, 2010) that predicts that global trait motivation affects the balance of habitual- vs. goal-directed processing but only through its interaction with the task framing as gain-maximization or loss-minimization. We found support for the hypothesis that a match between an individual's chronic motivational state and the task framing enhances goal-directed processing, and thus state-based decision-making. Specifically, chronic promotion-focused individuals under gain-maximization and chronic prevention-focused individuals under loss-minimization both showed enhanced state-based decision-making. Computational modeling indicates that individuals in a match between global chronic motivational state and local task reward structure engaged more goal-directed processing, whereas those in a mismatch engaged more habitual processing. Copyright © 2015 Elsevier Inc. All rights reserved.
Ma, Jun; Marignier, Jean-Louis; Pernot, Pascal; Houée-Levin, Chantal; Kumar, Anil; Sevilla, Michael D; Adhikary, Amitava; Mostafavi, Mehran
2018-05-30
In irradiated DNA, by the base-to-base and backbone-to-base hole transfer processes, the hole (i.e., the unpaired spin) localizes on the most electropositive base, guanine. Phosphate radicals formed via ionization events in the DNA-backbone must play an important role in the backbone-to-base hole transfer process. However, earlier studies on irradiated hydrated DNA, on irradiated DNA-models in frozen aqueous solution and in neat dimethyl phosphate showed the formation of carbon-centered radicals and not phosphate radicals. Therefore, to model the backbone-to-base hole transfer process, we report picosecond pulse radiolysis studies of the reactions between H2PO4˙ with the DNA bases - G, A, T, and C in 6 M H3PO4 at 22 °C. The time-resolved observations show that in 6 M H3PO4, H2PO4˙ causes the one-electron oxidation of adenine, guanine and thymine, by forming the cation radicals via a single electron transfer (SET) process; however, the rate constant of the reaction of H2PO4˙ with cytosine is too low (<107 L mol-1 s-1) to be measured. The rates of these reactions are influenced by the protonation states and the reorganization energies of the base radicals and of the phosphate radical in 6 M H3PO4.
Shahaf, Goded; Pratt, Hillel
2013-01-01
In this work we demonstrate the principles of a systematic modeling approach of the neurophysiologic processes underlying a behavioral function. The modeling is based upon a flexible simulation tool, which enables parametric specification of the underlying neurophysiologic characteristics. While the impact of selecting specific parameters is of interest, in this work we focus on the insights, which emerge from rather accepted assumptions regarding neuronal representation. We show that harnessing of even such simple assumptions enables the derivation of significant insights regarding the nature of the neurophysiologic processes underlying behavior. We demonstrate our approach in some detail by modeling the behavioral go/no-go task. We further demonstrate the practical significance of this simplified modeling approach in interpreting experimental data - the manifestation of these processes in the EEG and ERP literature of normal and abnormal (ADHD) function, as well as with comprehensive relevant ERP data analysis. In-fact we show that from the model-based spatiotemporal segregation of the processes, it is possible to derive simple and yet effective and theory-based EEG markers differentiating normal and ADHD subjects. We summarize by claiming that the neurophysiologic processes modeled for the go/no-go task are part of a limited set of neurophysiologic processes which underlie, in a variety of combinations, any behavioral function with measurable operational definition. Such neurophysiologic processes could be sampled directly from EEG on the basis of model-based spatiotemporal segregation.
Challenges With Research Contract Negotiations in Community-Based Cancer Research.
Thompson, Michael A; Hurley, Patricia A; Faller, Bryan; Longinette, Jean; Richter, Katie; Stewart, Teresa L; Robert, Nicholas
2016-06-01
Community-based research programs face many barriers to participation in clinical trials. Although the majority of people with cancer are diagnosed and treated in the community setting, only roughly 3% are enrolled onto clinical trials. Research contract and budget negotiations have been consistently identified as time consuming and a barrier to participation in clinical trials. ASCO's Community Research Forum conducted a survey about specific challenges of research contract and budget negotiation processes in community-based research settings. The goal was to ultimately identify potential solutions to these barriers. A survey was distributed to 780 community-based physician investigators and research staff. The survey included questions to provide insight into contract and budget negotiation processes and perceptions about related barriers. A total of 77% of the 150 respondents acknowledged barriers in the process. Respondents most frequently identified budget-related issues (n = 133), inefficiencies in the process (n = 80), or legal review and negotiation issues (n = 70). Of the respondents, 44.1% indicated that contract research organizations made the contract negotiations process harder for their research program, and only 5% believed contract research organizations made the process easier. The contract negotiations process is perceived to be impeded by sponsors through underestimation of costs, lack of flexibility with the contract language, and excessive delays. Improving clinical trial activation processes and reducing inefficiencies would be beneficial to all interested stakeholders, including patients who may ultimately stand to benefit from participation in clinical trials. The following key recommendations were made: standardization of contracts and negotiation processes to promulgate transparency and efficiencies, improve sponsor processes to minimize burden on sites, create and promote use of contract templates and best practices, and provide education and consultation. Copyright © 2016 by American Society of Clinical Oncology.
Hard and soft acids and bases: structure and process.
Reed, James L
2012-07-05
Under investigation is the structure and process that gives rise to hard-soft behavior in simple anionic atomic bases. That for simple atomic bases the chemical hardness is expected to be the only extrinsic component of acid-base strength, has been substantiated in the current study. A thermochemically based operational scale of chemical hardness was used to identify the structure within anionic atomic bases that is responsible for chemical hardness. The base's responding electrons have been identified as the structure, and the relaxation that occurs during charge transfer has been identified as the process giving rise to hard-soft behavior. This is in contrast the commonly accepted explanations that attribute hard-soft behavior to varying degrees of electrostatic and covalent contributions to the acid-base interaction. The ability of the atomic ion's responding electrons to cause hard-soft behavior has been assessed by examining the correlation of the estimated relaxation energies of the responding electrons with the operational chemical hardness. It has been demonstrated that the responding electrons are able to give rise to hard-soft behavior in simple anionic bases.
Quality of nursing documentation: Paper-based health records versus electronic-based health records.
Akhu-Zaheya, Laila; Al-Maaitah, Rowaida; Bany Hani, Salam
2018-02-01
To assess and compare the quality of paper-based and electronic-based health records. The comparison examined three criteria: content, documentation process and structure. Nursing documentation is a significant indicator of the quality of patient care delivery. It can be either paper-based or organised within the system known as the electronic health records. Nursing documentation must be completed at the highest standards, to ensure the safety and quality of healthcare services. However, the evidence is not clear on which one of the two forms of documentation (paper-based versus electronic health records is more qualified. A retrospective, descriptive, comparative design was used to address the study's purposes. A convenient number of patients' records, from two public hospitals, were audited using the Cat-ch-Ing audit instrument. The sample size consisted of 434 records for both paper-based health records and electronic health records from medical and surgical wards. Electronic health records were better than paper-based health records in terms of process and structure. In terms of quantity and quality content, paper-based records were better than electronic health records. The study affirmed the poor quality of nursing documentation and lack of nurses' knowledge and skills in the nursing process and its application in both paper-based and electronic-based systems. Both forms of documentation revealed drawbacks in terms of content, process and structure. This study provided important information, which can guide policymakers and administrators in identifying effective strategies aimed at enhancing the quality of nursing documentation. Policies and actions to ensure quality nursing documentation at the national level should focus on improving nursing knowledge, competencies, practice in nursing process, enhancing the work environment and nursing workload, as well as strengthening the capacity building of nursing practice to improve the quality of nursing care and patients' outcomes. © 2017 John Wiley & Sons Ltd.
Recognition Decisions From Visual Working Memory Are Mediated by Continuous Latent Strengths.
Ricker, Timothy J; Thiele, Jonathan E; Swagman, April R; Rouder, Jeffrey N
2017-08-01
Making recognition decisions often requires us to reference the contents of working memory, the information available for ongoing cognitive processing. As such, understanding how recognition decisions are made when based on the contents of working memory is of critical importance. In this work we examine whether recognition decisions based on the contents of visual working memory follow a continuous decision process of graded information about the correct choice or a discrete decision process reflecting only knowing and guessing. We find a clear pattern in favor of a continuous latent strength model of visual working memory-based decision making, supporting the notion that visual recognition decision processes are impacted by the degree of matching between the contents of working memory and the choices given. Relation to relevant findings and the implications for human information processing more generally are discussed. Copyright © 2016 Cognitive Science Society, Inc.
NASA Astrophysics Data System (ADS)
Kobayashi, Takashi; Komoda, Norihisa
The traditional business process design methods, in which the usecase is the most typical, have no useful framework to design the activity sequence with. Therefore, the design efficiency and quality vary widely according to the designer’s experience and skill. In this paper, to solve this problem, we propose the business events and their state transition model (a basic business event model) based on the language/action perspective, which is the result in the cognitive science domain. In the business process design, using this model, we decide event occurrence conditions so that every event synchronizes with each other. We also propose the design pattern to decide the event occurrence condition (a business event improvement strategy). Lastly, we apply the business process design method based on the business event model and the business event improvement strategy to the credit card issue process and estimate its effect.
Fault detection and diagnosis using neural network approaches
NASA Technical Reports Server (NTRS)
Kramer, Mark A.
1992-01-01
Neural networks can be used to detect and identify abnormalities in real-time process data. Two basic approaches can be used, the first based on training networks using data representing both normal and abnormal modes of process behavior, and the second based on statistical characterization of the normal mode only. Given data representative of process faults, radial basis function networks can effectively identify failures. This approach is often limited by the lack of fault data, but can be facilitated by process simulation. The second approach employs elliptical and radial basis function neural networks and other models to learn the statistical distributions of process observables under normal conditions. Analytical models of failure modes can then be applied in combination with the neural network models to identify faults. Special methods can be applied to compensate for sensor failures, to produce real-time estimation of missing or failed sensors based on the correlations codified in the neural network.
Familiarity facilitates feature-based face processing.
Visconti di Oleggio Castello, Matteo; Wheeler, Kelsey G; Cipolli, Carlo; Gobbini, M Ida
2017-01-01
Recognition of personally familiar faces is remarkably efficient, effortless and robust. We asked if feature-based face processing facilitates detection of familiar faces by testing the effect of face inversion on a visual search task for familiar and unfamiliar faces. Because face inversion disrupts configural and holistic face processing, we hypothesized that inversion would diminish the familiarity advantage to the extent that it is mediated by such processing. Subjects detected personally familiar and stranger target faces in arrays of two, four, or six face images. Subjects showed significant facilitation of personally familiar face detection for both upright and inverted faces. The effect of familiarity on target absent trials, which involved only rejection of unfamiliar face distractors, suggests that familiarity facilitates rejection of unfamiliar distractors as well as detection of familiar targets. The preserved familiarity effect for inverted faces suggests that facilitation of face detection afforded by familiarity reflects mostly feature-based processes.
Process-based upscaling of surface-atmosphere exchange
NASA Astrophysics Data System (ADS)
Keenan, T. F.; Prentice, I. C.; Canadell, J.; Williams, C. A.; Wang, H.; Raupach, M. R.; Collatz, G. J.; Davis, T.; Stocker, B.; Evans, B. J.
2015-12-01
Empirical upscaling techniques such as machine learning and data-mining have proven invaluable tools for the global scaling of disparate observations of surface-atmosphere exchange, but are not based on a theoretical understanding of the key processes involved. This makes spatial and temporal extrapolation outside of the training domain difficult at best. There is therefore a clear need for the incorporation of knowledge of ecosystem function, in combination with the strength of data mining. Here, we present such an approach. We describe a novel diagnostic process-based model of global photosynthesis and ecosystem respiration, which is directly informed by a variety of global datasets relevant to ecosystem state and function. We use the model framework to estimate global carbon cycling both spatially and temporally, with a specific focus on the mechanisms responsible for long-term change. Our results show the importance of incorporating process knowledge into upscaling approaches, and highlight the effect of key processes on the terrestrial carbon cycle.
Ni Based Powder Reconditioning and Reuse for LMD Process
NASA Astrophysics Data System (ADS)
Renderos, M.; Girot, F.; Lamikiz, A.; Torregaray, A.; Saintier, N.
LMD is an additive manufacturing process based on the injection of metallic powder into a melt-pool created by a heat laser source on a substrate. One of the benefits of this technology is the reduction of the wasted material since it is a near-shape process. Moreover one of the main drawbacks is the relatively low efficiency of the trapped powder, which can be loss than 5% in some cases. The non-trapped powder represents a significant cost in the LMD process, since powder metal material is very expensive and usually is not reused. This article proposes a methodology of the reconditioning and posterior reuse of a nickel base powder commonly used in the aerospace industry, with the main objectives of cost saving, higher environmental cleanup and increase of the overall efficiency in the LMD process. The results are checked by the development of a prototype part built up from reused powder.
Optimizing the availability of a buffered industrial process
Martz, Jr., Harry F.; Hamada, Michael S.; Koehler, Arthur J.; Berg, Eric C.
2004-08-24
A computer-implemented process determines optimum configuration parameters for a buffered industrial process. A population size is initialized by randomly selecting a first set of design and operation values associated with subsystems and buffers of the buffered industrial process to form a set of operating parameters for each member of the population. An availability discrete event simulation (ADES) is performed on each member of the population to determine the product-based availability of each member. A new population is formed having members with a second set of design and operation values related to the first set of design and operation values through a genetic algorithm and the product-based availability determined by the ADES. Subsequent population members are then determined by iterating the genetic algorithm with product-based availability determined by ADES to form improved design and operation values from which the configuration parameters are selected for the buffered industrial process.
da Silva, André Rodrigues Gurgel; Torres Ortega, Carlo Edgar; Rong, Ben-Guang
2016-10-01
In this work, a method based on process synthesis, simulation and evaluation has been used to setup and study the industrial scale lignocellulosic bioethanol productions processes. Scenarios for pretreatment processes of diluted acid, liquid hot water and ammonia fiber explosion were studied. Pretreatment reactor temperature, catalyst loading and water content as well as solids loading in the hydrolysis reactor were evaluated regarding its effects on the process energy consumption and bioethanol concentration. The best scenarios for maximizing ethanol concentration and minimizing total annual costs (TAC) were selected and their minimum ethanol selling price was calculated. Ethanol concentration in the range of 2-8% (wt.) was investigated after the pretreatment. The best scenarios maximizing the ethanol concentration and minimizing TAC obtained a reduction of 19.6% and 30.2% respectively in the final ethanol selling price with respect to the initial base case. Copyright © 2016 Elsevier Ltd. All rights reserved.
Ye, Jianchu; Tu, Song; Sha, Yong
2010-10-01
For the two-step transesterification biodiesel production made from the sunflower oil, based on the kinetics model of the homogeneous base-catalyzed transesterification and the liquid-liquid phase equilibrium of the transesterification product, the total methanol/oil mole ratio, the total reaction time, and the split ratios of methanol and reaction time between the two reactors in the stage of the two-step reaction are determined quantitatively. In consideration of the transesterification intermediate product, both the traditional distillation separation process and the improved separation process of the two-step reaction product are investigated in detail by means of the rigorous process simulation. In comparison with the traditional distillation process, the improved separation process of the two-step reaction product has distinct advantage in the energy duty and equipment requirement due to replacement of the costly methanol-biodiesel distillation column. Copyright 2010 Elsevier Ltd. All rights reserved.
Groenendyk, Derek G.; Ferré, Ty P.A.; Thorp, Kelly R.; Rice, Amy K.
2015-01-01
Soils lie at the interface between the atmosphere and the subsurface and are a key component that control ecosystem services, food production, and many other processes at the Earth’s surface. There is a long-established convention for identifying and mapping soils by texture. These readily available, georeferenced soil maps and databases are used widely in environmental sciences. Here, we show that these traditional soil classifications can be inappropriate, contributing to bias and uncertainty in applications from slope stability to water resource management. We suggest a new approach to soil classification, with a detailed example from the science of hydrology. Hydrologic simulations based on common meteorological conditions were performed using HYDRUS-1D, spanning textures identified by the United States Department of Agriculture soil texture triangle. We consider these common conditions to be: drainage from saturation, infiltration onto a drained soil, and combined infiltration and drainage events. Using a k-means clustering algorithm, we created soil classifications based on the modeled hydrologic responses of these soils. The hydrologic-process-based classifications were compared to those based on soil texture and a single hydraulic property, Ks. Differences in classifications based on hydrologic response versus soil texture demonstrate that traditional soil texture classification is a poor predictor of hydrologic response. We then developed a QGIS plugin to construct soil maps combining a classification with georeferenced soil data from the Natural Resource Conservation Service. The spatial patterns of hydrologic response were more immediately informative, much simpler, and less ambiguous, for use in applications ranging from trafficability to irrigation management to flood control. The ease with which hydrologic-process-based classifications can be made, along with the improved quantitative predictions of soil responses and visualization of landscape function, suggest that hydrologic-process-based classifications should be incorporated into environmental process models and can be used to define application-specific maps of hydrologic function. PMID:26121466
Groenendyk, Derek G; Ferré, Ty P A; Thorp, Kelly R; Rice, Amy K
2015-01-01
Soils lie at the interface between the atmosphere and the subsurface and are a key component that control ecosystem services, food production, and many other processes at the Earth's surface. There is a long-established convention for identifying and mapping soils by texture. These readily available, georeferenced soil maps and databases are used widely in environmental sciences. Here, we show that these traditional soil classifications can be inappropriate, contributing to bias and uncertainty in applications from slope stability to water resource management. We suggest a new approach to soil classification, with a detailed example from the science of hydrology. Hydrologic simulations based on common meteorological conditions were performed using HYDRUS-1D, spanning textures identified by the United States Department of Agriculture soil texture triangle. We consider these common conditions to be: drainage from saturation, infiltration onto a drained soil, and combined infiltration and drainage events. Using a k-means clustering algorithm, we created soil classifications based on the modeled hydrologic responses of these soils. The hydrologic-process-based classifications were compared to those based on soil texture and a single hydraulic property, Ks. Differences in classifications based on hydrologic response versus soil texture demonstrate that traditional soil texture classification is a poor predictor of hydrologic response. We then developed a QGIS plugin to construct soil maps combining a classification with georeferenced soil data from the Natural Resource Conservation Service. The spatial patterns of hydrologic response were more immediately informative, much simpler, and less ambiguous, for use in applications ranging from trafficability to irrigation management to flood control. The ease with which hydrologic-process-based classifications can be made, along with the improved quantitative predictions of soil responses and visualization of landscape function, suggest that hydrologic-process-based classifications should be incorporated into environmental process models and can be used to define application-specific maps of hydrologic function.
TOWARDS A MULTI-SCALE AGENT-BASED PROGRAMMING LANGUAGE METHODOLOGY
Somogyi, Endre; Hagar, Amit; Glazier, James A.
2017-01-01
Living tissues are dynamic, heterogeneous compositions of objects, including molecules, cells and extra-cellular materials, which interact via chemical, mechanical and electrical process and reorganize via transformation, birth, death and migration processes. Current programming language have difficulty describing the dynamics of tissues because: 1: Dynamic sets of objects participate simultaneously in multiple processes, 2: Processes may be either continuous or discrete, and their activity may be conditional, 3: Objects and processes form complex, heterogeneous relationships and structures, 4: Objects and processes may be hierarchically composed, 5: Processes may create, destroy and transform objects and processes. Some modeling languages support these concepts, but most cannot translate models into executable simulations. We present a new hybrid executable modeling language paradigm, the Continuous Concurrent Object Process Methodology (CCOPM) which naturally expresses tissue models, enabling users to visually create agent-based models of tissues, and also allows computer simulation of these models. PMID:29282379
TOWARDS A MULTI-SCALE AGENT-BASED PROGRAMMING LANGUAGE METHODOLOGY.
Somogyi, Endre; Hagar, Amit; Glazier, James A
2016-12-01
Living tissues are dynamic, heterogeneous compositions of objects , including molecules, cells and extra-cellular materials, which interact via chemical, mechanical and electrical process and reorganize via transformation, birth, death and migration processes . Current programming language have difficulty describing the dynamics of tissues because: 1: Dynamic sets of objects participate simultaneously in multiple processes, 2: Processes may be either continuous or discrete, and their activity may be conditional, 3: Objects and processes form complex, heterogeneous relationships and structures, 4: Objects and processes may be hierarchically composed, 5: Processes may create, destroy and transform objects and processes. Some modeling languages support these concepts, but most cannot translate models into executable simulations. We present a new hybrid executable modeling language paradigm, the Continuous Concurrent Object Process Methodology ( CCOPM ) which naturally expresses tissue models, enabling users to visually create agent-based models of tissues, and also allows computer simulation of these models.
NASA Astrophysics Data System (ADS)
Qyyum, Muhammad Abdul; Long, Nguyen Van Duc; Minh, Le Quang; Lee, Moonyong
2018-01-01
Design optimization of the single mixed refrigerant (SMR) natural gas liquefaction (LNG) process involves highly non-linear interactions between decision variables, constraints, and the objective function. These non-linear interactions lead to an irreversibility, which deteriorates the energy efficiency of the LNG process. In this study, a simple and highly efficient hybrid modified coordinate descent (HMCD) algorithm was proposed to cope with the optimization of the natural gas liquefaction process. The single mixed refrigerant process was modeled in Aspen Hysys® and then connected to a Microsoft Visual Studio environment. The proposed optimization algorithm provided an improved result compared to the other existing methodologies to find the optimal condition of the complex mixed refrigerant natural gas liquefaction process. By applying the proposed optimization algorithm, the SMR process can be designed with the 0.2555 kW specific compression power which is equivalent to 44.3% energy saving as compared to the base case. Furthermore, in terms of coefficient of performance (COP), it can be enhanced up to 34.7% as compared to the base case. The proposed optimization algorithm provides a deep understanding of the optimization of the liquefaction process in both technical and numerical perspectives. In addition, the HMCD algorithm can be employed to any mixed refrigerant based liquefaction process in the natural gas industry.
Sahinkaya, Erkan; Dursun, Nesrin
2012-09-01
This study evaluated the elimination of alkalinity need and excess sulfate generation of sulfur-based autotrophic denitrification process by stimulating simultaneous autotrophic and heterotrophic (mixotrophic) denitrification process in a column bioreactor by methanol supplementation. Also, denitrification performances of sulfur-based autotrophic and mixotrophic processes were compared. In autotrophic process, acidity produced by denitrifying sulfur-oxidizing bacteria was neutralized by the external NaHCO(3) supplementation. After stimulating mixotrophic denitrification process, the alkalinity need of the autotrophic process was satisfied by the alkalinity produced by heterotrophic denitrifiers. Decreasing and lastly eliminating the external alkalinity supplementation did not adversely affect the process performance. Complete denitrification of 75 mg L(-1) NO(3)-N under mixotrophic conditions at 4 h hydraulic retention time was achieved without external alkalinity supplementation and with effluent sulfate concentration lower than the drinking water guideline value of 250 mg L(-1). The denitrification rate of mixotrophic process (0.45 g NO(3)-N L(-1) d(-1)) was higher than that of autotrophic one (0.3 g NO(3)-N L(-1) d(-1)). Batch studies showed that the sulfur-based autotrophic nitrate reduction rate increased with increasing initial nitrate concentration and transient accumulation of nitrite was observed. Copyright © 2012 Elsevier Ltd. All rights reserved.
Estimating Missing Unit Process Data in Life Cycle Assessment Using a Similarity-Based Approach.
Hou, Ping; Cai, Jiarui; Qu, Shen; Xu, Ming
2018-05-01
In life cycle assessment (LCA), collecting unit process data from the empirical sources (i.e., meter readings, operation logs/journals) is often costly and time-consuming. We propose a new computational approach to estimate missing unit process data solely relying on limited known data based on a similarity-based link prediction method. The intuition is that similar processes in a unit process network tend to have similar material/energy inputs and waste/emission outputs. We use the ecoinvent 3.1 unit process data sets to test our method in four steps: (1) dividing the data sets into a training set and a test set; (2) randomly removing certain numbers of data in the test set indicated as missing; (3) using similarity-weighted means of various numbers of most similar processes in the training set to estimate the missing data in the test set; and (4) comparing estimated data with the original values to determine the performance of the estimation. The results show that missing data can be accurately estimated when less than 5% data are missing in one process. The estimation performance decreases as the percentage of missing data increases. This study provides a new approach to compile unit process data and demonstrates a promising potential of using computational approaches for LCA data compilation.
Mathematical modelling of disintegration-limited co-digestion of OFMSW and sewage sludge.
Esposito, G; Frunzo, L; Panico, A; d'Antonio, G
2008-01-01
This paper presents a mathematical model able to simulate under dynamic conditions the physical, chemical and biological processes prevailing in a OFMSW and sewage sludge anaerobic digestion system. The model proposed is based on differential mass balance equations for substrates, products and bacterial groups involved in the co-digestion process and includes the biochemical reactions of the substrate conversion and the kinetics of microbial growth and decay. The main peculiarity of the model is the surface based kinetic description of the OFMSW disintegration process, whereas the pH determination is based on a nine-order polynomial equation derived by acid-base equilibria. The model can be applied to simulate the co-digestion process for several purposes, such as the evaluation of the optimal process conditions in terms of OFMSW/sewage sludge ratio, temperature, OFMSW particle size, solid mixture retention time, reactor stirring rate, etc. Biogas production and composition can also be evaluated to estimate the potential energy production under different process conditions. In particular, model simulations reported in this paper show the model capability to predict the OFMSW amount which can be treated in the digester of an existing MWWTP and to assess the OFMSW particle size diminution pre-treatment required to increase the rate of the disintegration process, which otherwise can highly limit the co-digestion system. Copyright IWA Publishing 2008.
Reduced order model based on principal component analysis for process simulation and optimization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lang, Y.; Malacina, A.; Biegler, L.
2009-01-01
It is well-known that distributed parameter computational fluid dynamics (CFD) models provide more accurate results than conventional, lumped-parameter unit operation models used in process simulation. Consequently, the use of CFD models in process/equipment co-simulation offers the potential to optimize overall plant performance with respect to complex thermal and fluid flow phenomena. Because solving CFD models is time-consuming compared to the overall process simulation, we consider the development of fast reduced order models (ROMs) based on CFD results to closely approximate the high-fidelity equipment models in the co-simulation. By considering process equipment items with complicated geometries and detailed thermodynamic property models,more » this study proposes a strategy to develop ROMs based on principal component analysis (PCA). Taking advantage of commercial process simulation and CFD software (for example, Aspen Plus and FLUENT), we are able to develop systematic CFD-based ROMs for equipment models in an efficient manner. In particular, we show that the validity of the ROM is more robust within well-sampled input domain and the CPU time is significantly reduced. Typically, it takes at most several CPU seconds to evaluate the ROM compared to several CPU hours or more to solve the CFD model. Two case studies, involving two power plant equipment examples, are described and demonstrate the benefits of using our proposed ROM methodology for process simulation and optimization.« less
Ren, Luquan; Zhou, Xueli; Song, Zhengyi; Zhao, Che; Liu, Qingping; Xue, Jingze; Li, Xiujuan
2017-03-16
Recently, with a broadening range of available materials and alteration of feeding processes, several extrusion-based 3D printing processes for metal materials have been developed. An emerging process is applicable for the fabrication of metal parts into electronics and composites. In this paper, some critical parameters of extrusion-based 3D printing processes were optimized by a series of experiments with a melting extrusion printer. The raw materials were copper powder and a thermoplastic organic binder system and the system included paraffin wax, low density polyethylene, and stearic acid (PW-LDPE-SA). The homogeneity and rheological behaviour of the raw materials, the strength of the green samples, and the hardness of the sintered samples were investigated. Moreover, the printing and sintering parameters were optimized with an orthogonal design method. The influence factors in regard to the ultimate tensile strength of the green samples can be described as follows: infill degree > raster angle > layer thickness. As for the sintering process, the major factor on hardness is sintering temperature, followed by holding time and heating rate. The highest hardness of the sintered samples was very close to the average hardness of commercially pure copper material. Generally, the extrusion-based printing process for producing metal materials is a promising strategy because it has some advantages over traditional approaches for cost, efficiency, and simplicity.