Digital Signal Processing Based Biotelemetry Receivers
NASA Technical Reports Server (NTRS)
Singh, Avtar; Hines, John; Somps, Chris
1997-01-01
This is an attempt to develop a biotelemetry receiver using digital signal processing technology and techniques. The receiver developed in this work is based on recovering signals that have been encoded using either Pulse Position Modulation (PPM) or Pulse Code Modulation (PCM) technique. A prototype has been developed using state-of-the-art digital signal processing technology. A Printed Circuit Board (PCB) is being developed based on the technique and technology described here. This board is intended to be used in the UCSF Fetal Monitoring system developed at NASA. The board is capable of handling a variety of PPM and PCM signals encoding signals such as ECG, temperature, and pressure. A signal processing program has also been developed to analyze the received ECG signal to determine heart rate. This system provides a base for using digital signal processing in biotelemetry receivers and other similar applications.
Process Correlation Analysis Model for Process Improvement Identification
Park, Sooyong
2014-01-01
Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data. PMID:24977170
Process correlation analysis model for process improvement identification.
Choi, Su-jin; Kim, Dae-Kyoo; Park, Sooyong
2014-01-01
Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.
A Petri Net-Based Software Process Model for Developing Process-Oriented Information Systems
NASA Astrophysics Data System (ADS)
Li, Yu; Oberweis, Andreas
Aiming at increasing flexibility, efficiency, effectiveness, and transparency of information processing and resource deployment in organizations to ensure customer satisfaction and high quality of products and services, process-oriented information systems (POIS) represent a promising realization form of computerized business information systems. Due to the complexity of POIS, explicit and specialized software process models are required to guide POIS development. In this chapter we characterize POIS with an architecture framework and present a Petri net-based software process model tailored for POIS development with consideration of organizational roles. As integrated parts of the software process model, we also introduce XML nets, a variant of high-level Petri nets as basic methodology for business processes modeling, and an XML net-based software toolset providing comprehensive functionalities for POIS development.
NASA Astrophysics Data System (ADS)
Musil, Juergen; Schweda, Angelika; Winkler, Dietmar; Biffl, Stefan
Based on our observations of Austrian video game software development (VGSD) practices we identified a lack of systematic processes/method support and inefficient collaboration between various involved disciplines, i.e. engineers and artists. VGSD includes heterogeneous disciplines, e.g. creative arts, game/content design, and software. Nevertheless, improving team collaboration and process support is an ongoing challenge to enable a comprehensive view on game development projects. Lessons learned from software engineering practices can help game developers to increase game development processes within a heterogeneous environment. Based on a state of the practice survey in the Austrian games industry, this paper presents (a) first results with focus on process/method support and (b) suggests a candidate flexible process approach based on Scrum to improve VGSD and team collaboration. Results showed (a) a trend to highly flexible software processes involving various disciplines and (b) identified the suggested flexible process approach as feasible and useful for project application.
The Knowledge-Based Software Assistant: Beyond CASE
NASA Technical Reports Server (NTRS)
Carozzoni, Joseph A.
1993-01-01
This paper will outline the similarities and differences between two paradigms of software development. Both support the whole software life cycle and provide automation for most of the software development process, but have different approaches. The CASE approach is based on a set of tools linked by a central data repository. This tool-based approach is data driven and views software development as a series of sequential steps, each resulting in a product. The Knowledge-Based Software Assistant (KBSA) approach, a radical departure from existing software development practices, is knowledge driven and centers around a formalized software development process. KBSA views software development as an incremental, iterative, and evolutionary process with development occurring at the specification level.
Technology development for lunar base water recycling
NASA Technical Reports Server (NTRS)
Schultz, John R.; Sauer, Richard L.
1992-01-01
This paper will review previous and ongoing work in aerospace water recycling and identify research activities required to support development of a lunar base. The development of a water recycle system for use in the life support systems envisioned for a lunar base will require considerable research work. A review of previous work on aerospace water recycle systems indicates that more efficient physical and chemical processes are needed to reduce expendable and power requirements. Development work on biological processes that can be applied to microgravity and lunar environments also needs to be initiated. Biological processes are inherently more efficient than physical and chemical processes and may be used to minimize resupply and waste disposal requirements. Processes for recovering and recycling nutrients such as nitrogen, phosphorus, and sulfur also need to be developed to support plant growth units. The development of efficient water quality monitors to be used for process control and environmental monitoring also needs to be initiated.
Model-Driven Useware Engineering
NASA Astrophysics Data System (ADS)
Meixner, Gerrit; Seissler, Marc; Breiner, Kai
User-oriented hardware and software development relies on a systematic development process based on a comprehensive analysis focusing on the users' requirements and preferences. Such a development process calls for the integration of numerous disciplines, from psychology and ergonomics to computer sciences and mechanical engineering. Hence, a correspondingly interdisciplinary team must be equipped with suitable software tools to allow it to handle the complexity of a multimodal and multi-device user interface development approach. An abstract, model-based development approach seems to be adequate for handling this complexity. This approach comprises different levels of abstraction requiring adequate tool support. Thus, in this chapter, we present the current state of our model-based software tool chain. We introduce the use model as the core model of our model-based process, transformation processes, and a model-based architecture, and we present different software tools that provide support for creating and maintaining the models or performing the necessary model transformations.
ERIC Educational Resources Information Center
Ngo, Chau M.; Trinh, Lap Q.
2011-01-01
The field of English language education has seen developments in writing pedagogy, moving from product-based to process-based and then to genre-based approaches. In Vietnam, teaching secondary school students how to write in English is still lagging behind these growing developments. Product-based approach is commonly seen in English writing…
Process-based Cost Estimation for Ramjet/Scramjet Engines
NASA Technical Reports Server (NTRS)
Singh, Brijendra; Torres, Felix; Nesman, Miles; Reynolds, John
2003-01-01
Process-based cost estimation plays a key role in effecting cultural change that integrates distributed science, technology and engineering teams to rapidly create innovative and affordable products. Working together, NASA Glenn Research Center and Boeing Canoga Park have developed a methodology of process-based cost estimation bridging the methodologies of high-level parametric models and detailed bottoms-up estimation. The NASA GRC/Boeing CP process-based cost model provides a probabilistic structure of layered cost drivers. High-level inputs characterize mission requirements, system performance, and relevant economic factors. Design alternatives are extracted from a standard, product-specific work breakdown structure to pre-load lower-level cost driver inputs and generate the cost-risk analysis. As product design progresses and matures the lower level more detailed cost drivers can be re-accessed and the projected variation of input values narrowed, thereby generating a progressively more accurate estimate of cost-risk. Incorporated into the process-based cost model are techniques for decision analysis, specifically, the analytic hierarchy process (AHP) and functional utility analysis. Design alternatives may then be evaluated not just on cost-risk, but also user defined performance and schedule criteria. This implementation of full-trade study support contributes significantly to the realization of the integrated development environment. The process-based cost estimation model generates development and manufacturing cost estimates. The development team plans to expand the manufacturing process base from approximately 80 manufacturing processes to over 250 processes. Operation and support cost modeling is also envisioned. Process-based estimation considers the materials, resources, and processes in establishing cost-risk and rather depending on weight as an input, actually estimates weight along with cost and schedule.
ERIC Educational Resources Information Center
Kennedy, Kerry J.
The processes of instructional materials development and dissemination used in four Stanford Program on International and Cross Cultural Education (SPICE) projects dealing with Latin America, Africa, China, and Japan are described, and evaluative comments based on a review of the curriculum development process are made. The major purpose of the…
Culturally Based Intervention Development: The Case of Latino Families Dealing with Schizophrenia
ERIC Educational Resources Information Center
Barrio, Concepcion; Yamada, Ann-Marie
2010-01-01
Objectives: This article describes the process of developing a culturally based family intervention for Spanish-speaking Latino families with a relative diagnosed with schizophrenia. Method: Our iterative intervention development process was guided by a cultural exchange framework and based on findings from an ethnographic study. We piloted this…
Developing evidence-based physical therapy clinical practice guidelines.
Kaplan, Sandra L; Coulter, Colleen; Fetters, Linda
2013-01-01
Recommended strategies for developing evidence-based clinical practice guidelines (CPGs) are provided. The intent is that future CPGs developed with the support of the Section on Pediatrics of the American Physical Therapy Association would consistently follow similar developmental processes to yield consistent quality and presentation. Steps in the process of developing CPGs are outlined and resources are provided to assist CPG developers in carrying out their task. These recommended processes may also be useful to CPG developers representing organizations with similar structures, objectives, and resources.
Guide Book for Curriculum Development and Adaptation. Draft.
ERIC Educational Resources Information Center
United Nations Educational, Scientific and Cultural Organization, Bangkok (Thailand). Principal Regional Office for Asia and the Pacific.
This guide was developed as a cooperative initiative by UNESCO member countries to provide information about the process of development and adaptation of a flexible, competency-based curriculum, especially in vocational education. The guidebook outlines an exemplar process of education organized in 11 stages of design and development based on…
DEVS Unified Process for Web-Centric Development and Testing of System of Systems
2008-05-20
gathering from the user. Further, methodologies have been developed to generate DEVS models from BPMN /BPEL-based and message-based requirement specifications...27] 3. BPMN /BPEL based system specifications: Business Process Modeling Notation ( BPMN ) [bpm] or Business Process Execution Language (BPEL) provide a...information is stored in .wsdl and .bpel files for BPEL but in proprietary format for BPMN . 4. DoDAF-based requirement specifications: Department of
The Use of Multi-Criteria Evaluation and Network Analysis in the Area Development Planning Process
2013-03-01
layouts. The alternative layout scoring process, base in multi-criteria evaluation, returns a quantitative score for each alternative layout and a...The purpose of this research was to develop improvements to the area development planning process. These plans are used to improve operations within...an installation sub-section by altering the physical layout of facilities. One methodology was developed based on apply network analysis concepts to
2010-01-01
Comparative Effectiveness Research, or other efforts to determine best practices and to develop guidelines based on meta-analysis and evidence - based medicine . An...authoritative reviews or other evidence - based medicine sources, but they have been made unambiguous and computable – a process which sounds...best practice recommendation created through an evidence - based medicine (EBM) development process. The lifecycle envisions four stages of refinement
Lee, Robert H; Bott, Marjorie J; Forbes, Sarah; Redford, Linda; Swagerty, Daniel L; Taunton, Roma Lee
2003-01-01
Understanding how quality improvement affects costs is important. Unfortunately, low-cost, reliable ways of measuring direct costs are scarce. This article builds on the principles of process improvement to develop a costing strategy that meets both criteria. Process-based costing has 4 steps: developing a flowchart, estimating resource use, valuing resources, and calculating direct costs. To illustrate the technique, this article uses it to cost the care planning process in 3 long-term care facilities. We conclude that process-based costing is easy to implement; generates reliable, valid data; and allows nursing managers to assess the costs of new or modified processes.
Development of an evidence-based review with recommendations using an online iterative process.
Rudmik, Luke; Smith, Timothy L
2011-01-01
The practice of modern medicine is governed by evidence-based principles. Due to the plethora of medical literature, clinicians often rely on systematic reviews and clinical guidelines to summarize the evidence and provide best practices. Implementation of an evidence-based clinical approach can minimize variation in health care delivery and optimize the quality of patient care. This article reports a method for developing an "Evidence-based Review with Recommendations" using an online iterative process. The manuscript describes the following steps involved in this process: Clinical topic selection, Evidence-hased review assignment, Literature review and initial manuscript preparation, Iterative review process with author selection, and Manuscript finalization. The goal of this article is to improve efficiency and increase the production of evidence-based reviews while maintaining the high quality and transparency associated with the rigorous methodology utilized for clinical guideline development. With the rise of evidence-based medicine, most medical and surgical specialties have an abundance of clinical topics which would benefit from a formal evidence-based review. Although clinical guideline development is an important methodology, the associated challenges limit development to only the absolute highest priority clinical topics. As outlined in this article, the online iterative approach to the development of an Evidence-based Review with Recommendations may improve productivity without compromising the quality associated with formal guideline development methodology. Copyright © 2011 American Rhinologic Society-American Academy of Otolaryngic Allergy, LLC.
Learning-Based Curriculum Development
ERIC Educational Resources Information Center
Nygaard, Claus; Hojlt, Thomas; Hermansen, Mads
2008-01-01
This article is written to inspire curriculum developers to centre their efforts on the learning processes of students. It presents a learning-based paradigm for higher education and demonstrates the close relationship between curriculum development and students' learning processes. The article has three sections: Section "The role of higher…
Towards an Intelligent Planning Knowledge Base Development Environment
NASA Technical Reports Server (NTRS)
Chien, S.
1994-01-01
ract describes work in developing knowledge base editing and debugging tools for the Multimission VICAR Planner (MVP) system. MVP uses artificial intelligence planning techniques to automatically construct executable complex image processing procedures (using models of the smaller constituent image processing requests made to the JPL Multimission Image Processing Laboratory.
NASA Astrophysics Data System (ADS)
Subhan, M.; Oktolita, N.; Kn, M.
2018-04-01
The Lacks of students' skills in the learning process is due to lacks of exercises in the form of LKS. In the curriculum of 2013, there is no LKS as a companion to improve the students' skills. In order to solve those problem, it is necessary to develop LKS based on process skills as a teaching material to improve students' process skills. The purpose of this study is to develop LKS Process Skills based elementary school grade IV, V, VI which is integrated by process skill. The development of LKS can be used to develop the thematic process skills of elementary school students grade IV, V, VI based on curriculum 2013. The expected long-term goal is to produce teaching materials LKS Process Skill based of Thematic learning that is able to develop the process skill of elementary school students grade IV, V, VI. This development research refers to the steps developed by Borg & Gall (1983). The development process is carried out through 10 stages: preliminary research and gathering information, planning, draft development, initial test (limited trial), first product revision, final trial (field trial), product operational revision, Desemination and implementation. The limited subject of the this research is the students of SDN in Dharmasraya grade IV, V, VI. The field trial subjects in the experimental class are the students of SDN Dharmasraya grade IV, V, VI who have implemented the curriculum 2013. The data are collected by using LKS validation sheets, process skill observation sheets, and Thematic learning test (pre-test And post-test). The result of LKS development on the validity score is 81.70 (very valid), on practical score is 83.94 (very practical), and on effectiveness score is 86.67 (very effective). In the trial step the use of LKS using One Group Pretest-Posttest Design research design. The purpose of this trial is to know the effectiveness level of LKS result of development for improving the process skill of students in grade IV, V, and VI of elementary school. The data collection in this research uses the test result sheet of the process skill through pre-test and post-test. Observation results were analyzed with SPSS 16.0 software. The Result of analysis learning process of student skill of Sig value. (2-tailed) (0,000) <α (0.005) then H0 is rejected. There is a significant difference to the development of process skills between students using LKS with students who do not use LKS. It can be concluded that LKS have accuracy, ease and can improve result learn on aspect of skill process of student of grade IV, V and VI elementary school.
Performance-Based Assessment: An Alternative Assessment Process for Young Gifted Children.
ERIC Educational Resources Information Center
Hafenstein, Norma Lu; Tucker, Brooke
Performance-based assessment provides an alternative identification method for young gifted children. A performance-based identification process was developed and implemented to select three-, four-, and five-year-old children for inclusion in a school for gifted children. Literature regarding child development, characteristics of young gifted…
Cultural adaptation process for international dissemination of the strengthening families program.
Kumpfer, Karol L; Pinyuchon, Methinin; Teixeira de Melo, Ana; Whiteside, Henry O
2008-06-01
The Strengthening Families Program (SFP) is an evidence-based family skills training intervention developed and found efficacious for substance abuse prevention by U.S researchers in the 1980s. In the 1990s, a cultural adaptation process was developed to transport SFP for effectiveness trials with diverse populations (African, Hispanic, Asian, Pacific Islander, and Native American). Since 2003, SFP has been culturally adapted for use in 17 countries. This article reviews the SFP theory and research and a recommended cultural adaptation process. Challenges in international dissemination of evidence-based programs (EBPs) are discussed based on the results of U.N. and U.S. governmental initiatives to transport EBP family interventions to developing countries. The technology transfer and quality assurance system are described, including the language translation and cultural adaptation process for materials development, staff training, and on-site and online Web-based supervision and technical assistance and evaluation services to assure quality implementation and process evaluation feedback for improvements.
Application of agent-based system for bioprocess description and process improvement.
Gao, Ying; Kipling, Katie; Glassey, Jarka; Willis, Mark; Montague, Gary; Zhou, Yuhong; Titchener-Hooker, Nigel J
2010-01-01
Modeling plays an important role in bioprocess development for design and scale-up. Predictive models can also be used in biopharmaceutical manufacturing to assist decision-making either to maintain process consistency or to identify optimal operating conditions. To predict the whole bioprocess performance, the strong interactions present in a processing sequence must be adequately modeled. Traditionally, bioprocess modeling considers process units separately, which makes it difficult to capture the interactions between units. In this work, a systematic framework is developed to analyze the bioprocesses based on a whole process understanding and considering the interactions between process operations. An agent-based approach is adopted to provide a flexible infrastructure for the necessary integration of process models. This enables the prediction of overall process behavior, which can then be applied during process development or once manufacturing has commenced, in both cases leading to the capacity for fast evaluation of process improvement options. The multi-agent system comprises a process knowledge base, process models, and a group of functional agents. In this system, agent components co-operate with each other in performing their tasks. These include the description of the whole process behavior, evaluating process operating conditions, monitoring of the operating processes, predicting critical process performance, and providing guidance to decision-making when coping with process deviations. During process development, the system can be used to evaluate the design space for process operation. During manufacture, the system can be applied to identify abnormal process operation events and then to provide suggestions as to how best to cope with the deviations. In all cases, the function of the system is to ensure an efficient manufacturing process. The implementation of the agent-based approach is illustrated via selected application scenarios, which demonstrate how such a framework may enable the better integration of process operations by providing a plant-wide process description to facilitate process improvement. Copyright 2009 American Institute of Chemical Engineers
Harbison, K; Kelly, J; Burnell, L; Silva, J
1995-01-01
The Scenario-based Engineering Process (SEP) is a user-focused methodology for large and complex system design. This process supports new application development from requirements analysis with domain models to component selection, design and modification, implementation, integration, and archival placement. It is built upon object-oriented methodologies, domain modeling strategies, and scenario-based techniques to provide an analysis process for mapping application requirements to available components. We are using SEP in the health care applications that we are developing. The process has already achieved success in the manufacturing and military domains and is being adopted by many organizations. SEP should prove viable in any domain containing scenarios that can be decomposed into tasks.
Model-Based GN and C Simulation and Flight Software Development for Orion Missions beyond LEO
NASA Technical Reports Server (NTRS)
Odegard, Ryan; Milenkovic, Zoran; Henry, Joel; Buttacoli, Michael
2014-01-01
For Orion missions beyond low Earth orbit (LEO), the Guidance, Navigation, and Control (GN&C) system is being developed using a model-based approach for simulation and flight software. Lessons learned from the development of GN&C algorithms and flight software for the Orion Exploration Flight Test One (EFT-1) vehicle have been applied to the development of further capabilities for Orion GN&C beyond EFT-1. Continuing the use of a Model-Based Development (MBD) approach with the Matlab®/Simulink® tool suite, the process for GN&C development and analysis has been largely improved. Furthermore, a model-based simulation environment in Simulink, rather than an external C-based simulation, greatly eases the process for development of flight algorithms. The benefits seen by employing lessons learned from EFT-1 are described, as well as the approach for implementing additional MBD techniques. Also detailed are the key enablers for improvements to the MBD process, including enhanced configuration management techniques for model-based software systems, automated code and artifact generation, and automated testing and integration.
Evaluation of ERDA-sponsored coal feed system development
NASA Technical Reports Server (NTRS)
Phen, R. L.; Luckow, W. K.; Mattson, L.; Otth, D.; Tsou, P.
1977-01-01
Coal feeders were evaluated based upon criteria such as technical feasibility, performance (i.e. ability to meet process requirements), projected life cycle costs, and projected development cost. An initial set of feeders was selected based on the feeders' cost savings potential compared with baseline lockhopper systems. Additional feeders were considered for selection based on: (1) increasing the probability of successful feeder development; (2) application to specific processes; and (3) technical merit. A coal feeder development program is outlined.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simpson, L.; Britt, J.; Birkmire, R.
ITN Energy Systems, Inc., and Global Solar Energy, Inc., assisted by NREL's PV Manufacturing R&D program, have continued to advance CIGS production technology by developing trajectory-oriented predictive/control models, fault-tolerance control, control platform development, in-situ sensors, and process improvements. Modeling activities included developing physics-based and empirical models for CIGS and sputter-deposition processing, implementing model-based control, and applying predictive models to the construction of new evaporation sources and for control. Model-based control is enabled by implementing reduced or empirical models into a control platform. Reliability improvement activities include implementing preventive maintenance schedules; detecting failed sensors/equipment and reconfiguring to tinue processing; and systematicmore » development of fault prevention and reconfiguration strategies for the full range of CIGS PV production deposition processes. In-situ sensor development activities have resulted in improved control and indicated the potential for enhanced process status monitoring and control of the deposition processes. Substantial process improvements have been made, including significant improvement in CIGS uniformity, thickness control, efficiency, yield, and throughput. In large measure, these gains have been driven by process optimization, which in turn have been enabled by control and reliability improvements due to this PV Manufacturing R&D program.« less
NASA Astrophysics Data System (ADS)
Takei, Satoshi; Maki, Hirotaka; Sugahara, Kigen; Ito, Kenta; Hanabata, Makoto
2015-07-01
An electron beam (EB) lithography method using inedible cellulose-based resist material derived from woody biomass has been successfully developed. This method allows the use of pure water in the development process instead of the conventionally used tetramethylammonium hydroxide and anisole. The inedible cellulose-based biomass resist material, as an alternative to alpha-linked disaccharides in sugar derivatives that compete with food supplies, was developed by replacing the hydroxyl groups in the beta-linked disaccharides with EB-sensitive 2-methacryloyloxyethyl groups. A 75 nm line and space pattern at an exposure dose of 19 μC/cm2, a resist thickness uniformity of less than 0.4 nm on a 200 mm wafer, and low film thickness shrinkage under EB irradiation were achieved with this inedible cellulose-based biomass resist material using a water-based development process.
Rhodes, Scott D; Mann-Jackson, Lilli; Alonzo, Jorge; Simán, Florence M; Vissman, Aaron T; Nall, Jennifer; Abraham, Claire; Aronson, Robert E; Tanner, Amanda E
2017-12-01
The science underlying the development of individual, community, system, and policy interventions designed to reduce health disparities has lagged behind other innovations. Few models, theoretical frameworks, or processes exist to guide intervention development. Our community-engaged research partnership has been developing, implementing, and evaluating efficacious interventions to reduce HIV disparities for over 15 years. Based on our intervention research experiences, we propose a novel 13-step process designed to demystify and guide intervention development. Our intervention development process includes steps such as establishing an intervention team to manage the details of intervention development; assessing community needs, priorities, and assets; generating intervention priorities; evaluating and incorporating theory; developing a conceptual or logic model; crafting activities; honing materials; administering a pilot, noting its process, and gathering feedback from all those involved; and editing the intervention based on what was learned. Here, we outline and describe each of these 13 steps.
NASA Astrophysics Data System (ADS)
Ariana, I. M.; Bagiada, I. M.
2018-01-01
Development of spreadsheet-based integrated transaction processing systems and financial reporting systems is intended to optimize the capabilities of spreadsheet in accounting data processing. The purpose of this study are: 1) to describe the spreadsheet-based integrated transaction processing systems and financial reporting systems; 2) to test its technical and operational feasibility. This study type is research and development. The main steps of study are: 1) needs analysis (need assessment); 2) developing spreadsheet-based integrated transaction processing systems and financial reporting systems; and 3) testing the feasibility of spreadsheet-based integrated transaction processing systems and financial reporting systems. The technical feasibility include the ability of hardware and operating systems to respond the application of accounting, simplicity and ease of use. Operational feasibility include the ability of users using accounting applications, the ability of accounting applications to produce information, and control applications of the accounting applications. The instrument used to assess the technical and operational feasibility of the systems is the expert perception questionnaire. The instrument uses 4 Likert scale, from 1 (strongly disagree) to 4 (strongly agree). Data were analyzed using percentage analysis by comparing the number of answers within one (1) item by the number of ideal answer within one (1) item. Spreadsheet-based integrated transaction processing systems and financial reporting systems integrate sales, purchases, and cash transaction processing systems to produce financial reports (statement of profit or loss and other comprehensive income, statement of changes in equity, statement of financial position, and statement of cash flows) and other reports. Spreadsheet-based integrated transaction processing systems and financial reporting systems is feasible from the technical aspects (87.50%) and operational aspects (84.17%).
Musical rhythm and reading development: does beat processing matter?
Ozernov-Palchik, Ola; Patel, Aniruddh D
2018-05-20
There is mounting evidence for links between musical rhythm processing and reading-related cognitive skills, such as phonological awareness. This may be because music and speech are rhythmic: both involve processing complex sound sequences with systematic patterns of timing, accent, and grouping. Yet, there is a salient difference between musical and speech rhythm: musical rhythm is often beat-based (based on an underlying grid of equal time intervals), while speech rhythm is not. Thus, the role of beat-based processing in the reading-rhythm relationship is not clear. Is there is a distinct relation between beat-based processing mechanisms and reading-related language skills, or is the rhythm-reading link entirely due to shared mechanisms for processing nonbeat-based aspects of temporal structure? We discuss recent evidence for a distinct link between beat-based processing and early reading abilities in young children, and suggest experimental designs that would allow one to further methodically investigate this relationship. We propose that beat-based processing taps into a listener's ability to use rich contextual regularities to form predictions, a skill important for reading development. © 2018 New York Academy of Sciences.
ERIC Educational Resources Information Center
Lee, Il-Sun; Byeon, Jung-Ho; Kim, Young-shin; Kwon, Yong-Ju
2014-01-01
The purpose of this study was to develop a model for measuring experimental design ability based on functional magnetic resonance imaging (fMRI) during biological inquiry. More specifically, the researchers developed an experimental design task that measures experimental design ability. Using the developed experimental design task, they measured…
Responding to climate change in national forests: a guidebook for developing adaptation options
David L. Peterson; Connie I. Millar; Linda A. Joyce; Michael J. Furniss; Jessica E. Halofsky; Ronald P. Neilson; Toni Lyn Morelli
2011-01-01
This guidebook contains science-based principles, processes, and tools necessary to assist with developing adaptation options for national forest lands. The adaptation process is based on partnerships between local resource managers and scientists who work collaboratively to understand potential climate change effects, identify important resource issues, and develop...
Developing a Competency-Based Curriculum for a Dental Hygiene Program.
ERIC Educational Resources Information Center
DeWald, Janice P.; McCann, Ann L.
1999-01-01
Describes the three-step process used to develop a competency-based curriculum at the Caruth School of Dental Hygiene (Texas A&M University). The process involved development of a competency document (detailing three domains, nine major competencies, and 54 supporting competencies), an evaluation plan, and a curriculum inventory which defined…
ERIC Educational Resources Information Center
Rubin, Allen; Parrish, Danielle E.
2010-01-01
Objective: This report describes the development and preliminary findings regarding the reliability, validity, and sensitivity of a scale that has been developed to assess practitioners' perceived familiarity with, attitudes about, and implementation of the phases of the evidence-based practice (EBP) process. Method: After a panel of national…
ERIC Educational Resources Information Center
Schutte, Marc; Spottl, Georg
2011-01-01
Developing countries such as Malaysia and Oman have recently established occupational standards based on core work processes (functional clusters of work objects, activities and performance requirements), to which competencies (performance determinants) can be linked. While the development of work-process-based occupational standards is supposed…
Contingency theoretic methodology for agent-based web-oriented manufacturing systems
NASA Astrophysics Data System (ADS)
Durrett, John R.; Burnell, Lisa J.; Priest, John W.
2000-12-01
The development of distributed, agent-based, web-oriented, N-tier Information Systems (IS) must be supported by a design methodology capable of responding to the convergence of shifts in business process design, organizational structure, computing, and telecommunications infrastructures. We introduce a contingency theoretic model for the use of open, ubiquitous software infrastructure in the design of flexible organizational IS. Our basic premise is that developers should change in the way they view the software design process from a view toward the solution of a problem to one of the dynamic creation of teams of software components. We postulate that developing effective, efficient, flexible, component-based distributed software requires reconceptualizing the current development model. The basic concepts of distributed software design are merged with the environment-causes-structure relationship from contingency theory; the task-uncertainty of organizational- information-processing relationships from information processing theory; and the concept of inter-process dependencies from coordination theory. Software processes are considered as employees, groups of processes as software teams, and distributed systems as software organizations. Design techniques already used in the design of flexible business processes and well researched in the domain of the organizational sciences are presented. Guidelines that can be utilized in the creation of component-based distributed software will be discussed.
Druzinec, Damir; Salzig, Denise; Brix, Alexander; Kraume, Matthias; Vilcinskas, Andreas; Kollewe, Christian; Czermak, Peter
2013-01-01
Due to the increasing use of insect cell based expression systems in research and industrial recombinant protein production, the development of efficient and reproducible production processes remains a challenging task. In this context, the application of online monitoring techniques is intended to ensure high and reproducible product qualities already during the early phases of process development. In the following chapter, the most common transient and stable insect cell based expression systems are briefly introduced. Novel applications of insect cell based expression systems for the production of insect derived antimicrobial peptides/proteins (AMPs) are discussed using the example of G. mellonella derived gloverin. Suitable in situ sensor techniques for insect cell culture monitoring in disposable and common bioreactor systems are outlined with respect to optical and capacitive sensor concepts. Since scale up of production processes is one of the most critical steps in process development, a conclusive overview is given about scale up aspects for industrial insect cell culture processes.
Controlled Ecological Life Support System: Research and Development Guidelines
NASA Technical Reports Server (NTRS)
Mason, R. M. (Editor); Carden, J. L. (Editor)
1982-01-01
Results of a workshop designed to provide a base for initiating a program of research and development of controlled ecological life support systems (CELSS) are summarized. Included are an evaluation of a ground based manned demonstration as a milestone in CELSS development, and a discussion of development requirements for a successful ground based CELSS demonstration. Research recommendations are presented concerning the following topics: nutrition and food processing, food production, waste processing, systems engineering and modelling, and ecology-systems safety.
NASA Technical Reports Server (NTRS)
Izygon, Michel E.
1992-01-01
The development process of the knowledge base for the generation of Test Libraries for Mission Operations Computer (MOC) Command Support focused on a series of information gathering interviews. These knowledge capture sessions are supporting the development of a prototype for evaluating the capabilities of INTUIT on such an application. the prototype includes functions related to POCC (Payload Operation Control Center) processing. It prompts the end-users for input through a series of panels and then generates the Meds associated with the initialization and the update of hazardous command tables for a POCC Processing TLIB.
An Ibm PC/AT-Based Image Acquisition And Processing System For Quantitative Image Analysis
NASA Astrophysics Data System (ADS)
Kim, Yongmin; Alexander, Thomas
1986-06-01
In recent years, a large number of applications have been developed for image processing systems in the area of biological imaging. We have already finished the development of a dedicated microcomputer-based image processing and analysis system for quantitative microscopy. The system's primary function has been to facilitate and ultimately automate quantitative image analysis tasks such as the measurement of cellular DNA contents. We have recognized from this development experience, and interaction with system users, biologists and technicians, that the increasingly widespread use of image processing systems, and the development and application of new techniques for utilizing the capabilities of such systems, would generate a need for some kind of inexpensive general purpose image acquisition and processing system specially tailored for the needs of the medical community. We are currently engaged in the development and testing of hardware and software for a fairly high-performance image processing computer system based on a popular personal computer. In this paper, we describe the design and development of this system. Biological image processing computer systems have now reached a level of hardware and software refinement where they could become convenient image analysis tools for biologists. The development of a general purpose image processing system for quantitative image analysis that is inexpensive, flexible, and easy-to-use represents a significant step towards making the microscopic digital image processing techniques more widely applicable not only in a research environment as a biologist's workstation, but also in clinical environments as a diagnostic tool.
NASA Technical Reports Server (NTRS)
Hale, Mark A.; Craig, James I.; Mistree, Farrokh; Schrage, Daniel P.
1995-01-01
Integrated Product and Process Development (IPPD) embodies the simultaneous application of both system and quality engineering methods throughout an iterative design process. The use of IPPD results in the time-conscious, cost-saving development of engineering systems. Georgia Tech has proposed the development of an Integrated Design Engineering Simulator that will merge Integrated Product and Process Development with interdisciplinary analysis techniques and state-of-the-art computational technologies. To implement IPPD, a Decision-Based Design perspective is encapsulated in an approach that focuses on the role of the human designer in product development. The approach has two parts and is outlined in this paper. First, an architecture, called DREAMS, is being developed that facilitates design from a decision-based perspective. Second, a supporting computing infrastructure, called IMAGE, is being designed. The current status of development is given and future directions are outlined.
Biomimetic design processes in architecture: morphogenetic and evolutionary computational design.
Menges, Achim
2012-03-01
Design computation has profound impact on architectural design methods. This paper explains how computational design enables the development of biomimetic design processes specific to architecture, and how they need to be significantly different from established biomimetic processes in engineering disciplines. The paper first explains the fundamental difference between computer-aided and computational design in architecture, as the understanding of this distinction is of critical importance for the research presented. Thereafter, the conceptual relation and possible transfer of principles from natural morphogenesis to design computation are introduced and the related developments of generative, feature-based, constraint-based, process-based and feedback-based computational design methods are presented. This morphogenetic design research is then related to exploratory evolutionary computation, followed by the presentation of two case studies focusing on the exemplary development of spatial envelope morphologies and urban block morphologies.
Recent developments in membrane-based separations in biotechnology processes: review.
Rathore, A S; Shirke, A
2011-01-01
Membrane-based separations are the most ubiquitous unit operations in biotech processes. There are several key reasons for this. First, they can be used with a large variety of applications including clarification, concentration, buffer exchange, purification, and sterilization. Second, they are available in a variety of formats, such as depth filtration, ultrafiltration, diafiltration, nanofiltration, reverse osmosis, and microfiltration. Third, they are simple to operate and are generally robust toward normal variations in feed material and operating parameters. Fourth, membrane-based separations typically require lower capital cost when compared to other processing options. As a result of these advantages, a typical biotech process has anywhere from 10 to 20 membrane-based separation steps. In this article we review the major developments that have occurred on this topic with a focus on developments in the last 5 years.
Documenting the decision structure in software development
NASA Technical Reports Server (NTRS)
Wild, J. Christian; Maly, Kurt; Shen, Stewart N.
1990-01-01
Current software development paradigms focus on the products of the development process. Much of the decision making process which produces these products is outside the scope of these paradigms. The Decision-Based Software Development (DBSD) paradigm views the design process as a series of interrelated decisions which involve the identification and articulation of problems, alternates, solutions and justifications. Decisions made by programmers and analysts are recorded in a project data base. Unresolved problems are also recorded and resources for their resolution are allocated by management according to the overall development strategy. This decision structure is linked to the products affected by the relevant decision and provides a process oriented view of the resulted system. Software maintenance uses this decision view of the system to understand the rationale behind the decisions affecting the part of the system to be modified. D-HyperCase, a prototype Decision-Based Hypermedia System is described and results of applying the DBSD approach during its development are presented.
Kim, Huiyong; Hwang, Sung June; Lee, Kwang Soon
2015-02-03
Among various CO2 capture processes, the aqueous amine-based absorption process is considered the most promising for near-term deployment. However, the performance evaluation of newly developed solvents still requires complex and time-consuming procedures, such as pilot plant tests or the development of a rigorous simulator. Absence of accurate and simple calculation methods for the energy performance at an early stage of process development has lengthened and increased expense of the development of economically feasible CO2 capture processes. In this paper, a novel but simple method to reliably calculate the regeneration energy in a standard amine-based carbon capture process is proposed. Careful examination of stripper behaviors and exploitation of energy balance equations around the stripper allowed for calculation of the regeneration energy using only vapor-liquid equilibrium and caloric data. Reliability of the proposed method was confirmed by comparing to rigorous simulations for two well-known solvents, monoethanolamine (MEA) and piperazine (PZ). The proposed method can predict the regeneration energy at various operating conditions with greater simplicity, greater speed, and higher accuracy than those proposed in previous studies. This enables faster and more precise screening of various solvents and faster optimization of process variables and can eventually accelerate the development of economically deployable CO2 capture processes.
Enzyme-based solutions for textile processing and dye contaminant biodegradation-a review.
Chatha, Shahzad Ali Shahid; Asgher, Muhammad; Iqbal, Hafiz M N
2017-06-01
The textile industry, as recognized conformist and stake industry in the world's economy, is facing serious environmental challenges. In numerous industries, in practice, various chemical-based processes from initial sizing to final washing are fascinating harsh environment concerns. Some of these chemicals are corrosive to equipment and cause serious damage itself. Therefore, in the twenty-first century, chemical and allied industries quest a paradigm transition from traditional chemical-based concepts to a greener, sustainable, and environmentally friendlier catalytic alternative, both at the laboratory and industrial scales. Bio-based catalysis offers numerous benefits in the context of biotechnological industry and environmental applications. In recent years, bio-based processing has received particular interest among the scientist for inter- and multi-disciplinary investigations in the areas of natural and engineering sciences for the application in biotechnology sector at large and textile industries in particular. Different enzymatic processes such as chemical substitution have been developed or in the process of development for various textile wet processes. In this context, the present review article summarizes current developments and highlights those areas where environment-friendly enzymatic textile processing might play an increasingly important role in the textile industry. In the first part of the review, a special focus has been given to a comparative discussion of the chemical-based "classical/conventional" treatments and the modern enzyme-based treatment processes. Some relevant information is also reported to identify the major research gaps to be worked out in future.
Kahn, Jeremy M; Gould, Michael K; Krishnan, Jerry A; Wilson, Kevin C; Au, David H; Cooke, Colin R; Douglas, Ivor S; Feemster, Laura C; Mularski, Richard A; Slatore, Christopher G; Wiener, Renda Soylemez
2014-05-01
Many health care performance measures are either not based on high-quality clinical evidence or not tightly linked to patient-centered outcomes, limiting their usefulness in quality improvement. In this report we summarize the proceedings of an American Thoracic Society workshop convened to address this problem by reviewing current approaches to performance measure development and creating a framework for developing high-quality performance measures by basing them directly on recommendations from well-constructed clinical practice guidelines. Workshop participants concluded that ideally performance measures addressing care processes should be linked to clinical practice guidelines that explicitly rate the quality of evidence and the strength of recommendations, such as the Grading of Recommendations Assessment, Development, and Evaluation (GRADE) process. Under this framework, process-based performance measures would only be developed from strong recommendations based on high- or moderate-quality evidence. This approach would help ensure that clinical processes specified in performance measures are both of clear benefit to patients and supported by strong evidence. Although this approach may result in fewer performance measures, it would substantially increase the likelihood that quality-improvement programs based on these measures actually improve patient care.
Gonzalez, John; Trickett, Edison J.
2014-01-01
This paper describes the processes we engaged in to develop a measurement protocol used to assess the outcomes in a community based suicide and alcohol abuse prevention project with two Alaska Native communities. While the literature on community-based participatory research (CBPR) is substantial regarding the importance of collaborations, few studies have reported on this collaboration in the process of developing measures to assess CBPR projects. We first tell a story of the processes around the standard issues of doing cross-cultural work on measurement development related to areas of equivalence. A second story is provided that highlights how community differences within the same cultural group can affect both the process and content of culturally relevant measurement selection, adaptation, and development. PMID:24748283
Gonzalez, John; Trickett, Edison J
2014-09-01
This paper describes the processes we engaged into develop a measurement protocol used to assess the outcomes in a community based suicide and alcohol abuse prevention project with two Alaska Native communities. While the literature on community-based participatory research (CBPR) is substantial regarding the importance of collaborations, few studies have reported on this collaboration in the process of developing measures to assess CBPR projects. We first tell a story of the processes around the standard issues of doing cross-cultural work on measurement development related to areas of equivalence. A second story is provided that highlights how community differences within the same cultural group can affect both the process and content of culturally relevant measurement selection, adaptation, and development.
NASA Astrophysics Data System (ADS)
do Lago, Naydson Emmerson S. P.; Kardec Barros, Allan; Sousa, Nilviane Pires S.; Junior, Carlos Magno S.; Oliveira, Guilherme; Guimares Polisel, Camila; Eder Carvalho Santana, Ewaldo
2018-01-01
This study aims to develop an algorithm of an adaptive filter to determine the percentage of body fat based on the use of anthropometric indicators in adolescents. Measurements such as body mass, height and waist circumference were collected for a better analysis. The development of this filter was based on the Wiener filter, used to produce an estimate of a random process. The Wiener filter minimizes the mean square error between the estimated random process and the desired process. The LMS algorithm was also studied for the development of the filter because it is important due to its simplicity and facility of computation. Excellent results were obtained with the filter developed, being these results analyzed and compared with the data collected.
System and method for deriving a process-based specification
NASA Technical Reports Server (NTRS)
Hinchey, Michael Gerard (Inventor); Rouff, Christopher A. (Inventor); Rash, James Larry (Inventor)
2009-01-01
A system and method for deriving a process-based specification for a system is disclosed. The process-based specification is mathematically inferred from a trace-based specification. The trace-based specification is derived from a non-empty set of traces or natural language scenarios. The process-based specification is mathematically equivalent to the trace-based specification. Code is generated, if applicable, from the process-based specification. A process, or phases of a process, using the features disclosed can be reversed and repeated to allow for an interactive development and modification of legacy systems. The process is applicable to any class of system, including, but not limited to, biological and physical systems, electrical and electro-mechanical systems in addition to software, hardware and hybrid hardware-software systems.
Anammox-based technologies for nitrogen removal: Advances in process start-up and remaining issues.
Ali, Muhammad; Okabe, Satoshi
2015-12-01
Nitrogen removal from wastewater via anaerobic ammonium oxidation (anammox)-based process has been recognized as efficient, cost-effective and low energy alternative to the conventional nitrification and denitrification processes. To date, more than one hundred full-scale anammox plants have been installed and operated for treatment of NH4(+)-rich wastewater streams around the world, and the number is increasing rapidly. Since the discovery of anammox process, extensive researches have been done to develop various anammox-based technologies. However, there are still some challenges in practical application of anammox-based treatment process at full-scale, e.g., longer start-up period, limited application to mainstream municipal wastewater and poor effluent water quality. This paper aimed to summarize recent status of application of anammox process and researches on technological development for solving these remaining problems. In addition, an integrated system of anammox-based process and microbial fuel cell is proposed for sustainable and energy-positive wastewater treatment. Copyright © 2015 Elsevier Ltd. All rights reserved.
Walsh, Jane C; Groarke, AnnMarie; Moss-Morris, Rona; Morrissey, Eimear; McGuire, Brian E
2017-01-01
Background Cancer-related fatigue (CrF) is the most common and disruptive symptom experienced by cancer survivors. We aimed to develop a theory-based, interactive Web-based intervention designed to facilitate self-management and enhance coping with CrF following cancer treatment. Objective The aim of our study was to outline the rationale, decision-making processes, methods, and findings which led to the development of a Web-based intervention to be tested in a feasibility trial. This paper outlines the process and method of development of the intervention. Methods An extensive review of the literature and qualitative research was conducted to establish a therapeutic approach for this intervention, based on theory. The psychological principles used in the development process are outlined, and we also clarify hypothesized causal mechanisms. We describe decision-making processes involved in the development of the content of the intervention, input from the target patient group and stakeholders, the design of the website features, and the initial user testing of the website. Results The cocreation of the intervention with the experts and service users allowed the design team to ensure that an acceptable intervention was developed. This evidence-based Web-based program is the first intervention of its kind based on self-regulation model theory, with the primary aim of targeting the representations of fatigue and enhancing self-management of CrF, specifically. Conclusions This research sought to integrate psychological theory, existing evidence of effective interventions, empirically derived principles of Web design, and the views of potential users into the systematic planning and design of the intervention of an easy-to-use website for cancer survivors. PMID:28676465
Aventin, Áine; Lohan, Maria; O'Halloran, Peter; Henderson, Marion
2015-04-01
Following the UK Medical Research Council's (MRC) guidelines for the development and evaluation of complex interventions, this study aimed to design, develop and optimise an educational intervention about young men and unintended teenage pregnancy based around an interactive film. The process involved identification of the relevant evidence base, development of a theoretical understanding of the phenomenon of unintended teenage pregnancy in relation to young men, and exploratory mixed methods research. The result was an evidence-based, theory-informed, user-endorsed intervention designed to meet the much neglected pregnancy education needs of teenage men and intended to increase both boys' and girls' intentions to avoid an unplanned pregnancy during adolescence. In prioritising the development phase, this paper addresses a gap in the literature on the processes of research-informed intervention design. It illustrates the application of the MRC guidelines in practice while offering a critique and additional guidance to programme developers on the MRC prescribed processes of developing interventions. Key lessons learned were: (1) know and engage the target population and engage gatekeepers in addressing contextual complexities; (2) know the targeted behaviours and model a process of change; and (3) look beyond development to evaluation and implementation. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.
Lau, Nathan; Jamieson, Greg A; Skraaning, Gyrd
2016-03-01
The Process Overview Measure is a query-based measure developed to assess operator situation awareness (SA) from monitoring process plants. A companion paper describes how the measure has been developed according to process plant properties and operator cognitive work. The Process Overview Measure demonstrated practicality, sensitivity, validity and reliability in two full-scope simulator experiments investigating dramatically different operational concepts. Practicality was assessed based on qualitative feedback of participants and researchers. The Process Overview Measure demonstrated sensitivity and validity by revealing significant effects of experimental manipulations that corroborated with other empirical results. The measure also demonstrated adequate inter-rater reliability and practicality for measuring SA in full-scope simulator settings based on data collected on process experts. Thus, full-scope simulator studies can employ the Process Overview Measure to reveal the impact of new control room technology and operational concepts on monitoring process plants. Practitioner Summary: The Process Overview Measure is a query-based measure that demonstrated practicality, sensitivity, validity and reliability for assessing operator situation awareness (SA) from monitoring process plants in representative settings.
The Application of V&V within Reuse-Based Software Engineering
NASA Technical Reports Server (NTRS)
Addy, Edward
1996-01-01
Verification and Validation (V&V) is performed during application development for many systems, especially safety-critical and mission-critical systems. The V&V process is intended to discover errors as early as possible during the development process. Early discovery is important in order to minimize the cost and other impacts of correcting these errors. In reuse-based software engineering, decisions on the requirements, design and even implementation of domain assets can can be made prior to beginning development of a specific system. in order to bring the effectiveness of V&V to bear within reuse-based software engineering. V&V must be incorporated within the domain engineering process.
Quality Assessment of TPB-Based Questionnaires: A Systematic Review
Oluka, Obiageli Crystal; Nie, Shaofa; Sun, Yi
2014-01-01
Objective This review is aimed at assessing the quality of questionnaires and their development process based on the theory of planned behavior (TPB) change model. Methods A systematic literature search for studies with the primary aim of TPB-based questionnaire development was conducted in relevant databases between 2002 and 2012 using selected search terms. Ten of 1,034 screened abstracts met the inclusion criteria and were assessed for methodological quality using two different appraisal tools: one for the overall methodological quality of each study and the other developed for the appraisal of the questionnaire content and development process. Both appraisal tools consisted of items regarding the likelihood of bias in each study and were eventually combined to give the overall quality score for each included study. Results 8 of the 10 included studies showed low risk of bias in the overall quality assessment of each study, while 9 of the studies were of high quality based on the quality appraisal of questionnaire content and development process. Conclusion Quality appraisal of the questionnaires in the 10 reviewed studies was successfully conducted, highlighting the top problem areas (including: sample size estimation; inclusion of direct and indirect measures; and inclusion of questions on demographics) in the development of TPB-based questionnaires and the need for researchers to provide a more detailed account of their development process. PMID:24722323
Hogan, Lindsay; García Bengoechea, Enrique; Salsberg, Jon; Jacobs, Judi; King, Morrison; Macaulay, Ann C
2014-12-01
This study is part of a larger community-based participatory research (CBPR) project to develop, implement, and evaluate the physical activity component of a school-based wellness policy. The policy intervention is being carried out by community stakeholders and academic researchers within the Kahnawake Schools Diabetes Prevention Project, a well-established health promotion organization in the Indigenous community of Kahnawake, Quebec. We explored how a group of stakeholders develop a school physical activity policy in a participatory manner, and examined factors serving as facilitators and barriers to the development process. This case study was guided by an interpretive description approach and draws upon data from documentary analysis and participant observation. A CBPR approach allowed academic researchers and community stakeholders to codevelop a physical activity policy that is both evidence-based and contextually appropriate. The development process was influenced by a variety of barriers and facilitators including working within existing structures, securing appropriate stakeholders, and school contextual factors. This research offers a process framework that others developing school-based wellness policies may use with appropriate modifications based on local environments. © 2014, American School Health Association.
Developing Cultural Literacy through the Writing Process: Empowering All Learners.
ERIC Educational Resources Information Center
Palmer, Barbara C.; And Others
Combining the expansion of cultural literacy with the development of process-based writing, this book addresses each stage of the writing process, with emphasis on the recursive and overlapping nature of these stages. Numerous related model activities at the end of each chapter show how to develop the writing process, while expanding the writer's…
Collaboration in Global Software Engineering Based on Process Description Integration
NASA Astrophysics Data System (ADS)
Klein, Harald; Rausch, Andreas; Fischer, Edward
Globalization is one of the big trends in software development. Development projects need a variety of different resources with appropriate expert knowledge to be successful. More and more of these resources are nowadays obtained from specialized organizations and countries all over the world, varying in development approaches, processes, and culture. As seen with early outsourcing attempts, collaboration may fail due to these differences. Hence, the major challenge in global software engineering is to streamline collaborating organizations towards a successful conjoint development. Based on typical collaboration scenarios, this paper presents a structured approach to integrate processes in a comprehensible way.
Managing Analysis Models in the Design Process
NASA Technical Reports Server (NTRS)
Briggs, Clark
2006-01-01
Design of large, complex space systems depends on significant model-based support for exploration of the design space. Integrated models predict system performance in mission-relevant terms given design descriptions and multiple physics-based numerical models. Both the design activities and the modeling activities warrant explicit process definitions and active process management to protect the project from excessive risk. Software and systems engineering processes have been formalized and similar formal process activities are under development for design engineering and integrated modeling. JPL is establishing a modeling process to define development and application of such system-level models.
Three-dimensional motor schema based navigation
NASA Technical Reports Server (NTRS)
Arkin, Ronald C.
1989-01-01
Reactive schema-based navigation is possible in space domains by extending the methods developed for ground-based navigation found within the Autonomous Robot Architecture (AuRA). Reformulation of two dimensional motor schemas for three dimensional applications is a straightforward process. The manifold advantages of schema-based control persist, including modular development, amenability to distributed processing, and responsiveness to environmental sensing. Simulation results show the feasibility of this methodology for space docking operations in a cluttered work area.
A KPI-based process monitoring and fault detection framework for large-scale processes.
Zhang, Kai; Shardt, Yuri A W; Chen, Zhiwen; Yang, Xu; Ding, Steven X; Peng, Kaixiang
2017-05-01
Large-scale processes, consisting of multiple interconnected subprocesses, are commonly encountered in industrial systems, whose performance needs to be determined. A common approach to this problem is to use a key performance indicator (KPI)-based approach. However, the different KPI-based approaches are not developed with a coherent and consistent framework. Thus, this paper proposes a framework for KPI-based process monitoring and fault detection (PM-FD) for large-scale industrial processes, which considers the static and dynamic relationships between process and KPI variables. For the static case, a least squares-based approach is developed that provides an explicit link with least-squares regression, which gives better performance than partial least squares. For the dynamic case, using the kernel representation of each subprocess, an instrument variable is used to reduce the dynamic case to the static case. This framework is applied to the TE benchmark process and the hot strip mill rolling process. The results show that the proposed method can detect faults better than previous methods. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.
Strategic Planning: A (Site) Sight-Based Approach to Curriculum and Staff Development.
ERIC Educational Resources Information Center
Johnson, Daniel P.
The purpose of (Colorado's) Clear Creek School District's strategic planning process has been to develop basic district-wide parameters to promote instructional improvement through a process of shared leadership. The approach is termed "sight-based" to indicate the school district's commitment to connecting curriculum and…
Gong, Xing-Chu; Chen, Teng; Qu, Hai-Bin
2017-03-01
Quality by design (QbD) concept is an advanced pharmaceutical quality control concept. The application of QbD concept in the research and development of pharmaceutical processes of traditional Chinese medicines (TCM) mainly contains five parts, including the definition of critical processes and their evaluation criteria, the determination of critical process parameters and critical material attributes, the establishment of quantitative models, the development of design space, as well as the application and continuous improvement of control strategy. In this work, recent research advances in QbD concept implementation methods in the secondary development of Chinese patent medicines were reviewed, and five promising fields of the implementation of QbD concept were pointed out, including the research and development of TCM new drugs and Chinese medicine granules for formulation, modeling of pharmaceutical processes, development of control strategy based on industrial big data, strengthening the research of process amplification rules, and the development of new pharmaceutical equipment.. Copyright© by the Chinese Pharmaceutical Association.
The jABC Approach to Rigorous Collaborative Development of SCM Applications
NASA Astrophysics Data System (ADS)
Hörmann, Martina; Margaria, Tiziana; Mender, Thomas; Nagel, Ralf; Steffen, Bernhard; Trinh, Hong
Our approach to the model-driven collaborative design of IKEA's P3 Delivery Management Process uses the jABC [9] for model driven mediation and choreography to complement a RUP-based (Rational Unified Process) development process. jABC is a framework for service development based on Lightweight Process Coordination. Users (product developers and system/software designers) easily develop services and applications by composing reusable building-blocks into (flow-) graph structures that can be animated, analyzed, simulated, verified, executed, and compiled. This way of handling the collaborative design of complex embedded systems has proven to be effective and adequate for the cooperation of non-programmers and non-technical people, which is the focus of this contribution, and it is now being rolled out in the operative practice.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simpson, L.
ITN Energy Systems, Inc., and Global Solar Energy, Inc., with the assistance of NREL's PV Manufacturing R&D program, have continued the advancement of CIGS production technology through the development of trajectory-oriented predictive/control models, fault-tolerance control, control-platform development, in-situ sensors, and process improvements. Modeling activities to date include the development of physics-based and empirical models for CIGS and sputter-deposition processing, implementation of model-based control, and application of predictive models to the construction of new evaporation sources and for control. Model-based control is enabled through implementation of reduced or empirical models into a control platform. Reliability improvement activities include implementation of preventivemore » maintenance schedules; detection of failed sensors/equipment and reconfiguration to continue processing; and systematic development of fault prevention and reconfiguration strategies for the full range of CIGS PV production deposition processes. In-situ sensor development activities have resulted in improved control and indicated the potential for enhanced process status monitoring and control of the deposition processes. Substantial process improvements have been made, including significant improvement in CIGS uniformity, thickness control, efficiency, yield, and throughput. In large measure, these gains have been driven by process optimization, which, in turn, have been enabled by control and reliability improvements due to this PV Manufacturing R&D program. This has resulted in substantial improvements of flexible CIGS PV module performance and efficiency.« less
The Use of Intervention Mapping to Develop a Tailored Web-Based Intervention, Condom-HIM
2017-01-01
Background Many HIV (human immunodeficiency virus) prevention interventions are currently being implemented and evaluated, with little information published on their development. A framework highlighting the method of development of an intervention can be used by others wanting to replicate interventions or develop similar interventions to suit other contexts and settings. It provides researchers with a comprehensive development process of the intervention. Objective The objective of this paper was to describe how a systematic approach, intervention mapping, was used to develop a tailored Web-based intervention to increase condom use among HIV-positive men who have sex with men. Methods The intervention was developed in consultation with a multidisciplinary team composed of academic researchers, community members, Web designers, and the target population. Intervention mapping involved a systematic process of 6 steps: (1) needs assessment; (2) identification of proximal intervention objectives; (3) selection of theory-based intervention methods and practical strategies; (4) development of intervention components and materials; (5) adoption, implementation, and maintenance; and (6) evaluation planning. Results The application of intervention mapping resulted in the development of a tailored Web-based intervention for HIV-positive men who have sex with men, called Condom-HIM. Conclusions Using intervention mapping as a systematic process to develop interventions is a feasible approach that specifically integrates the use of theory and empirical findings. Outlining the process used to develop a particular intervention provides clarification on the conceptual use of experimental interventions in addition to potentially identifying reasons for intervention failures. PMID:28428162
NASA Astrophysics Data System (ADS)
Skersys, Tomas; Butleris, Rimantas; Kapocius, Kestutis
2013-10-01
Approaches for the analysis and specification of business vocabularies and rules are very relevant topics in both Business Process Management and Information Systems Development disciplines. However, in common practice of Information Systems Development, the Business modeling activities still are of mostly empiric nature. In this paper, basic aspects of the approach for business vocabularies' semi-automated extraction from business process models are presented. The approach is based on novel business modeling-level OMG standards "Business Process Model and Notation" (BPMN) and "Semantics for Business Vocabularies and Business Rules" (SBVR), thus contributing to OMG's vision about Model-Driven Architecture (MDA) and to model-driven development in general.
Development of modified FT (MFT) process
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jinglai Zhou; Zhixin Zhang; Wenjie Shen
1995-12-31
Two-Stage Modified FT (MFT) process has been developed for producing high-octane gasoline from coal-based syngas. The main R&D are focused on the development of catalysts and technologies process. Duration tests were finished in the single-tube reactor, pilot plant (100T/Y), and industrial demonstration plant (2000T/Y). A series of satisfactory results has been obtained in terms of operating reliability of equipments, performance of catalysts, purification of coal - based syngas, optimum operating conditions, properties of gasoline and economics etc. Further scaling - up commercial plant is being considered.
NASA Technical Reports Server (NTRS)
VanDalsem, William R.; Livingston, Mary E.; Melton, John E.; Torres, Francisco J.; Stremel, Paul M.
1999-01-01
Continuous improvement of aerospace product development processes is a driving requirement across much of the aerospace community. As up to 90% of the cost of an aerospace product is committed during the first 10% of the development cycle, there is a strong emphasis on capturing, creating, and communicating better information (both requirements and performance) early in the product development process. The community has responded by pursuing the development of computer-based systems designed to enhance the decision-making capabilities of product development individuals and teams. Recently, the historical foci on sharing the geometrical representation and on configuration management are being augmented: Physics-based analysis tools for filling the design space database; Distributed computational resources to reduce response time and cost; Web-based technologies to relieve machine-dependence; and Artificial intelligence technologies to accelerate processes and reduce process variability. Activities such as the Advanced Design Technologies Testbed (ADTT) project at NASA Ames Research Center study the strengths and weaknesses of the technologies supporting each of these trends, as well as the overall impact of the combination of these trends on a product development event. Lessons learned and recommendations for future activities will be reported.
Design and development of data acquisition system based on WeChat hardware
NASA Astrophysics Data System (ADS)
Wang, Zhitao; Ding, Lei
2018-06-01
Data acquisition system based on WeChat hardware provides methods for popularization and practicality of data acquisition. The whole system is based on WeChat hardware platform, where the hardware part is developed on DA14580 development board and the software part is based on Alibaba Cloud. We designed service module, logic processing module, data processing module and database module. The communication between hardware and software uses AirSync Protocal. We tested this system by collecting temperature and humidity data, and the result shows that the system can aquisite the temperature and humidity in real time according to settings.
Testing Strategies for Model-Based Development
NASA Technical Reports Server (NTRS)
Heimdahl, Mats P. E.; Whalen, Mike; Rajan, Ajitha; Miller, Steven P.
2006-01-01
This report presents an approach for testing artifacts generated in a model-based development process. This approach divides the traditional testing process into two parts: requirements-based testing (validation testing) which determines whether the model implements the high-level requirements and model-based testing (conformance testing) which determines whether the code generated from a model is behaviorally equivalent to the model. The goals of the two processes differ significantly and this report explores suitable testing metrics and automation strategies for each. To support requirements-based testing, we define novel objective requirements coverage metrics similar to existing specification and code coverage metrics. For model-based testing, we briefly describe automation strategies and examine the fault-finding capability of different structural coverage metrics using tests automatically generated from the model.
ERIC Educational Resources Information Center
Bruton, Anthony
2005-01-01
Process writing and communicative-task-based instruction both assume productive tasks that prompt self-expression to motivate students and as the principal engine for developing L2 proficiency in the language classroom. Besides this, process writing and communicative-task-based instruction have much else in common, despite some obvious…
Technical Potential Assessment for the Renewable Energy Zone (REZ) Process: A GIS-Based Approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Nathan; Roberts, Billy J
Geographic Information Systems (GIS)-based energy resource and technical potential assessments identify areas capable of supporting high levels of renewable energy (RE) development as part of a Renewable Energy Zone (REZ) Transmission Planning process. This document expands on the REZ Process to aid practitioners in conducting GIS-based RE resource and technical potential assessments. The REZ process is an approach to plan, approve, and build transmission infrastructure that connects REZs - geographic areas that have high-quality RE resources, suitable topography and land-use designations, and demonstrated developer interest - to the power system. The REZ process helps to increase the share of solarmore » photovoltaic (PV), wind, and other resources while also maintaining reliability and economics.« less
A Tutorial Programme to Enhance Psychiatry Learning Processes within a PBL-Based Course
ERIC Educational Resources Information Center
Hood, Sean; Chapman, Elaine
2011-01-01
This paper describes a tutorial programme developed at the University of Western Australia (UWA) to enhance medical students' learning processes within problem-based learning contexts. The programme encourages students to use more effective learning approaches by scaffolding the development of effective problem-solving strategies, and by reducing…
Deep Learning towards Expertise Development in a Visualization-Based Learning Environment
ERIC Educational Resources Information Center
Yuan, Bei; Wang, Minhong; Kushniruk, Andre W.; Peng, Jun
2017-01-01
With limited problem-solving capability and practical experience, novices have difficulties developing expert-like performance. It is important to make the complex problem-solving process visible to learners and provide them with necessary help throughout the process. This study explores the design and effects of a model-based learning approach…
2012-01-01
Background Optimization of the clinical care process by integration of evidence-based knowledge is one of the active components in care pathways. When studying the impact of a care pathway by using a cluster-randomized design, standardization of the care pathway intervention is crucial. This methodology paper describes the development of the clinical content of an evidence-based care pathway for in-hospital management of chronic obstructive pulmonary disease (COPD) exacerbation in the context of a cluster-randomized controlled trial (cRCT) on care pathway effectiveness. Methods The clinical content of a care pathway for COPD exacerbation was developed based on recognized process design and guideline development methods. Subsequently, based on the COPD case study, a generalized eight-step method was designed to support the development of the clinical content of an evidence-based care pathway. Results A set of 38 evidence-based key interventions and a set of 24 process and 15 outcome indicators were developed in eight different steps. Nine Belgian multidisciplinary teams piloted both the set of key interventions and indicators. The key intervention set was judged by the teams as being valid and clinically applicable. In addition, the pilot study showed that the indicators were feasible for the involved clinicians and patients. Conclusions The set of 38 key interventions and the set of process and outcome indicators were found to be appropriate for the development and standardization of the clinical content of the COPD care pathway in the context of a cRCT on pathway effectiveness. The developed eight-step method may facilitate multidisciplinary teams caring for other patient populations in designing the clinical content of their future care pathways. PMID:23190552
NASA Astrophysics Data System (ADS)
Nilasari, Yoni; Dasining
2018-04-01
In this era of globalization, every human resource is faced with a competitive climate that will have a major impact on the development of the business and industrial sector. Therefore it is deemed necessary to research the development of curriculum based on INQF and the business/industries sector in order to improve the competence of Sewing Technique for Vocational High School Students of fashion clothing program. The development of curricula based on INQF and the business/industries is an activity to produce a curriculum that suits the needs of the business and industries sector. The formulation of the problem in this research are: (1) what is the curriculum based on INQF and the business/industries sector?; (2) how is the process and procedure of curriculum development of fashion program profession based on INQF and the business/industries sector?; And (3) how the result of the curriculum of fashion expertise based on INQF and the business/industries sector. The aims of research are: (1) explain what is meant by curriculum based on INQF and business/industries sector; (2) to know the process and procedure of curriculum development of fashion program profession based on INQF and the business/industries sectors ; And (3) to know result the curriculum of clothing expertise based on INQF and the business/industries sector. The research method chosen in developing curriculum based on INQFand business/industry sector is using by 4-D model from Thiagarajan, which includes: (1) define; (2) design; (3) development; And (4) disseminate. Step 4, not done but in this study. The result of the research shows that: (1) the curriculum based on INQF and the business/industries sector is the curriculum created by applying the principles and procedures of the Indonesian National Qualification Framework (INQF) that will improve the quality of graduates of Vocational High School level 2, and establish cooperation with Business/industries as a guest teacher (counselor) in the learning process; (2) process and procedure of curriculum development of fashion program profession based on INQF and business/industries sector is process and procedure of curriculum development of fashion program profession based on INQF and business/industries sector there are several stages: feasibility study and requirement, preparation of initial concept of curriculum planning based on INQF and the business/industries sector in the field of fashion, as well as the development of a plan to implement the curriculum based on INQF and the business/industries sector in the field of fashion, this development will produce a curriculum of fashion proficiency program in the form of learning competency of sewing technology where the implementer of learning (counselor) Is a guest teacher from business/industries sector. (3) the learning device validity aspect earns an average score of 3.5 with very valid criteria and the practicality aspect of the device obtains an average score of 3.3 with practical criteria.
Model-based query language for analyzing clinical processes.
Barzdins, Janis; Barzdins, Juris; Rencis, Edgars; Sostaks, Agris
2013-01-01
Nowadays large databases of clinical process data exist in hospitals. However, these data are rarely used in full scope. In order to perform queries on hospital processes, one must either choose from the predefined queries or develop queries using MS Excel-type software system, which is not always a trivial task. In this paper we propose a new query language for analyzing clinical processes that is easily perceptible also by non-IT professionals. We develop this language based on a process modeling language which is also described in this paper. Prototypes of both languages have already been verified using real examples from hospitals.
New Vistas in Chemical Product and Process Design.
Zhang, Lei; Babi, Deenesh K; Gani, Rafiqul
2016-06-07
Design of chemicals-based products is broadly classified into those that are process centered and those that are product centered. In this article, the designs of both classes of products are reviewed from a process systems point of view; developments related to the design of the chemical product, its corresponding process, and its integration are highlighted. Although significant advances have been made in the development of systematic model-based techniques for process design (also for optimization, operation, and control), much work is needed to reach the same level for product design. Timeline diagrams illustrating key contributions in product design, process design, and integrated product-process design are presented. The search for novel, innovative, and sustainable solutions must be matched by consideration of issues related to the multidisciplinary nature of problems, the lack of data needed for model development, solution strategies that incorporate multiscale options, and reliability versus predictive power. The need for an integrated model-experiment-based design approach is discussed together with benefits of employing a systematic computer-aided framework with built-in design templates.
NASA Astrophysics Data System (ADS)
Telipenko, E.; Chernysheva, T.; Zakharova, A.; Dumchev, A.
2015-10-01
The article represents research results about the knowledge base development for the intellectual information system for the bankruptcy risk assessment of the enterprise. It is described the process analysis of the knowledge base development; the main process stages, some problems and their solutions are given. The article introduces the connectionist model for the bankruptcy risk assessment based on the analysis of industrial enterprise financial accounting. The basis for this connectionist model is a three-layer perceptron with the back propagation of error algorithm. The knowledge base for the intellectual information system consists of processed information and the processing operation method represented as the connectionist model. The article represents the structure of the intellectual information system, the knowledge base, and the information processing algorithm for neural network training. The paper shows mean values of 10 indexes for industrial enterprises; with the help of them it is possible to carry out a financial analysis of industrial enterprises and identify correctly the current situation for well-timed managerial decisions. Results are given about neural network testing on the data of both bankrupt and financially strong enterprises, which were not included into training and test sets.
NASA Astrophysics Data System (ADS)
Utanto, Yuli; Widhanarto, Ghanis Putra; Maretta, Yoris Adi
2017-03-01
This study aims to develop a web-based portfolio model. The model developed in this study could reveal the effectiveness of the new model in experiments conducted at research respondents in the department of curriculum and educational technology FIP Unnes. In particular, the further research objectives to be achieved through this development of research, namely: (1) Describing the process of implementing a portfolio in a web-based model; (2) Assessing the effectiveness of web-based portfolio model for the final task, especially in Web-Based Learning courses. This type of research is the development of research Borg and Gall (2008: 589) says "educational research and development (R & D) is a process used to develop and validate educational production". The series of research and development carried out starting with exploration and conceptual studies, followed by testing and evaluation, and also implementation. For the data analysis, the technique used is simple descriptive analysis, analysis of learning completeness, which then followed by prerequisite test for normality and homogeneity to do T - test. Based on the data analysis, it was concluded that: (1) a web-based portfolio model can be applied to learning process in higher education; (2) The effectiveness of web-based portfolio model with field data from the respondents of large group trial participants (field trial), the number of respondents who reached mastery learning (a score of 60 and above) were 24 people (92.3%) in which it indicates that the web-based portfolio model is effective. The conclusion of this study is that a web-based portfolio model is effective. The implications of the research development of this model, the next researcher is expected to be able to use the guideline of the development model based on the research that has already been conducted to be developed on other subjects.
NASA Technical Reports Server (NTRS)
Basili, Victor R.
1992-01-01
The concepts of quality improvements have permeated many businesses. It is clear that the nineties will be the quality era for software and there is a growing need to develop or adapt quality improvement approaches to the software business. Thus we must understand software as an artifact and software as a business. Since the business we are dealing with is software, we must understand the nature of software and software development. The software discipline is evolutionary and experimental; it is a laboratory science. Software is development not production. The technologies of the discipline are human based. There is a lack of models that allow us to reason about the process and the product. All software is not the same; process is a variable, goals are variable, etc. Packaged, reusable, experiences require additional resources in the form of organization, processes, people, etc. There have been a variety of organizational frameworks proposed to improve quality for various businesses. The ones discussed in this presentation include: Plan-Do-Check-Act, a quality improvement process based upon a feedback cycle for optimizing a single process model/production line; the Experience Factory/Quality Improvement Paradigm, continuous improvements through the experimentation, packaging, and reuse of experiences based upon a business's needs; Total Quality Management, a management approach to long term success through customer satisfaction based on the participation of all members of an organization; the SEI capability maturity model, a staged process improvement based upon assessment with regard to a set of key process areas until you reach a level 5 which represents a continuous process improvement; and Lean (software) Development, a principle supporting the concentration of the production on 'value added' activities and the elimination of reduction of 'not value added' activities.
NASA Astrophysics Data System (ADS)
Liu, Z.; LU, G.; He, H.; Wu, Z.; He, J.
2017-12-01
Seasonal pluvial-drought transition processes are unique natural phenomena. To explore possible mechanisms, we considered Southwest China (SWC) as the study region and comprehensively investigated the temporal evolution of large-scale and regional atmospheric variables with the simple method of Standardized Anomalies (SA). Some key results include: (1) The net vertical integral of water vapour flux (VIWVF) across the four boundaries may be a feasible indicator of pluvial-drought transition processes over SWC, because its SA-based index is almost consistent with process development. (2) The vertical SA-based patterns of regional horizontal divergence (D) and vertical motion (ω) also coincides with the pluvial-drought transition processes well, and the SA-based index of regional D show relatively high correlation with the identified processes over SWC. (3) With respect to large-scale anomalies of circulation patterns, a well-organized Eurasian Pattern is one important feature during the pluvial-drought transition over SWC. (4) To explore the possibility of simulating drought development using previous pluvial anomalies, large-scale and regional atmospheric SA-based indices were used. As a whole, when SA-based indices of regional dynamic and water-vapor variables are introduced, simulated drought development only with large-scale anomalies can be improved a lot. (5) Eventually, pluvial-drought transition processes and associated regional atmospheric anomalies over nine Chinese drought study regions were investigated. With respect to regional D, vertically single or double "upper-positive-lower-negative" and "upper-negative-lower-positive" patterns are the most common vertical SA-based patterns during the pluvial and drought parts of transition processes, respectively.
Soft sensor modeling based on variable partition ensemble method for nonlinear batch processes
NASA Astrophysics Data System (ADS)
Wang, Li; Chen, Xiangguang; Yang, Kai; Jin, Huaiping
2017-01-01
Batch processes are always characterized by nonlinear and system uncertain properties, therefore, the conventional single model may be ill-suited. A local learning strategy soft sensor based on variable partition ensemble method is developed for the quality prediction of nonlinear and non-Gaussian batch processes. A set of input variable sets are obtained by bootstrapping and PMI criterion. Then, multiple local GPR models are developed based on each local input variable set. When a new test data is coming, the posterior probability of each best performance local model is estimated based on Bayesian inference and used to combine these local GPR models to get the final prediction result. The proposed soft sensor is demonstrated by applying to an industrial fed-batch chlortetracycline fermentation process.
NASA Astrophysics Data System (ADS)
Acharya, Ranadip; Das, Suman
2015-09-01
This article describes additive manufacturing (AM) of IN100, a high gamma-prime nickel-based superalloy, through scanning laser epitaxy (SLE), aimed at the creation of thick deposits onto like-chemistry substrates for enabling repair of turbine engine hot-section components. SLE is a metal powder bed-based laser AM technology developed for nickel-base superalloys with equiaxed, directionally solidified, and single-crystal microstructural morphologies. Here, we combine process modeling, statistical design-of-experiments (DoE), and microstructural characterization to demonstrate fully metallurgically bonded, crack-free and dense deposits exceeding 1000 μm of SLE-processed IN100 powder onto IN100 cast substrates produced in a single pass. A combined thermal-fluid flow-solidification model of the SLE process compliments DoE-based process development. A customized quantitative metallography technique analyzes digital cross-sectional micrographs and extracts various microstructural parameters, enabling process model validation and process parameter optimization. Microindentation measurements show an increase in the hardness by 10 pct in the deposit region compared to the cast substrate due to microstructural refinement. The results illustrate one of the very few successes reported for the crack-free deposition of IN100, a notoriously "non-weldable" hot-section alloy, thus establishing the potential of SLE as an AM method suitable for hot-section component repair and for future new-make components in high gamma-prime containing crack-prone nickel-based superalloys.
Fleisher, Linda; Ruggieri, Dominique G.; Miller, Suzanne M.; Manne, Sharon; Albrecht, Terrance; Buzaglo, Joanne; Collins, Michael A.; Katz, Michael; Kinzy, Tyler G.; Liu, Tasnuva; Manning, Cheri; Charap, Ellen Specker; Millard, Jennifer; Miller, Dawn M.; Poole, David; Raivitch, Stephanie; Roach, Nancy; Ross, Eric A.; Meropol, Neal J.
2014-01-01
Objective This article describes the rigorous development process and initial feedback of the PRE-ACT (Preparatory Education About Clinical Trials) web-based- intervention designed to improve preparation for decision making in cancer clinical trials. Methods The multi-step process included stakeholder input, formative research, user testing and feedback. Diverse teams (researchers, advocates and developers) participated including content refinement, identification of actors, and development of video scripts. Patient feedback was provided in the final production period and through a vanguard group (N = 100) from the randomized trial. Results Patients/advocates confirmed barriers to cancer clinical trial participation, including lack of awareness and knowledge, fear of side effects, logistical concerns, and mistrust. Patients indicated they liked the tool’s user-friendly nature, the organized and comprehensive presentation of the subject matter, and the clarity of the videos. Conclusion The development process serves as an example of operationalizing best practice approaches and highlights the value of a multi-disciplinary team to develop a theory-based, sophisticated tool that patients found useful in their decision making process. Practice implications Best practice approaches can be addressed and are important to ensure evidence-based tools that are of value to patients and supports the usefulness of a process map in the development of e-health tools. PMID:24813474
Implementation of a Web-Based Collaborative Process Planning System
NASA Astrophysics Data System (ADS)
Wang, Huifen; Liu, Tingting; Qiao, Li; Huang, Shuangxi
Under the networked manufacturing environment, all phases of product manufacturing involving design, process planning, machining and assembling may be accomplished collaboratively by different enterprises, even different manufacturing stages of the same part may be finished collaboratively by different enterprises. Based on the self-developed networked manufacturing platform eCWS(e-Cooperative Work System), a multi-agent-based system framework for collaborative process planning is proposed. In accordance with requirements of collaborative process planning, share resources provided by cooperative enterprises in the course of collaboration are classified into seven classes. Then a reconfigurable and extendable resource object model is built. Decision-making strategy is also studied in this paper. Finally a collaborative process planning system e-CAPP is developed and applied. It provides strong support for distributed designers to collaboratively plan and optimize product process though network.
Measurement-based reliability/performability models
NASA Technical Reports Server (NTRS)
Hsueh, Mei-Chen
1987-01-01
Measurement-based models based on real error-data collected on a multiprocessor system are described. Model development from the raw error-data to the estimation of cumulative reward is also described. A workload/reliability model is developed based on low-level error and resource usage data collected on an IBM 3081 system during its normal operation in order to evaluate the resource usage/error/recovery process in a large mainframe system. Thus, both normal and erroneous behavior of the system are modeled. The results provide an understanding of the different types of errors and recovery processes. The measured data show that the holding times in key operational and error states are not simple exponentials and that a semi-Markov process is necessary to model the system behavior. A sensitivity analysis is performed to investigate the significance of using a semi-Markov process, as opposed to a Markov process, to model the measured system.
Liu, Xin; Fatehi, Pedram; Ni, Yonghao
2012-07-01
A process for removing inhibitors from pre-hydrolysis liquor (PHL) of a kraft-based dissolving pulp production process by adsorption and flocculation, and the characteristics of this process were studied. In this process, industrially produced PHL was treated with unmodified and oxidized activated carbon as an absorbent and polydiallyldimethylammonium chloride (PDADMAC) as a flocculant. The overall removal of lignin and furfural in the developed process was 83.3% and 100%, respectively, while that of hemicelluloses was 32.7%. These results confirmed that the developed process can remove inhibitors from PHL prior to producing value-added products, e.g. ethanol and xylitol via fermentation. Copyright © 2012 Elsevier Ltd. All rights reserved.
Corbett, Teresa; Walsh, Jane C; Groarke, AnnMarie; Moss-Morris, Rona; Morrissey, Eimear; McGuire, Brian E
2017-07-04
Cancer-related fatigue (CrF) is the most common and disruptive symptom experienced by cancer survivors. We aimed to develop a theory-based, interactive Web-based intervention designed to facilitate self-management and enhance coping with CrF following cancer treatment. The aim of our study was to outline the rationale, decision-making processes, methods, and findings which led to the development of a Web-based intervention to be tested in a feasibility trial. This paper outlines the process and method of development of the intervention. An extensive review of the literature and qualitative research was conducted to establish a therapeutic approach for this intervention, based on theory. The psychological principles used in the development process are outlined, and we also clarify hypothesized causal mechanisms. We describe decision-making processes involved in the development of the content of the intervention, input from the target patient group and stakeholders, the design of the website features, and the initial user testing of the website. The cocreation of the intervention with the experts and service users allowed the design team to ensure that an acceptable intervention was developed. This evidence-based Web-based program is the first intervention of its kind based on self-regulation model theory, with the primary aim of targeting the representations of fatigue and enhancing self-management of CrF, specifically. This research sought to integrate psychological theory, existing evidence of effective interventions, empirically derived principles of Web design, and the views of potential users into the systematic planning and design of the intervention of an easy-to-use website for cancer survivors. ©Teresa Corbett, Jane C Walsh, AnnMarie Groarke, Rona Moss-Morris, Eimear Morrissey, Brian E McGuire. Originally published in JMIR Cancer (http://cancer.jmir.org), 04.07.2017.
Testing the Digital Thread in Support of Model-Based Manufacturing and Inspection
Hedberg, Thomas; Lubell, Joshua; Fischer, Lyle; Maggiano, Larry; Feeney, Allison Barnard
2016-01-01
A number of manufacturing companies have reported anecdotal evidence describing the benefits of Model-Based Enterprise (MBE). Based on this evidence, major players in industry have embraced a vision to deploy MBE. In our view, the best chance of realizing this vision is the creation of a single “digital thread.” Under MBE, there exists a Model-Based Definition (MBD), created by the Engineering function, that downstream functions reuse to complete Model-Based Manufacturing and Model-Based Inspection activities. The ensemble of data that enables the combination of model-based definition, manufacturing, and inspection defines this digital thread. Such a digital thread would enable real-time design and analysis, collaborative process-flow development, automated artifact creation, and full-process traceability in a seamless real-time collaborative development among project participants. This paper documents the strengths and weaknesses in the current, industry strategies for implementing MBE. It also identifies gaps in the transition and/or exchange of data between various manufacturing processes. Lastly, this paper presents measured results from a study of model-based processes compared to drawing-based processes and provides evidence to support the anecdotal evidence and vision made by industry. PMID:27325911
The FoReVer Methodology: A MBSE Framework for Formal Verification
NASA Astrophysics Data System (ADS)
Baracchi, Laura; Mazzini, Silvia; Cimatti, Alessandro; Tonetta, Stefano; Garcia, Gerald
2013-08-01
The need for high level of confidence and operational integrity in critical space (software) systems is well recognized in the Space industry and has been addressed so far through rigorous System and Software Development Processes and stringent Verification and Validation regimes. The Model Based Space System Engineering process (MBSSE) derived in the System and Software Functional Requirement Techniques study (SSFRT) focused on the application of model based engineering technologies to support the space system and software development processes, from mission level requirements to software implementation through model refinements and translations. In this paper we report on our work in the ESA-funded FoReVer project where we aim at developing methodological, theoretical and technological support for a systematic approach to the space avionics system development, in phases 0/A/B/C. FoReVer enriches the MBSSE process with contract-based formal verification of properties, at different stages from system to software, through a step-wise refinement approach, with the support for a Software Reference Architecture.
NASA Astrophysics Data System (ADS)
Stolk, Machiel J.; de Jong, Onno; Bulte, Astrid M. W.; Pilot, Albert
2011-05-01
Involving teachers in early stages of context-based curriculum innovations requires a professional development programme that actively engages teachers in the design of new context-based units. This study considers the implementation of a teacher professional development framework aiming to investigate processes of professional development. The framework is based on Galperin's theory of the internalisation of actions and it is operationalised into a professional development programme to empower chemistry teachers for designing new context-based units. The programme consists of the teaching of an educative context-based unit, followed by the designing of an outline of a new context-based unit. Six experienced chemistry teachers participated in the instructional meetings and practical teaching in their respective classrooms. Data were obtained from meetings, classroom discussions, and observations. The findings indicated that teachers became only partially empowered for designing a new context-based chemistry unit. Moreover, the process of professional development leading to teachers' empowerment was not carried out as intended. It is concluded that the elaboration of the framework needs improvement. The implications for a new programme are discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Farmer, J; Brown, B; Bayles, B
The overall goal is to develop high-performance corrosion-resistant iron-based amorphous-metal coatings for prolonged trouble-free use in very aggressive environments: seawater & hot geothermal brines. The specific technical objectives are: (1) Synthesize Fe-based amorphous-metal coating with corrosion resistance comparable/superior to Ni-based Alloy C-22; (2) Establish processing parameter windows for applying and controlling coating attributes (porosity, density, bonding); (3) Assess possible cost savings through substitution of Fe-based material for more expensive Ni-based Alloy C-22; (4) Demonstrate practical fabrication processes; (5) Produce quality materials and data with complete traceability for nuclear applications; and (6) Develop, validate and calibrate computational models to enable lifemore » prediction and process design.« less
Time-based analysis of total cost of patient episodes: a case study of hip replacement.
Peltokorpi, Antti; Kujala, Jaakko
2006-01-01
Healthcare in the public and private sectors is facing increasing pressure to become more cost-effective. Time-based competition and work-in-progress have been used successfully to measure and improve the efficiency of industrial manufacturing. Seeks to address this issue. Presents a framework for time based management of the total cost of a patient episode and apply it to the six sigma DMAIC-process development approach. The framework is used to analyse hip replacement patient episodes in Päijät-Häme Hospital District in Finland, which has a catchment area of 210,000 inhabitants and performs an average of 230 hip replacements per year. The work-in-progress concept is applicable to healthcare--notably that the DMAIC-process development approach can be used to analyse the total cost of patient episodes. Concludes that a framework, which combines the patient-in-process and the DMAIC development approach, can be used not only to analyse the total cost of patient episode but also to improve patient process efficiency. Presents a framework that combines patient-in-process and DMAIC-process development approaches, which can be used to analyse the total cost of a patient episode in order to improve patient process efficiency.
Recapturing Graphite-Based Fuel Element Technology for Nuclear Thermal Propulsion
DOE Office of Scientific and Technical Information (OSTI.GOV)
Trammell, Michael P; Jolly, Brian C; Miller, James Henry
ORNL is currently recapturing graphite based fuel forms for Nuclear Thermal Propulsion (NTP). This effort involves research and development on materials selection, extrusion, and coating processes to produce fuel elements representative of historical ROVER and NERVA fuel. Initially, lab scale specimens were fabricated using surrogate oxides to develop processing parameters that could be applied to full length NTP fuel elements. Progress toward understanding the effect of these processing parameters on surrogate fuel microstructure is presented.
DEVELOPMENT OF CAPE-OPEN COMPLIANT PROCESS MODELING COMPONENTS IN MICROSOFT .NET
The CAPE-OPEN middleware standards were created to allow process modeling components (PMCs) developed by third parties to be used in any process modeling environment (PME) utilizing these standards. The CAPE-OPEN middleware specifications were based upon both Microsoft's Compone...
Regulatory agencies are confronted with a daunting task of developing fish consumption advisories for a large number of lakes and rivers with little resources. A feasible mechanism to develop region-wide fish advisories is by using a process-based mathematical model. One model of...
Regulatory agencies must develop fish consumption advisories for many lakes and rivers with limited resources. Process-based mathematical models are potentially valuable tools for developing regional fish advisories. The Regional Mercury Cycling model (R-MCM) was specifically d...
Using the Logic Model to Plan Extension and Outreach Program Development and Scholarship
ERIC Educational Resources Information Center
Corbin, Marilyn; Kiernan, Nancy Ellen; Koble, Margaret A.; Watson, Jack; Jackson, Daney
2004-01-01
In searching for a process to help program teams of campus-based faculty and field-based educators develop five-year and annual statewide program plans, cooperative extension administrators and specialists in Penn State's College of Agricultural Sciences discovered that the use of the logic model process can influence the successful design of…
Modeling and managing risk early in software development
NASA Technical Reports Server (NTRS)
Briand, Lionel C.; Thomas, William M.; Hetmanski, Christopher J.
1993-01-01
In order to improve the quality of the software development process, we need to be able to build empirical multivariate models based on data collectable early in the software process. These models need to be both useful for prediction and easy to interpret, so that remedial actions may be taken in order to control and optimize the development process. We present an automated modeling technique which can be used as an alternative to regression techniques. We show how it can be used to facilitate the identification and aid the interpretation of the significant trends which characterize 'high risk' components in several Ada systems. Finally, we evaluate the effectiveness of our technique based on a comparison with logistic regression based models.
The Use of Intervention Mapping to Develop a Tailored Web-Based Intervention, Condom-HIM.
Miranda, Joyal; Côté, José
2017-04-19
Many HIV (human immunodeficiency virus) prevention interventions are currently being implemented and evaluated, with little information published on their development. A framework highlighting the method of development of an intervention can be used by others wanting to replicate interventions or develop similar interventions to suit other contexts and settings. It provides researchers with a comprehensive development process of the intervention. The objective of this paper was to describe how a systematic approach, intervention mapping, was used to develop a tailored Web-based intervention to increase condom use among HIV-positive men who have sex with men. The intervention was developed in consultation with a multidisciplinary team composed of academic researchers, community members, Web designers, and the target population. Intervention mapping involved a systematic process of 6 steps: (1) needs assessment; (2) identification of proximal intervention objectives; (3) selection of theory-based intervention methods and practical strategies; (4) development of intervention components and materials; (5) adoption, implementation, and maintenance; and (6) evaluation planning. The application of intervention mapping resulted in the development of a tailored Web-based intervention for HIV-positive men who have sex with men, called Condom-HIM. Using intervention mapping as a systematic process to develop interventions is a feasible approach that specifically integrates the use of theory and empirical findings. Outlining the process used to develop a particular intervention provides clarification on the conceptual use of experimental interventions in addition to potentially identifying reasons for intervention failures. ©Joyal Miranda, José Côté. Originally published in JMIR Public Health and Surveillance (http://publichealth.jmir.org), 19.04.2017.
Development of a case tool to support decision based software development
NASA Technical Reports Server (NTRS)
Wild, Christian J.
1993-01-01
A summary of the accomplishments of the research over the past year are presented. Achievements include: made demonstrations with DHC, a prototype supporting decision based software development (DBSD) methodology, for Paramax personnel at ODU; met with Paramax personnel to discuss DBSD issues, the process of integrating DBSD and Refinery and the porting process model; completed and submitted a paper describing DBSD paradigm to IFIP '92; completed and presented a paper describing the approach for software reuse at the Software Reuse Workshop in April 1993; continued to extend DHC with a project agenda, facility necessary for a better project management; completed a primary draft of the re-engineering process model for porting; created a logging form to trace all the activities involved in the process of solving the reengineering problem, and developed a primary chart with the problems involved by the reengineering process.
Vision-based obstacle recognition system for automated lawn mower robot development
NASA Astrophysics Data System (ADS)
Mohd Zin, Zalhan; Ibrahim, Ratnawati
2011-06-01
Digital image processing techniques (DIP) have been widely used in various types of application recently. Classification and recognition of a specific object using vision system require some challenging tasks in the field of image processing and artificial intelligence. The ability and efficiency of vision system to capture and process the images is very important for any intelligent system such as autonomous robot. This paper gives attention to the development of a vision system that could contribute to the development of an automated vision based lawn mower robot. The works involve on the implementation of DIP techniques to detect and recognize three different types of obstacles that usually exist on a football field. The focus was given on the study on different types and sizes of obstacles, the development of vision based obstacle recognition system and the evaluation of the system's performance. Image processing techniques such as image filtering, segmentation, enhancement and edge detection have been applied in the system. The results have shown that the developed system is able to detect and recognize various types of obstacles on a football field with recognition rate of more 80%.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meyer, Howard
2010-11-30
This project met the objective to further the development of an integrated multi-contaminant removal process in which H2S, NH3, HCl and heavy metals including Hg, As, Se and Cd present in the coal-derived syngas can be removed to specified levels in a single/integrated process step. The process supports the mission and goals of the Department of Energy's Gasification Technologies Program, namely to enhance the performance of gasification systems, thus enabling U.S. industry to improve the competitiveness of gasification-based processes. The gasification program will reduce equipment costs, improve process environmental performance, and increase process reliability and flexibility. Two sulfur conversion conceptsmore » were tested in the laboratory under this project, i.e., the solventbased, high-pressure University of California Sulfur Recovery Process High Pressure (UCSRP-HP) and the catalytic-based, direct oxidation (DO) section of the CrystaSulf-DO process. Each process required a polishing unit to meet the ultra-clean sulfur content goals of <50 ppbv (parts per billion by volume) as may be necessary for fuel cells or chemical production applications. UCSRP-HP was also tested for the removal of trace, non-sulfur contaminants, including ammonia, hydrogen chloride, and heavy metals. A bench-scale unit was commissioned and limited testing was performed with simulated syngas. Aspen-Plus®-based computer simulation models were prepared and the economics of the UCSRP-HP and CrystaSulf-DO processes were evaluated for a nominal 500 MWe, coal-based, IGCC power plant with carbon capture. This report covers the progress on the UCSRP-HP technology development and the CrystaSulf-DO technology.« less
Updating National Topographic Data Base Using Change Detection Methods
NASA Astrophysics Data System (ADS)
Keinan, E.; Felus, Y. A.; Tal, Y.; Zilberstien, O.; Elihai, Y.
2016-06-01
The traditional method for updating a topographic database on a national scale is a complex process that requires human resources, time and the development of specialized procedures. In many National Mapping and Cadaster Agencies (NMCA), the updating cycle takes a few years. Today, the reality is dynamic and the changes occur every day, therefore, the users expect that the existing database will portray the current reality. Global mapping projects which are based on community volunteers, such as OSM, update their database every day based on crowdsourcing. In order to fulfil user's requirements for rapid updating, a new methodology that maps major interest areas while preserving associated decoding information, should be developed. Until recently, automated processes did not yield satisfactory results, and a typically process included comparing images from different periods. The success rates in identifying the objects were low, and most were accompanied by a high percentage of false alarms. As a result, the automatic process required significant editorial work that made it uneconomical. In the recent years, the development of technologies in mapping, advancement in image processing algorithms and computer vision, together with the development of digital aerial cameras with NIR band and Very High Resolution satellites, allow the implementation of a cost effective automated process. The automatic process is based on high-resolution Digital Surface Model analysis, Multi Spectral (MS) classification, MS segmentation, object analysis and shape forming algorithms. This article reviews the results of a novel change detection methodology as a first step for updating NTDB in the Survey of Israel.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-29
... Solar and Wind Energy Development AGENCY: Bureau of Land Management. ACTION: Advance notice of proposed... to establish a competitive process for leasing public lands for solar and wind energy development... process for issuing Right-of-Way (ROW) leases for solar and wind energy development that is based upon the...
Enzyme-based processing of soybean carbohydrate: Recent developments and future prospects.
Al Loman, Abdullah; Ju, Lu-Kwang
2017-11-01
Soybean is well known for its high-value oil and protein. Carbohydrate is, however, an underutilized major component, representing almost 26-30% (w/w) of the dried bean. The complex soybean carbohydrate is not easily hydrolyzable and can cause indigestibility when included in food and feed. Enzymes can be used to hydrolyze the carbohydrate for improving soybean processing and value of soybean products. Here the enzyme-based processing developed for the following purposes is reviewed: hydrolysis of different carbohydrate-rich by/products from soybean processing, improvement of soybean oil extraction, and increase of nutritional value of soybean-based food and animal feed. Once hydrolyzed into fermentable sugars, soybean carbohydrate can find more value-added applications and further improve the overall economics of soybean processing. Copyright © 2017 Elsevier Inc. All rights reserved.
Development of solution-processed nanowire composites for opto-electronics
Ginley, David S.; Aggarwal, Shruti; Singh, Rajiv; ...
2016-12-20
Here, silver nanowire-based contacts represent one of the major new directions in transparent contacts for opto-electronic devices with the added advantage that they can have Indium-Tin-Oxide-like properties at substantially reduced processing temperatures and without the use of vacuum-based processing. However, nanowires alone often do not adhere well to the substrate or other film interfaces; even after a relatively high-temperature anneal and unencapsulated nanowires show environmental degradation at high temperature and humidity. Here we report on the development of ZnO/Ag-nanowire composites that have sheet resistance below 10 Ω/sq and >90% transmittance from a solution-based process with process temperatures below 200 °C.more » These films have significant applications potential in photovoltaics and displays.« less
ERIC Educational Resources Information Center
Xiao, Naiqi G.; Quinn, Paul C.; Ge, Liezhong; Lee, Kang
2017-01-01
Although most of the faces we encounter daily are moving ones, much of what we know about face processing and its development is based on studies using static faces that emphasize holistic processing as the hallmark of mature face processing. Here the authors examined the effects of facial movements on face processing developmentally in children…
ERIC Educational Resources Information Center
Spaulding, Trent Joseph
2011-01-01
The objective of this research is to understand how a set of systems, as defined by the business process, creates value. The three studies contained in this work develop the model of process-based automation. The model states that complementarities among systems are specified by handoffs in the business process. The model also provides theory to…
Energy efficiency technologies in cement and steel industry
NASA Astrophysics Data System (ADS)
Zanoli, Silvia Maria; Cocchioni, Francesco; Pepe, Crescenzo
2018-02-01
In this paper, Advanced Process Control strategies aimed at energy efficiency achievement and improvement in cement and steel industry are proposed. A flexible and smart control structure constituted by several functional modules and blocks has been developed. The designed control strategy is based on Model Predictive Control techniques, formulated on linear models. Two industrial control solutions have been developed, oriented to energy efficiency and process control improvement in cement industry clinker rotary kilns (clinker production phase) and in steel industry billets reheating furnaces. Tailored customization procedures for the design of ad hoc control systems have been executed, based on the specific needs and specifications of the analysed processes. The installation of the developed controllers on cement and steel plants produced significant benefits in terms of process control which resulted in working closer to the imposed operating limits. With respect to the previous control systems, based on local controllers and/or operators manual conduction, more profitable configurations of the crucial process variables have been provided.
A resource for assessing information processing in the developing brain using EEG and eye tracking
Langer, Nicolas; Ho, Erica J.; Alexander, Lindsay M.; Xu, Helen Y.; Jozanovic, Renee K.; Henin, Simon; Petroni, Agustin; Cohen, Samantha; Marcelle, Enitan T.; Parra, Lucas C.; Milham, Michael P.; Kelly, Simon P.
2017-01-01
We present a dataset combining electrophysiology and eye tracking intended as a resource for the investigation of information processing in the developing brain. The dataset includes high-density task-based and task-free EEG, eye tracking, and cognitive and behavioral data collected from 126 individuals (ages: 6–44). The task battery spans both the simple/complex and passive/active dimensions to cover a range of approaches prevalent in modern cognitive neuroscience. The active task paradigms facilitate principled deconstruction of core components of task performance in the developing brain, whereas the passive paradigms permit the examination of intrinsic functional network activity during varying amounts of external stimulation. Alongside these neurophysiological data, we include an abbreviated cognitive test battery and questionnaire-based measures of psychiatric functioning. We hope that this dataset will lead to the development of novel assays of neural processes fundamental to information processing, which can be used to index healthy brain development as well as detect pathologic processes. PMID:28398357
A resource for assessing information processing in the developing brain using EEG and eye tracking.
Langer, Nicolas; Ho, Erica J; Alexander, Lindsay M; Xu, Helen Y; Jozanovic, Renee K; Henin, Simon; Petroni, Agustin; Cohen, Samantha; Marcelle, Enitan T; Parra, Lucas C; Milham, Michael P; Kelly, Simon P
2017-04-11
We present a dataset combining electrophysiology and eye tracking intended as a resource for the investigation of information processing in the developing brain. The dataset includes high-density task-based and task-free EEG, eye tracking, and cognitive and behavioral data collected from 126 individuals (ages: 6-44). The task battery spans both the simple/complex and passive/active dimensions to cover a range of approaches prevalent in modern cognitive neuroscience. The active task paradigms facilitate principled deconstruction of core components of task performance in the developing brain, whereas the passive paradigms permit the examination of intrinsic functional network activity during varying amounts of external stimulation. Alongside these neurophysiological data, we include an abbreviated cognitive test battery and questionnaire-based measures of psychiatric functioning. We hope that this dataset will lead to the development of novel assays of neural processes fundamental to information processing, which can be used to index healthy brain development as well as detect pathologic processes.
Devil is in the details: Using logic models to investigate program process.
Peyton, David J; Scicchitano, Michael
2017-12-01
Theory-based logic models are commonly developed as part of requirements for grant funding. As a tool to communicate complex social programs, theory based logic models are an effective visual communication. However, after initial development, theory based logic models are often abandoned and remain in their initial form despite changes in the program process. This paper examines the potential benefits of committing time and resources to revising the initial theory driven logic model and developing detailed logic models that describe key activities to accurately reflect the program and assist in effective program management. The authors use a funded special education teacher preparation program to exemplify the utility of drill down logic models. The paper concludes with lessons learned from the iterative revision process and suggests how the process can lead to more flexible and calibrated program management. Copyright © 2017 Elsevier Ltd. All rights reserved.
Development of Pulsed Processes for the Manufacture of Solar Cells
NASA Technical Reports Server (NTRS)
1979-01-01
The development status of the process based upon ion implantation for the introduction of junctions and back surface fields is described. A process sequence is presented employing ion implantation and pulse processing. Efforts to improve throughout and descrease process element costs for furnace annealing are described. Design studies for a modular 3,000 wafer per hour pulse processor are discussed.
Puri, Vibha; Brancazio, Dave; Desai, Parind M; Jensen, Keith D; Chun, Jung-Hoon; Myerson, Allan S; Trout, Bernhardt L
2017-11-01
The combination of hot-melt extrusion and injection molding (HME-IM) is a promising process technology for continuous manufacturing of tablets. However, there has been limited research on its application to formulate crystalline drug-containing immediate-release tablets. Furthermore, studies that have applied the HME-IM process to molded tablets have used a noncontinuous 2-step approach. The present study develops maltodextrin (MDX)-based extrusion-molded immediate-release tablets for a crystalline drug (griseofulvin) using an integrated twin-screw HME-IM continuous process. At 10% w/w drug loading, MDX was selected as the tablet matrix former based on a preliminary screen. Furthermore, liquid and solid polyols were evaluated for melt processing of MDX and for impact on tablet performance. Smooth-surfaced tablets, comprising crystalline griseofulvin solid suspension in the amorphous MDX-xylitol matrix, were produced by a continuous process on a twin-screw extruder coupled to a horizontally opening IM machine. Real-time HME process profiles were used to develop automated HME-IM cycles. Formulation adjustments overcame process challenges and improved tablet strength. The developed MDX tablets exhibited adequate strength and a fast-dissolving matrix (85% drug release in 20 min), and maintained performance on accelerated stability conditions. Copyright © 2017 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.
Distributed Computing Framework for Synthetic Radar Application
NASA Technical Reports Server (NTRS)
Gurrola, Eric M.; Rosen, Paul A.; Aivazis, Michael
2006-01-01
We are developing an extensible software framework, in response to Air Force and NASA needs for distributed computing facilities for a variety of radar applications. The objective of this work is to develop a Python based software framework, that is the framework elements of the middleware that allows developers to control processing flow on a grid in a distributed computing environment. Framework architectures to date allow developers to connect processing functions together as interchangeable objects, thereby allowing a data flow graph to be devised for a specific problem to be solved. The Pyre framework, developed at the California Institute of Technology (Caltech), and now being used as the basis for next-generation radar processing at JPL, is a Python-based software framework. We have extended the Pyre framework to include new facilities to deploy processing components as services, including components that monitor and assess the state of the distributed network for eventual real-time control of grid resources.
Quality by Design (QbD)-Based Process Development for Purification of a Biotherapeutic.
Rathore, Anurag S
2016-05-01
Quality by Design (QbD) is currently receiving increased attention from the pharmaceutical community. As a result, most major biotech manufacturers are in varying stages of implementing QbD. Here, I present a case study that illustrates the step-by-step development using QbD of a purification process for the production of a biosimilar product: granulocyte colony-stimulating factor (GCSF). I also highlight and discuss the advantages that QbD-based process development offers over traditional approaches. The case study is intended to help those who wish to implement QbD towards the development and commercialization of biotech products. Copyright © 2016 Elsevier Ltd. All rights reserved.
Virtual Collaborative Simulation Environment for Integrated Product and Process Development
NASA Technical Reports Server (NTRS)
Gulli, Michael A.
1997-01-01
Deneb Robotics is a leader in the development of commercially available, leading edge three- dimensional simulation software tools for virtual prototyping,, simulation-based design, manufacturing process simulation, and factory floor simulation and training applications. Deneb has developed and commercially released a preliminary Virtual Collaborative Engineering (VCE) capability for Integrated Product and Process Development (IPPD). This capability allows distributed, real-time visualization and evaluation of design concepts, manufacturing processes, and total factory and enterprises in one seamless simulation environment.
Lövquist, Erik; Shorten, George; Aboulafia, Annette
2012-01-01
The current focus on patient safety and evidence-based medical education has led to an increased interest in utilising virtual reality (VR) for medical training. The development of VR-based systems require experts from different disciplines to collaborate with shared and agreed objectives throughout a system's development process. Both the development of technology as well as the incorporation and evaluation of relevant training have to be given the appropriate attention. The aim of this article is to illustrate how constructive relationships can be established between stakeholders to develop useful and usable VR-based medical training systems. This article reports a case study of two research projects that developed and evaluated a VR-based training system for spinal anaesthesia. The case study illustrates how close relationships can be established by champion clinicians leading research in this area and by closely engaging clinicians and educators in iterative prototype design throughout a system's development process. Clinicians and educators have to strive to get more involved (ideally as champions of innovation) and actively guide the development of VR-based training and assessment systems. System developers have to strive to ensure that clinicians and educators are participating constructively in the developments of such systems.
Aerospace Engineering Systems and the Advanced Design Technologies Testbed Experience
NASA Technical Reports Server (NTRS)
VanDalsem, William R.; Livingston, Mary E.; Melton, John E.; Torres, Francisco J.; Stremel, Paul M.
1999-01-01
Continuous improvement of aerospace product development processes is a driving requirement across much of the aerospace community. As up to 90% of the cost of an aerospace product is committed during the first 10% of the development cycle, there is a strong emphasis on capturing, creating, and communicating better information (both requirements and performance) early in the product development process. The community has responded by pursuing the development of computer-based systems designed to enhance the decision-making capabilities of product development individuals and teams. Recently, the historical foci on sharing the geometrical representation and on configuration management are being augmented: 1) Physics-based analysis tools for filling the design space database; 2) Distributed computational resources to reduce response time and cost; 3) Web-based technologies to relieve machine-dependence; and 4) Artificial intelligence technologies to accelerate processes and reduce process variability. The Advanced Design Technologies Testbed (ADTT) activity at NASA Ames Research Center was initiated to study the strengths and weaknesses of the technologies supporting each of these trends, as well as the overall impact of the combination of these trends on a product development event. Lessons learned and recommendations for future activities are reported.
Computer-Aided Sensor Development Focused on Security Issues.
Bialas, Andrzej
2016-05-26
The paper examines intelligent sensor and sensor system development according to the Common Criteria methodology, which is the basic security assurance methodology for IT products and systems. The paper presents how the development process can be supported by software tools, design patterns and knowledge engineering. The automation of this process brings cost-, quality-, and time-related advantages, because the most difficult and most laborious activities are software-supported and the design reusability is growing. The paper includes a short introduction to the Common Criteria methodology and its sensor-related applications. In the experimental section the computer-supported and patterns-based IT security development process is presented using the example of an intelligent methane detection sensor. This process is supported by an ontology-based tool for security modeling and analyses. The verified and justified models are transferred straight to the security target specification representing security requirements for the IT product. The novelty of the paper is to provide a patterns-based and computer-aided methodology for the sensors development with a view to achieving their IT security assurance. The paper summarizes the validation experiment focused on this methodology adapted for the sensors system development, and presents directions of future research.
Computer-Aided Sensor Development Focused on Security Issues
Bialas, Andrzej
2016-01-01
The paper examines intelligent sensor and sensor system development according to the Common Criteria methodology, which is the basic security assurance methodology for IT products and systems. The paper presents how the development process can be supported by software tools, design patterns and knowledge engineering. The automation of this process brings cost-, quality-, and time-related advantages, because the most difficult and most laborious activities are software-supported and the design reusability is growing. The paper includes a short introduction to the Common Criteria methodology and its sensor-related applications. In the experimental section the computer-supported and patterns-based IT security development process is presented using the example of an intelligent methane detection sensor. This process is supported by an ontology-based tool for security modeling and analyses. The verified and justified models are transferred straight to the security target specification representing security requirements for the IT product. The novelty of the paper is to provide a patterns-based and computer-aided methodology for the sensors development with a view to achieving their IT security assurance. The paper summarizes the validation experiment focused on this methodology adapted for the sensors system development, and presents directions of future research. PMID:27240360
[Preface for special issue on bio-based materials (2016)].
Weng, Yunxuan
2016-06-25
Bio-based materials are new materials or chemicals with renewable biomass as raw materials such as grain, legume, straw, bamboo and wood powder. This class of materials includes bio-based polymer, biobased fiber, glycotechnology products, biobased rubber and plastics produced by biomass thermoplastic processing and basic biobased chemicals, for instance, bio-alcohols, organic acids, alkanes, and alkenes, obtained by bio-synthesis, bio-processing and bio-refinery. Owing to its environmental friendly and resource conservation, bio-based materials are becoming a new dominant industry taking the lead in the world scientific and technological innovation and economic development. An overview of bio-based materials development is reported in this special issue, and the industrial status and research progress of the following aspects, including biobased fiber, polyhydroxyalkanoates, biodegradable mulching film, bio-based polyamide, protein based biomedical materials, bio-based polyurethane, and modification and processing of poly(lactic acid), are introduced.
Müller-Staub, Maria; de Graaf-Waar, Helen; Paans, Wolter
2016-11-01
Nurses are accountable to apply the nursing process, which is key for patient care: It is a problem-solving process providing the structure for care plans and documentation. The state-of-the art nursing process is based on classifications that contain standardized concepts, and therefore, it is named Advanced Nursing Process. It contains valid assessments, nursing diagnoses, interventions, and nursing-sensitive patient outcomes. Electronic decision support systems can assist nurses to apply the Advanced Nursing Process. However, nursing decision support systems are missing, and no "gold standard" is available. The study aim is to develop a valid Nursing Process-Clinical Decision Support System Standard to guide future developments of clinical decision support systems. In a multistep approach, a Nursing Process-Clinical Decision Support System Standard with 28 criteria was developed. After pilot testing (N = 29 nurses), the criteria were reduced to 25. The Nursing Process-Clinical Decision Support System Standard was then presented to eight internationally known experts, who performed qualitative interviews according to Mayring. Fourteen categories demonstrate expert consensus on the Nursing Process-Clinical Decision Support System Standard and its content validity. All experts agreed the Advanced Nursing Process should be the centerpiece for the Nursing Process-Clinical Decision Support System and should suggest research-based, predefined nursing diagnoses and correct linkages between diagnoses, evidence-based interventions, and patient outcomes.
Knowledge/geometry-based Mobile Autonomous Robot Simulator (KMARS)
NASA Technical Reports Server (NTRS)
Cheng, Linfu; Mckendrick, John D.; Liu, Jeffrey
1990-01-01
Ongoing applied research is focused on developing guidance system for robot vehicles. Problems facing the basic research needed to support this development (e.g., scene understanding, real-time vision processing, etc.) are major impediments to progress. Due to the complexity and the unpredictable nature of a vehicle's area of operation, more advanced vehicle control systems must be able to learn about obstacles within the range of its sensor(s). A better understanding of the basic exploration process is needed to provide critical support to developers of both sensor systems and intelligent control systems which can be used in a wide spectrum of autonomous vehicles. Elcee Computek, Inc. has been working under contract to the Flight Dynamics Laboratory, Wright Research and Development Center, Wright-Patterson AFB, Ohio to develop a Knowledge/Geometry-based Mobile Autonomous Robot Simulator (KMARS). KMARS has two parts: a geometry base and a knowledge base. The knowledge base part of the system employs the expert-system shell CLIPS ('C' Language Integrated Production System) and necessary rules that control both the vehicle's use of an obstacle detecting sensor and the overall exploration process. The initial phase project has focused on the simulation of a point robot vehicle operating in a 2D environment.
Structural CNT Composites. Part I; Developing a Carbon Nanotube Filament Winder
NASA Technical Reports Server (NTRS)
Sauti, Godfrey; Kim, Jae-Woo; Wincheski, Russell A.; Antczak, Andrew; Campero, Jamie C.; Luong, Hoa H.; Shanahan, Michelle H.; Stelter, Christopher J.; Siochi, Emilie J.
2015-01-01
Carbon nanotube (CNT) based materials promise advances in the production of high strength and multifunctional components for aerospace and other applications. Specifically, in tension dominated applications, the latest CNT based filaments are yielding composite properties comparable to or exceeding composites from more established fibers such as Kevlar and carbon fiber. However, for the properties of these materials to be fully realized at the component level, suitable manufacturing processes have to be developed. These materials handle differently from conventional fibers, with different wetting characteristics and behavior under load. The limited availability of bulk forms also requires that the equipment be scaled down accordingly to tailor the process development approach to material availability. Here, the development of hardware and software for filament winding of carbon nanotube based tapes and yarns is described. This hardware features precision guidance of the CNT material and control of the winding tension over a wide range in an open architecture that allows for effective process control and troubleshooting during winding. Use of the filament winder to develop CNT based Composite Overwrapped Pressure Vessels (COPVs) shall also be discussed.
Technology and development requirements for advanced coal conversion systems
NASA Technical Reports Server (NTRS)
1981-01-01
A compendium of coal conversion process descriptions is presented. The SRS and MC data bases were utilized to provide information paticularly in the areas of existing process designs and process evaluations. Additional information requirements were established and arrangements were made to visit process developers, pilot plants, and process development units to obtain information that was not otherwise available. Plant designs, process descriptions and operating conditions, and performance characteristics were analyzed and requirements for further development identified and evaluated to determine the impact of these requirements on the process commercialization potential from the standpoint of economics and technical feasibility. A preliminary methodology was established for the comparative technical and economic assessment of advanced processes.
NASA Astrophysics Data System (ADS)
Kaiser, C.; Roll, K.; Volk, W.
2017-09-01
In the automotive industry, the manufacturing of automotive outer panels requires hemming processes in which two sheet metal parts are joined together by bending the flange of the outer part over the inner part. Because of decreasing development times and the steadily growing number of vehicle derivatives, an efficient digital product and process validation is necessary. Commonly used simulations, which are based on the finite element method, demand significant modelling effort, which results in disadvantages especially in the early product development phase. To increase the efficiency of designing hemming processes this paper presents a hemming-specific metamodel approach. The approach includes a part analysis in which the outline of the automotive outer panels is initially split into individual segments. By doing a para-metrization of each of the segments and assigning basic geometric shapes, the outline of the part is approximated. Based on this, the hemming parameters such as flange length, roll-in, wrinkling and plastic strains are calculated for each of the geometric basic shapes by performing a meta-model-based segmental product validation. The metamodel is based on an element similar formulation that includes a reference dataset of various geometric basic shapes. A random automotive outer panel can now be analysed and optimized based on the hemming-specific database. By implementing this approach into a planning system, an efficient optimization of designing hemming processes will be enabled. Furthermore, valuable time and cost benefits can be realized in a vehicle’s development process.
Process Monitoring Evaluation and Implementation for the Wood Abrasive Machining Process
Saloni, Daniel E.; Lemaster, Richard L.; Jackson, Steven D.
2010-01-01
Wood processing industries have continuously developed and improved technologies and processes to transform wood to obtain better final product quality and thus increase profits. Abrasive machining is one of the most important of these processes and therefore merits special attention and study. The objective of this work was to evaluate and demonstrate a process monitoring system for use in the abrasive machining of wood and wood based products. The system developed increases the life of the belt by detecting (using process monitoring sensors) and removing (by cleaning) the abrasive loading during the machining process. This study focused on abrasive belt machining processes and included substantial background work, which provided a solid base for understanding the behavior of the abrasive, and the different ways that the abrasive machining process can be monitored. In addition, the background research showed that abrasive belts can effectively be cleaned by the appropriate cleaning technique. The process monitoring system developed included acoustic emission sensors which tended to be sensitive to belt wear, as well as platen vibration, but not loading, and optical sensors which were sensitive to abrasive loading. PMID:22163477
ERIC Educational Resources Information Center
Wallace, Guy W.
2001-01-01
Explains lean instructional systems design/development (ISD) as it relates to curriculum architecture design, based on Japan's lean production system. Discusses performance-based systems; ISD models; processes for organizational training and development; curriculum architecture to support job performance; and modular curriculum development. (LRW)
ERIC Educational Resources Information Center
Zascerinska, Jelena
2010-01-01
The paradigm change from an input based teaching/learning process to an outcome based process (D. Bluma, 2008, p. 673) reveals efficiency of contribution applied to enhance students' learning outcomes to become particularly important for the development of education and culture change in the constantly changing environment. Aim of the research is…
Measuring the impact of computer resource quality on the software development process and product
NASA Technical Reports Server (NTRS)
Mcgarry, Frank; Valett, Jon; Hall, Dana
1985-01-01
The availability and quality of computer resources during the software development process was speculated to have measurable, significant impact on the efficiency of the development process and the quality of the resulting product. Environment components such as the types of tools, machine responsiveness, and quantity of direct access storage may play a major role in the effort to produce the product and in its subsequent quality as measured by factors such as reliability and ease of maintenance. During the past six years, the NASA Goddard Space Flight Center has conducted experiments with software projects in an attempt to better understand the impact of software development methodologies, environments, and general technologies on the software process and product. Data was extracted and examined from nearly 50 software development projects. All were related to support of satellite flight dynamics ground-based computations. The relationship between computer resources and the software development process and product as exemplified by the subject NASA data was examined. Based upon the results, a number of computer resource-related implications are provided.
Generic Raman-based calibration models enabling real-time monitoring of cell culture bioreactors.
Mehdizadeh, Hamidreza; Lauri, David; Karry, Krizia M; Moshgbar, Mojgan; Procopio-Melino, Renee; Drapeau, Denis
2015-01-01
Raman-based multivariate calibration models have been developed for real-time in situ monitoring of multiple process parameters within cell culture bioreactors. Developed models are generic, in the sense that they are applicable to various products, media, and cell lines based on Chinese Hamster Ovarian (CHO) host cells, and are scalable to large pilot and manufacturing scales. Several batches using different CHO-based cell lines and corresponding proprietary media and process conditions have been used to generate calibration datasets, and models have been validated using independent datasets from separate batch runs. All models have been validated to be generic and capable of predicting process parameters with acceptable accuracy. The developed models allow monitoring multiple key bioprocess metabolic variables, and hence can be utilized as an important enabling tool for Quality by Design approaches which are strongly supported by the U.S. Food and Drug Administration. © 2015 American Institute of Chemical Engineers.
Development of polypyrrole based solid-state on-chip microactuators using photolithography
NASA Astrophysics Data System (ADS)
Zhong, Yong; Lundemo, Staffan; Jager, Edwin W. H.
2018-07-01
There is a need for soft microactuators, especially for biomedical applications. We have developed a microfabrication process to create such soft, on-chip polymer based microactuators that can operate in air. The on-chip microactuators were fabricated using standard photolithographic techniques and wet etching, combined with special designed process to micropattern the electroactive polymer polypyrrole that drives the microactuators. By immobilizing a UV-patternable gel containing a liquid electrolyte on top of the electroactive polypyrrole layer, actuation in air was achieved although with reduced movement. Further optimization of the processing is currently on-going. The result shows the possibility to batch fabricate complex microsystems such as microrobotics and micromanipulators based on these solid-state on-chip microactuators using microfabrication methods including standard photolithographic processes.
A diagnostic prototype of the potable water subsystem of the Space Station Freedom ECLSS
NASA Technical Reports Server (NTRS)
Lukefahr, Brenda D.; Rochowiak, Daniel M.; Benson, Brian L.; Rogers, John S.; Mckee, James W.
1989-01-01
In analyzing the baseline Environmental Control and Life Support System (ECLSS) command and control architecture, various processes are found which would be enhanced by the use of knowledge based system methods of implementation. The most suitable process for prototyping using rule based methods are documented, while domain knowledge resources and other practical considerations are examined. Requirements for a prototype rule based software system are documented. These requirements reflect Space Station Freedom ECLSS software and hardware development efforts, and knowledge based system requirements. A quick prototype knowledge based system environment is researched and developed.
A CSP-Based Agent Modeling Framework for the Cougaar Agent-Based Architecture
NASA Technical Reports Server (NTRS)
Gracanin, Denis; Singh, H. Lally; Eltoweissy, Mohamed; Hinchey, Michael G.; Bohner, Shawn A.
2005-01-01
Cognitive Agent Architecture (Cougaar) is a Java-based architecture for large-scale distributed agent-based applications. A Cougaar agent is an autonomous software entity with behaviors that represent a real-world entity (e.g., a business process). A Cougaar-based Model Driven Architecture approach, currently under development, uses a description of system's functionality (requirements) to automatically implement the system in Cougaar. The Communicating Sequential Processes (CSP) formalism is used for the formal validation of the generated system. Two main agent components, a blackboard and a plugin, are modeled as CSP processes. A set of channels represents communications between the blackboard and individual plugins. The blackboard is represented as a CSP process that communicates with every agent in the collection. The developed CSP-based Cougaar modeling framework provides a starting point for a more complete formal verification of the automatically generated Cougaar code. Currently it is used to verify the behavior of an individual agent in terms of CSP properties and to analyze the corresponding Cougaar society.
Automated knowledge base development from CAD/CAE databases
NASA Technical Reports Server (NTRS)
Wright, R. Glenn; Blanchard, Mary
1988-01-01
Knowledge base development requires a substantial investment in time, money, and resources in order to capture the knowledge and information necessary for anything other than trivial applications. This paper addresses a means to integrate the design and knowledge base development process through automated knowledge base development from CAD/CAE databases and files. Benefits of this approach include the development of a more efficient means of knowledge engineering, resulting in the timely creation of large knowledge based systems that are inherently free of error.
Creating a nursing strategic planning framework based on evidence.
Shoemaker, Lorie K; Fischer, Brenda
2011-03-01
This article describes an evidence-informed strategic planning process and framework used by a Magnet-recognized public health system in California. This article includes (1) an overview of the organization and its strategic planning process, (2) the structure created within nursing for collaborative strategic planning and decision making, (3) the strategic planning framework developed based on the organization's balanced scorecard domains and the new Magnet model, and (4) the process undertaken to develop the nursing strategic priorities. Outcomes associated with the structure, process, and key initiatives are discussed throughout the article. Copyright © 2011 Elsevier Inc. All rights reserved.
Pavement maintenance optimization model using Markov Decision Processes
NASA Astrophysics Data System (ADS)
Mandiartha, P.; Duffield, C. F.; Razelan, I. S. b. M.; Ismail, A. b. H.
2017-09-01
This paper presents an optimization model for selection of pavement maintenance intervention using a theory of Markov Decision Processes (MDP). There are some particular characteristics of the MDP developed in this paper which distinguish it from other similar studies or optimization models intended for pavement maintenance policy development. These unique characteristics include a direct inclusion of constraints into the formulation of MDP, the use of an average cost method of MDP, and the policy development process based on the dual linear programming solution. The limited information or discussions that are available on these matters in terms of stochastic based optimization model in road network management motivates this study. This paper uses a data set acquired from road authorities of state of Victoria, Australia, to test the model and recommends steps in the computation of MDP based stochastic optimization model, leading to the development of optimum pavement maintenance policy.
Some technical considerations on the evolution of the IBIS system. [Image Based Information System
NASA Technical Reports Server (NTRS)
Bryant, N. A.; Zobrist, A. L.
1982-01-01
In connection with work related to the use of earth-resources images, it became apparent by 1974, that certain system improvements are necessary for the efficient processing of digital data. To resolve this dilemma, Billingsley and Bryant (1975) proposed the use of image processing technology. Bryant and Zobrist (1976) reported the development of the Image Based Information System (IBIS) as a subset of an overall Video Image Communication and Retrieval (VICAR) image processing system. A description of IBIS is presented, and its employment in connection with advanced applications is discussed. It is concluded that several important lessons have been learned from the development of IBIS. The development of a flexible system such as IBIS is found to rest upon the prior development of a general purpose image processing system, such as VICAR.
Conceptual information processing: A robust approach to KBS-DBMS integration
NASA Technical Reports Server (NTRS)
Lazzara, Allen V.; Tepfenhart, William; White, Richard C.; Liuzzi, Raymond
1987-01-01
Integrating the respective functionality and architectural features of knowledge base and data base management systems is a topic of considerable interest. Several aspects of this topic and associated issues are addressed. The significance of integration and the problems associated with accomplishing that integration are discussed. The shortcomings of current approaches to integration and the need to fuse the capabilities of both knowledge base and data base management systems motivates the investigation of information processing paradigms. One such paradigm is concept based processing, i.e., processing based on concepts and conceptual relations. An approach to robust knowledge and data base system integration is discussed by addressing progress made in the development of an experimental model for conceptual information processing.
Project-Based Module Development.
ERIC Educational Resources Information Center
Meel, R. M. van
A project management design for modularizing higher education at open universities was developed and tested. Literature in the fields of project management and development of modular curriculum materials was reviewed and used as a basis for developing a project-based approach to the process of developing modules for self-instruction. According to…
Markert, Sven; Joeris, Klaus
2017-01-01
We developed an automated microtiter plate (MTP)-based system for suspension cell culture to meet the increased demands for miniaturized high throughput applications in biopharmaceutical process development. The generic system is based on off-the-shelf commercial laboratory automation equipment and is able to utilize MTPs of different configurations (6-24 wells per plate) in orbital shaken mode. The shaking conditions were optimized by Computational Fluid Dynamics simulations. The fully automated system handles plate transport, seeding and feeding of cells, daily sampling, and preparation of analytical assays. The integration of all required analytical instrumentation into the system enables a hands-off operation which prevents bottlenecks in sample processing. The modular set-up makes the system flexible and adaptable for a continuous extension of analytical parameters and add-on components. The system proved suitable as screening tool for process development by verifying the comparability of results for the MTP-based system and bioreactors regarding profiles of viable cell density, lactate, and product concentration of CHO cell lines. These studies confirmed that 6 well MTPs as well as 24 deepwell MTPs were predictive for a scale up to a 1000 L stirred tank reactor (scale factor 1:200,000). Applying the established cell culture system for automated media blend screening in late stage development, a 22% increase in product yield was achieved in comparison to the reference process. The predicted product increase was subsequently confirmed in 2 L bioreactors. Thus, we demonstrated the feasibility of the automated MTP-based cell culture system for enhanced screening and optimization applications in process development and identified further application areas such as process robustness. The system offers a great potential to accelerate time-to-market for new biopharmaceuticals. Biotechnol. Bioeng. 2017;114: 113-121. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Cognitive components underpinning the development of model-based learning.
Potter, Tracey C S; Bryce, Nessa V; Hartley, Catherine A
2017-06-01
Reinforcement learning theory distinguishes "model-free" learning, which fosters reflexive repetition of previously rewarded actions, from "model-based" learning, which recruits a mental model of the environment to flexibly select goal-directed actions. Whereas model-free learning is evident across development, recruitment of model-based learning appears to increase with age. However, the cognitive processes underlying the development of model-based learning remain poorly characterized. Here, we examined whether age-related differences in cognitive processes underlying the construction and flexible recruitment of mental models predict developmental increases in model-based choice. In a cohort of participants aged 9-25, we examined whether the abilities to infer sequential regularities in the environment ("statistical learning"), maintain information in an active state ("working memory") and integrate distant concepts to solve problems ("fluid reasoning") predicted age-related improvements in model-based choice. We found that age-related improvements in statistical learning performance did not mediate the relationship between age and model-based choice. Ceiling performance on our working memory assay prevented examination of its contribution to model-based learning. However, age-related improvements in fluid reasoning statistically mediated the developmental increase in the recruitment of a model-based strategy. These findings suggest that gradual development of fluid reasoning may be a critical component process underlying the emergence of model-based learning. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
Development of High Throughput Process for Constructing 454 Titanium and Illumina Libraries
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deshpande, Shweta; Hack, Christopher; Tang, Eric
2010-05-28
We have developed two processes with the Biomek FX robot to construct 454 titanium and Illumina libraries in order to meet the increasing library demands. All modifications in the library construction steps were made to enable the adaptation of the entire processes to work with the 96-well plate format. The key modifications include the shearing of DNA with Covaris E210 and the enzymatic reaction cleaning and fragment size selection with SPRI beads and magnetic plate holders. The construction of 96 Titanium libraries takes about 8 hours from sheared DNA to ssDNA recovery. The processing of 96 Illumina libraries takes lessmore » time than that of the Titanium library process. Although both processes still require manual transfer of plates from robot to other work stations such as thermocyclers, these robotic processes represent about 12- to 24-folds increase of library capacity comparing to the manual processes. To enable the sequencing of many libraries in parallel, we have also developed sets of molecular barcodes for both library types. The requirements for the 454 library barcodes include 10 bases, 40-60percent GC, no consecutive same base, and no less than 3 bases difference between barcodes. We have used 96 of the resulted 270 barcodes to construct libraries and pool to test the ability of accurately assigning reads to the right samples. When allowing 1 base error occurred in the 10 base barcodes, we could assign 99.6percent of the total reads and 100percent of them were uniquely assigned. As for the Illumina barcodes, the requirements include 4 bases, balanced GC, and at least 2 bases difference between barcodes. We have begun to assess the ability to assign reads after pooling different number of libraries. We will discuss the progress and the challenges of these scale-up processes.« less
Fragment-based design of kinase inhibitors: a practical guide.
Erickson, Jon A
2015-01-01
Fragment-based drug design has become an important strategy for drug design and development over the last decade. It has been used with particular success in the development of kinase inhibitors, which are one of the most widely explored classes of drug targets today. The application of fragment-based methods to discovering and optimizing kinase inhibitors can be a complicated and daunting task; however, a general process has emerged that has been highly fruitful. Here a practical outline of the fragment process used in kinase inhibitor design and development is laid out with specific examples. A guide to the overall process from initial discovery through fragment screening, including the difficulties in detection, to the computational methods available for use in optimization of the discovered fragments is reported.
A nutribusiness strategy for processing and marketing animal-source foods for children.
Mills, Edward W; Seetharaman, Koushik; Maretzki, Audrey N
2007-04-01
Nutritional benefits of animal source foods in the diets of children in developing countries indicate a need to increase the availability of such foods to young children. A nutribusiness strategy based on a dried meat and starch product could be used to increase children's access to such foods. The "Chiparoo" was developed at The Pennsylvania State University with this objective in mind. Plant-based and meat ingredients of the Chiparoo are chosen based on regional availability and cultural acceptability. Chiparoo processing procedures, including solar drying, are designed to ensure product safety and to provide product properties that allow them to be eaten as a snack or crumbled into a weaning porridge. Continued work is needed to develop formulation and processing variations that accommodate the needs of cultures around the world.
Stotz, Sarah; Lee, Jung Sun
2018-01-01
The objective of this report was to describe the development process of an innovative smartphone-based electronic learning (eLearning) nutrition education program targeted to Supplemental Nutrition Assistance Program-Education-eligible individuals, entitled Food eTalk. Lessons learned from the Food eTalk development process suggest that it is critical to include all key team members from the program's inception using effective inter-team communication systems, understand the unique resources needed, budget ample time for development, and employ an iterative development and evaluation model. These lessons have implications for researchers and funding agencies in developing an innovative evidence-based eLearning nutrition education program to an increasingly technology-savvy, low-income audience. Copyright © 2016 Society for Nutrition Education and Behavior. Published by Elsevier Inc. All rights reserved.
Transforming nanomedicine manufacturing toward Quality by Design and microfluidics.
Colombo, Stefano; Beck-Broichsitter, Moritz; Bøtker, Johan Peter; Malmsten, Martin; Rantanen, Jukka; Bohr, Adam
2018-04-05
Nanopharmaceuticals aim at translating the unique features of nano-scale materials into therapeutic products and consequently their development relies critically on the progression in manufacturing technology to allow scalable processes complying with process economy and quality assurance. The relatively high failure rate in translational nanopharmaceutical research and development, with respect to new products on the market, is at least partly due to immature bottom-up manufacturing development and resulting sub-optimal control of quality attributes in nanopharmaceuticals. Recently, quality-oriented manufacturing of pharmaceuticals has undergone an unprecedented change toward process and product development interaction. In this context, Quality by Design (QbD) aims to integrate product and process development resulting in an increased number of product applications to regulatory agencies and stronger proprietary defense strategies of process-based products. Although QbD can be applied to essentially any production approach, microfluidic production offers particular opportunities for QbD-based manufacturing of nanopharmaceuticals. Microfluidics provides unique design flexibility, process control and parameter predictability, and also offers ample opportunities for modular production setups, allowing process feedback for continuously operating production and process control. The present review aims at outlining emerging opportunities in the synergistic implementation of QbD strategies and microfluidic production in contemporary development and manufacturing of nanopharmaceuticals. In doing so, aspects of design and development, but also technology management, are reviewed, as is the strategic role of these tools for aligning nanopharmaceutical innovation, development, and advanced industrialization in the broader pharmaceutical field. Copyright © 2018 Elsevier B.V. All rights reserved.
Process Evaluation in Corrections-Based Substance Abuse Treatment.
ERIC Educational Resources Information Center
Wolk, James L.; Hartmann, David J.
1996-01-01
Argues that process evaluation is needed to validate prison-based substance abuse treatment effectiveness. Five groups--inmates, treatment staff, prison staff, prison administration, and the parole board--should be a part of this process evaluation. Discusses these five groups relative to three stages of development of substance abuse treatment in…
The Growth of Developmental Thought: Implications for a New Evolutionary Psychology
Lickliter, Robert
2009-01-01
Evolution has come to be increasingly discussed in terms of changes in developmental processes rather than simply in terms of changes in gene frequencies. This shift is based in large part on the recognition that since all phenotypic traits arise during ontogeny as products of individual development, a primary basis for evolutionary change must be variations in the patterns and processes of development. Further, the products of development are epigenetic, not just genetic, and this is the case even when considering the evolutionary process. These insights have led investigators to reconsider the established notion of genes as the primary cause of development, opening the door to research programs focused on identifying how genetic and non-genetic factors coact to guide and constrain the process of development and its outcomes. I explore this growth of developmental thought and its implications for the achievement of a unified theory of heredity, development, and evolution and consider its implications for the realization of a new, developmentally-based evolutionary psychology. PMID:19956346
SCI-U: E-learning for patient education in spinal cord injury rehabilitation
Shepherd, John D.; Badger-Brown, Karla M.; Legassic, Matthew S.; Walia, Saagar; Wolfe, Dalton L.
2012-01-01
Background/objectives To develop an online patient education resource for use in spinal cord injury rehabilitation. Participants The development process involved more than 100 subject-matter experts (SMEs) (rehabilitation professionals and consumers) from across Canada. Preliminary evaluation was conducted with 25 end-users. Methods An iterative development process was coordinated by a project team; SMEs (including patients) developed the content in working groups using wiki-based tools. Multiple rounds of feedback based on early prototypes helped improve the courses during development. Results Five courses were created, each featuring more than 45 minutes of video content and hundreds of media assets. Preliminary evaluation results indicate that users were satisfied by the courses and perceived them to be effective. Conclusions This is an effective process for developing multimedia patient education resources; the involvement of patients in all parts of the process was particularly helpful. Future work will focus on implementation, integration into clinical practice and other delivery formats (smart phones, tablets). PMID:23031169
The TMIS life-cycle process document, revision A
NASA Technical Reports Server (NTRS)
1991-01-01
The Technical and Management Information System (TMIS) Life-Cycle Process Document describes the processes that shall be followed in the definition, design, development, test, deployment, and operation of all TMIS products and data base applications. This document is a roll out of TMIS Standards Document (SSP 30546). The purpose of this document is to define the life cycle methodology that the developers of all products and data base applications and any subsequent modifications shall follow. Included in this methodology are descriptions of the tasks, deliverables, reviews, and approvals that are required before a product or data base application is accepted in the TMIS environment.
Developing Emotion-Based Case Formulations: A Research-Informed Method.
Pascual-Leone, Antonio; Kramer, Ueli
2017-01-01
New research-informed methods for case conceptualization that cut across traditional therapy approaches are increasingly popular. This paper presents a trans-theoretical approach to case formulation based on the research observations of emotion. The sequential model of emotional processing (Pascual-Leone & Greenberg, 2007) is a process research model that provides concrete markers for therapists to observe the emerging emotional development of their clients. We illustrate how this model can be used by clinicians to track change and provides a 'clinical map,' by which therapist may orient themselves in-session and plan treatment interventions. Emotional processing offers as a trans-theoretical framework for therapists who wish to conduct emotion-based case formulations. First, we present criteria for why this research model translates well into practice. Second, two contrasting case studies are presented to demonstrate the method. The model bridges research with practice by using client emotion as an axis of integration. Key Practitioner Message Process research on emotion can offer a template for therapists to make case formulations while using a range of treatment approaches. The sequential model of emotional processing provides a 'process map' of concrete markers for therapists to (1) observe the emerging emotional development of their clients, and (2) help therapists develop a treatment plan. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
76 FR 41297 - Grant Program To Build Tribal Energy Development Capacity
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-13
... develop energy resources on Indian land and properly accounting for resulting energy resource production and revenues. We will use a competitive evaluation process based on criteria stated in the.... Determine what process(es) and/or procedure(s) may be used to eliminate capacity gaps or sustain the...
Evaluation development for a physical activity positive youth development program for girls.
Ullrich-French, Sarah; Cole, Amy N; Montgomery, Anna K
2016-04-01
Girls on the Run (GOTR) is an after school program for girls in third through fifth grade which utilizes a physical activity based positive youth development curriculum that culminates with completing a 5K run. Unfortunately, there is little empirical data documenting GOTR participant changes that align with the curriculum and describe the evaluation process. Therefore, this study presents an evaluation of GOTR consisting of three main processes: curriculum content analysis and stakeholder focus groups (N=11) to identify key outcomes of the program; community-based participatory research to collaborate with program personnel to further identify important outcomes; and the design and pilot testing of an instrument (N=104) for assessing changes in the theoretically grounded outcomes over time. Findings demonstrated a positive collaborative process that led to important information to be used for an impact evaluation of Girls on the Run and for future evaluation development efforts for physical activity based positive youth development. Copyright © 2015 Elsevier Ltd. All rights reserved.
On Intelligent Design and Planning Method of Process Route Based on Gun Breech Machining Process
NASA Astrophysics Data System (ADS)
Hongzhi, Zhao; Jian, Zhang
2018-03-01
The paper states an approach of intelligent design and planning of process route based on gun breech machining process, against several problems, such as complex machining process of gun breech, tedious route design and long period of its traditional unmanageable process route. Based on gun breech machining process, intelligent design and planning system of process route are developed by virtue of DEST and VC++. The system includes two functional modules--process route intelligent design and its planning. The process route intelligent design module, through the analysis of gun breech machining process, summarizes breech process knowledge so as to complete the design of knowledge base and inference engine. And then gun breech process route intelligently output. On the basis of intelligent route design module, the final process route is made, edited and managed in the process route planning module.
Developing Mathematical Processes (DMP). Field Test Evaluation, 1973-1974.
ERIC Educational Resources Information Center
Schall, William; And Others
The Developing Mathematical Processes (DMP) program was field-tested in the kindergarten and first three grades of one parochial and five public schools. DMP is an activity-based program developed around a comprehensive list of behavioral objectives. The program is concerned with the development of intuitive geometric concepts as well as…
A Neuroconstructivist Model of Past Tense Development and Processing
ERIC Educational Resources Information Center
Westermann, Gert; Ruh, Nicolas
2012-01-01
We present a neural network model of learning and processing the English past tense that is based on the notion that experience-dependent cortical development is a core aspect of cognitive development. During learning the model adds and removes units and connections to develop a task-specific final architecture. The model provides an integrated…
Reduced order model based on principal component analysis for process simulation and optimization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lang, Y.; Malacina, A.; Biegler, L.
2009-01-01
It is well-known that distributed parameter computational fluid dynamics (CFD) models provide more accurate results than conventional, lumped-parameter unit operation models used in process simulation. Consequently, the use of CFD models in process/equipment co-simulation offers the potential to optimize overall plant performance with respect to complex thermal and fluid flow phenomena. Because solving CFD models is time-consuming compared to the overall process simulation, we consider the development of fast reduced order models (ROMs) based on CFD results to closely approximate the high-fidelity equipment models in the co-simulation. By considering process equipment items with complicated geometries and detailed thermodynamic property models,more » this study proposes a strategy to develop ROMs based on principal component analysis (PCA). Taking advantage of commercial process simulation and CFD software (for example, Aspen Plus and FLUENT), we are able to develop systematic CFD-based ROMs for equipment models in an efficient manner. In particular, we show that the validity of the ROM is more robust within well-sampled input domain and the CPU time is significantly reduced. Typically, it takes at most several CPU seconds to evaluate the ROM compared to several CPU hours or more to solve the CFD model. Two case studies, involving two power plant equipment examples, are described and demonstrate the benefits of using our proposed ROM methodology for process simulation and optimization.« less
NASA Astrophysics Data System (ADS)
Luo, Keqin
1999-11-01
The electroplating industry of over 10,000 planting plants nationwide is one of the major waste generators in the industry. Large quantities of wastewater, spent solvents, spent process solutions, and sludge are the major wastes generated daily in plants, which costs the industry tremendously for waste treatment and disposal and hinders the further development of the industry. It becomes, therefore, an urgent need for the industry to identify technically most effective and economically most attractive methodologies and technologies to minimize the waste, while the production competitiveness can be still maintained. This dissertation aims at developing a novel WM methodology using artificial intelligence, fuzzy logic, and fundamental knowledge in chemical engineering, and an intelligent decision support tool. The WM methodology consists of two parts: the heuristic knowledge-based qualitative WM decision analysis and support methodology and fundamental knowledge-based quantitative process analysis methodology for waste reduction. In the former, a large number of WM strategies are represented as fuzzy rules. This becomes the main part of the knowledge base in the decision support tool, WMEP-Advisor. In the latter, various first-principles-based process dynamic models are developed. These models can characterize all three major types of operations in an electroplating plant, i.e., cleaning, rinsing, and plating. This development allows us to perform a thorough process analysis on bath efficiency, chemical consumption, wastewater generation, sludge generation, etc. Additional models are developed for quantifying drag-out and evaporation that are critical for waste reduction. The models are validated through numerous industrial experiments in a typical plating line of an industrial partner. The unique contribution of this research is that it is the first time for the electroplating industry to (i) use systematically available WM strategies, (ii) know quantitatively and accurately what is going on in each tank, and (iii) identify all WM opportunities through process improvement. This work has formed a solid foundation for the further development of powerful WM technologies for comprehensive WM in the following decade.
Yeo, Junyeob; Hong, Sukjoon; Lee, Daehoo; Hotz, Nico; Lee, Ming-Tsang; Grigoropoulos, Costas P.; Ko, Seung Hwan
2012-01-01
Flexible electronics opened a new class of future electronics. The foldable, light and durable nature of flexible electronics allows vast flexibility in applications such as display, energy devices and mobile electronics. Even though conventional electronics fabrication methods are well developed for rigid substrates, direct application or slight modification of conventional processes for flexible electronics fabrication cannot work. The future flexible electronics fabrication requires totally new low-temperature process development optimized for flexible substrate and it should be based on new material too. Here we present a simple approach to developing a flexible electronics fabrication without using conventional vacuum deposition and photolithography. We found that direct metal patterning based on laser-induced local melting of metal nanoparticle ink is a promising low-temperature alternative to vacuum deposition– and photolithography-based conventional metal patterning processes. The “digital” nature of the proposed direct metal patterning process removes the need for expensive photomask and allows easy design modification and short turnaround time. This new process can be extremely useful for current small-volume, large-variety manufacturing paradigms. Besides, simple, scalable, fast and low-temperature processes can lead to cost-effective fabrication methods on a large-area polymer substrate. The developed process was successfully applied to demonstrate high-quality Ag patterning (2.1 µΩ·cm) and high-performance flexible organic field effect transistor arrays. PMID:22900011
Yeo, Junyeob; Hong, Sukjoon; Lee, Daehoo; Hotz, Nico; Lee, Ming-Tsang; Grigoropoulos, Costas P; Ko, Seung Hwan
2012-01-01
Flexible electronics opened a new class of future electronics. The foldable, light and durable nature of flexible electronics allows vast flexibility in applications such as display, energy devices and mobile electronics. Even though conventional electronics fabrication methods are well developed for rigid substrates, direct application or slight modification of conventional processes for flexible electronics fabrication cannot work. The future flexible electronics fabrication requires totally new low-temperature process development optimized for flexible substrate and it should be based on new material too. Here we present a simple approach to developing a flexible electronics fabrication without using conventional vacuum deposition and photolithography. We found that direct metal patterning based on laser-induced local melting of metal nanoparticle ink is a promising low-temperature alternative to vacuum deposition- and photolithography-based conventional metal patterning processes. The "digital" nature of the proposed direct metal patterning process removes the need for expensive photomask and allows easy design modification and short turnaround time. This new process can be extremely useful for current small-volume, large-variety manufacturing paradigms. Besides, simple, scalable, fast and low-temperature processes can lead to cost-effective fabrication methods on a large-area polymer substrate. The developed process was successfully applied to demonstrate high-quality Ag patterning (2.1 µΩ·cm) and high-performance flexible organic field effect transistor arrays.
Ilott, Irene; Booth, Andrew; Rick, Jo; Patterson, Malcolm
2010-06-01
To explore how nurses, midwives and health visitors contribute to the development, implementation and audit of protocol-based care. Protocol-based care refers to the use of documents that set standards for clinical care processes with the intent of reducing unacceptable variations in practice. Documents such as protocols, clinical guidelines and care pathways underpin evidence-based practice throughout the world. An interpretative review using the five-stage systematic literature review process. The data sources were the British Nursing Index, CINAHL, EMBASE, MEDLINE and Web of Science from onset to 2005. The Journal of Integrated Care Pathways was hand searched (1997-June 2006). Thirty three studies about protocol-based care in the United Kingdom were appraised using the Qualitative Assessment and Review Instrument (QARI version 2). The literature was synthesized inductively and deductively, using an official 12-step guide for development as a framework for the deductive synthesis. Most papers were descriptive, offering practitioner knowledge and positive findings about a locally developed and owned protocol-based care. The majority were instigated in response to clinical need or service re-design. Development of protocol-based care was a non-linear, idiosyncratic process, with steps omitted, repeated or completed in a different order. The context and the multiple purposes of protocol-based care influenced the development process. Implementation and sustainability were rarely mentioned, or theorised as a change. The roles and activities of nurses were so understated as to be almost invisible. There were notable gaps in the literature about the resource use costs, the engagement of patients in the decision-making process, leadership and the impact of formalisation and new roles on inter-professional relations. Documents that standardise clinical care are part of the history of nursing as well as contemporary evidence-based care and expanded roles. Considering the proliferation and contested nature of protocol-based care, the dearth of literature about the contribution, experience and outcomes for nurses, midwives and health visitors is noteworthy and requires further investigation. (c) 2010 Elsevier Ltd. All rights reserved.
Khanali, Majid; Mobli, Hossein; Hosseinzadeh-Bandbafha, Homa
2017-12-01
In this study, an artificial neural network (ANN) model was developed for predicting the yield and life cycle environmental impacts based on energy inputs required in processing of black tea, green tea, and oolong tea in Guilan province of Iran. A life cycle assessment (LCA) approach was used to investigate the environmental impact categories of processed tea based on the cradle to gate approach, i.e., from production of input materials using raw materials to the gate of tea processing units, i.e., packaged tea. Thus, all the tea processing operations such as withering, rolling, fermentation, drying, and packaging were considered in the analysis. The initial data were obtained from tea processing units while the required data about the background system was extracted from the EcoInvent 2.2 database. LCA results indicated that diesel fuel and corrugated paper box used in drying and packaging operations, respectively, were the main hotspots. Black tea processing unit caused the highest pollution among the three processing units. Three feed-forward back-propagation ANN models based on Levenberg-Marquardt training algorithm with two hidden layers accompanied by sigmoid activation functions and a linear transfer function in output layer, were applied for three types of processed tea. The neural networks were developed based on energy equivalents of eight different input parameters (energy equivalents of fresh tea leaves, human labor, diesel fuel, electricity, adhesive, carton, corrugated paper box, and transportation) and 11 output parameters (yield, global warming, abiotic depletion, acidification, eutrophication, ozone layer depletion, human toxicity, freshwater aquatic ecotoxicity, marine aquatic ecotoxicity, terrestrial ecotoxicity, and photochemical oxidation). The results showed that the developed ANN models with R 2 values in the range of 0.878 to 0.990 had excellent performance in predicting all the output variables based on inputs. Energy consumption for processing of green tea, oolong tea, and black tea were calculated as 58,182, 60,947, and 66,301 MJ per ton of dry tea, respectively.
Lau, Nathan; Jamieson, Greg A; Skraaning, Gyrd
2016-07-01
We introduce Process Overview, a situation awareness characterisation of the knowledge derived from monitoring process plants. Process Overview is based on observational studies of process control work in the literature. The characterisation is applied to develop a query-based measure called the Process Overview Measure. The goal of the measure is to improve coupling between situation and awareness according to process plant properties and operator cognitive work. A companion article presents the empirical evaluation of the Process Overview Measure in a realistic process control setting. The Process Overview Measure demonstrated sensitivity and validity by revealing significant effects of experimental manipulations that corroborated with other empirical results. The measure also demonstrated adequate inter-rater reliability and practicality for measuring SA based on data collected by process experts. Practitioner Summary: The Process Overview Measure is a query-based measure for assessing operator situation awareness from monitoring process plants in representative settings.
Robust Low-Cost Cathode for Commercial Applications
NASA Technical Reports Server (NTRS)
Patterson, Michael J.
2007-01-01
Under funding from the NASA Commercial Technology Office, a cathode assembly was designed, developed, fabricated, and tested for use in plasma sources for ground-based materials processing applications. The cathode development activity relied on the large prior NASA investment and successful development of high-current, high-efficiency, long-life hollow cathodes for use on the International Space Station Plasma Contactor System. The hollow cathode was designed and fabricated based on known engineering criteria and manufacturing processes for compatibility with the requirements of the plasma source. The transfer of NASA GRC-developed hollow cathode technology for use as an electron emitter in the commercial plasma source is anticipated to yield a significant increase in process control, while eliminating the present issues of electron emitter lifetime and contamination.
Vicente, Tiago; Mota, José P B; Peixoto, Cristina; Alves, Paula M; Carrondo, Manuel J T
2011-01-01
The advent of advanced therapies in the pharmaceutical industry has moved the spotlight into virus-like particles and viral vectors produced in cell culture holding great promise in a myriad of clinical targets, including cancer prophylaxis and treatment. Even though a couple of cases have reached the clinic, these products have yet to overcome a number of biological and technological challenges before broad utilization. Concerning the manufacturing processes, there is significant research focusing on the optimization of current cell culture systems and, more recently, on developing scalable downstream processes to generate material for pre-clinical and clinical trials. We review the current options for downstream processing of these complex biopharmaceuticals and underline current advances on knowledge-based toolboxes proposed for rational optimization of their processing. Rational tools developed to increase the yet scarce knowledge on the purification processes of complex biologicals are discussed as alternative to empirical, "black-boxed" based strategies classically used for process development. Innovative methodologies based on surface plasmon resonance, dynamic light scattering, scale-down high-throughput screening and mathematical modeling for supporting ion-exchange chromatography show great potential for a more efficient and cost-effective process design, optimization and equipment prototyping. Copyright © 2011 Elsevier Inc. All rights reserved.
Cognitive Components Underpinning the Development of Model-Based Learning
Potter, Tracey C.S.; Bryce, Nessa V.; Hartley, Catherine A.
2016-01-01
Reinforcement learning theory distinguishes “model-free” learning, which fosters reflexive repetition of previously rewarded actions, from “model-based” learning, which recruits a mental model of the environment to flexibly select goal-directed actions. Whereas model-free learning is evident across development, recruitment of model-based learning appears to increase with age. However, the cognitive processes underlying the development of model-based learning remain poorly characterized. Here, we examined whether age-related differences in cognitive processes underlying the construction and flexible recruitment of mental models predict developmental increases in model-based choice. In a cohort of participants aged 9–25, we examined whether the abilities to infer sequential regularities in the environment (“statistical learning”), maintain information in an active state (“working memory”) and integrate distant concepts to solve problems (“fluid reasoning”) predicted age-related improvements in model-based choice. We found that age-related improvements in statistical learning performance did not mediate the relationship between age and model-based choice. Ceiling performance on our working memory assay prevented examination of its contribution to model-based learning. However, age-related improvements in fluid reasoning statistically mediated the developmental increase in the recruitment of a model-based strategy. These findings suggest that gradual development of fluid reasoning may be a critical component process underlying the emergence of model-based learning. PMID:27825732
Information Processing Theory and Conceptual Development.
ERIC Educational Resources Information Center
Schroder, H. M.
An educational program based upon information processing theory has been developed at Southern Illinois University. The integrating theme was the development of conceptual ability for coping with social and personal problems. It utilized student information search and concept formation as foundations for discussion and judgment and was organized…
ERIC Educational Resources Information Center
Jesness, Bradley
This paper examines concepts in information-processing theory which are likely to be relevant to development and characterizes the methods and data upon which the concepts are based. Among the concepts examined are those which have slight empirical grounds. Other concepts examined are those which seem to have empirical bases but which are…
[Definition and stabilization of processes II. Clinical Processes in a Urology Department].
Pascual, Carlos; Luján, Marcos; Mora, José Ramón; Diz, Manuel Ramón; Martín, Carlos; López, María Carmen
2015-01-01
New models in clinical management seek a clinical practice based on quality, efficacy and efficiency, avoiding variability and improvisation. In this paper we have developed one of the most frequent clinical processes in our speciality, the process based on DRG 311 or transurethral procedures without complications. Along it we will describe its components: Stabilization form, clinical trajectory, cost calculation, and finally the process flowchart.
Research and Analysis of Image Processing Technologies Based on DotNet Framework
NASA Astrophysics Data System (ADS)
Ya-Lin, Song; Chen-Xi, Bai
Microsoft.Net is a kind of most popular program development tool. This paper gave a detailed analysis concluded about some image processing technologies of the advantages and disadvantages by .Net processed image while the same algorithm is used in Programming experiments. The result shows that the two best efficient methods are unsafe pointer and Direct 3D, and Direct 3D used to 3D simulation development, and the others are useful in some fields while these technologies are poor efficiency and not suited to real-time processing. The experiment results in paper will help some projects about image processing and simulation based DotNet and it has strong practicability.
Welding process modelling and control
NASA Technical Reports Server (NTRS)
Romine, Peter L.; Adenwala, Jinen A.
1993-01-01
The research and analysis performed, and software developed, and hardware/software recommendations made during 1992 in development of the PC-based data acquisition system for support of Welding Process Modeling and Control is reported. A need was identified by the Metals Processing Branch of NASA Marshall Space Flight Center, for a mobile data aquisition and analysis system, customized for welding measurement and calibration. Several hardware configurations were evaluated and a PC-based system was chosen. The Welding Measurement System (WMS) is a dedicated instrument, strictly for the use of data aquisition and analysis. Although the WMS supports many of the functions associated with the process control, it is not the intention for this system to be used for welding process control.
Development and Validation of a Shear Punch Test Fixture
2013-08-01
composites (MMC) manufactured by friction stir processing (FSP) that are being developed as part of a Technology Investment Fund (TIF) project, as the...leading a team of government departments and academics to develop a friction stir processing (FSP) based procedure to create metal matrix composite... friction stir process to fabricate surface metal matrix composites in aluminum alloys for potential application in light armoured vehicles. The
Güttler, Karen; Lehmann, Almut
2003-06-01
This paper is based on the project "nursing process, standardisation and quality in nursing care" which is funded by the BMBF. This venture aims to develop and constitute a structure for recordation and documentation of nursing processes in terms of a typology and to standardise patients data for an exchange. The typology results from both the outcomes of the actual analysis of 128 patients in hospitals, homes for the elderly and community health care centers and the research on nursing classifications. The contents of the typology has been developed in co-operation with nurses working in such institutions. The structure and transfer of the data set will be realised by an IT media based network. The range of the project is regional, national and international. In this project the Bremen Institute of Industrial Technology and Applied Work Science (BIBA-ATOP) was responsible for the project management and the development of the IT based structure of the typology. The contents of the typology have been developed by the Institute of Applied Nursing Research (iap).
Natural Inspired Intelligent Visual Computing and Its Application to Viticulture.
Ang, Li Minn; Seng, Kah Phooi; Ge, Feng Lu
2017-05-23
This paper presents an investigation of natural inspired intelligent computing and its corresponding application towards visual information processing systems for viticulture. The paper has three contributions: (1) a review of visual information processing applications for viticulture; (2) the development of natural inspired computing algorithms based on artificial immune system (AIS) techniques for grape berry detection; and (3) the application of the developed algorithms towards real-world grape berry images captured in natural conditions from vineyards in Australia. The AIS algorithms in (2) were developed based on a nature-inspired clonal selection algorithm (CSA) which is able to detect the arcs in the berry images with precision, based on a fitness model. The arcs detected are then extended to perform the multiple arcs and ring detectors information processing for the berry detection application. The performance of the developed algorithms were compared with traditional image processing algorithms like the circular Hough transform (CHT) and other well-known circle detection methods. The proposed AIS approach gave a Fscore of 0.71 compared with Fscores of 0.28 and 0.30 for the CHT and a parameter-free circle detection technique (RPCD) respectively.
Building a Knowledge to Action Program in Stroke Rehabilitation.
Janzen, Shannon; McIntyre, Amanda; Richardson, Marina; Britt, Eileen; Teasell, Robert
2016-09-01
The knowledge to action (KTA) process proposed by Graham et al (2006) is a framework to facilitate the development and application of research evidence into clinical practice. The KTA process consists of the knowledge creation cycle and the action cycle. The Evidence Based Review of Stroke Rehabilitation is a foundational part of the knowledge creation cycle and has helped guide the development of best practice recommendations in stroke. The Rehabilitation Knowledge to Action Project is an audit-feedback process for the clinical implementation of best practice guidelines, which follows the action cycle. The objective of this review was to: (1) contextualize the Evidence Based Review of Stroke Rehabilitation and Rehabilitation Knowledge to Action Project within the KTA model and (2) show how this process led to improved evidence-based practice in stroke rehabilitation. Through this process, a single centre was able to change clinical practice and promote a culture that supports the use of evidence-based practices in stroke rehabilitation.
Local spatio-temporal analysis in vision systems
NASA Astrophysics Data System (ADS)
Geisler, Wilson S.; Bovik, Alan; Cormack, Lawrence; Ghosh, Joydeep; Gildeen, David
1994-07-01
The aims of this project are the following: (1) develop a physiologically and psychophysically based model of low-level human visual processing (a key component of which are local frequency coding mechanisms); (2) develop image models and image-processing methods based upon local frequency coding; (3) develop algorithms for performing certain complex visual tasks based upon local frequency representations, (4) develop models of human performance in certain complex tasks based upon our understanding of low-level processing; and (5) develop a computational testbed for implementing, evaluating and visualizing the proposed models and algorithms, using a massively parallel computer. Progress has been substantial on all aims. The highlights include the following: (1) completion of a number of psychophysical and physiological experiments revealing new, systematic and exciting properties of the primate (human and monkey) visual system; (2) further development of image models that can accurately represent the local frequency structure in complex images; (3) near completion in the construction of the Texas Active Vision Testbed; (4) development and testing of several new computer vision algorithms dealing with shape-from-texture, shape-from-stereo, and depth-from-focus; (5) implementation and evaluation of several new models of human visual performance; and (6) evaluation, purchase and installation of a MasPar parallel computer.
What will the future of cloud-based astronomical data processing look like?
NASA Astrophysics Data System (ADS)
Green, Andrew W.; Mannering, Elizabeth; Harischandra, Lloyd; Vuong, Minh; O'Toole, Simon; Sealey, Katrina; Hopkins, Andrew M.
2017-06-01
Astronomy is rapidly approaching an impasse: very large datasets require remote or cloud-based parallel processing, yet many astronomers still try to download the data and develop serial code locally. Astronomers understand the need for change, but the hurdles remain high. We are developing a data archive designed from the ground up to simplify and encourage cloud-based parallel processing. While the volume of data we host remains modest by some standards, it is still large enough that download and processing times are measured in days and even weeks. We plan to implement a python based, notebook-like interface that automatically parallelises execution. Our goal is to provide an interface sufficiently familiar and user-friendly that it encourages the astronomer to run their analysis on our system in the cloud-astroinformatics as a service. We describe how our system addresses the approaching impasse in astronomy using the SAMI Galaxy Survey as an example.
ERIC Educational Resources Information Center
Moallem, Mahnaz
2001-01-01
Provides an overview of the process of designing and developing a Web-based course using instructional design principles and models, including constructivist and objectivist theories. Explains the process of implementing an instructional design model in designing a Web-based undergraduate course and evaluates the model based on course evaluations.…
The development of a fear of falling interdisciplinary intervention program
Gomez, Fernando; Curcio, Carmen-Lucia
2007-01-01
Objective: To describe the development process of a protocol for a fear of falling interdisciplinary intervention program based on the main factors associated with fear of falling. Design/methods: The process of developing a protocol consisted of defining the target population, selecting the initial assessment components, adapting the intervention program based on findings about fear of falling and restriction of activities in this population. Settings: University-affiliated outpatient vertigo, dizziness and falls clinic in coffee-growers zone of Colombian Andes Mountains. Results: An intervention program was developed based on three main falling conceptual models. A medical intervention, based on a biomedical and pathophysiological model, a physiotherapeutic intervention based on a postural control model and a psychological intervention based on a biological-behavioral model. Conclusion: This interdisciplinary fear of falling intervention program developed is based on particular characteristics of target population, with differences in the inclusion criteria and the program intervention components; with emphasis on medical (recurrent falls and dizziness evaluation and management), psychological (cognitive-behavioral therapy) and physiotherapeutic (balance and transfers training) components. PMID:18225468
NASA Technical Reports Server (NTRS)
Mcmanus, Shawn; Mcdaniel, Michael
1989-01-01
Planning for processing payloads was always difficult and time-consuming. With the advent of Space Station Freedom and its capability to support a myriad of complex payloads, the planning to support this ground processing maze involves thousands of man-hours of often tedious data manipulation. To provide the capability to analyze various processing schedules, an object oriented knowledge-based simulation environment called the Advanced Generic Accomodations Planning Environment (AGAPE) is being developed. Having nearly completed the baseline system, the emphasis in this paper is directed toward rule definition and its relation to model development and simulation. The focus is specifically on the methodologies implemented during knowledge acquisition, analysis, and representation within the AGAPE rule structure. A model is provided to illustrate the concepts presented. The approach demonstrates a framework for AGAPE rule development to assist expert system development.
A PBOM configuration and management method based on templates
NASA Astrophysics Data System (ADS)
Guo, Kai; Qiao, Lihong; Qie, Yifan
2018-03-01
The design of Process Bill of Materials (PBOM) holds a hinge position in the process of product development. The requirements of PBOM configuration design and management for complex products are analysed in this paper, which include the reuse technique of configuration procedure and urgent management need of huge quantity of product family PBOM data. Based on the analysis, the function framework of PBOM configuration and management has been established. Configuration templates and modules are defined in the framework to support the customization and the reuse of configuration process. The configuration process of a detection sensor PBOM is shown as an illustration case in the end. The rapid and agile PBOM configuration and management can be achieved utilizing template-based method, which has a vital significance to improve the development efficiency for complex products.
On the Development of a Hospital-Patient Web-Based Communication Tool: A Case Study From Norway.
Granja, Conceição; Dyb, Kari; Bolle, Stein Roald; Hartvigsen, Gunnar
2015-01-01
Surgery cancellations are undesirable in hospital settings as they increase costs, reduce productivity and efficiency, and directly affect the patient. The problem of elective surgery cancellations in a North Norwegian University Hospital is addressed. Based on a three-step methodology conducted at the hospital, the preoperative planning process was modeled taking into consideration the narratives from different health professions. From the analysis of the generated process models, it is concluded that in order to develop a useful patient centered web-based communication tool, it is necessary to fully understand how hospitals plan and organize surgeries today. Moreover, process reengineering is required to generate a standard process that can serve as a tool for health ICT designers to define the requirements for a robust and useful system.
Problem Based Learning and the scientific process
NASA Astrophysics Data System (ADS)
Schuchardt, Daniel Shaner
This research project was developed to inspire students to constructively use problem based learning and the scientific process to learn middle school science content. The student population in this study consisted of male and female seventh grade students. Students were presented with authentic problems that are connected to physical and chemical properties of matter. The intent of the study was to have students use the scientific process of looking at existing knowledge, generating learning issues or questions about the problems, and then developing a course of action to research and design experiments to model resolutions to the authentic problems. It was expected that students would improve their ability to actively engage with others in a problem solving process to achieve a deeper understanding of Michigan's 7th Grade Level Content Expectations, the Next Generation Science Standards, and a scientific process. Problem based learning was statistically effective in students' learning of the scientific process. Students statistically showed improvement on pre to posttest scores. The teaching method of Problem Based Learning was effective for seventh grade science students at Dowagiac Middle School.
Ethanol precipitation for purification of recombinant antibodies.
Tscheliessnig, Anne; Satzer, Peter; Hammerschmidt, Nikolaus; Schulz, Henk; Helk, Bernhard; Jungbauer, Alois
2014-10-20
Currently, the golden standard for the purification of recombinant humanized antibodies (rhAbs) from CHO cell culture is protein A chromatography. However, due to increasing rhAbs titers alternative methods have come into focus. A new strategy for purification of recombinant human antibodies from CHO cell culture supernatant based on cold ethanol precipitation (CEP) and CaCl2 precipitation has been developed. This method is based on the cold ethanol precipitation, the process used for purification of antibodies and other components from blood plasma. We proof the applicability of the developed process for four different antibodies resulting in similar yield and purity as a protein A chromatography based process. This process can be further improved using an anion-exchange chromatography in flowthrough mode e.g. a monolith as last step so that residual host cell protein is reduced to a minimum. Beside the ethanol based process, our data also suggest that ethanol could be replaced with methanol or isopropanol. The process is suited for continuous operation. Copyright © 2014 The Authors. Published by Elsevier B.V. All rights reserved.
Validity and Reliability Study of the Turkish Version of Ego Identity Process Questionairre
ERIC Educational Resources Information Center
Morsünbül, Ümit; Atak, Hasan
2013-01-01
The main developmental task is identity development in adolescence period. Marcia defined four identity statuses based on exploration and commitment process: Achievement, moratorium, foreclosure and diffusion. Certain scales were developed to measure identity development. Another questionnaire that evaluates both four identity statuses and the…
Rotational Molding Process Technician. Instructional Program Package.
ERIC Educational Resources Information Center
El Paso Community Coll., TX.
This curriculum package contains materials developed through a partnership of the Association of Rotational Molders, El Paso Community College (Texas), and the College of DuPage (Illinois). The materials, which were developed during a 2-day DACUM (Developing a Curriculum) process, are based on national skill standards and designed for…
Echtermeyer, Alexander; Amar, Yehia; Zakrzewski, Jacek; Lapkin, Alexei
2017-01-01
A recently described C(sp 3 )-H activation reaction to synthesise aziridines was used as a model reaction to demonstrate the methodology of developing a process model using model-based design of experiments (MBDoE) and self-optimisation approaches in flow. The two approaches are compared in terms of experimental efficiency. The self-optimisation approach required the least number of experiments to reach the specified objectives of cost and product yield, whereas the MBDoE approach enabled a rapid generation of a process model.
NASA Astrophysics Data System (ADS)
Liu, Zhenchen; Lu, Guihua; He, Hai; Wu, Zhiyong; He, Jian
2018-01-01
Reliable drought prediction is fundamental for water resource managers to develop and implement drought mitigation measures. Considering that drought development is closely related to the spatial-temporal evolution of large-scale circulation patterns, we developed a conceptual prediction model of seasonal drought processes based on atmospheric and oceanic standardized anomalies (SAs). Empirical orthogonal function (EOF) analysis is first applied to drought-related SAs at 200 and 500 hPa geopotential height (HGT) and sea surface temperature (SST). Subsequently, SA-based predictors are built based on the spatial pattern of the first EOF modes. This drought prediction model is essentially the synchronous statistical relationship between 90-day-accumulated atmospheric-oceanic SA-based predictors and SPI3 (3-month standardized precipitation index), calibrated using a simple stepwise regression method. Predictor computation is based on forecast atmospheric-oceanic products retrieved from the NCEP Climate Forecast System Version 2 (CFSv2), indicating the lead time of the model depends on that of CFSv2. The model can make seamless drought predictions for operational use after a year-to-year calibration. Model application to four recent severe regional drought processes in China indicates its good performance in predicting seasonal drought development, despite its weakness in predicting drought severity. Overall, the model can be a worthy reference for seasonal water resource management in China.
Bridging process-based and empirical approaches to modeling tree growth
Harry T. Valentine; Annikki Makela; Annikki Makela
2005-01-01
The gulf between process-based and empirical approaches to modeling tree growth may be bridged, in part, by the use of a common model. To this end, we have formulated a process-based model of tree growth that can be fitted and applied in an empirical mode. The growth model is grounded in pipe model theory and an optimal control model of crown development. Together, the...
Technology CAD for integrated circuit fabrication technology development and technology transfer
NASA Astrophysics Data System (ADS)
Saha, Samar
2003-07-01
In this paper systematic simulation-based methodologies for integrated circuit (IC) manufacturing technology development and technology transfer are presented. In technology development, technology computer-aided design (TCAD) tools are used to optimize the device and process parameters to develop a new generation of IC manufacturing technology by reverse engineering from the target product specifications. While in technology transfer to manufacturing co-location, TCAD is used for process centering with respect to high-volume manufacturing equipment of the target manufacturing equipment of the target manufacturing facility. A quantitative model is developed to demonstrate the potential benefits of the simulation-based methodology in reducing the cycle time and cost of typical technology development and technology transfer projects over the traditional practices. The strategy for predictive simulation to improve the effectiveness of a TCAD-based project, is also discussed.
NASA Astrophysics Data System (ADS)
Nasution, Derlina; Syahreni Harahap, Putri; Harahap, Marabangun
2018-03-01
This research aims to: (1) developed a instrument’s learning (lesson plan, worksheet, student’s book, teacher’s guide book, and instrument test) of physics learning through scientific inquiry learning model based Batak culture to achieve skills improvement process of science students and the students’ curiosity; (2) describe the quality of the result of develop instrument’s learning in high school using scientific inquiry learning model based Batak culture (lesson plan, worksheet, student’s book, teacher’s guide book, and instrument test) to achieve the science process skill improvement of students and the student curiosity. This research is research development. This research developed a instrument’s learning of physics by using a development model that is adapted from the development model Thiagarajan, Semmel, and Semmel. The stages are traversed until retrieved a valid physics instrument’s learning, practical, and effective includes :(1) definition phase, (2) the planning phase, and (3) stages of development. Test performed include expert test/validation testing experts, small groups, and test classes is limited. Test classes are limited to do in SMAN 1 Padang Bolak alternating on a class X MIA. This research resulted in: 1) the learning of physics static fluid material specially for high school grade 10th consisted of (lesson plan, worksheet, student’s book, teacher’s guide book, and instrument test) and quality worthy of use in the learning process; 2) each component of the instrument’s learning meet the criteria have valid learning, practical, and effective way to reach the science process skill improvement and curiosity in students.
Lindgren, Helena; Lundin-Olsson, Lillemor; Pohl, Petra; Sandlund, Marlene
2014-01-01
Five physiotherapists organised a user-centric design process of a knowledge-based support system for promoting exercise and preventing falls. The process integrated focus group studies with 17 older adults and prototyping. The transformation of informal medical and rehabilitation expertise and older adults' experiences into formal information and process models during the development was studied. As tool they used ACKTUS, a development platform for knowledge-based applications. The process became agile and incremental, partly due to the diversity of expectations and preferences among both older adults and physiotherapists, and the participatory approach to design and development. In addition, there was a need to develop the knowledge content alongside with the formal models and their presentations, which allowed the participants to test hands-on and evaluate the ideas, content and design. The resulting application is modular, extendable, flexible and adaptable to the individual end user. Moreover, the physiotherapists are able to modify the information and process models, and in this way further develop the application. The main constraint was found to be the lack of support for the initial phase of concept modelling, which lead to a redesigned user interface and functionality of ACKTUS.
Architecture for Survivable System Processing (ASSP)
NASA Astrophysics Data System (ADS)
Wood, Richard J.
1991-11-01
The Architecture for Survivable System Processing (ASSP) Program is a multi-phase effort to implement Department of Defense (DOD) and commercially developed high-tech hardware, software, and architectures for reliable space avionics and ground based systems. System configuration options provide processing capabilities to address Time Dependent Processing (TDP), Object Dependent Processing (ODP), and Mission Dependent Processing (MDP) requirements through Open System Architecture (OSA) alternatives that allow for the enhancement, incorporation, and capitalization of a broad range of development assets. High technology developments in hardware, software, and networking models, address technology challenges of long processor life times, fault tolerance, reliability, throughput, memories, radiation hardening, size, weight, power (SWAP) and security. Hardware and software design, development, and implementation focus on the interconnectivity/interoperability of an open system architecture and is being developed to apply new technology into practical OSA components. To insure for widely acceptable architecture capable of interfacing with various commercial and military components, this program provides for regular interactions with standardization working groups (e.g.) the International Standards Organization (ISO), American National Standards Institute (ANSI), Society of Automotive Engineers (SAE), and Institute of Electrical and Electronic Engineers (IEEE). Selection of a viable open architecture is based on the widely accepted standards that implement the ISO/OSI Reference Model.
Architecture for Survivable System Processing (ASSP)
NASA Technical Reports Server (NTRS)
Wood, Richard J.
1991-01-01
The Architecture for Survivable System Processing (ASSP) Program is a multi-phase effort to implement Department of Defense (DOD) and commercially developed high-tech hardware, software, and architectures for reliable space avionics and ground based systems. System configuration options provide processing capabilities to address Time Dependent Processing (TDP), Object Dependent Processing (ODP), and Mission Dependent Processing (MDP) requirements through Open System Architecture (OSA) alternatives that allow for the enhancement, incorporation, and capitalization of a broad range of development assets. High technology developments in hardware, software, and networking models, address technology challenges of long processor life times, fault tolerance, reliability, throughput, memories, radiation hardening, size, weight, power (SWAP) and security. Hardware and software design, development, and implementation focus on the interconnectivity/interoperability of an open system architecture and is being developed to apply new technology into practical OSA components. To insure for widely acceptable architecture capable of interfacing with various commercial and military components, this program provides for regular interactions with standardization working groups (e.g.) the International Standards Organization (ISO), American National Standards Institute (ANSI), Society of Automotive Engineers (SAE), and Institute of Electrical and Electronic Engineers (IEEE). Selection of a viable open architecture is based on the widely accepted standards that implement the ISO/OSI Reference Model.
An Evidential Reasoning-Based CREAM to Human Reliability Analysis in Maritime Accident Process.
Wu, Bing; Yan, Xinping; Wang, Yang; Soares, C Guedes
2017-10-01
This article proposes a modified cognitive reliability and error analysis method (CREAM) for estimating the human error probability in the maritime accident process on the basis of an evidential reasoning approach. This modified CREAM is developed to precisely quantify the linguistic variables of the common performance conditions and to overcome the problem of ignoring the uncertainty caused by incomplete information in the existing CREAM models. Moreover, this article views maritime accident development from the sequential perspective, where a scenario- and barrier-based framework is proposed to describe the maritime accident process. This evidential reasoning-based CREAM approach together with the proposed accident development framework are applied to human reliability analysis of a ship capsizing accident. It will facilitate subjective human reliability analysis in different engineering systems where uncertainty exists in practice. © 2017 Society for Risk Analysis.
Schulz, Amy J.; Israel, Barbara A.; Coombe, Chris M.; Gaines, Causandra; Reyes, Angela G.; Rowe, Zachary; Sand, Sharon; Strong, Larkin L.; Weir, Sheryl
2010-01-01
The elimination of persistent health inequities requires the engagement of multiple perspectives, resources and skills. Community-based participatory research is one approach to developing action strategies that promote health equity by addressing contextual as well as individual level factors, and that can contribute to addressing more fundamental factors linked to health inequity. Yet many questions remain about how to implement participatory processes that engage local insights and expertise, are informed by the existing public health knowledge base, and build support across multiple sectors to implement solutions. We describe a CBPR approach used to conduct a community assessment and action planning process, culminating in development of a multilevel intervention to address inequalities in cardiovascular disease in Detroit, Michigan. We consider implications for future efforts to engage communities in developing strategies toward eliminating health inequities. PMID:21873580
The Complexity of Character: An Ability-Based Model for Higher Education
ERIC Educational Resources Information Center
Graham, Sandra E.; Diez, Mary E.
2015-01-01
Character development in higher education is a complex process. This process has often been delegated to a single course on ethics or courses on religion. The authors of this article pose an alternative higher educational process whereby character development is rooted in a series of abilities that are contextualized throughout the entire…
Developing Mathematical Processes (DMP). Field Test Evaluation, 1972-1973.
ERIC Educational Resources Information Center
Schall, William E.; And Others
The field test of the Developing Mathematical Processes (DMP) program was conducted jointly by the Falconer Central School, St. Mary's Elementary School in Dunkirk, New York, and the Teacher Education Research Center at the State University College in Fredonia, New York. DMP is a research-based, innovative, process-oriented elementary mathematics…
Advanced biologically plausible algorithms for low-level image processing
NASA Astrophysics Data System (ADS)
Gusakova, Valentina I.; Podladchikova, Lubov N.; Shaposhnikov, Dmitry G.; Markin, Sergey N.; Golovan, Alexander V.; Lee, Seong-Whan
1999-08-01
At present, in computer vision, the approach based on modeling the biological vision mechanisms is extensively developed. However, up to now, real world image processing has no effective solution in frameworks of both biologically inspired and conventional approaches. Evidently, new algorithms and system architectures based on advanced biological motivation should be developed for solution of computational problems related to this visual task. Basic problems that should be solved for creation of effective artificial visual system to process real world imags are a search for new algorithms of low-level image processing that, in a great extent, determine system performance. In the present paper, the result of psychophysical experiments and several advanced biologically motivated algorithms for low-level processing are presented. These algorithms are based on local space-variant filter, context encoding visual information presented in the center of input window, and automatic detection of perceptually important image fragments. The core of latter algorithm are using local feature conjunctions such as noncolinear oriented segment and composite feature map formation. Developed algorithms were integrated into foveal active vision model, the MARR. It is supposed that proposed algorithms may significantly improve model performance while real world image processing during memorizing, search, and recognition.
Stochastic simulation by image quilting of process-based geological models
NASA Astrophysics Data System (ADS)
Hoffimann, Júlio; Scheidt, Céline; Barfod, Adrian; Caers, Jef
2017-09-01
Process-based modeling offers a way to represent realistic geological heterogeneity in subsurface models. The main limitation lies in conditioning such models to data. Multiple-point geostatistics can use these process-based models as training images and address the data conditioning problem. In this work, we further develop image quilting as a method for 3D stochastic simulation capable of mimicking the realism of process-based geological models with minimal modeling effort (i.e. parameter tuning) and at the same time condition them to a variety of data. In particular, we develop a new probabilistic data aggregation method for image quilting that bypasses traditional ad-hoc weighting of auxiliary variables. In addition, we propose a novel criterion for template design in image quilting that generalizes the entropy plot for continuous training images. The criterion is based on the new concept of voxel reuse-a stochastic and quilting-aware function of the training image. We compare our proposed method with other established simulation methods on a set of process-based training images of varying complexity, including a real-case example of stochastic simulation of the buried-valley groundwater system in Denmark.
Thupayagale-Tshweneagae, Gloria
2011-12-01
The article describes a framework and the process for the development of the peer-based mental health support programme and its implementation. The development of a peer-based mental health support programme is based on Erikson's theory on the adolescent phase of development, the psycho-educational processes; the peer approach and the orphaned adolescents lived experiences as conceptual framework. A triangulation of five qualitative methods of photography, reflective diaries, focus groups, event history calendar and field notes were used to capture the lived experiences of adolescents orphaned to HIV and AIDS. Analysis of data followed Colaizzi's method of data analysis. The combination of psycho-education, Erikson's stages of development and peer support assisted the participants to gain knowledge and skills to overcome adversity and to assist them to become to more resilient. The peer based mental health support programme if used would enhance the mental health of adolescent orphans.
A stochastic hybrid systems based framework for modeling dependent failure processes
Fan, Mengfei; Zeng, Zhiguo; Zio, Enrico; Kang, Rui; Chen, Ying
2017-01-01
In this paper, we develop a framework to model and analyze systems that are subject to dependent, competing degradation processes and random shocks. The degradation processes are described by stochastic differential equations, whereas transitions between the system discrete states are triggered by random shocks. The modeling is, then, based on Stochastic Hybrid Systems (SHS), whose state space is comprised of a continuous state determined by stochastic differential equations and a discrete state driven by stochastic transitions and reset maps. A set of differential equations are derived to characterize the conditional moments of the state variables. System reliability and its lower bounds are estimated from these conditional moments, using the First Order Second Moment (FOSM) method and Markov inequality, respectively. The developed framework is applied to model three dependent failure processes from literature and a comparison is made to Monte Carlo simulations. The results demonstrate that the developed framework is able to yield an accurate estimation of reliability with less computational costs compared to traditional Monte Carlo-based methods. PMID:28231313
A stochastic hybrid systems based framework for modeling dependent failure processes.
Fan, Mengfei; Zeng, Zhiguo; Zio, Enrico; Kang, Rui; Chen, Ying
2017-01-01
In this paper, we develop a framework to model and analyze systems that are subject to dependent, competing degradation processes and random shocks. The degradation processes are described by stochastic differential equations, whereas transitions between the system discrete states are triggered by random shocks. The modeling is, then, based on Stochastic Hybrid Systems (SHS), whose state space is comprised of a continuous state determined by stochastic differential equations and a discrete state driven by stochastic transitions and reset maps. A set of differential equations are derived to characterize the conditional moments of the state variables. System reliability and its lower bounds are estimated from these conditional moments, using the First Order Second Moment (FOSM) method and Markov inequality, respectively. The developed framework is applied to model three dependent failure processes from literature and a comparison is made to Monte Carlo simulations. The results demonstrate that the developed framework is able to yield an accurate estimation of reliability with less computational costs compared to traditional Monte Carlo-based methods.
3D Nanofabrication Using AFM-Based Ultrasonic Vibration Assisted Nanomachining
NASA Astrophysics Data System (ADS)
Deng, Jia
Nanolithography and nanofabrication processes have significant impact on the recent development of fundamental research areas such as physics, chemistry and biology, as well as the modern electronic devices that have reached nanoscale domain such as optoelectronic devices. Many advanced nanofabrication techniques have been developed and reported to satisfy different requirements in both research areas and applications such as electron-beam lithography. However, it is expensive to use and maintain the equipment. Atomic Force Microscope (AFM) based nanolithography processes provide an alternative approach to nanopatterning with significantly lower cost. Recently, three dimensional nanostructures have attracted a lot of attention, motivated by many applications in various fields including optics, plasmonics and nanoelectromechanical systems. AFM nanolithography processes are able to create not only two dimensional nanopatterns but also have the great potential to fabricate three dimensional nanostructures. The objectives of this research proposal are to investigate the capability of AFM-based three dimensional nanofabrication processes, to transfer the three dimensional nanostructures from resists to silicon surfaces and to use the three dimensional nanostructures on silicon in applications. Based on the understanding of literature, a novel AFM-based ultrasonic vibration assisted nanomachining system is utilized to develop three dimensional nanofabrication processes. In the system, high-frequency in plane circular xy-vibration was introduced to create a virtual tool, whose diameter is controlled by the amplitude of xy-vibration and is larger than that of a regular AFM tip. Therefore, the feature width of a single trench is tunable. Ultrasonic vibration of sample in z-direction was introduced to control the depth of single trenches, creating a high-rate 3D nanomachining process. Complicated 3D nanostructures on PMMA are fabricated under both the setpoint force and z-height control modes. Complex contours and both discrete and continuous height changes are able to be fabricated by the novel 3D nanofabrication processes. Results are imaged clearly after cleaning the debris covering on the 3D nanostructures after nanomachining process. The process is validated by fabricating various 3D nanostructures. The advantages and disadvantages are compared between these two control modes. Furthermore, the 3D nanostructures were further transferred from PMMA surfaces onto silicon surfaces using reactive ion etching (RIE) process. Recipes are developed based on the functionality of the etching gas in the transfer process. Tunable selectivity and controllable surface finishes are achieved by varying the flow rate of oxygen. The developed 3D nanofabrication process is used as a novel technique in two applications, master fabrication for soft lithography and SERS substrates fabrication. 3D nanostructures were reversely molded on PDMS and then duplicated on new PMMA substrates. 3D nanostructures are fabricated, which can be either directly used or transferred on silicon as SERS substrates after coating 80 nm gold layers. They greatly enhanced the intensity of Raman scattering with the enhancement factor of 3.11x103. These applications demonstrate the capability of the novel process of AFM-based 3D nanomachining.
NASA Technical Reports Server (NTRS)
1980-01-01
The design, fabrication, and installation of an experimental process system development unit (EPSDU) were analyzed. Supporting research and development were performed to provide an information data base usable for the EPSDU and for technological design and economical analysis for potential scale-up of the process. Iterative economic analyses were conducted for the estimated product cost for the production of semiconductor grade silicon in a facility capable of producing 1000-MT/Yr.
NASA Astrophysics Data System (ADS)
Ulfa, Andi Maria; Sugiyarto, Kristian H.; Ikhsan, Jaslin
2017-05-01
Poor achievement of students' performance on Chemistry may result from unfavourable learning processes. Therefore, innovation on learning process must be created. Regarding fast development of mobile technology, learning process cannot ignore the crucial role of the technology. This research and development (R&D) studies was done to develop android based application and to study the effect of its integration in Learning together (LT) into the improvement of students' learning creativity and cognitive achievement. The development of the application was carried out by adapting Borg & Gall and Dick & Carey model. The developed-product was reviewed by chemist, learning media practitioners, peer reviewers, and educators. After the revision based on the reviews, the application was used in the LT model on the topic of Stoichiometry in a senior high school. The instruments were questionnaires to get comments and suggestion from the reviewers about the application, and the another questionnaire was to collect the data of learning creativity. Another instrument used was a set of test by which data of students' achievement was collected. The results showed that the use of the mobile based application on Learning Together can bring about significant improvement of students' performance including creativity and cognitive achievement.
SLS Navigation Model-Based Design Approach
NASA Technical Reports Server (NTRS)
Oliver, T. Emerson; Anzalone, Evan; Geohagan, Kevin; Bernard, Bill; Park, Thomas
2018-01-01
The SLS Program chose to implement a Model-based Design and Model-based Requirements approach for managing component design information and system requirements. This approach differs from previous large-scale design efforts at Marshall Space Flight Center where design documentation alone conveyed information required for vehicle design and analysis and where extensive requirements sets were used to scope and constrain the design. The SLS Navigation Team has been responsible for the Program-controlled Design Math Models (DMMs) which describe and represent the performance of the Inertial Navigation System (INS) and the Rate Gyro Assemblies (RGAs) used by Guidance, Navigation, and Controls (GN&C). The SLS Navigation Team is also responsible for the navigation algorithms. The navigation algorithms are delivered for implementation on the flight hardware as a DMM. For the SLS Block 1-B design, the additional GPS Receiver hardware is managed as a DMM at the vehicle design level. This paper provides a discussion of the processes and methods used to engineer, design, and coordinate engineering trades and performance assessments using SLS practices as applied to the GN&C system, with a particular focus on the Navigation components. These include composing system requirements, requirements verification, model development, model verification and validation, and modeling and analysis approaches. The Model-based Design and Requirements approach does not reduce the effort associated with the design process versus previous processes used at Marshall Space Flight Center. Instead, the approach takes advantage of overlap between the requirements development and management process, and the design and analysis process by efficiently combining the control (i.e. the requirement) and the design mechanisms. The design mechanism is the representation of the component behavior and performance in design and analysis tools. The focus in the early design process shifts from the development and management of design requirements to the development of usable models, model requirements, and model verification and validation efforts. The models themselves are represented in C/C++ code and accompanying data files. Under the idealized process, potential ambiguity in specification is reduced because the model must be implementable versus a requirement which is not necessarily subject to this constraint. Further, the models are shown to emulate the hardware during validation. For models developed by the Navigation Team, a common interface/standalone environment was developed. The common environment allows for easy implementation in design and analysis tools. Mechanisms such as unit test cases ensure implementation as the developer intended. The model verification and validation process provides a very high level of component design insight. The origin and implementation of the SLS variant of Model-based Design is described from the perspective of the SLS Navigation Team. The format of the models and the requirements are described. The Model-based Design approach has many benefits but is not without potential complications. Key lessons learned associated with the implementation of the Model Based Design approach and process from infancy to verification and certification are discussed
NASA Astrophysics Data System (ADS)
Krawczyk, Rafał D.; Czarski, Tomasz; Kolasiński, Piotr; Linczuk, Paweł; Poźniak, Krzysztof T.; Chernyshova, Maryna; Kasprowicz, Grzegorz; Wojeński, Andrzej; Zabolotny, Wojciech; Zienkiewicz, Paweł
2016-09-01
This article is an overview of what has been implemented in the process of development and testing the GEM detector based acquisition system in terms of post-processing algorithms. Information is given on mex functions for extended statistics collection, unified hex topology and optimized S-DAQ algorithm for splitting overlapped signals. Additional discussion on bottlenecks and major factors concerning optimization is presented.
Requirements for company-wide management
NASA Technical Reports Server (NTRS)
Southall, J. W.
1980-01-01
Computing system requirements were developed for company-wide management of information and computer programs in an engineering data processing environment. The requirements are essential to the successful implementation of a computer-based engineering data management system; they exceed the capabilities provided by the commercially available data base management systems. These requirements were derived from a study entitled The Design Process, which was prepared by design engineers experienced in development of aerospace products.
Moment-Based Physical Models of Broadband Clutter due to Aggregations of Fish
2013-09-30
statistical models for signal-processing algorithm development. These in turn will help to develop a capability to statistically forecast the impact of...aggregations of fish based on higher-order statistical measures describable in terms of physical and system parameters. Environmentally , these models...processing. In this experiment, we had good ground truth on (1) and (2), and had control over (3) and (4) except for environmentally -imposed restrictions
DOT National Transportation Integrated Search
2008-01-01
The Virginia Department of Transportation (VDOT) is increasingly involved with the land development process in evolving transportation corridors. This process includes consideration of real estate interests, rezoning and permitting approvals, site pl...
Practical Team-Based Learning from Planning to Implementation
Bell, Edward; Eng, Marty; Fuentes, David G.; Helms, Kristen L.; Maki, Erik D.; Vyas, Deepti
2015-01-01
Team-based learning (TBL) helps instructors develop an active teaching approach for the classroom through group work. The TBL infrastructure engages students in the learning process through the Readiness Assessment Process, problem-solving through team discussions, and peer feedback to ensure accountability. This manuscript describes the benefits and barriers of TBL, and the tools necessary for developing, implementing, and critically evaluating the technique within coursework in a user-friendly method. Specifically, the manuscript describes the processes underpinning effective TBL development, preparation, implementation, assessment, and evaluation, as well as practical techniques and advice from authors’ classroom experiences. The paper also highlights published articles in the area of TBL in education, with a focus on pharmacy education. PMID:26889061
NASA Astrophysics Data System (ADS)
Prawvichien, Sutthaporn; Siripun, Kulpatsorn; Yuenyong, Chokchai
2018-01-01
The STEM education could provide the context for students' learning in the 21st century. The Mathematical problem solving requires a context which simulates real life in order to give students experience of the power of mathematics in the world around them. This study aimed to develop the teaching process for enhancing students' mathematical problem solving in the 21st century through STEM education. The paper will clarify the STEM learning activities about graph theories regarding on the 6 steps of engineering design process. These include identify a challenge, exploring ideas, designing and planning, doing and developing, test and evaluate, and present the solution. The learning activities will start from the Identify a challenge stage which provides the northern part of Thailand flooding situation in order to set the students' tasks of develop the solutions of providing the routes of fastest moving people away from the flooding areas. The explore ideas stage will provide activities for enhance students to learn some knowledge based for designing the possible solutions. This knowledge based could focus on measuring, geometry, graph theory, and mathematical process. The design and plan stage will ask students to model the city based on the map and then provide the possible routes. The doing and development stage will ask students to develop the routes based on their possible model. The test and evaluating will ask students to clarify how to test and evaluate the possible routes, and then test it. The present solution stage will ask students to present the whole process of designing routes. Then, the paper will discuss how these learning activities could enhance students' mathematical problem solving. The paper may have implication for STEM education in school setting.
Web-based data collection: detailed methods of a questionnaire and data gathering tool
Cooper, Charles J; Cooper, Sharon P; del Junco, Deborah J; Shipp, Eva M; Whitworth, Ryan; Cooper, Sara R
2006-01-01
There have been dramatic advances in the development of web-based data collection instruments. This paper outlines a systematic web-based approach to facilitate this process through locally developed code and to describe the results of using this process after two years of data collection. We provide a detailed example of a web-based method that we developed for a study in Starr County, Texas, assessing high school students' work and health status. This web-based application includes data instrument design, data entry and management, and data tables needed to store the results that attempt to maximize the advantages of this data collection method. The software also efficiently produces a coding manual, web-based statistical summary and crosstab reports, as well as input templates for use by statistical packages. Overall, web-based data entry using a dynamic approach proved to be a very efficient and effective data collection system. This data collection method expedited data processing and analysis and eliminated the need for cumbersome and expensive transfer and tracking of forms, data entry, and verification. The code has been made available for non-profit use only to the public health research community as a free download [1]. PMID:16390556
NASA Astrophysics Data System (ADS)
Zan, Hao; Li, Haowei; Jiang, Yuguang; Wu, Meng; Zhou, Weixing; Bao, Wen
2018-06-01
As part of our efforts to find ways and means to further improve the regenerative cooling technology in scramjet, the experiments of thermo-acoustic instability dynamic characteristics of hydrocarbon fuel flowing have been conducted in horizontal circular tubes at different conditions. The experimental results indicate that there is a developing process from thermo-acoustic stability to instability. In order to have a deep understanding on the developing process of thermo-acoustic instability, the method of Multi-scale Shannon Wavelet Entropy (MSWE) based on Wavelet Transform Correlation Filter (WTCF) and Multi-Scale Shannon Entropy (MSE) is adopted in this paper. The results demonstrate that the developing process of thermo-acoustic instability from noise and weak signals is well detected by MSWE method and the differences among the stability, the developing process and the instability can be identified. These properties render the method particularly powerful for warning thermo-acoustic instability of hydrocarbon fuel flowing in scramjet cooling channels. The mass flow rate and the inlet pressure will make an influence on the developing process of the thermo-acoustic instability. The investigation on thermo-acoustic instability dynamic characteristics at supercritical pressure based on wavelet entropy method offers guidance on the control of scramjet fuel supply, which can secure stable fuel flowing in regenerative cooling system.
Tyus, Nadra C; Freeman, Randall J; Gibbons, M Christopher
2006-09-01
There has been considerable discussion about translating science into practical messages, especially among urban minority and "hard-to-reach" populations. Unfortunately, many research findings rarely make it back in useful format to the general public. Few innovative techniques have been established that provide researchers with a systematic process for developing health awareness and prevention messages for priority populations. The purpose of this paper is to describe the early development and experience of a unique community-based participatory process used to develop health promotion messages for a predominantly low-income, black and African-American community in Baltimore, MD. Scientific research findings from peer-reviewed literature were identified by academic researchers. Researchers then taught the science to graphic design students and faculty. The graphic design students and faculty then worked with both community residents and researchers to transform this information into evidence-based public health education messages. The final products were culturally and educationally appropriate, health promotion messages reflecting urban imagery that were eagerly desired by the community. This early outcome is in contrast to many previously developed messages and materials created through processes with limited community involvement and by individuals with limited practical knowledge of local community culture or expertise in marketing or mass communication. This process may potentially be utilized as a community-based participatory approach to enhance the translation of scientific research into desirable and appropriate health education messages.
ERIC Educational Resources Information Center
Zhang, Xiaolei; Wong, Jocelyn L. N.
2018-01-01
Studies of professional development have examined the influence of school-based approaches on in-service teacher learning and change but have seldom investigated teachers' job-embedded learning processes. This paper explores the dynamic processes of teacher learning in school-based settings. A qualitative comparative case study based on the…
Study on virtual instrument developing system based on intelligent virtual control
NASA Astrophysics Data System (ADS)
Tang, Baoping; Cheng, Fabin; Qin, Shuren
2005-01-01
The paper introduces a non-programming developing system of a virtual instument (VI), i.e., a virtual measurement instrument developing system (VMIDS) based on intelligent virtual control (IVC). The background of the IVC-based VMIDS is described briefly, and the hierarchical message bus (HMB)-based software architecture of VMIDS is discussed in detail. The three parts and functions of VMIDS are introduced, and the process of non-programming developing VI is further described.
ERIC Educational Resources Information Center
Tutlys, Vidmantas; Spöttl, Georg
2017-01-01
Purpose: This paper aims to explore methodological and institutional challenges on application of the work-process analysis approach in the design and development of competence-based occupational standards for Lithuania. Design/methodology/approach: The theoretical analysis is based on the review of scientific literature and the analysis of…
NASA Astrophysics Data System (ADS)
Mayasari, D.
2017-02-01
Investigative research on Influence of bockron as a medium of learning in process of inquiry-based learning to the development of science process skills on the concept of growth and development. This research was done in an effort to follow up underdeveloped skills of observing, communicating andconclude on students. This research was conducted using classroom action research (PTK), which consisted of 3 cycles. Cycle 1 students observe differences in growth and development, cycle 2 students measure the growth rate, cycle 3 students observe factors that influence growth and development, In these three cycles is used as a planting medium bocron (bottles and dacron). It involves 8th grade junior high-school students of 14-15 years old as research subjects in six meetings. Indicators of process skill include observation, communication, interpretation and inference. Data is collected through students’ work sheets, written tests and observation. Processing of the data to see N-Gain used Microsoft Excel 2007, and the results showed that an increase in science process skills with a value of medium N-Gain (0,63). Bokron learning medium easily and cheaply obtainable around the students, particularly those in urban areas is quite difficult to get land to be used as aplanting medium. In addition to observation of growth and development, bokron media can also be used to observe the motion in plants. The use bokron as a learning medium can train and develop science process skills, attitude and scientific method also gives students concrete experience of the process of growth and development in plants.
Synthesis and design of silicide intermetallic materials
DOE Office of Scientific and Technical Information (OSTI.GOV)
Petrovic, J.J.; Castro, R.G.; Butt, D.P.
1997-04-01
The overall objective of this program is to develop structural silicide-based materials with optimum combinations of elevated temperature strength/creep resistance, low temperature fracture toughness, and high temperature oxidation and corrosion resistance for applications of importance to the U.S. processing industry. A further objective is to develop silicide-based prototype industrial components. The ultimate aim of the program is to work with industry to transfer the structural silicide materials technology to the private sector in order to promote international competitiveness in the area of advanced high temperature materials and important applications in major energy-intensive U.S. processing industries. The program presently has amore » number of developing industrial connections, including a CRADA with Schuller International Inc. targeted at the area of MoSi{sub 2}-based high temperature materials and components for fiberglass melting and processing applications. The authors are also developing an interaction with the Institute of Gas Technology (IGT) to develop silicides for high temperature radiant gas burner applications, for the glass and other industries. Current experimental emphasis is on the development and characterization of MoSi{sub 2}-Si{sub 3}N{sub 4} and MoSi{sub 2}-SiC composites, the plasma spraying of MoSi{sub 2}-based materials, and the joining of MoSi{sub 2} materials to metals.« less
Wang, Yi; Lee, Sui Mae; Dykes, Gary
2015-01-01
Bacterial attachment to abiotic surfaces can be explained as a physicochemical process. Mechanisms of the process have been widely studied but are not yet well understood due to their complexity. Physicochemical processes can be influenced by various interactions and factors in attachment systems, including, but not limited to, hydrophobic interactions, electrostatic interactions and substratum surface roughness. Mechanistic models and control strategies for bacterial attachment to abiotic surfaces have been established based on the current understanding of the attachment process and the interactions involved. Due to a lack of process control and standardization in the methodologies used to study the mechanisms of bacterial attachment, however, various challenges are apparent in the development of models and control strategies. In this review, the physicochemical mechanisms, interactions and factors affecting the process of bacterial attachment to abiotic surfaces are described. Mechanistic models established based on these parameters are discussed in terms of their limitations. Currently employed methods to study these parameters and bacterial attachment are critically compared. The roles of these parameters in the development of control strategies for bacterial attachment are reviewed, and the challenges that arise in developing mechanistic models and control strategies are assessed.
Quality data collection and management technology of aerospace complex product assembly process
NASA Astrophysics Data System (ADS)
Weng, Gang; Liu, Jianhua; He, Yongxi; Zhuang, Cunbo
2017-04-01
Aiming at solving problems of difficult management and poor traceability for discrete assembly process quality data, a data collection and management method is proposed which take the assembly process and BOM as the core. Data collection method base on workflow technology, data model base on BOM and quality traceability of assembly process is included in the method. Finally, assembly process quality data management system is developed and effective control and management of quality information for complex product assembly process is realized.
NASA Technical Reports Server (NTRS)
Khan, Gufran Sayeed; Gubarev, Mikhail; Speegle, Chet; Ramsey, Brian
2010-01-01
The presentation includes grazing incidence X-ray optics, motivation and challenges, mid spatial frequency generation in cylindrical polishing, design considerations for polishing lap, simulation studies and experimental results, future scope, and summary. Topics include current status of replication optics technology, cylindrical polishing process using large size polishing lap, non-conformance of polishin lap to the optics, development of software and polishing machine, deterministic prediction of polishing, polishing experiment under optimum conditions, and polishing experiment based on known error profile. Future plans include determination of non-uniformity in the polishing lap compliance, development of a polishing sequence based on a known error profile of the specimen, software for generating a mandrel polishing sequence, design an development of a flexible polishing lap, and computer controlled localized polishing process.
Expert system for web based collaborative CAE
NASA Astrophysics Data System (ADS)
Hou, Liang; Lin, Zusheng
2006-11-01
An expert system for web based collaborative CAE was developed based on knowledge engineering, relational database and commercial FEA (Finite element analysis) software. The architecture of the system was illustrated. In this system, the experts' experiences, theories and typical examples and other related knowledge, which will be used in the stage of pre-process in FEA, were categorized into analysis process and object knowledge. Then, the integrated knowledge model based on object-oriented method and rule based method was described. The integrated reasoning process based on CBR (case based reasoning) and rule based reasoning was presented. Finally, the analysis process of this expert system in web based CAE application was illustrated, and an analysis example of a machine tool's column was illustrated to prove the validity of the system.
Butler, Ashleigh; Hall, Helen; Copnell, Beverley
2016-06-01
The qualitative systematic review is a rapidly developing area of nursing research. In order to present trustworthy, high-quality recommendations, such reviews should be based on a review protocol to minimize bias and enhance transparency and reproducibility. Although there are a number of resources available to guide researchers in developing a quantitative review protocol, very few resources exist for qualitative reviews. To guide researchers through the process of developing a qualitative systematic review protocol, using an example review question. The key elements required in a systematic review protocol are discussed, with a focus on application to qualitative reviews: Development of a research question; formulation of key search terms and strategies; designing a multistage review process; critical appraisal of qualitative literature; development of data extraction techniques; and data synthesis. The paper highlights important considerations during the protocol development process, and uses a previously developed review question as a working example. This paper will assist novice researchers in developing a qualitative systematic review protocol. By providing a worked example of a protocol, the paper encourages the development of review protocols, enhancing the trustworthiness and value of the completed qualitative systematic review findings. Qualitative systematic reviews should be based on well planned, peer reviewed protocols to enhance the trustworthiness of results and thus their usefulness in clinical practice. Protocols should outline, in detail, the processes which will be used to undertake the review, including key search terms, inclusion and exclusion criteria, and the methods used for critical appraisal, data extraction and data analysis to facilitate transparency of the review process. Additionally, journals should encourage and support the publication of review protocols, and should require reference to a protocol prior to publication of the review results. © 2016 Sigma Theta Tau International.
Web-Based Learning Environment: A Theory-Based Design Process for Development and Evaluation
ERIC Educational Resources Information Center
Nam, Chang S.; Smith-Jackson, Tonya L.
2007-01-01
Web-based courses and programs have increasingly been developed by many academic institutions, organizations, and companies worldwide due to their benefits for both learners and educators. However, many of the developmental approaches lack two important considerations needed for implementing Web-based learning applications: (1) integration of the…
Comprehensive process for the recovery of value and critical materials from electronic waste
Diaz, Luis A.; Lister, Tedd E.; Parkman, Jacob A.; ...
2016-04-08
The development of technologies that contribute to the proper disposal and treatment of electronic waste is not just an environmental need, but an opportunity for the recovery and recycle of valuable metals and critical materials. Value elements in electronic waste include gold, palladium, silver, copper, nickel, and rare earth elements (RE). Here, we present the development of a process that enables efficient recycling of metals from scrap mobile electronics. An electro recycling (ER) process, based on the regeneration of Fe 3+ as a weak oxidizer, is studied for the selective recovery of base metals while leaving precious metals for separatemore » extraction at reduced chemical demand. A separate process recovers rare earth oxides from magnets in electronics. Furthermore, recovery and extraction efficiencies ca. 90 % were obtained for the extraction of base metals from the non-ferromagnetic fraction in the two different solution matrices tested (H 2SO 4, and HCl). The effect of the pre-extraction of base metals in the increase of precious metals extraction efficiency was verified. On the other hand, the extraction of rare earths from the ferromagnetic fraction, performed by means of anaerobic extraction in acid media, was assessed for the selective recovery of rare earths. We developed a comprehensive flow sheet to process electronic waste to value products.« less
Does case-mix based reimbursement stimulate the development of process-oriented care delivery?
Vos, Leti; Dückers, Michel L A; Wagner, Cordula; van Merode, Godefridus G
2010-11-01
Reimbursement based on the total care of a patient during an acute episode of illness is believed to stimulate management and clinicians to reduce quality problems like waiting times and poor coordination of care delivery. Although many studies already show that this kind of case-mix based reimbursement leads to more efficiency, it remains unclear whether care coordination improved as well. This study aims to explore whether case-mix based reimbursement stimulates development of care coordination by the use of care programmes, and a process-oriented way of working. Data for this study were gathered during the winter of 2007/2008 in a survey involving all Dutch hospitals. Descriptive and structural equation modelling (SEM) analyses were conducted. SEM reveals that adoption of the case-mix reimbursement within hospitals' budgeting processes stimulates hospitals to establish care programmes by the use of process-oriented performance measures. However, the implementation of care programmes is not (yet) accompanied by a change in focus from function (the delivery of independent care activities) to process (the delivery of care activities as being connected to a chain of interdependent care activities). This study demonstrates that hospital management can stimulate the development of care programmes by the adoption of case-mix reimbursement within hospitals' budgeting processes. Future research is recommended to confirm this finding and to determine whether the establishment of care programmes will in time indeed lead to a more process-oriented view of professionals. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
Interactive brain shift compensation using GPU based programming
NASA Astrophysics Data System (ADS)
van der Steen, Sander; Noordmans, Herke Jan; Verdaasdonk, Rudolf
2009-02-01
Processing large images files or real-time video streams requires intense computational power. Driven by the gaming industry, the processing power of graphic process units (GPUs) has increased significantly. With the pixel shader model 4.0 the GPU can be used for image processing 10x faster than the CPU. Dedicated software was developed to deform 3D MR and CT image sets for real-time brain shift correction during navigated neurosurgery using landmarks or cortical surface traces defined by the navigation pointer. Feedback was given using orthogonal slices and an interactively raytraced 3D brain image. GPU based programming enables real-time processing of high definition image datasets and various applications can be developed in medicine, optics and image sciences.
Soft sensor for monitoring biomass subpopulations in mammalian cell culture processes.
Kroll, Paul; Stelzer, Ines V; Herwig, Christoph
2017-11-01
Biomass subpopulations in mammalian cell culture processes cause impurities and influence productivity, which requires this critical process parameter to be monitored in real-time. For this reason, a novel soft sensor concept for estimating viable, dead and lysed cell concentration was developed, based on the robust and cheap in situ measurements of permittivity and turbidity in combination with a simple model. It could be shown that the turbidity measurements contain information about all investigated biomass subpopulations. The novelty of the developed soft sensor is the real-time estimation of lysed cell concentration, which is directly correlated to process-related impurities such as DNA and host cell protein in the supernatant. Based on data generated by two fed-batch processes the developed soft sensor is described and discussed. The presented soft sensor concept provides a tool for viable, dead and lysed cell concentration estimation in real-time with adequate accuracy and enables further applications with respect to process optimization and control.
ERIC Educational Resources Information Center
Kawinkamolroj, Milintra; Triwaranyu, Charinee; Thongthew, Sumlee
2015-01-01
This research aimed to develop coaching process based on transformative learning theory for changing the mindset about instruction of elementary school teachers. Tools used in this process include mindset tests and questionnaires designed to assess the instructional mindset of teachers and to allow the teachers to reflect on how they perceive…
The Effectiveness of Adopting E-Readers to Facilitate EFL Students' Process-Based Academic Writing
ERIC Educational Resources Information Center
Hung, Hui-Chun; Young, Shelley Shwu-Ching
2015-01-01
English as Foreign Language (EFL) students face additional difficulties for academic writing largely due to their level of language competency. An appropriate structural process of writing can help students develop their academic writing skills. This study explored the use of the e-readers to facilitate EFL students' process-based academic…
Goal Development Practices of Physical Therapists Working in Educational Environments.
Wynarczuk, Kimberly D; Chiarello, Lisa A; Gohrband, Catherine L
2017-11-01
The aims of this study were to (1) describe the practices that school-based physical therapists use in developing student goals, and (2) identify facilitators and barriers to development of goals that are specific to participation in the context of the school setting. 46 school-based physical therapists who participated in a previous study on school-based physical therapy practice (PT COUNTS) completed a questionnaire on goal development. Frequencies and cross tabulations were generated for quantitative data. Open-ended questions were analyzed using an iterative qualitative analysis process. A majority of therapists reported that they frequently develop goals collaboratively with other educational team members. Input from teachers, related services personnel, and parents has the most influence on goal development. Qualitative analysis identified five themes that influence development of participation-based goals: (1) school-based philosophy and practice; (2) the educational environment, settings, and routines; (3) student strengths, needs, and personal characteristics; (4) support from and collaboration with members of the educational team; and (5) therapist practice and motivation. Goal development is a complex process that involves multiple members of the educational team and is influenced by many different aspects of practice, the school environment, and student characteristics.
Development of crayfish bio-based plastic materials processed by small-scale injection moulding.
Felix, Manuel; Romero, Alberto; Cordobes, Felipe; Guerrero, Antonio
2015-03-15
Protein has been investigated as a source for biodegradable polymeric materials. This work evaluates the development of plastic materials based on crayfish and glycerol blends, processed by injection moulding, as a fully biodegradable alternative to conventional polymer-based plastics. The effect of different additives, namely sodium sulfite or bisulfite as reducing agents, urea as denaturing agent and L-cysteine as cross-linking agent, is also analysed. The incorporation of any additive always yields an increase in energy efficiency at the mixing stage, but its effect on the mechanical properties of the bioplastics is not so clear, and even dampened. The additive developing a greater effect is L-cysteine, showing higher Young's modulus values and exhibiting a remnant thermosetting potential. Thus, processing at higher temperature yields a remarkable increase in extensibility. This work illustrates the feasibility of crayfish-based green biodegradable plastics, thereby contributing to the search for potential value-added applications for this by-product. © 2014 Society of Chemical Industry.
Application of space technologies for the purpose of education at the Belarusian state university
NASA Astrophysics Data System (ADS)
Liashkevich, Siarhey
Application of space technologies for the purpose of education at the Aerospace Educational Center of Belarusian state university is discussed. The aim of the work is to prepare launch of small satellite. Students are expected to participate in the design of control station, systems of communication, earth observation, navigation, and positioning. Benefit of such project-based learning from economical perspective is discussed. At present our training system at the base of EyasSat classroom satellite is used for management of satellite orientation and stabilization system. Principles of video processing, communication technologies and informational security for small spacecraft are developed at the base of Wi9M-2443 developer kit. More recent equipment allows obtaining the skills in digital signal processing at the base of FPGA. Development of ground station includes setup of 2.6 meter diameter dish for L-band, and spiral rotational antennas for UHF and VHF bands. Receiver equipment from National Instruments is used for digital signal processing and signal management.
SenSyF Experience on Integration of EO Services in a Generic, Cloud-Based EO Exploitation Platform
NASA Astrophysics Data System (ADS)
Almeida, Nuno; Catarino, Nuno; Gutierrez, Antonio; Grosso, Nuno; Andrade, Joao; Caumont, Herve; Goncalves, Pedro; Villa, Guillermo; Mangin, Antoine; Serra, Romain; Johnsen, Harald; Grydeland, Tom; Emsley, Stephen; Jauch, Eduardo; Moreno, Jose; Ruiz, Antonio
2016-08-01
SenSyF is a cloud-based data processing framework for EO- based services. It has been pioneer in addressing Big Data issues from the Earth Observation point of view, and is a precursor of several of the technologies and methodologies that will be deployed in ESA's Thematic Exploitation Platforms and other related systems.The SenSyF system focuses on developing fully automated data management, together with access to a processing and exploitation framework, including Earth Observation specific tools. SenSyF is both a development and validation platform for data intensive applications using Earth Observation data. With SenSyF, scientific, institutional or commercial institutions developing EO- based applications and services can take advantage of distributed computational and storage resources, tailored for applications dependent on big Earth Observation data, and without resorting to deep infrastructure and technological investments.This paper describes the integration process and the experience gathered from different EO Service providers during the project.
Computer-aided software development process design
NASA Technical Reports Server (NTRS)
Lin, Chi Y.; Levary, Reuven R.
1989-01-01
The authors describe an intelligent tool designed to aid managers of software development projects in planning, managing, and controlling the development process of medium- to large-scale software projects. Its purpose is to reduce uncertainties in the budget, personnel, and schedule planning of software development projects. It is based on dynamic model for the software development and maintenance life-cycle process. This dynamic process is composed of a number of time-varying, interacting developmental phases, each characterized by its intended functions and requirements. System dynamics is used as a modeling methodology. The resulting Software LIfe-Cycle Simulator (SLICS) and the hybrid expert simulation system of which it is a subsystem are described.
A proposal for a drug product Manufacturing Classification System (MCS) for oral solid dosage forms.
Leane, Michael; Pitt, Kendal; Reynolds, Gavin
2015-01-01
This paper proposes the development of a drug product Manufacturing Classification System (MCS) based on processing route. It summarizes conclusions from a dedicated APS conference and subsequent discussion within APS focus groups and the MCS working party. The MCS is intended as a tool for pharmaceutical scientists to rank the feasibility of different processing routes for the manufacture of oral solid dosage forms, based on selected properties of the API and the needs of the formulation. It has many applications in pharmaceutical development, in particular, it will provide a common understanding of risk by defining what the "right particles" are, enable the selection of the best process, and aid subsequent transfer to manufacturing. The ultimate aim is one of prediction of product developability and processability based upon previous experience. This paper is intended to stimulate contribution from a broad range of stakeholders to develop the MCS concept further and apply it to practice. In particular, opinions are sought on what API properties are important when selecting or modifying materials to enable an efficient and robust pharmaceutical manufacturing process. Feedback can be given by replying to our dedicated e-mail address (mcs@apsgb.org); completing the survey on our LinkedIn site; or by attending one of our planned conference roundtable sessions.
Developing a Teacher Identity: TAs' Perspectives about Learning to Teach Inquiry-Based Biology Labs
ERIC Educational Resources Information Center
Gormally, Cara
2016-01-01
Becoming a teacher involves a continual process of identity development and negotiation. Expectations and norms for particular pedagogies impact and inform this development. In inquiry based classes, instructors are expected to act as learning facilitators rather than information providers. For novice inquiry instructors, developing a teacher…
Adaptive Signal Processing Testbed: VME-based DSP board market survey
NASA Astrophysics Data System (ADS)
Ingram, Rick E.
1992-04-01
The Adaptive Signal Processing Testbed (ASPT) is a real-time multiprocessor system utilizing digital signal processor technology on VMEbus based printed circuit boards installed on a Sun workstation. The ASPT has specific requirements, particularly as regards to the signal excision application, with respect to interfacing with current and planned data generation equipment, processing of the data, storage to disk of final and intermediate results, and the development tools for applications development and integration into the overall EW/COM computing environment. A prototype ASPT was implemented using three VME-C-30 boards from Applied Silicon. Experience gained during the prototype development led to the conclusions that interprocessor communications capability is the most significant contributor to overall ASPT performance. In addition, the host involvement should be minimized. Boards using different processors were evaluated with respect to the ASPT system requirements, pricing, and availability. Specific recommendations based on various priorities are made as well as recommendations concerning the integration and interaction of various tools developed during the prototype implementation.
Propellant injection systems and processes
NASA Technical Reports Server (NTRS)
Ito, Jackson I.
1995-01-01
The previous 'Art of Injector Design' is maturing and merging with the more systematic 'Science of Combustion Device Analysis.' This technology can be based upon observation, correlation, experimentation and ultimately analytical modeling based upon basic engineering principles. This methodology is more systematic and far superior to the historical injector design process of 'Trial and Error' or blindly 'Copying Past Successes.' The benefit of such an approach is to be able to rank candidate design concepts for relative probability of success or technical risk in all the important combustion device design requirements and combustion process development risk categories before committing to an engine development program. Even if a single analytical design concept cannot be developed to predict satisfying all requirements simultaneously, a series of risk mitigation key enabling technologies can be identified for early resolution. Lower cost subscale or laboratory experimentation to demonstrate proof of principle, critical instrumentation requirements, and design discriminating test plans can be developed based on the physical insight provided by these analyses.
The Process of Curriculum Development and the Use of Assessments in Independent Schools
ERIC Educational Resources Information Center
Young, JoAnn P.
2012-01-01
This qualitative study was designed to examine and identify the site-based process that two elementary, independent schools, accredited by the Southern Association of Independent Schools use for curriculum and instructional development. Also, the study examined and identified the development and use of assessments to support each school's…
An Analytic Hierarchy Process for School Quality and Inspection: Model Development and Application
ERIC Educational Resources Information Center
Al Qubaisi, Amal; Badri, Masood; Mohaidat, Jihad; Al Dhaheri, Hamad; Yang, Guang; Al Rashedi, Asma; Greer, Kenneth
2016-01-01
Purpose: The purpose of this paper is to develop an analytic hierarchy planning-based framework to establish criteria weights and to develop a school performance system commonly called school inspections. Design/methodology/approach: The analytic hierarchy process (AHP) model uses pairwise comparisons and a measurement scale to generate the…
An Academic Development Model for Fostering Innovation and Sharing in Curriculum Design
ERIC Educational Resources Information Center
Dempster, Jacqueline A.; Benfield, Greg; Francis, Richard
2012-01-01
This paper outlines an academic development process based around a two- or three-day workshop programme called a Course Design Intensive (CDI). The CDI process aims to foster collaboration and peer support in curriculum development and bring about pedagogic innovation and positive experiences for both tutors and learners. Bringing participants…
Ott, Denise; Kralisch, Dana; Denčić, Ivana; Hessel, Volker; Laribi, Yosra; Perrichon, Philippe D; Berguerand, Charline; Kiwi-Minsker, Lioubov; Loeb, Patrick
2014-12-01
As the demand for new drugs is rising, the pharmaceutical industry faces the quest of shortening development time, and thus, reducing the time to market. Environmental aspects typically still play a minor role within the early phase of process development. Nevertheless, it is highly promising to rethink, redesign, and optimize process strategies as early as possible in active pharmaceutical ingredient (API) process development, rather than later at the stage of already established processes. The study presented herein deals with a holistic life-cycle-based process optimization and intensification of a pharmaceutical production process targeting a low-volume, high-value API. Striving for process intensification by transfer from batch to continuous processing, as well as an alternative catalytic system, different process options are evaluated with regard to their environmental impact to identify bottlenecks and improvement potentials for further process development activities. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
SOI-CMOS Process for Monolithic, Radiation-Tolerant, Science-Grade Imagers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, George; Lee, Adam
In Phase I, Voxtel worked with Jazz and Sandia to document and simulate the processes necessary to implement a DH-BSI SOI CMOS imaging process. The development is based upon mature SOI CMOS process at both fabs, with the addition of only a few custom processing steps for integration and electrical interconnection of the fully-depleted photodetectors. In Phase I, Voxtel also characterized the Sandia process, including the CMOS7 design rules, and we developed the outline of a process option that included a “BOX etch”, that will permit a “detector in handle” SOI CMOS process to be developed The process flows weremore » developed in cooperation with both Jazz and Sandia process engineers, along with detailed TCAD modeling and testing of the photodiode array architectures. In addition, Voxtel tested the radiation performance of the Jazz’s CA18HJ process, using standard and circular-enclosed transistors.« less
Excellent approach to modeling urban expansion by fuzzy cellular automata: agent base model
NASA Astrophysics Data System (ADS)
Khajavigodellou, Yousef; Alesheikh, Ali A.; Mohammed, Abdulrazak A. S.; Chapi, Kamran
2014-09-01
Recently, the interaction between humans and their environment is the one of important challenges in the world. Landuse/ cover change (LUCC) is a complex process that includes actors and factors at different social and spatial levels. The complexity and dynamics of urban systems make the applicable practice of urban modeling very difficult. With the increased computational power and the greater availability of spatial data, micro-simulation such as the agent based and cellular automata simulation methods, has been developed by geographers, planners, and scholars, and it has shown great potential for representing and simulating the complexity of the dynamic processes involved in urban growth and land use change. This paper presents Fuzzy Cellular Automata in Geospatial Information System and remote Sensing to simulated and predicted urban expansion pattern. These FCA-based dynamic spatial urban models provide an improved ability to forecast and assess future urban growth and to create planning scenarios, allowing us to explore the potential impacts of simulations that correspond to urban planning and management policies. A fuzzy inference guided cellular automata approach. Semantic or linguistic knowledge on Land use change is expressed as fuzzy rules, based on which fuzzy inference is applied to determine the urban development potential for each pixel. The model integrates an ABM (agent-based model) and FCA (Fuzzy Cellular Automata) to investigate a complex decision-making process and future urban dynamic processes. Based on this model rapid development and green land protection under the influences of the behaviors and decision modes of regional authority agents, real estate developer agents, resident agents and non- resident agents and their interactions have been applied to predict the future development patterns of the Erbil metropolitan region.
ERIC Educational Resources Information Center
Jacobs, Richard M.
2016-01-01
A 2 × 2 matrix identifying four discrete thought processes was presented. The contributions of the first three processes in developing the knowledge base of public administration were detailed as were their limitations. It was argued that the fourth process--insight and its mental powers--builds upon the strengths and overcomes the limitations…
TU-AB-BRD-04: Development of Quality Management Program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thomadsen, B.
2015-06-15
Current quality assurance and quality management guidelines provided by various professional organizations are prescriptive in nature, focusing principally on performance characteristics of planning and delivery devices. However, published analyses of events in radiation therapy show that most events are often caused by flaws in clinical processes rather than by device failures. This suggests the need for the development of a quality management program that is based on integrated approaches to process and equipment quality assurance. Industrial engineers have developed various risk assessment tools that are used to identify and eliminate potential failures from a system or a process before amore » failure impacts a customer. These tools include, but are not limited to, process mapping, failure modes and effects analysis, fault tree analysis. Task Group 100 of the American Association of Physicists in Medicine has developed these tools and used them to formulate an example risk-based quality management program for intensity-modulated radiotherapy. This is a prospective risk assessment approach that analyzes potential error pathways inherent in a clinical process and then ranks them according to relative risk, typically before implementation, followed by the design of a new process or modification of the existing process. Appropriate controls are then put in place to ensure that failures are less likely to occur and, if they do, they will more likely be detected before they propagate through the process, compromising treatment outcome and causing harm to the patient. Such a prospective approach forms the basis of the work of Task Group 100 that has recently been approved by the AAPM. This session will be devoted to a discussion of these tools and practical examples of how these tools can be used in a given radiotherapy clinic to develop a risk based quality management program. Learning Objectives: Learn how to design a process map for a radiotherapy process Learn how to perform failure modes and effects analysis analysis for a given process Learn what fault trees are all about Learn how to design a quality management program based upon the information obtained from process mapping, failure modes and effects analysis and fault tree analysis. Dunscombe: Director, TreatSafely, LLC and Center for the Assessment of Radiological Sciences; Consultant to IAEA and Varian Thomadsen: President, Center for the Assessment of Radiological Sciences Palta: Vice President of the Center for the Assessment of Radiological Sciences.« less
ERIC Educational Resources Information Center
Kupersmidt, Janis B.; Stelter, Rebecca; Dodge, Kenneth A.
2011-01-01
The purpose of this study was to evaluate the psychometric properties of an audio computer-assisted self-interviewing Web-based software application called the Social Information Processing Application (SIP-AP) that was designed to assess social information processing skills in boys in RD through 5th grades. This study included a racially and…
Janknegt, Robert; Scott, Mike; Mairs, Jill; Timoney, Mark; McElnay, James; Brenninkmeijer, Rob
2007-10-01
Drug selection should be a rational process that embraces the principles of evidence-based medicine. However, many factors may affect the choice of agent. It is against this background that the System of Objectified Judgement Analysis (SOJA) process for rational drug-selection was developed. This article describes how the information on which the SOJA process is based, was researched and processed.
Electronic Health Record for Intensive Care based on Usual Windows Based Software.
Reper, Arnaud; Reper, Pascal
2015-08-01
In Intensive Care Units, the amount of data to be processed for patients care, the turn over of the patients, the necessity for reliability and for review processes indicate the use of Patient Data Management Systems (PDMS) and electronic health records (EHR). To respond to the needs of an Intensive Care Unit and not to be locked with proprietary software, we developed an EHR based on usual software and components. The software was designed as a client-server architecture running on the Windows operating system and powered by the access data base system. The client software was developed using Visual Basic interface library. The application offers to the users the following functions: medical notes captures, observations and treatments, nursing charts with administration of medications, scoring systems for classification, and possibilities to encode medical activities for billing processes. Since his deployment in September 2004, the EHR was used to care more than five thousands patients with the expected software reliability and facilitated data management and review processes. Communications with other medical software were not developed from the start, and are realized by the use of basic functionalities communication engine. Further upgrade of the system will include multi-platform support, use of typed language with static analysis, and configurable interface. The developed system based on usual software components was able to respond to the medical needs of the local ICU environment. The use of Windows for development allowed us to customize the software to the preexisting organization and contributed to the acceptability of the whole system.
Vanegas, Fernando; Bratanov, Dmitry; Powell, Kevin; Weiss, John; Gonzalez, Felipe
2018-01-17
Recent advances in remote sensed imagery and geospatial image processing using unmanned aerial vehicles (UAVs) have enabled the rapid and ongoing development of monitoring tools for crop management and the detection/surveillance of insect pests. This paper describes a (UAV) remote sensing-based methodology to increase the efficiency of existing surveillance practices (human inspectors and insect traps) for detecting pest infestations (e.g., grape phylloxera in vineyards). The methodology uses a UAV integrated with advanced digital hyperspectral, multispectral, and RGB sensors. We implemented the methodology for the development of a predictive model for phylloxera detection. In this method, we explore the combination of airborne RGB, multispectral, and hyperspectral imagery with ground-based data at two separate time periods and under different levels of phylloxera infestation. We describe the technology used-the sensors, the UAV, and the flight operations-the processing workflow of the datasets from each imagery type, and the methods for combining multiple airborne with ground-based datasets. Finally, we present relevant results of correlation between the different processed datasets. The objective of this research is to develop a novel methodology for collecting, processing, analising and integrating multispectral, hyperspectral, ground and spatial data to remote sense different variables in different applications, such as, in this case, plant pest surveillance. The development of such methodology would provide researchers, agronomists, and UAV practitioners reliable data collection protocols and methods to achieve faster processing techniques and integrate multiple sources of data in diverse remote sensing applications.
Smith, Chris; Vannak, Uk; Sokhey, Ly; Ngo, Thoai D; Gold, Judy; Free, Caroline
2016-01-05
The objective of this paper is to outline the formative research process used to develop the MOTIF mobile phone-based (mHealth) intervention to support post-abortion family planning in Cambodia. The formative research process involved literature reviews, interviews and focus group discussions with clients, and consultation with clinicians and organisations implementing mHealth activities in Cambodia. This process led to the development of a conceptual framework and the intervention. Key findings from the formative research included identification of the main reasons for non-use of contraception and patterns of mobile phone use in Cambodia. We drew on components of existing interventions and behaviour change theory to develop a conceptual framework. A multi-faceted voice-based intervention was designed to address health concerns and other key determinants of contraception use. Formative research was essential in order to develop an appropriate mHealth intervention to support post-abortion contraception in Cambodia. Each component of the formative research contributed to the final intervention design.
Intervention mapping: a process for developing theory- and evidence-based health education programs.
Bartholomew, L K; Parcel, G S; Kok, G
1998-10-01
The practice of health education involves three major program-planning activities: needs assessment, program development, and evaluation. Over the past 20 years, significant enhancements have been made to the conceptual base and practice of health education. Models that outline explicit procedures and detailed conceptualization of community assessment and evaluation have been developed. Other advancements include the application of theory to health education and promotion program development and implementation. However, there remains a need for more explicit specification of the processes by which one uses theory and empirical findings to develop interventions. This article presents the origins, purpose, and description of Intervention Mapping, a framework for health education intervention development. Intervention Mapping is composed of five steps: (1) creating a matrix of proximal program objectives, (2) selecting theory-based intervention methods and practical strategies, (3) designing and organizing a program, (4) specifying adoption and implementation plans, and (5) generating program evaluation plans.
NASA Astrophysics Data System (ADS)
Purwoko, Saad, Noor Shah; Tajudin, Nor'ain Mohd
2017-05-01
This study aims to: i) develop problem solving questions of Linear Equations System of Two Variables (LESTV) based on levels of IPT Model, ii) explain the level of students' skill of information processing in solving LESTV problems; iii) explain students' skill in information processing in solving LESTV problems; and iv) explain students' cognitive process in solving LESTV problems. This study involves three phases: i) development of LESTV problem questions based on Tessmer Model; ii) quantitative survey method on analyzing students' skill level of information processing; and iii) qualitative case study method on analyzing students' cognitive process. The population of the study was 545 eighth grade students represented by a sample of 170 students of five Junior High Schools in Hilir Barat Zone, Palembang (Indonesia) that were chosen using cluster sampling. Fifteen students among them were drawn as a sample for the interview session with saturated information obtained. The data were collected using the LESTV problem solving test and the interview protocol. The quantitative data were analyzed using descriptive statistics, while the qualitative data were analyzed using the content analysis. The finding of this study indicated that students' cognitive process was just at the step of indentifying external source and doing algorithm in short-term memory fluently. Only 15.29% students could retrieve type A information and 5.88% students could retrieve type B information from long-term memory. The implication was the development problems of LESTV had validated IPT Model in modelling students' assessment by different level of hierarchy.
Development of real-time extensometer based on image processing
NASA Astrophysics Data System (ADS)
Adinanta, H.; Puranto, P.; Suryadi
2017-04-01
An extensometer system was developed by using high definition web camera as main sensor to track object position. The developed system applied digital image processing techniques. The image processing was used to measure the change of object position. The position measurement was done in real-time so that the system can directly showed the actual position in both x and y-axis. In this research, the relation between pixel and object position changes had been characterized. The system was tested by moving the target in a range of 20 cm in interval of 1 mm. To verify the long run performance, the stability and linearity of continuous measurements on both x and y-axis, this measurement had been conducted for 83 hours. The results show that this image processing-based extensometer had both good stability and linearity.
Valentijn, Pim P; Ruwaard, Dirk; Vrijhoef, Hubertus J M; de Bont, Antoinette; Arends, Rosa Y; Bruijnzeels, Marc A
2015-10-09
Collaborative partnerships are considered an essential strategy for integrating local disjointed health and social services. Currently, little evidence is available on how integrated care arrangements between professionals and organisations are achieved through the evolution of collaboration processes over time. The first aim was to develop a typology of integrated care projects (ICPs) based on the final degree of integration as perceived by multiple stakeholders. The second aim was to study how types of integration differ in changes of collaboration processes over time and final perceived effectiveness. A longitudinal mixed-methods study design based on two data sources (surveys and interviews) was used to identify the perceived degree of integration and patterns in collaboration among 42 ICPs in primary care in The Netherlands. We used cluster analysis to identify distinct subgroups of ICPs based on the final perceived degree of integration from a professional, organisational and system perspective. With the use of ANOVAs, the subgroups were contrasted based on: 1) changes in collaboration processes over time (shared ambition, interests and mutual gains, relationship dynamics, organisational dynamics and process management) and 2) final perceived effectiveness (i.e. rated success) at the professional, organisational and system levels. The ICPs were classified into three subgroups with: 'United Integration Perspectives (UIP)', 'Disunited Integration Perspectives (DIP)' and 'Professional-oriented Integration Perspectives (PIP)'. ICPs within the UIP subgroup made the strongest increase in trust-based (mutual gains and relationship dynamics) as well as control-based (organisational dynamics and process management) collaboration processes and had the highest overall effectiveness rates. On the other hand, ICPs with the DIP subgroup decreased on collaboration processes and had the lowest overall effectiveness rates. ICPs within the PIP subgroup increased in control-based collaboration processes (organisational dynamics and process management) and had the highest effectiveness rates at the professional level. The differences across the three subgroups in terms of the development of collaboration processes and the final perceived effectiveness provide evidence that united stakeholders' perspectives are achieved through a constructive collaboration process over time. Disunited perspectives at the professional, organisation and system levels can be aligned by both trust-based and control-based collaboration processes.
ERIC Educational Resources Information Center
Singh, Oma B.
2009-01-01
This study used a design based-research (DBR) methodology to examine how an Instructional Systematic Design (ISD) process such as ADDIE (Analysis, Design, Development, Implementation, Evaluation) can be employed to develop a web-based module to teach metacognitive learning strategies to students in higher education. The goal of the study was…
USDA-ARS?s Scientific Manuscript database
This paper presents a new GIS-based Best Management Practice (BMP) Tool developed for watershed managers to assist in the decision making process by simulating various scenarios using various combinations of Best Management Practices (BMPs). The development of this BMPTool is based on the integratio...
Development and Validation of a Theory Based Screening Process for Suicide Risk
2015-09-01
not be delayed until all data have been collected. This is with particular respect to our data that confirms soldiers under report suicide ideation ...and that while they say that they would inform loved ones about suicidal thoughts, over 50% of soldiers who endorse ideation have not told anyone...AD_________________ Award Number: W81XWH-11-1-0588 TITLE: Development and Validation of a Theory Based Screening Process for Suicide Risk
Turnaround operations analysis for OTV. Volume 1: Executive summary
NASA Technical Reports Server (NTRS)
1988-01-01
Anaylses performed for ground processing, both expendable and reusable ground-based Orbital Transfer Vehicles (OTVs) launched on the Space Transportation System (STS), a reusable space-based OTV (SBOTV) launched on the STS, and a reusable ground-based OTV (GBOTV) launched on an unmanned cargo vehicle and recovered by the Orbiter are summarized. Also summarized are the analyses performed for space processing the reusable SBOTV at the Space Station in low Earth orbit (LEO) as well as the maintenance and servicing of the SBOTV accommodations at the Space Station. In addition, the candidate OTV concepts, design and interface requirements, and the Space Station design, support, and interface requirements are summarized. A development schedule and associated costs for the required SBOTV accommodations at the Space Station are presented. Finallly, the technology development plan to develop the capability to process both GBOTVs and SBOTVs are summarized.
King, Gillian; Shepherd, Tracy A; Servais, Michelle; Willoughby, Colleen; Bolack, Linda; Strachan, Deborah; Moodie, Sheila; Baldwin, Patricia; Knickle, Kerry; Parker, Kathryn; Savage, Diane; McNaughton, Nancy
2016-10-01
To describe the creation and validation of six simulations concerned with effective listening and interpersonal communication in pediatric rehabilitation. The simulations involved clinicians from various disciplines, were based on clinical scenarios related to client issues, and reflected core aspects of listening/communication. Each simulation had a key learning objective, thus focusing clinicians on specific listening skills. The article outlines the process used to turn written scenarios into digital video simulations, including steps taken to establish content validity and authenticity, and to establish a series of videos based on the complexity of their learning objectives, given contextual factors and associated macrocognitive processes that influence the ability to listen. A complexity rating scale was developed and used to establish a gradient of easy/simple, intermediate, and hard/complex simulations. The development process exemplifies an evidence-based, integrated knowledge translation approach to the teaching and learning of listening and communication skills.
Formal Specification of Information Systems Requirements.
ERIC Educational Resources Information Center
Kampfner, Roberto R.
1985-01-01
Presents a formal model for specification of logical requirements of computer-based information systems that incorporates structural and dynamic aspects based on two separate models: the Logical Information Processing Structure and the Logical Information Processing Network. The model's role in systems development is discussed. (MBR)
Mentoring a new science teacher in reform-based ways: A focus on inquiry
NASA Astrophysics Data System (ADS)
Schomer, Scott D.
The processes, understandings, and uses of inquiry are identified by the National Science Education Standards (National Research Council, 1996) as a key component of science instruction. Currently, there are few examples in the literature demonstrating how teachers go about co-constructing inquiry-based activities and how mentors can promote the use of reform-based practices by novices. The purpose of this interpretive case study was to investigate how a mentor and her protege collaboratively developed, implemented and assessed three inquiry-based experiences. The questions that guided this research were: (1) How does the mentor assist protege growth in the development, implementation and assessment of inquiry-based experiences for secondary science students? (2) How are the protege's perceptions of inquiry influenced by her participation in developing, implementing and assessing inquiry-based experiences for secondary science students? The co-construction of the inquiry activities and the facilitation provided by the mentor represented Lev Vygotsky's (1978) social construction of information as the mentor guided the protege beyond her cognitive zone of proximal development. The participants in this study were a veteran science teacher who was obtaining her mentor certification, or Teacher Support Specialist, and her protege who was a science teacher in the induction phase of her career. Data were collected through in-depth, semi-structured interviews, tape recordings of planning sessions, researcher field notes, and email reflections during the co-construction process. Inductive analysis of the data led to the identification of common categories and subsequent findings, which reflected what the mentor and protege discussed about inquiry and the process of collaboration. The six themes that emerged from this study led to several implications that are significant for science teacher preparation and the mentoring community. The teachers indicated tools, such as the "Essential Features and Variations of Inquiry" table, were helpful for planning and assessing inquiry-based experiences. Examination of findings revealed how the process of purposefully collaborating on the development of inquiry-based lessons fostered a more student-centered approach to teaching and learning by the protege. Therefore, having new teachers continue to collaborate with reform-minded mentors beyond their first year of teaching may help new teachers develop inquiry-based pedagogies.
Intelligent processing of acoustic emission signals
NASA Astrophysics Data System (ADS)
Sachse, Wolfgang; Grabec, Igor
1992-07-01
Recent developments in applying neural-like signal-processing procedures for analyzing acoustic emission signals are summarized. These procedures employ a set of learning signals to develop a memory that can subsequently be utilized to process other signals to recover information about an unknown source. A majority of the current applications to process ultrasonic waveforms are based on multilayered, feed-forward neural networks, trained with some type of back-error propagation rule.
Depressive Rumination: Investigating Mechanisms to Improve Cognitive Behavioural Treatments
Watkins, Edward R.
2009-01-01
Rumination has been identified as a core process in the development and maintenance of depression. Treatments targeting ruminative processes may, therefore, be particularly helpful for treating chronic and recurrent depression. The development of such treatments requires translational research that marries clinical trials, process–outcome research, and basic experimental research that investigates the mechanisms underpinning pathological rumination. For example, a program of experimental research has demonstrated that there are distinct processing modes during rumination that have distinct functional effects for the consequences of rumination on a range of clinically relevant cognitive and emotional processes: an adaptive style characterized by more concrete, specific processing and a maladaptive style characterized by abstract, overgeneral processing. Based on this experimental work, two new treatments for depression have been developed and evaluated: (a) rumination-focused cognitive therapy, an individual-based face-to-face therapy, which has encouraging results in the treatment of residual depression in an extended case series and a pilot randomized controlled trial; and (b) concreteness training, a facilitated self-help intervention intended to increase specificity of processing in patients with depression, which has beneficial findings in a proof-of-principle study in a dysphoric population. These findings indicate the potential value of process–outcome research (a) explicitly targeting identified vulnerability processes and (b) developing interventions informed by research into basic mechanisms. PMID:19697180
A new chapter in pharmaceutical manufacturing: 3D-printed drug products.
Norman, James; Madurawe, Rapti D; Moore, Christine M V; Khan, Mansoor A; Khairuzzaman, Akm
2017-01-01
FDA recently approved a 3D-printed drug product in August 2015, which is indicative of a new chapter for pharmaceutical manufacturing. This review article summarizes progress with 3D printed drug products and discusses process development for solid oral dosage forms. 3D printing is a layer-by-layer process capable of producing 3D drug products from digital designs. Traditional pharmaceutical processes, such as tablet compression, have been used for decades with established regulatory pathways. These processes are well understood, but antiquated in terms of process capability and manufacturing flexibility. 3D printing, as a platform technology, has competitive advantages for complex products, personalized products, and products made on-demand. These advantages create opportunities for improving the safety, efficacy, and accessibility of medicines. Although 3D printing differs from traditional manufacturing processes for solid oral dosage forms, risk-based process development is feasible. This review highlights how product and process understanding can facilitate the development of a control strategy for different 3D printing methods. Overall, the authors believe that the recent approval of a 3D printed drug product will stimulate continual innovation in pharmaceutical manufacturing technology. FDA encourages the development of advanced manufacturing technologies, including 3D-printing, using science- and risk-based approaches. Published by Elsevier B.V.
Production of orthophosphate suspension fertilizers from wet-process acid
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, T.M.; Burnell, J.R.
1984-01-01
For many years, the Tennessee Valley Authority (TVA) has worked toward development of suspension fertilizers. TVA has two plants for production of base suspension fertilizers from wet-process orthophosphoric acid. One is a demonstration-scale plant where a 13-38-0 grade base suspension is produced by a three-stage ammoniation process. The other is a new batch-type pilot plant which is capable of producing high-grade base suspensions of various ratios and grades from wet-process acid. In this batch plant, suspensions and solutions can also be produced from solid intermediates.
DOT National Transportation Integrated Search
2011-01-01
This study develops an enhanced transportation planning framework by augmenting the sequential four-step : planning process with post-processing techniques. The post-processing techniques are incorporated through a feedback : mechanism and aim to imp...
Development of Multi-slice Analytical Tool to Support BIM-based Design Process
NASA Astrophysics Data System (ADS)
Atmodiwirjo, P.; Johanes, M.; Yatmo, Y. A.
2017-03-01
This paper describes the on-going development of computational tool to analyse architecture and interior space based on multi-slice representation approach that is integrated with Building Information Modelling (BIM). Architecture and interior space is experienced as a dynamic entity, which have the spatial properties that might be variable from one part of space to another, therefore the representation of space through standard architectural drawings is sometimes not sufficient. The representation of space as a series of slices with certain properties in each slice becomes important, so that the different characteristics in each part of space could inform the design process. The analytical tool is developed for use as a stand-alone application that utilises the data exported from generic BIM modelling tool. The tool would be useful to assist design development process that applies BIM, particularly for the design of architecture and interior spaces that are experienced as continuous spaces. The tool allows the identification of how the spatial properties change dynamically throughout the space and allows the prediction of the potential design problems. Integrating the multi-slice analytical tool in BIM-based design process thereby could assist the architects to generate better design and to avoid unnecessary costs that are often caused by failure to identify problems during design development stages.
NASA Astrophysics Data System (ADS)
Larson, Teresa
2011-12-01
This self-study examines my experiences with implementing an inquiry-based version of a chemistry course (Chemistry 299) designed for elementary education majors. The inquiry-based curriculum design and teaching strategies that I implement in Chemistry 299 is the focus of this study. Since my previous education and professional experiences were in the physical sciences, I position myself in this study as a scientist who engages in self-study as a form of professional development for the purpose of developing an inquiry-based curriculum and instructional practices. My research provides an inside perspective of the curriculum development process. This process involves implementing the inquiry-oriented ideas and knowledge I acquired in my graduate studies to design the curriculum and influence my teaching practice. My analysis of the curriculum and my instruction is guided by two questions: What are the strengths and weaknesses of the inquiry-based Chemistry 299 curriculum design? What does the process of developing my inquiry-based teaching practice entail and what makes is challenging? Schwab's (1973) The Practical 3: Translation into Curriculum serves as the theoretical framework for this study because of the emphasis Schwab places on combining theoretical and practical knowledge in the curriculum development process and because of the way he characterizes the curriculum. The findings in this study are separated into curriculum and instruction domains. First, the Chemistry 299 curriculum was designed to make the epistemological practices of scientists "accessible" to students by emphasizing epistemic development with respect to their ideas about scientific inquiry and science learning. Using student learning as a gauge for progress, I identify specific design elements that developed transferable inquiry skills as a means to support scientific literacy and pre-service teacher education. Second, the instruction-related findings built upon the insight I gained through my analysis of the curriculum. The data reveals four areas of inner conflict I dealt with throughout the study that related to underlying beliefs I held about science teaching and learning. The implications of the study position the Chemistry 299 curriculum in the field and speak to issues related to developing science courses for elementary education majors and professional development for scientists.
Workspace definition for navigated control functional endoscopic sinus surgery
NASA Astrophysics Data System (ADS)
Gessat, Michael; Hofer, Mathias; Audette, Michael; Dietz, Andreas; Meixensberger, Jürgen; Stauß, Gero; Burgert, Oliver
2007-03-01
For the pre-operative definition of a surgical workspace for Navigated Control ® Functional Endoscopic Sinus Surgery (FESS), we developed a semi-automatic image processing system. Based on observations of surgeons using a manual system, we implemented a workflow-based engineering process that led us to the development of a system reducing time and workload spent during the workspace definition. The system uses a feature based on local curvature to align vertices of a polygonal outline along the bone structures defining the cavities of the inner nose. An anisotropic morphologic operator was developed solve problems arising from artifacts from noise and partial volume effects. We used time measurements and NASA's TLX questionnaire to evaluate our system.
Research and technology, fiscal year 1982
NASA Technical Reports Server (NTRS)
1982-01-01
Advanced studies are reviewed. Atmospheric sciences, magnetospheric physics, solar physics, gravitational physics, astronomy, and materials processing in space comprise the research programs. Large space systems, propulsion technology, materials and processes, electrical/electronic systems, data bases/design criteria, and facilities development comprise the technology development activities.
Uptake, metabolism, and volatilization of selenium by terrestrial plants
USDA-ARS?s Scientific Manuscript database
The green technology of phytoremediation has being developed for the management of metal(loid)-contaminated soils and waters via the processes of phytoextraction, and phytovolatilization. Based upon these processes a plant management remediation strategy for selenium (Se) has been developed for the ...
The Building Wellness project: a case history of partnership, power sharing, and compromise.
Jones, Drew; Franklin, Charla; Butler, Brittany T; Williams, Pluscedia; Wells, Kenneth B; Rodríguez, Michael A
2006-01-01
The Institute of Medicine has recommended development of community-focused strategies to alleviate the disproportionate burden of illness on minorities, including depression. So far, limited data exist on the process of developing such partnerships within diverse racial/ethnic environments as they strive to develop community-driven, evidence-based action plans to improve the quality of outreach services. We describe such an effort around depression in south Los Angeles and explore the issues of the process in the hopes of informing future partnership development. Community meetings, presentations, feedback, discussion groups, and consensus-based action items were implemented over an 18-month period. A writing subcommittee was designated to develop a description of the group's work and process, as well as the diverse perspectives in the partnership. Data sources included meeting minutes, materials for members and community feedback presentations, scribe notes, and the reflections of the authors. Development was seen on the formal group level, in the process, and on the realization of three categories of action plans. Designed to assist social service caseworkers in the recognition of and referral for depression, the action plans included developing a website, a tool kit (modified Delphi process), and a one-page depression "fact sheet" with region-specific referrals. Through the process of developing a means to combat depression in a racially/ ethnically diverse population, the community is not only better informed about depression but has become a true partner with the academic element in adapting these programs for local service providers, resulting in improved understanding of the partnership process.
Mashup Model and Verification Using Mashup Processing Network
NASA Astrophysics Data System (ADS)
Zahoor, Ehtesham; Perrin, Olivier; Godart, Claude
Mashups are defined to be lightweight Web applications aggregating data from different Web services, built using ad-hoc composition and being not concerned with long term stability and robustness. In this paper we present a pattern based approach, called Mashup Processing Network (MPN). The idea is based on Event Processing Network and is supposed to facilitate the creation, modeling and the verification of mashups. MPN provides a view of how different actors interact for the mashup development namely the producer, consumer, mashup processing agent and the communication channels. It also supports modeling transformations and validations of data and offers validation of both functional and non-functional requirements, such as reliable messaging and security, that are key issues within the enterprise context. We have enriched the model with a set of processing operations and categorize them into data composition, transformation and validation categories. These processing operations can be seen as a set of patterns for facilitating the mashup development process. MPN also paves a way for realizing Mashup Oriented Architecture where mashups along with services are used as building blocks for application development.
NASA Technical Reports Server (NTRS)
Griesel, Martha Ann
1988-01-01
Several Laboratory software development projects that followed nonstandard development processes, which were hybrids of incremental development and prototyping, are being studied. Factors in the project environment leading to the decision to use a nonstandard development process and affecting its success are analyzed. A simple characterization of project environment based on this analysis is proposed, together with software development approaches which have been found effective for each category. These approaches include both documentation and review requirements.
ERIC Educational Resources Information Center
Haque, Mohammad Mahfujul; Little, David C.; Barman, Benoy K.; Wahab, Md. Abdul
2010-01-01
Purpose: The purpose of the study was to understand the adoption process of ricefield based fish seed production (RBFSP) that has been developed, promoted and established in Northwest Bangladesh. Design/Methodology/Approach: Quantitative investigation based on regression analysis and qualitative investigation using semi-structured interview were…
Designing a Web-Based Science Learning Environment for Model-Based Collaborative Inquiry
ERIC Educational Resources Information Center
Sun, Daner; Looi, Chee-Kit
2013-01-01
The paper traces a research process in the design and development of a science learning environment called WiMVT (web-based inquirer with modeling and visualization technology). The WiMVT system is designed to help secondary school students build a sophisticated understanding of scientific conceptions, and the science inquiry process, as well as…
Image Understanding Architecture
1991-09-01
architecture to support real-time, knowledge -based image understanding , and develop the software support environment that will be needed to utilize...NUMBER OF PAGES Image Understanding Architecture, Knowledge -Based Vision, AI Real-Time Computer Vision, Software Simulator, Parallel Processor IL PRICE... information . In addition to sensory and knowledge -based processing it is useful to introduce a level of symbolic processing. Thus, vision researchers
Hamidi, Ahd; Kreeftenberg, Hans; V D Pol, Leo; Ghimire, Saroj; V D Wielen, Luuk A M; Ottens, Marcel
2016-05-01
Vaccination is one of the most successful public health interventions being a cost-effective tool in preventing deaths among young children. The earliest vaccines were developed following empirical methods, creating vaccines by trial and error. New process development tools, for example mathematical modeling, as well as new regulatory initiatives requiring better understanding of both the product and the process are being applied to well-characterized biopharmaceuticals (for example recombinant proteins). The vaccine industry is still running behind in comparison to these industries. A production process for a new Haemophilus influenzae type b (Hib) conjugate vaccine, including related quality control (QC) tests, was developed and transferred to a number of emerging vaccine manufacturers. This contributed to a sustainable global supply of affordable Hib conjugate vaccines, as illustrated by the market launch of the first Hib vaccine based on this technology in 2007 and concomitant price reduction of Hib vaccines. This paper describes the development approach followed for this Hib conjugate vaccine as well as the mathematical modeling tool applied recently in order to indicate options for further improvements of the initial Hib process. The strategy followed during the process development of this Hib conjugate vaccine was a targeted and integrated approach based on prior knowledge and experience with similar products using multi-disciplinary expertise. Mathematical modeling was used to develop a predictive model for the initial Hib process (the 'baseline' model) as well as an 'optimized' model, by proposing a number of process changes which could lead to further reduction in price. © 2016 American Institute of Chemical Engineers Biotechnol. Prog., 32:568-580, 2016. © 2016 American Institute of Chemical Engineers.
A Focusing Method in the Calibration Process of Image Sensors Based on IOFBs
Fernández, Pedro R.; Lázaro, José L.; Gardel, Alfredo; Cano, Ángel E.; Bravo, Ignacio
2010-01-01
A focusing procedure in the calibration process of image sensors based on Incoherent Optical Fiber Bundles (IOFBs) is described using the information extracted from fibers. These procedures differ from any other currently known focusing method due to the non spatial in-out correspondence between fibers, which produces a natural codification of the image to transmit. Focus measuring is essential prior to carrying out calibration in order to guarantee accurate processing and decoding. Four algorithms have been developed to estimate the focus measure; two methods based on mean grey level, and the other two based on variance. In this paper, a few simple focus measures are defined and compared. Some experimental results referred to the focus measure and the accuracy of the developed methods are discussed in order to demonstrate its effectiveness. PMID:22315526
Bakker, Wilfried A M; Thomassen, Yvonne E; van't Oever, Aart G; Westdijk, Janny; van Oijen, Monique G C T; Sundermann, Lars C; van't Veld, Peter; Sleeman, Eelco; van Nimwegen, Fred W; Hamidi, Ahd; Kersten, Gideon F A; van den Heuvel, Nico; Hendriks, Jan T; van der Pol, Leo A
2011-09-22
Industrial-scale inactivated polio vaccine (IPV) production dates back to the 1960s when at the Rijks Instituut voor de Volksgezondheid (RIV) in Bilthoven a process was developed based on micro-carrier technology and primary monkey kidney cells. This technology was freely shared with several pharmaceutical companies and institutes worldwide. In this contribution, the history of one of the first cell-culture based large-scale biological production processes is summarized. Also, recent developments and the anticipated upcoming shift from regular IPV to Sabin-IPV are presented. Responding to a call by the World Health Organization (WHO) for new polio vaccines, the development of Sabin-IPV was continued, after demonstrating proof of principle in the 1990s, at the Netherlands Vaccine Institute (NVI). Development of Sabin-IPV plays an important role in the WHO polio eradication strategy as biocontainment will be critical in the post-OPV cessation period. The use of attenuated Sabin strains instead of wild-type Salk polio strains will provide additional safety during vaccine production. Initially, the Sabin-IPV production process will be based on the scale-down model of the current, and well-established, Salk-IPV process. In parallel to clinical trial material production, process development, optimization and formulation research is being carried out to further optimize the process and reduce cost per dose. Also, results will be shown from large-scale (to prepare for future technology transfer) generation of Master- and Working virus seedlots, and clinical trial material (for phase I studies) production. Finally, the planned technology transfer to vaccine manufacturers in low and middle-income countries is discussed. Copyright © 2011 Elsevier Ltd. All rights reserved.
An expert systems application to space base data processing
NASA Technical Reports Server (NTRS)
Babb, Stephen M.
1988-01-01
The advent of space vehicles with their increased data requirements are reflected in the complexity of future telemetry systems. Space based operations with its immense operating costs will shift the burden of data processing and routine analysis from the space station to the Orbital Transfer Vehicle (OTV). A research and development project is described which addresses the real time onboard data processing tasks associated with a space based vehicle, specifically focusing on an implementation of an expert system.
Peer Review for EPA’s Biologically Based Dose-Response (BBDR) Model for Perchlorate
EPA is developing a regulation for perchlorate in drinking water. As part the regulatory process EPA must develop a Maximum Contaminant Level Goal (MCLG). FDA and EPA scientists developed a biologically based dose-response (BBDR) model to assist in deriving the MCLG. This mode...
Learning Strategies for Adolescents with Mild Disabilities
ERIC Educational Resources Information Center
Conderman, Greg; Koman, Kara; Schibelka, Mary; Higgin, Karen; Cooper, Cody; Butler, Jordyn
2013-01-01
Learning strategy instruction is an evidence-based practice for teaching adolescents with mild disabilities. However, researchers have not developed strategies for every content area or skill. Therefore, teachers need to be able develop strategies based on the needs of their students. This article reviews the process for developing and teaching…
EPA announced the availability of the final report, Considerations for Developing a Dosimetry-Based Cumulative Risk Assessment Approach for Mixtures of Environmental Contaminants. This report describes a process that can be used to determine the potential value of develop...
Harris, Claire; Garrubba, Marie; Allen, Kelly; King, Richard; Kelly, Cate; Thiagarajan, Malar; Castleman, Beverley; Ramsey, Wayne; Farjou, Dina
2015-12-28
This paper reports the process of establishing a transparent, accountable, evidence-based program for introduction of new technologies and clinical practices (TCPs) in a large Australian healthcare network. Many countries have robust evidence-based processes for assessment of new TCPs at national level. However many decisions are made by local health services where the resources and expertise to undertake health technology assessment (HTA) are limited and a lack of structure, process and transparency has been reported. An evidence-based model for process change was used to establish the program. Evidence from research and local data, experience of health service staff and consumer perspectives were incorporated at each of four steps: identifying the need for change, developing a proposal, implementation and evaluation. Checklists assessing characteristics of success, factors for sustainability and barriers and enablers were applied and implementation strategies were based on these findings. Quantitative and qualitative methods were used for process and outcome evaluation. An action research approach underpinned ongoing refinement to systems, processes and resources. A Best Practice Guide developed from the literature and stakeholder consultation identified seven program components: Governance, Decision-Making, Application Process, Monitoring and Reporting, Resources, Administration, and Evaluation and Quality Improvement. The aims of transparency and accountability were achieved. The processes are explicit, decisions published, outcomes recorded and activities reported. The aim of ascertaining rigorous evidence-based information for decision-making was not achieved in all cases. Applicants proposing new TCPs provided the evidence from research literature and local data however the information was often incorrect or inadequate, overestimating benefits and underestimating costs. Due to these limitations the initial application process was replaced by an Expression of Interest from applicants followed by a rigorous HTA by independent in-house experts. The program is generalisable to most health care organisations. With one exception, the components would be achievable with minimal additional resources; the lack of skills and resources required for HTA will limit effective application in many settings. A toolkit containing details of the processes and sample materials is provided to facilitate replication or local adaptation by those wishing to establish a similar program.
Paskevich, Valerie F.
1992-01-01
The Branch of Atlantic Marine Geology has been involved in the collection, processing and digital mosaicking of high, medium and low-resolution side-scan sonar data during the past 6 years. In the past, processing and digital mosaicking has been accomplished with a dedicated, shore-based computer system. With the need to process sidescan data in the field with increased power and reduced cost of major workstations, a need to have an image processing package on a UNIX based computer system which could be utilized in the field as well as be more generally available to Branch personnel was identified. This report describes the initial development of that package referred to as the Woods Hole Image Processing System (WHIPS). The software was developed using the Unidata NetCDF software interface to allow data to be more readily portable between different computer operating systems.
Lannering, Christina; Ernsth Bravell, Marie; Johansson, Linda
2017-05-01
A structured and systematic care process for preventive work, aimed to reduce falls, pressure ulcers and malnutrition among older people, has been developed in Sweden. The process involves risk assessment, team-based interventions and evaluation of results. Since development, this structured work process has become web-based and has been implemented in a national quality registry called 'Senior Alert' and used countrywide. The aim of this study was to describe nursing staff's experience of preventive work by using the structured preventive care process as outlined by Senior Alert. Eight focus group interviews were conducted during 2015 including staff from nursing homes and home-based nursing care in three municipalities. The interview material was subjected to qualitative content analysis. In this study, both positive and negative opinions were expressed about the process. The systematic and structured work flow seemed to only partly facilitate care providers to improve care quality by making better clinical assessments, performing team-based planned interventions and learning from results. Participants described lack of reliability in the assessments and varying opinions about the structure. Furthermore, organisational structures limited the preventive work. © 2016 John Wiley & Sons Ltd.
Gehrlach, Christoph; Güntert, Bernhard
2015-01-01
Patient satisfaction (PS) surveys are frequently used evaluation methods to show performance from the customer's view. This approach has some fundamental deficits, especially with respect to theory, methodology and usage. Because of the significant theoretical value of the expectation confirmation/disconfirmation concept in the development of PS, an expectation-based experience typology has been developed and tested to check whether this approach could be a theoretical and practical alternative to the survey of PS. Due to the mainly cognitive-rational process of comparison between expectations and expectation fulfilment, it is easier to make changes in this stage of the process than in the subsequent stage of the development of PS that is mainly based on emotional-affective processes. The paper contains a literature review of the common concept of PS and its causal and influencing factors. Based on the theoretical part of this study, an expectation-based experience typology was developed. In the next step, the typology was subjected to exploratory testing, based on two patient surveys. In some parts of the tested typology explorative differences could be found between hospitals. Despite this rather more complex and unusual approach to expectation-based experience typology, this concept offers the chance to change conditions not only retrospectively (based on data), but also in a prospective way in terms of a "management of expectations". Copyright © 2014. Published by Elsevier GmbH.
Combining human and machine processes (CHAMP)
NASA Astrophysics Data System (ADS)
Sudit, Moises; Sudit, David; Hirsch, Michael
2015-05-01
Machine Reasoning and Intelligence is usually done in a vacuum, without consultation of the ultimate decision-maker. The late consideration of the human cognitive process causes some major problems in the use of automated systems to provide reliable and actionable information that users can trust and depend to make the best Course-of-Action (COA). On the other hand, if automated systems are created exclusively based on human cognition, then there is a danger of developing systems that don't push the barrier of technology and are mainly done for the comfort level of selected subject matter experts (SMEs). Our approach to combining human and machine processes (CHAMP) is based on the notion of developing optimal strategies for where, when, how, and which human intelligence should be injected within a machine reasoning and intelligence process. This combination is based on the criteria of improving the quality of the output of the automated process while maintaining the required computational efficiency for a COA to be actuated in timely fashion. This research addresses the following problem areas: • Providing consistency within a mission: Injection of human reasoning and intelligence within the reliability and temporal needs of a mission to attain situational awareness, impact assessment, and COA development. • Supporting the incorporation of data that is uncertain, incomplete, imprecise and contradictory (UIIC): Development of mathematical models to suggest the insertion of a cognitive process within a machine reasoning and intelligent system so as to minimize UIIC concerns. • Developing systems that include humans in the loop whose performance can be analyzed and understood to provide feedback to the sensors.
Nurse-Managed Clinics: A Blueprint for Success Using the Covey Framework.
ERIC Educational Resources Information Center
Starck, Patricia L.; And Others
1995-01-01
Describes the process from inception to successful operation of a university-based, nurse-managed clinic, based on Covey's seven habits of highly effective people. Includes information on the planning process, financing, political strategies for gaining approval, and ongoing development of services. (JOW)
Radiology information system: a workflow-based approach.
Zhang, Jinyan; Lu, Xudong; Nie, Hongchao; Huang, Zhengxing; van der Aalst, W M P
2009-09-01
Introducing workflow management technology in healthcare seems to be prospective in dealing with the problem that the current healthcare Information Systems cannot provide sufficient support for the process management, although several challenges still exist. The purpose of this paper is to study the method of developing workflow-based information system in radiology department as a use case. First, a workflow model of typical radiology process was established. Second, based on the model, the system could be designed and implemented as a group of loosely coupled components. Each component corresponded to one task in the process and could be assembled by the workflow management system. The legacy systems could be taken as special components, which also corresponded to the tasks and were integrated through transferring non-work- flow-aware interfaces to the standard ones. Finally, a workflow dashboard was designed and implemented to provide an integral view of radiology processes. The workflow-based Radiology Information System was deployed in the radiology department of Zhejiang Chinese Medicine Hospital in China. The results showed that it could be adjusted flexibly in response to the needs of changing process, and enhance the process management in the department. It can also provide a more workflow-aware integration method, comparing with other methods such as IHE-based ones. The workflow-based approach is a new method of developing radiology information system with more flexibility, more functionalities of process management and more workflow-aware integration. The work of this paper is an initial endeavor for introducing workflow management technology in healthcare.
NASA Astrophysics Data System (ADS)
Jeyakumar, Lordwin; Zhao, Yaqian
2014-05-01
Increased awareness of the impacts of diffuse pollution and their intensification has pushed forward the need for the development of low-cost wastewater treatment techniques. One of such efforts is the use of novel DASC (Dewatered Alum Sludge Cakes) based constructed wetlands (CWs) for removing nutrients, organics, trace elements and other pollutants from wastewater. Understanding of the processes in CWs requires a numerical model that describes the biochemical transformation and degradation processes in subsurface vertical flow (VF) CWs. Therefore, this research focuses on the development of a process-based model for phosphorus (P) and nitrogen (N) removal to achieve a stable performance by using DASC as a substrate in CWs treatment system. An object-oriented modelling tool known as "STELLA" which works based on the principle of system dynamics is used for the development of P and N model. The core objective of the modelling work is oriented towards understanding the process in DASC-based CWs and optimizes design criteria. The P and N dynamic model is developed for DASC-based CWs. The P model developed exclusively for DASC-based CW was able to simulate the effluent P concentration leaving the system satisfactorily. Moreover, the developed P dynamic model has identified the major P pathways as adsorption (72%) followed by plant uptake (20%) and microbial uptake (7%) in single-stage laboratory scale DASC-based CW. Similarly, P dynamic simulation model was developed to simulate the four-stage laboratory scale DASC-based CWs. It was found that simulated and observed values of P removal were in good agreement. The fate of P in all the four stages clearly shows that adsorption played a pivotal role in each stage of the system due to the use of the DASC as a substrate. P adsorption by wetland substrate/DASC represents 59-75% of total P reduction. Subsequently, plant uptake and microbial uptake have lesser role regarding P removal (as compared to adsorption).With regard to N, DASC-based CWs dynamic model was developed and was run for 18 months from Feb 2009 to May 2010. The results reveal that the simulated effluent DN, NH4-N, NO3-N and TN had a considerably good agreement with the observed results. The TN removal was found to be 52% in the DASC-based CW. Interestingly, NIT is the main agent (65.60%) for the removal followed by ad (11.90%), AMM (8.90%), NH4-N (P) (5.90%), and NO3-N (P) (4.40%). DeN did not result in any significant removal (2.90%) in DASC-based CW which may be due to lack of anaerobic condition and absence of carbon sources. The N model also attempted to simulate the internal process behaviour of the system which provided a useful tool for gaining insight into the N dynamics of VFCWs. The results obtained for both N and P models can be used to improve the design of the newly developed DASC-based CWs to increase the efficiency of nutrient removal by CWs.
ERIC Educational Resources Information Center
Zehavi, Nurit; Mann, Giora
2011-01-01
This paper presents the development process of a "praxeology" (theory-of-practice) for supporting the teaching of proofs in a CAS environment. The characteristics of the praxeology were elaborated within the frame of a professional development course for teaching analytic geometry with CAS. The theoretical framework draws on Chevallard's…
ON DEVELOPING TOOLS AND METHODS FOR ENVIRONMENTALLY BENIGN PROCESSES
Two types of tools are generally needed for designing processes and products that are cleaner from environmental impact perspectives. The first kind is called process tools. Process tools are based on information obtained from experimental investigations in chemistry., material s...
Food drying process by power ultrasound.
de la Fuente-Blanco, S; Riera-Franco de Sarabia, E; Acosta-Aparicio, V M; Blanco-Blanco, A; Gallego-Juárez, J A
2006-12-22
Drying processes, which have a great significance in the food industry, are frequently based on the use of thermal energy. Nevertheless, such methods may produce structural changes in the products. Consequently, a great emphasis is presently given to novel treatments where the quality will be preserved. Such is the case of the application of high-power ultrasound which represents an emergent and promising technology. During the last few years, we have been involved in the development of an ultrasonic dehydration process, based on the application of the ultrasonic vibration in direct contact with the product. Such a process has been the object of a detailed study at laboratory stage on the influence of the different parameters involved. This paper deals with the development and testing of a prototype system for the application and evaluation of the process at a pre-industrial stage. Such prototype is based on a high-power rectangular plate transducer, working at a frequency of 20 kHz, with a power capacity of about 100 W. In order to study mechanical and thermal effects, the system is provided with a series of sensors which permit monitoring the parameters of the process. Specific software has also been developed to facilitate data collection and analysis. The system has been tested with vegetable samples.
ERIC Educational Resources Information Center
de Klerk, Sebastiaan; Veldkamp, Bernard P.; Eggen, Theo J. H. M.
2018-01-01
The development of any assessment should be an iterative and careful process. Ideally, this process is guided by a well-defined framework (see for example Downing in: Downing and Haladyna (eds) "Handbook of test development," Lawrence Erlbaum Associates, Mahwah, 2006; Mislevy et al. in "On the roles of task model variables in…
Development of a Goal Setting Process and Instrumentation for Teachers and Principals.
ERIC Educational Resources Information Center
Minix, Nancy; And Others
A pilot program, the Career Ladder Plan, was developed in Kentucky to evaluate a teacher's performance in terms of professional growth and development and professional leadership/initiative based on that teacher's performance on a setting/goal attainment process. Goals jointly selected by the teacher and his/her principal must contribute to school…
ERIC Educational Resources Information Center
Carden, Fred; Earl, Sarah
2007-01-01
Until the recent introduction of a dynamic interview-based process, the International Development Research Centre (IDRC), a Canadian development research funding agency, faced a challenge: project completion reports (PCRs) were not being completed in a timely and quality manner. This is a common problem many organizations face in completing…
Beyond Staff Development: A Strategic Plan for School/Community Empowerment.
ERIC Educational Resources Information Center
Johnson, Daniel P.
At the beginning of his tenure in 1987, the superintendent of Clear Creek School District (Colorado) found that the district had no written K-12 curriculum, no ongoing process for developing such a curriculum, and no systematic process for staff development. To provide for change based on projected student needs for the 21st century, the…
NASA Astrophysics Data System (ADS)
Kirkire, Milind Shrikant; Rane, Santosh B.; Jadhav, Jagdish Rajaram
2015-12-01
Medical product development (MPD) process is highly multidisciplinary in nature, which increases the complexity and the associated risks. Managing the risks during MPD process is very crucial. The objective of this research is to explore risks during MPD in a dental product manufacturing company and propose a model for risk mitigation during MPD process to minimize failure events. A case study approach is employed. The existing MPD process is mapped with five phases of the customized phase gate process. The activities during each phase of development and risks associated with each activity are identified and categorized based on the source of occurrence. The risks are analyzed using traditional Failure mode and effect analysis (FMEA) and fuzzy FMEA. The results of two methods when compared show that fuzzy approach avoids the duplication of RPNs and helps more to convert cognition of experts into information to get values of risk factors. The critical, moderate, low level and negligible risks are identified based on criticality; risk treatments and mitigation model are proposed. During initial phases of MPD, the risks are less severe, but as the process progresses the severity of risks goes on increasing. The MPD process should be critically designed and simulated to minimize the number of risk events and their severity. To successfully develop the products/devices within the manufacturing companies, the process risk management is very essential. A systematic approach to manage risks during MPD process will lead to the development of medical products with expected quality and reliability. This is the first research of its kind having focus on MPD process risks and its management. The methodology adopted in this paper will help the developers, managers and researchers to have a competitive edge over the other companies by managing the risks during the development process.
NASA Astrophysics Data System (ADS)
Dudziak, T.; Olbrycht, A.; Polkowska, A.; Boron, L.; Skierski, P.; Wypych, A.; Ambroziak, A.; Krezel, A.
2018-03-01
Due to shortage of natural resources worldwide, it is a need to develop innovative technologies, to save natural resources and secure Critical Raw Materials (CRM). On the other hand, these new technologies should move forward materials engineering in order to develop better materials for extreme conditions. One way to develop new materials is to use post processing chips of austenitic steels (i.e. 304L stainless steel: 18/10 Cr/Ni) and other materials such as Ni-based alloy with high Cr content. In this work, the results of the preliminary study on the High Velocity Oxy Fuel (HVOF) coatings developed from 304L stainless steel chips and Haynes® 282® Ni- based alloys are shown. The study obeys development of the powder for HVOF technology. The produced coatings were exposed at high temperature at 500 and 700 °C for 100 and 300 hours respectively to assess corrosion behaviour.
Koivunen, Marita; Välimäki, Maritta; Jakobsson, Tiina; Pitkänen, Anneli
2008-01-01
This article describes the systematic process in which an evidence-based approach was used to develop a curriculum designed to support the computer and Internet skills of nurses in psychiatric hospitals in Finland. The pressure on organizations to have skilled and motivated nurses who use modern information and communication technology in health care organizations has increased due to rapid technology development at the international and national levels. However, less frequently has the development of those computer education curricula been based on evidence-based knowledge. First, we identified psychiatric nurses' learning experiences and barriers to computer use by examining written essays. Second, nurses' computer skills were surveyed. Last, evidence from the literature was scrutinized to find effective methods that can be used to teach and learn computer use in health care. This information was integrated and used for the development process of an education curriculum designed to support nurses' computer and Internet skills.
ERIC Educational Resources Information Center
Santagata, Rossella; Bray, Wendy
2016-01-01
This study examined processes at the core of teacher professional development (PD) experiences that might positively impact teacher learning and more specifically teacher change. Four processes were considered in the context of a PD program focused on student mathematical errors: analysis of students' mathematical misconceptions as a lever for…
Testing Processability Theory in L2 Spanish: Can Readiness or Markedness Predict Development?
ERIC Educational Resources Information Center
Bonilla, Carrie L.
2012-01-01
The goal of this dissertation is to test the five stages of Processability Theory (PT) for second language (L2) learners of Spanish and investigate how instruction can facilitate the development through the stages. PT details five fixed stages in the acquisition of L2 morphosyntax based on principles of speech processing (Levelt, 1989) and modeled…
The Importance of Process-Oriented Accessibility Guidelines for Web Developers.
Steen-Hansen, Linn; Fagernes, Siri
2016-01-01
Current accessibility research shows that in the web development, the process itself may lead to inaccessible web sites and applications. Common practices typically do not allow sufficient testing. The focus is mainly on complying with minimum standards, and treating accessibility compliance as a sort of bug-fixing process, missing the user perspective. In addition, there is an alarming lack of knowledge and experience with accessibility issues. It has also been argued that bringing accessibility into the development process at all stages is the only way to achieve the highest possible level of accessibility. The work presented in this paper is based on a previous project focusing on guidelines for developing accessible rich Internet applications. The guidelines were classified as either process-oriented or technology-oriented. In this paper, we examine the process-oriented guidelines and give a practical perspective on how these guidelines will make the development process more accessibility-friendly.
The DACUM Job Analysis Process.
ERIC Educational Resources Information Center
Dofasco, Inc., Hamilton (Ontario).
This document explains the DACUM (Developing A Curriculum) process for analyzing task-based jobs to: identify where standard operating procedures are required; identify duplicated low value added tasks; develop performance standards; create job descriptions; and identify the elements that must be included in job-specific training programs. The…
Emulsion based cast booster - a priming system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gupta, R.N.; Mishra, A.K.
2005-07-01
This paper explores the potential of emulsion based cast booster to be used as primer to initiate bulk delivered emulsion explosives used in mines. An attempt has been made for comparative study between conventional cast booster and emulsion based cast booster in terms of the initiation process developed and their capability to develop and maintain the stable detonation process in the column explosives. The study has been conducted using a continuous velocity of detonation (VOD) measuring instrument. During this study three blasts have been monitored. In each blast two holes have been selected for study, the first hole being initiatedmore » with conventional cast booster while the other one with emulsion based cast booster. The findings of the study advocates that emulsion based cast booster is capable of efficient priming of bulk delivered column explosive with stable detonation process in the column. Further, the booster had advantages over the conventional PETN/TNT based cast booster. 5 refs., 2 figs., 1 tab., 1 photo.« less
Low cost MATLAB-based pulse oximeter for deployment in research and development applications.
Shokouhian, M; Morling, R C S; Kale, I
2013-01-01
Problems such as motion artifact and effects of ambient lights have forced developers to design different signal processing techniques and algorithms to increase the reliability and accuracy of the conventional pulse oximeter device. To evaluate the robustness of these techniques, they are applied either to recorded data or are implemented on chip to be applied to real-time data. Recorded data is the most common method of evaluating however it is not as reliable as real-time measurements. On the other hand, hardware implementation can be both expensive and time consuming. This paper presents a low cost MATLAB-based pulse oximeter that can be used for rapid evaluation of newly developed signal processing techniques and algorithms. Flexibility to apply different signal processing techniques, providing both processed and unprocessed data along with low implementation cost are the important features of this design which makes it ideal for research and development purposes, as well as commercial, hospital and healthcare application.
Durvasula, Raghu; Kelly, Janet; Schleyer, Anneliese; Anawalt, Bradley D; Somani, Shabir; Dellit, Timothy H
2018-04-01
As healthcare costs rise and reimbursements decrease, healthcare organization leadership and clinical providers must collaborate to provide high-value healthcare. Medications are a key driver of the increasing cost of healthcare, largely as a result of the proliferation of expensive specialty drugs, including biologic agents. Such medications contribute significantly to the inpatient diagnosis-related group payment system, often with minimal or unproved benefit over less-expensive therapies. To describe a systematic review process to reduce non-evidence-based inpatient use of high-cost medications across a large multihospital academic health system. We created a Pharmacy & Therapeutics subcommittee consisting of clinicians, pharmacists, and an ethics representative. This committee developed a standardized process for a timely review (<48 hours) and approval of high-cost medications based on their clinical effectiveness, safety, and appropriateness. The engagement of clinical experts in the development of the consensus-based guidelines for the use of specific medications facilitated the clinicians' acceptance of the review process. Over a 2-year period, a total of 85 patient-specific requests underwent formal review. All reviews were conducted within 48 hours. This review process has reduced the non-evidence-based use of specialty medications and has resulted in a pharmacy savings of $491,000 in fiscal year 2016, with almost 80% of the savings occurring in the last 2 quarters, because our process has matured. The creation of a collaborative review process to ensure consistent, evidence-based utilization of high-cost medications provides value-based care, while minimizing unnecessary practice variation and reducing the cost of inpatient care.
NASA Technical Reports Server (NTRS)
1983-01-01
The process technology for the manufacture of semiconductor-grade silicon in a large commercial plant by 1986, at a price less than $14 per kilogram of silicon based on 1975 dollars is discussed. The engineering design, installation, checkout, and operation of an Experimental Process System Development unit was discussed. Quality control of scaling-up the process and an economic analysis of product and production costs are discussed.
The CompHP core competencies framework for health promotion in Europe.
Barry, Margaret M; Battel-Kirk, Barbara; Dempsey, Colette
2012-12-01
The CompHP Project on Developing Competencies and Professional Standards for Health Promotion in Europe was developed in response to the need for new and changing health promotion competencies to address health challenges. This article presents the process of developing the CompHP Core Competencies Framework for Health Promotion across the European Union Member States and Candidate Countries. A phased, multiple-method approach was employed to facilitate a consensus-building process on the development of the core competencies. Key stakeholders in European health promotion were engaged in a layered consultation process using the Delphi technique, online consultations, workshops, and focus groups. Based on an extensive literature review, a mapping process was used to identify the core domains, which informed the first draft of the Framework. A consultation process involving two rounds of a Delphi survey with national experts in health promotion from 30 countries was carried out. In addition, feedback was received from 25 health promotion leaders who participated in two focus groups at a pan-European level and 116 health promotion practitioners who engaged in four country-specific consultations. A further 54 respondents replied to online consultations, and there were a number of followers on various social media platforms. Based on four rounds of redrafting, the final Framework document was produced, consisting of 11 core domains and 68 core competency statements. The CompHP Core Competencies Framework for Health Promotion provides a resource for workforce development in Europe, by articulating the necessary knowledge, skills, and abilities that are required for effective practice. The core domains are based on the multidisciplinary concepts, theories, and research that make health promotion distinctive. It is the combined application of all the domains, the knowledge base, and the ethical values that constitute the CompHP Core Competencies Framework for Health Promotion.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Karel Grohman; Scott Stevenson
Renewable Spirits is developing an innovative pilot plant bio-refinery to establish the commercial viability of ehtanol production utilizing a processing waste from citrus juice production. A novel process based on enzymatic hydrolysis of citrus processing waste and fermentation of resulting sugars to ethanol by yeasts was successfully developed in collaboration with a CRADA partner, USDA/ARS Citrus and Subtropical Products Laboratory. The process was also successfully scaled up from laboratory scale to 10,000 gal fermentor level.
NASA Astrophysics Data System (ADS)
Sennewald, C.; Vorhof, M.; Schegner, P.; Hoffmann, G.; Cherif, C.; Boblenz, J.; Sinapius, M.; Hühne, C.
2018-05-01
Flexible cellular 3D structures with structure-inherent compliance made of fiber-reinforced composites have repeatedly aroused the interest of international research groups. Such structures offer the possibility to meet the increasing demand for flexible and adaptive structures. The aim of this paper is the development of cellular 3D structures based on weaving technology. Considering the desired geometry of the 3D structure, algorithms are developed for the formation of geometry through tissue sub-areas. Subsequently, these sub-areas are unwound into the weaving level and appropriate weave patterns are developed. A particular challenge is the realization of compliant mechanisms in the woven fabric. This can be achieved either by combining different materials or, in particular, by implementing large stiffness gradients by means of varying the woven fabrics thickness, whereas differences in wall thickness have to be realized with a factor of 1:10. A manufacturing technology based on the weaving process is developed for the realization of the developed 3D cellular structures. To this end, solutions for the processing of hybrid thermoplastic materials (e.g. tapes), solutions for the integration of inlays in the weaving process (thickening of partial areas), and solutions for tissue retraction, as well as for the fabric pull-off (linear pull-off system) are being developed. In this way, woven cellular 3D structures with woven outer layers and woven joint areas (compliance) can be realized in a single process step and are subsequently characterized.
Weight and the Future of Space Flight Hardware Cost Modeling
NASA Technical Reports Server (NTRS)
Prince, Frank A.
2003-01-01
Weight has been used as the primary input variable for cost estimating almost as long as there have been parametric cost models. While there are good reasons for using weight, serious limitations exist. These limitations have been addressed by multi-variable equations and trend analysis in models such as NAFCOM, PRICE, and SEER; however, these models have not be able to address the significant time lags that can occur between the development of similar space flight hardware systems. These time lags make the cost analyst's job difficult because insufficient data exists to perform trend analysis, and the current set of parametric models are not well suited to accommodating process improvements in space flight hardware design, development, build and test. As a result, people of good faith can have serious disagreement over the cost for new systems. To address these shortcomings, new cost modeling approaches are needed. The most promising approach is process based (sometimes called activity) costing. Developing process based models will require a detailed understanding of the functions required to produce space flight hardware combined with innovative approaches to estimating the necessary resources. Particularly challenging will be the lack of data at the process level. One method for developing a model is to combine notional algorithms with a discrete event simulation and model changes to the total cost as perturbations to the program are introduced. Despite these challenges, the potential benefits are such that efforts should be focused on developing process based cost models.
Cai, Bin; Altman, Michael B; Garcia-Ramirez, Jose; LaBrash, Jason; Goddu, S Murty; Mutic, Sasa; Parikh, Parag J; Olsen, Jeffrey R; Saad, Nael; Zoberi, Jacqueline E
To develop a safe and robust workflow for yttrium-90 (Y-90) radioembolization procedures in a multidisciplinary team environment. A generalized Define-Measure-Analyze-Improve-Control (DMAIC)-based approach to process improvement was applied to a Y-90 radioembolization workflow. In the first DMAIC cycle, events with the Y-90 workflow were defined and analyzed. To improve the workflow, a web-based interactive electronic white board (EWB) system was adopted as the central communication platform and information processing hub. The EWB-based Y-90 workflow then underwent a second DMAIC cycle. Out of 245 treatments, three misses that went undetected until treatment initiation were recorded over a period of 21 months, and root-cause-analysis was performed to determine causes of each incident and opportunities for improvement. The EWB-based Y-90 process was further improved via new rules to define reliable sources of information as inputs into the planning process, as well as new check points to ensure this information was communicated correctly throughout the process flow. After implementation of the revised EWB-based Y-90 workflow, after two DMAIC-like cycles, there were zero misses out of 153 patient treatments in 1 year. The DMAIC-based approach adopted here allowed the iterative development of a robust workflow to achieve an adaptable, event-minimizing planning process despite a complex setting which requires the participation of multiple teams for Y-90 microspheres therapy. Implementation of such a workflow using the EWB or similar platform with a DMAIC-based process improvement approach could be expanded to other treatment procedures, especially those requiring multidisciplinary management. Copyright © 2016 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.
Pricing of NASA Space Shuttle transportation system cargo
NASA Technical Reports Server (NTRS)
Hale, C. W.
1979-01-01
A two-part pricing policy is investigated as the most feasible method of pricing the transportation services to be provided by NASA's SSTS. Engineering cost estimates and a deterministic operating cost model generate a data base and develop a procedure for pricing the services of the SSTS. It is expected that the SSTS will have a monopoly on space material processing in areas of crystal growth, glass processing, metallurgical space applications, and biomedical processes using electrophoresis which will require efficient pricing. Pricing problems, the SSTS operating costs based on orbit elevation, number of launch sites, and number of flights, capital costs of the SSTS, research and development costs, allocation of joint transportation costs of the SSTS to a particular space processing activity, and rates for the SSTS are discussed. It is concluded that joint costs for commercial cargoes carried in the SSTS can be most usefully handled by making cost allocations based on proportionate capacity utilization.
TRWG developmental pathway for biospecimen-based assessment modalities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Translational Research Working Group; Srivastava, Sudhir; Gray, Joe W.
The Translational Research Working Group (TRWG) was created as a national initiative to evaluate the current status of NCI's investment in translational research and envision its future. The TRWG conceptualized translational research as a set of six developmental processes or pathways focused on various clinical goals. One of those pathways describes the development of biospecimen-based assays that utilize biomarkers for the detection, diagnosis, prognosis, and assessment of response to cancer treatment. The biospecimen-based assessment modality (BM) pathway was conceived not as comprehensive description of the corresponding real-world processes, but rather as a tool designed to facilitate movement of a candidatemore » assay through the translational process to the point where it can be handed off for definitive clinical testing. This paper introduces the pathway in the context of prior work and discusses key challenges associated with the biomarker development process in light of the pathway.« less
A Generic Approach for Pen-Based User Interface Development
NASA Astrophysics Data System (ADS)
Macé, Sébastien; Anquetil, Éric
Pen-based interaction is an intuitive way to realize hand drawn structured documents, but few applications take advantage of it. Indeed, the interpretation of the user hand drawn strokes in the context of document is a complex problem. In this paper, we propose a new generic approach to develop such systems based on three independent components. The first one is a set of graphical and editing functions adapted to pen interaction. The second one is a rule-based formalism that models structured document composition and the corresponding interpretation process. The last one is a hand drawn stroke analyzer that is able to interpret strokes progressively, directly while the user is drawing. We highlight in particular the human-computer interaction induced from this progressive interpretation process. Thanks to this generic approach, three pen-based system prototypes have already been developed, for musical score editing, for graph editing, and for UML class diagram editing
Oh, Pok-Ja; Kim, Il-Ok; Shin, Sung-Rae; Jung, Hoe-Kyung
2004-10-01
This study was to develop Web-based multimedia content for Physical Examination and Health Assessment. The multimedia content was developed based on Jung's teaching and learning structure plan model, using the following 5 processes : 1) Analysis Stage, 2) Planning Stage, 3) Storyboard Framing and Production Stage, 4) Program Operation Stage, and 5) Final Evaluation Stage. The web based multimedia content consisted of an intro movie, main page and sub pages. On the main page, there were 6 menu bars that consisted of Announcement center, Information of professors, Lecture guide, Cyber lecture, Q&A, and Data centers, and a site map which introduced 15 week lectures. In the operation of web based multimedia content, HTML, JavaScript, Flash, and multimedia technology (Audio and Video) were utilized and the content consisted of text content, interactive content, animation, and audio & video. Consultation with the experts in context, computer engineering, and educational technology was utilized in the development of these processes. Web-based multimedia content is expected to offer individualized and tailored learning opportunities to maximize and facilitate the effectiveness of the teaching and learning process. Therefore, multimedia content should be utilized concurrently with the lecture in the Physical Examination and Health Assessment classes as a vital teaching aid to make up for the weakness of the face-to- face teaching-learning method.
Processing methods for photoacoustic Doppler flowmetry with a clinical ultrasound scanner
NASA Astrophysics Data System (ADS)
Bücking, Thore M.; van den Berg, Pim J.; Balabani, Stavroula; Steenbergen, Wiendelt; Beard, Paul C.; Brunker, Joanna
2018-02-01
Photoacoustic flowmetry (PAF) based on time-domain cross correlation of photoacoustic signals is a promising technique for deep tissue measurement of blood flow velocity. Signal processing has previously been developed for single element transducers. Here, the processing methods for acoustic resolution PAF using a clinical ultrasound transducer array are developed and validated using a 64-element transducer array with a -6 dB detection band of 11 to 17 MHz. Measurements were performed on a flow phantom consisting of a tube (580 μm inner diameter) perfused with human blood flowing at physiological speeds ranging from 3 to 25 mm / s. The processing pipeline comprised: image reconstruction, filtering, displacement detection, and masking. High-pass filtering and background subtraction were found to be key preprocessing steps to enable accurate flow velocity estimates, which were calculated using a cross-correlation based method. In addition, the regions of interest in the calculated velocity maps were defined using a masking approach based on the amplitude of the cross-correlation functions. These developments enabled blood flow measurements using a transducer array, bringing PAF one step closer to clinical applicability.
NASA Astrophysics Data System (ADS)
Nerita, S.; Maizeli, A.; Afza, A.
2017-09-01
Process Evaluation and Learning Outcomes of Biology subjects discusses the evaluation process in learning and application of designed and processed learning outcomes. Some problems found during this subject was the student difficult to understand the subject and the subject unavailability of learning resources that can guide and make students independent study. So, it necessary to develop a learning resource that can make active students to think and to make decisions with the guidance of the lecturer. The purpose of this study is to produce handout based on guided discovery method that match the needs of students. The research was done by using 4-D models and limited to define phase that is student requirement analysis. Data obtained from the questionnaire and analyzed descriptively. The results showed that the average requirement of students was 91,43%. Can be concluded that students need a handout based on guided discovery method in the learning process.
A Scenario-Based Process for Requirements Development: Application to Mission Operations Systems
NASA Technical Reports Server (NTRS)
Bindschadler, Duane L.; Boyles, Carole A.
2008-01-01
The notion of using operational scenarios as part of requirements development during mission formulation (Phases A & B) is widely accepted as good system engineering practice. In the context of developing a Mission Operations System (MOS), there are numerous practical challenges to translating that notion into the cost-effective development of a useful set of requirements. These challenges can include such issues as a lack of Project-level focus on operations issues, insufficient or improper flowdown of requirements, flowdown of immature or poor-quality requirements from Project level, and MOS resource constraints (personnel expertise and/or dollars). System engineering theory must be translated into a practice that provides enough structure and standards to serve as guidance, but that retains sufficient flexibility to be tailored to the needs and constraints of a particular MOS or Project. We describe a detailed, scenario-based process for requirements development. Identifying a set of attributes for high quality requirements, we show how the portions of the process address many of those attributes. We also find that the basic process steps are robust, and can be effective even in challenging Project environments.
Ameliioration and Conversion of Excessive Se to New Resources from a Plant-based System
USDA-ARS?s Scientific Manuscript database
The green technology of phytoremediation has being developed for the management of metal(loid)-contaminated soils and waters via the processes of phytoextraction, phytovolatilization, and phytostabilization. Based upon these processes a plant management remediation strategy for selenium (Se) has be...
Current biotechnological developments in Belgium.
Masschelein, C A; Callegari, J P; Laurent, M; Simon, J P; Taeymans, D
1989-01-01
In recent years, actions have been undertaken by the Belgian government to promote process innovation and technical diversification. Research programs are initiated and coordinated by the study committee for biotechnology setup within the Institute for Scientific Research in Industry and Agriculture (IRSIA). As a result of this action, the main areas where biotechnological processes are developed or commercially exploited include plant genetics, protein engineering, hybridoma technology, biopesticides, production by genetic engineering of vaccines and drugs, monoclonal detection of human and animal deseases, process reactors for aerobic and anaerobic wastewater treatment, and genetic modification of yeast and bacteria as a base for biomass and energy. Development research also includes new fermentation technologies principally based on immobilization of microorganisms, reactor design, and optimization of unit operations involved in downstream processing. Food, pharmaceutical, and chemical industries are involved in genetic engineering and biotechnology and each of these sectors is overviewed in this paper.
Santos, João Rodrigo; Viegas, Olga; Páscoa, Ricardo N M J; Ferreira, Isabel M P L V O; Rangel, António O S S; Lopes, João Almeida
2016-10-01
In this work, a real-time and in-situ analytical tool based on near infrared spectroscopy is proposed to predict two of the most relevant coffee parameters during the roasting process, sucrose and colour. The methodology was developed taking in consideration different coffee varieties (Arabica and Robusta), coffee origins (Brazil, East-Timor, India and Uganda) and roasting process procedures (slow and fast). All near infrared spectroscopy-based calibrations were developed resorting to partial least squares regression. The results proved the suitability of this methodology as demonstrated by range-error-ratio and coefficient of determination higher than 10 and 0.85 respectively, for all modelled parameters. The relationship between sucrose and colour development during the roasting process is further discussed, in light of designing in real-time coffee products with similar visual appearance and distinct organoleptic profile. Copyright © 2016 Elsevier Ltd. All rights reserved.
Mahnke, Andrea N; Plasek, Joseph M; Hoffman, David G; Partridge, Nathan S; Foth, Wendy S; Waudby, Carol J; Rasmussen, Luke V; McManus, Valerie D; McCarty, Catherine A
2014-01-01
Many informed consent studies demonstrate that research subjects poorly retain and understand information in written consent documents. Previous research in multimedia consent is mixed in terms of success for improving participants’ understanding, satisfaction, and retention. This failure may be due to a lack of a community-centered design approach to building the interventions. The goal of this study was to gather information from the community to determine the best way to undertake the consent process. Community perceptions regarding different computer-based consenting approaches were evaluated, and a computer-based consent was developed and tested. A second goal was to evaluate whether participants make truly informed decisions to participate in research. Simulations of an informed consent process were videotaped to document the process. Focus groups were conducted to determine community attitudes towards a computer-based informed consent process. Hybrid focus groups were conducted to determine the most acceptable hardware device. Usability testing was conducted on a computer-based consent prototype using a touch-screen kiosk. Based on feedback, a computer-based consent was developed. Representative study participants were able to easily complete the consent, and all were able to correctly answer the comprehension check questions. Community involvement in developing a computer-based consent proved valuable for a population-based genetic study. These findings may translate to other types of informed consents, such as genetic clinical trials consents. A computer-based consent may serve to better communicate consistent, clear, accurate, and complete information regarding the risks and benefits of study participation. Additional analysis is necessary to measure the level of comprehension of the check-question answers by larger numbers of participants. The next step will involve contacting participants to measure whether understanding of what they consented to is retained over time. PMID:24273095
Mahnke, Andrea N; Plasek, Joseph M; Hoffman, David G; Partridge, Nathan S; Foth, Wendy S; Waudby, Carol J; Rasmussen, Luke V; McManus, Valerie D; McCarty, Catherine A
2014-01-01
Many informed consent studies demonstrate that research subjects poorly retain and understand information in written consent documents. Previous research in multimedia consent is mixed in terms of success for improving participants' understanding, satisfaction, and retention. This failure may be due to a lack of a community-centered design approach to building the interventions. The goal of this study was to gather information from the community to determine the best way to undertake the consent process. Community perceptions regarding different computer-based consenting approaches were evaluated, and a computer-based consent was developed and tested. A second goal was to evaluate whether participants make truly informed decisions to participate in research. Simulations of an informed consent process were videotaped to document the process. Focus groups were conducted to determine community attitudes towards a computer-based informed consent process. Hybrid focus groups were conducted to determine the most acceptable hardware device. Usability testing was conducted on a computer-based consent prototype using a touch-screen kiosk. Based on feedback, a computer-based consent was developed. Representative study participants were able to easily complete the consent, and all were able to correctly answer the comprehension check questions. Community involvement in developing a computer-based consent proved valuable for a population-based genetic study. These findings may translate to other types of informed consents, including those for trials involving treatment of genetic disorders. A computer-based consent may serve to better communicate consistent, clear, accurate, and complete information regarding the risks and benefits of study participation. Additional analysis is necessary to measure the level of comprehension of the check-question answers by larger numbers of participants. The next step will involve contacting participants to measure whether understanding of what they consented to is retained over time. © 2013 Wiley Periodicals, Inc.
Nano-Aramid Fiber Reinforced Polyurethane Foam
NASA Technical Reports Server (NTRS)
Semmes, Edmund B.; Frances, Arnold
2008-01-01
Closed cell polyurethane and, particularly, polyisocyanurate foams are a large family of flexible and rigid products the result of a reactive two part process wherein a urethane based polyol is combined with a foaming or "blowing" agent to create a cellular solid at room temperature. The ratio of reactive components, the constituency of the base materials, temperature, humidity, molding, pouring, spraying and many other processing techniques vary greatly. However, there is no known process for incorporating reinforcing fibers small enough to be integrally dispersed within the cell walls resulting in superior final products. The key differentiating aspect from the current state of art resides in the many processing technologies to be fully developed from the novel concept of milled nano pulp aramid fibers and their enabling entanglement capability fully enclosed within the cell walls of these closed cell urethane foams. The authors present the results of research and development of reinforced foam processing, equipment development, strength characteristics and the evolution of its many applications.
Demand driven salt clean-up in a molten salt fast reactor - Defining a priority list.
Merk, B; Litskevich, D; Gregg, R; Mount, A R
2018-01-01
The PUREX technology based on aqueous processes is currently the leading reprocessing technology in nuclear energy systems. It seems to be the most developed and established process for light water reactor fuel and the use of solid fuel. However, demand driven development of the nuclear system opens the way to liquid fuelled reactors, and disruptive technology development through the application of an integrated fuel cycle with a direct link to reactor operation. The possibilities of this new concept for innovative reprocessing technology development are analysed, the boundary conditions are discussed, and the economic as well as the neutron physical optimization parameters of the process are elucidated. Reactor physical knowledge of the influence of different elements on the neutron economy of the reactor is required. Using an innovative study approach, an element priority list for the salt clean-up is developed, which indicates that separation of Neodymium and Caesium is desirable, as they contribute almost 50% to the loss of criticality. Separating Zirconium and Samarium in addition from the fuel salt would remove nearly 80% of the loss of criticality due to fission products. The theoretical study is followed by a qualitative discussion of the different, demand driven optimization strategies which could satisfy the conflicting interests of sustainable reactor operation, efficient chemical processing for the salt clean-up, and the related economic as well as chemical engineering consequences. A new, innovative approach of balancing the throughput through salt processing based on a low number of separation process steps is developed. Next steps for the development of an economically viable salt clean-up process are identified.
The developmental processes for NANDA International Nursing Diagnoses.
Scroggins, Leann M
2008-01-01
This study aims to provide a step-by-step procedural guideline for the development of a nursing diagnosis that meets the necessary criteria for inclusion in the NANDA International and NNN classification systems. The guideline is based on the processes developed by the Diagnosis Development Committee of NANDA International and includes the necessary processes for development of Actual, Wellness, Health Promotion, and Risk nursing diagnoses. Definitions of Actual, Wellness, Health Promotion, and Risk nursing diagnoses along with inclusion criteria and taxonomy rules have been incorporated into the guideline to streamline the development and review processes for submitted diagnoses. A step-by-step procedural guideline will assist the submitter to move efficiently and effectively through the submission process, resulting in increased submissions and enhancement of the NANDA International and NNN classification systems.
Registration of surface structures using airborne focused ultrasound.
Sundström, N; Börjesson, P O; Holmer, N G; Olsson, L; Persson, H W
1991-01-01
A low-cost measuring system, based on a personal computer combined with standard equipment for complex measurements and signal processing, has been assembled. Such a system increases the possibilities for small hospitals and clinics to finance advanced measuring equipment. A description of equipment developed for airborne ultrasound together with a personal computer-based system for fast data acquisition and processing is given. Two air-adapted ultrasound transducers with high lateral resolution have been developed. Furthermore, a few results for fast and accurate estimation of signal arrival time are presented. The theoretical estimation models developed are applied to skin surface profile registrations.
ERIC Educational Resources Information Center
Vanfretti, Luigi; Farrokhabadi, Mostafa
2015-01-01
This article presents the implementation of the constructive alignment theory (CAT) in a power system analysis course through a consensus-based course design process. The consensus-based design process involves both the instructor and graduate-level students and it aims to develop the CAT framework in a holistic manner with the goal of including…
When is good, good enough? Methodological pragmatism for sustainable guideline development.
Browman, George P; Somerfield, Mark R; Lyman, Gary H; Brouwers, Melissa C
2015-03-06
Continuous escalation in methodological and procedural rigor for evidence-based processes in guideline development is associated with increasing costs and production delays that threaten sustainability. While health research methodologists are appropriately responsible for promoting increasing rigor in guideline development, guideline sponsors are responsible for funding such processes. This paper acknowledges that other stakeholders in addition to methodologists should be more involved in negotiating trade-offs between methodological procedures and efficiency in guideline production to produce guidelines that are 'good enough' to be trustworthy and affordable under specific circumstances. The argument for reasonable methodological compromise to meet practical circumstances is consistent with current implicit methodological practice. This paper proposes a conceptual tool as a framework to be used by different stakeholders in negotiating, and explicitly reporting, reasonable compromises for trustworthy as well as cost-worthy guidelines. The framework helps fill a transparency gap in how methodological choices in guideline development are made. The principle, 'when good is good enough' can serve as a basis for this approach. The conceptual tool 'Efficiency-Validity Methodological Continuum' acknowledges trade-offs between validity and efficiency in evidence-based guideline development and allows for negotiation, guided by methodologists, of reasonable methodological compromises among stakeholders. Collaboration among guideline stakeholders in the development process is necessary if evidence-based guideline development is to be sustainable.
Process-Based Development of Competence Models to Computer Science Education
ERIC Educational Resources Information Center
Zendler, Andreas; Seitz, Cornelia; Klaudt, Dieter
2016-01-01
A process model ("cpm.4.CSE") is introduced that allows the development of competence models in computer science education related to curricular requirements. It includes eight subprocesses: (a) determine competence concept, (b) determine competence areas, (c) identify computer science concepts, (d) assign competence dimensions to…
Eliciting User Requirements Using Appreciative Inquiry
ERIC Educational Resources Information Center
Gonzales, Carol Kernitzki
2010-01-01
Many software development projects fail because they do not meet the needs of users, are over-budget, and abandoned. To address this problem, the user requirements elicitation process was modified based on principles of Appreciative Inquiry. Appreciative Inquiry, commonly used in organizational development, aims to build organizations, processes,…
Assessing Problem Solving Competence through Inquiry-Based Teaching in School Science Education
ERIC Educational Resources Information Center
Zervas, Panagiotis; Sotiriou, Sofoklis; Tiemann, Rüdiger; Sampson, Demetrios G.
2015-01-01
Nowadays, there is a consensus that inquiry-based learning contributes to developing students' scientific literacy in schools. Inquiry-based teaching strategies are promoted for the development (among others) of the cognitive processes that cultivate problem solving (PS) competence. The build up of PS competence is a central objective for most…
Towards a Trans-Disciplinary Methodology for a Game-Based Intervention Development Process
ERIC Educational Resources Information Center
Arnab, Sylvester; Clarke, Samantha
2017-01-01
The application of game-based learning adds play into educational and instructional contexts. Even though there is a lack of standard methodologies or formulaic frameworks to better inform game-based intervention development, there exist scientific and empirical studies that can serve as benchmarks for establishing scientific validity in terms of…
ERIC Educational Resources Information Center
Desyatov, Tymofiy
2015-01-01
The article analyzes the development of competency-based professional training standards and their implementation into educational process in foreign countries. It determines that the main idea of competency-based approach is competency-and-active learning, which aims at complex acquirement of diverse skills and ways of practice activities via…
ERIC Educational Resources Information Center
Lin, Yi-Chun; Hsieh, Ya-Hui; Hou, Huei-Tse
2015-01-01
The development of a usability evaluation method for educational systems or applications, called the self-report-based sequential analysis, is described herein. The method aims to extend the current practice by proposing self-report-based sequential analysis as a new usability method, which integrates the advantages of self-report in survey…
ERIC Educational Resources Information Center
Barbera, Elena; Garcia, Iolanda; Fuertes-Alpiste, Marc
2017-01-01
This paper presents a case study of the co-design process for an online course on Sustainable Development (Degree in Tourism) involving the teacher, two students, and the project researchers. The co-design process was founded on an inquiry-based and technology-enhanced model that takes shape in a set of design principles. The research had two main…
ERIC Educational Resources Information Center
Ellett, Chad D.; Demir, Kadir; Monsaas, Judith
2015-01-01
The purpose of this study was to examine change processes, self-efficacy beliefs, and department culture and the roles these elements play in faculty engagement in working in K-12 schools. The development of three new web-based measures of faculty perceptions of change processes, self-efficacy beliefs, and department culture are described. The…
USDA-ARS?s Scientific Manuscript database
The U.S. food and non-food industries would benefit from the development of a domestically produced crude, semi-pure and pure bio-based fiber gum from corn bran and oat hulls processing waste streams. When corn bran and oat hulls are processed to produce a commercial cellulose enriched fiber gel, th...
Rafiq, Qasim A; Hanga, Mariana P; Heathman, Thomas R J; Coopman, Karen; Nienow, Alvin W; Williams, David J; Hewitt, Christopher J
2017-10-01
Microbioreactors play a critical role in process development as they reduce reagent requirements and can facilitate high-throughput screening of process parameters and culture conditions. Here, we have demonstrated and explained in detail, for the first time, the amenability of the automated ambr15 cell culture microbioreactor system for the development of scalable adherent human mesenchymal multipotent stromal/stem cell (hMSC) microcarrier culture processes. This was achieved by first improving suspension and mixing of the microcarriers and then improving cell attachment thereby reducing the initial growth lag phase. The latter was achieved by using only 50% of the final working volume of medium for the first 24 h and using an intermittent agitation strategy. These changes resulted in >150% increase in viable cell density after 24 h compared to the original process (no agitation for 24 h and 100% working volume). Using the same methodology as in the ambr15, similar improvements were obtained with larger scale spinner flask studies. Finally, this improved bioprocess methodology based on a serum-based medium was applied to a serum-free process in the ambr15, resulting in >250% increase in yield compared to the serum-based process. At both scales, the agitation used during culture was the minimum required for microcarrier suspension, N JS . The use of the ambr15, with its improved control compared to the spinner flask, reduced the coefficient of variation on viable cell density in the serum containing medium from 7.65% to 4.08%, and the switch to serum free further reduced these to 1.06-0.54%, respectively. The combination of both serum-free and automated processing improved the reproducibility more than 10-fold compared to the serum-based, manual spinner flask process. The findings of this study demonstrate that the ambr15 microbioreactor is an effective tool for bioprocess development of hMSC microcarrier cultures and that a combination of serum-free medium, control, and automation improves both process yield and consistency. Biotechnol. Bioeng. 2017;114: 2253-2266. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.
Hanga, Mariana P.; Heathman, Thomas R. J.; Coopman, Karen; Nienow, Alvin W.; Williams, David J.; Hewitt, Christopher J.
2017-01-01
ABSTRACT Microbioreactors play a critical role in process development as they reduce reagent requirements and can facilitate high‐throughput screening of process parameters and culture conditions. Here, we have demonstrated and explained in detail, for the first time, the amenability of the automated ambr15 cell culture microbioreactor system for the development of scalable adherent human mesenchymal multipotent stromal/stem cell (hMSC) microcarrier culture processes. This was achieved by first improving suspension and mixing of the microcarriers and then improving cell attachment thereby reducing the initial growth lag phase. The latter was achieved by using only 50% of the final working volume of medium for the first 24 h and using an intermittent agitation strategy. These changes resulted in >150% increase in viable cell density after 24 h compared to the original process (no agitation for 24 h and 100% working volume). Using the same methodology as in the ambr15, similar improvements were obtained with larger scale spinner flask studies. Finally, this improved bioprocess methodology based on a serum‐based medium was applied to a serum‐free process in the ambr15, resulting in >250% increase in yield compared to the serum‐based process. At both scales, the agitation used during culture was the minimum required for microcarrier suspension, NJS. The use of the ambr15, with its improved control compared to the spinner flask, reduced the coefficient of variation on viable cell density in the serum containing medium from 7.65% to 4.08%, and the switch to serum free further reduced these to 1.06–0.54%, respectively. The combination of both serum‐free and automated processing improved the reproducibility more than 10‐fold compared to the serum‐based, manual spinner flask process. The findings of this study demonstrate that the ambr15 microbioreactor is an effective tool for bioprocess development of hMSC microcarrier cultures and that a combination of serum‐free medium, control, and automation improves both process yield and consistency. Biotechnol. Bioeng. 2017;114: 2253–2266. © 2017 The Authors. Biotechnology and Bioengineering Published by Wiley Periodicals, Inc. PMID:28627713
Conceptualising the effectiveness of impact assessment processes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chanchitpricha, Chaunjit, E-mail: chaunjit@g.sut.ac.th; Bond, Alan, E-mail: alan.bond@uea.ac.uk; Unit for Environmental Sciences and Management School of Geo and Spatial Sciences, Internal Box 375, North West University
2013-11-15
This paper aims at conceptualising the effectiveness of impact assessment processes through the development of a literature-based framework of criteria to measure impact assessment effectiveness. Four categories of effectiveness were established: procedural, substantive, transactive and normative, each containing a number of criteria; no studies have previously brought together all four of these categories into such a comprehensive, criteria-based framework and undertaken systematic evaluation of practice. The criteria can be mapped within a cycle/or cycles of evaluation, based on the ‘logic model’, at the stages of input, process, output and outcome to enable the identification of connections between the criteria acrossmore » the categories of effectiveness. This framework is considered to have potential application in measuring the effectiveness of many impact assessment processes, including strategic environmental assessment (SEA), environmental impact assessment (EIA), social impact assessment (SIA) and health impact assessment (HIA). -- Highlights: • Conceptualising effectiveness of impact assessment processes. • Identification of factors influencing effectiveness of impact assessment processes. • Development of criteria within a framework for evaluating IA effectiveness. • Applying the logic model to examine connections between effectiveness criteria.« less
NURBS-Based Geometry for Integrated Structural Analysis
NASA Technical Reports Server (NTRS)
Oliver, James H.
1997-01-01
This grant was initiated in April 1993 and completed in September 1996. The primary goal of the project was to exploit the emerging defacto CAD standard of Non- Uniform Rational B-spline (NURBS) based curve and surface geometry to integrate and streamline the process of turbomachinery structural analysis. We focused our efforts on critical geometric modeling challenges typically posed by the requirements of structural analysts. We developed a suite of software tools that facilitate pre- and post-processing of NURBS-based turbomachinery blade models for finite element structural analyses. We also developed tools to facilitate the modeling of blades in their manufactured (or cold) state based on nominal operating shape and conditions. All of the software developed in the course of this research is written in the C++ language using the Iris Inventor 3D graphical interface tool-kit from Silicon Graphics. In addition to enhanced modularity, improved maintainability, and efficient prototype development, this design facilitates the re-use of code developed for other NASA projects and provides a uniform and professional 'look and feel' for all applications developed by the Iowa State Team.
Development of a Cr-Based Hard Composite Processed by Spark Plasma Sintering
NASA Astrophysics Data System (ADS)
García-Junceda, A.; Sáez, I.; Deng, X. X.; Torralba, J. M.
2018-04-01
This investigation analyzes the feasibility of processing a composite material comprising WC particles randomly dispersed in a matrix in which Cr is the main metallic binder. Thus, a new composite material is processed using a commercial, economic, and easily available Cr-based alloy, assuming that there is a certain Cr solubility in the WC particles acting as reinforcement. The processing route followed includes mechanical milling of the powders and consolidation by spark plasma sintering.
Bryant, Jamie; Sanson-Fisher, Rob; Tzelepis, Flora; Henskens, Frans; Paul, Christine; Stevenson, William
2014-01-01
Background Effective communication with cancer patients and their families about their disease, treatment options, and possible outcomes may improve psychosocial outcomes. However, traditional approaches to providing information to patients, including verbal information and written booklets, have a number of shortcomings centered on their limited ability to meet patient preferences and literacy levels. New-generation Web-based technologies offer an innovative and pragmatic solution for overcoming these limitations by providing a platform for interactive information seeking, information sharing, and user-centered tailoring. Objective The primary goal of this paper is to discuss the advantages of comprehensive and iterative Web-based technologies for health information provision and propose a four-phase framework for the development of Web-based information tools. Methods The proposed framework draws on our experience of constructing a Web-based information tool for hematological cancer patients and their families. The framework is based on principles for the development and evaluation of complex interventions and draws on the Agile methodology of software programming that emphasizes collaboration and iteration throughout the development process. Results The DoTTI framework provides a model for a comprehensive and iterative approach to the development of Web-based informational tools for patients. The process involves 4 phases of development: (1) Design and development, (2) Testing early iterations, (3) Testing for effectiveness, and (4) Integration and implementation. At each step, stakeholders (including researchers, clinicians, consumers, and programmers) are engaged in consultations to review progress, provide feedback on versions of the Web-based tool, and based on feedback, determine the appropriate next steps in development. Conclusions This 4-phase framework is evidence-informed and consumer-centered and could be applied widely to develop Web-based programs for a diverse range of diseases. PMID:24641991
Smits, Rochelle; Bryant, Jamie; Sanson-Fisher, Rob; Tzelepis, Flora; Henskens, Frans; Paul, Christine; Stevenson, William
2014-03-14
Effective communication with cancer patients and their families about their disease, treatment options, and possible outcomes may improve psychosocial outcomes. However, traditional approaches to providing information to patients, including verbal information and written booklets, have a number of shortcomings centered on their limited ability to meet patient preferences and literacy levels. New-generation Web-based technologies offer an innovative and pragmatic solution for overcoming these limitations by providing a platform for interactive information seeking, information sharing, and user-centered tailoring. The primary goal of this paper is to discuss the advantages of comprehensive and iterative Web-based technologies for health information provision and propose a four-phase framework for the development of Web-based information tools. The proposed framework draws on our experience of constructing a Web-based information tool for hematological cancer patients and their families. The framework is based on principles for the development and evaluation of complex interventions and draws on the Agile methodology of software programming that emphasizes collaboration and iteration throughout the development process. The DoTTI framework provides a model for a comprehensive and iterative approach to the development of Web-based informational tools for patients. The process involves 4 phases of development: (1) Design and development, (2) Testing early iterations, (3) Testing for effectiveness, and (4) Integration and implementation. At each step, stakeholders (including researchers, clinicians, consumers, and programmers) are engaged in consultations to review progress, provide feedback on versions of the Web-based tool, and based on feedback, determine the appropriate next steps in development. This 4-phase framework is evidence-informed and consumer-centered and could be applied widely to develop Web-based programs for a diverse range of diseases.
Vanegas, Fernando; Weiss, John; Gonzalez, Felipe
2018-01-01
Recent advances in remote sensed imagery and geospatial image processing using unmanned aerial vehicles (UAVs) have enabled the rapid and ongoing development of monitoring tools for crop management and the detection/surveillance of insect pests. This paper describes a (UAV) remote sensing-based methodology to increase the efficiency of existing surveillance practices (human inspectors and insect traps) for detecting pest infestations (e.g., grape phylloxera in vineyards). The methodology uses a UAV integrated with advanced digital hyperspectral, multispectral, and RGB sensors. We implemented the methodology for the development of a predictive model for phylloxera detection. In this method, we explore the combination of airborne RGB, multispectral, and hyperspectral imagery with ground-based data at two separate time periods and under different levels of phylloxera infestation. We describe the technology used—the sensors, the UAV, and the flight operations—the processing workflow of the datasets from each imagery type, and the methods for combining multiple airborne with ground-based datasets. Finally, we present relevant results of correlation between the different processed datasets. The objective of this research is to develop a novel methodology for collecting, processing, analysing and integrating multispectral, hyperspectral, ground and spatial data to remote sense different variables in different applications, such as, in this case, plant pest surveillance. The development of such methodology would provide researchers, agronomists, and UAV practitioners reliable data collection protocols and methods to achieve faster processing techniques and integrate multiple sources of data in diverse remote sensing applications. PMID:29342101
Kusurkar, Rashmi A; Croiset, Gerda; Mann, Karen V; Custers, Eugene; Ten Cate, Olle
2012-06-01
Educational psychology indicates that learning processes can be mapped on three dimensions: cognitive (what to learn), affective or motivational (why learn), and metacognitive regulation (how to learn). In a truly student-centered medical curriculum, all three dimensions should guide curriculum developers in constructing learning environments. The authors explored whether student motivation has guided medical education curriculum developments. The authors reviewed the literature on motivation theory related to education and on medical education curriculum development to identify major developments. Using the Learning-Oriented Teaching model as a framework, they evaluated the extent to which motivation theory has guided medical education curriculum developers. Major developments in the field of motivation theory indicate that motivation drives learning and influences students' academic performance, that gender differences exist in motivational mechanisms, and that the focus has shifted from quantity of motivation to quality of motivation and its determinants, and how they stimulate academic motivation. Major developments in medical curricula include the introduction of standardized and regulated medical education as well as problem-based, learner-centered, integrated teaching, outcome-based, and community-based approaches. These curricular changes have been based more on improving students' cognitive processing of content or metacognitive regulation than on stimulating motivation. Motivational processes may be a substantially undervalued factor in curriculum development. Building curricula to specifically stimulate motivation in students may powerfully influence the outcomes of curricula. The elements essential for stimulating intrinsic motivation in students, including autonomy support, adequate feedback, and emotional support, appear lacking as a primary aim in many curricular plans.
NASA Technical Reports Server (NTRS)
Spiering, Bruce; Underwood, Lauren; Ellis, Chris; Lehrter, John; Hagy, Jim; Schaeffer, Blake
2010-01-01
The goals of the project are to provide information from satellite remote sensing to support numeric nutrient criteria development and to determine data processing methods and data quality requirements to support nutrient criteria development and implementation. The approach is to identify water quality indicators that are used by decision makers to assess water quality and that are related to optical properties of the water; to develop remotely sensed data products based on algorithms relating remote sensing imagery to field-based observations of indicator values; to develop methods to assess estuarine water quality, including trends, spatial and temporal variability, and seasonality; and to develop tools to assist in the development and implementation of estuarine and coastal nutrient criteria. Additional slides present process, criteria development, typical data sources and analyses for criteria process, the power of remote sensing data for the process, examples from Pensacola Bay, spatial and temporal variability, pixel matchups, remote sensing validation, remote sensing in coastal waters, requirements for remotely sensed data products, and needs assessment. An additional presentation examines group engagement and information collection. Topics include needs assessment purpose and objectives, understanding water quality decision making, determining information requirements, and next steps.
Hypertext-based computer vision teaching packages
NASA Astrophysics Data System (ADS)
Marshall, A. David
1994-10-01
The World Wide Web Initiative has provided a means for providing hypertext and multimedia based information across the whole INTERNET. Many applications have been developed on such http servers. At Cardiff we have developed a http hypertext based multimedia server, the Cardiff Information Server, using the widely available Mosaic system. The server provides a variety of information ranging from the provision of teaching modules, on- line documentation, timetables for departmental activities to more light hearted hobby interests. One important and novel development to the server has been the development of courseware facilities. This ranges from the provision of on-line lecture notes, exercises and their solutions to more interactive teaching packages. A variety of disciplines have benefitted notably Computer Vision, and Image Processing but also C programming, X Windows, Computer Graphics and Parallel Computing. This paper will address the issues of the implementation of the Computer Vision and Image Processing packages, the advantages gained from using a hypertext based system and also will relate practical experiences of using the packages in a class environment. The paper addresses issues of how best to provide information in such a hypertext based system and how interactive image processing packages can be developed and integrated into courseware. The suite of tools developed facilitates a flexible and powerful courseware package that has proved popular in the classroom and over the Internet. The paper will also detail many future developments we see possible. One of the key points raised in the paper is that Mosaic's hypertext language (html) is extremely powerful and yet relatively straightforward to use. It is also possible to link in Unix calls so that programs and shells can be executed. This provides a powerful suite of utilities that can be exploited to develop many packages.
Scaffolding as an effort for thinking process optimization on heredity
NASA Astrophysics Data System (ADS)
Azizah, N. R.; Masykuri, M.; Prayitno, B. A.
2018-04-01
Thinking is an activity and process of manipulating and transforming data or information into memory. Thinking process is different between one and other person. Thinking process can be developed by interaction between student and their environment, such as scaffolding. Given scaffolding is based on each student necessity. There are 2 level on scaffolding such as explaining, reviewing, and restructuring; and developing conceptual thinking. This research is aimed to describe student’s thinking process on heredity especially on inheritance that is before and after scaffolding. This research used descriptive qualitative method. There were three kinds of subject degree such as the students with high, middle, and low achieving students. The result showed that subjects had some difficulty in dihybrid inheritance question in different place. Most difficulty was on determining the number of different characteristic, parental genotype, gamete, and ratio of genotype and phenotype F2. Based on discussed during scaffolding showed that the subjects have some misunderstanding terms and difficulty to determine parental, gamete, genotype, and phenotype. Final result in this research showed that the subjects develop thinking process higher after scaffolding. Therefore the subjects can solve question properly.
2012-09-01
Elmendorf, D. W., & Gregory Mankiw , N. (1999). Government debt. Handbook of Macroeconomics , 1, 1615-1669. European Union. European financial stability...budget process, based on the supply chain demand management process principles of operations and it is introduced the idea of developing a Budget... principles of systems dynamics, a proposal for the development of a Budget Management Flight Simulator, that will operate as a learning and educational
NASA Technical Reports Server (NTRS)
Bharwani, S. S.; Walls, J. T.; Jackson, M. E.
1987-01-01
A knowledge based system to assist process engineers in evaluating the processability and moldability of poly-isocyanurate (PIR) formulations for the thermal protection system of the Space Shuttle external tank (ET) is discussed. The Reaction Injection Molding- Process Development Advisor (RIM-PDA) is a coupled system which takes advantage of both symbolic and numeric processing techniques. This system will aid the process engineer in identifying a startup set of mold schedules and in refining the mold schedules to remedy specific process problems diagnosed by the system.
Building the Evidentiary Argument in Game-Based Assessment
ERIC Educational Resources Information Center
DiCerbo, Kristen E.
2017-01-01
While game-based assessment offers new potential for understanding the processes students use to solve problems, it also presents new challenges in uncovering which player actions provide evidence that contributes to understanding about students' knowledge, skill, and attributes that we are interested in assessing. A development process that…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Palta, J.
2015-06-15
Current quality assurance and quality management guidelines provided by various professional organizations are prescriptive in nature, focusing principally on performance characteristics of planning and delivery devices. However, published analyses of events in radiation therapy show that most events are often caused by flaws in clinical processes rather than by device failures. This suggests the need for the development of a quality management program that is based on integrated approaches to process and equipment quality assurance. Industrial engineers have developed various risk assessment tools that are used to identify and eliminate potential failures from a system or a process before amore » failure impacts a customer. These tools include, but are not limited to, process mapping, failure modes and effects analysis, fault tree analysis. Task Group 100 of the American Association of Physicists in Medicine has developed these tools and used them to formulate an example risk-based quality management program for intensity-modulated radiotherapy. This is a prospective risk assessment approach that analyzes potential error pathways inherent in a clinical process and then ranks them according to relative risk, typically before implementation, followed by the design of a new process or modification of the existing process. Appropriate controls are then put in place to ensure that failures are less likely to occur and, if they do, they will more likely be detected before they propagate through the process, compromising treatment outcome and causing harm to the patient. Such a prospective approach forms the basis of the work of Task Group 100 that has recently been approved by the AAPM. This session will be devoted to a discussion of these tools and practical examples of how these tools can be used in a given radiotherapy clinic to develop a risk based quality management program. Learning Objectives: Learn how to design a process map for a radiotherapy process Learn how to perform failure modes and effects analysis analysis for a given process Learn what fault trees are all about Learn how to design a quality management program based upon the information obtained from process mapping, failure modes and effects analysis and fault tree analysis. Dunscombe: Director, TreatSafely, LLC and Center for the Assessment of Radiological Sciences; Consultant to IAEA and Varian Thomadsen: President, Center for the Assessment of Radiological Sciences Palta: Vice President of the Center for the Assessment of Radiological Sciences.« less
Virtual Sensor Web Architecture
NASA Astrophysics Data System (ADS)
Bose, P.; Zimdars, A.; Hurlburt, N.; Doug, S.
2006-12-01
NASA envisions the development of smart sensor webs, intelligent and integrated observation network that harness distributed sensing assets, their associated continuous and complex data sets, and predictive observation processing mechanisms for timely, collaborative hazard mitigation and enhanced science productivity and reliability. This paper presents Virtual Sensor Web Infrastructure for Collaborative Science (VSICS) Architecture for sustained coordination of (numerical and distributed) model-based processing, closed-loop resource allocation, and observation planning. VSICS's key ideas include i) rich descriptions of sensors as services based on semantic markup languages like OWL and SensorML; ii) service-oriented workflow composition and repair for simple and ensemble models; event-driven workflow execution based on event-based and distributed workflow management mechanisms; and iii) development of autonomous model interaction management capabilities providing closed-loop control of collection resources driven by competing targeted observation needs. We present results from initial work on collaborative science processing involving distributed services (COSEC framework) that is being extended to create VSICS.
NASA Technical Reports Server (NTRS)
Hopkins, R. H.; Davis, J. R.; Rohatgi, A.; Hanes, M. H.; Rai-Choudhury, P.; Mollenkopf, H. C.
1982-01-01
The effects of impurities and processing on the characteristics of silicon and terrestrial silicon solar cells were defined in order to develop cost benefit relationships for the use of cheaper, less pure solar grades of silicon. The amount of concentrations of commonly encountered impurities that can be tolerated in typical p or n base solar cells was established, then a preliminary analytical model from which the cell performance could be projected depending on the kinds and amounts of contaminants in the silicon base material was developed. The impurity data base was expanded to include construction materials, and the impurity performace model was refined to account for additional effects such as base resistivity, grain boundary interactions, thermal processing, synergic behavior, and nonuniform impurity distributions. A preliminary assessment of long term (aging) behavior of impurities was also undertaken.
Developing the skills required for evidence-based practice.
French, B
1998-01-01
The current health care environment requires practitioners with the skills to find and apply the best currently available evidence for effective health care, to contribute to the development of evidence-based practice protocols, and to evaluate the impact of utilizing validated research findings in practice. Current approaches to teaching research are based mainly on gaining skills by participation in the research process. Emphasis on the requirement for rigour in the process of creating new knowledge is assumed to lead to skill in the process of using research information created by others. This article reflects upon the requirements for evidence-based practice, and the degree to which current approaches to teaching research prepare practitioners who are able to find, evaluate and best use currently available research information. The potential for using the principles of systematic review as a teaching and learning strategy for research is explored, and some of the possible strengths and weakness of this approach are highlighted.
Newby, Katie; Bayley, Julie; Wallace, L M
2011-03-01
This article describes the development of an intervention that aims to increase the quantity and quality of parent-child communication about sex and relationships. The intervention has been designed as part of a local strategic approach to teenage pregnancy and sexual health. The process and findings of Intervention Mapping (IM), a tool for the development of theory-and evidence-based interventions, are presented. The process involves a detailed assessment of the difficulties parents experience in communicating with their children about sex and relationships. The findings are translated into program and change objectives that specify what parents need to do to improve their communication. Theory-based practical strategies most likely to bring about the desired behavioral change are then identified and pretested. The intervention developed consists of a six-session facilitator-led program that targets parents' attitudes, knowledge, communication skills, and self-efficacy. Following on from Bartholomew's seminal work on IM, this article develops and extends the application of this process by presenting explicit detail on the behavioral change techniques used and their theoretical underpinnings. The strengths and weaknesses of IM as a process for the development of health behavior interventions are discussed.
Process Development for the Design and Manufacturing of Personalizable Mouth Sticks.
Berger, Veronika M; Pölzer, Stephan; Nussbaum, Gerhard; Ernst, Waltraud; Major, Zoltan
2017-01-01
To increase the independence of people with reduced hand/arm functionality, a process to generate personalizable mouth sticks was developed based on the participatory design principle. In a web tool, anybody can choose the geometry and the materials of their mouth piece, stick and tip. Manufacturing techniques (e.g. 3D printing) and materials used in the process are discussed and evaluated.
LISP based simulation generators for modeling complex space processes
NASA Technical Reports Server (NTRS)
Tseng, Fan T.; Schroer, Bernard J.; Dwan, Wen-Shing
1987-01-01
The development of a simulation assistant for modeling discrete event processes is presented. Included are an overview of the system, a description of the simulation generators, and a sample process generated using the simulation assistant.
SDI-based business processes: A territorial analysis web information system in Spain
NASA Astrophysics Data System (ADS)
Béjar, Rubén; Latre, Miguel Á.; Lopez-Pellicer, Francisco J.; Nogueras-Iso, Javier; Zarazaga-Soria, F. J.; Muro-Medrano, Pedro R.
2012-09-01
Spatial Data Infrastructures (SDIs) provide access to geospatial data and operations through interoperable Web services. These data and operations can be chained to set up specialized geospatial business processes, and these processes can give support to different applications. End users can benefit from these applications, while experts can integrate the Web services in their own business processes and developments. This paper presents an SDI-based territorial analysis Web information system for Spain, which gives access to land cover, topography and elevation data, as well as to a number of interoperable geospatial operations by means of a Web Processing Service (WPS). Several examples illustrate how different territorial analysis business processes are supported. The system has been established by the Spanish National SDI (Infraestructura de Datos Espaciales de España, IDEE) both as an experimental platform for geoscientists and geoinformation system developers, and as a mechanism to contribute to the Spanish citizens knowledge about their territory.
Georgia resource assessment project: Institutionalizing LANDSAT and geographic data base techniques
NASA Technical Reports Server (NTRS)
Pierce, R. R.; Rado, B. Q.; Faust, N.
1981-01-01
Digital data from LANDSAT for each 1.1-acre cell in Georgia were processed and the land cover conditions were categorized. Several test cases were completed and an operational hardware and software processing capability was established at the Georgia Institute of Technology. The operational capability was developed to process the entire state (60,000 sq. miles and 14 LANDSAT scenes) in a cooperative project between eleven divisions and agencies at the regional, state, and federal levels. Products were developed for State agencies such as in both mapped and statistical formats. A computerized geographical data base was developed for management programs. To a large extent the applications of the data base evolved as users of LANDSAT information requested that other data (i.e., soils, slope, land use, etc.) be made compatible with LANDSAT for management programs. To date, geographic data bases incorporating LANDSAT and other spatial data deal with elements of the municipal solid waste management program, and reservoir management for the Corps of Engineers. LANDSAT data are also being used for applications in wetland, wildlife, and forestry management.
NASA Astrophysics Data System (ADS)
Sabeur, Z. A.; Wächter, J.; Middleton, S. E.; Zlatev, Z.; Häner, R.; Hammitzsch, M.; Loewe, P.
2012-04-01
The intelligent management of large volumes of environmental monitoring data for early tsunami warning requires the deployment of robust and scalable service oriented infrastructure that is supported by an agile knowledge-base for critical decision-support In the TRIDEC project (TRIDEC 2010-2013), a sensor observation service bus of the TRIDEC system is being developed for the advancement of complex tsunami event processing and management. Further, a dedicated TRIDEC system knowledge-base is being implemented to enable on-demand access to semantically rich OGC SWE compliant hydrodynamic observations and operationally oriented meta-information to multiple subscribers. TRIDEC decision support requires a scalable and agile real-time processing architecture which enables fast response to evolving subscribers requirements as the tsunami crisis develops. This is also achieved with the support of intelligent processing services which specialise in multi-level fusion methods with relevance feedback and deep learning. The TRIDEC knowledge base development work coupled with that of the generic sensor bus platform shall be presented to demonstrate advanced decision-support with situation awareness in context of tsunami early warning and crisis management.
Ionic-Liquid-Based CO2 Capture Systems: Structure, Interaction and Process.
Zeng, Shaojuan; Zhang, Xiangping; Bai, Lu; Zhang, Xiaochun; Wang, Hui; Wang, Jianji; Bao, Di; Li, Mengdie; Liu, Xinyan; Zhang, Suojiang
2017-07-26
The inherent structure tunability, good affinity with CO 2 , and nonvolatility of ionic liquids (ILs) drive their exploration and exploitation in CO 2 separation field, and has attracted remarkable interest from both industries and academia. The aim of this Review is to give a detailed overview on the recent advances on IL-based materials, including pure ILs, IL-based solvents, and IL-based membranes for CO 2 capture and separation from the viewpoint of molecule to engineering. The effects of anions, cations and functional groups on CO 2 solubility and selectivity of ILs, as well as the studies on degradability of ILs are reviewed, and the recent developments on functionalized ILs, IL-based solvents, and IL-based membranes are also discussed. CO 2 separation mechanism with IL-based solvents and IL-based membranes are explained by combining molecular simulation and experimental characterization. Taking into consideration of the applications and industrialization, the recent achievements and developments on the transport properties of IL fluids and the process design of IL-based processes are highlighted. Finally, the future research challenges and perspectives of the commercialization of CO 2 capture and separation with IL-based materials are posed.
Coater/developer based techniques to improve high-resolution EUV patterning defectivity
NASA Astrophysics Data System (ADS)
Hontake, Koichi; Huli, Lior; Lemley, Corey; Hetzer, Dave; Liu, Eric; Ko, Akiteru; Kawakami, Shinichiro; Shimoaoki, Takeshi; Hashimoto, Yusaku; Tanaka, Koichiro; Petrillo, Karen; Meli, Luciana; De Silva, Anuja; Xu, Yongan; Felix, Nelson; Johnson, Richard; Murray, Cody; Hubbard, Alex
2017-10-01
Extreme ultraviolet lithography (EUVL) technology is one of the leading candidates under consideration for enabling the next generation of devices, for 7nm node and beyond. As the focus shifts to driving down the 'effective' k1 factor and enabling the full scaling entitlement of EUV patterning, new techniques and methods must be developed to reduce the overall defectivity, mitigate pattern collapse, and eliminate film-related defects. In addition, CD uniformity and LWR/LER must be improved in terms of patterning performance. Tokyo Electron Limited (TEL™) and IBM Corporation are continuously developing manufacturing quality processes for EUV. In this paper, we review the ongoing progress in coater/developer based processes (coating, developing, baking) that are required to enable EUV patterning.
Multi-enzyme logic network architectures for assessing injuries: digital processing of biomarkers.
Halámek, Jan; Bocharova, Vera; Chinnapareddy, Soujanya; Windmiller, Joshua Ray; Strack, Guinevere; Chuang, Min-Chieh; Zhou, Jian; Santhosh, Padmanabhan; Ramirez, Gabriela V; Arugula, Mary A; Wang, Joseph; Katz, Evgeny
2010-12-01
A multi-enzyme biocatalytic cascade processing simultaneously five biomarkers characteristic of traumatic brain injury (TBI) and soft tissue injury (STI) was developed. The system operates as a digital biosensor based on concerted function of 8 Boolean AND logic gates, resulting in the decision about the physiological conditions based on the logic analysis of complex patterns of the biomarkers. The system represents the first example of a multi-step/multi-enzyme biosensor with the built-in logic for the analysis of complex combinations of biochemical inputs. The approach is based on recent advances in enzyme-based biocomputing systems and the present paper demonstrates the potential applicability of biocomputing for developing novel digital biosensor networks.
A Model-Based Approach to Developing Your Mission Operations System
NASA Technical Reports Server (NTRS)
Smith, Robert R.; Schimmels, Kathryn A.; Lock, Patricia D; Valerio, Charlene P.
2014-01-01
Model-Based System Engineering (MBSE) is an increasingly popular methodology for designing complex engineering systems. As the use of MBSE has grown, it has begun to be applied to systems that are less hardware-based and more people- and process-based. We describe our approach to incorporating MBSE as a way to streamline development, and how to build a model consisting of core resources, such as requirements and interfaces, that can be adapted and used by new and upcoming projects. By comparing traditional Mission Operations System (MOS) system engineering with an MOS designed via a model, we will demonstrate the benefits to be obtained by incorporating MBSE in system engineering design processes.
Sensitivity analysis of the add-on price estimate for the silicon web growth process
NASA Technical Reports Server (NTRS)
Mokashi, A. R.
1981-01-01
The web growth process, a silicon-sheet technology option, developed for the flat plate solar array (FSA) project, was examined. Base case data for the technical and cost parameters for the technical and commercial readiness phase of the FSA project are projected. The process add on price, using the base case data for cost parameters such as equipment, space, direct labor, materials and utilities, and the production parameters such as growth rate and run length, using a computer program developed specifically to do the sensitivity analysis with improved price estimation are analyzed. Silicon price, sheet thickness and cell efficiency are also discussed.
Spatiotemporal-Thematic Data Processing for the Semantic Web
NASA Astrophysics Data System (ADS)
Hakimpour, Farshad; Aleman-Meza, Boanerges; Perry, Matthew; Sheth, Amit
This chapter presents practical approaches to data processing in the space, time and theme dimensions using existing Semantic Web technologies. It describes how we obtain geographic and event data from Internet sources and also how we integrate them into an RDF store. We briefly introduce a set of functionalities in space, time and semantics. These functionalities are implemented based on our existing technology for main-memory-based RDF data processing developed at the LSDIS Lab. A number of these functionalities are exposed as REST Web services. We present two sample client-side applications that are developed using a combination of our services with Google Maps service.
A laser-based vision system for weld quality inspection.
Huang, Wei; Kovacevic, Radovan
2011-01-01
Welding is a very complex process in which the final weld quality can be affected by many process parameters. In order to inspect the weld quality and detect the presence of various weld defects, different methods and systems are studied and developed. In this paper, a laser-based vision system is developed for non-destructive weld quality inspection. The vision sensor is designed based on the principle of laser triangulation. By processing the images acquired from the vision sensor, the geometrical features of the weld can be obtained. Through the visual analysis of the acquired 3D profiles of the weld, the presences as well as the positions and sizes of the weld defects can be accurately identified and therefore, the non-destructive weld quality inspection can be achieved.
Sakamoto, Izumi
2006-07-01
A grounded-theory study aimed at reconceptualizing cultural adaptation processes from gender role and family/couple perspectives while critically drawing from acculturation and culture and self literatures. In-depth interviews with 34 Japanese academic sojourners (international students, scholars) and their spouses (a total of 50 interviews with select longitudinal interviews) were conducted. The author earlier developed the Model of Cultural Negotiation (2001; 2006) capturing uneven and cyclical processes of dealing with multiple cultural contexts. The current study further develops more tailored versions of this model, Family-Based (Couple-Based) Cultural Negotiation and Individual-Based Cultural Negotiation, highlighting the impacts of family/couple and gender roles, especially for female spouses. These conceptualizations afford a sophisticated understanding of the processes of culture.
A Laser-Based Vision System for Weld Quality Inspection
Huang, Wei; Kovacevic, Radovan
2011-01-01
Welding is a very complex process in which the final weld quality can be affected by many process parameters. In order to inspect the weld quality and detect the presence of various weld defects, different methods and systems are studied and developed. In this paper, a laser-based vision system is developed for non-destructive weld quality inspection. The vision sensor is designed based on the principle of laser triangulation. By processing the images acquired from the vision sensor, the geometrical features of the weld can be obtained. Through the visual analysis of the acquired 3D profiles of the weld, the presences as well as the positions and sizes of the weld defects can be accurately identified and therefore, the non-destructive weld quality inspection can be achieved. PMID:22344308
A Compton suppressed detector multiplicity trigger based digital DAQ for gamma-ray spectroscopy
NASA Astrophysics Data System (ADS)
Das, S.; Samanta, S.; Banik, R.; Bhattacharjee, R.; Basu, K.; Raut, R.; Ghugre, S. S.; Sinha, A. K.; Bhattacharya, S.; Imran, S.; Mukherjee, G.; Bhattacharyya, S.; Goswami, A.; Palit, R.; Tan, H.
2018-06-01
The development of a digitizer based pulse processing and data acquisition system for γ-ray spectroscopy with large detector arrays is presented. The system is based on 250 MHz 12-bit digitizers, and is triggered by a user chosen multiplicity of Compton suppressed detectors. The logic for trigger generation is similar to the one practised for analog (NIM/CAMAC) pulse processing electronics, while retaining the fast processing merits of the digitizer system. Codes for reduction of data acquired from the system have also been developed. The system has been tested with offline studies using radioactive sources as well as in the in-beam experiments with an array of Compton suppressed Clover detectors. The results obtained therefrom validate its use in spectroscopic efforts for nuclear structure investigations.
Biochips: non-conventional strategies for biosensing elements immobilization.
Marquette, Christophe A; Corgier, Benjamin P; Heyries, Kevin A; Blum, Loic J
2008-01-01
The present article draws a general picture of non-conventional methods for biomolecules immobilization. The technologies presented are based either on original solid supports or on innovative immobilization processes. Polydimethylsiloxane elastomer will be presented as a popular immobilization support within the biochip developer community. Electro-addressing of biomolecules at the surface of conducting biochips will appear to be an interesting alternative to immobilization processes based on surface functionalization. Finally, bead-assisted biomolecules immobilization will be presented as an open field of research for biochip developments.
An adaptive signal-processing approach to online adaptive tutoring.
Bergeron, Bryan; Cline, Andrew
2011-01-01
Conventional intelligent or adaptive tutoring online systems rely on domain-specific models of learner behavior based on rules, deep domain knowledge, and other resource-intensive methods. We have developed and studied a domain-independent methodology of adaptive tutoring based on domain-independent signal-processing approaches that obviate the need for the construction of explicit expert and student models. A key advantage of our method over conventional approaches is a lower barrier to entry for educators who want to develop adaptive online learning materials.
Performance measurement: integrating quality management and activity-based cost management.
McKeon, T
1996-04-01
The development of an activity-based management system provides a framework for developing performance measures integral to quality and cost management. Performance measures that cross operational boundaries and embrace core processes provide a mechanism to evaluate operational results related to strategic intention and internal and external customers. The author discusses this measurement process that allows managers to evaluate where they are and where they want to be, and to set a course of action that closes the gap between the two.
Fabrication of polydimethylsiloxane (PDMS) - based multielectrode array for neural interface.
Kim, Jun-Min; Oh, Da-Rong; Sanchez, Joaquin; Kim, Shang-Hyub; Seo, Jong-Mo
2013-01-01
Flexible multielectrode arrays (MEAs) are being developed with various materials, and polyimide has been widely used due to the conveniece of process. Polyimide is developed in the form of photoresist. And this enable precise and reproducible fabrication. PDMS is another good candidate for MEA base material, but it has poor surface energy and etching property. In this paper, we proposed a better fabrication process that could modify PDMS surface for a long time and open the site of electrode and pad efficiently without PDMS etching.
Automated, on-board terrain analysis for precision landings
NASA Technical Reports Server (NTRS)
Rahman, Zia-ur; Jobson, Daniel J.; Woodell, Glenn A.; Hines, Glenn D.
2006-01-01
Advances in space robotics technology hinge to a large extent upon the development and deployment of sophisticated new vision-based methods for automated in-space mission operations and scientific survey. To this end, we have developed a new concept for automated terrain analysis that is based upon a generic image enhancement platform|multi-scale retinex (MSR) and visual servo (VS) processing. This pre-conditioning with the MSR and the vs produces a "canonical" visual representation that is largely independent of lighting variations, and exposure errors. Enhanced imagery is then processed with a biologically inspired two-channel edge detection process, followed by a smoothness based criteria for image segmentation. Landing sites can be automatically determined by examining the results of the smoothness-based segmentation which shows those areas in the image that surpass a minimum degree of smoothness. Though the msr has proven to be a very strong enhancement engine, the other elements of the approach|the vs, terrain map generation, and smoothness-based segmentation|are in early stages of development. Experimental results on data from the Mars Global Surveyor show that the imagery can be processed to automatically obtain smooth landing sites. In this paper, we describe the method used to obtain these landing sites, and also examine the smoothness criteria in terms of the imager and scene characteristics. Several examples of applying this method to simulated and real imagery are shown.
Automated assembly of VECSEL components
NASA Astrophysics Data System (ADS)
Brecher, C.; Pyschny, N.; Haag, S.; Mueller, T.
2013-02-01
Due to the architectural advantage of an external cavity architecture that enables the integration of additional elements into the cavity (e.g. for mode control, frequency conversion, wavelength tuning or passive mode-locking) VECSELs are a rapidly developing laser technology. Nevertheless they often have to compete with direct (edge) emitting laser diodes which can have significant cost advantages thanks to their rather simple structure and production processes. One way to compensate the economical disadvantages of VECSELs is to optimize each component in terms of quality and costs and to apply more efficient (batch) production processes. In this context, the paper presents recent process developments for the automated assembly of VECSELs using a new type of desktop assembly station with an ultra-precise micromanipulator. The core concept is to create a dedicated process development environment from which implemented processes can be transferred fluently to production equipment. By now two types of processes have been put into operation on the desktop assembly station: 1.) passive alignment of the pump optics implementing a camera-based alignment process, where the pump spot geometry and position on the semiconductor chip is analyzed and evaluated; 2.) active alignment of the end mirror based on output power measurements and optimization algorithms. In addition to the core concept and corresponding hardware and software developments, detailed results of both processes are presented explaining measurement setups as well as alignment strategies and results.
Developing a semantic web model for medical differential diagnosis recommendation.
Mohammed, Osama; Benlamri, Rachid
2014-10-01
In this paper we describe a novel model for differential diagnosis designed to make recommendations by utilizing semantic web technologies. The model is a response to a number of requirements, ranging from incorporating essential clinical diagnostic semantics to the integration of data mining for the process of identifying candidate diseases that best explain a set of clinical features. We introduce two major components, which we find essential to the construction of an integral differential diagnosis recommendation model: the evidence-based recommender component and the proximity-based recommender component. Both approaches are driven by disease diagnosis ontologies designed specifically to enable the process of generating diagnostic recommendations. These ontologies are the disease symptom ontology and the patient ontology. The evidence-based diagnosis process develops dynamic rules based on standardized clinical pathways. The proximity-based component employs data mining to provide clinicians with diagnosis predictions, as well as generates new diagnosis rules from provided training datasets. This article describes the integration between these two components along with the developed diagnosis ontologies to form a novel medical differential diagnosis recommendation model. This article also provides test cases from the implementation of the overall model, which shows quite promising diagnostic recommendation results.
2013-12-15
Blufftown is underlain by igneous and metamorphic rocks which are equivalent to those of the Georgia Piedmont. Potable and process waters are produced...Final Environmental Assessment for Developing Renewable Energy Enhanced Use Lease Facilities at Robins Air Force Base...TITLE AND SUBTITLE Final Environmental Assessment for Developing Renewable Energy Enhanced Use Lease Facilities at Robins Air Force Base 5a. CONTRACT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chai, X; Liu, L; Xing, L
Purpose: Visualization and processing of medical images and radiation treatment plan evaluation have traditionally been constrained to local workstations with limited computation power and ability of data sharing and software update. We present a web-based image processing and planning evaluation platform (WIPPEP) for radiotherapy applications with high efficiency, ubiquitous web access, and real-time data sharing. Methods: This software platform consists of three parts: web server, image server and computation server. Each independent server communicates with each other through HTTP requests. The web server is the key component that provides visualizations and user interface through front-end web browsers and relay informationmore » to the backend to process user requests. The image server serves as a PACS system. The computation server performs the actual image processing and dose calculation. The web server backend is developed using Java Servlets and the frontend is developed using HTML5, Javascript, and jQuery. The image server is based on open source DCME4CHEE PACS system. The computation server can be written in any programming language as long as it can send/receive HTTP requests. Our computation server was implemented in Delphi, Python and PHP, which can process data directly or via a C++ program DLL. Results: This software platform is running on a 32-core CPU server virtually hosting the web server, image server, and computation servers separately. Users can visit our internal website with Chrome browser, select a specific patient, visualize image and RT structures belonging to this patient and perform image segmentation running Delphi computation server and Monte Carlo dose calculation on Python or PHP computation server. Conclusion: We have developed a webbased image processing and plan evaluation platform prototype for radiotherapy. This system has clearly demonstrated the feasibility of performing image processing and plan evaluation platform through a web browser and exhibited potential for future cloud based radiotherapy.« less
Nolte, Kurt B; Stewart, Douglas M; O'Hair, Kevin C; Gannon, William L; Briggs, Michael S; Barron, A Marie; Pointer, Judy; Larson, Richard S
2008-10-01
The authors developed a novel continuous quality improvement (CQI) process for academic biomedical research compliance administration. A challenge in developing a quality improvement program in a nonbusiness environment is that the terminology and processes are often foreign. Rather than training staff in an existing quality improvement process, the authors opted to develop a novel process based on the scientific method--a paradigm familiar to all team members. The CQI process included our research compliance units. Unit leaders identified problems in compliance administration where a resolution would have a positive impact and which could be resolved or improved with current resources. They then generated testable hypotheses about a change to standard practice expected to improve the problem, and they developed methods and metrics to assess the impact of the change. The CQI process was managed in a "peer review" environment. The program included processes to reduce the incidence of infections in animal colonies, decrease research protocol-approval times, improve compliance and protection of animal and human research subjects, and improve research protocol quality. This novel CQI approach is well suited to the needs and the unique processes of research compliance administration. Using the scientific method as the improvement paradigm fostered acceptance of the project by unit leaders and facilitated the development of specific improvement projects. These quality initiatives will allow us to improve support for investigators while ensuring that compliance standards continue to be met. We believe that our CQI process can readily be used in other academically based offices of research.
Developing cloud-based Business Process Management (BPM): a survey
NASA Astrophysics Data System (ADS)
Mercia; Gunawan, W.; Fajar, A. N.; Alianto, H.; Inayatulloh
2018-03-01
In today’s highly competitive business environment, modern enterprises are dealing difficulties to cut unnecessary costs, eliminate wastes and delivery huge benefits for the organization. Companies are increasingly turning to a more flexible IT environment to help them realize this goal. For this reason, the article applies cloud based Business Process Management (BPM) that enables to focus on modeling, monitoring and process management. Cloud based BPM consists of business processes, business information and IT resources, which help build real-time intelligence systems, based on business management and cloud technology. Cloud computing is a paradigm that involves procuring dynamically measurable resources over the internet as an IT resource service. Cloud based BPM service enables to address common problems faced by traditional BPM, especially in promoting flexibility, event-driven business process to exploit opportunities in the marketplace.
Developing criterion-based competencies for tele-intensive care unit.
Schleifer, Sarah Joy; Carroll, Karen; Moseley, Marthe J
2014-01-01
Over the last 5 years, telemedicine has developed nursing roles that differ from traditional bedside care. In the midst of this transition, current competency development models focused on task completion may not be the most effective form of proficiency validation. The procedure of competency creation for the role of tele-intensive care unit registered nurse requires a thoughtful process using stakeholders from institutional leadership to frontline staff. The process must include stakeholder approval to ensure appropriate buy-in and follow-through on the agreed-upon criteria. This can be achieved using a standardized method of concept stimulation related to the behaviors, not a memorized list of tasks, expected of a telemedicine registered nurse. This process serves as the foundation for the development of criterion-based competency statements that then allows for clearer expectations. Continually reviewing the written competencies, ensuring current applicability, and revising as needed are necessities for maintaining competence and, therefore, patient safety.
Madsen, William C
2016-06-01
Across North America, community agencies and state/provincial jurisdictions are embracing family-centered approaches to service delivery that are grounded in strength-based, culturally responsive, accountable partnerships with families. This article details a collaborative consultation process to initiate and sustain organizational change toward this effort. It draws on innovative ideas from narrative theory, organizational development, and implementation science to highlight a three component approach. This approach includes the use of appreciative inquiry focus groups to elicit existing best practices, the provision of clinical training, and ongoing coaching with practice leaders to build on those better moments and develop concrete practice frameworks, and leadership coaching and organizational consultation to develop organizational structures that institutionalize family-centered practice. While the article uses a principle-based practice framework, Collaborative Helping, to illustrate this process, the approach is applicable with a variety of clinical frameworks grounded in family-centered values and principles. © 2016 Family Process Institute.
Scientific and Regulatory Considerations in Solid Oral Modified Release Drug Product Development.
Li, Min; Sander, Sanna; Duan, John; Rosencrance, Susan; Miksinski, Sarah Pope; Yu, Lawrence; Seo, Paul; Rege, Bhagwant
2016-11-01
This review presents scientific and regulatory considerations for the development of solid oral modified release (MR) drug products. It includes a rationale for patient-focused development based on Quality-by-Design (QbD) principles. Product and process understanding of MR products includes identification and risk-based evaluation of critical material attributes (CMAs), critical process parameters (CPPs), and their impact on critical quality attributes (CQAs) that affect the clinical performance. The use of various biopharmaceutics tools that link the CQAs to a predictable and reproducible clinical performance for patient benefit is emphasized. Product and process understanding lead to a more comprehensive control strategy that can maintain product quality through the shelf life and the lifecycle of the drug product. The overall goal is to develop MR products that consistently meet the clinical objectives while mitigating the risks to patients by reducing the probability and increasing the detectability of CQA failures.
Food Processors Skills Building Project. Evaluation Report.
ERIC Educational Resources Information Center
White, Eileen Casey
The Food Processors Skills Building project was undertaken by four Oregon community colleges, with funds from the Oregon Economic Development Department and 11 local food processing companies, to address basic skills needs in the food processing industry through the development and implementation of an industry-specific curriculum. Based on…
ERIC Educational Resources Information Center
Stuckey, Heather L.; Taylor, Edward W.; Cranton, Patricia
2013-01-01
The purpose of this research was to develop an inclusive evaluation of "transformative learning theory" that encompassed varied perspectives of transformative learning. We constructed a validated quantitative survey to assess the potential outcomes and processes of how transformative learning may be experienced by college-educated…
Software quality and process improvement in scientific simulation codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ambrosiano, J.; Webster, R.
1997-11-01
This report contains viewgraphs on the quest to develope better simulation code quality through process modeling and improvement. This study is based on the experience of the authors and interviews with ten subjects chosen from simulation code development teams at LANL. This study is descriptive rather than scientific.
Analyzing the Impact of a Data Analysis Process to Improve Instruction Using a Collaborative Model
ERIC Educational Resources Information Center
Good, Rebecca B.
2006-01-01
The Data Collaborative Model (DCM) assembles assessment literacy, reflective practices, and professional development into a four-component process. The sub-components include assessing students, reflecting over data, professional dialogue, professional development for the teachers, interventions for students based on data results, and re-assessing…
In-Space Manufacturing (ISM): Pioneering Space Exploration
NASA Technical Reports Server (NTRS)
Werkheiser, Niki
2015-01-01
ISM Objective: Develop and enable the manufacturing technologies and processes required to provide on-demand, sustainable operations for Exploration Missions. This includes development of the desired capabilities, as well as the required processes for the certification, characterization & verification that will enable these capabilities to become institutionalized via ground-based and ISS demonstrations.
Development and Validation of the Homeostasis Concept Inventory
ERIC Educational Resources Information Center
McFarland, Jenny L.; Price, Rebecca M.; Wenderoth, Mary Pat; Martinková, Patrícia; Cliff, William; Michael, Joel; Modell, Harold; Wright, Ann
2017-01-01
We present the Homeostasis Concept Inventory (HCI), a 20-item multiple-choice instrument that assesses how well undergraduates understand this critical physiological concept. We used an iterative process to develop a set of questions based on elements in the Homeostasis Concept Framework. This process involved faculty experts and undergraduate…
The Verification-based Analysis of Reliable Multicast Protocol
NASA Technical Reports Server (NTRS)
Wu, Yunqing
1996-01-01
Reliable Multicast Protocol (RMP) is a communication protocol that provides an atomic, totally ordered, reliable multicast service on top of unreliable IP Multicasting. In this paper, we develop formal models for R.W using existing automatic verification systems, and perform verification-based analysis on the formal RMP specifications. We also use the formal models of RW specifications to generate a test suite for conformance testing of the RMP implementation. Throughout the process of RMP development, we follow an iterative, interactive approach that emphasizes concurrent and parallel progress between the implementation and verification processes. Through this approach, we incorporate formal techniques into our development process, promote a common understanding for the protocol, increase the reliability of our software, and maintain high fidelity between the specifications of RMP and its implementation.
Data systems and computer science: Neural networks base R/T program overview
NASA Technical Reports Server (NTRS)
Gulati, Sandeep
1991-01-01
The research base, in the U.S. and abroad, for the development of neural network technology is discussed. The technical objectives are to develop and demonstrate adaptive, neural information processing concepts. The leveraging of external funding is also discussed.
Training tomorrow's clinicians today--managed care essentials: a process for curriculum development.
Colenda, C C; Wadland, W; Hayes, O; Anderson, W; Priester, F; Pearson, R; Keefe, C; Fleck, L
2000-05-01
To develop a managed care curriculum for primary care residents. This article outlines a 4-stage curriculum development process focusing on concepts of managed care organization and finance. The stages consist of: (1) identifying the curriculum development work group and framing the scope of the curriculum, (2) identifying stakeholder buy-in and expectations, (3) choosing curricular topics and delivery mechanisms, and (4) outlining the evaluation process. Key elements of building a curriculum development team, content objectives of the curriculum, the rationale for using problem-based learning, and finally, lessons learned from the partnership among the stakeholders are reviewed. The curriculum was delivered to an entering group of postgraduate-year 1 primary care residents. Attitudes among residents toward managed care remained relatively negative and stable over the yearlong curriculum, especially over issues relating to finance, quality of care, control and autonomy of practitioners, time spent with patients, and managed care's impact on the doctor-patient relationship. Residents' baseline knowledge of core concepts about managed care organization and finance improved during the year that the curriculum was delivered. Satisfaction with a problem-based learning approach was high. Problem-based learning, using real-life clinical examples, is a successful approach to resident instruction about managed care.
A Framework for Performing V&V within Reuse-Based Software Engineering
NASA Technical Reports Server (NTRS)
Addy, Edward A.
1996-01-01
Verification and validation (V&V) is performed during application development for many systems, especially safety-critical and mission-critical systems. The V&V process is intended to discover errors, especially errors related to critical processing, as early as possible during the development process. Early discovery is important in order to minimize the cost and other impacts of correcting these errors. In order to provide early detection of errors, V&V is conducted in parallel with system development, often beginning with the concept phase. In reuse-based software engineering, however, decisions on the requirements, design and even implementation of domain assets can be made prior to beginning development of a specific system. In this case, V&V must be performed during domain engineering in order to have an impact on system development. This paper describes a framework for performing V&V within architecture-centric, reuse-based software engineering. This framework includes the activities of traditional application-level V&V, and extends these activities into domain engineering and into the transition between domain engineering and application engineering. The framework includes descriptions of the types of activities to be performed during each of the life-cycle phases, and provides motivation for the activities.
Kohlberg's theory of moral development: insights into rights reasoning.
Peens, B J; Louw, D A
2000-01-01
Kohlberg's theory of moral development was based on extensive research done on the reactions of people of all ages to specific moral situational dilemmas. Kohlberg was specifically interested in reasoning processes involved in decision-making. The way in which children perceive their rights is also based on reasoning processes that are inextricably linked to their level of development and more specifically to their level of moral development since the area of human rights can be considered essentially moral. Since Kohlberg's theory is primarily concerned with development, a great deal of insight can be gained into the developmental shift that occurs in children's reasoning about the rights to which they feel they should be entitled. This article focuses on Kohlberg's six-stage theory, specifically as it pertains to reasoning processes similar to those that would be used in rights reasoning. At each stage the authors propose a potential view of how children at each developmental stage might perceive their rights based on the description Kohlberg gives of the developmental trends associated with each stage. A critical assessment of Kohlberg's work is also given in order to highlight certain considerations about the limitations of this theory that need to be considered for future research.
Fuzzy control of burnout of multilayer ceramic actuators
NASA Astrophysics Data System (ADS)
Ling, Alice V.; Voss, David; Christodoulou, Leo
1996-08-01
To improve the yield and repeatability of the burnout process of multilayer ceramic actuators (MCAs), an intelligent processing of materials (IPM-based) control system has been developed for the manufacture of MCAs. IPM involves the active (ultimately adaptive) control of a material process using empirical or analytical models and in situ sensing of critical process states (part features and process parameters) to modify the processing conditions in real time to achieve predefined product goals. Thus, the three enabling technologies for the IPM burnout control system are process modeling, in situ sensing and intelligent control. This paper presents the design of an IPM-based control strategy for the burnout process of MCAs.
Masters, Kevin S; Ross, Kaile M; Hooker, Stephanie A; Wooldridge, Jennalee L
2018-05-18
There has been a notable disconnect between theories of behavior change and behavior change interventions. Because few interventions are both explicitly and adequately theory-based, investigators cannot assess the impact of theory on intervention effectiveness. Theory-based interventions, designed to deliberately engage the theory's proposed mechanisms of change, are needed to adequately test theories. Thus, systematic approaches to theory-based intervention development are needed. This article will introduce and discuss the psychometric method of developing theory-based interventions. The psychometric approach to intervention development utilizes basic psychometric principles at each step of the intervention development process in order to build a theoretically driven intervention to, subsequently, be tested in process (mechanism) and outcome studies. Five stages of intervention development are presented as follows: (i) Choice of theory; (ii) Identification and characterization of key concepts and expected relations; (iii) Intervention construction; (iv) Initial testing and revision; and (v) Empirical testing of the intervention. Examples of this approach from the Colorado Meaning-Activity Project (COMAP) are presented. Based on self-determination theory integrated with meaning or purpose, and utilizing a motivational interviewing approach, the COMAP intervention is individually based with an initial interview followed by smart phone-delivered interventions for increasing daily activity. The psychometric approach to intervention development is one method to ensure careful consideration of theory in all steps of intervention development. This structured approach supports developing a research culture that endorses deliberate and systematic operationalization of theory into behavior change intervention from the outset of intervention development.
Generalization bounds of ERM-based learning processes for continuous-time Markov chains.
Zhang, Chao; Tao, Dacheng
2012-12-01
Many existing results on statistical learning theory are based on the assumption that samples are independently and identically distributed (i.i.d.). However, the assumption of i.i.d. samples is not suitable for practical application to problems in which samples are time dependent. In this paper, we are mainly concerned with the empirical risk minimization (ERM) based learning process for time-dependent samples drawn from a continuous-time Markov chain. This learning process covers many kinds of practical applications, e.g., the prediction for a time series and the estimation of channel state information. Thus, it is significant to study its theoretical properties including the generalization bound, the asymptotic convergence, and the rate of convergence. It is noteworthy that, since samples are time dependent in this learning process, the concerns of this paper cannot (at least straightforwardly) be addressed by existing methods developed under the sample i.i.d. assumption. We first develop a deviation inequality for a sequence of time-dependent samples drawn from a continuous-time Markov chain and present a symmetrization inequality for such a sequence. By using the resultant deviation inequality and symmetrization inequality, we then obtain the generalization bounds of the ERM-based learning process for time-dependent samples drawn from a continuous-time Markov chain. Finally, based on the resultant generalization bounds, we analyze the asymptotic convergence and the rate of convergence of the learning process.
Image2000: A Free, Innovative, Java Based Imaging Package
NASA Technical Reports Server (NTRS)
Pell, Nicholas; Wheeler, Phil; Cornwell, Carl; Matusow, David; Obenschain, Arthur F. (Technical Monitor)
2001-01-01
The National Aeronautics and Space Administration (NASA) Goddard Space Flight Center's (GSFC) Scientific and Educational Endeavors (SEE) and the Center for Image Processing in Education (CIPE) use satellite image processing as part of their science lessons developed for students and educators. The image processing products that they use, as part of these lessons, no longer fulfill the needs of SEE and CIPE because these products are either dependent on a particular computing platform, hard to customize and extend, or do not have enough functionality. SEE and CIPE began looking for what they considered the "perfect" image processing tool that was platform independent, rich in functionality and could easily be extended and customized for their purposes. At the request of SEE, NASA's GSFC, code 588 the Advanced Architectures and Automation Branch developed a powerful new Java based image processing endeavors.
Comparative evaluation of urban storm water quality models
NASA Astrophysics Data System (ADS)
Vaze, J.; Chiew, Francis H. S.
2003-10-01
The estimation of urban storm water pollutant loads is required for the development of mitigation and management strategies to minimize impacts to receiving environments. Event pollutant loads are typically estimated using either regression equations or "process-based" water quality models. The relative merit of using regression models compared to process-based models is not clear. A modeling study is carried out here to evaluate the comparative ability of the regression equations and process-based water quality models to estimate event diffuse pollutant loads from impervious surfaces. The results indicate that, once calibrated, both the regression equations and the process-based model can estimate event pollutant loads satisfactorily. In fact, the loads estimated using the regression equation as a function of rainfall intensity and runoff rate are better than the loads estimated using the process-based model. Therefore, if only estimates of event loads are required, regression models should be used because they are simpler and require less data compared to process-based models.
Schultz, Karen; Griffiths, Jane
2016-05-01
In 2009-2010, the postgraduate residency training program at the Department of Family Medicine, Queen's University, wrestled with the practicalities of competency-based medical education (CBME) implementation when its accrediting body, the College of Family Physicians of Canada, introduced the competency-based Triple C curriculum. The authors used a stepwise approach to implement CMBE; the steps were to (1) identify objectives, (2) identify competencies, (3) map objectives and competencies to learning experiences and assessment processes, (4) plan learning experiences, (5) develop an assessment system, (6) collect and interpret data, (7) adjust individual residents' training programs, and (8) distribute decisions to stakeholders. The authors also note overarching processes, costs, and facil itating factors and processes or steps that would have been helpful for CBME implementation. Early outcomes are encouraging. Residents are being directly observed more often with increased documented feedback about performance based on explicit competency standards (24,000 data points for 150 residents from 2013 to 2015). These multiple observations are being collated in a way that is allowing the identification of patterns of performance, red flags, and competency development trajectory. Outliers are being identified earlier, resulting in earlier individualized modification of their residency training program. The authors will continue to provide and refine faculty development, are developing an entrustable professional activity field note app for handheld devices, and are undertaking research to explore what facilitates learners' competency development, what increases assessors' confidence in making competence decisions, and whether residents are better trained as a result of CBME implementation.
Kim, Dokyoon; Lee, Nohyun; Park, Yong Il; Hyeon, Taeghwan
2017-01-18
Several types of nanoparticle-based imaging probes have been developed to replace conventional luminescent probes. For luminescence imaging, near-infrared (NIR) probes are useful in that they allow deep tissue penetration and high spatial resolution as a result of reduced light absorption/scattering and negligible autofluorescence in biological media. They rely on either an anti-Stokes or a Stokes shift process to generate luminescence. For example, transition metal-doped semiconductor nanoparticles and lanthanide-doped inorganic nanoparticles have been demonstrated as anti-Stokes shift-based agents that absorb NIR light through two- or three-photon absorption process and upconversion process, respectively. On the other hand, quantum dots (QDs) and lanthanide-doped nanoparticles that emit in NIR-II range (∼1000 to ∼1350 nm) were suggested as promising Stokes shift-based imaging agents. In this topical review, we summarize and discuss the recent progress in the development of inorganic nanoparticle-based luminescence imaging probes working in NIR range.
Safdari, Reza; Ghazisaeedi, Marjan; Mirzaee, Mahboobeh; Farzi, Jebrail; Goodini, Azadeh
2014-01-01
Dynamic reporting tools, such as dashboards, should be developed to measure emergency department (ED) performance. However, choosing an effective balanced set of performance measures and key performance indicators (KPIs) is a main challenge to accomplish this. The aim of this study was to develop a balanced set of KPIs for use in ED strategic dashboards following an analytic hierarchical process. The study was carried out in 2 phases: constructing ED performance measures based on balanced scorecard perspectives and incorporating them into analytic hierarchical process framework to select the final KPIs. The respondents placed most importance on ED internal processes perspective especially on measures related to timeliness and accessibility of care in ED. Some measures from financial, customer, and learning and growth perspectives were also selected as other top KPIs. Measures of care effectiveness and care safety were placed as the next priorities too. The respondents placed least importance on disease-/condition-specific "time to" measures. The methodology can be presented as a reference model for development of KPIs in various performance related areas based on a consistent and fair approach. Dashboards that are designed based on such a balanced set of KPIs will help to establish comprehensive performance measurements and fair benchmarks and comparisons.
Microstructure based procedure for process parameter control in rolling of aluminum thin foils
NASA Astrophysics Data System (ADS)
Johannes, Kronsteiner; Kabliman, Evgeniya; Klimek, Philipp-Christoph
2018-05-01
In present work, a microstructure based procedure is used for a numerical prediction of strength properties for Al-Mg-Sc thin foils during a hot rolling process. For this purpose, the following techniques were developed and implemented. At first, a toolkit for a numerical analysis of experimental stress-strain curves obtained during a hot compression testing by a deformation dilatometer was developed. The implemented techniques allow for the correction of a temperature increase in samples due to adiabatic heating and for the determination of a yield strength needed for the separation of the elastic and plastic deformation regimes during numerical simulation of multi-pass hot rolling. At the next step, an asymmetric Hot Rolling Simulator (adjustable table inlet/outlet height as well as separate roll infeed) was developed in order to match the exact processing conditions of a semi-industrial rolling procedure. At each element of a finite element mesh the total strength is calculated by in-house Flow Stress Model based on evolution of mean dislocation density. The strength values obtained by numerical modelling were found in a reasonable agreement with results of tensile tests for thin Al-Mg-Sc foils. Thus, the proposed simulation procedure might allow to optimize the processing parameters with respect to the microstructure development.
Process Based on SysML for New Launchers System and Software Developments
NASA Astrophysics Data System (ADS)
Hiron, Emmanuel; Miramont, Philippe
2010-08-01
The purpose of this paper is to present the Astrium-ST engineering process based on SysML. This process is currently set-up in the frame of common CNES /Astrium-ST R&T studies related to the Ariane 5 electrical system and flight software modelling. The tool used to set up this process is Rhapsody release 7.3 from IBM-Software firm [1]. This process focuses on the system engineering phase dedicated to Software with the objective to generate both System documents (sequential system design and flight control) and Software specifications.
Framework Support For Knowledge-Based Software Development
NASA Astrophysics Data System (ADS)
Huseth, Steve
1988-03-01
The advent of personal engineering workstations has brought substantial information processing power to the individual programmer. Advanced tools and environment capabilities supporting the software lifecycle are just beginning to become generally available. However, many of these tools are addressing only part of the software development problem by focusing on rapid construction of self-contained programs by a small group of talented engineers. Additional capabilities are required to support the development of large programming systems where a high degree of coordination and communication is required among large numbers of software engineers, hardware engineers, and managers. A major player in realizing these capabilities is the framework supporting the software development environment. In this paper we discuss our research toward a Knowledge-Based Software Assistant (KBSA) framework. We propose the development of an advanced framework containing a distributed knowledge base that can support the data representation needs of tools, provide environmental support for the formalization and control of the software development process, and offer a highly interactive and consistent user interface.
What defines mindfulness-based programs? The warp and the weft.
Crane, R S; Brewer, J; Feldman, C; Kabat-Zinn, J; Santorelli, S; Williams, J M G; Kuyken, W
2017-04-01
There has been an explosion of interest in mindfulness-based programs (MBPs) such as Mindfulness-Based Stress Reduction (MBSR) and Mindfulness-Based Cognitive Therapy. This is demonstrated in increased research, implementation of MBPs in healthcare, educational, criminal justice and workplace settings, and in mainstream interest. For the sustainable development of the field there is a need to articulate a definition of what an MBP is and what it is not. This paper provides a framework to define the essential characteristics of the family of MBPs originating from the parent program MBSR, and the processes which inform adaptations of MBPs for different populations or contexts. The framework addresses the essential characteristics of the program and of teacher. MBPs: are informed by theories and practices that draw from a confluence of contemplative traditions, science, and the major disciplines of medicine, psychology and education; underpinned by a model of human experience which addresses the causes of human distress and the pathways to relieving it; develop a new relationship with experience characterized by present moment focus, decentering and an approach orientation; catalyze the development of qualities such as joy, compassion, wisdom, equanimity and greater attentional, emotional and behavioral self-regulation, and engage participants in a sustained intensive training in mindfulness meditation practice, in an experiential inquiry-based learning process and in exercises to develop understanding. The paper's aim is to support clarity, which will in turn support the systematic development of MBP research, and the integrity of the field during the process of implementation in the mainstream.
Biotechnology in Food Production and Processing
NASA Astrophysics Data System (ADS)
Knorr, Dietrich; Sinskey, Anthony J.
1985-09-01
The food processing industry is the oldest and largest industry using biotechnological processes. Further development of food products and processes based on biotechnology depends upon the improvement of existing processes, such as fermentation, immobilized biocatalyst technology, and production of additives and processing aids, as well as the development of new opportunities for food biotechnology. Improvements are needed in the characterization, safety, and quality control of food materials, in processing methods, in waste conversion and utilization processes, and in currently used food microorganism and tissue culture systems. Also needed are fundamental studies of the structure-function relationship of food materials and of the cell physiology and biochemistry of raw materials.
Plasma Processing of Model Residential Solid Waste
NASA Astrophysics Data System (ADS)
Messerle, V. E.; Mossé, A. L.; Nikonchuk, A. N.; Ustimenko, A. B.; Baimuldin, R. V.
2017-09-01
The authors have tested the technology of processing of model residential solid waste. They have developed and created a pilot plasma unit based on a plasma chamber incinerator. The waste processing technology has been tested and prepared for commercialization.
Indicators and Metrics for Evaluating the Sustainability of Chemical Processes
A metric-based method, called GREENSCOPE, has been developed for evaluating process sustainability. Using lab-scale information and engineering assumptions the method evaluates full-scale epresentations of processes in environmental, efficiency, energy and economic areas. The m...
Panday, Sorab; Langevin, Christian D.; Niswonger, Richard G.; Ibaraki, Motomu; Hughes, Joseph D.
2013-01-01
A new version of MODFLOW, called MODFLOW–USG (for UnStructured Grid), was developed to support a wide variety of structured and unstructured grid types, including nested grids and grids based on prismatic triangles, rectangles, hexagons, and other cell shapes. Flexibility in grid design can be used to focus resolution along rivers and around wells, for example, or to subdiscretize individual layers to better represent hydrostratigraphic units. MODFLOW–USG is based on an underlying control volume finite difference (CVFD) formulation in which a cell can be connected to an arbitrary number of adjacent cells. To improve accuracy of the CVFD formulation for irregular grid-cell geometries or nested grids, a generalized Ghost Node Correction (GNC) Package was developed, which uses interpolated heads in the flow calculation between adjacent connected cells. MODFLOW–USG includes a Groundwater Flow (GWF) Process, based on the GWF Process in MODFLOW–2005, as well as a new Connected Linear Network (CLN) Process to simulate the effects of multi-node wells, karst conduits, and tile drains, for example. The CLN Process is tightly coupled with the GWF Process in that the equations from both processes are formulated into one matrix equation and solved simultaneously. This robustness results from using an unstructured grid with unstructured matrix storage and solution schemes. MODFLOW–USG also contains an optional Newton-Raphson formulation, based on the formulation in MODFLOW–NWT, for improving solution convergence and avoiding problems with the drying and rewetting of cells. Because the existing MODFLOW solvers were developed for structured and symmetric matrices, they were replaced with a new Sparse Matrix Solver (SMS) Package developed specifically for MODFLOW–USG. The SMS Package provides several methods for resolving nonlinearities and multiple symmetric and asymmetric linear solution schemes to solve the matrix arising from the flow equations and the Newton-Raphson formulation, respectively.
Case study: Lockheed-Georgia Company integrated design process
NASA Technical Reports Server (NTRS)
Waldrop, C. T.
1980-01-01
A case study of the development of an Integrated Design Process is presented. The approach taken in preparing for the development of an integrated design process includes some of the IPAD approaches such as developing a Design Process Model, cataloging Technical Program Elements (TPE's), and examining data characteristics and interfaces between contiguous TPE's. The implementation plan is based on an incremental development of capabilities over a period of time with each step directed toward, and consistent with, the final architecture of a total integrated system. Because of time schedules and different computer hardware, this system will not be the same as the final IPAD release; however, many IPAD concepts will no doubt prove applicable as the best approach. Full advantage will be taken of the IPAD development experience. A scenario that could be typical for many companies, even outside the aerospace industry, in developing an integrated design process for an IPAD-type environment is represented.
Developing a probability-based model of aquifer vulnerability in an agricultural region
NASA Astrophysics Data System (ADS)
Chen, Shih-Kai; Jang, Cheng-Shin; Peng, Yi-Huei
2013-04-01
SummaryHydrogeological settings of aquifers strongly influence the regional groundwater movement and pollution processes. Establishing a map of aquifer vulnerability is considerably critical for planning a scheme of groundwater quality protection. This study developed a novel probability-based DRASTIC model of aquifer vulnerability in the Choushui River alluvial fan, Taiwan, using indicator kriging and to determine various risk categories of contamination potentials based on estimated vulnerability indexes. Categories and ratings of six parameters in the probability-based DRASTIC model were probabilistically characterized according to the parameter classification methods of selecting a maximum estimation probability and calculating an expected value. Moreover, the probability-based estimation and assessment gave us an excellent insight into propagating the uncertainty of parameters due to limited observation data. To examine the prediction capacity of pollutants for the developed probability-based DRASTIC model, medium, high, and very high risk categories of contamination potentials were compared with observed nitrate-N exceeding 0.5 mg/L indicating the anthropogenic groundwater pollution. The analyzed results reveal that the developed probability-based DRASTIC model is capable of predicting high nitrate-N groundwater pollution and characterizing the parameter uncertainty via the probability estimation processes.
Durvasula, Raghu; Kelly, Janet; Schleyer, Anneliese; Anawalt, Bradley D.; Somani, Shabir; Dellit, Timothy H.
2018-01-01
Background As healthcare costs rise and reimbursements decrease, healthcare organization leadership and clinical providers must collaborate to provide high-value healthcare. Medications are a key driver of the increasing cost of healthcare, largely as a result of the proliferation of expensive specialty drugs, including biologic agents. Such medications contribute significantly to the inpatient diagnosis-related group payment system, often with minimal or unproved benefit over less-expensive therapies. Objective To describe a systematic review process to reduce non–evidence-based inpatient use of high-cost medications across a large multihospital academic health system. Methods We created a Pharmacy & Therapeutics subcommittee consisting of clinicians, pharmacists, and an ethics representative. This committee developed a standardized process for a timely review (<48 hours) and approval of high-cost medications based on their clinical effectiveness, safety, and appropriateness. The engagement of clinical experts in the development of the consensus-based guidelines for the use of specific medications facilitated the clinicians' acceptance of the review process. Results Over a 2-year period, a total of 85 patient-specific requests underwent formal review. All reviews were conducted within 48 hours. This review process has reduced the non–evidence-based use of specialty medications and has resulted in a pharmacy savings of $491,000 in fiscal year 2016, with almost 80% of the savings occurring in the last 2 quarters, because our process has matured. Conclusion The creation of a collaborative review process to ensure consistent, evidence-based utilization of high-cost medications provides value-based care, while minimizing unnecessary practice variation and reducing the cost of inpatient care.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
2015-06-15
Current quality assurance and quality management guidelines provided by various professional organizations are prescriptive in nature, focusing principally on performance characteristics of planning and delivery devices. However, published analyses of events in radiation therapy show that most events are often caused by flaws in clinical processes rather than by device failures. This suggests the need for the development of a quality management program that is based on integrated approaches to process and equipment quality assurance. Industrial engineers have developed various risk assessment tools that are used to identify and eliminate potential failures from a system or a process before amore » failure impacts a customer. These tools include, but are not limited to, process mapping, failure modes and effects analysis, fault tree analysis. Task Group 100 of the American Association of Physicists in Medicine has developed these tools and used them to formulate an example risk-based quality management program for intensity-modulated radiotherapy. This is a prospective risk assessment approach that analyzes potential error pathways inherent in a clinical process and then ranks them according to relative risk, typically before implementation, followed by the design of a new process or modification of the existing process. Appropriate controls are then put in place to ensure that failures are less likely to occur and, if they do, they will more likely be detected before they propagate through the process, compromising treatment outcome and causing harm to the patient. Such a prospective approach forms the basis of the work of Task Group 100 that has recently been approved by the AAPM. This session will be devoted to a discussion of these tools and practical examples of how these tools can be used in a given radiotherapy clinic to develop a risk based quality management program. Learning Objectives: Learn how to design a process map for a radiotherapy process Learn how to perform failure modes and effects analysis analysis for a given process Learn what fault trees are all about Learn how to design a quality management program based upon the information obtained from process mapping, failure modes and effects analysis and fault tree analysis. Dunscombe: Director, TreatSafely, LLC and Center for the Assessment of Radiological Sciences; Consultant to IAEA and Varian Thomadsen: President, Center for the Assessment of Radiological Sciences Palta: Vice President of the Center for the Assessment of Radiological Sciences.« less
TU-AB-BRD-03: Fault Tree Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dunscombe, P.
2015-06-15
Current quality assurance and quality management guidelines provided by various professional organizations are prescriptive in nature, focusing principally on performance characteristics of planning and delivery devices. However, published analyses of events in radiation therapy show that most events are often caused by flaws in clinical processes rather than by device failures. This suggests the need for the development of a quality management program that is based on integrated approaches to process and equipment quality assurance. Industrial engineers have developed various risk assessment tools that are used to identify and eliminate potential failures from a system or a process before amore » failure impacts a customer. These tools include, but are not limited to, process mapping, failure modes and effects analysis, fault tree analysis. Task Group 100 of the American Association of Physicists in Medicine has developed these tools and used them to formulate an example risk-based quality management program for intensity-modulated radiotherapy. This is a prospective risk assessment approach that analyzes potential error pathways inherent in a clinical process and then ranks them according to relative risk, typically before implementation, followed by the design of a new process or modification of the existing process. Appropriate controls are then put in place to ensure that failures are less likely to occur and, if they do, they will more likely be detected before they propagate through the process, compromising treatment outcome and causing harm to the patient. Such a prospective approach forms the basis of the work of Task Group 100 that has recently been approved by the AAPM. This session will be devoted to a discussion of these tools and practical examples of how these tools can be used in a given radiotherapy clinic to develop a risk based quality management program. Learning Objectives: Learn how to design a process map for a radiotherapy process Learn how to perform failure modes and effects analysis analysis for a given process Learn what fault trees are all about Learn how to design a quality management program based upon the information obtained from process mapping, failure modes and effects analysis and fault tree analysis. Dunscombe: Director, TreatSafely, LLC and Center for the Assessment of Radiological Sciences; Consultant to IAEA and Varian Thomadsen: President, Center for the Assessment of Radiological Sciences Palta: Vice President of the Center for the Assessment of Radiological Sciences.« less
TU-AB-BRD-02: Failure Modes and Effects Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huq, M.
2015-06-15
Current quality assurance and quality management guidelines provided by various professional organizations are prescriptive in nature, focusing principally on performance characteristics of planning and delivery devices. However, published analyses of events in radiation therapy show that most events are often caused by flaws in clinical processes rather than by device failures. This suggests the need for the development of a quality management program that is based on integrated approaches to process and equipment quality assurance. Industrial engineers have developed various risk assessment tools that are used to identify and eliminate potential failures from a system or a process before amore » failure impacts a customer. These tools include, but are not limited to, process mapping, failure modes and effects analysis, fault tree analysis. Task Group 100 of the American Association of Physicists in Medicine has developed these tools and used them to formulate an example risk-based quality management program for intensity-modulated radiotherapy. This is a prospective risk assessment approach that analyzes potential error pathways inherent in a clinical process and then ranks them according to relative risk, typically before implementation, followed by the design of a new process or modification of the existing process. Appropriate controls are then put in place to ensure that failures are less likely to occur and, if they do, they will more likely be detected before they propagate through the process, compromising treatment outcome and causing harm to the patient. Such a prospective approach forms the basis of the work of Task Group 100 that has recently been approved by the AAPM. This session will be devoted to a discussion of these tools and practical examples of how these tools can be used in a given radiotherapy clinic to develop a risk based quality management program. Learning Objectives: Learn how to design a process map for a radiotherapy process Learn how to perform failure modes and effects analysis analysis for a given process Learn what fault trees are all about Learn how to design a quality management program based upon the information obtained from process mapping, failure modes and effects analysis and fault tree analysis. Dunscombe: Director, TreatSafely, LLC and Center for the Assessment of Radiological Sciences; Consultant to IAEA and Varian Thomadsen: President, Center for the Assessment of Radiological Sciences Palta: Vice President of the Center for the Assessment of Radiological Sciences.« less
NASA Astrophysics Data System (ADS)
Norlina, M. S.; Diyana, M. S. Nor; Mazidah, P.; Rusop, M.
2016-07-01
In the RF magnetron sputtering process, the desirable layer properties are largely influenced by the process parameters and conditions. If the quality of the thin film has not reached up to its intended level, the experiments have to be repeated until the desirable quality has been met. This research is proposing Gravitational Search Algorithm (GSA) as the optimization model to reduce the time and cost to be spent in the thin film fabrication. The optimization model's engine has been developed using Java. The model is developed based on GSA concept, which is inspired by the Newtonian laws of gravity and motion. In this research, the model is expected to optimize four deposition parameters which are RF power, deposition time, oxygen flow rate and substrate temperature. The results have turned out to be promising and it could be concluded that the performance of the model is satisfying in this parameter optimization problem. Future work could compare GSA with other nature based algorithms and test them with various set of data.
A single FPGA-based portable ultrasound imaging system for point-of-care applications.
Kim, Gi-Duck; Yoon, Changhan; Kye, Sang-Bum; Lee, Youngbae; Kang, Jeeun; Yoo, Yangmo; Song, Tai-kyong
2012-07-01
We present a cost-effective portable ultrasound system based on a single field-programmable gate array (FPGA) for point-of-care applications. In the portable ultrasound system developed, all the ultrasound signal and image processing modules, including an effective 32-channel receive beamformer with pseudo-dynamic focusing, are embedded in an FPGA chip. For overall system control, a mobile processor running Linux at 667 MHz is used. The scan-converted ultrasound image data from the FPGA are directly transferred to the system controller via external direct memory access without a video processing unit. The potable ultrasound system developed can provide real-time B-mode imaging with a maximum frame rate of 30, and it has a battery life of approximately 1.5 h. These results indicate that the single FPGA-based portable ultrasound system developed is able to meet the processing requirements in medical ultrasound imaging while providing improved flexibility for adapting to emerging POC applications.
Bayramzadeh, Sara; Joseph, Anjali; Allison, David; Shultz, Jonas; Abernathy, James
2018-07-01
This paper describes the process and tools developed as part of a multidisciplinary collaborative simulation-based approach for iterative design and evaluation of operating room (OR) prototypes. Full-scale physical mock-ups of healthcare spaces offer an opportunity to actively communicate with and to engage multidisciplinary stakeholders in the design process. While mock-ups are increasingly being used in healthcare facility design projects, they are rarely evaluated in a manner to support active user feedback and engagement. Researchers and architecture students worked closely with clinicians and architects to develop OR design prototypes and engaged clinical end-users in simulated scenarios. An evaluation toolkit was developed to compare design prototypes. The mock-up evaluation helped the team make key decisions about room size, location of OR table, intra-room zoning, and doors location. Structured simulation based mock-up evaluations conducted in the design process can help stakeholders visualize their future workspace and provide active feedback. Copyright © 2018 Elsevier Ltd. All rights reserved.
Development of a Power Metallurgy Superalloy for Use at 1800-2000 F (980-1090 C)
NASA Technical Reports Server (NTRS)
Kortovich, C. S.
1973-01-01
A program was conducted to develop a powder metallurgy nickel-base superalloy for 1800-2000 F (980-1090 C) temperature applications. The feasibility of a unique concept for alloying carbon into a superalloy powder matrix and achieving both grain growth and a discrete particle grain boundary carbide precipitation was demonstrated. The process consisted of blending metastable carbides with a carbon free base alloy and consolidating this blend by hot extrusion. This was followed by heat treatment to grow a desired ASTM No. 2-3 grain size and to solution the metastable carbides to allow precipitation of discrete particle grain boundary carbides during subsequent aging heat treatments. The best alloy developed during this program was hydrogen-atomized, thermal-mechanically processed, modified MAR-M246 base alloy plus VC (0.28 w/o C). Although below those for cast MAR-M246, the mechanical properties exhibited by this alloy represent the best combination offered by conventional powder metallurgy processing to date.
Wang, Ning; Björvell, Catrin; Hailey, David; Yu, Ping
2014-12-01
To develop an Australian nursing documentation in aged care (Quality of Australian Nursing Documentation in Aged Care (QANDAC)) instrument to measure the quality of paper-based and electronic resident records. The instrument was based on the nursing process model and on three attributes of documentation quality identified in a systematic review. The development process involved five phases following approaches to designing criterion-referenced measures. The face and content validities and the inter-rater reliability of the instrument were estimated using a focus group approach and consensus model. The instrument contains 34 questions in three sections: completion of nursing history and assessment, description of care process and meeting the requirements of data entry. Estimates of the validity and inter-rater reliability of the instrument gave satisfactory results. The QANDAC instrument may be a useful audit tool for quality improvement and research in aged care documentation. © 2013 ACOTA.
Kazis, Lewis E; Sheridan, Robert L; Shapiro, Gabriel D; Lee, Austin F; Liang, Matthew H; Ryan, Colleen M; Schneider, Jeffrey C; Lydon, Martha; Soley-Bori, Marina; Sonis, Lily A; Dore, Emily C; Palmieri, Tina; Herndon, David; Meyer, Walter; Warner, Petra; Kagan, Richard; Stoddard, Frederick J; Murphy, Michael; Tompkins, Ronald G
2018-04-01
There has been little systematic examination of variation in pediatric burn care clinical practices and its effect on outcomes. As a first step, current clinical care processes need to be operationally defined. The highly specialized burn care units of the Shriners Hospitals for Children system present an opportunity to describe the processes of care. The aim of this study was to develop a set of process-based measures for pediatric burn care and examine adherence to them by providers in a cohort of pediatric burn patients. We conducted a systematic literature review to compile a set of process-based indicators. These measures were refined by an expert panel of burn care providers, yielding 36 process-based indicators in four clinical areas: initial evaluation and resuscitation, acute excisional surgery and critical care, psychosocial and pain control, and reconstruction and aftercare. We assessed variability in adherence to the indicators in a cohort of 1,076 children with burns at four regional pediatric burn programs in the Shriners Hospital system. The percentages of the cohort at each of the four sites were as follows: Boston, 20.8%; Cincinnati, 21.1%; Galveston, 36.0%; and Sacramento, 22.1%. The cohort included children who received care between 2006 and 2010. Adherence to the process indicators varied both across sites and by clinical area. Adherence was lowest for the clinical areas of acute excisional surgery and critical care, with a range of 35% to 48% across sites, followed by initial evaluation and resuscitation (range, 34%-60%). In contrast, the clinical areas of psychosocial and pain control and reconstruction and aftercare had relatively high adherence across sites, with ranges of 62% to 93% and 71% to 87%, respectively. Of the 36 process indicators, 89% differed significantly in adherence between clinical sites (p < 0.05). Acute excisional surgery and critical care exhibited the most variability. The development of this set of process-based measures represents an important step in the assessment of clinical practice in pediatric burn care. Substantial variation was observed in practices of pediatric burn care. However, further research is needed to link these process-based measures to clinical outcomes. Therapeutic/care management, level IV.
ERIC Educational Resources Information Center
Pandey, Anjali
2012-01-01
This article calls for a rethinking of pure process-based approaches in the teaching of second language writers in the middle school classroom. The author provides evidence from a detailed case study of the writing of a Korean middle school student in a U.S. school setting to make a case for rethinking the efficacy of classic process-based…
ISPE: A knowledge-based system for fluidization studies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reddy, S.
1991-01-01
Chemical engineers use mathematical simulators to design, model, optimize and refine various engineering plants/processes. This procedure requires the following steps: (1) preparation of an input data file according to the format required by the target simulator; (2) excecuting the simulation; and (3) analyzing the results of the simulation to determine if all specified goals'' are satisfied. If the goals are not met, the input data file must be modified and the simulation repeated. This multistep process is continued until satisfactory results are obtained. This research was undertaken to develop a knowledge based system, IPSE (Intelligent Process Simulation Environment), that canmore » enhance the productivity of chemical engineers/modelers by serving as an intelligent assistant to perform a variety tasks related to process simulation. ASPEN, a widely used simulator by the US Department of Energy (DOE) at Morgantown Energy Technology Center (METC) was selected as the target process simulator in the project. IPSE, written in the C language, was developed using a number of knowledge-based programming paradigms: object-oriented knowledge representation that uses inheritance and methods, rulebased inferencing (includes processing and propagation of probabilistic information) and data-driven programming using demons. It was implemented using the knowledge based environment LASER. The relationship of IPSE with the user, ASPEN, LASER and the C language is shown in Figure 1.« less
The Effect of Web-Based Portfolio Use on Academic Achievement and Retention
ERIC Educational Resources Information Center
Guzeller, Cem Oktay
2012-01-01
The web-based portfolio emerged as a result of the influence of technological developments on educational practices. In this study, the effect of the web-based portfolio building process on academic achievement and retention is explored. For this purpose, a study platform known as a computer-assisted personal development portfolio was designed for…
Is Truthiness Enough? Classroom Activities for Encouraging Evidence-Based Critical Thinking
ERIC Educational Resources Information Center
Kraus, Sue; Sears, Sharon R.; Burke, Brian L.
2013-01-01
Teaching students how to think critically and develop lifelong habits of evidence-based inquiry outside of the classroom is a primary goal for educators today. This paper describes nine activities designed to promote evidence-based critical thinking in college or high school classrooms in any discipline. We have developed a seven step process for…
Informed Ignorance and the Difficulty of Using Guidelines in Policy Processes
ERIC Educational Resources Information Center
Fernler, Karin
2015-01-01
Based on an ethnographic study, this article investigates an attempt by a multidisciplinary group to employ pre-developed guidelines for producing a knowledge base that was to be used in a policy decision. The article contributes to previous studies of the development and use of knowledge-based guidelines and knowledge syntheses in policy-research…
NASA Technical Reports Server (NTRS)
Afjeh, Abdollah A.; Reed, John A.
2003-01-01
This research is aimed at developing a neiv and advanced simulation framework that will significantly improve the overall efficiency of aerospace systems design and development. This objective will be accomplished through an innovative integration of object-oriented and Web-based technologies ivith both new and proven simulation methodologies. The basic approach involves Ihree major areas of research: Aerospace system and component representation using a hierarchical object-oriented component model which enables the use of multimodels and enforces component interoperability. Collaborative software environment that streamlines the process of developing, sharing and integrating aerospace design and analysis models. . Development of a distributed infrastructure which enables Web-based exchange of models to simplify the collaborative design process, and to support computationally intensive aerospace design and analysis processes. Research for the first year dealt with the design of the basic architecture and supporting infrastructure, an initial implementation of that design, and a demonstration of its application to an example aircraft engine system simulation.
Development of an Aeroelastic Modeling Capability for Transient Nozzle Side Load Analysis
NASA Technical Reports Server (NTRS)
Wang, Ten-See; Zhao, Xiang; Zhang, Sijun; Chen, Yen-Sen
2013-01-01
Lateral nozzle forces are known to cause severe structural damage to any new rocket engine in development. Currently there is no fully coupled computational tool to analyze this fluid/structure interaction process. The objective of this study was to develop a fully coupled aeroelastic modeling capability to describe the fluid/structure interaction process during the transient nozzle operations. The aeroelastic model composes of three components: the computational fluid dynamics component based on an unstructured-grid, pressure-based computational fluid dynamics formulation, the computational structural dynamics component developed in the framework of modal analysis, and the fluid-structural interface component. The developed aeroelastic model was applied to the transient nozzle startup process of the Space Shuttle Main Engine at sea level. The computed nozzle side loads and the axial nozzle wall pressure profiles from the aeroelastic nozzle are compared with those of the published rigid nozzle results, and the impact of the fluid/structure interaction on nozzle side loads is interrogated and presented.
Business Process-Based Resource Importance Determination
NASA Astrophysics Data System (ADS)
Fenz, Stefan; Ekelhart, Andreas; Neubauer, Thomas
Information security risk management (ISRM) heavily depends on realistic impact values representing the resources’ importance in the overall organizational context. Although a variety of ISRM approaches have been proposed, well-founded methods that provide an answer to the following question are still missing: How can business processes be used to determine resources’ importance in the overall organizational context? We answer this question by measuring the actual importance level of resources based on business processes. Therefore, this paper presents our novel business process-based resource importance determination method which provides ISRM with an efficient and powerful tool for deriving realistic resource importance figures solely from existing business processes. The conducted evaluation has shown that the calculation results of the developed method comply to the results gained in traditional workshop-based assessments.
Coal liquefaction processes and development requirements analysis for synthetic fuels production
NASA Technical Reports Server (NTRS)
1980-01-01
Focus of the study is on: (1) developing a technical and programmatic data base on direct and indirect liquefaction processes which have potential for commercialization during the 1980's and beyond, and (2) performing analyses to assess technology readiness and development trends, development requirements, commercial plant costs, and projected synthetic fuel costs. Numerous data sources and references were used as the basis for the analysis results and information presented.
Process simulation and dynamic control for marine oily wastewater treatment using UV irradiation.
Jing, Liang; Chen, Bing; Zhang, Baiyu; Li, Pu
2015-09-15
UV irradiation and advanced oxidation processes have been recently regarded as promising solutions in removing polycyclic aromatic hydrocarbons (PAHs) from marine oily wastewater. However, such treatment methods are generally not sufficiently understood in terms of reaction mechanisms, process simulation and process control. These deficiencies can drastically hinder their application in shipping and offshore petroleum industries which produce bilge/ballast water and produced water as the main streams of marine oily wastewater. In this study, the factorial design of experiment was carried out to investigate the degradation mechanism of a typical PAH, namely naphthalene, under UV irradiation in seawater. Based on the experimental results, a three-layer feed-forward artificial neural network simulation model was developed to simulate the treatment process and to forecast the removal performance. A simulation-based dynamic mixed integer nonlinear programming (SDMINP) approach was then proposed to intelligently control the treatment process by integrating the developed simulation model, genetic algorithm and multi-stage programming. The applicability and effectiveness of the developed approach were further tested though a case study. The experimental results showed that the influences of fluence rate and temperature on the removal of naphthalene were greater than those of salinity and initial concentration. The developed simulation model could well predict the UV-induced removal process under varying conditions. The case study suggested that the SDMINP approach, with the aid of the multi-stage control strategy, was able to significantly reduce treatment cost when comparing to the traditional single-stage process optimization. The developed approach and its concept/framework have high potential of applicability in other environmental fields where a treatment process is involved and experimentation and modeling are used for process simulation and control. Copyright © 2015 Elsevier Ltd. All rights reserved.
The Research and Implementation of MUSER CLEAN Algorithm Based on OpenCL
NASA Astrophysics Data System (ADS)
Feng, Y.; Chen, K.; Deng, H.; Wang, F.; Mei, Y.; Wei, S. L.; Dai, W.; Yang, Q. P.; Liu, Y. B.; Wu, J. P.
2017-03-01
It's urgent to carry out high-performance data processing with a single machine in the development of astronomical software. However, due to the different configuration of the machine, traditional programming techniques such as multi-threading, and CUDA (Compute Unified Device Architecture)+GPU (Graphic Processing Unit) have obvious limitations in portability and seamlessness between different operation systems. The OpenCL (Open Computing Language) used in the development of MUSER (MingantU SpEctral Radioheliograph) data processing system is introduced. And the Högbom CLEAN algorithm is re-implemented into parallel CLEAN algorithm by the Python language and PyOpenCL extended package. The experimental results show that the CLEAN algorithm based on OpenCL has approximately equally operating efficiency compared with the former CLEAN algorithm based on CUDA. More important, the data processing in merely CPU (Central Processing Unit) environment of this system can also achieve high performance, which has solved the problem of environmental dependence of CUDA+GPU. Overall, the research improves the adaptability of the system with emphasis on performance of MUSER image clean computing. In the meanwhile, the realization of OpenCL in MUSER proves its availability in scientific data processing. In view of the high-performance computing features of OpenCL in heterogeneous environment, it will probably become the preferred technology in the future high-performance astronomical software development.
ERIC Educational Resources Information Center
Chang, Kuo-En; Sung, Yao-Ting; Hou, Huei-Tse
2006-01-01
Educational software for teachers is an important, yet usually ignored, link for integrating information technology into classroom instruction. This study builds a web-based teaching material design and development system. The process in the system is divided into four stages, analysis, design, development, and practice. Eight junior high school…
Developing a Web 2.0-Based System with User-Authored Content for Community Use and Teacher Education
ERIC Educational Resources Information Center
Cifuentes, Lauren; Sharp, Amy; Bulu, Sanser; Benz, Mike; Stough, Laura M.
2010-01-01
We report on an investigation into the design, development, implementation, and evaluation of an informational and instructional Website in order to generate guidelines for instructional designers of read/write Web environments. We describe the process of design and development research, the problem addressed, the theory-based solution, and the…
Mirfazaelian et al. (2006) developed a physiologically based pharmacokinetic (PBPK) model for the pyrethroid pesticide deltamethrin in the rat. This model describes gastrointestinal tract absorption as a saturable process mediated by phase III efflux transporters which pump delta...
Principles for Developing Competency-Based Education Programs
ERIC Educational Resources Information Center
Johnstone, Sally M.; Soares, Louis
2014-01-01
The 2013 US college/university policy agenda, "Making College Affordable: A Better Agenda for the Middle Class," highlighted the role of developing technologies, institutional curriculum-design processes, and new delivery methods as keys to providing quality, affordable postsecondary education. Competency-based education (CBE) is given…
NASA Technical Reports Server (NTRS)
Abhiraman, A.; Collard, D.; Cardelino, B.; Bhatia, S.; Desai, P.; Harruna, I.; Khan, I.; Mariam, Y.; Mensah, T.; Mitchell, M.
1992-01-01
The NASA funding allowed Clark Atlanta University (CAU) to establish a High Performance Polymers And Ceramics (HiPPAC) Research Center. The HiPPAC Center is consolidating and expanding the existing polymer and ceramic research capabilities at CAU through the development of interdepartmental and interinstitutional research in: (1) polymer synthesis; (2) polymer characterization and properties; (3) polymer processing; (4) polymer-based ceramic synthesis; and (5) ceramic characterization and properties. This Center has developed strong interactions between scientists and materials scientists of CAU and their counterparts from sister institutions in the Atlanta University Center (AUC) and the Georgia Institute of Technology. As a component of the center, we have started to develop strong collaborations with scientists from other universities and the HBCU's, national and federal agency laboratories, and the private sector during this first year. During this first year we have refined the focus of the research in the HiPPAC Center to three areas with seven working groups that will start programmatic activities on January 1, 1993, as follows: (1) nonlinear optical properties of chitosan derivatives; (2) polymeric electronic materials; (3) nondestructive characterization and prediction of polyimide performance; (4) solution processing of high-performance materials; (5) processable polyimides for composite applications; (6) sol-gel based ceramic materials processing; and (7) synthetic based processing of pre-ceramic polymers.
Derouesné, Christian
2017-09-01
In the 1930's LS Vygotsky developed an original conception of the psychology and the development of the higher psychological processes, which stands up the current theories in Russia and the West. He layed the bases for the study of the higher mental processes and their relationship with the brain functioning, which will be later on developped by AR Luria. After a brief historical notice, this paper will specify the relationships between Vygoski and Marx's and Engels's philosophy, the Soviet power and the works of Freud and Piaget.
Ground Vehicle Condition Based Maintenance
2010-10-04
Diagnostic Process Map 32 FMEAs Developed : • Diesel Engine • Transmission • Alternators Analysis : • Identify failure modes • Derive design factors and...S&T Initiatives TARDEC P&D Process Map Component Testing ARL CBM Research AMSAA SDC & Terrain Modeling UNCLASSIFIED 3 CBM+ Overview...UNCLASSIFIED 4 RCM and CBM are core processes for CBM+ System Development • Army Regulation 750-1, 20 Sep 2007, p. 79 - Reliability Centered Maintenance (RCM
Technology development in support of the TWRS process flowsheet. Revision 1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Washenfelder, D.J.
1995-10-11
The Tank Waste Remediation System is to treat and dispose of Hanford`s Single-Shell and Double-Shell Tank Waste. The TWRS Process Flowsheet, (WHC-SD-WM-TI-613 Rev. 1) described a flowsheet based on a large number of assumptions and engineering judgements that require verification or further definition through process and technology development activities. This document takes off from the TWRS Process Flowsheet to identify and prioritize tasks that should be completed to strengthen the technical foundation for the flowsheet.
Architecture Of High Speed Image Processing System
NASA Astrophysics Data System (ADS)
Konishi, Toshio; Hayashi, Hiroshi; Ohki, Tohru
1988-01-01
One of architectures for a high speed image processing system which corresponds to a new algorithm for a shape understanding is proposed. And the hardware system which is based on the archtecture was developed. Consideration points of the architecture are mainly that using processors should match with the processing sequence of the target image and that the developed system should be used practically in an industry. As the result, it was possible to perform each processing at a speed of 80 nano-seconds a pixel.
Field trials of a novel toolkit for evaluating 'intangible' values-related dimensions of projects.
Burford, Gemma; Velasco, Ismael; Janoušková, Svatava; Zahradnik, Martin; Hak, Tomas; Podger, Dimity; Piggot, Georgia; Harder, Marie K
2013-02-01
A novel toolkit has been developed, using an original approach to develop its components, for the purpose of evaluating 'soft' outcomes and processes that have previously been generally considered 'intangible': those which are specifically values based. This represents a step-wise, significant, change in provision for the assessment of values-based achievements that are of absolutely key importance to most civil society organisations (CSOs) and values-based businesses, and fills a known gap in evaluation practice. In this paper, we demonstrate the significance and rigour of the toolkit by presenting an evaluation of it in three diverse scenarios where different CSOs use it to co-evaluate locally relevant outcomes and processes to obtain results which are both meaningful to them and potentially comparable across organisations. A key strength of the toolkit is its original use of a prior generated, peer-elicited 'menu' of values-based indicators which provides a framework for user CSOs to localise. Principles of participatory, process-based and utilisation-focused evaluation are embedded in this toolkit and shown to be critical to its success, achieving high face-validity and wide applicability. The emerging contribution of this next-generation evaluation tool to other fields, such as environmental values, development and environmental sustainable development, shared values, business, education and organisational change is outlined. Copyright © 2012 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Gabriel, Paramo; Adrian, Benitez
2014-07-01
Incremental sheet forming by the method of single point incremental forming Dieless-SPIF, is a widely studied process, experimented and developed in countries with high manufacturing technologies, with friendly costs when the productive configuration in a productivity system is based in small production batches. United states, United kingdom and France lead this type of studies and cases, developing various proof with experimental geometries, different from the national environment such as Colombia, Bolivia, Chile, Ecuador and Peru where this process where discretely studied. Previously mentioned, it pretends develop an experimental case of a particular geometry, identifying the maximum formability angle of material permissible for the forming of a piece in one pass, the analysis of forming limit curve (FLC), with the objective to emphasizes in this innovative method based in CAD-CAM technologies, compare with other analogous process of deformation sheet metal like embossing, take correct decisions about the viability and applicability of this process (Dieless) in a particular industrial piece, which responses to the necessities of productive configurations mentioned and be highly taken like a manufacturing alternative to the other conventional process of forming sheet metal like embossing, for systems with slow batches production.
Concepts and Relations in Neurally Inspired In Situ Concept-Based Computing
van der Velde, Frank
2016-01-01
In situ concept-based computing is based on the notion that conceptual representations in the human brain are “in situ.” In this way, they are grounded in perception and action. Examples are neuronal assemblies, whose connection structures develop over time and are distributed over different brain areas. In situ concepts representations cannot be copied or duplicated because that will disrupt their connection structure, and thus the meaning of these concepts. Higher-level cognitive processes, as found in language and reasoning, can be performed with in situ concepts by embedding them in specialized neurally inspired “blackboards.” The interactions between the in situ concepts and the blackboards form the basis for in situ concept computing architectures. In these architectures, memory (concepts) and processing are interwoven, in contrast with the separation between memory and processing found in Von Neumann architectures. Because the further development of Von Neumann computing (more, faster, yet power limited) is questionable, in situ concept computing might be an alternative for concept-based computing. In situ concept computing will be illustrated with a recently developed BABI reasoning task. Neurorobotics can play an important role in the development of in situ concept computing because of the development of in situ concept representations derived in scenarios as needed for reasoning tasks. Neurorobotics would also benefit from power limited and in situ concept computing. PMID:27242504
Concepts and Relations in Neurally Inspired In Situ Concept-Based Computing.
van der Velde, Frank
2016-01-01
In situ concept-based computing is based on the notion that conceptual representations in the human brain are "in situ." In this way, they are grounded in perception and action. Examples are neuronal assemblies, whose connection structures develop over time and are distributed over different brain areas. In situ concepts representations cannot be copied or duplicated because that will disrupt their connection structure, and thus the meaning of these concepts. Higher-level cognitive processes, as found in language and reasoning, can be performed with in situ concepts by embedding them in specialized neurally inspired "blackboards." The interactions between the in situ concepts and the blackboards form the basis for in situ concept computing architectures. In these architectures, memory (concepts) and processing are interwoven, in contrast with the separation between memory and processing found in Von Neumann architectures. Because the further development of Von Neumann computing (more, faster, yet power limited) is questionable, in situ concept computing might be an alternative for concept-based computing. In situ concept computing will be illustrated with a recently developed BABI reasoning task. Neurorobotics can play an important role in the development of in situ concept computing because of the development of in situ concept representations derived in scenarios as needed for reasoning tasks. Neurorobotics would also benefit from power limited and in situ concept computing.
NASA Technical Reports Server (NTRS)
Tamir, David
1992-01-01
As we venture into space, it becomes necessary to assemble, expand, and repair space-based structures for our housing, research, and manufacturing. The zero gravity-vacuum of space challenges us to employ construction options which are commonplace on Earth. Rockwell International (RI) has begun to undertake the challenge of space-based construction via numerous options, of which one is welding. As of today, RI divisions have developed appropriate resources and technologies to bring space-based welding within our grasp. Further work, specifically in the area of developing space experiments to test RI technology, is required. RI Space Welding Project's achievements to date, from research and development (R&E) efforts in the areas of microgravity, vacuum, intra- / extra- vehicular activity and spinoff technologies, are reviewed. Special emphasis is given to results for G-169's (Get Away Special) microgravity flights aboard a NASA KC-135. Based on these achievements, a path to actual development of a space welding system is proposed with options to explore spinoff in-space metal processing technologies. This path is constructed by following a series of milestone experiments, of which several are to utilize NASA's Shuttle Small Payload Programs. Conceptual designs of the proposed shuttle payload experiments are discussed with application of lessons learned from G-169's design, development, integration, testing, safety approval process, and KC-135 flights.
Demand driven salt clean-up in a molten salt fast reactor – Defining a priority list
Litskevich, D.; Gregg, R.; Mount, A. R.
2018-01-01
The PUREX technology based on aqueous processes is currently the leading reprocessing technology in nuclear energy systems. It seems to be the most developed and established process for light water reactor fuel and the use of solid fuel. However, demand driven development of the nuclear system opens the way to liquid fuelled reactors, and disruptive technology development through the application of an integrated fuel cycle with a direct link to reactor operation. The possibilities of this new concept for innovative reprocessing technology development are analysed, the boundary conditions are discussed, and the economic as well as the neutron physical optimization parameters of the process are elucidated. Reactor physical knowledge of the influence of different elements on the neutron economy of the reactor is required. Using an innovative study approach, an element priority list for the salt clean-up is developed, which indicates that separation of Neodymium and Caesium is desirable, as they contribute almost 50% to the loss of criticality. Separating Zirconium and Samarium in addition from the fuel salt would remove nearly 80% of the loss of criticality due to fission products. The theoretical study is followed by a qualitative discussion of the different, demand driven optimization strategies which could satisfy the conflicting interests of sustainable reactor operation, efficient chemical processing for the salt clean-up, and the related economic as well as chemical engineering consequences. A new, innovative approach of balancing the throughput through salt processing based on a low number of separation process steps is developed. Next steps for the development of an economically viable salt clean-up process are identified. PMID:29494604
Atypical forest products, processes, and uses: a developing component of National Forest management
Mike Higgs; John Sebelius; Mike Miller
1995-01-01
The silvicultural practices prescribed under an ecosystem management regimen will alter the volume and character of National Forests' marketable raw material base. This alteration will affect forest-dependent communities that have traditionally relied upon these resources for their economic and social well being. Community based atypical forest products, processes...
ERIC Educational Resources Information Center
Chan, Cecilia K. Y.
2016-01-01
Many educational researchers have established problem-based learning (PBL) as a total approach to education--both a product and a process--from a pedagogical instructional strategy to skills development to assessment. This study provides qualitative evidences from educational practitioners in various professional disciplines, namely, Medicine,…
A HO-IRT Based Diagnostic Assessment System with Constructed Response Items
ERIC Educational Resources Information Center
Yang, Chih-Wei; Kuo, Bor-Chen; Liao, Chen-Huei
2011-01-01
The aim of the present study was to develop an on-line assessment system with constructed response items in the context of elementary mathematics curriculum. The system recorded the problem solving process of constructed response items and transfered the process to response codes for further analyses. An inference mechanism based on artificial…
Joining precipitation-hardened nickel-base alloys by friction welding
NASA Technical Reports Server (NTRS)
Moore, T. J.
1972-01-01
Solid state deformation welding process, friction welding, has been developed for joining precipitation hardened nickel-base alloys and other gamma prime-strengthened materials which heretofore have been virtually unweldable. Method requires rotation of one of the parts to be welded, but where applicable, it is an ideal process for high volume production jobs.
ERIC Educational Resources Information Center
Paleeri, Sankaranarayanan
2015-01-01
Transaction methods and approaches of value education have to change from lecturing to process based methods according to the development of constructivist approach. The process based methods provide creative interpretation and active participation from student side. Teachers have to organize suitable activities to transact values through process…
Cognitive Process as a Basis for Intelligent Retrieval Systems Design.
ERIC Educational Resources Information Center
Chen, Hsinchun; Dhar, Vasant
1991-01-01
Two studies of the cognitive processes involved in online document-based information retrieval were conducted. These studies led to the development of five computational models of online document retrieval which were incorporated into the design of an "intelligent" document-based retrieval system. Both the system and the broader implications of…
We propose multi-faceted research to enhance our understanding of NH3 emissions from livestock feeding operations. A process-based emissions modeling approach will be used, and we will investigate ammonia emissions from the scale of the individual farm out to impacts on region...
ERIC Educational Resources Information Center
Liou, Hsien-Chin; Chang, Jason S; Chen, Hao-Jan; Lin, Chih-Cheng; Liaw, Meei-Ling; Gao, Zhao-Ming; Jang, Jyh-Shing Roger; Yeh, Yuli; Chuang, Thomas C.; You, Geeng-Neng
2006-01-01
This paper describes the development of an innovative web-based environment for English language learning with advanced data-driven and statistical approaches. The project uses various corpora, including a Chinese-English parallel corpus ("Sinorama") and various natural language processing (NLP) tools to construct effective English…
2003-06-26
VANDENBERG AIR FORCE BASE, CALIF. - At Vandenberg Air Force Base, Calif., the Pegasus launch vehicle is moved toward its hangar. The Pegasus will carry the SciSat-1 spacecraft in a 400-mile-high polar orbit to investigate processes that control the distribution of ozone in the upper atmosphere. The data from the satellite will provide Canadian and international scientists with improved measurements relating to global ozone processes and help policymakers assess existing environmental policy and develop protective measures for improving the health of our atmosphere, preventing further ozone depletion. The mission is designed to last two years.
Porous starch-based drug delivery systems processed by a microwave route.
Malafaya, P B; Elvira, C; Gallardo, A; San Román, J; Reis, R L
2001-01-01
Abstract-A new simple processing route to produce starch-based porous materials was developed based on a microwave baking methodology. This innovative processing route was used to obtain non-loaded controls and loaded drug delivery carriers, incorporating a non-steroid anti-inflammatory agent. This bioactive agent was selected as model drug with expectations that the developed methodology might be used for other drugs and growth factors. The prepared systems were characterized by 1H and 13C NMR spectroscopy which allow the study of the interactions between the starch-based materials and the processing components, i.e, the blowing agents. The porosity of the prepared materials was estimated by measuring their apparent density and studied by comparing drug-loaded and non-loaded carriers. The behaviour of the porous structures, while immersed in aqueous media, was studied in terms of swelling and degradation, being intimately related to their porosity. Finally, in vitro drug release studies were performed showing a clear burst effect, followed by a slow controlled release of the drug over several days (up to 10 days).
NASA Astrophysics Data System (ADS)
Sari, Anggi Ristiyana Puspita; Suyanta, LFX, Endang Widjajanti; Rohaeti, Eli
2017-05-01
Recognizing the importance of the development of critical thinking and science process skills, the instrument should give attention to the characteristics of chemistry. Therefore, constructing an accurate instrument for measuring those skills is important. However, the integrated instrument assessment is limited in number. The purpose of this study is to validate an integrated assessment instrument for measuring students' critical thinking and science process skills on acid base matter. The development model of the test instrument adapted McIntire model. The sample consisted of 392 second grade high school students in the academic year of 2015/2016 in Yogyakarta. Exploratory Factor Analysis (EFA) was conducted to explore construct validity, whereas content validity was substantiated by Aiken's formula. The result shows that the KMO test is 0.714 which indicates sufficient items for each factor and the Bartlett test is significant (a significance value of less than 0.05). Furthermore, content validity coefficient which is based on 8 experts is obtained at 0.85. The findings support the integrated assessment instrument to measure critical thinking and science process skills on acid base matter.
Cotton-based Cellulose Nanomaterials for Applications in Composites and Electronics
NASA Astrophysics Data System (ADS)
Farahbakhsh, Nasim
A modern society demands development of highly valued and sustainable products via innovative process technologies and utilizing bio-based alternatives for petroleum based materials. Systematic comparative study of nanocellulose particles as a biodegradable and renewable reinforcing agent can help to develop criteria for selecting an appropriate candidate to be incorporated in polymer nanocomposites. Of particular interest has been nanocellulosic materials including cellulose nanocrystal (CNC) and micro/nanofibrilated cellulose (MFC/NFC) which possess a hierarchical structure that permits an ordered structure with unique properties that has served as building blocks for the design of green and novel materials composites for applications in flexible electronics, medicine and composites. Key differences exist in nanocellulosic materials as a result the process by which the material is produced. This research demonstrates the applicability for the use of recycled cotton as promising sustainable material to be utilized as a substrate for electronic application and a reinforcing agent choice that can be produced without any intensive purification process and be applied to synthetic-based polymer nanocomposites in melt-processing. (Abstract shortened by ProQuest.).
Housing decision making methods for initiation development phase process
NASA Astrophysics Data System (ADS)
Zainal, Rozlin; Kasim, Narimah; Sarpin, Norliana; Wee, Seow Ta; Shamsudin, Zarina
2017-10-01
Late delivery and sick housing project problems were attributed to poor decision making. These problems are the string of housing developer that prefers to create their own approach based on their experiences and expertise with the simplest approach by just applying the obtainable standards and rules in decision making. This paper seeks to identify the decision making methods for housing development at the initiation phase in Malaysia. The research involved Delphi method by using questionnaire survey which involved 50 numbers of developers as samples for the primary stage of collect data. However, only 34 developers contributed to the second stage of the information gathering process. At the last stage, only 12 developers were left for the final data collection process. Finding affirms that Malaysian developers prefer to make their investment decisions based on simple interpolation of historical data and using simple statistical or mathematical techniques in producing the required reports. It was suggested that they seemed to skip several important decision-making functions at the primary development stage. These shortcomings were mainly due to time and financial constraints and the lack of statistical or mathematical expertise among the professional and management groups in the developer organisations.
Baradez, Marc-Olivier; Marshall, Damian
2011-01-01
The transition from traditional culture methods towards bioreactor based bioprocessing to produce cells in commercially viable quantities for cell therapy applications requires the development of robust methods to ensure the quality of the cells produced. Standard methods for measuring cell quality parameters such as viability provide only limited information making process monitoring and optimisation difficult. Here we describe a 3D image-based approach to develop cell distribution maps which can be used to simultaneously measure the number, confluency and morphology of cells attached to microcarriers in a stirred tank bioreactor. The accuracy of the cell distribution measurements is validated using in silico modelling of synthetic image datasets and is shown to have an accuracy >90%. Using the cell distribution mapping process and principal component analysis we show how cell growth can be quantitatively monitored over a 13 day bioreactor culture period and how changes to manufacture processes such as initial cell seeding density can significantly influence cell morphology and the rate at which cells are produced. Taken together, these results demonstrate how image-based analysis can be incorporated in cell quality control processes facilitating the transition towards bioreactor based manufacture for clinical grade cells. PMID:22028809
Baradez, Marc-Olivier; Marshall, Damian
2011-01-01
The transition from traditional culture methods towards bioreactor based bioprocessing to produce cells in commercially viable quantities for cell therapy applications requires the development of robust methods to ensure the quality of the cells produced. Standard methods for measuring cell quality parameters such as viability provide only limited information making process monitoring and optimisation difficult. Here we describe a 3D image-based approach to develop cell distribution maps which can be used to simultaneously measure the number, confluency and morphology of cells attached to microcarriers in a stirred tank bioreactor. The accuracy of the cell distribution measurements is validated using in silico modelling of synthetic image datasets and is shown to have an accuracy >90%. Using the cell distribution mapping process and principal component analysis we show how cell growth can be quantitatively monitored over a 13 day bioreactor culture period and how changes to manufacture processes such as initial cell seeding density can significantly influence cell morphology and the rate at which cells are produced. Taken together, these results demonstrate how image-based analysis can be incorporated in cell quality control processes facilitating the transition towards bioreactor based manufacture for clinical grade cells.
Social Information Processing in Deaf Adolescents
ERIC Educational Resources Information Center
Torres, Jesús; Saldaña, David; Rodríguez-Ortiz, Isabel R.
2016-01-01
The goal of this study was to compare the processing of social information in deaf and hearing adolescents. A task was developed to assess social information processing (SIP) skills of deaf adolescents based on Crick and Dodge's (1994; A review and reformulation of social information-processing mechanisms in children's social adjustment.…
The application of intelligent process control to space based systems
NASA Technical Reports Server (NTRS)
Wakefield, G. Steve
1990-01-01
The application of Artificial Intelligence to electronic and process control can help attain the autonomy and safety requirements of manned space systems. An overview of documented applications within various industries is presented. The development process is discussed along with associated issues for implementing an intelligence process control system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clarke, Kester Diederik
The intent of this report is to document a procedure used at LANL for HIP bonding aluminum cladding to U-10Mo fuel foils using a formed HIP can for the Domestic Reactor Conversion program in the NNSA Office of Material, Management and Minimization, and provide some details that may not have been published elsewhere. The HIP process is based on the procedures that have been used to develop the formed HIP can process, including the baseline process developed at Idaho National Laboratory (INL). The HIP bonding cladding process development is summarized in the listed references. Further iterations with Babcock & Wilcoxmore » (B&W) to refine the process to meet production and facility requirements is expected.« less
An overview of TOUGH-based geomechanics models
Rutqvist, Jonny
2016-09-22
After the initial development of the first TOUGH-based geomechanics model 15 years ago based on linking TOUGH2 multiphase flow simulator to the FLAC3D geomechanics simulator, at least 15 additional TOUGH-based geomechanics models have appeared in the literature. This development has been fueled by a growing demand and interest for modeling coupled multiphase flow and geomechanical processes related to a number of geoengineering applications, such as in geologic CO 2 sequestration, enhanced geothermal systems, unconventional hydrocarbon production, and most recently, related to reservoir stimulation and injection-induced seismicity. This paper provides a short overview of these TOUGH-based geomechanics models, focusing on somemore » of the most frequently applied to a diverse set of problems associated with geomechanics and its couplings to hydraulic, thermal and chemical processes.« less
Learning cell biology as a team: a project-based approach to upper-division cell biology.
Wright, Robin; Boggs, James
2002-01-01
To help students develop successful strategies for learning how to learn and communicate complex information in cell biology, we developed a quarter-long cell biology class based on team projects. Each team researches a particular human disease and presents information about the cellular structure or process affected by the disease, the cellular and molecular biology of the disease, and recent research focused on understanding the cellular mechanisms of the disease process. To support effective teamwork and to help students develop collaboration skills useful for their future careers, we provide training in working in small groups. A final poster presentation, held in a public forum, summarizes what students have learned throughout the quarter. Although student satisfaction with the course is similar to that of standard lecture-based classes, a project-based class offers unique benefits to both the student and the instructor.
User-centric design of a personal assistance robot (FRASIER) for active aging.
Padir, Taşkin; Skorinko, Jeanine; Dimitrov, Velin
2015-01-01
We present our preliminary results from the design process for developing the Worcester Polytechnic Institute's personal assistance robot, FRASIER, as an intelligent service robot for enabling active aging. The robot capabilities include vision-based object detection, tracking the user and help with carrying heavy items such as grocery bags or cafeteria trays. This work-in-progress report outlines our motivation and approach to developing the next generation of service robots for the elderly. Our main contribution in this paper is the development of a set of specifications based on the adopted user-centered design process, and realization of the prototype system designed to meet these specifications.
Evaluating Process Sustainability Using Flowsheet Monitoring
Environmental metric software can be used to evaluate the sustainability of a chemical based on data from the chemical process that is used to manufacture it. One problem in developing environmental metric software is that chemical process simulation packages typically do not rea...
Comprehension of Connected Discourse.
ERIC Educational Resources Information Center
Mosberg, Ludwig; Shima, Fred
A rationale was developed for researching reading comprehension based on information gain. Previous definitions of comprehension which were reviewed included operational vs. nonoperational and skills vs. processes. Comprehension was viewed as an informational processing event which includes a constellation of cognitive and learning processes. Two…
Evans, Rachel C; Kyeremateng, Samuel O; Asmus, Lutz; Degenhardt, Matthias; Rosenberg, Joerg; Wagner, Karl G
2018-05-01
The aim of this work was to investigate the use of torasemide as a highly sensitive indicator substance and to develop a formulation thereof for establishing quantitative relationships between hot-melt extrusion process conditions and critical quality attributes (CQAs). Using solid-state characterization techniques and a 10 mm lab-scale co-rotating twin-screw extruder, we studied torasemide in a Soluplus® (SOL)-polyethylene glycol 1500 (PEG 1500) matrix, and developed and characterized a formulation which was used as a process indicator to study thermal- and hydrolysis-induced degradation, as well as residual crystallinity. We found that torasemide first dissolved into the matrix and then degraded. Based on this mechanism, extrudates with measurable levels of degradation and residual crystallinity were produced, depending strongly on the main barrel and die temperature and residence time applied. In addition, we found that 10% w/w PEG 1500 as plasticizer resulted in the widest operating space with the widest range of measurable residual crystallinity and degradant levels. Torasemide as an indicator substance behaves like a challenging-to-process API, only with higher sensitivity and more pronounced effects, e.g., degradation and residual crystallinity. Application of a model formulation containing torasemide will enhance the understanding of the dynamic environment inside an extruder and elucidate the cumulative thermal and hydrolysis effects of the extrusion process. The use of such a formulation will also facilitate rational process development and scaling by establishing clear links between process conditions and CQAs.
Unifying Model-Based and Reactive Programming within a Model-Based Executive
NASA Technical Reports Server (NTRS)
Williams, Brian C.; Gupta, Vineet; Norvig, Peter (Technical Monitor)
1999-01-01
Real-time, model-based, deduction has recently emerged as a vital component in AI's tool box for developing highly autonomous reactive systems. Yet one of the current hurdles towards developing model-based reactive systems is the number of methods simultaneously employed, and their corresponding melange of programming and modeling languages. This paper offers an important step towards unification. We introduce RMPL, a rich modeling language that combines probabilistic, constraint-based modeling with reactive programming constructs, while offering a simple semantics in terms of hidden state Markov processes. We introduce probabilistic, hierarchical constraint automata (PHCA), which allow Markov processes to be expressed in a compact representation that preserves the modularity of RMPL programs. Finally, a model-based executive, called Reactive Burton is described that exploits this compact encoding to perform efficIent simulation, belief state update and control sequence generation.
Model-Based Verification and Validation of the SMAP Uplink Processes
NASA Technical Reports Server (NTRS)
Khan, M. Omair; Dubos, Gregory F.; Tirona, Joseph; Standley, Shaun
2013-01-01
This case study stands as an example of how a project can validate a system-level design earlier in the project life cycle than traditional V&V processes by using simulation on a system model. Specifically, this paper describes how simulation was added to a system model of the Soil Moisture Active-Passive (SMAP) mission's uplink process.Also discussed are the advantages and disadvantages of the methods employed and the lessons learned; which are intended to benefit future model-based and simulation-based V&V development efforts.
Fleisher, Linda; Buzaglo, Joanne; Collins, Michael; Millard, Jennifer; Miller, Suzanne M; Egleston, Brian L; Solarino, Nicholas; Trinastic, Jonathan; Cegala, Donald J; Benson, Al B; Schulman, Kevin A; Weinfurt, Kevin P; Sulmasy, Daniel; Diefenbach, Michael A; Meropol, Neal J
2008-06-01
Although there is broad consensus that careful content vetting and user testing is important in the development of technology-based educational interventions, often these steps are overlooked. This paper highlights the development of a theory-guided, web-based communication aid (CONNECT), designed to facilitate treatment decision-making among patients with advanced cancer. The communication aid included an on-line survey, patient skills training module and an automated physician report. Development steps included: (1) evidence-based content development; (2) usability testing; (3) pilot testing; and (4) patient utilization and satisfaction. Usability testing identified some confusing directions and navigation for the on-line survey and validated the relevance of the "patient testimonials" in the skills module. Preliminary satisfaction from the implementation of the communication aid showed that 66% found the survey length reasonable and 70% found it helpful in talking with the physician. Seventy percent reported the skills module helpful and about half found it affected the consultation. Designing patient education interventions for translation into practice requires the integration of health communication best practice including user feedback along the developmental process. This developmental process can be translated to a broad array of community-based patient and provider educational interventions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Soria-Lara, Julio A., E-mail: j.a.sorialara@uva.nl; Bertolini, Luca, E-mail: l.bertolini@uva.nl; Brömmelstroet, Marco te, E-mail: M.C.G.teBrommelstroet@uva.nl
The effectiveness of EIA for evaluating transport planning projects is increasingly being questioned by practitioners, institutions and scholars. The academic literature has traditionally focused more on solving content-related problems with EIA (i.e. the measurement of environmental effects) than on process-related issues (i.e. the role of EIA in the planning process and the interaction between key actors). Focusing only on technical improvements is not sufficient for rectifying the effectiveness problems of EIA. In order to address this knowledge gap, the paper explores how EIA is experienced in the Spanish planning context and offers in-depth insight into EIA process-related issues in themore » field of urban transport planning. From the multitude of involved actors, the research focuses on exploring the perceptions of the two main professional groups: EIA developers and transport planners. Through a web-based survey we assess the importance of process-related barriers to the effective use of EIA in urban transport planning. The analyses revealed process issues based fundamentally on unstructured stakeholders involvement and an inefficient public participation - Highlights: • Qualitative research on perceptions of EIA participants on EIA processes. • Web-based survey with different participants (EIA-developers; transport planners). • It was seen an inefficient participation of stakeholders during the EIA processes.« less
NASA Astrophysics Data System (ADS)
Yerizon; Jazwinarti; Yarman
2018-01-01
Students have difficulties experience in the course Introduction to Operational Research (PRO). The purpose of this study is to analyze the requirement of students in the developing lecturing materials PRO based Problem Based Learning which is valid, practice, and effective. Lecture materials are developed based on Plomp’s model. The development process of this device consists of 3 phases: front-end analysis/preliminary research, development/prototype phase and assessment phase. Preliminary analysis was obtained by observation and interview. From the research, it is found that students need the student’s worksheet (LKM) for several reasons: 1) no LKM available, 2) presentation of subject not yet based on real problem, 3) experiencing difficulties from current learning source.